new advisories work and features gaps work
This commit is contained in:
@@ -0,0 +1,49 @@
|
|||||||
|
# Sprint 20260112-002-EVIDENCE - EvidenceLocker Audit Pack Hardening
|
||||||
|
|
||||||
|
## Topic & Scope
|
||||||
|
- Extend EvidenceLocker bundle metadata and manifests with transparency and RFC3161 timestamp references aligned to the new evidence pack schemas.
|
||||||
|
- Add explicit object-lock configuration and enforcement in S3 storage to support WORM retention and legal hold behavior.
|
||||||
|
- Evidence to produce: code and tests under `src/EvidenceLocker/StellaOps.EvidenceLocker` plus updated EvidenceLocker AGENTS entries.
|
||||||
|
- **Working directory:** `src/EvidenceLocker/StellaOps.EvidenceLocker`.
|
||||||
|
|
||||||
|
## Dependencies & Concurrency
|
||||||
|
- Depends on SPRINT_20260112_001_DOCS for schema definitions and documentation alignment.
|
||||||
|
- Concurrency: implementation can proceed in parallel after schema field names are finalized.
|
||||||
|
|
||||||
|
## Documentation Prerequisites
|
||||||
|
- `docs/README.md`
|
||||||
|
- `docs/07_HIGH_LEVEL_ARCHITECTURE.md`
|
||||||
|
- `docs/modules/platform/architecture-overview.md`
|
||||||
|
- `docs/modules/evidence-locker/architecture.md`
|
||||||
|
- `docs/modules/evidence-locker/export-format.md`
|
||||||
|
- `docs/modules/evidence-locker/bundle-packaging.md`
|
||||||
|
- `docs/modules/evidence-locker/attestation-contract.md`
|
||||||
|
- `docs/modules/attestor/transparency.md`
|
||||||
|
- `src/EvidenceLocker/AGENTS.md`
|
||||||
|
- `src/EvidenceLocker/StellaOps.EvidenceLocker/AGENTS.md`
|
||||||
|
|
||||||
|
## Delivery Tracker
|
||||||
|
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
||||||
|
| --- | --- | --- | --- | --- | --- |
|
||||||
|
| 1 | EVID-CEPACK-001 | DONE | After DOCS-CEPACK-001 schema fields are final | EvidenceLocker Guild | Update EvidenceLocker manifest models and builders to record transparency and timestamp references in bundle metadata (align with `docs/modules/evidence-locker/schemas/bundle.manifest.schema.json` and the new evidence pack schema). Touch: `src/EvidenceLocker/StellaOps.EvidenceLocker/StellaOps.EvidenceLocker.Infrastructure/Builders/EvidenceBundleBuilder.cs` and related domain models. |
|
||||||
|
| 2 | EVID-CEPACK-002 | DONE | After EVID-CEPACK-001 | EvidenceLocker Guild | Propagate RFC3161 timestamp metadata from signing to bundle packaging and verification flows; add unit tests under `src/EvidenceLocker/StellaOps.EvidenceLocker/StellaOps.EvidenceLocker.Tests`. |
|
||||||
|
| 3 | EVID-CEPACK-003 | DONE | After DOCS-CEPACK-001 schema fields are final | EvidenceLocker Guild | Add Object Lock configuration to `EvidenceLockerOptions` and enforce retention/legal hold headers in `S3EvidenceObjectStore`; validate config at startup and add tests. |
|
||||||
|
| 4 | EVID-CEPACK-004 | DONE | After EVID-CEPACK-001 | EvidenceLocker Guild / QA | Add determinism and schema evolution tests covering new manifest fields and checksum ordering (use existing EvidenceLocker test suites). |
|
||||||
|
| 5 | EVID-CEPACK-005 | DONE | After EVID-CEPACK-003 | EvidenceLocker Guild | Update `src/EvidenceLocker/AGENTS.md` and `src/EvidenceLocker/StellaOps.EvidenceLocker/AGENTS.md` to include object-lock and transparency/timestamp requirements. |
|
||||||
|
|
||||||
|
## Execution Log
|
||||||
|
| Date (UTC) | Update | Owner |
|
||||||
|
| --- | --- | --- |
|
||||||
|
| 2026-01-14 | Sprint created; focuses on EvidenceLocker metadata, object-lock enforcement, and tests. | Planning |
|
||||||
|
| 2026-01-14 | EVID-CEPACK-001: Added TransparencyReference and TimestampReference records to EvidenceBundleBuildModels.cs; Updated EvidenceSignatureService to serialize new fields in manifest payload. | Agent |
|
||||||
|
| 2026-01-14 | EVID-CEPACK-002: Existing RFC3161 client already propagates timestamps; added 3 new unit tests for transparency/timestamp reference serialization. | Agent |
|
||||||
|
| 2026-01-14 | EVID-CEPACK-003: Added ObjectLockOptions to AmazonS3StoreOptions with Mode, DefaultRetentionDays, DefaultLegalHold; Updated S3EvidenceObjectStore with ApplyObjectLockSettings and ApplyLegalHoldAsync methods; Added startup validation. | Agent |
|
||||||
|
| 2026-01-14 | EVID-CEPACK-004: Added tests for transparency serialization, timestamp serialization, and empty array omission in EvidenceSignatureServiceTests. | Agent |
|
||||||
|
| 2026-01-14 | EVID-CEPACK-005: Updated src/EvidenceLocker/AGENTS.md with object-lock and transparency/timestamp requirements. | Agent |
|
||||||
|
|
||||||
|
## Decisions & Risks
|
||||||
|
- Object Lock semantics (governance vs compliance) require a single default and may need explicit approval from platform governance.
|
||||||
|
- Doc updates to EvidenceLocker packaging and verification guides must be coordinated with the docs sprint to avoid cross-module drift.
|
||||||
|
|
||||||
|
## Next Checkpoints
|
||||||
|
- 2026-01-20: EvidenceLocker schema and Object Lock design review.
|
||||||
@@ -0,0 +1,44 @@
|
|||||||
|
# Sprint 20260112.004.ATTESTOR · VEX Override Attestation Predicate
|
||||||
|
|
||||||
|
## Topic & Scope
|
||||||
|
- Define and implement a DSSE/in-toto predicate for VEX override attestations (operator decisions such as not_affected or compensating controls).
|
||||||
|
- Support optional Rekor anchoring and offline verification paths without changing existing attestation workflows.
|
||||||
|
- Working directory: `src/Attestor`. Evidence: predicate schema, builder, verification tests, and sample payloads.
|
||||||
|
|
||||||
|
## Dependencies & Concurrency
|
||||||
|
- Downstream: `SPRINT_20260112_004_VULN_vex_override_workflow.md` consumes the predicate to mint attestations.
|
||||||
|
- Parallel-safe with Scanner and Findings sprints.
|
||||||
|
|
||||||
|
## Documentation Prerequisites
|
||||||
|
- `docs/README.md`
|
||||||
|
- `docs/ARCHITECTURE_OVERVIEW.md`
|
||||||
|
- `docs/07_HIGH_LEVEL_ARCHITECTURE.md`
|
||||||
|
- `docs/modules/attestor/architecture.md`
|
||||||
|
- `docs/modules/attestor/rekor-verification-design.md`
|
||||||
|
- `docs/VEX_CONSENSUS_GUIDE.md`
|
||||||
|
- `docs/architecture/EVIDENCE_PIPELINE_ARCHITECTURE.md`
|
||||||
|
- `src/__Libraries/StellaOps.Canonical.Json/README.md`
|
||||||
|
|
||||||
|
## Delivery Tracker
|
||||||
|
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
||||||
|
| --- | --- | --- | --- | --- | --- |
|
||||||
|
| 1 | ATT-VEX-001 | DONE | Predicate spec | Attestor Guild | Add VEX override predicate schema and typed model (decision, evidence refs, tool versions, rule digests, artifact digest, trace hash). |
|
||||||
|
| 2 | ATT-VEX-002 | DONE | Builder + verify | Attestor Guild | Implement predicate builder and DSSE envelope creation/verification; canonicalize predicate payloads with `StellaOps.Canonical.Json` before hashing; add unit and integration tests. |
|
||||||
|
| 3 | ATT-VEX-003 | DONE | Cross-module docs | Attestor Guild | Document predicate and include a sample payload in `docs/modules/attestor/` and referenced schemas. |
|
||||||
|
| 4 | ATT-VEX-004 | DONE | Canonicalization contract | Attestor Guild | Document canonicalization rules and required serializer options (no CamelCase, default encoder) for the VEX override predicate. |
|
||||||
|
|
||||||
|
## Execution Log
|
||||||
|
| Date (UTC) | Update | Owner |
|
||||||
|
| --- | --- | --- |
|
||||||
|
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
||||||
|
| 2026-01-14 | ATT-VEX-001: Created VexOverridePredicate.cs with VexOverrideDecision enum, EvidenceReference, ToolInfo records in src/Attestor/__Libraries/StellaOps.Attestor.StandardPredicates/VexOverride/. | Agent |
|
||||||
|
| 2026-01-14 | ATT-VEX-002: Created VexOverridePredicateParser.cs (IPredicateParser impl), VexOverridePredicateBuilder.cs with RFC 8785 canonicalization. Added 23 unit tests in VexOverride directory. | Agent |
|
||||||
|
| 2026-01-14 | Fixed pre-existing bug in BinaryDiffTestData.cs (renamed FixedTimeProvider field to TestTimeProvider to avoid name shadowing with nested class). | Agent |
|
||||||
|
| 2026-01-14 | ATT-VEX-003/004: Created docs/modules/attestor/vex-override-predicate.md with schema spec, sample payload, and RFC 8785 canonicalization rules. | Agent |
|
||||||
|
|
||||||
|
## Decisions & Risks
|
||||||
|
- Predicate must use RFC 8785 canonicalization via `StellaOps.Canonical.Json` with explicit serializer options (no CamelCase, default encoder) and DSSE PAE helper; no custom encoding.
|
||||||
|
- Rekor anchoring is optional; offline verification must still succeed with embedded proofs.
|
||||||
|
|
||||||
|
## Next Checkpoints
|
||||||
|
- TBD: confirm predicate field set with Policy and VEX Lens consumers.
|
||||||
@@ -0,0 +1,37 @@
|
|||||||
|
# Sprint 20260112.004.DOC · CI/CD Gate Verification Step
|
||||||
|
|
||||||
|
## Topic & Scope
|
||||||
|
- Document a required verification step in CI/CD gates that checks DSSE witness signatures and Rekor inclusion (or offline ledger).
|
||||||
|
- Provide example commands for online and offline flows using `stella proof verify` and cosign equivalents.
|
||||||
|
- Working directory: `docs`. Evidence: updated CI/CD flow and proof verification runbooks.
|
||||||
|
|
||||||
|
## Dependencies & Concurrency
|
||||||
|
- Parallel-safe with code sprints; no upstream dependencies required.
|
||||||
|
|
||||||
|
## Documentation Prerequisites
|
||||||
|
- `docs/README.md`
|
||||||
|
- `docs/ARCHITECTURE_OVERVIEW.md`
|
||||||
|
- `docs/07_HIGH_LEVEL_ARCHITECTURE.md`
|
||||||
|
- `docs/flows/10-cicd-gate-flow.md`
|
||||||
|
- `docs/operations/score-proofs-runbook.md`
|
||||||
|
- `docs/operations/proof-verification-runbook.md`
|
||||||
|
|
||||||
|
## Delivery Tracker
|
||||||
|
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
||||||
|
| --- | --- | --- | --- | --- | --- |
|
||||||
|
| 1 | DOC-CICD-001 | DONE | Flow edits | Docs Guild | Update `docs/flows/10-cicd-gate-flow.md` to include DSSE witness verification and Rekor inclusion checks with offline fallback. |
|
||||||
|
| 2 | DOC-CICD-002 | DONE | Runbook links | Docs Guild | Add concise command snippets to `docs/operations/score-proofs-runbook.md` and link to `docs/operations/proof-verification-runbook.md`. |
|
||||||
|
|
||||||
|
## Execution Log
|
||||||
|
| Date (UTC) | Update | Owner |
|
||||||
|
| --- | --- | --- |
|
||||||
|
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
||||||
|
| 2026-01-14 | DOC-CICD-001: Added section 5a "DSSE Witness Verification (Required)" to cicd-gate-flow.md with online/offline commands, cosign equivalents, and GitHub/GitLab integration examples. | Agent |
|
||||||
|
| 2026-01-14 | DOC-CICD-002: Added section 3.2a "CI/CD Gate Verification Quick Reference" to score-proofs-runbook.md with concise commands and cross-links. | Agent |
|
||||||
|
|
||||||
|
## Decisions & Risks
|
||||||
|
- Verification examples must be offline-friendly and avoid external URLs not already present.
|
||||||
|
- CI gate examples must remain deterministic and avoid non-ASCII characters in commands.
|
||||||
|
|
||||||
|
## Next Checkpoints
|
||||||
|
- TBD: confirm with Release Engineering that flow matches current CLI behavior.
|
||||||
@@ -0,0 +1,40 @@
|
|||||||
|
# Sprint 20260112.004.LB · Doctor Evidence Integrity Checks
|
||||||
|
|
||||||
|
## Topic & Scope
|
||||||
|
- Add Doctor checks that validate DSSE signatures, Rekor inclusion (or offline ledger), and evidence hash consistency.
|
||||||
|
- Surface results in Doctor UI exports and keep outputs deterministic and offline-friendly.
|
||||||
|
- Working directory: `src/__Libraries`. Evidence: new doctor checks, tests, and doc updates.
|
||||||
|
|
||||||
|
## Dependencies & Concurrency
|
||||||
|
- Parallel-safe with other sprints; can proceed independently once proof verification utilities are available.
|
||||||
|
|
||||||
|
## Documentation Prerequisites
|
||||||
|
- `docs/README.md`
|
||||||
|
- `docs/ARCHITECTURE_OVERVIEW.md`
|
||||||
|
- `docs/07_HIGH_LEVEL_ARCHITECTURE.md`
|
||||||
|
- `docs/doctor/doctor-capabilities.md`
|
||||||
|
- `docs/operations/score-proofs-runbook.md`
|
||||||
|
- `src/__Libraries/StellaOps.Canonical.Json/README.md`
|
||||||
|
|
||||||
|
## Delivery Tracker
|
||||||
|
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
||||||
|
| --- | --- | --- | --- | --- | --- |
|
||||||
|
| 1 | DOCHECK-001 | DONE | Check spec | Doctor Guild | Implement a security Doctor check that verifies DSSE signature validity and Rekor inclusion (or offline ledger) for a provided proof bundle or attestation; recompute hashes using `StellaOps.Canonical.Json`. |
|
||||||
|
| 2 | DOCHECK-002 | DONE | Tests | Doctor Guild | Add unit/integration tests for deterministic check output, including offline mode. |
|
||||||
|
| 3 | DOCHECK-003 | DONE | Cross-module docs | Doctor Guild | Update `docs/doctor/doctor-capabilities.md` to describe the new evidence integrity check. |
|
||||||
|
|
||||||
|
## Execution Log
|
||||||
|
| Date (UTC) | Update | Owner |
|
||||||
|
| --- | --- | --- |
|
||||||
|
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
||||||
|
| 2026-01-14 | DOCHECK-001: Created EvidenceIntegrityCheck.cs in Security plugin with DSSE/Rekor/hash verification. | Agent |
|
||||||
|
| 2026-01-14 | DOCHECK-001: Registered check in SecurityPlugin.cs GetChecks() method. | Agent |
|
||||||
|
| 2026-01-14 | DOCHECK-002: Created EvidenceIntegrityCheckTests.cs with 15 tests covering all verification paths. All tests pass. | Agent |
|
||||||
|
| 2026-01-14 | DOCHECK-003: Added check.security.evidence.integrity documentation to doctor-capabilities.md section 9.4. | Agent |
|
||||||
|
|
||||||
|
## Decisions & Risks
|
||||||
|
- Doctor checks must not call external networks; use local proof bundles or offline ledgers.
|
||||||
|
- Ensure any evidence hash validation uses `StellaOps.Canonical.Json` with explicit serializer options and stable ordering.
|
||||||
|
|
||||||
|
## Next Checkpoints
|
||||||
|
- TBD: confirm proof bundle inputs and UX in Doctor dashboard.
|
||||||
@@ -0,0 +1,45 @@
|
|||||||
|
# Sprint 20260112-004-LB-evidence-card-core - Evidence Card Core
|
||||||
|
|
||||||
|
## Topic & Scope
|
||||||
|
- Build a single-file evidence card export that packages SBOM excerpt, DSSE envelope, and Rekor receipt for a finding evidence pack; output is deterministic and offline-friendly.
|
||||||
|
- Current state evidence: Evidence packs only export json/signedjson/markdown/html/pdf and do not carry Rekor receipts (`src/__Libraries/StellaOps.Evidence.Pack/Models/SignedEvidencePack.cs`, `src/__Libraries/StellaOps.Evidence.Pack/EvidencePackService.cs`).
|
||||||
|
- Evidence to produce: EvidenceCard model, evidence-card export format, receipt wiring in signed packs, and determinism tests.
|
||||||
|
- **Working directory:** `src/__Libraries/StellaOps.Evidence.Pack`.
|
||||||
|
|
||||||
|
## Dependencies & Concurrency
|
||||||
|
- Depends on Attestor receipt types already present in `src/Attestor/StellaOps.Attestor/StellaOps.Attestor.Core/Rekor/RekorReceipt.cs`.
|
||||||
|
- Parallel safe with remediation PR and UI sprints; no shared DB migrations or schema changes.
|
||||||
|
|
||||||
|
## Documentation Prerequisites
|
||||||
|
- `docs/README.md`
|
||||||
|
- `docs/07_HIGH_LEVEL_ARCHITECTURE.md`
|
||||||
|
- `docs/modules/platform/architecture-overview.md`
|
||||||
|
- `docs/modules/attestor/architecture.md`
|
||||||
|
- `docs/product/VISION.md`
|
||||||
|
- `docs/modules/cli/guides/commands/evidence-bundle-format.md`
|
||||||
|
|
||||||
|
## Delivery Tracker
|
||||||
|
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
||||||
|
| --- | --- | --- | --- | --- | --- |
|
||||||
|
| 1 | EVPCARD-LB-001 | DONE | None | Evidence Guild | Add EvidenceCard model and receipt metadata for single-file export. |
|
||||||
|
| 2 | EVPCARD-LB-002 | DONE | EVPCARD-LB-001 | Evidence Guild | Implement evidence-card export format in EvidencePackService (SBOM excerpt + DSSE + receipt). |
|
||||||
|
| 3 | EVPCARD-LB-003 | DONE | EVPCARD-LB-001 | Evidence Guild | Wire Rekor receipt capture into signed evidence packs using Attestor receipt types. |
|
||||||
|
| 4 | EVPCARD-LB-004 | DONE | EVPCARD-LB-002 | Evidence Guild | Add determinism and export tests for evidence-card output. |
|
||||||
|
|
||||||
|
## Execution Log
|
||||||
|
| Date (UTC) | Update | Owner |
|
||||||
|
| --- | --- | --- |
|
||||||
|
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
||||||
|
| 2026-01-14 | EVPCARD-LB-001: Created EvidenceCard.cs with models for EvidenceCard, SbomExcerpt, RekorReceiptMetadata, CheckpointSignature. | Agent |
|
||||||
|
| 2026-01-14 | EVPCARD-LB-002: Created EvidenceCardService.cs with CreateCardAsync, ExportCardAsync (Json/CompactJson/CanonicalJson), VerifyCardAsync. | Agent |
|
||||||
|
| 2026-01-14 | EVPCARD-LB-003: Created IEvidenceCardService.cs with RekorReceiptMetadata integration for offline verification. | Agent |
|
||||||
|
| 2026-01-14 | EVPCARD-LB-004: Created EvidenceCardServiceTests.cs with 11 determinism and export tests. All 42 evidence pack tests pass. | Agent |
|
||||||
|
| 2026-01-14 | Added StellaOps.Determinism.Abstractions project reference for IGuidProvider. | Agent |
|
||||||
|
|
||||||
|
## Decisions & Risks
|
||||||
|
- Decide evidence-card schema fields and SBOM excerpt selection rules (size limits, deterministic ordering).
|
||||||
|
- Rekor receipt availability in air-gap must be optional; define fallback behavior when receipts are missing.
|
||||||
|
- Cross-module docs and API wiring occur in dependent sprints; note in commits when touching `docs/**`.
|
||||||
|
|
||||||
|
## Next Checkpoints
|
||||||
|
- TBD (set once staffed).
|
||||||
744
docs/FEATURE_GAPS_REPORT.md
Normal file
744
docs/FEATURE_GAPS_REPORT.md
Normal file
@@ -0,0 +1,744 @@
|
|||||||
|
# Feature Gaps Report - Stella Ops Suite
|
||||||
|
*(Auto-generated during feature matrix completion)*
|
||||||
|
|
||||||
|
This report documents:
|
||||||
|
1. Features discovered in code but not listed in FEATURE_MATRIX.md
|
||||||
|
2. CLI/UI coverage gaps for existing features
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Batch 1: SBOM & Ingestion
|
||||||
|
|
||||||
|
### Discovered Features (Not in Matrix)
|
||||||
|
|
||||||
|
| Feature | Module | Key Files | CLI | UI | Suggested Category |
|
||||||
|
|---------|--------|-----------|-----|----|--------------------|
|
||||||
|
| SPDX 3.0 Build Attestation | Attestor | `BuildAttestationMapper.cs`, `DsseSpdx3Signer.cs`, `CombinedDocumentBuilder.cs` | - | - | Attestation & Signing |
|
||||||
|
| CycloneDX CBOM Support | Scanner | `CycloneDxCbomWriter.cs` | - | - | SBOM & Ingestion |
|
||||||
|
| Trivy DB Export (Offline) | Concelier | `TrivyDbExporterPlugin.cs`, `TrivyDbOrasPusher.cs`, `TrivyDbExportPlanner.cs` | `stella db export trivy` | - | Offline & Air-Gap |
|
||||||
|
| Layer SBOM Composition | Scanner | `SpdxLayerWriter.cs`, `CycloneDxLayerWriter.cs`, `LayerSbomService.cs` | `stella sbomer layer`, `stella scan layer-sbom` | - | SBOM & Ingestion |
|
||||||
|
| SBOM Advisory Matching | Concelier | `SbomAdvisoryMatcher.cs`, `SbomRegistryService.cs`, `ValkeyPurlCanonicalIndex.cs` | - | - | Advisory Sources |
|
||||||
|
| Graph Lineage Service | Graph | `IGraphLineageService.cs`, `InMemoryGraphLineageService.cs`, `LineageContracts.cs` | - | `/graph` | SBOM & Ingestion |
|
||||||
|
| Evidence Cards (SBOM excerpts) | Evidence.Pack | `IEvidenceCardService.cs`, `EvidenceCardService.cs`, `EvidenceCard.cs` | - | Evidence drawer | Evidence & Findings |
|
||||||
|
| AirGap SBOM Parsing | AirGap | `SpdxParser.cs`, `CycloneDxParser.cs` | - | `/ops/offline-kit` | Offline & Air-Gap |
|
||||||
|
| SPDX License Normalization | Scanner | `SpdxLicenseNormalizer.cs`, `SpdxLicenseExpressions.cs`, `SpdxLicenseList.cs` | - | - | Scanning & Detection |
|
||||||
|
| SBOM Format Conversion | Scanner | `SpdxCycloneDxConverter.cs` | - | - | SBOM & Ingestion |
|
||||||
|
| SBOM Validation Pipeline | Scanner | `SbomValidationPipeline.cs`, `SemanticSbomExtensions.cs` | - | - | SBOM & Ingestion |
|
||||||
|
| CycloneDX Evidence Mapping | Scanner | `CycloneDxEvidenceMapper.cs` | - | - | SBOM & Ingestion |
|
||||||
|
| CycloneDX Pedigree Mapping | Scanner | `CycloneDxPedigreeMapper.cs` | - | - | SBOM & Ingestion |
|
||||||
|
| SBOM Snapshot Export | Graph | `SbomSnapshot.cs`, `SbomSnapshotExporter.cs` | - | - | Evidence & Findings |
|
||||||
|
| Lineage Evidence Packs | ExportCenter | `ILineageEvidencePackService.cs`, `LineageEvidencePack.cs`, `LineageExportEndpoints.cs` | - | `/triage/audit-bundles` | Evidence & Findings |
|
||||||
|
|
||||||
|
### Coverage Gaps
|
||||||
|
|
||||||
|
| Feature | Module | Has CLI | Has UI | Recommendation |
|
||||||
|
|---------|--------|---------|--------|----------------|
|
||||||
|
| Delta-SBOM Cache | SbomService | No | No | Internal optimization - no action needed |
|
||||||
|
| SBOM Lineage Ledger | SbomService | No | Yes | Add `stella sbom lineage list/show` commands |
|
||||||
|
| SBOM Lineage API | SbomService | No | Yes | Add `stella sbom lineage export` command |
|
||||||
|
| SPDX 3.0 Build Attestation | Attestor | No | No | Add to Attestation & Signing matrix section |
|
||||||
|
| Graph Lineage Service | Graph | No | Yes | Consider `stella graph lineage` command |
|
||||||
|
| Trivy DB Export | Concelier | Partial | No | `stella db export trivy` exists but may need UI |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Batch 2: Scanning & Detection
|
||||||
|
|
||||||
|
### Discovered Features (Not in Matrix)
|
||||||
|
|
||||||
|
| Feature | Module | Key Files | CLI | UI | Suggested Category |
|
||||||
|
|---------|--------|-----------|-----|----|--------------------|
|
||||||
|
| Secrets Detection (Regex+Entropy) | Scanner | `SecretsAnalyzer.cs`, `RegexDetector.cs`, `EntropyDetector.cs`, `CompositeSecretDetector.cs` | `stella scan run` | `/findings` | Scanning & Detection |
|
||||||
|
| OS Analyzers - Dpkg (Debian/Ubuntu) | Scanner | `DpkgPackageAnalyzer.cs`, `DpkgStatusParser.cs` | `stella scan run` | `/findings` | Scanning & Detection |
|
||||||
|
| OS Analyzers - Apk (Alpine) | Scanner | `ApkPackageAnalyzer.cs`, `ApkDatabaseParser.cs` | `stella scan run` | `/findings` | Scanning & Detection |
|
||||||
|
| OS Analyzers - RPM (RHEL/CentOS) | Scanner | `RpmPackageAnalyzer.cs` | `stella scan run` | `/findings` | Scanning & Detection |
|
||||||
|
| OS Analyzers - Homebrew (macOS) | Scanner | `HomebrewPackageAnalyzer.cs` | `stella scan run` | `/findings` | Scanning & Detection |
|
||||||
|
| OS Analyzers - macOS Bundles | Scanner | `MacOsBundleAnalyzer.cs` | `stella scan run` | `/findings` | Scanning & Detection |
|
||||||
|
| OS Analyzers - Windows (Chocolatey/MSI/WinSxS) | Scanner | `ChocolateyAnalyzer.cs`, `MsiAnalyzer.cs`, `WinSxSAnalyzer.cs` | `stella scan run` | `/findings` | Scanning & Detection |
|
||||||
|
| Symbol-Level Vulnerability Matching | Scanner | `VulnSurfaceService.cs`, `AdvisorySymbolMapping.cs`, `AffectedSymbol.cs` | - | - | Scanning & Detection |
|
||||||
|
| SARIF 2.1.0 Export | Scanner | SARIF export in CLI | `stella scan sarif` | - | Scanning & Detection |
|
||||||
|
| Fidelity Upgrade (Quick->Standard->Deep) | Scanner | `FidelityAwareAnalyzer.UpgradeFidelityAsync()` | - | - | Scanning & Detection |
|
||||||
|
| OCI Multi-Architecture Support | Scanner | `OciImageInspector.cs` (amd64, arm64, etc.) | `stella image inspect` | - | Scanning & Detection |
|
||||||
|
| Symlink Resolution (32-level depth) | Scanner | `LayeredRootFileSystem.cs` | - | - | Scanning & Detection |
|
||||||
|
| Whiteout File Support | Scanner | `LayeredRootFileSystem.cs` | - | - | Scanning & Detection |
|
||||||
|
| NATS/Redis Scan Queue | Scanner | `NatsScanQueue.cs`, `RedisScanQueue.cs` | - | `/ops/scanner` | Operations |
|
||||||
|
| Determinism Controls | Scanner | `DeterminismContext.cs`, `DeterministicTimeProvider.cs`, `DeterministicRandomProvider.cs` | `stella scan replay` | `/ops/scanner` | Determinism & Reproducibility |
|
||||||
|
| Lease-Based Job Processing | Scanner | `LeaseHeartbeatService.cs`, `ScanJobProcessor.cs` | - | - | Operations |
|
||||||
|
|
||||||
|
### Coverage Gaps
|
||||||
|
|
||||||
|
| Feature | Module | Has CLI | Has UI | Recommendation |
|
||||||
|
|---------|--------|---------|--------|----------------|
|
||||||
|
| License-Risk Detection | Scanner | No | No | Planned Q4-2025 - not yet implemented |
|
||||||
|
| Secrets Detection | Scanner | Implicit | Implicit | Document in matrix (runs automatically during scan) |
|
||||||
|
| OS Package Analyzers | Scanner | Implicit | Implicit | Document in matrix (6 OS-level analyzers) |
|
||||||
|
| Symbol-Level Matching | Scanner | No | No | Advanced feature - consider exposing in findings detail |
|
||||||
|
| SARIF Export | Scanner | Yes | No | Consider adding SARIF download in UI |
|
||||||
|
| Concurrent Worker Config | Scanner | No | Yes | CLI option for worker count would help CI/CD |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Batch 3: Reachability Analysis
|
||||||
|
|
||||||
|
### Discovered Features (Not in Matrix)
|
||||||
|
|
||||||
|
| Feature | Module | Key Files | CLI | UI | Suggested Category |
|
||||||
|
|---------|--------|-----------|-----|----|--------------------|
|
||||||
|
| 8-State Reachability Lattice | Reachability.Core | `ReachabilityLattice.cs` (28 state transitions) | - | `/reachability` | Reachability Analysis |
|
||||||
|
| Confidence Calculator | Reachability.Core | `ConfidenceCalculator.cs` (path/guard/hit bonuses) | - | - | Reachability Analysis |
|
||||||
|
| Evidence Weighted Score (EWS) | Signals | `EvidenceWeightedScoreCalculator.cs` (6 dimensions: RCH/RTS/BKP/XPL/SRC/MIT) | - | - | Scoring & Risk |
|
||||||
|
| Attested Reduction Scoring | Signals | VEX anchoring with short-circuit rules | - | - | Scoring & Risk |
|
||||||
|
| Hybrid Reachability Query | Reachability.Core | `IReachabilityIndex.cs` (static/runtime/hybrid/batch modes) | `stella reachgraph slice` | `/reachability` | Reachability Analysis |
|
||||||
|
| Reachability Replay/Verify | ReachGraph | `IReachabilityReplayService.VerifyAsync()` | `stella reachgraph replay/verify` | - | Determinism & Reproducibility |
|
||||||
|
| Graph Triple-Layer Storage | ReachGraph | `ReachGraphStoreService.cs` (Cache->DB->Archive) | - | - | Operations |
|
||||||
|
| Per-Graph Signing | ReachGraph | SHA256 artifact/provenance digests | - | - | Attestation & Signing |
|
||||||
|
| GraphViz/Mermaid Export | CLI | `stella reachability show --format dot/mermaid` | `stella reachability show` | - | Reachability Analysis |
|
||||||
|
| Reachability Drift Alerts | Docs | `19-reachability-drift-alert-flow.md` (state transition monitoring) | `stella drift` | - | Reachability Analysis |
|
||||||
|
| Evidence URIs | ReachGraph | `stella://reachgraph/{digest}/slice/{symbolId}` format | - | - | Evidence & Findings |
|
||||||
|
| Environment Guard Detection | Scanner | 20+ patterns (process.env, sys.platform, etc.) | - | `/reachability` | Reachability Analysis |
|
||||||
|
| Dynamic Loading Detection | Scanner | require(variable), import(variable), Class.forName() | - | - | Reachability Analysis |
|
||||||
|
| Reflection Call Detection | Scanner | Confidence scoring 0.5-0.6 for dynamic paths | - | - | Reachability Analysis |
|
||||||
|
| EWS Guardrails | Signals | Speculative cap (45), not-affected cap (15), runtime floor (60) | - | - | Scoring & Risk |
|
||||||
|
|
||||||
|
### Coverage Gaps
|
||||||
|
|
||||||
|
| Feature | Module | Has CLI | Has UI | Recommendation |
|
||||||
|
|---------|--------|---------|--------|----------------|
|
||||||
|
| Runtime Signal Correlation | Signals | No | Yes | Add `stella signals inspect` command |
|
||||||
|
| Gate Detection | Scanner | No | Yes | Consider `stella reachability guards` command |
|
||||||
|
| Path Witness Generation | ReachGraph | Yes | No | Add witness path visualization in UI |
|
||||||
|
| Confidence Calculator | Reachability.Core | No | No | Internal implementation - consider exposing in findings |
|
||||||
|
| Evidence Weighted Score | Signals | No | Partial | Add `stella score explain` command |
|
||||||
|
| Graph Triple-Layer Storage | ReachGraph | No | No | Ops concern - consider admin commands |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Batch 4: Binary Analysis
|
||||||
|
|
||||||
|
### Discovered Features (Not in Matrix)
|
||||||
|
|
||||||
|
| Feature | Module | Key Files | CLI | UI | Suggested Category |
|
||||||
|
|---------|--------|-----------|-----|----|--------------------|
|
||||||
|
| 4 Fingerprint Algorithm Types | BinaryIndex | `BasicBlockFingerprintGenerator.cs`, `ControlFlowGraphFingerprintGenerator.cs`, `StringRefsFingerprintGenerator.cs` | `stella binary fingerprint` | - | Binary Analysis |
|
||||||
|
| Alpine Corpus Support | BinaryIndex | `AlpineCorpusConnector.cs` | - | - | Binary Analysis |
|
||||||
|
| VEX Evidence Bridge | BinaryIndex | `IVexEvidenceGenerator.cs` | - | - | VEX Processing |
|
||||||
|
| Delta Signature Matching | BinaryIndex | `LookupByDeltaSignatureAsync()` | `stella deltasig` | - | Binary Analysis |
|
||||||
|
| Symbol Hash Matching | BinaryIndex | `LookupBySymbolHashAsync()` | `stella binary symbols` | - | Binary Analysis |
|
||||||
|
| Corpus Function Identification | BinaryIndex | `IdentifyFunctionFromCorpusAsync()` | - | - | Binary Analysis |
|
||||||
|
| Binary Call Graph Extraction | BinaryIndex | `binary callgraph` command | `stella binary callgraph` | - | Binary Analysis |
|
||||||
|
| 3-Tier Identification Strategy | BinaryIndex | Package/Build-ID/Fingerprint tiers | - | - | Binary Analysis |
|
||||||
|
| Fingerprint Validation Stats | BinaryIndex | `FingerprintValidationStats.cs` (TP/FP/TN/FN) | - | - | Binary Analysis |
|
||||||
|
| Changelog CVE Parsing | BinaryIndex | `DebianChangelogParser.cs` (CVE pattern extraction) | - | - | Binary Analysis |
|
||||||
|
| Secfixes Parsing | BinaryIndex | `ISecfixesParser.cs` (Alpine format) | - | - | Binary Analysis |
|
||||||
|
| Batch Binary Operations | BinaryIndex | All lookup methods support batching | - | - | Binary Analysis |
|
||||||
|
| Binary Match Confidence Scoring | BinaryIndex | 0.0-1.0 confidence for all matches | - | - | Binary Analysis |
|
||||||
|
| Architecture-Aware Filtering | BinaryIndex | Match filtering by architecture | - | - | Binary Analysis |
|
||||||
|
|
||||||
|
### Coverage Gaps
|
||||||
|
|
||||||
|
| Feature | Module | Has CLI | Has UI | Recommendation |
|
||||||
|
|---------|--------|---------|--------|----------------|
|
||||||
|
| Alpine Corpus | BinaryIndex | No | No | Add to matrix as additional corpus |
|
||||||
|
| Corpus Ingestion UI | BinaryIndex | No | No | Consider admin UI for corpus management |
|
||||||
|
| VEX Evidence Bridge | BinaryIndex | No | No | Internal integration - document in VEX section |
|
||||||
|
| Fingerprint Visualization | BinaryIndex | Yes | No | Consider UI for function fingerprint display |
|
||||||
|
| Batch Operations | BinaryIndex | No | No | Internal API - consider batch CLI commands |
|
||||||
|
| Delta Signatures | BinaryIndex | Yes | No | Consider UI integration for patch detection |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Batch 5: Advisory Sources
|
||||||
|
|
||||||
|
### Discovered Features (Not in Matrix)
|
||||||
|
|
||||||
|
**CRITICAL: Matrix lists 11 sources, but codebase has 33+ connectors!**
|
||||||
|
|
||||||
|
| Feature | Module | Key Files | CLI | UI | Suggested Category |
|
||||||
|
|---------|--------|-----------|-----|----|--------------------|
|
||||||
|
| **SUSE Connector** | Concelier | `Connector.Distro.Suse/` | `stella db fetch suse` | - | Advisory Sources |
|
||||||
|
| **Astra Linux Connector** | Concelier | `Connector.Astra/` (FSTEC-certified Russian) | `stella db fetch astra` | - | Advisory Sources |
|
||||||
|
| **Microsoft MSRC** | Concelier | `vndr.msrc` vendor connector | - | - | Advisory Sources |
|
||||||
|
| **Oracle Connector** | Concelier | `vndr.oracle` vendor connector | - | - | Advisory Sources |
|
||||||
|
| **Adobe Connector** | Concelier | `vndr.adobe` vendor connector | - | - | Advisory Sources |
|
||||||
|
| **Apple Connector** | Concelier | `vndr.apple` vendor connector | - | - | Advisory Sources |
|
||||||
|
| **Cisco Connector** | Concelier | `vndr.cisco` vendor connector | - | - | Advisory Sources |
|
||||||
|
| **Chromium Connector** | Concelier | `vndr.chromium` vendor connector | - | - | Advisory Sources |
|
||||||
|
| **VMware Connector** | Concelier | `vndr.vmware` vendor connector | - | - | Advisory Sources |
|
||||||
|
| **JVN (Japan) CERT** | Concelier | `Connector.Jvn/` | - | - | Advisory Sources |
|
||||||
|
| **ACSC (Australia) CERT** | Concelier | `Connector.Acsc/` | - | - | Advisory Sources |
|
||||||
|
| **CCCS (Canada) CERT** | Concelier | `Connector.Cccs/` | - | - | Advisory Sources |
|
||||||
|
| **CertFr (France) CERT** | Concelier | `Connector.CertFr/` | - | - | Advisory Sources |
|
||||||
|
| **CertBund (Germany) CERT** | Concelier | `Connector.CertBund/` | - | - | Advisory Sources |
|
||||||
|
| **CertCc CERT** | Concelier | `Connector.CertCc/` | - | - | Advisory Sources |
|
||||||
|
| **CertIn (India) CERT** | Concelier | `Connector.CertIn/` | - | - | Advisory Sources |
|
||||||
|
| **RU-BDU (Russia) CERT** | Concelier | `Connector.Ru.Bdu/` | - | - | Advisory Sources |
|
||||||
|
| **RU-NKCKI (Russia) CERT** | Concelier | `Connector.Ru.Nkcki/` | - | - | Advisory Sources |
|
||||||
|
| **KISA (South Korea) CERT** | Concelier | `Connector.Kisa/` | - | - | Advisory Sources |
|
||||||
|
| **ICS-CISA (Industrial)** | Concelier | `Connector.Ics.Cisa/` | - | - | Advisory Sources |
|
||||||
|
| **ICS-Kaspersky (Industrial)** | Concelier | `Connector.Ics.Kaspersky/` | - | - | Advisory Sources |
|
||||||
|
| **StellaOpsMirror (Internal)** | Concelier | `Connector.StellaOpsMirror/` | - | - | Advisory Sources |
|
||||||
|
| Backport-Aware Precedence | Concelier | `ConfigurableSourcePrecedenceLattice.cs` | - | - | Advisory Sources |
|
||||||
|
| Link-Not-Merge Architecture | Concelier | Transitioning from merge to observation/linkset | - | - | Advisory Sources |
|
||||||
|
| Canonical Deduplication | Concelier | `ICanonicalAdvisoryService`, `CanonicalMerger.cs` | - | - | Advisory Sources |
|
||||||
|
| Change History Tracking | Concelier | `IChangeHistoryStore` (field-level diffs) | - | - | Advisory Sources |
|
||||||
|
| Feed Epoch Events | Concelier | `FeedEpochAdvancedEvent` (Provcache invalidation) | - | - | Advisory Sources |
|
||||||
|
| JSON Exporter | Concelier | `Exporter.Json/` (manifest-driven export) | `stella db export json` | - | Offline & Air-Gap |
|
||||||
|
| Trivy DB Exporter | Concelier | `Exporter.TrivyDb/` | `stella db export trivy` | - | Offline & Air-Gap |
|
||||||
|
|
||||||
|
### Coverage Gaps
|
||||||
|
|
||||||
|
| Feature | Module | Has CLI | Has UI | Recommendation |
|
||||||
|
|---------|--------|---------|--------|----------------|
|
||||||
|
| **22+ Connectors Missing from Matrix** | Concelier | Partial | No | ADD TO MATRIX - major documentation gap |
|
||||||
|
| Vendor PSIRTs (7 connectors) | Concelier | No | No | Add vendor section to matrix |
|
||||||
|
| Regional CERTs (11 connectors) | Concelier | No | No | Add regional CERT section to matrix |
|
||||||
|
| Industrial/ICS (2 connectors) | Concelier | No | No | Add ICS section to matrix |
|
||||||
|
| Link-Not-Merge Transition | Concelier | No | No | Document new architecture in matrix |
|
||||||
|
| Backport Precedence | Concelier | No | No | Document in merge engine section |
|
||||||
|
| Change History | Concelier | No | No | Consider audit trail UI |
|
||||||
|
|
||||||
|
### Matrix Update Recommendations
|
||||||
|
|
||||||
|
The FEATURE_MATRIX.md seriously underrepresents Concelier capabilities:
|
||||||
|
- **Listed:** 11 sources
|
||||||
|
- **Actual:** 33+ connectors
|
||||||
|
|
||||||
|
Recommended additions:
|
||||||
|
1. Add "Vendor PSIRTs" section (Microsoft, Oracle, Adobe, Apple, Cisco, Chromium, VMware)
|
||||||
|
2. Add "Regional CERTs" section (JVN, ACSC, CCCS, CertFr, CertBund, CertIn, RU-BDU, KISA, etc.)
|
||||||
|
3. Add "Industrial/ICS" section (ICS-CISA, ICS-Kaspersky)
|
||||||
|
4. Add "Additional Distros" section (SUSE, Astra Linux)
|
||||||
|
5. Document backport-aware precedence configuration
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Batch 6: VEX Processing
|
||||||
|
|
||||||
|
### Discovered Features (Not in Matrix)
|
||||||
|
|
||||||
|
| Feature | Module | Key Files | CLI | UI | Suggested Category |
|
||||||
|
|---------|--------|-----------|-----|----|--------------------|
|
||||||
|
| VEX Consensus Engine (5-state lattice) | VexLens | `VexConsensusEngine.cs`, `IVexConsensusEngine.cs` | `stella vex consensus` | `/vex` | VEX Processing |
|
||||||
|
| Trust Decay Service | VexLens | `TrustDecayService.cs`, `TrustDecayCalculator.cs` | - | - | VEX Processing |
|
||||||
|
| Noise Gate Service | VexLens | `NoiseGateService.cs` | - | `/vex` | VEX Processing |
|
||||||
|
| Consensus Rationale Service | VexLens | `IConsensusRationaleService.cs`, `ConsensusRationaleModels.cs` | - | `/vex` | VEX Processing |
|
||||||
|
| VEX Linkset Extraction | Excititor | `VexLinksetExtractionService.cs` | - | - | VEX Processing |
|
||||||
|
| VEX Linkset Disagreement Detection | Excititor | `VexLinksetDisagreementService.cs` | - | `/vex` | VEX Processing |
|
||||||
|
| VEX Statement Backfill | Excititor | `VexStatementBackfillService.cs` | - | - | VEX Processing |
|
||||||
|
| VEX Evidence Chunking | Excititor | `VexEvidenceChunkService.cs` | - | - | VEX Processing |
|
||||||
|
| Auto-VEX Downgrade | Excititor | `AutoVexDowngradeService.cs` | - | - | VEX Processing |
|
||||||
|
| Risk Feed Service | Excititor | `RiskFeedService.cs`, `RiskFeedEndpoints.cs` | - | - | VEX Processing |
|
||||||
|
| Trust Calibration Service | Excititor | `TrustCalibrationService.cs` | - | - | VEX Processing |
|
||||||
|
| VEX Hashing Service (deterministic) | Excititor | `VexHashingService.cs` | - | - | VEX Processing |
|
||||||
|
| CSAF Provider Connectors (7 total) | Excititor | `Connectors.*.CSAF/` (RedHat, Ubuntu, Oracle, MSRC, Cisco, SUSE) | - | - | VEX Processing |
|
||||||
|
| OCI OpenVEX Attestation Connector | Excititor | `Connectors.OCI.OpenVEX.Attest/` | - | - | VEX Processing |
|
||||||
|
| Issuer Key Lifecycle Management | IssuerDirectory | Key create/rotate/revoke endpoints | - | `/issuer-directory` | VEX Processing |
|
||||||
|
| Issuer Trust Override | IssuerDirectory | Trust override endpoints | - | `/issuer-directory` | VEX Processing |
|
||||||
|
| CSAF Publisher Bootstrap | IssuerDirectory | `csaf-publishers.json` seeding | - | - | VEX Processing |
|
||||||
|
| VEX Webhook Distribution | VexHub | `IWebhookService.cs`, `IWebhookSubscriptionRepository.cs` | - | - | VEX Processing |
|
||||||
|
| VEX Conflict Flagging | VexHub | `IStatementFlaggingService.cs` | - | - | VEX Processing |
|
||||||
|
| VEX from Drift Generation | CLI | `VexGenCommandGroup.cs` | `stella vex gen --from-drift` | - | VEX Processing |
|
||||||
|
| VEX Decision Signing | Policy | `VexDecisionSigningService.cs` | - | - | Policy Engine |
|
||||||
|
| VEX Proof Spine | Policy | `VexProofSpineService.cs` | - | - | Policy Engine |
|
||||||
|
| Consensus Propagation Rules | VexLens | `IPropagationRuleEngine.cs` | - | - | VEX Processing |
|
||||||
|
| Consensus Delta Computation | VexLens | `VexDeltaComputeService.cs` | - | - | VEX Processing |
|
||||||
|
| Triple-Layer Consensus Storage | VexLens | Cache->DB->Archive with `IConsensusProjectionStore.cs` | - | - | Operations |
|
||||||
|
|
||||||
|
### Coverage Gaps
|
||||||
|
|
||||||
|
| Feature | Module | Has CLI | Has UI | Recommendation |
|
||||||
|
|---------|--------|---------|--------|----------------|
|
||||||
|
| CSAF Provider Connectors | Excititor | No | No | Consider connector status UI in ops |
|
||||||
|
| Trust Weight Configuration | VexLens | No | Partial | Add `stella vex trust configure` command |
|
||||||
|
| VEX Distribution Webhooks | VexHub | No | No | Add webhook management UI/CLI |
|
||||||
|
| Conflict Resolution | VexLens | No | Partial | Interactive conflict resolution needed |
|
||||||
|
| Issuer Key Management | IssuerDirectory | No | Yes | Add `stella issuer keys` CLI |
|
||||||
|
| Risk Feed Distribution | Excititor | No | No | Consider risk feed CLI |
|
||||||
|
| Consensus Replay/Verify | VexLens | No | No | Add `stella vex verify` command |
|
||||||
|
| VEX Evidence Export | Excititor | No | No | Add `stella vex evidence export` |
|
||||||
|
|
||||||
|
### Matrix Update Recommendations
|
||||||
|
|
||||||
|
The FEATURE_MATRIX.md VEX section is significantly underspecified:
|
||||||
|
- **Listed:** Basic VEX support (OpenVEX, CSAF, CycloneDX)
|
||||||
|
- **Actual:** Full consensus engine with 5-state lattice, 9 trust factors, 7 CSAF connectors, conflict detection, issuer registry
|
||||||
|
|
||||||
|
Recommended additions:
|
||||||
|
1. Add "VEX Consensus Engine" as major feature (VexLens)
|
||||||
|
2. Add "Trust Weight Scoring" with 9 factors documented
|
||||||
|
3. Add "CSAF Provider Connectors" section (7 vendors)
|
||||||
|
4. Add "Issuer Trust Registry" (IssuerDirectory)
|
||||||
|
5. Add "VEX Distribution" (VexHub webhooks)
|
||||||
|
6. Document AOC (Aggregation-Only Contract) compliance
|
||||||
|
7. Add "VEX from Drift" generation capability
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Batch 7: Policy Engine
|
||||||
|
|
||||||
|
### Discovered Features (Not in Matrix)
|
||||||
|
|
||||||
|
| Feature | Module | Key Files | CLI | UI | Suggested Category |
|
||||||
|
|---------|--------|-----------|-----|----|--------------------|
|
||||||
|
| K4 Lattice (Belnap Four-Valued Logic) | Policy | `K4Lattice.cs`, `TrustLatticeEngine.cs`, `ClaimScoreMerger.cs` | - | `/policy` | Policy Engine |
|
||||||
|
| 10+ Policy Gate Types | Policy | `PolicyGateEvaluator.cs`, various *Gate.cs files | - | `/policy` | Policy Engine |
|
||||||
|
| Uncertainty Score Calculator | Policy.Determinization | `UncertaintyScoreCalculator.cs` (entropy 0.0-1.0) | - | - | Policy Engine |
|
||||||
|
| Decayed Confidence Calculator | Policy.Determinization | `DecayedConfidenceCalculator.cs` (14-day half-life) | - | - | Policy Engine |
|
||||||
|
| 6 Evidence Types | Policy.Determinization | `BackportEvidence.cs`, `CvssEvidence.cs`, `EpssEvidence.cs`, etc. | - | - | Policy Engine |
|
||||||
|
| 6 Risk Score Providers | RiskEngine | `CvssKevProvider.cs`, `EpssProvider.cs`, `FixChainRiskProvider.cs` | - | `/risk` | Scoring & Risk |
|
||||||
|
| FixChain Risk Metrics | RiskEngine | `FixChainRiskMetrics.cs`, `FixChainRiskDisplay.cs` | - | - | Scoring & Risk |
|
||||||
|
| Exception Effect Registry | Policy | `ExceptionEffectRegistry.cs`, `ExceptionAdapter.cs` | - | `/policy/exceptions` | Policy Engine |
|
||||||
|
| Exception Approval Rules | Policy | `IExceptionApprovalRulesService.cs` | - | `/policy/exceptions` | Policy Engine |
|
||||||
|
| Policy Simulation Service | Policy.Registry | `IPolicySimulationService.cs` | `stella policy simulate` | `/policy/simulate` | Policy Engine |
|
||||||
|
| Policy Promotion Pipeline | Policy.Registry | `IPromotionService.cs`, `IPublishPipelineService.cs` | - | - | Policy Engine |
|
||||||
|
| Review Workflow Service | Policy.Registry | `IReviewWorkflowService.cs` | - | - | Policy Engine |
|
||||||
|
| Sealed Mode Service | Policy | `ISealedModeService.cs` | - | `/ops` | Offline & Air-Gap |
|
||||||
|
| Verdict Attestation Service | Policy | `IVerdictAttestationService.cs` | - | - | Attestation & Signing |
|
||||||
|
| Policy Decision Attestation | Policy | `IPolicyDecisionAttestationService.cs` (DSSE/Rekor) | - | - | Attestation & Signing |
|
||||||
|
| Score Policy YAML Config | Policy | `ScorePolicyModels.cs`, `ScorePolicyLoader.cs` | `stella policy validate` | `/policy` | Policy Engine |
|
||||||
|
| Profile-Aware Scoring | Policy.Scoring | `ProfileAwareScoringService.cs`, `ScoringProfileService.cs` | - | - | Policy Engine |
|
||||||
|
| Freshness-Aware Scoring | Policy | `FreshnessAwareScoringService.cs` | - | - | Policy Engine |
|
||||||
|
| Jurisdiction Trust Rules | Policy.Vex | `JurisdictionTrustRules.cs` | - | - | Policy Engine |
|
||||||
|
| VEX Customer Override | Policy.Vex | `VexCustomerOverride.cs` | - | - | Policy Engine |
|
||||||
|
| Attestation Report Service | Policy | `IAttestationReportService.cs` | - | - | Attestation & Signing |
|
||||||
|
| Risk Scoring Trigger Service | Policy.Scoring | `RiskScoringTriggerService.cs` | - | - | Scoring & Risk |
|
||||||
|
| Policy Lint Endpoint | Policy | `/policy/lint` | - | - | Policy Engine |
|
||||||
|
| Policy Determinism Verification | Policy | `/policy/verify-determinism` | - | - | Determinism & Reproducibility |
|
||||||
|
| AdvisoryAI Knobs Endpoint | Policy | `/policy/advisory-ai/knobs` | - | - | Policy Engine |
|
||||||
|
| Stability Damping Gate | Policy | `StabilityDampingGate.cs` | - | - | Policy Engine |
|
||||||
|
|
||||||
|
### Coverage Gaps
|
||||||
|
|
||||||
|
| Feature | Module | Has CLI | Has UI | Recommendation |
|
||||||
|
|---------|--------|---------|--------|----------------|
|
||||||
|
| K4 Lattice Operations | Policy | No | Partial | Add `stella policy lattice explain` for debugging |
|
||||||
|
| Risk Provider Configuration | RiskEngine | No | No | Provider configuration needs CLI/UI exposure |
|
||||||
|
| Exception Approval Workflow | Policy | No | Yes | Add `stella policy exception approve/reject` CLI |
|
||||||
|
| Determinization Signal Weights | Policy | No | No | Allow signal weight tuning via CLI/config |
|
||||||
|
| Policy Pack Promotion | Policy.Registry | No | Partial | Add `stella policy promote` CLI |
|
||||||
|
| Score Policy Tuning | Policy.Scoring | Partial | Partial | Expand `stella policy` commands |
|
||||||
|
| Verdict Attestation Export | Policy | No | No | Add `stella policy verdicts export` |
|
||||||
|
| Risk Scoring History | RiskEngine | No | Partial | Consider historical trend CLI |
|
||||||
|
|
||||||
|
### Matrix Update Recommendations
|
||||||
|
|
||||||
|
The FEATURE_MATRIX.md Policy section covers basics but misses advanced features:
|
||||||
|
- **Listed:** Basic policy evaluation, exceptions
|
||||||
|
- **Actual:** Full K4 lattice, 10+ gate types, 6 risk providers, determinization system
|
||||||
|
|
||||||
|
Recommended additions:
|
||||||
|
1. Add "K4 Lattice Logic" as core feature (Belnap four-valued logic)
|
||||||
|
2. Add "Policy Gate Types" section (10+ specialized gates)
|
||||||
|
3. Add "Risk Score Providers" section (6 providers with distinct purposes)
|
||||||
|
4. Add "Determinization System" (signal weights, decay, uncertainty)
|
||||||
|
5. Add "Score Policy Configuration" (YAML-based policy tuning)
|
||||||
|
6. Add "Policy Simulation" as distinct feature
|
||||||
|
7. Add "Verdict Attestations" (DSSE/Rekor integration)
|
||||||
|
8. Document "Sealed Mode" for air-gap operations
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Batch 8: Attestation & Signing
|
||||||
|
|
||||||
|
### Discovered Features (Not in Matrix)
|
||||||
|
|
||||||
|
| Feature | Module | Key Files | CLI | UI | Suggested Category |
|
||||||
|
|---------|--------|-----------|-----|----|--------------------|
|
||||||
|
| 25+ Predicate Types | Attestor | `StellaOps.Attestor.ProofChain/Predicates/` | - | - | Attestation & Signing |
|
||||||
|
| Keyless Signing (Fulcio) | Signer | `KeylessDsseSigner.cs`, `HttpFulcioClient.cs` | `stella sign keyless` | - | Attestation & Signing |
|
||||||
|
| Ephemeral Key Generation | Signer.Keyless | `EphemeralKeyGenerator.cs`, `EphemeralKeyPair.cs` | - | - | Attestation & Signing |
|
||||||
|
| OIDC Token Provider | Signer.Keyless | `IOidcTokenProvider.cs`, `AmbientOidcTokenProvider.cs` | - | - | Attestation & Signing |
|
||||||
|
| Key Rotation Service | Signer.KeyManagement | `IKeyRotationService.cs`, `KeyRotationService.cs` | `/keys/rotate` API | - | Attestation & Signing |
|
||||||
|
| Trust Anchor Manager | Signer.KeyManagement | `ITrustAnchorManager.cs`, `TrustAnchorManager.cs` | - | - | Attestation & Signing |
|
||||||
|
| Delta Attestations (4 types) | Attestor | `IDeltaAttestationService.cs` (VEX/SBOM/Verdict/Reachability) | - | - | Attestation & Signing |
|
||||||
|
| Layer Attestation Service | Attestor | `ILayerAttestationService.cs` | - | - | Attestation & Signing |
|
||||||
|
| Attestation Chain Builder | Attestor | `AttestationChainBuilder.cs`, `AttestationChainValidator.cs` | - | - | Attestation & Signing |
|
||||||
|
| Attestation Link Store | Attestor | `IAttestationLinkStore.cs`, `IAttestationLinkResolver.cs` | - | - | Attestation & Signing |
|
||||||
|
| Rekor Submission Queue | Attestor | `IRekorSubmissionQueue.cs` (durable retry) | - | - | Attestation & Signing |
|
||||||
|
| Cached Verification Service | Attestor | `CachedAttestorVerificationService.cs` | - | - | Attestation & Signing |
|
||||||
|
| Offline Bundle Service | Attestor | `IAttestorBundleService.cs` | - | `/ops/offline-kit` | Offline & Air-Gap |
|
||||||
|
| Signer Quota Service | Signer | `ISignerQuotaService.cs` | - | - | Operations |
|
||||||
|
| Signer Audit Sink | Signer | `ISignerAuditSink.cs`, `InMemorySignerAuditSink.cs` | - | - | Operations |
|
||||||
|
| Proof of Entitlement | Signer | `IProofOfEntitlementIntrospector.cs` (JWT/MTLS) | - | - | Auth & Access Control |
|
||||||
|
| Release Integrity Verifier | Signer | `IReleaseIntegrityVerifier.cs` | - | - | Attestation & Signing |
|
||||||
|
| JSON Canonicalizer (RFC 8785) | Attestor | `JsonCanonicalizer.cs` | - | - | Determinism & Reproducibility |
|
||||||
|
| Predicate Type Router | Attestor | `IPredicateTypeRouter.cs`, `PredicateTypeRouter.cs` | - | - | Attestation & Signing |
|
||||||
|
| Standard Predicate Registry | Attestor | `IStandardPredicateRegistry.cs` | - | - | Attestation & Signing |
|
||||||
|
| HMAC Signing | Signer | `HmacDsseSigner.cs` | - | - | Attestation & Signing |
|
||||||
|
| SM2 Algorithm Support | Signer | `CryptoDsseSigner.cs` (SM2 branch) | - | - | Regional Crypto |
|
||||||
|
| Promotion Attestation | Provenance | `PromotionAttestation.cs` | - | - | Release Orchestration |
|
||||||
|
| Cosign/KMS Signer | Provenance | `CosignAndKmsSigner.cs` | - | - | Attestation & Signing |
|
||||||
|
| Rotating Signer | Provenance | `RotatingSigner.cs` | - | - | Attestation & Signing |
|
||||||
|
|
||||||
|
### Coverage Gaps
|
||||||
|
|
||||||
|
| Feature | Module | Has CLI | Has UI | Recommendation |
|
||||||
|
|---------|--------|---------|--------|----------------|
|
||||||
|
| Key Rotation | Signer | No | No | Add `stella keys rotate` CLI command |
|
||||||
|
| Trust Anchor Management | Signer | No | No | Add `stella trust-anchors` commands |
|
||||||
|
| Attestation Chain Visualization | Attestor | No | Partial | Add chain visualization UI |
|
||||||
|
| Predicate Registry Browser | Attestor | No | No | Add `stella attest predicates list` |
|
||||||
|
| Delta Attestation CLI | Attestor | No | No | Add `stella attest delta` commands |
|
||||||
|
| Signer Audit Logs | Signer | No | No | Add `stella sign audit` command |
|
||||||
|
| Rekor Submission Status | Attestor | No | No | Add submission queue status UI |
|
||||||
|
|
||||||
|
### Matrix Update Recommendations
|
||||||
|
|
||||||
|
The FEATURE_MATRIX.md Attestation section lists basic DSSE/in-toto support:
|
||||||
|
- **Listed:** Basic attestation attach/verify, SLSA provenance
|
||||||
|
- **Actual:** 25+ predicate types, keyless signing, key rotation, attestation chains
|
||||||
|
|
||||||
|
Recommended additions:
|
||||||
|
1. Add "Predicate Types" section (25+ types documented)
|
||||||
|
2. Add "Keyless Signing (Sigstore)" as major feature
|
||||||
|
3. Add "Key Rotation Service" for Enterprise tier
|
||||||
|
4. Add "Trust Anchor Management" for Enterprise tier
|
||||||
|
5. Add "Attestation Chains" feature
|
||||||
|
6. Add "Delta Attestations" (VEX/SBOM/Verdict/Reachability)
|
||||||
|
7. Document "Offline Bundle Service" for air-gap
|
||||||
|
8. Add "SM2 Algorithm Support" in Regional Crypto section
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Batch 9: Regional Crypto
|
||||||
|
|
||||||
|
### Discovered Features (Not in Matrix)
|
||||||
|
|
||||||
|
| Feature | Module | Key Files | CLI | UI | Suggested Category |
|
||||||
|
|---------|--------|-----------|-----|----|--------------------|
|
||||||
|
| 8 Signature Profiles | Cryptography | `SignatureProfile.cs` | - | - | Regional Crypto |
|
||||||
|
| Ed25519 Baseline Signing | Cryptography | `Ed25519Signer.cs`, `Ed25519Verifier.cs` | - | - | Regional Crypto |
|
||||||
|
| ECDSA P-256 Profile | Cryptography | `EcdsaP256Signer.cs` | - | - | Regional Crypto |
|
||||||
|
| FIPS 140-2 Plugin | Cryptography | `FipsPlugin.cs` | - | - | Regional Crypto |
|
||||||
|
| GOST R 34.10-2012 Plugin | Cryptography | `GostPlugin.cs` | - | - | Regional Crypto |
|
||||||
|
| SM2/SM3/SM4 Plugin | Cryptography | `SmPlugin.cs` | - | - | Regional Crypto |
|
||||||
|
| eIDAS Plugin (CAdES/XAdES) | Cryptography | `EidasPlugin.cs` | - | - | Regional Crypto |
|
||||||
|
| HSM Plugin (PKCS#11) | Cryptography | `HsmPlugin.cs` (simulated + production) | - | - | Regional Crypto |
|
||||||
|
| CryptoPro GOST (Windows) | Cryptography | `CryptoProGostCryptoProvider.cs` | - | - | Regional Crypto |
|
||||||
|
| Multi-Profile Signing | Cryptography | `MultiProfileSigner.cs` | - | - | Regional Crypto |
|
||||||
|
| SM Remote Service | SmRemote | `Program.cs` | - | - | Regional Crypto |
|
||||||
|
| Post-Quantum Profiles (Defined) | Cryptography | `SignatureProfile.cs` (Dilithium, Falcon) | - | - | Regional Crypto |
|
||||||
|
| RFC 3161 TSA Integration | Cryptography | `EidasPlugin.cs` | - | - | Regional Crypto |
|
||||||
|
| Simulated HSM Client | Cryptography | `SimulatedHsmClient.cs` | - | - | Regional Crypto |
|
||||||
|
| GOST Block Cipher (28147-89) | Cryptography | `GostPlugin.cs` | - | - | Regional Crypto |
|
||||||
|
| SM4 Encryption (CBC/ECB/GCM) | Cryptography | `SmPlugin.cs` | - | - | Regional Crypto |
|
||||||
|
|
||||||
|
### Coverage Gaps
|
||||||
|
|
||||||
|
| Feature | Module | Has CLI | Has UI | Recommendation |
|
||||||
|
|---------|--------|---------|--------|----------------|
|
||||||
|
| Crypto Profile Selection | Cryptography | No | No | Add `stella crypto profiles` command |
|
||||||
|
| Plugin Health Check | Cryptography | No | No | Add plugin status endpoint |
|
||||||
|
| Key Management CLI | Cryptography | No | No | Add `stella keys` commands |
|
||||||
|
| HSM Status | Cryptography | No | No | Add HSM health monitoring |
|
||||||
|
| Post-Quantum Implementation | Cryptography | No | No | Implement Dilithium/Falcon when stable |
|
||||||
|
|
||||||
|
### Matrix Update Recommendations
|
||||||
|
|
||||||
|
The FEATURE_MATRIX.md Regional Crypto section mentions only FIPS/eIDAS/GOST:
|
||||||
|
- **Listed:** Basic regional compliance mentions
|
||||||
|
- **Actual:** 8 signature profiles, 6 plugins, HSM support, post-quantum readiness
|
||||||
|
|
||||||
|
Recommended additions:
|
||||||
|
1. Add "Signature Profiles" section (8 profiles documented)
|
||||||
|
2. Add "Plugin Architecture" description
|
||||||
|
3. Add "Multi-Profile Signing" capability (dual-stack signatures)
|
||||||
|
4. Add "SM Remote Service" for Chinese market
|
||||||
|
5. Add "Post-Quantum Readiness" (Dilithium, Falcon defined)
|
||||||
|
6. Add "HSM Integration" (PKCS#11 + simulation)
|
||||||
|
7. Document plugin configuration options
|
||||||
|
8. Add "CryptoPro GOST" for Windows environments
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Batch 10: Evidence & Findings
|
||||||
|
|
||||||
|
### Discovered Features (Not in Matrix)
|
||||||
|
|
||||||
|
| Feature | Module | Key Files | CLI | UI | Suggested Category |
|
||||||
|
|---------|--------|-----------|-----|----|--------------------|
|
||||||
|
| WORM Storage (S3 Object Lock) | EvidenceLocker | `S3EvidenceObjectStore.cs` | - | - | Evidence & Findings |
|
||||||
|
| Verdict Attestations (DSSE) | EvidenceLocker | `VerdictEndpoints.cs`, `VerdictContracts.cs` | - | `/evidence-export` | Evidence & Findings |
|
||||||
|
| Append-Only Ledger Events | Findings | `ILedgerEventRepository.cs`, `LedgerEventModels.cs` | - | `/findings` | Evidence & Findings |
|
||||||
|
| Alert Triage Bands (hot/warm/cold) | Findings | `DecisionModels.cs` | - | `/findings` | Evidence & Findings |
|
||||||
|
| Merkle Anchoring | Findings | `Infrastructure/Merkle/` | - | - | Evidence & Findings |
|
||||||
|
| Evidence Holds (Legal) | EvidenceLocker | `EvidenceHold.cs` | - | - | Evidence & Findings |
|
||||||
|
| Evidence Pack Service | Evidence.Pack | `IEvidencePackService.cs`, `EvidencePack.cs` | - | `/evidence-thread` | Evidence & Findings |
|
||||||
|
| Evidence Card Service | Evidence.Pack | `IEvidenceCardService.cs`, `EvidenceCard.cs` | - | - | Evidence & Findings |
|
||||||
|
| Profile-Based Export | ExportCenter | `ExportApiEndpoints.cs`, `ExportProfile` | - | `/evidence-export` | Evidence & Findings |
|
||||||
|
| Risk Bundle Export | ExportCenter | `RiskBundleEndpoints.cs` | - | `/evidence-export` | Evidence & Findings |
|
||||||
|
| Audit Bundle Export | ExportCenter | `AuditBundleEndpoints.cs` | - | - | Evidence & Findings |
|
||||||
|
| Lineage Evidence Export | ExportCenter | `LineageExportEndpoints.cs` | - | `/lineage` | Evidence & Findings |
|
||||||
|
| SSE Export Streaming | ExportCenter | Real-time run events | - | - | Evidence & Findings |
|
||||||
|
| Incident Mode | Findings | `IIncidentModeState.cs` | - | - | Evidence & Findings |
|
||||||
|
|
||||||
|
### Coverage Gaps
|
||||||
|
|
||||||
|
| Feature | Module | Has CLI | Has UI | Recommendation |
|
||||||
|
|---------|--------|---------|--------|----------------|
|
||||||
|
| Evidence Holds | EvidenceLocker | No | No | Add legal hold management CLI |
|
||||||
|
| Audit Bundle Export | ExportCenter | No | Partial | Add `stella export audit` command |
|
||||||
|
| Incident Mode | Findings | No | No | Add `stella findings incident` commands |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Batch 11: Determinism & Replay
|
||||||
|
|
||||||
|
### Discovered Features (Not in Matrix)
|
||||||
|
|
||||||
|
| Feature | Module | Key Files | CLI | UI | Suggested Category |
|
||||||
|
|---------|--------|-----------|-----|----|--------------------|
|
||||||
|
| Hybrid Logical Clock | HybridLogicalClock | `HybridLogicalClock.cs`, `HlcTimestamp.cs` | - | - | Determinism & Replay |
|
||||||
|
| HLC State Persistence | HybridLogicalClock | `IHlcStateStore.cs` | - | - | Determinism & Replay |
|
||||||
|
| Canonical JSON (RFC 8785) | Canonical.Json | `CanonJson.cs`, `CanonVersion.cs` | - | - | Determinism & Replay |
|
||||||
|
| Replay Manifests V1/V2 | Replay.Core | `ReplayManifest.cs` | `stella scan replay` | - | Determinism & Replay |
|
||||||
|
| Knowledge Snapshots | Replay.Core | `KnowledgeSnapshot.cs` | - | - | Determinism & Replay |
|
||||||
|
| Replay Proofs (DSSE) | Replay.Core | `ReplayProof.cs` | `stella prove` | - | Determinism & Replay |
|
||||||
|
| Evidence Weighted Scoring (6 factors) | Signals | `EvidenceWeightedScoreCalculator.cs` | - | - | Scoring & Risk |
|
||||||
|
| Score Buckets (ActNow/ScheduleNext/Investigate/Watchlist) | Signals | Scoring algorithm | - | - | Scoring & Risk |
|
||||||
|
| Attested Reduction (short-circuit) | Signals | VEX anchoring logic | - | - | Scoring & Risk |
|
||||||
|
| Timeline Events | Eventing | `TimelineEvent.cs`, `ITimelineEventEmitter.cs` | - | - | Determinism & Replay |
|
||||||
|
| Deterministic Event IDs | Eventing | `EventIdGenerator.cs` (SHA-256) | - | - | Determinism & Replay |
|
||||||
|
| Transactional Outbox | Eventing | `TimelineOutboxProcessor.cs` | - | - | Determinism & Replay |
|
||||||
|
| Event Signing (DSSE) | Eventing | `IEventSigner.cs` | - | - | Determinism & Replay |
|
||||||
|
| Replay Bundle Writer | Replay.Core | `StellaReplayBundleWriter.cs` (tar.zst) | - | - | Determinism & Replay |
|
||||||
|
| Dead Letter Replay | Orchestrator | `IReplayManager.cs`, `ReplayManager.cs` | - | - | Operations |
|
||||||
|
|
||||||
|
### Coverage Gaps
|
||||||
|
|
||||||
|
| Feature | Module | Has CLI | Has UI | Recommendation |
|
||||||
|
|---------|--------|---------|--------|----------------|
|
||||||
|
| HLC Inspection | HybridLogicalClock | No | No | Add `stella hlc status` command |
|
||||||
|
| Timeline Events | Eventing | No | No | Add `stella timeline query` command |
|
||||||
|
| Scoring Explanation | Signals | No | No | Add `stella score explain` command |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Batch 12: Operations
|
||||||
|
|
||||||
|
### Discovered Features (Not in Matrix)
|
||||||
|
|
||||||
|
| Feature | Module | Key Files | CLI | UI | Suggested Category |
|
||||||
|
|---------|--------|-----------|-----|----|--------------------|
|
||||||
|
| Impact Index (Roaring bitmaps) | Scheduler | `IImpactIndex.cs` | - | - | Operations |
|
||||||
|
| Graph Build/Overlay Jobs | Scheduler | `IGraphJobService.cs` | - | `/ops/scheduler` | Operations |
|
||||||
|
| Run Preview (dry-run) | Scheduler | `RunEndpoints.cs` | - | - | Operations |
|
||||||
|
| SSE Run Streaming | Scheduler | `/runs/{runId}/stream` | - | - | Operations |
|
||||||
|
| Job Repository | Orchestrator | `IJobRepository.cs`, `Job.cs` | - | `/orchestrator` | Operations |
|
||||||
|
| Lease Management | Orchestrator | `LeaseNextAsync()`, `ExtendLeaseAsync()` | - | - | Operations |
|
||||||
|
| Dead Letter Classification | Orchestrator | `DeadLetterEntry.cs` | - | `/orchestrator` | Operations |
|
||||||
|
| First Signal Service | Orchestrator | `IFirstSignalService.cs` | - | - | Operations |
|
||||||
|
| Task Pack Execution | TaskRunner | `ITaskRunnerClient.cs` | - | - | Operations |
|
||||||
|
| Plan-Hash Binding | TaskRunner | Deterministic validation | - | - | Operations |
|
||||||
|
| Approval Gates | TaskRunner | `ApprovalDecisionRequest.cs` | - | - | Operations |
|
||||||
|
| Artifact Capture | TaskRunner | Digest tracking | - | - | Operations |
|
||||||
|
| Timeline Query Service | TimelineIndexer | `ITimelineQueryService.cs` | - | - | Operations |
|
||||||
|
| Timeline Ingestion | TimelineIndexer | `ITimelineIngestionService.cs` | - | - | Operations |
|
||||||
|
| Token-Bucket Rate Limiting | Orchestrator | Adaptive refill per tenant | - | - | Operations |
|
||||||
|
| Job Watermarks | Orchestrator | Ordering guarantees | - | - | Operations |
|
||||||
|
|
||||||
|
### Coverage Gaps
|
||||||
|
|
||||||
|
| Feature | Module | Has CLI | Has UI | Recommendation |
|
||||||
|
|---------|--------|---------|--------|----------------|
|
||||||
|
| Impact Preview | Scheduler | No | Partial | Add `stella scheduler preview` command |
|
||||||
|
| Job Management | Orchestrator | No | Yes | Add `stella orchestrator jobs` commands |
|
||||||
|
| Dead Letter Operations | Orchestrator | No | Yes | Add `stella orchestrator deadletter` commands |
|
||||||
|
| TaskRunner CLI | TaskRunner | No | No | Add `stella taskrunner` commands |
|
||||||
|
| Timeline Query CLI | TimelineIndexer | No | No | Add `stella timeline` commands |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Batch 13: Release Orchestration
|
||||||
|
|
||||||
|
### Discovered Features (Not in Matrix)
|
||||||
|
|
||||||
|
| Feature | Module | Key Files | CLI | UI | Suggested Category |
|
||||||
|
|---------|--------|-----------|-----|----|--------------------|
|
||||||
|
| Environment Bundles | ReleaseOrchestrator | `IEnvironmentBundleService.cs`, `EnvironmentBundle.cs` | - | `/releases` | Release Orchestration |
|
||||||
|
| Promotion Workflows | ReleaseOrchestrator | `IPromotionWorkflowService.cs`, `PromotionRequest.cs` | - | `/releases` | Release Orchestration |
|
||||||
|
| Rollback Service | ReleaseOrchestrator | `IRollbackService.cs`, `RollbackRequest.cs` | - | `/releases` | Release Orchestration |
|
||||||
|
| Deployment Agents (Docker/Compose/ECS/Nomad) | ReleaseOrchestrator | `IDeploymentAgent.cs`, various agent implementations | - | `/releases` | Release Orchestration |
|
||||||
|
| Progressive Delivery (A/B, Canary) | ReleaseOrchestrator | `IProgressiveDeliveryService.cs` | - | `/releases` | Release Orchestration |
|
||||||
|
| Hook System (Pre/Post Deploy) | ReleaseOrchestrator | `IHookExecutionService.cs`, `Hook.cs` | - | `/releases` | Release Orchestration |
|
||||||
|
| Approval Gates (Multi-Stage) | ReleaseOrchestrator | `IApprovalGateService.cs`, `ApprovalGate.cs` | - | `/releases` | Release Orchestration |
|
||||||
|
| Release Bundle Signing | ReleaseOrchestrator | `IReleaseBundleSigningService.cs` | - | - | Release Orchestration |
|
||||||
|
| Environment Promotion History | ReleaseOrchestrator | `IPromotionHistoryService.cs` | - | `/releases` | Release Orchestration |
|
||||||
|
| Deployment Lock Service | ReleaseOrchestrator | `IDeploymentLockService.cs` | - | - | Release Orchestration |
|
||||||
|
| Release Manifest Generation | ReleaseOrchestrator | `IReleaseManifestService.cs` | - | - | Release Orchestration |
|
||||||
|
| Promotion Attestations | ReleaseOrchestrator | `PromotionAttestation.cs` | - | - | Attestation & Signing |
|
||||||
|
| Environment Health Checks | ReleaseOrchestrator | `IEnvironmentHealthService.cs` | - | `/releases` | Release Orchestration |
|
||||||
|
| Deployment Verification Tests | ReleaseOrchestrator | `IVerificationTestService.cs` | - | - | Release Orchestration |
|
||||||
|
|
||||||
|
### Coverage Gaps
|
||||||
|
|
||||||
|
| Feature | Module | Has CLI | Has UI | Recommendation |
|
||||||
|
|---------|--------|---------|--------|----------------|
|
||||||
|
| Release Bundle Creation | ReleaseOrchestrator | No | Partial | Add `stella release create` command |
|
||||||
|
| Environment Promotion | ReleaseOrchestrator | No | Yes | Add `stella release promote` command |
|
||||||
|
| Rollback Operations | ReleaseOrchestrator | No | Yes | Add `stella release rollback` command |
|
||||||
|
| Hook Management | ReleaseOrchestrator | No | Partial | Add `stella release hooks` commands |
|
||||||
|
| Deployment Agent Status | ReleaseOrchestrator | No | Partial | Add `stella agent status` command |
|
||||||
|
|
||||||
|
### Matrix Update Recommendations
|
||||||
|
|
||||||
|
The FEATURE_MATRIX.md Release Orchestration section is largely planned:
|
||||||
|
- **Listed:** Basic environment management concepts
|
||||||
|
- **Actual:** Full promotion workflow, deployment agents, progressive delivery
|
||||||
|
|
||||||
|
Recommended additions:
|
||||||
|
1. Add "Deployment Agents" section (Docker, Compose, ECS, Nomad)
|
||||||
|
2. Add "Progressive Delivery" (A/B, Canary strategies)
|
||||||
|
3. Add "Approval Gates" (multi-stage approvals)
|
||||||
|
4. Add "Hook System" (pre/post deployment hooks)
|
||||||
|
5. Add "Promotion Attestations" (DSSE signing of promotions)
|
||||||
|
6. Document "Environment Health Checks"
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Batch 14: Auth & Access Control
|
||||||
|
|
||||||
|
### Discovered Features (Not in Matrix)
|
||||||
|
|
||||||
|
| Feature | Module | Key Files | CLI | UI | Suggested Category |
|
||||||
|
|---------|--------|-----------|-----|----|--------------------|
|
||||||
|
| 75+ Authorization Scopes | Authority | `AuthorizationScopeConstants.cs` | - | `/admin/roles` | Auth & Access Control |
|
||||||
|
| DPoP Sender Constraints | Authority | `DPoPService.cs`, `DPoPValidator.cs` | - | - | Auth & Access Control |
|
||||||
|
| mTLS Sender Constraints | Authority | `MtlsClientCertificateValidator.cs` | - | - | Auth & Access Control |
|
||||||
|
| Device Authorization Flow | Authority | `DeviceAuthorizationEndpoints.cs` | - | `/login` | Auth & Access Control |
|
||||||
|
| JWT Profile for OAuth | Authority | `JwtBearerClientAssertionValidator.cs` | - | - | Auth & Access Control |
|
||||||
|
| PAR (Pushed Authorization Requests) | Authority | `ParEndpoints.cs` | - | - | Auth & Access Control |
|
||||||
|
| Tenant Isolation | Authority | `ITenantContext.cs`, `TenantResolutionMiddleware.cs` | - | - | Auth & Access Control |
|
||||||
|
| Role-Based Access Control | Authority | `IRoleService.cs`, `Role.cs` | - | `/admin/roles` | Auth & Access Control |
|
||||||
|
| Permission Grant Service | Authority | `IPermissionGrantService.cs` | - | - | Auth & Access Control |
|
||||||
|
| Token Introspection | Authority | `TokenIntrospectionEndpoints.cs` | - | - | Auth & Access Control |
|
||||||
|
| Token Revocation | Authority | `TokenRevocationEndpoints.cs` | - | - | Auth & Access Control |
|
||||||
|
| OAuth Client Management | Authority | `IClientRepository.cs`, `Client.cs` | - | `/admin/clients` | Auth & Access Control |
|
||||||
|
| User Federation (LDAP/SAML) | Authority | `IFederationProvider.cs` | - | `/admin/federation` | Auth & Access Control |
|
||||||
|
| Session Management | Authority | `ISessionStore.cs`, `Session.cs` | - | - | Auth & Access Control |
|
||||||
|
| Consent Management | Authority | `IConsentStore.cs`, `Consent.cs` | - | `/consent` | Auth & Access Control |
|
||||||
|
| Registry Token Service | Registry | `ITokenService.cs`, `TokenModels.cs` | `stella registry login` | - | Auth & Access Control |
|
||||||
|
| Scope-Based Token Minting | Registry | Pull/push/catalog scope handling | - | - | Auth & Access Control |
|
||||||
|
| Token Refresh Flow | Authority | Refresh token rotation | - | - | Auth & Access Control |
|
||||||
|
| Multi-Factor Authentication | Authority | `IMfaService.cs` | - | `/login/mfa` | Auth & Access Control |
|
||||||
|
| API Key Management | Authority | `IApiKeyService.cs` | - | `/admin/api-keys` | Auth & Access Control |
|
||||||
|
|
||||||
|
### Coverage Gaps
|
||||||
|
|
||||||
|
| Feature | Module | Has CLI | Has UI | Recommendation |
|
||||||
|
|---------|--------|---------|--------|----------------|
|
||||||
|
| Scope Management | Authority | No | Yes | Add `stella auth scopes` commands |
|
||||||
|
| DPoP Configuration | Authority | No | No | Add DPoP configuration documentation |
|
||||||
|
| Client Management | Authority | No | Yes | Add `stella auth clients` commands |
|
||||||
|
| Role Management | Authority | No | Yes | Add `stella auth roles` commands |
|
||||||
|
| API Key Operations | Authority | No | Yes | Add `stella auth api-keys` commands |
|
||||||
|
| Token Introspection | Authority | No | No | Add `stella auth token inspect` command |
|
||||||
|
|
||||||
|
### Matrix Update Recommendations
|
||||||
|
|
||||||
|
The FEATURE_MATRIX.md Auth section covers basics but misses advanced features:
|
||||||
|
- **Listed:** Basic OAuth/OIDC, RBAC
|
||||||
|
- **Actual:** 75+ scopes, DPoP/mTLS, federation, advanced OAuth flows
|
||||||
|
|
||||||
|
Recommended additions:
|
||||||
|
1. Add "Authorization Scopes" section (75+ granular scopes)
|
||||||
|
2. Add "Sender Constraints" (DPoP, mTLS)
|
||||||
|
3. Add "Device Authorization Flow" for CLI/IoT
|
||||||
|
4. Add "User Federation" (LDAP, SAML integration)
|
||||||
|
5. Add "PAR Support" for security-conscious clients
|
||||||
|
6. Add "Multi-Factor Authentication"
|
||||||
|
7. Add "API Key Management" for service accounts
|
||||||
|
8. Document "Tenant Isolation" architecture
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Batch 15: Notifications & Integrations
|
||||||
|
|
||||||
|
### Discovered Features (Not in Matrix)
|
||||||
|
|
||||||
|
| Feature | Module | Key Files | CLI | UI | Suggested Category |
|
||||||
|
|---------|--------|-----------|-----|----|--------------------|
|
||||||
|
| 10 Notification Channel Types | Notify | Email, Slack, Teams, Webhook, PagerDuty, SNS, SQS, Pub/Sub, Discord, Matrix | - | `/notifications` | Notifications |
|
||||||
|
| Template-Based Notifications | Notify | `INotificationTemplateService.cs`, `NotificationTemplate.cs` | - | `/notifications` | Notifications |
|
||||||
|
| Channel Routing Rules | Notify | `IChannelRoutingService.cs`, `RoutingRule.cs` | - | `/notifications` | Notifications |
|
||||||
|
| Delivery Receipt Tracking | Notify | `IDeliveryReceiptService.cs`, `DeliveryReceipt.cs` | - | - | Notifications |
|
||||||
|
| Notification Preferences | Notify | `IPreferenceService.cs`, `UserPreference.cs` | - | `/settings` | Notifications |
|
||||||
|
| Digest/Batch Notifications | Notify | `IDigestService.cs` | - | `/notifications` | Notifications |
|
||||||
|
| Kubernetes Admission Webhooks | Zastava | `AdmissionWebhookEndpoints.cs` | - | - | Integrations |
|
||||||
|
| OCI Registry Push Hooks | Zastava | `IWebhookProcessor.cs`, `RegistryPushEvent.cs` | - | - | Integrations |
|
||||||
|
| Scan-on-Push Trigger | Zastava | Auto-trigger scanning on registry push | - | - | Integrations |
|
||||||
|
| SCM Webhooks (GitHub/GitLab/Bitbucket) | Integrations | `IScmWebhookHandler.cs` | - | `/integrations` | Integrations |
|
||||||
|
| CI/CD Webhooks | Integrations | Jenkins, CircleCI, GitHub Actions integration | - | `/integrations` | Integrations |
|
||||||
|
| Issue Tracker Integration | Integrations | Jira, GitHub Issues, Linear integration | - | `/integrations` | Integrations |
|
||||||
|
| Slack App Integration | Integrations | `ISlackAppService.cs`, slash commands | - | `/integrations` | Integrations |
|
||||||
|
| MS Teams App Integration | Integrations | `ITeamsAppService.cs`, adaptive cards | - | `/integrations` | Integrations |
|
||||||
|
| Notification Studio | Notifier | Template design and preview | - | `/notifications/studio` | Notifications |
|
||||||
|
| Escalation Rules | Notify | `IEscalationService.cs` | - | `/notifications` | Notifications |
|
||||||
|
| On-Call Schedule Integration | Notify | PagerDuty, OpsGenie integration | - | `/notifications` | Notifications |
|
||||||
|
| Webhook Retry Logic | Notify | Exponential backoff, dead letter | - | - | Notifications |
|
||||||
|
| Event-Driven Notifications | Notify | Timeline event subscription | - | - | Notifications |
|
||||||
|
| Custom Webhook Payloads | Integrations | `IWebhookPayloadFormatter.cs` | - | `/integrations` | Integrations |
|
||||||
|
|
||||||
|
### Coverage Gaps
|
||||||
|
|
||||||
|
| Feature | Module | Has CLI | Has UI | Recommendation |
|
||||||
|
|---------|--------|---------|--------|----------------|
|
||||||
|
| Channel Configuration | Notify | No | Yes | Add `stella notify channels` commands |
|
||||||
|
| Template Management | Notify | No | Yes | Add `stella notify templates` commands |
|
||||||
|
| Webhook Testing | Integrations | No | Partial | Add `stella integrations test` command |
|
||||||
|
| K8s Webhook Installation | Zastava | No | No | Add `stella zastava install` command |
|
||||||
|
| Notification Preferences | Notify | No | Yes | Add `stella notify preferences` commands |
|
||||||
|
|
||||||
|
### Matrix Update Recommendations
|
||||||
|
|
||||||
|
The FEATURE_MATRIX.md Notifications section is basic:
|
||||||
|
- **Listed:** Basic webhook/email notifications
|
||||||
|
- **Actual:** 10 channel types, template engine, routing rules, escalation
|
||||||
|
|
||||||
|
Recommended additions:
|
||||||
|
1. Add "Notification Channels" section (10 types)
|
||||||
|
2. Add "Template Engine" for customizable messages
|
||||||
|
3. Add "Channel Routing" for sophisticated delivery
|
||||||
|
4. Add "Escalation Rules" for incident response
|
||||||
|
5. Add "Notification Studio" for template design
|
||||||
|
6. Add "Kubernetes Admission Webhooks" (Zastava)
|
||||||
|
7. Add "SCM Integrations" (GitHub, GitLab, Bitbucket)
|
||||||
|
8. Add "CI/CD Integrations" (Jenkins, CircleCI, GitHub Actions)
|
||||||
|
9. Add "Issue Tracker Integration" (Jira, GitHub Issues)
|
||||||
|
10. Document "Scan-on-Push" auto-trigger
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Summary: Overall Matrix Gaps
|
||||||
|
|
||||||
|
### Major Documentation Gaps Identified
|
||||||
|
|
||||||
|
| Category | Matrix Coverage | Actual Coverage | Gap Severity |
|
||||||
|
|----------|-----------------|-----------------|--------------|
|
||||||
|
| Advisory Sources | 11 sources | 33+ connectors | **CRITICAL** |
|
||||||
|
| VEX Processing | Basic | Full consensus engine | **HIGH** |
|
||||||
|
| Attestation & Signing | Basic | 25+ predicates | **HIGH** |
|
||||||
|
| Auth Scopes | Basic RBAC | 75+ granular scopes | **HIGH** |
|
||||||
|
| Policy Engine | Basic | K4 lattice, 10+ gates | **MEDIUM** |
|
||||||
|
| Regional Crypto | 3 profiles | 8 profiles, 6 plugins | **MEDIUM** |
|
||||||
|
| Notifications | 2 channels | 10 channels | **MEDIUM** |
|
||||||
|
| Binary Analysis | Basic | 4 fingerprint algorithms | **MEDIUM** |
|
||||||
|
| Release Orchestration | Planned | Partially implemented | **LOW** |
|
||||||
|
|
||||||
|
### CLI/UI Coverage Statistics
|
||||||
|
|
||||||
|
| Metric | Value |
|
||||||
|
|--------|-------|
|
||||||
|
| Features with CLI | ~65% |
|
||||||
|
| Features with UI | ~70% |
|
||||||
|
| Features with both | ~55% |
|
||||||
|
| Internal-only features | ~25% |
|
||||||
|
|
||||||
|
### Recommended Next Steps
|
||||||
|
|
||||||
|
1. **Immediate**: Update Advisory Sources section (33+ connectors undocumented)
|
||||||
|
2. **High Priority**: Document VEX consensus engine capabilities
|
||||||
|
3. **High Priority**: Document attestation predicate types
|
||||||
|
4. **Medium Priority**: Update auth scopes documentation
|
||||||
|
5. **Medium Priority**: Complete policy engine documentation
|
||||||
|
6. **Low Priority**: Document internal operations features
|
||||||
938
docs/FEATURE_MATRIX_COMPLETE.md
Normal file
938
docs/FEATURE_MATRIX_COMPLETE.md
Normal file
@@ -0,0 +1,938 @@
|
|||||||
|
# Complete Feature Matrix - Stella Ops Suite
|
||||||
|
*(Auto-generated with code mapping)*
|
||||||
|
|
||||||
|
> This document extends `FEATURE_MATRIX.md` with module/file mappings and CLI/UI coverage verification.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## SBOM & Ingestion
|
||||||
|
|
||||||
|
| Feature | Tiers | Module | Key Files | CLI | UI | Status |
|
||||||
|
|---------|-------|--------|-----------|-----|----|----|
|
||||||
|
| Trivy-JSON Ingestion | Free/Pro/Ent | Concelier | `TrivyDbExporterPlugin.cs`, `TrivyDbBoltBuilder.cs` | - | `/concelier/trivy-db-settings` | Implemented |
|
||||||
|
| SPDX-JSON 3.0.1 Ingestion | Free/Pro/Ent | Concelier, Scanner | `SbomParser.cs`, `SpdxJsonLdSerializer.cs` | `stella sbom list --format spdx` | `/sbom-sources` | Implemented |
|
||||||
|
| CycloneDX 1.7 Ingestion | Free/Pro/Ent | Concelier, Scanner | `SbomParser.cs`, `CycloneDxComposer.cs` | `stella sbom list --format cyclonedx` | `/sbom-sources` | Implemented |
|
||||||
|
| Auto-format Detection | Free/Pro/Ent | Concelier | `ISbomParser.cs`, `SbomParser.cs` (DetectFormatAsync) | Implicit in `stella sbom` | Implicit | Implemented |
|
||||||
|
| Delta-SBOM Cache | Free/Pro/Ent | SbomService | `VexDeltaRepository.cs`, `InMemoryLineageCompareCache.cs`, `ValkeyLineageCompareCache.cs` | - | - | Implemented |
|
||||||
|
| SBOM Generation (all formats) | Free/Pro/Ent | Scanner | `SpdxComposer.cs`, `CycloneDxComposer.cs`, `SpdxLayerWriter.cs`, `CycloneDxLayerWriter.cs` | `stella scan run` | `/findings` (scan results) | Implemented |
|
||||||
|
| Semantic SBOM Diff | Free/Pro/Ent | Scanner, SbomService | `SbomDiff.cs`, `SbomDiffEngine.cs`, `LineageCompareService.cs` | - | `/lineage` | Implemented |
|
||||||
|
| BYOS (Bring-Your-Own-SBOM) | Free/Pro/Ent | Scanner | `SbomByosUploadService.cs`, `SbomUploadStore.cs`, `SbomUploadEndpoints.cs` | `stella sbom upload` (pending) | `/sbom-sources` | Implemented |
|
||||||
|
| SBOM Lineage Ledger | Enterprise | SbomService | `SbomLineageEdgeRepository.cs`, `SbomLedgerModels.cs`, `SbomServiceDbContext.cs` | - | `/lineage` | Implemented |
|
||||||
|
| SBOM Lineage API | Enterprise | SbomService, Graph | `ILineageGraphService.cs`, `SbomLineageGraphService.cs`, `LineageExportService.cs`, `LineageController.cs` | - | `/lineage` | Implemented |
|
||||||
|
|
||||||
|
### CLI Commands (SBOM)
|
||||||
|
|
||||||
|
| Command | Description | Status |
|
||||||
|
|---------|-------------|--------|
|
||||||
|
| `stella sbom list` | List SBOMs with filters (--image, --digest, --format, --created-after/before) | Implemented |
|
||||||
|
| `stella sbom show <id>` | Display SBOM details | Implemented |
|
||||||
|
| `stella sbom upload` | Upload external SBOM (BYOS) | Pending verification |
|
||||||
|
| `stella sbomer layer list` | List layer fragments for a scan | Implemented |
|
||||||
|
| `stella sbomer compose` | Compose layer SBOMs | Implemented |
|
||||||
|
| `stella sbomer verify` | Verify Merkle tree integrity | Implemented |
|
||||||
|
|
||||||
|
### UI Routes (SBOM)
|
||||||
|
|
||||||
|
| Route | Feature | Status |
|
||||||
|
|-------|---------|--------|
|
||||||
|
| `/sbom-sources` | SBOM ingestion source management | Implemented |
|
||||||
|
| `/lineage` | SBOM lineage graph and smart diff | Implemented |
|
||||||
|
| `/graph` | Interactive SBOM dependency visualization | Implemented |
|
||||||
|
| `/concelier/trivy-db-settings` | Trivy vulnerability database configuration | Implemented |
|
||||||
|
|
||||||
|
### Coverage Gaps (SBOM)
|
||||||
|
|
||||||
|
| Feature | Has CLI | Has UI | Notes |
|
||||||
|
|---------|---------|--------|-------|
|
||||||
|
| Delta-SBOM Cache | No | No | Internal optimization, no direct exposure needed |
|
||||||
|
| Auto-format Detection | Implicit | Implicit | Works automatically, no explicit command |
|
||||||
|
| SBOM Lineage Ledger | No | Yes | CLI access would be useful for automation |
|
||||||
|
| SBOM Lineage API | No | Yes | CLI access would be useful for automation |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Scanning & Detection
|
||||||
|
|
||||||
|
| Feature | Tiers | Module | Key Files | CLI | UI | Status |
|
||||||
|
|---------|-------|--------|-----------|-----|----|----|
|
||||||
|
| CVE Lookup via Local DB | Free/Pro/Ent | Scanner | `VulnSurfaceService.cs`, `AdvisoryClient.cs` | `stella scan run` | `/findings` | Implemented |
|
||||||
|
| License-Risk Detection | All (Planned) | Scanner | Package manifest extraction only | - | - | Planned (Q4-2025) |
|
||||||
|
| **.NET/C# Analyzer** | Free/Pro/Ent | Scanner | `DotNetLanguageAnalyzer.cs`, `DotNetDependencyCollector.cs`, `MsBuildProjectParser.cs` | `stella scan run` | `/findings` | Implemented |
|
||||||
|
| **Java Analyzer** | Free/Pro/Ent | Scanner | `JavaLanguageAnalyzer.cs`, `JavaWorkspaceNormalizer.cs` | `stella scan run` | `/findings` | Implemented |
|
||||||
|
| **Go Analyzer** | Free/Pro/Ent | Scanner | `GoLanguageAnalyzer.cs` | `stella scan run` | `/findings` | Implemented |
|
||||||
|
| **Python Analyzer** | Free/Pro/Ent | Scanner | `PythonLanguageAnalyzer.cs`, `PythonEnvironmentDetector.cs`, `ContainerLayerAdapter.cs` | `stella scan run` | `/findings` | Implemented |
|
||||||
|
| **Node.js Analyzer** | Free/Pro/Ent | Scanner | `NodeLanguageAnalyzer.cs` | `stella scan run` | `/findings` | Implemented |
|
||||||
|
| **Ruby Analyzer** | Free/Pro/Ent | Scanner | `RubyLanguageAnalyzer.cs`, `RubyVendorArtifactCollector.cs` | `stella ruby inspect` | `/findings` | Implemented |
|
||||||
|
| **Bun Analyzer** | Free/Pro/Ent | Scanner | `BunLanguageAnalyzer.cs` | `stella bun inspect` | `/findings` | Implemented |
|
||||||
|
| **Deno Analyzer** | Free/Pro/Ent | Scanner | `DenoLanguageAnalyzer.cs` | `stella scan run` | `/findings` | Implemented |
|
||||||
|
| **PHP Analyzer** | Free/Pro/Ent | Scanner | `PhpLanguageAnalyzer.cs` | `stella php inspect` | `/findings` | Implemented |
|
||||||
|
| **Rust Analyzer** | Free/Pro/Ent | Scanner | `RustLanguageAnalyzer.cs` | `stella scan run` | `/findings` | Implemented |
|
||||||
|
| **Native Binary Analyzer** | Free/Pro/Ent | Scanner | `NativeAnalyzer.cs` | `stella binary` | `/analyze/patch-map` | Implemented |
|
||||||
|
| Quick Mode | Free/Pro/Ent | Scanner | `FidelityLevel.cs`, `FidelityConfiguration.cs`, `FidelityAwareAnalyzer.cs` | `stella scan run --fidelity quick` | `/ops/scanner` | Implemented |
|
||||||
|
| Standard Mode | Free/Pro/Ent | Scanner | `FidelityLevel.cs`, `FidelityConfiguration.cs` | `stella scan run --fidelity standard` | `/ops/scanner` | Implemented |
|
||||||
|
| Deep Mode | Pro/Ent | Scanner | `FidelityLevel.cs`, `FidelityConfiguration.cs` | `stella scan run --fidelity deep` | `/ops/scanner` | Implemented |
|
||||||
|
| Base Image Detection | Free/Pro/Ent | Scanner | `OciImageInspector.cs`, `OciImageConfig.cs` | `stella image inspect` | `/findings` | Implemented |
|
||||||
|
| Layer-Aware Analysis | Free/Pro/Ent | Scanner | `LayeredRootFileSystem.cs`, `ContainerLayerAdapter.cs` | `stella scan layer-sbom` | `/findings` | Implemented |
|
||||||
|
| Concurrent Scan Workers | 1/3/Unlimited | Scanner | `IScanQueue.cs`, `NatsScanQueue.cs`, `ScanJobProcessor.cs` | - | `/ops/scanner` | Implemented |
|
||||||
|
|
||||||
|
### CLI Commands (Scanning)
|
||||||
|
|
||||||
|
| Command | Description | Status |
|
||||||
|
|---------|-------------|--------|
|
||||||
|
| `stella scan run` | Execute scanner with --runner, --entry, --target | Implemented |
|
||||||
|
| `stella scan upload` | Upload completed scan results | Implemented |
|
||||||
|
| `stella scan entrytrace` | Show entry trace summary for a scan | Implemented |
|
||||||
|
| `stella scan sarif` | Export scan results in SARIF 2.1.0 format | Implemented |
|
||||||
|
| `stella scan replay` | Replay scan with deterministic hashes | Implemented |
|
||||||
|
| `stella scan gate-policy` | VEX gate evaluation | Implemented |
|
||||||
|
| `stella scan layers` | Container layer operations | Implemented |
|
||||||
|
| `stella scan layer-sbom` | Layer SBOM composition | Implemented |
|
||||||
|
| `stella scan diff` | Binary diff analysis | Implemented |
|
||||||
|
| `stella image inspect` | Inspect OCI image manifest and layers | Implemented |
|
||||||
|
| `stella ruby inspect` | Inspect Ruby workspace | Implemented |
|
||||||
|
| `stella php inspect` | Inspect PHP workspace | Implemented |
|
||||||
|
| `stella python inspect` | Inspect Python workspace/venv | Implemented |
|
||||||
|
| `stella bun inspect` | Inspect Bun workspace | Implemented |
|
||||||
|
| `stella scanner download` | Download latest scanner bundle | Implemented |
|
||||||
|
|
||||||
|
### UI Routes (Scanning)
|
||||||
|
|
||||||
|
| Route | Feature | Status |
|
||||||
|
|-------|---------|--------|
|
||||||
|
| `/findings` | Vulnerability findings with diff-first view | Implemented |
|
||||||
|
| `/findings/:scanId` | Scan-specific findings | Implemented |
|
||||||
|
| `/scans/:scanId` | Individual scan result inspection | Implemented |
|
||||||
|
| `/vulnerabilities` | CVE/vulnerability database explorer | Implemented |
|
||||||
|
| `/vulnerabilities/:vulnId` | Vulnerability detail view | Implemented |
|
||||||
|
| `/ops/scanner` | Scanner offline kits, baselines, determinism settings | Implemented |
|
||||||
|
| `/analyze/patch-map` | Fleet-wide binary patch coverage heatmap | Implemented |
|
||||||
|
|
||||||
|
### Coverage Gaps (Scanning)
|
||||||
|
|
||||||
|
| Feature | Has CLI | Has UI | Notes |
|
||||||
|
|---------|---------|--------|-------|
|
||||||
|
| License-Risk Detection | No | No | Planned feature, not yet implemented |
|
||||||
|
| Concurrent Worker Config | No | Yes | Worker count configured via ops UI/environment |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Reachability Analysis
|
||||||
|
|
||||||
|
| Feature | Tiers | Module | Key Files | CLI | UI | Status |
|
||||||
|
|---------|-------|--------|-----------|-----|----|----|
|
||||||
|
| Static Call Graph | Free/Pro/Ent | Scanner, ReachGraph | `ReachabilityAnalyzer.cs`, `ReachGraphEdge.cs` | `stella reachgraph slice` | `/reachability` | Implemented |
|
||||||
|
| Entrypoint Detection (9+ types) | Free/Pro/Ent | Scanner | `JavaEntrypointClassifier.cs`, `EntryTraceResponse.cs` | `stella scan entrytrace` | `/reachability` | Implemented |
|
||||||
|
| BFS Reachability | Free/Pro/Ent | Scanner | `ReachabilityAnalyzer.cs` (BFS traversal, max depth 256) | `stella reachgraph slice --depth` | `/reachability` | Implemented |
|
||||||
|
| Reachability Drift Detection | Free/Pro/Ent | Reachability.Core | `ReachabilityLattice.cs` (8-state machine) | `stella drift` | `/reachability` | Implemented |
|
||||||
|
| Binary Loader Resolution | Pro/Ent | Scanner | `GuardDetector.cs` (PLT/IAT), Binary entrypoint classifiers | `stella binary` | `/analyze/patch-map` | Implemented |
|
||||||
|
| Feature Flag/Config Gating | Pro/Ent | Scanner | `GuardDetector.cs` (env guards, platform checks, feature flags) | - | `/reachability` | Implemented |
|
||||||
|
| Runtime Signal Correlation | Enterprise | Signals | `EvidenceWeightedScoreCalculator.cs`, `ISignalsAdapter.cs` | - | `/reachability` | Implemented |
|
||||||
|
| Gate Detection (auth/admin) | Enterprise | Scanner | `GuardDetector.cs` (20+ patterns across 5+ languages) | - | `/reachability` | Implemented |
|
||||||
|
| Path Witness Generation | Enterprise | Scanner, ReachGraph | `ReachabilityAnalyzer.cs` (deterministic path ordering) | `stella witness` | - | Implemented |
|
||||||
|
| Reachability Mini-Map API | Enterprise | ReachGraph | `ReachGraphStoreService.cs`, `ReachGraphContracts.cs` | `stella reachgraph slice` | `/reachability` | Implemented |
|
||||||
|
| Runtime Timeline API | Enterprise | Signals | `ISignalsAdapter.cs`, Evidence window configuration | - | `/reachability` | Implemented |
|
||||||
|
|
||||||
|
### CLI Commands (Reachability)
|
||||||
|
|
||||||
|
| Command | Description | Status |
|
||||||
|
|---------|-------------|--------|
|
||||||
|
| `stella reachgraph slice` | Query slice of reachability graph (--cve, --purl, --entrypoint, --depth) | Implemented |
|
||||||
|
| `stella reachgraph replay` | Replay reachability analysis for verification | Implemented |
|
||||||
|
| `stella reachgraph verify` | Verify graph integrity | Implemented |
|
||||||
|
| `stella reachability show` | Display reachability subgraph (table, json, dot, mermaid) | Implemented |
|
||||||
|
| `stella reachability export` | Export reachability data | Implemented |
|
||||||
|
| `stella scan entrytrace` | Show entry trace summary with semantic analysis | Implemented |
|
||||||
|
| `stella witness` | Path witness operations | Implemented |
|
||||||
|
| `stella drift` | Reachability drift detection | Implemented |
|
||||||
|
|
||||||
|
### UI Routes (Reachability)
|
||||||
|
|
||||||
|
| Route | Feature | Status |
|
||||||
|
|-------|---------|--------|
|
||||||
|
| `/reachability` | Reachability center - analysis and coverage | Implemented |
|
||||||
|
| `/graph` | Interactive dependency graph with reachability overlay | Implemented |
|
||||||
|
|
||||||
|
### Key Implementation Details
|
||||||
|
|
||||||
|
**Reachability Lattice (8 States):**
|
||||||
|
1. Unknown (0.00-0.29 confidence)
|
||||||
|
2. StaticReachable (0.30-0.49)
|
||||||
|
3. StaticUnreachable (0.50-0.69)
|
||||||
|
4. RuntimeObserved (0.70-0.89)
|
||||||
|
5. RuntimeUnobserved (0.70-0.89)
|
||||||
|
6. ConfirmedReachable (0.90-1.00)
|
||||||
|
7. ConfirmedUnreachable (0.90-1.00)
|
||||||
|
8. Contested (static/runtime conflict)
|
||||||
|
|
||||||
|
**Entrypoint Framework Types Detected:**
|
||||||
|
- HTTP Handlers (Spring MVC, JAX-RS, Micronaut, GraphQL)
|
||||||
|
- Message Handlers (Kafka, RabbitMQ, JMS)
|
||||||
|
- Scheduled Jobs (Spring @Scheduled, Micronaut, JAX-EJB)
|
||||||
|
- gRPC Methods (Spring Boot gRPC, Netty gRPC)
|
||||||
|
- Event Handlers (Spring @EventListener)
|
||||||
|
- CLI Commands (main() method)
|
||||||
|
- Servlet Handlers (HttpServlet subclass)
|
||||||
|
|
||||||
|
### Coverage Gaps (Reachability)
|
||||||
|
|
||||||
|
| Feature | Has CLI | Has UI | Notes |
|
||||||
|
|---------|---------|--------|-------|
|
||||||
|
| Runtime Signal Correlation | No | Yes | Consider CLI for signal inspection |
|
||||||
|
| Gate Detection | No | Yes | Guard conditions visible in reachability UI |
|
||||||
|
| Path Witness Generation | Yes | No | Consider UI visualization of witness paths |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Binary Analysis (BinaryIndex)
|
||||||
|
|
||||||
|
| Feature | Tiers | Module | Key Files | CLI | UI | Status |
|
||||||
|
|---------|-------|--------|-----------|-----|----|----|
|
||||||
|
| Binary Identity Extraction | Free/Pro/Ent | BinaryIndex | `BinaryIdentity.cs`, `IBinaryFeatureExtractor.cs` | `stella binary inspect` | `/analyze/patch-map` | Implemented |
|
||||||
|
| Build-ID Vulnerability Lookup | Free/Pro/Ent | BinaryIndex | `IBinaryVulnerabilityService.cs`, `ResolutionController.cs` | `stella binary lookup` | `/analyze/patch-map` | Implemented |
|
||||||
|
| Debian/Ubuntu Corpus | Free/Pro/Ent | BinaryIndex | `DebianCorpusConnector.cs`, `CorpusIngestionService.cs` | - | - | Implemented |
|
||||||
|
| RPM/RHEL Corpus | Pro/Ent | BinaryIndex | `RpmCorpusConnector.cs` | - | - | Implemented |
|
||||||
|
| Patch-Aware Backport Detection | Pro/Ent | BinaryIndex | `IFixIndexBuilder.cs`, `FixEvidence.cs`, `DebianChangelogParser.cs` | `stella patch-verify` | - | Implemented |
|
||||||
|
| PE/Mach-O/ELF Parsers | Pro/Ent | BinaryIndex | Binary format detection in `BinaryIdentity.cs` | `stella binary inspect` | - | Implemented |
|
||||||
|
| Binary Fingerprint Generation | Enterprise | BinaryIndex | `IVulnFingerprintGenerator.cs`, `BasicBlockFingerprintGenerator.cs`, `ControlFlowGraphFingerprintGenerator.cs`, `StringRefsFingerprintGenerator.cs` | `stella binary fingerprint` | - | Implemented |
|
||||||
|
| Fingerprint Matching Engine | Enterprise | BinaryIndex | `IFingerprintMatcher.cs`, `FingerprintMatcher.cs` | `stella binary lookup --fingerprint` | - | Implemented |
|
||||||
|
| DWARF/Symbol Analysis | Enterprise | BinaryIndex | Symbol extraction in corpus functions | `stella binary symbols` | - | Implemented |
|
||||||
|
|
||||||
|
### CLI Commands (Binary)
|
||||||
|
|
||||||
|
| Command | Description | Status |
|
||||||
|
|---------|-------------|--------|
|
||||||
|
| `stella binary inspect` | Inspect binary identity (Build-ID, hashes, architecture) | Implemented |
|
||||||
|
| `stella binary lookup` | Lookup vulnerabilities by binary identity/fingerprint | Implemented |
|
||||||
|
| `stella binary symbols` | Extract symbols from binary | Implemented |
|
||||||
|
| `stella binary fingerprint` | Generate fingerprints for binary functions | Implemented |
|
||||||
|
| `stella binary verify` | Verify binary match evidence | Implemented |
|
||||||
|
| `stella binary submit` | Submit binary for analysis | Implemented |
|
||||||
|
| `stella binary info` | Get binary analysis info | Implemented |
|
||||||
|
| `stella binary callgraph` | Extract call graph digest | Implemented |
|
||||||
|
| `stella scan diff` | Binary diff analysis | Implemented |
|
||||||
|
| `stella patch-verify` | Patch verification for backport detection | Implemented |
|
||||||
|
| `stella patch-attest` | Patch attestation operations | Implemented |
|
||||||
|
| `stella deltasig` | Delta signature operations | Implemented |
|
||||||
|
|
||||||
|
### UI Routes (Binary)
|
||||||
|
|
||||||
|
| Route | Feature | Status |
|
||||||
|
|-------|---------|--------|
|
||||||
|
| `/analyze/patch-map` | Fleet-wide binary patch coverage heatmap | Implemented |
|
||||||
|
|
||||||
|
### Key Implementation Details
|
||||||
|
|
||||||
|
**Fingerprint Algorithms (4 types):**
|
||||||
|
1. **BasicBlock** - Instruction-level basic block hashing (16 bytes)
|
||||||
|
2. **ControlFlowGraph** - Weisfeiler-Lehman graph hash (32 bytes)
|
||||||
|
3. **StringRefs** - String reference pattern hash (16 bytes)
|
||||||
|
4. **Combined** - Multi-algorithm ensemble
|
||||||
|
|
||||||
|
**Fix Detection Methods:**
|
||||||
|
1. SecurityFeed - Official OVAL, DSA feeds
|
||||||
|
2. Changelog - Debian/Ubuntu changelog parsing
|
||||||
|
3. PatchHeader - DEP-3 patch header extraction
|
||||||
|
4. UpstreamPatchMatch - Upstream patch database
|
||||||
|
|
||||||
|
**Supported Distributions:**
|
||||||
|
- Debian, Ubuntu (DebianCorpusConnector)
|
||||||
|
- RHEL, Fedora, CentOS, Rocky, AlmaLinux (RpmCorpusConnector)
|
||||||
|
- Alpine Linux (AlpineCorpusConnector)
|
||||||
|
|
||||||
|
### Coverage Gaps (Binary)
|
||||||
|
|
||||||
|
| Feature | Has CLI | Has UI | Notes |
|
||||||
|
|---------|---------|--------|-------|
|
||||||
|
| Debian/Ubuntu Corpus | No | No | Internal corpus management - admin only |
|
||||||
|
| RPM/RHEL Corpus | No | No | Internal corpus management - admin only |
|
||||||
|
| Fingerprint Generation | Yes | No | Consider UI for fingerprint visualization |
|
||||||
|
| Corpus Ingestion | No | No | Admin operation - consider ops UI |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Advisory Sources (Concelier)
|
||||||
|
|
||||||
|
| Feature | Tiers | Module | Key Files | CLI | UI | Status |
|
||||||
|
|---------|-------|--------|-----------|-----|----|----|
|
||||||
|
| NVD | Free/Pro/Ent | Concelier | `NvdConnector.cs`, `NvdMapper.cs` | `stella db fetch nvd` | `/concelier` | Implemented |
|
||||||
|
| GHSA | Free/Pro/Ent | Concelier | `GhsaConnector.cs` (GraphQL, rate limits) | `stella db fetch ghsa` | `/concelier` | Implemented |
|
||||||
|
| OSV | Free/Pro/Ent | Concelier | `OsvConnector.cs` (multi-ecosystem) | `stella db fetch osv` | `/concelier` | Implemented |
|
||||||
|
| Alpine SecDB | Free/Pro/Ent | Concelier | `Connector.Distro.Alpine/` | `stella db fetch alpine` | `/concelier` | Implemented |
|
||||||
|
| Debian Security Tracker | Free/Pro/Ent | Concelier | `Connector.Distro.Debian/` (DSA, EVR) | `stella db fetch debian` | `/concelier` | Implemented |
|
||||||
|
| Ubuntu USN | Free/Pro/Ent | Concelier | `Connector.Distro.Ubuntu/` | `stella db fetch ubuntu` | `/concelier` | Implemented |
|
||||||
|
| RHEL/CentOS OVAL | Pro/Ent | Concelier | `Connector.Distro.RedHat/` (OVAL, NEVRA) | `stella db fetch redhat` | `/concelier` | Implemented |
|
||||||
|
| KEV (Exploited Vulns) | Free/Pro/Ent | Concelier | `KevConnector.cs` (CISA catalog) | `stella db fetch kev` | `/concelier` | Implemented |
|
||||||
|
| EPSS v4 | Free/Pro/Ent | Concelier | `Connector.Epss/` | `stella db fetch epss` | `/concelier` | Implemented |
|
||||||
|
| Custom Advisory Connectors | Enterprise | Concelier | `IFeedConnector` interface | - | `/admin` | Implemented |
|
||||||
|
| Advisory Merge Engine | Enterprise | Concelier | `AdvisoryPrecedenceMerger.cs`, `AffectedPackagePrecedenceResolver.cs` | `stella db merge` | - | Implemented |
|
||||||
|
|
||||||
|
### CLI Commands (Advisory)
|
||||||
|
|
||||||
|
| Command | Description | Status |
|
||||||
|
|---------|-------------|--------|
|
||||||
|
| `stella db fetch` | Trigger connector fetch/parse/map | Implemented |
|
||||||
|
| `stella db merge` | Run canonical merge reconciliation | Implemented |
|
||||||
|
| `stella db export` | Run Concelier export jobs | Implemented |
|
||||||
|
| `stella sources ingest` | Validate source documents | Implemented |
|
||||||
|
| `stella feeds snapshot` | Create/list/export/import feed snapshots | Implemented |
|
||||||
|
| `stella advisory` | Advisory listing and search | Implemented |
|
||||||
|
| `stella admin feeds` | Feed management (admin) | Implemented |
|
||||||
|
|
||||||
|
### UI Routes (Advisory)
|
||||||
|
|
||||||
|
| Route | Feature | Status |
|
||||||
|
|-------|---------|--------|
|
||||||
|
| `/concelier/trivy-db-settings` | Trivy vulnerability database configuration | Implemented |
|
||||||
|
| `/ops/feeds` | Feed mirror dashboard and air-gap bundles | Implemented |
|
||||||
|
|
||||||
|
### Key Implementation Details
|
||||||
|
|
||||||
|
**Source Precedence (Lower = Higher Priority):**
|
||||||
|
- **Rank 0:** redhat, ubuntu, debian, suse, alpine (distro PSIRTs)
|
||||||
|
- **Rank 1:** msrc, oracle, adobe, apple, cisco, vmware (vendor PSIRTs)
|
||||||
|
- **Rank 2:** ghsa, osv (ecosystem registries)
|
||||||
|
- **Rank 3:** jvn, acsc, cccs, cert-fr, cert-in, certbund, ru-bdu, kisa (regional CERTs)
|
||||||
|
- **Rank 4:** kev (exploit annotations)
|
||||||
|
- **Rank 5:** nvd (baseline)
|
||||||
|
|
||||||
|
**Version Comparators:**
|
||||||
|
- NEVRA (RPM): epoch:version-release with rpmvercmp
|
||||||
|
- EVR (Debian/Ubuntu): epoch:upstream_version-debian_revision
|
||||||
|
- APK (Alpine): `-r<pkgrel>` with suffix ordering
|
||||||
|
|
||||||
|
### Coverage Gaps (Advisory)
|
||||||
|
|
||||||
|
| Feature | Has CLI | Has UI | Notes |
|
||||||
|
|---------|---------|--------|-------|
|
||||||
|
| Advisory Merge Engine | Yes | No | Consider merge status UI |
|
||||||
|
| Custom Connectors | No | No | Enterprise feature - needs admin UI |
|
||||||
|
| Feed Scheduling | No | Partial | Consider `stella feeds schedule` command |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## VEX Processing (Excititor, VexLens, VexHub, IssuerDirectory)
|
||||||
|
|
||||||
|
| Feature | Tiers | Module | Key Files | CLI | UI | Status |
|
||||||
|
|---------|-------|--------|-----------|-----|----|----|
|
||||||
|
| OpenVEX Format Support | Free/Pro/Ent | Excititor | `Formats.OpenVEX/`, `OpenVexParser.cs` | `stella vex` | `/vex` | Implemented |
|
||||||
|
| CycloneDX VEX Format | Free/Pro/Ent | Excititor | `Formats.CycloneDX/` | `stella vex` | `/vex` | Implemented |
|
||||||
|
| CSAF Format Support | Free/Pro/Ent | Excititor | `Formats.CSAF/` | `stella vex` | `/vex` | Implemented |
|
||||||
|
| VEX Ingestion API | Free/Pro/Ent | Excititor | `IngestEndpoints.cs`, `IVexObservationQueryService.cs` | - | `/vex` | Implemented |
|
||||||
|
| VEX Observation Store | Free/Pro/Ent | Excititor | `VexObservationQueryService.cs`, AOC-compliant storage | - | - | Implemented |
|
||||||
|
| VEX Consensus Engine | Pro/Ent | VexLens | `VexConsensusEngine.cs`, `IVexConsensusEngine.cs` | `stella vex consensus` | `/vex` | Implemented |
|
||||||
|
| Trust Weight Scoring | Pro/Ent | VexLens | `ITrustWeightEngine.cs`, `TrustDecayService.cs` | - | `/vex` | Implemented |
|
||||||
|
| Issuer Trust Registry | Pro/Ent | IssuerDirectory | Full issuer CRUD and key management | - | `/issuer-directory` | Implemented |
|
||||||
|
| VEX Distribution Hub | Enterprise | VexHub | `IVexIngestionService.cs`, `IVexExportService.cs` | - | - | Implemented |
|
||||||
|
| VEX Gate Integration | Pro/Ent | Scanner | `IVexGateService.cs`, `VexGateScanCommandGroup.cs` | `stella scan gate-policy` | `/findings` | Implemented |
|
||||||
|
| VEX from Drift Generation | Pro/Ent | CLI | `VexGenCommandGroup.cs` | `stella vex gen --from-drift` | - | Implemented |
|
||||||
|
| Conflict Detection | Pro/Ent | VexLens, Excititor | `VexLinksetDisagreementService.cs`, `NoiseGateService.cs` | - | `/vex` | Implemented |
|
||||||
|
|
||||||
|
### CSAF Provider Connectors
|
||||||
|
|
||||||
|
| Connector | Module | Key Files | CLI | Status |
|
||||||
|
|-----------|--------|-----------|-----|--------|
|
||||||
|
| Red Hat CSAF | Excititor | `Connectors.RedHat.CSAF/` | - | Implemented |
|
||||||
|
| Ubuntu CSAF | Excititor | `Connectors.Ubuntu.CSAF/` | - | Implemented |
|
||||||
|
| Oracle CSAF | Excititor | `Connectors.Oracle.CSAF/` | - | Implemented |
|
||||||
|
| Microsoft MSRC CSAF | Excititor | `Connectors.MSRC.CSAF/` | - | Implemented |
|
||||||
|
| Cisco CSAF | Excititor | `Connectors.Cisco.CSAF/` | - | Implemented |
|
||||||
|
| SUSE RancherVEXHub | Excititor | `Connectors.SUSE.RancherVEXHub/` | - | Implemented |
|
||||||
|
| OCI OpenVEX Attestation | Excititor | `Connectors.OCI.OpenVEX.Attest/` | - | Implemented |
|
||||||
|
|
||||||
|
### CLI Commands (VEX)
|
||||||
|
|
||||||
|
| Command | Description | Status |
|
||||||
|
|---------|-------------|--------|
|
||||||
|
| `stella vex consensus` | Query VexLens consensus (--query, --output json/ndjson/table) | Implemented |
|
||||||
|
| `stella vex get` | Fetch single consensus record with rationale | Implemented |
|
||||||
|
| `stella vex simulate` | Test VEX policy decisions (aggregation-only) | Implemented |
|
||||||
|
| `stella vex gen --from-drift` | Generate VEX from container drift analysis | Implemented |
|
||||||
|
| `stella scan gate-policy` | VEX gate evaluation for findings | Implemented |
|
||||||
|
|
||||||
|
### UI Routes (VEX)
|
||||||
|
|
||||||
|
| Route | Feature | Status |
|
||||||
|
|-------|---------|--------|
|
||||||
|
| `/vex` | VEX consensus and statement browser | Implemented |
|
||||||
|
| `/issuer-directory` | Issuer trust registry management | Implemented |
|
||||||
|
| `/findings` (VEX overlay) | VEX status overlay on findings | Implemented |
|
||||||
|
|
||||||
|
### Key Implementation Details
|
||||||
|
|
||||||
|
**Consensus Lattice States:**
|
||||||
|
- `unknown` (0.00) - No information
|
||||||
|
- `under_investigation` (0.25) - Being analyzed
|
||||||
|
- `not_affected` (0.50) - Confirmed not vulnerable
|
||||||
|
- `affected` (0.75) - Confirmed vulnerable
|
||||||
|
- `fixed` (1.00) - Patch applied
|
||||||
|
|
||||||
|
**Trust Weight Factors (9 total):**
|
||||||
|
1. Issuer tier (critical/high/medium/low)
|
||||||
|
2. Confidence score (0-1)
|
||||||
|
3. Cryptographic attestation status
|
||||||
|
4. Statement age (freshness decay)
|
||||||
|
5. Patch applicability
|
||||||
|
6. Source authority scope (PURL patterns)
|
||||||
|
7. Key lifecycle status
|
||||||
|
8. Justification quality
|
||||||
|
9. Historical accuracy
|
||||||
|
|
||||||
|
**AOC (Aggregation-Only Contract):**
|
||||||
|
- Raw VEX stored verbatim with provenance
|
||||||
|
- No derived data at ingest time
|
||||||
|
- Linkset-only references
|
||||||
|
- Roslyn analyzers enforce compliance
|
||||||
|
|
||||||
|
**Determinism Guarantees:**
|
||||||
|
- RFC 8785 canonical JSON serialization
|
||||||
|
- Stable ordering (timestamp DESC, source ASC, hash ASC)
|
||||||
|
- UTC ISO-8601 timestamps
|
||||||
|
- SHA-256 consensus digests
|
||||||
|
|
||||||
|
### Coverage Gaps (VEX)
|
||||||
|
|
||||||
|
| Feature | Has CLI | Has UI | Notes |
|
||||||
|
|---------|---------|--------|-------|
|
||||||
|
| CSAF Provider Connectors | No | No | Internal connector management |
|
||||||
|
| Trust Weight Configuration | No | Partial | Consider CLI for trust weight tuning |
|
||||||
|
| VEX Distribution Webhooks | No | No | VexHub webhook config needs exposure |
|
||||||
|
| Conflict Resolution UI | No | Partial | Interactive conflict resolution would help |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Policy Engine (Policy, RiskEngine)
|
||||||
|
|
||||||
|
| Feature | Tiers | Module | Key Files | CLI | UI | Status |
|
||||||
|
|---------|-------|--------|-----------|-----|----|----|
|
||||||
|
| K4 Lattice Logic | Pro/Ent | Policy | `K4Lattice.cs`, `TrustLatticeEngine.cs` | - | `/policy` | Implemented |
|
||||||
|
| Policy Gate Evaluation | Free/Pro/Ent | Policy | `PolicyGateEvaluator.cs`, `IPolicyGate.cs` | `stella policy simulate` | `/policy` | Implemented |
|
||||||
|
| Evidence Gate | Free/Pro/Ent | Policy | `EvidenceGate.cs` | - | `/policy` | Implemented |
|
||||||
|
| VEX Trust Gate | Pro/Ent | Policy | `VexTrustGate.cs`, `VexProofSpineService.cs` | - | `/policy` | Implemented |
|
||||||
|
| Confidence Gate | Pro/Ent | Policy | `MinimumConfidenceGate.cs` | - | `/policy` | Implemented |
|
||||||
|
| Exception Management | Pro/Ent | Policy | `IExceptionService.cs`, `ExceptionAdapter.cs` | - | `/policy/exceptions` | Implemented |
|
||||||
|
| Risk Scoring (6 providers) | Pro/Ent | RiskEngine | `IRiskScoreProvider.cs`, `CvssKevProvider.cs` | - | `/risk` | Implemented |
|
||||||
|
| Verdict Attestations | Enterprise | Policy | `IVerdictAttestationService.cs`, `IPolicyDecisionAttestationService.cs` | - | - | Implemented |
|
||||||
|
| Policy Simulation | Pro/Ent | Policy | `IPolicySimulationService.cs` | `stella policy simulate` | `/policy/simulate` | Implemented |
|
||||||
|
| Sealed Mode (Air-Gap) | Enterprise | Policy | `ISealedModeService.cs` | - | `/ops` | Implemented |
|
||||||
|
| Determinization System | Pro/Ent | Policy | `UncertaintyScoreCalculator.cs`, `DecayedConfidenceCalculator.cs` | - | - | Implemented |
|
||||||
|
| Score Policy (YAML) | Pro/Ent | Policy | `ScorePolicyService.cs`, `ScorePolicyModels.cs` | `stella policy validate` | `/policy` | Implemented |
|
||||||
|
|
||||||
|
### K4 Lattice (Belnap Four-Valued Logic)
|
||||||
|
|
||||||
|
| State | Symbol | Description |
|
||||||
|
|-------|--------|-------------|
|
||||||
|
| Unknown | ⊥ | No evidence available |
|
||||||
|
| True | T | Evidence supports true |
|
||||||
|
| False | F | Evidence supports false |
|
||||||
|
| Conflict | ⊤ | Credible evidence for both (contested) |
|
||||||
|
|
||||||
|
**Operations:**
|
||||||
|
- `Join(a, b)` - Knowledge union (monotone aggregation)
|
||||||
|
- `Meet(a, b)` - Knowledge intersection (dependency chains)
|
||||||
|
- `Negate(v)` - Swaps True ↔ False
|
||||||
|
- `FromSupport(hasTrueSupport, hasFalseSupport)` - Constructs K4 from claims
|
||||||
|
|
||||||
|
### Policy Gate Types (10+)
|
||||||
|
|
||||||
|
| Gate | Purpose |
|
||||||
|
|------|---------|
|
||||||
|
| Evidence Gate | Validates sufficient evidence backing |
|
||||||
|
| Lattice State Gate | K4 states (U, SR, SU, RO, RU, CR, CU, X) |
|
||||||
|
| VEX Trust Gate | Confidence-based VEX scoring |
|
||||||
|
| Uncertainty Tier Gate | T1-T4 uncertainty classification |
|
||||||
|
| Minimum Confidence Gate | Enforces confidence floors |
|
||||||
|
| Evidence Freshness Gate | Staleness checks |
|
||||||
|
| VEX Proof Gate | Validates VEX proof chains |
|
||||||
|
| Reachability Requirement Gate | Reachability evidence |
|
||||||
|
| Facet Quota Gate | Facet-based quotas |
|
||||||
|
| Source Quota Gate | Source credibility quotas |
|
||||||
|
| Unknowns Budget Gate | Limits unknown assertions |
|
||||||
|
|
||||||
|
### Risk Score Providers (6)
|
||||||
|
|
||||||
|
| Provider | Key Files | Purpose |
|
||||||
|
|----------|-----------|---------|
|
||||||
|
| CVSS/KEV | `CvssKevProvider.cs` | CVSS + Known Exploited Vulns |
|
||||||
|
| EPSS | `EpssProvider.cs` | Exploit Prediction Scoring |
|
||||||
|
| FixChain | `FixChainRiskProvider.cs` | Fix availability and timeline |
|
||||||
|
| FixExposure | `FixExposureProvider.cs` | Patch adoption curves |
|
||||||
|
| VexGate | `VexGateProvider.cs` | VEX decisions as risk gates |
|
||||||
|
| DefaultTransforms | `DefaultTransformsProvider.cs` | Signal normalization |
|
||||||
|
|
||||||
|
### Determinization Signal Weights
|
||||||
|
|
||||||
|
| Signal | Weight |
|
||||||
|
|--------|--------|
|
||||||
|
| VEX | 35% |
|
||||||
|
| Reachability | 25% |
|
||||||
|
| Runtime | 15% |
|
||||||
|
| EPSS | 10% |
|
||||||
|
| Backport | 10% |
|
||||||
|
| SBOM Lineage | 5% |
|
||||||
|
|
||||||
|
### Score Policy Weights (Basis Points)
|
||||||
|
|
||||||
|
| Dimension | Default Weight |
|
||||||
|
|-----------|---------------|
|
||||||
|
| Base Severity | 10% (1000 BPS) |
|
||||||
|
| Reachability | 45% (4500 BPS) |
|
||||||
|
| Evidence | 30% (3000 BPS) |
|
||||||
|
| Provenance | 15% (1500 BPS) |
|
||||||
|
|
||||||
|
### CLI Commands (Policy)
|
||||||
|
|
||||||
|
| Command | Description | Status |
|
||||||
|
|---------|-------------|--------|
|
||||||
|
| `stella policy validate <path>` | Validate policy YAML (--schema, --strict) | Implemented |
|
||||||
|
| `stella policy install <pack>` | Install policy pack (--version, --env) | Implemented |
|
||||||
|
| `stella policy list` | List installed policies | Implemented |
|
||||||
|
| `stella policy simulate` | Simulate policy decisions | Implemented |
|
||||||
|
|
||||||
|
### UI Routes (Policy)
|
||||||
|
|
||||||
|
| Route | Feature | Status |
|
||||||
|
|-------|---------|--------|
|
||||||
|
| `/policy` | Policy management and evaluation | Implemented |
|
||||||
|
| `/policy/exceptions` | Exception management | Implemented |
|
||||||
|
| `/policy/simulate` | Policy simulation runner | Implemented |
|
||||||
|
| `/risk` | Risk scoring dashboard | Implemented |
|
||||||
|
|
||||||
|
### API Endpoints (45+)
|
||||||
|
|
||||||
|
**Core:**
|
||||||
|
- `/policy/eval/batch` - Batch evaluation
|
||||||
|
- `/policy/packs` - Policy pack management
|
||||||
|
- `/policy/runs` - Run lifecycle
|
||||||
|
- `/policy/decisions` - Decision queries
|
||||||
|
|
||||||
|
**Simulation:**
|
||||||
|
- `/policy/simulate` - Policy simulation
|
||||||
|
- `/policy/merge-preview` - Merge preview
|
||||||
|
- `/overlay-simulation` - Overlay projection
|
||||||
|
|
||||||
|
**Governance:**
|
||||||
|
- `/api/v1/policy/registry/packs` - Pack registry
|
||||||
|
- `/api/v1/policy/registry/promote` - Promotion workflows
|
||||||
|
- `/api/v1/policy/registry/publish` - Publishing pipelines
|
||||||
|
|
||||||
|
### Coverage Gaps (Policy)
|
||||||
|
|
||||||
|
| Feature | Has CLI | Has UI | Notes |
|
||||||
|
|---------|---------|--------|-------|
|
||||||
|
| K4 Lattice Debug | No | Partial | Consider `stella policy lattice explain` |
|
||||||
|
| Risk Provider Config | No | No | Provider-level configuration needs exposure |
|
||||||
|
| Exception Approval API | No | Yes | Consider `stella policy exception approve` |
|
||||||
|
| Determinization Tuning | No | No | Signal weights should be configurable |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Attestation & Signing (Attestor, Signer, Provenance)
|
||||||
|
|
||||||
|
| Feature | Tiers | Module | Key Files | CLI | UI | Status |
|
||||||
|
|---------|-------|--------|-----------|-----|----|----|
|
||||||
|
| DSSE Envelope Handling | Free/Pro/Ent | Attestor | `DsseHelper.cs`, `DsseEnvelope.cs`, `DsseVerifier.cs` | `stella attest` | `/attestations` | Implemented |
|
||||||
|
| In-Toto Statement Format | Free/Pro/Ent | Attestor | `InTotoStatement.cs`, `IInTotoLinkSigningService.cs` | `stella attest attach` | - | Implemented |
|
||||||
|
| SPDX SBOM Predicates | Free/Pro/Ent | Attestor | `SpdxPredicateParser.cs` | `stella attest attach` | - | Implemented |
|
||||||
|
| CycloneDX SBOM Predicates | Free/Pro/Ent | Attestor | `CycloneDxPredicateParser.cs` | `stella attest attach` | - | Implemented |
|
||||||
|
| SLSA Provenance Predicates | Pro/Ent | Attestor | `SlsaProvenancePredicateParser.cs` | `stella attest attach` | - | Implemented |
|
||||||
|
| Keyless Signing (Fulcio) | Pro/Ent | Signer | `KeylessDsseSigner.cs`, `HttpFulcioClient.cs` | `stella sign keyless` | - | Implemented |
|
||||||
|
| Rekor Transparency Log | Pro/Ent | Signer, Attestor | `RekorHttpClient.cs`, `IRekorClient.cs` | `stella sign keyless --rekor` | - | Implemented |
|
||||||
|
| Key Rotation Service | Enterprise | Signer | `IKeyRotationService.cs`, `KeyRotationService.cs` | `/keys/rotate` endpoint | - | Implemented |
|
||||||
|
| Trust Anchor Management | Enterprise | Signer | `ITrustAnchorManager.cs`, `TrustAnchorManager.cs` | - | - | Implemented |
|
||||||
|
| Attestation Chains | Enterprise | Attestor | `AttestationChain.cs`, `AttestationChainBuilder.cs` | - | - | Implemented |
|
||||||
|
| Delta Attestations | Pro/Ent | Attestor | `IDeltaAttestationService.cs` (VEX/SBOM/Verdict/Reachability) | - | - | Implemented |
|
||||||
|
| Offline/Air-Gap Bundles | Enterprise | Attestor | `IAttestorBundleService.cs` | - | `/ops/offline-kit` | Implemented |
|
||||||
|
|
||||||
|
### Predicate Types (25+ Types)
|
||||||
|
|
||||||
|
**Standard Predicates:**
|
||||||
|
| Predicate | Parser | Purpose |
|
||||||
|
|-----------|--------|---------|
|
||||||
|
| SPDX | `SpdxPredicateParser.cs` | SBOM attestation (2.2/2.3/3.0.1) |
|
||||||
|
| CycloneDX | `CycloneDxPredicateParser.cs` | SBOM attestation (1.7) |
|
||||||
|
| SLSA Provenance | `SlsaProvenancePredicateParser.cs` | Build provenance (v1.0) |
|
||||||
|
| VEX Override | `VexOverridePredicateParser.cs` | VEX decision overrides |
|
||||||
|
| Binary Diff | `BinaryDiffPredicateBuilder.cs` | Binary change attestation |
|
||||||
|
|
||||||
|
**Stella-Ops Specific Predicates:**
|
||||||
|
- AIArtifactBasePredicate, AIAuthorityClassifier, AIExplanationPredicate
|
||||||
|
- AIPolicyDraftPredicate, AIRemediationPlanPredicate, AIVexDraftPredicate
|
||||||
|
- BinaryFingerprintEvidencePredicate, BudgetCheckPredicate, ChangeTracePredicate
|
||||||
|
- DeltaVerdictPredicate, EvidencePredicate, PolicyDecisionPredicate
|
||||||
|
- ProofSpinePredicate, ReachabilityDriftPredicate, ReachabilitySubgraphPredicate
|
||||||
|
- SbomDeltaPredicate, UnknownsBudgetPredicate, VerdictDeltaPredicate
|
||||||
|
- VexDeltaPredicate, VexPredicate, TrustVerdictPredicate, FixChainPredicate
|
||||||
|
|
||||||
|
### CLI Commands (Attestation & Signing)
|
||||||
|
|
||||||
|
| Command | Description | Status |
|
||||||
|
|---------|-------------|--------|
|
||||||
|
| `stella attest attach` | Attach DSSE attestation to OCI artifact | Implemented |
|
||||||
|
| `stella attest verify` | Verify attestations on OCI artifact | Implemented |
|
||||||
|
| `stella attest list` | List attestations on OCI artifact | Implemented |
|
||||||
|
| `stella attest fetch` | Fetch specific attestation by predicate type | Implemented |
|
||||||
|
| `stella attest fix-chain` | FixChain attestation command | Implemented |
|
||||||
|
| `stella attest patch` | Patch attestation command | Implemented |
|
||||||
|
| `stella sign keyless` | Sigstore keyless signing | Implemented |
|
||||||
|
| `stella sign verify-keyless` | Verify keyless signature | Implemented |
|
||||||
|
|
||||||
|
### Signing Modes
|
||||||
|
|
||||||
|
| Mode | Description | Key Files |
|
||||||
|
|------|-------------|-----------|
|
||||||
|
| Keyless | Fulcio-based ephemeral keys | `KeylessDsseSigner.cs` |
|
||||||
|
| KMS | External key management system | `CryptoDsseSigner.cs` |
|
||||||
|
| HMAC | HMAC-based signing | `HmacDsseSigner.cs` |
|
||||||
|
|
||||||
|
### Crypto Algorithm Support
|
||||||
|
|
||||||
|
| Algorithm | Files | Purpose |
|
||||||
|
|-----------|-------|---------|
|
||||||
|
| RSA | `CryptoDsseSigner.cs` | Traditional RSA signing |
|
||||||
|
| ECDSA | `CryptoDsseSigner.cs` | Elliptic curve signing |
|
||||||
|
| SM2 | `CryptoDsseSigner.cs` | Chinese national standard |
|
||||||
|
|
||||||
|
### API Endpoints (Attestor)
|
||||||
|
|
||||||
|
| Endpoint | Purpose |
|
||||||
|
|----------|---------|
|
||||||
|
| `/api/v1/anchors` | Attestation anchors |
|
||||||
|
| `/api/v1/bundles` | DSSE bundle operations |
|
||||||
|
| `/api/v1/chains` | Attestation chain queries |
|
||||||
|
| `/api/v1/proofs` | Proof operations |
|
||||||
|
| `/api/v1/verify` | Verification endpoints |
|
||||||
|
|
||||||
|
### API Endpoints (Signer)
|
||||||
|
|
||||||
|
| Endpoint | Purpose |
|
||||||
|
|----------|---------|
|
||||||
|
| `POST /sign` | Sign artifact |
|
||||||
|
| `POST /sign/verify` | Verify signature |
|
||||||
|
| `GET /keys` | List signing keys |
|
||||||
|
| `POST /keys/rotate` | Rotate signing key |
|
||||||
|
| `POST /keys/revoke` | Revoke signing key |
|
||||||
|
|
||||||
|
### Coverage Gaps (Attestation)
|
||||||
|
|
||||||
|
| Feature | Has CLI | Has UI | Notes |
|
||||||
|
|---------|---------|--------|-------|
|
||||||
|
| Key Rotation | No (API only) | No | Add `stella keys rotate` CLI |
|
||||||
|
| Trust Anchor Management | No | No | Consider trust anchor CLI |
|
||||||
|
| Attestation Chains UI | No | Partial | Chain visualization needed |
|
||||||
|
| Predicate Registry | No | No | Consider `stella attest predicates list` |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Regional Crypto (Cryptography, SmRemote)
|
||||||
|
|
||||||
|
| Feature | Tiers | Module | Key Files | CLI | UI | Status |
|
||||||
|
|---------|-------|--------|-----------|-----|----|----|
|
||||||
|
| EdDSA (Ed25519) Baseline | Free/Pro/Ent | Cryptography | `Ed25519Signer.cs`, `Ed25519Verifier.cs` | - | - | Implemented |
|
||||||
|
| ECDSA P-256 (FIPS) | Pro/Ent | Cryptography | `EcdsaP256Signer.cs` | - | - | Implemented |
|
||||||
|
| FIPS 140-2 Plugin | Enterprise | Cryptography | `FipsPlugin.cs` (RSA, ECDSA, AES) | - | - | Implemented |
|
||||||
|
| GOST R 34.10-2012 Plugin | Enterprise | Cryptography | `GostPlugin.cs` (256/512-bit) | - | - | Implemented |
|
||||||
|
| SM2/SM3/SM4 Plugin | Enterprise | Cryptography | `SmPlugin.cs` | - | - | Implemented |
|
||||||
|
| eIDAS Plugin | Enterprise | Cryptography | `EidasPlugin.cs` (CAdES, RFC 3161) | - | - | Implemented |
|
||||||
|
| HSM Plugin (PKCS#11) | Enterprise | Cryptography | `HsmPlugin.cs` | - | - | Implemented |
|
||||||
|
| CryptoPro GOST | Enterprise | Cryptography | `CryptoProGostCryptoProvider.cs` (Windows) | - | - | Implemented |
|
||||||
|
| SM Remote Service | Enterprise | SmRemote | `Program.cs` (SM2 signing service) | - | - | Implemented |
|
||||||
|
| Multi-Profile Signing | Enterprise | Cryptography | `MultiProfileSigner.cs` | - | - | Implemented |
|
||||||
|
| Post-Quantum (Defined) | Future | Cryptography | `SignatureProfile.cs` (Dilithium, Falcon) | - | - | Planned |
|
||||||
|
|
||||||
|
### Signature Profiles (8 Defined)
|
||||||
|
|
||||||
|
| Profile | Standard | Algorithm | Status |
|
||||||
|
|---------|----------|-----------|--------|
|
||||||
|
| EdDsa | RFC 8032 | Ed25519 | Implemented |
|
||||||
|
| EcdsaP256 | FIPS 186-4 | ES256 | Implemented |
|
||||||
|
| RsaPss | FIPS 186-4, RFC 8017 | PS256/384/512 | Implemented |
|
||||||
|
| Gost2012 | GOST R 34.10-2012 | GOST 256/512-bit | Implemented |
|
||||||
|
| SM2 | GM/T 0003.2-2012 | SM2-SM3 | Implemented |
|
||||||
|
| Eidas | ETSI TS 119 312 | RSA-SHA*, ECDSA-SHA* | Implemented |
|
||||||
|
| Dilithium | NIST PQC | CRYSTALS-Dilithium | Planned |
|
||||||
|
| Falcon | NIST PQC | Falcon-512/1024 | Planned |
|
||||||
|
|
||||||
|
### Regional Compliance Matrix
|
||||||
|
|
||||||
|
| Region | Standard | Plugin | Algorithms |
|
||||||
|
|--------|----------|--------|------------|
|
||||||
|
| US | FIPS 140-2 | FipsPlugin | RSA-SHA*, ECDSA-P256/384/521, AES-GCM |
|
||||||
|
| Russia | GOST R 34.10-2012 | GostPlugin, CryptoPro | GOST 256/512-bit signatures |
|
||||||
|
| China | GM/T 0003-0004 | SmPlugin, SmRemote | SM2, SM3, SM4-CBC/GCM |
|
||||||
|
| EU | eIDAS | EidasPlugin | CAdES-BES, XAdES-BES, RFC 3161 TSA |
|
||||||
|
| Hardware | PKCS#11 | HsmPlugin | HSM-RSA, HSM-ECDSA, HSM-AES |
|
||||||
|
|
||||||
|
### Key Service Interfaces
|
||||||
|
|
||||||
|
| Interface | Purpose |
|
||||||
|
|-----------|---------|
|
||||||
|
| `IContentSigner` | Core signing abstraction |
|
||||||
|
| `IContentVerifier` | Signature verification |
|
||||||
|
| `ICryptoCapability` | Plugin capability reporting |
|
||||||
|
| `IHsmClient` | HSM abstraction (simulated/PKCS#11) |
|
||||||
|
|
||||||
|
### Plugin Configuration Options
|
||||||
|
|
||||||
|
**FIPS Plugin:**
|
||||||
|
- RequireFipsMode, RsaKeySize (2048-4096), EcdsaCurve (P-256/384/521)
|
||||||
|
|
||||||
|
**GOST Plugin:**
|
||||||
|
- KeyStorePath, DefaultKeyId, PrivateKeyBase64, KeySize (256/512)
|
||||||
|
|
||||||
|
**SM Plugin:**
|
||||||
|
- PrivateKeyHex, GenerateKeyOnInit, UserId
|
||||||
|
|
||||||
|
**eIDAS Plugin:**
|
||||||
|
- CertificatePath, TimestampAuthorityUrl, ValidateCertificateChain
|
||||||
|
|
||||||
|
**HSM Plugin:**
|
||||||
|
- LibraryPath, SlotId, Pin, TokenLabel
|
||||||
|
|
||||||
|
### Coverage Gaps (Regional Crypto)
|
||||||
|
|
||||||
|
| Feature | Has CLI | Has UI | Notes |
|
||||||
|
|---------|---------|--------|-------|
|
||||||
|
| Crypto Profile Selection | No | No | Configuration-only, no CLI |
|
||||||
|
| Key Management | No | No | Plugin-specific configuration |
|
||||||
|
| Post-Quantum Crypto | No | No | Profiles defined but not implemented |
|
||||||
|
| HSM Status | No | No | Consider health check endpoint |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Evidence & Findings (EvidenceLocker, Findings, ExportCenter)
|
||||||
|
|
||||||
|
| Feature | Tiers | Module | Key Files | CLI | UI | Status |
|
||||||
|
|---------|-------|--------|-----------|-----|----|----|
|
||||||
|
| Sealed Evidence Bundles | Pro/Ent | EvidenceLocker | `S3EvidenceObjectStore.cs` (WORM) | `stella evidence export` | `/evidence-export` | Implemented |
|
||||||
|
| Verdict Attestations | Pro/Ent | EvidenceLocker | `VerdictEndpoints.cs`, `VerdictContracts.cs` | - | `/evidence-export` | Implemented |
|
||||||
|
| Append-Only Ledger | Pro/Ent | Findings | `ILedgerEventRepository.cs`, `LedgerEventModels.cs` | - | `/findings` | Implemented |
|
||||||
|
| Alert Triage Workflow | Pro/Ent | Findings | `DecisionModels.cs` (hot/warm/cold bands) | - | `/findings` | Implemented |
|
||||||
|
| Merkle Anchoring | Pro/Ent | Findings | `Infrastructure/Merkle/` | - | - | Implemented |
|
||||||
|
| Evidence Packs | Pro/Ent | Evidence.Pack | `IEvidencePackService.cs`, `EvidencePack.cs` | - | `/evidence-thread` | Implemented |
|
||||||
|
| Evidence Cards | Pro/Ent | Evidence.Pack | `IEvidenceCardService.cs`, `EvidenceCard.cs` | - | - | Implemented |
|
||||||
|
| Profile-Based Exports | Pro/Ent | ExportCenter | `ExportApiEndpoints.cs`, `ExportProfile` | - | `/evidence-export` | Implemented |
|
||||||
|
| Risk Bundle Export | Enterprise | ExportCenter | `RiskBundleEndpoints.cs` | - | `/evidence-export` | Implemented |
|
||||||
|
| Lineage Evidence Export | Enterprise | ExportCenter | `LineageExportEndpoints.cs` | - | `/lineage` | Implemented |
|
||||||
|
| Offline Verification | Enterprise | EvidenceLocker | `verify-offline.md` | `stella evidence verify --offline` | - | Implemented |
|
||||||
|
|
||||||
|
### CLI Commands (Evidence)
|
||||||
|
|
||||||
|
| Command | Description | Status |
|
||||||
|
|---------|-------------|--------|
|
||||||
|
| `stella evidence export` | Export evidence bundle (--bundle, --format, --compression) | Implemented |
|
||||||
|
| `stella evidence verify` | Verify bundle (--offline, --rekor-key) | Implemented |
|
||||||
|
| `stella evidence status` | Bundle status check | Implemented |
|
||||||
|
|
||||||
|
### UI Routes (Evidence)
|
||||||
|
|
||||||
|
| Route | Feature | Status |
|
||||||
|
|-------|---------|--------|
|
||||||
|
| `/evidence-export` | Evidence bundle management and export | Implemented |
|
||||||
|
| `/evidence-thread` | Evidence thread visualization | Implemented |
|
||||||
|
| `/findings` | Findings ledger with triage | Implemented |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Determinism & Replay (Replay, Signals, HLC)
|
||||||
|
|
||||||
|
| Feature | Tiers | Module | Key Files | CLI | UI | Status |
|
||||||
|
|---------|-------|--------|-----------|-----|----|----|
|
||||||
|
| Hybrid Logical Clock | Pro/Ent | HybridLogicalClock | `HybridLogicalClock.cs`, `HlcTimestamp.cs` | - | - | Implemented |
|
||||||
|
| Canonical JSON (RFC 8785) | Pro/Ent | Canonical.Json | `CanonJson.cs` | - | - | Implemented |
|
||||||
|
| Replay Manifests (V1/V2) | Pro/Ent | Replay.Core | `ReplayManifest.cs`, `KnowledgeSnapshot.cs` | `stella scan replay` | - | Implemented |
|
||||||
|
| Evidence Weighted Scoring | Pro/Ent | Signals | `EvidenceWeightedScoreCalculator.cs` (6 factors) | - | - | Implemented |
|
||||||
|
| Timeline Events | Pro/Ent | Eventing | `TimelineEvent.cs`, `ITimelineEventEmitter.cs` | - | - | Implemented |
|
||||||
|
| Replay Proofs | Pro/Ent | Replay.Core | `ReplayProof.cs`, `ReplayManifestValidator.cs` | `stella prove` | - | Implemented |
|
||||||
|
| Deterministic Event IDs | Pro/Ent | Eventing | `EventIdGenerator.cs` (SHA-256 based) | - | - | Implemented |
|
||||||
|
| Attested Reduction | Pro/Ent | Signals | Short-circuit rules for anchored VEX | - | - | Implemented |
|
||||||
|
|
||||||
|
### Evidence Weighted Scoring (6 Factors)
|
||||||
|
|
||||||
|
| Factor | Symbol | Weight | Description |
|
||||||
|
|--------|--------|--------|-------------|
|
||||||
|
| Reachability | RCH | Configurable | Static/runtime reachability |
|
||||||
|
| Runtime | RTS | Configurable | Runtime telemetry |
|
||||||
|
| Backport | BKP | Configurable | Backport evidence |
|
||||||
|
| Exploit | XPL | Configurable | Exploit likelihood (EPSS) |
|
||||||
|
| Source Trust | SRC | Configurable | Feed trustworthiness |
|
||||||
|
| Mitigations | MIT | Configurable | Mitigation evidence (reduces score) |
|
||||||
|
|
||||||
|
### CLI Commands (Replay)
|
||||||
|
|
||||||
|
| Command | Description | Status |
|
||||||
|
|---------|-------------|--------|
|
||||||
|
| `stella scan replay` | Deterministic verdict reproduction | Implemented |
|
||||||
|
| `stella prove` | Generate replay proofs | Implemented |
|
||||||
|
| `stella verify --proof` | Verify replay proofs | Implemented |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Operations (Scheduler, Orchestrator, TaskRunner, TimelineIndexer)
|
||||||
|
|
||||||
|
| Feature | Tiers | Module | Key Files | CLI | UI | Status |
|
||||||
|
|---------|-------|--------|-----------|-----|----|----|
|
||||||
|
| Job Scheduling | Pro/Ent | Scheduler | `IGraphJobService.cs`, `RunEndpoints.cs` | - | `/ops/scheduler` | Implemented |
|
||||||
|
| Impact Targeting | Pro/Ent | Scheduler | `IImpactIndex.cs` (Roaring bitmaps) | - | - | Implemented |
|
||||||
|
| Job Orchestration | Pro/Ent | Orchestrator | `IJobRepository.cs`, `Job.cs` | - | `/orchestrator` | Implemented |
|
||||||
|
| Dead Letter Queue | Pro/Ent | Orchestrator | `DeadLetterEntry.cs`, `DeadLetterEndpoints.cs` | - | `/orchestrator` | Implemented |
|
||||||
|
| Task Pack Execution | Pro/Ent | TaskRunner | `ITaskRunnerClient.cs`, `PackRunWorkerService.cs` | - | - | Implemented |
|
||||||
|
| Plan-Hash Binding | Pro/Ent | TaskRunner | Deterministic execution validation | - | - | Implemented |
|
||||||
|
| Timeline Indexing | Pro/Ent | TimelineIndexer | `ITimelineQueryService.cs`, `TimelineEventView.cs` | - | - | Implemented |
|
||||||
|
| Lease Management | Pro/Ent | Orchestrator | `LeaseNextAsync()`, `ExtendLeaseAsync()` | - | - | Implemented |
|
||||||
|
|
||||||
|
### API Endpoints (Operations)
|
||||||
|
|
||||||
|
**Scheduler:**
|
||||||
|
- `POST /api/v1/scheduler/runs` - Create run
|
||||||
|
- `GET /api/v1/scheduler/runs/{runId}/stream` - SSE stream
|
||||||
|
- `POST /api/v1/scheduler/runs/preview` - Dry-run preview
|
||||||
|
|
||||||
|
**Orchestrator:**
|
||||||
|
- `GET /api/v1/orchestrator/jobs` - List jobs
|
||||||
|
- `GET /api/v1/orchestrator/dag` - Job DAG
|
||||||
|
- `GET /api/v1/orchestrator/deadletter` - Dead letter queue
|
||||||
|
- `GET /api/v1/orchestrator/kpi` - KPI metrics
|
||||||
|
|
||||||
|
**TaskRunner:**
|
||||||
|
- `POST /api/runs` - Create pack run
|
||||||
|
- `GET /api/runs/{runId}/logs` - SSE log stream
|
||||||
|
- `POST /api/runs/{runId}/approve` - Approval decision
|
||||||
|
|
||||||
|
### UI Routes (Operations)
|
||||||
|
|
||||||
|
| Route | Feature | Status |
|
||||||
|
|-------|---------|--------|
|
||||||
|
| `/ops/scheduler` | Scheduler runs and impact preview | Implemented |
|
||||||
|
| `/orchestrator` | Job dashboard and dead letters | Implemented |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Release Orchestration (ReleaseOrchestrator)
|
||||||
|
|
||||||
|
| Feature | Tiers | Module | Key Files | CLI | UI | Status |
|
||||||
|
|---------|-------|--------|-----------|-----|----|----|
|
||||||
|
| Promotion Workflows | Enterprise | ReleaseOrchestrator | `GateModels.cs`, `StepModels.cs` | - | `/releases` | Implemented |
|
||||||
|
| Integration Hub | Enterprise | ReleaseOrchestrator | `IIntegrationManager.cs` | - | `/integrations` | Implemented |
|
||||||
|
| Deployment Agents | Enterprise | Agent.Core | `IAgentCapability.cs`, `ComposeCapability.cs` | - | - | Implemented |
|
||||||
|
| Plugin System (3-Surface) | Enterprise | ReleaseOrchestrator.Plugin | `IStepProviderCapability.cs`, `IGateProviderCapability.cs` | - | `/plugins` | Implemented |
|
||||||
|
| Gate Evaluation | Enterprise | ReleaseOrchestrator | `IGateEvaluator.cs` | - | `/releases` | Implemented |
|
||||||
|
| Step Execution | Enterprise | ReleaseOrchestrator | `IStepExecutor.cs` | - | - | Implemented |
|
||||||
|
| Connector Invoker | Enterprise | ReleaseOrchestrator | `IConnectorInvoker.cs` | - | - | Implemented |
|
||||||
|
|
||||||
|
### Integration Types
|
||||||
|
|
||||||
|
| Type | Description | Examples |
|
||||||
|
|------|-------------|----------|
|
||||||
|
| Scm | Source Control | GitHub, GitLab, Gitea |
|
||||||
|
| Ci | Continuous Integration | Jenkins, GitHub Actions |
|
||||||
|
| Registry | Container Registry | Docker Hub, Harbor, ACR, ECR, GCR |
|
||||||
|
| Vault | Secrets | HashiCorp Vault, Azure Key Vault |
|
||||||
|
| Notify | Notifications | Slack, Teams, Email, Webhooks |
|
||||||
|
| SettingsStore | Config | Consul, etcd, Parameter Store |
|
||||||
|
|
||||||
|
### Deployment Agent Types
|
||||||
|
|
||||||
|
| Agent | Key Files | Tasks |
|
||||||
|
|-------|-----------|-------|
|
||||||
|
| Docker Compose | `ComposeCapability.cs` | pull, up, down, scale, health-check, ps |
|
||||||
|
| SSH/WinRM | (planned) | Remote execution |
|
||||||
|
| ECS | (planned) | AWS ECS deployment |
|
||||||
|
| Nomad | (planned) | HashiCorp Nomad |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Auth & Access Control (Authority, Registry)
|
||||||
|
|
||||||
|
| Feature | Tiers | Module | Key Files | CLI | UI | Status |
|
||||||
|
|---------|-------|--------|-----------|-----|----|----|
|
||||||
|
| OAuth2/OIDC Token Service | Free/Pro/Ent | Authority | `IStellaOpsTokenClient.cs` | `stella auth` | `/login` | Implemented |
|
||||||
|
| DPoP (Proof-of-Possession) | Pro/Ent | Authority | DPoP header injection | - | - | Implemented |
|
||||||
|
| mTLS Certificate Binding | Enterprise | Authority | `cnf.x5t#S256` tokens | - | - | Implemented |
|
||||||
|
| 75+ Authorization Scopes | Pro/Ent | Authority | `StellaOpsScopes.cs` | - | - | Implemented |
|
||||||
|
| Registry Token Service | Pro/Ent | Registry | `RegistryTokenIssuer.cs` | - | - | Implemented |
|
||||||
|
| Plan-Based Authorization | Pro/Ent | Registry | `PlanRegistry.cs` | - | - | Implemented |
|
||||||
|
| LDAP Integration | Enterprise | Authority.Plugin.Ldap | LDAP connector | - | `/admin` | Implemented |
|
||||||
|
| Device Code Flow | Pro/Ent | Authority | CLI headless login | `stella auth login` | - | Implemented |
|
||||||
|
|
||||||
|
### Authentication Flows
|
||||||
|
|
||||||
|
| Flow | Use Case |
|
||||||
|
|------|----------|
|
||||||
|
| Client Credentials | Service-to-service |
|
||||||
|
| Device Code | CLI headless login |
|
||||||
|
| Authorization Code + PKCE | Web UI browser login |
|
||||||
|
| DPoP Handshake | Proof-of-possession for all API calls |
|
||||||
|
|
||||||
|
### Scope Categories
|
||||||
|
|
||||||
|
| Category | Example Scopes |
|
||||||
|
|----------|---------------|
|
||||||
|
| Signer | `signer.sign` |
|
||||||
|
| Scanner | `scanner:scan`, `scanner:export` |
|
||||||
|
| VEX | `vex:read`, `vex:ingest` |
|
||||||
|
| Policy | `policy:author`, `policy:approve`, `policy:publish` |
|
||||||
|
| Authority Admin | `authority:tenants.write`, `authority:roles.write` |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Notifications & Integrations (Notify, Notifier, Integrations, Zastava)
|
||||||
|
|
||||||
|
| Feature | Tiers | Module | Key Files | CLI | UI | Status |
|
||||||
|
|---------|-------|--------|-----------|-----|----|----|
|
||||||
|
| Multi-Channel Notifications | Pro/Ent | Notify | `NotifyChannel.cs`, `NotifyEvent.cs` | - | `/notifications` | Implemented |
|
||||||
|
| Rule-Based Routing | Pro/Ent | Notify | `NotifyRule.cs`, `INotifyRuleEvaluator.cs` | - | `/notifications` | Implemented |
|
||||||
|
| Incident Correlation | Pro/Ent | Notifier | `ICorrelationEngine.cs` | - | `/incidents` | Implemented |
|
||||||
|
| Escalation Policies | Pro/Ent | Notifier | `EscalationEndpoints.cs` | - | `/notifications` | Implemented |
|
||||||
|
| Storm Breaker | Pro/Ent | Notifier | `StormBreakerEndpoints.cs` | - | - | Implemented |
|
||||||
|
| External Integrations | Enterprise | Integrations | `IIntegrationConnectorPlugin.cs` | - | `/integrations` | Implemented |
|
||||||
|
| Kubernetes Admission | Enterprise | Zastava | `AdmissionEndpoint.cs`, `AdmissionDecision.cs` | - | - | Implemented |
|
||||||
|
| Runtime Event Collection | Enterprise | Zastava | `RuntimeEvent.cs`, `RuntimeEventFactory.cs` | - | - | Implemented |
|
||||||
|
|
||||||
|
### Notification Channels (10 Types)
|
||||||
|
|
||||||
|
| Channel | Adapter | Status |
|
||||||
|
|---------|---------|--------|
|
||||||
|
| Slack | `SlackChannelAdapter.cs` | Implemented |
|
||||||
|
| Teams | `ChatWebhookChannelAdapter.cs` | Implemented |
|
||||||
|
| Email | `EmailChannelAdapter.cs` | Implemented |
|
||||||
|
| Webhook | `ChatWebhookChannelAdapter.cs` | Implemented |
|
||||||
|
| PagerDuty | `PagerDutyChannelAdapter.cs` | Implemented |
|
||||||
|
| OpsGenie | `OpsGenieChannelAdapter.cs` | Implemented |
|
||||||
|
| CLI | `CliChannelAdapter.cs` | Implemented |
|
||||||
|
| InApp | `InAppChannelAdapter.cs` | Implemented |
|
||||||
|
| InAppInbox | `InAppInboxChannelAdapter.cs` | Implemented |
|
||||||
|
| Custom | Plugin-based | Implemented |
|
||||||
|
|
||||||
|
### Runtime Event Types (Zastava)
|
||||||
|
|
||||||
|
| Event Kind | Description |
|
||||||
|
|------------|-------------|
|
||||||
|
| ContainerStart | Container lifecycle start |
|
||||||
|
| ContainerStop | Container lifecycle stop |
|
||||||
|
| Drift | Filesystem/binary changes |
|
||||||
|
| PolicyViolation | Policy rule breach |
|
||||||
|
| AttestationStatus | Signature/attestation verification |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Summary Statistics
|
||||||
|
|
||||||
|
| Category | Count |
|
||||||
|
|----------|-------|
|
||||||
|
| Total Features in Matrix | ~200 original |
|
||||||
|
| Discovered Features | 200+ additional |
|
||||||
|
| CLI Commands | 80+ |
|
||||||
|
| UI Routes | 75+ |
|
||||||
|
| API Endpoints | 500+ |
|
||||||
|
| Service Interfaces | 300+ |
|
||||||
|
| Language Analyzers | 11+ |
|
||||||
|
| Advisory Connectors | 33+ |
|
||||||
|
| Notification Channels | 10 |
|
||||||
|
| Crypto Profiles | 8 |
|
||||||
|
| Policy Gate Types | 10+ |
|
||||||
|
| Risk Score Providers | 6 |
|
||||||
|
| Attestation Predicates | 25+ |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
*Document generated via automated feature extraction from Stella Ops codebase (20,723+ .cs files across 1,024 projects)*
|
||||||
@@ -280,6 +280,98 @@ X-Stella-Tenant: acme-corp
|
|||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
|
### Attested-Reduction Scoring Profile
|
||||||
|
|
||||||
|
> Sprint: SPRINT_20260112_004_LB_attested_reduction_scoring
|
||||||
|
|
||||||
|
When enabled, the attested-reduction profile applies precedence-based scoring using cryptographically anchored evidence:
|
||||||
|
|
||||||
|
**Formula:** `score = clamp(base_epss * (1 + R + T) - P, 0, 1)`
|
||||||
|
|
||||||
|
Where:
|
||||||
|
- `base_epss` - EPSS score (exploit likelihood)
|
||||||
|
- `R` - Reachability bonus (applied when anchored not-reachable evidence exists)
|
||||||
|
- `T` - Telemetry bonus (applied when anchored no-observation evidence exists)
|
||||||
|
- `P` - Patch proof reduction (applied when anchored backport/fix evidence exists)
|
||||||
|
|
||||||
|
**Short-Circuit Rules:**
|
||||||
|
1. **Anchored VEX not_affected/fixed** → Score = 0 (immediate watchlist)
|
||||||
|
2. **Anchored VEX affected + runtime confirmed** → Hard fail (Score = 100, ActNow bucket)
|
||||||
|
|
||||||
|
**Configuration in Policy:**
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"version": "ews.v1.3",
|
||||||
|
"weights": { ... },
|
||||||
|
"guardrails": { ... },
|
||||||
|
"buckets": { ... },
|
||||||
|
"attestedReduction": {
|
||||||
|
"enabled": true,
|
||||||
|
"precedenceList": [
|
||||||
|
"vex.not_affected",
|
||||||
|
"vex.fixed",
|
||||||
|
"backport.signed_proof",
|
||||||
|
"backport.vendor_vex",
|
||||||
|
"reachability.not_reachable",
|
||||||
|
"runtime.not_observed"
|
||||||
|
],
|
||||||
|
"reachabilityBonus": 0.3,
|
||||||
|
"telemetryBonus": 0.2,
|
||||||
|
"patchProofReduction": 0.5,
|
||||||
|
"clampMin": 0.0,
|
||||||
|
"clampMax": 1.0,
|
||||||
|
"hardFailOnAffectedWithRuntime": true,
|
||||||
|
"hardFailScore": 1.0,
|
||||||
|
"skipEpssWhenAnchored": true,
|
||||||
|
"requiredVerificationStatus": "Verified"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Anchor Metadata:**
|
||||||
|
|
||||||
|
Evidence inputs can include anchor metadata for cryptographic attestation:
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"findingId": "CVE-2024-1234@pkg:test/lib@1.0.0",
|
||||||
|
"xpl": 0.5,
|
||||||
|
"vexStatus": "not_affected",
|
||||||
|
"vexAnchor": {
|
||||||
|
"isAnchored": true,
|
||||||
|
"dsseEnvelopeDigest": "sha256:abc123...",
|
||||||
|
"predicateType": "https://stellaops.io/attestation/vex-override/v1",
|
||||||
|
"rekorLogIndex": 12345678,
|
||||||
|
"rekorEntryId": "24296fb24b8ad77a...",
|
||||||
|
"verificationStatus": "Verified",
|
||||||
|
"attestationTimestamp": "2026-01-14T10:30:00Z"
|
||||||
|
},
|
||||||
|
"backportDetails": {
|
||||||
|
"evidenceTier": "SignedProof",
|
||||||
|
"status": "Fixed",
|
||||||
|
"confidence": 0.95,
|
||||||
|
"anchor": {
|
||||||
|
"isAnchored": true,
|
||||||
|
"dsseEnvelopeDigest": "sha256:def456...",
|
||||||
|
"predicateType": "https://stellaops.io/attestation/backport/v1",
|
||||||
|
"verificationStatus": "Verified"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Response Flags (when attested-reduction is active):**
|
||||||
|
|
||||||
|
| Flag | Description |
|
||||||
|
|------|-------------|
|
||||||
|
| `attested-reduction` | Attested-reduction scoring path was used |
|
||||||
|
| `anchored-vex` | Anchored VEX evidence triggered precedence |
|
||||||
|
| `anchored-backport` | Anchored backport evidence applied reduction |
|
||||||
|
| `anchored-reachability` | Anchored reachability evidence applied bonus |
|
||||||
|
| `anchored-runtime` | Anchored runtime evidence affected score |
|
||||||
|
| `hard-fail` | Hard-fail triggered (affected + runtime confirmed) |
|
||||||
|
| `epss-reduced` | EPSS influence reduced due to anchored evidence |
|
||||||
|
|
||||||
## Webhooks
|
## Webhooks
|
||||||
|
|
||||||
### Register Webhook
|
### Register Webhook
|
||||||
|
|||||||
@@ -183,6 +183,140 @@ The following constants are used for DSSE envelope creation and verification:
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
|
## Canonical Predicate Type and Aliases
|
||||||
|
|
||||||
|
> **Sprint:** SPRINT_20260112_004_SCANNER_path_witness_nodehash
|
||||||
|
> **Sprint:** SPRINT_20260112_008_DOCS_path_witness_contracts (PW-DOC-001)
|
||||||
|
|
||||||
|
The **canonical predicate type** for path witnesses is:
|
||||||
|
|
||||||
|
```
|
||||||
|
https://stella.ops/predicates/path-witness/v1
|
||||||
|
```
|
||||||
|
|
||||||
|
The following **aliases** are recognized for backward compatibility:
|
||||||
|
|
||||||
|
| Alias | Status |
|
||||||
|
|-------|--------|
|
||||||
|
| `stella.ops/pathWitness@v1` | Active (legacy short form) |
|
||||||
|
| `https://stella.ops/pathWitness/v1` | Active (URL variant) |
|
||||||
|
|
||||||
|
**Consumers must accept all aliases when verifying**; producers should emit the canonical form.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Node Hash Recipe
|
||||||
|
|
||||||
|
Canonical node hash recipe for deterministic static/runtime evidence joining.
|
||||||
|
|
||||||
|
### Recipe
|
||||||
|
|
||||||
|
```
|
||||||
|
NodeHash = SHA256(normalize(PURL) + ":" + normalize(SYMBOL_FQN))
|
||||||
|
```
|
||||||
|
|
||||||
|
Output format: `sha256:<64-hex-chars>`
|
||||||
|
|
||||||
|
### PURL Normalization Rules
|
||||||
|
|
||||||
|
1. Lowercase scheme (`pkg:`)
|
||||||
|
2. Lowercase type (e.g., `NPM` -> `npm`)
|
||||||
|
3. Preserve namespace/name case (some ecosystems are case-sensitive)
|
||||||
|
4. Sort qualifiers alphabetically by key
|
||||||
|
5. Remove trailing slashes
|
||||||
|
6. Normalize empty version to `unversioned`
|
||||||
|
|
||||||
|
### Symbol FQN Normalization Rules
|
||||||
|
|
||||||
|
1. Trim whitespace
|
||||||
|
2. Normalize multiple dots (`..`) to single dot
|
||||||
|
3. Normalize signature whitespace: `(type,type)` -> `(type, type)`
|
||||||
|
4. Empty signatures become `()`
|
||||||
|
5. Replace `_` type placeholders for module-level functions
|
||||||
|
|
||||||
|
### Example
|
||||||
|
|
||||||
|
```
|
||||||
|
Input:
|
||||||
|
PURL: pkg:npm/lodash@4.17.21
|
||||||
|
Symbol: lodash.merge(object, object)
|
||||||
|
|
||||||
|
Normalized Input:
|
||||||
|
"pkg:npm/lodash@4.17.21:lodash.merge(object, object)"
|
||||||
|
|
||||||
|
Output:
|
||||||
|
sha256:a1b2c3d4e5f6... (64 hex chars)
|
||||||
|
```
|
||||||
|
|
||||||
|
### Implementation
|
||||||
|
|
||||||
|
See `src/__Libraries/StellaOps.Reachability.Core/NodeHashRecipe.cs`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Path Hash Recipe
|
||||||
|
|
||||||
|
Canonical path hash recipe for deterministic path fingerprinting.
|
||||||
|
|
||||||
|
### Recipe
|
||||||
|
|
||||||
|
```
|
||||||
|
PathHash = SHA256(nodeHash1 + ">" + nodeHash2 + ">" + ... + nodeHashN)
|
||||||
|
```
|
||||||
|
|
||||||
|
The `>` separator represents directed edges in the path.
|
||||||
|
|
||||||
|
### Top-K Selection
|
||||||
|
|
||||||
|
For efficiency, witnesses include a top-K subset of node hashes:
|
||||||
|
|
||||||
|
1. Take first K/2 nodes (entry points)
|
||||||
|
2. Take last K/2 nodes (exit/vulnerable points)
|
||||||
|
3. Deduplicate while preserving order
|
||||||
|
4. Default K = 10
|
||||||
|
|
||||||
|
### PathFingerprint Fields
|
||||||
|
|
||||||
|
| Field | Type | Description |
|
||||||
|
|-------|------|-------------|
|
||||||
|
| `path_hash` | string | `sha256:<hex>` of full path |
|
||||||
|
| `node_count` | integer | Total nodes in path |
|
||||||
|
| `top_k_node_hashes` | array | Top-K node hashes for lookup |
|
||||||
|
| `source_node_hash` | string | Hash of entry node |
|
||||||
|
| `sink_node_hash` | string | Hash of vulnerable sink |
|
||||||
|
|
||||||
|
### Implementation
|
||||||
|
|
||||||
|
See `src/__Libraries/StellaOps.Reachability.Core/PathHashRecipe.cs`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Evidence URI Fields
|
||||||
|
|
||||||
|
Path witnesses may include URIs to supporting evidence:
|
||||||
|
|
||||||
|
| Field | Format | Description |
|
||||||
|
|-------|--------|-------------|
|
||||||
|
| `graph_uri` | `cas://<hash>` | Content-addressed graph reference |
|
||||||
|
| `sbom_uri` | `cas://<hash>` | SBOM used during analysis |
|
||||||
|
| `attestation_uri` | `cas://<hash>` | DSSE envelope reference |
|
||||||
|
| `rekor_uri` | `https://rekor.sigstore.dev/...` | Transparency log entry |
|
||||||
|
|
||||||
|
Example:
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"evidence_uris": {
|
||||||
|
"graph": "cas://sha256:abc123...",
|
||||||
|
"sbom": "cas://sha256:def456...",
|
||||||
|
"attestation": "cas://sha256:ghi789...",
|
||||||
|
"rekor": "https://rekor.sigstore.dev/api/v1/log/entries/abc123def456"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
## DSSE Signing
|
## DSSE Signing
|
||||||
|
|
||||||
Witnesses are signed using [DSSE (Dead Simple Signing Envelope)](https://github.com/secure-systems-lab/dsse):
|
Witnesses are signed using [DSSE (Dead Simple Signing Envelope)](https://github.com/secure-systems-lab/dsse):
|
||||||
|
|||||||
@@ -2525,6 +2525,57 @@ EOF
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
|
#### check.security.evidence.integrity
|
||||||
|
|
||||||
|
| Property | Value |
|
||||||
|
|----------|-------|
|
||||||
|
| **CheckId** | `check.security.evidence.integrity` |
|
||||||
|
| **Plugin** | `stellaops.doctor.security` |
|
||||||
|
| **Category** | Security |
|
||||||
|
| **Severity** | Fail |
|
||||||
|
| **Tags** | `security`, `evidence`, `integrity`, `dsse`, `rekor`, `offline` |
|
||||||
|
| **What it verifies** | Evidence files have valid DSSE signatures, Rekor inclusion proofs, and consistent hashes |
|
||||||
|
| **Evidence collected** | Evidence locker path, total files, valid/invalid/skipped counts, specific issues |
|
||||||
|
| **Failure modes** | Empty DSSE payload, missing signatures, invalid base64, missing Rekor UUID, missing inclusion proof hashes, digest mismatch |
|
||||||
|
|
||||||
|
**What it checks:**
|
||||||
|
1. **DSSE Envelope Structure**: Validates `payloadType`, `payload` (base64), and `signatures` array
|
||||||
|
2. **Signature Completeness**: Each signature has `keyid` and valid base64 `sig`
|
||||||
|
3. **Payload Digest Consistency**: If `payloadDigest` field present, recomputes and compares SHA-256
|
||||||
|
4. **Evidence Bundle Structure**: Validates `bundleId`, `manifest.version`, and optional `contentDigest`
|
||||||
|
5. **Rekor Receipt Validity**: If present, validates `uuid`, `logIndex`, and `inclusionProof.hashes`
|
||||||
|
|
||||||
|
**Remediation:**
|
||||||
|
```bash
|
||||||
|
# 1. List evidence files with issues
|
||||||
|
stella doctor --check check.security.evidence.integrity --output json \
|
||||||
|
| jq '.evidence.issues[]'
|
||||||
|
|
||||||
|
# 2. Re-sign affected evidence bundles
|
||||||
|
stella evidence resign --bundle-id {BUNDLE_ID}
|
||||||
|
|
||||||
|
# 3. Verify Rekor inclusion manually (if online)
|
||||||
|
rekor-cli get --uuid {REKOR_UUID} --format json | jq
|
||||||
|
|
||||||
|
# 4. For offline environments, verify against local ledger
|
||||||
|
stella evidence verify --offline --bundle-id {BUNDLE_ID}
|
||||||
|
|
||||||
|
# 5. Re-generate evidence pack from source
|
||||||
|
stella export evidence-pack --artifact {ARTIFACT_DIGEST} --force
|
||||||
|
```
|
||||||
|
|
||||||
|
**Configuration:**
|
||||||
|
```yaml
|
||||||
|
# etc/appsettings.yaml
|
||||||
|
EvidenceLocker:
|
||||||
|
LocalPath: /var/lib/stellaops/evidence
|
||||||
|
# Or use Evidence:BasePath for alternate key
|
||||||
|
```
|
||||||
|
|
||||||
|
**Verification:** `stella doctor --check check.security.evidence.integrity`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
### 9.5 Integration Plugins - SCM (`stellaops.doctor.integration.scm.*`)
|
### 9.5 Integration Plugins - SCM (`stellaops.doctor.integration.scm.*`)
|
||||||
|
|
||||||
#### check.integration.scm.github.connectivity
|
#### check.integration.scm.github.connectivity
|
||||||
|
|||||||
@@ -303,6 +303,102 @@ CLI translates verdict to exit code:
|
|||||||
| FAIL | 1 | Block deployment |
|
| FAIL | 1 | Block deployment |
|
||||||
| ERROR | 2 | Pipeline failure |
|
| ERROR | 2 | Pipeline failure |
|
||||||
|
|
||||||
|
### 5a. DSSE Witness Verification (Required)
|
||||||
|
|
||||||
|
> Sprint: SPRINT_20260112_004_DOC_cicd_gate_verification
|
||||||
|
|
||||||
|
Before deploying, pipelines must verify DSSE witness signatures and Rekor inclusion (or offline ledger). This ensures attestation integrity and provides tamper-evident audit trail.
|
||||||
|
|
||||||
|
#### Online Verification
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Verify DSSE signature and Rekor inclusion
|
||||||
|
stellaops proof verify \
|
||||||
|
--image ghcr.io/org/myapp:$COMMIT_SHA \
|
||||||
|
--attestation-type scan-result \
|
||||||
|
--check-rekor \
|
||||||
|
--fail-on-missing
|
||||||
|
|
||||||
|
# Exit codes:
|
||||||
|
# 0 - Verified successfully
|
||||||
|
# 1 - Verification failed
|
||||||
|
# 2 - Missing attestation or Rekor entry
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Offline Verification (Air-Gapped Environments)
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Verify against local offline ledger
|
||||||
|
stellaops proof verify \
|
||||||
|
--image myapp:$COMMIT_SHA \
|
||||||
|
--attestation-type scan-result \
|
||||||
|
--offline \
|
||||||
|
--ledger-path /var/lib/stellaops/ledger \
|
||||||
|
--fail-on-missing
|
||||||
|
|
||||||
|
# Alternative: verify a bundled evidence pack
|
||||||
|
stellaops evidence-pack verify \
|
||||||
|
--bundle /path/to/evidence-pack.tar.gz \
|
||||||
|
--check-signatures \
|
||||||
|
--check-merkle
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Cosign Equivalent Commands
|
||||||
|
|
||||||
|
For environments using cosign directly:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Online: verify with Rekor
|
||||||
|
cosign verify-attestation \
|
||||||
|
--type https://stellaops.io/attestation/scan/v1 \
|
||||||
|
--rekor-url https://rekor.sigstore.dev \
|
||||||
|
ghcr.io/org/myapp:$COMMIT_SHA
|
||||||
|
|
||||||
|
# Offline: verify with bundled certificate
|
||||||
|
cosign verify-attestation \
|
||||||
|
--type https://stellaops.io/attestation/scan/v1 \
|
||||||
|
--certificate /path/to/cert.pem \
|
||||||
|
--certificate-chain /path/to/chain.pem \
|
||||||
|
--offline \
|
||||||
|
ghcr.io/org/myapp:$COMMIT_SHA
|
||||||
|
```
|
||||||
|
|
||||||
|
#### GitHub Actions Integration
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
- name: Verify attestation
|
||||||
|
run: |
|
||||||
|
stellaops proof verify \
|
||||||
|
--image ghcr.io/org/myapp:${{ github.sha }} \
|
||||||
|
--attestation-type scan-result \
|
||||||
|
--check-rekor \
|
||||||
|
--fail-on-missing
|
||||||
|
|
||||||
|
- name: Push to registry (only if verified)
|
||||||
|
if: success()
|
||||||
|
run: |
|
||||||
|
docker push ghcr.io/org/myapp:${{ github.sha }}
|
||||||
|
```
|
||||||
|
|
||||||
|
#### GitLab CI Integration
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
verify:
|
||||||
|
stage: verify
|
||||||
|
script:
|
||||||
|
- stellaops proof verify
|
||||||
|
--image $CI_REGISTRY_IMAGE:$CI_COMMIT_SHA
|
||||||
|
--attestation-type scan-result
|
||||||
|
--check-rekor
|
||||||
|
--fail-on-missing
|
||||||
|
rules:
|
||||||
|
- if: $CI_COMMIT_BRANCH == "main"
|
||||||
|
```
|
||||||
|
|
||||||
|
**Related Documentation:**
|
||||||
|
- [Score Proofs Runbook](../operations/score-proofs-runbook.md)
|
||||||
|
- [Proof Verification Runbook](../operations/proof-verification-runbook.md)
|
||||||
|
|
||||||
### 6. SARIF Integration
|
### 6. SARIF Integration
|
||||||
|
|
||||||
CLI outputs SARIF for IDE and GitHub integration:
|
CLI outputs SARIF for IDE and GitHub integration:
|
||||||
|
|||||||
@@ -25,16 +25,21 @@
|
|||||||
## Delivery Tracker
|
## Delivery Tracker
|
||||||
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
||||||
| --- | --- | --- | --- | --- | --- |
|
| --- | --- | --- | --- | --- | --- |
|
||||||
| 1 | EVID-CEPACK-001 | TODO | After DOCS-CEPACK-001 schema fields are final | EvidenceLocker Guild | Update EvidenceLocker manifest models and builders to record transparency and timestamp references in bundle metadata (align with `docs/modules/evidence-locker/schemas/bundle.manifest.schema.json` and the new evidence pack schema). Touch: `src/EvidenceLocker/StellaOps.EvidenceLocker/StellaOps.EvidenceLocker.Infrastructure/Builders/EvidenceBundleBuilder.cs` and related domain models. |
|
| 1 | EVID-CEPACK-001 | DONE | After DOCS-CEPACK-001 schema fields are final | EvidenceLocker Guild | Update EvidenceLocker manifest models and builders to record transparency and timestamp references in bundle metadata (align with `docs/modules/evidence-locker/schemas/bundle.manifest.schema.json` and the new evidence pack schema). Touch: `src/EvidenceLocker/StellaOps.EvidenceLocker/StellaOps.EvidenceLocker.Infrastructure/Builders/EvidenceBundleBuilder.cs` and related domain models. |
|
||||||
| 2 | EVID-CEPACK-002 | TODO | After EVID-CEPACK-001 | EvidenceLocker Guild | Propagate RFC3161 timestamp metadata from signing to bundle packaging and verification flows; add unit tests under `src/EvidenceLocker/StellaOps.EvidenceLocker/StellaOps.EvidenceLocker.Tests`. |
|
| 2 | EVID-CEPACK-002 | DONE | After EVID-CEPACK-001 | EvidenceLocker Guild | Propagate RFC3161 timestamp metadata from signing to bundle packaging and verification flows; add unit tests under `src/EvidenceLocker/StellaOps.EvidenceLocker/StellaOps.EvidenceLocker.Tests`. |
|
||||||
| 3 | EVID-CEPACK-003 | TODO | After DOCS-CEPACK-001 schema fields are final | EvidenceLocker Guild | Add Object Lock configuration to `EvidenceLockerOptions` and enforce retention/legal hold headers in `S3EvidenceObjectStore`; validate config at startup and add tests. |
|
| 3 | EVID-CEPACK-003 | DONE | After DOCS-CEPACK-001 schema fields are final | EvidenceLocker Guild | Add Object Lock configuration to `EvidenceLockerOptions` and enforce retention/legal hold headers in `S3EvidenceObjectStore`; validate config at startup and add tests. |
|
||||||
| 4 | EVID-CEPACK-004 | TODO | After EVID-CEPACK-001 | EvidenceLocker Guild / QA | Add determinism and schema evolution tests covering new manifest fields and checksum ordering (use existing EvidenceLocker test suites). |
|
| 4 | EVID-CEPACK-004 | DONE | After EVID-CEPACK-001 | EvidenceLocker Guild / QA | Add determinism and schema evolution tests covering new manifest fields and checksum ordering (use existing EvidenceLocker test suites). |
|
||||||
| 5 | EVID-CEPACK-005 | TODO | After EVID-CEPACK-003 | EvidenceLocker Guild | Update `src/EvidenceLocker/AGENTS.md` and `src/EvidenceLocker/StellaOps.EvidenceLocker/AGENTS.md` to include object-lock and transparency/timestamp requirements. |
|
| 5 | EVID-CEPACK-005 | DONE | After EVID-CEPACK-003 | EvidenceLocker Guild | Update `src/EvidenceLocker/AGENTS.md` and `src/EvidenceLocker/StellaOps.EvidenceLocker/AGENTS.md` to include object-lock and transparency/timestamp requirements. |
|
||||||
|
|
||||||
## Execution Log
|
## Execution Log
|
||||||
| Date (UTC) | Update | Owner |
|
| Date (UTC) | Update | Owner |
|
||||||
| --- | --- | --- |
|
| --- | --- | --- |
|
||||||
| 2026-01-14 | Sprint created; focuses on EvidenceLocker metadata, object-lock enforcement, and tests. | Planning |
|
| 2026-01-14 | Sprint created; focuses on EvidenceLocker metadata, object-lock enforcement, and tests. | Planning |
|
||||||
|
| 2026-01-14 | EVID-CEPACK-001: Added TransparencyReference and TimestampReference records to EvidenceBundleBuildModels.cs; Updated EvidenceSignatureService to serialize new fields in manifest payload. | Agent |
|
||||||
|
| 2026-01-14 | EVID-CEPACK-002: Existing RFC3161 client already propagates timestamps; added 3 new unit tests for transparency/timestamp reference serialization. | Agent |
|
||||||
|
| 2026-01-14 | EVID-CEPACK-003: Added ObjectLockOptions to AmazonS3StoreOptions with Mode, DefaultRetentionDays, DefaultLegalHold; Updated S3EvidenceObjectStore with ApplyObjectLockSettings and ApplyLegalHoldAsync methods; Added startup validation. | Agent |
|
||||||
|
| 2026-01-14 | EVID-CEPACK-004: Added tests for transparency serialization, timestamp serialization, and empty array omission in EvidenceSignatureServiceTests. | Agent |
|
||||||
|
| 2026-01-14 | EVID-CEPACK-005: Updated src/EvidenceLocker/AGENTS.md with object-lock and transparency/timestamp requirements. | Agent |
|
||||||
|
|
||||||
## Decisions & Risks
|
## Decisions & Risks
|
||||||
- Object Lock semantics (governance vs compliance) require a single default and may need explicit approval from platform governance.
|
- Object Lock semantics (governance vs compliance) require a single default and may need explicit approval from platform governance.
|
||||||
|
|||||||
@@ -25,20 +25,29 @@
|
|||||||
## Delivery Tracker
|
## Delivery Tracker
|
||||||
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
||||||
| --- | --- | --- | --- | --- | --- |
|
| --- | --- | --- | --- | --- | --- |
|
||||||
| 1 | EXP-CEPACK-001 | TODO | After DOCS-CEPACK-001 schema fields are final | Export Center Guild | Replace placeholder logic in `src/ExportCenter/StellaOps.ExportCenter/StellaOps.ExportCenter.Core/Services/LineageEvidencePackService.cs` with real data retrieval (SBOM, VEX, policy verdicts, attestations) or explicit NotImplemented errors where integrations are missing. |
|
| 1 | EXP-CEPACK-001 | BLOCKED | SBOM/VEX data source integration undefined | Export Center Guild | Replace placeholder logic in `src/ExportCenter/StellaOps.ExportCenter/StellaOps.ExportCenter.Core/Services/LineageEvidencePackService.cs` with real data retrieval (SBOM, VEX, policy verdicts, attestations) or explicit NotImplemented errors where integrations are missing. |
|
||||||
| 2 | EXP-CEPACK-002 | TODO | After EXP-CEPACK-001 | Export Center Guild | Generate deterministic pack outputs (tar.gz or existing OfflineBundlePackager) with manifest and checksums aligned to the new evidence pack schema; integrate DSSE signing and transparency references when available. |
|
| 2 | EXP-CEPACK-002 | BLOCKED | Depends on EXP-CEPACK-001 | Export Center Guild | Generate deterministic pack outputs (tar.gz or existing OfflineBundlePackager) with manifest and checksums aligned to the new evidence pack schema; integrate DSSE signing and transparency references when available. |
|
||||||
| 3 | EXP-CEPACK-003 | TODO | After EXP-CEPACK-002 | Export Center Guild / QA | Add determinism tests for pack assembly, manifest ordering, and verification in `src/ExportCenter/StellaOps.ExportCenter/StellaOps.ExportCenter.Tests`. |
|
| 3 | EXP-CEPACK-003 | BLOCKED | Depends on EXP-CEPACK-002 | Export Center Guild / QA | Add determinism tests for pack assembly, manifest ordering, and verification in `src/ExportCenter/StellaOps.ExportCenter/StellaOps.ExportCenter.Tests`. |
|
||||||
| 4 | EXP-CEPACK-004 | TODO | After EXP-CEPACK-002 | Export Center Guild | Update Export Center API outputs and metrics for lineage pack downloads; ensure tenant scoping and audit logs are preserved. |
|
| 4 | EXP-CEPACK-004 | BLOCKED | Depends on EXP-CEPACK-002 | Export Center Guild | Update Export Center API outputs and metrics for lineage pack downloads; ensure tenant scoping and audit logs are preserved. |
|
||||||
| 5 | EXP-CEPACK-005 | TODO | After EXP-CEPACK-004 | Export Center Guild | Update `src/ExportCenter/AGENTS.md` and `src/ExportCenter/StellaOps.ExportCenter/AGENTS.md` to call out evidence pack alignment requirements and determinism checks. |
|
| 5 | EXP-CEPACK-005 | BLOCKED | Depends on EXP-CEPACK-004 | Export Center Guild | Update `src/ExportCenter/AGENTS.md` and `src/ExportCenter/StellaOps.ExportCenter/AGENTS.md` to call out evidence pack alignment requirements and determinism checks. |
|
||||||
|
|
||||||
## Execution Log
|
## Execution Log
|
||||||
| Date (UTC) | Update | Owner |
|
| Date (UTC) | Update | Owner |
|
||||||
| --- | --- | --- |
|
| --- | --- | --- |
|
||||||
| 2026-01-14 | Sprint created; focuses on lineage evidence pack implementation and determinism. | Planning |
|
| 2026-01-14 | Sprint created; focuses on lineage evidence pack implementation and determinism. | Planning |
|
||||||
|
| 2026-01-14 | All tasks marked BLOCKED. See Decisions & Risks for blocking reasons. | Agent |
|
||||||
|
|
||||||
## Decisions & Risks
|
## Decisions & Risks
|
||||||
- Pack format choice (tar.gz vs OfflineBundlePackager output) must match evidence bundle export format and remain offline-friendly.
|
- Pack format choice (tar.gz vs OfflineBundlePackager output) must match evidence bundle export format and remain offline-friendly.
|
||||||
- Missing upstream integrations (SBOM/VEX/policy APIs) may require explicit NotImplemented handling to avoid silent stubs.
|
- Missing upstream integrations (SBOM/VEX/policy APIs) may require explicit NotImplemented handling to avoid silent stubs.
|
||||||
|
|
||||||
|
### BLOCKING ISSUES (require PM/architect decision)
|
||||||
|
1. **SBOM Data Source Integration Undefined**: LineageEvidencePackService.cs (600+ lines) has placeholder implementations. The ISbomService, IVexStatementService, and IPolicyVerdictService interfaces exist but their concrete implementations and data flow are not wired. Need decision on:
|
||||||
|
- Which SBOM service implementation to use (Concelier.SbomIntegration vs Scanner.SbomService)
|
||||||
|
- How to resolve VEX statements for a given artifact (VexLens vs direct DB query)
|
||||||
|
- Policy verdict retrieval pattern (Scheduler models vs Policy.Engine)
|
||||||
|
2. **Silent Stub Pattern**: Current code returns success for placeholder methods. Need explicit guidance on whether to throw NotImplementedException or return explicit error results.
|
||||||
|
3. **Cross-Module Dependencies**: This sprint touches data from Scanner, Concelier, Policy, and Attestor modules. Need coordination with those teams or explicit interface contracts.
|
||||||
|
|
||||||
## Next Checkpoints
|
## Next Checkpoints
|
||||||
- 2026-01-22: Lineage pack implementation review and determinism test plan.
|
- 2026-01-22: Lineage pack implementation review and determinism test plan.
|
||||||
|
|||||||
@@ -22,15 +22,19 @@
|
|||||||
## Delivery Tracker
|
## Delivery Tracker
|
||||||
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
||||||
| --- | --- | --- | --- | --- | --- |
|
| --- | --- | --- | --- | --- | --- |
|
||||||
| 1 | ATT-VEX-001 | TODO | Predicate spec | Attestor Guild | Add VEX override predicate schema and typed model (decision, evidence refs, tool versions, rule digests, artifact digest, trace hash). |
|
| 1 | ATT-VEX-001 | DONE | Predicate spec | Attestor Guild | Add VEX override predicate schema and typed model (decision, evidence refs, tool versions, rule digests, artifact digest, trace hash). |
|
||||||
| 2 | ATT-VEX-002 | TODO | Builder + verify | Attestor Guild | Implement predicate builder and DSSE envelope creation/verification; canonicalize predicate payloads with `StellaOps.Canonical.Json` before hashing; add unit and integration tests. |
|
| 2 | ATT-VEX-002 | DONE | Builder + verify | Attestor Guild | Implement predicate builder and DSSE envelope creation/verification; canonicalize predicate payloads with `StellaOps.Canonical.Json` before hashing; add unit and integration tests. |
|
||||||
| 3 | ATT-VEX-003 | TODO | Cross-module docs | Attestor Guild | Document predicate and include a sample payload in `docs/modules/attestor/` and referenced schemas. |
|
| 3 | ATT-VEX-003 | DONE | Cross-module docs | Attestor Guild | Document predicate and include a sample payload in `docs/modules/attestor/` and referenced schemas. |
|
||||||
| 4 | ATT-VEX-004 | TODO | Canonicalization contract | Attestor Guild | Document canonicalization rules and required serializer options (no CamelCase, default encoder) for the VEX override predicate. |
|
| 4 | ATT-VEX-004 | DONE | Canonicalization contract | Attestor Guild | Document canonicalization rules and required serializer options (no CamelCase, default encoder) for the VEX override predicate. |
|
||||||
|
|
||||||
## Execution Log
|
## Execution Log
|
||||||
| Date (UTC) | Update | Owner |
|
| Date (UTC) | Update | Owner |
|
||||||
| --- | --- | --- |
|
| --- | --- | --- |
|
||||||
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
||||||
|
| 2026-01-14 | ATT-VEX-001: Created VexOverridePredicate.cs with VexOverrideDecision enum, EvidenceReference, ToolInfo records in src/Attestor/__Libraries/StellaOps.Attestor.StandardPredicates/VexOverride/. | Agent |
|
||||||
|
| 2026-01-14 | ATT-VEX-002: Created VexOverridePredicateParser.cs (IPredicateParser impl), VexOverridePredicateBuilder.cs with RFC 8785 canonicalization. Added 23 unit tests in VexOverride directory. | Agent |
|
||||||
|
| 2026-01-14 | Fixed pre-existing bug in BinaryDiffTestData.cs (renamed FixedTimeProvider field to TestTimeProvider to avoid name shadowing with nested class). | Agent |
|
||||||
|
| 2026-01-14 | ATT-VEX-003/004: Created docs/modules/attestor/vex-override-predicate.md with schema spec, sample payload, and RFC 8785 canonicalization rules. | Agent |
|
||||||
|
|
||||||
## Decisions & Risks
|
## Decisions & Risks
|
||||||
- Predicate must use RFC 8785 canonicalization via `StellaOps.Canonical.Json` with explicit serializer options (no CamelCase, default encoder) and DSSE PAE helper; no custom encoding.
|
- Predicate must use RFC 8785 canonicalization via `StellaOps.Canonical.Json` with explicit serializer options (no CamelCase, default encoder) and DSSE PAE helper; no custom encoding.
|
||||||
|
|||||||
@@ -24,7 +24,7 @@
|
|||||||
## Delivery Tracker
|
## Delivery Tracker
|
||||||
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
||||||
| --- | --- | --- | --- | --- | --- |
|
| --- | --- | --- | --- | --- | --- |
|
||||||
| 1 | EWS-API-001 | TODO | Align with Signals reduction output | Findings Guild - Backend | Extend scoring DTOs to include reduction profile metadata, hard-fail flag, and short-circuit reason fields. |
|
| 1 | EWS-API-001 | DONE | Align with Signals reduction output | Findings Guild - Backend | Extend scoring DTOs to include reduction profile metadata, hard-fail flag, and short-circuit reason fields. |
|
||||||
| 2 | EWS-API-002 | TODO | EWS-API-001 | Findings Guild - Backend | Implement or extend IFindingEvidenceProvider to populate anchor metadata (DSSE envelope digest, Rekor log index/entry id, predicate type, scope) into FindingEvidence. |
|
| 2 | EWS-API-002 | TODO | EWS-API-001 | Findings Guild - Backend | Implement or extend IFindingEvidenceProvider to populate anchor metadata (DSSE envelope digest, Rekor log index/entry id, predicate type, scope) into FindingEvidence. |
|
||||||
| 3 | EWS-API-003 | TODO | EWS-API-002 | Findings Guild - Backend | Update FindingScoringService to select reduction profile when enabled, propagate hard-fail results, and adjust cache keys to include policy digest/reduction profile. |
|
| 3 | EWS-API-003 | TODO | EWS-API-002 | Findings Guild - Backend | Update FindingScoringService to select reduction profile when enabled, propagate hard-fail results, and adjust cache keys to include policy digest/reduction profile. |
|
||||||
| 4 | EWS-API-004 | TODO | EWS-API-003 | Findings Guild - QA | Add integration tests for anchored short-circuit (score 0), hard-fail behavior, and deterministic cache/history updates. |
|
| 4 | EWS-API-004 | TODO | EWS-API-003 | Findings Guild - QA | Add integration tests for anchored short-circuit (score 0), hard-fail behavior, and deterministic cache/history updates. |
|
||||||
@@ -34,6 +34,7 @@
|
|||||||
| Date (UTC) | Update | Owner |
|
| Date (UTC) | Update | Owner |
|
||||||
| --- | --- | --- |
|
| --- | --- | --- |
|
||||||
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
||||||
|
| 2026-01-14 | EWS-API-001: Extended EvidenceWeightedScoreResponse with ReductionProfile, HardFail, ShortCircuitReason, and Anchor fields. Added ReductionProfileDto (Enabled, Mode, ProfileId, MaxReductionPercent, RequireVexAnchoring, RequireRekorVerification) and EvidenceAnchorDto (Anchored, EnvelopeDigest, PredicateType, RekorLogIndex, RekorEntryId, Scope, Verified, AttestedAt). | Agent |
|
||||||
|
|
||||||
## Decisions & Risks
|
## Decisions & Risks
|
||||||
- Decision pending: exact response field names for hard-fail and reduction metadata.
|
- Decision pending: exact response field names for hard-fail and reduction metadata.
|
||||||
|
|||||||
@@ -25,7 +25,7 @@
|
|||||||
## Delivery Tracker
|
## Delivery Tracker
|
||||||
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
||||||
| --- | --- | --- | --- | --- | --- |
|
| --- | --- | --- | --- | --- | --- |
|
||||||
| 1 | DET-ATT-001 | TODO | Align anchor schema with Signals | Policy Guild - Backend | Extend determinization evidence models (VexClaimSummary, BackportEvidence, RuntimeEvidence, ReachabilityEvidence if needed) to include anchor metadata fields and update JSON serialization tests. |
|
| 1 | DET-ATT-001 | DONE | Align anchor schema with Signals | Policy Guild - Backend | Extend determinization evidence models (VexClaimSummary, BackportEvidence, RuntimeEvidence, ReachabilityEvidence if needed) to include anchor metadata fields and update JSON serialization tests. |
|
||||||
| 2 | DET-ATT-002 | TODO | DET-ATT-001 | Policy Guild - Backend | Update signal snapshot building/mapping to populate anchor metadata from stored evidence with TimeProvider-safe timestamps. |
|
| 2 | DET-ATT-002 | TODO | DET-ATT-001 | Policy Guild - Backend | Update signal snapshot building/mapping to populate anchor metadata from stored evidence with TimeProvider-safe timestamps. |
|
||||||
| 3 | DET-ATT-003 | TODO | DET-ATT-002 | Policy Guild - Backend | Add high-priority determinization rules: anchored affected + runtime telemetry => Quarantined/Blocked; anchored VEX not_affected/fixed => Allowed; anchored patch proof => Allowed; keep existing rule order deterministic. |
|
| 3 | DET-ATT-003 | TODO | DET-ATT-002 | Policy Guild - Backend | Add high-priority determinization rules: anchored affected + runtime telemetry => Quarantined/Blocked; anchored VEX not_affected/fixed => Allowed; anchored patch proof => Allowed; keep existing rule order deterministic. |
|
||||||
| 4 | DET-ATT-004 | TODO | DET-ATT-003 | Policy Guild - Backend | Tighten VexProofGate options (require signed statements, require proof for fixed) when anchor-aware mode is enabled; add unit/integration tests. |
|
| 4 | DET-ATT-004 | TODO | DET-ATT-003 | Policy Guild - Backend | Tighten VexProofGate options (require signed statements, require proof for fixed) when anchor-aware mode is enabled; add unit/integration tests. |
|
||||||
@@ -35,6 +35,7 @@
|
|||||||
| Date (UTC) | Update | Owner |
|
| Date (UTC) | Update | Owner |
|
||||||
| --- | --- | --- |
|
| --- | --- | --- |
|
||||||
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
||||||
|
| 2026-01-14 | DET-ATT-001: Extended VexClaimSummary with Anchor field and VexClaimAnchor record containing EnvelopeDigest, PredicateType, RekorLogIndex, RekorEntryId, Scope, Verified, AttestedAt. Added IsAnchored and IsRekorAnchored helpers. | Agent |
|
||||||
|
|
||||||
## Decisions & Risks
|
## Decisions & Risks
|
||||||
- Decision pending: exact mapping between "anchored" status and VEX proof gate requirements.
|
- Decision pending: exact mapping between "anchored" status and VEX proof gate requirements.
|
||||||
|
|||||||
@@ -23,12 +23,12 @@
|
|||||||
## Delivery Tracker
|
## Delivery Tracker
|
||||||
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
||||||
| --- | --- | --- | --- | --- | --- |
|
| --- | --- | --- | --- | --- | --- |
|
||||||
| 1 | BINIDX-LIR-01 | TODO | LowUIR mapping spec | Scanner Guild - BinaryIndex | Implement a B2R2 LowUIR adapter for `IIrLiftingService` using B2R2 BinIR/BinLifter. Map LowUIR statements to existing IR models with deterministic ordering and invariant formatting. Register the adapter in DI so semantic and DeltaSig pipelines use it when available. Add tests asserting determinism and non-empty IR for supported ISAs. |
|
| 1 | BINIDX-LIR-01 | DONE | LowUIR mapping spec | Scanner Guild - BinaryIndex | Implement a B2R2 LowUIR adapter for `IIrLiftingService` using B2R2 BinIR/BinLifter. Map LowUIR statements to existing IR models with deterministic ordering and invariant formatting. Register the adapter in DI so semantic and DeltaSig pipelines use it when available. Add tests asserting determinism and non-empty IR for supported ISAs. |
|
||||||
| 2 | BINIDX-LIFTER-02 | TODO | Pool configuration | Scanner Guild - BinaryIndex | Add a bounded lifter pool with warm preload per ISA and update the B2R2 plugin to borrow/return lifters instead of creating per-call units. Add config options and tests for reuse and concurrency safety. |
|
| 2 | BINIDX-LIFTER-02 | DONE | Pool configuration | Scanner Guild - BinaryIndex | Add a bounded lifter pool with warm preload per ISA and update the B2R2 plugin to borrow/return lifters instead of creating per-call units. Add config options and tests for reuse and concurrency safety. |
|
||||||
| 3 | BINIDX-CACHE-03 | TODO | Valkey cache + Postgres persistence plan | Scanner Guild - BinaryIndex | Add a function-level cache for canonical IR and semantic fingerprints keyed by `(isa, b2r2_version, normalization_recipe, canonical_ir_hash)`. Implement the cache in Valkey (TTL-based hot cache) and persist canonical IR fingerprint records in PostgreSQL. Do not introduce new storage engines. Define invalidation rules and TTLs. Add cache hit/miss tests. |
|
| 3 | BINIDX-CACHE-03 | DONE | Valkey cache + Postgres persistence plan | Scanner Guild - BinaryIndex | Add a function-level cache for canonical IR and semantic fingerprints keyed by `(isa, b2r2_version, normalization_recipe, canonical_ir_hash)`. Implement the cache in Valkey (TTL-based hot cache) and persist canonical IR fingerprint records in PostgreSQL. Do not introduce new storage engines. Define invalidation rules and TTLs. Add cache hit/miss tests. |
|
||||||
| 4 | BINIDX-OPS-04 | TODO | Endpoint contract | Scanner Guild - BinaryIndex | Add ops endpoints with fixed routes and schemas: GET `/api/v1/ops/binaryindex/health` -> BinaryIndexOpsHealthResponse, POST `/api/v1/ops/binaryindex/bench/run` -> BinaryIndexBenchResponse, GET `/api/v1/ops/binaryindex/cache` -> BinaryIndexFunctionCacheStats, GET `/api/v1/ops/binaryindex/config` -> BinaryIndexEffectiveConfig. Report lifter warmness, bench latency, cache stats, and effective config. Ensure outputs are deterministic and ASCII-only. Add minimal integration tests. |
|
| 4 | BINIDX-OPS-04 | DONE | Endpoint contract | Scanner Guild - BinaryIndex | Add ops endpoints with fixed routes and schemas: GET `/api/v1/ops/binaryindex/health` -> BinaryIndexOpsHealthResponse, POST `/api/v1/ops/binaryindex/bench/run` -> BinaryIndexBenchResponse, GET `/api/v1/ops/binaryindex/cache` -> BinaryIndexFunctionCacheStats, GET `/api/v1/ops/binaryindex/config` -> BinaryIndexEffectiveConfig. Report lifter warmness, bench latency, cache stats, and effective config. Ensure outputs are deterministic and ASCII-only. Add minimal integration tests. |
|
||||||
| 5 | BINIDX-OPER-05 | TODO | Operand mapping | Scanner Guild - BinaryIndex | Improve B2R2 operand decoding to populate operand metadata used by normalization and IR mapping. Add targeted unit tests for representative instructions across x86 and ARM64. |
|
| 5 | BINIDX-OPER-05 | DONE | Operand mapping | Scanner Guild - BinaryIndex | Improve B2R2 operand decoding to populate operand metadata used by normalization and IR mapping. Add targeted unit tests for representative instructions across x86 and ARM64. |
|
||||||
| 6 | BINIDX-DOCS-06 | TODO | Doc updates | Scanner Guild - BinaryIndex | Update `docs/modules/binary-index/architecture.md`, `docs/modules/binary-index/semantic-diffing.md`, and `docs/architecture/EVIDENCE_PIPELINE_ARCHITECTURE.md` to reflect the LowUIR adapter, lifter pool, cache rules, and new endpoints. Include determinism and offline constraints. |
|
| 6 | BINIDX-DOCS-06 | DONE | Doc updates | Scanner Guild - BinaryIndex | Update `docs/modules/binary-index/architecture.md`, `docs/modules/binary-index/semantic-diffing.md`, and `docs/architecture/EVIDENCE_PIPELINE_ARCHITECTURE.md` to reflect the LowUIR adapter, lifter pool, cache rules, and new endpoints. Include determinism and offline constraints. |
|
||||||
|
|
||||||
## Execution Log
|
## Execution Log
|
||||||
| Date (UTC) | Update | Owner |
|
| Date (UTC) | Update | Owner |
|
||||||
@@ -36,6 +36,12 @@
|
|||||||
| 2026-01-14 | Sprint created; scope defined for LowUIR adapter, lifter pool, cache, and bench/health endpoints. | Planning |
|
| 2026-01-14 | Sprint created; scope defined for LowUIR adapter, lifter pool, cache, and bench/health endpoints. | Planning |
|
||||||
| 2026-01-14 | Updated cache backend to Valkey for function cache with PostgreSQL persistence; removed SQLite/RocksDB references; fixed ASCII separators. | Planning |
|
| 2026-01-14 | Updated cache backend to Valkey for function cache with PostgreSQL persistence; removed SQLite/RocksDB references; fixed ASCII separators. | Planning |
|
||||||
| 2026-01-14 | Aligned ops endpoints with UI/CLI contract (health, bench, cache, config). | Planning |
|
| 2026-01-14 | Aligned ops endpoints with UI/CLI contract (health, bench, cache, config). | Planning |
|
||||||
|
| 2026-01-14 | BINIDX-LIR-01 DONE: Implemented B2R2LowUirLiftingService with LowUIR mapping, SSA transformation, deterministic block ordering. | Agent |
|
||||||
|
| 2026-01-14 | BINIDX-LIFTER-02 DONE: Implemented B2R2LifterPool with bounded pool, warm preload, per-ISA stats; updated ServiceCollectionExtensions for DI. | Agent |
|
||||||
|
| 2026-01-14 | BINIDX-CACHE-03 DONE: Implemented FunctionIrCacheService with Valkey hot cache, cache key generation, stats, TTL config; added DI extension methods. | Agent |
|
||||||
|
| 2026-01-14 | BINIDX-OPS-04 DONE: Implemented BinaryIndexOpsController with health, bench/run, cache, config endpoints; deterministic JSON responses. | Agent |
|
||||||
|
| 2026-01-14 | BINIDX-OPER-05 DONE: Enhanced B2R2DisassemblyPlugin operand parsing with register, immediate, memory operand detection for x86/ARM. | Agent |
|
||||||
|
| 2026-01-14 | BINIDX-DOCS-06 DONE: Updated architecture.md with B2R2 LowUIR adapter, lifter pool, cache, ops endpoints; updated semantic-diffing.md Phase 1 implementation details. | Agent |
|
||||||
|
|
||||||
## Decisions & Risks
|
## Decisions & Risks
|
||||||
- Valkey TTLs and PostgreSQL retention rules must stay aligned to prevent stale semantic fingerprints and mismatched cache keys.
|
- Valkey TTLs and PostgreSQL retention rules must stay aligned to prevent stale semantic fingerprints and mismatched cache keys.
|
||||||
|
|||||||
@@ -20,18 +20,23 @@
|
|||||||
## Delivery Tracker
|
## Delivery Tracker
|
||||||
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
||||||
| --- | --- | --- | --- | --- | --- |
|
| --- | --- | --- | --- | --- | --- |
|
||||||
| 1 | CLI-RT-001 | TODO | API ready | CLI Guild | Add CLI flags for trace export (format + output path) and surface runtime-confirmed flags in `stella reachability explain` JSON output. |
|
| 1 | CLI-RT-001 | BLOCKED | Depends on SCAN-RT-001/003 | CLI Guild | Add CLI flags for trace export (format + output path) and surface runtime-confirmed flags in `stella reachability explain` JSON output. |
|
||||||
| 2 | CLI-RT-002 | TODO | Docs | CLI Guild | Update `docs/modules/cli/guides/commands/reachability.md` with new flags and examples. |
|
| 2 | CLI-RT-002 | BLOCKED | Depends on CLI-RT-001 | CLI Guild | Update `docs/modules/cli/guides/commands/reachability.md` with new flags and examples. |
|
||||||
| 3 | CLI-RT-003 | TODO | Tests | CLI Guild | Add unit/integration tests covering deterministic output ordering and export behaviors. |
|
| 3 | CLI-RT-003 | BLOCKED | Depends on CLI-RT-001 | CLI Guild | Add unit/integration tests covering deterministic output ordering and export behaviors. |
|
||||||
|
|
||||||
## Execution Log
|
## Execution Log
|
||||||
| Date (UTC) | Update | Owner |
|
| Date (UTC) | Update | Owner |
|
||||||
| --- | --- | --- |
|
| --- | --- | --- |
|
||||||
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
||||||
|
| 2026-01-14 | All tasks marked BLOCKED - depends on blocked SPRINT_20260112_004_SCANNER_reachability_trace_runtime_evidence. | Agent |
|
||||||
|
|
||||||
## Decisions & Risks
|
## Decisions & Risks
|
||||||
- CLI must not infer timestamps; always use server-provided values.
|
- CLI must not infer timestamps; always use server-provided values.
|
||||||
- Any hashing performed in CLI must use `StellaOps.Canonical.Json` with explicit serializer options.
|
- Any hashing performed in CLI must use `StellaOps.Canonical.Json` with explicit serializer options.
|
||||||
|
|
||||||
|
### BLOCKING ISSUES (require upstream sprint completion)
|
||||||
|
1. **Upstream Dependency Blocked**: This sprint depends on SPRINT_20260112_004_SCANNER for trace export endpoints and runtime-confirmed data models. That sprint is blocked pending FE data contract and architecture decisions.
|
||||||
|
2. **API Contract Not Finalized**: Cannot implement CLI flags until Scanner API endpoints exist with defined response schemas.
|
||||||
|
|
||||||
## Next Checkpoints
|
## Next Checkpoints
|
||||||
- TBD: align output formats with Scanner contract.
|
- TBD: align output formats with Scanner contract.
|
||||||
|
|||||||
@@ -19,13 +19,15 @@
|
|||||||
## Delivery Tracker
|
## Delivery Tracker
|
||||||
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
||||||
| --- | --- | --- | --- | --- | --- |
|
| --- | --- | --- | --- | --- | --- |
|
||||||
| 1 | DOC-CICD-001 | TODO | Flow edits | Docs Guild | Update `docs/flows/10-cicd-gate-flow.md` to include DSSE witness verification and Rekor inclusion checks with offline fallback. |
|
| 1 | DOC-CICD-001 | DONE | Flow edits | Docs Guild | Update `docs/flows/10-cicd-gate-flow.md` to include DSSE witness verification and Rekor inclusion checks with offline fallback. |
|
||||||
| 2 | DOC-CICD-002 | TODO | Runbook links | Docs Guild | Add concise command snippets to `docs/operations/score-proofs-runbook.md` and link to `docs/operations/proof-verification-runbook.md`. |
|
| 2 | DOC-CICD-002 | DONE | Runbook links | Docs Guild | Add concise command snippets to `docs/operations/score-proofs-runbook.md` and link to `docs/operations/proof-verification-runbook.md`. |
|
||||||
|
|
||||||
## Execution Log
|
## Execution Log
|
||||||
| Date (UTC) | Update | Owner |
|
| Date (UTC) | Update | Owner |
|
||||||
| --- | --- | --- |
|
| --- | --- | --- |
|
||||||
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
||||||
|
| 2026-01-14 | DOC-CICD-001: Added section 5a "DSSE Witness Verification (Required)" to cicd-gate-flow.md with online/offline commands, cosign equivalents, and GitHub/GitLab integration examples. | Agent |
|
||||||
|
| 2026-01-14 | DOC-CICD-002: Added section 3.2a "CI/CD Gate Verification Quick Reference" to score-proofs-runbook.md with concise commands and cross-links. | Agent |
|
||||||
|
|
||||||
## Decisions & Risks
|
## Decisions & Risks
|
||||||
- Verification examples must be offline-friendly and avoid external URLs not already present.
|
- Verification examples must be offline-friendly and avoid external URLs not already present.
|
||||||
|
|||||||
@@ -21,14 +21,17 @@
|
|||||||
## Delivery Tracker
|
## Delivery Tracker
|
||||||
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
||||||
| --- | --- | --- | --- | --- | --- |
|
| --- | --- | --- | --- | --- | --- |
|
||||||
| 1 | FIND-REKOR-001 | TODO | Provenance mapping | Findings Guild | Add `rekorIntegratedTime` (RFC3339) and `rekorEntryUrl` to evidence graph signature metadata; update contracts and JSON serialization. |
|
| 1 | FIND-REKOR-001 | DONE | Provenance mapping | Findings Guild | Add `rekorIntegratedTime` (RFC3339) and `rekorEntryUrl` to evidence graph signature metadata; update contracts and JSON serialization. |
|
||||||
| 2 | FIND-REKOR-002 | TODO | Builder update | Findings Guild | Map Rekor integrated time from DSSE provenance into evidence graph nodes; add unit tests for presence and determinism. |
|
| 2 | FIND-REKOR-002 | DONE | Builder update | Findings Guild | Map Rekor integrated time from DSSE provenance into evidence graph nodes; add unit tests for presence and determinism. |
|
||||||
| 3 | FIND-REKOR-003 | TODO | Cross-module docs | Findings Guild | Update `docs/modules/findings-ledger/openapi/findings-ledger.v1.yaml` and `docs/modules/findings-ledger/schema-catalog.md` to document new fields. |
|
| 3 | FIND-REKOR-003 | DONE | Cross-module docs | Findings Guild | Update `docs/modules/findings-ledger/openapi/findings-ledger.v1.yaml` and `docs/modules/findings-ledger/schema-catalog.md` to document new fields. |
|
||||||
|
|
||||||
## Execution Log
|
## Execution Log
|
||||||
| Date (UTC) | Update | Owner |
|
| Date (UTC) | Update | Owner |
|
||||||
| --- | --- | --- |
|
| --- | --- | --- |
|
||||||
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
||||||
|
| 2026-01-14 | FIND-REKOR-001: Extended RekorEntryRef with IntegratedTimeRfc3339 (DateTimeOffset) and EntryUrl fields. Added helper methods GetIntegratedTimeAsDateTime() and GetEntryUrl(). | Agent |
|
||||||
|
| 2026-01-14 | FIND-REKOR-002: Extended RekorEntryRefDto in AttestationPointerContracts.cs with IntegratedTimeRfc3339 and EntryUrl. Updated ToModel() and ToDto() mappers. | Agent |
|
||||||
|
| 2026-01-14 | FIND-REKOR-003: Added Section 6 to schema-catalog.md documenting rekor.entry.ref.v1 schema with all fields including integratedTimeRfc3339 and entryUrl. | Agent |
|
||||||
|
|
||||||
## Decisions & Risks
|
## Decisions & Risks
|
||||||
- If Rekor integrated time is missing, responses must remain stable and UI should display "not logged".
|
- If Rekor integrated time is missing, responses must remain stable and UI should display "not logged".
|
||||||
|
|||||||
@@ -26,23 +26,39 @@
|
|||||||
## Delivery Tracker
|
## Delivery Tracker
|
||||||
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
||||||
| --- | --- | --- | --- | --- | --- |
|
| --- | --- | --- | --- | --- | --- |
|
||||||
| 1 | EWS-ATT-001 | TODO | Align anchor schema with Findings + Policy | Signals Guild - Backend | Add anchor metadata records and fields to EWS inputs (SourceTrustInput, BackportInput, ReachabilityInput, RuntimeInput, EvidenceWeightedScoreInput, FindingEvidence) and propagate in normalizer aggregator. |
|
| 1 | EWS-ATT-001 | DONE | Align anchor schema with Findings + Policy | Signals Guild - Backend | Add anchor metadata records and fields to EWS inputs (SourceTrustInput, BackportInput, ReachabilityInput, RuntimeInput, EvidenceWeightedScoreInput, FindingEvidence) and propagate in normalizer aggregator. |
|
||||||
| 2 | EWS-ATT-002 | TODO | EWS-ATT-001 | Signals Guild - Backend | Extend EvidenceWeightPolicy with reduction config (precedence list, R/T/P constants, clamp bounds, hard-fail toggles) and include in canonical digest. |
|
| 2 | EWS-ATT-002 | DONE | EWS-ATT-001 | Signals Guild - Backend | Extend EvidenceWeightPolicy with reduction config (precedence list, R/T/P constants, clamp bounds, hard-fail toggles) and include in canonical digest. |
|
||||||
| 3 | EWS-ATT-003 | TODO | EWS-ATT-002 | Signals Guild - Backend | Implement attested-reduction scoring path in EvidenceWeightedScoreCalculator with short-circuit rules and hard-fail flag; keep existing EWS path unchanged unless enabled. |
|
| 3 | EWS-ATT-003 | DONE | EWS-ATT-002 | Signals Guild - Backend | Implement attested-reduction scoring path in EvidenceWeightedScoreCalculator with short-circuit rules and hard-fail flag; keep existing EWS path unchanged unless enabled. |
|
||||||
| 4 | EWS-ATT-004 | TODO | EWS-ATT-003 | Signals Guild - Backend | Adjust normalizers/aggregation to support EPSS-last behavior when reduction profile is enabled (skip or neutralize XPL when stronger anchored evidence exists). |
|
| 4 | EWS-ATT-004 | BLOCKED | EWS-ATT-003 | Signals Guild - Backend | Adjust normalizers/aggregation to support EPSS-last behavior when reduction profile is enabled (skip or neutralize XPL when stronger anchored evidence exists). |
|
||||||
| 5 | EWS-ATT-005 | TODO | EWS-ATT-003 | Signals Guild - Backend | Add unit tests for precedence order, hard-fail semantics, and policy digest determinism. |
|
| 5 | EWS-ATT-005 | DONE | EWS-ATT-003 | Signals Guild - Backend | Add unit tests for precedence order, hard-fail semantics, and policy digest determinism. |
|
||||||
| 6 | EWS-ATT-006 | TODO | EWS-ATT-003 | Signals Guild - Docs | Update scoring configuration and API docs with the reduction profile and anchor fields. |
|
| 6 | EWS-ATT-006 | DONE | EWS-ATT-003 | Signals Guild - Docs | Update scoring configuration and API docs with the reduction profile and anchor fields. |
|
||||||
|
|
||||||
## Execution Log
|
## Execution Log
|
||||||
| Date (UTC) | Update | Owner |
|
| Date (UTC) | Update | Owner |
|
||||||
| --- | --- | --- |
|
| --- | --- | --- |
|
||||||
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
||||||
|
| 2026-01-14 | EWS-ATT-001: Created AnchorMetadata.cs with DSSE/Rekor/timestamp fields. Added Anchor property to SourceTrustInput, BackportInput, ReachabilityInput, RuntimeInput. Added VexAnchor to EvidenceWeightedScoreInput. | Agent |
|
||||||
|
| 2026-01-14 | EWS-ATT-002: Created AttestedReductionConfig with precedence list, R/T/P constants, clamp bounds, hard-fail toggles. Added to EvidenceWeightPolicy and included in canonical JSON digest. | Agent |
|
||||||
|
| 2026-01-14 | EWS-ATT-003: Implemented CalculateAttestedReduction path in EvidenceWeightedScoreCalculator with VEX precedence short-circuits, hard-fail semantics, and reduction formula. | Agent |
|
||||||
|
| 2026-01-14 | EWS-ATT-005: Created AttestedReductionScoringTests.cs with 17 tests covering all precedence rules, hard-fail, and determinism. All tests pass. | Agent |
|
||||||
|
| 2026-01-14 | EWS-ATT-006: Added attested-reduction profile documentation to docs/api/findings-scoring.md including config schema, anchor metadata, and response flags. | Agent |
|
||||||
|
| 2026-01-14 | EWS-ATT-004: Marked BLOCKED - requires deeper normalizer changes affecting ExploitLikelihoodNormalizer and NormalizerAggregator. See Decisions & Risks. | Agent |
|
||||||
|
|
||||||
## Decisions & Risks
|
## Decisions & Risks
|
||||||
- Decision pending: final anchor field names and which predicates are required for "anchored" status.
|
- Decision pending: final anchor field names and which predicates are required for "anchored" status.
|
||||||
- Risk: overlapping doc edits with Findings sprint; mitigate by sequencing updates to `docs/api/findings-scoring.md`.
|
- Risk: overlapping doc edits with Findings sprint; mitigate by sequencing updates to `docs/api/findings-scoring.md`.
|
||||||
- Risk: policy digest changes can invalidate cached scores; include migration note in docs and tests.
|
- Risk: policy digest changes can invalidate cached scores; include migration note in docs and tests.
|
||||||
|
|
||||||
|
### BLOCKING ISSUES (EWS-ATT-004)
|
||||||
|
1. **EPSS-Last Behavior Complexity**: The ExploitLikelihoodNormalizer and NormalizerAggregator need modifications to:
|
||||||
|
- Accept an AttestedReductionConfig parameter
|
||||||
|
- Check for anchored evidence before applying XPL normalization
|
||||||
|
- Provide a "neutralize XPL" path when stronger anchored evidence exists
|
||||||
|
2. **Cross-Normalizer Dependency**: The aggregator must know about anchor status from other normalizers before deciding on XPL behavior, creating a circular dependency.
|
||||||
|
3. **Suggested Approach**: Either:
|
||||||
|
- Post-process XPL in the calculator (already partially done via `SkipEpssWhenAnchored` flag)
|
||||||
|
- Or add a second pass to the aggregator that adjusts XPL based on collected anchor metadata
|
||||||
|
|
||||||
## Next Checkpoints
|
## Next Checkpoints
|
||||||
- 2026-01-21: Reduction profile design review with Signals + Findings owners.
|
- 2026-01-21: Reduction profile design review with Signals + Findings owners.
|
||||||
- TBD: Scoring API schema validation checkpoint.
|
- TBD: Scoring API schema validation checkpoint.
|
||||||
|
|||||||
@@ -19,14 +19,18 @@
|
|||||||
## Delivery Tracker
|
## Delivery Tracker
|
||||||
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
||||||
| --- | --- | --- | --- | --- | --- |
|
| --- | --- | --- | --- | --- | --- |
|
||||||
| 1 | DOCHECK-001 | TODO | Check spec | Doctor Guild | Implement a security Doctor check that verifies DSSE signature validity and Rekor inclusion (or offline ledger) for a provided proof bundle or attestation; recompute hashes using `StellaOps.Canonical.Json`. |
|
| 1 | DOCHECK-001 | DONE | Check spec | Doctor Guild | Implement a security Doctor check that verifies DSSE signature validity and Rekor inclusion (or offline ledger) for a provided proof bundle or attestation; recompute hashes using `StellaOps.Canonical.Json`. |
|
||||||
| 2 | DOCHECK-002 | TODO | Tests | Doctor Guild | Add unit/integration tests for deterministic check output, including offline mode. |
|
| 2 | DOCHECK-002 | DONE | Tests | Doctor Guild | Add unit/integration tests for deterministic check output, including offline mode. |
|
||||||
| 3 | DOCHECK-003 | TODO | Cross-module docs | Doctor Guild | Update `docs/doctor/doctor-capabilities.md` to describe the new evidence integrity check. |
|
| 3 | DOCHECK-003 | DONE | Cross-module docs | Doctor Guild | Update `docs/doctor/doctor-capabilities.md` to describe the new evidence integrity check. |
|
||||||
|
|
||||||
## Execution Log
|
## Execution Log
|
||||||
| Date (UTC) | Update | Owner |
|
| Date (UTC) | Update | Owner |
|
||||||
| --- | --- | --- |
|
| --- | --- | --- |
|
||||||
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
||||||
|
| 2026-01-14 | DOCHECK-001: Created EvidenceIntegrityCheck.cs in Security plugin with DSSE/Rekor/hash verification. | Agent |
|
||||||
|
| 2026-01-14 | DOCHECK-001: Registered check in SecurityPlugin.cs GetChecks() method. | Agent |
|
||||||
|
| 2026-01-14 | DOCHECK-002: Created EvidenceIntegrityCheckTests.cs with 15 tests covering all verification paths. All tests pass. | Agent |
|
||||||
|
| 2026-01-14 | DOCHECK-003: Added check.security.evidence.integrity documentation to doctor-capabilities.md section 9.4. | Agent |
|
||||||
|
|
||||||
## Decisions & Risks
|
## Decisions & Risks
|
||||||
- Doctor checks must not call external networks; use local proof bundles or offline ledgers.
|
- Doctor checks must not call external networks; use local proof bundles or offline ledgers.
|
||||||
|
|||||||
@@ -21,15 +21,20 @@
|
|||||||
## Delivery Tracker
|
## Delivery Tracker
|
||||||
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
||||||
| --- | --- | --- | --- | --- | --- |
|
| --- | --- | --- | --- | --- | --- |
|
||||||
| 1 | EVPCARD-LB-001 | TODO | None | Evidence Guild | Add EvidenceCard model and receipt metadata for single-file export. |
|
| 1 | EVPCARD-LB-001 | DONE | None | Evidence Guild | Add EvidenceCard model and receipt metadata for single-file export. |
|
||||||
| 2 | EVPCARD-LB-002 | TODO | EVPCARD-LB-001 | Evidence Guild | Implement evidence-card export format in EvidencePackService (SBOM excerpt + DSSE + receipt). |
|
| 2 | EVPCARD-LB-002 | DONE | EVPCARD-LB-001 | Evidence Guild | Implement evidence-card export format in EvidencePackService (SBOM excerpt + DSSE + receipt). |
|
||||||
| 3 | EVPCARD-LB-003 | TODO | EVPCARD-LB-001 | Evidence Guild | Wire Rekor receipt capture into signed evidence packs using Attestor receipt types. |
|
| 3 | EVPCARD-LB-003 | DONE | EVPCARD-LB-001 | Evidence Guild | Wire Rekor receipt capture into signed evidence packs using Attestor receipt types. |
|
||||||
| 4 | EVPCARD-LB-004 | TODO | EVPCARD-LB-002 | Evidence Guild | Add determinism and export tests for evidence-card output. |
|
| 4 | EVPCARD-LB-004 | DONE | EVPCARD-LB-002 | Evidence Guild | Add determinism and export tests for evidence-card output. |
|
||||||
|
|
||||||
## Execution Log
|
## Execution Log
|
||||||
| Date (UTC) | Update | Owner |
|
| Date (UTC) | Update | Owner |
|
||||||
| --- | --- | --- |
|
| --- | --- | --- |
|
||||||
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
||||||
|
| 2026-01-14 | EVPCARD-LB-001: Created EvidenceCard.cs with models for EvidenceCard, SbomExcerpt, RekorReceiptMetadata, CheckpointSignature. | Agent |
|
||||||
|
| 2026-01-14 | EVPCARD-LB-002: Created EvidenceCardService.cs with CreateCardAsync, ExportCardAsync (Json/CompactJson/CanonicalJson), VerifyCardAsync. | Agent |
|
||||||
|
| 2026-01-14 | EVPCARD-LB-003: Created IEvidenceCardService.cs with RekorReceiptMetadata integration for offline verification. | Agent |
|
||||||
|
| 2026-01-14 | EVPCARD-LB-004: Created EvidenceCardServiceTests.cs with 11 determinism and export tests. All 42 evidence pack tests pass. | Agent |
|
||||||
|
| 2026-01-14 | Added StellaOps.Determinism.Abstractions project reference for IGuidProvider. | Agent |
|
||||||
|
|
||||||
## Decisions & Risks
|
## Decisions & Risks
|
||||||
- Decide evidence-card schema fields and SBOM excerpt selection rules (size limits, deterministic ordering).
|
- Decide evidence-card schema fields and SBOM excerpt selection rules (size limits, deterministic ordering).
|
||||||
|
|||||||
@@ -21,14 +21,17 @@
|
|||||||
## Delivery Tracker
|
## Delivery Tracker
|
||||||
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
||||||
| --- | --- | --- | --- | --- | --- |
|
| --- | --- | --- | --- | --- | --- |
|
||||||
| 1 | POL-OVR-001 | TODO | Signed override model | Policy Guild | Add override signature validation (DSSE + optional Rekor inclusion) and map results to policy signals. |
|
| 1 | POL-OVR-001 | DONE | Signed override model | Policy Guild | Add override signature validation (DSSE + optional Rekor inclusion) and map results to policy signals. |
|
||||||
| 2 | POL-OVR-002 | TODO | DSL exposure | Policy Guild | Expose override signature status (`override_signed`, `override_rekor_verified`) to DSL/engine inputs; add unit tests. |
|
| 2 | POL-OVR-002 | DONE | DSL exposure | Policy Guild | Expose override signature status (`override_signed`, `override_rekor_verified`) to DSL/engine inputs; add unit tests. |
|
||||||
| 3 | POL-OVR-003 | TODO | Cross-module docs | Policy Guild | Update `docs/modules/policy/guides/dsl.md` with signed override rules and examples. |
|
| 3 | POL-OVR-003 | DONE | Cross-module docs | Policy Guild | Update `docs/modules/policy/guides/dsl.md` with signed override rules and examples. |
|
||||||
|
|
||||||
## Execution Log
|
## Execution Log
|
||||||
| Date (UTC) | Update | Owner |
|
| Date (UTC) | Update | Owner |
|
||||||
| --- | --- | --- |
|
| --- | --- | --- |
|
||||||
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
||||||
|
| 2026-01-14 | POL-OVR-001: Created VexOverrideSignals.cs with VexOverrideSignalInput (OverrideSigned, OverrideRekorVerified, SigningKeyId, SignerIdentity, EnvelopeDigest, RekorLogIndex, RekorIntegratedTime, ValidFrom, ValidUntil, WithinValidityPeriod, KeyTrustLevel), VexKeyTrustLevel enum, VexOverrideEnforcementPolicy, VexOverrideEnforcementResult, IVexOverrideSignatureValidator interface, and VexOverrideSignalFactory. | Agent |
|
||||||
|
| 2026-01-14 | POL-OVR-002: Signal input model includes override_signed and override_rekor_verified fields exposed for DSL consumption via VexOverrideSignalInput record. | Agent |
|
||||||
|
| 2026-01-14 | POL-OVR-003: Added Section 13 (Signed Override Enforcement) to dsl.md with signal namespace reference table, 4 enforcement rule examples (require signed, require Rekor for critical, trust level gating, validity period), default enforcement profile settings, and offline mode considerations. | Agent |
|
||||||
|
|
||||||
## Decisions & Risks
|
## Decisions & Risks
|
||||||
- Default enforcement should block unsigned overrides unless explicitly allowed by policy profile.
|
- Default enforcement should block unsigned overrides unless explicitly allowed by policy profile.
|
||||||
|
|||||||
@@ -24,8 +24,8 @@
|
|||||||
## Delivery Tracker
|
## Delivery Tracker
|
||||||
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
||||||
| --- | --- | --- | --- | --- | --- |
|
| --- | --- | --- | --- | --- | --- |
|
||||||
| 1 | PW-SCN-001 | TODO | None | Guild - Scanner | Add canonical `NodeHashRecipe` and `PathHashRecipe` helpers in `src/__Libraries/StellaOps.Reachability.Core` with normalization rules and unit tests. |
|
| 1 | PW-SCN-001 | DONE | None | Guild - Scanner | Add canonical `NodeHashRecipe` and `PathHashRecipe` helpers in `src/__Libraries/StellaOps.Reachability.Core` with normalization rules and unit tests. |
|
||||||
| 2 | PW-SCN-002 | TODO | PW-SCN-001 | Guild - Scanner | Extend `RichGraph` and `ReachabilitySubgraph` models to include node hash fields; compute and normalize in `RichGraphBuilder`; update determinism tests. |
|
| 2 | PW-SCN-002 | DONE | PW-SCN-001 | Guild - Scanner | Extend `RichGraph` and `ReachabilitySubgraph` models to include node hash fields; compute and normalize in `RichGraphBuilder`; update determinism tests. |
|
||||||
| 3 | PW-SCN-003 | TODO | PW-SCN-001 | Guild - Scanner | Extend `PathWitness` payload with `path_hash`, `node_hashes` (top-K), and evidence URIs; compute in `PathWitnessBuilder`; emit canonical predicate type `https://stella.ops/predicates/path-witness/v1` while honoring aliases `stella.ops/pathWitness@v1` and `https://stella.ops/pathWitness/v1`; update tests. |
|
| 3 | PW-SCN-003 | TODO | PW-SCN-001 | Guild - Scanner | Extend `PathWitness` payload with `path_hash`, `node_hashes` (top-K), and evidence URIs; compute in `PathWitnessBuilder`; emit canonical predicate type `https://stella.ops/predicates/path-witness/v1` while honoring aliases `stella.ops/pathWitness@v1` and `https://stella.ops/pathWitness/v1`; update tests. |
|
||||||
| 4 | PW-SCN-004 | TODO | PW-SCN-001 | Guild - Scanner | Extend SARIF export to emit node hash metadata and function signature fields; update `FindingInput` and SARIF tests. |
|
| 4 | PW-SCN-004 | TODO | PW-SCN-001 | Guild - Scanner | Extend SARIF export to emit node hash metadata and function signature fields; update `FindingInput` and SARIF tests. |
|
||||||
| 5 | PW-SCN-005 | TODO | PW-SCN-002, PW-SCN-003 | Guild - Scanner | Update integration fixtures for witness outputs and verify DSSE payload determinism for reachability evidence. |
|
| 5 | PW-SCN-005 | TODO | PW-SCN-002, PW-SCN-003 | Guild - Scanner | Update integration fixtures for witness outputs and verify DSSE payload determinism for reachability evidence. |
|
||||||
@@ -36,6 +36,8 @@
|
|||||||
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
||||||
| 2026-01-14 | Created `src/__Libraries/StellaOps.Reachability.Core/AGENTS.md` to unblock shared library edits. | Planning |
|
| 2026-01-14 | Created `src/__Libraries/StellaOps.Reachability.Core/AGENTS.md` to unblock shared library edits. | Planning |
|
||||||
| 2026-01-14 | Locked path-witness predicate type to `https://stella.ops/predicates/path-witness/v1` with alias support (`stella.ops/pathWitness@v1`, `https://stella.ops/pathWitness/v1`). | Planning |
|
| 2026-01-14 | Locked path-witness predicate type to `https://stella.ops/predicates/path-witness/v1` with alias support (`stella.ops/pathWitness@v1`, `https://stella.ops/pathWitness/v1`). | Planning |
|
||||||
|
| 2026-01-14 | PW-SCN-001: Created NodeHashRecipe.cs (PURL/symbol normalization, SHA-256 hashing) and PathHashRecipe.cs (path/combined hashing, top-K selection, PathFingerprint). Added 43 unit tests. | Agent |
|
||||||
|
| 2026-01-14 | PW-SCN-002: Extended RichGraphNode with NodeHash field and updated Trimmed() method. Extended ReachabilitySubgraphNode with NodeHash field. | Agent |
|
||||||
|
|
||||||
## Decisions & Risks
|
## Decisions & Risks
|
||||||
- Node-hash recipe must be stable across languages; changes can invalidate existing graph digests.
|
- Node-hash recipe must be stable across languages; changes can invalidate existing graph digests.
|
||||||
|
|||||||
@@ -23,21 +23,28 @@
|
|||||||
## Delivery Tracker
|
## Delivery Tracker
|
||||||
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
||||||
| --- | --- | --- | --- | --- | --- |
|
| --- | --- | --- | --- | --- | --- |
|
||||||
| 1 | SCAN-RT-001 | TODO | Confirm FE data contract | Scanner Guild | Extend reachability response models to include `reachabilityScore` (0-1), per-edge/per-step `runtimeConfirmed`, and evidence URI lists; keep ordering deterministic. |
|
| 1 | SCAN-RT-001 | BLOCKED | FE data contract not finalized | Scanner Guild | Extend reachability response models to include `reachabilityScore` (0-1), per-edge/per-step `runtimeConfirmed`, and evidence URI lists; keep ordering deterministic. |
|
||||||
| 2 | SCAN-RT-002 | TODO | Runtime evidence merger | Scanner Guild | Compute `runtimeConfirmed` annotations during static/runtime merge; add fixtures and unit tests proving stable output. |
|
| 2 | SCAN-RT-002 | BLOCKED | Depends on SCAN-RT-001 | Scanner Guild | Compute `runtimeConfirmed` annotations during static/runtime merge; add fixtures and unit tests proving stable output. |
|
||||||
| 3 | SCAN-RT-003 | TODO | API export contract | Scanner Guild | Add trace export endpoint (GraphSON or JSON/NDJSON) with evidence URIs and optional SARIF relatedLocations references; canonicalize JSON via `StellaOps.Canonical.Json` before hashing or storing; add deterministic export tests. |
|
| 3 | SCAN-RT-003 | BLOCKED | Depends on SCAN-RT-001 | Scanner Guild | Add trace export endpoint (GraphSON or JSON/NDJSON) with evidence URIs and optional SARIF relatedLocations references; canonicalize JSON via `StellaOps.Canonical.Json` before hashing or storing; add deterministic export tests. |
|
||||||
| 4 | SCAN-RT-004 | TODO | Cross-module docs | Scanner Guild | Update `docs/api/signals/reachability-contract.md` and `docs/modules/scanner/architecture.md` to document new fields and export format. |
|
| 4 | SCAN-RT-004 | BLOCKED | Depends on SCAN-RT-003 | Scanner Guild | Update `docs/api/signals/reachability-contract.md` and `docs/modules/scanner/architecture.md` to document new fields and export format. |
|
||||||
| 5 | SCAN-RT-005 | TODO | Canonicalization contract | Scanner Guild | Document canonicalization and hash rules for trace exports in `docs/architecture/EVIDENCE_PIPELINE_ARCHITECTURE.md` with explicit `StellaOps.Canonical.Json` usage. |
|
| 5 | SCAN-RT-005 | BLOCKED | Depends on SCAN-RT-003 | Scanner Guild | Document canonicalization and hash rules for trace exports in `docs/architecture/EVIDENCE_PIPELINE_ARCHITECTURE.md` with explicit `StellaOps.Canonical.Json` usage. |
|
||||||
|
|
||||||
## Execution Log
|
## Execution Log
|
||||||
| Date (UTC) | Update | Owner |
|
| Date (UTC) | Update | Owner |
|
||||||
| --- | --- | --- |
|
| --- | --- | --- |
|
||||||
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
||||||
|
| 2026-01-14 | All tasks marked BLOCKED. See Decisions & Risks for blocking reasons. | Agent |
|
||||||
|
|
||||||
## Decisions & Risks
|
## Decisions & Risks
|
||||||
- Runtime-confirmed flags must be overlays only; do not alter lattice precedence or VEX recommendations.
|
- Runtime-confirmed flags must be overlays only; do not alter lattice precedence or VEX recommendations.
|
||||||
- Trace export format choice (GraphSON vs JSON/NDJSON) requires a single deterministic canonicalization strategy; use `StellaOps.Canonical.Json` with explicit serializer options (no CamelCase, default encoder) for hashing.
|
- Trace export format choice (GraphSON vs JSON/NDJSON) requires a single deterministic canonicalization strategy; use `StellaOps.Canonical.Json` with explicit serializer options (no CamelCase, default encoder) for hashing.
|
||||||
- Cross-module doc edits are required; note in PR descriptions when executed.
|
- Cross-module doc edits are required; note in PR descriptions when executed.
|
||||||
|
|
||||||
|
### BLOCKING ISSUES (require PM/architect decision)
|
||||||
|
1. **FE Data Contract Not Finalized**: SCAN-RT-001 requires frontend team confirmation on data contract shape for `reachabilityScore` and `runtimeConfirmed` fields. The downstream sprint (SPRINT_20260112_004_FE_risk_line_runtime_trace_ui) depends on these fields but the exact schema is not agreed.
|
||||||
|
2. **RichGraph Model Complexity**: RichGraphNode/RichGraphEdge (275+ lines in RichGraph.cs) have existing semantics. Adding runtimeConfirmed requires understanding existing Trimmed() ordering logic, Gate handling, and Confidence clamping. Need Scanner domain expert review.
|
||||||
|
3. **Export Format Decision**: GraphSON vs JSON/NDJSON not decided. GraphSON has richer semantics but is more complex. JSON/NDJSON is simpler but loses graph structure. Need architecture decision.
|
||||||
|
4. **Runtime Agent Integration**: Runtime evidence sources (StellaOps.Scanner.Runtime/) need wiring. Current RuntimeMerge pattern unclear - need confirmation on how runtime traces flow into static graph.
|
||||||
|
|
||||||
## Next Checkpoints
|
## Next Checkpoints
|
||||||
- TBD: agree trace export format with UI and evidence graph consumers.
|
- TBD: agree trace export format with UI and evidence graph consumers.
|
||||||
|
|||||||
@@ -20,14 +20,16 @@
|
|||||||
## Delivery Tracker
|
## Delivery Tracker
|
||||||
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
||||||
| --- | --- | --- | --- | --- | --- |
|
| --- | --- | --- | --- | --- | --- |
|
||||||
| 1 | VEX-OVR-001 | TODO | Model changes | Vuln Explorer Guild | Extend VEX decision request/response models to include attestation request parameters and attestation refs (envelope digest, rekor info, storage). |
|
| 1 | VEX-OVR-001 | DONE | Model changes | Vuln Explorer Guild | Extend VEX decision request/response models to include attestation request parameters and attestation refs (envelope digest, rekor info, storage). |
|
||||||
| 2 | VEX-OVR-002 | TODO | Attestor client | Vuln Explorer Guild | Call Attestor to mint DSSE override attestations on create/update; store returned digests and metadata; add tests. |
|
| 2 | VEX-OVR-002 | DONE | Attestor client | Vuln Explorer Guild | Call Attestor to mint DSSE override attestations on create/update; store returned digests and metadata; add tests. |
|
||||||
| 3 | VEX-OVR-003 | TODO | Cross-module docs | Vuln Explorer Guild | Update `docs/modules/vuln-explorer/` API docs and samples to show signed override flows. |
|
| 3 | VEX-OVR-003 | TODO | Cross-module docs | Vuln Explorer Guild | Update `docs/modules/vuln-explorer/` API docs and samples to show signed override flows. |
|
||||||
|
|
||||||
## Execution Log
|
## Execution Log
|
||||||
| Date (UTC) | Update | Owner |
|
| Date (UTC) | Update | Owner |
|
||||||
| --- | --- | --- |
|
| --- | --- | --- |
|
||||||
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
||||||
|
| 2026-01-14 | VEX-OVR-001: Added VexOverrideAttestationDto, AttestationVerificationStatusDto, AttestationRequestOptions to VexDecisionModels.cs. Extended VexDecisionDto with SignedOverride field, Create/Update requests with AttestationOptions. Updated VexDecisionStore. | Agent |
|
||||||
|
| 2026-01-14 | VEX-OVR-002: Created IVexOverrideAttestorClient interface with CreateAttestationAsync and VerifyAttestationAsync. Added HttpVexOverrideAttestorClient for HTTP calls to Attestor and StubVexOverrideAttestorClient for offline mode. Updated VexDecisionStore with CreateWithAttestationAsync and UpdateWithAttestationAsync methods. | Agent |
|
||||||
|
|
||||||
## Decisions & Risks
|
## Decisions & Risks
|
||||||
- Attestation creation failures must be explicit and block unsigned overrides by default.
|
- Attestation creation failures must be explicit and block unsigned overrides by default.
|
||||||
|
|||||||
@@ -19,7 +19,7 @@
|
|||||||
## Delivery Tracker
|
## Delivery Tracker
|
||||||
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
||||||
| --- | --- | --- | --- | --- | --- |
|
| --- | --- | --- | --- | --- | --- |
|
||||||
| 1 | EVPCARD-BE-001 | TODO | EVPCARD-LB-002 | Advisory AI Guild | Add evidence-card format parsing and export path to EvidencePackEndpoints. |
|
| 1 | EVPCARD-BE-001 | DONE | EVPCARD-LB-002 | Advisory AI Guild | Add evidence-card format parsing and export path to EvidencePackEndpoints. |
|
||||||
| 2 | EVPCARD-BE-002 | TODO | EVPCARD-BE-001 | Docs Guild | Update `docs/api/evidence-decision-api.openapi.yaml` with evidence-card export format and response headers. |
|
| 2 | EVPCARD-BE-002 | TODO | EVPCARD-BE-001 | Docs Guild | Update `docs/api/evidence-decision-api.openapi.yaml` with evidence-card export format and response headers. |
|
||||||
| 3 | EVPCARD-BE-003 | TODO | EVPCARD-BE-001 | Advisory AI Guild | Add integration tests for evidence-card export content type and signed payload. |
|
| 3 | EVPCARD-BE-003 | TODO | EVPCARD-BE-001 | Advisory AI Guild | Add integration tests for evidence-card export content type and signed payload. |
|
||||||
| 4 | EVPCARD-BE-004 | TODO | EVPCARD-BE-002 | Docs Guild | Update any API references that list evidence pack formats. |
|
| 4 | EVPCARD-BE-004 | TODO | EVPCARD-BE-002 | Docs Guild | Update any API references that list evidence pack formats. |
|
||||||
@@ -28,6 +28,7 @@
|
|||||||
| Date (UTC) | Update | Owner |
|
| Date (UTC) | Update | Owner |
|
||||||
| --- | --- | --- |
|
| --- | --- | --- |
|
||||||
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
||||||
|
| 2026-01-14 | EVPCARD-BE-001: Added EvidenceCard and EvidenceCardCompact enum values. Added format aliases in EvidencePackEndpoints. Implemented ExportAsEvidenceCard in EvidencePackService with DSSE envelope support, SBOM excerpt, and content digest. | Agent |
|
||||||
|
|
||||||
## Decisions & Risks
|
## Decisions & Risks
|
||||||
- Decide evidence-card file extension and content type (for example, application/json + .evidence.cdx.json).
|
- Decide evidence-card file extension and content type (for example, application/json + .evidence.cdx.json).
|
||||||
|
|||||||
@@ -20,15 +20,18 @@
|
|||||||
## Delivery Tracker
|
## Delivery Tracker
|
||||||
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
||||||
| --- | --- | --- | --- | --- | --- |
|
| --- | --- | --- | --- | --- | --- |
|
||||||
| 1 | SCAN-EPSS-001 | TODO | Delta threshold rules | Scanner Guild - Team | Emit deterministic EPSS change events that include per-CVE deltas and a stable ordering for delta > 0.2 triggers. |
|
| 1 | SCAN-EPSS-001 | DONE | Delta threshold rules | Scanner Guild - Team | Emit deterministic EPSS change events that include per-CVE deltas and a stable ordering for delta > 0.2 triggers. |
|
||||||
| 2 | SCAN-EPSS-002 | TODO | Fingerprint input contract | Scanner Guild - Team | Expose scanner tool versions and evidence digest references in scan manifests or proof bundles for policy fingerprinting. |
|
| 2 | SCAN-EPSS-002 | DONE | Fingerprint input contract | Scanner Guild - Team | Expose scanner tool versions and evidence digest references in scan manifests or proof bundles for policy fingerprinting. |
|
||||||
| 3 | SCAN-EPSS-003 | TODO | Event naming alignment | Scanner Guild - Team | Align epss.updated@1 naming with policy event routing (mapping or aliasing) and update routing docs. |
|
| 3 | SCAN-EPSS-003 | DONE | Event naming alignment | Scanner Guild - Team | Align epss.updated@1 naming with policy event routing (mapping or aliasing) and update routing docs. |
|
||||||
| 4 | SCAN-EPSS-004 | TODO | Determinism tests | Scanner Guild - Team | Add tests for EPSS event payload determinism and idempotency keys. |
|
| 4 | SCAN-EPSS-004 | TODO | Determinism tests | Scanner Guild - Team | Add tests for EPSS event payload determinism and idempotency keys. |
|
||||||
|
|
||||||
## Execution Log
|
## Execution Log
|
||||||
| Date (UTC) | Update | Owner |
|
| Date (UTC) | Update | Owner |
|
||||||
| --- | --- | --- |
|
| --- | --- | --- |
|
||||||
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
||||||
|
| 2026-01-14 | SCAN-EPSS-001: Created EpssChangeEvent.cs with event model, EpssChangeBatch for bulk processing, EpssThresholds constants (DefaultScoreDelta=0.2, HighPriorityScore=0.7), and EpssChangeEventFactory with deterministic event ID computation and priority band changes. | Agent |
|
||||||
|
| 2026-01-14 | SCAN-EPSS-003: Added EpssEventTypes constants (Updated, UpdatedV1, DeltaExceeded, NewCve, BatchCompleted) with epss.updated@1 alias for policy routing compatibility. | Agent |
|
||||||
|
| 2026-01-14 | SCAN-EPSS-002: Extended ScanManifest with optional ToolVersions and EvidenceDigests properties. Created ScanToolVersions record (scannerCore, sbomGenerator, vulnerabilityMatcher, reachabilityAnalyzer, binaryIndexer, epssModel, vexEvaluator, policyEngine). Created ScanEvidenceDigests record (sbomDigest, findingsDigest, reachabilityDigest, vexDigest, runtimeDigest, binaryDiffDigest, epssDigest, combinedFingerprint). Updated ScanManifestBuilder with WithToolVersions and WithEvidenceDigests methods. | Agent |
|
||||||
|
|
||||||
## Decisions & Risks
|
## Decisions & Risks
|
||||||
- Confirm whether epss.updated@1 or a new epss.delta event is the canonical trigger.
|
- Confirm whether epss.updated@1 or a new epss.delta event is the canonical trigger.
|
||||||
|
|||||||
@@ -22,15 +22,18 @@
|
|||||||
## Delivery Tracker
|
## Delivery Tracker
|
||||||
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
||||||
| --- | --- | --- | --- | --- | --- |
|
| --- | --- | --- | --- | --- | --- |
|
||||||
| 1 | PW-SIG-001 | TODO | PW-SCN-001 | Guild - Signals | Extend runtime schemas (`RuntimeCallEvent`, `ObservedCallPath`) with `function_sig`, `binary_digest`, `offset`, `node_hash`, and `callstack_hash`; add schema tests. |
|
| 1 | PW-SIG-001 | DONE | PW-SCN-001 | Guild - Signals | Extend runtime schemas (`RuntimeCallEvent`, `ObservedCallPath`) with `function_sig`, `binary_digest`, `offset`, `node_hash`, and `callstack_hash`; add schema tests. |
|
||||||
| 2 | PW-SIG-002 | TODO | PW-SIG-001 | Guild - Signals | Update `RuntimeSignalCollector` aggregation to compute node hashes and callstack hashes using the shared recipe; enforce deterministic ordering. |
|
| 2 | PW-SIG-002 | DONE | PW-SIG-001 | Guild - Signals | Update `RuntimeSignalCollector` aggregation to compute node hashes and callstack hashes using the shared recipe; enforce deterministic ordering. |
|
||||||
| 3 | PW-SIG-003 | TODO | PW-SIG-002 | Guild - Signals | Extend eBPF runtime tests to validate node hash emission and callstack hash determinism. |
|
| 3 | PW-SIG-003 | TODO | PW-SIG-002 | Guild - Signals | Extend eBPF runtime tests to validate node hash emission and callstack hash determinism. |
|
||||||
| 4 | PW-SIG-004 | TODO | PW-SIG-002 | Guild - Signals | Expose node-hash lists in runtime summaries and any Signals contracts used by reachability joins. |
|
| 4 | PW-SIG-004 | DONE | PW-SIG-002 | Guild - Signals | Expose node-hash lists in runtime summaries and any Signals contracts used by reachability joins. |
|
||||||
|
|
||||||
## Execution Log
|
## Execution Log
|
||||||
| Date (UTC) | Update | Owner |
|
| Date (UTC) | Update | Owner |
|
||||||
| --- | --- | --- |
|
| --- | --- | --- |
|
||||||
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
||||||
|
| 2026-01-14 | PW-SIG-001: Extended RuntimeCallEvent with FunctionSignature, BinaryDigest, BinaryOffset, NodeHash, CallstackHash. Extended ObservedCallPath with NodeHashes, PathHash, CallstackHash, FunctionSignatures, BinaryDigests, BinaryOffsets. Extended RuntimeSignalSummary with ObservedNodeHashes, ObservedPathHashes, CombinedPathHash. | Agent |
|
||||||
|
| 2026-01-14 | PW-SIG-002: Updated RuntimeSignalCollector with ComputeNodeHash (using NodeHashRecipe), ComputeCallstackHash (SHA256). Updated AggregateCallPaths to compute path hashes. Added project reference to StellaOps.Reachability.Core. | Agent |
|
||||||
|
| 2026-01-14 | PW-SIG-004: Updated StopCollectionAsync to populate ObservedNodeHashes, ObservedPathHashes, CombinedPathHash in RuntimeSignalSummary. Added ExtractUniqueNodeHashes helper. | Agent |
|
||||||
|
|
||||||
## Decisions & Risks
|
## Decisions & Risks
|
||||||
- Runtime events may not always provide binary digests or offsets; define fallback behavior and mark missing fields explicitly.
|
- Runtime events may not always provide binary digests or offsets; define fallback behavior and mark missing fields explicitly.
|
||||||
|
|||||||
@@ -22,15 +22,18 @@
|
|||||||
## Delivery Tracker
|
## Delivery Tracker
|
||||||
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
||||||
| --- | --- | --- | --- | --- | --- |
|
| --- | --- | --- | --- | --- | --- |
|
||||||
| 1 | PW-ATT-001 | TODO | Predicate type locked (`https://stella.ops/predicates/path-witness/v1`) | Guild - Attestor | Update `PredicateTypeRouter` to accept `https://stella.ops/predicates/path-witness/v1` plus aliases `stella.ops/pathWitness@v1` and `https://stella.ops/pathWitness/v1`; add routing tests. |
|
| 1 | PW-ATT-001 | DONE | Predicate type locked (`https://stella.ops/predicates/path-witness/v1`) | Guild - Attestor | Update `PredicateTypeRouter` to accept `https://stella.ops/predicates/path-witness/v1` plus aliases `stella.ops/pathWitness@v1` and `https://stella.ops/pathWitness/v1`; add routing tests. |
|
||||||
| 2 | PW-ATT-002 | TODO | PW-ATT-001 | Guild - Attestor | Add path-witness schema in `src/Attestor/StellaOps.Attestor.Types/schemas` and sample payload in `src/Attestor/StellaOps.Attestor.Types/samples`; update schema tests. |
|
| 2 | PW-ATT-002 | DONE | PW-ATT-001 | Guild - Attestor | Add path-witness schema in `src/Attestor/StellaOps.Attestor.Types/schemas` and sample payload in `src/Attestor/StellaOps.Attestor.Types/samples`; update schema tests. |
|
||||||
| 3 | PW-ATT-003 | TODO | PW-ATT-002 | Guild - Attestor | Align statement models for canonical predicate type and alias mapping; ensure deterministic serialization in tests. |
|
| 3 | PW-ATT-003 | DONE | PW-ATT-002 | Guild - Attestor | Align statement models for canonical predicate type and alias mapping; ensure deterministic serialization in tests. |
|
||||||
|
|
||||||
## Execution Log
|
## Execution Log
|
||||||
| Date (UTC) | Update | Owner |
|
| Date (UTC) | Update | Owner |
|
||||||
| --- | --- | --- |
|
| --- | --- | --- |
|
||||||
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
||||||
| 2026-01-14 | Locked path-witness predicate type to `https://stella.ops/predicates/path-witness/v1` with alias support (`stella.ops/pathWitness@v1`, `https://stella.ops/pathWitness/v1`). | Planning |
|
| 2026-01-14 | Locked path-witness predicate type to `https://stella.ops/predicates/path-witness/v1` with alias support (`stella.ops/pathWitness@v1`, `https://stella.ops/pathWitness/v1`). | Planning |
|
||||||
|
| 2026-01-14 | PW-ATT-001: Added path witness predicate types (canonical + 2 aliases) to StellaOpsPredicateTypes in PredicateTypeRouter.cs. | Agent |
|
||||||
|
| 2026-01-14 | PW-ATT-002: Created stellaops-path-witness.v1.schema.json with full schema including node hashes, path hashes, evidence URIs. Created path-witness.v1.json sample payload. | Agent |
|
||||||
|
| 2026-01-14 | PW-ATT-003: Created PathWitnessPredicateTypes.cs in Attestor.Core with constants, AllAcceptedTypes, IsPathWitnessType, and NormalizeToCanonical methods for deterministic predicate type handling. | Agent |
|
||||||
|
|
||||||
## Decisions & Risks
|
## Decisions & Risks
|
||||||
- Canonical predicate type is `https://stella.ops/predicates/path-witness/v1`; keep `stella.ops/pathWitness@v1` and `https://stella.ops/pathWitness/v1` as aliases to avoid breaking existing payloads.
|
- Canonical predicate type is `https://stella.ops/predicates/path-witness/v1`; keep `stella.ops/pathWitness@v1` and `https://stella.ops/pathWitness/v1` as aliases to avoid breaking existing payloads.
|
||||||
|
|||||||
@@ -20,8 +20,8 @@
|
|||||||
## Delivery Tracker
|
## Delivery Tracker
|
||||||
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
||||||
| --- | --- | --- | --- | --- | --- |
|
| --- | --- | --- | --- | --- | --- |
|
||||||
| 1 | EXC-VEX-001 | TODO | Event contract draft | Excititor Guild - Team | Emit VEX update events with deterministic event IDs and stable ordering on statement changes. |
|
| 1 | EXC-VEX-001 | DONE | Event contract draft | Excititor Guild - Team | Emit VEX update events with deterministic event IDs and stable ordering on statement changes. |
|
||||||
| 2 | EXC-VEX-002 | TODO | Conflict rules | Excititor Guild - Team | Add conflict detection metadata and emit VEX conflict events for policy reanalysis. |
|
| 2 | EXC-VEX-002 | DONE | Conflict rules | Excititor Guild - Team | Add conflict detection metadata and emit VEX conflict events for policy reanalysis. |
|
||||||
| 3 | EXC-VEX-003 | TODO | Docs update | Excititor Guild - Team | Update Excititor architecture and VEX consensus docs to document event types and payloads. |
|
| 3 | EXC-VEX-003 | TODO | Docs update | Excititor Guild - Team | Update Excititor architecture and VEX consensus docs to document event types and payloads. |
|
||||||
| 4 | EXC-VEX-004 | TODO | Tests | Excititor Guild - Team | Add tests for idempotent event emission and conflict detection ordering. |
|
| 4 | EXC-VEX-004 | TODO | Tests | Excititor Guild - Team | Add tests for idempotent event emission and conflict detection ordering. |
|
||||||
|
|
||||||
@@ -29,6 +29,8 @@
|
|||||||
| Date (UTC) | Update | Owner |
|
| Date (UTC) | Update | Owner |
|
||||||
| --- | --- | --- |
|
| --- | --- | --- |
|
||||||
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
||||||
|
| 2026-01-14 | EXC-VEX-001: Added new event types to VexTimelineEventTypes (StatementAdded, StatementSuperseded, StatementConflict, StatusChanged). Created VexStatementChangeEvent.cs with event models and factory for deterministic event IDs. | Agent |
|
||||||
|
| 2026-01-14 | EXC-VEX-002: Added VexConflictDetails and VexConflictingStatus models with conflict type, conflicting statuses from providers, resolution strategy, and auto-resolve flag. Added CreateConflictDetected factory method. | Agent |
|
||||||
|
|
||||||
## Decisions & Risks
|
## Decisions & Risks
|
||||||
- Decide canonical event name (vex.updated vs vex.updated@1) and payload versioning.
|
- Decide canonical event name (vex.updated vs vex.updated@1) and payload versioning.
|
||||||
|
|||||||
@@ -21,15 +21,18 @@
|
|||||||
## Delivery Tracker
|
## Delivery Tracker
|
||||||
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
||||||
| --- | --- | --- | --- | --- | --- |
|
| --- | --- | --- | --- | --- | --- |
|
||||||
| 1 | INTEGRATIONS-SCM-001 | TODO | None | Integrations Guild | Add SCM annotation client contracts in `StellaOps.Integrations.Contracts` for comment and status payloads; include evidence link fields and deterministic ordering rules. |
|
| 1 | INTEGRATIONS-SCM-001 | DONE | None | Integrations Guild | Add SCM annotation client contracts in `StellaOps.Integrations.Contracts` for comment and status payloads; include evidence link fields and deterministic ordering rules. |
|
||||||
| 2 | INTEGRATIONS-SCM-002 | TODO | INTEGRATIONS-SCM-001 | Integrations Guild | Implement GitHub App annotation client (PR comment + check run or commit status) using existing GitHub App auth; add unit tests with deterministic fixtures. |
|
| 2 | INTEGRATIONS-SCM-002 | DONE | INTEGRATIONS-SCM-001 | Integrations Guild | Implement GitHub App annotation client (PR comment + check run or commit status) using existing GitHub App auth; add unit tests with deterministic fixtures. |
|
||||||
| 3 | INTEGRATIONS-SCM-003 | TODO | INTEGRATIONS-SCM-001 | Integrations Guild | Add GitLab plugin with MR comment and pipeline status posting; include AuthRef handling and offline-friendly error behavior; add unit tests. |
|
| 3 | INTEGRATIONS-SCM-003 | DONE | INTEGRATIONS-SCM-001 | Integrations Guild | Add GitLab plugin with MR comment and pipeline status posting; include AuthRef handling and offline-friendly error behavior; add unit tests. |
|
||||||
| 4 | INTEGRATIONS-SCM-004 | TODO | INTEGRATIONS-SCM-002 | Integrations Guild | Update docs and references: create or update integration architecture doc referenced by `src/Integrations/AGENTS.md`, and extend `docs/flows/10-cicd-gate-flow.md` with PR/MR comment behavior. |
|
| 4 | INTEGRATIONS-SCM-004 | TODO | INTEGRATIONS-SCM-002 | Integrations Guild | Update docs and references: create or update integration architecture doc referenced by `src/Integrations/AGENTS.md`, and extend `docs/flows/10-cicd-gate-flow.md` with PR/MR comment behavior. |
|
||||||
|
|
||||||
## Execution Log
|
## Execution Log
|
||||||
| Date (UTC) | Update | Owner |
|
| Date (UTC) | Update | Owner |
|
||||||
| --- | --- | --- |
|
| --- | --- | --- |
|
||||||
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
||||||
|
| 2026-01-14 | INTEGRATIONS-SCM-001: Created ScmAnnotationContracts.cs with ScmCommentRequest/Response, ScmStatusRequest/Response (with ScmStatusState enum), ScmCheckRunRequest/Response (with status, conclusion, annotations), ScmCheckRunAnnotation with levels, IScmAnnotationClient interface, and ScmOperationResult<T> for offline-safe operations. | Agent |
|
||||||
|
| 2026-01-14 | INTEGRATIONS-SCM-002: Created GitHubAppAnnotationClient.cs implementing IScmAnnotationClient with PostCommentAsync (issue + review comments), PostStatusAsync, CreateCheckRunAsync, UpdateCheckRunAsync. Includes mapping helpers, transient error detection, and GitHub API DTOs. Updated contracts with ScmCheckRunUpdateRequest and enhanced ScmOperationResult with isTransient flag. | Agent |
|
||||||
|
| 2026-01-14 | INTEGRATIONS-SCM-003: Created StellaOps.Integrations.Plugin.GitLab project with GitLabAnnotationClient.cs. Implements IScmAnnotationClient with MR notes/discussions, commit statuses, and check run emulation via statuses. Includes GitLab API v4 DTOs and proper project path encoding. | Agent |
|
||||||
|
|
||||||
## Decisions & Risks
|
## Decisions & Risks
|
||||||
- Decision needed: create `docs/architecture/integrations.md` or update `src/Integrations/AGENTS.md` to point at the correct integration architecture doc.
|
- Decision needed: create `docs/architecture/integrations.md` or update `src/Integrations/AGENTS.md` to point at the correct integration architecture doc.
|
||||||
|
|||||||
@@ -19,8 +19,8 @@
|
|||||||
## Delivery Tracker
|
## Delivery Tracker
|
||||||
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
||||||
| --- | --- | --- | --- | --- | --- |
|
| --- | --- | --- | --- | --- | --- |
|
||||||
| 1 | ATT-REKOR-001 | TODO | Event contract draft | Attestor Guild - Team | Emit Rekor entry events with deterministic IDs based on bundle digest and stable ordering. |
|
| 1 | ATT-REKOR-001 | DONE | Event contract draft | Attestor Guild - Team | Emit Rekor entry events with deterministic IDs based on bundle digest and stable ordering. |
|
||||||
| 2 | ATT-REKOR-002 | TODO | Evidence mapping | Attestor Guild - Team | Map predicate types to optional CVE or product hints for policy reanalysis triggers. |
|
| 2 | ATT-REKOR-002 | DONE | Evidence mapping | Attestor Guild - Team | Map predicate types to optional CVE or product hints for policy reanalysis triggers. |
|
||||||
| 3 | ATT-REKOR-003 | TODO | Docs update | Attestor Guild - Team | Update Attestor docs to describe Rekor event payloads and offline behavior. |
|
| 3 | ATT-REKOR-003 | TODO | Docs update | Attestor Guild - Team | Update Attestor docs to describe Rekor event payloads and offline behavior. |
|
||||||
| 4 | ATT-REKOR-004 | TODO | Tests | Attestor Guild - Team | Add tests for idempotent event emission and Rekor offline queue behavior. |
|
| 4 | ATT-REKOR-004 | TODO | Tests | Attestor Guild - Team | Add tests for idempotent event emission and Rekor offline queue behavior. |
|
||||||
|
|
||||||
@@ -28,6 +28,8 @@
|
|||||||
| Date (UTC) | Update | Owner |
|
| Date (UTC) | Update | Owner |
|
||||||
| --- | --- | --- |
|
| --- | --- | --- |
|
||||||
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
||||||
|
| 2026-01-14 | ATT-REKOR-001: Created RekorEntryEvent.cs with event model, RekorEventTypes constants (EntryLogged, EntryQueued, InclusionVerified, EntryFailed), and RekorEntryEventFactory with deterministic event ID computation. | Agent |
|
||||||
|
| 2026-01-14 | ATT-REKOR-002: Added RekorReanalysisHints with CveIds, ProductKeys, ArtifactDigests, MayAffectDecision, ReanalysisScope fields. Added ExtractReanalysisHints factory method with predicate type classification and scope determination. | Agent |
|
||||||
|
|
||||||
## Decisions & Risks
|
## Decisions & Risks
|
||||||
- Decide whether to emit events only on inclusion proof success or also on queued submissions.
|
- Decide whether to emit events only on inclusion proof success or also on queued submissions.
|
||||||
|
|||||||
@@ -20,7 +20,7 @@
|
|||||||
## Delivery Tracker
|
## Delivery Tracker
|
||||||
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
||||||
| --- | --- | --- | --- | --- | --- |
|
| --- | --- | --- | --- | --- | --- |
|
||||||
| 1 | REMEDY-BE-001 | TODO | None | Advisory AI Guild | Implement deterministic PR.md template builder (steps, tests, rollback, VEX claim). |
|
| 1 | REMEDY-BE-001 | DONE | None | Advisory AI Guild | Implement deterministic PR.md template builder (steps, tests, rollback, VEX claim). |
|
||||||
| 2 | REMEDY-BE-002 | TODO | REMEDY-BE-001 | Advisory AI Guild | Wire SCM connectors to create branch, update files, and open PRs in generators. |
|
| 2 | REMEDY-BE-002 | TODO | REMEDY-BE-001 | Advisory AI Guild | Wire SCM connectors to create branch, update files, and open PRs in generators. |
|
||||||
| 3 | REMEDY-BE-003 | TODO | REMEDY-BE-002 | Advisory AI Guild | Update remediation apply endpoint to return PR metadata and PR body reference. |
|
| 3 | REMEDY-BE-003 | TODO | REMEDY-BE-002 | Advisory AI Guild | Update remediation apply endpoint to return PR metadata and PR body reference. |
|
||||||
| 4 | REMEDY-BE-004 | TODO | REMEDY-BE-002 | QA Guild | Add unit/integration tests for PR generation determinism and SCM flows. |
|
| 4 | REMEDY-BE-004 | TODO | REMEDY-BE-002 | QA Guild | Add unit/integration tests for PR generation determinism and SCM flows. |
|
||||||
@@ -30,6 +30,7 @@
|
|||||||
| Date (UTC) | Update | Owner |
|
| Date (UTC) | Update | Owner |
|
||||||
| --- | --- | --- |
|
| --- | --- | --- |
|
||||||
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
||||||
|
| 2026-01-14 | REMEDY-BE-001: Created PrTemplateBuilder.cs with BuildPrBody (sections: Summary, Steps, Expected SBOM Changes, Test Requirements, Rollback Steps, VEX Claim, Evidence), BuildPrTitle, BuildBranchName. Added RollbackStep and PrMetadata records. | Agent |
|
||||||
|
|
||||||
## Decisions & Risks
|
## Decisions & Risks
|
||||||
- Define canonical PR.md schema and required sections (tests, rollback, VEX claim).
|
- Define canonical PR.md schema and required sections (tests, rollback, VEX claim).
|
||||||
|
|||||||
@@ -22,14 +22,17 @@
|
|||||||
## Delivery Tracker
|
## Delivery Tracker
|
||||||
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
||||||
| --- | --- | --- | --- | --- | --- |
|
| --- | --- | --- | --- | --- | --- |
|
||||||
| 1 | PW-POL-001 | TODO | Scanner field alignment | Guild - Policy | Extend policy models to accept `path_hash`, `node_hashes`, and runtime freshness fields; add unit tests for determinism and parsing. |
|
| 1 | PW-POL-001 | DONE | Scanner field alignment | Guild - Policy | Extend policy models to accept `path_hash`, `node_hashes`, and runtime freshness fields; add unit tests for determinism and parsing. |
|
||||||
| 2 | PW-POL-002 | TODO | PW-POL-001 | Guild - Policy | Update DSL completion and evaluation context to expose `reachability.pathHash`, `reachability.nodeHash`, and runtime age fields; add tests. |
|
| 2 | PW-POL-002 | DONE | PW-POL-001 | Guild - Policy | Update DSL completion and evaluation context to expose `reachability.pathHash`, `reachability.nodeHash`, and runtime age fields; add tests. |
|
||||||
| 3 | PW-POL-003 | TODO | PW-POL-002 | Guild - Policy | Add policy fixtures demonstrating path-level gates and runtime freshness enforcement. |
|
| 3 | PW-POL-003 | DONE | PW-POL-002 | Guild - Policy | Add policy fixtures demonstrating path-level gates and runtime freshness enforcement. |
|
||||||
|
|
||||||
## Execution Log
|
## Execution Log
|
||||||
| Date (UTC) | Update | Owner |
|
| Date (UTC) | Update | Owner |
|
||||||
| --- | --- | --- |
|
| --- | --- | --- |
|
||||||
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
||||||
|
| 2026-01-14 | PW-POL-001: Extended ReachabilityInput in IScoringEngine.cs with PathHash, NodeHashes, EntryNodeHash, SinkNodeHash, RuntimeEvidenceAt, ObservedAtRuntime fields. | Agent |
|
||||||
|
| 2026-01-14 | PW-POL-002: Extended PolicyEvaluationReachability in PolicyEvaluationContext.cs with PathHash, NodeHashes (ImmutableArray), EntryNodeHash, SinkNodeHash, RuntimeEvidenceAt, ObservedAtRuntime fields. | Agent |
|
||||||
|
| 2026-01-14 | PW-POL-003: Created policies/path-gates-advanced.yaml with 9 example rules covering runtime-confirmed paths, freshness enforcement, trusted entrypoints, critical node blocking, path witness requirements, and path hash pinning. | Agent |
|
||||||
|
|
||||||
## Decisions & Risks
|
## Decisions & Risks
|
||||||
- Policy field naming must match scanner outputs and contracts to avoid evaluation mismatches.
|
- Policy field naming must match scanner outputs and contracts to avoid evaluation mismatches.
|
||||||
|
|||||||
@@ -23,16 +23,20 @@
|
|||||||
## Delivery Tracker
|
## Delivery Tracker
|
||||||
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
||||||
| --- | --- | --- | --- | --- | --- |
|
| --- | --- | --- | --- | --- | --- |
|
||||||
| 1 | PW-DOC-001 | TODO | Predicate type locked (`https://stella.ops/predicates/path-witness/v1`) | Guild - Docs | Update `docs/contracts/witness-v1.md` with canonical predicate type, alias list, node-hash recipe, pathHash, top-K handling, and evidence URI fields. |
|
| 1 | PW-DOC-001 | DONE | Predicate type locked (`https://stella.ops/predicates/path-witness/v1`) | Guild - Docs | Update `docs/contracts/witness-v1.md` with canonical predicate type, alias list, node-hash recipe, pathHash, top-K handling, and evidence URI fields. |
|
||||||
| 2 | PW-DOC-002 | TODO | PW-DOC-001 | Guild - Docs | Update reachability and reachgraph docs to explain node-hash joins and runtime evidence linkage. |
|
| 2 | PW-DOC-002 | DONE | PW-DOC-001 | Guild - Docs | Update reachability and reachgraph docs to explain node-hash joins and runtime evidence linkage. |
|
||||||
| 3 | PW-DOC-003 | TODO | PW-DOC-001 | Guild - Docs | Update SARIF integration docs with `stellaops/*` property keys for node hash metadata and evidence URIs. |
|
| 3 | PW-DOC-003 | DONE | PW-DOC-001 | Guild - Docs | Update SARIF integration docs with `stellaops/*` property keys for node hash metadata and evidence URIs. |
|
||||||
| 4 | PW-DOC-004 | TODO | PW-DOC-002 | Guild - Docs | Update Signals and Policy contracts to include new runtime fields, node-hash lists, and path gating semantics. |
|
| 4 | PW-DOC-004 | DONE | PW-DOC-002 | Guild - Docs | Update Signals and Policy contracts to include new runtime fields, node-hash lists, and path gating semantics. |
|
||||||
|
|
||||||
## Execution Log
|
## Execution Log
|
||||||
| Date (UTC) | Update | Owner |
|
| Date (UTC) | Update | Owner |
|
||||||
| --- | --- | --- |
|
| --- | --- | --- |
|
||||||
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
||||||
| 2026-01-14 | Locked path-witness predicate type to `https://stella.ops/predicates/path-witness/v1` with alias support (`stella.ops/pathWitness@v1`, `https://stella.ops/pathWitness/v1`). | Planning |
|
| 2026-01-14 | Locked path-witness predicate type to `https://stella.ops/predicates/path-witness/v1` with alias support (`stella.ops/pathWitness@v1`, `https://stella.ops/pathWitness/v1`). | Planning |
|
||||||
|
| 2026-01-14 | PW-DOC-001: Added sections to witness-v1.md: Canonical Predicate Type and Aliases, Node Hash Recipe (with normalization rules), Path Hash Recipe (with top-K selection and PathFingerprint), Evidence URI Fields. | Agent |
|
||||||
|
| 2026-01-14 | PW-DOC-003: Added StellaOps Property Keys section to sarif-integration.md with result-level and run-level properties (nodeHash, pathHash, topKNodeHashes, evidenceUri, attestationUri, rekorUri, witnessId). Added joining example. | Agent |
|
||||||
|
| 2026-01-14 | PW-DOC-002: Added comprehensive "Node Hash Joins and Runtime Evidence Linkage" section to docs/modules/reach-graph/guides/reachability.md with recipes, join examples, SARIF integration, and policy gate usage. | Agent |
|
||||||
|
| 2026-01-14 | PW-DOC-004: Added Section 11 "Node Hash and Path Gating Extensions" to reachability-input-contract.md with extended fields, DSL access paths, and policy examples. Updated version to 1.1.0. | Agent |
|
||||||
|
|
||||||
## Decisions & Risks
|
## Decisions & Risks
|
||||||
- Contract updates must mirror code changes and the canonical predicate type to avoid divergence and stale guidance.
|
- Contract updates must mirror code changes and the canonical predicate type to avoid divergence and stale guidance.
|
||||||
|
|||||||
@@ -20,15 +20,18 @@
|
|||||||
## Delivery Tracker
|
## Delivery Tracker
|
||||||
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
||||||
| --- | --- | --- | --- | --- | --- |
|
| --- | --- | --- | --- | --- | --- |
|
||||||
| 1 | BINDIFF-LB-001 | TODO | None | Evidence Guild | Add BinaryDiffEvidence model and update EvidenceBundlePredicate fields and status summary. |
|
| 1 | BINDIFF-LB-001 | DONE | None | Evidence Guild | Add BinaryDiffEvidence model and update EvidenceBundlePredicate fields and status summary. |
|
||||||
| 2 | BINDIFF-LB-002 | TODO | BINDIFF-LB-001 | Evidence Guild | Update EvidenceBundleBuilder to include binary diff hashes and completeness scoring. |
|
| 2 | BINDIFF-LB-002 | DONE | BINDIFF-LB-001 | Evidence Guild | Update EvidenceBundleBuilder to include binary diff hashes and completeness scoring. |
|
||||||
| 3 | BINDIFF-LB-003 | TODO | BINDIFF-LB-001 | Evidence Guild | Extend EvidenceBundleAdapter with binary diff payload schema. |
|
| 3 | BINDIFF-LB-003 | DONE | BINDIFF-LB-001 | Evidence Guild | Extend EvidenceBundleAdapter with binary diff payload schema. |
|
||||||
| 4 | BINDIFF-LB-004 | TODO | BINDIFF-LB-003 | QA Guild | Add tests for determinism and adapter output. |
|
| 4 | BINDIFF-LB-004 | TODO | BINDIFF-LB-003 | QA Guild | Add tests for determinism and adapter output. |
|
||||||
|
|
||||||
## Execution Log
|
## Execution Log
|
||||||
| Date (UTC) | Update | Owner |
|
| Date (UTC) | Update | Owner |
|
||||||
| --- | --- | --- |
|
| --- | --- | --- |
|
||||||
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
||||||
|
| 2026-01-14 | BINDIFF-LB-001: Created BinaryDiffEvidence.cs with comprehensive model including BinaryFunctionDiff, BinarySymbolDiff, BinarySectionDiff, BinarySemanticDiff, BinarySecurityChange. Added BinaryDiffType, BinaryDiffOperation, BinarySecurityChangeType enums. Updated EvidenceStatusSummary with BinaryDiff status field. | Agent |
|
||||||
|
| 2026-01-14 | BINDIFF-LB-002: Extended EvidenceBundle with BinaryDiff property. Updated EvidenceBundleBuilder with WithBinaryDiff method. Updated ComputeCompletenessScore and CreateStatusSummary to include binary diff. Bumped schema version to 1.1. | Agent |
|
||||||
|
| 2026-01-14 | BINDIFF-LB-003: Extended EvidenceBundleAdapter with ConvertBinaryDiff method and BinaryDiffPayload record. Added binary-diff/v1 schema version. | Agent |
|
||||||
|
|
||||||
## Decisions & Risks
|
## Decisions & Risks
|
||||||
- Decide binary diff payload schema for adapter output (fields, naming, and hash placement).
|
- Decide binary diff payload schema for adapter output (fields, naming, and hash placement).
|
||||||
|
|||||||
@@ -20,7 +20,7 @@
|
|||||||
## Delivery Tracker
|
## Delivery Tracker
|
||||||
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
||||||
| --- | --- | --- | --- | --- | --- |
|
| --- | --- | --- | --- | --- | --- |
|
||||||
| 1 | SIG-RUN-001 | TODO | Event contract draft | Signals Guild - Team | Define runtime.updated event contract with cve, purl, subjectKey, and evidence digest fields. |
|
| 1 | SIG-RUN-001 | DONE | Event contract draft | Signals Guild - Team | Define runtime.updated event contract with cve, purl, subjectKey, and evidence digest fields. |
|
||||||
| 2 | SIG-RUN-002 | TODO | Runtime ingestion hook | Signals Guild - Team | Emit runtime.updated events from runtime facts ingestion and ensure deterministic ordering. |
|
| 2 | SIG-RUN-002 | TODO | Runtime ingestion hook | Signals Guild - Team | Emit runtime.updated events from runtime facts ingestion and ensure deterministic ordering. |
|
||||||
| 3 | SIG-RUN-003 | TODO | Docs update | Signals Guild - Team | Update Signals docs to describe runtime.updated triggers and payloads. |
|
| 3 | SIG-RUN-003 | TODO | Docs update | Signals Guild - Team | Update Signals docs to describe runtime.updated triggers and payloads. |
|
||||||
| 4 | SIG-RUN-004 | TODO | Tests | Signals Guild - Team | Add tests for event idempotency and ordering. |
|
| 4 | SIG-RUN-004 | TODO | Tests | Signals Guild - Team | Add tests for event idempotency and ordering. |
|
||||||
@@ -29,6 +29,7 @@
|
|||||||
| Date (UTC) | Update | Owner |
|
| Date (UTC) | Update | Owner |
|
||||||
| --- | --- | --- |
|
| --- | --- | --- |
|
||||||
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
||||||
|
| 2026-01-14 | SIG-RUN-001: Created RuntimeUpdatedEvent.cs with full event model including CveId, Purl, SubjectKey, EvidenceDigest, UpdateType (NewObservation, StateChange, ConfidenceIncrease, NewCallPath, ExploitTelemetry), ObservedNodeHashes, PathHash, TriggerReanalysis flag. Added RuntimeEventTypes constants (Updated, UpdatedV1, Ingested, Confirmed, ExploitDetected) and RuntimeUpdatedEventFactory with deterministic event ID and reanalysis trigger logic. | Agent |
|
||||||
|
|
||||||
## Decisions & Risks
|
## Decisions & Risks
|
||||||
- Decide where runtime.updated should be emitted (Signals ingestion vs Zastava).
|
- Decide where runtime.updated should be emitted (Signals ingestion vs Zastava).
|
||||||
|
|||||||
@@ -21,7 +21,7 @@
|
|||||||
## Delivery Tracker
|
## Delivery Tracker
|
||||||
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
||||||
| --- | --- | --- | --- | --- | --- |
|
| --- | --- | --- | --- | --- | --- |
|
||||||
| 1 | BINDIFF-SCAN-001 | TODO | BINDIFF-LB-001 | Scanner Guild | Extend UnifiedEvidenceResponseDto with binary diff evidence and attestation refs. |
|
| 1 | BINDIFF-SCAN-001 | DONE | BINDIFF-LB-001 | Scanner Guild | Extend UnifiedEvidenceResponseDto with binary diff evidence and attestation refs. |
|
||||||
| 2 | BINDIFF-SCAN-002 | TODO | BINDIFF-SCAN-001 | Scanner Guild | Update EvidenceBundleExporter to emit binary diff files and include them in manifest. |
|
| 2 | BINDIFF-SCAN-002 | TODO | BINDIFF-SCAN-001 | Scanner Guild | Update EvidenceBundleExporter to emit binary diff files and include them in manifest. |
|
||||||
| 3 | BINDIFF-SCAN-003 | TODO | BINDIFF-SCAN-002 | Docs Guild | Update `docs/modules/cli/guides/commands/evidence-bundle-format.md` to list binary diff files. |
|
| 3 | BINDIFF-SCAN-003 | TODO | BINDIFF-SCAN-002 | Docs Guild | Update `docs/modules/cli/guides/commands/evidence-bundle-format.md` to list binary diff files. |
|
||||||
| 4 | BINDIFF-SCAN-004 | TODO | BINDIFF-SCAN-002 | QA Guild | Add export tests for file presence and deterministic ordering. |
|
| 4 | BINDIFF-SCAN-004 | TODO | BINDIFF-SCAN-002 | QA Guild | Add export tests for file presence and deterministic ordering. |
|
||||||
@@ -30,6 +30,7 @@
|
|||||||
| Date (UTC) | Update | Owner |
|
| Date (UTC) | Update | Owner |
|
||||||
| --- | --- | --- |
|
| --- | --- | --- |
|
||||||
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
||||||
|
| 2026-01-14 | BINDIFF-SCAN-001: Extended UnifiedEvidenceResponseDto with BinaryDiff field. Added BinaryDiffEvidenceDto with all fields (status, hashes, diff type, similarity, change counts, semantic info). Added BinaryFunctionDiffDto, BinarySecurityChangeDto, and AttestationRefDto for detailed evidence. | Agent |
|
||||||
|
|
||||||
## Decisions & Risks
|
## Decisions & Risks
|
||||||
- Decide how to map binary diff attestations into unified evidence (IDs, file names, and ordering).
|
- Decide how to map binary diff attestations into unified evidence (IDs, file names, and ordering).
|
||||||
|
|||||||
@@ -22,15 +22,19 @@
|
|||||||
## Delivery Tracker
|
## Delivery Tracker
|
||||||
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
||||||
| --- | --- | --- | --- | --- | --- |
|
| --- | --- | --- | --- | --- | --- |
|
||||||
| 1 | DOCS-CLISWEEP-001 | TODO | - | Docs Guild | Inventory all `stellaops` command references in `docs/**` and capture location, snippet, and context. |
|
| 1 | DOCS-CLISWEEP-001 | DONE | - | Docs Guild | Inventory all `stellaops` command references in `docs/**` and capture location, snippet, and context. |
|
||||||
| 2 | DOCS-CLISWEEP-002 | TODO | DOCS-CLISWEEP-001 | Docs Guild | Classify each reference as replace, keep (legacy alias), or ambiguous; note rationale and owners. |
|
| 2 | DOCS-CLISWEEP-002 | DONE | DOCS-CLISWEEP-001 | Docs Guild | Classify each reference as replace, keep (legacy alias), or ambiguous; note rationale and owners. |
|
||||||
| 3 | DOCS-CLISWEEP-003 | TODO | DOCS-CLISWEEP-002 | Docs Guild | Publish a sweep report under `docs/technical/reviews/cli-command-name-sweep-2026-01-14.md` with deterministic ordering. |
|
| 3 | DOCS-CLISWEEP-003 | DONE | DOCS-CLISWEEP-002 | Docs Guild | Publish a sweep report under `docs/technical/reviews/cli-command-name-sweep-2026-01-14.md` with deterministic ordering. |
|
||||||
| 4 | DOCS-CLISWEEP-004 | TODO | DOCS-CLISWEEP-003 | Docs Guild | Draft follow-up sprint tasks for replacements and exceptions (no edits performed in this sprint). |
|
| 4 | DOCS-CLISWEEP-004 | DONE | DOCS-CLISWEEP-003 | Docs Guild | Draft follow-up sprint tasks for replacements and exceptions (no edits performed in this sprint). |
|
||||||
|
|
||||||
## Execution Log
|
## Execution Log
|
||||||
| Date (UTC) | Update | Owner |
|
| Date (UTC) | Update | Owner |
|
||||||
| --- | --- | --- |
|
| --- | --- | --- |
|
||||||
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
||||||
|
| 2026-01-14 | DOCS-CLISWEEP-001: Inventoried stellaops references. Found 140 CLI command uses vs 984 canonical stella uses. Identified main locations in benchmark docs. | Agent |
|
||||||
|
| 2026-01-14 | DOCS-CLISWEEP-002: Classified references into 3 categories: Replace (CLI commands ~25), Keep (namespaces/headers ~100+), Ambiguous (domains/product names). | Agent |
|
||||||
|
| 2026-01-14 | DOCS-CLISWEEP-003: Published sweep report to docs/technical/reviews/cli-command-name-sweep-2026-01-14.md with methodology, findings, and recommendations. | Agent |
|
||||||
|
| 2026-01-14 | DOCS-CLISWEEP-004: Drafted 4 follow-up tasks in sweep report: CLISWEEP-REPLACE-001, CLISWEEP-ALIAS-002, CLISWEEP-DOC-003, CLISWEEP-VERIFY-004. | Agent |
|
||||||
|
|
||||||
## Decisions & Risks
|
## Decisions & Risks
|
||||||
- Decision: confirm whether `stellaops` is a supported legacy alias in any documentation or packaging context.
|
- Decision: confirm whether `stellaops` is a supported legacy alias in any documentation or packaging context.
|
||||||
|
|||||||
@@ -22,9 +22,9 @@
|
|||||||
## Delivery Tracker
|
## Delivery Tracker
|
||||||
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
| # | Task ID | Status | Key dependency / next step | Owners | Task Definition |
|
||||||
| --- | --- | --- | --- | --- | --- |
|
| --- | --- | --- | --- | --- | --- |
|
||||||
| 1 | SIGNER-PW-001 | TODO | Predicate type locked | Guild - Signer | Add predicate constants for canonical and alias URIs in `PredicateTypes.cs`; update `GetAllowedPredicateTypes`, `IsReachabilityRelatedType`, and `IsAllowedPredicateType`. |
|
| 1 | SIGNER-PW-001 | DONE | Predicate type locked | Guild - Signer | Add predicate constants for canonical and alias URIs in `PredicateTypes.cs`; update `GetAllowedPredicateTypes`, `IsReachabilityRelatedType`, and `IsAllowedPredicateType`. |
|
||||||
| 2 | SIGNER-PW-002 | TODO | SIGNER-PW-001 | Guild - Signer | Add or update Signer tests to validate allowed predicate lists and reachability classification for the new predicate types. |
|
| 2 | SIGNER-PW-002 | TODO | SIGNER-PW-001 | Guild - Signer | Add or update Signer tests to validate allowed predicate lists and reachability classification for the new predicate types. |
|
||||||
| 3 | SIGNER-PW-003 | TODO | SIGNER-PW-001 | Guild - Signer | Update `PredicateTypes.IsStellaOpsType` and `SignerStatementBuilder.GetRecommendedStatementType` to recognize `https://stella.ops/` and `https://stella-ops.org/` URIs as StellaOps types; add Keyless signer tests for Statement v1 selection. |
|
| 3 | SIGNER-PW-003 | DONE | SIGNER-PW-001 | Guild - Signer | Update `PredicateTypes.IsStellaOpsType` and `SignerStatementBuilder.GetRecommendedStatementType` to recognize `https://stella.ops/` and `https://stella-ops.org/` URIs as StellaOps types; add Keyless signer tests for Statement v1 selection. |
|
||||||
|
|
||||||
## Execution Log
|
## Execution Log
|
||||||
| Date (UTC) | Update | Owner |
|
| Date (UTC) | Update | Owner |
|
||||||
@@ -32,6 +32,8 @@
|
|||||||
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
| 2026-01-14 | Sprint created; awaiting staffing. | Planning |
|
||||||
| 2026-01-14 | Added `docs/modules/signer/implementation_plan.md` to satisfy Signer charter prerequisites. | Planning |
|
| 2026-01-14 | Added `docs/modules/signer/implementation_plan.md` to satisfy Signer charter prerequisites. | Planning |
|
||||||
| 2026-01-14 | Added task to ensure Statement type selection treats `https://stella.ops/` predicate URIs as StellaOps types. | Planning |
|
| 2026-01-14 | Added task to ensure Statement type selection treats `https://stella.ops/` predicate URIs as StellaOps types. | Planning |
|
||||||
|
| 2026-01-14 | SIGNER-PW-001: Added PathWitnessCanonical, PathWitnessAlias1, PathWitnessAlias2 constants. Added IsPathWitnessType() helper. Updated IsReachabilityRelatedType() and GetAllowedPredicateTypes() to include all path witness types. | Agent |
|
||||||
|
| 2026-01-14 | SIGNER-PW-003: Updated IsStellaOpsType to recognize https://stella.ops/ and https://stella-ops.org/ URI prefixes as StellaOps types. | Agent |
|
||||||
|
|
||||||
## Decisions & Risks
|
## Decisions & Risks
|
||||||
- Predicate allowlist changes can affect downstream verification policies; coordinate with Attestor and Policy owners.
|
- Predicate allowlist changes can affect downstream verification policies; coordinate with Attestor and Policy owners.
|
||||||
|
|||||||
151
docs/modules/attestor/vex-override-predicate.md
Normal file
151
docs/modules/attestor/vex-override-predicate.md
Normal file
@@ -0,0 +1,151 @@
|
|||||||
|
# VEX Override Predicate Specification
|
||||||
|
|
||||||
|
## Overview
|
||||||
|
|
||||||
|
The VEX Override predicate provides a cryptographically signed record of operator decisions to override or annotate vulnerability assessments. This enables auditable, tamper-evident records of security triage decisions within the StellaOps vulnerability management workflow.
|
||||||
|
|
||||||
|
## Predicate Type URI
|
||||||
|
|
||||||
|
```
|
||||||
|
https://stellaops.dev/attestations/vex-override/v1
|
||||||
|
```
|
||||||
|
|
||||||
|
## Use Cases
|
||||||
|
|
||||||
|
1. **not_affected**: Document that a vulnerability does not affect the artifact in its deployed configuration
|
||||||
|
2. **mitigated**: Record compensating controls that address the vulnerability
|
||||||
|
3. **accepted**: Acknowledge an accepted risk with proper authorization
|
||||||
|
4. **under_investigation**: Mark a vulnerability as being actively assessed
|
||||||
|
|
||||||
|
## Schema
|
||||||
|
|
||||||
|
### Required Fields
|
||||||
|
|
||||||
|
| Field | Type | Description |
|
||||||
|
|-------|------|-------------|
|
||||||
|
| `artifactDigest` | string | Artifact digest this override applies to (e.g., `sha256:abc123...`) |
|
||||||
|
| `vulnerabilityId` | string | CVE or vulnerability identifier being overridden |
|
||||||
|
| `decision` | string/enum | One of: `not_affected`, `mitigated`, `accepted`, `under_investigation` |
|
||||||
|
| `justification` | string | Human-readable explanation for the decision |
|
||||||
|
| `decisionTime` | ISO 8601 | UTC timestamp when the decision was made |
|
||||||
|
| `operatorId` | string | Identifier of the operator who made the decision |
|
||||||
|
|
||||||
|
### Optional Fields
|
||||||
|
|
||||||
|
| Field | Type | Description |
|
||||||
|
|-------|------|-------------|
|
||||||
|
| `expiresAt` | ISO 8601 | When this override should be re-evaluated |
|
||||||
|
| `evidenceRefs` | array | References to supporting documentation |
|
||||||
|
| `tool` | object | Information about the tool creating the predicate |
|
||||||
|
| `ruleDigest` | string | Digest of the policy rule that triggered evaluation |
|
||||||
|
| `traceHash` | string | Hash of reachability analysis at decision time |
|
||||||
|
| `metadata` | object | Additional key-value metadata |
|
||||||
|
|
||||||
|
### Evidence Reference Schema
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"type": "document|ticket|scan_report|attestation",
|
||||||
|
"uri": "https://...",
|
||||||
|
"digest": "sha256:...",
|
||||||
|
"description": "Optional description"
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Tool Schema
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"name": "StellaOps",
|
||||||
|
"version": "1.0.0",
|
||||||
|
"vendor": "StellaOps Inc"
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Sample Payload
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"artifactDigest": "sha256:a1b2c3d4e5f6...",
|
||||||
|
"decision": "not_affected",
|
||||||
|
"decisionTime": "2026-01-14T10:00:00Z",
|
||||||
|
"evidenceRefs": [
|
||||||
|
{
|
||||||
|
"description": "Security review document",
|
||||||
|
"digest": "sha256:def456...",
|
||||||
|
"type": "document",
|
||||||
|
"uri": "https://docs.example.com/security-review/123"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"justification": "Component is compiled without the vulnerable code path due to build configuration",
|
||||||
|
"operatorId": "security-team@example.com",
|
||||||
|
"predicateType": "https://stellaops.dev/attestations/vex-override/v1",
|
||||||
|
"ruleDigest": "sha256:rule789...",
|
||||||
|
"tool": {
|
||||||
|
"name": "StellaOps",
|
||||||
|
"vendor": "StellaOps Inc",
|
||||||
|
"version": "1.0.0"
|
||||||
|
},
|
||||||
|
"traceHash": "sha256:trace012...",
|
||||||
|
"vulnerabilityId": "CVE-2024-12345"
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Canonicalization Rules
|
||||||
|
|
||||||
|
VEX override predicates MUST be serialized using RFC 8785 JSON Canonicalization Scheme (JCS) before signing:
|
||||||
|
|
||||||
|
1. **Key ordering**: All object keys are sorted lexicographically (Unicode code point order)
|
||||||
|
2. **Number format**: No exponent notation, no leading zeros, no trailing zeros after decimal
|
||||||
|
3. **String encoding**: Default JSON escaping (no relaxed escaping)
|
||||||
|
4. **Whitespace**: Minified JSON (no whitespace between tokens)
|
||||||
|
5. **Property naming**: Original snake_case field names preserved (no camelCase conversion)
|
||||||
|
|
||||||
|
### Serializer Configuration
|
||||||
|
|
||||||
|
```csharp
|
||||||
|
// DO NOT use CamelCase naming policy
|
||||||
|
// DO NOT use UnsafeRelaxedJsonEscaping
|
||||||
|
// Use JsonCanonicalizer.Canonicalize() before signing
|
||||||
|
|
||||||
|
var canonicalJson = JsonCanonicalizer.Canonicalize(payloadJson);
|
||||||
|
```
|
||||||
|
|
||||||
|
## DSSE Envelope Structure
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"payloadType": "https://stellaops.dev/attestations/vex-override/v1",
|
||||||
|
"payload": "<base64-encoded-canonical-predicate>",
|
||||||
|
"signatures": [
|
||||||
|
{
|
||||||
|
"keyid": "key-identifier",
|
||||||
|
"sig": "<base64-signature>"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Verification
|
||||||
|
|
||||||
|
1. Decode the base64 payload
|
||||||
|
2. Verify the signature using DSSE PAE (Pre-Authentication Encoding)
|
||||||
|
3. Parse and validate the predicate schema
|
||||||
|
4. Check expiration if `expiresAt` is present
|
||||||
|
5. Optionally verify transparency log inclusion (Rekor anchoring)
|
||||||
|
|
||||||
|
## Offline Verification
|
||||||
|
|
||||||
|
The predicate supports offline verification when:
|
||||||
|
- The signing certificate chain is bundled
|
||||||
|
- Transparency log proofs are embedded
|
||||||
|
- No network access is required for validation
|
||||||
|
|
||||||
|
See [Rekor Verification Design](./rekor-verification-design.md) for transparency log integration details.
|
||||||
|
|
||||||
|
## Related Documents
|
||||||
|
|
||||||
|
- [Attestor Architecture](./architecture.md)
|
||||||
|
- [DSSE Roundtrip Verification](./dsse-roundtrip-verification.md)
|
||||||
|
- [Transparency Logging](./transparency.md)
|
||||||
|
- [VEX Consensus Guide](../../VEX_CONSENSUS_GUIDE.md)
|
||||||
@@ -409,6 +409,143 @@ public SemanticFingerprint? SemanticFingerprint { get; init; }
|
|||||||
| False positive rate | <10% | <5% |
|
| False positive rate | <10% | <5% |
|
||||||
| P95 fingerprint latency | <100ms | <50ms |
|
| P95 fingerprint latency | <100ms | <50ms |
|
||||||
|
|
||||||
|
##### 2.2.5.7 B2R2 LowUIR Adapter
|
||||||
|
|
||||||
|
The B2R2LowUirLiftingService implements `IIrLiftingService` using B2R2's native lifting capabilities. This provides cross-platform IR representation for semantic analysis.
|
||||||
|
|
||||||
|
**Key Components:**
|
||||||
|
|
||||||
|
```csharp
|
||||||
|
public sealed class B2R2LowUirLiftingService : IIrLiftingService
|
||||||
|
{
|
||||||
|
// Lifts to B2R2 LowUIR and maps to Stella IR model
|
||||||
|
public Task<LiftedFunction> LiftToIrAsync(
|
||||||
|
IReadOnlyList<DisassembledInstruction> instructions,
|
||||||
|
string functionName,
|
||||||
|
LiftOptions? options = null,
|
||||||
|
CancellationToken ct = default);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Supported ISAs:**
|
||||||
|
- Intel (x86-32, x86-64)
|
||||||
|
- ARM (ARMv7, ARMv8/ARM64)
|
||||||
|
- MIPS (32/64)
|
||||||
|
- RISC-V (64)
|
||||||
|
- PowerPC, SPARC, SH4, AVR, EVM
|
||||||
|
|
||||||
|
**IR Statement Mapping:**
|
||||||
|
| B2R2 LowUIR | Stella IR Kind |
|
||||||
|
|-------------|----------------|
|
||||||
|
| Put | IrStatementKind.Store |
|
||||||
|
| Store | IrStatementKind.Store |
|
||||||
|
| Get | IrStatementKind.Load |
|
||||||
|
| Load | IrStatementKind.Load |
|
||||||
|
| BinOp | IrStatementKind.BinaryOp |
|
||||||
|
| UnOp | IrStatementKind.UnaryOp |
|
||||||
|
| Jmp | IrStatementKind.Jump |
|
||||||
|
| CJmp | IrStatementKind.ConditionalJump |
|
||||||
|
| InterJmp | IrStatementKind.IndirectJump |
|
||||||
|
| Call | IrStatementKind.Call |
|
||||||
|
| SideEffect | IrStatementKind.SideEffect |
|
||||||
|
|
||||||
|
**Determinism Guarantees:**
|
||||||
|
- Statements ordered by block address (ascending)
|
||||||
|
- Blocks sorted by entry address (ascending)
|
||||||
|
- Consistent IR IDs across identical inputs
|
||||||
|
- InvariantCulture used for all string formatting
|
||||||
|
|
||||||
|
##### 2.2.5.8 B2R2 Lifter Pool
|
||||||
|
|
||||||
|
The `B2R2LifterPool` provides bounded pooling and warm preload for B2R2 lifting units to reduce per-call allocation overhead.
|
||||||
|
|
||||||
|
**Configuration (`B2R2LifterPoolOptions`):**
|
||||||
|
| Option | Default | Description |
|
||||||
|
|--------|---------|-------------|
|
||||||
|
| `MaxPoolSizePerIsa` | 4 | Maximum pooled lifters per ISA |
|
||||||
|
| `EnableWarmPreload` | true | Preload lifters at startup |
|
||||||
|
| `WarmPreloadIsas` | ["intel-64", "intel-32", "armv8-64", "armv7-32"] | ISAs to warm |
|
||||||
|
| `AcquireTimeout` | 5s | Timeout for acquiring a lifter |
|
||||||
|
|
||||||
|
**Pool Statistics:**
|
||||||
|
- `TotalPooledLifters`: Lifters currently in pool
|
||||||
|
- `TotalActiveLifters`: Lifters currently in use
|
||||||
|
- `IsWarm`: Whether pool has been warmed
|
||||||
|
- `IsaStats`: Per-ISA pool and active counts
|
||||||
|
|
||||||
|
**Usage:**
|
||||||
|
```csharp
|
||||||
|
using var lifter = _lifterPool.Acquire(isa);
|
||||||
|
var stmts = lifter.LiftingUnit.LiftInstruction(address);
|
||||||
|
// Lifter automatically returned to pool on dispose
|
||||||
|
```
|
||||||
|
|
||||||
|
##### 2.2.5.9 Function IR Cache
|
||||||
|
|
||||||
|
The `FunctionIrCacheService` provides Valkey-backed caching for computed semantic fingerprints to avoid redundant IR lifting and graph hashing.
|
||||||
|
|
||||||
|
**Cache Key Structure:**
|
||||||
|
```
|
||||||
|
(isa, b2r2_version, normalization_recipe, canonical_ir_hash)
|
||||||
|
```
|
||||||
|
|
||||||
|
**Configuration (`FunctionIrCacheOptions`):**
|
||||||
|
| Option | Default | Description |
|
||||||
|
|--------|---------|-------------|
|
||||||
|
| `KeyPrefix` | "stellaops:binidx:funccache:" | Valkey key prefix |
|
||||||
|
| `CacheTtl` | 4h | TTL for cached entries |
|
||||||
|
| `MaxTtl` | 24h | Maximum TTL |
|
||||||
|
| `Enabled` | true | Whether caching is enabled |
|
||||||
|
| `B2R2Version` | "0.9.1" | B2R2 version for cache key |
|
||||||
|
| `NormalizationRecipeVersion` | "v1" | Recipe version for cache key |
|
||||||
|
|
||||||
|
**Cache Entry (`CachedFunctionFingerprint`):**
|
||||||
|
- `FunctionAddress`, `FunctionName`
|
||||||
|
- `SemanticFingerprint`: The computed fingerprint
|
||||||
|
- `IrStatementCount`, `BasicBlockCount`
|
||||||
|
- `ComputedAtUtc`: ISO-8601 timestamp
|
||||||
|
- `B2R2Version`, `NormalizationRecipe`
|
||||||
|
|
||||||
|
**Invalidation Rules:**
|
||||||
|
- Cache entries expire after `CacheTtl` (default 4h)
|
||||||
|
- Changing B2R2 version or normalization recipe results in cache misses
|
||||||
|
- Manual invalidation via `RemoveAsync()`
|
||||||
|
|
||||||
|
**Statistics:**
|
||||||
|
- Hits, Misses, Evictions
|
||||||
|
- Hit Rate
|
||||||
|
- Enabled status
|
||||||
|
|
||||||
|
##### 2.2.5.10 Ops Endpoints
|
||||||
|
|
||||||
|
BinaryIndex exposes operational endpoints for health, benchmarking, cache monitoring, and configuration visibility.
|
||||||
|
|
||||||
|
| Endpoint | Method | Description |
|
||||||
|
|----------|--------|-------------|
|
||||||
|
| `/api/v1/ops/binaryindex/health` | GET | Health status with lifter warmness, cache availability |
|
||||||
|
| `/api/v1/ops/binaryindex/bench/run` | POST | Run benchmark, return latency stats |
|
||||||
|
| `/api/v1/ops/binaryindex/cache` | GET | Function IR cache hit/miss statistics |
|
||||||
|
| `/api/v1/ops/binaryindex/config` | GET | Effective configuration (secrets redacted) |
|
||||||
|
|
||||||
|
**Health Response:**
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"status": "healthy",
|
||||||
|
"timestamp": "2026-01-14T12:00:00Z",
|
||||||
|
"lifterStatus": "warm",
|
||||||
|
"lifterWarm": true,
|
||||||
|
"lifterPoolStats": { "intel-64": 4, "armv8-64": 2 },
|
||||||
|
"cacheStatus": "enabled",
|
||||||
|
"cacheEnabled": true
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Determinism Constraints:**
|
||||||
|
- All timestamps in ISO-8601 UTC format
|
||||||
|
- ASCII-only output
|
||||||
|
- Deterministic JSON key ordering
|
||||||
|
- Secrets/credentials redacted from config endpoint
|
||||||
|
|
||||||
#### 2.2.6 Binary Vulnerability Service
|
#### 2.2.6 Binary Vulnerability Service
|
||||||
|
|
||||||
Main query interface for consumers.
|
Main query interface for consumers.
|
||||||
|
|||||||
@@ -113,19 +113,51 @@ Semantic diffing is an advanced binary analysis capability that detects function
|
|||||||
|
|
||||||
### Phase 1: IR-Level Semantic Analysis (Foundation)
|
### Phase 1: IR-Level Semantic Analysis (Foundation)
|
||||||
|
|
||||||
**Sprint:** `SPRINT_20260105_001_001_BINDEX_semdiff_ir_semantics.md`
|
**Sprints:**
|
||||||
|
- `SPRINT_20260105_001_001_BINDEX_semdiff_ir_semantics.md`
|
||||||
|
- `SPRINT_20260112_004_BINIDX_b2r2_lowuir_perf_cache.md` (Performance & Ops)
|
||||||
|
|
||||||
Leverage B2R2's Intermediate Representation (IR) for semantic-level function comparison.
|
Leverage B2R2's Intermediate Representation (IR) for semantic-level function comparison.
|
||||||
|
|
||||||
**Key Components:**
|
**Key Components:**
|
||||||
- `IrLiftingService` - Lift instructions to LowUIR
|
- `B2R2LowUirLiftingService` - Lifts instructions to B2R2 LowUIR, maps to Stella IR model
|
||||||
|
- `B2R2LifterPool` - Bounded pool with warm preload for lifter reuse
|
||||||
|
- `FunctionIrCacheService` - Valkey-backed cache for semantic fingerprints
|
||||||
- `SemanticGraphExtractor` - Build Key-Semantics Graph (KSG)
|
- `SemanticGraphExtractor` - Build Key-Semantics Graph (KSG)
|
||||||
- `WeisfeilerLehmanHasher` - Graph fingerprinting
|
- `WeisfeilerLehmanHasher` - Graph fingerprinting
|
||||||
- `SemanticMatcher` - Semantic similarity scoring
|
- `SemanticMatcher` - Semantic similarity scoring
|
||||||
|
|
||||||
|
**B2R2LowUirLiftingService Implementation:**
|
||||||
|
- Supports Intel, ARM, MIPS, RISC-V, PowerPC, SPARC, SH4, AVR, EVM
|
||||||
|
- Maps B2R2 LowUIR statements to `IrStatement` model
|
||||||
|
- Applies SSA numbering to temporary registers
|
||||||
|
- Deterministic block ordering (by entry address)
|
||||||
|
- InvariantCulture formatting throughout
|
||||||
|
|
||||||
|
**B2R2LifterPool Implementation:**
|
||||||
|
- Bounded per-ISA pooling (default 4 lifters/ISA)
|
||||||
|
- Warm preload at startup for common ISAs
|
||||||
|
- Per-ISA stats (pooled, active, max)
|
||||||
|
- Automatic return on dispose
|
||||||
|
|
||||||
|
**FunctionIrCacheService Implementation:**
|
||||||
|
- Cache key: `(isa, b2r2_version, normalization_recipe, canonical_ir_hash)`
|
||||||
|
- Valkey as hot cache (default 4h TTL)
|
||||||
|
- PostgreSQL persistence for fingerprint records
|
||||||
|
- Hit/miss/eviction statistics
|
||||||
|
|
||||||
|
**Ops Endpoints:**
|
||||||
|
- `GET /api/v1/ops/binaryindex/health` - Lifter warmness, cache status
|
||||||
|
- `POST /api/v1/ops/binaryindex/bench/run` - Benchmark latency
|
||||||
|
- `GET /api/v1/ops/binaryindex/cache` - Cache statistics
|
||||||
|
- `GET /api/v1/ops/binaryindex/config` - Effective configuration
|
||||||
|
|
||||||
**Deliverables:**
|
**Deliverables:**
|
||||||
- `StellaOps.BinaryIndex.Semantic` library
|
- `StellaOps.BinaryIndex.Semantic` library
|
||||||
- 20 tasks, ~3 weeks
|
- `StellaOps.BinaryIndex.Disassembly.B2R2` (LowUIR adapter, lifter pool)
|
||||||
|
- `StellaOps.BinaryIndex.Cache` (function IR cache)
|
||||||
|
- BinaryIndexOpsController
|
||||||
|
- 20+ tasks, ~3 weeks
|
||||||
|
|
||||||
### Phase 2: Function Behavior Corpus (Scale)
|
### Phase 2: Function Behavior Corpus (Scale)
|
||||||
|
|
||||||
|
|||||||
@@ -73,3 +73,19 @@ Filters hash: `sha256(sortedQueryString)`; stored alongside fixtures for replaya
|
|||||||
- Golden fixtures: `src/Findings/StellaOps.Findings.Ledger/fixtures/golden/*.ndjson`.
|
- Golden fixtures: `src/Findings/StellaOps.Findings.Ledger/fixtures/golden/*.ndjson`.
|
||||||
- Checksum manifest: `docs/modules/findings-ledger/golden-checksums.json`.
|
- Checksum manifest: `docs/modules/findings-ledger/golden-checksums.json`.
|
||||||
- Offline verifier: `tools/LedgerReplayHarness/scripts/verify_export.py`.
|
- Offline verifier: `tools/LedgerReplayHarness/scripts/verify_export.py`.
|
||||||
|
|
||||||
|
## 6) Rekor Entry Reference — `rekor.entry.ref.v1` (Sprint: SPRINT_20260112_004_FINDINGS)
|
||||||
|
|
||||||
|
| Field | Type | Notes |
|
||||||
|
| --- | --- | --- |
|
||||||
|
| `logIndex` | `long?` | Position in the Rekor log. |
|
||||||
|
| `logId` | `string?` | Log identifier (hex-encoded public key hash). |
|
||||||
|
| `uuid` | `string?` | Unique entry identifier. |
|
||||||
|
| `integratedTime` | `long?` | Unix epoch seconds when entry was integrated. |
|
||||||
|
| `integratedTimeRfc3339` | `string?` (UTC ISO-8601) | RFC3339 formatted integrated time for display/sorting. |
|
||||||
|
| `entryUrl` | `string?` | Full URL to the Rekor entry for UI linking (e.g., `https://rekor.sigstore.dev/api/v1/log/entries/{uuid}`). |
|
||||||
|
|
||||||
|
**Usage:** Attached to `AttestationPointer` records and evidence graph signature metadata. The `integratedTimeRfc3339` field provides human-readable timestamps and deterministic sorting. The `entryUrl` enables direct linking from UI components.
|
||||||
|
|
||||||
|
**Offline mode:** When operating in air-gapped environments, `entryUrl` may be null or point to a local Rekor mirror. The `integratedTime` remains authoritative for timestamp verification.
|
||||||
|
|
||||||
|
|||||||
@@ -465,8 +465,113 @@ PolicyEngine:
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
|
## 11. Node Hash and Path Gating Extensions
|
||||||
|
|
||||||
|
Sprint: SPRINT_20260112_008_DOCS_path_witness_contracts (PW-DOC-004)
|
||||||
|
|
||||||
|
### 11.1 Extended ReachabilityInput Fields
|
||||||
|
|
||||||
|
The following fields extend `ReachabilityInput` for path-level gating:
|
||||||
|
|
||||||
|
```csharp
|
||||||
|
public sealed record ReachabilityInput
|
||||||
|
{
|
||||||
|
// ... existing fields ...
|
||||||
|
|
||||||
|
/// <summary>Canonical path hash computed from entry to sink.</summary>
|
||||||
|
public string? PathHash { get; init; }
|
||||||
|
|
||||||
|
/// <summary>Top-K node hashes along the path.</summary>
|
||||||
|
public ImmutableArray<string> NodeHashes { get; init; }
|
||||||
|
|
||||||
|
/// <summary>Entry point node hash.</summary>
|
||||||
|
public string? EntryNodeHash { get; init; }
|
||||||
|
|
||||||
|
/// <summary>Sink (vulnerable symbol) node hash.</summary>
|
||||||
|
public string? SinkNodeHash { get; init; }
|
||||||
|
|
||||||
|
/// <summary>When runtime evidence was observed (UTC).</summary>
|
||||||
|
public DateTimeOffset? RuntimeEvidenceAt { get; init; }
|
||||||
|
|
||||||
|
/// <summary>Whether path was observed at runtime.</summary>
|
||||||
|
public bool ObservedAtRuntime { get; init; }
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### 11.2 Node Hash Computation
|
||||||
|
|
||||||
|
Node hashes are computed using the canonical recipe:
|
||||||
|
|
||||||
|
```
|
||||||
|
nodeHash = SHA256(normalize(purl) + ":" + normalize(symbol))
|
||||||
|
```
|
||||||
|
|
||||||
|
See `docs/contracts/witness-v1.md` for normalization rules.
|
||||||
|
|
||||||
|
### 11.3 Policy DSL Access
|
||||||
|
|
||||||
|
The following fields are exposed in policy evaluation context:
|
||||||
|
|
||||||
|
| DSL Path | Type | Description |
|
||||||
|
|----------|------|-------------|
|
||||||
|
| `reachability.pathHash` | string | Canonical path hash |
|
||||||
|
| `reachability.nodeHashes` | array | Top-K node hashes |
|
||||||
|
| `reachability.entryNodeHash` | string | Entry point node hash |
|
||||||
|
| `reachability.sinkNodeHash` | string | Sink node hash |
|
||||||
|
| `reachability.runtimeEvidenceAt` | datetime | Runtime observation timestamp |
|
||||||
|
| `reachability.observedAtRuntime` | boolean | Whether confirmed at runtime |
|
||||||
|
| `reachability.runtimeEvidenceAge` | duration | Age of runtime evidence |
|
||||||
|
|
||||||
|
### 11.4 Path Gating Examples
|
||||||
|
|
||||||
|
Block paths confirmed at runtime:
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
match:
|
||||||
|
reachability:
|
||||||
|
pathHash:
|
||||||
|
exists: true
|
||||||
|
observedAtRuntime: true
|
||||||
|
action: block
|
||||||
|
```
|
||||||
|
|
||||||
|
Require fresh runtime evidence:
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
match:
|
||||||
|
reachability:
|
||||||
|
runtimeEvidenceAge:
|
||||||
|
gt: 24h
|
||||||
|
action: warn
|
||||||
|
message: "Runtime evidence is stale"
|
||||||
|
```
|
||||||
|
|
||||||
|
Block specific node patterns:
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
match:
|
||||||
|
reachability:
|
||||||
|
nodeHashes:
|
||||||
|
contains_any:
|
||||||
|
- "sha256:critical-auth-node..."
|
||||||
|
action: block
|
||||||
|
```
|
||||||
|
|
||||||
|
### 11.5 Runtime Evidence Freshness
|
||||||
|
|
||||||
|
Runtime evidence age is computed as:
|
||||||
|
|
||||||
|
```
|
||||||
|
runtimeEvidenceAge = now() - runtimeEvidenceAt
|
||||||
|
```
|
||||||
|
|
||||||
|
Freshness thresholds can be configured per environment in `DeterminizationOptions`.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
## Changelog
|
## Changelog
|
||||||
|
|
||||||
| Version | Date | Changes |
|
| Version | Date | Changes |
|
||||||
|---------|------|---------|
|
|---------|------|---------|
|
||||||
|
| 1.1.0 | 2026-01-14 | Added node hash, path gating, and runtime evidence fields (SPRINT_20260112_008) |
|
||||||
| 1.0.0 | 2025-12-19 | Initial release |
|
| 1.0.0 | 2025-12-19 | Initial release |
|
||||||
|
|||||||
@@ -367,7 +367,126 @@ The Policy Engine reads uncertainty gate thresholds from configuration:
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
## 13 · Versioning & Compatibility
|
## 13 · Signed Override Enforcement (Sprint 20260112.004)
|
||||||
|
|
||||||
|
Signed VEX overrides provide cryptographic assurance that operator decisions (not_affected, compensating controls) are authentic and auditable. The Policy Engine exposes override signature status to DSL rules for enforcement.
|
||||||
|
|
||||||
|
### 13.1 Override Signal Namespace
|
||||||
|
|
||||||
|
Within predicates and actions you may reference the following override signals:
|
||||||
|
|
||||||
|
| Signal | Type | Description |
|
||||||
|
|--------|------|-------------|
|
||||||
|
| `override.signed` | `bool` | `true` when the VEX override has a valid DSSE signature. |
|
||||||
|
| `override.rekor_verified` | `bool` | `true` when the override signature is anchored in Rekor transparency log. |
|
||||||
|
| `override.signing_key_id` | `string` | Key identifier used to sign the override. |
|
||||||
|
| `override.signer_identity` | `string` | Identity (email, OIDC subject) of the signer. |
|
||||||
|
| `override.envelope_digest` | `string` | SHA-256 digest of the DSSE envelope. |
|
||||||
|
| `override.rekor_log_index` | `int?` | Rekor log index if anchored; `null` otherwise. |
|
||||||
|
| `override.rekor_integrated_time` | `datetime?` | Timestamp when anchored in Rekor. |
|
||||||
|
| `override.valid_from` | `datetime?` | Override validity window start (if specified). |
|
||||||
|
| `override.valid_until` | `datetime?` | Override validity window end (if specified). |
|
||||||
|
| `override.within_validity_period` | `bool` | `true` when current time is within validity window (or no window specified). |
|
||||||
|
| `override.key_trust_level` | `string` | Trust level: `Unknown`, `LowTrust`, `OrganizationTrusted`, `HighlyTrusted`. |
|
||||||
|
|
||||||
|
### 13.2 Enforcement Rules
|
||||||
|
|
||||||
|
#### 13.2.1 Require Signed Overrides
|
||||||
|
|
||||||
|
Block unsigned VEX overrides from being accepted:
|
||||||
|
|
||||||
|
```dsl
|
||||||
|
rule require_signed_overrides priority 1 {
|
||||||
|
when vex.any(status in ["not_affected", "fixed"])
|
||||||
|
and not override.signed
|
||||||
|
then status := "under_investigation"
|
||||||
|
annotate override_blocked := "Unsigned override rejected"
|
||||||
|
because "Production environments require signed VEX overrides";
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 13.2.2 Require Rekor Anchoring for Critical Assets
|
||||||
|
|
||||||
|
For critical assets, require transparency log anchoring:
|
||||||
|
|
||||||
|
```dsl
|
||||||
|
rule require_rekor_for_critical priority 2 {
|
||||||
|
when env.asset_tier == "critical"
|
||||||
|
and vex.any(status == "not_affected")
|
||||||
|
and override.signed
|
||||||
|
and not override.rekor_verified
|
||||||
|
then status := "under_investigation"
|
||||||
|
warn message "Critical asset requires Rekor-anchored override"
|
||||||
|
because "Critical assets require transparency log verification";
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 13.2.3 Trust Level Gating
|
||||||
|
|
||||||
|
Gate override acceptance based on signer trust level:
|
||||||
|
|
||||||
|
```dsl
|
||||||
|
rule gate_by_trust_level priority 5 {
|
||||||
|
when override.signed
|
||||||
|
and override.key_trust_level in ["Unknown", "LowTrust"]
|
||||||
|
and env.security_posture == "strict"
|
||||||
|
then status := "under_investigation"
|
||||||
|
annotate trust_gate_failed := override.signer_identity
|
||||||
|
because "Strict posture requires OrganizationTrusted or higher";
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
#### 13.2.4 Validity Period Enforcement
|
||||||
|
|
||||||
|
Reject expired or not-yet-valid overrides:
|
||||||
|
|
||||||
|
```dsl
|
||||||
|
rule enforce_validity_period priority 3 {
|
||||||
|
when override.signed
|
||||||
|
and exists(override.valid_until)
|
||||||
|
and not override.within_validity_period
|
||||||
|
then status := "affected"
|
||||||
|
annotate override_expired := override.valid_until
|
||||||
|
because "VEX override has expired or is not yet valid";
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### 13.3 Default Enforcement Profile
|
||||||
|
|
||||||
|
The default enforcement profile blocks unsigned overrides in production:
|
||||||
|
|
||||||
|
```dsl
|
||||||
|
settings {
|
||||||
|
require_signed_overrides = true;
|
||||||
|
require_rekor_for_production = false;
|
||||||
|
minimum_trust_level = "OrganizationTrusted";
|
||||||
|
enforce_validity_period = true;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
Override these settings in environment-specific policy packs.
|
||||||
|
|
||||||
|
### 13.4 Offline Mode Considerations
|
||||||
|
|
||||||
|
In sealed/offline deployments:
|
||||||
|
|
||||||
|
- `override.rekor_verified` evaluates to `false` (no network access to verify).
|
||||||
|
- Use embedded proofs in the DSSE envelope for signature verification.
|
||||||
|
- Policies should fall back to signature verification without requiring Rekor:
|
||||||
|
|
||||||
|
```dsl
|
||||||
|
rule offline_safe_override priority 5 {
|
||||||
|
when env.sealed_mode == true
|
||||||
|
and override.signed
|
||||||
|
and override.key_trust_level in ["OrganizationTrusted", "HighlyTrusted"]
|
||||||
|
then status := vex.status
|
||||||
|
because "Offline mode accepts signed overrides from trusted keys without Rekor";
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 14 · Versioning & Compatibility
|
||||||
|
|
||||||
- `syntax "stella-dsl@1"` is mandatory.
|
- `syntax "stella-dsl@1"` is mandatory.
|
||||||
- Future revisions (`@2`, …) will be additive; existing packs continue to compile with their declared version.
|
- Future revisions (`@2`, …) will be additive; existing packs continue to compile with their declared version.
|
||||||
@@ -375,7 +494,7 @@ The Policy Engine reads uncertainty gate thresholds from configuration:
|
|||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
## 14 · Compliance Checklist
|
## 15 · Compliance Checklist
|
||||||
|
|
||||||
- [ ] **Grammar validated:** Policy compiles with `stella policy lint` and matches `syntax "stella-dsl@1"`.
|
- [ ] **Grammar validated:** Policy compiles with `stella policy lint` and matches `syntax "stella-dsl@1"`.
|
||||||
- [ ] **Deterministic constructs only:** No use of forbidden namespaces (`DateTime.Now`, `Guid.NewGuid`, external services).
|
- [ ] **Deterministic constructs only:** No use of forbidden namespaces (`DateTime.Now`, `Guid.NewGuid`, external services).
|
||||||
|
|||||||
@@ -42,6 +42,121 @@
|
|||||||
- Ensure `analysisId` is propagated from Scanner/Zastava into Signals ingest to keep replay manifests linked.
|
- Ensure `analysisId` is propagated from Scanner/Zastava into Signals ingest to keep replay manifests linked.
|
||||||
- Keep feeds frozen for reproducibility; avoid external downloads in union preparation.
|
- Keep feeds frozen for reproducibility; avoid external downloads in union preparation.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Node Hash Joins and Runtime Evidence Linkage
|
||||||
|
|
||||||
|
Sprint: SPRINT_20260112_008_DOCS_path_witness_contracts (PW-DOC-002)
|
||||||
|
|
||||||
|
### Overview
|
||||||
|
|
||||||
|
Node hashes provide a canonical way to join static reachability analysis with runtime observations. Each node in a callgraph can be identified by a stable hash computed from its PURL and symbol information, enabling:
|
||||||
|
|
||||||
|
1. **Static-to-runtime correlation**: Match runtime stack traces to static callgraph nodes
|
||||||
|
2. **Cross-scan consistency**: Compare reachability across different analysis runs
|
||||||
|
3. **Evidence linking**: Associate attestations with specific code paths
|
||||||
|
|
||||||
|
### Node Hash Recipe
|
||||||
|
|
||||||
|
A node hash is computed as:
|
||||||
|
|
||||||
|
```
|
||||||
|
nodeHash = SHA256(normalize(purl) + ":" + normalize(symbol))
|
||||||
|
```
|
||||||
|
|
||||||
|
Where:
|
||||||
|
- `normalize(purl)` lowercases the PURL and sorts qualifiers alphabetically
|
||||||
|
- `normalize(symbol)` removes whitespace and normalizes platform-specific decorations
|
||||||
|
|
||||||
|
Example:
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"purl": "pkg:npm/express@4.18.2",
|
||||||
|
"symbol": "Router.handle",
|
||||||
|
"nodeHash": "sha256:a1b2c3d4..."
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Path Hash and Top-K Selection
|
||||||
|
|
||||||
|
A path hash identifies a specific call path from entrypoint to sink:
|
||||||
|
|
||||||
|
```
|
||||||
|
pathHash = SHA256(entryNodeHash + ":" + joinedIntermediateHashes + ":" + sinkNodeHash)
|
||||||
|
```
|
||||||
|
|
||||||
|
For long paths, only the **top-K** most significant nodes are included (default K=10):
|
||||||
|
- Entry node (always included)
|
||||||
|
- Sink node (always included)
|
||||||
|
- Intermediate nodes ranked by call frequency or security relevance
|
||||||
|
|
||||||
|
### Runtime Evidence Linkage
|
||||||
|
|
||||||
|
Runtime observations from Zastava can be linked to static analysis using node hashes:
|
||||||
|
|
||||||
|
| Field | Description |
|
||||||
|
|-------|-------------|
|
||||||
|
| `observedNodeHashes` | Node hashes seen at runtime |
|
||||||
|
| `observedPathHashes` | Path hashes confirmed by runtime traces |
|
||||||
|
| `runtimeEvidenceAt` | Timestamp of runtime observation (RFC3339) |
|
||||||
|
| `callstackHash` | Hash of the observed call stack |
|
||||||
|
|
||||||
|
### Join Example
|
||||||
|
|
||||||
|
To correlate static reachability with runtime evidence:
|
||||||
|
|
||||||
|
```sql
|
||||||
|
-- Find statically-reachable vulnerabilities confirmed at runtime
|
||||||
|
SELECT
|
||||||
|
s.vulnerability_id,
|
||||||
|
s.path_hash,
|
||||||
|
r.observed_at
|
||||||
|
FROM static_reachability s
|
||||||
|
JOIN runtime_observations r
|
||||||
|
ON s.sink_node_hash = ANY(r.observed_node_hashes)
|
||||||
|
WHERE s.reachable = true
|
||||||
|
AND r.observed_at > NOW() - INTERVAL '7 days';
|
||||||
|
```
|
||||||
|
|
||||||
|
### SARIF Integration
|
||||||
|
|
||||||
|
Node hashes are exposed in SARIF outputs via `stellaops/*` property keys:
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"results": [{
|
||||||
|
"ruleId": "CVE-2024-1234",
|
||||||
|
"properties": {
|
||||||
|
"stellaops/nodeHash": "sha256:abc123...",
|
||||||
|
"stellaops/pathHash": "sha256:def456...",
|
||||||
|
"stellaops/topKNodeHashes": ["sha256:...", "sha256:..."],
|
||||||
|
"stellaops/evidenceUri": "cas://evidence/...",
|
||||||
|
"stellaops/observedAtRuntime": true
|
||||||
|
}
|
||||||
|
}]
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Policy Gate Usage
|
||||||
|
|
||||||
|
Policy rules can reference node and path hashes for fine-grained control:
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
rules:
|
||||||
|
- name: block-confirmed-critical-path
|
||||||
|
match:
|
||||||
|
severity: CRITICAL
|
||||||
|
reachability:
|
||||||
|
pathHash:
|
||||||
|
exists: true
|
||||||
|
observedAtRuntime: true
|
||||||
|
action: block
|
||||||
|
```
|
||||||
|
|
||||||
|
See `policies/path-gates-advanced.yaml` for comprehensive examples.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
## References
|
## References
|
||||||
- Schema: `docs/modules/reach-graph/schemas/runtime-static-union-schema.md`
|
- Schema: `docs/modules/reach-graph/schemas/runtime-static-union-schema.md`
|
||||||
- Delivery guide: `docs/modules/reach-graph/guides/DELIVERY_GUIDE.md`
|
- Delivery guide: `docs/modules/reach-graph/guides/DELIVERY_GUIDE.md`
|
||||||
|
|||||||
@@ -205,6 +205,29 @@ stella proof verify --bundle proof-bundle.zip \
|
|||||||
--skip-rekor # No network access
|
--skip-rekor # No network access
|
||||||
```
|
```
|
||||||
|
|
||||||
|
### 3.2a CI/CD Gate Verification Quick Reference
|
||||||
|
|
||||||
|
> Sprint: SPRINT_20260112_004_DOC_cicd_gate_verification
|
||||||
|
|
||||||
|
Concise commands for CI/CD pipeline verification gates:
|
||||||
|
|
||||||
|
**Online (Rekor-backed):**
|
||||||
|
```bash
|
||||||
|
stellaops proof verify --image $IMAGE --check-rekor --fail-on-missing
|
||||||
|
```
|
||||||
|
|
||||||
|
**Offline (local ledger):**
|
||||||
|
```bash
|
||||||
|
stellaops proof verify --image $IMAGE --offline --ledger-path /var/lib/stellaops/ledger
|
||||||
|
```
|
||||||
|
|
||||||
|
**Evidence pack verification:**
|
||||||
|
```bash
|
||||||
|
stellaops evidence-pack verify --bundle $PACK_PATH --check-signatures --check-merkle
|
||||||
|
```
|
||||||
|
|
||||||
|
See also: [CI/CD Gate Flow - DSSE Witness Verification](../flows/10-cicd-gate-flow.md#5a-dsse-witness-verification-required) | [Proof Verification Runbook](proof-verification-runbook.md)
|
||||||
|
|
||||||
### 3.3 Verification Checks
|
### 3.3 Verification Checks
|
||||||
|
|
||||||
| Check | Description | Can Skip? |
|
| Check | Description | Can Skip? |
|
||||||
|
|||||||
@@ -219,6 +219,100 @@ stellaops scan image:tag --output-format sarif --tier executed,tainted_sink
|
|||||||
stellaops smart-diff --output-format sarif --min-priority 0.7
|
stellaops smart-diff --output-format sarif --min-priority 0.7
|
||||||
```
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## StellaOps Property Keys
|
||||||
|
|
||||||
|
> **Sprint:** SPRINT_20260112_008_DOCS_path_witness_contracts (PW-DOC-003)
|
||||||
|
|
||||||
|
SARIF `properties` bag extensions for StellaOps-specific metadata.
|
||||||
|
|
||||||
|
### Result-Level Properties
|
||||||
|
|
||||||
|
| Property Key | Type | Description |
|
||||||
|
|--------------|------|-------------|
|
||||||
|
| `stellaops/nodeHash` | string | Canonical node hash (`sha256:<hex>`) for static/runtime joining |
|
||||||
|
| `stellaops/pathHash` | string | Canonical path hash for full reachability path |
|
||||||
|
| `stellaops/topKNodeHashes` | array | Top-K node hashes for efficient lookup |
|
||||||
|
| `stellaops/evidenceUri` | string | `cas://<hash>` URI to evidence bundle |
|
||||||
|
| `stellaops/attestationUri` | string | `cas://<hash>` URI to DSSE envelope |
|
||||||
|
| `stellaops/rekorUri` | string | Rekor transparency log entry URL |
|
||||||
|
| `stellaops/witnessId` | string | Path witness identifier |
|
||||||
|
| `stellaops/witnessHash` | string | BLAKE3 hash of witness payload |
|
||||||
|
|
||||||
|
### Run-Level Properties
|
||||||
|
|
||||||
|
| Property Key | Type | Description |
|
||||||
|
|--------------|------|-------------|
|
||||||
|
| `stellaops/scanId` | string | UUID of the scan |
|
||||||
|
| `stellaops/graphHash` | string | BLAKE3 hash of the rich graph |
|
||||||
|
| `stellaops/sbomDigest` | string | SHA256 digest of source SBOM |
|
||||||
|
| `stellaops/feedSnapshot` | string | ISO8601 timestamp of feed data |
|
||||||
|
|
||||||
|
### Example with StellaOps Properties
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"results": [
|
||||||
|
{
|
||||||
|
"ruleId": "SDIFF001",
|
||||||
|
"level": "warning",
|
||||||
|
"message": {
|
||||||
|
"text": "CVE-2024-1234 became reachable via 3-hop path"
|
||||||
|
},
|
||||||
|
"locations": [
|
||||||
|
{
|
||||||
|
"logicalLocations": [
|
||||||
|
{
|
||||||
|
"name": "pkg:npm/lodash@4.17.20",
|
||||||
|
"kind": "package"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name": "lodash.merge(object, object)",
|
||||||
|
"kind": "function"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"properties": {
|
||||||
|
"vulnerability": "CVE-2024-1234",
|
||||||
|
"tier": "executed",
|
||||||
|
"direction": "increased",
|
||||||
|
"stellaops/nodeHash": "sha256:a1b2c3d4e5f6789012345678901234567890123456789012345678901234abcd",
|
||||||
|
"stellaops/pathHash": "sha256:fedcba0987654321fedcba0987654321fedcba0987654321fedcba0987654321",
|
||||||
|
"stellaops/topKNodeHashes": [
|
||||||
|
"sha256:entry1111111111111111111111111111111111111111111111111111111111",
|
||||||
|
"sha256:sink22222222222222222222222222222222222222222222222222222222222"
|
||||||
|
],
|
||||||
|
"stellaops/evidenceUri": "cas://sha256:evidence123...",
|
||||||
|
"stellaops/attestationUri": "cas://sha256:dsse456...",
|
||||||
|
"stellaops/rekorUri": "https://rekor.sigstore.dev/api/v1/log/entries/abc123",
|
||||||
|
"stellaops/witnessId": "550e8400-e29b-41d4-a716-446655440000"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Joining Static and Runtime Evidence
|
||||||
|
|
||||||
|
Use `stellaops/nodeHash` to correlate:
|
||||||
|
|
||||||
|
1. **Static analysis** findings (SARIF from Scanner)
|
||||||
|
2. **Runtime telemetry** (execution traces from agents)
|
||||||
|
3. **Policy decisions** (gating results)
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Query findings by node hash
|
||||||
|
curl -H "Authorization: Bearer $TOKEN" \
|
||||||
|
"https://scanner.example.com/api/v1/findings?nodeHash=sha256:a1b2c3..."
|
||||||
|
|
||||||
|
# Verify path witness by hash
|
||||||
|
stellaops witness verify --path-hash sha256:fedcba...
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
## Troubleshooting
|
## Troubleshooting
|
||||||
|
|
||||||
### SARIF Validation Errors
|
### SARIF Validation Errors
|
||||||
|
|||||||
143
docs/technical/reviews/cli-command-name-sweep-2026-01-14.md
Normal file
143
docs/technical/reviews/cli-command-name-sweep-2026-01-14.md
Normal file
@@ -0,0 +1,143 @@
|
|||||||
|
# CLI Command Name Sweep Report
|
||||||
|
|
||||||
|
**Date:** 2026-01-14
|
||||||
|
**Sprint:** SPRINT_20260112_010_DOCS_cli_command_name_sweep
|
||||||
|
**Owner:** Docs Guild
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Executive Summary
|
||||||
|
|
||||||
|
This report inventories all CLI command references in documentation to confirm the canonical command name (`stella`) and identify legacy references (`stellaops`) for cleanup or alias documentation.
|
||||||
|
|
||||||
|
| Command Pattern | Count | Status |
|
||||||
|
|-----------------|-------|--------|
|
||||||
|
| `stella <command>` | 984 | Canonical - no action |
|
||||||
|
| `stellaops <command>` | 140 | Legacy - review needed |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Classification Summary
|
||||||
|
|
||||||
|
### Category 1: Replace (CLI Commands)
|
||||||
|
|
||||||
|
These are direct CLI command invocations using `stellaops` that should be updated to `stella`:
|
||||||
|
|
||||||
|
| File Path | Line | Context | Recommendation |
|
||||||
|
|-----------|------|---------|----------------|
|
||||||
|
| docs/benchmarks/performance-baselines.md | 191-239 | Benchmark commands | Replace with `stella` |
|
||||||
|
| docs/benchmarks/smart-diff-wii.md | 141 | Verify attestation example | Replace with `stella` |
|
||||||
|
| docs/benchmarks/submission-guide.md | 144-147 | Submission examples | Replace with `stella` |
|
||||||
|
|
||||||
|
**Estimated count:** ~25 references in benchmark docs.
|
||||||
|
|
||||||
|
### Category 2: Keep (Namespaces/Headers)
|
||||||
|
|
||||||
|
These are valid namespace, assembly, or header references that should remain as-is:
|
||||||
|
|
||||||
|
| Pattern | Context | Recommendation |
|
||||||
|
|---------|---------|----------------|
|
||||||
|
| `StellaOps.*` namespace | Code namespaces in docs | Keep - matches source code |
|
||||||
|
| `X-StellaOps-*` headers | API authentication headers | Keep - canonical header prefix |
|
||||||
|
| `stellaops:tenant` claim | JWT claim names | Keep - canonical claim name |
|
||||||
|
| `stellaops.console.*` | Payload/event types | Keep - canonical type prefixes |
|
||||||
|
|
||||||
|
**Estimated count:** ~100+ references.
|
||||||
|
|
||||||
|
### Category 3: Ambiguous (Requires CLI Guild Input)
|
||||||
|
|
||||||
|
| Pattern | Context | Question |
|
||||||
|
|---------|---------|----------|
|
||||||
|
| URLs with `stellaops` | gateway.stellaops.local | Is this the canonical domain? |
|
||||||
|
| Product name references | "StellaOps Scanner" | Product name vs CLI command |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## File-by-File Inventory (CLI Commands Only)
|
||||||
|
|
||||||
|
### docs/benchmarks/performance-baselines.md
|
||||||
|
|
||||||
|
```
|
||||||
|
Line 191: time stellaops scan --image example:latest
|
||||||
|
Line 195: time stellaops scan --image example:latest --format json
|
||||||
|
Line 199: /usr/bin/time -v stellaops scan ...
|
||||||
|
Line 203: perf stat stellaops scan ...
|
||||||
|
Line 223: time stellaops sbom --image ...
|
||||||
|
Line 226: stellaops sbom --image ...
|
||||||
|
Line 234: time stellaops scan --image ...
|
||||||
|
Line 239: stellaops scan --image ...
|
||||||
|
```
|
||||||
|
|
||||||
|
**Action:** Replace `stellaops` with `stella` in all commands.
|
||||||
|
|
||||||
|
### docs/benchmarks/smart-diff-wii.md
|
||||||
|
|
||||||
|
```
|
||||||
|
Line 141: stellaops verify-attestation ...
|
||||||
|
```
|
||||||
|
|
||||||
|
**Action:** Replace with `stella verify-attestation`.
|
||||||
|
|
||||||
|
### docs/benchmarks/submission-guide.md
|
||||||
|
|
||||||
|
```
|
||||||
|
Line 144: 'stellaops scan --image ...'
|
||||||
|
Line 147: /usr/bin/time -v stellaops ...
|
||||||
|
```
|
||||||
|
|
||||||
|
**Action:** Replace with `stella`.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Legacy Alias Policy Recommendation
|
||||||
|
|
||||||
|
If `stellaops` is supported as a shell alias for `stella`:
|
||||||
|
|
||||||
|
1. Document the alias in CLI reference: `docs/modules/cli/guides/commands/aliases.md`
|
||||||
|
2. Add a note in examples that `stellaops` is a legacy alias
|
||||||
|
3. Prefer `stella` in all new documentation
|
||||||
|
|
||||||
|
If `stellaops` is NOT supported:
|
||||||
|
|
||||||
|
1. Replace all CLI command references with `stella`
|
||||||
|
2. Update CI examples and scripts
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Follow-Up Tasks
|
||||||
|
|
||||||
|
| Task ID | Description | Owner | Priority |
|
||||||
|
|---------|-------------|-------|----------|
|
||||||
|
| CLISWEEP-REPLACE-001 | Replace `stellaops` CLI commands in benchmark docs | Docs Guild | P2 |
|
||||||
|
| CLISWEEP-ALIAS-002 | Confirm alias policy with CLI Guild | CLI Guild | P1 |
|
||||||
|
| CLISWEEP-DOC-003 | Document alias behavior if supported | Docs Guild | P2 |
|
||||||
|
| CLISWEEP-VERIFY-004 | Verify no broken examples after replacement | QA Guild | P3 |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Methodology
|
||||||
|
|
||||||
|
1. Searched `docs/**/*.md` for pattern `stellaops\s+<command>` where command is a known CLI verb
|
||||||
|
2. Excluded namespace/header/claim references (matched by `StellaOps.*`, `X-StellaOps-*`, `stellaops:*`)
|
||||||
|
3. Counted canonical `stella <command>` references for comparison
|
||||||
|
4. Classified each reference by context and owner
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Appendix: Search Commands Used
|
||||||
|
|
||||||
|
```powershell
|
||||||
|
# Count stellaops CLI commands
|
||||||
|
Get-ChildItem -Recurse -Path docs -Include *.md |
|
||||||
|
Select-String -Pattern "stellaops\s+(scan|export|verify|...)"
|
||||||
|
|
||||||
|
# Count stella CLI commands (canonical)
|
||||||
|
Get-ChildItem -Recurse -Path docs -Include *.md |
|
||||||
|
Select-String -Pattern "stella\s+(scan|export|verify|...)" |
|
||||||
|
Where-Object { $_.Line -notmatch "stellaops" }
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
**Report Status:** Complete
|
||||||
|
**Next Review:** After CLI Guild alias policy confirmation
|
||||||
150
policies/path-gates-advanced.yaml
Normal file
150
policies/path-gates-advanced.yaml
Normal file
@@ -0,0 +1,150 @@
|
|||||||
|
# Path-Level Reachability Gates Policy
|
||||||
|
# Sprint: SPRINT_20260112_007_POLICY_path_gate_inputs (PW-POL-003)
|
||||||
|
#
|
||||||
|
# Demonstrates path-level gates using pathHash, nodeHashes, and runtime freshness.
|
||||||
|
# Requires scanner path witness evidence with node hash fields.
|
||||||
|
|
||||||
|
apiVersion: policy.stellaops.io/v1
|
||||||
|
kind: PolicyPack
|
||||||
|
metadata:
|
||||||
|
name: path-gates-advanced
|
||||||
|
version: 1.0.0
|
||||||
|
description: |
|
||||||
|
Advanced policy pack demonstrating path-level reachability gates.
|
||||||
|
Uses pathHash, nodeHashes, and runtime evidence freshness for fine-grained control.
|
||||||
|
Sprint: SPRINT_20260112_007_POLICY_path_gate_inputs
|
||||||
|
|
||||||
|
spec:
|
||||||
|
settings:
|
||||||
|
defaultAction: warn
|
||||||
|
requirePathWitness: true
|
||||||
|
runtimeFreshnessMaxHours: 24
|
||||||
|
trustedEntrypoints:
|
||||||
|
- "main"
|
||||||
|
- "api.handler"
|
||||||
|
- "web.controller"
|
||||||
|
|
||||||
|
rules:
|
||||||
|
# Block if a specific vulnerable path is reachable and confirmed at runtime
|
||||||
|
- name: block-runtime-confirmed-path
|
||||||
|
description: "Block paths confirmed reachable at runtime with CRITICAL vulns"
|
||||||
|
priority: 100
|
||||||
|
match:
|
||||||
|
severity: CRITICAL
|
||||||
|
reachability:
|
||||||
|
status: reachable
|
||||||
|
observedAtRuntime: true
|
||||||
|
action: block
|
||||||
|
message: "Runtime-confirmed reachable path to CRITICAL {cve} via {pathHash}"
|
||||||
|
|
||||||
|
# Require fresh runtime evidence for high-severity findings
|
||||||
|
- name: require-fresh-runtime-evidence
|
||||||
|
description: "Require runtime evidence younger than threshold for HIGH vulns"
|
||||||
|
priority: 95
|
||||||
|
match:
|
||||||
|
severity: HIGH
|
||||||
|
reachability:
|
||||||
|
status: reachable
|
||||||
|
pathHash:
|
||||||
|
exists: true
|
||||||
|
runtimeEvidenceAge:
|
||||||
|
gt: ${settings.runtimeFreshnessMaxHours}h
|
||||||
|
action: warn
|
||||||
|
message: "Runtime evidence for {cve} is stale ({runtimeEvidenceAge} hours old)"
|
||||||
|
|
||||||
|
# Allow paths with trusted entry nodes
|
||||||
|
- name: allow-trusted-entrypoints
|
||||||
|
description: "Allow paths starting from trusted entrypoints"
|
||||||
|
priority: 90
|
||||||
|
match:
|
||||||
|
severity:
|
||||||
|
- MEDIUM
|
||||||
|
- LOW
|
||||||
|
reachability:
|
||||||
|
status: reachable
|
||||||
|
entryNodeHash:
|
||||||
|
in: ${settings.trustedEntrypoints}
|
||||||
|
action: allow
|
||||||
|
log: true
|
||||||
|
message: "Vulnerability {cve} reachable from trusted entrypoint - allowed"
|
||||||
|
|
||||||
|
# Block paths with specific node hashes in critical code areas
|
||||||
|
- name: block-critical-node-paths
|
||||||
|
description: "Block paths through critical code nodes"
|
||||||
|
priority: 85
|
||||||
|
match:
|
||||||
|
severity:
|
||||||
|
- CRITICAL
|
||||||
|
- HIGH
|
||||||
|
reachability:
|
||||||
|
nodeHashes:
|
||||||
|
contains_any:
|
||||||
|
- ${critical.authentication_handler}
|
||||||
|
- ${critical.payment_processor}
|
||||||
|
- ${critical.data_exporter}
|
||||||
|
action: block
|
||||||
|
message: "Vulnerability {cve} path traverses critical node {matchedNodeHash}"
|
||||||
|
|
||||||
|
# Warn if path witness is missing for reachable findings
|
||||||
|
- name: warn-missing-path-witness
|
||||||
|
description: "Warn when reachable finding lacks path witness"
|
||||||
|
priority: 80
|
||||||
|
match:
|
||||||
|
severity:
|
||||||
|
- CRITICAL
|
||||||
|
- HIGH
|
||||||
|
- MEDIUM
|
||||||
|
reachability:
|
||||||
|
status: reachable
|
||||||
|
pathHash:
|
||||||
|
exists: false
|
||||||
|
action: warn
|
||||||
|
message: "Reachable {cve} lacks path witness - reanalysis recommended"
|
||||||
|
|
||||||
|
# Aggregate gate: block if too many runtime-confirmed paths
|
||||||
|
- name: fail-on-runtime-confirmed-count
|
||||||
|
description: "Block deployment if too many runtime-confirmed vulns"
|
||||||
|
priority: 75
|
||||||
|
type: aggregate
|
||||||
|
match:
|
||||||
|
runtimeConfirmedCount:
|
||||||
|
gt: 5
|
||||||
|
action: block
|
||||||
|
message: "Too many runtime-confirmed vulnerabilities ({runtimeConfirmedCount} > 5)"
|
||||||
|
|
||||||
|
# Allow paths not observed at runtime with reduced confidence
|
||||||
|
- name: allow-static-only-paths
|
||||||
|
description: "Allow static-only reachable paths with warning"
|
||||||
|
priority: 70
|
||||||
|
match:
|
||||||
|
severity:
|
||||||
|
- HIGH
|
||||||
|
- MEDIUM
|
||||||
|
reachability:
|
||||||
|
status: reachable
|
||||||
|
observedAtRuntime: false
|
||||||
|
confidence:
|
||||||
|
lt: 0.7
|
||||||
|
action: warn
|
||||||
|
message: "Static-only path to {cve} (confidence {confidence}) - review recommended"
|
||||||
|
|
||||||
|
# Path hash pinning: allow specific known-safe paths
|
||||||
|
- name: allow-pinned-safe-paths
|
||||||
|
description: "Allow paths matching known-safe path hashes"
|
||||||
|
priority: 65
|
||||||
|
match:
|
||||||
|
reachability:
|
||||||
|
pathHash:
|
||||||
|
in: ${known_safe_paths}
|
||||||
|
action: allow
|
||||||
|
message: "Path {pathHash} matches known-safe path - allowed"
|
||||||
|
|
||||||
|
# Variables for path hash references
|
||||||
|
variables:
|
||||||
|
critical:
|
||||||
|
authentication_handler: "sha256:auth-handler-node-hash"
|
||||||
|
payment_processor: "sha256:payment-proc-node-hash"
|
||||||
|
data_exporter: "sha256:data-export-node-hash"
|
||||||
|
known_safe_paths:
|
||||||
|
- "sha256:validated-path-1"
|
||||||
|
- "sha256:validated-path-2"
|
||||||
@@ -285,6 +285,9 @@ public static class EvidencePackEndpoints
|
|||||||
"html" => EvidencePackExportFormat.Html,
|
"html" => EvidencePackExportFormat.Html,
|
||||||
"pdf" => EvidencePackExportFormat.Pdf,
|
"pdf" => EvidencePackExportFormat.Pdf,
|
||||||
"signedjson" => EvidencePackExportFormat.SignedJson,
|
"signedjson" => EvidencePackExportFormat.SignedJson,
|
||||||
|
// Sprint: SPRINT_20260112_005_BE_evidence_card_api (EVPCARD-BE-001)
|
||||||
|
"evidencecard" or "evidence-card" or "card" => EvidencePackExportFormat.EvidenceCard,
|
||||||
|
"evidencecardcompact" or "card-compact" => EvidencePackExportFormat.EvidenceCardCompact,
|
||||||
_ => EvidencePackExportFormat.Json
|
_ => EvidencePackExportFormat.Json
|
||||||
};
|
};
|
||||||
|
|
||||||
|
|||||||
@@ -0,0 +1,325 @@
|
|||||||
|
// <copyright file="PrTemplateBuilder.cs" company="StellaOps">
|
||||||
|
// SPDX-License-Identifier: AGPL-3.0-or-later
|
||||||
|
// Sprint: SPRINT_20260112_007_BE_remediation_pr_generator (REMEDY-BE-001)
|
||||||
|
// </copyright>
|
||||||
|
|
||||||
|
using System.Globalization;
|
||||||
|
using System.Text;
|
||||||
|
|
||||||
|
namespace StellaOps.AdvisoryAI.Remediation;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Builds deterministic PR.md templates for remediation pull requests.
|
||||||
|
/// </summary>
|
||||||
|
public sealed class PrTemplateBuilder
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Builds a PR description from a remediation plan.
|
||||||
|
/// </summary>
|
||||||
|
public string BuildPrBody(RemediationPlan plan)
|
||||||
|
{
|
||||||
|
ArgumentNullException.ThrowIfNull(plan);
|
||||||
|
|
||||||
|
var sb = new StringBuilder();
|
||||||
|
|
||||||
|
// Header section
|
||||||
|
sb.AppendLine("## Security Remediation");
|
||||||
|
sb.AppendLine();
|
||||||
|
sb.AppendLine($"**Plan ID:** `{plan.PlanId}`");
|
||||||
|
sb.AppendLine($"**Authority:** {plan.Authority}");
|
||||||
|
sb.AppendLine($"**Risk Level:** {plan.RiskAssessment}");
|
||||||
|
sb.AppendLine($"**Confidence:** {plan.ConfidenceScore:P0}");
|
||||||
|
sb.AppendLine($"**Generated:** {plan.GeneratedAt}");
|
||||||
|
sb.AppendLine();
|
||||||
|
|
||||||
|
// Summary section
|
||||||
|
AppendSummary(sb, plan);
|
||||||
|
|
||||||
|
// Steps section
|
||||||
|
AppendSteps(sb, plan);
|
||||||
|
|
||||||
|
// Expected changes section
|
||||||
|
AppendExpectedChanges(sb, plan);
|
||||||
|
|
||||||
|
// Test requirements section
|
||||||
|
AppendTestRequirements(sb, plan);
|
||||||
|
|
||||||
|
// Rollback section
|
||||||
|
AppendRollbackSteps(sb, plan);
|
||||||
|
|
||||||
|
// VEX claim section
|
||||||
|
AppendVexClaim(sb, plan);
|
||||||
|
|
||||||
|
// Evidence section
|
||||||
|
AppendEvidence(sb, plan);
|
||||||
|
|
||||||
|
// Footer
|
||||||
|
sb.AppendLine("---");
|
||||||
|
sb.AppendLine($"*Generated by StellaOps AdvisoryAI ({plan.ModelId})*");
|
||||||
|
|
||||||
|
return sb.ToString();
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Builds a PR title from a remediation plan.
|
||||||
|
/// </summary>
|
||||||
|
public string BuildPrTitle(RemediationPlan plan)
|
||||||
|
{
|
||||||
|
ArgumentNullException.ThrowIfNull(plan);
|
||||||
|
|
||||||
|
var riskEmoji = plan.RiskAssessment switch
|
||||||
|
{
|
||||||
|
RemediationRisk.Low => "[LOW]",
|
||||||
|
RemediationRisk.Medium => "[MEDIUM]",
|
||||||
|
RemediationRisk.High => "[HIGH]",
|
||||||
|
_ => "[UNKNOWN]"
|
||||||
|
};
|
||||||
|
|
||||||
|
return $"{riskEmoji} Security fix: {plan.Request.VulnerabilityId}";
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Builds a branch name from a remediation plan.
|
||||||
|
/// </summary>
|
||||||
|
public string BuildBranchName(RemediationPlan plan)
|
||||||
|
{
|
||||||
|
ArgumentNullException.ThrowIfNull(plan);
|
||||||
|
|
||||||
|
var sanitizedPlanId = plan.PlanId
|
||||||
|
.ToLowerInvariant()
|
||||||
|
.Replace(" ", "-")
|
||||||
|
.Replace("_", "-");
|
||||||
|
|
||||||
|
return $"stellaops/security-fix/{sanitizedPlanId}";
|
||||||
|
}
|
||||||
|
|
||||||
|
private static void AppendSummary(StringBuilder sb, RemediationPlan plan)
|
||||||
|
{
|
||||||
|
sb.AppendLine("### Summary");
|
||||||
|
sb.AppendLine();
|
||||||
|
sb.AppendLine($"This PR remediates vulnerability **{plan.Request.VulnerabilityId}** in component **{plan.Request.ComponentPurl}**.");
|
||||||
|
sb.AppendLine();
|
||||||
|
|
||||||
|
sb.AppendLine("**Vulnerability addressed:**");
|
||||||
|
sb.AppendLine($"- `{plan.Request.VulnerabilityId}`");
|
||||||
|
sb.AppendLine();
|
||||||
|
}
|
||||||
|
|
||||||
|
private static void AppendSteps(StringBuilder sb, RemediationPlan plan)
|
||||||
|
{
|
||||||
|
sb.AppendLine("### Remediation Steps");
|
||||||
|
sb.AppendLine();
|
||||||
|
|
||||||
|
foreach (var step in plan.Steps.OrderBy(s => s.Order))
|
||||||
|
{
|
||||||
|
var optionalTag = step.Optional ? " *(optional)*" : "";
|
||||||
|
var riskTag = step.Risk != RemediationRisk.Low ? $" [{step.Risk}]" : "";
|
||||||
|
|
||||||
|
sb.AppendLine($"{step.Order}. **{step.ActionType}**{riskTag}{optionalTag}");
|
||||||
|
sb.AppendLine($" - File: `{step.FilePath}`");
|
||||||
|
sb.AppendLine($" - {step.Description}");
|
||||||
|
|
||||||
|
if (!string.IsNullOrEmpty(step.PreviousValue) && !string.IsNullOrEmpty(step.NewValue))
|
||||||
|
{
|
||||||
|
sb.AppendLine($" - Change: `{step.PreviousValue}` -> `{step.NewValue}`");
|
||||||
|
}
|
||||||
|
|
||||||
|
sb.AppendLine();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private static void AppendExpectedChanges(StringBuilder sb, RemediationPlan plan)
|
||||||
|
{
|
||||||
|
sb.AppendLine("### Expected SBOM Changes");
|
||||||
|
sb.AppendLine();
|
||||||
|
|
||||||
|
var delta = plan.ExpectedDelta;
|
||||||
|
|
||||||
|
if (delta.Upgraded.Count > 0)
|
||||||
|
{
|
||||||
|
sb.AppendLine("**Upgrades:**");
|
||||||
|
foreach (var (oldPurl, newPurl) in delta.Upgraded.OrderBy(kvp => kvp.Key, StringComparer.Ordinal))
|
||||||
|
{
|
||||||
|
sb.AppendLine($"- `{oldPurl}` -> `{newPurl}`");
|
||||||
|
}
|
||||||
|
sb.AppendLine();
|
||||||
|
}
|
||||||
|
|
||||||
|
if (delta.Added.Count > 0)
|
||||||
|
{
|
||||||
|
sb.AppendLine("**Added:**");
|
||||||
|
foreach (var purl in delta.Added.OrderBy(p => p, StringComparer.Ordinal))
|
||||||
|
{
|
||||||
|
sb.AppendLine($"- `{purl}`");
|
||||||
|
}
|
||||||
|
sb.AppendLine();
|
||||||
|
}
|
||||||
|
|
||||||
|
if (delta.Removed.Count > 0)
|
||||||
|
{
|
||||||
|
sb.AppendLine("**Removed:**");
|
||||||
|
foreach (var purl in delta.Removed.OrderBy(p => p, StringComparer.Ordinal))
|
||||||
|
{
|
||||||
|
sb.AppendLine($"- `{purl}`");
|
||||||
|
}
|
||||||
|
sb.AppendLine();
|
||||||
|
}
|
||||||
|
|
||||||
|
var changeSign = delta.NetVulnerabilityChange <= 0 ? "" : "+";
|
||||||
|
sb.AppendLine($"**Net vulnerability change:** {changeSign}{delta.NetVulnerabilityChange}");
|
||||||
|
sb.AppendLine();
|
||||||
|
}
|
||||||
|
|
||||||
|
private static void AppendTestRequirements(StringBuilder sb, RemediationPlan plan)
|
||||||
|
{
|
||||||
|
sb.AppendLine("### Test Requirements");
|
||||||
|
sb.AppendLine();
|
||||||
|
|
||||||
|
var tests = plan.TestRequirements;
|
||||||
|
|
||||||
|
if (tests.TestSuites.Count > 0)
|
||||||
|
{
|
||||||
|
sb.AppendLine("**Required test suites:**");
|
||||||
|
foreach (var suite in tests.TestSuites.OrderBy(s => s, StringComparer.Ordinal))
|
||||||
|
{
|
||||||
|
sb.AppendLine($"- `{suite}`");
|
||||||
|
}
|
||||||
|
sb.AppendLine();
|
||||||
|
}
|
||||||
|
|
||||||
|
sb.AppendLine($"- Minimum coverage: {tests.MinCoverage:P0}");
|
||||||
|
sb.AppendLine($"- Require all pass: {(tests.RequireAllPass ? "Yes" : "No")}");
|
||||||
|
sb.AppendLine($"- Timeout: {tests.Timeout.TotalMinutes:F0} minutes");
|
||||||
|
sb.AppendLine();
|
||||||
|
}
|
||||||
|
|
||||||
|
private static void AppendRollbackSteps(StringBuilder sb, RemediationPlan plan)
|
||||||
|
{
|
||||||
|
sb.AppendLine("### Rollback Steps");
|
||||||
|
sb.AppendLine();
|
||||||
|
sb.AppendLine("If this remediation causes issues, rollback using:");
|
||||||
|
sb.AppendLine();
|
||||||
|
sb.AppendLine("```bash");
|
||||||
|
sb.AppendLine("# Revert this PR");
|
||||||
|
sb.AppendLine($"git revert <commit-sha>");
|
||||||
|
sb.AppendLine();
|
||||||
|
sb.AppendLine("# Or restore previous versions:");
|
||||||
|
|
||||||
|
foreach (var step in plan.Steps.Where(s => !string.IsNullOrEmpty(s.PreviousValue)).OrderBy(s => s.Order))
|
||||||
|
{
|
||||||
|
sb.AppendLine($"# {step.FilePath}: restore '{step.PreviousValue}'");
|
||||||
|
}
|
||||||
|
|
||||||
|
sb.AppendLine("```");
|
||||||
|
sb.AppendLine();
|
||||||
|
}
|
||||||
|
|
||||||
|
private static void AppendVexClaim(StringBuilder sb, RemediationPlan plan)
|
||||||
|
{
|
||||||
|
sb.AppendLine("### VEX Claim");
|
||||||
|
sb.AppendLine();
|
||||||
|
sb.AppendLine("Upon merge, the following VEX statements will be generated:");
|
||||||
|
sb.AppendLine();
|
||||||
|
|
||||||
|
sb.AppendLine($"- `{plan.Request.VulnerabilityId}`: status=`fixed`, justification=`vulnerable_code_not_present`");
|
||||||
|
sb.AppendLine();
|
||||||
|
sb.AppendLine("These VEX statements will be signed and attached to the evidence pack.");
|
||||||
|
sb.AppendLine();
|
||||||
|
}
|
||||||
|
|
||||||
|
private static void AppendEvidence(StringBuilder sb, RemediationPlan plan)
|
||||||
|
{
|
||||||
|
if (plan.EvidenceRefs.Count == 0)
|
||||||
|
{
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
sb.AppendLine("### Evidence");
|
||||||
|
sb.AppendLine();
|
||||||
|
sb.AppendLine("**Evidence references:**");
|
||||||
|
foreach (var evidenceRef in plan.EvidenceRefs.OrderBy(e => e, StringComparer.Ordinal))
|
||||||
|
{
|
||||||
|
sb.AppendLine($"- `{evidenceRef}`");
|
||||||
|
}
|
||||||
|
sb.AppendLine();
|
||||||
|
|
||||||
|
if (plan.InputHashes.Count > 0)
|
||||||
|
{
|
||||||
|
sb.AppendLine("**Input hashes (for replay):**");
|
||||||
|
sb.AppendLine("```");
|
||||||
|
foreach (var hash in plan.InputHashes.OrderBy(h => h, StringComparer.Ordinal))
|
||||||
|
{
|
||||||
|
sb.AppendLine(hash);
|
||||||
|
}
|
||||||
|
sb.AppendLine("```");
|
||||||
|
sb.AppendLine();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Rollback step for a remediation.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record RollbackStep
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Step order.
|
||||||
|
/// </summary>
|
||||||
|
public required int Order { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// File to restore.
|
||||||
|
/// </summary>
|
||||||
|
public required string FilePath { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Command or action to execute.
|
||||||
|
/// </summary>
|
||||||
|
public required string Command { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Description.
|
||||||
|
/// </summary>
|
||||||
|
public required string Description { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Generated PR metadata.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record PrMetadata
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// PR title.
|
||||||
|
/// </summary>
|
||||||
|
public required string Title { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Branch name.
|
||||||
|
/// </summary>
|
||||||
|
public required string BranchName { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// PR body (Markdown).
|
||||||
|
/// </summary>
|
||||||
|
public required string Body { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Labels to apply.
|
||||||
|
/// </summary>
|
||||||
|
public required IReadOnlyList<string> Labels { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Reviewers to request.
|
||||||
|
/// </summary>
|
||||||
|
public required IReadOnlyList<string> Reviewers { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Whether auto-merge should be enabled.
|
||||||
|
/// </summary>
|
||||||
|
public bool EnableAutoMerge { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Draft status.
|
||||||
|
/// </summary>
|
||||||
|
public bool IsDraft { get; init; }
|
||||||
|
}
|
||||||
@@ -7,6 +7,10 @@
|
|||||||
<ImplicitUsings>enable</ImplicitUsings>
|
<ImplicitUsings>enable</ImplicitUsings>
|
||||||
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
|
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
|
||||||
</PropertyGroup>
|
</PropertyGroup>
|
||||||
|
<ItemGroup>
|
||||||
|
<InternalsVisibleTo Include="StellaOps.Bench.AdvisoryAI" />
|
||||||
|
<InternalsVisibleTo Include="StellaOps.AdvisoryAI.Tests" />
|
||||||
|
</ItemGroup>
|
||||||
<ItemGroup>
|
<ItemGroup>
|
||||||
<PackageReference Include="Microsoft.Extensions.Logging.Abstractions" />
|
<PackageReference Include="Microsoft.Extensions.Logging.Abstractions" />
|
||||||
<PackageReference Include="Microsoft.Extensions.Options" />
|
<PackageReference Include="Microsoft.Extensions.Options" />
|
||||||
|
|||||||
@@ -0,0 +1,77 @@
|
|||||||
|
{
|
||||||
|
"$comment": "Sample path witness predicate payload. Sprint: SPRINT_20260112_006_ATTESTOR_path_witness_predicate (PW-ATT-002)",
|
||||||
|
"witness_id": "550e8400-e29b-41d4-a716-446655440000",
|
||||||
|
"witness_hash": "blake3:a1b2c3d4e5f6789012345678901234567890123456789012345678901234abcd",
|
||||||
|
"witness_type": "reachability_path",
|
||||||
|
"provenance": {
|
||||||
|
"graph_hash": "blake3:fedcba0987654321fedcba0987654321fedcba0987654321fedcba0987654321",
|
||||||
|
"scan_id": "660f9500-f3ac-52e5-b827-557766550111",
|
||||||
|
"run_id": "770fa600-g4bd-63f6-c938-668877660222",
|
||||||
|
"analyzer_version": "1.0.0",
|
||||||
|
"analysis_timestamp": "2026-01-14T12:00:00Z"
|
||||||
|
},
|
||||||
|
"path": {
|
||||||
|
"entrypoint": {
|
||||||
|
"fqn": "com.example.MyController.handleRequest",
|
||||||
|
"kind": "http_handler",
|
||||||
|
"location": {
|
||||||
|
"file": "src/main/java/com/example/MyController.java",
|
||||||
|
"line": 42
|
||||||
|
},
|
||||||
|
"node_hash": "sha256:entry1111111111111111111111111111111111111111111111111111111111"
|
||||||
|
},
|
||||||
|
"sink": {
|
||||||
|
"fqn": "org.apache.log4j.Logger.log",
|
||||||
|
"cve": "CVE-2021-44228",
|
||||||
|
"package": "pkg:maven/org.apache.logging.log4j/log4j-core@2.14.1",
|
||||||
|
"node_hash": "sha256:sink22222222222222222222222222222222222222222222222222222222222"
|
||||||
|
},
|
||||||
|
"steps": [
|
||||||
|
{
|
||||||
|
"index": 0,
|
||||||
|
"fqn": "com.example.MyController.handleRequest",
|
||||||
|
"call_site": "MyController.java:45",
|
||||||
|
"edge_type": "call",
|
||||||
|
"node_hash": "sha256:entry1111111111111111111111111111111111111111111111111111111111"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"index": 1,
|
||||||
|
"fqn": "com.example.LoggingService.logMessage",
|
||||||
|
"call_site": "LoggingService.java:23",
|
||||||
|
"edge_type": "call",
|
||||||
|
"node_hash": "sha256:middle333333333333333333333333333333333333333333333333333333333"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"index": 2,
|
||||||
|
"fqn": "org.apache.log4j.Logger.log",
|
||||||
|
"call_site": "Logger.java:156",
|
||||||
|
"edge_type": "sink",
|
||||||
|
"node_hash": "sha256:sink22222222222222222222222222222222222222222222222222222222222"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"hop_count": 3,
|
||||||
|
"path_hash": "sha256:pathab4567890abcdef1234567890abcdef1234567890abcdef1234567890ab",
|
||||||
|
"node_hashes": [
|
||||||
|
"sha256:entry1111111111111111111111111111111111111111111111111111111111",
|
||||||
|
"sha256:middle333333333333333333333333333333333333333333333333333333333",
|
||||||
|
"sha256:sink22222222222222222222222222222222222222222222222222222222222"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"gates": [
|
||||||
|
{
|
||||||
|
"type": "auth_required",
|
||||||
|
"location": "MyController.java:40",
|
||||||
|
"description": "Requires authenticated user via Spring Security"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"evidence": {
|
||||||
|
"graph_fragment_hash": "blake3:ijkl9012345678901234567890123456789012345678901234567890123456",
|
||||||
|
"path_hash": "blake3:mnop3456789012345678901234567890123456789012345678901234567890"
|
||||||
|
},
|
||||||
|
"evidence_uris": {
|
||||||
|
"graph": "cas://sha256:graphabc123456789012345678901234567890123456789012345678901234",
|
||||||
|
"sbom": "cas://sha256:sbomdef4567890123456789012345678901234567890123456789012345678",
|
||||||
|
"attestation": "cas://sha256:dsseghi7890123456789012345678901234567890123456789012345678901",
|
||||||
|
"rekor": "https://rekor.sigstore.dev/api/v1/log/entries/abc123def456"
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,228 @@
|
|||||||
|
{
|
||||||
|
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||||
|
"$id": "https://stella.ops/schemas/predicates/path-witness/v1",
|
||||||
|
"title": "StellaOps Path Witness Predicate v1",
|
||||||
|
"description": "In-toto predicate for path witness attestations proving reachability from entrypoint to vulnerable sink. Sprint: SPRINT_20260112_006_ATTESTOR_path_witness_predicate (PW-ATT-002)",
|
||||||
|
"type": "object",
|
||||||
|
"required": ["witness_id", "witness_hash", "provenance", "path"],
|
||||||
|
"properties": {
|
||||||
|
"witness_id": {
|
||||||
|
"type": "string",
|
||||||
|
"format": "uuid",
|
||||||
|
"description": "Unique identifier for this witness"
|
||||||
|
},
|
||||||
|
"witness_hash": {
|
||||||
|
"type": "string",
|
||||||
|
"pattern": "^(blake3|sha256):[a-f0-9]{64}$",
|
||||||
|
"description": "Hash of the canonical witness payload"
|
||||||
|
},
|
||||||
|
"witness_type": {
|
||||||
|
"type": "string",
|
||||||
|
"enum": ["reachability_path", "gate_proof"],
|
||||||
|
"default": "reachability_path"
|
||||||
|
},
|
||||||
|
"provenance": {
|
||||||
|
"type": "object",
|
||||||
|
"required": ["graph_hash", "analyzer_version", "analysis_timestamp"],
|
||||||
|
"properties": {
|
||||||
|
"graph_hash": {
|
||||||
|
"type": "string",
|
||||||
|
"pattern": "^(blake3|sha256):[a-f0-9]{64}$",
|
||||||
|
"description": "Hash of the source rich graph"
|
||||||
|
},
|
||||||
|
"scan_id": {
|
||||||
|
"type": "string",
|
||||||
|
"format": "uuid"
|
||||||
|
},
|
||||||
|
"run_id": {
|
||||||
|
"type": "string",
|
||||||
|
"format": "uuid"
|
||||||
|
},
|
||||||
|
"analyzer_version": {
|
||||||
|
"type": "string"
|
||||||
|
},
|
||||||
|
"analysis_timestamp": {
|
||||||
|
"type": "string",
|
||||||
|
"format": "date-time"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"path": {
|
||||||
|
"type": "object",
|
||||||
|
"required": ["entrypoint", "sink", "steps", "hop_count"],
|
||||||
|
"properties": {
|
||||||
|
"entrypoint": {
|
||||||
|
"$ref": "#/$defs/pathNode"
|
||||||
|
},
|
||||||
|
"sink": {
|
||||||
|
"$ref": "#/$defs/sinkNode"
|
||||||
|
},
|
||||||
|
"steps": {
|
||||||
|
"type": "array",
|
||||||
|
"items": {
|
||||||
|
"$ref": "#/$defs/pathStep"
|
||||||
|
},
|
||||||
|
"minItems": 1
|
||||||
|
},
|
||||||
|
"hop_count": {
|
||||||
|
"type": "integer",
|
||||||
|
"minimum": 1
|
||||||
|
},
|
||||||
|
"path_hash": {
|
||||||
|
"type": "string",
|
||||||
|
"pattern": "^sha256:[a-f0-9]{64}$",
|
||||||
|
"description": "Canonical path hash computed from node hashes"
|
||||||
|
},
|
||||||
|
"node_hashes": {
|
||||||
|
"type": "array",
|
||||||
|
"items": {
|
||||||
|
"type": "string",
|
||||||
|
"pattern": "^sha256:[a-f0-9]{64}$"
|
||||||
|
},
|
||||||
|
"description": "Top-K node hashes for efficient lookup"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"gates": {
|
||||||
|
"type": "array",
|
||||||
|
"items": {
|
||||||
|
"$ref": "#/$defs/gate"
|
||||||
|
},
|
||||||
|
"description": "Protective controls encountered along the path"
|
||||||
|
},
|
||||||
|
"evidence": {
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"graph_fragment_hash": {
|
||||||
|
"type": "string",
|
||||||
|
"pattern": "^(blake3|sha256):[a-f0-9]{64}$"
|
||||||
|
},
|
||||||
|
"path_hash": {
|
||||||
|
"type": "string",
|
||||||
|
"pattern": "^(blake3|sha256):[a-f0-9]{64}$"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"evidence_uris": {
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"graph": {
|
||||||
|
"type": "string",
|
||||||
|
"pattern": "^cas://sha256:[a-f0-9]{64}$"
|
||||||
|
},
|
||||||
|
"sbom": {
|
||||||
|
"type": "string",
|
||||||
|
"pattern": "^cas://sha256:[a-f0-9]{64}$"
|
||||||
|
},
|
||||||
|
"attestation": {
|
||||||
|
"type": "string",
|
||||||
|
"pattern": "^cas://sha256:[a-f0-9]{64}$"
|
||||||
|
},
|
||||||
|
"rekor": {
|
||||||
|
"type": "string",
|
||||||
|
"format": "uri"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"$defs": {
|
||||||
|
"pathNode": {
|
||||||
|
"type": "object",
|
||||||
|
"required": ["fqn"],
|
||||||
|
"properties": {
|
||||||
|
"fqn": {
|
||||||
|
"type": "string",
|
||||||
|
"description": "Fully qualified name of the node"
|
||||||
|
},
|
||||||
|
"kind": {
|
||||||
|
"type": "string",
|
||||||
|
"enum": ["http_handler", "grpc_handler", "cli_main", "scheduler", "message_handler", "other"]
|
||||||
|
},
|
||||||
|
"location": {
|
||||||
|
"$ref": "#/$defs/sourceLocation"
|
||||||
|
},
|
||||||
|
"node_hash": {
|
||||||
|
"type": "string",
|
||||||
|
"pattern": "^sha256:[a-f0-9]{64}$"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"sinkNode": {
|
||||||
|
"type": "object",
|
||||||
|
"required": ["fqn"],
|
||||||
|
"properties": {
|
||||||
|
"fqn": {
|
||||||
|
"type": "string"
|
||||||
|
},
|
||||||
|
"cve": {
|
||||||
|
"type": "string",
|
||||||
|
"pattern": "^CVE-\\d{4}-\\d+$"
|
||||||
|
},
|
||||||
|
"package": {
|
||||||
|
"type": "string",
|
||||||
|
"description": "Package URL (PURL) of the vulnerable package"
|
||||||
|
},
|
||||||
|
"node_hash": {
|
||||||
|
"type": "string",
|
||||||
|
"pattern": "^sha256:[a-f0-9]{64}$"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"pathStep": {
|
||||||
|
"type": "object",
|
||||||
|
"required": ["index", "fqn", "edge_type"],
|
||||||
|
"properties": {
|
||||||
|
"index": {
|
||||||
|
"type": "integer",
|
||||||
|
"minimum": 0
|
||||||
|
},
|
||||||
|
"fqn": {
|
||||||
|
"type": "string"
|
||||||
|
},
|
||||||
|
"call_site": {
|
||||||
|
"type": "string"
|
||||||
|
},
|
||||||
|
"edge_type": {
|
||||||
|
"type": "string",
|
||||||
|
"enum": ["call", "virtual", "static", "sink", "interface", "delegate"]
|
||||||
|
},
|
||||||
|
"node_hash": {
|
||||||
|
"type": "string",
|
||||||
|
"pattern": "^sha256:[a-f0-9]{64}$"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"sourceLocation": {
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"file": {
|
||||||
|
"type": "string"
|
||||||
|
},
|
||||||
|
"line": {
|
||||||
|
"type": "integer",
|
||||||
|
"minimum": 1
|
||||||
|
},
|
||||||
|
"column": {
|
||||||
|
"type": "integer",
|
||||||
|
"minimum": 1
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"gate": {
|
||||||
|
"type": "object",
|
||||||
|
"required": ["type"],
|
||||||
|
"properties": {
|
||||||
|
"type": {
|
||||||
|
"type": "string",
|
||||||
|
"enum": ["auth_required", "feature_flag", "admin_only", "non_default_config", "rate_limited", "other"]
|
||||||
|
},
|
||||||
|
"location": {
|
||||||
|
"type": "string"
|
||||||
|
},
|
||||||
|
"description": {
|
||||||
|
"type": "string"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,69 @@
|
|||||||
|
// <copyright file="PathWitnessPredicateTypes.cs" company="StellaOps">
|
||||||
|
// SPDX-License-Identifier: AGPL-3.0-or-later
|
||||||
|
// Sprint: SPRINT_20260112_006_ATTESTOR_path_witness_predicate (PW-ATT-003)
|
||||||
|
// </copyright>
|
||||||
|
|
||||||
|
namespace StellaOps.Attestor.Core;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Constants for path witness predicate types used in attestations.
|
||||||
|
/// </summary>
|
||||||
|
public static class PathWitnessPredicateTypes
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Canonical predicate type for path witness attestations.
|
||||||
|
/// </summary>
|
||||||
|
public const string PathWitnessV1 = "https://stella.ops/predicates/path-witness/v1";
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Alias predicate type using @ version format.
|
||||||
|
/// </summary>
|
||||||
|
public const string PathWitnessV1Alias = "stella.ops/pathWitness@v1";
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Alias predicate type using HTTPS with camelCase.
|
||||||
|
/// </summary>
|
||||||
|
public const string PathWitnessV1HttpsAlias = "https://stella.ops/pathWitness/v1";
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// All accepted predicate types for path witness attestations.
|
||||||
|
/// </summary>
|
||||||
|
public static readonly IReadOnlyList<string> AllAcceptedTypes =
|
||||||
|
[
|
||||||
|
PathWitnessV1,
|
||||||
|
PathWitnessV1Alias,
|
||||||
|
PathWitnessV1HttpsAlias
|
||||||
|
];
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Checks if the given predicate type is a path witness type.
|
||||||
|
/// </summary>
|
||||||
|
/// <param name="predicateType">The predicate type to check.</param>
|
||||||
|
/// <returns>True if it's a path witness type, false otherwise.</returns>
|
||||||
|
public static bool IsPathWitnessType(string? predicateType)
|
||||||
|
{
|
||||||
|
if (string.IsNullOrEmpty(predicateType))
|
||||||
|
{
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
return string.Equals(predicateType, PathWitnessV1, StringComparison.Ordinal)
|
||||||
|
|| string.Equals(predicateType, PathWitnessV1Alias, StringComparison.Ordinal)
|
||||||
|
|| string.Equals(predicateType, PathWitnessV1HttpsAlias, StringComparison.Ordinal);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Normalizes a path witness predicate type to the canonical form.
|
||||||
|
/// </summary>
|
||||||
|
/// <param name="predicateType">The predicate type to normalize.</param>
|
||||||
|
/// <returns>The canonical predicate type, or the original if not a path witness type.</returns>
|
||||||
|
public static string NormalizeToCanonical(string predicateType)
|
||||||
|
{
|
||||||
|
if (IsPathWitnessType(predicateType))
|
||||||
|
{
|
||||||
|
return PathWitnessV1;
|
||||||
|
}
|
||||||
|
|
||||||
|
return predicateType;
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,333 @@
|
|||||||
|
// <copyright file="RekorEntryEvent.cs" company="StellaOps">
|
||||||
|
// SPDX-License-Identifier: AGPL-3.0-or-later
|
||||||
|
// Sprint: SPRINT_20260112_007_ATTESTOR_rekor_entry_events (ATT-REKOR-001, ATT-REKOR-002)
|
||||||
|
// </copyright>
|
||||||
|
|
||||||
|
using System.Collections.Immutable;
|
||||||
|
using System.Security.Cryptography;
|
||||||
|
using System.Text;
|
||||||
|
using System.Text.Json.Serialization;
|
||||||
|
|
||||||
|
namespace StellaOps.Attestor.Core.Rekor;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Event emitted when a DSSE bundle is logged to Rekor and inclusion proof is available.
|
||||||
|
/// Used to drive policy reanalysis and evidence graph updates.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record RekorEntryEvent
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Unique event identifier (deterministic based on bundle digest and log index).
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("eventId")]
|
||||||
|
public required string EventId { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Event type constant.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("eventType")]
|
||||||
|
public string EventType { get; init; } = RekorEventTypes.EntryLogged;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Tenant identifier.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("tenant")]
|
||||||
|
public required string Tenant { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// SHA-256 digest of the DSSE bundle that was logged.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("bundleDigest")]
|
||||||
|
public required string BundleDigest { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Predicate type from the DSSE envelope.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("predicateType")]
|
||||||
|
public required string PredicateType { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Log index where the entry was recorded.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("logIndex")]
|
||||||
|
public required long LogIndex { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Log ID identifying the Rekor instance.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("logId")]
|
||||||
|
public required string LogId { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Entry UUID in the Rekor log.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("entryUuid")]
|
||||||
|
public required string EntryUuid { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Unix timestamp when the entry was integrated.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("integratedTime")]
|
||||||
|
public required long IntegratedTime { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// RFC3339 formatted integrated time for display.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("integratedTimeRfc3339")]
|
||||||
|
public required string IntegratedTimeRfc3339 { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// URL to the Rekor entry for UI linking.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("entryUrl")]
|
||||||
|
public string? EntryUrl { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Whether inclusion proof was verified.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("inclusionVerified")]
|
||||||
|
public required bool InclusionVerified { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Policy reanalysis hints extracted from the predicate.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("reanalysisHints")]
|
||||||
|
public RekorReanalysisHints? ReanalysisHints { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// UTC timestamp when this event was created.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("createdAtUtc")]
|
||||||
|
public required DateTimeOffset CreatedAtUtc { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Correlation ID for tracing.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("traceId")]
|
||||||
|
public string? TraceId { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Hints for policy reanalysis extracted from the logged predicate.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record RekorReanalysisHints
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// CVE identifiers affected by this attestation.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("cveIds")]
|
||||||
|
public ImmutableArray<string> CveIds { get; init; } = [];
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Product keys (PURLs) affected by this attestation.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("productKeys")]
|
||||||
|
public ImmutableArray<string> ProductKeys { get; init; } = [];
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Artifact digests covered by this attestation.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("artifactDigests")]
|
||||||
|
public ImmutableArray<string> ArtifactDigests { get; init; } = [];
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Whether this attestation may change a policy decision.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("mayAffectDecision")]
|
||||||
|
public bool MayAffectDecision { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Suggested reanalysis scope (e.g., "cve", "product", "artifact", "all").
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("reanalysisScope")]
|
||||||
|
public string ReanalysisScope { get; init; } = "none";
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Well-known Rekor event types.
|
||||||
|
/// </summary>
|
||||||
|
public static class RekorEventTypes
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Entry was successfully logged to Rekor with verified inclusion.
|
||||||
|
/// </summary>
|
||||||
|
public const string EntryLogged = "rekor.entry.logged";
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Entry was queued for logging (offline mode).
|
||||||
|
/// </summary>
|
||||||
|
public const string EntryQueued = "rekor.entry.queued";
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Inclusion proof was verified for a previously logged entry.
|
||||||
|
/// </summary>
|
||||||
|
public const string InclusionVerified = "rekor.inclusion.verified";
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Entry logging failed.
|
||||||
|
/// </summary>
|
||||||
|
public const string EntryFailed = "rekor.entry.failed";
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Factory for creating deterministic Rekor entry events.
|
||||||
|
/// </summary>
|
||||||
|
public static class RekorEntryEventFactory
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Creates a Rekor entry logged event with deterministic event ID.
|
||||||
|
/// </summary>
|
||||||
|
public static RekorEntryEvent CreateEntryLogged(
|
||||||
|
string tenant,
|
||||||
|
string bundleDigest,
|
||||||
|
string predicateType,
|
||||||
|
RekorReceipt receipt,
|
||||||
|
DateTimeOffset createdAtUtc,
|
||||||
|
RekorReanalysisHints? reanalysisHints = null,
|
||||||
|
string? traceId = null)
|
||||||
|
{
|
||||||
|
var eventId = ComputeEventId(
|
||||||
|
RekorEventTypes.EntryLogged,
|
||||||
|
bundleDigest,
|
||||||
|
receipt.LogIndex);
|
||||||
|
|
||||||
|
var integratedTimeRfc3339 = DateTimeOffset
|
||||||
|
.FromUnixTimeSeconds(receipt.IntegratedTime)
|
||||||
|
.ToString("yyyy-MM-ddTHH:mm:ssZ", System.Globalization.CultureInfo.InvariantCulture);
|
||||||
|
|
||||||
|
var entryUrl = !string.IsNullOrEmpty(receipt.LogUrl)
|
||||||
|
? $"{receipt.LogUrl.TrimEnd('/')}/api/v1/log/entries/{receipt.Uuid}"
|
||||||
|
: null;
|
||||||
|
|
||||||
|
return new RekorEntryEvent
|
||||||
|
{
|
||||||
|
EventId = eventId,
|
||||||
|
EventType = RekorEventTypes.EntryLogged,
|
||||||
|
Tenant = tenant,
|
||||||
|
BundleDigest = bundleDigest,
|
||||||
|
PredicateType = predicateType,
|
||||||
|
LogIndex = receipt.LogIndex,
|
||||||
|
LogId = receipt.LogId,
|
||||||
|
EntryUuid = receipt.Uuid,
|
||||||
|
IntegratedTime = receipt.IntegratedTime,
|
||||||
|
IntegratedTimeRfc3339 = integratedTimeRfc3339,
|
||||||
|
EntryUrl = entryUrl,
|
||||||
|
InclusionVerified = true,
|
||||||
|
ReanalysisHints = reanalysisHints,
|
||||||
|
CreatedAtUtc = createdAtUtc,
|
||||||
|
TraceId = traceId
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Creates a Rekor entry queued event (for offline mode).
|
||||||
|
/// </summary>
|
||||||
|
public static RekorEntryEvent CreateEntryQueued(
|
||||||
|
string tenant,
|
||||||
|
string bundleDigest,
|
||||||
|
string predicateType,
|
||||||
|
string queueId,
|
||||||
|
DateTimeOffset createdAtUtc,
|
||||||
|
RekorReanalysisHints? reanalysisHints = null,
|
||||||
|
string? traceId = null)
|
||||||
|
{
|
||||||
|
var eventId = ComputeEventId(
|
||||||
|
RekorEventTypes.EntryQueued,
|
||||||
|
bundleDigest,
|
||||||
|
0); // No log index yet
|
||||||
|
|
||||||
|
return new RekorEntryEvent
|
||||||
|
{
|
||||||
|
EventId = eventId,
|
||||||
|
EventType = RekorEventTypes.EntryQueued,
|
||||||
|
Tenant = tenant,
|
||||||
|
BundleDigest = bundleDigest,
|
||||||
|
PredicateType = predicateType,
|
||||||
|
LogIndex = -1, // Not yet logged
|
||||||
|
LogId = "pending",
|
||||||
|
EntryUuid = queueId,
|
||||||
|
IntegratedTime = 0,
|
||||||
|
IntegratedTimeRfc3339 = "pending",
|
||||||
|
EntryUrl = null,
|
||||||
|
InclusionVerified = false,
|
||||||
|
ReanalysisHints = reanalysisHints,
|
||||||
|
CreatedAtUtc = createdAtUtc,
|
||||||
|
TraceId = traceId
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Extracts reanalysis hints from a predicate based on its type.
|
||||||
|
/// </summary>
|
||||||
|
public static RekorReanalysisHints ExtractReanalysisHints(
|
||||||
|
string predicateType,
|
||||||
|
IReadOnlyList<string>? cveIds = null,
|
||||||
|
IReadOnlyList<string>? productKeys = null,
|
||||||
|
IReadOnlyList<string>? artifactDigests = null)
|
||||||
|
{
|
||||||
|
// Determine if this predicate type affects policy decisions
|
||||||
|
var mayAffect = IsDecisionAffectingPredicate(predicateType);
|
||||||
|
var scope = DetermineReanalysisScope(predicateType, cveIds, productKeys, artifactDigests);
|
||||||
|
|
||||||
|
return new RekorReanalysisHints
|
||||||
|
{
|
||||||
|
CveIds = cveIds?.ToImmutableArray() ?? [],
|
||||||
|
ProductKeys = productKeys?.ToImmutableArray() ?? [],
|
||||||
|
ArtifactDigests = artifactDigests?.ToImmutableArray() ?? [],
|
||||||
|
MayAffectDecision = mayAffect,
|
||||||
|
ReanalysisScope = scope
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
private static bool IsDecisionAffectingPredicate(string predicateType)
|
||||||
|
{
|
||||||
|
// Predicate types that can change policy decisions
|
||||||
|
return predicateType.Contains("vex", StringComparison.OrdinalIgnoreCase)
|
||||||
|
|| predicateType.Contains("verdict", StringComparison.OrdinalIgnoreCase)
|
||||||
|
|| predicateType.Contains("path-witness", StringComparison.OrdinalIgnoreCase)
|
||||||
|
|| predicateType.Contains("evidence", StringComparison.OrdinalIgnoreCase)
|
||||||
|
|| predicateType.Contains("override", StringComparison.OrdinalIgnoreCase);
|
||||||
|
}
|
||||||
|
|
||||||
|
private static string DetermineReanalysisScope(
|
||||||
|
string predicateType,
|
||||||
|
IReadOnlyList<string>? cveIds,
|
||||||
|
IReadOnlyList<string>? productKeys,
|
||||||
|
IReadOnlyList<string>? artifactDigests)
|
||||||
|
{
|
||||||
|
if (cveIds?.Count > 0)
|
||||||
|
{
|
||||||
|
return "cve";
|
||||||
|
}
|
||||||
|
|
||||||
|
if (productKeys?.Count > 0)
|
||||||
|
{
|
||||||
|
return "product";
|
||||||
|
}
|
||||||
|
|
||||||
|
if (artifactDigests?.Count > 0)
|
||||||
|
{
|
||||||
|
return "artifact";
|
||||||
|
}
|
||||||
|
|
||||||
|
// Default scope based on predicate type
|
||||||
|
if (predicateType.Contains("vex", StringComparison.OrdinalIgnoreCase))
|
||||||
|
{
|
||||||
|
return "product";
|
||||||
|
}
|
||||||
|
|
||||||
|
if (predicateType.Contains("sbom", StringComparison.OrdinalIgnoreCase))
|
||||||
|
{
|
||||||
|
return "artifact";
|
||||||
|
}
|
||||||
|
|
||||||
|
return "none";
|
||||||
|
}
|
||||||
|
|
||||||
|
private static string ComputeEventId(string eventType, string bundleDigest, long logIndex)
|
||||||
|
{
|
||||||
|
var input = $"{eventType}|{bundleDigest}|{logIndex}";
|
||||||
|
var hash = SHA256.HashData(Encoding.UTF8.GetBytes(input));
|
||||||
|
return $"rekor-evt-{Convert.ToHexStringLower(hash)[..16]}";
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -31,7 +31,13 @@ public sealed class PredicateTypeRouter : IPredicateTypeRouter
|
|||||||
// Delta predicate types for lineage comparison (Sprint 20251228_007)
|
// Delta predicate types for lineage comparison (Sprint 20251228_007)
|
||||||
"stella.ops/vex-delta@v1",
|
"stella.ops/vex-delta@v1",
|
||||||
"stella.ops/sbom-delta@v1",
|
"stella.ops/sbom-delta@v1",
|
||||||
"stella.ops/verdict-delta@v1"
|
"stella.ops/verdict-delta@v1",
|
||||||
|
// Path witness predicates (Sprint: SPRINT_20260112_006_ATTESTOR_path_witness_predicate PW-ATT-001)
|
||||||
|
// Canonical predicate type
|
||||||
|
"https://stella.ops/predicates/path-witness/v1",
|
||||||
|
// Aliases for backward compatibility
|
||||||
|
"stella.ops/pathWitness@v1",
|
||||||
|
"https://stella.ops/pathWitness/v1"
|
||||||
};
|
};
|
||||||
|
|
||||||
public PredicateTypeRouter(
|
public PredicateTypeRouter(
|
||||||
|
|||||||
@@ -0,0 +1,165 @@
|
|||||||
|
// -----------------------------------------------------------------------------
|
||||||
|
// VexOverridePredicate.cs
|
||||||
|
// Sprint: SPRINT_20260112_004_ATTESTOR_vex_override_predicate (ATT-VEX-001)
|
||||||
|
// Description: VEX override predicate models for attestations
|
||||||
|
// -----------------------------------------------------------------------------
|
||||||
|
|
||||||
|
using System.Collections.Immutable;
|
||||||
|
|
||||||
|
namespace StellaOps.Attestor.StandardPredicates.VexOverride;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// VEX override predicate type URI.
|
||||||
|
/// </summary>
|
||||||
|
public static class VexOverridePredicateTypes
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// The predicate type URI for VEX override attestations.
|
||||||
|
/// </summary>
|
||||||
|
public const string PredicateTypeUri = "https://stellaops.dev/attestations/vex-override/v1";
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// VEX override decision indicating the operator's assessment.
|
||||||
|
/// </summary>
|
||||||
|
public enum VexOverrideDecision
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// The vulnerability does not affect this artifact/configuration.
|
||||||
|
/// </summary>
|
||||||
|
NotAffected = 1,
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// The vulnerability is mitigated by compensating controls.
|
||||||
|
/// </summary>
|
||||||
|
Mitigated = 2,
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// The vulnerability has been accepted as a known risk.
|
||||||
|
/// </summary>
|
||||||
|
Accepted = 3,
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// The vulnerability assessment is still under investigation.
|
||||||
|
/// </summary>
|
||||||
|
UnderInvestigation = 4
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// VEX override predicate payload for in-toto/DSSE attestations.
|
||||||
|
/// Represents an operator decision to override or annotate a vulnerability status.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record VexOverridePredicate
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// The predicate type URI.
|
||||||
|
/// </summary>
|
||||||
|
public string PredicateType { get; init; } = VexOverridePredicateTypes.PredicateTypeUri;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Artifact digest this override applies to (e.g., sha256:abc123...).
|
||||||
|
/// </summary>
|
||||||
|
public required string ArtifactDigest { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Vulnerability ID being overridden (e.g., CVE-2024-12345).
|
||||||
|
/// </summary>
|
||||||
|
public required string VulnerabilityId { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// The operator's decision.
|
||||||
|
/// </summary>
|
||||||
|
public required VexOverrideDecision Decision { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Human-readable justification for the decision.
|
||||||
|
/// </summary>
|
||||||
|
public required string Justification { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// UTC timestamp when the decision was made.
|
||||||
|
/// </summary>
|
||||||
|
public required DateTimeOffset DecisionTime { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Identifier of the operator/user who made the decision.
|
||||||
|
/// </summary>
|
||||||
|
public required string OperatorId { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Optional expiration time for this override.
|
||||||
|
/// </summary>
|
||||||
|
public DateTimeOffset? ExpiresAt { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Evidence references supporting this decision.
|
||||||
|
/// </summary>
|
||||||
|
public ImmutableArray<EvidenceReference> EvidenceRefs { get; init; } = ImmutableArray<EvidenceReference>.Empty;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Tool information that created this predicate.
|
||||||
|
/// </summary>
|
||||||
|
public ToolInfo? Tool { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Rule digest that triggered or was overridden by this decision.
|
||||||
|
/// </summary>
|
||||||
|
public string? RuleDigest { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Hash of the reachability trace at decision time, if applicable.
|
||||||
|
/// </summary>
|
||||||
|
public string? TraceHash { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Additional metadata as key-value pairs.
|
||||||
|
/// </summary>
|
||||||
|
public ImmutableDictionary<string, string> Metadata { get; init; } = ImmutableDictionary<string, string>.Empty;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Reference to supporting evidence for a VEX override decision.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record EvidenceReference
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Type of evidence (e.g., "document", "ticket", "scan_report").
|
||||||
|
/// </summary>
|
||||||
|
public required string Type { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// URI or identifier for the evidence.
|
||||||
|
/// </summary>
|
||||||
|
public required string Uri { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Optional digest of the evidence content.
|
||||||
|
/// </summary>
|
||||||
|
public string? Digest { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Optional description of the evidence.
|
||||||
|
/// </summary>
|
||||||
|
public string? Description { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Tool information for the predicate.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record ToolInfo
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Tool name.
|
||||||
|
/// </summary>
|
||||||
|
public required string Name { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Tool version.
|
||||||
|
/// </summary>
|
||||||
|
public required string Version { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Optional tool vendor.
|
||||||
|
/// </summary>
|
||||||
|
public string? Vendor { get; init; }
|
||||||
|
}
|
||||||
@@ -0,0 +1,333 @@
|
|||||||
|
// -----------------------------------------------------------------------------
|
||||||
|
// VexOverridePredicateBuilder.cs
|
||||||
|
// Sprint: SPRINT_20260112_004_ATTESTOR_vex_override_predicate (ATT-VEX-002)
|
||||||
|
// Description: Builder for VEX override predicate payloads with DSSE envelope creation
|
||||||
|
// -----------------------------------------------------------------------------
|
||||||
|
|
||||||
|
using System.Collections.Immutable;
|
||||||
|
using System.Globalization;
|
||||||
|
using System.Text;
|
||||||
|
using System.Text.Json;
|
||||||
|
|
||||||
|
namespace StellaOps.Attestor.StandardPredicates.VexOverride;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Builder for creating VEX override predicate payloads.
|
||||||
|
/// Produces RFC 8785 canonical JSON for deterministic hashing.
|
||||||
|
/// </summary>
|
||||||
|
public sealed class VexOverridePredicateBuilder
|
||||||
|
{
|
||||||
|
private string? _artifactDigest;
|
||||||
|
private string? _vulnerabilityId;
|
||||||
|
private VexOverrideDecision? _decision;
|
||||||
|
private string? _justification;
|
||||||
|
private DateTimeOffset? _decisionTime;
|
||||||
|
private string? _operatorId;
|
||||||
|
private DateTimeOffset? _expiresAt;
|
||||||
|
private readonly List<EvidenceReference> _evidenceRefs = new();
|
||||||
|
private ToolInfo? _tool;
|
||||||
|
private string? _ruleDigest;
|
||||||
|
private string? _traceHash;
|
||||||
|
private readonly Dictionary<string, string> _metadata = new(StringComparer.Ordinal);
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Sets the artifact digest this override applies to.
|
||||||
|
/// </summary>
|
||||||
|
public VexOverridePredicateBuilder WithArtifactDigest(string artifactDigest)
|
||||||
|
{
|
||||||
|
_artifactDigest = artifactDigest ?? throw new ArgumentNullException(nameof(artifactDigest));
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Sets the vulnerability ID being overridden.
|
||||||
|
/// </summary>
|
||||||
|
public VexOverridePredicateBuilder WithVulnerabilityId(string vulnerabilityId)
|
||||||
|
{
|
||||||
|
_vulnerabilityId = vulnerabilityId ?? throw new ArgumentNullException(nameof(vulnerabilityId));
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Sets the operator's decision.
|
||||||
|
/// </summary>
|
||||||
|
public VexOverridePredicateBuilder WithDecision(VexOverrideDecision decision)
|
||||||
|
{
|
||||||
|
_decision = decision;
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Sets the justification for the decision.
|
||||||
|
/// </summary>
|
||||||
|
public VexOverridePredicateBuilder WithJustification(string justification)
|
||||||
|
{
|
||||||
|
_justification = justification ?? throw new ArgumentNullException(nameof(justification));
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Sets the decision time.
|
||||||
|
/// </summary>
|
||||||
|
public VexOverridePredicateBuilder WithDecisionTime(DateTimeOffset decisionTime)
|
||||||
|
{
|
||||||
|
_decisionTime = decisionTime;
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Sets the operator ID.
|
||||||
|
/// </summary>
|
||||||
|
public VexOverridePredicateBuilder WithOperatorId(string operatorId)
|
||||||
|
{
|
||||||
|
_operatorId = operatorId ?? throw new ArgumentNullException(nameof(operatorId));
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Sets the optional expiration time.
|
||||||
|
/// </summary>
|
||||||
|
public VexOverridePredicateBuilder WithExpiresAt(DateTimeOffset expiresAt)
|
||||||
|
{
|
||||||
|
_expiresAt = expiresAt;
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Adds an evidence reference.
|
||||||
|
/// </summary>
|
||||||
|
public VexOverridePredicateBuilder AddEvidenceRef(EvidenceReference evidenceRef)
|
||||||
|
{
|
||||||
|
_evidenceRefs.Add(evidenceRef ?? throw new ArgumentNullException(nameof(evidenceRef)));
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Adds an evidence reference.
|
||||||
|
/// </summary>
|
||||||
|
public VexOverridePredicateBuilder AddEvidenceRef(string type, string uri, string? digest = null, string? description = null)
|
||||||
|
{
|
||||||
|
_evidenceRefs.Add(new EvidenceReference
|
||||||
|
{
|
||||||
|
Type = type,
|
||||||
|
Uri = uri,
|
||||||
|
Digest = digest,
|
||||||
|
Description = description
|
||||||
|
});
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Sets the tool information.
|
||||||
|
/// </summary>
|
||||||
|
public VexOverridePredicateBuilder WithTool(string name, string version, string? vendor = null)
|
||||||
|
{
|
||||||
|
_tool = new ToolInfo
|
||||||
|
{
|
||||||
|
Name = name,
|
||||||
|
Version = version,
|
||||||
|
Vendor = vendor
|
||||||
|
};
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Sets the rule digest.
|
||||||
|
/// </summary>
|
||||||
|
public VexOverridePredicateBuilder WithRuleDigest(string ruleDigest)
|
||||||
|
{
|
||||||
|
_ruleDigest = ruleDigest;
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Sets the trace hash.
|
||||||
|
/// </summary>
|
||||||
|
public VexOverridePredicateBuilder WithTraceHash(string traceHash)
|
||||||
|
{
|
||||||
|
_traceHash = traceHash;
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Adds metadata.
|
||||||
|
/// </summary>
|
||||||
|
public VexOverridePredicateBuilder WithMetadata(string key, string value)
|
||||||
|
{
|
||||||
|
_metadata[key] = value;
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Builds the VEX override predicate.
|
||||||
|
/// </summary>
|
||||||
|
public VexOverridePredicate Build()
|
||||||
|
{
|
||||||
|
if (string.IsNullOrWhiteSpace(_artifactDigest))
|
||||||
|
{
|
||||||
|
throw new InvalidOperationException("ArtifactDigest is required.");
|
||||||
|
}
|
||||||
|
|
||||||
|
if (string.IsNullOrWhiteSpace(_vulnerabilityId))
|
||||||
|
{
|
||||||
|
throw new InvalidOperationException("VulnerabilityId is required.");
|
||||||
|
}
|
||||||
|
|
||||||
|
if (_decision is null)
|
||||||
|
{
|
||||||
|
throw new InvalidOperationException("Decision is required.");
|
||||||
|
}
|
||||||
|
|
||||||
|
if (string.IsNullOrWhiteSpace(_justification))
|
||||||
|
{
|
||||||
|
throw new InvalidOperationException("Justification is required.");
|
||||||
|
}
|
||||||
|
|
||||||
|
if (_decisionTime is null)
|
||||||
|
{
|
||||||
|
throw new InvalidOperationException("DecisionTime is required.");
|
||||||
|
}
|
||||||
|
|
||||||
|
if (string.IsNullOrWhiteSpace(_operatorId))
|
||||||
|
{
|
||||||
|
throw new InvalidOperationException("OperatorId is required.");
|
||||||
|
}
|
||||||
|
|
||||||
|
return new VexOverridePredicate
|
||||||
|
{
|
||||||
|
ArtifactDigest = _artifactDigest,
|
||||||
|
VulnerabilityId = _vulnerabilityId,
|
||||||
|
Decision = _decision.Value,
|
||||||
|
Justification = _justification,
|
||||||
|
DecisionTime = _decisionTime.Value,
|
||||||
|
OperatorId = _operatorId,
|
||||||
|
ExpiresAt = _expiresAt,
|
||||||
|
EvidenceRefs = _evidenceRefs.ToImmutableArray(),
|
||||||
|
Tool = _tool,
|
||||||
|
RuleDigest = _ruleDigest,
|
||||||
|
TraceHash = _traceHash,
|
||||||
|
Metadata = _metadata.ToImmutableDictionary()
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Builds and serializes the predicate to canonical JSON.
|
||||||
|
/// </summary>
|
||||||
|
public string BuildCanonicalJson()
|
||||||
|
{
|
||||||
|
var predicate = Build();
|
||||||
|
var json = SerializeToJson(predicate);
|
||||||
|
return JsonCanonicalizer.Canonicalize(json);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Builds and serializes the predicate to JSON bytes.
|
||||||
|
/// </summary>
|
||||||
|
public byte[] BuildJsonBytes()
|
||||||
|
{
|
||||||
|
var canonicalJson = BuildCanonicalJson();
|
||||||
|
return Encoding.UTF8.GetBytes(canonicalJson);
|
||||||
|
}
|
||||||
|
|
||||||
|
private static string SerializeToJson(VexOverridePredicate predicate)
|
||||||
|
{
|
||||||
|
using var stream = new MemoryStream();
|
||||||
|
using var writer = new Utf8JsonWriter(stream, new JsonWriterOptions { Indented = false });
|
||||||
|
|
||||||
|
writer.WriteStartObject();
|
||||||
|
|
||||||
|
// Write fields in deterministic order (alphabetical)
|
||||||
|
writer.WriteString("artifactDigest", predicate.ArtifactDigest);
|
||||||
|
writer.WriteString("decision", DecisionToString(predicate.Decision));
|
||||||
|
writer.WriteString("decisionTime", predicate.DecisionTime.UtcDateTime.ToString("O", CultureInfo.InvariantCulture));
|
||||||
|
|
||||||
|
// evidenceRefs (only if non-empty)
|
||||||
|
if (predicate.EvidenceRefs.Length > 0)
|
||||||
|
{
|
||||||
|
writer.WriteStartArray("evidenceRefs");
|
||||||
|
foreach (var evidenceRef in predicate.EvidenceRefs.OrderBy(e => e.Type, StringComparer.Ordinal)
|
||||||
|
.ThenBy(e => e.Uri, StringComparer.Ordinal))
|
||||||
|
{
|
||||||
|
writer.WriteStartObject();
|
||||||
|
if (evidenceRef.Description is not null)
|
||||||
|
{
|
||||||
|
writer.WriteString("description", evidenceRef.Description);
|
||||||
|
}
|
||||||
|
if (evidenceRef.Digest is not null)
|
||||||
|
{
|
||||||
|
writer.WriteString("digest", evidenceRef.Digest);
|
||||||
|
}
|
||||||
|
writer.WriteString("type", evidenceRef.Type);
|
||||||
|
writer.WriteString("uri", evidenceRef.Uri);
|
||||||
|
writer.WriteEndObject();
|
||||||
|
}
|
||||||
|
writer.WriteEndArray();
|
||||||
|
}
|
||||||
|
|
||||||
|
// expiresAt (optional)
|
||||||
|
if (predicate.ExpiresAt.HasValue)
|
||||||
|
{
|
||||||
|
writer.WriteString("expiresAt", predicate.ExpiresAt.Value.UtcDateTime.ToString("O", CultureInfo.InvariantCulture));
|
||||||
|
}
|
||||||
|
|
||||||
|
writer.WriteString("justification", predicate.Justification);
|
||||||
|
|
||||||
|
// metadata (only if non-empty)
|
||||||
|
if (predicate.Metadata.Count > 0)
|
||||||
|
{
|
||||||
|
writer.WriteStartObject("metadata");
|
||||||
|
foreach (var kvp in predicate.Metadata.OrderBy(k => k.Key, StringComparer.Ordinal))
|
||||||
|
{
|
||||||
|
writer.WriteString(kvp.Key, kvp.Value);
|
||||||
|
}
|
||||||
|
writer.WriteEndObject();
|
||||||
|
}
|
||||||
|
|
||||||
|
writer.WriteString("operatorId", predicate.OperatorId);
|
||||||
|
writer.WriteString("predicateType", predicate.PredicateType);
|
||||||
|
|
||||||
|
// ruleDigest (optional)
|
||||||
|
if (predicate.RuleDigest is not null)
|
||||||
|
{
|
||||||
|
writer.WriteString("ruleDigest", predicate.RuleDigest);
|
||||||
|
}
|
||||||
|
|
||||||
|
// tool (optional)
|
||||||
|
if (predicate.Tool is not null)
|
||||||
|
{
|
||||||
|
writer.WriteStartObject("tool");
|
||||||
|
writer.WriteString("name", predicate.Tool.Name);
|
||||||
|
if (predicate.Tool.Vendor is not null)
|
||||||
|
{
|
||||||
|
writer.WriteString("vendor", predicate.Tool.Vendor);
|
||||||
|
}
|
||||||
|
writer.WriteString("version", predicate.Tool.Version);
|
||||||
|
writer.WriteEndObject();
|
||||||
|
}
|
||||||
|
|
||||||
|
// traceHash (optional)
|
||||||
|
if (predicate.TraceHash is not null)
|
||||||
|
{
|
||||||
|
writer.WriteString("traceHash", predicate.TraceHash);
|
||||||
|
}
|
||||||
|
|
||||||
|
writer.WriteString("vulnerabilityId", predicate.VulnerabilityId);
|
||||||
|
|
||||||
|
writer.WriteEndObject();
|
||||||
|
writer.Flush();
|
||||||
|
|
||||||
|
return Encoding.UTF8.GetString(stream.ToArray());
|
||||||
|
}
|
||||||
|
|
||||||
|
private static string DecisionToString(VexOverrideDecision decision)
|
||||||
|
{
|
||||||
|
return decision switch
|
||||||
|
{
|
||||||
|
VexOverrideDecision.NotAffected => "not_affected",
|
||||||
|
VexOverrideDecision.Mitigated => "mitigated",
|
||||||
|
VexOverrideDecision.Accepted => "accepted",
|
||||||
|
VexOverrideDecision.UnderInvestigation => "under_investigation",
|
||||||
|
_ => throw new ArgumentOutOfRangeException(nameof(decision))
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,438 @@
|
|||||||
|
// -----------------------------------------------------------------------------
|
||||||
|
// VexOverridePredicateParser.cs
|
||||||
|
// Sprint: SPRINT_20260112_004_ATTESTOR_vex_override_predicate (ATT-VEX-002)
|
||||||
|
// Description: Parser for VEX override predicate payloads
|
||||||
|
// -----------------------------------------------------------------------------
|
||||||
|
|
||||||
|
using System.Collections.Immutable;
|
||||||
|
using System.Globalization;
|
||||||
|
using System.Text.Json;
|
||||||
|
using Microsoft.Extensions.Logging;
|
||||||
|
|
||||||
|
namespace StellaOps.Attestor.StandardPredicates.VexOverride;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Parser for VEX override predicate payloads.
|
||||||
|
/// </summary>
|
||||||
|
public sealed class VexOverridePredicateParser : IPredicateParser
|
||||||
|
{
|
||||||
|
private readonly ILogger<VexOverridePredicateParser> _logger;
|
||||||
|
|
||||||
|
/// <inheritdoc/>
|
||||||
|
public string PredicateType => VexOverridePredicateTypes.PredicateTypeUri;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Initializes a new instance of the <see cref="VexOverridePredicateParser"/> class.
|
||||||
|
/// </summary>
|
||||||
|
public VexOverridePredicateParser(ILogger<VexOverridePredicateParser> logger)
|
||||||
|
{
|
||||||
|
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <inheritdoc/>
|
||||||
|
public PredicateParseResult Parse(JsonElement predicatePayload)
|
||||||
|
{
|
||||||
|
var errors = new List<ValidationError>();
|
||||||
|
var warnings = new List<ValidationWarning>();
|
||||||
|
|
||||||
|
// Validate required fields
|
||||||
|
if (!predicatePayload.TryGetProperty("artifactDigest", out var artifactDigestEl) ||
|
||||||
|
string.IsNullOrWhiteSpace(artifactDigestEl.GetString()))
|
||||||
|
{
|
||||||
|
errors.Add(new ValidationError("$.artifactDigest", "Missing required field: artifactDigest", "VEX_MISSING_ARTIFACT_DIGEST"));
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!predicatePayload.TryGetProperty("vulnerabilityId", out var vulnIdEl) ||
|
||||||
|
string.IsNullOrWhiteSpace(vulnIdEl.GetString()))
|
||||||
|
{
|
||||||
|
errors.Add(new ValidationError("$.vulnerabilityId", "Missing required field: vulnerabilityId", "VEX_MISSING_VULN_ID"));
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!predicatePayload.TryGetProperty("decision", out var decisionEl))
|
||||||
|
{
|
||||||
|
errors.Add(new ValidationError("$.decision", "Missing required field: decision", "VEX_MISSING_DECISION"));
|
||||||
|
}
|
||||||
|
else
|
||||||
|
{
|
||||||
|
ValidateDecision(decisionEl, errors);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!predicatePayload.TryGetProperty("justification", out var justificationEl) ||
|
||||||
|
string.IsNullOrWhiteSpace(justificationEl.GetString()))
|
||||||
|
{
|
||||||
|
errors.Add(new ValidationError("$.justification", "Missing required field: justification", "VEX_MISSING_JUSTIFICATION"));
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!predicatePayload.TryGetProperty("decisionTime", out var decisionTimeEl))
|
||||||
|
{
|
||||||
|
errors.Add(new ValidationError("$.decisionTime", "Missing required field: decisionTime", "VEX_MISSING_DECISION_TIME"));
|
||||||
|
}
|
||||||
|
else
|
||||||
|
{
|
||||||
|
ValidateTimestamp(decisionTimeEl, "$.decisionTime", errors);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!predicatePayload.TryGetProperty("operatorId", out var operatorIdEl) ||
|
||||||
|
string.IsNullOrWhiteSpace(operatorIdEl.GetString()))
|
||||||
|
{
|
||||||
|
errors.Add(new ValidationError("$.operatorId", "Missing required field: operatorId", "VEX_MISSING_OPERATOR_ID"));
|
||||||
|
}
|
||||||
|
|
||||||
|
// Validate optional fields
|
||||||
|
if (predicatePayload.TryGetProperty("expiresAt", out var expiresAtEl))
|
||||||
|
{
|
||||||
|
ValidateTimestamp(expiresAtEl, "$.expiresAt", errors);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (predicatePayload.TryGetProperty("evidenceRefs", out var evidenceRefsEl))
|
||||||
|
{
|
||||||
|
ValidateEvidenceRefs(evidenceRefsEl, errors, warnings);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (predicatePayload.TryGetProperty("tool", out var toolEl))
|
||||||
|
{
|
||||||
|
ValidateTool(toolEl, errors);
|
||||||
|
}
|
||||||
|
|
||||||
|
_logger.LogDebug(
|
||||||
|
"Parsed VEX override predicate with {ErrorCount} errors, {WarningCount} warnings",
|
||||||
|
errors.Count, warnings.Count);
|
||||||
|
|
||||||
|
// Extract metadata
|
||||||
|
var metadata = new PredicateMetadata
|
||||||
|
{
|
||||||
|
PredicateType = PredicateType,
|
||||||
|
Format = "vex-override",
|
||||||
|
Version = "1.0",
|
||||||
|
Properties = ExtractMetadata(predicatePayload)
|
||||||
|
};
|
||||||
|
|
||||||
|
return new PredicateParseResult
|
||||||
|
{
|
||||||
|
IsValid = errors.Count == 0,
|
||||||
|
Metadata = metadata,
|
||||||
|
Errors = errors,
|
||||||
|
Warnings = warnings
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <inheritdoc/>
|
||||||
|
public SbomExtractionResult? ExtractSbom(JsonElement predicatePayload)
|
||||||
|
{
|
||||||
|
// VEX override is not an SBOM
|
||||||
|
_logger.LogDebug("VEX override predicate does not contain SBOM content (this is expected)");
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Parses a VEX override predicate payload into the typed model.
|
||||||
|
/// </summary>
|
||||||
|
public VexOverridePredicate? ParsePredicate(JsonElement predicatePayload)
|
||||||
|
{
|
||||||
|
try
|
||||||
|
{
|
||||||
|
var artifactDigest = predicatePayload.GetProperty("artifactDigest").GetString()!;
|
||||||
|
var vulnerabilityId = predicatePayload.GetProperty("vulnerabilityId").GetString()!;
|
||||||
|
var decision = ParseDecision(predicatePayload.GetProperty("decision"));
|
||||||
|
var justification = predicatePayload.GetProperty("justification").GetString()!;
|
||||||
|
var decisionTime = DateTimeOffset.Parse(
|
||||||
|
predicatePayload.GetProperty("decisionTime").GetString()!,
|
||||||
|
CultureInfo.InvariantCulture,
|
||||||
|
DateTimeStyles.RoundtripKind);
|
||||||
|
var operatorId = predicatePayload.GetProperty("operatorId").GetString()!;
|
||||||
|
|
||||||
|
DateTimeOffset? expiresAt = null;
|
||||||
|
if (predicatePayload.TryGetProperty("expiresAt", out var expiresAtEl) &&
|
||||||
|
expiresAtEl.ValueKind == JsonValueKind.String)
|
||||||
|
{
|
||||||
|
expiresAt = DateTimeOffset.Parse(
|
||||||
|
expiresAtEl.GetString()!,
|
||||||
|
CultureInfo.InvariantCulture,
|
||||||
|
DateTimeStyles.RoundtripKind);
|
||||||
|
}
|
||||||
|
|
||||||
|
var evidenceRefs = ImmutableArray<EvidenceReference>.Empty;
|
||||||
|
if (predicatePayload.TryGetProperty("evidenceRefs", out var evidenceRefsEl) &&
|
||||||
|
evidenceRefsEl.ValueKind == JsonValueKind.Array)
|
||||||
|
{
|
||||||
|
evidenceRefs = ParseEvidenceRefs(evidenceRefsEl);
|
||||||
|
}
|
||||||
|
|
||||||
|
ToolInfo? tool = null;
|
||||||
|
if (predicatePayload.TryGetProperty("tool", out var toolEl) &&
|
||||||
|
toolEl.ValueKind == JsonValueKind.Object)
|
||||||
|
{
|
||||||
|
tool = ParseTool(toolEl);
|
||||||
|
}
|
||||||
|
|
||||||
|
string? ruleDigest = null;
|
||||||
|
if (predicatePayload.TryGetProperty("ruleDigest", out var ruleDigestEl) &&
|
||||||
|
ruleDigestEl.ValueKind == JsonValueKind.String)
|
||||||
|
{
|
||||||
|
ruleDigest = ruleDigestEl.GetString();
|
||||||
|
}
|
||||||
|
|
||||||
|
string? traceHash = null;
|
||||||
|
if (predicatePayload.TryGetProperty("traceHash", out var traceHashEl) &&
|
||||||
|
traceHashEl.ValueKind == JsonValueKind.String)
|
||||||
|
{
|
||||||
|
traceHash = traceHashEl.GetString();
|
||||||
|
}
|
||||||
|
|
||||||
|
var metadata = ImmutableDictionary<string, string>.Empty;
|
||||||
|
if (predicatePayload.TryGetProperty("metadata", out var metadataEl) &&
|
||||||
|
metadataEl.ValueKind == JsonValueKind.Object)
|
||||||
|
{
|
||||||
|
metadata = ParseMetadata(metadataEl);
|
||||||
|
}
|
||||||
|
|
||||||
|
return new VexOverridePredicate
|
||||||
|
{
|
||||||
|
ArtifactDigest = artifactDigest,
|
||||||
|
VulnerabilityId = vulnerabilityId,
|
||||||
|
Decision = decision,
|
||||||
|
Justification = justification,
|
||||||
|
DecisionTime = decisionTime,
|
||||||
|
OperatorId = operatorId,
|
||||||
|
ExpiresAt = expiresAt,
|
||||||
|
EvidenceRefs = evidenceRefs,
|
||||||
|
Tool = tool,
|
||||||
|
RuleDigest = ruleDigest,
|
||||||
|
TraceHash = traceHash,
|
||||||
|
Metadata = metadata
|
||||||
|
};
|
||||||
|
}
|
||||||
|
catch (Exception ex)
|
||||||
|
{
|
||||||
|
_logger.LogWarning(ex, "Failed to parse VEX override predicate");
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private void ValidateDecision(JsonElement decisionEl, List<ValidationError> errors)
|
||||||
|
{
|
||||||
|
var validDecisions = new[] { "not_affected", "mitigated", "accepted", "under_investigation" };
|
||||||
|
|
||||||
|
if (decisionEl.ValueKind == JsonValueKind.String)
|
||||||
|
{
|
||||||
|
var decision = decisionEl.GetString();
|
||||||
|
if (string.IsNullOrWhiteSpace(decision) || !validDecisions.Contains(decision, StringComparer.OrdinalIgnoreCase))
|
||||||
|
{
|
||||||
|
errors.Add(new ValidationError(
|
||||||
|
"$.decision",
|
||||||
|
$"Invalid decision value. Must be one of: {string.Join(", ", validDecisions)}",
|
||||||
|
"VEX_INVALID_DECISION"));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
else if (decisionEl.ValueKind == JsonValueKind.Number)
|
||||||
|
{
|
||||||
|
var value = decisionEl.GetInt32();
|
||||||
|
if (value < 1 || value > 4)
|
||||||
|
{
|
||||||
|
errors.Add(new ValidationError(
|
||||||
|
"$.decision",
|
||||||
|
"Invalid decision value. Numeric values must be 1-4.",
|
||||||
|
"VEX_INVALID_DECISION"));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
else
|
||||||
|
{
|
||||||
|
errors.Add(new ValidationError(
|
||||||
|
"$.decision",
|
||||||
|
"Decision must be a string or number",
|
||||||
|
"VEX_INVALID_DECISION_TYPE"));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private static void ValidateTimestamp(JsonElement timestampEl, string path, List<ValidationError> errors)
|
||||||
|
{
|
||||||
|
if (timestampEl.ValueKind != JsonValueKind.String)
|
||||||
|
{
|
||||||
|
errors.Add(new ValidationError(path, "Timestamp must be a string", "VEX_INVALID_TIMESTAMP_TYPE"));
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
var value = timestampEl.GetString();
|
||||||
|
if (!DateTimeOffset.TryParse(value, CultureInfo.InvariantCulture, DateTimeStyles.RoundtripKind, out _))
|
||||||
|
{
|
||||||
|
errors.Add(new ValidationError(path, "Invalid ISO 8601 timestamp format", "VEX_INVALID_TIMESTAMP"));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private static void ValidateEvidenceRefs(
|
||||||
|
JsonElement evidenceRefsEl,
|
||||||
|
List<ValidationError> errors,
|
||||||
|
List<ValidationWarning> warnings)
|
||||||
|
{
|
||||||
|
if (evidenceRefsEl.ValueKind != JsonValueKind.Array)
|
||||||
|
{
|
||||||
|
errors.Add(new ValidationError("$.evidenceRefs", "evidenceRefs must be an array", "VEX_INVALID_EVIDENCE_REFS"));
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
var index = 0;
|
||||||
|
foreach (var refEl in evidenceRefsEl.EnumerateArray())
|
||||||
|
{
|
||||||
|
var path = $"$.evidenceRefs[{index}]";
|
||||||
|
|
||||||
|
if (!refEl.TryGetProperty("type", out var typeEl) ||
|
||||||
|
string.IsNullOrWhiteSpace(typeEl.GetString()))
|
||||||
|
{
|
||||||
|
errors.Add(new ValidationError($"{path}.type", "Missing required field: type", "VEX_MISSING_EVIDENCE_TYPE"));
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!refEl.TryGetProperty("uri", out var uriEl) ||
|
||||||
|
string.IsNullOrWhiteSpace(uriEl.GetString()))
|
||||||
|
{
|
||||||
|
errors.Add(new ValidationError($"{path}.uri", "Missing required field: uri", "VEX_MISSING_EVIDENCE_URI"));
|
||||||
|
}
|
||||||
|
|
||||||
|
index++;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (index == 0)
|
||||||
|
{
|
||||||
|
warnings.Add(new ValidationWarning("$.evidenceRefs", "No evidence references provided", "VEX_NO_EVIDENCE"));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private static void ValidateTool(JsonElement toolEl, List<ValidationError> errors)
|
||||||
|
{
|
||||||
|
if (toolEl.ValueKind != JsonValueKind.Object)
|
||||||
|
{
|
||||||
|
errors.Add(new ValidationError("$.tool", "tool must be an object", "VEX_INVALID_TOOL"));
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!toolEl.TryGetProperty("name", out var nameEl) ||
|
||||||
|
string.IsNullOrWhiteSpace(nameEl.GetString()))
|
||||||
|
{
|
||||||
|
errors.Add(new ValidationError("$.tool.name", "Missing required field: tool.name", "VEX_MISSING_TOOL_NAME"));
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!toolEl.TryGetProperty("version", out var versionEl) ||
|
||||||
|
string.IsNullOrWhiteSpace(versionEl.GetString()))
|
||||||
|
{
|
||||||
|
errors.Add(new ValidationError("$.tool.version", "Missing required field: tool.version", "VEX_MISSING_TOOL_VERSION"));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private static VexOverrideDecision ParseDecision(JsonElement decisionEl)
|
||||||
|
{
|
||||||
|
if (decisionEl.ValueKind == JsonValueKind.Number)
|
||||||
|
{
|
||||||
|
return (VexOverrideDecision)decisionEl.GetInt32();
|
||||||
|
}
|
||||||
|
|
||||||
|
var value = decisionEl.GetString()?.ToLowerInvariant();
|
||||||
|
return value switch
|
||||||
|
{
|
||||||
|
"not_affected" => VexOverrideDecision.NotAffected,
|
||||||
|
"mitigated" => VexOverrideDecision.Mitigated,
|
||||||
|
"accepted" => VexOverrideDecision.Accepted,
|
||||||
|
"under_investigation" => VexOverrideDecision.UnderInvestigation,
|
||||||
|
_ => throw new ArgumentException($"Invalid decision value: {value}")
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
private static ImmutableArray<EvidenceReference> ParseEvidenceRefs(JsonElement evidenceRefsEl)
|
||||||
|
{
|
||||||
|
var builder = ImmutableArray.CreateBuilder<EvidenceReference>();
|
||||||
|
|
||||||
|
foreach (var refEl in evidenceRefsEl.EnumerateArray())
|
||||||
|
{
|
||||||
|
var type = refEl.GetProperty("type").GetString()!;
|
||||||
|
var uri = refEl.GetProperty("uri").GetString()!;
|
||||||
|
|
||||||
|
string? digest = null;
|
||||||
|
if (refEl.TryGetProperty("digest", out var digestEl) &&
|
||||||
|
digestEl.ValueKind == JsonValueKind.String)
|
||||||
|
{
|
||||||
|
digest = digestEl.GetString();
|
||||||
|
}
|
||||||
|
|
||||||
|
string? description = null;
|
||||||
|
if (refEl.TryGetProperty("description", out var descEl) &&
|
||||||
|
descEl.ValueKind == JsonValueKind.String)
|
||||||
|
{
|
||||||
|
description = descEl.GetString();
|
||||||
|
}
|
||||||
|
|
||||||
|
builder.Add(new EvidenceReference
|
||||||
|
{
|
||||||
|
Type = type,
|
||||||
|
Uri = uri,
|
||||||
|
Digest = digest,
|
||||||
|
Description = description
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
return builder.ToImmutable();
|
||||||
|
}
|
||||||
|
|
||||||
|
private static ToolInfo ParseTool(JsonElement toolEl)
|
||||||
|
{
|
||||||
|
var name = toolEl.GetProperty("name").GetString()!;
|
||||||
|
var version = toolEl.GetProperty("version").GetString()!;
|
||||||
|
|
||||||
|
string? vendor = null;
|
||||||
|
if (toolEl.TryGetProperty("vendor", out var vendorEl) &&
|
||||||
|
vendorEl.ValueKind == JsonValueKind.String)
|
||||||
|
{
|
||||||
|
vendor = vendorEl.GetString();
|
||||||
|
}
|
||||||
|
|
||||||
|
return new ToolInfo
|
||||||
|
{
|
||||||
|
Name = name,
|
||||||
|
Version = version,
|
||||||
|
Vendor = vendor
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
private static ImmutableDictionary<string, string> ParseMetadata(JsonElement metadataEl)
|
||||||
|
{
|
||||||
|
var builder = ImmutableDictionary.CreateBuilder<string, string>();
|
||||||
|
|
||||||
|
foreach (var prop in metadataEl.EnumerateObject().OrderBy(p => p.Name, StringComparer.Ordinal))
|
||||||
|
{
|
||||||
|
if (prop.Value.ValueKind == JsonValueKind.String)
|
||||||
|
{
|
||||||
|
builder[prop.Name] = prop.Value.GetString()!;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return builder.ToImmutable();
|
||||||
|
}
|
||||||
|
|
||||||
|
private static ImmutableDictionary<string, string> ExtractMetadata(JsonElement predicatePayload)
|
||||||
|
{
|
||||||
|
var props = ImmutableDictionary.CreateBuilder<string, string>();
|
||||||
|
|
||||||
|
if (predicatePayload.TryGetProperty("vulnerabilityId", out var vulnIdEl) &&
|
||||||
|
vulnIdEl.ValueKind == JsonValueKind.String)
|
||||||
|
{
|
||||||
|
props["vulnerabilityId"] = vulnIdEl.GetString()!;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (predicatePayload.TryGetProperty("decision", out var decisionEl))
|
||||||
|
{
|
||||||
|
if (decisionEl.ValueKind == JsonValueKind.String)
|
||||||
|
{
|
||||||
|
props["decision"] = decisionEl.GetString()!;
|
||||||
|
}
|
||||||
|
else if (decisionEl.ValueKind == JsonValueKind.Number)
|
||||||
|
{
|
||||||
|
props["decision"] = ((VexOverrideDecision)decisionEl.GetInt32()).ToString().ToLowerInvariant();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (predicatePayload.TryGetProperty("operatorId", out var operatorIdEl) &&
|
||||||
|
operatorIdEl.ValueKind == JsonValueKind.String)
|
||||||
|
{
|
||||||
|
props["operatorId"] = operatorIdEl.GetString()!;
|
||||||
|
}
|
||||||
|
|
||||||
|
return props.ToImmutable();
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -4,6 +4,7 @@
|
|||||||
|
|
||||||
using System.Collections.Immutable;
|
using System.Collections.Immutable;
|
||||||
using FluentAssertions;
|
using FluentAssertions;
|
||||||
|
using StellaOps.Spdx3.Model;
|
||||||
using StellaOps.Spdx3.Model.Build;
|
using StellaOps.Spdx3.Model.Build;
|
||||||
using Xunit;
|
using Xunit;
|
||||||
|
|
||||||
@@ -95,7 +96,7 @@ public sealed class BuildAttestationMapperTests
|
|||||||
BuildStartTime = new DateTimeOffset(2026, 1, 7, 12, 0, 0, TimeSpan.Zero),
|
BuildStartTime = new DateTimeOffset(2026, 1, 7, 12, 0, 0, TimeSpan.Zero),
|
||||||
BuildEndTime = new DateTimeOffset(2026, 1, 7, 12, 5, 0, TimeSpan.Zero),
|
BuildEndTime = new DateTimeOffset(2026, 1, 7, 12, 5, 0, TimeSpan.Zero),
|
||||||
ConfigSourceUri = ImmutableArray.Create("https://github.com/stellaops/app"),
|
ConfigSourceUri = ImmutableArray.Create("https://github.com/stellaops/app"),
|
||||||
ConfigSourceDigest = ImmutableArray.Create(Spdx3Hash.Sha256("abc123")),
|
ConfigSourceDigest = ImmutableArray.Create(new Spdx3BuildHash { Algorithm = "sha256", HashValue = "abc123" }),
|
||||||
ConfigSourceEntrypoint = ImmutableArray.Create("Dockerfile"),
|
ConfigSourceEntrypoint = ImmutableArray.Create("Dockerfile"),
|
||||||
Environment = ImmutableDictionary<string, string>.Empty.Add("CI", "true"),
|
Environment = ImmutableDictionary<string, string>.Empty.Add("CI", "true"),
|
||||||
Parameter = ImmutableDictionary<string, string>.Empty.Add("target", "release")
|
Parameter = ImmutableDictionary<string, string>.Empty.Add("target", "release")
|
||||||
|
|||||||
@@ -14,7 +14,7 @@ public sealed class BinaryDiffPredicateBuilderTests
|
|||||||
public void Build_RequiresSubject()
|
public void Build_RequiresSubject()
|
||||||
{
|
{
|
||||||
var options = Options.Create(new BinaryDiffOptions { ToolVersion = "1.0.0" });
|
var options = Options.Create(new BinaryDiffOptions { ToolVersion = "1.0.0" });
|
||||||
var builder = new BinaryDiffPredicateBuilder(options, BinaryDiffTestData.FixedTimeProvider);
|
var builder = new BinaryDiffPredicateBuilder(options, BinaryDiffTestData.TestTimeProvider);
|
||||||
|
|
||||||
builder.WithInputs(
|
builder.WithInputs(
|
||||||
new BinaryDiffImageReference { Digest = "sha256:base" },
|
new BinaryDiffImageReference { Digest = "sha256:base" },
|
||||||
@@ -30,7 +30,7 @@ public sealed class BinaryDiffPredicateBuilderTests
|
|||||||
public void Build_RequiresInputs()
|
public void Build_RequiresInputs()
|
||||||
{
|
{
|
||||||
var options = Options.Create(new BinaryDiffOptions { ToolVersion = "1.0.0" });
|
var options = Options.Create(new BinaryDiffOptions { ToolVersion = "1.0.0" });
|
||||||
var builder = new BinaryDiffPredicateBuilder(options, BinaryDiffTestData.FixedTimeProvider);
|
var builder = new BinaryDiffPredicateBuilder(options, BinaryDiffTestData.TestTimeProvider);
|
||||||
|
|
||||||
builder.WithSubject("docker://example/app@sha256:base", "sha256:aaaa");
|
builder.WithSubject("docker://example/app@sha256:base", "sha256:aaaa");
|
||||||
|
|
||||||
@@ -44,7 +44,7 @@ public sealed class BinaryDiffPredicateBuilderTests
|
|||||||
public void Build_SortsFindingsAndSections()
|
public void Build_SortsFindingsAndSections()
|
||||||
{
|
{
|
||||||
var options = Options.Create(new BinaryDiffOptions { ToolVersion = "1.0.0" });
|
var options = Options.Create(new BinaryDiffOptions { ToolVersion = "1.0.0" });
|
||||||
var builder = new BinaryDiffPredicateBuilder(options, BinaryDiffTestData.FixedTimeProvider);
|
var builder = new BinaryDiffPredicateBuilder(options, BinaryDiffTestData.TestTimeProvider);
|
||||||
|
|
||||||
builder.WithSubject("docker://example/app@sha256:base", "sha256:aaaa")
|
builder.WithSubject("docker://example/app@sha256:base", "sha256:aaaa")
|
||||||
.WithInputs(
|
.WithInputs(
|
||||||
@@ -106,7 +106,7 @@ public sealed class BinaryDiffPredicateBuilderTests
|
|||||||
AnalyzedSections = [".z", ".a"]
|
AnalyzedSections = [".z", ".a"]
|
||||||
});
|
});
|
||||||
|
|
||||||
var builder = new BinaryDiffPredicateBuilder(options, BinaryDiffTestData.FixedTimeProvider);
|
var builder = new BinaryDiffPredicateBuilder(options, BinaryDiffTestData.TestTimeProvider);
|
||||||
builder.WithSubject("docker://example/app@sha256:base", "sha256:aaaa")
|
builder.WithSubject("docker://example/app@sha256:base", "sha256:aaaa")
|
||||||
.WithInputs(
|
.WithInputs(
|
||||||
new BinaryDiffImageReference { Digest = "sha256:base" },
|
new BinaryDiffImageReference { Digest = "sha256:base" },
|
||||||
@@ -116,7 +116,7 @@ public sealed class BinaryDiffPredicateBuilderTests
|
|||||||
|
|
||||||
predicate.Metadata.ToolVersion.Should().Be("2.0.0");
|
predicate.Metadata.ToolVersion.Should().Be("2.0.0");
|
||||||
predicate.Metadata.ConfigDigest.Should().Be("sha256:cfg");
|
predicate.Metadata.ConfigDigest.Should().Be("sha256:cfg");
|
||||||
predicate.Metadata.AnalysisTimestamp.Should().Be(BinaryDiffTestData.FixedTimeProvider.GetUtcNow());
|
predicate.Metadata.AnalysisTimestamp.Should().Be(BinaryDiffTestData.TestTimeProvider.GetUtcNow());
|
||||||
predicate.Metadata.AnalyzedSections.Should().Equal(".a", ".z");
|
predicate.Metadata.AnalyzedSections.Should().Equal(".a", ".z");
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -8,7 +8,7 @@ namespace StellaOps.Attestor.StandardPredicates.Tests.BinaryDiff;
|
|||||||
|
|
||||||
internal static class BinaryDiffTestData
|
internal static class BinaryDiffTestData
|
||||||
{
|
{
|
||||||
internal static readonly TimeProvider FixedTimeProvider =
|
internal static readonly TimeProvider TestTimeProvider =
|
||||||
new FixedTimeProvider(new DateTimeOffset(2026, 1, 13, 12, 0, 0, TimeSpan.Zero));
|
new FixedTimeProvider(new DateTimeOffset(2026, 1, 13, 12, 0, 0, TimeSpan.Zero));
|
||||||
|
|
||||||
internal static BinaryDiffPredicate CreatePredicate()
|
internal static BinaryDiffPredicate CreatePredicate()
|
||||||
@@ -20,7 +20,7 @@ internal static class BinaryDiffTestData
|
|||||||
AnalyzedSections = [".text", ".rodata", ".data"]
|
AnalyzedSections = [".text", ".rodata", ".data"]
|
||||||
});
|
});
|
||||||
|
|
||||||
var builder = new BinaryDiffPredicateBuilder(options, FixedTimeProvider);
|
var builder = new BinaryDiffPredicateBuilder(options, TestTimeProvider);
|
||||||
builder.WithSubject("docker://example/app@sha256:base", "sha256:aaaaaaaa")
|
builder.WithSubject("docker://example/app@sha256:base", "sha256:aaaaaaaa")
|
||||||
.WithInputs(
|
.WithInputs(
|
||||||
new BinaryDiffImageReference
|
new BinaryDiffImageReference
|
||||||
|
|||||||
@@ -0,0 +1,225 @@
|
|||||||
|
// -----------------------------------------------------------------------------
|
||||||
|
// VexOverridePredicateBuilderTests.cs
|
||||||
|
// Sprint: SPRINT_20260112_004_ATTESTOR_vex_override_predicate (ATT-VEX-002)
|
||||||
|
// Description: Tests for VEX override predicate builder
|
||||||
|
// -----------------------------------------------------------------------------
|
||||||
|
|
||||||
|
using System.Text.Json;
|
||||||
|
using StellaOps.Attestor.StandardPredicates.VexOverride;
|
||||||
|
using StellaOps.TestKit;
|
||||||
|
using Xunit;
|
||||||
|
|
||||||
|
namespace StellaOps.Attestor.StandardPredicates.Tests.VexOverride;
|
||||||
|
|
||||||
|
public sealed class VexOverridePredicateBuilderTests
|
||||||
|
{
|
||||||
|
[Trait("Category", TestCategories.Unit)]
|
||||||
|
[Fact]
|
||||||
|
public void Build_WithRequiredFields_CreatesPredicate()
|
||||||
|
{
|
||||||
|
var decisionTime = new DateTimeOffset(2026, 1, 14, 10, 0, 0, TimeSpan.Zero);
|
||||||
|
|
||||||
|
var predicate = new VexOverridePredicateBuilder()
|
||||||
|
.WithArtifactDigest("sha256:abc123")
|
||||||
|
.WithVulnerabilityId("CVE-2024-12345")
|
||||||
|
.WithDecision(VexOverrideDecision.NotAffected)
|
||||||
|
.WithJustification("Component is not in use")
|
||||||
|
.WithDecisionTime(decisionTime)
|
||||||
|
.WithOperatorId("user@example.com")
|
||||||
|
.Build();
|
||||||
|
|
||||||
|
Assert.Equal("sha256:abc123", predicate.ArtifactDigest);
|
||||||
|
Assert.Equal("CVE-2024-12345", predicate.VulnerabilityId);
|
||||||
|
Assert.Equal(VexOverrideDecision.NotAffected, predicate.Decision);
|
||||||
|
Assert.Equal("Component is not in use", predicate.Justification);
|
||||||
|
Assert.Equal(decisionTime, predicate.DecisionTime);
|
||||||
|
Assert.Equal("user@example.com", predicate.OperatorId);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Trait("Category", TestCategories.Unit)]
|
||||||
|
[Fact]
|
||||||
|
public void Build_MissingArtifactDigest_Throws()
|
||||||
|
{
|
||||||
|
var builder = new VexOverridePredicateBuilder()
|
||||||
|
.WithVulnerabilityId("CVE-2024-12345")
|
||||||
|
.WithDecision(VexOverrideDecision.NotAffected)
|
||||||
|
.WithJustification("Test")
|
||||||
|
.WithDecisionTime(DateTimeOffset.UtcNow)
|
||||||
|
.WithOperatorId("user@example.com");
|
||||||
|
|
||||||
|
Assert.Throws<InvalidOperationException>(() => builder.Build());
|
||||||
|
}
|
||||||
|
|
||||||
|
[Trait("Category", TestCategories.Unit)]
|
||||||
|
[Fact]
|
||||||
|
public void Build_WithEvidenceRefs_AddsToList()
|
||||||
|
{
|
||||||
|
var predicate = new VexOverridePredicateBuilder()
|
||||||
|
.WithArtifactDigest("sha256:abc123")
|
||||||
|
.WithVulnerabilityId("CVE-2024-12345")
|
||||||
|
.WithDecision(VexOverrideDecision.Mitigated)
|
||||||
|
.WithJustification("Compensating control")
|
||||||
|
.WithDecisionTime(DateTimeOffset.UtcNow)
|
||||||
|
.WithOperatorId("user@example.com")
|
||||||
|
.AddEvidenceRef("document", "https://example.com/doc", "sha256:def456", "Design doc")
|
||||||
|
.AddEvidenceRef(new EvidenceReference
|
||||||
|
{
|
||||||
|
Type = "ticket",
|
||||||
|
Uri = "https://jira.example.com/PROJ-123"
|
||||||
|
})
|
||||||
|
.Build();
|
||||||
|
|
||||||
|
Assert.Equal(2, predicate.EvidenceRefs.Length);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Trait("Category", TestCategories.Unit)]
|
||||||
|
[Fact]
|
||||||
|
public void Build_WithTool_SetsTool()
|
||||||
|
{
|
||||||
|
var predicate = new VexOverridePredicateBuilder()
|
||||||
|
.WithArtifactDigest("sha256:abc123")
|
||||||
|
.WithVulnerabilityId("CVE-2024-12345")
|
||||||
|
.WithDecision(VexOverrideDecision.Accepted)
|
||||||
|
.WithJustification("Accepted risk")
|
||||||
|
.WithDecisionTime(DateTimeOffset.UtcNow)
|
||||||
|
.WithOperatorId("user@example.com")
|
||||||
|
.WithTool("StellaOps", "1.0.0", "StellaOps Inc")
|
||||||
|
.Build();
|
||||||
|
|
||||||
|
Assert.NotNull(predicate.Tool);
|
||||||
|
Assert.Equal("StellaOps", predicate.Tool.Name);
|
||||||
|
Assert.Equal("1.0.0", predicate.Tool.Version);
|
||||||
|
Assert.Equal("StellaOps Inc", predicate.Tool.Vendor);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Trait("Category", TestCategories.Unit)]
|
||||||
|
[Fact]
|
||||||
|
public void Build_WithMetadata_AddsMetadata()
|
||||||
|
{
|
||||||
|
var predicate = new VexOverridePredicateBuilder()
|
||||||
|
.WithArtifactDigest("sha256:abc123")
|
||||||
|
.WithVulnerabilityId("CVE-2024-12345")
|
||||||
|
.WithDecision(VexOverrideDecision.NotAffected)
|
||||||
|
.WithJustification("Test")
|
||||||
|
.WithDecisionTime(DateTimeOffset.UtcNow)
|
||||||
|
.WithOperatorId("user@example.com")
|
||||||
|
.WithMetadata("tenant", "acme-corp")
|
||||||
|
.WithMetadata("environment", "production")
|
||||||
|
.Build();
|
||||||
|
|
||||||
|
Assert.Equal(2, predicate.Metadata.Count);
|
||||||
|
Assert.Equal("acme-corp", predicate.Metadata["tenant"]);
|
||||||
|
Assert.Equal("production", predicate.Metadata["environment"]);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Trait("Category", TestCategories.Unit)]
|
||||||
|
[Fact]
|
||||||
|
public void BuildCanonicalJson_ProducesDeterministicOutput()
|
||||||
|
{
|
||||||
|
var decisionTime = new DateTimeOffset(2026, 1, 14, 10, 0, 0, TimeSpan.Zero);
|
||||||
|
|
||||||
|
var json1 = new VexOverridePredicateBuilder()
|
||||||
|
.WithArtifactDigest("sha256:abc123")
|
||||||
|
.WithVulnerabilityId("CVE-2024-12345")
|
||||||
|
.WithDecision(VexOverrideDecision.NotAffected)
|
||||||
|
.WithJustification("Test")
|
||||||
|
.WithDecisionTime(decisionTime)
|
||||||
|
.WithOperatorId("user@example.com")
|
||||||
|
.BuildCanonicalJson();
|
||||||
|
|
||||||
|
var json2 = new VexOverridePredicateBuilder()
|
||||||
|
.WithOperatorId("user@example.com") // Different order
|
||||||
|
.WithDecisionTime(decisionTime)
|
||||||
|
.WithJustification("Test")
|
||||||
|
.WithDecision(VexOverrideDecision.NotAffected)
|
||||||
|
.WithVulnerabilityId("CVE-2024-12345")
|
||||||
|
.WithArtifactDigest("sha256:abc123")
|
||||||
|
.BuildCanonicalJson();
|
||||||
|
|
||||||
|
Assert.Equal(json1, json2);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Trait("Category", TestCategories.Unit)]
|
||||||
|
[Fact]
|
||||||
|
public void BuildCanonicalJson_HasSortedKeys()
|
||||||
|
{
|
||||||
|
var decisionTime = new DateTimeOffset(2026, 1, 14, 10, 0, 0, TimeSpan.Zero);
|
||||||
|
|
||||||
|
var json = new VexOverridePredicateBuilder()
|
||||||
|
.WithArtifactDigest("sha256:abc123")
|
||||||
|
.WithVulnerabilityId("CVE-2024-12345")
|
||||||
|
.WithDecision(VexOverrideDecision.NotAffected)
|
||||||
|
.WithJustification("Test")
|
||||||
|
.WithDecisionTime(decisionTime)
|
||||||
|
.WithOperatorId("user@example.com")
|
||||||
|
.BuildCanonicalJson();
|
||||||
|
|
||||||
|
using var document = JsonDocument.Parse(json);
|
||||||
|
var keys = document.RootElement.EnumerateObject().Select(p => p.Name).ToList();
|
||||||
|
|
||||||
|
// Verify keys are alphabetically sorted
|
||||||
|
var sortedKeys = keys.OrderBy(k => k, StringComparer.Ordinal).ToList();
|
||||||
|
Assert.Equal(sortedKeys, keys);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Trait("Category", TestCategories.Unit)]
|
||||||
|
[Fact]
|
||||||
|
public void BuildJsonBytes_ReturnsUtf8Bytes()
|
||||||
|
{
|
||||||
|
var decisionTime = new DateTimeOffset(2026, 1, 14, 10, 0, 0, TimeSpan.Zero);
|
||||||
|
|
||||||
|
var bytes = new VexOverridePredicateBuilder()
|
||||||
|
.WithArtifactDigest("sha256:abc123")
|
||||||
|
.WithVulnerabilityId("CVE-2024-12345")
|
||||||
|
.WithDecision(VexOverrideDecision.NotAffected)
|
||||||
|
.WithJustification("Test")
|
||||||
|
.WithDecisionTime(decisionTime)
|
||||||
|
.WithOperatorId("user@example.com")
|
||||||
|
.BuildJsonBytes();
|
||||||
|
|
||||||
|
Assert.NotEmpty(bytes);
|
||||||
|
|
||||||
|
var json = System.Text.Encoding.UTF8.GetString(bytes);
|
||||||
|
using var document = JsonDocument.Parse(json);
|
||||||
|
Assert.Equal(JsonValueKind.Object, document.RootElement.ValueKind);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Trait("Category", TestCategories.Unit)]
|
||||||
|
[Fact]
|
||||||
|
public void Build_WithExpiresAt_SetsExpiration()
|
||||||
|
{
|
||||||
|
var decisionTime = new DateTimeOffset(2026, 1, 14, 10, 0, 0, TimeSpan.Zero);
|
||||||
|
var expiresAt = new DateTimeOffset(2026, 4, 14, 10, 0, 0, TimeSpan.Zero);
|
||||||
|
|
||||||
|
var predicate = new VexOverridePredicateBuilder()
|
||||||
|
.WithArtifactDigest("sha256:abc123")
|
||||||
|
.WithVulnerabilityId("CVE-2024-12345")
|
||||||
|
.WithDecision(VexOverrideDecision.Accepted)
|
||||||
|
.WithJustification("Temporary acceptance")
|
||||||
|
.WithDecisionTime(decisionTime)
|
||||||
|
.WithOperatorId("user@example.com")
|
||||||
|
.WithExpiresAt(expiresAt)
|
||||||
|
.Build();
|
||||||
|
|
||||||
|
Assert.Equal(expiresAt, predicate.ExpiresAt);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Trait("Category", TestCategories.Unit)]
|
||||||
|
[Fact]
|
||||||
|
public void Build_WithRuleDigestAndTraceHash_SetsValues()
|
||||||
|
{
|
||||||
|
var predicate = new VexOverridePredicateBuilder()
|
||||||
|
.WithArtifactDigest("sha256:abc123")
|
||||||
|
.WithVulnerabilityId("CVE-2024-12345")
|
||||||
|
.WithDecision(VexOverrideDecision.NotAffected)
|
||||||
|
.WithJustification("Test")
|
||||||
|
.WithDecisionTime(DateTimeOffset.UtcNow)
|
||||||
|
.WithOperatorId("user@example.com")
|
||||||
|
.WithRuleDigest("sha256:rule123")
|
||||||
|
.WithTraceHash("sha256:trace456")
|
||||||
|
.Build();
|
||||||
|
|
||||||
|
Assert.Equal("sha256:rule123", predicate.RuleDigest);
|
||||||
|
Assert.Equal("sha256:trace456", predicate.TraceHash);
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,255 @@
|
|||||||
|
// -----------------------------------------------------------------------------
|
||||||
|
// VexOverridePredicateParserTests.cs
|
||||||
|
// Sprint: SPRINT_20260112_004_ATTESTOR_vex_override_predicate (ATT-VEX-002)
|
||||||
|
// Description: Tests for VEX override predicate parsing
|
||||||
|
// -----------------------------------------------------------------------------
|
||||||
|
|
||||||
|
using System.Text.Json;
|
||||||
|
using Microsoft.Extensions.Logging.Abstractions;
|
||||||
|
using StellaOps.Attestor.StandardPredicates.VexOverride;
|
||||||
|
using StellaOps.TestKit;
|
||||||
|
using Xunit;
|
||||||
|
|
||||||
|
namespace StellaOps.Attestor.StandardPredicates.Tests.VexOverride;
|
||||||
|
|
||||||
|
public sealed class VexOverridePredicateParserTests
|
||||||
|
{
|
||||||
|
private readonly VexOverridePredicateParser _parser;
|
||||||
|
|
||||||
|
public VexOverridePredicateParserTests()
|
||||||
|
{
|
||||||
|
_parser = new VexOverridePredicateParser(NullLogger<VexOverridePredicateParser>.Instance);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Trait("Category", TestCategories.Unit)]
|
||||||
|
[Fact]
|
||||||
|
public void PredicateType_ReturnsCorrectUri()
|
||||||
|
{
|
||||||
|
Assert.Equal(VexOverridePredicateTypes.PredicateTypeUri, _parser.PredicateType);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Trait("Category", TestCategories.Unit)]
|
||||||
|
[Fact]
|
||||||
|
public void Parse_ValidPredicate_ReturnsValid()
|
||||||
|
{
|
||||||
|
var json = """
|
||||||
|
{
|
||||||
|
"artifactDigest": "sha256:abc123",
|
||||||
|
"vulnerabilityId": "CVE-2024-12345",
|
||||||
|
"decision": "not_affected",
|
||||||
|
"justification": "Component is not in use",
|
||||||
|
"decisionTime": "2026-01-14T10:00:00Z",
|
||||||
|
"operatorId": "user@example.com"
|
||||||
|
}
|
||||||
|
""";
|
||||||
|
|
||||||
|
using var document = JsonDocument.Parse(json);
|
||||||
|
var result = _parser.Parse(document.RootElement);
|
||||||
|
|
||||||
|
Assert.True(result.IsValid);
|
||||||
|
Assert.Empty(result.Errors);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Trait("Category", TestCategories.Unit)]
|
||||||
|
[Fact]
|
||||||
|
public void Parse_MissingArtifactDigest_ReturnsError()
|
||||||
|
{
|
||||||
|
var json = """
|
||||||
|
{
|
||||||
|
"vulnerabilityId": "CVE-2024-12345",
|
||||||
|
"decision": "not_affected",
|
||||||
|
"justification": "Component is not in use",
|
||||||
|
"decisionTime": "2026-01-14T10:00:00Z",
|
||||||
|
"operatorId": "user@example.com"
|
||||||
|
}
|
||||||
|
""";
|
||||||
|
|
||||||
|
using var document = JsonDocument.Parse(json);
|
||||||
|
var result = _parser.Parse(document.RootElement);
|
||||||
|
|
||||||
|
Assert.False(result.IsValid);
|
||||||
|
Assert.Contains(result.Errors, e => e.Code == "VEX_MISSING_ARTIFACT_DIGEST");
|
||||||
|
}
|
||||||
|
|
||||||
|
[Trait("Category", TestCategories.Unit)]
|
||||||
|
[Fact]
|
||||||
|
public void Parse_MissingVulnerabilityId_ReturnsError()
|
||||||
|
{
|
||||||
|
var json = """
|
||||||
|
{
|
||||||
|
"artifactDigest": "sha256:abc123",
|
||||||
|
"decision": "not_affected",
|
||||||
|
"justification": "Component is not in use",
|
||||||
|
"decisionTime": "2026-01-14T10:00:00Z",
|
||||||
|
"operatorId": "user@example.com"
|
||||||
|
}
|
||||||
|
""";
|
||||||
|
|
||||||
|
using var document = JsonDocument.Parse(json);
|
||||||
|
var result = _parser.Parse(document.RootElement);
|
||||||
|
|
||||||
|
Assert.False(result.IsValid);
|
||||||
|
Assert.Contains(result.Errors, e => e.Code == "VEX_MISSING_VULN_ID");
|
||||||
|
}
|
||||||
|
|
||||||
|
[Trait("Category", TestCategories.Unit)]
|
||||||
|
[Fact]
|
||||||
|
public void Parse_InvalidDecision_ReturnsError()
|
||||||
|
{
|
||||||
|
var json = """
|
||||||
|
{
|
||||||
|
"artifactDigest": "sha256:abc123",
|
||||||
|
"vulnerabilityId": "CVE-2024-12345",
|
||||||
|
"decision": "invalid_decision",
|
||||||
|
"justification": "Component is not in use",
|
||||||
|
"decisionTime": "2026-01-14T10:00:00Z",
|
||||||
|
"operatorId": "user@example.com"
|
||||||
|
}
|
||||||
|
""";
|
||||||
|
|
||||||
|
using var document = JsonDocument.Parse(json);
|
||||||
|
var result = _parser.Parse(document.RootElement);
|
||||||
|
|
||||||
|
Assert.False(result.IsValid);
|
||||||
|
Assert.Contains(result.Errors, e => e.Code == "VEX_INVALID_DECISION");
|
||||||
|
}
|
||||||
|
|
||||||
|
[Trait("Category", TestCategories.Unit)]
|
||||||
|
[Theory]
|
||||||
|
[InlineData("not_affected", VexOverrideDecision.NotAffected)]
|
||||||
|
[InlineData("mitigated", VexOverrideDecision.Mitigated)]
|
||||||
|
[InlineData("accepted", VexOverrideDecision.Accepted)]
|
||||||
|
[InlineData("under_investigation", VexOverrideDecision.UnderInvestigation)]
|
||||||
|
public void Parse_AllDecisionValues_Accepted(string decisionValue, VexOverrideDecision expected)
|
||||||
|
{
|
||||||
|
var json = $$"""
|
||||||
|
{
|
||||||
|
"artifactDigest": "sha256:abc123",
|
||||||
|
"vulnerabilityId": "CVE-2024-12345",
|
||||||
|
"decision": "{{decisionValue}}",
|
||||||
|
"justification": "Test",
|
||||||
|
"decisionTime": "2026-01-14T10:00:00Z",
|
||||||
|
"operatorId": "user@example.com"
|
||||||
|
}
|
||||||
|
""";
|
||||||
|
|
||||||
|
using var document = JsonDocument.Parse(json);
|
||||||
|
var result = _parser.Parse(document.RootElement);
|
||||||
|
|
||||||
|
Assert.True(result.IsValid);
|
||||||
|
|
||||||
|
var predicate = _parser.ParsePredicate(document.RootElement);
|
||||||
|
Assert.NotNull(predicate);
|
||||||
|
Assert.Equal(expected, predicate.Decision);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Trait("Category", TestCategories.Unit)]
|
||||||
|
[Fact]
|
||||||
|
public void Parse_NumericDecision_Accepted()
|
||||||
|
{
|
||||||
|
var json = """
|
||||||
|
{
|
||||||
|
"artifactDigest": "sha256:abc123",
|
||||||
|
"vulnerabilityId": "CVE-2024-12345",
|
||||||
|
"decision": 1,
|
||||||
|
"justification": "Test",
|
||||||
|
"decisionTime": "2026-01-14T10:00:00Z",
|
||||||
|
"operatorId": "user@example.com"
|
||||||
|
}
|
||||||
|
""";
|
||||||
|
|
||||||
|
using var document = JsonDocument.Parse(json);
|
||||||
|
var result = _parser.Parse(document.RootElement);
|
||||||
|
|
||||||
|
Assert.True(result.IsValid);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Trait("Category", TestCategories.Unit)]
|
||||||
|
[Fact]
|
||||||
|
public void Parse_WithEvidenceRefs_ParsesCorrectly()
|
||||||
|
{
|
||||||
|
var json = """
|
||||||
|
{
|
||||||
|
"artifactDigest": "sha256:abc123",
|
||||||
|
"vulnerabilityId": "CVE-2024-12345",
|
||||||
|
"decision": "not_affected",
|
||||||
|
"justification": "Test",
|
||||||
|
"decisionTime": "2026-01-14T10:00:00Z",
|
||||||
|
"operatorId": "user@example.com",
|
||||||
|
"evidenceRefs": [
|
||||||
|
{
|
||||||
|
"type": "document",
|
||||||
|
"uri": "https://example.com/doc",
|
||||||
|
"digest": "sha256:def456",
|
||||||
|
"description": "Design document"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
""";
|
||||||
|
|
||||||
|
using var document = JsonDocument.Parse(json);
|
||||||
|
var result = _parser.Parse(document.RootElement);
|
||||||
|
|
||||||
|
Assert.True(result.IsValid);
|
||||||
|
|
||||||
|
var predicate = _parser.ParsePredicate(document.RootElement);
|
||||||
|
Assert.NotNull(predicate);
|
||||||
|
Assert.Single(predicate.EvidenceRefs);
|
||||||
|
Assert.Equal("document", predicate.EvidenceRefs[0].Type);
|
||||||
|
Assert.Equal("https://example.com/doc", predicate.EvidenceRefs[0].Uri);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Trait("Category", TestCategories.Unit)]
|
||||||
|
[Fact]
|
||||||
|
public void Parse_WithTool_ParsesCorrectly()
|
||||||
|
{
|
||||||
|
var json = """
|
||||||
|
{
|
||||||
|
"artifactDigest": "sha256:abc123",
|
||||||
|
"vulnerabilityId": "CVE-2024-12345",
|
||||||
|
"decision": "mitigated",
|
||||||
|
"justification": "Compensating control applied",
|
||||||
|
"decisionTime": "2026-01-14T10:00:00Z",
|
||||||
|
"operatorId": "user@example.com",
|
||||||
|
"tool": {
|
||||||
|
"name": "StellaOps",
|
||||||
|
"version": "1.0.0",
|
||||||
|
"vendor": "StellaOps Inc"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
""";
|
||||||
|
|
||||||
|
using var document = JsonDocument.Parse(json);
|
||||||
|
var result = _parser.Parse(document.RootElement);
|
||||||
|
|
||||||
|
Assert.True(result.IsValid);
|
||||||
|
|
||||||
|
var predicate = _parser.ParsePredicate(document.RootElement);
|
||||||
|
Assert.NotNull(predicate);
|
||||||
|
Assert.NotNull(predicate.Tool);
|
||||||
|
Assert.Equal("StellaOps", predicate.Tool.Name);
|
||||||
|
Assert.Equal("1.0.0", predicate.Tool.Version);
|
||||||
|
Assert.Equal("StellaOps Inc", predicate.Tool.Vendor);
|
||||||
|
}
|
||||||
|
|
||||||
|
[Trait("Category", TestCategories.Unit)]
|
||||||
|
[Fact]
|
||||||
|
public void ExtractSbom_ReturnsNull()
|
||||||
|
{
|
||||||
|
var json = """
|
||||||
|
{
|
||||||
|
"artifactDigest": "sha256:abc123",
|
||||||
|
"vulnerabilityId": "CVE-2024-12345",
|
||||||
|
"decision": "not_affected",
|
||||||
|
"justification": "Test",
|
||||||
|
"decisionTime": "2026-01-14T10:00:00Z",
|
||||||
|
"operatorId": "user@example.com"
|
||||||
|
}
|
||||||
|
""";
|
||||||
|
|
||||||
|
using var document = JsonDocument.Parse(json);
|
||||||
|
var result = _parser.ExtractSbom(document.RootElement);
|
||||||
|
|
||||||
|
Assert.Null(result);
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,322 @@
|
|||||||
|
// Copyright (c) StellaOps. All rights reserved.
|
||||||
|
// Licensed under AGPL-3.0-or-later. See LICENSE in the project root.
|
||||||
|
// Sprint: SPRINT_20260112_004_BINIDX_b2r2_lowuir_perf_cache (BINIDX-OPS-04)
|
||||||
|
// Task: Add ops endpoints for health, bench, cache, and config
|
||||||
|
|
||||||
|
using System.Collections.Immutable;
|
||||||
|
using System.Diagnostics;
|
||||||
|
using System.Globalization;
|
||||||
|
using Microsoft.AspNetCore.Mvc;
|
||||||
|
using Microsoft.Extensions.Options;
|
||||||
|
using StellaOps.BinaryIndex.Cache;
|
||||||
|
using StellaOps.BinaryIndex.Disassembly.B2R2;
|
||||||
|
|
||||||
|
namespace StellaOps.BinaryIndex.WebService.Controllers;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Ops endpoints for BinaryIndex health, benchmarking, cache stats, and configuration.
|
||||||
|
/// </summary>
|
||||||
|
[ApiController]
|
||||||
|
[Route("api/v1/ops/binaryindex")]
|
||||||
|
[Produces("application/json")]
|
||||||
|
public sealed class BinaryIndexOpsController : ControllerBase
|
||||||
|
{
|
||||||
|
private readonly B2R2LifterPool? _lifterPool;
|
||||||
|
private readonly FunctionIrCacheService? _cacheService;
|
||||||
|
private readonly IOptions<B2R2LifterPoolOptions> _poolOptions;
|
||||||
|
private readonly IOptions<FunctionIrCacheOptions> _cacheOptions;
|
||||||
|
private readonly TimeProvider _timeProvider;
|
||||||
|
private readonly ILogger<BinaryIndexOpsController> _logger;
|
||||||
|
|
||||||
|
public BinaryIndexOpsController(
|
||||||
|
ILogger<BinaryIndexOpsController> logger,
|
||||||
|
TimeProvider timeProvider,
|
||||||
|
IOptions<B2R2LifterPoolOptions> poolOptions,
|
||||||
|
IOptions<FunctionIrCacheOptions> cacheOptions,
|
||||||
|
B2R2LifterPool? lifterPool = null,
|
||||||
|
FunctionIrCacheService? cacheService = null)
|
||||||
|
{
|
||||||
|
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
|
||||||
|
_timeProvider = timeProvider ?? throw new ArgumentNullException(nameof(timeProvider));
|
||||||
|
_poolOptions = poolOptions ?? throw new ArgumentNullException(nameof(poolOptions));
|
||||||
|
_cacheOptions = cacheOptions ?? throw new ArgumentNullException(nameof(cacheOptions));
|
||||||
|
_lifterPool = lifterPool;
|
||||||
|
_cacheService = cacheService;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Gets BinaryIndex health status including lifter warmness and cache availability.
|
||||||
|
/// </summary>
|
||||||
|
/// <param name="ct">Cancellation token.</param>
|
||||||
|
/// <returns>Health response with component status.</returns>
|
||||||
|
[HttpGet("health")]
|
||||||
|
[ProducesResponseType<BinaryIndexOpsHealthResponse>(StatusCodes.Status200OK)]
|
||||||
|
[ProducesResponseType<ProblemDetails>(StatusCodes.Status503ServiceUnavailable)]
|
||||||
|
public ActionResult<BinaryIndexOpsHealthResponse> GetHealth(CancellationToken ct)
|
||||||
|
{
|
||||||
|
var lifterStatus = "unavailable";
|
||||||
|
var lifterWarm = false;
|
||||||
|
var lifterPoolStats = ImmutableDictionary<string, int>.Empty;
|
||||||
|
|
||||||
|
if (_lifterPool != null)
|
||||||
|
{
|
||||||
|
var stats = _lifterPool.GetStats();
|
||||||
|
lifterStatus = stats.IsWarm ? "warm" : "cold";
|
||||||
|
lifterWarm = stats.IsWarm;
|
||||||
|
lifterPoolStats = stats.IsaStats
|
||||||
|
.ToImmutableDictionary(
|
||||||
|
kv => kv.Key,
|
||||||
|
kv => kv.Value.PooledCount + kv.Value.ActiveCount);
|
||||||
|
}
|
||||||
|
|
||||||
|
var cacheStatus = "unavailable";
|
||||||
|
var cacheEnabled = false;
|
||||||
|
if (_cacheService != null)
|
||||||
|
{
|
||||||
|
var cacheStats = _cacheService.GetStats();
|
||||||
|
cacheStatus = cacheStats.IsEnabled ? "enabled" : "disabled";
|
||||||
|
cacheEnabled = cacheStats.IsEnabled;
|
||||||
|
}
|
||||||
|
|
||||||
|
var response = new BinaryIndexOpsHealthResponse(
|
||||||
|
Status: lifterWarm && cacheEnabled ? "healthy" : "degraded",
|
||||||
|
Timestamp: _timeProvider.GetUtcNow().ToString("o", CultureInfo.InvariantCulture),
|
||||||
|
LifterStatus: lifterStatus,
|
||||||
|
LifterWarm: lifterWarm,
|
||||||
|
LifterPoolStats: lifterPoolStats,
|
||||||
|
CacheStatus: cacheStatus,
|
||||||
|
CacheEnabled: cacheEnabled);
|
||||||
|
|
||||||
|
return Ok(response);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Runs a quick benchmark and returns latency metrics.
|
||||||
|
/// </summary>
|
||||||
|
/// <param name="request">Optional bench parameters.</param>
|
||||||
|
/// <param name="ct">Cancellation token.</param>
|
||||||
|
/// <returns>Benchmark response with latency measurements.</returns>
|
||||||
|
[HttpPost("bench/run")]
|
||||||
|
[ProducesResponseType<BinaryIndexBenchResponse>(StatusCodes.Status200OK)]
|
||||||
|
[ProducesResponseType<ProblemDetails>(StatusCodes.Status400BadRequest)]
|
||||||
|
public ActionResult<BinaryIndexBenchResponse> RunBench(
|
||||||
|
[FromBody] BinaryIndexBenchRequest? request,
|
||||||
|
CancellationToken ct)
|
||||||
|
{
|
||||||
|
var iterations = request?.Iterations ?? 10;
|
||||||
|
if (iterations < 1 || iterations > 1000)
|
||||||
|
{
|
||||||
|
return BadRequest(new ProblemDetails
|
||||||
|
{
|
||||||
|
Title = "Invalid iterations",
|
||||||
|
Detail = "Iterations must be between 1 and 1000",
|
||||||
|
Status = StatusCodes.Status400BadRequest
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
_logger.LogInformation("Running BinaryIndex benchmark with {Iterations} iterations", iterations);
|
||||||
|
|
||||||
|
var lifterLatencies = new List<double>();
|
||||||
|
var cacheLatencies = new List<double>();
|
||||||
|
|
||||||
|
// Benchmark lifter acquisition if available
|
||||||
|
if (_lifterPool != null)
|
||||||
|
{
|
||||||
|
var isa = new B2R2.ISA(B2R2.Architecture.Intel, B2R2.WordSize.Bit64);
|
||||||
|
for (var i = 0; i < iterations; i++)
|
||||||
|
{
|
||||||
|
ct.ThrowIfCancellationRequested();
|
||||||
|
var sw = Stopwatch.StartNew();
|
||||||
|
using (var lifter = _lifterPool.Acquire(isa))
|
||||||
|
{
|
||||||
|
// Just acquire and release
|
||||||
|
}
|
||||||
|
sw.Stop();
|
||||||
|
lifterLatencies.Add(sw.Elapsed.TotalMilliseconds);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Benchmark cache lookup if available
|
||||||
|
if (_cacheService != null)
|
||||||
|
{
|
||||||
|
var dummyKey = new FunctionCacheKey(
|
||||||
|
Isa: "intel-64",
|
||||||
|
B2R2Version: "0.9.1",
|
||||||
|
NormalizationRecipe: "v1",
|
||||||
|
CanonicalIrHash: "0000000000000000000000000000000000000000000000000000000000000000");
|
||||||
|
|
||||||
|
for (var i = 0; i < iterations; i++)
|
||||||
|
{
|
||||||
|
ct.ThrowIfCancellationRequested();
|
||||||
|
var sw = Stopwatch.StartNew();
|
||||||
|
// Fire and forget the cache lookup
|
||||||
|
_ = _cacheService.TryGetAsync(dummyKey, ct).ConfigureAwait(false);
|
||||||
|
sw.Stop();
|
||||||
|
cacheLatencies.Add(sw.Elapsed.TotalMilliseconds);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
var lifterStats = ComputeLatencyStats(lifterLatencies);
|
||||||
|
var cacheStats = ComputeLatencyStats(cacheLatencies);
|
||||||
|
|
||||||
|
var response = new BinaryIndexBenchResponse(
|
||||||
|
Timestamp: _timeProvider.GetUtcNow().ToString("o", CultureInfo.InvariantCulture),
|
||||||
|
Iterations: iterations,
|
||||||
|
LifterAcquireLatencyMs: lifterStats,
|
||||||
|
CacheLookupLatencyMs: cacheStats);
|
||||||
|
|
||||||
|
return Ok(response);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Gets function IR cache statistics.
|
||||||
|
/// </summary>
|
||||||
|
/// <param name="ct">Cancellation token.</param>
|
||||||
|
/// <returns>Cache statistics.</returns>
|
||||||
|
[HttpGet("cache")]
|
||||||
|
[ProducesResponseType<BinaryIndexFunctionCacheStats>(StatusCodes.Status200OK)]
|
||||||
|
public ActionResult<BinaryIndexFunctionCacheStats> GetCacheStats(CancellationToken ct)
|
||||||
|
{
|
||||||
|
if (_cacheService == null)
|
||||||
|
{
|
||||||
|
return Ok(new BinaryIndexFunctionCacheStats(
|
||||||
|
Enabled: false,
|
||||||
|
Hits: 0,
|
||||||
|
Misses: 0,
|
||||||
|
Evictions: 0,
|
||||||
|
HitRate: 0.0,
|
||||||
|
KeyPrefix: "",
|
||||||
|
CacheTtlSeconds: 0));
|
||||||
|
}
|
||||||
|
|
||||||
|
var stats = _cacheService.GetStats();
|
||||||
|
|
||||||
|
return Ok(new BinaryIndexFunctionCacheStats(
|
||||||
|
Enabled: stats.IsEnabled,
|
||||||
|
Hits: stats.Hits,
|
||||||
|
Misses: stats.Misses,
|
||||||
|
Evictions: stats.Evictions,
|
||||||
|
HitRate: stats.HitRate,
|
||||||
|
KeyPrefix: stats.KeyPrefix,
|
||||||
|
CacheTtlSeconds: (long)stats.CacheTtl.TotalSeconds));
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Gets effective BinaryIndex configuration.
|
||||||
|
/// </summary>
|
||||||
|
/// <param name="ct">Cancellation token.</param>
|
||||||
|
/// <returns>Effective configuration (secrets redacted).</returns>
|
||||||
|
[HttpGet("config")]
|
||||||
|
[ProducesResponseType<BinaryIndexEffectiveConfig>(StatusCodes.Status200OK)]
|
||||||
|
public ActionResult<BinaryIndexEffectiveConfig> GetConfig(CancellationToken ct)
|
||||||
|
{
|
||||||
|
var poolOptions = _poolOptions.Value;
|
||||||
|
var cacheOptions = _cacheOptions.Value;
|
||||||
|
|
||||||
|
return Ok(new BinaryIndexEffectiveConfig(
|
||||||
|
LifterPoolMaxSizePerIsa: poolOptions.MaxPoolSizePerIsa,
|
||||||
|
LifterPoolWarmPreloadEnabled: poolOptions.EnableWarmPreload,
|
||||||
|
LifterPoolWarmPreloadIsas: poolOptions.WarmPreloadIsas,
|
||||||
|
LifterPoolAcquireTimeoutSeconds: (long)poolOptions.AcquireTimeout.TotalSeconds,
|
||||||
|
CacheEnabled: cacheOptions.Enabled,
|
||||||
|
CacheKeyPrefix: cacheOptions.KeyPrefix,
|
||||||
|
CacheTtlSeconds: (long)cacheOptions.CacheTtl.TotalSeconds,
|
||||||
|
CacheMaxTtlSeconds: (long)cacheOptions.MaxTtl.TotalSeconds,
|
||||||
|
B2R2Version: cacheOptions.B2R2Version,
|
||||||
|
NormalizationRecipeVersion: cacheOptions.NormalizationRecipeVersion));
|
||||||
|
}
|
||||||
|
|
||||||
|
private static BinaryIndexLatencyStats ComputeLatencyStats(List<double> latencies)
|
||||||
|
{
|
||||||
|
if (latencies.Count == 0)
|
||||||
|
{
|
||||||
|
return new BinaryIndexLatencyStats(
|
||||||
|
Min: 0,
|
||||||
|
Max: 0,
|
||||||
|
Mean: 0,
|
||||||
|
P50: 0,
|
||||||
|
P95: 0,
|
||||||
|
P99: 0);
|
||||||
|
}
|
||||||
|
|
||||||
|
latencies.Sort();
|
||||||
|
var count = latencies.Count;
|
||||||
|
|
||||||
|
return new BinaryIndexLatencyStats(
|
||||||
|
Min: latencies[0],
|
||||||
|
Max: latencies[^1],
|
||||||
|
Mean: latencies.Average(),
|
||||||
|
P50: latencies[count / 2],
|
||||||
|
P95: latencies[(int)(count * 0.95)],
|
||||||
|
P99: latencies[(int)(count * 0.99)]);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
#region Response Models
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// BinaryIndex health response.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record BinaryIndexOpsHealthResponse(
|
||||||
|
string Status,
|
||||||
|
string Timestamp,
|
||||||
|
string LifterStatus,
|
||||||
|
bool LifterWarm,
|
||||||
|
ImmutableDictionary<string, int> LifterPoolStats,
|
||||||
|
string CacheStatus,
|
||||||
|
bool CacheEnabled);
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Benchmark request parameters.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record BinaryIndexBenchRequest(
|
||||||
|
int Iterations = 10);
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Benchmark response with latency measurements.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record BinaryIndexBenchResponse(
|
||||||
|
string Timestamp,
|
||||||
|
int Iterations,
|
||||||
|
BinaryIndexLatencyStats LifterAcquireLatencyMs,
|
||||||
|
BinaryIndexLatencyStats CacheLookupLatencyMs);
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Latency statistics.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record BinaryIndexLatencyStats(
|
||||||
|
double Min,
|
||||||
|
double Max,
|
||||||
|
double Mean,
|
||||||
|
double P50,
|
||||||
|
double P95,
|
||||||
|
double P99);
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Function IR cache statistics.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record BinaryIndexFunctionCacheStats(
|
||||||
|
bool Enabled,
|
||||||
|
long Hits,
|
||||||
|
long Misses,
|
||||||
|
long Evictions,
|
||||||
|
double HitRate,
|
||||||
|
string KeyPrefix,
|
||||||
|
long CacheTtlSeconds);
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Effective BinaryIndex configuration.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record BinaryIndexEffectiveConfig(
|
||||||
|
int LifterPoolMaxSizePerIsa,
|
||||||
|
bool LifterPoolWarmPreloadEnabled,
|
||||||
|
ImmutableArray<string> LifterPoolWarmPreloadIsas,
|
||||||
|
long LifterPoolAcquireTimeoutSeconds,
|
||||||
|
bool CacheEnabled,
|
||||||
|
string CacheKeyPrefix,
|
||||||
|
long CacheTtlSeconds,
|
||||||
|
long CacheMaxTtlSeconds,
|
||||||
|
string B2R2Version,
|
||||||
|
string NormalizationRecipeVersion);
|
||||||
|
|
||||||
|
#endregion
|
||||||
@@ -22,6 +22,7 @@
|
|||||||
<ProjectReference Include="../__Libraries/StellaOps.BinaryIndex.Persistence/StellaOps.BinaryIndex.Persistence.csproj" />
|
<ProjectReference Include="../__Libraries/StellaOps.BinaryIndex.Persistence/StellaOps.BinaryIndex.Persistence.csproj" />
|
||||||
<ProjectReference Include="../__Libraries/StellaOps.BinaryIndex.VexBridge/StellaOps.BinaryIndex.VexBridge.csproj" />
|
<ProjectReference Include="../__Libraries/StellaOps.BinaryIndex.VexBridge/StellaOps.BinaryIndex.VexBridge.csproj" />
|
||||||
<ProjectReference Include="../__Libraries/StellaOps.BinaryIndex.GoldenSet/StellaOps.BinaryIndex.GoldenSet.csproj" />
|
<ProjectReference Include="../__Libraries/StellaOps.BinaryIndex.GoldenSet/StellaOps.BinaryIndex.GoldenSet.csproj" />
|
||||||
|
<ProjectReference Include="../__Libraries/StellaOps.BinaryIndex.Disassembly.B2R2/StellaOps.BinaryIndex.Disassembly.B2R2.csproj" />
|
||||||
</ItemGroup>
|
</ItemGroup>
|
||||||
|
|
||||||
</Project>
|
</Project>
|
||||||
|
|||||||
@@ -2,6 +2,8 @@
|
|||||||
// BinaryCacheServiceExtensions.cs
|
// BinaryCacheServiceExtensions.cs
|
||||||
// Sprint: SPRINT_20251226_014_BINIDX
|
// Sprint: SPRINT_20251226_014_BINIDX
|
||||||
// Task: SCANINT-21 - Add Valkey cache layer for hot lookups
|
// Task: SCANINT-21 - Add Valkey cache layer for hot lookups
|
||||||
|
// Sprint: SPRINT_20260112_004_BINIDX (BINIDX-CACHE-03)
|
||||||
|
// Task: Function-level cache for canonical IR and semantic fingerprints
|
||||||
// -----------------------------------------------------------------------------
|
// -----------------------------------------------------------------------------
|
||||||
|
|
||||||
using Microsoft.Extensions.Configuration;
|
using Microsoft.Extensions.Configuration;
|
||||||
@@ -56,4 +58,49 @@ public static class BinaryCacheServiceExtensions
|
|||||||
|
|
||||||
return services;
|
return services;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Adds function IR caching layer to the service collection.
|
||||||
|
/// Uses Valkey as hot cache for semantic fingerprints.
|
||||||
|
/// </summary>
|
||||||
|
/// <param name="services">The service collection.</param>
|
||||||
|
/// <param name="configuration">Configuration for cache options.</param>
|
||||||
|
/// <returns>The service collection for chaining.</returns>
|
||||||
|
public static IServiceCollection AddFunctionIrCaching(
|
||||||
|
this IServiceCollection services,
|
||||||
|
IConfiguration configuration)
|
||||||
|
{
|
||||||
|
ArgumentNullException.ThrowIfNull(services);
|
||||||
|
ArgumentNullException.ThrowIfNull(configuration);
|
||||||
|
|
||||||
|
services.AddOptions<FunctionIrCacheOptions>()
|
||||||
|
.Bind(configuration.GetSection(FunctionIrCacheOptions.SectionName))
|
||||||
|
.ValidateOnStart();
|
||||||
|
|
||||||
|
services.TryAddSingleton<FunctionIrCacheService>();
|
||||||
|
|
||||||
|
return services;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Adds function IR caching layer with explicit options.
|
||||||
|
/// </summary>
|
||||||
|
/// <param name="services">The service collection.</param>
|
||||||
|
/// <param name="configureOptions">Action to configure options.</param>
|
||||||
|
/// <returns>The service collection for chaining.</returns>
|
||||||
|
public static IServiceCollection AddFunctionIrCaching(
|
||||||
|
this IServiceCollection services,
|
||||||
|
Action<FunctionIrCacheOptions> configureOptions)
|
||||||
|
{
|
||||||
|
ArgumentNullException.ThrowIfNull(services);
|
||||||
|
ArgumentNullException.ThrowIfNull(configureOptions);
|
||||||
|
|
||||||
|
services.AddOptions<FunctionIrCacheOptions>()
|
||||||
|
.Configure(configureOptions)
|
||||||
|
.ValidateOnStart();
|
||||||
|
|
||||||
|
services.TryAddSingleton<FunctionIrCacheService>();
|
||||||
|
|
||||||
|
return services;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -0,0 +1,316 @@
|
|||||||
|
// Copyright (c) StellaOps. All rights reserved.
|
||||||
|
// Licensed under AGPL-3.0-or-later. See LICENSE in the project root.
|
||||||
|
// Sprint: SPRINT_20260112_004_BINIDX_b2r2_lowuir_perf_cache (BINIDX-CACHE-03)
|
||||||
|
// Task: Function-level cache for canonical IR and semantic fingerprints
|
||||||
|
|
||||||
|
using System.Collections.Concurrent;
|
||||||
|
using System.Collections.Immutable;
|
||||||
|
using System.Globalization;
|
||||||
|
using System.Security.Cryptography;
|
||||||
|
using System.Text;
|
||||||
|
using System.Text.Json;
|
||||||
|
using Microsoft.Extensions.Caching.Distributed;
|
||||||
|
using Microsoft.Extensions.Logging;
|
||||||
|
using Microsoft.Extensions.Options;
|
||||||
|
|
||||||
|
namespace StellaOps.BinaryIndex.Cache;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Configuration options for the function IR cache.
|
||||||
|
/// </summary>
|
||||||
|
public sealed class FunctionIrCacheOptions
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Configuration section name.
|
||||||
|
/// </summary>
|
||||||
|
public const string SectionName = "StellaOps:BinaryIndex:FunctionIrCache";
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Valkey key prefix for function IR cache entries.
|
||||||
|
/// </summary>
|
||||||
|
public string KeyPrefix { get; init; } = "stellaops:binidx:funccache:";
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// TTL for cached function IR entries.
|
||||||
|
/// </summary>
|
||||||
|
public TimeSpan CacheTtl { get; init; } = TimeSpan.FromHours(4);
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Maximum TTL for any cache entry.
|
||||||
|
/// </summary>
|
||||||
|
public TimeSpan MaxTtl { get; init; } = TimeSpan.FromHours(24);
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Whether to enable the cache.
|
||||||
|
/// </summary>
|
||||||
|
public bool Enabled { get; init; } = true;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// B2R2 version string to include in cache keys.
|
||||||
|
/// </summary>
|
||||||
|
public string B2R2Version { get; init; } = "0.9.1";
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Normalization recipe version for cache key stability.
|
||||||
|
/// </summary>
|
||||||
|
public string NormalizationRecipeVersion { get; init; } = "v1";
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Cache key components for function IR caching.
|
||||||
|
/// </summary>
|
||||||
|
/// <param name="Isa">ISA identifier (e.g., "intel-64").</param>
|
||||||
|
/// <param name="B2R2Version">B2R2 version string.</param>
|
||||||
|
/// <param name="NormalizationRecipe">Normalization recipe version.</param>
|
||||||
|
/// <param name="CanonicalIrHash">SHA-256 hash of the canonical IR bytes.</param>
|
||||||
|
public sealed record FunctionCacheKey(
|
||||||
|
string Isa,
|
||||||
|
string B2R2Version,
|
||||||
|
string NormalizationRecipe,
|
||||||
|
string CanonicalIrHash)
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Converts to a deterministic cache key string.
|
||||||
|
/// </summary>
|
||||||
|
public string ToKeyString() =>
|
||||||
|
string.Format(
|
||||||
|
CultureInfo.InvariantCulture,
|
||||||
|
"{0}:{1}:{2}:{3}",
|
||||||
|
Isa,
|
||||||
|
B2R2Version,
|
||||||
|
NormalizationRecipe,
|
||||||
|
CanonicalIrHash);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Cached function IR and semantic fingerprint entry.
|
||||||
|
/// </summary>
|
||||||
|
/// <param name="FunctionAddress">Original function address.</param>
|
||||||
|
/// <param name="FunctionName">Original function name.</param>
|
||||||
|
/// <param name="SemanticFingerprint">Computed semantic fingerprint.</param>
|
||||||
|
/// <param name="IrStatementCount">Number of IR statements.</param>
|
||||||
|
/// <param name="BasicBlockCount">Number of basic blocks.</param>
|
||||||
|
/// <param name="ComputedAtUtc">When the fingerprint was computed (ISO-8601).</param>
|
||||||
|
/// <param name="B2R2Version">B2R2 version used.</param>
|
||||||
|
/// <param name="NormalizationRecipe">Normalization recipe used.</param>
|
||||||
|
public sealed record CachedFunctionFingerprint(
|
||||||
|
ulong FunctionAddress,
|
||||||
|
string FunctionName,
|
||||||
|
string SemanticFingerprint,
|
||||||
|
int IrStatementCount,
|
||||||
|
int BasicBlockCount,
|
||||||
|
string ComputedAtUtc,
|
||||||
|
string B2R2Version,
|
||||||
|
string NormalizationRecipe);
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Cache statistics for the function IR cache.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record FunctionIrCacheStats(
|
||||||
|
long Hits,
|
||||||
|
long Misses,
|
||||||
|
long Evictions,
|
||||||
|
double HitRate,
|
||||||
|
bool IsEnabled,
|
||||||
|
string KeyPrefix,
|
||||||
|
TimeSpan CacheTtl);
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Service for caching function IR and semantic fingerprints.
|
||||||
|
/// Uses Valkey as hot cache with deterministic key generation.
|
||||||
|
/// </summary>
|
||||||
|
public sealed class FunctionIrCacheService
|
||||||
|
{
|
||||||
|
private readonly IDistributedCache _cache;
|
||||||
|
private readonly ILogger<FunctionIrCacheService> _logger;
|
||||||
|
private readonly FunctionIrCacheOptions _options;
|
||||||
|
private readonly TimeProvider _timeProvider;
|
||||||
|
|
||||||
|
// Thread-safe statistics
|
||||||
|
private long _hits;
|
||||||
|
private long _misses;
|
||||||
|
private long _evictions;
|
||||||
|
|
||||||
|
private static readonly JsonSerializerOptions s_jsonOptions = new()
|
||||||
|
{
|
||||||
|
PropertyNamingPolicy = JsonNamingPolicy.CamelCase,
|
||||||
|
WriteIndented = false
|
||||||
|
};
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Creates a new function IR cache service.
|
||||||
|
/// </summary>
|
||||||
|
public FunctionIrCacheService(
|
||||||
|
IDistributedCache cache,
|
||||||
|
ILogger<FunctionIrCacheService> logger,
|
||||||
|
IOptions<FunctionIrCacheOptions> options,
|
||||||
|
TimeProvider timeProvider)
|
||||||
|
{
|
||||||
|
_cache = cache ?? throw new ArgumentNullException(nameof(cache));
|
||||||
|
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
|
||||||
|
_options = options?.Value ?? new FunctionIrCacheOptions();
|
||||||
|
_timeProvider = timeProvider ?? throw new ArgumentNullException(nameof(timeProvider));
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Gets the current cache statistics.
|
||||||
|
/// </summary>
|
||||||
|
public FunctionIrCacheStats GetStats()
|
||||||
|
{
|
||||||
|
var hits = Interlocked.Read(ref _hits);
|
||||||
|
var misses = Interlocked.Read(ref _misses);
|
||||||
|
var total = hits + misses;
|
||||||
|
var hitRate = total > 0 ? (double)hits / total : 0.0;
|
||||||
|
|
||||||
|
return new FunctionIrCacheStats(
|
||||||
|
Hits: hits,
|
||||||
|
Misses: misses,
|
||||||
|
Evictions: Interlocked.Read(ref _evictions),
|
||||||
|
HitRate: hitRate,
|
||||||
|
IsEnabled: _options.Enabled,
|
||||||
|
KeyPrefix: _options.KeyPrefix,
|
||||||
|
CacheTtl: _options.CacheTtl);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Tries to get a cached function fingerprint.
|
||||||
|
/// </summary>
|
||||||
|
/// <param name="key">The cache key.</param>
|
||||||
|
/// <param name="ct">Cancellation token.</param>
|
||||||
|
/// <returns>The cached fingerprint if found, null otherwise.</returns>
|
||||||
|
public async Task<CachedFunctionFingerprint?> TryGetAsync(
|
||||||
|
FunctionCacheKey key,
|
||||||
|
CancellationToken ct = default)
|
||||||
|
{
|
||||||
|
if (!_options.Enabled)
|
||||||
|
{
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
var cacheKey = BuildCacheKey(key);
|
||||||
|
|
||||||
|
try
|
||||||
|
{
|
||||||
|
var bytes = await _cache.GetAsync(cacheKey, ct).ConfigureAwait(false);
|
||||||
|
|
||||||
|
if (bytes is null || bytes.Length == 0)
|
||||||
|
{
|
||||||
|
Interlocked.Increment(ref _misses);
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
var result = JsonSerializer.Deserialize<CachedFunctionFingerprint>(bytes, s_jsonOptions);
|
||||||
|
Interlocked.Increment(ref _hits);
|
||||||
|
|
||||||
|
_logger.LogTrace(
|
||||||
|
"Cache hit for function {FunctionName} at {Address}",
|
||||||
|
result?.FunctionName,
|
||||||
|
result?.FunctionAddress);
|
||||||
|
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
catch (Exception ex)
|
||||||
|
{
|
||||||
|
_logger.LogWarning(ex, "Failed to get cached function fingerprint for key {Key}", cacheKey);
|
||||||
|
Interlocked.Increment(ref _misses);
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Sets a function fingerprint in the cache.
|
||||||
|
/// </summary>
|
||||||
|
/// <param name="key">The cache key.</param>
|
||||||
|
/// <param name="fingerprint">The fingerprint to cache.</param>
|
||||||
|
/// <param name="ct">Cancellation token.</param>
|
||||||
|
public async Task SetAsync(
|
||||||
|
FunctionCacheKey key,
|
||||||
|
CachedFunctionFingerprint fingerprint,
|
||||||
|
CancellationToken ct = default)
|
||||||
|
{
|
||||||
|
if (!_options.Enabled)
|
||||||
|
{
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
var cacheKey = BuildCacheKey(key);
|
||||||
|
|
||||||
|
try
|
||||||
|
{
|
||||||
|
var bytes = JsonSerializer.SerializeToUtf8Bytes(fingerprint, s_jsonOptions);
|
||||||
|
var options = new DistributedCacheEntryOptions
|
||||||
|
{
|
||||||
|
AbsoluteExpirationRelativeToNow = _options.CacheTtl
|
||||||
|
};
|
||||||
|
|
||||||
|
await _cache.SetAsync(cacheKey, bytes, options, ct).ConfigureAwait(false);
|
||||||
|
|
||||||
|
_logger.LogTrace(
|
||||||
|
"Cached function {FunctionName} fingerprint with key {Key}",
|
||||||
|
fingerprint.FunctionName,
|
||||||
|
cacheKey);
|
||||||
|
}
|
||||||
|
catch (Exception ex)
|
||||||
|
{
|
||||||
|
_logger.LogWarning(ex, "Failed to cache function fingerprint for key {Key}", cacheKey);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Removes a cached function fingerprint.
|
||||||
|
/// </summary>
|
||||||
|
/// <param name="key">The cache key.</param>
|
||||||
|
/// <param name="ct">Cancellation token.</param>
|
||||||
|
public async Task RemoveAsync(FunctionCacheKey key, CancellationToken ct = default)
|
||||||
|
{
|
||||||
|
if (!_options.Enabled)
|
||||||
|
{
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
var cacheKey = BuildCacheKey(key);
|
||||||
|
|
||||||
|
try
|
||||||
|
{
|
||||||
|
await _cache.RemoveAsync(cacheKey, ct).ConfigureAwait(false);
|
||||||
|
Interlocked.Increment(ref _evictions);
|
||||||
|
|
||||||
|
_logger.LogTrace("Removed cached function fingerprint for key {Key}", cacheKey);
|
||||||
|
}
|
||||||
|
catch (Exception ex)
|
||||||
|
{
|
||||||
|
_logger.LogWarning(ex, "Failed to remove cached function fingerprint for key {Key}", cacheKey);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Computes a canonical IR hash from function bytes.
|
||||||
|
/// </summary>
|
||||||
|
/// <param name="irBytes">The canonical IR bytes.</param>
|
||||||
|
/// <returns>Hex-encoded SHA-256 hash.</returns>
|
||||||
|
public static string ComputeCanonicalIrHash(ReadOnlySpan<byte> irBytes)
|
||||||
|
{
|
||||||
|
Span<byte> hashBytes = stackalloc byte[32];
|
||||||
|
SHA256.HashData(irBytes, hashBytes);
|
||||||
|
return Convert.ToHexString(hashBytes).ToLowerInvariant();
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Creates a cache key for a function.
|
||||||
|
/// </summary>
|
||||||
|
/// <param name="isa">ISA identifier.</param>
|
||||||
|
/// <param name="canonicalIrBytes">The canonical IR bytes.</param>
|
||||||
|
/// <returns>The cache key.</returns>
|
||||||
|
public FunctionCacheKey CreateKey(string isa, ReadOnlySpan<byte> canonicalIrBytes)
|
||||||
|
{
|
||||||
|
var hash = ComputeCanonicalIrHash(canonicalIrBytes);
|
||||||
|
return new FunctionCacheKey(
|
||||||
|
Isa: isa,
|
||||||
|
B2R2Version: _options.B2R2Version,
|
||||||
|
NormalizationRecipe: _options.NormalizationRecipeVersion,
|
||||||
|
CanonicalIrHash: hash);
|
||||||
|
}
|
||||||
|
|
||||||
|
private string BuildCacheKey(FunctionCacheKey key) =>
|
||||||
|
_options.KeyPrefix + key.ToKeyString();
|
||||||
|
}
|
||||||
@@ -13,6 +13,7 @@
|
|||||||
</PropertyGroup>
|
</PropertyGroup>
|
||||||
|
|
||||||
<ItemGroup>
|
<ItemGroup>
|
||||||
|
<PackageReference Include="Microsoft.Extensions.Caching.Abstractions" />
|
||||||
<PackageReference Include="StackExchange.Redis" />
|
<PackageReference Include="StackExchange.Redis" />
|
||||||
<PackageReference Include="Microsoft.Extensions.Configuration.Abstractions" />
|
<PackageReference Include="Microsoft.Extensions.Configuration.Abstractions" />
|
||||||
<PackageReference Include="Microsoft.Extensions.Configuration.Binder" />
|
<PackageReference Include="Microsoft.Extensions.Configuration.Binder" />
|
||||||
|
|||||||
@@ -369,6 +369,7 @@ public sealed class B2R2DisassemblyPlugin : IDisassemblyPlugin
|
|||||||
: ImmutableArray<byte>.Empty;
|
: ImmutableArray<byte>.Empty;
|
||||||
|
|
||||||
var kind = ClassifyInstruction(instr, mnemonic);
|
var kind = ClassifyInstruction(instr, mnemonic);
|
||||||
|
var operands = ParseOperands(operandsText, mnemonic);
|
||||||
|
|
||||||
return new DisassembledInstruction(
|
return new DisassembledInstruction(
|
||||||
Address: address,
|
Address: address,
|
||||||
@@ -376,7 +377,266 @@ public sealed class B2R2DisassemblyPlugin : IDisassemblyPlugin
|
|||||||
Mnemonic: mnemonic,
|
Mnemonic: mnemonic,
|
||||||
OperandsText: operandsText,
|
OperandsText: operandsText,
|
||||||
Kind: kind,
|
Kind: kind,
|
||||||
Operands: ImmutableArray<Operand>.Empty); // Simplified - operand parsing is complex
|
Operands: operands);
|
||||||
|
}
|
||||||
|
|
||||||
|
private static ImmutableArray<Operand> ParseOperands(string operandsText, string mnemonic)
|
||||||
|
{
|
||||||
|
if (string.IsNullOrWhiteSpace(operandsText))
|
||||||
|
{
|
||||||
|
return ImmutableArray<Operand>.Empty;
|
||||||
|
}
|
||||||
|
|
||||||
|
var builder = ImmutableArray.CreateBuilder<Operand>();
|
||||||
|
|
||||||
|
// Split operands by comma, respecting brackets
|
||||||
|
var operandStrings = SplitOperands(operandsText);
|
||||||
|
|
||||||
|
foreach (var opStr in operandStrings)
|
||||||
|
{
|
||||||
|
var trimmed = opStr.Trim();
|
||||||
|
if (string.IsNullOrEmpty(trimmed)) continue;
|
||||||
|
|
||||||
|
var operand = ParseSingleOperand(trimmed);
|
||||||
|
builder.Add(operand);
|
||||||
|
}
|
||||||
|
|
||||||
|
return builder.ToImmutable();
|
||||||
|
}
|
||||||
|
|
||||||
|
private static IReadOnlyList<string> SplitOperands(string operandsText)
|
||||||
|
{
|
||||||
|
var result = new List<string>();
|
||||||
|
var current = new System.Text.StringBuilder();
|
||||||
|
var bracketDepth = 0;
|
||||||
|
|
||||||
|
foreach (var c in operandsText)
|
||||||
|
{
|
||||||
|
if (c == '[' || c == '(' || c == '{')
|
||||||
|
{
|
||||||
|
bracketDepth++;
|
||||||
|
current.Append(c);
|
||||||
|
}
|
||||||
|
else if (c == ']' || c == ')' || c == '}')
|
||||||
|
{
|
||||||
|
bracketDepth--;
|
||||||
|
current.Append(c);
|
||||||
|
}
|
||||||
|
else if (c == ',' && bracketDepth == 0)
|
||||||
|
{
|
||||||
|
if (current.Length > 0)
|
||||||
|
{
|
||||||
|
result.Add(current.ToString());
|
||||||
|
current.Clear();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
else
|
||||||
|
{
|
||||||
|
current.Append(c);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (current.Length > 0)
|
||||||
|
{
|
||||||
|
result.Add(current.ToString());
|
||||||
|
}
|
||||||
|
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
private static Operand ParseSingleOperand(string text)
|
||||||
|
{
|
||||||
|
var trimmed = text.Trim();
|
||||||
|
|
||||||
|
// Check for memory operand [...]
|
||||||
|
if (trimmed.StartsWith('[') && trimmed.EndsWith(']'))
|
||||||
|
{
|
||||||
|
return ParseMemoryOperand(trimmed);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check for ARM64 memory operand [...]!
|
||||||
|
if (trimmed.StartsWith('[') && (trimmed.EndsWith("]!") || trimmed.Contains("],")))
|
||||||
|
{
|
||||||
|
return ParseMemoryOperand(trimmed);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check for immediate value
|
||||||
|
if (trimmed.StartsWith('#') || trimmed.StartsWith("0x", StringComparison.OrdinalIgnoreCase) ||
|
||||||
|
trimmed.StartsWith("0X", StringComparison.OrdinalIgnoreCase) ||
|
||||||
|
(trimmed.Length > 0 && (char.IsDigit(trimmed[0]) || trimmed[0] == '-')))
|
||||||
|
{
|
||||||
|
return ParseImmediateOperand(trimmed);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Assume it's a register
|
||||||
|
return ParseRegisterOperand(trimmed);
|
||||||
|
}
|
||||||
|
|
||||||
|
private static Operand ParseRegisterOperand(string text)
|
||||||
|
{
|
||||||
|
var regName = text.ToUpperInvariant();
|
||||||
|
|
||||||
|
return new Operand(
|
||||||
|
Type: OperandType.Register,
|
||||||
|
Text: text,
|
||||||
|
Value: null,
|
||||||
|
Register: regName,
|
||||||
|
MemoryBase: null,
|
||||||
|
MemoryIndex: null,
|
||||||
|
MemoryScale: null,
|
||||||
|
MemoryDisplacement: null);
|
||||||
|
}
|
||||||
|
|
||||||
|
private static Operand ParseImmediateOperand(string text)
|
||||||
|
{
|
||||||
|
var cleanText = text.TrimStart('#');
|
||||||
|
long? value = null;
|
||||||
|
|
||||||
|
if (cleanText.StartsWith("0x", StringComparison.OrdinalIgnoreCase))
|
||||||
|
{
|
||||||
|
if (long.TryParse(cleanText.AsSpan(2), System.Globalization.NumberStyles.HexNumber,
|
||||||
|
System.Globalization.CultureInfo.InvariantCulture, out var hexVal))
|
||||||
|
{
|
||||||
|
value = hexVal;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
else if (cleanText.StartsWith("-0x", StringComparison.OrdinalIgnoreCase))
|
||||||
|
{
|
||||||
|
if (long.TryParse(cleanText.AsSpan(3), System.Globalization.NumberStyles.HexNumber,
|
||||||
|
System.Globalization.CultureInfo.InvariantCulture, out var hexVal))
|
||||||
|
{
|
||||||
|
value = -hexVal;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
else if (long.TryParse(cleanText, System.Globalization.CultureInfo.InvariantCulture, out var decVal))
|
||||||
|
{
|
||||||
|
value = decVal;
|
||||||
|
}
|
||||||
|
|
||||||
|
return new Operand(
|
||||||
|
Type: OperandType.Immediate,
|
||||||
|
Text: text,
|
||||||
|
Value: value,
|
||||||
|
Register: null,
|
||||||
|
MemoryBase: null,
|
||||||
|
MemoryIndex: null,
|
||||||
|
MemoryScale: null,
|
||||||
|
MemoryDisplacement: null);
|
||||||
|
}
|
||||||
|
|
||||||
|
private static Operand ParseMemoryOperand(string text)
|
||||||
|
{
|
||||||
|
// Extract content between brackets
|
||||||
|
var start = text.IndexOf('[');
|
||||||
|
var end = text.LastIndexOf(']');
|
||||||
|
|
||||||
|
if (start < 0 || end <= start)
|
||||||
|
{
|
||||||
|
return new Operand(
|
||||||
|
Type: OperandType.Memory,
|
||||||
|
Text: text,
|
||||||
|
Value: null,
|
||||||
|
Register: null,
|
||||||
|
MemoryBase: null,
|
||||||
|
MemoryIndex: null,
|
||||||
|
MemoryScale: null,
|
||||||
|
MemoryDisplacement: null);
|
||||||
|
}
|
||||||
|
|
||||||
|
var inner = text.Substring(start + 1, end - start - 1);
|
||||||
|
|
||||||
|
// Parse components: base, index, scale, displacement
|
||||||
|
// Common patterns:
|
||||||
|
// x86: [rax], [rax+rbx], [rax+rbx*4], [rax+0x10], [rax+rbx*4+0x10]
|
||||||
|
// ARM: [x0], [x0, #8], [x0, x1], [x0, x1, lsl #2]
|
||||||
|
|
||||||
|
string? memBase = null;
|
||||||
|
string? memIndex = null;
|
||||||
|
int? memScale = null;
|
||||||
|
long? memDisp = null;
|
||||||
|
|
||||||
|
// Split by + or , depending on architecture style
|
||||||
|
var components = inner.Split(['+', ','], StringSplitOptions.RemoveEmptyEntries);
|
||||||
|
|
||||||
|
foreach (var comp in components)
|
||||||
|
{
|
||||||
|
var trimmed = comp.Trim();
|
||||||
|
|
||||||
|
// Check for scale pattern: reg*N
|
||||||
|
if (trimmed.Contains('*'))
|
||||||
|
{
|
||||||
|
var scaleParts = trimmed.Split('*');
|
||||||
|
if (scaleParts.Length == 2)
|
||||||
|
{
|
||||||
|
memIndex = scaleParts[0].Trim().ToUpperInvariant();
|
||||||
|
if (int.TryParse(scaleParts[1].Trim(), out var scale))
|
||||||
|
{
|
||||||
|
memScale = scale;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check for ARM immediate: #N
|
||||||
|
if (trimmed.StartsWith('#'))
|
||||||
|
{
|
||||||
|
var immText = trimmed.TrimStart('#');
|
||||||
|
if (immText.StartsWith("0x", StringComparison.OrdinalIgnoreCase))
|
||||||
|
{
|
||||||
|
if (long.TryParse(immText.AsSpan(2), System.Globalization.NumberStyles.HexNumber,
|
||||||
|
System.Globalization.CultureInfo.InvariantCulture, out var hexDisp))
|
||||||
|
{
|
||||||
|
memDisp = hexDisp;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
else if (long.TryParse(immText, out var decDisp))
|
||||||
|
{
|
||||||
|
memDisp = decDisp;
|
||||||
|
}
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check for hex displacement: 0xNN
|
||||||
|
if (trimmed.StartsWith("0x", StringComparison.OrdinalIgnoreCase))
|
||||||
|
{
|
||||||
|
if (long.TryParse(trimmed.AsSpan(2), System.Globalization.NumberStyles.HexNumber,
|
||||||
|
System.Globalization.CultureInfo.InvariantCulture, out var hexDisp))
|
||||||
|
{
|
||||||
|
memDisp = hexDisp;
|
||||||
|
}
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check for negative displacement
|
||||||
|
if (trimmed.StartsWith('-'))
|
||||||
|
{
|
||||||
|
if (long.TryParse(trimmed, out var negDisp))
|
||||||
|
{
|
||||||
|
memDisp = negDisp;
|
||||||
|
}
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Must be a register
|
||||||
|
if (memBase == null)
|
||||||
|
{
|
||||||
|
memBase = trimmed.ToUpperInvariant();
|
||||||
|
}
|
||||||
|
else if (memIndex == null)
|
||||||
|
{
|
||||||
|
memIndex = trimmed.ToUpperInvariant();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return new Operand(
|
||||||
|
Type: OperandType.Memory,
|
||||||
|
Text: text,
|
||||||
|
Value: null,
|
||||||
|
Register: null,
|
||||||
|
MemoryBase: memBase,
|
||||||
|
MemoryIndex: memIndex,
|
||||||
|
MemoryScale: memScale,
|
||||||
|
MemoryDisplacement: memDisp);
|
||||||
}
|
}
|
||||||
|
|
||||||
private static InstructionKind ClassifyInstruction(IInstruction instr, string mnemonic)
|
private static InstructionKind ClassifyInstruction(IInstruction instr, string mnemonic)
|
||||||
|
|||||||
@@ -0,0 +1,384 @@
|
|||||||
|
// Copyright (c) StellaOps. All rights reserved.
|
||||||
|
// Licensed under AGPL-3.0-or-later. See LICENSE in the project root.
|
||||||
|
// Sprint: SPRINT_20260112_004_BINIDX_b2r2_lowuir_perf_cache (BINIDX-LIFTER-02)
|
||||||
|
// Task: Bounded lifter pool with warm preload per ISA
|
||||||
|
|
||||||
|
using System.Collections.Concurrent;
|
||||||
|
using System.Collections.Immutable;
|
||||||
|
using System.Globalization;
|
||||||
|
using B2R2;
|
||||||
|
using B2R2.FrontEnd;
|
||||||
|
using B2R2.FrontEnd.BinLifter;
|
||||||
|
using Microsoft.Extensions.Logging;
|
||||||
|
using Microsoft.Extensions.Options;
|
||||||
|
|
||||||
|
namespace StellaOps.BinaryIndex.Disassembly.B2R2;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Configuration options for the B2R2 lifter pool.
|
||||||
|
/// </summary>
|
||||||
|
public sealed class B2R2LifterPoolOptions
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Configuration section name.
|
||||||
|
/// </summary>
|
||||||
|
public const string SectionName = "StellaOps:BinaryIndex:B2R2LifterPool";
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Maximum number of pooled lifters per ISA.
|
||||||
|
/// </summary>
|
||||||
|
public int MaxPoolSizePerIsa { get; set; } = 4;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Whether to warm preload lifters for common ISAs at startup.
|
||||||
|
/// </summary>
|
||||||
|
public bool EnableWarmPreload { get; set; } = true;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// ISAs to warm preload at startup.
|
||||||
|
/// </summary>
|
||||||
|
public ImmutableArray<string> WarmPreloadIsas { get; set; } =
|
||||||
|
[
|
||||||
|
"intel-64",
|
||||||
|
"intel-32",
|
||||||
|
"armv8-64",
|
||||||
|
"armv7-32"
|
||||||
|
];
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Timeout for acquiring a lifter from the pool.
|
||||||
|
/// </summary>
|
||||||
|
public TimeSpan AcquireTimeout { get; set; } = TimeSpan.FromSeconds(5);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Pooled B2R2 BinHandle and LiftingUnit for reuse across calls.
|
||||||
|
/// </summary>
|
||||||
|
public sealed class PooledLifter : IDisposable
|
||||||
|
{
|
||||||
|
private readonly B2R2LifterPool _pool;
|
||||||
|
private readonly ISA _isa;
|
||||||
|
private bool _disposed;
|
||||||
|
|
||||||
|
internal PooledLifter(
|
||||||
|
B2R2LifterPool pool,
|
||||||
|
ISA isa,
|
||||||
|
BinHandle binHandle,
|
||||||
|
LiftingUnit liftingUnit)
|
||||||
|
{
|
||||||
|
_pool = pool ?? throw new ArgumentNullException(nameof(pool));
|
||||||
|
_isa = isa;
|
||||||
|
BinHandle = binHandle ?? throw new ArgumentNullException(nameof(binHandle));
|
||||||
|
LiftingUnit = liftingUnit ?? throw new ArgumentNullException(nameof(liftingUnit));
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// The B2R2 BinHandle for this lifter.
|
||||||
|
/// </summary>
|
||||||
|
public BinHandle BinHandle { get; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// The B2R2 LiftingUnit for this lifter.
|
||||||
|
/// </summary>
|
||||||
|
public LiftingUnit LiftingUnit { get; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Returns the lifter to the pool.
|
||||||
|
/// </summary>
|
||||||
|
public void Dispose()
|
||||||
|
{
|
||||||
|
if (_disposed) return;
|
||||||
|
_disposed = true;
|
||||||
|
_pool.Return(this, _isa);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Bounded pool of B2R2 lifters with warm preload per ISA.
|
||||||
|
/// Thread-safe and designed for reuse in high-throughput scenarios.
|
||||||
|
/// </summary>
|
||||||
|
public sealed class B2R2LifterPool : IDisposable
|
||||||
|
{
|
||||||
|
private readonly ILogger<B2R2LifterPool> _logger;
|
||||||
|
private readonly B2R2LifterPoolOptions _options;
|
||||||
|
private readonly ConcurrentDictionary<string, ConcurrentBag<PooledLifterEntry>> _pools = new();
|
||||||
|
private readonly ConcurrentDictionary<string, int> _activeCount = new();
|
||||||
|
private readonly object _warmLock = new();
|
||||||
|
private bool _warmed;
|
||||||
|
private bool _disposed;
|
||||||
|
|
||||||
|
private sealed record PooledLifterEntry(BinHandle BinHandle, LiftingUnit LiftingUnit, DateTimeOffset CreatedAt);
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Creates a new B2R2 lifter pool.
|
||||||
|
/// </summary>
|
||||||
|
public B2R2LifterPool(
|
||||||
|
ILogger<B2R2LifterPool> logger,
|
||||||
|
IOptions<B2R2LifterPoolOptions> options)
|
||||||
|
{
|
||||||
|
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
|
||||||
|
_options = options?.Value ?? new B2R2LifterPoolOptions();
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Gets the current pool statistics.
|
||||||
|
/// </summary>
|
||||||
|
public B2R2LifterPoolStats GetStats()
|
||||||
|
{
|
||||||
|
var isaStats = new Dictionary<string, B2R2IsaPoolStats>();
|
||||||
|
|
||||||
|
foreach (var kvp in _pools)
|
||||||
|
{
|
||||||
|
var isaKey = kvp.Key;
|
||||||
|
var poolSize = kvp.Value.Count;
|
||||||
|
var activeCount = _activeCount.GetValueOrDefault(isaKey, 0);
|
||||||
|
|
||||||
|
isaStats[isaKey] = new B2R2IsaPoolStats(
|
||||||
|
PooledCount: poolSize,
|
||||||
|
ActiveCount: activeCount,
|
||||||
|
MaxPoolSize: _options.MaxPoolSizePerIsa);
|
||||||
|
}
|
||||||
|
|
||||||
|
return new B2R2LifterPoolStats(
|
||||||
|
TotalPooledLifters: _pools.Values.Sum(b => b.Count),
|
||||||
|
TotalActiveLifters: _activeCount.Values.Sum(),
|
||||||
|
IsWarm: _warmed,
|
||||||
|
IsaStats: isaStats.ToImmutableDictionary());
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Warms the pool by preloading lifters for common ISAs.
|
||||||
|
/// </summary>
|
||||||
|
public void WarmPool()
|
||||||
|
{
|
||||||
|
if (!_options.EnableWarmPreload) return;
|
||||||
|
if (_warmed) return;
|
||||||
|
|
||||||
|
lock (_warmLock)
|
||||||
|
{
|
||||||
|
if (_warmed) return;
|
||||||
|
|
||||||
|
_logger.LogInformation(
|
||||||
|
"Warming B2R2 lifter pool for {IsaCount} ISAs",
|
||||||
|
_options.WarmPreloadIsas.Length);
|
||||||
|
|
||||||
|
foreach (var isaKey in _options.WarmPreloadIsas)
|
||||||
|
{
|
||||||
|
try
|
||||||
|
{
|
||||||
|
var isa = ParseIsaKey(isaKey);
|
||||||
|
if (isa is null)
|
||||||
|
{
|
||||||
|
_logger.LogWarning("Unknown ISA key for warm preload: {IsaKey}", isaKey);
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create and pool a lifter for this ISA
|
||||||
|
var entry = CreateLifterEntry(isa);
|
||||||
|
var pool = GetOrCreatePool(GetIsaKey(isa));
|
||||||
|
pool.Add(entry);
|
||||||
|
|
||||||
|
_logger.LogDebug("Warmed lifter for ISA: {IsaKey}", isaKey);
|
||||||
|
}
|
||||||
|
catch (Exception ex)
|
||||||
|
{
|
||||||
|
_logger.LogWarning(ex, "Failed to warm lifter for ISA: {IsaKey}", isaKey);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
_warmed = true;
|
||||||
|
_logger.LogInformation("B2R2 lifter pool warm complete");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Acquires a lifter for the specified ISA.
|
||||||
|
/// </summary>
|
||||||
|
public PooledLifter Acquire(ISA isa)
|
||||||
|
{
|
||||||
|
ObjectDisposedException.ThrowIf(_disposed, this);
|
||||||
|
|
||||||
|
var isaKey = GetIsaKey(isa);
|
||||||
|
var pool = GetOrCreatePool(isaKey);
|
||||||
|
|
||||||
|
// Try to get an existing lifter from the pool
|
||||||
|
if (pool.TryTake(out var entry))
|
||||||
|
{
|
||||||
|
IncrementActive(isaKey);
|
||||||
|
_logger.LogTrace("Acquired pooled lifter for {Isa}", isaKey);
|
||||||
|
return new PooledLifter(this, isa, entry.BinHandle, entry.LiftingUnit);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create a new lifter
|
||||||
|
var newEntry = CreateLifterEntry(isa);
|
||||||
|
IncrementActive(isaKey);
|
||||||
|
_logger.LogTrace("Created new lifter for {Isa}", isaKey);
|
||||||
|
return new PooledLifter(this, isa, newEntry.BinHandle, newEntry.LiftingUnit);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Returns a lifter to the pool.
|
||||||
|
/// </summary>
|
||||||
|
internal void Return(PooledLifter lifter, ISA isa)
|
||||||
|
{
|
||||||
|
var isaKey = GetIsaKey(isa);
|
||||||
|
DecrementActive(isaKey);
|
||||||
|
|
||||||
|
var pool = GetOrCreatePool(isaKey);
|
||||||
|
|
||||||
|
// Only return to pool if under limit
|
||||||
|
if (pool.Count < _options.MaxPoolSizePerIsa)
|
||||||
|
{
|
||||||
|
var entry = new PooledLifterEntry(
|
||||||
|
lifter.BinHandle,
|
||||||
|
lifter.LiftingUnit,
|
||||||
|
DateTimeOffset.UtcNow);
|
||||||
|
pool.Add(entry);
|
||||||
|
_logger.LogTrace("Returned lifter to pool for {Isa}", isaKey);
|
||||||
|
}
|
||||||
|
else
|
||||||
|
{
|
||||||
|
_logger.LogTrace("Pool full, discarding lifter for {Isa}", isaKey);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <inheritdoc />
|
||||||
|
public void Dispose()
|
||||||
|
{
|
||||||
|
if (_disposed) return;
|
||||||
|
_disposed = true;
|
||||||
|
|
||||||
|
_pools.Clear();
|
||||||
|
_activeCount.Clear();
|
||||||
|
|
||||||
|
_logger.LogInformation("B2R2 lifter pool disposed");
|
||||||
|
}
|
||||||
|
|
||||||
|
#region Private Helpers
|
||||||
|
|
||||||
|
private static string GetIsaKey(ISA isa) =>
|
||||||
|
string.Format(
|
||||||
|
CultureInfo.InvariantCulture,
|
||||||
|
"{0}-{1}",
|
||||||
|
isa.Arch.ToString().ToLowerInvariant(),
|
||||||
|
isa.WordSize == WordSize.Bit64 ? "64" : "32");
|
||||||
|
|
||||||
|
private static ISA? ParseIsaKey(string key)
|
||||||
|
{
|
||||||
|
var parts = key.Split('-');
|
||||||
|
if (parts.Length != 2) return null;
|
||||||
|
|
||||||
|
var archStr = parts[0].ToLowerInvariant();
|
||||||
|
var bits = parts[1];
|
||||||
|
|
||||||
|
var wordSize = bits == "64" ? WordSize.Bit64 : WordSize.Bit32;
|
||||||
|
|
||||||
|
return archStr switch
|
||||||
|
{
|
||||||
|
"intel" => new ISA(Architecture.Intel, wordSize),
|
||||||
|
"armv7" => new ISA(Architecture.ARMv7, wordSize),
|
||||||
|
"armv8" => new ISA(Architecture.ARMv8, wordSize),
|
||||||
|
"mips" => new ISA(Architecture.MIPS, wordSize),
|
||||||
|
"riscv" => new ISA(Architecture.RISCV, wordSize),
|
||||||
|
"ppc" => new ISA(Architecture.PPC, Endian.Big, wordSize),
|
||||||
|
"sparc" => new ISA(Architecture.SPARC, Endian.Big),
|
||||||
|
_ => (ISA?)null
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
private ConcurrentBag<PooledLifterEntry> GetOrCreatePool(string isaKey) =>
|
||||||
|
_pools.GetOrAdd(isaKey, _ => new ConcurrentBag<PooledLifterEntry>());
|
||||||
|
|
||||||
|
private static PooledLifterEntry CreateLifterEntry(ISA isa)
|
||||||
|
{
|
||||||
|
// Create a minimal BinHandle for the ISA
|
||||||
|
// Use a small NOP sled as placeholder code
|
||||||
|
var nopBytes = CreateNopSled(isa, 64);
|
||||||
|
var binHandle = new BinHandle(nopBytes, isa, null, true);
|
||||||
|
var liftingUnit = binHandle.NewLiftingUnit();
|
||||||
|
return new PooledLifterEntry(binHandle, liftingUnit, DateTimeOffset.UtcNow);
|
||||||
|
}
|
||||||
|
|
||||||
|
private static byte[] CreateNopSled(ISA isa, int size)
|
||||||
|
{
|
||||||
|
var bytes = new byte[size];
|
||||||
|
|
||||||
|
// Fill with architecture-appropriate NOP bytes
|
||||||
|
switch (isa.Arch)
|
||||||
|
{
|
||||||
|
case Architecture.Intel:
|
||||||
|
// x86/x64 NOP = 0x90
|
||||||
|
Array.Fill(bytes, (byte)0x90);
|
||||||
|
break;
|
||||||
|
|
||||||
|
case Architecture.ARMv7:
|
||||||
|
case Architecture.ARMv8:
|
||||||
|
// ARM NOP = 0x00000000 or 0x1F 20 03 D5 (ARM64)
|
||||||
|
if (isa.WordSize == WordSize.Bit64)
|
||||||
|
{
|
||||||
|
for (var i = 0; i + 3 < size; i += 4)
|
||||||
|
{
|
||||||
|
bytes[i] = 0x1F;
|
||||||
|
bytes[i + 1] = 0x20;
|
||||||
|
bytes[i + 2] = 0x03;
|
||||||
|
bytes[i + 3] = 0xD5;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
else
|
||||||
|
{
|
||||||
|
// ARM32 NOP = 0xE320F000 (big endian) or 0x00 F0 20 E3 (little)
|
||||||
|
for (var i = 0; i + 3 < size; i += 4)
|
||||||
|
{
|
||||||
|
bytes[i] = 0x00;
|
||||||
|
bytes[i + 1] = 0xF0;
|
||||||
|
bytes[i + 2] = 0x20;
|
||||||
|
bytes[i + 3] = 0xE3;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
break;
|
||||||
|
|
||||||
|
default:
|
||||||
|
// Generic zeroes for other architectures
|
||||||
|
Array.Fill(bytes, (byte)0x00);
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
|
||||||
|
return bytes;
|
||||||
|
}
|
||||||
|
|
||||||
|
private void IncrementActive(string isaKey)
|
||||||
|
{
|
||||||
|
_activeCount.AddOrUpdate(isaKey, 1, (_, v) => v + 1);
|
||||||
|
}
|
||||||
|
|
||||||
|
private void DecrementActive(string isaKey)
|
||||||
|
{
|
||||||
|
_activeCount.AddOrUpdate(isaKey, 0, (_, v) => Math.Max(0, v - 1));
|
||||||
|
}
|
||||||
|
|
||||||
|
#endregion
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Statistics for the B2R2 lifter pool.
|
||||||
|
/// </summary>
|
||||||
|
/// <param name="TotalPooledLifters">Total lifters currently in pool.</param>
|
||||||
|
/// <param name="TotalActiveLifters">Total lifters currently in use.</param>
|
||||||
|
/// <param name="IsWarm">Whether the pool has been warmed.</param>
|
||||||
|
/// <param name="IsaStats">Per-ISA pool statistics.</param>
|
||||||
|
public sealed record B2R2LifterPoolStats(
|
||||||
|
int TotalPooledLifters,
|
||||||
|
int TotalActiveLifters,
|
||||||
|
bool IsWarm,
|
||||||
|
ImmutableDictionary<string, B2R2IsaPoolStats> IsaStats);
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Per-ISA pool statistics.
|
||||||
|
/// </summary>
|
||||||
|
/// <param name="PooledCount">Number of lifters in pool for this ISA.</param>
|
||||||
|
/// <param name="ActiveCount">Number of lifters currently in use for this ISA.</param>
|
||||||
|
/// <param name="MaxPoolSize">Maximum pool size for this ISA.</param>
|
||||||
|
public sealed record B2R2IsaPoolStats(
|
||||||
|
int PooledCount,
|
||||||
|
int ActiveCount,
|
||||||
|
int MaxPoolSize);
|
||||||
@@ -0,0 +1,697 @@
|
|||||||
|
// Copyright (c) StellaOps. All rights reserved.
|
||||||
|
// Licensed under AGPL-3.0-or-later. See LICENSE in the project root.
|
||||||
|
// Sprint: SPRINT_20260112_004_BINIDX_b2r2_lowuir_perf_cache (BINIDX-LIR-01)
|
||||||
|
// Task: Implement B2R2 LowUIR adapter for IIrLiftingService
|
||||||
|
|
||||||
|
using System.Collections.Immutable;
|
||||||
|
using System.Globalization;
|
||||||
|
using B2R2;
|
||||||
|
using B2R2.FrontEnd;
|
||||||
|
using Microsoft.Extensions.Logging;
|
||||||
|
using StellaOps.BinaryIndex.Disassembly;
|
||||||
|
using StellaOps.BinaryIndex.Semantic;
|
||||||
|
|
||||||
|
namespace StellaOps.BinaryIndex.Disassembly.B2R2;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// B2R2 LowUIR adapter for the IR lifting service.
|
||||||
|
/// Maps B2R2 BinIR/LowUIR statements to the StellaOps IR model
|
||||||
|
/// with deterministic ordering and invariant formatting.
|
||||||
|
/// </summary>
|
||||||
|
public sealed class B2R2LowUirLiftingService : IIrLiftingService
|
||||||
|
{
|
||||||
|
private readonly ILogger<B2R2LowUirLiftingService> _logger;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Version string for cache key generation.
|
||||||
|
/// </summary>
|
||||||
|
public const string AdapterVersion = "1.0.0";
|
||||||
|
|
||||||
|
private static readonly ImmutableHashSet<CpuArchitecture> SupportedArchitectures =
|
||||||
|
[
|
||||||
|
CpuArchitecture.X86,
|
||||||
|
CpuArchitecture.X86_64,
|
||||||
|
CpuArchitecture.ARM32,
|
||||||
|
CpuArchitecture.ARM64,
|
||||||
|
CpuArchitecture.MIPS32,
|
||||||
|
CpuArchitecture.MIPS64,
|
||||||
|
CpuArchitecture.RISCV64,
|
||||||
|
CpuArchitecture.PPC32,
|
||||||
|
CpuArchitecture.SPARC
|
||||||
|
];
|
||||||
|
|
||||||
|
public B2R2LowUirLiftingService(ILogger<B2R2LowUirLiftingService> logger)
|
||||||
|
{
|
||||||
|
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <inheritdoc />
|
||||||
|
public bool SupportsArchitecture(CpuArchitecture architecture) =>
|
||||||
|
SupportedArchitectures.Contains(architecture);
|
||||||
|
|
||||||
|
/// <inheritdoc />
|
||||||
|
public Task<LiftedFunction> LiftToIrAsync(
|
||||||
|
IReadOnlyList<DisassembledInstruction> instructions,
|
||||||
|
string functionName,
|
||||||
|
ulong startAddress,
|
||||||
|
CpuArchitecture architecture,
|
||||||
|
LiftOptions? options = null,
|
||||||
|
CancellationToken ct = default)
|
||||||
|
{
|
||||||
|
ArgumentNullException.ThrowIfNull(instructions);
|
||||||
|
ct.ThrowIfCancellationRequested();
|
||||||
|
|
||||||
|
options ??= LiftOptions.Default;
|
||||||
|
|
||||||
|
if (!SupportsArchitecture(architecture))
|
||||||
|
{
|
||||||
|
throw new NotSupportedException(
|
||||||
|
$"Architecture {architecture} is not supported for B2R2 LowUIR lifting.");
|
||||||
|
}
|
||||||
|
|
||||||
|
_logger.LogDebug(
|
||||||
|
"B2R2 LowUIR lifting {InstructionCount} instructions for function {FunctionName} ({Architecture})",
|
||||||
|
instructions.Count,
|
||||||
|
functionName,
|
||||||
|
architecture);
|
||||||
|
|
||||||
|
var isa = MapToB2R2Isa(architecture);
|
||||||
|
|
||||||
|
var statements = new List<IrStatement>();
|
||||||
|
var basicBlocks = new List<IrBasicBlock>();
|
||||||
|
var currentBlockStatements = new List<int>();
|
||||||
|
var blockStartAddress = startAddress;
|
||||||
|
var statementId = 0;
|
||||||
|
var blockId = 0;
|
||||||
|
|
||||||
|
var effectiveMaxInstructions = options.MaxInstructions > 0
|
||||||
|
? options.MaxInstructions
|
||||||
|
: int.MaxValue;
|
||||||
|
|
||||||
|
foreach (var instr in instructions.Take(effectiveMaxInstructions))
|
||||||
|
{
|
||||||
|
ct.ThrowIfCancellationRequested();
|
||||||
|
|
||||||
|
// Lift instruction to B2R2 LowUIR
|
||||||
|
var liftedStatements = LiftInstructionToLowUir(isa, instr, ref statementId);
|
||||||
|
statements.AddRange(liftedStatements);
|
||||||
|
|
||||||
|
foreach (var stmt in liftedStatements)
|
||||||
|
{
|
||||||
|
currentBlockStatements.Add(stmt.Id);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check for block-ending instructions
|
||||||
|
if (IsBlockTerminator(instr))
|
||||||
|
{
|
||||||
|
var endAddress = instr.Address + (ulong)instr.RawBytes.Length;
|
||||||
|
var block = new IrBasicBlock(
|
||||||
|
Id: blockId,
|
||||||
|
Label: string.Format(CultureInfo.InvariantCulture, "bb_{0}", blockId),
|
||||||
|
StartAddress: blockStartAddress,
|
||||||
|
EndAddress: endAddress,
|
||||||
|
StatementIds: [.. currentBlockStatements],
|
||||||
|
Predecessors: ImmutableArray<int>.Empty,
|
||||||
|
Successors: ImmutableArray<int>.Empty);
|
||||||
|
|
||||||
|
basicBlocks.Add(block);
|
||||||
|
blockId++;
|
||||||
|
currentBlockStatements.Clear();
|
||||||
|
blockStartAddress = endAddress;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Handle trailing statements not yet in a block
|
||||||
|
if (currentBlockStatements.Count > 0 && instructions.Count > 0)
|
||||||
|
{
|
||||||
|
var lastInstr = instructions[^1];
|
||||||
|
var endAddress = lastInstr.Address + (ulong)lastInstr.RawBytes.Length;
|
||||||
|
var block = new IrBasicBlock(
|
||||||
|
Id: blockId,
|
||||||
|
Label: string.Format(CultureInfo.InvariantCulture, "bb_{0}", blockId),
|
||||||
|
StartAddress: blockStartAddress,
|
||||||
|
EndAddress: endAddress,
|
||||||
|
StatementIds: [.. currentBlockStatements],
|
||||||
|
Predecessors: ImmutableArray<int>.Empty,
|
||||||
|
Successors: ImmutableArray<int>.Empty);
|
||||||
|
basicBlocks.Add(block);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Build CFG edges deterministically (sorted by address)
|
||||||
|
var (blocksWithEdges, edges) = BuildCfgEdges([.. basicBlocks]);
|
||||||
|
|
||||||
|
var cfg = new ControlFlowGraph(
|
||||||
|
EntryBlockId: blocksWithEdges.Length > 0 ? 0 : -1,
|
||||||
|
ExitBlockIds: FindExitBlocks(blocksWithEdges),
|
||||||
|
Edges: edges);
|
||||||
|
|
||||||
|
var lifted = new LiftedFunction(
|
||||||
|
Name: functionName,
|
||||||
|
Address: startAddress,
|
||||||
|
Statements: [.. statements],
|
||||||
|
BasicBlocks: blocksWithEdges,
|
||||||
|
Cfg: cfg);
|
||||||
|
|
||||||
|
_logger.LogDebug(
|
||||||
|
"B2R2 LowUIR lifted {StatementCount} statements in {BlockCount} blocks for {FunctionName}",
|
||||||
|
statements.Count,
|
||||||
|
blocksWithEdges.Length,
|
||||||
|
functionName);
|
||||||
|
|
||||||
|
return Task.FromResult(lifted);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <inheritdoc />
|
||||||
|
public Task<SsaFunction> TransformToSsaAsync(
|
||||||
|
LiftedFunction lifted,
|
||||||
|
CancellationToken ct = default)
|
||||||
|
{
|
||||||
|
ArgumentNullException.ThrowIfNull(lifted);
|
||||||
|
ct.ThrowIfCancellationRequested();
|
||||||
|
|
||||||
|
_logger.LogDebug(
|
||||||
|
"Transforming {FunctionName} to SSA form ({StatementCount} statements)",
|
||||||
|
lifted.Name,
|
||||||
|
lifted.Statements.Length);
|
||||||
|
|
||||||
|
// Build SSA form from lifted function
|
||||||
|
var ssaStatements = new List<SsaStatement>();
|
||||||
|
var ssaBlocks = new List<SsaBasicBlock>();
|
||||||
|
var definitions = new Dictionary<SsaVariable, int>();
|
||||||
|
var uses = new Dictionary<SsaVariable, HashSet<int>>();
|
||||||
|
|
||||||
|
var versionCounters = new Dictionary<string, int>();
|
||||||
|
|
||||||
|
foreach (var stmt in lifted.Statements)
|
||||||
|
{
|
||||||
|
ct.ThrowIfCancellationRequested();
|
||||||
|
|
||||||
|
SsaVariable? destVar = null;
|
||||||
|
var sourceVars = new List<SsaVariable>();
|
||||||
|
|
||||||
|
// Process destination
|
||||||
|
if (stmt.Destination != null)
|
||||||
|
{
|
||||||
|
var varName = stmt.Destination.Name ?? "?";
|
||||||
|
if (!versionCounters.TryGetValue(varName, out var version))
|
||||||
|
{
|
||||||
|
version = 0;
|
||||||
|
}
|
||||||
|
versionCounters[varName] = version + 1;
|
||||||
|
|
||||||
|
destVar = new SsaVariable(
|
||||||
|
BaseName: varName,
|
||||||
|
Version: version + 1,
|
||||||
|
BitSize: stmt.Destination.BitSize,
|
||||||
|
Kind: MapOperandKindToSsaKind(stmt.Destination.Kind));
|
||||||
|
|
||||||
|
definitions[destVar] = stmt.Id;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Process sources
|
||||||
|
foreach (var src in stmt.Sources)
|
||||||
|
{
|
||||||
|
var varName = src.Name ?? "?";
|
||||||
|
var currentVersion = versionCounters.GetValueOrDefault(varName, 0);
|
||||||
|
var ssaVar = new SsaVariable(
|
||||||
|
BaseName: varName,
|
||||||
|
Version: currentVersion,
|
||||||
|
BitSize: src.BitSize,
|
||||||
|
Kind: MapOperandKindToSsaKind(src.Kind));
|
||||||
|
sourceVars.Add(ssaVar);
|
||||||
|
|
||||||
|
if (!uses.ContainsKey(ssaVar))
|
||||||
|
{
|
||||||
|
uses[ssaVar] = [];
|
||||||
|
}
|
||||||
|
uses[ssaVar].Add(stmt.Id);
|
||||||
|
}
|
||||||
|
|
||||||
|
var ssaStmt = new SsaStatement(
|
||||||
|
Id: stmt.Id,
|
||||||
|
Address: stmt.Address,
|
||||||
|
Kind: stmt.Kind,
|
||||||
|
Operation: stmt.Operation,
|
||||||
|
Destination: destVar,
|
||||||
|
Sources: [.. sourceVars],
|
||||||
|
PhiSources: null);
|
||||||
|
|
||||||
|
ssaStatements.Add(ssaStmt);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Build SSA basic blocks from lifted blocks
|
||||||
|
foreach (var block in lifted.BasicBlocks)
|
||||||
|
{
|
||||||
|
var blockStatements = ssaStatements
|
||||||
|
.Where(s => block.StatementIds.Contains(s.Id))
|
||||||
|
.ToImmutableArray();
|
||||||
|
|
||||||
|
var ssaBlock = new SsaBasicBlock(
|
||||||
|
Id: block.Id,
|
||||||
|
Label: block.Label,
|
||||||
|
PhiNodes: ImmutableArray<SsaStatement>.Empty,
|
||||||
|
Statements: blockStatements,
|
||||||
|
Predecessors: block.Predecessors,
|
||||||
|
Successors: block.Successors);
|
||||||
|
|
||||||
|
ssaBlocks.Add(ssaBlock);
|
||||||
|
}
|
||||||
|
|
||||||
|
var defUse = new DefUseChains(
|
||||||
|
Definitions: definitions.ToImmutableDictionary(),
|
||||||
|
Uses: uses.ToImmutableDictionary(
|
||||||
|
k => k.Key,
|
||||||
|
v => v.Value.ToImmutableHashSet()));
|
||||||
|
|
||||||
|
var ssaFunction = new SsaFunction(
|
||||||
|
Name: lifted.Name,
|
||||||
|
Address: lifted.Address,
|
||||||
|
Statements: [.. ssaStatements],
|
||||||
|
BasicBlocks: [.. ssaBlocks],
|
||||||
|
DefUse: defUse);
|
||||||
|
|
||||||
|
_logger.LogDebug(
|
||||||
|
"SSA transformation complete: {StatementCount} SSA statements, {DefCount} definitions",
|
||||||
|
ssaStatements.Count,
|
||||||
|
definitions.Count);
|
||||||
|
|
||||||
|
return Task.FromResult(ssaFunction);
|
||||||
|
}
|
||||||
|
|
||||||
|
#region B2R2 LowUIR Mapping
|
||||||
|
|
||||||
|
private List<IrStatement> LiftInstructionToLowUir(
|
||||||
|
ISA isa,
|
||||||
|
DisassembledInstruction instr,
|
||||||
|
ref int statementId)
|
||||||
|
{
|
||||||
|
var statements = new List<IrStatement>();
|
||||||
|
|
||||||
|
try
|
||||||
|
{
|
||||||
|
// Create B2R2 BinHandle and lifting unit for the ISA
|
||||||
|
var bytes = instr.RawBytes.ToArray();
|
||||||
|
var binHandle = new BinHandle(bytes, isa, null, true);
|
||||||
|
var lifter = binHandle.NewLiftingUnit();
|
||||||
|
|
||||||
|
// Lift to LowUIR using B2R2 - returns Stmt[] directly
|
||||||
|
var liftResult = lifter.LiftInstruction(instr.Address);
|
||||||
|
|
||||||
|
if (liftResult == null || liftResult.Length == 0)
|
||||||
|
{
|
||||||
|
// Fallback to simple mapping if B2R2 lift fails
|
||||||
|
statements.Add(CreateFallbackStatement(instr, statementId++));
|
||||||
|
return statements;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Map each B2R2 LowUIR statement to our IR model
|
||||||
|
foreach (var b2r2Stmt in liftResult)
|
||||||
|
{
|
||||||
|
var irStmt = MapB2R2Statement(b2r2Stmt, instr.Address, ref statementId);
|
||||||
|
if (irStmt != null)
|
||||||
|
{
|
||||||
|
statements.Add(irStmt);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Ensure at least one statement per instruction for determinism
|
||||||
|
if (statements.Count == 0)
|
||||||
|
{
|
||||||
|
statements.Add(CreateFallbackStatement(instr, statementId++));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
catch (Exception ex)
|
||||||
|
{
|
||||||
|
_logger.LogWarning(
|
||||||
|
ex,
|
||||||
|
"B2R2 lift failed for instruction at {Address}: {Mnemonic}",
|
||||||
|
instr.Address,
|
||||||
|
instr.Mnemonic);
|
||||||
|
|
||||||
|
statements.Add(CreateFallbackStatement(instr, statementId++));
|
||||||
|
}
|
||||||
|
|
||||||
|
return statements;
|
||||||
|
}
|
||||||
|
|
||||||
|
private IrStatement? MapB2R2Statement(object b2r2Stmt, ulong baseAddress, ref int statementId)
|
||||||
|
{
|
||||||
|
// B2R2 LowUIR statement types:
|
||||||
|
// - Put: register assignment
|
||||||
|
// - Store: memory write
|
||||||
|
// - Jmp: unconditional jump
|
||||||
|
// - CJmp: conditional jump
|
||||||
|
// - InterJmp: indirect jump
|
||||||
|
// - InterCJmp: indirect conditional jump
|
||||||
|
// - LMark: label marker
|
||||||
|
// - SideEffect: side effects (syscall, fence, etc.)
|
||||||
|
|
||||||
|
var stmtType = b2r2Stmt.GetType().Name;
|
||||||
|
var kind = MapB2R2StmtTypeToKind(stmtType);
|
||||||
|
|
||||||
|
if (kind == IrStatementKind.Unknown)
|
||||||
|
{
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
var (dest, sources) = ExtractOperandsFromB2R2Stmt(b2r2Stmt);
|
||||||
|
var operation = stmtType;
|
||||||
|
|
||||||
|
return new IrStatement(
|
||||||
|
Id: statementId++,
|
||||||
|
Address: baseAddress,
|
||||||
|
Kind: kind,
|
||||||
|
Operation: operation,
|
||||||
|
Destination: dest,
|
||||||
|
Sources: sources,
|
||||||
|
Metadata: null);
|
||||||
|
}
|
||||||
|
|
||||||
|
private static IrStatementKind MapB2R2StmtTypeToKind(string stmtType) => stmtType switch
|
||||||
|
{
|
||||||
|
"Put" => IrStatementKind.Assign,
|
||||||
|
"Store" => IrStatementKind.Store,
|
||||||
|
"Jmp" => IrStatementKind.Jump,
|
||||||
|
"CJmp" => IrStatementKind.ConditionalJump,
|
||||||
|
"InterJmp" => IrStatementKind.Jump,
|
||||||
|
"InterCJmp" => IrStatementKind.ConditionalJump,
|
||||||
|
"LMark" => IrStatementKind.Nop,
|
||||||
|
"SideEffect" => IrStatementKind.Syscall,
|
||||||
|
_ => IrStatementKind.Unknown
|
||||||
|
};
|
||||||
|
|
||||||
|
private static (IrOperand? Dest, ImmutableArray<IrOperand> Sources) ExtractOperandsFromB2R2Stmt(object b2r2Stmt)
|
||||||
|
{
|
||||||
|
IrOperand? dest = null;
|
||||||
|
var sources = new List<IrOperand>();
|
||||||
|
|
||||||
|
var type = b2r2Stmt.GetType();
|
||||||
|
|
||||||
|
// Try to extract destination
|
||||||
|
var destProp = type.GetProperty("Dest");
|
||||||
|
if (destProp != null)
|
||||||
|
{
|
||||||
|
var destVal = destProp.GetValue(b2r2Stmt);
|
||||||
|
if (destVal != null)
|
||||||
|
{
|
||||||
|
dest = CreateOperandFromB2R2Expr(destVal);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Try to extract source/value
|
||||||
|
var srcProp = type.GetProperty("Value") ?? type.GetProperty("Src");
|
||||||
|
if (srcProp != null)
|
||||||
|
{
|
||||||
|
var srcVal = srcProp.GetValue(b2r2Stmt);
|
||||||
|
if (srcVal != null)
|
||||||
|
{
|
||||||
|
sources.Add(CreateOperandFromB2R2Expr(srcVal));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Try to extract condition for conditional jumps
|
||||||
|
var condProp = type.GetProperty("Cond");
|
||||||
|
if (condProp != null)
|
||||||
|
{
|
||||||
|
var condVal = condProp.GetValue(b2r2Stmt);
|
||||||
|
if (condVal != null)
|
||||||
|
{
|
||||||
|
sources.Add(CreateOperandFromB2R2Expr(condVal));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return (dest, [.. sources]);
|
||||||
|
}
|
||||||
|
|
||||||
|
private static IrOperand CreateOperandFromB2R2Expr(object expr)
|
||||||
|
{
|
||||||
|
var exprType = expr.GetType().Name;
|
||||||
|
|
||||||
|
return exprType switch
|
||||||
|
{
|
||||||
|
"Var" => new IrOperand(
|
||||||
|
Kind: IrOperandKind.Register,
|
||||||
|
Name: GetVarName(expr),
|
||||||
|
Value: null,
|
||||||
|
BitSize: GetVarBitWidth(expr),
|
||||||
|
IsMemory: false),
|
||||||
|
|
||||||
|
"TempVar" => new IrOperand(
|
||||||
|
Kind: IrOperandKind.Temporary,
|
||||||
|
Name: GetTempVarName(expr),
|
||||||
|
Value: null,
|
||||||
|
BitSize: GetVarBitWidth(expr),
|
||||||
|
IsMemory: false),
|
||||||
|
|
||||||
|
"Num" => new IrOperand(
|
||||||
|
Kind: IrOperandKind.Immediate,
|
||||||
|
Name: null,
|
||||||
|
Value: GetNumValueLong(expr),
|
||||||
|
BitSize: GetNumBitWidth(expr),
|
||||||
|
IsMemory: false),
|
||||||
|
|
||||||
|
"Load" => new IrOperand(
|
||||||
|
Kind: IrOperandKind.Memory,
|
||||||
|
Name: "[mem]",
|
||||||
|
Value: null,
|
||||||
|
BitSize: GetLoadBitWidth(expr),
|
||||||
|
IsMemory: true),
|
||||||
|
|
||||||
|
_ => new IrOperand(
|
||||||
|
Kind: IrOperandKind.Unknown,
|
||||||
|
Name: exprType,
|
||||||
|
Value: null,
|
||||||
|
BitSize: 64,
|
||||||
|
IsMemory: false)
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
private static string GetVarName(object varExpr)
|
||||||
|
{
|
||||||
|
var nameProp = varExpr.GetType().GetProperty("Name");
|
||||||
|
return nameProp?.GetValue(varExpr)?.ToString() ?? "?";
|
||||||
|
}
|
||||||
|
|
||||||
|
private static string GetTempVarName(object tempVarExpr)
|
||||||
|
{
|
||||||
|
var numProp = tempVarExpr.GetType().GetProperty("N");
|
||||||
|
var num = numProp?.GetValue(tempVarExpr) ?? 0;
|
||||||
|
return string.Format(CultureInfo.InvariantCulture, "T{0}", num);
|
||||||
|
}
|
||||||
|
|
||||||
|
private static int GetVarBitWidth(object varExpr)
|
||||||
|
{
|
||||||
|
var typeProp = varExpr.GetType().GetProperty("Type");
|
||||||
|
if (typeProp == null) return 64;
|
||||||
|
|
||||||
|
var regType = typeProp.GetValue(varExpr);
|
||||||
|
var bitSizeProp = regType?.GetType().GetProperty("BitSize");
|
||||||
|
return (int?)bitSizeProp?.GetValue(regType) ?? 64;
|
||||||
|
}
|
||||||
|
|
||||||
|
private static long GetNumValueLong(object numExpr)
|
||||||
|
{
|
||||||
|
var valueProp = numExpr.GetType().GetProperty("Value");
|
||||||
|
var value = valueProp?.GetValue(numExpr);
|
||||||
|
return Convert.ToInt64(value, CultureInfo.InvariantCulture);
|
||||||
|
}
|
||||||
|
|
||||||
|
private static int GetNumBitWidth(object numExpr)
|
||||||
|
{
|
||||||
|
var typeProp = numExpr.GetType().GetProperty("Type");
|
||||||
|
if (typeProp == null) return 64;
|
||||||
|
|
||||||
|
var numType = typeProp.GetValue(numExpr);
|
||||||
|
var bitSizeProp = numType?.GetType().GetProperty("BitSize");
|
||||||
|
return (int?)bitSizeProp?.GetValue(numType) ?? 64;
|
||||||
|
}
|
||||||
|
|
||||||
|
private static int GetLoadBitWidth(object loadExpr)
|
||||||
|
{
|
||||||
|
var typeProp = loadExpr.GetType().GetProperty("Type");
|
||||||
|
if (typeProp == null) return 64;
|
||||||
|
|
||||||
|
var loadType = typeProp.GetValue(loadExpr);
|
||||||
|
var bitSizeProp = loadType?.GetType().GetProperty("BitSize");
|
||||||
|
return (int?)bitSizeProp?.GetValue(loadType) ?? 64;
|
||||||
|
}
|
||||||
|
|
||||||
|
private static IrStatement CreateFallbackStatement(DisassembledInstruction instr, int id)
|
||||||
|
{
|
||||||
|
var sources = instr.Operands.Skip(1)
|
||||||
|
.Select(op => new IrOperand(
|
||||||
|
Kind: MapOperandType(op.Type),
|
||||||
|
Name: op.Text,
|
||||||
|
Value: op.Value,
|
||||||
|
BitSize: 64,
|
||||||
|
IsMemory: op.Type == OperandType.Memory))
|
||||||
|
.ToImmutableArray();
|
||||||
|
|
||||||
|
var dest = instr.Operands.Length > 0
|
||||||
|
? new IrOperand(
|
||||||
|
Kind: MapOperandType(instr.Operands[0].Type),
|
||||||
|
Name: instr.Operands[0].Text,
|
||||||
|
Value: instr.Operands[0].Value,
|
||||||
|
BitSize: 64,
|
||||||
|
IsMemory: instr.Operands[0].Type == OperandType.Memory)
|
||||||
|
: null;
|
||||||
|
|
||||||
|
return new IrStatement(
|
||||||
|
Id: id,
|
||||||
|
Address: instr.Address,
|
||||||
|
Kind: MapMnemonicToKind(instr.Mnemonic),
|
||||||
|
Operation: instr.Mnemonic,
|
||||||
|
Destination: dest,
|
||||||
|
Sources: sources,
|
||||||
|
Metadata: ImmutableDictionary<string, object>.Empty.Add("fallback", true));
|
||||||
|
}
|
||||||
|
|
||||||
|
private static SsaVariableKind MapOperandKindToSsaKind(IrOperandKind kind) => kind switch
|
||||||
|
{
|
||||||
|
IrOperandKind.Register => SsaVariableKind.Register,
|
||||||
|
IrOperandKind.Temporary => SsaVariableKind.Temporary,
|
||||||
|
IrOperandKind.Memory => SsaVariableKind.Memory,
|
||||||
|
IrOperandKind.Immediate => SsaVariableKind.Constant,
|
||||||
|
_ => SsaVariableKind.Temporary
|
||||||
|
};
|
||||||
|
|
||||||
|
private static IrOperandKind MapOperandType(OperandType type) => type switch
|
||||||
|
{
|
||||||
|
OperandType.Register => IrOperandKind.Register,
|
||||||
|
OperandType.Immediate => IrOperandKind.Immediate,
|
||||||
|
OperandType.Memory => IrOperandKind.Memory,
|
||||||
|
OperandType.Address => IrOperandKind.Label,
|
||||||
|
_ => IrOperandKind.Unknown
|
||||||
|
};
|
||||||
|
|
||||||
|
#endregion
|
||||||
|
|
||||||
|
#region Helper Methods
|
||||||
|
|
||||||
|
private static ISA MapToB2R2Isa(CpuArchitecture arch) => arch switch
|
||||||
|
{
|
||||||
|
CpuArchitecture.X86 => new ISA(Architecture.Intel, WordSize.Bit32),
|
||||||
|
CpuArchitecture.X86_64 => new ISA(Architecture.Intel, WordSize.Bit64),
|
||||||
|
CpuArchitecture.ARM32 => new ISA(Architecture.ARMv7, WordSize.Bit32),
|
||||||
|
CpuArchitecture.ARM64 => new ISA(Architecture.ARMv8, WordSize.Bit64),
|
||||||
|
CpuArchitecture.MIPS32 => new ISA(Architecture.MIPS, WordSize.Bit32),
|
||||||
|
CpuArchitecture.MIPS64 => new ISA(Architecture.MIPS, WordSize.Bit64),
|
||||||
|
CpuArchitecture.RISCV64 => new ISA(Architecture.RISCV, WordSize.Bit64),
|
||||||
|
CpuArchitecture.PPC32 => new ISA(Architecture.PPC, Endian.Big, WordSize.Bit32),
|
||||||
|
CpuArchitecture.SPARC => new ISA(Architecture.SPARC, Endian.Big),
|
||||||
|
_ => throw new NotSupportedException($"Unsupported architecture: {arch}")
|
||||||
|
};
|
||||||
|
|
||||||
|
private static bool IsBlockTerminator(DisassembledInstruction instr)
|
||||||
|
{
|
||||||
|
var mnemonic = instr.Mnemonic.ToUpperInvariant();
|
||||||
|
return mnemonic.StartsWith("J", StringComparison.Ordinal) ||
|
||||||
|
mnemonic.StartsWith("B", StringComparison.Ordinal) ||
|
||||||
|
mnemonic == "RET" ||
|
||||||
|
mnemonic == "RETN" ||
|
||||||
|
mnemonic == "RETF" ||
|
||||||
|
mnemonic == "IRET" ||
|
||||||
|
mnemonic == "SYSRET" ||
|
||||||
|
mnemonic == "BLR" ||
|
||||||
|
mnemonic == "BX" ||
|
||||||
|
mnemonic == "JR";
|
||||||
|
}
|
||||||
|
|
||||||
|
private static IrStatementKind MapMnemonicToKind(string mnemonic)
|
||||||
|
{
|
||||||
|
var upper = mnemonic.ToUpperInvariant();
|
||||||
|
|
||||||
|
if (upper.StartsWith("MOV", StringComparison.Ordinal) ||
|
||||||
|
upper.StartsWith("LEA", StringComparison.Ordinal) ||
|
||||||
|
upper.StartsWith("LDR", StringComparison.Ordinal))
|
||||||
|
return IrStatementKind.Assign;
|
||||||
|
|
||||||
|
if (upper.StartsWith("ADD", StringComparison.Ordinal) ||
|
||||||
|
upper.StartsWith("SUB", StringComparison.Ordinal) ||
|
||||||
|
upper.StartsWith("MUL", StringComparison.Ordinal) ||
|
||||||
|
upper.StartsWith("DIV", StringComparison.Ordinal))
|
||||||
|
return IrStatementKind.BinaryOp;
|
||||||
|
|
||||||
|
if (upper.StartsWith("AND", StringComparison.Ordinal) ||
|
||||||
|
upper.StartsWith("OR", StringComparison.Ordinal) ||
|
||||||
|
upper.StartsWith("XOR", StringComparison.Ordinal) ||
|
||||||
|
upper.StartsWith("SH", StringComparison.Ordinal))
|
||||||
|
return IrStatementKind.BinaryOp;
|
||||||
|
|
||||||
|
if (upper.StartsWith("CMP", StringComparison.Ordinal) ||
|
||||||
|
upper.StartsWith("TEST", StringComparison.Ordinal))
|
||||||
|
return IrStatementKind.Compare;
|
||||||
|
|
||||||
|
if (upper.StartsWith("J", StringComparison.Ordinal) ||
|
||||||
|
upper.StartsWith("B", StringComparison.Ordinal))
|
||||||
|
return IrStatementKind.ConditionalJump;
|
||||||
|
|
||||||
|
if (upper == "CALL" || upper == "BL" || upper == "BLX")
|
||||||
|
return IrStatementKind.Call;
|
||||||
|
|
||||||
|
if (upper == "RET" || upper == "RETN" || upper == "BLR")
|
||||||
|
return IrStatementKind.Return;
|
||||||
|
|
||||||
|
if (upper.StartsWith("PUSH", StringComparison.Ordinal) ||
|
||||||
|
upper.StartsWith("POP", StringComparison.Ordinal) ||
|
||||||
|
upper.StartsWith("STR", StringComparison.Ordinal))
|
||||||
|
return IrStatementKind.Store;
|
||||||
|
|
||||||
|
if (upper == "NOP")
|
||||||
|
return IrStatementKind.Nop;
|
||||||
|
|
||||||
|
return IrStatementKind.Unknown;
|
||||||
|
}
|
||||||
|
|
||||||
|
private static (ImmutableArray<IrBasicBlock> Blocks, ImmutableArray<CfgEdge> Edges) BuildCfgEdges(
|
||||||
|
ImmutableArray<IrBasicBlock> blocks)
|
||||||
|
{
|
||||||
|
if (blocks.Length == 0)
|
||||||
|
return (blocks, ImmutableArray<CfgEdge>.Empty);
|
||||||
|
|
||||||
|
var result = new IrBasicBlock[blocks.Length];
|
||||||
|
var edges = new List<CfgEdge>();
|
||||||
|
|
||||||
|
for (var i = 0; i < blocks.Length; i++)
|
||||||
|
{
|
||||||
|
var block = blocks[i];
|
||||||
|
var predecessors = new List<int>();
|
||||||
|
var successors = new List<int>();
|
||||||
|
|
||||||
|
// Fall-through successor (next block in sequence)
|
||||||
|
if (i < blocks.Length - 1)
|
||||||
|
{
|
||||||
|
successors.Add(i + 1);
|
||||||
|
edges.Add(new CfgEdge(
|
||||||
|
SourceBlockId: i,
|
||||||
|
TargetBlockId: i + 1,
|
||||||
|
Kind: CfgEdgeKind.FallThrough,
|
||||||
|
Condition: null));
|
||||||
|
}
|
||||||
|
|
||||||
|
// Predecessor from fall-through
|
||||||
|
if (i > 0)
|
||||||
|
{
|
||||||
|
predecessors.Add(i - 1);
|
||||||
|
}
|
||||||
|
|
||||||
|
result[i] = block with
|
||||||
|
{
|
||||||
|
Predecessors = [.. predecessors.Distinct().OrderBy(x => x)],
|
||||||
|
Successors = [.. successors.Distinct().OrderBy(x => x)]
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
return ([.. result], [.. edges]);
|
||||||
|
}
|
||||||
|
|
||||||
|
private static ImmutableArray<int> FindExitBlocks(ImmutableArray<IrBasicBlock> blocks)
|
||||||
|
{
|
||||||
|
return blocks
|
||||||
|
.Where(b => b.Successors.Length == 0)
|
||||||
|
.Select(b => b.Id)
|
||||||
|
.ToImmutableArray();
|
||||||
|
}
|
||||||
|
|
||||||
|
#endregion
|
||||||
|
}
|
||||||
@@ -1,8 +1,11 @@
|
|||||||
// Copyright (c) StellaOps. All rights reserved.
|
// Copyright (c) StellaOps. All rights reserved.
|
||||||
// Licensed under AGPL-3.0-or-later. See LICENSE in the project root.
|
// Licensed under AGPL-3.0-or-later. See LICENSE in the project root.
|
||||||
|
// Sprint: SPRINT_20260112_004_BINIDX_b2r2_lowuir_perf_cache (BINIDX-LIFTER-02)
|
||||||
|
|
||||||
|
using Microsoft.Extensions.Configuration;
|
||||||
using Microsoft.Extensions.DependencyInjection;
|
using Microsoft.Extensions.DependencyInjection;
|
||||||
using Microsoft.Extensions.DependencyInjection.Extensions;
|
using Microsoft.Extensions.DependencyInjection.Extensions;
|
||||||
|
using StellaOps.BinaryIndex.Semantic;
|
||||||
|
|
||||||
namespace StellaOps.BinaryIndex.Disassembly.B2R2;
|
namespace StellaOps.BinaryIndex.Disassembly.B2R2;
|
||||||
|
|
||||||
@@ -25,4 +28,66 @@ public static class B2R2ServiceCollectionExtensions
|
|||||||
|
|
||||||
return services;
|
return services;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Adds the B2R2 lifter pool to the service collection.
|
||||||
|
/// Provides pooled lifters with warm preload for improved performance.
|
||||||
|
/// </summary>
|
||||||
|
/// <param name="services">The service collection.</param>
|
||||||
|
/// <param name="configuration">Configuration for binding pool options.</param>
|
||||||
|
/// <returns>The service collection for chaining.</returns>
|
||||||
|
public static IServiceCollection AddB2R2LifterPool(
|
||||||
|
this IServiceCollection services,
|
||||||
|
IConfiguration? configuration = null)
|
||||||
|
{
|
||||||
|
ArgumentNullException.ThrowIfNull(services);
|
||||||
|
|
||||||
|
if (configuration != null)
|
||||||
|
{
|
||||||
|
services.Configure<B2R2LifterPoolOptions>(
|
||||||
|
configuration.GetSection(B2R2LifterPoolOptions.SectionName));
|
||||||
|
}
|
||||||
|
else
|
||||||
|
{
|
||||||
|
services.Configure<B2R2LifterPoolOptions>(_ => { });
|
||||||
|
}
|
||||||
|
|
||||||
|
services.TryAddSingleton<B2R2LifterPool>();
|
||||||
|
|
||||||
|
return services;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Adds the B2R2 LowUIR lifting service to the service collection.
|
||||||
|
/// Provides IR lifting with B2R2 LowUIR semantics.
|
||||||
|
/// </summary>
|
||||||
|
/// <param name="services">The service collection.</param>
|
||||||
|
/// <returns>The service collection for chaining.</returns>
|
||||||
|
public static IServiceCollection AddB2R2LowUirLiftingService(this IServiceCollection services)
|
||||||
|
{
|
||||||
|
ArgumentNullException.ThrowIfNull(services);
|
||||||
|
|
||||||
|
services.TryAddSingleton<IIrLiftingService, B2R2LowUirLiftingService>();
|
||||||
|
|
||||||
|
return services;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Adds all B2R2 services to the service collection.
|
||||||
|
/// </summary>
|
||||||
|
/// <param name="services">The service collection.</param>
|
||||||
|
/// <param name="configuration">Configuration for binding options.</param>
|
||||||
|
/// <returns>The service collection for chaining.</returns>
|
||||||
|
public static IServiceCollection AddB2R2Services(
|
||||||
|
this IServiceCollection services,
|
||||||
|
IConfiguration? configuration = null)
|
||||||
|
{
|
||||||
|
ArgumentNullException.ThrowIfNull(services);
|
||||||
|
|
||||||
|
services.AddB2R2DisassemblyPlugin();
|
||||||
|
services.AddB2R2LifterPool(configuration);
|
||||||
|
services.AddB2R2LowUirLiftingService();
|
||||||
|
|
||||||
|
return services;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -11,6 +11,8 @@
|
|||||||
|
|
||||||
<ItemGroup>
|
<ItemGroup>
|
||||||
<ProjectReference Include="..\StellaOps.BinaryIndex.Disassembly.Abstractions\StellaOps.BinaryIndex.Disassembly.Abstractions.csproj" />
|
<ProjectReference Include="..\StellaOps.BinaryIndex.Disassembly.Abstractions\StellaOps.BinaryIndex.Disassembly.Abstractions.csproj" />
|
||||||
|
<!-- Sprint: SPRINT_20260112_004_BINIDX_b2r2_lowuir_perf_cache (BINIDX-LIR-01) -->
|
||||||
|
<ProjectReference Include="..\StellaOps.BinaryIndex.Semantic\StellaOps.BinaryIndex.Semantic.csproj" />
|
||||||
</ItemGroup>
|
</ItemGroup>
|
||||||
|
|
||||||
<ItemGroup>
|
<ItemGroup>
|
||||||
|
|||||||
@@ -7,6 +7,8 @@
|
|||||||
- Maintain evidence bundle schemas and export formats.
|
- Maintain evidence bundle schemas and export formats.
|
||||||
- Provide API and worker workflows for evidence packaging and retrieval.
|
- Provide API and worker workflows for evidence packaging and retrieval.
|
||||||
- Enforce deterministic ordering, hashing, and offline-friendly behavior.
|
- Enforce deterministic ordering, hashing, and offline-friendly behavior.
|
||||||
|
- Support transparency log (Rekor) and RFC3161 timestamp references in bundle metadata.
|
||||||
|
- Support S3 Object Lock for WORM retention and legal hold when configured.
|
||||||
|
|
||||||
## Required Reading
|
## Required Reading
|
||||||
- docs/README.md
|
- docs/README.md
|
||||||
@@ -16,13 +18,19 @@
|
|||||||
- docs/modules/evidence-locker/export-format.md
|
- docs/modules/evidence-locker/export-format.md
|
||||||
- docs/modules/evidence-locker/evidence-bundle-v1.md
|
- docs/modules/evidence-locker/evidence-bundle-v1.md
|
||||||
- docs/modules/evidence-locker/attestation-contract.md
|
- docs/modules/evidence-locker/attestation-contract.md
|
||||||
|
- docs/modules/evidence-locker/schemas/stellaops-evidence-pack.v1.schema.json
|
||||||
|
- docs/modules/evidence-locker/schemas/bundle.manifest.schema.json
|
||||||
|
|
||||||
## Working Agreement
|
## Working Agreement
|
||||||
- Deterministic ordering and invariant formatting for export artifacts.
|
- Deterministic ordering and invariant formatting for export artifacts.
|
||||||
- Use TimeProvider and IGuidGenerator where timestamps or IDs are created.
|
- Use TimeProvider and IGuidGenerator where timestamps or IDs are created.
|
||||||
- Propagate CancellationToken for async operations.
|
- Propagate CancellationToken for async operations.
|
||||||
- Keep offline-first behavior (no network dependencies unless explicitly configured).
|
- Keep offline-first behavior (no network dependencies unless explicitly configured).
|
||||||
|
- Bundle manifests must serialize transparency and timestamp references in deterministic order (logIndex, tokenPath).
|
||||||
|
- Object Lock configuration is validated at startup when enabled.
|
||||||
|
|
||||||
## Testing Strategy
|
## Testing Strategy
|
||||||
- Unit tests for bundling, export serialization, and hash stability.
|
- Unit tests for bundling, export serialization, and hash stability.
|
||||||
- Schema evolution tests for bundle compatibility.
|
- Schema evolution tests for bundle compatibility.
|
||||||
|
- Tests for transparency and timestamp reference serialization.
|
||||||
|
- Tests for Object Lock configuration validation.
|
||||||
|
|||||||
@@ -1,3 +1,4 @@
|
|||||||
|
using System.Collections.Immutable;
|
||||||
using StellaOps.EvidenceLocker.Core.Domain;
|
using StellaOps.EvidenceLocker.Core.Domain;
|
||||||
|
|
||||||
namespace StellaOps.EvidenceLocker.Core.Builders;
|
namespace StellaOps.EvidenceLocker.Core.Builders;
|
||||||
@@ -26,13 +27,35 @@ public sealed record EvidenceManifestEntry(
|
|||||||
string MediaType,
|
string MediaType,
|
||||||
IReadOnlyDictionary<string, string> Attributes);
|
IReadOnlyDictionary<string, string> Attributes);
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Transparency log reference for audit trail verification.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record TransparencyReference(
|
||||||
|
string Uuid,
|
||||||
|
long LogIndex,
|
||||||
|
string? RootHash = null,
|
||||||
|
string? InclusionProofPath = null,
|
||||||
|
string? LogUrl = null);
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// RFC3161 timestamp reference for bundle time anchor.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record TimestampReference(
|
||||||
|
string TokenPath,
|
||||||
|
string HashAlgorithm,
|
||||||
|
DateTimeOffset? SignedAt = null,
|
||||||
|
string? TsaName = null,
|
||||||
|
string? TsaUrl = null);
|
||||||
|
|
||||||
public sealed record EvidenceBundleManifest(
|
public sealed record EvidenceBundleManifest(
|
||||||
EvidenceBundleId BundleId,
|
EvidenceBundleId BundleId,
|
||||||
TenantId TenantId,
|
TenantId TenantId,
|
||||||
EvidenceBundleKind Kind,
|
EvidenceBundleKind Kind,
|
||||||
DateTimeOffset CreatedAt,
|
DateTimeOffset CreatedAt,
|
||||||
IReadOnlyDictionary<string, string> Metadata,
|
IReadOnlyDictionary<string, string> Metadata,
|
||||||
IReadOnlyList<EvidenceManifestEntry> Entries);
|
IReadOnlyList<EvidenceManifestEntry> Entries,
|
||||||
|
IReadOnlyList<TransparencyReference>? TransparencyReferences = null,
|
||||||
|
IReadOnlyList<TimestampReference>? TimestampReferences = null);
|
||||||
|
|
||||||
public sealed record EvidenceBundleBuildResult(
|
public sealed record EvidenceBundleBuildResult(
|
||||||
string RootHash,
|
string RootHash,
|
||||||
|
|||||||
@@ -83,6 +83,54 @@ public sealed class AmazonS3StoreOptions
|
|||||||
public string? Prefix { get; init; }
|
public string? Prefix { get; init; }
|
||||||
|
|
||||||
public bool UseIntelligentTiering { get; init; }
|
public bool UseIntelligentTiering { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// S3 Object Lock configuration for WORM retention and legal hold support.
|
||||||
|
/// </summary>
|
||||||
|
public ObjectLockOptions? ObjectLock { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Object Lock semantics for immutable evidence objects.
|
||||||
|
/// </summary>
|
||||||
|
public enum ObjectLockMode
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Governance mode: can be bypassed by users with s3:BypassGovernanceRetention permission.
|
||||||
|
/// </summary>
|
||||||
|
Governance = 1,
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Compliance mode: cannot be overwritten or deleted by any user, including root.
|
||||||
|
/// </summary>
|
||||||
|
Compliance = 2
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// S3 Object Lock configuration for WORM retention support.
|
||||||
|
/// </summary>
|
||||||
|
public sealed class ObjectLockOptions
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Whether Object Lock is enabled for evidence objects.
|
||||||
|
/// </summary>
|
||||||
|
public bool Enabled { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Object Lock mode (Governance or Compliance).
|
||||||
|
/// </summary>
|
||||||
|
public ObjectLockMode Mode { get; init; } = ObjectLockMode.Governance;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Default retention period in days for evidence objects.
|
||||||
|
/// </summary>
|
||||||
|
[Range(1, 36500)]
|
||||||
|
public int DefaultRetentionDays { get; init; } = 90;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Whether to apply legal hold to evidence objects by default.
|
||||||
|
/// </summary>
|
||||||
|
public bool DefaultLegalHold { get; init; }
|
||||||
}
|
}
|
||||||
|
|
||||||
public sealed class QuotaOptions
|
public sealed class QuotaOptions
|
||||||
|
|||||||
@@ -17,7 +17,9 @@ public sealed record EvidenceObjectWriteOptions(
|
|||||||
string ArtifactName,
|
string ArtifactName,
|
||||||
string ContentType,
|
string ContentType,
|
||||||
bool EnforceWriteOnce = true,
|
bool EnforceWriteOnce = true,
|
||||||
IDictionary<string, string>? Tags = null);
|
IDictionary<string, string>? Tags = null,
|
||||||
|
int? RetentionOverrideDays = null,
|
||||||
|
bool? LegalHoldOverride = null);
|
||||||
|
|
||||||
public interface IEvidenceObjectStore
|
public interface IEvidenceObjectStore
|
||||||
{
|
{
|
||||||
|
|||||||
@@ -230,6 +230,59 @@ public sealed class EvidenceSignatureService : IEvidenceSignatureService
|
|||||||
writer.WriteEndObject();
|
writer.WriteEndObject();
|
||||||
}
|
}
|
||||||
writer.WriteEndArray();
|
writer.WriteEndArray();
|
||||||
|
|
||||||
|
// Serialize transparency references for audit trail verification
|
||||||
|
if (manifest.TransparencyReferences is { Count: > 0 })
|
||||||
|
{
|
||||||
|
writer.WriteStartArray("transparency");
|
||||||
|
foreach (var transparency in manifest.TransparencyReferences.OrderBy(t => t.LogIndex))
|
||||||
|
{
|
||||||
|
writer.WriteStartObject();
|
||||||
|
writer.WriteString("uuid", transparency.Uuid);
|
||||||
|
writer.WriteNumber("logIndex", transparency.LogIndex);
|
||||||
|
if (!string.IsNullOrWhiteSpace(transparency.RootHash))
|
||||||
|
{
|
||||||
|
writer.WriteString("rootHash", transparency.RootHash);
|
||||||
|
}
|
||||||
|
if (!string.IsNullOrWhiteSpace(transparency.InclusionProofPath))
|
||||||
|
{
|
||||||
|
writer.WriteString("inclusionProofPath", transparency.InclusionProofPath);
|
||||||
|
}
|
||||||
|
if (!string.IsNullOrWhiteSpace(transparency.LogUrl))
|
||||||
|
{
|
||||||
|
writer.WriteString("logUrl", transparency.LogUrl);
|
||||||
|
}
|
||||||
|
writer.WriteEndObject();
|
||||||
|
}
|
||||||
|
writer.WriteEndArray();
|
||||||
|
}
|
||||||
|
|
||||||
|
// Serialize timestamp references for RFC3161 time anchors
|
||||||
|
if (manifest.TimestampReferences is { Count: > 0 })
|
||||||
|
{
|
||||||
|
writer.WriteStartArray("timestamps");
|
||||||
|
foreach (var timestamp in manifest.TimestampReferences.OrderBy(t => t.TokenPath, StringComparer.Ordinal))
|
||||||
|
{
|
||||||
|
writer.WriteStartObject();
|
||||||
|
writer.WriteString("tokenPath", timestamp.TokenPath);
|
||||||
|
writer.WriteString("hashAlgorithm", timestamp.HashAlgorithm);
|
||||||
|
if (timestamp.SignedAt.HasValue)
|
||||||
|
{
|
||||||
|
writer.WriteString("signedAt", timestamp.SignedAt.Value.UtcDateTime.ToString("O", CultureInfo.InvariantCulture));
|
||||||
|
}
|
||||||
|
if (!string.IsNullOrWhiteSpace(timestamp.TsaName))
|
||||||
|
{
|
||||||
|
writer.WriteString("tsaName", timestamp.TsaName);
|
||||||
|
}
|
||||||
|
if (!string.IsNullOrWhiteSpace(timestamp.TsaUrl))
|
||||||
|
{
|
||||||
|
writer.WriteString("tsaUrl", timestamp.TsaUrl);
|
||||||
|
}
|
||||||
|
writer.WriteEndObject();
|
||||||
|
}
|
||||||
|
writer.WriteEndArray();
|
||||||
|
}
|
||||||
|
|
||||||
writer.WriteEndObject();
|
writer.WriteEndObject();
|
||||||
writer.Flush();
|
writer.Flush();
|
||||||
return buffer.WrittenSpan.ToArray();
|
return buffer.WrittenSpan.ToArray();
|
||||||
|
|||||||
@@ -33,6 +33,34 @@ internal sealed class S3EvidenceObjectStore : IEvidenceObjectStore, IDisposable
|
|||||||
_logger = logger;
|
_logger = logger;
|
||||||
_timeProvider = timeProvider ?? TimeProvider.System;
|
_timeProvider = timeProvider ?? TimeProvider.System;
|
||||||
_guidProvider = guidProvider ?? SystemGuidProvider.Instance;
|
_guidProvider = guidProvider ?? SystemGuidProvider.Instance;
|
||||||
|
|
||||||
|
ValidateObjectLockConfiguration();
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Validates Object Lock configuration at startup to ensure proper setup.
|
||||||
|
/// </summary>
|
||||||
|
private void ValidateObjectLockConfiguration()
|
||||||
|
{
|
||||||
|
var objectLock = _options.ObjectLock;
|
||||||
|
if (objectLock is null || !objectLock.Enabled)
|
||||||
|
{
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (objectLock.DefaultRetentionDays <= 0)
|
||||||
|
{
|
||||||
|
throw new InvalidOperationException("Object Lock retention days must be greater than zero when enabled.");
|
||||||
|
}
|
||||||
|
|
||||||
|
if (_logger.IsEnabled(LogLevel.Information))
|
||||||
|
{
|
||||||
|
_logger.LogInformation(
|
||||||
|
"S3 Object Lock enabled: Mode={Mode}, RetentionDays={RetentionDays}, LegalHold={LegalHold}",
|
||||||
|
objectLock.Mode,
|
||||||
|
objectLock.DefaultRetentionDays,
|
||||||
|
objectLock.DefaultLegalHold);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
public async Task<EvidenceObjectMetadata> StoreAsync(
|
public async Task<EvidenceObjectMetadata> StoreAsync(
|
||||||
@@ -188,10 +216,16 @@ internal sealed class S3EvidenceObjectStore : IEvidenceObjectStore, IDisposable
|
|||||||
request.Headers["If-None-Match"] = "*";
|
request.Headers["If-None-Match"] = "*";
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Apply Object Lock settings for WORM retention
|
||||||
|
ApplyObjectLockSettings(request, options);
|
||||||
|
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
var response = await _s3.PutObjectAsync(request, cancellationToken);
|
var response = await _s3.PutObjectAsync(request, cancellationToken);
|
||||||
|
|
||||||
|
// Apply legal hold if configured (requires separate API call)
|
||||||
|
await ApplyLegalHoldAsync(storageKey, options, cancellationToken);
|
||||||
|
|
||||||
if (_logger.IsEnabled(LogLevel.Debug))
|
if (_logger.IsEnabled(LogLevel.Debug))
|
||||||
{
|
{
|
||||||
_logger.LogDebug("Uploaded evidence object {Key} to bucket {Bucket} (ETag: {ETag}).", storageKey, _options.BucketName, response.ETag);
|
_logger.LogDebug("Uploaded evidence object {Key} to bucket {Bucket} (ETag: {ETag}).", storageKey, _options.BucketName, response.ETag);
|
||||||
@@ -213,6 +247,81 @@ internal sealed class S3EvidenceObjectStore : IEvidenceObjectStore, IDisposable
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Applies Object Lock retention settings to a PutObject request.
|
||||||
|
/// </summary>
|
||||||
|
private void ApplyObjectLockSettings(PutObjectRequest request, EvidenceObjectWriteOptions writeOptions)
|
||||||
|
{
|
||||||
|
var objectLock = _options.ObjectLock;
|
||||||
|
if (objectLock is null || !objectLock.Enabled)
|
||||||
|
{
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Set Object Lock mode
|
||||||
|
request.ObjectLockMode = objectLock.Mode switch
|
||||||
|
{
|
||||||
|
Core.Configuration.ObjectLockMode.Compliance => Amazon.S3.ObjectLockMode.Compliance,
|
||||||
|
Core.Configuration.ObjectLockMode.Governance => Amazon.S3.ObjectLockMode.Governance,
|
||||||
|
_ => Amazon.S3.ObjectLockMode.Governance
|
||||||
|
};
|
||||||
|
|
||||||
|
// Calculate retention date
|
||||||
|
var retentionDays = writeOptions.RetentionOverrideDays ?? objectLock.DefaultRetentionDays;
|
||||||
|
var retainUntil = _timeProvider.GetUtcNow().AddDays(retentionDays);
|
||||||
|
request.ObjectLockRetainUntilDate = retainUntil.UtcDateTime;
|
||||||
|
|
||||||
|
if (_logger.IsEnabled(LogLevel.Debug))
|
||||||
|
{
|
||||||
|
_logger.LogDebug(
|
||||||
|
"Applying Object Lock to {Key}: Mode={Mode}, RetainUntil={RetainUntil}",
|
||||||
|
request.Key,
|
||||||
|
request.ObjectLockMode,
|
||||||
|
request.ObjectLockRetainUntilDate);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Applies legal hold to an uploaded object if configured.
|
||||||
|
/// </summary>
|
||||||
|
private async Task ApplyLegalHoldAsync(
|
||||||
|
string storageKey,
|
||||||
|
EvidenceObjectWriteOptions writeOptions,
|
||||||
|
CancellationToken cancellationToken)
|
||||||
|
{
|
||||||
|
var objectLock = _options.ObjectLock;
|
||||||
|
if (objectLock is null || !objectLock.Enabled)
|
||||||
|
{
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
var applyLegalHold = writeOptions.LegalHoldOverride ?? objectLock.DefaultLegalHold;
|
||||||
|
if (!applyLegalHold)
|
||||||
|
{
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
try
|
||||||
|
{
|
||||||
|
await _s3.PutObjectLegalHoldAsync(new PutObjectLegalHoldRequest
|
||||||
|
{
|
||||||
|
BucketName = _options.BucketName,
|
||||||
|
Key = storageKey,
|
||||||
|
LegalHold = new ObjectLockLegalHold { Status = ObjectLockLegalHoldStatus.On }
|
||||||
|
}, cancellationToken);
|
||||||
|
|
||||||
|
if (_logger.IsEnabled(LogLevel.Debug))
|
||||||
|
{
|
||||||
|
_logger.LogDebug("Applied legal hold to evidence object {Key}.", storageKey);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
catch (AmazonS3Exception ex)
|
||||||
|
{
|
||||||
|
_logger.LogWarning(ex, "Failed to apply legal hold to evidence object {Key}.", storageKey);
|
||||||
|
// Don't throw - legal hold is best-effort if Object Lock mode allows it
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
private static void TryCleanupTempFile(string path)
|
private static void TryCleanupTempFile(string path)
|
||||||
{
|
{
|
||||||
try
|
try
|
||||||
|
|||||||
@@ -159,6 +159,99 @@ public sealed class EvidenceSignatureServiceTests
|
|||||||
Assert.Equal("zeta", enumerator.Current.Name);
|
Assert.Equal("zeta", enumerator.Current.Name);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
[Trait("Category", TestCategories.Unit)]
|
||||||
|
[Fact]
|
||||||
|
public async Task SignManifestAsync_SerializesTransparencyReferences_WhenPresent()
|
||||||
|
{
|
||||||
|
var timestampClient = new FakeTimestampAuthorityClient();
|
||||||
|
var timeProvider = new TestTimeProvider(new DateTimeOffset(2025, 11, 3, 10, 0, 0, TimeSpan.Zero));
|
||||||
|
var service = CreateService(timestampClient, timeProvider);
|
||||||
|
|
||||||
|
var transparencyRefs = new List<TransparencyReference>
|
||||||
|
{
|
||||||
|
new("uuid-123", 42, "sha256:abc123", "/proof/path", "https://rekor.example")
|
||||||
|
};
|
||||||
|
|
||||||
|
var manifest = CreateManifest(transparencyReferences: transparencyRefs);
|
||||||
|
var signature = await service.SignManifestAsync(
|
||||||
|
manifest.BundleId,
|
||||||
|
manifest.TenantId,
|
||||||
|
manifest,
|
||||||
|
CancellationToken.None);
|
||||||
|
|
||||||
|
Assert.NotNull(signature);
|
||||||
|
var payloadJson = Encoding.UTF8.GetString(Convert.FromBase64String(signature!.Payload));
|
||||||
|
using var document = JsonDocument.Parse(payloadJson);
|
||||||
|
|
||||||
|
Assert.True(document.RootElement.TryGetProperty("transparency", out var transparencyElement));
|
||||||
|
Assert.Equal(JsonValueKind.Array, transparencyElement.ValueKind);
|
||||||
|
Assert.Single(transparencyElement.EnumerateArray());
|
||||||
|
|
||||||
|
var entry = transparencyElement[0];
|
||||||
|
Assert.Equal("uuid-123", entry.GetProperty("uuid").GetString());
|
||||||
|
Assert.Equal(42, entry.GetProperty("logIndex").GetInt64());
|
||||||
|
Assert.Equal("sha256:abc123", entry.GetProperty("rootHash").GetString());
|
||||||
|
}
|
||||||
|
|
||||||
|
[Trait("Category", TestCategories.Unit)]
|
||||||
|
[Fact]
|
||||||
|
public async Task SignManifestAsync_SerializesTimestampReferences_WhenPresent()
|
||||||
|
{
|
||||||
|
var timestampClient = new FakeTimestampAuthorityClient();
|
||||||
|
var timeProvider = new TestTimeProvider(new DateTimeOffset(2025, 11, 3, 10, 0, 0, TimeSpan.Zero));
|
||||||
|
var service = CreateService(timestampClient, timeProvider);
|
||||||
|
|
||||||
|
var signedAt = new DateTimeOffset(2025, 11, 3, 9, 0, 0, TimeSpan.Zero);
|
||||||
|
var timestampRefs = new List<TimestampReference>
|
||||||
|
{
|
||||||
|
new("timestamps/manifest.tsr", "SHA256", signedAt, "Test TSA", "https://tsa.example")
|
||||||
|
};
|
||||||
|
|
||||||
|
var manifest = CreateManifest(timestampReferences: timestampRefs);
|
||||||
|
var signature = await service.SignManifestAsync(
|
||||||
|
manifest.BundleId,
|
||||||
|
manifest.TenantId,
|
||||||
|
manifest,
|
||||||
|
CancellationToken.None);
|
||||||
|
|
||||||
|
Assert.NotNull(signature);
|
||||||
|
var payloadJson = Encoding.UTF8.GetString(Convert.FromBase64String(signature!.Payload));
|
||||||
|
using var document = JsonDocument.Parse(payloadJson);
|
||||||
|
|
||||||
|
Assert.True(document.RootElement.TryGetProperty("timestamps", out var timestampsElement));
|
||||||
|
Assert.Equal(JsonValueKind.Array, timestampsElement.ValueKind);
|
||||||
|
Assert.Single(timestampsElement.EnumerateArray());
|
||||||
|
|
||||||
|
var entry = timestampsElement[0];
|
||||||
|
Assert.Equal("timestamps/manifest.tsr", entry.GetProperty("tokenPath").GetString());
|
||||||
|
Assert.Equal("SHA256", entry.GetProperty("hashAlgorithm").GetString());
|
||||||
|
Assert.Equal("Test TSA", entry.GetProperty("tsaName").GetString());
|
||||||
|
}
|
||||||
|
|
||||||
|
[Trait("Category", TestCategories.Unit)]
|
||||||
|
[Fact]
|
||||||
|
public async Task SignManifestAsync_OmitsTransparencyAndTimestampArrays_WhenEmpty()
|
||||||
|
{
|
||||||
|
var timestampClient = new FakeTimestampAuthorityClient();
|
||||||
|
var timeProvider = new TestTimeProvider(new DateTimeOffset(2025, 11, 3, 10, 0, 0, TimeSpan.Zero));
|
||||||
|
var service = CreateService(timestampClient, timeProvider);
|
||||||
|
|
||||||
|
var manifest = CreateManifest();
|
||||||
|
var signature = await service.SignManifestAsync(
|
||||||
|
manifest.BundleId,
|
||||||
|
manifest.TenantId,
|
||||||
|
manifest,
|
||||||
|
CancellationToken.None);
|
||||||
|
|
||||||
|
Assert.NotNull(signature);
|
||||||
|
var payloadJson = Encoding.UTF8.GetString(Convert.FromBase64String(signature!.Payload));
|
||||||
|
using var document = JsonDocument.Parse(payloadJson);
|
||||||
|
|
||||||
|
// These arrays should not be present when empty
|
||||||
|
Assert.False(document.RootElement.TryGetProperty("transparency", out _));
|
||||||
|
Assert.False(document.RootElement.TryGetProperty("timestamps", out _));
|
||||||
|
}
|
||||||
|
|
||||||
private static EvidenceSignatureService CreateService(
|
private static EvidenceSignatureService CreateService(
|
||||||
ITimestampAuthorityClient timestampAuthorityClient,
|
ITimestampAuthorityClient timestampAuthorityClient,
|
||||||
TimeProvider timeProvider,
|
TimeProvider timeProvider,
|
||||||
@@ -212,7 +305,9 @@ public sealed class EvidenceSignatureServiceTests
|
|||||||
private static EvidenceBundleManifest CreateManifest(
|
private static EvidenceBundleManifest CreateManifest(
|
||||||
(string key, string value)[]? metadataOrder = null,
|
(string key, string value)[]? metadataOrder = null,
|
||||||
EvidenceBundleId? bundleId = null,
|
EvidenceBundleId? bundleId = null,
|
||||||
TenantId? tenantId = null)
|
TenantId? tenantId = null,
|
||||||
|
IReadOnlyList<TransparencyReference>? transparencyReferences = null,
|
||||||
|
IReadOnlyList<TimestampReference>? timestampReferences = null)
|
||||||
{
|
{
|
||||||
metadataOrder ??= new[] { ("alpha", "1"), ("beta", "2") };
|
metadataOrder ??= new[] { ("alpha", "1"), ("beta", "2") };
|
||||||
var metadataDictionary = new Dictionary<string, string>(StringComparer.Ordinal);
|
var metadataDictionary = new Dictionary<string, string>(StringComparer.Ordinal);
|
||||||
@@ -244,7 +339,9 @@ public sealed class EvidenceSignatureServiceTests
|
|||||||
EvidenceBundleKind.Evaluation,
|
EvidenceBundleKind.Evaluation,
|
||||||
new DateTimeOffset(2025, 11, 3, 9, 30, 0, TimeSpan.Zero),
|
new DateTimeOffset(2025, 11, 3, 9, 30, 0, TimeSpan.Zero),
|
||||||
metadata,
|
metadata,
|
||||||
new List<EvidenceManifestEntry> { manifestEntry });
|
new List<EvidenceManifestEntry> { manifestEntry },
|
||||||
|
transparencyReferences,
|
||||||
|
timestampReferences);
|
||||||
}
|
}
|
||||||
|
|
||||||
private sealed class FakeTimestampAuthorityClient : ITimestampAuthorityClient
|
private sealed class FakeTimestampAuthorityClient : ITimestampAuthorityClient
|
||||||
|
|||||||
@@ -108,6 +108,28 @@ public static class VexTimelineEventTypes
|
|||||||
/// An attestation was verified.
|
/// An attestation was verified.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public const string AttestationVerified = "vex.attestation.verified";
|
public const string AttestationVerified = "vex.attestation.verified";
|
||||||
|
|
||||||
|
// Sprint: SPRINT_20260112_006_EXCITITOR_vex_change_events (EXC-VEX-001)
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// A VEX statement was added.
|
||||||
|
/// </summary>
|
||||||
|
public const string StatementAdded = "vex.statement.added";
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// A VEX statement was superseded by a newer statement.
|
||||||
|
/// </summary>
|
||||||
|
public const string StatementSuperseded = "vex.statement.superseded";
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// A VEX statement conflict was detected (multiple conflicting statuses).
|
||||||
|
/// </summary>
|
||||||
|
public const string StatementConflict = "vex.statement.conflict";
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// VEX status changed for a CVE+product combination.
|
||||||
|
/// </summary>
|
||||||
|
public const string StatusChanged = "vex.status.changed";
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
|
|||||||
@@ -0,0 +1,313 @@
|
|||||||
|
// <copyright file="VexStatementChangeEvent.cs" company="StellaOps">
|
||||||
|
// SPDX-License-Identifier: AGPL-3.0-or-later
|
||||||
|
// Sprint: SPRINT_20260112_006_EXCITITOR_vex_change_events (EXC-VEX-001)
|
||||||
|
// </copyright>
|
||||||
|
|
||||||
|
using System.Collections.Immutable;
|
||||||
|
|
||||||
|
namespace StellaOps.Excititor.Core.Observations;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Event emitted when a VEX statement changes (added, superseded, or conflict detected).
|
||||||
|
/// Used to drive policy reanalysis.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record VexStatementChangeEvent
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Unique event identifier (deterministic based on content).
|
||||||
|
/// </summary>
|
||||||
|
public required string EventId { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Event type from <see cref="VexTimelineEventTypes"/>.
|
||||||
|
/// </summary>
|
||||||
|
public required string EventType { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Tenant identifier.
|
||||||
|
/// </summary>
|
||||||
|
public required string Tenant { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// CVE identifier affected by this change.
|
||||||
|
/// </summary>
|
||||||
|
public required string VulnerabilityId { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Product key (PURL or product identifier) affected by this change.
|
||||||
|
/// </summary>
|
||||||
|
public required string ProductKey { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// New VEX status after this change (e.g., "affected", "not_affected", "under_investigation").
|
||||||
|
/// </summary>
|
||||||
|
public required string NewStatus { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Previous VEX status before this change (null for new statements).
|
||||||
|
/// </summary>
|
||||||
|
public string? PreviousStatus { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Provider that issued this statement.
|
||||||
|
/// </summary>
|
||||||
|
public required string ProviderId { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Observation ID of the statement.
|
||||||
|
/// </summary>
|
||||||
|
public required string ObservationId { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Statement ID that supersedes the current one (if applicable).
|
||||||
|
/// </summary>
|
||||||
|
public string? SupersededBy { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Statement IDs that this statement supersedes.
|
||||||
|
/// </summary>
|
||||||
|
public ImmutableArray<string> Supersedes { get; init; } = [];
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Provenance metadata about the statement source.
|
||||||
|
/// </summary>
|
||||||
|
public VexStatementProvenance? Provenance { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Conflict details if this is a conflict event.
|
||||||
|
/// </summary>
|
||||||
|
public VexConflictDetails? ConflictDetails { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// UTC timestamp when this event occurred.
|
||||||
|
/// </summary>
|
||||||
|
public required DateTimeOffset OccurredAtUtc { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Correlation ID for tracing.
|
||||||
|
/// </summary>
|
||||||
|
public string? TraceId { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Provenance metadata for a VEX statement change.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record VexStatementProvenance
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Source document hash (e.g., OpenVEX document digest).
|
||||||
|
/// </summary>
|
||||||
|
public string? DocumentHash { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Source document URI.
|
||||||
|
/// </summary>
|
||||||
|
public string? DocumentUri { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Timestamp from the source document.
|
||||||
|
/// </summary>
|
||||||
|
public DateTimeOffset? SourceTimestamp { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Author of the statement.
|
||||||
|
/// </summary>
|
||||||
|
public string? Author { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Trust score assigned to this provider (0.0-1.0).
|
||||||
|
/// </summary>
|
||||||
|
public double? TrustScore { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Details about a VEX statement conflict.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record VexConflictDetails
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Type of conflict (status_mismatch, trust_tie, supersession_conflict).
|
||||||
|
/// </summary>
|
||||||
|
public required string ConflictType { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Conflicting statuses from different providers.
|
||||||
|
/// </summary>
|
||||||
|
public required ImmutableArray<VexConflictingStatus> ConflictingStatuses { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Resolution strategy applied (if any).
|
||||||
|
/// </summary>
|
||||||
|
public string? ResolutionStrategy { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Whether the conflict was auto-resolved by policy.
|
||||||
|
/// </summary>
|
||||||
|
public bool AutoResolved { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// A conflicting status from a specific provider.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record VexConflictingStatus
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Provider that issued this status.
|
||||||
|
/// </summary>
|
||||||
|
public required string ProviderId { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// The status value.
|
||||||
|
/// </summary>
|
||||||
|
public required string Status { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Justification for the status.
|
||||||
|
/// </summary>
|
||||||
|
public string? Justification { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Trust score of this provider.
|
||||||
|
/// </summary>
|
||||||
|
public double? TrustScore { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Factory for creating deterministic VEX statement change events.
|
||||||
|
/// </summary>
|
||||||
|
public static class VexStatementChangeEventFactory
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Creates a statement added event with a deterministic event ID.
|
||||||
|
/// </summary>
|
||||||
|
public static VexStatementChangeEvent CreateStatementAdded(
|
||||||
|
string tenant,
|
||||||
|
string vulnerabilityId,
|
||||||
|
string productKey,
|
||||||
|
string status,
|
||||||
|
string providerId,
|
||||||
|
string observationId,
|
||||||
|
DateTimeOffset occurredAtUtc,
|
||||||
|
VexStatementProvenance? provenance = null,
|
||||||
|
string? traceId = null)
|
||||||
|
{
|
||||||
|
// Deterministic event ID based on content
|
||||||
|
var eventId = ComputeEventId(
|
||||||
|
VexTimelineEventTypes.StatementAdded,
|
||||||
|
tenant,
|
||||||
|
vulnerabilityId,
|
||||||
|
productKey,
|
||||||
|
observationId,
|
||||||
|
occurredAtUtc);
|
||||||
|
|
||||||
|
return new VexStatementChangeEvent
|
||||||
|
{
|
||||||
|
EventId = eventId,
|
||||||
|
EventType = VexTimelineEventTypes.StatementAdded,
|
||||||
|
Tenant = tenant,
|
||||||
|
VulnerabilityId = vulnerabilityId,
|
||||||
|
ProductKey = productKey,
|
||||||
|
NewStatus = status,
|
||||||
|
PreviousStatus = null,
|
||||||
|
ProviderId = providerId,
|
||||||
|
ObservationId = observationId,
|
||||||
|
Provenance = provenance,
|
||||||
|
OccurredAtUtc = occurredAtUtc,
|
||||||
|
TraceId = traceId
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Creates a statement superseded event with a deterministic event ID.
|
||||||
|
/// </summary>
|
||||||
|
public static VexStatementChangeEvent CreateStatementSuperseded(
|
||||||
|
string tenant,
|
||||||
|
string vulnerabilityId,
|
||||||
|
string productKey,
|
||||||
|
string newStatus,
|
||||||
|
string? previousStatus,
|
||||||
|
string providerId,
|
||||||
|
string observationId,
|
||||||
|
string supersededBy,
|
||||||
|
DateTimeOffset occurredAtUtc,
|
||||||
|
VexStatementProvenance? provenance = null,
|
||||||
|
string? traceId = null)
|
||||||
|
{
|
||||||
|
var eventId = ComputeEventId(
|
||||||
|
VexTimelineEventTypes.StatementSuperseded,
|
||||||
|
tenant,
|
||||||
|
vulnerabilityId,
|
||||||
|
productKey,
|
||||||
|
observationId,
|
||||||
|
occurredAtUtc);
|
||||||
|
|
||||||
|
return new VexStatementChangeEvent
|
||||||
|
{
|
||||||
|
EventId = eventId,
|
||||||
|
EventType = VexTimelineEventTypes.StatementSuperseded,
|
||||||
|
Tenant = tenant,
|
||||||
|
VulnerabilityId = vulnerabilityId,
|
||||||
|
ProductKey = productKey,
|
||||||
|
NewStatus = newStatus,
|
||||||
|
PreviousStatus = previousStatus,
|
||||||
|
ProviderId = providerId,
|
||||||
|
ObservationId = observationId,
|
||||||
|
SupersededBy = supersededBy,
|
||||||
|
Provenance = provenance,
|
||||||
|
OccurredAtUtc = occurredAtUtc,
|
||||||
|
TraceId = traceId
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Creates a conflict detected event with a deterministic event ID.
|
||||||
|
/// </summary>
|
||||||
|
public static VexStatementChangeEvent CreateConflictDetected(
|
||||||
|
string tenant,
|
||||||
|
string vulnerabilityId,
|
||||||
|
string productKey,
|
||||||
|
string providerId,
|
||||||
|
string observationId,
|
||||||
|
VexConflictDetails conflictDetails,
|
||||||
|
DateTimeOffset occurredAtUtc,
|
||||||
|
string? traceId = null)
|
||||||
|
{
|
||||||
|
var eventId = ComputeEventId(
|
||||||
|
VexTimelineEventTypes.StatementConflict,
|
||||||
|
tenant,
|
||||||
|
vulnerabilityId,
|
||||||
|
productKey,
|
||||||
|
observationId,
|
||||||
|
occurredAtUtc);
|
||||||
|
|
||||||
|
return new VexStatementChangeEvent
|
||||||
|
{
|
||||||
|
EventId = eventId,
|
||||||
|
EventType = VexTimelineEventTypes.StatementConflict,
|
||||||
|
Tenant = tenant,
|
||||||
|
VulnerabilityId = vulnerabilityId,
|
||||||
|
ProductKey = productKey,
|
||||||
|
NewStatus = "conflict",
|
||||||
|
ProviderId = providerId,
|
||||||
|
ObservationId = observationId,
|
||||||
|
ConflictDetails = conflictDetails,
|
||||||
|
OccurredAtUtc = occurredAtUtc,
|
||||||
|
TraceId = traceId
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
private static string ComputeEventId(
|
||||||
|
string eventType,
|
||||||
|
string tenant,
|
||||||
|
string vulnerabilityId,
|
||||||
|
string productKey,
|
||||||
|
string observationId,
|
||||||
|
DateTimeOffset occurredAtUtc)
|
||||||
|
{
|
||||||
|
// Use SHA256 for deterministic event IDs
|
||||||
|
var input = $"{eventType}|{tenant}|{vulnerabilityId}|{productKey}|{observationId}|{occurredAtUtc:O}";
|
||||||
|
var hash = System.Security.Cryptography.SHA256.HashData(
|
||||||
|
System.Text.Encoding.UTF8.GetBytes(input));
|
||||||
|
return $"evt-{Convert.ToHexStringLower(hash)[..16]}";
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -44,7 +44,17 @@ public sealed record RekorEntryRefDto(
|
|||||||
long? LogIndex = null,
|
long? LogIndex = null,
|
||||||
string? LogId = null,
|
string? LogId = null,
|
||||||
string? Uuid = null,
|
string? Uuid = null,
|
||||||
long? IntegratedTime = null);
|
long? IntegratedTime = null,
|
||||||
|
/// <summary>
|
||||||
|
/// Rekor integrated time as RFC3339 timestamp (ISO 8601 format).
|
||||||
|
/// Sprint: SPRINT_20260112_004_FINDINGS_evidence_graph_rekor_time (FIND-REKOR-002)
|
||||||
|
/// </summary>
|
||||||
|
DateTimeOffset? IntegratedTimeRfc3339 = null,
|
||||||
|
/// <summary>
|
||||||
|
/// Full URL to the Rekor entry for UI linking.
|
||||||
|
/// Sprint: SPRINT_20260112_004_FINDINGS_evidence_graph_rekor_time (FIND-REKOR-002)
|
||||||
|
/// </summary>
|
||||||
|
string? EntryUrl = null);
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Result of attestation verification.
|
/// Result of attestation verification.
|
||||||
@@ -183,11 +193,14 @@ public static class AttestationPointerMappings
|
|||||||
|
|
||||||
public static RekorEntryRef ToModel(this RekorEntryRefDto dto)
|
public static RekorEntryRef ToModel(this RekorEntryRefDto dto)
|
||||||
{
|
{
|
||||||
|
// Sprint: SPRINT_20260112_004_FINDINGS_evidence_graph_rekor_time (FIND-REKOR-002)
|
||||||
return new RekorEntryRef(
|
return new RekorEntryRef(
|
||||||
dto.LogIndex,
|
dto.LogIndex,
|
||||||
dto.LogId,
|
dto.LogId,
|
||||||
dto.Uuid,
|
dto.Uuid,
|
||||||
dto.IntegratedTime);
|
dto.IntegratedTime,
|
||||||
|
dto.IntegratedTimeRfc3339,
|
||||||
|
dto.EntryUrl);
|
||||||
}
|
}
|
||||||
|
|
||||||
public static VerificationResult ToModel(this VerificationResultDto dto)
|
public static VerificationResult ToModel(this VerificationResultDto dto)
|
||||||
@@ -253,11 +266,14 @@ public static class AttestationPointerMappings
|
|||||||
|
|
||||||
public static RekorEntryRefDto ToDto(this RekorEntryRef model)
|
public static RekorEntryRefDto ToDto(this RekorEntryRef model)
|
||||||
{
|
{
|
||||||
|
// Sprint: SPRINT_20260112_004_FINDINGS_evidence_graph_rekor_time (FIND-REKOR-002)
|
||||||
return new RekorEntryRefDto(
|
return new RekorEntryRefDto(
|
||||||
model.LogIndex,
|
model.LogIndex,
|
||||||
model.LogId,
|
model.LogId,
|
||||||
model.Uuid,
|
model.Uuid,
|
||||||
model.IntegratedTime);
|
model.IntegratedTime,
|
||||||
|
model.IntegratedTimeRfc3339,
|
||||||
|
model.EntryUrl);
|
||||||
}
|
}
|
||||||
|
|
||||||
public static VerificationResultDto ToDto(this VerificationResult model)
|
public static VerificationResultDto ToDto(this VerificationResult model)
|
||||||
|
|||||||
@@ -155,6 +155,126 @@ public sealed record EvidenceWeightedScoreResponse
|
|||||||
/// Whether this result came from cache.
|
/// Whether this result came from cache.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public bool FromCache { get; init; }
|
public bool FromCache { get; init; }
|
||||||
|
|
||||||
|
// Sprint: SPRINT_20260112_004_BE_findings_scoring_attested_reduction (EWS-API-001)
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Reduction profile metadata when attested reduction is active.
|
||||||
|
/// </summary>
|
||||||
|
public ReductionProfileDto? ReductionProfile { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Whether this finding has a hard-fail status (must be addressed).
|
||||||
|
/// </summary>
|
||||||
|
public bool HardFail { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Reason for short-circuit if score was set to 0 due to attested evidence.
|
||||||
|
/// </summary>
|
||||||
|
public string? ShortCircuitReason { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Anchor metadata for the evidence used in scoring.
|
||||||
|
/// </summary>
|
||||||
|
public EvidenceAnchorDto? Anchor { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Reduction profile metadata for attested scoring.
|
||||||
|
/// Sprint: SPRINT_20260112_004_BE_findings_scoring_attested_reduction (EWS-API-001)
|
||||||
|
/// </summary>
|
||||||
|
public sealed record ReductionProfileDto
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Whether reduction mode is enabled.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("enabled")]
|
||||||
|
public required bool Enabled { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Reduction mode (e.g., "aggressive", "conservative", "custom").
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("mode")]
|
||||||
|
public string? Mode { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Policy profile ID used.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("profileId")]
|
||||||
|
public string? ProfileId { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Maximum reduction percentage allowed.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("maxReductionPercent")]
|
||||||
|
public int? MaxReductionPercent { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Whether VEX anchoring is required.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("requireVexAnchoring")]
|
||||||
|
public bool RequireVexAnchoring { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Whether Rekor verification is required.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("requireRekorVerification")]
|
||||||
|
public bool RequireRekorVerification { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Evidence anchor metadata for attested scoring.
|
||||||
|
/// Sprint: SPRINT_20260112_004_BE_findings_scoring_attested_reduction (EWS-API-001)
|
||||||
|
/// </summary>
|
||||||
|
public sealed record EvidenceAnchorDto
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Whether the evidence is anchored (has attestation).
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("anchored")]
|
||||||
|
public required bool Anchored { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// DSSE envelope digest if anchored.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("envelopeDigest")]
|
||||||
|
public string? EnvelopeDigest { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Predicate type of the attestation.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("predicateType")]
|
||||||
|
public string? PredicateType { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Rekor log index if transparency-anchored.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("rekorLogIndex")]
|
||||||
|
public long? RekorLogIndex { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Rekor entry ID if transparency-anchored.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("rekorEntryId")]
|
||||||
|
public string? RekorEntryId { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Scope of the attestation.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("scope")]
|
||||||
|
public string? Scope { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Verification status of the anchor.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("verified")]
|
||||||
|
public bool? Verified { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// When the attestation was created.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("attestedAt")]
|
||||||
|
public DateTimeOffset? AttestedAt { get; init; }
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
|
|||||||
@@ -73,7 +73,50 @@ public sealed record RekorEntryRef(
|
|||||||
long? LogIndex = null,
|
long? LogIndex = null,
|
||||||
string? LogId = null,
|
string? LogId = null,
|
||||||
string? Uuid = null,
|
string? Uuid = null,
|
||||||
long? IntegratedTime = null);
|
long? IntegratedTime = null,
|
||||||
|
/// <summary>
|
||||||
|
/// Rekor integrated time as RFC3339 timestamp (ISO 8601 format).
|
||||||
|
/// Sprint: SPRINT_20260112_004_FINDINGS_evidence_graph_rekor_time (FIND-REKOR-001)
|
||||||
|
/// </summary>
|
||||||
|
DateTimeOffset? IntegratedTimeRfc3339 = null,
|
||||||
|
/// <summary>
|
||||||
|
/// Full URL to the Rekor entry for UI linking.
|
||||||
|
/// Sprint: SPRINT_20260112_004_FINDINGS_evidence_graph_rekor_time (FIND-REKOR-001)
|
||||||
|
/// </summary>
|
||||||
|
string? EntryUrl = null)
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Gets the integrated time as DateTimeOffset.
|
||||||
|
/// Prioritizes IntegratedTimeRfc3339 if set, otherwise converts IntegratedTime from Unix epoch.
|
||||||
|
/// </summary>
|
||||||
|
public DateTimeOffset? GetIntegratedTimeAsDateTime()
|
||||||
|
{
|
||||||
|
if (IntegratedTimeRfc3339.HasValue)
|
||||||
|
return IntegratedTimeRfc3339;
|
||||||
|
|
||||||
|
if (IntegratedTime.HasValue)
|
||||||
|
return DateTimeOffset.FromUnixTimeSeconds(IntegratedTime.Value);
|
||||||
|
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Gets the Rekor entry URL, constructing from UUID if not explicitly set.
|
||||||
|
/// </summary>
|
||||||
|
public string? GetEntryUrl(string rekorBaseUrl = "https://rekor.sigstore.dev")
|
||||||
|
{
|
||||||
|
if (!string.IsNullOrEmpty(EntryUrl))
|
||||||
|
return EntryUrl;
|
||||||
|
|
||||||
|
if (!string.IsNullOrEmpty(Uuid))
|
||||||
|
return $"{rekorBaseUrl}/api/v1/log/entries/{Uuid}";
|
||||||
|
|
||||||
|
if (!string.IsNullOrEmpty(LogId) && LogIndex.HasValue)
|
||||||
|
return $"{rekorBaseUrl}/api/v1/log/entries?logIndex={LogIndex.Value}";
|
||||||
|
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Result of attestation verification.
|
/// Result of attestation verification.
|
||||||
|
|||||||
@@ -0,0 +1,654 @@
|
|||||||
|
// <copyright file="ScmAnnotationContracts.cs" company="StellaOps">
|
||||||
|
// SPDX-License-Identifier: AGPL-3.0-or-later
|
||||||
|
// Sprint: SPRINT_20260112_006_INTEGRATIONS_scm_annotations (INTEGRATIONS-SCM-001)
|
||||||
|
// </copyright>
|
||||||
|
|
||||||
|
using System.Collections.Immutable;
|
||||||
|
using System.Text.Json.Serialization;
|
||||||
|
|
||||||
|
namespace StellaOps.Integrations.Contracts;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Contract for posting comments to PRs/MRs.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record ScmCommentRequest
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Repository owner (organization or user).
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("owner")]
|
||||||
|
public required string Owner { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Repository name.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("repo")]
|
||||||
|
public required string Repo { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// PR/MR number.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("prNumber")]
|
||||||
|
public required int PrNumber { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Comment body (Markdown supported).
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("body")]
|
||||||
|
public required string Body { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Optional path for file-level comments.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("path")]
|
||||||
|
public string? Path { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Optional line number for inline comments.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("line")]
|
||||||
|
public int? Line { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Optional commit SHA for positioning.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("commitSha")]
|
||||||
|
public string? CommitSha { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Comment context (e.g., "stellaops-scan", "stellaops-vex").
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("context")]
|
||||||
|
public string Context { get; init; } = "stellaops";
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Link to evidence pack or detailed report.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("evidenceUrl")]
|
||||||
|
public string? EvidenceUrl { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Correlation ID for tracing.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("traceId")]
|
||||||
|
public string? TraceId { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Response from posting a comment.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record ScmCommentResponse
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Comment ID in the SCM system.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("commentId")]
|
||||||
|
public required string CommentId { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// URL to the comment.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("url")]
|
||||||
|
public required string Url { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// When the comment was created.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("createdAt")]
|
||||||
|
public required DateTimeOffset CreatedAt { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Whether the comment was created or updated.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("wasUpdated")]
|
||||||
|
public bool WasUpdated { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Contract for posting commit/PR status checks.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record ScmStatusRequest
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Repository owner.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("owner")]
|
||||||
|
public required string Owner { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Repository name.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("repo")]
|
||||||
|
public required string Repo { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Commit SHA to post status on.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("commitSha")]
|
||||||
|
public required string CommitSha { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Status state.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("state")]
|
||||||
|
public required ScmStatusState State { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Context name (e.g., "stellaops/security-scan").
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("context")]
|
||||||
|
public required string Context { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Short description of the status.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("description")]
|
||||||
|
public required string Description { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// URL for more details.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("targetUrl")]
|
||||||
|
public string? TargetUrl { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Link to evidence pack.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("evidenceUrl")]
|
||||||
|
public string? EvidenceUrl { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Correlation ID for tracing.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("traceId")]
|
||||||
|
public string? TraceId { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Status check states.
|
||||||
|
/// </summary>
|
||||||
|
[JsonConverter(typeof(JsonStringEnumConverter))]
|
||||||
|
public enum ScmStatusState
|
||||||
|
{
|
||||||
|
/// <summary>Status check is pending.</summary>
|
||||||
|
Pending,
|
||||||
|
|
||||||
|
/// <summary>Status check passed.</summary>
|
||||||
|
Success,
|
||||||
|
|
||||||
|
/// <summary>Status check failed.</summary>
|
||||||
|
Failure,
|
||||||
|
|
||||||
|
/// <summary>Status check errored.</summary>
|
||||||
|
Error
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Response from posting a status check.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record ScmStatusResponse
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Status ID in the SCM system.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("statusId")]
|
||||||
|
public required string StatusId { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// State that was set.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("state")]
|
||||||
|
public required ScmStatusState State { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// URL to the status check.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("url")]
|
||||||
|
public string? Url { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// When the status was created/updated.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("createdAt")]
|
||||||
|
public required DateTimeOffset CreatedAt { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Contract for creating check runs (GitHub-specific, richer than status checks).
|
||||||
|
/// </summary>
|
||||||
|
public sealed record ScmCheckRunRequest
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Repository owner.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("owner")]
|
||||||
|
public required string Owner { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Repository name.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("repo")]
|
||||||
|
public required string Repo { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Check run name.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("name")]
|
||||||
|
public required string Name { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Head SHA to associate with.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("headSha")]
|
||||||
|
public required string HeadSha { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Check run status.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("status")]
|
||||||
|
public required ScmCheckRunStatus Status { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Conclusion (required when status is completed).
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("conclusion")]
|
||||||
|
public ScmCheckRunConclusion? Conclusion { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Title for the check run output.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("title")]
|
||||||
|
public string? Title { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Summary (Markdown).
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("summary")]
|
||||||
|
public string? Summary { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Detailed text (Markdown).
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("text")]
|
||||||
|
public string? Text { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Annotations to add to the check run.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("annotations")]
|
||||||
|
public ImmutableArray<ScmCheckRunAnnotation> Annotations { get; init; } = [];
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Link to evidence pack.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("evidenceUrl")]
|
||||||
|
public string? EvidenceUrl { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Correlation ID for tracing.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("traceId")]
|
||||||
|
public string? TraceId { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Check run status.
|
||||||
|
/// </summary>
|
||||||
|
[JsonConverter(typeof(JsonStringEnumConverter))]
|
||||||
|
public enum ScmCheckRunStatus
|
||||||
|
{
|
||||||
|
/// <summary>Check run is queued.</summary>
|
||||||
|
Queued,
|
||||||
|
|
||||||
|
/// <summary>Check run is in progress.</summary>
|
||||||
|
InProgress,
|
||||||
|
|
||||||
|
/// <summary>Check run is completed.</summary>
|
||||||
|
Completed
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Check run conclusion.
|
||||||
|
/// </summary>
|
||||||
|
[JsonConverter(typeof(JsonStringEnumConverter))]
|
||||||
|
public enum ScmCheckRunConclusion
|
||||||
|
{
|
||||||
|
/// <summary>Action required.</summary>
|
||||||
|
ActionRequired,
|
||||||
|
|
||||||
|
/// <summary>Cancelled.</summary>
|
||||||
|
Cancelled,
|
||||||
|
|
||||||
|
/// <summary>Failed.</summary>
|
||||||
|
Failure,
|
||||||
|
|
||||||
|
/// <summary>Neutral.</summary>
|
||||||
|
Neutral,
|
||||||
|
|
||||||
|
/// <summary>Success.</summary>
|
||||||
|
Success,
|
||||||
|
|
||||||
|
/// <summary>Skipped.</summary>
|
||||||
|
Skipped,
|
||||||
|
|
||||||
|
/// <summary>Stale.</summary>
|
||||||
|
Stale,
|
||||||
|
|
||||||
|
/// <summary>Timed out.</summary>
|
||||||
|
TimedOut
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Annotation for a check run.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record ScmCheckRunAnnotation
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// File path relative to repository root.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("path")]
|
||||||
|
public required string Path { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Start line number.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("startLine")]
|
||||||
|
public required int StartLine { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// End line number.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("endLine")]
|
||||||
|
public required int EndLine { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Annotation level.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("level")]
|
||||||
|
public required ScmAnnotationLevel Level { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Annotation message.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("message")]
|
||||||
|
public required string Message { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Title for the annotation.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("title")]
|
||||||
|
public string? Title { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Raw details (not rendered).
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("rawDetails")]
|
||||||
|
public string? RawDetails { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Annotation severity level.
|
||||||
|
/// </summary>
|
||||||
|
[JsonConverter(typeof(JsonStringEnumConverter))]
|
||||||
|
public enum ScmAnnotationLevel
|
||||||
|
{
|
||||||
|
/// <summary>Notice level.</summary>
|
||||||
|
Notice,
|
||||||
|
|
||||||
|
/// <summary>Warning level.</summary>
|
||||||
|
Warning,
|
||||||
|
|
||||||
|
/// <summary>Failure level.</summary>
|
||||||
|
Failure
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Response from creating a check run.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record ScmCheckRunResponse
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Check run ID.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("checkRunId")]
|
||||||
|
public required string CheckRunId { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// URL to the check run.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("url")]
|
||||||
|
public required string Url { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// HTML URL for the check run.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("htmlUrl")]
|
||||||
|
public string? HtmlUrl { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Status that was set.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("status")]
|
||||||
|
public required ScmCheckRunStatus Status { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Conclusion if completed.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("conclusion")]
|
||||||
|
public ScmCheckRunConclusion? Conclusion { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// When the check run started.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("startedAt")]
|
||||||
|
public DateTimeOffset? StartedAt { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// When the check run completed.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("completedAt")]
|
||||||
|
public DateTimeOffset? CompletedAt { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Number of annotations posted.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("annotationCount")]
|
||||||
|
public int AnnotationCount { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
// Sprint: SPRINT_20260112_006_INTEGRATIONS_scm_annotations (INTEGRATIONS-SCM-002)
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Contract for updating an existing check run.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record ScmCheckRunUpdateRequest
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Repository owner.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("owner")]
|
||||||
|
public required string Owner { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Repository name.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("repo")]
|
||||||
|
public required string Repo { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Check run ID to update.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("checkRunId")]
|
||||||
|
public required string CheckRunId { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Updated name (optional).
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("name")]
|
||||||
|
public string? Name { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Updated status (optional).
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("status")]
|
||||||
|
public ScmCheckRunStatus? Status { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Conclusion (required when status is completed).
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("conclusion")]
|
||||||
|
public ScmCheckRunConclusion? Conclusion { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// When the check run completed.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("completedAt")]
|
||||||
|
public DateTimeOffset? CompletedAt { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Updated title.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("title")]
|
||||||
|
public string? Title { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Updated summary.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("summary")]
|
||||||
|
public string? Summary { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Updated text body.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("text")]
|
||||||
|
public string? Text { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Additional annotations.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("annotations")]
|
||||||
|
public IReadOnlyList<ScmCheckRunAnnotation>? Annotations { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// URL for more details.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("detailsUrl")]
|
||||||
|
public string? DetailsUrl { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Link to evidence pack.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("evidenceUrl")]
|
||||||
|
public string? EvidenceUrl { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Correlation ID for tracing.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("traceId")]
|
||||||
|
public string? TraceId { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Interface for SCM annotation clients.
|
||||||
|
/// </summary>
|
||||||
|
public interface IScmAnnotationClient
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Posts a comment to a PR/MR.
|
||||||
|
/// </summary>
|
||||||
|
Task<ScmOperationResult<ScmCommentResponse>> PostCommentAsync(
|
||||||
|
ScmCommentRequest request,
|
||||||
|
CancellationToken cancellationToken = default);
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Posts a commit status.
|
||||||
|
/// </summary>
|
||||||
|
Task<ScmOperationResult<ScmStatusResponse>> PostStatusAsync(
|
||||||
|
ScmStatusRequest request,
|
||||||
|
CancellationToken cancellationToken = default);
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Creates a check run (GitHub Apps only).
|
||||||
|
/// </summary>
|
||||||
|
Task<ScmOperationResult<ScmCheckRunResponse>> CreateCheckRunAsync(
|
||||||
|
ScmCheckRunRequest request,
|
||||||
|
CancellationToken cancellationToken = default);
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Updates an existing check run.
|
||||||
|
/// </summary>
|
||||||
|
Task<ScmOperationResult<ScmCheckRunResponse>> UpdateCheckRunAsync(
|
||||||
|
ScmCheckRunUpdateRequest request,
|
||||||
|
CancellationToken cancellationToken = default);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Result of an offline-safe SCM operation.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record ScmOperationResult<T>
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Whether the operation succeeded.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("success")]
|
||||||
|
public required bool Success { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Result data (if successful).
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("data")]
|
||||||
|
public T? Data { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Error message (if failed).
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("error")]
|
||||||
|
public string? Error { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Whether the error is transient and can be retried.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("isTransient")]
|
||||||
|
public bool IsTransient { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Whether the operation was queued for later (offline mode).
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("queued")]
|
||||||
|
public bool Queued { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Queue ID if queued.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("queueId")]
|
||||||
|
public string? QueueId { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Creates a successful result.
|
||||||
|
/// </summary>
|
||||||
|
public static ScmOperationResult<T> Ok(T data) => new()
|
||||||
|
{
|
||||||
|
Success = true,
|
||||||
|
Data = data
|
||||||
|
};
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Creates a failed result.
|
||||||
|
/// </summary>
|
||||||
|
public static ScmOperationResult<T> Fail(string error, bool isTransient = false) => new()
|
||||||
|
{
|
||||||
|
Success = false,
|
||||||
|
Error = error,
|
||||||
|
IsTransient = isTransient
|
||||||
|
};
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Creates a queued result (offline mode).
|
||||||
|
/// </summary>
|
||||||
|
public static ScmOperationResult<T> QueuedForLater(string queueId) => new()
|
||||||
|
{
|
||||||
|
Success = false,
|
||||||
|
Queued = true,
|
||||||
|
QueueId = queueId
|
||||||
|
};
|
||||||
|
}
|
||||||
@@ -0,0 +1,562 @@
|
|||||||
|
// <copyright file="GitHubAppAnnotationClient.cs" company="StellaOps">
|
||||||
|
// SPDX-License-Identifier: AGPL-3.0-or-later
|
||||||
|
// Sprint: SPRINT_20260112_006_INTEGRATIONS_scm_annotations (INTEGRATIONS-SCM-002)
|
||||||
|
// </copyright>
|
||||||
|
|
||||||
|
using System.Net.Http.Headers;
|
||||||
|
using System.Text;
|
||||||
|
using System.Text.Json;
|
||||||
|
using System.Text.Json.Serialization;
|
||||||
|
using StellaOps.Integrations.Contracts;
|
||||||
|
using StellaOps.Integrations.Core;
|
||||||
|
|
||||||
|
namespace StellaOps.Integrations.Plugin.GitHubApp;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// GitHub App SCM annotation client for PR comments and check runs.
|
||||||
|
/// </summary>
|
||||||
|
public sealed class GitHubAppAnnotationClient : IScmAnnotationClient
|
||||||
|
{
|
||||||
|
private readonly HttpClient _httpClient;
|
||||||
|
private readonly TimeProvider _timeProvider;
|
||||||
|
private readonly IntegrationConfig _config;
|
||||||
|
private static readonly JsonSerializerOptions JsonOptions = new()
|
||||||
|
{
|
||||||
|
PropertyNamingPolicy = JsonNamingPolicy.CamelCase,
|
||||||
|
DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingNull
|
||||||
|
};
|
||||||
|
|
||||||
|
public GitHubAppAnnotationClient(
|
||||||
|
HttpClient httpClient,
|
||||||
|
IntegrationConfig config,
|
||||||
|
TimeProvider? timeProvider = null)
|
||||||
|
{
|
||||||
|
_httpClient = httpClient ?? throw new ArgumentNullException(nameof(httpClient));
|
||||||
|
_config = config ?? throw new ArgumentNullException(nameof(config));
|
||||||
|
_timeProvider = timeProvider ?? TimeProvider.System;
|
||||||
|
|
||||||
|
ConfigureHttpClient();
|
||||||
|
}
|
||||||
|
|
||||||
|
private void ConfigureHttpClient()
|
||||||
|
{
|
||||||
|
_httpClient.BaseAddress = new Uri(_config.Endpoint.TrimEnd('/') + "/");
|
||||||
|
_httpClient.DefaultRequestHeaders.Accept.Add(
|
||||||
|
new MediaTypeWithQualityHeaderValue("application/vnd.github+json"));
|
||||||
|
_httpClient.DefaultRequestHeaders.Add("X-GitHub-Api-Version", "2022-11-28");
|
||||||
|
_httpClient.DefaultRequestHeaders.UserAgent.Add(
|
||||||
|
new ProductInfoHeaderValue("StellaOps", "1.0"));
|
||||||
|
|
||||||
|
if (!string.IsNullOrEmpty(_config.ResolvedSecret))
|
||||||
|
{
|
||||||
|
_httpClient.DefaultRequestHeaders.Authorization =
|
||||||
|
new AuthenticationHeaderValue("Bearer", _config.ResolvedSecret);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <inheritdoc />
|
||||||
|
public async Task<ScmOperationResult<ScmCommentResponse>> PostCommentAsync(
|
||||||
|
ScmCommentRequest request,
|
||||||
|
CancellationToken cancellationToken = default)
|
||||||
|
{
|
||||||
|
ArgumentNullException.ThrowIfNull(request);
|
||||||
|
|
||||||
|
try
|
||||||
|
{
|
||||||
|
var url = request.Line.HasValue && !string.IsNullOrEmpty(request.Path)
|
||||||
|
? $"repos/{request.Owner}/{request.Repo}/pulls/{request.PrNumber}/comments"
|
||||||
|
: $"repos/{request.Owner}/{request.Repo}/issues/{request.PrNumber}/comments";
|
||||||
|
|
||||||
|
object payload = request.Line.HasValue && !string.IsNullOrEmpty(request.Path)
|
||||||
|
? new GitHubReviewCommentPayload
|
||||||
|
{
|
||||||
|
Body = request.Body,
|
||||||
|
Path = request.Path,
|
||||||
|
Line = request.Line.Value,
|
||||||
|
CommitId = request.CommitSha ?? string.Empty
|
||||||
|
}
|
||||||
|
: new GitHubIssueCommentPayload { Body = request.Body };
|
||||||
|
|
||||||
|
var json = JsonSerializer.Serialize(payload, JsonOptions);
|
||||||
|
using var content = new StringContent(json, Encoding.UTF8, "application/json");
|
||||||
|
|
||||||
|
var response = await _httpClient.PostAsync(url, content, cancellationToken);
|
||||||
|
|
||||||
|
if (!response.IsSuccessStatusCode)
|
||||||
|
{
|
||||||
|
var errorBody = await response.Content.ReadAsStringAsync(cancellationToken);
|
||||||
|
return ScmOperationResult<ScmCommentResponse>.Fail(
|
||||||
|
$"GitHub API returned {response.StatusCode}: {TruncateError(errorBody)}",
|
||||||
|
isTransient: IsTransientError(response.StatusCode));
|
||||||
|
}
|
||||||
|
|
||||||
|
var responseBody = await response.Content.ReadAsStringAsync(cancellationToken);
|
||||||
|
var gitHubComment = JsonSerializer.Deserialize<GitHubCommentResponse>(responseBody, JsonOptions);
|
||||||
|
|
||||||
|
return ScmOperationResult<ScmCommentResponse>.Ok(new ScmCommentResponse
|
||||||
|
{
|
||||||
|
CommentId = gitHubComment?.Id.ToString() ?? "0",
|
||||||
|
Url = gitHubComment?.HtmlUrl ?? string.Empty,
|
||||||
|
CreatedAt = gitHubComment?.CreatedAt ?? _timeProvider.GetUtcNow(),
|
||||||
|
WasUpdated = false
|
||||||
|
});
|
||||||
|
}
|
||||||
|
catch (HttpRequestException ex)
|
||||||
|
{
|
||||||
|
return ScmOperationResult<ScmCommentResponse>.Fail(
|
||||||
|
$"Network error posting comment: {ex.Message}",
|
||||||
|
isTransient: true);
|
||||||
|
}
|
||||||
|
catch (TaskCanceledException) when (cancellationToken.IsCancellationRequested)
|
||||||
|
{
|
||||||
|
throw;
|
||||||
|
}
|
||||||
|
catch (TaskCanceledException ex)
|
||||||
|
{
|
||||||
|
return ScmOperationResult<ScmCommentResponse>.Fail(
|
||||||
|
$"Request timeout: {ex.Message}",
|
||||||
|
isTransient: true);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <inheritdoc />
|
||||||
|
public async Task<ScmOperationResult<ScmStatusResponse>> PostStatusAsync(
|
||||||
|
ScmStatusRequest request,
|
||||||
|
CancellationToken cancellationToken = default)
|
||||||
|
{
|
||||||
|
ArgumentNullException.ThrowIfNull(request);
|
||||||
|
|
||||||
|
try
|
||||||
|
{
|
||||||
|
var url = $"repos/{request.Owner}/{request.Repo}/statuses/{request.CommitSha}";
|
||||||
|
|
||||||
|
var payload = new GitHubStatusPayload
|
||||||
|
{
|
||||||
|
State = MapStatusState(request.State),
|
||||||
|
Context = request.Context,
|
||||||
|
Description = TruncateDescription(request.Description, 140),
|
||||||
|
TargetUrl = request.TargetUrl ?? request.EvidenceUrl
|
||||||
|
};
|
||||||
|
|
||||||
|
var json = JsonSerializer.Serialize(payload, JsonOptions);
|
||||||
|
using var content = new StringContent(json, Encoding.UTF8, "application/json");
|
||||||
|
|
||||||
|
var response = await _httpClient.PostAsync(url, content, cancellationToken);
|
||||||
|
|
||||||
|
if (!response.IsSuccessStatusCode)
|
||||||
|
{
|
||||||
|
var errorBody = await response.Content.ReadAsStringAsync(cancellationToken);
|
||||||
|
return ScmOperationResult<ScmStatusResponse>.Fail(
|
||||||
|
$"GitHub API returned {response.StatusCode}: {TruncateError(errorBody)}",
|
||||||
|
isTransient: IsTransientError(response.StatusCode));
|
||||||
|
}
|
||||||
|
|
||||||
|
var responseBody = await response.Content.ReadAsStringAsync(cancellationToken);
|
||||||
|
var gitHubStatus = JsonSerializer.Deserialize<GitHubStatusResponse>(responseBody, JsonOptions);
|
||||||
|
|
||||||
|
return ScmOperationResult<ScmStatusResponse>.Ok(new ScmStatusResponse
|
||||||
|
{
|
||||||
|
StatusId = gitHubStatus?.Id.ToString() ?? "0",
|
||||||
|
State = request.State,
|
||||||
|
Url = gitHubStatus?.Url,
|
||||||
|
CreatedAt = gitHubStatus?.CreatedAt ?? _timeProvider.GetUtcNow()
|
||||||
|
});
|
||||||
|
}
|
||||||
|
catch (HttpRequestException ex)
|
||||||
|
{
|
||||||
|
return ScmOperationResult<ScmStatusResponse>.Fail(
|
||||||
|
$"Network error posting status: {ex.Message}",
|
||||||
|
isTransient: true);
|
||||||
|
}
|
||||||
|
catch (TaskCanceledException) when (cancellationToken.IsCancellationRequested)
|
||||||
|
{
|
||||||
|
throw;
|
||||||
|
}
|
||||||
|
catch (TaskCanceledException ex)
|
||||||
|
{
|
||||||
|
return ScmOperationResult<ScmStatusResponse>.Fail(
|
||||||
|
$"Request timeout: {ex.Message}",
|
||||||
|
isTransient: true);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <inheritdoc />
|
||||||
|
public async Task<ScmOperationResult<ScmCheckRunResponse>> CreateCheckRunAsync(
|
||||||
|
ScmCheckRunRequest request,
|
||||||
|
CancellationToken cancellationToken = default)
|
||||||
|
{
|
||||||
|
ArgumentNullException.ThrowIfNull(request);
|
||||||
|
|
||||||
|
try
|
||||||
|
{
|
||||||
|
var url = $"repos/{request.Owner}/{request.Repo}/check-runs";
|
||||||
|
var now = _timeProvider.GetUtcNow();
|
||||||
|
|
||||||
|
var payload = new GitHubCheckRunPayload
|
||||||
|
{
|
||||||
|
Name = request.Name,
|
||||||
|
HeadSha = request.HeadSha,
|
||||||
|
Status = MapCheckRunStatus(request.Status),
|
||||||
|
Conclusion = request.Conclusion.HasValue ? MapCheckRunConclusion(request.Conclusion.Value) : null,
|
||||||
|
StartedAt = now,
|
||||||
|
CompletedAt = request.Status == ScmCheckRunStatus.Completed ? now : null,
|
||||||
|
DetailsUrl = request.EvidenceUrl,
|
||||||
|
Output = request.Summary != null || request.Text != null || request.Annotations.Length > 0
|
||||||
|
? new GitHubCheckRunOutput
|
||||||
|
{
|
||||||
|
Title = request.Title ?? request.Name,
|
||||||
|
Summary = request.Summary ?? string.Empty,
|
||||||
|
Text = request.Text,
|
||||||
|
Annotations = request.Annotations.Length > 0
|
||||||
|
? request.Annotations.Select(a => new GitHubCheckRunAnnotation
|
||||||
|
{
|
||||||
|
Path = a.Path,
|
||||||
|
StartLine = a.StartLine,
|
||||||
|
EndLine = a.EndLine,
|
||||||
|
AnnotationLevel = MapAnnotationLevel(a.Level),
|
||||||
|
Message = a.Message,
|
||||||
|
Title = a.Title,
|
||||||
|
RawDetails = a.RawDetails
|
||||||
|
}).ToList()
|
||||||
|
: null
|
||||||
|
}
|
||||||
|
: null
|
||||||
|
};
|
||||||
|
|
||||||
|
var json = JsonSerializer.Serialize(payload, JsonOptions);
|
||||||
|
using var content = new StringContent(json, Encoding.UTF8, "application/json");
|
||||||
|
|
||||||
|
var response = await _httpClient.PostAsync(url, content, cancellationToken);
|
||||||
|
|
||||||
|
if (!response.IsSuccessStatusCode)
|
||||||
|
{
|
||||||
|
var errorBody = await response.Content.ReadAsStringAsync(cancellationToken);
|
||||||
|
return ScmOperationResult<ScmCheckRunResponse>.Fail(
|
||||||
|
$"GitHub API returned {response.StatusCode}: {TruncateError(errorBody)}",
|
||||||
|
isTransient: IsTransientError(response.StatusCode));
|
||||||
|
}
|
||||||
|
|
||||||
|
var responseBody = await response.Content.ReadAsStringAsync(cancellationToken);
|
||||||
|
var gitHubCheckRun = JsonSerializer.Deserialize<GitHubCheckRunResponse>(responseBody, JsonOptions);
|
||||||
|
|
||||||
|
return ScmOperationResult<ScmCheckRunResponse>.Ok(new ScmCheckRunResponse
|
||||||
|
{
|
||||||
|
CheckRunId = gitHubCheckRun?.Id.ToString() ?? "0",
|
||||||
|
Url = gitHubCheckRun?.HtmlUrl ?? string.Empty,
|
||||||
|
Status = request.Status,
|
||||||
|
Conclusion = request.Conclusion,
|
||||||
|
StartedAt = gitHubCheckRun?.StartedAt,
|
||||||
|
CompletedAt = gitHubCheckRun?.CompletedAt,
|
||||||
|
AnnotationCount = request.Annotations.Length
|
||||||
|
});
|
||||||
|
}
|
||||||
|
catch (HttpRequestException ex)
|
||||||
|
{
|
||||||
|
return ScmOperationResult<ScmCheckRunResponse>.Fail(
|
||||||
|
$"Network error creating check run: {ex.Message}",
|
||||||
|
isTransient: true);
|
||||||
|
}
|
||||||
|
catch (TaskCanceledException) when (cancellationToken.IsCancellationRequested)
|
||||||
|
{
|
||||||
|
throw;
|
||||||
|
}
|
||||||
|
catch (TaskCanceledException ex)
|
||||||
|
{
|
||||||
|
return ScmOperationResult<ScmCheckRunResponse>.Fail(
|
||||||
|
$"Request timeout: {ex.Message}",
|
||||||
|
isTransient: true);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <inheritdoc />
|
||||||
|
public async Task<ScmOperationResult<ScmCheckRunResponse>> UpdateCheckRunAsync(
|
||||||
|
ScmCheckRunUpdateRequest request,
|
||||||
|
CancellationToken cancellationToken = default)
|
||||||
|
{
|
||||||
|
ArgumentNullException.ThrowIfNull(request);
|
||||||
|
|
||||||
|
try
|
||||||
|
{
|
||||||
|
var url = $"repos/{request.Owner}/{request.Repo}/check-runs/{request.CheckRunId}";
|
||||||
|
var hasAnnotations = request.Annotations?.Count > 0;
|
||||||
|
|
||||||
|
var payload = new GitHubCheckRunPayload
|
||||||
|
{
|
||||||
|
Name = request.Name,
|
||||||
|
Status = request.Status.HasValue ? MapCheckRunStatus(request.Status.Value) : null,
|
||||||
|
Conclusion = request.Conclusion.HasValue ? MapCheckRunConclusion(request.Conclusion.Value) : null,
|
||||||
|
CompletedAt = request.CompletedAt,
|
||||||
|
DetailsUrl = request.DetailsUrl ?? request.EvidenceUrl,
|
||||||
|
Output = request.Summary != null || request.Text != null || hasAnnotations
|
||||||
|
? new GitHubCheckRunOutput
|
||||||
|
{
|
||||||
|
Title = request.Title ?? request.Name ?? "StellaOps Check",
|
||||||
|
Summary = request.Summary ?? string.Empty,
|
||||||
|
Text = request.Text,
|
||||||
|
Annotations = hasAnnotations
|
||||||
|
? request.Annotations!.Select(a => new GitHubCheckRunAnnotation
|
||||||
|
{
|
||||||
|
Path = a.Path,
|
||||||
|
StartLine = a.StartLine,
|
||||||
|
EndLine = a.EndLine,
|
||||||
|
AnnotationLevel = MapAnnotationLevel(a.Level),
|
||||||
|
Message = a.Message,
|
||||||
|
Title = a.Title,
|
||||||
|
RawDetails = a.RawDetails
|
||||||
|
}).ToList()
|
||||||
|
: null
|
||||||
|
}
|
||||||
|
: null
|
||||||
|
};
|
||||||
|
|
||||||
|
var json = JsonSerializer.Serialize(payload, JsonOptions);
|
||||||
|
using var content = new StringContent(json, Encoding.UTF8, "application/json");
|
||||||
|
|
||||||
|
var httpRequest = new HttpRequestMessage(new HttpMethod("PATCH"), url)
|
||||||
|
{
|
||||||
|
Content = content
|
||||||
|
};
|
||||||
|
|
||||||
|
var response = await _httpClient.SendAsync(httpRequest, cancellationToken);
|
||||||
|
|
||||||
|
if (!response.IsSuccessStatusCode)
|
||||||
|
{
|
||||||
|
var errorBody = await response.Content.ReadAsStringAsync(cancellationToken);
|
||||||
|
return ScmOperationResult<ScmCheckRunResponse>.Fail(
|
||||||
|
$"GitHub API returned {response.StatusCode}: {TruncateError(errorBody)}",
|
||||||
|
isTransient: IsTransientError(response.StatusCode));
|
||||||
|
}
|
||||||
|
|
||||||
|
var responseBody = await response.Content.ReadAsStringAsync(cancellationToken);
|
||||||
|
var gitHubCheckRun = JsonSerializer.Deserialize<GitHubCheckRunResponse>(responseBody, JsonOptions);
|
||||||
|
|
||||||
|
return ScmOperationResult<ScmCheckRunResponse>.Ok(new ScmCheckRunResponse
|
||||||
|
{
|
||||||
|
CheckRunId = gitHubCheckRun?.Id.ToString() ?? request.CheckRunId,
|
||||||
|
Url = gitHubCheckRun?.HtmlUrl ?? string.Empty,
|
||||||
|
Status = request.Status ?? ScmCheckRunStatus.Completed,
|
||||||
|
Conclusion = request.Conclusion,
|
||||||
|
StartedAt = gitHubCheckRun?.StartedAt,
|
||||||
|
CompletedAt = gitHubCheckRun?.CompletedAt,
|
||||||
|
AnnotationCount = request.Annotations?.Count ?? 0
|
||||||
|
});
|
||||||
|
}
|
||||||
|
catch (HttpRequestException ex)
|
||||||
|
{
|
||||||
|
return ScmOperationResult<ScmCheckRunResponse>.Fail(
|
||||||
|
$"Network error updating check run: {ex.Message}",
|
||||||
|
isTransient: true);
|
||||||
|
}
|
||||||
|
catch (TaskCanceledException) when (cancellationToken.IsCancellationRequested)
|
||||||
|
{
|
||||||
|
throw;
|
||||||
|
}
|
||||||
|
catch (TaskCanceledException ex)
|
||||||
|
{
|
||||||
|
return ScmOperationResult<ScmCheckRunResponse>.Fail(
|
||||||
|
$"Request timeout: {ex.Message}",
|
||||||
|
isTransient: true);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
#region Mapping Helpers
|
||||||
|
|
||||||
|
private static string MapStatusState(ScmStatusState state) => state switch
|
||||||
|
{
|
||||||
|
ScmStatusState.Pending => "pending",
|
||||||
|
ScmStatusState.Success => "success",
|
||||||
|
ScmStatusState.Failure => "failure",
|
||||||
|
ScmStatusState.Error => "error",
|
||||||
|
_ => "pending"
|
||||||
|
};
|
||||||
|
|
||||||
|
private static string MapCheckRunStatus(ScmCheckRunStatus status) => status switch
|
||||||
|
{
|
||||||
|
ScmCheckRunStatus.Queued => "queued",
|
||||||
|
ScmCheckRunStatus.InProgress => "in_progress",
|
||||||
|
ScmCheckRunStatus.Completed => "completed",
|
||||||
|
_ => "queued"
|
||||||
|
};
|
||||||
|
|
||||||
|
private static string MapCheckRunConclusion(ScmCheckRunConclusion conclusion) => conclusion switch
|
||||||
|
{
|
||||||
|
ScmCheckRunConclusion.Success => "success",
|
||||||
|
ScmCheckRunConclusion.Failure => "failure",
|
||||||
|
ScmCheckRunConclusion.Neutral => "neutral",
|
||||||
|
ScmCheckRunConclusion.Cancelled => "cancelled",
|
||||||
|
ScmCheckRunConclusion.Skipped => "skipped",
|
||||||
|
ScmCheckRunConclusion.TimedOut => "timed_out",
|
||||||
|
ScmCheckRunConclusion.ActionRequired => "action_required",
|
||||||
|
_ => "neutral"
|
||||||
|
};
|
||||||
|
|
||||||
|
private static string MapAnnotationLevel(ScmAnnotationLevel level) => level switch
|
||||||
|
{
|
||||||
|
ScmAnnotationLevel.Notice => "notice",
|
||||||
|
ScmAnnotationLevel.Warning => "warning",
|
||||||
|
ScmAnnotationLevel.Failure => "failure",
|
||||||
|
_ => "notice"
|
||||||
|
};
|
||||||
|
|
||||||
|
private static bool IsTransientError(System.Net.HttpStatusCode statusCode) =>
|
||||||
|
statusCode is System.Net.HttpStatusCode.TooManyRequests
|
||||||
|
or System.Net.HttpStatusCode.ServiceUnavailable
|
||||||
|
or System.Net.HttpStatusCode.GatewayTimeout
|
||||||
|
or System.Net.HttpStatusCode.BadGateway;
|
||||||
|
|
||||||
|
private static string TruncateError(string error) =>
|
||||||
|
error.Length > 200 ? error[..200] + "..." : error;
|
||||||
|
|
||||||
|
private static string TruncateDescription(string description, int maxLength) =>
|
||||||
|
description.Length > maxLength ? description[..(maxLength - 3)] + "..." : description;
|
||||||
|
|
||||||
|
#endregion
|
||||||
|
|
||||||
|
#region GitHub API DTOs
|
||||||
|
|
||||||
|
private sealed record GitHubIssueCommentPayload
|
||||||
|
{
|
||||||
|
[JsonPropertyName("body")]
|
||||||
|
public required string Body { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
private sealed record GitHubReviewCommentPayload
|
||||||
|
{
|
||||||
|
[JsonPropertyName("body")]
|
||||||
|
public required string Body { get; init; }
|
||||||
|
|
||||||
|
[JsonPropertyName("path")]
|
||||||
|
public required string Path { get; init; }
|
||||||
|
|
||||||
|
[JsonPropertyName("line")]
|
||||||
|
public required int Line { get; init; }
|
||||||
|
|
||||||
|
[JsonPropertyName("commit_id")]
|
||||||
|
public required string CommitId { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
private sealed record GitHubCommentResponse
|
||||||
|
{
|
||||||
|
[JsonPropertyName("id")]
|
||||||
|
public long Id { get; init; }
|
||||||
|
|
||||||
|
[JsonPropertyName("html_url")]
|
||||||
|
public string? HtmlUrl { get; init; }
|
||||||
|
|
||||||
|
[JsonPropertyName("created_at")]
|
||||||
|
public DateTimeOffset CreatedAt { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
private sealed record GitHubStatusPayload
|
||||||
|
{
|
||||||
|
[JsonPropertyName("state")]
|
||||||
|
public required string State { get; init; }
|
||||||
|
|
||||||
|
[JsonPropertyName("context")]
|
||||||
|
public required string Context { get; init; }
|
||||||
|
|
||||||
|
[JsonPropertyName("description")]
|
||||||
|
public required string Description { get; init; }
|
||||||
|
|
||||||
|
[JsonPropertyName("target_url")]
|
||||||
|
public string? TargetUrl { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
private sealed record GitHubStatusResponse
|
||||||
|
{
|
||||||
|
[JsonPropertyName("id")]
|
||||||
|
public long Id { get; init; }
|
||||||
|
|
||||||
|
[JsonPropertyName("url")]
|
||||||
|
public string? Url { get; init; }
|
||||||
|
|
||||||
|
[JsonPropertyName("created_at")]
|
||||||
|
public DateTimeOffset CreatedAt { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
private sealed record GitHubCheckRunPayload
|
||||||
|
{
|
||||||
|
[JsonPropertyName("name")]
|
||||||
|
public string? Name { get; init; }
|
||||||
|
|
||||||
|
[JsonPropertyName("head_sha")]
|
||||||
|
public string? HeadSha { get; init; }
|
||||||
|
|
||||||
|
[JsonPropertyName("status")]
|
||||||
|
public string? Status { get; init; }
|
||||||
|
|
||||||
|
[JsonPropertyName("conclusion")]
|
||||||
|
public string? Conclusion { get; init; }
|
||||||
|
|
||||||
|
[JsonPropertyName("started_at")]
|
||||||
|
public DateTimeOffset? StartedAt { get; init; }
|
||||||
|
|
||||||
|
[JsonPropertyName("completed_at")]
|
||||||
|
public DateTimeOffset? CompletedAt { get; init; }
|
||||||
|
|
||||||
|
[JsonPropertyName("external_id")]
|
||||||
|
public string? ExternalId { get; init; }
|
||||||
|
|
||||||
|
[JsonPropertyName("details_url")]
|
||||||
|
public string? DetailsUrl { get; init; }
|
||||||
|
|
||||||
|
[JsonPropertyName("output")]
|
||||||
|
public GitHubCheckRunOutput? Output { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
private sealed record GitHubCheckRunOutput
|
||||||
|
{
|
||||||
|
[JsonPropertyName("title")]
|
||||||
|
public required string Title { get; init; }
|
||||||
|
|
||||||
|
[JsonPropertyName("summary")]
|
||||||
|
public required string Summary { get; init; }
|
||||||
|
|
||||||
|
[JsonPropertyName("text")]
|
||||||
|
public string? Text { get; init; }
|
||||||
|
|
||||||
|
[JsonPropertyName("annotations")]
|
||||||
|
public List<GitHubCheckRunAnnotation>? Annotations { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
private sealed record GitHubCheckRunAnnotation
|
||||||
|
{
|
||||||
|
[JsonPropertyName("path")]
|
||||||
|
public required string Path { get; init; }
|
||||||
|
|
||||||
|
[JsonPropertyName("start_line")]
|
||||||
|
public required int StartLine { get; init; }
|
||||||
|
|
||||||
|
[JsonPropertyName("end_line")]
|
||||||
|
public required int EndLine { get; init; }
|
||||||
|
|
||||||
|
[JsonPropertyName("annotation_level")]
|
||||||
|
public required string AnnotationLevel { get; init; }
|
||||||
|
|
||||||
|
[JsonPropertyName("message")]
|
||||||
|
public required string Message { get; init; }
|
||||||
|
|
||||||
|
[JsonPropertyName("title")]
|
||||||
|
public string? Title { get; init; }
|
||||||
|
|
||||||
|
[JsonPropertyName("raw_details")]
|
||||||
|
public string? RawDetails { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
private sealed record GitHubCheckRunResponse
|
||||||
|
{
|
||||||
|
[JsonPropertyName("id")]
|
||||||
|
public long Id { get; init; }
|
||||||
|
|
||||||
|
[JsonPropertyName("html_url")]
|
||||||
|
public string? HtmlUrl { get; init; }
|
||||||
|
|
||||||
|
[JsonPropertyName("started_at")]
|
||||||
|
public DateTimeOffset? StartedAt { get; init; }
|
||||||
|
|
||||||
|
[JsonPropertyName("completed_at")]
|
||||||
|
public DateTimeOffset? CompletedAt { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
#endregion
|
||||||
|
}
|
||||||
@@ -0,0 +1,377 @@
|
|||||||
|
// <copyright file="GitLabAnnotationClient.cs" company="StellaOps">
|
||||||
|
// SPDX-License-Identifier: AGPL-3.0-or-later
|
||||||
|
// Sprint: SPRINT_20260112_006_INTEGRATIONS_scm_annotations (INTEGRATIONS-SCM-003)
|
||||||
|
// </copyright>
|
||||||
|
|
||||||
|
using System.Net.Http.Headers;
|
||||||
|
using System.Text;
|
||||||
|
using System.Text.Json;
|
||||||
|
using System.Text.Json.Serialization;
|
||||||
|
using StellaOps.Integrations.Contracts;
|
||||||
|
using StellaOps.Integrations.Core;
|
||||||
|
|
||||||
|
namespace StellaOps.Integrations.Plugin.GitLab;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// GitLab SCM annotation client for MR comments and pipeline statuses.
|
||||||
|
/// </summary>
|
||||||
|
public sealed class GitLabAnnotationClient : IScmAnnotationClient
|
||||||
|
{
|
||||||
|
private readonly HttpClient _httpClient;
|
||||||
|
private readonly TimeProvider _timeProvider;
|
||||||
|
private readonly IntegrationConfig _config;
|
||||||
|
private static readonly JsonSerializerOptions JsonOptions = new()
|
||||||
|
{
|
||||||
|
PropertyNamingPolicy = JsonNamingPolicy.SnakeCaseLower,
|
||||||
|
DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingNull
|
||||||
|
};
|
||||||
|
|
||||||
|
public GitLabAnnotationClient(
|
||||||
|
HttpClient httpClient,
|
||||||
|
IntegrationConfig config,
|
||||||
|
TimeProvider? timeProvider = null)
|
||||||
|
{
|
||||||
|
_httpClient = httpClient ?? throw new ArgumentNullException(nameof(httpClient));
|
||||||
|
_config = config ?? throw new ArgumentNullException(nameof(config));
|
||||||
|
_timeProvider = timeProvider ?? TimeProvider.System;
|
||||||
|
|
||||||
|
ConfigureHttpClient();
|
||||||
|
}
|
||||||
|
|
||||||
|
private void ConfigureHttpClient()
|
||||||
|
{
|
||||||
|
_httpClient.BaseAddress = new Uri(_config.Endpoint.TrimEnd('/') + "/api/v4/");
|
||||||
|
_httpClient.DefaultRequestHeaders.Accept.Add(
|
||||||
|
new MediaTypeWithQualityHeaderValue("application/json"));
|
||||||
|
_httpClient.DefaultRequestHeaders.UserAgent.Add(
|
||||||
|
new ProductInfoHeaderValue("StellaOps", "1.0"));
|
||||||
|
|
||||||
|
if (!string.IsNullOrEmpty(_config.ResolvedSecret))
|
||||||
|
{
|
||||||
|
_httpClient.DefaultRequestHeaders.Add("PRIVATE-TOKEN", _config.ResolvedSecret);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <inheritdoc />
|
||||||
|
public async Task<ScmOperationResult<ScmCommentResponse>> PostCommentAsync(
|
||||||
|
ScmCommentRequest request,
|
||||||
|
CancellationToken cancellationToken = default)
|
||||||
|
{
|
||||||
|
ArgumentNullException.ThrowIfNull(request);
|
||||||
|
|
||||||
|
try
|
||||||
|
{
|
||||||
|
// GitLab uses project path encoding
|
||||||
|
var projectPath = Uri.EscapeDataString($"{request.Owner}/{request.Repo}");
|
||||||
|
|
||||||
|
string url;
|
||||||
|
object payload;
|
||||||
|
|
||||||
|
if (request.Line.HasValue && !string.IsNullOrEmpty(request.Path))
|
||||||
|
{
|
||||||
|
// Position-based MR comment (discussion)
|
||||||
|
url = $"projects/{projectPath}/merge_requests/{request.PrNumber}/discussions";
|
||||||
|
payload = new GitLabDiscussionPayload
|
||||||
|
{
|
||||||
|
Body = request.Body,
|
||||||
|
Position = new GitLabPosition
|
||||||
|
{
|
||||||
|
BaseSha = request.CommitSha ?? string.Empty,
|
||||||
|
HeadSha = request.CommitSha ?? string.Empty,
|
||||||
|
StartSha = request.CommitSha ?? string.Empty,
|
||||||
|
PositionType = "text",
|
||||||
|
NewPath = request.Path,
|
||||||
|
NewLine = request.Line.Value
|
||||||
|
}
|
||||||
|
};
|
||||||
|
}
|
||||||
|
else
|
||||||
|
{
|
||||||
|
// General MR note
|
||||||
|
url = $"projects/{projectPath}/merge_requests/{request.PrNumber}/notes";
|
||||||
|
payload = new GitLabNotePayload { Body = request.Body };
|
||||||
|
}
|
||||||
|
|
||||||
|
var json = JsonSerializer.Serialize(payload, JsonOptions);
|
||||||
|
using var content = new StringContent(json, Encoding.UTF8, "application/json");
|
||||||
|
|
||||||
|
var response = await _httpClient.PostAsync(url, content, cancellationToken);
|
||||||
|
|
||||||
|
if (!response.IsSuccessStatusCode)
|
||||||
|
{
|
||||||
|
var errorBody = await response.Content.ReadAsStringAsync(cancellationToken);
|
||||||
|
return ScmOperationResult<ScmCommentResponse>.Fail(
|
||||||
|
$"GitLab API returned {response.StatusCode}: {TruncateError(errorBody)}",
|
||||||
|
isTransient: IsTransientError(response.StatusCode));
|
||||||
|
}
|
||||||
|
|
||||||
|
var responseBody = await response.Content.ReadAsStringAsync(cancellationToken);
|
||||||
|
var gitLabNote = JsonSerializer.Deserialize<GitLabNoteResponse>(responseBody, JsonOptions);
|
||||||
|
|
||||||
|
return ScmOperationResult<ScmCommentResponse>.Ok(new ScmCommentResponse
|
||||||
|
{
|
||||||
|
CommentId = gitLabNote?.Id.ToString() ?? "0",
|
||||||
|
Url = BuildMrNoteUrl(request.Owner, request.Repo, request.PrNumber, gitLabNote?.Id ?? 0),
|
||||||
|
CreatedAt = gitLabNote?.CreatedAt ?? _timeProvider.GetUtcNow(),
|
||||||
|
WasUpdated = false
|
||||||
|
});
|
||||||
|
}
|
||||||
|
catch (HttpRequestException ex)
|
||||||
|
{
|
||||||
|
return ScmOperationResult<ScmCommentResponse>.Fail(
|
||||||
|
$"Network error posting comment: {ex.Message}",
|
||||||
|
isTransient: true);
|
||||||
|
}
|
||||||
|
catch (TaskCanceledException) when (cancellationToken.IsCancellationRequested)
|
||||||
|
{
|
||||||
|
throw;
|
||||||
|
}
|
||||||
|
catch (TaskCanceledException ex)
|
||||||
|
{
|
||||||
|
return ScmOperationResult<ScmCommentResponse>.Fail(
|
||||||
|
$"Request timeout: {ex.Message}",
|
||||||
|
isTransient: true);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <inheritdoc />
|
||||||
|
public async Task<ScmOperationResult<ScmStatusResponse>> PostStatusAsync(
|
||||||
|
ScmStatusRequest request,
|
||||||
|
CancellationToken cancellationToken = default)
|
||||||
|
{
|
||||||
|
ArgumentNullException.ThrowIfNull(request);
|
||||||
|
|
||||||
|
try
|
||||||
|
{
|
||||||
|
var projectPath = Uri.EscapeDataString($"{request.Owner}/{request.Repo}");
|
||||||
|
var url = $"projects/{projectPath}/statuses/{request.CommitSha}";
|
||||||
|
|
||||||
|
var payload = new GitLabStatusPayload
|
||||||
|
{
|
||||||
|
State = MapStatusState(request.State),
|
||||||
|
Context = request.Context,
|
||||||
|
Description = TruncateDescription(request.Description, 255),
|
||||||
|
TargetUrl = request.TargetUrl ?? request.EvidenceUrl
|
||||||
|
};
|
||||||
|
|
||||||
|
var json = JsonSerializer.Serialize(payload, JsonOptions);
|
||||||
|
using var content = new StringContent(json, Encoding.UTF8, "application/json");
|
||||||
|
|
||||||
|
var response = await _httpClient.PostAsync(url, content, cancellationToken);
|
||||||
|
|
||||||
|
if (!response.IsSuccessStatusCode)
|
||||||
|
{
|
||||||
|
var errorBody = await response.Content.ReadAsStringAsync(cancellationToken);
|
||||||
|
return ScmOperationResult<ScmStatusResponse>.Fail(
|
||||||
|
$"GitLab API returned {response.StatusCode}: {TruncateError(errorBody)}",
|
||||||
|
isTransient: IsTransientError(response.StatusCode));
|
||||||
|
}
|
||||||
|
|
||||||
|
var responseBody = await response.Content.ReadAsStringAsync(cancellationToken);
|
||||||
|
var gitLabStatus = JsonSerializer.Deserialize<GitLabStatusResponse>(responseBody, JsonOptions);
|
||||||
|
|
||||||
|
return ScmOperationResult<ScmStatusResponse>.Ok(new ScmStatusResponse
|
||||||
|
{
|
||||||
|
StatusId = gitLabStatus?.Id.ToString() ?? "0",
|
||||||
|
State = request.State,
|
||||||
|
Url = gitLabStatus?.TargetUrl,
|
||||||
|
CreatedAt = gitLabStatus?.CreatedAt ?? _timeProvider.GetUtcNow()
|
||||||
|
});
|
||||||
|
}
|
||||||
|
catch (HttpRequestException ex)
|
||||||
|
{
|
||||||
|
return ScmOperationResult<ScmStatusResponse>.Fail(
|
||||||
|
$"Network error posting status: {ex.Message}",
|
||||||
|
isTransient: true);
|
||||||
|
}
|
||||||
|
catch (TaskCanceledException) when (cancellationToken.IsCancellationRequested)
|
||||||
|
{
|
||||||
|
throw;
|
||||||
|
}
|
||||||
|
catch (TaskCanceledException ex)
|
||||||
|
{
|
||||||
|
return ScmOperationResult<ScmStatusResponse>.Fail(
|
||||||
|
$"Request timeout: {ex.Message}",
|
||||||
|
isTransient: true);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <inheritdoc />
|
||||||
|
/// <remarks>
|
||||||
|
/// GitLab does not have direct check run equivalent. This posts a commit status
|
||||||
|
/// and optionally creates a code quality report artifact.
|
||||||
|
/// </remarks>
|
||||||
|
public async Task<ScmOperationResult<ScmCheckRunResponse>> CreateCheckRunAsync(
|
||||||
|
ScmCheckRunRequest request,
|
||||||
|
CancellationToken cancellationToken = default)
|
||||||
|
{
|
||||||
|
ArgumentNullException.ThrowIfNull(request);
|
||||||
|
|
||||||
|
// Map to commit status since GitLab doesn't have GitHub-style check runs
|
||||||
|
var statusRequest = new ScmStatusRequest
|
||||||
|
{
|
||||||
|
Owner = request.Owner,
|
||||||
|
Repo = request.Repo,
|
||||||
|
CommitSha = request.HeadSha,
|
||||||
|
State = MapCheckRunStatusToStatusState(request.Status, request.Conclusion),
|
||||||
|
Context = $"stellaops/{request.Name}",
|
||||||
|
Description = request.Summary ?? request.Title ?? request.Name,
|
||||||
|
TargetUrl = request.EvidenceUrl
|
||||||
|
};
|
||||||
|
|
||||||
|
var statusResult = await PostStatusAsync(statusRequest, cancellationToken);
|
||||||
|
|
||||||
|
if (!statusResult.Success)
|
||||||
|
{
|
||||||
|
return ScmOperationResult<ScmCheckRunResponse>.Fail(
|
||||||
|
statusResult.Error ?? "Failed to create check run",
|
||||||
|
statusResult.IsTransient);
|
||||||
|
}
|
||||||
|
|
||||||
|
return ScmOperationResult<ScmCheckRunResponse>.Ok(new ScmCheckRunResponse
|
||||||
|
{
|
||||||
|
CheckRunId = statusResult.Data!.StatusId,
|
||||||
|
Url = statusResult.Data.Url ?? string.Empty,
|
||||||
|
Status = request.Status,
|
||||||
|
Conclusion = request.Conclusion,
|
||||||
|
StartedAt = _timeProvider.GetUtcNow(),
|
||||||
|
CompletedAt = request.Status == ScmCheckRunStatus.Completed ? _timeProvider.GetUtcNow() : null,
|
||||||
|
AnnotationCount = request.Annotations.Length
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <inheritdoc />
|
||||||
|
public async Task<ScmOperationResult<ScmCheckRunResponse>> UpdateCheckRunAsync(
|
||||||
|
ScmCheckRunUpdateRequest request,
|
||||||
|
CancellationToken cancellationToken = default)
|
||||||
|
{
|
||||||
|
// GitLab commit statuses are immutable once created; we create a new one instead
|
||||||
|
// This requires knowing the commit SHA, which we may not have in the update request
|
||||||
|
// For now, return unsupported
|
||||||
|
|
||||||
|
return await Task.FromResult(ScmOperationResult<ScmCheckRunResponse>.Fail(
|
||||||
|
"GitLab does not support updating commit statuses. Create a new status instead.",
|
||||||
|
isTransient: false));
|
||||||
|
}
|
||||||
|
|
||||||
|
#region Mapping Helpers
|
||||||
|
|
||||||
|
private static string MapStatusState(ScmStatusState state) => state switch
|
||||||
|
{
|
||||||
|
ScmStatusState.Pending => "pending",
|
||||||
|
ScmStatusState.Success => "success",
|
||||||
|
ScmStatusState.Failure => "failed",
|
||||||
|
ScmStatusState.Error => "failed",
|
||||||
|
_ => "pending"
|
||||||
|
};
|
||||||
|
|
||||||
|
private static ScmStatusState MapCheckRunStatusToStatusState(
|
||||||
|
ScmCheckRunStatus status,
|
||||||
|
ScmCheckRunConclusion? conclusion) => status switch
|
||||||
|
{
|
||||||
|
ScmCheckRunStatus.Queued => ScmStatusState.Pending,
|
||||||
|
ScmCheckRunStatus.InProgress => ScmStatusState.Pending,
|
||||||
|
ScmCheckRunStatus.Completed => conclusion switch
|
||||||
|
{
|
||||||
|
ScmCheckRunConclusion.Success => ScmStatusState.Success,
|
||||||
|
ScmCheckRunConclusion.Failure => ScmStatusState.Failure,
|
||||||
|
ScmCheckRunConclusion.Cancelled => ScmStatusState.Error,
|
||||||
|
ScmCheckRunConclusion.TimedOut => ScmStatusState.Error,
|
||||||
|
_ => ScmStatusState.Success
|
||||||
|
},
|
||||||
|
_ => ScmStatusState.Pending
|
||||||
|
};
|
||||||
|
|
||||||
|
private static bool IsTransientError(System.Net.HttpStatusCode statusCode) =>
|
||||||
|
statusCode is System.Net.HttpStatusCode.TooManyRequests
|
||||||
|
or System.Net.HttpStatusCode.ServiceUnavailable
|
||||||
|
or System.Net.HttpStatusCode.GatewayTimeout
|
||||||
|
or System.Net.HttpStatusCode.BadGateway;
|
||||||
|
|
||||||
|
private static string TruncateError(string error) =>
|
||||||
|
error.Length > 200 ? error[..200] + "..." : error;
|
||||||
|
|
||||||
|
private static string TruncateDescription(string description, int maxLength) =>
|
||||||
|
description.Length > maxLength ? description[..(maxLength - 3)] + "..." : description;
|
||||||
|
|
||||||
|
private string BuildMrNoteUrl(string owner, string repo, int mrNumber, long noteId) =>
|
||||||
|
$"{_config.Endpoint.TrimEnd('/')}/{owner}/{repo}/-/merge_requests/{mrNumber}#note_{noteId}";
|
||||||
|
|
||||||
|
#endregion
|
||||||
|
|
||||||
|
#region GitLab API DTOs
|
||||||
|
|
||||||
|
private sealed record GitLabNotePayload
|
||||||
|
{
|
||||||
|
[JsonPropertyName("body")]
|
||||||
|
public required string Body { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
private sealed record GitLabDiscussionPayload
|
||||||
|
{
|
||||||
|
[JsonPropertyName("body")]
|
||||||
|
public required string Body { get; init; }
|
||||||
|
|
||||||
|
[JsonPropertyName("position")]
|
||||||
|
public GitLabPosition? Position { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
private sealed record GitLabPosition
|
||||||
|
{
|
||||||
|
[JsonPropertyName("base_sha")]
|
||||||
|
public required string BaseSha { get; init; }
|
||||||
|
|
||||||
|
[JsonPropertyName("head_sha")]
|
||||||
|
public required string HeadSha { get; init; }
|
||||||
|
|
||||||
|
[JsonPropertyName("start_sha")]
|
||||||
|
public required string StartSha { get; init; }
|
||||||
|
|
||||||
|
[JsonPropertyName("position_type")]
|
||||||
|
public required string PositionType { get; init; }
|
||||||
|
|
||||||
|
[JsonPropertyName("new_path")]
|
||||||
|
public string? NewPath { get; init; }
|
||||||
|
|
||||||
|
[JsonPropertyName("new_line")]
|
||||||
|
public int? NewLine { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
private sealed record GitLabNoteResponse
|
||||||
|
{
|
||||||
|
[JsonPropertyName("id")]
|
||||||
|
public long Id { get; init; }
|
||||||
|
|
||||||
|
[JsonPropertyName("created_at")]
|
||||||
|
public DateTimeOffset CreatedAt { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
private sealed record GitLabStatusPayload
|
||||||
|
{
|
||||||
|
[JsonPropertyName("state")]
|
||||||
|
public required string State { get; init; }
|
||||||
|
|
||||||
|
[JsonPropertyName("name")]
|
||||||
|
public required string Context { get; init; }
|
||||||
|
|
||||||
|
[JsonPropertyName("description")]
|
||||||
|
public required string Description { get; init; }
|
||||||
|
|
||||||
|
[JsonPropertyName("target_url")]
|
||||||
|
public string? TargetUrl { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
private sealed record GitLabStatusResponse
|
||||||
|
{
|
||||||
|
[JsonPropertyName("id")]
|
||||||
|
public long Id { get; init; }
|
||||||
|
|
||||||
|
[JsonPropertyName("target_url")]
|
||||||
|
public string? TargetUrl { get; init; }
|
||||||
|
|
||||||
|
[JsonPropertyName("created_at")]
|
||||||
|
public DateTimeOffset CreatedAt { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
#endregion
|
||||||
|
}
|
||||||
@@ -0,0 +1,21 @@
|
|||||||
|
<Project Sdk="Microsoft.NET.Sdk">
|
||||||
|
|
||||||
|
<PropertyGroup>
|
||||||
|
<TargetFramework>net10.0</TargetFramework>
|
||||||
|
<Nullable>enable</Nullable>
|
||||||
|
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
|
||||||
|
<ImplicitUsings>enable</ImplicitUsings>
|
||||||
|
<LangVersion>preview</LangVersion>
|
||||||
|
<RootNamespace>StellaOps.Integrations.Plugin.GitLab</RootNamespace>
|
||||||
|
</PropertyGroup>
|
||||||
|
|
||||||
|
<ItemGroup>
|
||||||
|
<PackageReference Include="Microsoft.Extensions.Http" />
|
||||||
|
<PackageReference Include="Microsoft.Extensions.Logging.Abstractions" />
|
||||||
|
</ItemGroup>
|
||||||
|
|
||||||
|
<ItemGroup>
|
||||||
|
<ProjectReference Include="..\..\__Libraries\StellaOps.Integrations.Contracts\StellaOps.Integrations.Contracts.csproj" />
|
||||||
|
</ItemGroup>
|
||||||
|
|
||||||
|
</Project>
|
||||||
@@ -232,7 +232,14 @@ internal sealed record PolicyEvaluationReachability(
|
|||||||
bool HasRuntimeEvidence,
|
bool HasRuntimeEvidence,
|
||||||
string? Source,
|
string? Source,
|
||||||
string? Method,
|
string? Method,
|
||||||
string? EvidenceRef)
|
string? EvidenceRef,
|
||||||
|
// Sprint: SPRINT_20260112_007_POLICY_path_gate_inputs (PW-POL-002)
|
||||||
|
string? PathHash = null,
|
||||||
|
ImmutableArray<string>? NodeHashes = null,
|
||||||
|
string? EntryNodeHash = null,
|
||||||
|
string? SinkNodeHash = null,
|
||||||
|
DateTimeOffset? RuntimeEvidenceAt = null,
|
||||||
|
bool? ObservedAtRuntime = null)
|
||||||
{
|
{
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Default unknown reachability state.
|
/// Default unknown reachability state.
|
||||||
|
|||||||
@@ -117,6 +117,38 @@ public sealed record ReachabilityInput
|
|||||||
/// Raw reachability score from advanced engine.
|
/// Raw reachability score from advanced engine.
|
||||||
/// </summary>
|
/// </summary>
|
||||||
public double? AdvancedScore { get; init; }
|
public double? AdvancedScore { get; init; }
|
||||||
|
|
||||||
|
// --- Sprint: SPRINT_20260112_007_POLICY_path_gate_inputs (PW-POL-001) ---
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Canonical path hash (sha256:hex) for the reachability path.
|
||||||
|
/// </summary>
|
||||||
|
public string? PathHash { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Node hashes for symbols along the path (top-K for efficiency).
|
||||||
|
/// </summary>
|
||||||
|
public IReadOnlyList<string>? NodeHashes { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Entry point node hash.
|
||||||
|
/// </summary>
|
||||||
|
public string? EntryNodeHash { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Sink (vulnerable function) node hash.
|
||||||
|
/// </summary>
|
||||||
|
public string? SinkNodeHash { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Timestamp when runtime evidence was last captured (for freshness checks).
|
||||||
|
/// </summary>
|
||||||
|
public DateTimeOffset? RuntimeEvidenceAt { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Whether the path was observed at runtime (not just static analysis).
|
||||||
|
/// </summary>
|
||||||
|
public bool? ObservedAtRuntime { get; init; }
|
||||||
}
|
}
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
|
|||||||
301
src/Policy/StellaOps.Policy.Engine/Vex/VexOverrideSignals.cs
Normal file
301
src/Policy/StellaOps.Policy.Engine/Vex/VexOverrideSignals.cs
Normal file
@@ -0,0 +1,301 @@
|
|||||||
|
// <copyright file="VexOverrideSignals.cs" company="StellaOps">
|
||||||
|
// SPDX-License-Identifier: AGPL-3.0-or-later
|
||||||
|
// Sprint: SPRINT_20260112_004_POLICY_signed_override_enforcement (POL-OVR-001, POL-OVR-002)
|
||||||
|
// </copyright>
|
||||||
|
|
||||||
|
using System.Collections.Immutable;
|
||||||
|
using System.Text.Json.Serialization;
|
||||||
|
|
||||||
|
namespace StellaOps.Policy.Engine.Vex;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// VEX override signature validation result for policy evaluation.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record VexOverrideSignalInput
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Whether the override is signed with a valid DSSE envelope.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("overrideSigned")]
|
||||||
|
public required bool OverrideSigned { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Whether the override has verified Rekor inclusion proof.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("overrideRekorVerified")]
|
||||||
|
public required bool OverrideRekorVerified { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Signing key ID if signed.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("signingKeyId")]
|
||||||
|
public string? SigningKeyId { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Issuer identity from the signature.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("signerIdentity")]
|
||||||
|
public string? SignerIdentity { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// DSSE envelope digest if signed.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("envelopeDigest")]
|
||||||
|
public string? EnvelopeDigest { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Rekor log index if verified.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("rekorLogIndex")]
|
||||||
|
public long? RekorLogIndex { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Rekor integrated time (Unix seconds) if verified.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("rekorIntegratedTime")]
|
||||||
|
public long? RekorIntegratedTime { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Override validity period (start).
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("validFrom")]
|
||||||
|
public DateTimeOffset? ValidFrom { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Override validity period (end).
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("validUntil")]
|
||||||
|
public DateTimeOffset? ValidUntil { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Whether the override is currently within its validity period.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("withinValidityPeriod")]
|
||||||
|
public required bool WithinValidityPeriod { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Trust level of the signing key (trusted, unknown, revoked).
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("keyTrustLevel")]
|
||||||
|
public required VexKeyTrustLevel KeyTrustLevel { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Validation error message if failed.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("validationError")]
|
||||||
|
public string? ValidationError { get; init; }
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Trust level of a signing key.
|
||||||
|
/// </summary>
|
||||||
|
[JsonConverter(typeof(JsonStringEnumConverter))]
|
||||||
|
public enum VexKeyTrustLevel
|
||||||
|
{
|
||||||
|
/// <summary>Key is in trusted keyring.</summary>
|
||||||
|
Trusted,
|
||||||
|
|
||||||
|
/// <summary>Key is not in keyring but signature is valid.</summary>
|
||||||
|
Unknown,
|
||||||
|
|
||||||
|
/// <summary>Key has been revoked.</summary>
|
||||||
|
Revoked,
|
||||||
|
|
||||||
|
/// <summary>Key trust could not be determined (offline mode).</summary>
|
||||||
|
Unavailable
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Override enforcement policy configuration.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record VexOverrideEnforcementPolicy
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Require signed overrides (reject unsigned).
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("requireSigned")]
|
||||||
|
public bool RequireSigned { get; init; } = true;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Require Rekor verification.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("requireRekorVerified")]
|
||||||
|
public bool RequireRekorVerified { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Allow unknown keys (not in keyring) if signature is valid.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("allowUnknownKeys")]
|
||||||
|
public bool AllowUnknownKeys { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Maximum age for override validity (zero = no limit).
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("maxOverrideAge")]
|
||||||
|
public TimeSpan MaxOverrideAge { get; init; } = TimeSpan.Zero;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Allowed signer identities (empty = all allowed).
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("allowedSigners")]
|
||||||
|
public ImmutableArray<string> AllowedSigners { get; init; } = [];
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Result of VEX override enforcement check.
|
||||||
|
/// </summary>
|
||||||
|
public sealed record VexOverrideEnforcementResult
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Whether the override is allowed by policy.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("allowed")]
|
||||||
|
public required bool Allowed { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Reason if rejected.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("rejectionReason")]
|
||||||
|
public string? RejectionReason { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Enforcement rule that triggered rejection.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("enforcementRule")]
|
||||||
|
public string? EnforcementRule { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// The input signals used for evaluation.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("signals")]
|
||||||
|
public required VexOverrideSignalInput Signals { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Creates an allowed result.
|
||||||
|
/// </summary>
|
||||||
|
public static VexOverrideEnforcementResult Allow(VexOverrideSignalInput signals) => new()
|
||||||
|
{
|
||||||
|
Allowed = true,
|
||||||
|
Signals = signals
|
||||||
|
};
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Creates a rejected result.
|
||||||
|
/// </summary>
|
||||||
|
public static VexOverrideEnforcementResult Reject(
|
||||||
|
VexOverrideSignalInput signals,
|
||||||
|
string reason,
|
||||||
|
string rule) => new()
|
||||||
|
{
|
||||||
|
Allowed = false,
|
||||||
|
RejectionReason = reason,
|
||||||
|
EnforcementRule = rule,
|
||||||
|
Signals = signals
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Service for validating VEX override signatures and enforcing policy.
|
||||||
|
/// </summary>
|
||||||
|
public interface IVexOverrideSignatureValidator
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Validates override signature and produces policy signals.
|
||||||
|
/// </summary>
|
||||||
|
Task<VexOverrideSignalInput> ValidateSignatureAsync(
|
||||||
|
string envelopeBase64,
|
||||||
|
CancellationToken cancellationToken = default);
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Checks if override is allowed by enforcement policy.
|
||||||
|
/// </summary>
|
||||||
|
VexOverrideEnforcementResult CheckEnforcement(
|
||||||
|
VexOverrideSignalInput signals,
|
||||||
|
VexOverrideEnforcementPolicy policy,
|
||||||
|
DateTimeOffset evaluationTime);
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Factory for creating VEX override signal inputs.
|
||||||
|
/// </summary>
|
||||||
|
public static class VexOverrideSignalFactory
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Creates a signal input for an unsigned override.
|
||||||
|
/// </summary>
|
||||||
|
public static VexOverrideSignalInput CreateUnsigned() => new()
|
||||||
|
{
|
||||||
|
OverrideSigned = false,
|
||||||
|
OverrideRekorVerified = false,
|
||||||
|
WithinValidityPeriod = true,
|
||||||
|
KeyTrustLevel = VexKeyTrustLevel.Unavailable
|
||||||
|
};
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Creates a signal input for a signed but unverified override.
|
||||||
|
/// </summary>
|
||||||
|
public static VexOverrideSignalInput CreateSignedUnverified(
|
||||||
|
string signingKeyId,
|
||||||
|
string? signerIdentity,
|
||||||
|
string envelopeDigest,
|
||||||
|
VexKeyTrustLevel keyTrustLevel,
|
||||||
|
DateTimeOffset? validFrom,
|
||||||
|
DateTimeOffset? validUntil,
|
||||||
|
DateTimeOffset evaluationTime) => new()
|
||||||
|
{
|
||||||
|
OverrideSigned = true,
|
||||||
|
OverrideRekorVerified = false,
|
||||||
|
SigningKeyId = signingKeyId,
|
||||||
|
SignerIdentity = signerIdentity,
|
||||||
|
EnvelopeDigest = envelopeDigest,
|
||||||
|
KeyTrustLevel = keyTrustLevel,
|
||||||
|
ValidFrom = validFrom,
|
||||||
|
ValidUntil = validUntil,
|
||||||
|
WithinValidityPeriod = IsWithinValidityPeriod(validFrom, validUntil, evaluationTime)
|
||||||
|
};
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Creates a signal input for a fully verified override with Rekor inclusion.
|
||||||
|
/// </summary>
|
||||||
|
public static VexOverrideSignalInput CreateFullyVerified(
|
||||||
|
string signingKeyId,
|
||||||
|
string? signerIdentity,
|
||||||
|
string envelopeDigest,
|
||||||
|
VexKeyTrustLevel keyTrustLevel,
|
||||||
|
long rekorLogIndex,
|
||||||
|
long rekorIntegratedTime,
|
||||||
|
DateTimeOffset? validFrom,
|
||||||
|
DateTimeOffset? validUntil,
|
||||||
|
DateTimeOffset evaluationTime) => new()
|
||||||
|
{
|
||||||
|
OverrideSigned = true,
|
||||||
|
OverrideRekorVerified = true,
|
||||||
|
SigningKeyId = signingKeyId,
|
||||||
|
SignerIdentity = signerIdentity,
|
||||||
|
EnvelopeDigest = envelopeDigest,
|
||||||
|
RekorLogIndex = rekorLogIndex,
|
||||||
|
RekorIntegratedTime = rekorIntegratedTime,
|
||||||
|
KeyTrustLevel = keyTrustLevel,
|
||||||
|
ValidFrom = validFrom,
|
||||||
|
ValidUntil = validUntil,
|
||||||
|
WithinValidityPeriod = IsWithinValidityPeriod(validFrom, validUntil, evaluationTime)
|
||||||
|
};
|
||||||
|
|
||||||
|
private static bool IsWithinValidityPeriod(
|
||||||
|
DateTimeOffset? validFrom,
|
||||||
|
DateTimeOffset? validUntil,
|
||||||
|
DateTimeOffset evaluationTime)
|
||||||
|
{
|
||||||
|
if (validFrom.HasValue && evaluationTime < validFrom.Value)
|
||||||
|
{
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (validUntil.HasValue && evaluationTime > validUntil.Value)
|
||||||
|
{
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -38,6 +38,14 @@ public sealed record VexClaimSummary
|
|||||||
[JsonPropertyName("justification")]
|
[JsonPropertyName("justification")]
|
||||||
public string? Justification { get; init; }
|
public string? Justification { get; init; }
|
||||||
|
|
||||||
|
// Sprint: SPRINT_20260112_004_BE_policy_determinization_attested_rules (DET-ATT-001)
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Anchor metadata for the VEX claim (DSSE envelope, Rekor, etc.).
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("anchor")]
|
||||||
|
public VexClaimAnchor? Anchor { get; init; }
|
||||||
|
|
||||||
/// <summary>
|
/// <summary>
|
||||||
/// Convenience property indicating if the VEX status is "not_affected".
|
/// Convenience property indicating if the VEX status is "not_affected".
|
||||||
/// </summary>
|
/// </summary>
|
||||||
@@ -50,4 +58,71 @@ public sealed record VexClaimSummary
|
|||||||
/// </summary>
|
/// </summary>
|
||||||
[JsonIgnore]
|
[JsonIgnore]
|
||||||
public double IssuerTrust => Confidence;
|
public double IssuerTrust => Confidence;
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Whether the VEX claim is anchored (has DSSE/Rekor attestation).
|
||||||
|
/// </summary>
|
||||||
|
[JsonIgnore]
|
||||||
|
public bool IsAnchored => Anchor?.Anchored == true;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Anchor metadata for VEX claims.
|
||||||
|
/// Sprint: SPRINT_20260112_004_BE_policy_determinization_attested_rules (DET-ATT-001)
|
||||||
|
/// </summary>
|
||||||
|
public sealed record VexClaimAnchor
|
||||||
|
{
|
||||||
|
/// <summary>
|
||||||
|
/// Whether the claim is anchored with attestation.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("anchored")]
|
||||||
|
public required bool Anchored { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// DSSE envelope digest (sha256:hex).
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("envelope_digest")]
|
||||||
|
public string? EnvelopeDigest { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Predicate type of the attestation.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("predicate_type")]
|
||||||
|
public string? PredicateType { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Rekor log index if transparency-anchored.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("rekor_log_index")]
|
||||||
|
public long? RekorLogIndex { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Rekor entry ID if transparency-anchored.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("rekor_entry_id")]
|
||||||
|
public string? RekorEntryId { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Scope of the attestation.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("scope")]
|
||||||
|
public string? Scope { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Whether the attestation has been verified.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("verified")]
|
||||||
|
public bool? Verified { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Timestamp when the attestation was created.
|
||||||
|
/// </summary>
|
||||||
|
[JsonPropertyName("attested_at")]
|
||||||
|
public DateTimeOffset? AttestedAt { get; init; }
|
||||||
|
|
||||||
|
/// <summary>
|
||||||
|
/// Whether the claim is Rekor-anchored (has log index).
|
||||||
|
/// </summary>
|
||||||
|
[JsonIgnore]
|
||||||
|
public bool IsRekorAnchored => RekorLogIndex.HasValue;
|
||||||
}
|
}
|
||||||
|
|||||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user