Align AOC tasks for Excititor and Concelier

This commit is contained in:
master
2025-10-31 18:50:15 +02:00
committed by root
parent 9e6d9fbae8
commit 8da4e12a90
334 changed files with 35528 additions and 34546 deletions

File diff suppressed because it is too large Load Diff

View File

@@ -1,32 +1,32 @@
# StellaOps.Cli — Agent Brief
## Mission
- Deliver an offline-capable command-line interface that drives StellaOps back-end operations: scanner distribution, scan execution, result uploads, and Concelier database lifecycle calls (init/resume/export).
- Honour StellaOps principles of determinism, observability, and offline-first behaviour while providing a polished operator experience.
## Role Charter
| Role | Mandate | Collaboration |
| --- | --- | --- |
| **DevEx/CLI** | Own CLI UX, command routing, and configuration model. Ensure commands work with empty/default config and document overrides. | Coordinate with Backend/WebService for API contracts and with Docs for operator workflows. |
| **Ops Integrator** | Maintain integration paths for shell/dotnet/docker tooling. Validate that air-gapped runners can bootstrap required binaries. | Work with Concelier/Agent teams to mirror packaging and signing requirements. |
| **QA** | Provide command-level fixtures, golden outputs, and regression coverage (unit & smoke). Ensure commands respect cancellation and deterministic logging. | Partner with QA guild for shared harnesses and test data. |
## Working Agreements
- Configuration is centralised in `StellaOps.Configuration`; always consume the bootstrapper instead of hand rolling builders. Env vars (`API_KEY`, `STELLAOPS_BACKEND_URL`, `StellaOps:*`) override JSON/YAML and default to empty values.
- Command verbs (`scanner`, `scan`, `db`, `config`) are wired through System.CommandLine 2.0; keep handlers composable, cancellation-aware, and unit-testable.
- `scanner download` must verify digests/signatures, install containers locally (docker load), and log artefact metadata.
- `scan run` must execute the container against a directory, materialise artefacts in `ResultsDirectory`, and auto-upload them on success; `scan upload` is the manual retry path.
- Emit structured console logs (single line, UTC timestamps) and honour offline-first expectations—no hidden network calls.
- Mirror repository guidance: stay within `src/Cli/StellaOps.Cli` unless collaborating via documented handshakes.
- Update `TASKS.md` as states change (TODO → DOING → DONE/BLOCKED) and record added tests/fixtures alongside implementation notes.
## Reference Materials
- `docs/modules/concelier/ARCHITECTURE.md` for database operations surface area.
- Backend OpenAPI/contract docs (once available) for job triggers and scanner endpoints.
- Existing module AGENTS/TASKS files for style and coordination cues.
- `docs/09_API_CLI_REFERENCE.md` (section 3) for the user-facing synopsis of the CLI verbs and flags.
### Attestor Command Guild
- Owns the `stella attest` verb family (sign, verify, list, fetch) plus key lifecycle helpers (create, import, rotate, revoke).
- Ensures all attestation flows use the official SDK transport, support offline bundles, and surface JSON/table outputs for automation.
- Guards parity with attestor service policies (verification policies, explainability) and keeps fixtures/tests covering file-based and KMS-backed keys.
# StellaOps.Cli — Agent Brief
## Mission
- Deliver an offline-capable command-line interface that drives StellaOps back-end operations: scanner distribution, scan execution, result uploads, and Concelier database lifecycle calls (init/resume/export).
- Honour StellaOps principles of determinism, observability, and offline-first behaviour while providing a polished operator experience.
## Role Charter
| Role | Mandate | Collaboration |
| --- | --- | --- |
| **DevEx/CLI** | Own CLI UX, command routing, and configuration model. Ensure commands work with empty/default config and document overrides. | Coordinate with Backend/WebService for API contracts and with Docs for operator workflows. |
| **Ops Integrator** | Maintain integration paths for shell/dotnet/docker tooling. Validate that air-gapped runners can bootstrap required binaries. | Work with Concelier/Agent teams to mirror packaging and signing requirements. |
| **QA** | Provide command-level fixtures, golden outputs, and regression coverage (unit & smoke). Ensure commands respect cancellation and deterministic logging. | Partner with QA guild for shared harnesses and test data. |
## Working Agreements
- Configuration is centralised in `StellaOps.Configuration`; always consume the bootstrapper instead of hand rolling builders. Env vars (`API_KEY`, `STELLAOPS_BACKEND_URL`, `StellaOps:*`) override JSON/YAML and default to empty values.
- Command verbs (`scanner`, `scan`, `db`, `config`) are wired through System.CommandLine 2.0; keep handlers composable, cancellation-aware, and unit-testable.
- `scanner download` must verify digests/signatures, install containers locally (docker load), and log artefact metadata.
- `scan run` must execute the container against a directory, materialise artefacts in `ResultsDirectory`, and auto-upload them on success; `scan upload` is the manual retry path.
- Emit structured console logs (single line, UTC timestamps) and honour offline-first expectations—no hidden network calls.
- Mirror repository guidance: stay within `src/Cli/StellaOps.Cli` unless collaborating via documented handshakes.
- Update `TASKS.md` as states change (TODO → DOING → DONE/BLOCKED) and record added tests/fixtures alongside implementation notes.
## Reference Materials
- `docs/modules/concelier/ARCHITECTURE.md` for database operations surface area.
- Backend OpenAPI/contract docs (once available) for job triggers and scanner endpoints.
- Existing module AGENTS/TASKS files for style and coordination cues.
- `docs/09_API_CLI_REFERENCE.md` (section 3) for the user-facing synopsis of the CLI verbs and flags.
### Attestor Command Guild
- Owns the `stella attest` verb family (sign, verify, list, fetch) plus key lifecycle helpers (create, import, rotate, revoke).
- Ensures all attestation flows use the official SDK transport, support offline bundles, and surface JSON/table outputs for automation.
- Guards parity with attestor service policies (verification policies, explainability) and keeps fixtures/tests covering file-based and KMS-backed keys.

View File

@@ -1,59 +1,59 @@
# FEEDCONN-CERTCC-02-009 VINCE Detail & Map Reintegration Plan
- **Author:** BE-Conn-CERTCC (current on-call)
- **Date:** 2025-10-11
- **Scope:** Restore VINCE detail parsing and canonical mapping in Concelier without destabilising downstream Merge/Export pipelines.
## 1. Current State Snapshot (2025-10-11)
- ✅ Fetch pipeline, VINCE summary planner, and detail queue are live; documents land with `DocumentStatuses.PendingParse`.
- ✅ DTO aggregate (`CertCcNoteDto`) plus mapper emit vendor-centric `normalizedVersions` (`scheme=certcc.vendor`) and provenance aligned with `src/Concelier/__Libraries/StellaOps.Concelier.Models/PROVENANCE_GUIDELINES.md`.
- ✅ Regression coverage exists for fetch/parse/map flows (`CertCcConnectorSnapshotTests`), but snapshot regeneration is gated on harness refresh (FEEDCONN-CERTCC-02-007) and QA handoff (FEEDCONN-CERTCC-02-008).
- ⚠️ Parse/map jobs are not scheduled; production still operates in fetch-only mode.
- ⚠️ Downstream Merge team is finalising normalized range ingestion per `src/FASTER_MODELING_AND_NORMALIZATION.md`; we must avoid publishing canonical records until they certify compatibility.
## 2. Required Dependencies & Coordinated Tasks
| Dependency | Owner(s) | Blocking Condition | Handshake |
|------------|----------|--------------------|-----------|
| FEEDCONN-CERTCC-02-004 (Canonical mapping & range primitives hardening) | BE-Conn-CERTCC + Models | Ensure mapper emits deterministic `normalizedVersions` array and provenance field masks | Daily sync with Models/Merge leads; share fixture diff before each enablement phase |
| FEEDCONN-CERTCC-02-007 (Connector test harness remediation) | BE-Conn-CERTCC, QA | Restore `AddSourceCommon` harness + canned VINCE fixtures so we can shadow-run parse/map | Required before Phase 1 |
| FEEDCONN-CERTCC-02-008 (Snapshot coverage handoff) | QA | Snapshot refresh process green to surface regressions | Required before Phase 2 |
| FEEDCONN-CERTCC-02-010 (Partial-detail graceful degradation) | BE-Conn-CERTCC | Resiliency for missing VINCE endpoints to avoid job wedging after reintegration | Should land before Phase 2 cutover |
## 3. Phased Rollout Plan
| Phase | Window (UTC) | Actions | Success Signals | Rollback |
|-------|--------------|---------|-----------------|----------|
| **0 Pre-flight validation** | 2025-10-11 → 2025-10-12 | • Finish FEEDCONN-CERTCC-02-007 harness fixes and regenerate fixtures.<br>• Run `dotnet test src/Concelier/__Tests/StellaOps.Concelier.Connector.CertCc.Tests` with `UPDATE_CERTCC_FIXTURES=0` to confirm deterministic baselines.<br>• Generate sample advisory batch (`dotnet test … --filter SnapshotSmoke`) and deliver JSON diff to Merge for schema verification (`normalizedVersions[].scheme == certcc.vendor`, provenance masks populated). | • Harness tests green locally and in CI.<br>• Merge sign-off that sample advisories conform to `FASTER_MODELING_AND_NORMALIZATION.md`. | N/A (no production enablement yet). |
| **1 Shadow parse/map in staging** | Target start 2025-10-13 | • Register `source:cert-cc:parse` and `source:cert-cc:map` jobs, but gate them behind new config flag `concelier:sources:cert-cc:enableDetailMapping` (default `false`).<br>• Deploy (restart required for options rebinding), enable flag, and point connector at staging Mongo with isolated collection (`advisories_certcc_shadow`).<br>• Run connector for ≥2 cycles; compare advisory counts vs. fetch-only baseline and validate `concelier.range.primitives` metrics include `scheme=certcc.vendor`. | • No uncaught exceptions in staging logs.<br>• Shadow advisories match expected vendor counts (±5%).<br>`certcc.summary.fetch.*` + new `certcc.map.duration.ms` metrics stable. | Disable flag; staging returns to fetch-only. No production impact. |
| **2 Controlled production enablement** | Target start 2025-10-14 | • Redeploy production with flag enabled, start with job concurrency `1`, and reduce `MaxNotesPerFetch` to 5 for first 24h.<br>• Observe metrics dashboards hourly (fetch/map latency, pending queues, Mongo write throughput).<br>• QA to replay latest snapshots and confirm no deterministic drift.<br>• Publish advisory sample (top 10 changed docs) to Merge Slack channel for validation. | • Pending parse/mapping queues drain within expected SLA (<30min).<br>• No increase in merge dedupe anomalies.<br>• Mongo writes stay within 10% of baseline. | Toggle flag off, re-run fetch-only. Clear `pendingMappings` via connector cursor reset if stuck. |
| **3 Full production & cleanup** | Target start 2025-10-15 | • Restore `MaxNotesPerFetch` to configured default (20).<br>• Remove temporary throttles and leave flag enabled by default.<br>• Update `README.md` rollout notes; close FEEDCONN-CERTCC-02-009.<br>• Kick off post-merge audit with Merge to ensure new advisories dedupe with other sources. | • Stable operations for ≥48h, no degradation alerts.<br>• Merge confirms conflict resolver behaviour unchanged. | If regression detected, revert to Phase2 state or disable jobs; retain plan for reuse. |
## 4. Monitoring & Validation Checklist
- Dashboards: `certcc.*` meters (plan, summary fetch, detail fetch) plus `concelier.range.primitives` with tag `scheme=certcc.vendor`.
- Logs: ensure Parse/Map jobs emit `correlationId` aligned with fetch events for traceability.
- Data QA: run `src/Tools/dump_advisory` against two VINCE notes (one multi-vendor, one single-vendor) every phase to spot-check normalized versions ordering and provenance.
- Storage: verify Mongo TTL/size for `raw_documents` and `dtos`—detail payload volume increases by ~3× when mapping resumes.
## 5. Rollback / Contingency Playbook
1. Disable `concelier:sources:cert-cc:enableDetailMapping` flag (and optionally set `MaxNotesPerFetch=0` for a single cycle) to halt new detail ingestion.
2. Run connector once to update cursor; verify `pendingMappings` drains.
3. If advisories already persisted, coordinate with Merge to soft-delete affected `certcc/*` advisories by advisory key hash (no schema rollback required).
4. Re-run Phase1 shadow validation before retrying.
## 6. Communication Cadence
- Daily check-in with Models/Merge leads (09:30 EDT) to surface normalizedVersions/provenance diffs.
- Post-phase reports in `#concelier-certcc` Slack channel summarising metrics, advisory counts, and outstanding issues.
- Escalate blockers >12h via Runbook SEV-3 path and annotate `TASKS.md`.
## 7. Open Questions / Next Actions
- [ ] Confirm whether Merge requires additional provenance field masks before Phase2 (waiting on feedback from 2025-10-11 sample).
- [ ] Decide if CSAF endpoint ingestion (optional) should piggyback on Phase3 or stay deferred.
- [ ] Validate that FEEDCONN-CERTCC-02-010 coverage handles mixed 200/404 VINCE endpoints during partial outages.
Once Dependencies (Section2) are cleared and Phase3 completes, update `src/Concelier/StellaOps.Concelier.PluginBinaries/StellaOps.Concelier.Connector.CertCc/TASKS.md` and close FEEDCONN-CERTCC-02-009.
# FEEDCONN-CERTCC-02-009 VINCE Detail & Map Reintegration Plan
- **Author:** BE-Conn-CERTCC (current on-call)
- **Date:** 2025-10-11
- **Scope:** Restore VINCE detail parsing and canonical mapping in Concelier without destabilising downstream Merge/Export pipelines.
## 1. Current State Snapshot (2025-10-11)
- ✅ Fetch pipeline, VINCE summary planner, and detail queue are live; documents land with `DocumentStatuses.PendingParse`.
- ✅ DTO aggregate (`CertCcNoteDto`) plus mapper emit vendor-centric `normalizedVersions` (`scheme=certcc.vendor`) and provenance aligned with `src/Concelier/__Libraries/StellaOps.Concelier.Models/PROVENANCE_GUIDELINES.md`.
- ✅ Regression coverage exists for fetch/parse/map flows (`CertCcConnectorSnapshotTests`), but snapshot regeneration is gated on harness refresh (FEEDCONN-CERTCC-02-007) and QA handoff (FEEDCONN-CERTCC-02-008).
- ⚠️ Parse/map jobs are not scheduled; production still operates in fetch-only mode.
- ⚠️ Downstream Merge team is finalising normalized range ingestion per `src/FASTER_MODELING_AND_NORMALIZATION.md`; we must avoid publishing canonical records until they certify compatibility.
## 2. Required Dependencies & Coordinated Tasks
| Dependency | Owner(s) | Blocking Condition | Handshake |
|------------|----------|--------------------|-----------|
| FEEDCONN-CERTCC-02-004 (Canonical mapping & range primitives hardening) | BE-Conn-CERTCC + Models | Ensure mapper emits deterministic `normalizedVersions` array and provenance field masks | Daily sync with Models/Merge leads; share fixture diff before each enablement phase |
| FEEDCONN-CERTCC-02-007 (Connector test harness remediation) | BE-Conn-CERTCC, QA | Restore `AddSourceCommon` harness + canned VINCE fixtures so we can shadow-run parse/map | Required before Phase 1 |
| FEEDCONN-CERTCC-02-008 (Snapshot coverage handoff) | QA | Snapshot refresh process green to surface regressions | Required before Phase 2 |
| FEEDCONN-CERTCC-02-010 (Partial-detail graceful degradation) | BE-Conn-CERTCC | Resiliency for missing VINCE endpoints to avoid job wedging after reintegration | Should land before Phase 2 cutover |
## 3. Phased Rollout Plan
| Phase | Window (UTC) | Actions | Success Signals | Rollback |
|-------|--------------|---------|-----------------|----------|
| **0 Pre-flight validation** | 2025-10-11 → 2025-10-12 | • Finish FEEDCONN-CERTCC-02-007 harness fixes and regenerate fixtures.<br>• Run `dotnet test src/Concelier/__Tests/StellaOps.Concelier.Connector.CertCc.Tests` with `UPDATE_CERTCC_FIXTURES=0` to confirm deterministic baselines.<br>• Generate sample advisory batch (`dotnet test … --filter SnapshotSmoke`) and deliver JSON diff to Merge for schema verification (`normalizedVersions[].scheme == certcc.vendor`, provenance masks populated). | • Harness tests green locally and in CI.<br>• Merge sign-off that sample advisories conform to `FASTER_MODELING_AND_NORMALIZATION.md`. | N/A (no production enablement yet). |
| **1 Shadow parse/map in staging** | Target start 2025-10-13 | • Register `source:cert-cc:parse` and `source:cert-cc:map` jobs, but gate them behind new config flag `concelier:sources:cert-cc:enableDetailMapping` (default `false`).<br>• Deploy (restart required for options rebinding), enable flag, and point connector at staging Mongo with isolated collection (`advisories_certcc_shadow`).<br>• Run connector for ≥2 cycles; compare advisory counts vs. fetch-only baseline and validate `concelier.range.primitives` metrics include `scheme=certcc.vendor`. | • No uncaught exceptions in staging logs.<br>• Shadow advisories match expected vendor counts (±5%).<br>`certcc.summary.fetch.*` + new `certcc.map.duration.ms` metrics stable. | Disable flag; staging returns to fetch-only. No production impact. |
| **2 Controlled production enablement** | Target start 2025-10-14 | • Redeploy production with flag enabled, start with job concurrency `1`, and reduce `MaxNotesPerFetch` to 5 for first 24h.<br>• Observe metrics dashboards hourly (fetch/map latency, pending queues, Mongo write throughput).<br>• QA to replay latest snapshots and confirm no deterministic drift.<br>• Publish advisory sample (top 10 changed docs) to Merge Slack channel for validation. | • Pending parse/mapping queues drain within expected SLA (<30min).<br>• No increase in merge dedupe anomalies.<br>• Mongo writes stay within 10% of baseline. | Toggle flag off, re-run fetch-only. Clear `pendingMappings` via connector cursor reset if stuck. |
| **3 Full production & cleanup** | Target start 2025-10-15 | • Restore `MaxNotesPerFetch` to configured default (20).<br>• Remove temporary throttles and leave flag enabled by default.<br>• Update `README.md` rollout notes; close FEEDCONN-CERTCC-02-009.<br>• Kick off post-merge audit with Merge to ensure new advisories dedupe with other sources. | • Stable operations for ≥48h, no degradation alerts.<br>• Merge confirms conflict resolver behaviour unchanged. | If regression detected, revert to Phase2 state or disable jobs; retain plan for reuse. |
## 4. Monitoring & Validation Checklist
- Dashboards: `certcc.*` meters (plan, summary fetch, detail fetch) plus `concelier.range.primitives` with tag `scheme=certcc.vendor`.
- Logs: ensure Parse/Map jobs emit `correlationId` aligned with fetch events for traceability.
- Data QA: run `src/Tools/dump_advisory` against two VINCE notes (one multi-vendor, one single-vendor) every phase to spot-check normalized versions ordering and provenance.
- Storage: verify Mongo TTL/size for `raw_documents` and `dtos`—detail payload volume increases by ~3× when mapping resumes.
## 5. Rollback / Contingency Playbook
1. Disable `concelier:sources:cert-cc:enableDetailMapping` flag (and optionally set `MaxNotesPerFetch=0` for a single cycle) to halt new detail ingestion.
2. Run connector once to update cursor; verify `pendingMappings` drains.
3. If advisories already persisted, coordinate with Merge to soft-delete affected `certcc/*` advisories by advisory key hash (no schema rollback required).
4. Re-run Phase1 shadow validation before retrying.
## 6. Communication Cadence
- Daily check-in with Models/Merge leads (09:30 EDT) to surface normalizedVersions/provenance diffs.
- Post-phase reports in `#concelier-certcc` Slack channel summarising metrics, advisory counts, and outstanding issues.
- Escalate blockers >12h via Runbook SEV-3 path and annotate `TASKS.md`.
## 7. Open Questions / Next Actions
- [ ] Confirm whether Merge requires additional provenance field masks before Phase2 (waiting on feedback from 2025-10-11 sample).
- [ ] Decide if CSAF endpoint ingestion (optional) should piggyback on Phase3 or stay deferred.
- [ ] Validate that FEEDCONN-CERTCC-02-010 coverage handles mixed 200/404 VINCE endpoints during partial outages.
Once Dependencies (Section2) are cleared and Phase3 completes, update `src/Concelier/StellaOps.Concelier.PluginBinaries/StellaOps.Concelier.Connector.CertCc/TASKS.md` and close FEEDCONN-CERTCC-02-009.

View File

@@ -19,23 +19,25 @@ internal sealed class AdvisoryObservationFactory : IAdvisoryObservationFactory
ArgumentNullException.ThrowIfNull(rawDocument);
var source = CreateSource(rawDocument.Source, rawDocument.Upstream);
var upstream = CreateUpstream(rawDocument.Upstream);
var content = CreateContent(rawDocument.Content);
var linkset = CreateLinkset(rawDocument.Identifiers, rawDocument.Linkset);
var attributes = CreateAttributes(rawDocument);
var createdAt = (observedAt ?? rawDocument.Upstream.RetrievedAt).ToUniversalTime();
return new AdvisoryObservation(
var upstream = CreateUpstream(rawDocument.Upstream);
var content = CreateContent(rawDocument.Content);
var linkset = CreateLinkset(rawDocument.Identifiers, rawDocument.Linkset);
var rawLinkset = CreateRawLinkset(rawDocument.Identifiers, rawDocument.Linkset);
var attributes = CreateAttributes(rawDocument);
var createdAt = (observedAt ?? rawDocument.Upstream.RetrievedAt).ToUniversalTime();
return new AdvisoryObservation(
observationId: BuildObservationId(rawDocument),
tenant: rawDocument.Tenant,
source: source,
upstream: upstream,
content: content,
linkset: linkset,
createdAt: createdAt,
attributes: attributes);
}
upstream: upstream,
content: content,
linkset: linkset,
rawLinkset: rawLinkset,
createdAt: createdAt,
attributes: attributes);
}
private static AdvisoryObservationSource CreateSource(RawSourceMetadata source, RawUpstreamMetadata upstream)
{
@@ -110,16 +112,64 @@ internal sealed class AdvisoryObservationFactory : IAdvisoryObservationFactory
return JsonNode.Parse(document.RootElement.GetRawText()) ?? JsonNode.Parse("{}")!;
}
private static AdvisoryObservationLinkset CreateLinkset(RawIdentifiers identifiers, RawLinkset linkset)
{
var aliases = NormalizeAliases(identifiers, linkset);
var purls = NormalizePackageUrls(linkset.PackageUrls);
var cpes = NormalizeCpes(linkset.Cpes);
var references = NormalizeReferences(linkset.References);
return new AdvisoryObservationLinkset(aliases, purls, cpes, references);
}
private static AdvisoryObservationLinkset CreateLinkset(RawIdentifiers identifiers, RawLinkset linkset)
{
var aliases = NormalizeAliases(identifiers, linkset);
var purls = NormalizePackageUrls(linkset.PackageUrls);
var cpes = NormalizeCpes(linkset.Cpes);
var references = NormalizeReferences(linkset.References);
return new AdvisoryObservationLinkset(aliases, purls, cpes, references);
}
private static RawLinkset CreateRawLinkset(RawIdentifiers identifiers, RawLinkset linkset)
{
var aliasBuilder = ImmutableArray.CreateBuilder<string>();
if (!string.IsNullOrWhiteSpace(identifiers.PrimaryId))
{
aliasBuilder.Add(identifiers.PrimaryId);
}
if (!identifiers.Aliases.IsDefaultOrEmpty)
{
foreach (var alias in identifiers.Aliases)
{
if (!string.IsNullOrEmpty(alias))
{
aliasBuilder.Add(alias);
}
}
}
if (!linkset.Aliases.IsDefaultOrEmpty)
{
foreach (var alias in linkset.Aliases)
{
if (!string.IsNullOrEmpty(alias))
{
aliasBuilder.Add(alias);
}
}
}
static ImmutableArray<string> EnsureArray(ImmutableArray<string> values)
=> values.IsDefault ? ImmutableArray<string>.Empty : values;
static ImmutableArray<RawReference> EnsureReferences(ImmutableArray<RawReference> values)
=> values.IsDefault ? ImmutableArray<RawReference>.Empty : values;
return linkset with
{
Aliases = aliasBuilder.ToImmutable(),
PackageUrls = EnsureArray(linkset.PackageUrls),
Cpes = EnsureArray(linkset.Cpes),
References = EnsureReferences(linkset.References),
ReconciledFrom = EnsureArray(linkset.ReconciledFrom),
Notes = linkset.Notes ?? ImmutableDictionary<string, string>.Empty
};
}
private static IEnumerable<string> NormalizeAliases(RawIdentifiers identifiers, RawLinkset linkset)
{
var aliases = new HashSet<string>(StringComparer.OrdinalIgnoreCase);

View File

@@ -1,6 +1,7 @@
using System.Collections.Immutable;
using System.Text.Json;
using System.Text.Json.Nodes;
using System.Collections.Immutable;
using System.Text.Json;
using System.Text.Json.Nodes;
using StellaOps.Concelier.RawModels;
namespace StellaOps.Concelier.Models.Observations;
@@ -12,19 +13,21 @@ public sealed record AdvisoryObservation
AdvisoryObservationSource source,
AdvisoryObservationUpstream upstream,
AdvisoryObservationContent content,
AdvisoryObservationLinkset linkset,
DateTimeOffset createdAt,
ImmutableDictionary<string, string>? attributes = null)
{
ObservationId = Validation.EnsureNotNullOrWhiteSpace(observationId, nameof(observationId));
Tenant = Validation.EnsureNotNullOrWhiteSpace(tenant, nameof(tenant)).ToLowerInvariant();
Source = source ?? throw new ArgumentNullException(nameof(source));
Upstream = upstream ?? throw new ArgumentNullException(nameof(upstream));
Content = content ?? throw new ArgumentNullException(nameof(content));
Linkset = linkset ?? throw new ArgumentNullException(nameof(linkset));
CreatedAt = createdAt.ToUniversalTime();
Attributes = NormalizeAttributes(attributes);
}
AdvisoryObservationLinkset linkset,
RawLinkset rawLinkset,
DateTimeOffset createdAt,
ImmutableDictionary<string, string>? attributes = null)
{
ObservationId = Validation.EnsureNotNullOrWhiteSpace(observationId, nameof(observationId));
Tenant = Validation.EnsureNotNullOrWhiteSpace(tenant, nameof(tenant)).ToLowerInvariant();
Source = source ?? throw new ArgumentNullException(nameof(source));
Upstream = upstream ?? throw new ArgumentNullException(nameof(upstream));
Content = content ?? throw new ArgumentNullException(nameof(content));
Linkset = linkset ?? throw new ArgumentNullException(nameof(linkset));
RawLinkset = SanitizeRawLinkset(rawLinkset);
CreatedAt = createdAt.ToUniversalTime();
Attributes = NormalizeAttributes(attributes);
}
public string ObservationId { get; }
@@ -34,15 +37,17 @@ public sealed record AdvisoryObservation
public AdvisoryObservationUpstream Upstream { get; }
public AdvisoryObservationContent Content { get; }
public AdvisoryObservationLinkset Linkset { get; }
public DateTimeOffset CreatedAt { get; }
public ImmutableDictionary<string, string> Attributes { get; }
private static ImmutableDictionary<string, string> NormalizeAttributes(ImmutableDictionary<string, string>? attributes)
public AdvisoryObservationContent Content { get; }
public AdvisoryObservationLinkset Linkset { get; }
public RawLinkset RawLinkset { get; }
public DateTimeOffset CreatedAt { get; }
public ImmutableDictionary<string, string> Attributes { get; }
private static ImmutableDictionary<string, string> NormalizeAttributes(ImmutableDictionary<string, string>? attributes)
{
if (attributes is null || attributes.Count == 0)
{
@@ -59,10 +64,58 @@ public sealed record AdvisoryObservation
builder[pair.Key.Trim()] = pair.Value;
}
return builder.ToImmutable();
}
}
return builder.ToImmutable();
}
private static RawLinkset SanitizeRawLinkset(RawLinkset? rawLinkset)
{
if (rawLinkset is null)
{
return new RawLinkset();
}
static ImmutableArray<string> SanitizeStrings(ImmutableArray<string> values)
{
if (values.IsDefault)
{
return ImmutableArray<string>.Empty;
}
return values;
}
static ImmutableArray<RawReference> SanitizeReferences(ImmutableArray<RawReference> references)
{
if (references.IsDefault)
{
return ImmutableArray<RawReference>.Empty;
}
return references;
}
static ImmutableDictionary<string, string> SanitizeNotes(ImmutableDictionary<string, string>? notes)
{
if (notes is null || notes.Count == 0)
{
return ImmutableDictionary<string, string>.Empty;
}
return notes;
}
return rawLinkset with
{
Aliases = SanitizeStrings(rawLinkset.Aliases),
PackageUrls = SanitizeStrings(rawLinkset.PackageUrls),
Cpes = SanitizeStrings(rawLinkset.Cpes),
References = SanitizeReferences(rawLinkset.References),
ReconciledFrom = SanitizeStrings(rawLinkset.ReconciledFrom),
Notes = SanitizeNotes(rawLinkset.Notes)
};
}
}
public sealed record AdvisoryObservationSource
{

View File

@@ -1,12 +1,15 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>net10.0</TargetFramework>
<LangVersion>preview</LangVersion>
<Nullable>enable</Nullable>
<ImplicitUsings>enable</ImplicitUsings>
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Microsoft.Extensions.Logging.Abstractions" Version="10.0.0-rc.2.25502.107" />
</ItemGroup>
</Project>
<PropertyGroup>
<TargetFramework>net10.0</TargetFramework>
<LangVersion>preview</LangVersion>
<Nullable>enable</Nullable>
<ImplicitUsings>enable</ImplicitUsings>
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Microsoft.Extensions.Logging.Abstractions" Version="10.0.0-rc.2.25502.107" />
</ItemGroup>
<ItemGroup>
<ProjectReference Include="..\StellaOps.Concelier.RawModels\StellaOps.Concelier.RawModels.csproj" />
</ItemGroup>
</Project>

View File

@@ -26,6 +26,7 @@ This module owns the persistent shape of Concelier's MongoDB database. Upgrades
| `20251028_advisory_raw_idempotency_index` | Applies compound unique index on `(source.vendor, upstream.upstream_id, upstream.content_hash, tenant)` after verifying no duplicates exist. |
| `20251028_advisory_supersedes_backfill` | Renames legacy `advisory` collection to a read-only backup view and backfills `supersedes` chains across `advisory_raw`. |
| `20251028_advisory_raw_validator` | Applies Aggregation-Only Contract JSON schema validator to the `advisory_raw` collection with configurable enforcement level. |
| `20251104_advisory_observations_raw_linkset` | Backfills `rawLinkset` on `advisory_observations` using stored `advisory_raw` documents so canonical and raw projections co-exist for downstream policy joins. |
## Operator Runbook

View File

@@ -0,0 +1,442 @@
using System;
using System.Collections.Generic;
using System.Collections.Immutable;
using System.Globalization;
using System.Linq;
using System.Text.Json;
using System.Threading;
using System.Threading.Tasks;
using MongoDB.Bson;
using MongoDB.Bson.IO;
using MongoDB.Driver;
using StellaOps.Concelier.RawModels;
namespace StellaOps.Concelier.Storage.Mongo.Migrations;
/// <summary>
/// Backfills the raw linkset projection on advisory observations so downstream services
/// can rely on both canonical and raw linkset shapes.
/// </summary>
public sealed class EnsureAdvisoryObservationsRawLinksetMigration : IMongoMigration
{
private const string MigrationId = "20251104_advisory_observations_raw_linkset";
private const int BulkBatchSize = 500;
public string Id => MigrationId;
public string Description => "Populate rawLinkset field for advisory observations using stored advisory_raw documents.";
public async Task ApplyAsync(IMongoDatabase database, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(database);
var observations = database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.AdvisoryObservations);
var rawCollection = database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.AdvisoryRaw);
var filter = Builders<BsonDocument>.Filter.Exists("rawLinkset", false) |
Builders<BsonDocument>.Filter.Type("rawLinkset", BsonType.Null);
using var cursor = await observations
.Find(filter)
.ToCursorAsync(cancellationToken)
.ConfigureAwait(false);
var updates = new List<WriteModel<BsonDocument>>(BulkBatchSize);
var missingRawDocuments = new List<string>();
while (await cursor.MoveNextAsync(cancellationToken).ConfigureAwait(false))
{
foreach (var observationDocument in cursor.Current)
{
cancellationToken.ThrowIfCancellationRequested();
if (!TryExtractObservationKey(observationDocument, out var key))
{
continue;
}
var rawFilter = Builders<BsonDocument>.Filter.Eq("tenant", key.Tenant) &
Builders<BsonDocument>.Filter.Eq("source.vendor", key.Vendor) &
Builders<BsonDocument>.Filter.Eq("upstream.upstream_id", key.UpstreamId) &
Builders<BsonDocument>.Filter.Eq("upstream.content_hash", key.ContentHash);
var rawDocument = await rawCollection
.Find(rawFilter)
.Sort(Builders<BsonDocument>.Sort.Descending("ingested_at").Descending("_id"))
.Limit(1)
.FirstOrDefaultAsync(cancellationToken)
.ConfigureAwait(false);
if (rawDocument is null)
{
missingRawDocuments.Add(key.ToString());
continue;
}
var advisoryRaw = MapToRawDocument(rawDocument);
var rawLinkset = BuildRawLinkset(advisoryRaw.Identifiers, advisoryRaw.Linkset);
var rawLinksetDocument = BuildRawLinksetBson(rawLinkset);
var update = Builders<BsonDocument>.Update.Set("rawLinkset", rawLinksetDocument);
updates.Add(new UpdateOneModel<BsonDocument>(
Builders<BsonDocument>.Filter.Eq("_id", observationDocument["_id"].AsString),
update));
if (updates.Count >= BulkBatchSize)
{
await observations.BulkWriteAsync(updates, cancellationToken: cancellationToken).ConfigureAwait(false);
updates.Clear();
}
}
}
if (updates.Count > 0)
{
await observations.BulkWriteAsync(updates, cancellationToken: cancellationToken).ConfigureAwait(false);
}
if (missingRawDocuments.Count > 0)
{
throw new InvalidOperationException(
$"Unable to locate advisory_raw documents for {missingRawDocuments.Count} observations: {string.Join(", ", missingRawDocuments.Take(10))}");
}
}
private static bool TryExtractObservationKey(BsonDocument observation, out ObservationKey key)
{
key = default;
if (!observation.TryGetValue("tenant", out var tenantValue) || tenantValue.IsBsonNull)
{
return false;
}
if (!observation.TryGetValue("source", out var sourceValue) || sourceValue is not BsonDocument sourceDocument)
{
return false;
}
if (!observation.TryGetValue("upstream", out var upstreamValue) || upstreamValue is not BsonDocument upstreamDocument)
{
return false;
}
var tenant = tenantValue.AsString;
var vendor = sourceDocument.GetValue("vendor", BsonString.Empty).AsString;
var upstreamId = upstreamDocument.GetValue("upstream_id", BsonString.Empty).AsString;
var contentHash = upstreamDocument.GetValue("contentHash", BsonString.Empty).AsString;
var createdAt = observation.GetValue("createdAt", BsonNull.Value);
key = new ObservationKey(
tenant,
vendor,
upstreamId,
contentHash,
BsonValueToDateTimeOffset(createdAt) ?? DateTimeOffset.UtcNow);
return !string.IsNullOrWhiteSpace(tenant) &&
!string.IsNullOrWhiteSpace(vendor) &&
!string.IsNullOrWhiteSpace(upstreamId) &&
!string.IsNullOrWhiteSpace(contentHash);
}
private static AdvisoryRawDocument MapToRawDocument(BsonDocument document)
{
var tenant = GetRequiredString(document, "tenant");
var source = MapSource(document["source"].AsBsonDocument);
var upstream = MapUpstream(document["upstream"].AsBsonDocument);
var content = MapContent(document["content"].AsBsonDocument);
var identifiers = MapIdentifiers(document["identifiers"].AsBsonDocument);
var linkset = MapLinkset(document["linkset"].AsBsonDocument);
var supersedes = document.GetValue("supersedes", BsonNull.Value);
return new AdvisoryRawDocument(
tenant,
source,
upstream,
content,
identifiers,
linkset,
supersedes.IsBsonNull ? null : supersedes.AsString);
}
private static RawSourceMetadata MapSource(BsonDocument source)
{
return new RawSourceMetadata(
GetRequiredString(source, "vendor"),
GetRequiredString(source, "connector"),
GetRequiredString(source, "version"),
GetOptionalString(source, "stream"));
}
private static RawUpstreamMetadata MapUpstream(BsonDocument upstream)
{
var provenanceBuilder = ImmutableDictionary.CreateBuilder<string, string>(StringComparer.Ordinal);
if (upstream.TryGetValue("provenance", out var provenanceValue) && provenanceValue.IsBsonDocument)
{
foreach (var element in provenanceValue.AsBsonDocument)
{
provenanceBuilder[element.Name] = BsonValueToString(element.Value);
}
}
var signatureDocument = upstream["signature"].AsBsonDocument;
var signature = new RawSignatureMetadata(
signatureDocument.GetValue("present", BsonBoolean.False).AsBoolean,
signatureDocument.TryGetValue("format", out var format) && !format.IsBsonNull ? format.AsString : null,
signatureDocument.TryGetValue("key_id", out var keyId) && !keyId.IsBsonNull ? keyId.AsString : null,
signatureDocument.TryGetValue("sig", out var sig) && !sig.IsBsonNull ? sig.AsString : null,
signatureDocument.TryGetValue("certificate", out var certificate) && !certificate.IsBsonNull ? certificate.AsString : null,
signatureDocument.TryGetValue("digest", out var digest) && !digest.IsBsonNull ? digest.AsString : null);
return new RawUpstreamMetadata(
GetRequiredString(upstream, "upstream_id"),
upstream.TryGetValue("document_version", out var version) && !version.IsBsonNull ? version.AsString : null,
GetDateTimeOffset(upstream, "retrieved_at", DateTimeOffset.UtcNow),
GetRequiredString(upstream, "content_hash"),
signature,
provenanceBuilder.ToImmutable());
}
private static RawContent MapContent(BsonDocument content)
{
var rawValue = content.GetValue("raw", BsonNull.Value);
string rawJson;
if (rawValue.IsBsonNull)
{
rawJson = "{}";
}
else if (rawValue.IsString)
{
rawJson = rawValue.AsString ?? "{}";
}
else
{
rawJson = rawValue.ToJson(new JsonWriterSettings { OutputMode = JsonOutputMode.RelaxedExtendedJson });
}
using var document = JsonDocument.Parse(string.IsNullOrWhiteSpace(rawJson) ? "{}" : rawJson);
return new RawContent(
GetRequiredString(content, "format"),
content.TryGetValue("spec_version", out var specVersion) && !specVersion.IsBsonNull ? specVersion.AsString : null,
document.RootElement.Clone(),
content.TryGetValue("encoding", out var encoding) && !encoding.IsBsonNull ? encoding.AsString : null);
}
private static RawIdentifiers MapIdentifiers(BsonDocument identifiers)
{
var aliases = identifiers.TryGetValue("aliases", out var aliasesValue) && aliasesValue.IsBsonArray
? aliasesValue.AsBsonArray.Select(BsonValueToString).ToImmutableArray()
: ImmutableArray<string>.Empty;
return new RawIdentifiers(
aliases,
GetRequiredString(identifiers, "primary"));
}
private static RawLinkset MapLinkset(BsonDocument linkset)
{
var aliases = linkset.TryGetValue("aliases", out var aliasesValue) && aliasesValue.IsBsonArray
? aliasesValue.AsBsonArray.Select(BsonValueToString).ToImmutableArray()
: ImmutableArray<string>.Empty;
var purls = linkset.TryGetValue("purls", out var purlsValue) && purlsValue.IsBsonArray
? purlsValue.AsBsonArray.Select(BsonValueToString).ToImmutableArray()
: ImmutableArray<string>.Empty;
var cpes = linkset.TryGetValue("cpes", out var cpesValue) && cpesValue.IsBsonArray
? cpesValue.AsBsonArray.Select(BsonValueToString).ToImmutableArray()
: ImmutableArray<string>.Empty;
var references = linkset.TryGetValue("references", out var referencesValue) && referencesValue.IsBsonArray
? referencesValue.AsBsonArray
.Where(static value => value.IsBsonDocument)
.Select(value =>
{
var doc = value.AsBsonDocument;
return new RawReference(
GetRequiredString(doc, "type"),
GetRequiredString(doc, "url"),
doc.TryGetValue("source", out var source) && !source.IsBsonNull ? source.AsString : null);
})
.ToImmutableArray()
: ImmutableArray<RawReference>.Empty;
var reconciledFrom = linkset.TryGetValue("reconciled_from", out var reconciledValue) && reconciledValue.IsBsonArray
? reconciledValue.AsBsonArray.Select(BsonValueToString).ToImmutableArray()
: ImmutableArray<string>.Empty;
var notesBuilder = ImmutableDictionary.CreateBuilder<string, string>(StringComparer.Ordinal);
if (linkset.TryGetValue("notes", out var notesValue) && notesValue.IsBsonDocument)
{
foreach (var element in notesValue.AsBsonDocument)
{
notesBuilder[element.Name] = BsonValueToString(element.Value);
}
}
return new RawLinkset
{
Aliases = aliases,
PackageUrls = purls,
Cpes = cpes,
References = references,
ReconciledFrom = reconciledFrom,
Notes = notesBuilder.ToImmutable()
};
}
private static RawLinkset BuildRawLinkset(RawIdentifiers identifiers, RawLinkset linkset)
{
var aliasBuilder = ImmutableArray.CreateBuilder<string>();
if (!string.IsNullOrWhiteSpace(identifiers.PrimaryId))
{
aliasBuilder.Add(identifiers.PrimaryId);
}
if (!identifiers.Aliases.IsDefaultOrEmpty)
{
foreach (var alias in identifiers.Aliases)
{
if (!string.IsNullOrEmpty(alias))
{
aliasBuilder.Add(alias);
}
}
}
if (!linkset.Aliases.IsDefaultOrEmpty)
{
foreach (var alias in linkset.Aliases)
{
if (!string.IsNullOrEmpty(alias))
{
aliasBuilder.Add(alias);
}
}
}
static ImmutableArray<string> EnsureArray(ImmutableArray<string> values)
=> values.IsDefault ? ImmutableArray<string>.Empty : values;
static ImmutableArray<RawReference> EnsureReferences(ImmutableArray<RawReference> values)
=> values.IsDefault ? ImmutableArray<RawReference>.Empty : values;
return linkset with
{
Aliases = aliasBuilder.ToImmutable(),
PackageUrls = EnsureArray(linkset.PackageUrls),
Cpes = EnsureArray(linkset.Cpes),
References = EnsureReferences(linkset.References),
ReconciledFrom = EnsureArray(linkset.ReconciledFrom),
Notes = linkset.Notes ?? ImmutableDictionary<string, string>.Empty
};
}
private static BsonDocument BuildRawLinksetBson(RawLinkset rawLinkset)
{
var references = new BsonArray(rawLinkset.References.Select(reference =>
{
var referenceDocument = new BsonDocument
{
{ "type", reference.Type },
{ "url", reference.Url }
};
if (!string.IsNullOrWhiteSpace(reference.Source))
{
referenceDocument["source"] = reference.Source;
}
return referenceDocument;
}));
var notes = new BsonDocument();
if (rawLinkset.Notes is not null)
{
foreach (var entry in rawLinkset.Notes)
{
notes[entry.Key] = entry.Value;
}
}
return new BsonDocument
{
{ "aliases", new BsonArray(rawLinkset.Aliases) },
{ "purls", new BsonArray(rawLinkset.PackageUrls) },
{ "cpes", new BsonArray(rawLinkset.Cpes) },
{ "references", references },
{ "reconciled_from", new BsonArray(rawLinkset.ReconciledFrom) },
{ "notes", notes }
};
}
private static string GetRequiredString(BsonDocument document, string key)
{
if (!document.TryGetValue(key, out var value) || value.IsBsonNull)
{
return string.Empty;
}
return value.IsString ? value.AsString : value.ToString() ?? string.Empty;
}
private static string? GetOptionalString(BsonDocument document, string key)
{
if (!document.TryGetValue(key, out var value) || value.IsBsonNull)
{
return null;
}
return value.IsString ? value.AsString : value.ToString();
}
private static string BsonValueToString(BsonValue value)
{
if (value.IsString)
{
return value.AsString ?? string.Empty;
}
if (value.IsBsonNull)
{
return string.Empty;
}
return value.ToString() ?? string.Empty;
}
private static DateTimeOffset GetDateTimeOffset(BsonDocument document, string field, DateTimeOffset fallback)
{
if (!document.TryGetValue(field, out var value) || value.IsBsonNull)
{
return fallback;
}
return BsonValueToDateTimeOffset(value) ?? fallback;
}
private static DateTimeOffset? BsonValueToDateTimeOffset(BsonValue value)
{
return value.BsonType switch
{
BsonType.DateTime => new DateTimeOffset(DateTime.SpecifyKind(value.ToUniversalTime(), DateTimeKind.Utc)),
BsonType.String when DateTimeOffset.TryParse(value.AsString, CultureInfo.InvariantCulture, DateTimeStyles.AssumeUniversal, out var parsed)
=> parsed.ToUniversalTime(),
BsonType.Int64 => DateTimeOffset.FromUnixTimeMilliseconds(value.AsInt64).ToUniversalTime(),
_ => null
};
}
private readonly record struct ObservationKey(
string Tenant,
string Vendor,
string UpstreamId,
string ContentHash,
DateTimeOffset CreatedAt)
{
public override string ToString()
=> $"{Tenant}:{Vendor}:{UpstreamId}:{ContentHash}";
}
}

View File

@@ -24,11 +24,16 @@ public sealed class AdvisoryObservationDocument
public AdvisoryObservationContentDocument Content { get; set; } = new();
[BsonElement("linkset")]
public AdvisoryObservationLinksetDocument Linkset { get; set; } = new();
[BsonElement("createdAt")]
public DateTime CreatedAt { get; set; }
= DateTime.UtcNow;
public AdvisoryObservationLinksetDocument Linkset { get; set; } = new();
[BsonElement("rawLinkset")]
[BsonIgnoreIfNull]
public AdvisoryObservationRawLinksetDocument? RawLinkset { get; set; }
= null;
[BsonElement("createdAt")]
public DateTime CreatedAt { get; set; }
= DateTime.UtcNow;
[BsonElement("attributes")]
[BsonIgnoreIfNull]
@@ -129,11 +134,11 @@ public sealed class AdvisoryObservationContentDocument
}
[BsonIgnoreExtraElements]
public sealed class AdvisoryObservationLinksetDocument
{
[BsonElement("aliases")]
[BsonIgnoreIfNull]
public List<string>? Aliases { get; set; }
public sealed class AdvisoryObservationLinksetDocument
{
[BsonElement("aliases")]
[BsonIgnoreIfNull]
public List<string>? Aliases { get; set; }
= new();
[BsonElement("purls")]
@@ -153,11 +158,62 @@ public sealed class AdvisoryObservationLinksetDocument
}
[BsonIgnoreExtraElements]
public sealed class AdvisoryObservationReferenceDocument
{
[BsonElement("type")]
public string Type { get; set; } = string.Empty;
[BsonElement("url")]
public string Url { get; set; } = string.Empty;
}
public sealed class AdvisoryObservationReferenceDocument
{
[BsonElement("type")]
public string Type { get; set; } = string.Empty;
[BsonElement("url")]
public string Url { get; set; } = string.Empty;
}
[BsonIgnoreExtraElements]
public sealed class AdvisoryObservationRawLinksetDocument
{
[BsonElement("aliases")]
[BsonIgnoreIfNull]
public List<string>? Aliases { get; set; }
= new();
[BsonElement("purls")]
[BsonIgnoreIfNull]
public List<string>? PackageUrls { get; set; }
= new();
[BsonElement("cpes")]
[BsonIgnoreIfNull]
public List<string>? Cpes { get; set; }
= new();
[BsonElement("references")]
[BsonIgnoreIfNull]
public List<AdvisoryObservationRawReferenceDocument>? References { get; set; }
= new();
[BsonElement("reconciled_from")]
[BsonIgnoreIfNull]
public List<string>? ReconciledFrom { get; set; }
= new();
[BsonElement("notes")]
[BsonIgnoreIfNull]
public Dictionary<string, string>? Notes { get; set; }
= new(StringComparer.Ordinal);
}
[BsonIgnoreExtraElements]
public sealed class AdvisoryObservationRawReferenceDocument
{
[BsonElement("type")]
[BsonIgnoreIfNull]
public string? Type { get; set; }
= null;
[BsonElement("url")]
public string Url { get; set; } = string.Empty;
[BsonElement("source")]
[BsonIgnoreIfNull]
public string? Source { get; set; }
= null;
}

View File

@@ -5,7 +5,8 @@ using System.Linq;
using System.Text.Json.Nodes;
using MongoDB.Bson;
using MongoDB.Bson.IO;
using StellaOps.Concelier.Models.Observations;
using StellaOps.Concelier.Models.Observations;
using StellaOps.Concelier.RawModels;
namespace StellaOps.Concelier.Storage.Mongo.Observations;
@@ -22,12 +23,14 @@ internal static class AdvisoryObservationDocumentFactory
var contentMetadata = ToImmutable(document.Content.Metadata);
var upstreamMetadata = ToImmutable(document.Upstream.Metadata);
var observation = new AdvisoryObservation(
document.Id,
document.Tenant,
new AdvisoryObservationSource(
document.Source.Vendor,
document.Source.Stream,
var rawLinkset = ToRawLinkset(document.RawLinkset);
var observation = new AdvisoryObservation(
document.Id,
document.Tenant,
new AdvisoryObservationSource(
document.Source.Vendor,
document.Source.Stream,
document.Source.Api,
document.Source.CollectorVersion),
new AdvisoryObservationUpstream(
@@ -42,21 +45,22 @@ internal static class AdvisoryObservationDocumentFactory
document.Upstream.Signature.KeyId,
document.Upstream.Signature.Signature),
upstreamMetadata),
new AdvisoryObservationContent(
document.Content.Format,
document.Content.SpecVersion,
rawNode,
contentMetadata),
new AdvisoryObservationLinkset(
document.Linkset.Aliases ?? Enumerable.Empty<string>(),
document.Linkset.Purls ?? Enumerable.Empty<string>(),
document.Linkset.Cpes ?? Enumerable.Empty<string>(),
document.Linkset.References?.Select(reference => new AdvisoryObservationReference(reference.Type, reference.Url))),
DateTime.SpecifyKind(document.CreatedAt, DateTimeKind.Utc),
attributes);
return observation;
}
new AdvisoryObservationContent(
document.Content.Format,
document.Content.SpecVersion,
rawNode,
contentMetadata),
new AdvisoryObservationLinkset(
document.Linkset.Aliases ?? Enumerable.Empty<string>(),
document.Linkset.Purls ?? Enumerable.Empty<string>(),
document.Linkset.Cpes ?? Enumerable.Empty<string>(),
document.Linkset.References?.Select(reference => new AdvisoryObservationReference(reference.Type, reference.Url))),
rawLinkset,
DateTime.SpecifyKind(document.CreatedAt, DateTimeKind.Utc),
attributes);
return observation;
}
private static JsonNode ParseJsonNode(BsonDocument raw)
{
@@ -87,6 +91,72 @@ internal static class AdvisoryObservationDocumentFactory
builder[pair.Key.Trim()] = pair.Value;
}
return builder.ToImmutable();
}
}
return builder.ToImmutable();
}
private static RawLinkset ToRawLinkset(AdvisoryObservationRawLinksetDocument? document)
{
if (document is null)
{
return new RawLinkset();
}
static ImmutableArray<string> ToImmutableStringArray(List<string>? values)
{
if (values is null || values.Count == 0)
{
return ImmutableArray<string>.Empty;
}
return values
.Select(static value => value ?? string.Empty)
.ToImmutableArray();
}
static ImmutableArray<RawReference> ToImmutableReferences(List<AdvisoryObservationRawReferenceDocument>? references)
{
if (references is null || references.Count == 0)
{
return ImmutableArray<RawReference>.Empty;
}
return references
.Select(static reference => new RawReference(
reference.Type ?? string.Empty,
reference.Url,
reference.Source))
.ToImmutableArray();
}
static ImmutableDictionary<string, string> ToImmutableDictionary(Dictionary<string, string>? values)
{
if (values is null || values.Count == 0)
{
return ImmutableDictionary<string, string>.Empty;
}
var builder = ImmutableDictionary.CreateBuilder<string, string>(StringComparer.Ordinal);
foreach (var pair in values)
{
if (pair.Key is null)
{
continue;
}
builder[pair.Key] = pair.Value;
}
return builder.ToImmutable();
}
return new RawLinkset
{
Aliases = ToImmutableStringArray(document.Aliases),
PackageUrls = ToImmutableStringArray(document.PackageUrls),
Cpes = ToImmutableStringArray(document.Cpes),
References = ToImmutableReferences(document.References),
ReconciledFrom = ToImmutableStringArray(document.ReconciledFrom),
Notes = ToImmutableDictionary(document.Notes)
};
}
}

View File

@@ -109,6 +109,7 @@ public static class ServiceCollectionExtensions
services.AddSingleton<IMongoMigration, EnsureAdvisoryRawIdempotencyIndexMigration>();
services.AddSingleton<IMongoMigration, EnsureAdvisorySupersedesBackfillMigration>();
services.AddSingleton<IMongoMigration, EnsureAdvisoryRawValidatorMigration>();
services.AddSingleton<IMongoMigration, EnsureAdvisoryObservationsRawLinksetMigration>();
services.AddSingleton<IMongoMigration, EnsureAdvisoryEventCollectionsMigration>();
services.AddSingleton<IMongoMigration, SemVerStyleBackfillMigration>();

View File

@@ -10,6 +10,7 @@
> Docs alignment (2025-10-26): Rollback guidance added to `docs/deploy/containers.md` §6.
> 2025-10-28: Documented duplicate audit + migration workflow in `docs/deploy/containers.md`, Offline Kit guide, and `MIGRATIONS.md`; published `ops/devops/scripts/check-advisory-raw-duplicates.js` for staging/offline clusters.
> Docs alignment (2025-10-26): Offline kit requirements documented in `docs/deploy/containers.md` §5.
| CONCELIER-STORE-AOC-19-005 `Raw linkset backfill` | TODO (2025-11-04) | Concelier Storage Guild, DevOps Guild | CONCELIER-CORE-AOC-19-004 | Plan and execute advisory_observations `rawLinkset` backfill (online + Offline Kit bundles), supply migration scripts + rehearse rollback. Follow the coordination plan in `docs/dev/raw-linkset-backfill-plan.md`. |
## Policy Engine v2

View File

@@ -34,12 +34,34 @@ public sealed class AdvisoryObservationFactoryTests
Assert.Equal(SampleTimestamp, observation.CreatedAt);
Assert.Equal(new[] { "cve-2025-0001", "ghsa-xxxx-yyyy" }, observation.Linkset.Aliases);
Assert.Equal(new[] { "pkg:npm/left-pad@1.0.0" }, observation.Linkset.Purls);
Assert.Equal(new[] { "cpe:2.3:a:example:product:1.0:*:*:*:*:*:*:*" }, observation.Linkset.Cpes);
var reference = Assert.Single(observation.Linkset.References);
Assert.Equal("advisory", reference.Type);
Assert.Equal("https://example.test/advisory", reference.Url);
}
Assert.Equal(new[] { "pkg:npm/left-pad@1.0.0" }, observation.Linkset.Purls);
Assert.Equal(new[] { "cpe:2.3:a:example:product:1.0:*:*:*:*:*:*:*" }, observation.Linkset.Cpes);
var reference = Assert.Single(observation.Linkset.References);
Assert.Equal("advisory", reference.Type);
Assert.Equal("https://example.test/advisory", reference.Url);
Assert.Equal(
new[] { "GHSA-XXXX-YYYY", " CVE-2025-0001 ", "ghsa-XXXX-YYYY", " CVE-2025-0001 " },
observation.RawLinkset.Aliases);
Assert.Equal(
new[] { "pkg:NPM/left-pad@1.0.0", "pkg:npm/left-pad@1.0.0?foo=bar" },
observation.RawLinkset.PackageUrls);
Assert.Equal(
new[] { "cpe:/a:Example:Product:1.0", "cpe:/a:example:product:1.0" },
observation.RawLinkset.Cpes);
Assert.Collection(
observation.RawLinkset.References,
first =>
{
Assert.Equal("Advisory", first.Type);
Assert.Equal(" https://example.test/advisory ", first.Url);
},
second =>
{
Assert.Equal("ADVISORY", second.Type);
Assert.Equal("https://example.test/advisory", second.Url);
});
}
[Fact]
public void Create_SetsSourceAndUpstreamFields()
@@ -96,13 +118,15 @@ public sealed class AdvisoryObservationFactoryTests
},
supersedes: "tenant-a:vendor-x:previous:sha256:123");
var observation = factory.Create(rawDocument);
Assert.Equal("1.0.0", observation.Attributes["linkset.note.range-introduced"]);
Assert.Equal("1.0.5", observation.Attributes["linkset.note.range-fixed"]);
Assert.Equal("tenant-a:vendor-x:previous:sha256:123", observation.Attributes["supersedes"]);
Assert.Equal("connector-a;connector-b", observation.Attributes["linkset.reconciled_from"]);
}
var observation = factory.Create(rawDocument);
Assert.Equal("1.0.0", observation.Attributes["linkset.note.range-introduced"]);
Assert.Equal("1.0.5", observation.Attributes["linkset.note.range-fixed"]);
Assert.Equal("tenant-a:vendor-x:previous:sha256:123", observation.Attributes["supersedes"]);
Assert.Equal("connector-a;connector-b", observation.Attributes["linkset.reconciled_from"]);
Assert.Equal(notes, observation.RawLinkset.Notes);
Assert.Equal(new[] { "connector-a", "connector-b" }, observation.RawLinkset.ReconciledFrom);
}
private static AdvisoryRawDocument BuildRawDocument(
RawSourceMetadata? source = null,

View File

@@ -1,7 +1,9 @@
using System.Collections.Immutable;
using System.Collections.Immutable;
using System.Linq;
using System.Text.Json.Nodes;
using StellaOps.Concelier.Core.Observations;
using StellaOps.Concelier.Models.Observations;
using StellaOps.Concelier.Core.Observations;
using StellaOps.Concelier.Models.Observations;
using StellaOps.Concelier.RawModels;
using Xunit;
namespace StellaOps.Concelier.Core.Tests.Observations;
@@ -224,18 +226,28 @@ public sealed class AdvisoryObservationQueryServiceTests
contentHash: $"sha256:{observationId}",
signature: DefaultSignature);
var content = new AdvisoryObservationContent("CSAF", "2.0", raw);
var linkset = new AdvisoryObservationLinkset(aliases, purls, cpes, references);
return new AdvisoryObservation(
observationId,
tenant,
DefaultSource,
upstream,
content,
linkset,
createdAt);
}
var content = new AdvisoryObservationContent("CSAF", "2.0", raw);
var linkset = new AdvisoryObservationLinkset(aliases, purls, cpes, references);
var rawLinkset = new RawLinkset
{
Aliases = aliases.ToImmutableArray(),
PackageUrls = purls.ToImmutableArray(),
Cpes = cpes.ToImmutableArray(),
References = references
.Select(static reference => new RawReference(reference.Type, reference.Url))
.ToImmutableArray()
};
return new AdvisoryObservation(
observationId,
tenant,
DefaultSource,
upstream,
content,
linkset,
rawLinkset,
createdAt);
}
private sealed class InMemoryLookup : IAdvisoryObservationLookup
{

View File

@@ -1,5 +1,6 @@
using System;
using System.Collections.Immutable;
using System.Collections.Immutable;
using System.Linq;
using System.Text.Json;
using System.Threading;
using System.Threading.Tasks;
@@ -66,7 +67,7 @@ public sealed class AdvisoryRawServiceTests
await service.IngestAsync(document, CancellationToken.None);
Assert.NotNull(repository.CapturedDocument);
Assert.Equal(aliasSeries, repository.CapturedDocument!.Identifiers.Aliases);
Assert.True(aliasSeries.SequenceEqual(repository.CapturedDocument!.Identifiers.Aliases));
}
private static AdvisoryRawService CreateService(RecordingRepository repository)

View File

@@ -1,127 +0,0 @@
{
"advisoryKey": "GHSA-aaaa-bbbb-cccc",
"affectedPackages": [
{
"type": "semver",
"identifier": "pkg:npm/example-widget",
"platform": null,
"versionRanges": [
{
"fixedVersion": "2.5.1",
"introducedVersion": null,
"lastAffectedVersion": null,
"primitives": null,
"provenance": {
"source": "ghsa",
"kind": "map",
"value": "ghsa-aaaa-bbbb-cccc",
"decisionReason": null,
"recordedAt": "2024-03-05T10:00:00+00:00",
"fieldMask": []
},
"rangeExpression": ">=0.0.0 <2.5.1",
"rangeKind": "semver"
},
{
"fixedVersion": "3.2.4",
"introducedVersion": "3.0.0",
"lastAffectedVersion": null,
"primitives": null,
"provenance": {
"source": "ghsa",
"kind": "map",
"value": "ghsa-aaaa-bbbb-cccc",
"decisionReason": null,
"recordedAt": "2024-03-05T10:00:00+00:00",
"fieldMask": []
},
"rangeExpression": null,
"rangeKind": "semver"
}
],
"normalizedVersions": [],
"statuses": [],
"provenance": [
{
"source": "ghsa",
"kind": "map",
"value": "ghsa-aaaa-bbbb-cccc",
"decisionReason": null,
"recordedAt": "2024-03-05T10:00:00+00:00",
"fieldMask": []
}
]
}
],
"aliases": [
"CVE-2024-2222",
"GHSA-aaaa-bbbb-cccc"
],
"canonicalMetricId": null,
"credits": [],
"cvssMetrics": [
{
"baseScore": 8.8,
"baseSeverity": "high",
"provenance": {
"source": "ghsa",
"kind": "map",
"value": "ghsa-aaaa-bbbb-cccc",
"decisionReason": null,
"recordedAt": "2024-03-05T10:00:00+00:00",
"fieldMask": []
},
"vector": "CVSS:3.1/AV:N/AC:L/PR:N/UI:R/S:U/C:H/I:H/A:H",
"version": "3.1"
}
],
"cwes": [],
"description": null,
"exploitKnown": false,
"language": "en",
"modified": "2024-03-04T12:00:00+00:00",
"provenance": [
{
"source": "ghsa",
"kind": "map",
"value": "ghsa-aaaa-bbbb-cccc",
"decisionReason": null,
"recordedAt": "2024-03-05T10:00:00+00:00",
"fieldMask": []
}
],
"published": "2024-03-04T00:00:00+00:00",
"references": [
{
"kind": "patch",
"provenance": {
"source": "ghsa",
"kind": "map",
"value": "ghsa-aaaa-bbbb-cccc",
"decisionReason": null,
"recordedAt": "2024-03-05T10:00:00+00:00",
"fieldMask": []
},
"sourceTag": "ghsa",
"summary": "Patch commit",
"url": "https://github.com/example/widget/commit/abcd1234"
},
{
"kind": "advisory",
"provenance": {
"source": "ghsa",
"kind": "map",
"value": "ghsa-aaaa-bbbb-cccc",
"decisionReason": null,
"recordedAt": "2024-03-05T10:00:00+00:00",
"fieldMask": []
},
"sourceTag": "ghsa",
"summary": "GitHub Security Advisory",
"url": "https://github.com/example/widget/security/advisories/GHSA-aaaa-bbbb-cccc"
}
],
"severity": "high",
"summary": "A crafted payload can pollute Object.prototype leading to RCE.",
"title": "Prototype pollution in widget.js"
}

View File

@@ -1,124 +1,127 @@
{
"advisoryKey": "GHSA-aaaa-bbbb-cccc",
"affectedPackages": [
{
"type": "semver",
"identifier": "pkg:npm/example-widget",
"platform": null,
"versionRanges": [
{
"fixedVersion": "2.5.1",
"introducedVersion": null,
"lastAffectedVersion": null,
"primitives": null,
"provenance": {
"source": "ghsa",
"kind": "map",
"value": "ghsa-aaaa-bbbb-cccc",
"decisionReason": null,
"recordedAt": "2024-03-05T10:00:00+00:00",
"fieldMask": []
},
"rangeExpression": ">=0.0.0 <2.5.1",
"rangeKind": "semver"
},
{
"fixedVersion": "3.2.4",
"introducedVersion": "3.0.0",
"lastAffectedVersion": null,
"primitives": null,
"provenance": {
"source": "ghsa",
"kind": "map",
"value": "ghsa-aaaa-bbbb-cccc",
"decisionReason": null,
"recordedAt": "2024-03-05T10:00:00+00:00",
"fieldMask": []
},
"rangeExpression": null,
"rangeKind": "semver"
}
],
"normalizedVersions": [],
"statuses": [],
"provenance": [
{
"source": "ghsa",
"kind": "map",
"value": "ghsa-aaaa-bbbb-cccc",
"decisionReason": null,
"recordedAt": "2024-03-05T10:00:00+00:00",
"fieldMask": []
}
]
}
],
"aliases": [
"CVE-2024-2222",
"GHSA-aaaa-bbbb-cccc"
],
"credits": [],
"cvssMetrics": [
{
"baseScore": 8.8,
"baseSeverity": "high",
"provenance": {
"source": "ghsa",
"kind": "map",
"value": "ghsa-aaaa-bbbb-cccc",
"decisionReason": null,
"recordedAt": "2024-03-05T10:00:00+00:00",
"fieldMask": []
},
"vector": "CVSS:3.1/AV:N/AC:L/PR:N/UI:R/S:U/C:H/I:H/A:H",
"version": "3.1"
}
],
"exploitKnown": false,
"language": "en",
"modified": "2024-03-04T12:00:00+00:00",
"provenance": [
{
"source": "ghsa",
"kind": "map",
"value": "ghsa-aaaa-bbbb-cccc",
"decisionReason": null,
"recordedAt": "2024-03-05T10:00:00+00:00",
"fieldMask": []
}
],
"published": "2024-03-04T00:00:00+00:00",
"references": [
{
"kind": "patch",
"provenance": {
"source": "ghsa",
"kind": "map",
"value": "ghsa-aaaa-bbbb-cccc",
"decisionReason": null,
"recordedAt": "2024-03-05T10:00:00+00:00",
"fieldMask": []
},
"sourceTag": "ghsa",
"summary": "Patch commit",
"url": "https://github.com/example/widget/commit/abcd1234"
},
{
"kind": "advisory",
"provenance": {
"source": "ghsa",
"kind": "map",
"value": "ghsa-aaaa-bbbb-cccc",
"decisionReason": null,
"recordedAt": "2024-03-05T10:00:00+00:00",
"fieldMask": []
},
"sourceTag": "ghsa",
"summary": "GitHub Security Advisory",
"url": "https://github.com/example/widget/security/advisories/GHSA-aaaa-bbbb-cccc"
}
],
"severity": "high",
"summary": "A crafted payload can pollute Object.prototype leading to RCE.",
"title": "Prototype pollution in widget.js"
{
"advisoryKey": "GHSA-aaaa-bbbb-cccc",
"affectedPackages": [
{
"type": "semver",
"identifier": "pkg:npm/example-widget",
"platform": null,
"versionRanges": [
{
"fixedVersion": "2.5.1",
"introducedVersion": null,
"lastAffectedVersion": null,
"primitives": null,
"provenance": {
"source": "ghsa",
"kind": "map",
"value": "ghsa-aaaa-bbbb-cccc",
"decisionReason": null,
"recordedAt": "2024-03-05T10:00:00+00:00",
"fieldMask": []
},
"rangeExpression": ">=0.0.0 <2.5.1",
"rangeKind": "semver"
},
{
"fixedVersion": "3.2.4",
"introducedVersion": "3.0.0",
"lastAffectedVersion": null,
"primitives": null,
"provenance": {
"source": "ghsa",
"kind": "map",
"value": "ghsa-aaaa-bbbb-cccc",
"decisionReason": null,
"recordedAt": "2024-03-05T10:00:00+00:00",
"fieldMask": []
},
"rangeExpression": null,
"rangeKind": "semver"
}
],
"normalizedVersions": [],
"statuses": [],
"provenance": [
{
"source": "ghsa",
"kind": "map",
"value": "ghsa-aaaa-bbbb-cccc",
"decisionReason": null,
"recordedAt": "2024-03-05T10:00:00+00:00",
"fieldMask": []
}
]
}
],
"aliases": [
"CVE-2024-2222",
"GHSA-aaaa-bbbb-cccc"
],
"canonicalMetricId": null,
"credits": [],
"cvssMetrics": [
{
"baseScore": 8.8,
"baseSeverity": "high",
"provenance": {
"source": "ghsa",
"kind": "map",
"value": "ghsa-aaaa-bbbb-cccc",
"decisionReason": null,
"recordedAt": "2024-03-05T10:00:00+00:00",
"fieldMask": []
},
"vector": "CVSS:3.1/AV:N/AC:L/PR:N/UI:R/S:U/C:H/I:H/A:H",
"version": "3.1"
}
],
"cwes": [],
"description": null,
"exploitKnown": false,
"language": "en",
"modified": "2024-03-04T12:00:00+00:00",
"provenance": [
{
"source": "ghsa",
"kind": "map",
"value": "ghsa-aaaa-bbbb-cccc",
"decisionReason": null,
"recordedAt": "2024-03-05T10:00:00+00:00",
"fieldMask": []
}
],
"published": "2024-03-04T00:00:00+00:00",
"references": [
{
"kind": "patch",
"provenance": {
"source": "ghsa",
"kind": "map",
"value": "ghsa-aaaa-bbbb-cccc",
"decisionReason": null,
"recordedAt": "2024-03-05T10:00:00+00:00",
"fieldMask": []
},
"sourceTag": "ghsa",
"summary": "Patch commit",
"url": "https://github.com/example/widget/commit/abcd1234"
},
{
"kind": "advisory",
"provenance": {
"source": "ghsa",
"kind": "map",
"value": "ghsa-aaaa-bbbb-cccc",
"decisionReason": null,
"recordedAt": "2024-03-05T10:00:00+00:00",
"fieldMask": []
},
"sourceTag": "ghsa",
"summary": "GitHub Security Advisory",
"url": "https://github.com/example/widget/security/advisories/GHSA-aaaa-bbbb-cccc"
}
],
"severity": "high",
"summary": "A crafted payload can pollute Object.prototype leading to RCE.",
"title": "Prototype pollution in widget.js"
}

View File

@@ -1,45 +0,0 @@
{
"advisoryKey": "CVE-2023-9999",
"affectedPackages": [],
"aliases": [
"CVE-2023-9999"
],
"canonicalMetricId": null,
"credits": [],
"cvssMetrics": [],
"cwes": [],
"description": null,
"exploitKnown": true,
"language": "en",
"modified": "2024-02-09T16:22:00+00:00",
"provenance": [
{
"source": "cisa-kev",
"kind": "annotate",
"value": "kev",
"decisionReason": null,
"recordedAt": "2024-02-10T09:30:00+00:00",
"fieldMask": []
}
],
"published": "2023-11-20T00:00:00+00:00",
"references": [
{
"kind": "kev",
"provenance": {
"source": "cisa-kev",
"kind": "annotate",
"value": "kev",
"decisionReason": null,
"recordedAt": "2024-02-10T09:30:00+00:00",
"fieldMask": []
},
"sourceTag": "cisa",
"summary": "CISA KEV entry",
"url": "https://www.cisa.gov/known-exploited-vulnerabilities-catalog"
}
],
"severity": "critical",
"summary": "Unauthenticated RCE due to unsafe deserialization.",
"title": "Remote code execution in LegacyServer"
}

View File

@@ -1,42 +1,45 @@
{
"advisoryKey": "CVE-2023-9999",
"affectedPackages": [],
"aliases": [
"CVE-2023-9999"
],
"credits": [],
"cvssMetrics": [],
"exploitKnown": true,
"language": "en",
"modified": "2024-02-09T16:22:00+00:00",
"provenance": [
{
"source": "cisa-kev",
"kind": "annotate",
"value": "kev",
"decisionReason": null,
"recordedAt": "2024-02-10T09:30:00+00:00",
"fieldMask": []
}
],
"published": "2023-11-20T00:00:00+00:00",
"references": [
{
"kind": "kev",
"provenance": {
"source": "cisa-kev",
"kind": "annotate",
"value": "kev",
"decisionReason": null,
"recordedAt": "2024-02-10T09:30:00+00:00",
"fieldMask": []
},
"sourceTag": "cisa",
"summary": "CISA KEV entry",
"url": "https://www.cisa.gov/known-exploited-vulnerabilities-catalog"
}
],
"severity": "critical",
"summary": "Unauthenticated RCE due to unsafe deserialization.",
"title": "Remote code execution in LegacyServer"
{
"advisoryKey": "CVE-2023-9999",
"affectedPackages": [],
"aliases": [
"CVE-2023-9999"
],
"canonicalMetricId": null,
"credits": [],
"cvssMetrics": [],
"cwes": [],
"description": null,
"exploitKnown": true,
"language": "en",
"modified": "2024-02-09T16:22:00+00:00",
"provenance": [
{
"source": "cisa-kev",
"kind": "annotate",
"value": "kev",
"decisionReason": null,
"recordedAt": "2024-02-10T09:30:00+00:00",
"fieldMask": []
}
],
"published": "2023-11-20T00:00:00+00:00",
"references": [
{
"kind": "kev",
"provenance": {
"source": "cisa-kev",
"kind": "annotate",
"value": "kev",
"decisionReason": null,
"recordedAt": "2024-02-10T09:30:00+00:00",
"fieldMask": []
},
"sourceTag": "cisa",
"summary": "CISA KEV entry",
"url": "https://www.cisa.gov/known-exploited-vulnerabilities-catalog"
}
],
"severity": "critical",
"summary": "Unauthenticated RCE due to unsafe deserialization.",
"title": "Remote code execution in LegacyServer"
}

View File

@@ -1,122 +0,0 @@
{
"advisoryKey": "CVE-2024-1234",
"affectedPackages": [
{
"type": "cpe",
"identifier": "cpe:/a:examplecms:examplecms:1.0",
"platform": null,
"versionRanges": [
{
"fixedVersion": "1.0.5",
"introducedVersion": "1.0",
"lastAffectedVersion": null,
"primitives": null,
"provenance": {
"source": "nvd",
"kind": "map",
"value": "cve-2024-1234",
"decisionReason": null,
"recordedAt": "2024-08-01T12:00:00+00:00",
"fieldMask": []
},
"rangeExpression": null,
"rangeKind": "version"
}
],
"normalizedVersions": [],
"statuses": [
{
"provenance": {
"source": "nvd",
"kind": "map",
"value": "cve-2024-1234",
"decisionReason": null,
"recordedAt": "2024-08-01T12:00:00+00:00",
"fieldMask": []
},
"status": "affected"
}
],
"provenance": [
{
"source": "nvd",
"kind": "map",
"value": "cve-2024-1234",
"decisionReason": null,
"recordedAt": "2024-08-01T12:00:00+00:00",
"fieldMask": []
}
]
}
],
"aliases": [
"CVE-2024-1234"
],
"canonicalMetricId": null,
"credits": [],
"cvssMetrics": [
{
"baseScore": 9.8,
"baseSeverity": "critical",
"provenance": {
"source": "nvd",
"kind": "map",
"value": "cve-2024-1234",
"decisionReason": null,
"recordedAt": "2024-08-01T12:00:00+00:00",
"fieldMask": []
},
"vector": "CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:H/I:H/A:H",
"version": "3.1"
}
],
"cwes": [],
"description": null,
"exploitKnown": false,
"language": "en",
"modified": "2024-07-16T10:35:00+00:00",
"provenance": [
{
"source": "nvd",
"kind": "map",
"value": "cve-2024-1234",
"decisionReason": null,
"recordedAt": "2024-08-01T12:00:00+00:00",
"fieldMask": []
}
],
"published": "2024-07-15T00:00:00+00:00",
"references": [
{
"kind": "advisory",
"provenance": {
"source": "example",
"kind": "fetch",
"value": "bulletin",
"decisionReason": null,
"recordedAt": "2024-07-14T15:00:00+00:00",
"fieldMask": []
},
"sourceTag": "vendor",
"summary": "Vendor bulletin",
"url": "https://example.org/security/CVE-2024-1234"
},
{
"kind": "advisory",
"provenance": {
"source": "nvd",
"kind": "map",
"value": "cve-2024-1234",
"decisionReason": null,
"recordedAt": "2024-08-01T12:00:00+00:00",
"fieldMask": []
},
"sourceTag": "nvd",
"summary": "NVD entry",
"url": "https://nvd.nist.gov/vuln/detail/CVE-2024-1234"
}
],
"severity": "high",
"summary": "An integer overflow in ExampleCMS allows remote attackers to escalate privileges.",
"title": "Integer overflow in ExampleCMS"
}

View File

@@ -1,119 +1,122 @@
{
"advisoryKey": "CVE-2024-1234",
"affectedPackages": [
{
"type": "cpe",
"identifier": "cpe:/a:examplecms:examplecms:1.0",
"platform": null,
"versionRanges": [
{
"fixedVersion": "1.0.5",
"introducedVersion": "1.0",
"lastAffectedVersion": null,
"primitives": null,
"provenance": {
"source": "nvd",
"kind": "map",
"value": "cve-2024-1234",
"decisionReason": null,
"recordedAt": "2024-08-01T12:00:00+00:00",
"fieldMask": []
},
"rangeExpression": null,
"rangeKind": "version"
}
],
"normalizedVersions": [],
"statuses": [
{
"provenance": {
"source": "nvd",
"kind": "map",
"value": "cve-2024-1234",
"decisionReason": null,
"recordedAt": "2024-08-01T12:00:00+00:00",
"fieldMask": []
},
"status": "affected"
}
],
"provenance": [
{
"source": "nvd",
"kind": "map",
"value": "cve-2024-1234",
"decisionReason": null,
"recordedAt": "2024-08-01T12:00:00+00:00",
"fieldMask": []
}
]
}
],
"aliases": [
"CVE-2024-1234"
],
"credits": [],
"cvssMetrics": [
{
"baseScore": 9.8,
"baseSeverity": "critical",
"provenance": {
"source": "nvd",
"kind": "map",
"value": "cve-2024-1234",
"decisionReason": null,
"recordedAt": "2024-08-01T12:00:00+00:00",
"fieldMask": []
},
"vector": "CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:H/I:H/A:H",
"version": "3.1"
}
],
"exploitKnown": false,
"language": "en",
"modified": "2024-07-16T10:35:00+00:00",
"provenance": [
{
"source": "nvd",
"kind": "map",
"value": "cve-2024-1234",
"decisionReason": null,
"recordedAt": "2024-08-01T12:00:00+00:00",
"fieldMask": []
}
],
"published": "2024-07-15T00:00:00+00:00",
"references": [
{
"kind": "advisory",
"provenance": {
"source": "example",
"kind": "fetch",
"value": "bulletin",
"decisionReason": null,
"recordedAt": "2024-07-14T15:00:00+00:00",
"fieldMask": []
},
"sourceTag": "vendor",
"summary": "Vendor bulletin",
"url": "https://example.org/security/CVE-2024-1234"
},
{
"kind": "advisory",
"provenance": {
"source": "nvd",
"kind": "map",
"value": "cve-2024-1234",
"decisionReason": null,
"recordedAt": "2024-08-01T12:00:00+00:00",
"fieldMask": []
},
"sourceTag": "nvd",
"summary": "NVD entry",
"url": "https://nvd.nist.gov/vuln/detail/CVE-2024-1234"
}
],
"severity": "high",
"summary": "An integer overflow in ExampleCMS allows remote attackers to escalate privileges.",
"title": "Integer overflow in ExampleCMS"
{
"advisoryKey": "CVE-2024-1234",
"affectedPackages": [
{
"type": "cpe",
"identifier": "cpe:/a:examplecms:examplecms:1.0",
"platform": null,
"versionRanges": [
{
"fixedVersion": "1.0.5",
"introducedVersion": "1.0",
"lastAffectedVersion": null,
"primitives": null,
"provenance": {
"source": "nvd",
"kind": "map",
"value": "cve-2024-1234",
"decisionReason": null,
"recordedAt": "2024-08-01T12:00:00+00:00",
"fieldMask": []
},
"rangeExpression": null,
"rangeKind": "version"
}
],
"normalizedVersions": [],
"statuses": [
{
"provenance": {
"source": "nvd",
"kind": "map",
"value": "cve-2024-1234",
"decisionReason": null,
"recordedAt": "2024-08-01T12:00:00+00:00",
"fieldMask": []
},
"status": "affected"
}
],
"provenance": [
{
"source": "nvd",
"kind": "map",
"value": "cve-2024-1234",
"decisionReason": null,
"recordedAt": "2024-08-01T12:00:00+00:00",
"fieldMask": []
}
]
}
],
"aliases": [
"CVE-2024-1234"
],
"canonicalMetricId": null,
"credits": [],
"cvssMetrics": [
{
"baseScore": 9.8,
"baseSeverity": "critical",
"provenance": {
"source": "nvd",
"kind": "map",
"value": "cve-2024-1234",
"decisionReason": null,
"recordedAt": "2024-08-01T12:00:00+00:00",
"fieldMask": []
},
"vector": "CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:H/I:H/A:H",
"version": "3.1"
}
],
"cwes": [],
"description": null,
"exploitKnown": false,
"language": "en",
"modified": "2024-07-16T10:35:00+00:00",
"provenance": [
{
"source": "nvd",
"kind": "map",
"value": "cve-2024-1234",
"decisionReason": null,
"recordedAt": "2024-08-01T12:00:00+00:00",
"fieldMask": []
}
],
"published": "2024-07-15T00:00:00+00:00",
"references": [
{
"kind": "advisory",
"provenance": {
"source": "example",
"kind": "fetch",
"value": "bulletin",
"decisionReason": null,
"recordedAt": "2024-07-14T15:00:00+00:00",
"fieldMask": []
},
"sourceTag": "vendor",
"summary": "Vendor bulletin",
"url": "https://example.org/security/CVE-2024-1234"
},
{
"kind": "advisory",
"provenance": {
"source": "nvd",
"kind": "map",
"value": "cve-2024-1234",
"decisionReason": null,
"recordedAt": "2024-08-01T12:00:00+00:00",
"fieldMask": []
},
"sourceTag": "nvd",
"summary": "NVD entry",
"url": "https://nvd.nist.gov/vuln/detail/CVE-2024-1234"
}
],
"severity": "high",
"summary": "An integer overflow in ExampleCMS allows remote attackers to escalate privileges.",
"title": "Integer overflow in ExampleCMS"
}

View File

@@ -1,125 +0,0 @@
{
"advisoryKey": "RHSA-2024:0252",
"affectedPackages": [
{
"type": "rpm",
"identifier": "kernel-0:4.18.0-553.el8.x86_64",
"platform": "rhel-8",
"versionRanges": [
{
"fixedVersion": null,
"introducedVersion": "0:4.18.0-553.el8",
"lastAffectedVersion": null,
"primitives": null,
"provenance": {
"source": "redhat",
"kind": "map",
"value": "rhsa-2024:0252",
"decisionReason": null,
"recordedAt": "2024-05-11T09:00:00+00:00",
"fieldMask": []
},
"rangeExpression": null,
"rangeKind": "nevra"
}
],
"normalizedVersions": [],
"statuses": [
{
"provenance": {
"source": "redhat",
"kind": "map",
"value": "rhsa-2024:0252",
"decisionReason": null,
"recordedAt": "2024-05-11T09:00:00+00:00",
"fieldMask": []
},
"status": "fixed"
}
],
"provenance": [
{
"source": "redhat",
"kind": "enrich",
"value": "cve-2024-5678",
"decisionReason": null,
"recordedAt": "2024-05-11T09:05:00+00:00",
"fieldMask": []
},
{
"source": "redhat",
"kind": "map",
"value": "rhsa-2024:0252",
"decisionReason": null,
"recordedAt": "2024-05-11T09:00:00+00:00",
"fieldMask": []
}
]
}
],
"aliases": [
"CVE-2024-5678",
"RHSA-2024:0252"
],
"canonicalMetricId": null,
"credits": [],
"cvssMetrics": [
{
"baseScore": 6.7,
"baseSeverity": "medium",
"provenance": {
"source": "redhat",
"kind": "map",
"value": "rhsa-2024:0252",
"decisionReason": null,
"recordedAt": "2024-05-11T09:00:00+00:00",
"fieldMask": []
},
"vector": "CVSS:3.1/AV:L/AC:L/PR:H/UI:N/S:U/C:H/I:H/A:H",
"version": "3.1"
}
],
"cwes": [],
"description": null,
"exploitKnown": false,
"language": "en",
"modified": "2024-05-11T08:15:00+00:00",
"provenance": [
{
"source": "redhat",
"kind": "enrich",
"value": "cve-2024-5678",
"decisionReason": null,
"recordedAt": "2024-05-11T09:05:00+00:00",
"fieldMask": []
},
{
"source": "redhat",
"kind": "map",
"value": "rhsa-2024:0252",
"decisionReason": null,
"recordedAt": "2024-05-11T09:00:00+00:00",
"fieldMask": []
}
],
"published": "2024-05-10T19:28:00+00:00",
"references": [
{
"kind": "advisory",
"provenance": {
"source": "redhat",
"kind": "map",
"value": "rhsa-2024:0252",
"decisionReason": null,
"recordedAt": "2024-05-11T09:00:00+00:00",
"fieldMask": []
},
"sourceTag": "redhat",
"summary": "Red Hat security advisory",
"url": "https://access.redhat.com/errata/RHSA-2024:0252"
}
],
"severity": "critical",
"summary": "Updates the Red Hat Enterprise Linux kernel to address CVE-2024-5678.",
"title": "Important: kernel security update"
}

View File

@@ -1,122 +1,125 @@
{
"advisoryKey": "RHSA-2024:0252",
"affectedPackages": [
{
"type": "rpm",
"identifier": "kernel-0:4.18.0-553.el8.x86_64",
"platform": "rhel-8",
"versionRanges": [
{
"fixedVersion": null,
"introducedVersion": "0:4.18.0-553.el8",
"lastAffectedVersion": null,
"primitives": null,
"provenance": {
"source": "redhat",
"kind": "map",
"value": "rhsa-2024:0252",
"decisionReason": null,
"recordedAt": "2024-05-11T09:00:00+00:00",
"fieldMask": []
},
"rangeExpression": null,
"rangeKind": "nevra"
}
],
"normalizedVersions": [],
"statuses": [
{
"provenance": {
"source": "redhat",
"kind": "map",
"value": "rhsa-2024:0252",
"decisionReason": null,
"recordedAt": "2024-05-11T09:00:00+00:00",
"fieldMask": []
},
"status": "fixed"
}
],
"provenance": [
{
"source": "redhat",
"kind": "enrich",
"value": "cve-2024-5678",
"decisionReason": null,
"recordedAt": "2024-05-11T09:05:00+00:00",
"fieldMask": []
},
{
"source": "redhat",
"kind": "map",
"value": "rhsa-2024:0252",
"decisionReason": null,
"recordedAt": "2024-05-11T09:00:00+00:00",
"fieldMask": []
}
]
}
],
"aliases": [
"CVE-2024-5678",
"RHSA-2024:0252"
],
"credits": [],
"cvssMetrics": [
{
"baseScore": 6.7,
"baseSeverity": "medium",
"provenance": {
"source": "redhat",
"kind": "map",
"value": "rhsa-2024:0252",
"decisionReason": null,
"recordedAt": "2024-05-11T09:00:00+00:00",
"fieldMask": []
},
"vector": "CVSS:3.1/AV:L/AC:L/PR:H/UI:N/S:U/C:H/I:H/A:H",
"version": "3.1"
}
],
"exploitKnown": false,
"language": "en",
"modified": "2024-05-11T08:15:00+00:00",
"provenance": [
{
"source": "redhat",
"kind": "enrich",
"value": "cve-2024-5678",
"decisionReason": null,
"recordedAt": "2024-05-11T09:05:00+00:00",
"fieldMask": []
},
{
"source": "redhat",
"kind": "map",
"value": "rhsa-2024:0252",
"decisionReason": null,
"recordedAt": "2024-05-11T09:00:00+00:00",
"fieldMask": []
}
],
"published": "2024-05-10T19:28:00+00:00",
"references": [
{
"kind": "advisory",
"provenance": {
"source": "redhat",
"kind": "map",
"value": "rhsa-2024:0252",
"decisionReason": null,
"recordedAt": "2024-05-11T09:00:00+00:00",
"fieldMask": []
},
"sourceTag": "redhat",
"summary": "Red Hat security advisory",
"url": "https://access.redhat.com/errata/RHSA-2024:0252"
}
],
"severity": "critical",
"summary": "Updates the Red Hat Enterprise Linux kernel to address CVE-2024-5678.",
"title": "Important: kernel security update"
{
"advisoryKey": "RHSA-2024:0252",
"affectedPackages": [
{
"type": "rpm",
"identifier": "kernel-0:4.18.0-553.el8.x86_64",
"platform": "rhel-8",
"versionRanges": [
{
"fixedVersion": null,
"introducedVersion": "0:4.18.0-553.el8",
"lastAffectedVersion": null,
"primitives": null,
"provenance": {
"source": "redhat",
"kind": "map",
"value": "rhsa-2024:0252",
"decisionReason": null,
"recordedAt": "2024-05-11T09:00:00+00:00",
"fieldMask": []
},
"rangeExpression": null,
"rangeKind": "nevra"
}
],
"normalizedVersions": [],
"statuses": [
{
"provenance": {
"source": "redhat",
"kind": "map",
"value": "rhsa-2024:0252",
"decisionReason": null,
"recordedAt": "2024-05-11T09:00:00+00:00",
"fieldMask": []
},
"status": "fixed"
}
],
"provenance": [
{
"source": "redhat",
"kind": "enrich",
"value": "cve-2024-5678",
"decisionReason": null,
"recordedAt": "2024-05-11T09:05:00+00:00",
"fieldMask": []
},
{
"source": "redhat",
"kind": "map",
"value": "rhsa-2024:0252",
"decisionReason": null,
"recordedAt": "2024-05-11T09:00:00+00:00",
"fieldMask": []
}
]
}
],
"aliases": [
"CVE-2024-5678",
"RHSA-2024:0252"
],
"canonicalMetricId": null,
"credits": [],
"cvssMetrics": [
{
"baseScore": 6.7,
"baseSeverity": "medium",
"provenance": {
"source": "redhat",
"kind": "map",
"value": "rhsa-2024:0252",
"decisionReason": null,
"recordedAt": "2024-05-11T09:00:00+00:00",
"fieldMask": []
},
"vector": "CVSS:3.1/AV:L/AC:L/PR:H/UI:N/S:U/C:H/I:H/A:H",
"version": "3.1"
}
],
"cwes": [],
"description": null,
"exploitKnown": false,
"language": "en",
"modified": "2024-05-11T08:15:00+00:00",
"provenance": [
{
"source": "redhat",
"kind": "enrich",
"value": "cve-2024-5678",
"decisionReason": null,
"recordedAt": "2024-05-11T09:05:00+00:00",
"fieldMask": []
},
{
"source": "redhat",
"kind": "map",
"value": "rhsa-2024:0252",
"decisionReason": null,
"recordedAt": "2024-05-11T09:00:00+00:00",
"fieldMask": []
}
],
"published": "2024-05-10T19:28:00+00:00",
"references": [
{
"kind": "advisory",
"provenance": {
"source": "redhat",
"kind": "map",
"value": "rhsa-2024:0252",
"decisionReason": null,
"recordedAt": "2024-05-11T09:00:00+00:00",
"fieldMask": []
},
"sourceTag": "redhat",
"summary": "Red Hat security advisory",
"url": "https://access.redhat.com/errata/RHSA-2024:0252"
}
],
"severity": "critical",
"summary": "Updates the Red Hat Enterprise Linux kernel to address CVE-2024-5678.",
"title": "Important: kernel security update"
}

View File

@@ -1,7 +1,8 @@
using System.Collections.Generic;
using System.Collections.Immutable;
using System.Text.Json.Nodes;
using StellaOps.Concelier.Models.Observations;
using StellaOps.Concelier.Models.Observations;
using StellaOps.Concelier.RawModels;
using Xunit;
namespace StellaOps.Concelier.Models.Tests.Observations;
@@ -33,21 +34,32 @@ public sealed class AdvisoryObservationTests
new AdvisoryObservationReference("advisory", "https://example.com/advisory")
});
var attributes = ImmutableDictionary.CreateRange(new Dictionary<string, string>
{
[" region "] = "emea",
["pipeline"] = "daily"
});
var observation = new AdvisoryObservation(
observationId: " tenant-a:CVE-2025-1234:1 ",
tenant: " Tenant-A ",
source: source,
upstream: upstream,
content: content,
linkset: linkset,
createdAt: DateTimeOffset.Parse("2025-10-01T01:00:06Z"),
attributes: attributes);
var attributes = ImmutableDictionary.CreateRange(new Dictionary<string, string>
{
[" region "] = "emea",
["pipeline"] = "daily"
});
var rawLinkset = new RawLinkset
{
Aliases = ImmutableArray.Create(" Cve-2025-1234 ", "cve-2025-1234"),
PackageUrls = ImmutableArray.Create("pkg:generic/foo@1.0.0"),
Cpes = ImmutableArray.Create("cpe:/a:vendor:product:1"),
References = ImmutableArray.Create(new RawReference("ADVISORY", "https://example.com/advisory")),
ReconciledFrom = ImmutableArray.Create("pointer-1"),
Notes = ImmutableDictionary.CreateRange(new Dictionary<string, string> { ["note"] = "value" })
};
var observation = new AdvisoryObservation(
observationId: " tenant-a:CVE-2025-1234:1 ",
tenant: " Tenant-A ",
source: source,
upstream: upstream,
content: content,
linkset: linkset,
rawLinkset: rawLinkset,
createdAt: DateTimeOffset.Parse("2025-10-01T01:00:06Z"),
attributes: attributes);
Assert.Equal("tenant-a:CVE-2025-1234:1", observation.ObservationId);
Assert.Equal("tenant-a", observation.Tenant);

View File

@@ -0,0 +1,337 @@
using System;
using System.Collections.Generic;
using System.Collections.Immutable;
using System.Linq;
using System.Text.Json;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging.Abstractions;
using MongoDB.Bson;
using MongoDB.Bson.Serialization;
using MongoDB.Driver;
using StellaOps.Concelier.RawModels;
using StellaOps.Concelier.Storage.Mongo;
using StellaOps.Concelier.Storage.Mongo.Migrations;
using StellaOps.Concelier.Storage.Mongo.Observations;
using StellaOps.Concelier.Storage.Mongo.Raw;
using Xunit;
namespace StellaOps.Concelier.Storage.Mongo.Tests.Migrations;
[Collection("mongo-fixture")]
public sealed class EnsureAdvisoryObservationsRawLinksetMigrationTests
{
private readonly MongoIntegrationFixture _fixture;
public EnsureAdvisoryObservationsRawLinksetMigrationTests(MongoIntegrationFixture fixture)
{
_fixture = fixture;
}
[Fact]
public async Task ApplyAsync_BackfillsRawLinksetFromRawDocument()
{
var databaseName = $"concelier-rawlinkset-{Guid.NewGuid():N}";
var database = _fixture.Client.GetDatabase(databaseName);
await database.CreateCollectionAsync(MongoStorageDefaults.Collections.Migrations);
await database.CreateCollectionAsync(MongoStorageDefaults.Collections.AdvisoryObservations);
try
{
var rawRepository = new MongoAdvisoryRawRepository(
database,
TimeProvider.System,
NullLogger<MongoAdvisoryRawRepository>.Instance);
var rawDocument = RawDocumentFactory.CreateAdvisory(
tenant: "tenant-a",
source: new RawSourceMetadata("Vendor-X", "connector-y", "1.0.0", "stable"),
upstream: new RawUpstreamMetadata(
UpstreamId: "GHSA-2025-0001",
DocumentVersion: "v1",
RetrievedAt: DateTimeOffset.Parse("2025-10-29T12:34:56Z"),
ContentHash: "sha256:abc123",
Signature: new RawSignatureMetadata(true, "dsse", "key1", "sig1"),
Provenance: ImmutableDictionary.CreateRange(new[] { new KeyValuePair<string, string>("api", "https://example.test/api") })),
content: new RawContent(
Format: "OSV",
SpecVersion: "1.0.0",
Raw: ParseJsonElement("""{"id":"GHSA-2025-0001"}"""),
Encoding: null),
identifiers: new RawIdentifiers(
Aliases: ImmutableArray.Create("CVE-2025-0001", "cve-2025-0001"),
PrimaryId: "CVE-2025-0001"),
linkset: new RawLinkset
{
Aliases = ImmutableArray.Create("GHSA-xxxx-yyyy"),
PackageUrls = ImmutableArray.Create("pkg:npm/example@1.0.0"),
Cpes = ImmutableArray.Create("cpe:/a:example:product:1.0"),
References = ImmutableArray.Create(new RawReference("advisory", "https://example.test/advisory", "vendor")),
ReconciledFrom = ImmutableArray.Create("connector-y"),
Notes = ImmutableDictionary.CreateRange(new[] { new KeyValuePair<string, string>("range-fixed", "1.0.1") })
});
await rawRepository.UpsertAsync(rawDocument, CancellationToken.None);
var expectedRawLinkset = BuildRawLinkset(rawDocument.Identifiers, rawDocument.Linkset);
var canonicalAliases = ImmutableArray.Create("cve-2025-0001", "ghsa-xxxx-yyyy");
var canonicalPurls = rawDocument.Linkset.PackageUrls;
var canonicalCpes = rawDocument.Linkset.Cpes;
var canonicalReferences = rawDocument.Linkset.References;
var observationId = "tenant-a:vendor-x:ghsa-2025-0001:sha256-abc123";
var observationBson = BuildObservationDocument(
observationId,
rawDocument,
canonicalAliases,
canonicalPurls,
canonicalCpes,
canonicalReferences,
rawDocument.Upstream.RetrievedAt,
includeRawLinkset: false);
await database
.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.AdvisoryObservations)
.InsertOneAsync(observationBson);
var migration = new EnsureAdvisoryObservationsRawLinksetMigration();
await migration.ApplyAsync(database, CancellationToken.None);
var storedBson = await database
.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.AdvisoryObservations)
.Find(Builders<BsonDocument>.Filter.Eq("_id", observationId))
.FirstOrDefaultAsync();
Assert.NotNull(storedBson);
Assert.True(storedBson.TryGetValue("rawLinkset", out var rawLinksetValue));
var storedDocument = BsonSerializer.Deserialize<AdvisoryObservationDocument>(storedBson);
var storedObservation = AdvisoryObservationDocumentFactory.ToModel(storedDocument);
Assert.True(expectedRawLinkset.Aliases.SequenceEqual(storedObservation.RawLinkset.Aliases, StringComparer.Ordinal));
Assert.True(expectedRawLinkset.PackageUrls.SequenceEqual(storedObservation.RawLinkset.PackageUrls, StringComparer.Ordinal));
Assert.True(expectedRawLinkset.Cpes.SequenceEqual(storedObservation.RawLinkset.Cpes, StringComparer.Ordinal));
Assert.True(expectedRawLinkset.References.SequenceEqual(storedObservation.RawLinkset.References));
Assert.Equal(expectedRawLinkset.Notes, storedObservation.RawLinkset.Notes);
}
finally
{
await _fixture.Client.DropDatabaseAsync(databaseName);
}
}
[Fact]
public async Task ApplyAsync_ThrowsWhenRawDocumentMissing()
{
var databaseName = $"concelier-rawlinkset-missing-{Guid.NewGuid():N}";
var database = _fixture.Client.GetDatabase(databaseName);
await database.CreateCollectionAsync(MongoStorageDefaults.Collections.Migrations);
await database.CreateCollectionAsync(MongoStorageDefaults.Collections.AdvisoryObservations);
try
{
var rawDocument = RawDocumentFactory.CreateAdvisory(
tenant: "tenant-b",
source: new RawSourceMetadata("Vendor-Y", "connector-z", "2.0.0", "stable"),
upstream: new RawUpstreamMetadata(
UpstreamId: "GHSA-9999-0001",
DocumentVersion: "v2",
RetrievedAt: DateTimeOffset.Parse("2025-10-30T00:00:00Z"),
ContentHash: "sha256:def456",
Signature: new RawSignatureMetadata(false),
Provenance: ImmutableDictionary<string, string>.Empty),
content: new RawContent(
Format: "OSV",
SpecVersion: "1.0.0",
Raw: ParseJsonElement("""{"id":"GHSA-9999-0001"}"""),
Encoding: null),
identifiers: new RawIdentifiers(
Aliases: ImmutableArray<string>.Empty,
PrimaryId: "GHSA-9999-0001"),
linkset: new RawLinkset());
var observationId = "tenant-b:vendor-y:ghsa-9999-0001:sha256-def456";
var document = BuildObservationDocument(
observationId,
rawDocument,
ImmutableArray<string>.Empty,
ImmutableArray<string>.Empty,
ImmutableArray<string>.Empty,
ImmutableArray<RawReference>.Empty,
rawDocument.Upstream.RetrievedAt,
includeRawLinkset: false);
await database
.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.AdvisoryObservations)
.InsertOneAsync(document);
var migration = new EnsureAdvisoryObservationsRawLinksetMigration();
await Assert.ThrowsAsync<InvalidOperationException>(
() => migration.ApplyAsync(database, CancellationToken.None));
}
finally
{
await _fixture.Client.DropDatabaseAsync(databaseName);
}
}
private static BsonDocument BuildObservationDocument(
string observationId,
AdvisoryRawDocument rawDocument,
ImmutableArray<string> canonicalAliases,
ImmutableArray<string> canonicalPurls,
ImmutableArray<string> canonicalCpes,
ImmutableArray<RawReference> canonicalReferences,
DateTimeOffset createdAt,
bool includeRawLinkset,
RawLinkset? rawLinkset = null)
{
var sourceDocument = new BsonDocument
{
{ "vendor", rawDocument.Source.Vendor },
{ "stream", string.IsNullOrWhiteSpace(rawDocument.Source.Stream) ? rawDocument.Source.Connector : rawDocument.Source.Stream! },
{ "api", rawDocument.Upstream.Provenance.TryGetValue("api", out var api) ? api : rawDocument.Source.Connector }
};
if (!string.IsNullOrWhiteSpace(rawDocument.Source.ConnectorVersion))
{
sourceDocument["collectorVersion"] = rawDocument.Source.ConnectorVersion;
}
var signatureDocument = new BsonDocument
{
{ "present", rawDocument.Upstream.Signature.Present }
};
if (!string.IsNullOrWhiteSpace(rawDocument.Upstream.Signature.Format))
{
signatureDocument["format"] = rawDocument.Upstream.Signature.Format;
}
if (!string.IsNullOrWhiteSpace(rawDocument.Upstream.Signature.KeyId))
{
signatureDocument["keyId"] = rawDocument.Upstream.Signature.KeyId;
}
if (!string.IsNullOrWhiteSpace(rawDocument.Upstream.Signature.Signature))
{
signatureDocument["signature"] = rawDocument.Upstream.Signature.Signature;
}
var upstreamDocument = new BsonDocument
{
{ "upstream_id", rawDocument.Upstream.UpstreamId },
{ "document_version", rawDocument.Upstream.DocumentVersion },
{ "fetchedAt", rawDocument.Upstream.RetrievedAt.UtcDateTime },
{ "receivedAt", rawDocument.Upstream.RetrievedAt.UtcDateTime },
{ "contentHash", rawDocument.Upstream.ContentHash },
{ "signature", signatureDocument },
{ "metadata", new BsonDocument(rawDocument.Upstream.Provenance) }
};
var contentDocument = new BsonDocument
{
{ "format", rawDocument.Content.Format },
{ "raw", BsonDocument.Parse(rawDocument.Content.Raw.GetRawText()) }
};
if (!string.IsNullOrWhiteSpace(rawDocument.Content.SpecVersion))
{
contentDocument["specVersion"] = rawDocument.Content.SpecVersion;
}
var canonicalLinkset = new BsonDocument
{
{ "aliases", new BsonArray(canonicalAliases) },
{ "purls", new BsonArray(canonicalPurls) },
{ "cpes", new BsonArray(canonicalCpes) },
{ "references", new BsonArray(canonicalReferences.Select(reference => new BsonDocument
{
{ "type", reference.Type },
{ "url", reference.Url }
})) }
};
var document = new BsonDocument
{
{ "_id", observationId },
{ "tenant", rawDocument.Tenant },
{ "source", sourceDocument },
{ "upstream", upstreamDocument },
{ "content", contentDocument },
{ "linkset", canonicalLinkset },
{ "createdAt", createdAt.UtcDateTime },
{ "attributes", new BsonDocument() }
};
if (includeRawLinkset)
{
var actualRawLinkset = rawLinkset ?? throw new ArgumentNullException(nameof(rawLinkset));
document["rawLinkset"] = new BsonDocument
{
{ "aliases", new BsonArray(actualRawLinkset.Aliases) },
{ "purls", new BsonArray(actualRawLinkset.PackageUrls) },
{ "cpes", new BsonArray(actualRawLinkset.Cpes) },
{ "references", new BsonArray(actualRawLinkset.References.Select(reference => new BsonDocument
{
{ "type", reference.Type },
{ "url", reference.Url },
{ "source", reference.Source }
})) },
{ "reconciled_from", new BsonArray(actualRawLinkset.ReconciledFrom) },
{ "notes", new BsonDocument(actualRawLinkset.Notes) }
};
}
return document;
}
private static JsonElement ParseJsonElement(string json)
{
using var document = JsonDocument.Parse(json);
return document.RootElement.Clone();
}
private static RawLinkset BuildRawLinkset(RawIdentifiers identifiers, RawLinkset linkset)
{
var aliasBuilder = ImmutableArray.CreateBuilder<string>();
if (!string.IsNullOrWhiteSpace(identifiers.PrimaryId))
{
aliasBuilder.Add(identifiers.PrimaryId);
}
if (!identifiers.Aliases.IsDefaultOrEmpty)
{
foreach (var alias in identifiers.Aliases)
{
if (!string.IsNullOrEmpty(alias))
{
aliasBuilder.Add(alias);
}
}
}
if (!linkset.Aliases.IsDefaultOrEmpty)
{
foreach (var alias in linkset.Aliases)
{
if (!string.IsNullOrEmpty(alias))
{
aliasBuilder.Add(alias);
}
}
}
static ImmutableArray<string> EnsureArray(ImmutableArray<string> values)
=> values.IsDefault ? ImmutableArray<string>.Empty : values;
static ImmutableArray<RawReference> EnsureReferences(ImmutableArray<RawReference> values)
=> values.IsDefault ? ImmutableArray<RawReference>.Empty : values;
return linkset with
{
Aliases = aliasBuilder.ToImmutable(),
PackageUrls = EnsureArray(linkset.PackageUrls),
Cpes = EnsureArray(linkset.Cpes),
References = EnsureReferences(linkset.References),
ReconciledFrom = EnsureArray(linkset.ReconciledFrom),
Notes = linkset.Notes ?? ImmutableDictionary<string, string>.Empty
};
}
}

View File

@@ -37,32 +37,48 @@ public sealed class AdvisoryObservationDocumentFactoryTests
Signature = "signature"
}
},
Content = new AdvisoryObservationContentDocument
{
Format = "CSAF",
SpecVersion = "2.0",
Raw = BsonDocument.Parse("{\"example\":true}")
},
Linkset = new AdvisoryObservationLinksetDocument
{
Aliases = new List<string> { "CVE-2025-1234" },
Purls = new List<string> { "pkg:generic/foo@1.0.0" },
Cpes = new List<string> { "cpe:/a:vendor:product:1" },
References = new List<AdvisoryObservationReferenceDocument>
{
new() { Type = "advisory", Url = "https://example.com" }
}
}
};
var observation = AdvisoryObservationDocumentFactory.ToModel(document);
Assert.Equal("tenant-a:obs-1", observation.ObservationId);
Content = new AdvisoryObservationContentDocument
{
Format = "CSAF",
SpecVersion = "2.0",
Raw = BsonDocument.Parse("{\"example\":true}")
},
Linkset = new AdvisoryObservationLinksetDocument
{
Aliases = new List<string> { "CVE-2025-1234" },
Purls = new List<string> { "pkg:generic/foo@1.0.0" },
Cpes = new List<string> { "cpe:/a:vendor:product:1" },
References = new List<AdvisoryObservationReferenceDocument>
{
new() { Type = "advisory", Url = "https://example.com" }
}
},
RawLinkset = new AdvisoryObservationRawLinksetDocument
{
Aliases = new List<string> { "CVE-2025-1234", "cve-2025-1234" },
PackageUrls = new List<string> { "pkg:generic/foo@1.0.0" },
Cpes = new List<string> { "cpe:/a:vendor:product:1" },
References = new List<AdvisoryObservationRawReferenceDocument>
{
new() { Type = "Advisory", Url = "https://example.com", Source = "vendor" }
},
ReconciledFrom = new List<string> { "source-a" },
Notes = new Dictionary<string, string> { ["note-key"] = "note-value" }
}
};
var observation = AdvisoryObservationDocumentFactory.ToModel(document);
Assert.Equal("tenant-a:obs-1", observation.ObservationId);
Assert.Equal("tenant-a", observation.Tenant);
Assert.Equal("CVE-2025-1234", observation.Upstream.UpstreamId);
Assert.Contains("pkg:generic/foo@1.0.0", observation.Linkset.Purls);
Assert.Equal("CSAF", observation.Content.Format);
Assert.True(observation.Content.Raw?["example"]?.GetValue<bool>());
Assert.Equal("advisory", observation.Linkset.References[0].Type);
}
}
Assert.Contains("pkg:generic/foo@1.0.0", observation.Linkset.Purls);
Assert.Equal("CSAF", observation.Content.Format);
Assert.True(observation.Content.Raw?["example"]?.GetValue<bool>());
Assert.Equal("advisory", observation.Linkset.References[0].Type);
Assert.Equal(new[] { "CVE-2025-1234", "cve-2025-1234" }, observation.RawLinkset.Aliases);
Assert.Equal("Advisory", observation.RawLinkset.References[0].Type);
Assert.Equal("vendor", observation.RawLinkset.References[0].Source);
Assert.Equal("note-value", observation.RawLinkset.Notes["note-key"]);
}
}

View File

@@ -1,53 +1,53 @@
<Project>
<PropertyGroup>
<ConcelierPluginOutputRoot Condition="'$(ConcelierPluginOutputRoot)' == ''">$(SolutionDir)StellaOps.Concelier.PluginBinaries</ConcelierPluginOutputRoot>
<ConcelierPluginOutputRoot Condition="'$(ConcelierPluginOutputRoot)' == '' and '$(SolutionDir)' == ''">$(MSBuildThisFileDirectory)StellaOps.Concelier.PluginBinaries</ConcelierPluginOutputRoot>
<AuthorityPluginOutputRoot Condition="'$(AuthorityPluginOutputRoot)' == ''">$(SolutionDir)StellaOps.Authority.PluginBinaries</AuthorityPluginOutputRoot>
<AuthorityPluginOutputRoot Condition="'$(AuthorityPluginOutputRoot)' == '' and '$(SolutionDir)' == ''">$(MSBuildThisFileDirectory)StellaOps.Authority.PluginBinaries</AuthorityPluginOutputRoot>
<IsConcelierPlugin Condition="'$(IsConcelierPlugin)' == '' and $([System.String]::Copy('$(MSBuildProjectName)').StartsWith('StellaOps.Concelier.Connector.'))">true</IsConcelierPlugin>
<IsConcelierPlugin Condition="'$(IsConcelierPlugin)' == '' and $([System.String]::Copy('$(MSBuildProjectName)').StartsWith('StellaOps.Concelier.Exporter.'))">true</IsConcelierPlugin>
<IsAuthorityPlugin Condition="'$(IsAuthorityPlugin)' == '' and $([System.String]::Copy('$(MSBuildProjectName)').StartsWith('StellaOps.Authority.Plugin.'))">true</IsAuthorityPlugin>
<NotifyPluginOutputRoot Condition="'$(NotifyPluginOutputRoot)' == '' and '$(SolutionDir)' != ''">$(SolutionDir)plugins\notify</NotifyPluginOutputRoot>
<NotifyPluginOutputRoot Condition="'$(NotifyPluginOutputRoot)' == '' and '$(SolutionDir)' == ''">$([System.IO.Path]::GetFullPath('$(MSBuildThisFileDirectory)..\plugins\notify\'))</NotifyPluginOutputRoot>
<IsNotifyPlugin Condition="'$(IsNotifyPlugin)' == '' and $([System.String]::Copy('$(MSBuildProjectName)').StartsWith('StellaOps.Notify.Connectors.')) and !$([System.String]::Copy('$(MSBuildProjectName)').EndsWith('.Tests'))">true</IsNotifyPlugin>
<IsNotifyPlugin Condition="'$(IsNotifyPlugin)' == 'true' and $([System.String]::Copy('$(MSBuildProjectName)')) == 'StellaOps.Notify.Connectors.Shared'">false</IsNotifyPlugin>
<ScannerBuildxPluginOutputRoot Condition="'$(ScannerBuildxPluginOutputRoot)' == ''">$([System.IO.Path]::GetFullPath('$(MSBuildThisFileDirectory)..\plugins\scanner\buildx\'))</ScannerBuildxPluginOutputRoot>
<IsScannerBuildxPlugin Condition="'$(IsScannerBuildxPlugin)' == '' and $([System.String]::Copy('$(MSBuildProjectName)')) == 'StellaOps.Scanner.Sbomer.BuildXPlugin'">true</IsScannerBuildxPlugin>
<ScannerOsAnalyzerPluginOutputRoot Condition="'$(ScannerOsAnalyzerPluginOutputRoot)' == ''">$([System.IO.Path]::GetFullPath('$(MSBuildThisFileDirectory)..\plugins\scanner\analyzers\os\'))</ScannerOsAnalyzerPluginOutputRoot>
<IsScannerOsAnalyzerPlugin Condition="'$(IsScannerOsAnalyzerPlugin)' == '' and $([System.String]::Copy('$(MSBuildProjectName)').StartsWith('StellaOps.Scanner.Analyzers.OS.')) and !$([System.String]::Copy('$(MSBuildProjectName)').EndsWith('.Tests'))">true</IsScannerOsAnalyzerPlugin>
<ScannerLangAnalyzerPluginOutputRoot Condition="'$(ScannerLangAnalyzerPluginOutputRoot)' == ''">$([System.IO.Path]::GetFullPath('$(MSBuildThisFileDirectory)..\plugins\scanner\analyzers\lang\'))</ScannerLangAnalyzerPluginOutputRoot>
<IsScannerLangAnalyzerPlugin Condition="'$(IsScannerLangAnalyzerPlugin)' == '' and $([System.String]::Copy('$(MSBuildProjectName)').StartsWith('StellaOps.Scanner.Analyzers.Lang.'))">true</IsScannerLangAnalyzerPlugin>
<UseConcelierTestInfra Condition="'$(UseConcelierTestInfra)' == ''">true</UseConcelierTestInfra>
<ConcelierTestingPath Condition="'$(ConcelierTestingPath)' == '' and Exists('$(MSBuildThisFileDirectory)StellaOps.Concelier.Testing\StellaOps.Concelier.Testing.csproj')">$(MSBuildThisFileDirectory)StellaOps.Concelier.Testing\</ConcelierTestingPath>
<ConcelierTestingPath Condition="'$(ConcelierTestingPath)' == '' and Exists('$(MSBuildThisFileDirectory)Concelier\__Libraries\StellaOps.Concelier.Testing\StellaOps.Concelier.Testing.csproj')">$(MSBuildThisFileDirectory)Concelier\__Libraries\StellaOps.Concelier.Testing\</ConcelierTestingPath>
<ConcelierSharedTestsPath Condition="'$(ConcelierSharedTestsPath)' == '' and Exists('$(MSBuildThisFileDirectory)StellaOps.Concelier.Tests.Shared\AssemblyInfo.cs')">$(MSBuildThisFileDirectory)StellaOps.Concelier.Tests.Shared\</ConcelierSharedTestsPath>
<ConcelierSharedTestsPath Condition="'$(ConcelierSharedTestsPath)' == '' and Exists('$(MSBuildThisFileDirectory)Concelier\StellaOps.Concelier.Tests.Shared\AssemblyInfo.cs')">$(MSBuildThisFileDirectory)Concelier\StellaOps.Concelier.Tests.Shared\</ConcelierSharedTestsPath>
</PropertyGroup>
<ItemGroup>
<ProjectReference Update="../StellaOps.Plugin/StellaOps.Plugin.csproj">
<Private>false</Private>
<ExcludeAssets>runtime</ExcludeAssets>
</ProjectReference>
</ItemGroup>
<ItemGroup>
<PackageReference Update="MongoDB.Driver" Version="3.5.0" />
<PackageReference Include="SharpCompress" Version="0.41.0" />
</ItemGroup>
<ItemGroup Condition="$([System.String]::Copy('$(MSBuildProjectName)').EndsWith('.Tests')) and '$(UseConcelierTestInfra)' != 'false'">
<PackageReference Include="coverlet.collector" Version="6.0.4" />
<PackageReference Include="Microsoft.NET.Test.Sdk" Version="17.14.0" />
<PackageReference Include="Microsoft.AspNetCore.Mvc.Testing" Version="10.0.0-rc.2.25502.107" />
<PackageReference Include="Mongo2Go" Version="4.1.0" />
<PackageReference Include="xunit" Version="2.9.2" />
<PackageReference Include="xunit.runner.visualstudio" Version="2.8.2" />
<PackageReference Include="Microsoft.Extensions.TimeProvider.Testing" Version="9.10.0" />
<Compile Include="$(ConcelierSharedTestsPath)AssemblyInfo.cs" Link="Shared\AssemblyInfo.cs" Condition="'$(ConcelierSharedTestsPath)' != ''" />
<Compile Include="$(ConcelierSharedTestsPath)MongoFixtureCollection.cs" Link="Shared\MongoFixtureCollection.cs" Condition="'$(ConcelierSharedTestsPath)' != ''" />
<ProjectReference Include="$(ConcelierTestingPath)StellaOps.Concelier.Testing.csproj" Condition="'$(ConcelierTestingPath)' != ''" />
<Using Include="StellaOps.Concelier.Testing" />
<Using Include="Xunit" />
</ItemGroup>
</Project>
<Project>
<PropertyGroup>
<ConcelierPluginOutputRoot Condition="'$(ConcelierPluginOutputRoot)' == ''">$(SolutionDir)StellaOps.Concelier.PluginBinaries</ConcelierPluginOutputRoot>
<ConcelierPluginOutputRoot Condition="'$(ConcelierPluginOutputRoot)' == '' and '$(SolutionDir)' == ''">$(MSBuildThisFileDirectory)StellaOps.Concelier.PluginBinaries</ConcelierPluginOutputRoot>
<AuthorityPluginOutputRoot Condition="'$(AuthorityPluginOutputRoot)' == ''">$(SolutionDir)StellaOps.Authority.PluginBinaries</AuthorityPluginOutputRoot>
<AuthorityPluginOutputRoot Condition="'$(AuthorityPluginOutputRoot)' == '' and '$(SolutionDir)' == ''">$(MSBuildThisFileDirectory)StellaOps.Authority.PluginBinaries</AuthorityPluginOutputRoot>
<IsConcelierPlugin Condition="'$(IsConcelierPlugin)' == '' and $([System.String]::Copy('$(MSBuildProjectName)').StartsWith('StellaOps.Concelier.Connector.'))">true</IsConcelierPlugin>
<IsConcelierPlugin Condition="'$(IsConcelierPlugin)' == '' and $([System.String]::Copy('$(MSBuildProjectName)').StartsWith('StellaOps.Concelier.Exporter.'))">true</IsConcelierPlugin>
<IsAuthorityPlugin Condition="'$(IsAuthorityPlugin)' == '' and $([System.String]::Copy('$(MSBuildProjectName)').StartsWith('StellaOps.Authority.Plugin.'))">true</IsAuthorityPlugin>
<NotifyPluginOutputRoot Condition="'$(NotifyPluginOutputRoot)' == '' and '$(SolutionDir)' != ''">$(SolutionDir)plugins\notify</NotifyPluginOutputRoot>
<NotifyPluginOutputRoot Condition="'$(NotifyPluginOutputRoot)' == '' and '$(SolutionDir)' == ''">$([System.IO.Path]::GetFullPath('$(MSBuildThisFileDirectory)..\plugins\notify\'))</NotifyPluginOutputRoot>
<IsNotifyPlugin Condition="'$(IsNotifyPlugin)' == '' and $([System.String]::Copy('$(MSBuildProjectName)').StartsWith('StellaOps.Notify.Connectors.')) and !$([System.String]::Copy('$(MSBuildProjectName)').EndsWith('.Tests'))">true</IsNotifyPlugin>
<IsNotifyPlugin Condition="'$(IsNotifyPlugin)' == 'true' and $([System.String]::Copy('$(MSBuildProjectName)')) == 'StellaOps.Notify.Connectors.Shared'">false</IsNotifyPlugin>
<ScannerBuildxPluginOutputRoot Condition="'$(ScannerBuildxPluginOutputRoot)' == ''">$([System.IO.Path]::GetFullPath('$(MSBuildThisFileDirectory)..\plugins\scanner\buildx\'))</ScannerBuildxPluginOutputRoot>
<IsScannerBuildxPlugin Condition="'$(IsScannerBuildxPlugin)' == '' and $([System.String]::Copy('$(MSBuildProjectName)')) == 'StellaOps.Scanner.Sbomer.BuildXPlugin'">true</IsScannerBuildxPlugin>
<ScannerOsAnalyzerPluginOutputRoot Condition="'$(ScannerOsAnalyzerPluginOutputRoot)' == ''">$([System.IO.Path]::GetFullPath('$(MSBuildThisFileDirectory)..\plugins\scanner\analyzers\os\'))</ScannerOsAnalyzerPluginOutputRoot>
<IsScannerOsAnalyzerPlugin Condition="'$(IsScannerOsAnalyzerPlugin)' == '' and $([System.String]::Copy('$(MSBuildProjectName)').StartsWith('StellaOps.Scanner.Analyzers.OS.')) and !$([System.String]::Copy('$(MSBuildProjectName)').EndsWith('.Tests'))">true</IsScannerOsAnalyzerPlugin>
<ScannerLangAnalyzerPluginOutputRoot Condition="'$(ScannerLangAnalyzerPluginOutputRoot)' == ''">$([System.IO.Path]::GetFullPath('$(MSBuildThisFileDirectory)..\plugins\scanner\analyzers\lang\'))</ScannerLangAnalyzerPluginOutputRoot>
<IsScannerLangAnalyzerPlugin Condition="'$(IsScannerLangAnalyzerPlugin)' == '' and $([System.String]::Copy('$(MSBuildProjectName)').StartsWith('StellaOps.Scanner.Analyzers.Lang.'))">true</IsScannerLangAnalyzerPlugin>
<UseConcelierTestInfra Condition="'$(UseConcelierTestInfra)' == ''">true</UseConcelierTestInfra>
<ConcelierTestingPath Condition="'$(ConcelierTestingPath)' == '' and Exists('$(MSBuildThisFileDirectory)StellaOps.Concelier.Testing\StellaOps.Concelier.Testing.csproj')">$(MSBuildThisFileDirectory)StellaOps.Concelier.Testing\</ConcelierTestingPath>
<ConcelierTestingPath Condition="'$(ConcelierTestingPath)' == '' and Exists('$(MSBuildThisFileDirectory)Concelier\__Libraries\StellaOps.Concelier.Testing\StellaOps.Concelier.Testing.csproj')">$(MSBuildThisFileDirectory)Concelier\__Libraries\StellaOps.Concelier.Testing\</ConcelierTestingPath>
<ConcelierSharedTestsPath Condition="'$(ConcelierSharedTestsPath)' == '' and Exists('$(MSBuildThisFileDirectory)StellaOps.Concelier.Tests.Shared\AssemblyInfo.cs')">$(MSBuildThisFileDirectory)StellaOps.Concelier.Tests.Shared\</ConcelierSharedTestsPath>
<ConcelierSharedTestsPath Condition="'$(ConcelierSharedTestsPath)' == '' and Exists('$(MSBuildThisFileDirectory)Concelier\StellaOps.Concelier.Tests.Shared\AssemblyInfo.cs')">$(MSBuildThisFileDirectory)Concelier\StellaOps.Concelier.Tests.Shared\</ConcelierSharedTestsPath>
</PropertyGroup>
<ItemGroup>
<ProjectReference Update="../StellaOps.Plugin/StellaOps.Plugin.csproj">
<Private>false</Private>
<ExcludeAssets>runtime</ExcludeAssets>
</ProjectReference>
</ItemGroup>
<ItemGroup>
<PackageReference Update="MongoDB.Driver" Version="3.5.0" />
<PackageReference Include="SharpCompress" Version="0.41.0" />
</ItemGroup>
<ItemGroup Condition="$([System.String]::Copy('$(MSBuildProjectName)').EndsWith('.Tests')) and '$(UseConcelierTestInfra)' != 'false'">
<PackageReference Include="coverlet.collector" Version="6.0.4" />
<PackageReference Include="Microsoft.NET.Test.Sdk" Version="17.14.0" />
<PackageReference Include="Microsoft.AspNetCore.Mvc.Testing" Version="10.0.0-rc.2.25502.107" />
<PackageReference Include="Mongo2Go" Version="4.1.0" />
<PackageReference Include="xunit" Version="2.9.2" />
<PackageReference Include="xunit.runner.visualstudio" Version="2.8.2" />
<PackageReference Include="Microsoft.Extensions.TimeProvider.Testing" Version="9.10.0" />
<Compile Include="$(ConcelierSharedTestsPath)AssemblyInfo.cs" Link="Shared\AssemblyInfo.cs" Condition="'$(ConcelierSharedTestsPath)' != ''" />
<Compile Include="$(ConcelierSharedTestsPath)MongoFixtureCollection.cs" Link="Shared\MongoFixtureCollection.cs" Condition="'$(ConcelierSharedTestsPath)' != ''" />
<ProjectReference Include="$(ConcelierTestingPath)StellaOps.Concelier.Testing.csproj" Condition="'$(ConcelierTestingPath)' != ''" />
<Using Include="StellaOps.Concelier.Testing" />
<Using Include="Xunit" />
</ItemGroup>
</Project>

View File

@@ -1,94 +1,94 @@
# TASKS — Epic 1: Aggregation-Only Contract
> **AOC Reminder:** Excititor WebService publishes raw statements/linksets only; derived precedence/severity belongs to Policy overlays.
| ID | Status | Owner(s) | Depends on | Notes |
|---|---|---|---|---|
| EXCITITOR-WEB-AOC-19-001 `Raw VEX ingestion APIs` | TODO | Excititor WebService Guild | EXCITITOR-CORE-AOC-19-001, EXCITITOR-STORE-AOC-19-001 | Implement `POST /ingest/vex`, `GET /vex/raw*`, and `POST /aoc/verify` endpoints. Enforce Authority scopes, tenant injection, and guard pipeline to ensure only immutable VEX facts are persisted. |
> Docs alignment (2025-10-26): See AOC reference §45 and authority scopes doc for required tokens/behaviour.
| EXCITITOR-WEB-AOC-19-002 `AOC observability + metrics` | TODO | Excititor WebService Guild, Observability Guild | EXCITITOR-WEB-AOC-19-001 | Export metrics (`ingestion_write_total`, `aoc_violation_total`, signature verification counters) and tracing spans matching Conseiller naming. Ensure structured logging includes tenant, source vendor, upstream id, and content hash. |
> Docs alignment (2025-10-26): Metrics/traces/log schema in `docs/observability/observability.md`.
| EXCITITOR-WEB-AOC-19-003 `Guard + schema test harness` | TODO | QA Guild | EXCITITOR-WEB-AOC-19-001 | Add unit/integration tests for schema validation, forbidden field rejection (`ERR_AOC_001/006/007`), and supersedes behavior using CycloneDX-VEX & CSAF fixtures with deterministic expectations. |
> Docs alignment (2025-10-26): Error codes + CLI verification in `docs/modules/cli/guides/cli-reference.md`.
| EXCITITOR-WEB-AOC-19-004 `Batch ingest validation` | TODO | Excititor WebService Guild, QA Guild | EXCITITOR-WEB-AOC-19-003, EXCITITOR-CORE-AOC-19-002 | Build large fixture ingest covering mixed VEX statuses, verifying raw storage parity, metrics, and CLI `aoc verify` compatibility. Document load test/runbook updates. |
> Docs alignment (2025-10-26): Offline/air-gap workflows captured in `docs/deploy/containers.md` §5.
## Policy Engine v2
| ID | Status | Owner(s) | Depends on | Notes |
|----|--------|----------|------------|-------|
| EXCITITOR-POLICY-20-001 `Policy selection endpoints` | TODO | Excititor WebService Guild | WEB-POLICY-20-001, EXCITITOR-CORE-AOC-19-004 | Provide VEX lookup APIs supporting PURL/advisory batching, scope filtering, and tenant enforcement with deterministic ordering + pagination. |
## StellaOps Console (Sprint 23)
| ID | Status | Owner(s) | Depends on | Notes |
|----|--------|----------|------------|-------|
| EXCITITOR-CONSOLE-23-001 `VEX aggregation views` | TODO | Excititor WebService Guild, BE-Base Platform Guild | EXCITITOR-LNM-21-201, EXCITITOR-LNM-21-202 | Expose `/console/vex` endpoints returning grouped VEX statements per advisory/component with status chips, justification metadata, precedence trace pointers, and tenant-scoped filters for Console explorer. |
| EXCITITOR-CONSOLE-23-002 `Dashboard VEX deltas` | TODO | Excititor WebService Guild | EXCITITOR-CONSOLE-23-001, EXCITITOR-LNM-21-203 | Provide aggregated counts for VEX overrides (new, not_affected, revoked) powering Console dashboard + live status ticker; emit metrics for policy explain integration. |
| EXCITITOR-CONSOLE-23-003 `VEX search helpers` | TODO | Excititor WebService Guild | EXCITITOR-CONSOLE-23-001 | Deliver rapid lookup endpoints of VEX by advisory/component for Console global search; ensure response includes provenance and precedence context; include caching and RBAC. |
## Graph Explorer v1
| ID | Status | Owner(s) | Depends on | Notes |
|----|--------|----------|------------|-------|
## Link-Not-Merge v1
| ID | Status | Owner(s) | Depends on | Notes |
|----|--------|----------|------------|-------|
| EXCITITOR-LNM-21-201 `Observation APIs` | TODO | Excititor WebService Guild, BE-Base Platform Guild | EXCITITOR-LNM-21-001 | Add VEX observation read endpoints with filters, pagination, RBAC, and tenant scoping. |
| EXCITITOR-LNM-21-202 `Linkset APIs` | TODO | Excititor WebService Guild | EXCITITOR-LNM-21-002, EXCITITOR-LNM-21-003 | Implement linkset read/export/evidence endpoints returning correlation/conflict payloads and map errors to `ERR_AGG_*`. |
| EXCITITOR-LNM-21-203 `Event publishing` | TODO | Excititor WebService Guild, Platform Events Guild | EXCITITOR-LNM-21-005 | Publish `vex.linkset.updated` events, document schema, and ensure idempotent delivery. |
## Graph & Vuln Explorer v1
| ID | Status | Owner(s) | Depends on | Notes |
|----|--------|----------|------------|-------|
| EXCITITOR-GRAPH-24-101 `VEX summary API` | TODO | Excititor WebService Guild | EXCITITOR-GRAPH-24-001 | Provide endpoints delivering VEX status summaries per component/asset for Vuln Explorer integration. |
| EXCITITOR-GRAPH-24-102 `Evidence batch API` | TODO | Excititor WebService Guild | EXCITITOR-LNM-21-201 | Add batch VEX observation retrieval optimized for Graph overlays/tooltips. |
## VEX Lens (Sprint 30)
| ID | Status | Owner(s) | Depends on | Notes |
|----|--------|----------|------------|-------|
| EXCITITOR-VEXLENS-30-001 `VEX evidence enrichers` | TODO | Excititor WebService Guild, VEX Lens Guild | EXCITITOR-VULN-29-001, VEXLENS-30-005 | Include issuer hints, signatures, and product trees in evidence payloads for VEX Lens; Label: VEX-Lens. |
## Vulnerability Explorer (Sprint 29)
| ID | Status | Owner(s) | Depends on | Notes |
|----|--------|----------|------------|-------|
| EXCITITOR-VULN-29-001 `VEX key canonicalization` | TODO | Excititor WebService Guild | EXCITITOR-LNM-21-001 | Canonicalize (lossless) VEX advisory/product keys (map to `advisory_key`, capture product scopes); expose original sources in `links[]`; AOC-compliant: no merge, no derived fields, no suppression; backfill existing records. |
| EXCITITOR-VULN-29-002 `Evidence retrieval` | TODO | Excititor WebService Guild | EXCITITOR-VULN-29-001, VULN-API-29-003 | Provide `/vuln/evidence/vex/{advisory_key}` returning raw VEX statements filtered by tenant/product scope for Explorer evidence tabs. |
| EXCITITOR-VULN-29-004 `Observability` | TODO | Excititor WebService Guild, Observability Guild | EXCITITOR-VULN-29-001 | Add metrics/logs for VEX normalization, suppression scopes, withdrawn statements; emit events consumed by Vuln Explorer resolver. |
## Advisory AI (Sprint 31)
| ID | Status | Owner(s) | Depends on | Notes |
|----|--------|----------|------------|-------|
| EXCITITOR-AIAI-31-001 `Justification enrichment` | TODO | Excititor WebService Guild | EXCITITOR-VULN-29-001 | Expose normalized VEX justifications, product trees, and paragraph anchors for Advisory AI conflict explanations. |
| EXCITITOR-AIAI-31-002 `VEX chunk API` | TODO | Excititor WebService Guild | EXCITITOR-AIAI-31-001, VEXLENS-30-006 | Provide `/vex/evidence/chunks` endpoint returning tenant-scoped VEX statements with signature metadata and scope scores for RAG. |
| EXCITITOR-AIAI-31-003 `Telemetry` | TODO | Excititor WebService Guild, Observability Guild | EXCITITOR-AIAI-31-001 | Emit metrics/logs for VEX chunk usage, signature verification failures, and guardrail triggers. |
## Observability & Forensics (Epic 15)
| ID | Status | Owner(s) | Depends on | Notes |
|----|--------|----------|------------|-------|
| EXCITITOR-WEB-OBS-50-001 `Telemetry adoption` | TODO | Excititor WebService Guild | TELEMETRY-OBS-50-001, EXCITITOR-OBS-50-001 | Adopt telemetry core for VEX APIs, ensure responses include trace IDs & correlation headers, and update structured logging for read endpoints. |
| EXCITITOR-WEB-OBS-51-001 `Observability health endpoints` | TODO | Excititor WebService Guild | EXCITITOR-WEB-OBS-50-001, WEB-OBS-51-001 | Implement `/obs/excititor/health` summarizing ingest/link SLOs, signature failure counts, and conflict trends for Console dashboards. |
| EXCITITOR-WEB-OBS-52-001 `Timeline streaming` | TODO | Excititor WebService Guild | EXCITITOR-WEB-OBS-50-001, TIMELINE-OBS-52-003 | Provide SSE bridge for VEX timeline events with tenant filters, pagination, and guardrails. |
| EXCITITOR-WEB-OBS-53-001 `Evidence APIs` | TODO | Excititor WebService Guild, Evidence Locker Guild | EXCITITOR-OBS-53-001, EVID-OBS-53-003 | Expose `/evidence/vex/*` endpoints that fetch locker bundles, enforce scopes, and surface verification metadata. |
| EXCITITOR-WEB-OBS-54-001 `Attestation APIs` | TODO | Excititor WebService Guild | EXCITITOR-OBS-54-001, PROV-OBS-54-001 | Add `/attestations/vex/*` endpoints returning DSSE verification state, builder identity, and chain-of-custody links. |
| EXCITITOR-WEB-OBS-55-001 `Incident mode toggles` | TODO | Excititor WebService Guild, DevOps Guild | EXCITITOR-OBS-55-001, WEB-OBS-55-001 | Provide incident mode API for VEX pipelines with activation audit logs and retention override previews. |
## Air-Gapped Mode (Epic 16)
| ID | Status | Owner(s) | Depends on | Notes |
|----|--------|----------|------------|-------|
| EXCITITOR-WEB-AIRGAP-56-001 | TODO | Excititor WebService Guild | AIRGAP-IMP-58-001, EXCITITOR-AIRGAP-56-001 | Support mirror bundle registration via APIs, expose bundle provenance in VEX responses, and block external connectors in sealed mode. |
| EXCITITOR-WEB-AIRGAP-56-002 | TODO | Excititor WebService Guild, AirGap Time Guild | EXCITITOR-WEB-AIRGAP-56-001, AIRGAP-TIME-58-001 | Return VEX staleness metrics and time anchor info in API responses for Console/CLI use. |
| EXCITITOR-WEB-AIRGAP-57-001 | TODO | Excititor WebService Guild, AirGap Policy Guild | AIRGAP-POL-56-001 | Map sealed-mode violations to standardized error payload with remediation guidance. |
| EXCITITOR-WEB-AIRGAP-58-001 | TODO | Excititor WebService Guild, AirGap Importer Guild | EXCITITOR-WEB-AIRGAP-56-001, TIMELINE-OBS-53-001 | Emit timeline events for VEX bundle imports with bundle ID, scope, and actor metadata. |
## SDKs & OpenAPI (Epic 17)
| ID | Status | Owner(s) | Depends on | Notes |
|----|--------|----------|------------|-------|
| EXCITITOR-WEB-OAS-61-001 | TODO | Excititor WebService Guild | OAS-61-001 | Implement `/.well-known/openapi` discovery endpoint with spec version metadata. |
| EXCITITOR-WEB-OAS-61-002 | TODO | Excititor WebService Guild | APIGOV-61-001 | Standardize error envelope responses and update controller/unit tests. |
| EXCITITOR-WEB-OAS-62-001 | TODO | Excititor WebService Guild | EXCITITOR-OAS-61-002 | Add curated examples for VEX observation/linkset endpoints and ensure portal displays them. |
| EXCITITOR-WEB-OAS-63-001 | TODO | Excititor WebService Guild, API Governance Guild | APIGOV-63-001 | Emit deprecation headers and update docs for retiring VEX APIs. |
# TASKS — Epic 1: Aggregation-Only Contract
> **AOC Reminder:** Excititor WebService publishes raw statements/linksets only; derived precedence/severity belongs to Policy overlays.
| ID | Status | Owner(s) | Depends on | Notes |
|---|---|---|---|---|
| EXCITITOR-WEB-AOC-19-001 `Raw VEX ingestion APIs` | TODO | Excititor WebService Guild | EXCITITOR-CORE-AOC-19-001, EXCITITOR-STORE-AOC-19-001 | Implement `POST /ingest/vex`, `GET /vex/raw*`, and `POST /aoc/verify` endpoints. Enforce Authority scopes, tenant injection, and guard pipeline to ensure only immutable VEX facts are persisted. |
> Docs alignment (2025-10-26): See AOC reference §45 and authority scopes doc for required tokens/behaviour.
| EXCITITOR-WEB-AOC-19-002 `AOC observability + metrics` | TODO | Excititor WebService Guild, Observability Guild | EXCITITOR-WEB-AOC-19-001 | Export metrics (`ingestion_write_total`, `aoc_violation_total`, signature verification counters) and tracing spans matching Conseiller naming. Ensure structured logging includes tenant, source vendor, upstream id, and content hash. |
> Docs alignment (2025-10-26): Metrics/traces/log schema in `docs/observability/observability.md`.
| EXCITITOR-WEB-AOC-19-003 `Guard + schema test harness` | TODO | QA Guild | EXCITITOR-WEB-AOC-19-001 | Add unit/integration tests for schema validation, forbidden field rejection (`ERR_AOC_001/006/007`), and supersedes behavior using CycloneDX-VEX & CSAF fixtures with deterministic expectations. |
> Docs alignment (2025-10-26): Error codes + CLI verification in `docs/modules/cli/guides/cli-reference.md`.
| EXCITITOR-WEB-AOC-19-004 `Batch ingest validation` | TODO | Excititor WebService Guild, QA Guild | EXCITITOR-WEB-AOC-19-003, EXCITITOR-CORE-AOC-19-002 | Build large fixture ingest covering mixed VEX statuses, verifying raw storage parity, metrics, and CLI `aoc verify` compatibility. Document load test/runbook updates. |
> Docs alignment (2025-10-26): Offline/air-gap workflows captured in `docs/deploy/containers.md` §5.
## Policy Engine v2
| ID | Status | Owner(s) | Depends on | Notes |
|----|--------|----------|------------|-------|
| EXCITITOR-POLICY-20-001 `Policy selection endpoints` | TODO | Excititor WebService Guild | WEB-POLICY-20-001, EXCITITOR-CORE-AOC-19-004 | Provide VEX lookup APIs supporting PURL/advisory batching, scope filtering, and tenant enforcement with deterministic ordering + pagination. |
## StellaOps Console (Sprint 23)
| ID | Status | Owner(s) | Depends on | Notes |
|----|--------|----------|------------|-------|
| EXCITITOR-CONSOLE-23-001 `VEX aggregation views` | TODO | Excititor WebService Guild, BE-Base Platform Guild | EXCITITOR-LNM-21-201, EXCITITOR-LNM-21-202 | Expose `/console/vex` endpoints returning grouped VEX statements per advisory/component with status chips, justification metadata, precedence trace pointers, and tenant-scoped filters for Console explorer. |
| EXCITITOR-CONSOLE-23-002 `Dashboard VEX deltas` | TODO | Excititor WebService Guild | EXCITITOR-CONSOLE-23-001, EXCITITOR-LNM-21-203 | Provide aggregated counts for VEX overrides (new, not_affected, revoked) powering Console dashboard + live status ticker; emit metrics for policy explain integration. |
| EXCITITOR-CONSOLE-23-003 `VEX search helpers` | TODO | Excititor WebService Guild | EXCITITOR-CONSOLE-23-001 | Deliver rapid lookup endpoints of VEX by advisory/component for Console global search; ensure response includes provenance and precedence context; include caching and RBAC. |
## Graph Explorer v1
| ID | Status | Owner(s) | Depends on | Notes |
|----|--------|----------|------------|-------|
## Link-Not-Merge v1
| ID | Status | Owner(s) | Depends on | Notes |
|----|--------|----------|------------|-------|
| EXCITITOR-LNM-21-201 `Observation APIs` | TODO | Excititor WebService Guild, BE-Base Platform Guild | EXCITITOR-LNM-21-001 | Add VEX observation read endpoints with filters, pagination, RBAC, and tenant scoping. |
| EXCITITOR-LNM-21-202 `Linkset APIs` | TODO | Excititor WebService Guild | EXCITITOR-LNM-21-002, EXCITITOR-LNM-21-003 | Implement linkset read/export/evidence endpoints returning correlation/conflict payloads and map errors to `ERR_AGG_*`. |
| EXCITITOR-LNM-21-203 `Event publishing` | TODO | Excititor WebService Guild, Platform Events Guild | EXCITITOR-LNM-21-005 | Publish `vex.linkset.updated` events, document schema, and ensure idempotent delivery. |
## Graph & Vuln Explorer v1
| ID | Status | Owner(s) | Depends on | Notes |
|----|--------|----------|------------|-------|
| EXCITITOR-GRAPH-24-101 `VEX summary API` | TODO | Excititor WebService Guild | EXCITITOR-GRAPH-24-001 | Provide endpoints delivering VEX status summaries per component/asset for Vuln Explorer integration. |
| EXCITITOR-GRAPH-24-102 `Evidence batch API` | TODO | Excititor WebService Guild | EXCITITOR-LNM-21-201 | Add batch VEX observation retrieval optimized for Graph overlays/tooltips. |
## VEX Lens (Sprint 30)
| ID | Status | Owner(s) | Depends on | Notes |
|----|--------|----------|------------|-------|
| EXCITITOR-VEXLENS-30-001 `VEX evidence enrichers` | TODO | Excititor WebService Guild, VEX Lens Guild | EXCITITOR-VULN-29-001, VEXLENS-30-005 | Include issuer hints, signatures, and product trees in evidence payloads for VEX Lens; Label: VEX-Lens. |
## Vulnerability Explorer (Sprint 29)
| ID | Status | Owner(s) | Depends on | Notes |
|----|--------|----------|------------|-------|
| EXCITITOR-VULN-29-001 `VEX key canonicalization` | TODO | Excititor WebService Guild | EXCITITOR-LNM-21-001 | Canonicalize (lossless) VEX advisory/product keys (map to `advisory_key`, capture product scopes); expose original sources in `links[]`; AOC-compliant: no merge, no derived fields, no suppression; backfill existing records. |
| EXCITITOR-VULN-29-002 `Evidence retrieval` | TODO | Excititor WebService Guild | EXCITITOR-VULN-29-001, VULN-API-29-003 | Provide `/vuln/evidence/vex/{advisory_key}` returning raw VEX statements filtered by tenant/product scope for Explorer evidence tabs. |
| EXCITITOR-VULN-29-004 `Observability` | TODO | Excititor WebService Guild, Observability Guild | EXCITITOR-VULN-29-001 | Add metrics/logs for VEX normalization, suppression scopes, withdrawn statements; emit events consumed by Vuln Explorer resolver. |
## Advisory AI (Sprint 31)
| ID | Status | Owner(s) | Depends on | Notes |
|----|--------|----------|------------|-------|
| EXCITITOR-AIAI-31-001 `Justification enrichment` | TODO | Excititor WebService Guild | EXCITITOR-VULN-29-001 | Expose normalized VEX justifications, product trees, and paragraph anchors for Advisory AI conflict explanations. |
| EXCITITOR-AIAI-31-002 `VEX chunk API` | TODO | Excititor WebService Guild | EXCITITOR-AIAI-31-001, VEXLENS-30-006 | Provide `/vex/evidence/chunks` endpoint returning tenant-scoped VEX statements with signature metadata and scope scores for RAG. |
| EXCITITOR-AIAI-31-003 `Telemetry` | TODO | Excititor WebService Guild, Observability Guild | EXCITITOR-AIAI-31-001 | Emit metrics/logs for VEX chunk usage, signature verification failures, and guardrail triggers. |
## Observability & Forensics (Epic 15)
| ID | Status | Owner(s) | Depends on | Notes |
|----|--------|----------|------------|-------|
| EXCITITOR-WEB-OBS-50-001 `Telemetry adoption` | TODO | Excititor WebService Guild | TELEMETRY-OBS-50-001, EXCITITOR-OBS-50-001 | Adopt telemetry core for VEX APIs, ensure responses include trace IDs & correlation headers, and update structured logging for read endpoints. |
| EXCITITOR-WEB-OBS-51-001 `Observability health endpoints` | TODO | Excititor WebService Guild | EXCITITOR-WEB-OBS-50-001, WEB-OBS-51-001 | Implement `/obs/excititor/health` summarizing ingest/link SLOs, signature failure counts, and conflict trends for Console dashboards. |
| EXCITITOR-WEB-OBS-52-001 `Timeline streaming` | TODO | Excititor WebService Guild | EXCITITOR-WEB-OBS-50-001, TIMELINE-OBS-52-003 | Provide SSE bridge for VEX timeline events with tenant filters, pagination, and guardrails. |
| EXCITITOR-WEB-OBS-53-001 `Evidence APIs` | TODO | Excititor WebService Guild, Evidence Locker Guild | EXCITITOR-OBS-53-001, EVID-OBS-53-003 | Expose `/evidence/vex/*` endpoints that fetch locker bundles, enforce scopes, and surface verification metadata. |
| EXCITITOR-WEB-OBS-54-001 `Attestation APIs` | TODO | Excititor WebService Guild | EXCITITOR-OBS-54-001, PROV-OBS-54-001 | Add `/attestations/vex/*` endpoints returning DSSE verification state, builder identity, and chain-of-custody links. |
| EXCITITOR-WEB-OBS-55-001 `Incident mode toggles` | TODO | Excititor WebService Guild, DevOps Guild | EXCITITOR-OBS-55-001, WEB-OBS-55-001 | Provide incident mode API for VEX pipelines with activation audit logs and retention override previews. |
## Air-Gapped Mode (Epic 16)
| ID | Status | Owner(s) | Depends on | Notes |
|----|--------|----------|------------|-------|
| EXCITITOR-WEB-AIRGAP-56-001 | TODO | Excititor WebService Guild | AIRGAP-IMP-58-001, EXCITITOR-AIRGAP-56-001 | Support mirror bundle registration via APIs, expose bundle provenance in VEX responses, and block external connectors in sealed mode. |
| EXCITITOR-WEB-AIRGAP-56-002 | TODO | Excititor WebService Guild, AirGap Time Guild | EXCITITOR-WEB-AIRGAP-56-001, AIRGAP-TIME-58-001 | Return VEX staleness metrics and time anchor info in API responses for Console/CLI use. |
| EXCITITOR-WEB-AIRGAP-57-001 | TODO | Excititor WebService Guild, AirGap Policy Guild | AIRGAP-POL-56-001 | Map sealed-mode violations to standardized error payload with remediation guidance. |
| EXCITITOR-WEB-AIRGAP-58-001 | TODO | Excititor WebService Guild, AirGap Importer Guild | EXCITITOR-WEB-AIRGAP-56-001, TIMELINE-OBS-53-001 | Emit timeline events for VEX bundle imports with bundle ID, scope, and actor metadata. |
## SDKs & OpenAPI (Epic 17)
| ID | Status | Owner(s) | Depends on | Notes |
|----|--------|----------|------------|-------|
| EXCITITOR-WEB-OAS-61-001 | TODO | Excititor WebService Guild | OAS-61-001 | Implement `/.well-known/openapi` discovery endpoint with spec version metadata. |
| EXCITITOR-WEB-OAS-61-002 | TODO | Excititor WebService Guild | APIGOV-61-001 | Standardize error envelope responses and update controller/unit tests. |
| EXCITITOR-WEB-OAS-62-001 | TODO | Excititor WebService Guild | EXCITITOR-OAS-61-002 | Add curated examples for VEX observation/linkset endpoints and ensure portal displays them. |
| EXCITITOR-WEB-OAS-63-001 | TODO | Excititor WebService Guild, API Governance Guild | APIGOV-63-001 | Emit deprecation headers and update docs for retiring VEX APIs. |

View File

@@ -2,6 +2,6 @@ If you are working on this file you need to read docs/modules/excititor/ARCHITEC
# TASKS
| Task | Owner(s) | Depends on | Notes |
|---|---|---|---|
|EXCITITOR-ATTEST-01-003 Verification suite & observability|Team Excititor Attestation|EXCITITOR-ATTEST-01-002|DOING (2025-10-22) Continuing implementation: build `IVexAttestationVerifier`, wire metrics/logging, and add regression tests. Draft plan in `EXCITITOR-ATTEST-01-003-plan.md` (2025-10-19) guides scope; updating with worknotes as progress lands.|
|EXCITITOR-ATTEST-01-003 Verification suite & observability|Team Excititor Attestation|EXCITITOR-ATTEST-01-002|DOING (2025-10-22) Continuing implementation: build `IVexAttestationVerifier`, wire metrics/logging, and add regression tests. Draft plan in `EXCITITOR-ATTEST-01-003-plan.md` (2025-10-19) guides scope; updating with worknotes as progress lands.<br>2025-10-31: Verifier now tolerates duplicate source providers from AOC raw projections, downgrades offline Rekor verification to a degraded result, and enforces trusted signer registry checks with detailed diagnostics/tests.|
> Remark (2025-10-22): Added verifier implementation + metrics/tests; next steps include wiring into WebService/Worker flows and expanding negative-path coverage.

View File

@@ -64,12 +64,12 @@ internal sealed class VexAttestationVerifier : IVexAttestationVerifier
var stopwatch = Stopwatch.StartNew();
var diagnostics = ImmutableDictionary.CreateBuilder<string, string>(StringComparer.Ordinal);
var resultLabel = "valid";
var rekorState = "skipped";
var component = request.IsReverify ? "worker" : "webservice";
try
{
var resultLabel = "valid";
var rekorState = "skipped";
var component = request.IsReverify ? "worker" : "webservice";
try
{
if (string.IsNullOrWhiteSpace(request.Envelope))
{
diagnostics["envelope.state"] = "missing";
@@ -173,13 +173,17 @@ internal sealed class VexAttestationVerifier : IVexAttestationVerifier
resultLabel = "degraded";
}
diagnostics["signature.state"] = "present";
return BuildResult(true);
}
catch (Exception ex)
{
diagnostics["error"] = ex.GetType().Name;
diagnostics["error.message"] = ex.Message;
if (rekorState is "offline" && resultLabel != "invalid")
{
resultLabel = "degraded";
}
return BuildResult(true);
}
catch (Exception ex)
{
diagnostics["error"] = ex.GetType().Name;
diagnostics["error.message"] = ex.Message;
resultLabel = "error";
_logger.LogError(ex, "Unexpected exception verifying attestation for export {ExportId}", request.Attestation.ExportId);
return BuildResult(false);
@@ -201,16 +205,113 @@ internal sealed class VexAttestationVerifier : IVexAttestationVerifier
{
diagnostics["result"] = resultLabel;
diagnostics["component"] = component;
diagnostics["rekor.state"] = rekorState;
return new VexAttestationVerification(isValid, diagnostics.ToImmutable());
}
}
private static bool TryDeserializeEnvelope(
string envelopeJson,
out DsseEnvelope envelope,
ImmutableDictionary<string, string>.Builder diagnostics)
{
diagnostics["rekor.state"] = rekorState;
return new VexAttestationVerification(isValid, diagnostics.ToImmutable());
}
}
private async ValueTask<bool> VerifySignaturesAsync(
ReadOnlyMemory<byte> payloadBytes,
IReadOnlyList<DsseSignature> signatures,
ImmutableDictionary<string, string>.Builder diagnostics,
CancellationToken cancellationToken)
{
if (signatures is null || signatures.Count == 0)
{
diagnostics["signature.state"] = "missing";
return false;
}
if (_trustedSigners.Count == 0)
{
diagnostics["signature.state"] = "skipped";
diagnostics["signature.reason"] = "trust_not_configured";
return true;
}
if (_cryptoRegistry is null)
{
diagnostics["signature.state"] = "error";
diagnostics["signature.reason"] = "registry_unavailable";
return false;
}
foreach (var signature in signatures)
{
if (string.IsNullOrWhiteSpace(signature.Signature))
{
diagnostics["signature.state"] = "error";
diagnostics["signature.reason"] = "empty_signature";
return false;
}
if (string.IsNullOrWhiteSpace(signature.KeyId))
{
diagnostics["signature.state"] = "error";
diagnostics["signature.reason"] = "missing_key_id";
return false;
}
if (!_trustedSigners.TryGetValue(signature.KeyId, out var signerOptions))
{
diagnostics["signature.state"] = "error";
diagnostics["signature.reason"] = "untrusted_key";
diagnostics["signature.keyId"] = signature.KeyId;
return false;
}
byte[] signatureBytes;
try
{
signatureBytes = Convert.FromBase64String(signature.Signature);
}
catch (FormatException)
{
diagnostics["signature.state"] = "error";
diagnostics["signature.reason"] = "invalid_signature_encoding";
diagnostics["signature.keyId"] = signature.KeyId;
return false;
}
CryptoSignerResolution resolution;
try
{
resolution = _cryptoRegistry.ResolveSigner(
CryptoCapability.Verification,
signerOptions.Algorithm,
new CryptoKeyReference(signerOptions.KeyReference, signerOptions.ProviderHint));
}
catch (Exception ex)
{
diagnostics["signature.state"] = "error";
diagnostics["signature.reason"] = "resolution_failed";
diagnostics["signature.error"] = ex.GetType().Name;
diagnostics["signature.keyId"] = signature.KeyId;
return false;
}
var verified = await resolution.Signer
.VerifyAsync(payloadBytes, signatureBytes, cancellationToken)
.ConfigureAwait(false);
if (!verified)
{
diagnostics["signature.state"] = "error";
diagnostics["signature.reason"] = "verification_failed";
diagnostics["signature.keyId"] = signature.KeyId;
return false;
}
}
diagnostics["signature.state"] = "verified";
return true;
}
private static bool TryDeserializeEnvelope(
string envelopeJson,
out DsseEnvelope envelope,
ImmutableDictionary<string, string>.Builder diagnostics)
{
try
{
envelope = JsonSerializer.Deserialize<DsseEnvelope>(envelopeJson, EnvelopeSerializerOptions)
@@ -471,19 +572,20 @@ internal sealed class VexAttestationVerifier : IVexAttestationVerifier
}
}
private static bool SetEquals(IReadOnlyCollection<string>? left, ImmutableArray<string> right)
{
if (left is null)
{
return right.IsDefaultOrEmpty;
}
if (left.Count != right.Length)
{
return false;
}
var leftSet = new HashSet<string>(left, StringComparer.Ordinal);
return right.All(leftSet.Contains);
}
}
private static bool SetEquals(IReadOnlyCollection<string>? left, ImmutableArray<string> right)
{
if (left is null || left.Count == 0)
{
return right.IsDefaultOrEmpty || right.Length == 0;
}
if (right.IsDefaultOrEmpty)
{
return false;
}
var leftSet = new HashSet<string>(left, StringComparer.Ordinal);
var rightSet = new HashSet<string>(right, StringComparer.Ordinal);
return leftSet.SetEquals(rightSet);
}
}

View File

@@ -1,365 +1,365 @@
using System;
using System.Collections.Generic;
using System.Collections.Immutable;
using System.Linq;
using System.Net.Http;
using System.Net.Http.Headers;
using System.Security.Cryptography;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
using System.Runtime.CompilerServices;
using Microsoft.Extensions.Logging;
using StellaOps.Excititor.Connectors.Abstractions;
using StellaOps.Excititor.Connectors.SUSE.RancherVEXHub.Authentication;
using StellaOps.Excititor.Connectors.SUSE.RancherVEXHub.Configuration;
using StellaOps.Excititor.Connectors.SUSE.RancherVEXHub.Events;
using StellaOps.Excititor.Connectors.SUSE.RancherVEXHub.Metadata;
using StellaOps.Excititor.Connectors.SUSE.RancherVEXHub.State;
using StellaOps.Excititor.Core;
namespace StellaOps.Excititor.Connectors.SUSE.RancherVEXHub;
public sealed class RancherHubConnector : VexConnectorBase
{
private const int MaxDigestHistory = 200;
private static readonly VexConnectorDescriptor StaticDescriptor = new(
id: "excititor:suse.rancher",
kind: VexProviderKind.Hub,
displayName: "SUSE Rancher VEX Hub")
{
Tags = ImmutableArray.Create("hub", "suse", "offline"),
};
private readonly RancherHubMetadataLoader _metadataLoader;
private readonly RancherHubEventClient _eventClient;
private readonly RancherHubCheckpointManager _checkpointManager;
private readonly RancherHubTokenProvider _tokenProvider;
private readonly IHttpClientFactory _httpClientFactory;
private readonly IEnumerable<IVexConnectorOptionsValidator<RancherHubConnectorOptions>> _validators;
private RancherHubConnectorOptions? _options;
private RancherHubMetadataResult? _metadata;
public RancherHubConnector(
RancherHubMetadataLoader metadataLoader,
RancherHubEventClient eventClient,
RancherHubCheckpointManager checkpointManager,
RancherHubTokenProvider tokenProvider,
IHttpClientFactory httpClientFactory,
ILogger<RancherHubConnector> logger,
TimeProvider timeProvider,
IEnumerable<IVexConnectorOptionsValidator<RancherHubConnectorOptions>>? validators = null)
: base(StaticDescriptor, logger, timeProvider)
{
_metadataLoader = metadataLoader ?? throw new ArgumentNullException(nameof(metadataLoader));
_eventClient = eventClient ?? throw new ArgumentNullException(nameof(eventClient));
_checkpointManager = checkpointManager ?? throw new ArgumentNullException(nameof(checkpointManager));
_tokenProvider = tokenProvider ?? throw new ArgumentNullException(nameof(tokenProvider));
_httpClientFactory = httpClientFactory ?? throw new ArgumentNullException(nameof(httpClientFactory));
_validators = validators ?? Array.Empty<IVexConnectorOptionsValidator<RancherHubConnectorOptions>>();
}
public override async ValueTask ValidateAsync(VexConnectorSettings settings, CancellationToken cancellationToken)
{
_options = VexConnectorOptionsBinder.Bind(
Descriptor,
settings,
validators: _validators);
_metadata = await _metadataLoader.LoadAsync(_options, cancellationToken).ConfigureAwait(false);
LogConnectorEvent(LogLevel.Information, "validate", "Rancher hub discovery loaded.", new Dictionary<string, object?>
{
["discoveryUri"] = _options.DiscoveryUri.ToString(),
["subscriptionUri"] = _metadata.Metadata.Subscription.EventsUri.ToString(),
["requiresAuth"] = _metadata.Metadata.Subscription.RequiresAuthentication,
["fromOffline"] = _metadata.FromOfflineSnapshot,
});
}
public override async IAsyncEnumerable<VexRawDocument> FetchAsync(VexConnectorContext context, [EnumeratorCancellation] CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(context);
if (_options is null)
{
throw new InvalidOperationException("Connector must be validated before fetch operations.");
}
if (_metadata is null)
{
_metadata = await _metadataLoader.LoadAsync(_options, cancellationToken).ConfigureAwait(false);
}
var checkpoint = await _checkpointManager.LoadAsync(Descriptor.Id, context, cancellationToken).ConfigureAwait(false);
var digestHistory = checkpoint.Digests.ToList();
var dedupeSet = new HashSet<string>(checkpoint.Digests, StringComparer.OrdinalIgnoreCase);
var latestCursor = checkpoint.Cursor;
var latestPublishedAt = checkpoint.LastPublishedAt ?? checkpoint.EffectiveSince;
var stateChanged = false;
LogConnectorEvent(LogLevel.Information, "fetch_start", "Starting Rancher hub event ingestion.", new Dictionary<string, object?>
{
["since"] = checkpoint.EffectiveSince?.ToString("O"),
["cursor"] = checkpoint.Cursor,
["subscriptionUri"] = _metadata.Metadata.Subscription.EventsUri.ToString(),
["offline"] = checkpoint.Cursor is null && _options.PreferOfflineSnapshot,
});
await foreach (var batch in _eventClient.FetchEventBatchesAsync(
_options,
_metadata.Metadata,
checkpoint.Cursor,
checkpoint.EffectiveSince,
_metadata.Metadata.Subscription.Channels,
cancellationToken).ConfigureAwait(false))
{
LogConnectorEvent(LogLevel.Debug, "batch", "Processing Rancher hub batch.", new Dictionary<string, object?>
{
["cursor"] = batch.Cursor,
["nextCursor"] = batch.NextCursor,
["count"] = batch.Events.Length,
["offline"] = batch.FromOfflineSnapshot,
});
if (!string.IsNullOrWhiteSpace(batch.NextCursor) && !string.Equals(batch.NextCursor, latestCursor, StringComparison.Ordinal))
{
latestCursor = batch.NextCursor;
stateChanged = true;
}
else if (string.IsNullOrWhiteSpace(latestCursor) && !string.IsNullOrWhiteSpace(batch.Cursor))
{
latestCursor = batch.Cursor;
}
foreach (var record in batch.Events)
{
cancellationToken.ThrowIfCancellationRequested();
var result = await ProcessEventAsync(record, batch, context, dedupeSet, digestHistory, cancellationToken).ConfigureAwait(false);
if (result.ProcessedDocument is not null)
{
yield return result.ProcessedDocument;
stateChanged = true;
if (result.PublishedAt is { } published && (latestPublishedAt is null || published > latestPublishedAt))
{
latestPublishedAt = published;
}
}
else if (result.Quarantined)
{
stateChanged = true;
}
}
}
var trimmed = TrimHistory(digestHistory);
if (trimmed)
{
stateChanged = true;
}
if (stateChanged || !string.Equals(latestCursor, checkpoint.Cursor, StringComparison.Ordinal) || latestPublishedAt != checkpoint.LastPublishedAt)
{
await _checkpointManager.SaveAsync(
Descriptor.Id,
latestCursor,
latestPublishedAt,
digestHistory.ToImmutableArray(),
cancellationToken).ConfigureAwait(false);
}
}
public override ValueTask<VexClaimBatch> NormalizeAsync(VexRawDocument document, CancellationToken cancellationToken)
=> throw new NotSupportedException("RancherHubConnector relies on format-specific normalizers for CSAF/OpenVEX payloads.");
public RancherHubMetadata? GetCachedMetadata() => _metadata?.Metadata;
private async Task<EventProcessingResult> ProcessEventAsync(
RancherHubEventRecord record,
RancherHubEventBatch batch,
VexConnectorContext context,
HashSet<string> dedupeSet,
List<string> digestHistory,
CancellationToken cancellationToken)
{
var quarantineKey = BuildQuarantineKey(record);
if (dedupeSet.Contains(quarantineKey))
{
return EventProcessingResult.QuarantinedOnly;
}
if (record.DocumentUri is null || string.IsNullOrWhiteSpace(record.Id))
{
await QuarantineAsync(record, batch, "missing documentUri or id", context, cancellationToken).ConfigureAwait(false);
AddQuarantineDigest(quarantineKey, dedupeSet, digestHistory);
return EventProcessingResult.QuarantinedOnly;
}
var client = _httpClientFactory.CreateClient(RancherHubConnectorOptions.HttpClientName);
using var request = await CreateDocumentRequestAsync(record.DocumentUri, cancellationToken).ConfigureAwait(false);
using var response = await client.SendAsync(request, HttpCompletionOption.ResponseHeadersRead, cancellationToken).ConfigureAwait(false);
if (!response.IsSuccessStatusCode)
{
await QuarantineAsync(record, batch, $"document fetch failed ({(int)response.StatusCode} {response.StatusCode})", context, cancellationToken).ConfigureAwait(false);
AddQuarantineDigest(quarantineKey, dedupeSet, digestHistory);
return EventProcessingResult.QuarantinedOnly;
}
var contentBytes = await response.Content.ReadAsByteArrayAsync(cancellationToken).ConfigureAwait(false);
var publishedAt = record.PublishedAt ?? UtcNow();
var metadata = BuildMetadata(builder => builder
.Add("rancher.event.id", record.Id)
.Add("rancher.event.type", record.Type)
.Add("rancher.event.channel", record.Channel)
.Add("rancher.event.published", publishedAt)
.Add("rancher.event.cursor", batch.NextCursor ?? batch.Cursor)
.Add("rancher.event.offline", batch.FromOfflineSnapshot ? "true" : "false")
.Add("rancher.event.declaredDigest", record.DocumentDigest));
var format = ResolveFormat(record.DocumentFormat);
var document = CreateRawDocument(format, record.DocumentUri, contentBytes, metadata);
if (!string.IsNullOrWhiteSpace(record.DocumentDigest))
{
var declared = NormalizeDigest(record.DocumentDigest);
var computed = NormalizeDigest(document.Digest);
if (!string.Equals(declared, computed, StringComparison.OrdinalIgnoreCase))
{
await QuarantineAsync(record, batch, $"digest mismatch (declared {record.DocumentDigest}, computed {document.Digest})", context, cancellationToken).ConfigureAwait(false);
AddQuarantineDigest(quarantineKey, dedupeSet, digestHistory);
return EventProcessingResult.QuarantinedOnly;
}
}
if (!dedupeSet.Add(document.Digest))
{
return EventProcessingResult.Skipped;
}
digestHistory.Add(document.Digest);
await context.RawSink.StoreAsync(document, cancellationToken).ConfigureAwait(false);
return new EventProcessingResult(document, false, publishedAt);
}
private static bool TrimHistory(List<string> digestHistory)
{
if (digestHistory.Count <= MaxDigestHistory)
{
return false;
}
var excess = digestHistory.Count - MaxDigestHistory;
digestHistory.RemoveRange(0, excess);
return true;
}
private async Task<HttpRequestMessage> CreateDocumentRequestAsync(Uri documentUri, CancellationToken cancellationToken)
{
var request = new HttpRequestMessage(HttpMethod.Get, documentUri);
if (_metadata?.Metadata.Subscription.RequiresAuthentication ?? false)
{
var token = await _tokenProvider.GetAccessTokenAsync(_options!, cancellationToken).ConfigureAwait(false);
if (token is not null)
{
var scheme = string.IsNullOrWhiteSpace(token.TokenType) ? "Bearer" : token.TokenType;
request.Headers.Authorization = new AuthenticationHeaderValue(scheme, token.Value);
}
}
return request;
}
private async Task QuarantineAsync(
RancherHubEventRecord record,
RancherHubEventBatch batch,
string reason,
VexConnectorContext context,
CancellationToken cancellationToken)
{
var metadata = BuildMetadata(builder => builder
.Add("rancher.event.id", record.Id)
.Add("rancher.event.type", record.Type)
.Add("rancher.event.channel", record.Channel)
.Add("rancher.event.quarantine", "true")
.Add("rancher.event.error", reason)
.Add("rancher.event.cursor", batch.NextCursor ?? batch.Cursor)
.Add("rancher.event.offline", batch.FromOfflineSnapshot ? "true" : "false"));
var sourceUri = record.DocumentUri ?? _metadata?.Metadata.Subscription.EventsUri ?? _options!.DiscoveryUri;
var payload = Encoding.UTF8.GetBytes(record.RawJson);
var document = CreateRawDocument(VexDocumentFormat.Csaf, sourceUri, payload, metadata);
await context.RawSink.StoreAsync(document, cancellationToken).ConfigureAwait(false);
LogConnectorEvent(LogLevel.Warning, "quarantine", "Rancher hub event moved to quarantine.", new Dictionary<string, object?>
{
["eventId"] = record.Id ?? "(missing)",
["reason"] = reason,
});
}
private static void AddQuarantineDigest(string key, HashSet<string> dedupeSet, List<string> digestHistory)
{
if (dedupeSet.Add(key))
{
digestHistory.Add(key);
}
}
private static string BuildQuarantineKey(RancherHubEventRecord record)
{
if (!string.IsNullOrWhiteSpace(record.Id))
{
return $"quarantine:{record.Id}";
}
Span<byte> hash = stackalloc byte[32];
var bytes = Encoding.UTF8.GetBytes(record.RawJson);
if (!SHA256.TryHashData(bytes, hash, out _))
{
using var sha = SHA256.Create();
hash = sha.ComputeHash(bytes);
}
return $"quarantine:{Convert.ToHexString(hash).ToLowerInvariant()}";
}
private static string NormalizeDigest(string digest)
{
if (string.IsNullOrWhiteSpace(digest))
{
return digest;
}
var trimmed = digest.Trim();
return trimmed.StartsWith("sha256:", StringComparison.OrdinalIgnoreCase)
? trimmed.ToLowerInvariant()
: $"sha256:{trimmed.ToLowerInvariant()}";
}
private static VexDocumentFormat ResolveFormat(string? format)
{
if (string.IsNullOrWhiteSpace(format))
{
return VexDocumentFormat.Csaf;
}
return format.ToLowerInvariant() switch
{
"csaf" or "csaf_json" or "json" => VexDocumentFormat.Csaf,
"cyclonedx" or "cyclonedx_vex" => VexDocumentFormat.CycloneDx,
"openvex" => VexDocumentFormat.OpenVex,
"oci" or "oci_attestation" or "attestation" => VexDocumentFormat.OciAttestation,
_ => VexDocumentFormat.Csaf,
};
}
private sealed record EventProcessingResult(VexRawDocument? ProcessedDocument, bool Quarantined, DateTimeOffset? PublishedAt)
{
public static EventProcessingResult QuarantinedOnly { get; } = new(null, true, null);
public static EventProcessingResult Skipped { get; } = new(null, false, null);
}
}
using System;
using System.Collections.Generic;
using System.Collections.Immutable;
using System.Linq;
using System.Net.Http;
using System.Net.Http.Headers;
using System.Security.Cryptography;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
using System.Runtime.CompilerServices;
using Microsoft.Extensions.Logging;
using StellaOps.Excititor.Connectors.Abstractions;
using StellaOps.Excititor.Connectors.SUSE.RancherVEXHub.Authentication;
using StellaOps.Excititor.Connectors.SUSE.RancherVEXHub.Configuration;
using StellaOps.Excititor.Connectors.SUSE.RancherVEXHub.Events;
using StellaOps.Excititor.Connectors.SUSE.RancherVEXHub.Metadata;
using StellaOps.Excititor.Connectors.SUSE.RancherVEXHub.State;
using StellaOps.Excititor.Core;
namespace StellaOps.Excititor.Connectors.SUSE.RancherVEXHub;
public sealed class RancherHubConnector : VexConnectorBase
{
private const int MaxDigestHistory = 200;
private static readonly VexConnectorDescriptor StaticDescriptor = new(
id: "excititor:suse.rancher",
kind: VexProviderKind.Hub,
displayName: "SUSE Rancher VEX Hub")
{
Tags = ImmutableArray.Create("hub", "suse", "offline"),
};
private readonly RancherHubMetadataLoader _metadataLoader;
private readonly RancherHubEventClient _eventClient;
private readonly RancherHubCheckpointManager _checkpointManager;
private readonly RancherHubTokenProvider _tokenProvider;
private readonly IHttpClientFactory _httpClientFactory;
private readonly IEnumerable<IVexConnectorOptionsValidator<RancherHubConnectorOptions>> _validators;
private RancherHubConnectorOptions? _options;
private RancherHubMetadataResult? _metadata;
public RancherHubConnector(
RancherHubMetadataLoader metadataLoader,
RancherHubEventClient eventClient,
RancherHubCheckpointManager checkpointManager,
RancherHubTokenProvider tokenProvider,
IHttpClientFactory httpClientFactory,
ILogger<RancherHubConnector> logger,
TimeProvider timeProvider,
IEnumerable<IVexConnectorOptionsValidator<RancherHubConnectorOptions>>? validators = null)
: base(StaticDescriptor, logger, timeProvider)
{
_metadataLoader = metadataLoader ?? throw new ArgumentNullException(nameof(metadataLoader));
_eventClient = eventClient ?? throw new ArgumentNullException(nameof(eventClient));
_checkpointManager = checkpointManager ?? throw new ArgumentNullException(nameof(checkpointManager));
_tokenProvider = tokenProvider ?? throw new ArgumentNullException(nameof(tokenProvider));
_httpClientFactory = httpClientFactory ?? throw new ArgumentNullException(nameof(httpClientFactory));
_validators = validators ?? Array.Empty<IVexConnectorOptionsValidator<RancherHubConnectorOptions>>();
}
public override async ValueTask ValidateAsync(VexConnectorSettings settings, CancellationToken cancellationToken)
{
_options = VexConnectorOptionsBinder.Bind(
Descriptor,
settings,
validators: _validators);
_metadata = await _metadataLoader.LoadAsync(_options, cancellationToken).ConfigureAwait(false);
LogConnectorEvent(LogLevel.Information, "validate", "Rancher hub discovery loaded.", new Dictionary<string, object?>
{
["discoveryUri"] = _options.DiscoveryUri.ToString(),
["subscriptionUri"] = _metadata.Metadata.Subscription.EventsUri.ToString(),
["requiresAuth"] = _metadata.Metadata.Subscription.RequiresAuthentication,
["fromOffline"] = _metadata.FromOfflineSnapshot,
});
}
public override async IAsyncEnumerable<VexRawDocument> FetchAsync(VexConnectorContext context, [EnumeratorCancellation] CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(context);
if (_options is null)
{
throw new InvalidOperationException("Connector must be validated before fetch operations.");
}
if (_metadata is null)
{
_metadata = await _metadataLoader.LoadAsync(_options, cancellationToken).ConfigureAwait(false);
}
var checkpoint = await _checkpointManager.LoadAsync(Descriptor.Id, context, cancellationToken).ConfigureAwait(false);
var digestHistory = checkpoint.Digests.ToList();
var dedupeSet = new HashSet<string>(checkpoint.Digests, StringComparer.OrdinalIgnoreCase);
var latestCursor = checkpoint.Cursor;
var latestPublishedAt = checkpoint.LastPublishedAt ?? checkpoint.EffectiveSince;
var stateChanged = false;
LogConnectorEvent(LogLevel.Information, "fetch_start", "Starting Rancher hub event ingestion.", new Dictionary<string, object?>
{
["since"] = checkpoint.EffectiveSince?.ToString("O"),
["cursor"] = checkpoint.Cursor,
["subscriptionUri"] = _metadata.Metadata.Subscription.EventsUri.ToString(),
["offline"] = checkpoint.Cursor is null && _options.PreferOfflineSnapshot,
});
await foreach (var batch in _eventClient.FetchEventBatchesAsync(
_options,
_metadata.Metadata,
checkpoint.Cursor,
checkpoint.EffectiveSince,
_metadata.Metadata.Subscription.Channels,
cancellationToken).ConfigureAwait(false))
{
LogConnectorEvent(LogLevel.Debug, "batch", "Processing Rancher hub batch.", new Dictionary<string, object?>
{
["cursor"] = batch.Cursor,
["nextCursor"] = batch.NextCursor,
["count"] = batch.Events.Length,
["offline"] = batch.FromOfflineSnapshot,
});
if (!string.IsNullOrWhiteSpace(batch.NextCursor) && !string.Equals(batch.NextCursor, latestCursor, StringComparison.Ordinal))
{
latestCursor = batch.NextCursor;
stateChanged = true;
}
else if (string.IsNullOrWhiteSpace(latestCursor) && !string.IsNullOrWhiteSpace(batch.Cursor))
{
latestCursor = batch.Cursor;
}
foreach (var record in batch.Events)
{
cancellationToken.ThrowIfCancellationRequested();
var result = await ProcessEventAsync(record, batch, context, dedupeSet, digestHistory, cancellationToken).ConfigureAwait(false);
if (result.ProcessedDocument is not null)
{
yield return result.ProcessedDocument;
stateChanged = true;
if (result.PublishedAt is { } published && (latestPublishedAt is null || published > latestPublishedAt))
{
latestPublishedAt = published;
}
}
else if (result.Quarantined)
{
stateChanged = true;
}
}
}
var trimmed = TrimHistory(digestHistory);
if (trimmed)
{
stateChanged = true;
}
if (stateChanged || !string.Equals(latestCursor, checkpoint.Cursor, StringComparison.Ordinal) || latestPublishedAt != checkpoint.LastPublishedAt)
{
await _checkpointManager.SaveAsync(
Descriptor.Id,
latestCursor,
latestPublishedAt,
digestHistory.ToImmutableArray(),
cancellationToken).ConfigureAwait(false);
}
}
public override ValueTask<VexClaimBatch> NormalizeAsync(VexRawDocument document, CancellationToken cancellationToken)
=> throw new NotSupportedException("RancherHubConnector relies on format-specific normalizers for CSAF/OpenVEX payloads.");
public RancherHubMetadata? GetCachedMetadata() => _metadata?.Metadata;
private async Task<EventProcessingResult> ProcessEventAsync(
RancherHubEventRecord record,
RancherHubEventBatch batch,
VexConnectorContext context,
HashSet<string> dedupeSet,
List<string> digestHistory,
CancellationToken cancellationToken)
{
var quarantineKey = BuildQuarantineKey(record);
if (dedupeSet.Contains(quarantineKey))
{
return EventProcessingResult.QuarantinedOnly;
}
if (record.DocumentUri is null || string.IsNullOrWhiteSpace(record.Id))
{
await QuarantineAsync(record, batch, "missing documentUri or id", context, cancellationToken).ConfigureAwait(false);
AddQuarantineDigest(quarantineKey, dedupeSet, digestHistory);
return EventProcessingResult.QuarantinedOnly;
}
var client = _httpClientFactory.CreateClient(RancherHubConnectorOptions.HttpClientName);
using var request = await CreateDocumentRequestAsync(record.DocumentUri, cancellationToken).ConfigureAwait(false);
using var response = await client.SendAsync(request, HttpCompletionOption.ResponseHeadersRead, cancellationToken).ConfigureAwait(false);
if (!response.IsSuccessStatusCode)
{
await QuarantineAsync(record, batch, $"document fetch failed ({(int)response.StatusCode} {response.StatusCode})", context, cancellationToken).ConfigureAwait(false);
AddQuarantineDigest(quarantineKey, dedupeSet, digestHistory);
return EventProcessingResult.QuarantinedOnly;
}
var contentBytes = await response.Content.ReadAsByteArrayAsync(cancellationToken).ConfigureAwait(false);
var publishedAt = record.PublishedAt ?? UtcNow();
var metadata = BuildMetadata(builder => builder
.Add("rancher.event.id", record.Id)
.Add("rancher.event.type", record.Type)
.Add("rancher.event.channel", record.Channel)
.Add("rancher.event.published", publishedAt)
.Add("rancher.event.cursor", batch.NextCursor ?? batch.Cursor)
.Add("rancher.event.offline", batch.FromOfflineSnapshot ? "true" : "false")
.Add("rancher.event.declaredDigest", record.DocumentDigest));
var format = ResolveFormat(record.DocumentFormat);
var document = CreateRawDocument(format, record.DocumentUri, contentBytes, metadata);
if (!string.IsNullOrWhiteSpace(record.DocumentDigest))
{
var declared = NormalizeDigest(record.DocumentDigest);
var computed = NormalizeDigest(document.Digest);
if (!string.Equals(declared, computed, StringComparison.OrdinalIgnoreCase))
{
await QuarantineAsync(record, batch, $"digest mismatch (declared {record.DocumentDigest}, computed {document.Digest})", context, cancellationToken).ConfigureAwait(false);
AddQuarantineDigest(quarantineKey, dedupeSet, digestHistory);
return EventProcessingResult.QuarantinedOnly;
}
}
if (!dedupeSet.Add(document.Digest))
{
return EventProcessingResult.Skipped;
}
digestHistory.Add(document.Digest);
await context.RawSink.StoreAsync(document, cancellationToken).ConfigureAwait(false);
return new EventProcessingResult(document, false, publishedAt);
}
private static bool TrimHistory(List<string> digestHistory)
{
if (digestHistory.Count <= MaxDigestHistory)
{
return false;
}
var excess = digestHistory.Count - MaxDigestHistory;
digestHistory.RemoveRange(0, excess);
return true;
}
private async Task<HttpRequestMessage> CreateDocumentRequestAsync(Uri documentUri, CancellationToken cancellationToken)
{
var request = new HttpRequestMessage(HttpMethod.Get, documentUri);
if (_metadata?.Metadata.Subscription.RequiresAuthentication ?? false)
{
var token = await _tokenProvider.GetAccessTokenAsync(_options!, cancellationToken).ConfigureAwait(false);
if (token is not null)
{
var scheme = string.IsNullOrWhiteSpace(token.TokenType) ? "Bearer" : token.TokenType;
request.Headers.Authorization = new AuthenticationHeaderValue(scheme, token.Value);
}
}
return request;
}
private async Task QuarantineAsync(
RancherHubEventRecord record,
RancherHubEventBatch batch,
string reason,
VexConnectorContext context,
CancellationToken cancellationToken)
{
var metadata = BuildMetadata(builder => builder
.Add("rancher.event.id", record.Id)
.Add("rancher.event.type", record.Type)
.Add("rancher.event.channel", record.Channel)
.Add("rancher.event.quarantine", "true")
.Add("rancher.event.error", reason)
.Add("rancher.event.cursor", batch.NextCursor ?? batch.Cursor)
.Add("rancher.event.offline", batch.FromOfflineSnapshot ? "true" : "false"));
var sourceUri = record.DocumentUri ?? _metadata?.Metadata.Subscription.EventsUri ?? _options!.DiscoveryUri;
var payload = Encoding.UTF8.GetBytes(record.RawJson);
var document = CreateRawDocument(VexDocumentFormat.Csaf, sourceUri, payload, metadata);
await context.RawSink.StoreAsync(document, cancellationToken).ConfigureAwait(false);
LogConnectorEvent(LogLevel.Warning, "quarantine", "Rancher hub event moved to quarantine.", new Dictionary<string, object?>
{
["eventId"] = record.Id ?? "(missing)",
["reason"] = reason,
});
}
private static void AddQuarantineDigest(string key, HashSet<string> dedupeSet, List<string> digestHistory)
{
if (dedupeSet.Add(key))
{
digestHistory.Add(key);
}
}
private static string BuildQuarantineKey(RancherHubEventRecord record)
{
if (!string.IsNullOrWhiteSpace(record.Id))
{
return $"quarantine:{record.Id}";
}
Span<byte> hash = stackalloc byte[32];
var bytes = Encoding.UTF8.GetBytes(record.RawJson);
if (!SHA256.TryHashData(bytes, hash, out _))
{
using var sha = SHA256.Create();
hash = sha.ComputeHash(bytes);
}
return $"quarantine:{Convert.ToHexString(hash).ToLowerInvariant()}";
}
private static string NormalizeDigest(string digest)
{
if (string.IsNullOrWhiteSpace(digest))
{
return digest;
}
var trimmed = digest.Trim();
return trimmed.StartsWith("sha256:", StringComparison.OrdinalIgnoreCase)
? trimmed.ToLowerInvariant()
: $"sha256:{trimmed.ToLowerInvariant()}";
}
private static VexDocumentFormat ResolveFormat(string? format)
{
if (string.IsNullOrWhiteSpace(format))
{
return VexDocumentFormat.Csaf;
}
return format.ToLowerInvariant() switch
{
"csaf" or "csaf_json" or "json" => VexDocumentFormat.Csaf,
"cyclonedx" or "cyclonedx_vex" => VexDocumentFormat.CycloneDx,
"openvex" => VexDocumentFormat.OpenVex,
"oci" or "oci_attestation" or "attestation" => VexDocumentFormat.OciAttestation,
_ => VexDocumentFormat.Csaf,
};
}
private sealed record EventProcessingResult(VexRawDocument? ProcessedDocument, bool Quarantined, DateTimeOffset? PublishedAt)
{
public static EventProcessingResult QuarantinedOnly { get; } = new(null, true, null);
public static EventProcessingResult Skipped { get; } = new(null, false, null);
}
}

View File

@@ -1,242 +1,242 @@
using System.Collections.Generic;
using System.Collections.Immutable;
using System.Globalization;
using System.Linq;
using System.Security.Cryptography;
using System.Text;
using System.Text.Json.Serialization;
using StellaOps.Excititor.Core;
namespace StellaOps.Excititor.Formats.CycloneDX;
internal static class CycloneDxComponentReconciler
{
public static CycloneDxReconciliationResult Reconcile(IEnumerable<VexClaim> claims)
{
ArgumentNullException.ThrowIfNull(claims);
var catalog = new ComponentCatalog();
var diagnostics = new Dictionary<string, SortedSet<string>>(StringComparer.Ordinal);
var componentRefs = new Dictionary<(string VulnerabilityId, string ProductKey), string>();
foreach (var claim in claims)
{
if (claim is null)
{
continue;
}
var component = catalog.GetOrAdd(claim.Product, claim.ProviderId, diagnostics);
componentRefs[(claim.VulnerabilityId, claim.Product.Key)] = component.BomRef;
}
var components = catalog.Build();
var orderedDiagnostics = diagnostics.Count == 0
? ImmutableDictionary<string, string>.Empty
: diagnostics.ToImmutableDictionary(
static pair => pair.Key,
pair => string.Join(",", pair.Value.OrderBy(static value => value, StringComparer.Ordinal)),
StringComparer.Ordinal);
return new CycloneDxReconciliationResult(
components,
componentRefs.ToImmutableDictionary(),
orderedDiagnostics);
}
private sealed class ComponentCatalog
{
private readonly Dictionary<string, MutableComponent> _components = new(StringComparer.Ordinal);
private readonly HashSet<string> _bomRefs = new(StringComparer.Ordinal);
public MutableComponent GetOrAdd(VexProduct product, string providerId, IDictionary<string, SortedSet<string>> diagnostics)
{
if (_components.TryGetValue(product.Key, out var existing))
{
existing.Update(product, providerId, diagnostics);
return existing;
}
var bomRef = GenerateBomRef(product);
var component = new MutableComponent(product.Key, bomRef);
component.Update(product, providerId, diagnostics);
_components[product.Key] = component;
return component;
}
public ImmutableArray<CycloneDxComponentEntry> Build()
=> _components.Values
.Select(static component => component.ToEntry())
.OrderBy(static entry => entry.BomRef, StringComparer.Ordinal)
.ToImmutableArray();
private string GenerateBomRef(VexProduct product)
{
if (!string.IsNullOrWhiteSpace(product.Purl))
{
var normalized = product.Purl!.Trim();
if (_bomRefs.Add(normalized))
{
return normalized;
}
}
var baseRef = Sanitize(product.Key);
if (_bomRefs.Add(baseRef))
{
return baseRef;
}
var hash = ComputeShortHash(product.Key + product.Name);
var candidate = FormattableString.Invariant($"{baseRef}-{hash}");
while (!_bomRefs.Add(candidate))
{
candidate = FormattableString.Invariant($"{candidate}-{hash}");
}
return candidate;
}
private static string Sanitize(string value)
{
if (string.IsNullOrWhiteSpace(value))
{
return "component";
}
var builder = new StringBuilder(value.Length);
foreach (var ch in value)
{
builder.Append(char.IsLetterOrDigit(ch) ? char.ToLowerInvariant(ch) : '-');
}
var sanitized = builder.ToString().Trim('-');
return string.IsNullOrEmpty(sanitized) ? "component" : sanitized;
}
private static string ComputeShortHash(string value)
{
var bytes = Encoding.UTF8.GetBytes(value);
Span<byte> hash = stackalloc byte[SHA256.HashSizeInBytes];
SHA256.HashData(bytes, hash);
return Convert.ToHexString(hash[..6]).ToLowerInvariant();
}
}
private sealed class MutableComponent
{
public MutableComponent(string key, string bomRef)
{
ProductKey = key;
BomRef = bomRef;
}
public string ProductKey { get; }
public string BomRef { get; }
private string? _name;
private string? _version;
private string? _purl;
private string? _cpe;
private readonly SortedDictionary<string, string> _properties = new(StringComparer.Ordinal);
public void Update(VexProduct product, string providerId, IDictionary<string, SortedSet<string>> diagnostics)
{
if (!string.IsNullOrWhiteSpace(product.Name) && ShouldReplace(_name, product.Name))
{
_name = product.Name;
}
if (!string.IsNullOrWhiteSpace(product.Version) && ShouldReplace(_version, product.Version))
{
_version = product.Version;
}
if (!string.IsNullOrWhiteSpace(product.Purl))
{
var trimmed = product.Purl!.Trim();
if (string.IsNullOrWhiteSpace(_purl))
{
_purl = trimmed;
}
else if (!string.Equals(_purl, trimmed, StringComparison.OrdinalIgnoreCase))
{
AddDiagnostic(diagnostics, "purl_conflict", FormattableString.Invariant($"{ProductKey}:{_purl}->{trimmed}"));
}
}
else
{
AddDiagnostic(diagnostics, "missing_purl", FormattableString.Invariant($"{ProductKey}:{providerId}"));
}
if (!string.IsNullOrWhiteSpace(product.Cpe))
{
_cpe = product.Cpe;
}
if (product.ComponentIdentifiers.Length > 0)
{
_properties["stellaops/componentIdentifiers"] = string.Join(';', product.ComponentIdentifiers.OrderBy(static identifier => identifier, StringComparer.OrdinalIgnoreCase));
}
}
public CycloneDxComponentEntry ToEntry()
{
ImmutableArray<CycloneDxProperty>? properties = _properties.Count == 0
? null
: _properties.Select(static pair => new CycloneDxProperty(pair.Key, pair.Value)).ToImmutableArray();
return new CycloneDxComponentEntry(
BomRef,
_name ?? ProductKey,
_version,
_purl,
_cpe,
properties);
}
private static bool ShouldReplace(string? existing, string candidate)
{
if (string.IsNullOrWhiteSpace(candidate))
{
return false;
}
if (string.IsNullOrWhiteSpace(existing))
{
return true;
}
return candidate.Length > existing.Length;
}
private static void AddDiagnostic(IDictionary<string, SortedSet<string>> diagnostics, string key, string value)
{
if (!diagnostics.TryGetValue(key, out var set))
{
set = new SortedSet<string>(StringComparer.Ordinal);
diagnostics[key] = set;
}
set.Add(value);
}
}
}
internal sealed record CycloneDxReconciliationResult(
ImmutableArray<CycloneDxComponentEntry> Components,
ImmutableDictionary<(string VulnerabilityId, string ProductKey), string> ComponentRefs,
ImmutableDictionary<string, string> Diagnostics);
internal sealed record CycloneDxComponentEntry(
[property: JsonPropertyName("bom-ref")] string BomRef,
[property: JsonPropertyName("name")] string Name,
[property: JsonPropertyName("version"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] string? Version,
[property: JsonPropertyName("purl"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] string? Purl,
[property: JsonPropertyName("cpe"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] string? Cpe,
[property: JsonPropertyName("properties"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] ImmutableArray<CycloneDxProperty>? Properties);
internal sealed record CycloneDxProperty(
[property: JsonPropertyName("name")] string Name,
[property: JsonPropertyName("value")] string Value);
using System.Collections.Generic;
using System.Collections.Immutable;
using System.Globalization;
using System.Linq;
using System.Security.Cryptography;
using System.Text;
using System.Text.Json.Serialization;
using StellaOps.Excititor.Core;
namespace StellaOps.Excititor.Formats.CycloneDX;
internal static class CycloneDxComponentReconciler
{
public static CycloneDxReconciliationResult Reconcile(IEnumerable<VexClaim> claims)
{
ArgumentNullException.ThrowIfNull(claims);
var catalog = new ComponentCatalog();
var diagnostics = new Dictionary<string, SortedSet<string>>(StringComparer.Ordinal);
var componentRefs = new Dictionary<(string VulnerabilityId, string ProductKey), string>();
foreach (var claim in claims)
{
if (claim is null)
{
continue;
}
var component = catalog.GetOrAdd(claim.Product, claim.ProviderId, diagnostics);
componentRefs[(claim.VulnerabilityId, claim.Product.Key)] = component.BomRef;
}
var components = catalog.Build();
var orderedDiagnostics = diagnostics.Count == 0
? ImmutableDictionary<string, string>.Empty
: diagnostics.ToImmutableDictionary(
static pair => pair.Key,
pair => string.Join(",", pair.Value.OrderBy(static value => value, StringComparer.Ordinal)),
StringComparer.Ordinal);
return new CycloneDxReconciliationResult(
components,
componentRefs.ToImmutableDictionary(),
orderedDiagnostics);
}
private sealed class ComponentCatalog
{
private readonly Dictionary<string, MutableComponent> _components = new(StringComparer.Ordinal);
private readonly HashSet<string> _bomRefs = new(StringComparer.Ordinal);
public MutableComponent GetOrAdd(VexProduct product, string providerId, IDictionary<string, SortedSet<string>> diagnostics)
{
if (_components.TryGetValue(product.Key, out var existing))
{
existing.Update(product, providerId, diagnostics);
return existing;
}
var bomRef = GenerateBomRef(product);
var component = new MutableComponent(product.Key, bomRef);
component.Update(product, providerId, diagnostics);
_components[product.Key] = component;
return component;
}
public ImmutableArray<CycloneDxComponentEntry> Build()
=> _components.Values
.Select(static component => component.ToEntry())
.OrderBy(static entry => entry.BomRef, StringComparer.Ordinal)
.ToImmutableArray();
private string GenerateBomRef(VexProduct product)
{
if (!string.IsNullOrWhiteSpace(product.Purl))
{
var normalized = product.Purl!.Trim();
if (_bomRefs.Add(normalized))
{
return normalized;
}
}
var baseRef = Sanitize(product.Key);
if (_bomRefs.Add(baseRef))
{
return baseRef;
}
var hash = ComputeShortHash(product.Key + product.Name);
var candidate = FormattableString.Invariant($"{baseRef}-{hash}");
while (!_bomRefs.Add(candidate))
{
candidate = FormattableString.Invariant($"{candidate}-{hash}");
}
return candidate;
}
private static string Sanitize(string value)
{
if (string.IsNullOrWhiteSpace(value))
{
return "component";
}
var builder = new StringBuilder(value.Length);
foreach (var ch in value)
{
builder.Append(char.IsLetterOrDigit(ch) ? char.ToLowerInvariant(ch) : '-');
}
var sanitized = builder.ToString().Trim('-');
return string.IsNullOrEmpty(sanitized) ? "component" : sanitized;
}
private static string ComputeShortHash(string value)
{
var bytes = Encoding.UTF8.GetBytes(value);
Span<byte> hash = stackalloc byte[SHA256.HashSizeInBytes];
SHA256.HashData(bytes, hash);
return Convert.ToHexString(hash[..6]).ToLowerInvariant();
}
}
private sealed class MutableComponent
{
public MutableComponent(string key, string bomRef)
{
ProductKey = key;
BomRef = bomRef;
}
public string ProductKey { get; }
public string BomRef { get; }
private string? _name;
private string? _version;
private string? _purl;
private string? _cpe;
private readonly SortedDictionary<string, string> _properties = new(StringComparer.Ordinal);
public void Update(VexProduct product, string providerId, IDictionary<string, SortedSet<string>> diagnostics)
{
if (!string.IsNullOrWhiteSpace(product.Name) && ShouldReplace(_name, product.Name))
{
_name = product.Name;
}
if (!string.IsNullOrWhiteSpace(product.Version) && ShouldReplace(_version, product.Version))
{
_version = product.Version;
}
if (!string.IsNullOrWhiteSpace(product.Purl))
{
var trimmed = product.Purl!.Trim();
if (string.IsNullOrWhiteSpace(_purl))
{
_purl = trimmed;
}
else if (!string.Equals(_purl, trimmed, StringComparison.OrdinalIgnoreCase))
{
AddDiagnostic(diagnostics, "purl_conflict", FormattableString.Invariant($"{ProductKey}:{_purl}->{trimmed}"));
}
}
else
{
AddDiagnostic(diagnostics, "missing_purl", FormattableString.Invariant($"{ProductKey}:{providerId}"));
}
if (!string.IsNullOrWhiteSpace(product.Cpe))
{
_cpe = product.Cpe;
}
if (product.ComponentIdentifiers.Length > 0)
{
_properties["stellaops/componentIdentifiers"] = string.Join(';', product.ComponentIdentifiers.OrderBy(static identifier => identifier, StringComparer.OrdinalIgnoreCase));
}
}
public CycloneDxComponentEntry ToEntry()
{
ImmutableArray<CycloneDxProperty>? properties = _properties.Count == 0
? null
: _properties.Select(static pair => new CycloneDxProperty(pair.Key, pair.Value)).ToImmutableArray();
return new CycloneDxComponentEntry(
BomRef,
_name ?? ProductKey,
_version,
_purl,
_cpe,
properties);
}
private static bool ShouldReplace(string? existing, string candidate)
{
if (string.IsNullOrWhiteSpace(candidate))
{
return false;
}
if (string.IsNullOrWhiteSpace(existing))
{
return true;
}
return candidate.Length > existing.Length;
}
private static void AddDiagnostic(IDictionary<string, SortedSet<string>> diagnostics, string key, string value)
{
if (!diagnostics.TryGetValue(key, out var set))
{
set = new SortedSet<string>(StringComparer.Ordinal);
diagnostics[key] = set;
}
set.Add(value);
}
}
}
internal sealed record CycloneDxReconciliationResult(
ImmutableArray<CycloneDxComponentEntry> Components,
ImmutableDictionary<(string VulnerabilityId, string ProductKey), string> ComponentRefs,
ImmutableDictionary<string, string> Diagnostics);
internal sealed record CycloneDxComponentEntry(
[property: JsonPropertyName("bom-ref")] string BomRef,
[property: JsonPropertyName("name")] string Name,
[property: JsonPropertyName("version"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] string? Version,
[property: JsonPropertyName("purl"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] string? Purl,
[property: JsonPropertyName("cpe"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] string? Cpe,
[property: JsonPropertyName("properties"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] ImmutableArray<CycloneDxProperty>? Properties);
internal sealed record CycloneDxProperty(
[property: JsonPropertyName("name")] string Name,
[property: JsonPropertyName("value")] string Value);

View File

@@ -1,228 +1,228 @@
using System.Collections.Generic;
using System.Collections.Immutable;
using System.Globalization;
using System.IO;
using System.Linq;
using System.Security.Cryptography;
using System.Text;
using System.Text.Json.Serialization;
using System.Threading;
using System.Threading.Tasks;
using StellaOps.Excititor.Core;
namespace StellaOps.Excititor.Formats.CycloneDX;
/// <summary>
/// Serialises normalized VEX claims into CycloneDX VEX documents with reconciled component references.
/// </summary>
public sealed class CycloneDxExporter : IVexExporter
{
public VexExportFormat Format => VexExportFormat.CycloneDx;
public VexContentAddress Digest(VexExportRequest request)
{
ArgumentNullException.ThrowIfNull(request);
var document = BuildDocument(request, out _);
var json = VexCanonicalJsonSerializer.Serialize(document);
return ComputeDigest(json);
}
public async ValueTask<VexExportResult> SerializeAsync(
VexExportRequest request,
Stream output,
CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(request);
ArgumentNullException.ThrowIfNull(output);
var document = BuildDocument(request, out var metadata);
var json = VexCanonicalJsonSerializer.Serialize(document);
var digest = ComputeDigest(json);
var buffer = Encoding.UTF8.GetBytes(json);
await output.WriteAsync(buffer, 0, buffer.Length, cancellationToken).ConfigureAwait(false);
return new VexExportResult(digest, buffer.LongLength, metadata);
}
private CycloneDxExportDocument BuildDocument(VexExportRequest request, out ImmutableDictionary<string, string> metadata)
{
var signature = VexQuerySignature.FromQuery(request.Query);
var signatureHash = signature.ComputeHash();
var generatedAt = request.GeneratedAt.UtcDateTime.ToString("O", CultureInfo.InvariantCulture);
var reconciliation = CycloneDxComponentReconciler.Reconcile(request.Claims);
var vulnerabilityEntries = BuildVulnerabilities(request.Claims, reconciliation.ComponentRefs);
var missingJustifications = request.Claims
.Where(static claim => claim.Status == VexClaimStatus.NotAffected && claim.Justification is null)
.Select(static claim => FormattableString.Invariant($"{claim.VulnerabilityId}:{claim.Product.Key}"))
.Distinct(StringComparer.Ordinal)
.OrderBy(static key => key, StringComparer.Ordinal)
.ToImmutableArray();
var properties = ImmutableArray.Create(new CycloneDxProperty("stellaops/querySignature", signature.Value));
metadata = BuildMetadata(signature, reconciliation.Diagnostics, generatedAt, vulnerabilityEntries.Length, reconciliation.Components.Length, missingJustifications);
var document = new CycloneDxExportDocument(
BomFormat: "CycloneDX",
SpecVersion: "1.6",
SerialNumber: FormattableString.Invariant($"urn:uuid:{BuildDeterministicGuid(signatureHash.Digest)}"),
Version: 1,
Metadata: new CycloneDxMetadata(generatedAt),
Components: reconciliation.Components,
Vulnerabilities: vulnerabilityEntries,
Properties: properties);
return document;
}
private static ImmutableArray<CycloneDxVulnerabilityEntry> BuildVulnerabilities(
ImmutableArray<VexClaim> claims,
ImmutableDictionary<(string VulnerabilityId, string ProductKey), string> componentRefs)
{
var entries = ImmutableArray.CreateBuilder<CycloneDxVulnerabilityEntry>();
foreach (var claim in claims)
{
if (!componentRefs.TryGetValue((claim.VulnerabilityId, claim.Product.Key), out var componentRef))
{
continue;
}
var analysis = new CycloneDxAnalysis(
State: MapStatus(claim.Status),
Justification: claim.Justification?.ToString().ToLowerInvariant(),
Responses: null);
var affects = ImmutableArray.Create(new CycloneDxAffectEntry(componentRef));
var properties = ImmutableArray.Create(
new CycloneDxProperty("stellaops/providerId", claim.ProviderId),
new CycloneDxProperty("stellaops/documentDigest", claim.Document.Digest));
var vulnerabilityId = claim.VulnerabilityId;
var bomRef = FormattableString.Invariant($"{vulnerabilityId}#{Normalize(componentRef)}");
entries.Add(new CycloneDxVulnerabilityEntry(
Id: vulnerabilityId,
BomRef: bomRef,
Description: claim.Detail,
Analysis: analysis,
Affects: affects,
Properties: properties));
}
return entries
.ToImmutable()
.OrderBy(static entry => entry.Id, StringComparer.Ordinal)
.ThenBy(static entry => entry.BomRef, StringComparer.Ordinal)
.ToImmutableArray();
}
private static string Normalize(string value)
{
if (string.IsNullOrWhiteSpace(value))
{
return "component";
}
var builder = new StringBuilder(value.Length);
foreach (var ch in value)
{
builder.Append(char.IsLetterOrDigit(ch) ? char.ToLowerInvariant(ch) : '-');
}
var normalized = builder.ToString().Trim('-');
return string.IsNullOrEmpty(normalized) ? "component" : normalized;
}
private static string MapStatus(VexClaimStatus status)
=> status switch
{
VexClaimStatus.Affected => "affected",
VexClaimStatus.NotAffected => "not_affected",
VexClaimStatus.Fixed => "resolved",
VexClaimStatus.UnderInvestigation => "under_investigation",
_ => "unknown",
};
private static ImmutableDictionary<string, string> BuildMetadata(
VexQuerySignature signature,
ImmutableDictionary<string, string> diagnostics,
string generatedAt,
int vulnerabilityCount,
int componentCount,
ImmutableArray<string> missingJustifications)
{
var builder = ImmutableDictionary.CreateBuilder<string, string>(StringComparer.Ordinal);
builder["cyclonedx.querySignature"] = signature.Value;
builder["cyclonedx.generatedAt"] = generatedAt;
builder["cyclonedx.vulnerabilityCount"] = vulnerabilityCount.ToString(CultureInfo.InvariantCulture);
builder["cyclonedx.componentCount"] = componentCount.ToString(CultureInfo.InvariantCulture);
foreach (var diagnostic in diagnostics.OrderBy(static pair => pair.Key, StringComparer.Ordinal))
{
builder[$"cyclonedx.{diagnostic.Key}"] = diagnostic.Value;
}
if (!missingJustifications.IsDefaultOrEmpty && missingJustifications.Length > 0)
{
builder["policy.justification_missing"] = string.Join(",", missingJustifications);
}
return builder.ToImmutable();
}
private static string BuildDeterministicGuid(string digest)
{
if (string.IsNullOrWhiteSpace(digest) || digest.Length < 32)
{
return Guid.NewGuid().ToString();
}
var hex = digest[..32];
var bytes = Enumerable.Range(0, hex.Length / 2)
.Select(i => byte.Parse(hex.Substring(i * 2, 2), NumberStyles.HexNumber, CultureInfo.InvariantCulture))
.ToArray();
return new Guid(bytes).ToString();
}
private static VexContentAddress ComputeDigest(string json)
{
var bytes = Encoding.UTF8.GetBytes(json);
Span<byte> hash = stackalloc byte[SHA256.HashSizeInBytes];
SHA256.HashData(bytes, hash);
var digest = Convert.ToHexString(hash).ToLowerInvariant();
return new VexContentAddress("sha256", digest);
}
}
internal sealed record CycloneDxExportDocument(
[property: JsonPropertyName("bomFormat")] string BomFormat,
[property: JsonPropertyName("specVersion")] string SpecVersion,
[property: JsonPropertyName("serialNumber")] string SerialNumber,
[property: JsonPropertyName("version")] int Version,
[property: JsonPropertyName("metadata")] CycloneDxMetadata Metadata,
[property: JsonPropertyName("components")] ImmutableArray<CycloneDxComponentEntry> Components,
[property: JsonPropertyName("vulnerabilities")] ImmutableArray<CycloneDxVulnerabilityEntry> Vulnerabilities,
[property: JsonPropertyName("properties"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] ImmutableArray<CycloneDxProperty>? Properties);
internal sealed record CycloneDxMetadata(
[property: JsonPropertyName("timestamp")] string Timestamp);
internal sealed record CycloneDxVulnerabilityEntry(
[property: JsonPropertyName("id")] string Id,
[property: JsonPropertyName("bom-ref")] string BomRef,
[property: JsonPropertyName("description"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] string? Description,
[property: JsonPropertyName("analysis")] CycloneDxAnalysis Analysis,
[property: JsonPropertyName("affects")] ImmutableArray<CycloneDxAffectEntry> Affects,
[property: JsonPropertyName("properties")] ImmutableArray<CycloneDxProperty> Properties);
internal sealed record CycloneDxAnalysis(
[property: JsonPropertyName("state")] string State,
[property: JsonPropertyName("justification"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] string? Justification,
[property: JsonPropertyName("response"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] ImmutableArray<string>? Responses);
internal sealed record CycloneDxAffectEntry(
[property: JsonPropertyName("ref")] string Reference);
using System.Collections.Generic;
using System.Collections.Immutable;
using System.Globalization;
using System.IO;
using System.Linq;
using System.Security.Cryptography;
using System.Text;
using System.Text.Json.Serialization;
using System.Threading;
using System.Threading.Tasks;
using StellaOps.Excititor.Core;
namespace StellaOps.Excititor.Formats.CycloneDX;
/// <summary>
/// Serialises normalized VEX claims into CycloneDX VEX documents with reconciled component references.
/// </summary>
public sealed class CycloneDxExporter : IVexExporter
{
public VexExportFormat Format => VexExportFormat.CycloneDx;
public VexContentAddress Digest(VexExportRequest request)
{
ArgumentNullException.ThrowIfNull(request);
var document = BuildDocument(request, out _);
var json = VexCanonicalJsonSerializer.Serialize(document);
return ComputeDigest(json);
}
public async ValueTask<VexExportResult> SerializeAsync(
VexExportRequest request,
Stream output,
CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(request);
ArgumentNullException.ThrowIfNull(output);
var document = BuildDocument(request, out var metadata);
var json = VexCanonicalJsonSerializer.Serialize(document);
var digest = ComputeDigest(json);
var buffer = Encoding.UTF8.GetBytes(json);
await output.WriteAsync(buffer, 0, buffer.Length, cancellationToken).ConfigureAwait(false);
return new VexExportResult(digest, buffer.LongLength, metadata);
}
private CycloneDxExportDocument BuildDocument(VexExportRequest request, out ImmutableDictionary<string, string> metadata)
{
var signature = VexQuerySignature.FromQuery(request.Query);
var signatureHash = signature.ComputeHash();
var generatedAt = request.GeneratedAt.UtcDateTime.ToString("O", CultureInfo.InvariantCulture);
var reconciliation = CycloneDxComponentReconciler.Reconcile(request.Claims);
var vulnerabilityEntries = BuildVulnerabilities(request.Claims, reconciliation.ComponentRefs);
var missingJustifications = request.Claims
.Where(static claim => claim.Status == VexClaimStatus.NotAffected && claim.Justification is null)
.Select(static claim => FormattableString.Invariant($"{claim.VulnerabilityId}:{claim.Product.Key}"))
.Distinct(StringComparer.Ordinal)
.OrderBy(static key => key, StringComparer.Ordinal)
.ToImmutableArray();
var properties = ImmutableArray.Create(new CycloneDxProperty("stellaops/querySignature", signature.Value));
metadata = BuildMetadata(signature, reconciliation.Diagnostics, generatedAt, vulnerabilityEntries.Length, reconciliation.Components.Length, missingJustifications);
var document = new CycloneDxExportDocument(
BomFormat: "CycloneDX",
SpecVersion: "1.6",
SerialNumber: FormattableString.Invariant($"urn:uuid:{BuildDeterministicGuid(signatureHash.Digest)}"),
Version: 1,
Metadata: new CycloneDxMetadata(generatedAt),
Components: reconciliation.Components,
Vulnerabilities: vulnerabilityEntries,
Properties: properties);
return document;
}
private static ImmutableArray<CycloneDxVulnerabilityEntry> BuildVulnerabilities(
ImmutableArray<VexClaim> claims,
ImmutableDictionary<(string VulnerabilityId, string ProductKey), string> componentRefs)
{
var entries = ImmutableArray.CreateBuilder<CycloneDxVulnerabilityEntry>();
foreach (var claim in claims)
{
if (!componentRefs.TryGetValue((claim.VulnerabilityId, claim.Product.Key), out var componentRef))
{
continue;
}
var analysis = new CycloneDxAnalysis(
State: MapStatus(claim.Status),
Justification: claim.Justification?.ToString().ToLowerInvariant(),
Responses: null);
var affects = ImmutableArray.Create(new CycloneDxAffectEntry(componentRef));
var properties = ImmutableArray.Create(
new CycloneDxProperty("stellaops/providerId", claim.ProviderId),
new CycloneDxProperty("stellaops/documentDigest", claim.Document.Digest));
var vulnerabilityId = claim.VulnerabilityId;
var bomRef = FormattableString.Invariant($"{vulnerabilityId}#{Normalize(componentRef)}");
entries.Add(new CycloneDxVulnerabilityEntry(
Id: vulnerabilityId,
BomRef: bomRef,
Description: claim.Detail,
Analysis: analysis,
Affects: affects,
Properties: properties));
}
return entries
.ToImmutable()
.OrderBy(static entry => entry.Id, StringComparer.Ordinal)
.ThenBy(static entry => entry.BomRef, StringComparer.Ordinal)
.ToImmutableArray();
}
private static string Normalize(string value)
{
if (string.IsNullOrWhiteSpace(value))
{
return "component";
}
var builder = new StringBuilder(value.Length);
foreach (var ch in value)
{
builder.Append(char.IsLetterOrDigit(ch) ? char.ToLowerInvariant(ch) : '-');
}
var normalized = builder.ToString().Trim('-');
return string.IsNullOrEmpty(normalized) ? "component" : normalized;
}
private static string MapStatus(VexClaimStatus status)
=> status switch
{
VexClaimStatus.Affected => "affected",
VexClaimStatus.NotAffected => "not_affected",
VexClaimStatus.Fixed => "resolved",
VexClaimStatus.UnderInvestigation => "under_investigation",
_ => "unknown",
};
private static ImmutableDictionary<string, string> BuildMetadata(
VexQuerySignature signature,
ImmutableDictionary<string, string> diagnostics,
string generatedAt,
int vulnerabilityCount,
int componentCount,
ImmutableArray<string> missingJustifications)
{
var builder = ImmutableDictionary.CreateBuilder<string, string>(StringComparer.Ordinal);
builder["cyclonedx.querySignature"] = signature.Value;
builder["cyclonedx.generatedAt"] = generatedAt;
builder["cyclonedx.vulnerabilityCount"] = vulnerabilityCount.ToString(CultureInfo.InvariantCulture);
builder["cyclonedx.componentCount"] = componentCount.ToString(CultureInfo.InvariantCulture);
foreach (var diagnostic in diagnostics.OrderBy(static pair => pair.Key, StringComparer.Ordinal))
{
builder[$"cyclonedx.{diagnostic.Key}"] = diagnostic.Value;
}
if (!missingJustifications.IsDefaultOrEmpty && missingJustifications.Length > 0)
{
builder["policy.justification_missing"] = string.Join(",", missingJustifications);
}
return builder.ToImmutable();
}
private static string BuildDeterministicGuid(string digest)
{
if (string.IsNullOrWhiteSpace(digest) || digest.Length < 32)
{
return Guid.NewGuid().ToString();
}
var hex = digest[..32];
var bytes = Enumerable.Range(0, hex.Length / 2)
.Select(i => byte.Parse(hex.Substring(i * 2, 2), NumberStyles.HexNumber, CultureInfo.InvariantCulture))
.ToArray();
return new Guid(bytes).ToString();
}
private static VexContentAddress ComputeDigest(string json)
{
var bytes = Encoding.UTF8.GetBytes(json);
Span<byte> hash = stackalloc byte[SHA256.HashSizeInBytes];
SHA256.HashData(bytes, hash);
var digest = Convert.ToHexString(hash).ToLowerInvariant();
return new VexContentAddress("sha256", digest);
}
}
internal sealed record CycloneDxExportDocument(
[property: JsonPropertyName("bomFormat")] string BomFormat,
[property: JsonPropertyName("specVersion")] string SpecVersion,
[property: JsonPropertyName("serialNumber")] string SerialNumber,
[property: JsonPropertyName("version")] int Version,
[property: JsonPropertyName("metadata")] CycloneDxMetadata Metadata,
[property: JsonPropertyName("components")] ImmutableArray<CycloneDxComponentEntry> Components,
[property: JsonPropertyName("vulnerabilities")] ImmutableArray<CycloneDxVulnerabilityEntry> Vulnerabilities,
[property: JsonPropertyName("properties"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] ImmutableArray<CycloneDxProperty>? Properties);
internal sealed record CycloneDxMetadata(
[property: JsonPropertyName("timestamp")] string Timestamp);
internal sealed record CycloneDxVulnerabilityEntry(
[property: JsonPropertyName("id")] string Id,
[property: JsonPropertyName("bom-ref")] string BomRef,
[property: JsonPropertyName("description"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] string? Description,
[property: JsonPropertyName("analysis")] CycloneDxAnalysis Analysis,
[property: JsonPropertyName("affects")] ImmutableArray<CycloneDxAffectEntry> Affects,
[property: JsonPropertyName("properties")] ImmutableArray<CycloneDxProperty> Properties);
internal sealed record CycloneDxAnalysis(
[property: JsonPropertyName("state")] string State,
[property: JsonPropertyName("justification"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] string? Justification,
[property: JsonPropertyName("response"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] ImmutableArray<string>? Responses);
internal sealed record CycloneDxAffectEntry(
[property: JsonPropertyName("ref")] string Reference);

View File

@@ -1,217 +1,217 @@
using System.Collections.Generic;
using System.Collections.Immutable;
using System.Globalization;
using System.IO;
using System.Linq;
using System.Security.Cryptography;
using System.Text;
using System.Text.Json.Serialization;
using System.Threading;
using System.Threading.Tasks;
using StellaOps.Excititor.Core;
namespace StellaOps.Excititor.Formats.OpenVEX;
/// <summary>
/// Serializes merged VEX statements into canonical OpenVEX export documents.
/// </summary>
public sealed class OpenVexExporter : IVexExporter
{
public OpenVexExporter()
{
}
public VexExportFormat Format => VexExportFormat.OpenVex;
public VexContentAddress Digest(VexExportRequest request)
{
ArgumentNullException.ThrowIfNull(request);
var document = BuildDocument(request, out _);
var json = VexCanonicalJsonSerializer.Serialize(document);
return ComputeDigest(json);
}
public async ValueTask<VexExportResult> SerializeAsync(
VexExportRequest request,
Stream output,
CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(request);
ArgumentNullException.ThrowIfNull(output);
var metadata = BuildDocument(request, out var exportMetadata);
var json = VexCanonicalJsonSerializer.Serialize(metadata);
var digest = ComputeDigest(json);
var buffer = Encoding.UTF8.GetBytes(json);
await output.WriteAsync(buffer, 0, buffer.Length, cancellationToken).ConfigureAwait(false);
return new VexExportResult(digest, buffer.LongLength, exportMetadata);
}
private OpenVexExportDocument BuildDocument(VexExportRequest request, out ImmutableDictionary<string, string> metadata)
{
var mergeResult = OpenVexStatementMerger.Merge(request.Claims);
var signature = VexQuerySignature.FromQuery(request.Query);
var signatureHash = signature.ComputeHash();
var generatedAt = request.GeneratedAt.UtcDateTime.ToString("O", CultureInfo.InvariantCulture);
var sourceProviders = request.Claims
.Select(static claim => claim.ProviderId)
.Distinct(StringComparer.Ordinal)
.OrderBy(static provider => provider, StringComparer.Ordinal)
.ToImmutableArray();
var statements = mergeResult.Statements
.Select(statement => MapStatement(statement))
.ToImmutableArray();
var document = new OpenVexDocumentSection(
Id: FormattableString.Invariant($"openvex:export:{signatureHash.Digest}"),
Author: "StellaOps Excititor",
Version: "1",
Created: generatedAt,
LastUpdated: generatedAt,
Profile: "stellaops-export/v1");
var metadataSection = new OpenVexExportMetadata(
generatedAt,
signature.Value,
sourceProviders,
mergeResult.Diagnostics);
metadata = BuildMetadata(signature, mergeResult, sourceProviders, generatedAt);
return new OpenVexExportDocument(document, statements, metadataSection);
}
private static ImmutableDictionary<string, string> BuildMetadata(
VexQuerySignature signature,
OpenVexMergeResult mergeResult,
ImmutableArray<string> sourceProviders,
string generatedAt)
{
var metadataBuilder = ImmutableDictionary.CreateBuilder<string, string>(StringComparer.Ordinal);
metadataBuilder["openvex.querySignature"] = signature.Value;
metadataBuilder["openvex.generatedAt"] = generatedAt;
metadataBuilder["openvex.statementCount"] = mergeResult.Statements.Length.ToString(CultureInfo.InvariantCulture);
metadataBuilder["openvex.providerCount"] = sourceProviders.Length.ToString(CultureInfo.InvariantCulture);
var sourceCount = mergeResult.Statements.Sum(static statement => statement.Sources.Length);
metadataBuilder["openvex.sourceCount"] = sourceCount.ToString(CultureInfo.InvariantCulture);
foreach (var diagnostic in mergeResult.Diagnostics.OrderBy(static pair => pair.Key, StringComparer.Ordinal))
{
metadataBuilder[$"openvex.diagnostic.{diagnostic.Key}"] = diagnostic.Value;
}
return metadataBuilder.ToImmutable();
}
private static OpenVexExportStatement MapStatement(OpenVexMergedStatement statement)
{
var products = ImmutableArray.Create(
new OpenVexExportProduct(
Id: statement.Product.Key,
Name: statement.Product.Name ?? statement.Product.Key,
Version: statement.Product.Version,
Purl: statement.Product.Purl,
Cpe: statement.Product.Cpe));
var sources = statement.Sources
.Select(source => new OpenVexExportSource(
Provider: source.ProviderId,
Status: source.Status.ToString().ToLowerInvariant(),
Justification: source.Justification?.ToString().ToLowerInvariant(),
DocumentDigest: source.DocumentDigest,
SourceUri: source.DocumentSource.ToString(),
Detail: source.Detail,
FirstObserved: source.FirstSeen.UtcDateTime.ToString("O", CultureInfo.InvariantCulture),
LastObserved: source.LastSeen.UtcDateTime.ToString("O", CultureInfo.InvariantCulture)))
.ToImmutableArray();
var statementId = FormattableString.Invariant($"{statement.VulnerabilityId}#{NormalizeProductKey(statement.Product.Key)}");
return new OpenVexExportStatement(
Id: statementId,
Vulnerability: statement.VulnerabilityId,
Status: statement.Status.ToString().ToLowerInvariant(),
Justification: statement.Justification?.ToString().ToLowerInvariant(),
Timestamp: statement.FirstObserved.UtcDateTime.ToString("O", CultureInfo.InvariantCulture),
LastUpdated: statement.LastObserved.UtcDateTime.ToString("O", CultureInfo.InvariantCulture),
Products: products,
Statement: statement.Detail,
Sources: sources);
}
private static string NormalizeProductKey(string key)
{
if (string.IsNullOrWhiteSpace(key))
{
return "unknown";
}
var builder = new StringBuilder(key.Length);
foreach (var ch in key)
{
builder.Append(char.IsLetterOrDigit(ch) ? char.ToLowerInvariant(ch) : '-');
}
var normalized = builder.ToString().Trim('-');
return string.IsNullOrEmpty(normalized) ? "unknown" : normalized;
}
private static VexContentAddress ComputeDigest(string json)
{
var bytes = Encoding.UTF8.GetBytes(json);
Span<byte> hash = stackalloc byte[SHA256.HashSizeInBytes];
SHA256.HashData(bytes, hash);
var digest = Convert.ToHexString(hash).ToLowerInvariant();
return new VexContentAddress("sha256", digest);
}
}
internal sealed record OpenVexExportDocument(
OpenVexDocumentSection Document,
ImmutableArray<OpenVexExportStatement> Statements,
OpenVexExportMetadata Metadata);
internal sealed record OpenVexDocumentSection(
[property: JsonPropertyName("@context")] string Context = "https://openvex.dev/ns/v0.2",
[property: JsonPropertyName("id")] string Id = "",
[property: JsonPropertyName("author")] string Author = "",
[property: JsonPropertyName("version")] string Version = "1",
[property: JsonPropertyName("created")] string Created = "",
[property: JsonPropertyName("last_updated")] string LastUpdated = "",
[property: JsonPropertyName("profile")] string Profile = "");
internal sealed record OpenVexExportStatement(
[property: JsonPropertyName("id")] string Id,
[property: JsonPropertyName("vulnerability")] string Vulnerability,
[property: JsonPropertyName("status")] string Status,
[property: JsonPropertyName("justification"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] string? Justification,
[property: JsonPropertyName("timestamp")] string Timestamp,
[property: JsonPropertyName("last_updated")] string LastUpdated,
[property: JsonPropertyName("products")] ImmutableArray<OpenVexExportProduct> Products,
[property: JsonPropertyName("statement"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] string? Statement,
[property: JsonPropertyName("sources")] ImmutableArray<OpenVexExportSource> Sources);
internal sealed record OpenVexExportProduct(
[property: JsonPropertyName("id")] string Id,
[property: JsonPropertyName("name")] string Name,
[property: JsonPropertyName("version"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] string? Version,
[property: JsonPropertyName("purl"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] string? Purl,
[property: JsonPropertyName("cpe"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] string? Cpe);
internal sealed record OpenVexExportSource(
[property: JsonPropertyName("provider")] string Provider,
[property: JsonPropertyName("status")] string Status,
[property: JsonPropertyName("justification"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] string? Justification,
[property: JsonPropertyName("document_digest")] string DocumentDigest,
[property: JsonPropertyName("source_uri")] string SourceUri,
[property: JsonPropertyName("detail"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] string? Detail,
[property: JsonPropertyName("first_observed")] string FirstObserved,
[property: JsonPropertyName("last_observed")] string LastObserved);
internal sealed record OpenVexExportMetadata(
[property: JsonPropertyName("generated_at")] string GeneratedAt,
[property: JsonPropertyName("query_signature")] string QuerySignature,
[property: JsonPropertyName("source_providers")] ImmutableArray<string> SourceProviders,
[property: JsonPropertyName("diagnostics")] ImmutableDictionary<string, string> Diagnostics);
using System.Collections.Generic;
using System.Collections.Immutable;
using System.Globalization;
using System.IO;
using System.Linq;
using System.Security.Cryptography;
using System.Text;
using System.Text.Json.Serialization;
using System.Threading;
using System.Threading.Tasks;
using StellaOps.Excititor.Core;
namespace StellaOps.Excititor.Formats.OpenVEX;
/// <summary>
/// Serializes merged VEX statements into canonical OpenVEX export documents.
/// </summary>
public sealed class OpenVexExporter : IVexExporter
{
public OpenVexExporter()
{
}
public VexExportFormat Format => VexExportFormat.OpenVex;
public VexContentAddress Digest(VexExportRequest request)
{
ArgumentNullException.ThrowIfNull(request);
var document = BuildDocument(request, out _);
var json = VexCanonicalJsonSerializer.Serialize(document);
return ComputeDigest(json);
}
public async ValueTask<VexExportResult> SerializeAsync(
VexExportRequest request,
Stream output,
CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(request);
ArgumentNullException.ThrowIfNull(output);
var metadata = BuildDocument(request, out var exportMetadata);
var json = VexCanonicalJsonSerializer.Serialize(metadata);
var digest = ComputeDigest(json);
var buffer = Encoding.UTF8.GetBytes(json);
await output.WriteAsync(buffer, 0, buffer.Length, cancellationToken).ConfigureAwait(false);
return new VexExportResult(digest, buffer.LongLength, exportMetadata);
}
private OpenVexExportDocument BuildDocument(VexExportRequest request, out ImmutableDictionary<string, string> metadata)
{
var mergeResult = OpenVexStatementMerger.Merge(request.Claims);
var signature = VexQuerySignature.FromQuery(request.Query);
var signatureHash = signature.ComputeHash();
var generatedAt = request.GeneratedAt.UtcDateTime.ToString("O", CultureInfo.InvariantCulture);
var sourceProviders = request.Claims
.Select(static claim => claim.ProviderId)
.Distinct(StringComparer.Ordinal)
.OrderBy(static provider => provider, StringComparer.Ordinal)
.ToImmutableArray();
var statements = mergeResult.Statements
.Select(statement => MapStatement(statement))
.ToImmutableArray();
var document = new OpenVexDocumentSection(
Id: FormattableString.Invariant($"openvex:export:{signatureHash.Digest}"),
Author: "StellaOps Excititor",
Version: "1",
Created: generatedAt,
LastUpdated: generatedAt,
Profile: "stellaops-export/v1");
var metadataSection = new OpenVexExportMetadata(
generatedAt,
signature.Value,
sourceProviders,
mergeResult.Diagnostics);
metadata = BuildMetadata(signature, mergeResult, sourceProviders, generatedAt);
return new OpenVexExportDocument(document, statements, metadataSection);
}
private static ImmutableDictionary<string, string> BuildMetadata(
VexQuerySignature signature,
OpenVexMergeResult mergeResult,
ImmutableArray<string> sourceProviders,
string generatedAt)
{
var metadataBuilder = ImmutableDictionary.CreateBuilder<string, string>(StringComparer.Ordinal);
metadataBuilder["openvex.querySignature"] = signature.Value;
metadataBuilder["openvex.generatedAt"] = generatedAt;
metadataBuilder["openvex.statementCount"] = mergeResult.Statements.Length.ToString(CultureInfo.InvariantCulture);
metadataBuilder["openvex.providerCount"] = sourceProviders.Length.ToString(CultureInfo.InvariantCulture);
var sourceCount = mergeResult.Statements.Sum(static statement => statement.Sources.Length);
metadataBuilder["openvex.sourceCount"] = sourceCount.ToString(CultureInfo.InvariantCulture);
foreach (var diagnostic in mergeResult.Diagnostics.OrderBy(static pair => pair.Key, StringComparer.Ordinal))
{
metadataBuilder[$"openvex.diagnostic.{diagnostic.Key}"] = diagnostic.Value;
}
return metadataBuilder.ToImmutable();
}
private static OpenVexExportStatement MapStatement(OpenVexMergedStatement statement)
{
var products = ImmutableArray.Create(
new OpenVexExportProduct(
Id: statement.Product.Key,
Name: statement.Product.Name ?? statement.Product.Key,
Version: statement.Product.Version,
Purl: statement.Product.Purl,
Cpe: statement.Product.Cpe));
var sources = statement.Sources
.Select(source => new OpenVexExportSource(
Provider: source.ProviderId,
Status: source.Status.ToString().ToLowerInvariant(),
Justification: source.Justification?.ToString().ToLowerInvariant(),
DocumentDigest: source.DocumentDigest,
SourceUri: source.DocumentSource.ToString(),
Detail: source.Detail,
FirstObserved: source.FirstSeen.UtcDateTime.ToString("O", CultureInfo.InvariantCulture),
LastObserved: source.LastSeen.UtcDateTime.ToString("O", CultureInfo.InvariantCulture)))
.ToImmutableArray();
var statementId = FormattableString.Invariant($"{statement.VulnerabilityId}#{NormalizeProductKey(statement.Product.Key)}");
return new OpenVexExportStatement(
Id: statementId,
Vulnerability: statement.VulnerabilityId,
Status: statement.Status.ToString().ToLowerInvariant(),
Justification: statement.Justification?.ToString().ToLowerInvariant(),
Timestamp: statement.FirstObserved.UtcDateTime.ToString("O", CultureInfo.InvariantCulture),
LastUpdated: statement.LastObserved.UtcDateTime.ToString("O", CultureInfo.InvariantCulture),
Products: products,
Statement: statement.Detail,
Sources: sources);
}
private static string NormalizeProductKey(string key)
{
if (string.IsNullOrWhiteSpace(key))
{
return "unknown";
}
var builder = new StringBuilder(key.Length);
foreach (var ch in key)
{
builder.Append(char.IsLetterOrDigit(ch) ? char.ToLowerInvariant(ch) : '-');
}
var normalized = builder.ToString().Trim('-');
return string.IsNullOrEmpty(normalized) ? "unknown" : normalized;
}
private static VexContentAddress ComputeDigest(string json)
{
var bytes = Encoding.UTF8.GetBytes(json);
Span<byte> hash = stackalloc byte[SHA256.HashSizeInBytes];
SHA256.HashData(bytes, hash);
var digest = Convert.ToHexString(hash).ToLowerInvariant();
return new VexContentAddress("sha256", digest);
}
}
internal sealed record OpenVexExportDocument(
OpenVexDocumentSection Document,
ImmutableArray<OpenVexExportStatement> Statements,
OpenVexExportMetadata Metadata);
internal sealed record OpenVexDocumentSection(
[property: JsonPropertyName("@context")] string Context = "https://openvex.dev/ns/v0.2",
[property: JsonPropertyName("id")] string Id = "",
[property: JsonPropertyName("author")] string Author = "",
[property: JsonPropertyName("version")] string Version = "1",
[property: JsonPropertyName("created")] string Created = "",
[property: JsonPropertyName("last_updated")] string LastUpdated = "",
[property: JsonPropertyName("profile")] string Profile = "");
internal sealed record OpenVexExportStatement(
[property: JsonPropertyName("id")] string Id,
[property: JsonPropertyName("vulnerability")] string Vulnerability,
[property: JsonPropertyName("status")] string Status,
[property: JsonPropertyName("justification"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] string? Justification,
[property: JsonPropertyName("timestamp")] string Timestamp,
[property: JsonPropertyName("last_updated")] string LastUpdated,
[property: JsonPropertyName("products")] ImmutableArray<OpenVexExportProduct> Products,
[property: JsonPropertyName("statement"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] string? Statement,
[property: JsonPropertyName("sources")] ImmutableArray<OpenVexExportSource> Sources);
internal sealed record OpenVexExportProduct(
[property: JsonPropertyName("id")] string Id,
[property: JsonPropertyName("name")] string Name,
[property: JsonPropertyName("version"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] string? Version,
[property: JsonPropertyName("purl"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] string? Purl,
[property: JsonPropertyName("cpe"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] string? Cpe);
internal sealed record OpenVexExportSource(
[property: JsonPropertyName("provider")] string Provider,
[property: JsonPropertyName("status")] string Status,
[property: JsonPropertyName("justification"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] string? Justification,
[property: JsonPropertyName("document_digest")] string DocumentDigest,
[property: JsonPropertyName("source_uri")] string SourceUri,
[property: JsonPropertyName("detail"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] string? Detail,
[property: JsonPropertyName("first_observed")] string FirstObserved,
[property: JsonPropertyName("last_observed")] string LastObserved);
internal sealed record OpenVexExportMetadata(
[property: JsonPropertyName("generated_at")] string GeneratedAt,
[property: JsonPropertyName("query_signature")] string QuerySignature,
[property: JsonPropertyName("source_providers")] ImmutableArray<string> SourceProviders,
[property: JsonPropertyName("diagnostics")] ImmutableDictionary<string, string> Diagnostics);

View File

@@ -1,282 +1,282 @@
using System.Collections.Generic;
using System.Collections.Immutable;
using System.Globalization;
using System.Linq;
using StellaOps.Excititor.Core;
namespace StellaOps.Excititor.Formats.OpenVEX;
/// <summary>
/// Provides deterministic merging utilities for OpenVEX statements derived from normalized VEX claims.
/// </summary>
public static class OpenVexStatementMerger
{
private static readonly ImmutableDictionary<VexClaimStatus, int> StatusRiskPrecedence = new Dictionary<VexClaimStatus, int>
{
[VexClaimStatus.Affected] = 3,
[VexClaimStatus.UnderInvestigation] = 2,
[VexClaimStatus.Fixed] = 1,
[VexClaimStatus.NotAffected] = 0,
}.ToImmutableDictionary();
public static OpenVexMergeResult Merge(IEnumerable<VexClaim> claims)
{
ArgumentNullException.ThrowIfNull(claims);
var statements = new List<OpenVexMergedStatement>();
var diagnostics = new Dictionary<string, SortedSet<string>>(StringComparer.Ordinal);
foreach (var group in claims
.Where(static claim => claim is not null)
.GroupBy(static claim => (claim.VulnerabilityId, claim.Product.Key)))
{
var orderedClaims = group
.OrderBy(static claim => claim.ProviderId, StringComparer.Ordinal)
.ThenBy(static claim => claim.Document.Digest, StringComparer.Ordinal)
.ToImmutableArray();
if (orderedClaims.IsDefaultOrEmpty)
{
continue;
}
var mergedProduct = MergeProduct(orderedClaims);
var sources = BuildSources(orderedClaims);
var firstSeen = orderedClaims.Min(static claim => claim.FirstSeen);
var lastSeen = orderedClaims.Max(static claim => claim.LastSeen);
var statusSet = orderedClaims
.Select(static claim => claim.Status)
.Distinct()
.ToArray();
if (statusSet.Length > 1)
{
AddDiagnostic(
diagnostics,
"openvex.status_conflict",
FormattableString.Invariant($"{group.Key.VulnerabilityId}:{group.Key.Key}={string.Join('|', statusSet.Select(static status => status.ToString().ToLowerInvariant()))}"));
}
var canonicalStatus = SelectCanonicalStatus(statusSet);
var justification = SelectJustification(canonicalStatus, orderedClaims, diagnostics, group.Key);
if (canonicalStatus == VexClaimStatus.NotAffected && justification is null)
{
AddDiagnostic(
diagnostics,
"policy.justification_missing",
FormattableString.Invariant($"{group.Key.VulnerabilityId}:{group.Key.Key}"));
}
var detail = BuildDetail(orderedClaims);
statements.Add(new OpenVexMergedStatement(
group.Key.VulnerabilityId,
mergedProduct,
canonicalStatus,
justification,
detail,
sources,
firstSeen,
lastSeen));
}
var orderedStatements = statements
.OrderBy(static statement => statement.VulnerabilityId, StringComparer.Ordinal)
.ThenBy(static statement => statement.Product.Key, StringComparer.Ordinal)
.ToImmutableArray();
var orderedDiagnostics = diagnostics.Count == 0
? ImmutableDictionary<string, string>.Empty
: diagnostics.ToImmutableDictionary(
static pair => pair.Key,
pair => string.Join(",", pair.Value.OrderBy(static entry => entry, StringComparer.Ordinal)),
StringComparer.Ordinal);
return new OpenVexMergeResult(orderedStatements, orderedDiagnostics);
}
private static VexClaimStatus SelectCanonicalStatus(IReadOnlyCollection<VexClaimStatus> statuses)
{
if (statuses.Count == 0)
{
return VexClaimStatus.UnderInvestigation;
}
return statuses
.OrderByDescending(static status => StatusRiskPrecedence.GetValueOrDefault(status, -1))
.ThenBy(static status => status.ToString(), StringComparer.Ordinal)
.First();
}
private static VexJustification? SelectJustification(
VexClaimStatus canonicalStatus,
ImmutableArray<VexClaim> claims,
IDictionary<string, SortedSet<string>> diagnostics,
(string Vulnerability, string ProductKey) groupKey)
{
var relevantClaims = claims
.Where(claim => claim.Status == canonicalStatus)
.ToArray();
if (relevantClaims.Length == 0)
{
relevantClaims = claims.ToArray();
}
var justifications = relevantClaims
.Select(static claim => claim.Justification)
.Where(static justification => justification is not null)
.Cast<VexJustification>()
.Distinct()
.ToArray();
if (justifications.Length == 0)
{
return null;
}
if (justifications.Length > 1)
{
AddDiagnostic(
diagnostics,
"openvex.justification_conflict",
FormattableString.Invariant($"{groupKey.Vulnerability}:{groupKey.ProductKey}={string.Join('|', justifications.Select(static justification => justification.ToString().ToLowerInvariant()))}"));
}
return justifications
.OrderBy(static justification => justification.ToString(), StringComparer.Ordinal)
.First();
}
private static string? BuildDetail(ImmutableArray<VexClaim> claims)
{
var details = claims
.Select(static claim => claim.Detail)
.Where(static detail => !string.IsNullOrWhiteSpace(detail))
.Select(static detail => detail!.Trim())
.Distinct(StringComparer.Ordinal)
.ToArray();
if (details.Length == 0)
{
return null;
}
return string.Join("; ", details.OrderBy(static detail => detail, StringComparer.Ordinal));
}
private static ImmutableArray<OpenVexSourceEntry> BuildSources(ImmutableArray<VexClaim> claims)
{
var builder = ImmutableArray.CreateBuilder<OpenVexSourceEntry>(claims.Length);
foreach (var claim in claims)
{
builder.Add(new OpenVexSourceEntry(
claim.ProviderId,
claim.Status,
claim.Justification,
claim.Document.Digest,
claim.Document.SourceUri,
claim.Detail,
claim.FirstSeen,
claim.LastSeen));
}
return builder
.ToImmutable()
.OrderBy(static source => source.ProviderId, StringComparer.Ordinal)
.ThenBy(static source => source.DocumentDigest, StringComparer.Ordinal)
.ToImmutableArray();
}
private static VexProduct MergeProduct(ImmutableArray<VexClaim> claims)
{
var key = claims[0].Product.Key;
var names = claims
.Select(static claim => claim.Product.Name)
.Where(static name => !string.IsNullOrWhiteSpace(name))
.Select(static name => name!)
.Distinct(StringComparer.Ordinal)
.ToArray();
var versions = claims
.Select(static claim => claim.Product.Version)
.Where(static version => !string.IsNullOrWhiteSpace(version))
.Select(static version => version!)
.Distinct(StringComparer.Ordinal)
.ToArray();
var purls = claims
.Select(static claim => claim.Product.Purl)
.Where(static purl => !string.IsNullOrWhiteSpace(purl))
.Select(static purl => purl!)
.Distinct(StringComparer.OrdinalIgnoreCase)
.ToArray();
var cpes = claims
.Select(static claim => claim.Product.Cpe)
.Where(static cpe => !string.IsNullOrWhiteSpace(cpe))
.Select(static cpe => cpe!)
.Distinct(StringComparer.OrdinalIgnoreCase)
.ToArray();
var identifiers = claims
.SelectMany(static claim => claim.Product.ComponentIdentifiers)
.Where(static identifier => !string.IsNullOrWhiteSpace(identifier))
.Select(static identifier => identifier!)
.Distinct(StringComparer.OrdinalIgnoreCase)
.OrderBy(static identifier => identifier, StringComparer.OrdinalIgnoreCase)
.ToImmutableArray();
return new VexProduct(
key,
names.Length == 0 ? claims[0].Product.Name : names.OrderByDescending(static name => name.Length).ThenBy(static name => name, StringComparer.Ordinal).First(),
versions.Length == 0 ? claims[0].Product.Version : versions.OrderByDescending(static version => version.Length).ThenBy(static version => version, StringComparer.Ordinal).First(),
purls.Length == 0 ? claims[0].Product.Purl : purls.OrderBy(static purl => purl, StringComparer.OrdinalIgnoreCase).First(),
cpes.Length == 0 ? claims[0].Product.Cpe : cpes.OrderBy(static cpe => cpe, StringComparer.OrdinalIgnoreCase).First(),
identifiers);
}
private static void AddDiagnostic(
IDictionary<string, SortedSet<string>> diagnostics,
string code,
string value)
{
if (!diagnostics.TryGetValue(code, out var entries))
{
entries = new SortedSet<string>(StringComparer.Ordinal);
diagnostics[code] = entries;
}
entries.Add(value);
}
}
public sealed record OpenVexMergeResult(
ImmutableArray<OpenVexMergedStatement> Statements,
ImmutableDictionary<string, string> Diagnostics);
public sealed record OpenVexMergedStatement(
string VulnerabilityId,
VexProduct Product,
VexClaimStatus Status,
VexJustification? Justification,
string? Detail,
ImmutableArray<OpenVexSourceEntry> Sources,
DateTimeOffset FirstObserved,
DateTimeOffset LastObserved);
public sealed record OpenVexSourceEntry(
string ProviderId,
VexClaimStatus Status,
VexJustification? Justification,
string DocumentDigest,
Uri DocumentSource,
string? Detail,
DateTimeOffset FirstSeen,
DateTimeOffset LastSeen)
{
public string DocumentDigest { get; } = string.IsNullOrWhiteSpace(DocumentDigest)
? throw new ArgumentException("Document digest must be provided.", nameof(DocumentDigest))
: DocumentDigest.Trim();
}
using System.Collections.Generic;
using System.Collections.Immutable;
using System.Globalization;
using System.Linq;
using StellaOps.Excititor.Core;
namespace StellaOps.Excititor.Formats.OpenVEX;
/// <summary>
/// Provides deterministic merging utilities for OpenVEX statements derived from normalized VEX claims.
/// </summary>
public static class OpenVexStatementMerger
{
private static readonly ImmutableDictionary<VexClaimStatus, int> StatusRiskPrecedence = new Dictionary<VexClaimStatus, int>
{
[VexClaimStatus.Affected] = 3,
[VexClaimStatus.UnderInvestigation] = 2,
[VexClaimStatus.Fixed] = 1,
[VexClaimStatus.NotAffected] = 0,
}.ToImmutableDictionary();
public static OpenVexMergeResult Merge(IEnumerable<VexClaim> claims)
{
ArgumentNullException.ThrowIfNull(claims);
var statements = new List<OpenVexMergedStatement>();
var diagnostics = new Dictionary<string, SortedSet<string>>(StringComparer.Ordinal);
foreach (var group in claims
.Where(static claim => claim is not null)
.GroupBy(static claim => (claim.VulnerabilityId, claim.Product.Key)))
{
var orderedClaims = group
.OrderBy(static claim => claim.ProviderId, StringComparer.Ordinal)
.ThenBy(static claim => claim.Document.Digest, StringComparer.Ordinal)
.ToImmutableArray();
if (orderedClaims.IsDefaultOrEmpty)
{
continue;
}
var mergedProduct = MergeProduct(orderedClaims);
var sources = BuildSources(orderedClaims);
var firstSeen = orderedClaims.Min(static claim => claim.FirstSeen);
var lastSeen = orderedClaims.Max(static claim => claim.LastSeen);
var statusSet = orderedClaims
.Select(static claim => claim.Status)
.Distinct()
.ToArray();
if (statusSet.Length > 1)
{
AddDiagnostic(
diagnostics,
"openvex.status_conflict",
FormattableString.Invariant($"{group.Key.VulnerabilityId}:{group.Key.Key}={string.Join('|', statusSet.Select(static status => status.ToString().ToLowerInvariant()))}"));
}
var canonicalStatus = SelectCanonicalStatus(statusSet);
var justification = SelectJustification(canonicalStatus, orderedClaims, diagnostics, group.Key);
if (canonicalStatus == VexClaimStatus.NotAffected && justification is null)
{
AddDiagnostic(
diagnostics,
"policy.justification_missing",
FormattableString.Invariant($"{group.Key.VulnerabilityId}:{group.Key.Key}"));
}
var detail = BuildDetail(orderedClaims);
statements.Add(new OpenVexMergedStatement(
group.Key.VulnerabilityId,
mergedProduct,
canonicalStatus,
justification,
detail,
sources,
firstSeen,
lastSeen));
}
var orderedStatements = statements
.OrderBy(static statement => statement.VulnerabilityId, StringComparer.Ordinal)
.ThenBy(static statement => statement.Product.Key, StringComparer.Ordinal)
.ToImmutableArray();
var orderedDiagnostics = diagnostics.Count == 0
? ImmutableDictionary<string, string>.Empty
: diagnostics.ToImmutableDictionary(
static pair => pair.Key,
pair => string.Join(",", pair.Value.OrderBy(static entry => entry, StringComparer.Ordinal)),
StringComparer.Ordinal);
return new OpenVexMergeResult(orderedStatements, orderedDiagnostics);
}
private static VexClaimStatus SelectCanonicalStatus(IReadOnlyCollection<VexClaimStatus> statuses)
{
if (statuses.Count == 0)
{
return VexClaimStatus.UnderInvestigation;
}
return statuses
.OrderByDescending(static status => StatusRiskPrecedence.GetValueOrDefault(status, -1))
.ThenBy(static status => status.ToString(), StringComparer.Ordinal)
.First();
}
private static VexJustification? SelectJustification(
VexClaimStatus canonicalStatus,
ImmutableArray<VexClaim> claims,
IDictionary<string, SortedSet<string>> diagnostics,
(string Vulnerability, string ProductKey) groupKey)
{
var relevantClaims = claims
.Where(claim => claim.Status == canonicalStatus)
.ToArray();
if (relevantClaims.Length == 0)
{
relevantClaims = claims.ToArray();
}
var justifications = relevantClaims
.Select(static claim => claim.Justification)
.Where(static justification => justification is not null)
.Cast<VexJustification>()
.Distinct()
.ToArray();
if (justifications.Length == 0)
{
return null;
}
if (justifications.Length > 1)
{
AddDiagnostic(
diagnostics,
"openvex.justification_conflict",
FormattableString.Invariant($"{groupKey.Vulnerability}:{groupKey.ProductKey}={string.Join('|', justifications.Select(static justification => justification.ToString().ToLowerInvariant()))}"));
}
return justifications
.OrderBy(static justification => justification.ToString(), StringComparer.Ordinal)
.First();
}
private static string? BuildDetail(ImmutableArray<VexClaim> claims)
{
var details = claims
.Select(static claim => claim.Detail)
.Where(static detail => !string.IsNullOrWhiteSpace(detail))
.Select(static detail => detail!.Trim())
.Distinct(StringComparer.Ordinal)
.ToArray();
if (details.Length == 0)
{
return null;
}
return string.Join("; ", details.OrderBy(static detail => detail, StringComparer.Ordinal));
}
private static ImmutableArray<OpenVexSourceEntry> BuildSources(ImmutableArray<VexClaim> claims)
{
var builder = ImmutableArray.CreateBuilder<OpenVexSourceEntry>(claims.Length);
foreach (var claim in claims)
{
builder.Add(new OpenVexSourceEntry(
claim.ProviderId,
claim.Status,
claim.Justification,
claim.Document.Digest,
claim.Document.SourceUri,
claim.Detail,
claim.FirstSeen,
claim.LastSeen));
}
return builder
.ToImmutable()
.OrderBy(static source => source.ProviderId, StringComparer.Ordinal)
.ThenBy(static source => source.DocumentDigest, StringComparer.Ordinal)
.ToImmutableArray();
}
private static VexProduct MergeProduct(ImmutableArray<VexClaim> claims)
{
var key = claims[0].Product.Key;
var names = claims
.Select(static claim => claim.Product.Name)
.Where(static name => !string.IsNullOrWhiteSpace(name))
.Select(static name => name!)
.Distinct(StringComparer.Ordinal)
.ToArray();
var versions = claims
.Select(static claim => claim.Product.Version)
.Where(static version => !string.IsNullOrWhiteSpace(version))
.Select(static version => version!)
.Distinct(StringComparer.Ordinal)
.ToArray();
var purls = claims
.Select(static claim => claim.Product.Purl)
.Where(static purl => !string.IsNullOrWhiteSpace(purl))
.Select(static purl => purl!)
.Distinct(StringComparer.OrdinalIgnoreCase)
.ToArray();
var cpes = claims
.Select(static claim => claim.Product.Cpe)
.Where(static cpe => !string.IsNullOrWhiteSpace(cpe))
.Select(static cpe => cpe!)
.Distinct(StringComparer.OrdinalIgnoreCase)
.ToArray();
var identifiers = claims
.SelectMany(static claim => claim.Product.ComponentIdentifiers)
.Where(static identifier => !string.IsNullOrWhiteSpace(identifier))
.Select(static identifier => identifier!)
.Distinct(StringComparer.OrdinalIgnoreCase)
.OrderBy(static identifier => identifier, StringComparer.OrdinalIgnoreCase)
.ToImmutableArray();
return new VexProduct(
key,
names.Length == 0 ? claims[0].Product.Name : names.OrderByDescending(static name => name.Length).ThenBy(static name => name, StringComparer.Ordinal).First(),
versions.Length == 0 ? claims[0].Product.Version : versions.OrderByDescending(static version => version.Length).ThenBy(static version => version, StringComparer.Ordinal).First(),
purls.Length == 0 ? claims[0].Product.Purl : purls.OrderBy(static purl => purl, StringComparer.OrdinalIgnoreCase).First(),
cpes.Length == 0 ? claims[0].Product.Cpe : cpes.OrderBy(static cpe => cpe, StringComparer.OrdinalIgnoreCase).First(),
identifiers);
}
private static void AddDiagnostic(
IDictionary<string, SortedSet<string>> diagnostics,
string code,
string value)
{
if (!diagnostics.TryGetValue(code, out var entries))
{
entries = new SortedSet<string>(StringComparer.Ordinal);
diagnostics[code] = entries;
}
entries.Add(value);
}
}
public sealed record OpenVexMergeResult(
ImmutableArray<OpenVexMergedStatement> Statements,
ImmutableDictionary<string, string> Diagnostics);
public sealed record OpenVexMergedStatement(
string VulnerabilityId,
VexProduct Product,
VexClaimStatus Status,
VexJustification? Justification,
string? Detail,
ImmutableArray<OpenVexSourceEntry> Sources,
DateTimeOffset FirstObserved,
DateTimeOffset LastObserved);
public sealed record OpenVexSourceEntry(
string ProviderId,
VexClaimStatus Status,
VexJustification? Justification,
string DocumentDigest,
Uri DocumentSource,
string? Detail,
DateTimeOffset FirstSeen,
DateTimeOffset LastSeen)
{
public string DocumentDigest { get; } = string.IsNullOrWhiteSpace(DocumentDigest)
? throw new ArgumentException("Document digest must be provided.", nameof(DocumentDigest))
: DocumentDigest.Trim();
}

View File

@@ -1,87 +1,87 @@
using System;
using System.Collections.Immutable;
using System.Linq;
namespace StellaOps.Excititor.Policy;
public interface IVexPolicyDiagnostics
{
VexPolicyDiagnosticsReport GetDiagnostics();
}
public sealed record VexPolicyDiagnosticsReport(
string Version,
string RevisionId,
string Digest,
int ErrorCount,
int WarningCount,
DateTimeOffset GeneratedAt,
ImmutableArray<VexPolicyIssue> Issues,
ImmutableArray<string> Recommendations,
ImmutableDictionary<string, double> ActiveOverrides);
public sealed class VexPolicyDiagnostics : IVexPolicyDiagnostics
{
private readonly IVexPolicyProvider _policyProvider;
private readonly TimeProvider _timeProvider;
public VexPolicyDiagnostics(
IVexPolicyProvider policyProvider,
TimeProvider? timeProvider = null)
{
_policyProvider = policyProvider ?? throw new ArgumentNullException(nameof(policyProvider));
_timeProvider = timeProvider ?? TimeProvider.System;
}
public VexPolicyDiagnosticsReport GetDiagnostics()
{
var snapshot = _policyProvider.GetSnapshot();
var issues = snapshot.Issues;
var errorCount = issues.Count(static issue => issue.Severity == VexPolicyIssueSeverity.Error);
var warningCount = issues.Count(static issue => issue.Severity == VexPolicyIssueSeverity.Warning);
var overrides = snapshot.ConsensusOptions.ProviderOverrides
.OrderBy(static pair => pair.Key, StringComparer.Ordinal)
.ToImmutableDictionary();
var recommendations = BuildRecommendations(errorCount, warningCount, overrides);
return new VexPolicyDiagnosticsReport(
snapshot.Version,
snapshot.RevisionId,
snapshot.Digest,
errorCount,
warningCount,
_timeProvider.GetUtcNow(),
issues,
recommendations,
overrides);
}
private static ImmutableArray<string> BuildRecommendations(
int errorCount,
int warningCount,
ImmutableDictionary<string, double> overrides)
{
var messages = ImmutableArray.CreateBuilder<string>();
if (errorCount > 0)
{
messages.Add("Resolve policy errors before running consensus; defaults are used while errors persist.");
}
if (warningCount > 0)
{
messages.Add("Review policy warnings via CLI/Web diagnostics and adjust configuration as needed.");
}
if (overrides.Count > 0)
{
messages.Add($"Provider overrides active for: {string.Join(", ", overrides.Keys)}.");
}
messages.Add("Refer to docs/modules/excititor/architecture.md for policy upgrade and diagnostics guidance.");
return messages.ToImmutable();
}
}
using System;
using System.Collections.Immutable;
using System.Linq;
namespace StellaOps.Excititor.Policy;
public interface IVexPolicyDiagnostics
{
VexPolicyDiagnosticsReport GetDiagnostics();
}
public sealed record VexPolicyDiagnosticsReport(
string Version,
string RevisionId,
string Digest,
int ErrorCount,
int WarningCount,
DateTimeOffset GeneratedAt,
ImmutableArray<VexPolicyIssue> Issues,
ImmutableArray<string> Recommendations,
ImmutableDictionary<string, double> ActiveOverrides);
public sealed class VexPolicyDiagnostics : IVexPolicyDiagnostics
{
private readonly IVexPolicyProvider _policyProvider;
private readonly TimeProvider _timeProvider;
public VexPolicyDiagnostics(
IVexPolicyProvider policyProvider,
TimeProvider? timeProvider = null)
{
_policyProvider = policyProvider ?? throw new ArgumentNullException(nameof(policyProvider));
_timeProvider = timeProvider ?? TimeProvider.System;
}
public VexPolicyDiagnosticsReport GetDiagnostics()
{
var snapshot = _policyProvider.GetSnapshot();
var issues = snapshot.Issues;
var errorCount = issues.Count(static issue => issue.Severity == VexPolicyIssueSeverity.Error);
var warningCount = issues.Count(static issue => issue.Severity == VexPolicyIssueSeverity.Warning);
var overrides = snapshot.ConsensusOptions.ProviderOverrides
.OrderBy(static pair => pair.Key, StringComparer.Ordinal)
.ToImmutableDictionary();
var recommendations = BuildRecommendations(errorCount, warningCount, overrides);
return new VexPolicyDiagnosticsReport(
snapshot.Version,
snapshot.RevisionId,
snapshot.Digest,
errorCount,
warningCount,
_timeProvider.GetUtcNow(),
issues,
recommendations,
overrides);
}
private static ImmutableArray<string> BuildRecommendations(
int errorCount,
int warningCount,
ImmutableDictionary<string, double> overrides)
{
var messages = ImmutableArray.CreateBuilder<string>();
if (errorCount > 0)
{
messages.Add("Resolve policy errors before running consensus; defaults are used while errors persist.");
}
if (warningCount > 0)
{
messages.Add("Review policy warnings via CLI/Web diagnostics and adjust configuration as needed.");
}
if (overrides.Count > 0)
{
messages.Add($"Provider overrides active for: {string.Join(", ", overrides.Keys)}.");
}
messages.Add("Refer to docs/modules/excititor/architecture.md for policy upgrade and diagnostics guidance.");
return messages.ToImmutable();
}
}

View File

@@ -1,19 +1,19 @@
<?xml version='1.0' encoding='utf-8'?>
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>net10.0</TargetFramework>
<LangVersion>preview</LangVersion>
<Nullable>enable</Nullable>
<ImplicitUsings>enable</ImplicitUsings>
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
<UseConcelierTestInfra>false</UseConcelierTestInfra>
</PropertyGroup>
<ItemGroup>
<Compile Remove="..\..\..\StellaOps.Concelier.Tests.Shared\AssemblyInfo.cs" />
<Compile Remove="..\..\..\StellaOps.Concelier.Tests.Shared\MongoFixtureCollection.cs" />
</ItemGroup>
<ItemGroup>
<ProjectReference Include="../../__Libraries/StellaOps.Excititor.Attestation/StellaOps.Excititor.Attestation.csproj" />
<ProjectReference Include="../../__Libraries/StellaOps.Excititor.Core/StellaOps.Excititor.Core.csproj" />
</ItemGroup>
</Project>
<?xml version='1.0' encoding='utf-8'?>
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>net10.0</TargetFramework>
<LangVersion>preview</LangVersion>
<Nullable>enable</Nullable>
<ImplicitUsings>enable</ImplicitUsings>
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
<UseConcelierTestInfra>false</UseConcelierTestInfra>
</PropertyGroup>
<ItemGroup>
<Compile Remove="..\..\..\StellaOps.Concelier.Tests.Shared\AssemblyInfo.cs" />
<Compile Remove="..\..\..\StellaOps.Concelier.Tests.Shared\MongoFixtureCollection.cs" />
</ItemGroup>
<ItemGroup>
<ProjectReference Include="../../__Libraries/StellaOps.Excititor.Attestation/StellaOps.Excititor.Attestation.csproj" />
<ProjectReference Include="../../__Libraries/StellaOps.Excititor.Core/StellaOps.Excititor.Core.csproj" />
</ItemGroup>
</Project>

View File

@@ -1,54 +1,57 @@
using System.Collections.Immutable;
using System.Text;
using Microsoft.Extensions.Logging.Abstractions;
using Microsoft.Extensions.Options;
using Microsoft.IdentityModel.Tokens;
using StellaOps.Cryptography;
using StellaOps.Excititor.Attestation.Dsse;
using StellaOps.Excititor.Attestation.Signing;
using StellaOps.Excititor.Attestation.Transparency;
using StellaOps.Excititor.Attestation.Verification;
using StellaOps.Excititor.Core;
namespace StellaOps.Excititor.Attestation.Tests;
public sealed class VexAttestationVerifierTests : IDisposable
{
private readonly VexAttestationMetrics _metrics = new();
[Fact]
public async Task VerifyAsync_ReturnsValid_WhenEnvelopeMatches()
{
var (request, metadata, envelope) = await CreateSignedAttestationAsync();
var verifier = CreateVerifier(options => options.RequireTransparencyLog = false);
var verification = await verifier.VerifyAsync(
new VexAttestationVerificationRequest(request, metadata, envelope),
CancellationToken.None);
Assert.True(verification.IsValid);
Assert.Equal("valid", verification.Diagnostics["result"]);
}
[Fact]
public async Task VerifyAsync_ReturnsInvalid_WhenDigestMismatch()
{
var (request, metadata, envelope) = await CreateSignedAttestationAsync();
var verifier = CreateVerifier(options => options.RequireTransparencyLog = false);
var tamperedMetadata = new VexAttestationMetadata(
metadata.PredicateType,
metadata.Rekor,
"sha256:deadbeef",
metadata.SignedAt);
var verification = await verifier.VerifyAsync(
new VexAttestationVerificationRequest(request, tamperedMetadata, envelope),
CancellationToken.None);
Assert.False(verification.IsValid);
Assert.Equal("invalid", verification.Diagnostics["result"]);
Assert.Equal("sha256:deadbeef", verification.Diagnostics["metadata.envelopeDigest"]);
}
[Fact]
namespace StellaOps.Excititor.Attestation.Tests;
public sealed class VexAttestationVerifierTests : IDisposable
{
private readonly VexAttestationMetrics _metrics = new();
[Fact]
public async Task VerifyAsync_ReturnsValid_WhenEnvelopeMatches()
{
var (request, metadata, envelope) = await CreateSignedAttestationAsync();
var verifier = CreateVerifier(options => options.RequireTransparencyLog = false);
var verification = await verifier.VerifyAsync(
new VexAttestationVerificationRequest(request, metadata, envelope),
CancellationToken.None);
Assert.True(verification.IsValid);
Assert.Equal("valid", verification.Diagnostics["result"]);
}
[Fact]
public async Task VerifyAsync_ReturnsInvalid_WhenDigestMismatch()
{
var (request, metadata, envelope) = await CreateSignedAttestationAsync();
var verifier = CreateVerifier(options => options.RequireTransparencyLog = false);
var tamperedMetadata = new VexAttestationMetadata(
metadata.PredicateType,
metadata.Rekor,
"sha256:deadbeef",
metadata.SignedAt);
var verification = await verifier.VerifyAsync(
new VexAttestationVerificationRequest(request, tamperedMetadata, envelope),
CancellationToken.None);
Assert.False(verification.IsValid);
Assert.Equal("invalid", verification.Diagnostics["result"]);
Assert.Equal("sha256:deadbeef", verification.Diagnostics["metadata.envelopeDigest"]);
}
[Fact]
public async Task VerifyAsync_AllowsOfflineTransparency_WhenConfigured()
{
var (request, metadata, envelope) = await CreateSignedAttestationAsync(includeRekor: true);
@@ -65,48 +68,122 @@ public sealed class VexAttestationVerifierTests : IDisposable
Assert.True(verification.IsValid);
Assert.Equal("offline", verification.Diagnostics["rekor.state"]);
Assert.Equal("degraded", verification.Diagnostics["result"]);
}
[Fact]
public async Task VerifyAsync_ReturnsInvalid_WhenTransparencyRequiredAndMissing()
{
var (request, metadata, envelope) = await CreateSignedAttestationAsync(includeRekor: false);
var verifier = CreateVerifier(options =>
{
options.RequireTransparencyLog = true;
options.AllowOfflineTransparency = false;
});
var verification = await verifier.VerifyAsync(
new VexAttestationVerificationRequest(request, metadata, envelope),
CancellationToken.None);
Assert.False(verification.IsValid);
Assert.Equal("missing", verification.Diagnostics["rekor.state"]);
Assert.Equal("invalid", verification.Diagnostics["result"]);
}
[Fact]
public async Task VerifyAsync_ReturnsInvalid_WhenTransparencyUnavailableAndOfflineDisallowed()
{
var (request, metadata, envelope) = await CreateSignedAttestationAsync(includeRekor: true);
var transparency = new ThrowingTransparencyLogClient();
var verifier = CreateVerifier(options =>
{
options.RequireTransparencyLog = true;
options.AllowOfflineTransparency = false;
}, transparency);
var verification = await verifier.VerifyAsync(
new VexAttestationVerificationRequest(request, metadata, envelope),
CancellationToken.None);
Assert.False(verification.IsValid);
{
var (request, metadata, envelope) = await CreateSignedAttestationAsync(includeRekor: false);
var verifier = CreateVerifier(options =>
{
options.RequireTransparencyLog = true;
options.AllowOfflineTransparency = false;
});
var verification = await verifier.VerifyAsync(
new VexAttestationVerificationRequest(request, metadata, envelope),
CancellationToken.None);
Assert.False(verification.IsValid);
Assert.Equal("missing", verification.Diagnostics["rekor.state"]);
Assert.Equal("invalid", verification.Diagnostics["result"]);
}
[Fact]
public async Task VerifyAsync_ReturnsInvalid_WhenTransparencyUnavailableAndOfflineDisallowed()
{
var (request, metadata, envelope) = await CreateSignedAttestationAsync(includeRekor: true);
var transparency = new ThrowingTransparencyLogClient();
var verifier = CreateVerifier(options =>
{
options.RequireTransparencyLog = true;
options.AllowOfflineTransparency = false;
}, transparency);
var verification = await verifier.VerifyAsync(
new VexAttestationVerificationRequest(request, metadata, envelope),
CancellationToken.None);
Assert.False(verification.IsValid);
Assert.Equal("unreachable", verification.Diagnostics["rekor.state"]);
Assert.Equal("invalid", verification.Diagnostics["result"]);
}
private async Task<(VexAttestationRequest Request, VexAttestationMetadata Metadata, string Envelope)> CreateSignedAttestationAsync(bool includeRekor = false)
[Fact]
public async Task VerifyAsync_HandlesDuplicateSourceProviders()
{
var (request, metadata, envelope) = await CreateSignedAttestationAsync(
includeRekor: false,
sourceProviders: ImmutableArray.Create("provider-a", "provider-a"));
var normalizedRequest = request with { SourceProviders = ImmutableArray.Create("provider-a") };
var verifier = CreateVerifier(options => options.RequireTransparencyLog = false);
var verification = await verifier.VerifyAsync(
new VexAttestationVerificationRequest(normalizedRequest, metadata, envelope),
CancellationToken.None);
Assert.True(verification.IsValid);
Assert.Equal("valid", verification.Diagnostics["result"]);
}
[Fact]
public async Task VerifyAsync_ReturnsValid_WhenTrustedSignerConfigured()
{
var (request, metadata, envelope) = await CreateSignedAttestationAsync(includeRekor: false);
var registry = new StubCryptoProviderRegistry(success: true);
var verifier = CreateVerifier(options =>
{
options.RequireTransparencyLog = false;
options.RequireSignatureVerification = true;
options.TrustedSigners = ImmutableDictionary<string, VexAttestationVerificationOptions.TrustedSignerOptions>.Empty.Add(
"key",
new VexAttestationVerificationOptions.TrustedSignerOptions
{
Algorithm = StubCryptoProviderRegistry.Algorithm,
KeyReference = StubCryptoProviderRegistry.KeyReference
});
}, transparency: null, registry: registry);
var verification = await verifier.VerifyAsync(
new VexAttestationVerificationRequest(request, metadata, envelope),
CancellationToken.None);
Assert.True(verification.IsValid);
Assert.Equal("verified", verification.Diagnostics["signature.state"]);
}
[Fact]
public async Task VerifyAsync_ReturnsInvalid_WhenSignatureFailsAndRequired()
{
var (request, metadata, envelope) = await CreateSignedAttestationAsync(includeRekor: false);
var registry = new StubCryptoProviderRegistry(success: false);
var verifier = CreateVerifier(options =>
{
options.RequireTransparencyLog = false;
options.RequireSignatureVerification = true;
options.TrustedSigners = ImmutableDictionary<string, VexAttestationVerificationOptions.TrustedSignerOptions>.Empty.Add(
"key",
new VexAttestationVerificationOptions.TrustedSignerOptions
{
Algorithm = StubCryptoProviderRegistry.Algorithm,
KeyReference = StubCryptoProviderRegistry.KeyReference
});
}, transparency: null, registry: registry);
var verification = await verifier.VerifyAsync(
new VexAttestationVerificationRequest(request, metadata, envelope),
CancellationToken.None);
Assert.False(verification.IsValid);
Assert.Equal("error", verification.Diagnostics["signature.state"]);
Assert.Equal("verification_failed", verification.Diagnostics["signature.reason"]);
}
private async Task<(VexAttestationRequest Request, VexAttestationMetadata Metadata, string Envelope)> CreateSignedAttestationAsync(
bool includeRekor = false,
ImmutableArray<string>? sourceProviders = null)
{
var signer = new FakeSigner();
var builder = new VexDsseBuilder(signer, NullLogger<VexDsseBuilder>.Instance);
@@ -115,13 +192,14 @@ public sealed class VexAttestationVerifierTests : IDisposable
var verifier = CreateVerifier(options => options.RequireTransparencyLog = false);
var client = new VexAttestationClient(builder, options, NullLogger<VexAttestationClient>.Instance, verifier, transparency);
var providers = sourceProviders ?? ImmutableArray.Create("provider-a");
var request = new VexAttestationRequest(
ExportId: "exports/unit-test",
QuerySignature: new VexQuerySignature("filters"),
Artifact: new VexContentAddress("sha256", "cafebabe"),
Format: VexExportFormat.Json,
CreatedAt: DateTimeOffset.UtcNow,
SourceProviders: ImmutableArray.Create("provider-a"),
SourceProviders: providers,
Metadata: ImmutableDictionary<string, string>.Empty);
var response = await client.SignAsync(request, CancellationToken.None);
@@ -129,7 +207,10 @@ public sealed class VexAttestationVerifierTests : IDisposable
return (request, response.Attestation, envelope);
}
private VexAttestationVerifier CreateVerifier(Action<VexAttestationVerificationOptions>? configureOptions = null, ITransparencyLogClient? transparency = null)
private VexAttestationVerifier CreateVerifier(
Action<VexAttestationVerificationOptions>? configureOptions = null,
ITransparencyLogClient? transparency = null,
ICryptoProviderRegistry? registry = null)
{
var options = new VexAttestationVerificationOptions();
configureOptions?.Invoke(options);
@@ -137,7 +218,8 @@ public sealed class VexAttestationVerifierTests : IDisposable
NullLogger<VexAttestationVerifier>.Instance,
transparency,
Options.Create(options),
_metrics);
_metrics,
registry);
}
public void Dispose()
@@ -147,25 +229,93 @@ public sealed class VexAttestationVerifierTests : IDisposable
private sealed class FakeSigner : IVexSigner
{
internal static readonly string SignatureBase64 = Convert.ToBase64String(Encoding.UTF8.GetBytes("signature"));
public ValueTask<VexSignedPayload> SignAsync(ReadOnlyMemory<byte> payload, CancellationToken cancellationToken)
=> ValueTask.FromResult(new VexSignedPayload("signature", "key"));
=> ValueTask.FromResult(new VexSignedPayload(SignatureBase64, "key"));
}
private sealed class FakeTransparencyLogClient : ITransparencyLogClient
{
public ValueTask<TransparencyLogEntry> SubmitAsync(DsseEnvelope envelope, CancellationToken cancellationToken)
=> ValueTask.FromResult(new TransparencyLogEntry(Guid.NewGuid().ToString(), "https://rekor.example/entries/123", "42", null));
public ValueTask<bool> VerifyAsync(string entryLocation, CancellationToken cancellationToken)
=> ValueTask.FromResult(true);
}
private sealed class ThrowingTransparencyLogClient : ITransparencyLogClient
{
public ValueTask<TransparencyLogEntry> SubmitAsync(DsseEnvelope envelope, CancellationToken cancellationToken)
=> throw new NotSupportedException();
=> ValueTask.FromResult(new TransparencyLogEntry(Guid.NewGuid().ToString(), "https://rekor.example/entries/123", "42", null));
public ValueTask<bool> VerifyAsync(string entryLocation, CancellationToken cancellationToken)
=> ValueTask.FromResult(true);
}
private sealed class ThrowingTransparencyLogClient : ITransparencyLogClient
{
public ValueTask<TransparencyLogEntry> SubmitAsync(DsseEnvelope envelope, CancellationToken cancellationToken)
=> throw new NotSupportedException();
public ValueTask<bool> VerifyAsync(string entryLocation, CancellationToken cancellationToken)
=> throw new HttpRequestException("rekor unavailable");
}
private sealed class StubCryptoProviderRegistry : ICryptoProviderRegistry
{
public const string Algorithm = "ed25519";
public const string KeyReference = "stub-key";
private readonly StubCryptoSigner _signer;
private readonly IReadOnlyCollection<ICryptoProvider> _providers = Array.Empty<ICryptoProvider>();
public StubCryptoProviderRegistry(bool success)
{
_signer = new StubCryptoSigner("key", Algorithm, success);
}
public IReadOnlyCollection<ICryptoProvider> Providers => _providers;
public bool TryResolve(string preferredProvider, out ICryptoProvider provider)
{
provider = null!;
return false;
}
public ICryptoProvider ResolveOrThrow(CryptoCapability capability, string algorithmId)
=> throw new NotSupportedException();
public CryptoSignerResolution ResolveSigner(
CryptoCapability capability,
string algorithmId,
CryptoKeyReference keyReference,
string? preferredProvider = null)
{
if (!string.Equals(keyReference.KeyId, _signer.KeyId, StringComparison.Ordinal))
{
throw new InvalidOperationException($"Unknown key '{keyReference.KeyId}'.");
}
return new CryptoSignerResolution(_signer, "stub");
}
}
private sealed class StubCryptoSigner : ICryptoSigner
{
private readonly bool _success;
private readonly byte[] _expectedSignature;
public StubCryptoSigner(string keyId, string algorithmId, bool success)
{
KeyId = keyId;
AlgorithmId = algorithmId;
_success = success;
_expectedSignature = Convert.FromBase64String(FakeSigner.SignatureBase64);
}
public string KeyId { get; }
public string AlgorithmId { get; }
public ValueTask<byte[]> SignAsync(ReadOnlyMemory<byte> data, CancellationToken cancellationToken = default)
=> throw new NotSupportedException();
public ValueTask<bool> VerifyAsync(ReadOnlyMemory<byte> data, ReadOnlyMemory<byte> signature, CancellationToken cancellationToken = default)
=> ValueTask.FromResult(_success && signature.Span.SequenceEqual(_expectedSignature.Span));
public JsonWebKey ExportPublicJsonWebKey()
=> new JsonWebKey();
}
}

View File

@@ -1,22 +1,22 @@
<?xml version='1.0' encoding='utf-8'?>
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>net10.0</TargetFramework>
<LangVersion>preview</LangVersion>
<Nullable>enable</Nullable>
<ImplicitUsings>enable</ImplicitUsings>
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
<UseConcelierTestInfra>false</UseConcelierTestInfra>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="FluentAssertions" Version="6.12.0" />
<PackageReference Include="System.IO.Abstractions.TestingHelpers" Version="20.0.28" />
</ItemGroup>
<ItemGroup>
<Compile Remove="..\..\..\StellaOps.Concelier.Tests.Shared\AssemblyInfo.cs" />
<Compile Remove="..\..\..\StellaOps.Concelier.Tests.Shared\MongoFixtureCollection.cs" />
</ItemGroup>
<ItemGroup>
<ProjectReference Include="../../__Libraries/StellaOps.Excititor.Connectors.Cisco.CSAF/StellaOps.Excititor.Connectors.Cisco.CSAF.csproj" />
</ItemGroup>
</Project>
<?xml version='1.0' encoding='utf-8'?>
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>net10.0</TargetFramework>
<LangVersion>preview</LangVersion>
<Nullable>enable</Nullable>
<ImplicitUsings>enable</ImplicitUsings>
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
<UseConcelierTestInfra>false</UseConcelierTestInfra>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="FluentAssertions" Version="6.12.0" />
<PackageReference Include="System.IO.Abstractions.TestingHelpers" Version="20.0.28" />
</ItemGroup>
<ItemGroup>
<Compile Remove="..\..\..\StellaOps.Concelier.Tests.Shared\AssemblyInfo.cs" />
<Compile Remove="..\..\..\StellaOps.Concelier.Tests.Shared\MongoFixtureCollection.cs" />
</ItemGroup>
<ItemGroup>
<ProjectReference Include="../../__Libraries/StellaOps.Excititor.Connectors.Cisco.CSAF/StellaOps.Excititor.Connectors.Cisco.CSAF.csproj" />
</ItemGroup>
</Project>

View File

@@ -1,429 +1,429 @@
using System.Collections.Immutable;
using System.Globalization;
using System.Net;
using System.Net.Http;
using System.Security.Cryptography;
using System.Text;
using FluentAssertions;
using Microsoft.Extensions.Caching.Memory;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Logging.Abstractions;
using StellaOps.Excititor.Connectors.Abstractions;
using StellaOps.Excititor.Connectors.SUSE.RancherVEXHub;
using StellaOps.Excititor.Connectors.SUSE.RancherVEXHub.Configuration;
using StellaOps.Excititor.Connectors.SUSE.RancherVEXHub.Events;
using StellaOps.Excititor.Connectors.SUSE.RancherVEXHub.Metadata;
using StellaOps.Excititor.Connectors.SUSE.RancherVEXHub.State;
using StellaOps.Excititor.Core;
using StellaOps.Excititor.Storage.Mongo;
using Xunit;
namespace StellaOps.Excititor.Connectors.SUSE.RancherVEXHub.Tests.Connectors;
public sealed class RancherHubConnectorTests
{
[Fact]
public async Task FetchAsync_OfflineSnapshot_StoresDocumentAndUpdatesCheckpoint()
{
using var fixture = await ConnectorFixture.CreateAsync();
var sink = new InMemoryRawSink();
var context = fixture.CreateContext(sink);
var documents = await CollectAsync(fixture.Connector.FetchAsync(context, CancellationToken.None));
documents.Should().HaveCount(1);
var document = documents[0];
document.Digest.Should().Be(fixture.ExpectedDocumentDigest);
document.Metadata.Should().ContainKey("rancher.event.id").WhoseValue.Should().Be("evt-1");
document.Metadata.Should().ContainKey("rancher.event.cursor").WhoseValue.Should().Be("cursor-2");
sink.Documents.Should().HaveCount(1);
var state = fixture.StateRepository.State;
state.Should().NotBeNull();
state!.LastUpdated.Should().Be(DateTimeOffset.Parse("2025-10-19T12:00:00Z", CultureInfo.InvariantCulture, DateTimeStyles.AssumeUniversal));
state.DocumentDigests.Should().Contain(fixture.ExpectedDocumentDigest);
state.DocumentDigests.Should().Contain("checkpoint:cursor-2");
state.DocumentDigests.Count.Should().BeLessOrEqualTo(ConnectorFixture.MaxDigestHistory + 1);
}
[Fact]
public async Task FetchAsync_WhenDocumentDownloadFails_QuarantinesEvent()
{
using var fixture = await ConnectorFixture.CreateAsync();
fixture.Handler.SetRoute(fixture.DocumentUri, () => new HttpResponseMessage(HttpStatusCode.InternalServerError));
var sink = new InMemoryRawSink();
var context = fixture.CreateContext(sink);
var documents = await CollectAsync(fixture.Connector.FetchAsync(context, CancellationToken.None));
documents.Should().BeEmpty();
sink.Documents.Should().HaveCount(1);
var quarantined = sink.Documents[0];
quarantined.Metadata.Should().Contain("rancher.event.quarantine", "true");
quarantined.Metadata.Should().ContainKey("rancher.event.error").WhoseValue.Should().Contain("document fetch failed");
var state = fixture.StateRepository.State;
state.Should().NotBeNull();
state!.DocumentDigests.Should().Contain(d => d.StartsWith("quarantine:", StringComparison.Ordinal));
}
[Fact]
public async Task FetchAsync_ReplayingSnapshot_SkipsDuplicateDocuments()
{
using var fixture = await ConnectorFixture.CreateAsync();
var firstSink = new InMemoryRawSink();
var firstContext = fixture.CreateContext(firstSink);
await CollectAsync(fixture.Connector.FetchAsync(firstContext, CancellationToken.None));
var secondSink = new InMemoryRawSink();
var secondContext = fixture.CreateContext(secondSink);
var secondRunDocuments = await CollectAsync(fixture.Connector.FetchAsync(secondContext, CancellationToken.None));
secondRunDocuments.Should().BeEmpty();
secondSink.Documents.Should().BeEmpty();
var state = fixture.StateRepository.State;
state.Should().NotBeNull();
state!.DocumentDigests.Should().Contain(fixture.ExpectedDocumentDigest);
}
[Fact]
public async Task FetchAsync_TrimsPersistedDigestHistory()
{
var existingDigests = Enumerable.Range(0, ConnectorFixture.MaxDigestHistory + 5)
.Select(i => $"sha256:{i:X32}")
.ToImmutableArray();
var initialState = new VexConnectorState(
"excititor:suse.rancher",
DateTimeOffset.Parse("2025-10-18T00:00:00Z", CultureInfo.InvariantCulture, DateTimeStyles.AssumeUniversal),
ImmutableArray.CreateBuilder<string>()
.Add("checkpoint:cursor-old")
.AddRange(existingDigests)
.ToImmutable());
using var fixture = await ConnectorFixture.CreateAsync(initialState);
var sink = new InMemoryRawSink();
var context = fixture.CreateContext(sink);
await CollectAsync(fixture.Connector.FetchAsync(context, CancellationToken.None));
var state = fixture.StateRepository.State;
state.Should().NotBeNull();
state!.DocumentDigests.Should().Contain(d => d.StartsWith("checkpoint:", StringComparison.Ordinal));
state.DocumentDigests.Count.Should().Be(ConnectorFixture.MaxDigestHistory + 1);
}
private static async Task<List<VexRawDocument>> CollectAsync(IAsyncEnumerable<VexRawDocument> source)
{
var list = new List<VexRawDocument>();
await foreach (var document in source.ConfigureAwait(false))
{
list.Add(document);
}
return list;
}
#region helpers
private sealed class ConnectorFixture : IDisposable
{
public const int MaxDigestHistory = 200;
private readonly IServiceProvider _serviceProvider;
private readonly TempDirectory _tempDirectory;
private readonly HttpClient _httpClient;
private ConnectorFixture(
RancherHubConnector connector,
InMemoryConnectorStateRepository stateRepository,
RoutingHttpMessageHandler handler,
IServiceProvider serviceProvider,
TempDirectory tempDirectory,
HttpClient httpClient,
Uri documentUri,
string documentDigest)
{
Connector = connector;
StateRepository = stateRepository;
Handler = handler;
_serviceProvider = serviceProvider;
_tempDirectory = tempDirectory;
_httpClient = httpClient;
DocumentUri = documentUri;
ExpectedDocumentDigest = $"sha256:{documentDigest}";
}
public RancherHubConnector Connector { get; }
public InMemoryConnectorStateRepository StateRepository { get; }
public RoutingHttpMessageHandler Handler { get; }
public Uri DocumentUri { get; }
public string ExpectedDocumentDigest { get; }
public VexConnectorContext CreateContext(InMemoryRawSink sink, DateTimeOffset? since = null)
=> new(
since,
VexConnectorSettings.Empty,
sink,
new NoopSignatureVerifier(),
new NoopNormalizerRouter(),
_serviceProvider,
ImmutableDictionary<string, string>.Empty);
public void Dispose()
{
_httpClient.Dispose();
_tempDirectory.Dispose();
}
public static async Task<ConnectorFixture> CreateAsync(VexConnectorState? initialState = null)
{
var tempDirectory = new TempDirectory();
var documentPayload = "{\"document\":\"payload\"}";
var documentDigest = ComputeSha256Hex(documentPayload);
var documentUri = new Uri("https://hub.test/events/evt-1.json");
var eventsPayload = """
{
"cursor": "cursor-1",
"nextCursor": "cursor-2",
"events": [
{
"id": "evt-1",
"type": "vex.statement.published",
"channel": "rancher/rke2",
"publishedAt": "2025-10-19T12:00:00Z",
"document": {
"uri": "https://hub.test/events/evt-1.json",
"sha256": "DOC_DIGEST",
"format": "csaf"
}
}
]
}
""".Replace("DOC_DIGEST", documentDigest, StringComparison.Ordinal);
var eventsPath = tempDirectory.Combine("events.json");
await File.WriteAllTextAsync(eventsPath, eventsPayload, Encoding.UTF8).ConfigureAwait(false);
var eventsChecksum = ComputeSha256Hex(eventsPayload);
var discoveryPayload = """
{
"hubId": "excititor:suse.rancher",
"title": "SUSE Rancher VEX Hub",
"subscription": {
"eventsUri": "https://hub.test/events",
"checkpointUri": "https://hub.test/checkpoint",
"channels": [ "rancher/rke2" ],
"requiresAuthentication": false
},
"offline": {
"snapshotUri": "EVENTS_URI",
"sha256": "EVENTS_DIGEST"
}
}
"""
.Replace("EVENTS_URI", new Uri(eventsPath).ToString(), StringComparison.Ordinal)
.Replace("EVENTS_DIGEST", eventsChecksum, StringComparison.Ordinal);
var discoveryPath = tempDirectory.Combine("discovery.json");
await File.WriteAllTextAsync(discoveryPath, discoveryPayload, Encoding.UTF8).ConfigureAwait(false);
var handler = new RoutingHttpMessageHandler();
handler.SetRoute(documentUri, () => JsonResponse(documentPayload));
var httpClient = new HttpClient(handler)
{
Timeout = TimeSpan.FromSeconds(10),
};
var httpFactory = new SingletonHttpClientFactory(httpClient);
var memoryCache = new MemoryCache(new MemoryCacheOptions());
var fileSystem = new System.IO.Abstractions.FileSystem();
var tokenProvider = new RancherHubTokenProvider(httpFactory, memoryCache, NullLogger<RancherHubTokenProvider>.Instance);
var metadataLoader = new RancherHubMetadataLoader(httpFactory, memoryCache, tokenProvider, fileSystem, NullLogger<RancherHubMetadataLoader>.Instance);
var eventClient = new RancherHubEventClient(httpFactory, tokenProvider, fileSystem, NullLogger<RancherHubEventClient>.Instance);
var stateRepository = new InMemoryConnectorStateRepository(initialState);
var checkpointManager = new RancherHubCheckpointManager(stateRepository);
var validators = new[] { new RancherHubConnectorOptionsValidator(fileSystem) };
var connector = new RancherHubConnector(
metadataLoader,
eventClient,
checkpointManager,
tokenProvider,
httpFactory,
NullLogger<RancherHubConnector>.Instance,
TimeProvider.System,
validators);
var settingsValues = ImmutableDictionary.CreateBuilder<string, string>(StringComparer.OrdinalIgnoreCase);
settingsValues["DiscoveryUri"] = "https://hub.test/.well-known/rancher-hub.json";
settingsValues["OfflineSnapshotPath"] = discoveryPath;
settingsValues["PreferOfflineSnapshot"] = "true";
var settings = new VexConnectorSettings(settingsValues.ToImmutable());
await connector.ValidateAsync(settings, CancellationToken.None).ConfigureAwait(false);
var services = new ServiceCollection().BuildServiceProvider();
return new ConnectorFixture(
connector,
stateRepository,
handler,
services,
tempDirectory,
httpClient,
documentUri,
documentDigest);
}
private static HttpResponseMessage JsonResponse(string payload)
{
var response = new HttpResponseMessage(HttpStatusCode.OK)
{
Content = new StringContent(payload, Encoding.UTF8, "application/json"),
};
return response;
}
}
private sealed class SingletonHttpClientFactory : IHttpClientFactory
{
private readonly HttpClient _client;
public SingletonHttpClientFactory(HttpClient client)
{
_client = client;
}
public HttpClient CreateClient(string name) => _client;
}
private sealed class RoutingHttpMessageHandler : HttpMessageHandler
{
private readonly Dictionary<Uri, Queue<Func<HttpResponseMessage>>> _routes = new();
public void SetRoute(Uri uri, params Func<HttpResponseMessage>[] responders)
{
ArgumentNullException.ThrowIfNull(uri);
if (responders is null || responders.Length == 0)
{
_routes.Remove(uri);
return;
}
_routes[uri] = new Queue<Func<HttpResponseMessage>>(responders);
}
protected override Task<HttpResponseMessage> SendAsync(HttpRequestMessage request, CancellationToken cancellationToken)
{
if (request.RequestUri is not null &&
_routes.TryGetValue(request.RequestUri, out var queue) &&
queue.Count > 0)
{
var responder = queue.Count > 1 ? queue.Dequeue() : queue.Peek();
var response = responder();
response.RequestMessage = request;
return Task.FromResult(response);
}
return Task.FromResult(new HttpResponseMessage(HttpStatusCode.NotFound)
{
Content = new StringContent($"No response configured for {request.RequestUri}", Encoding.UTF8, "text/plain"),
});
}
}
private sealed class InMemoryConnectorStateRepository : IVexConnectorStateRepository
{
public InMemoryConnectorStateRepository(VexConnectorState? initialState = null)
{
State = initialState;
}
public VexConnectorState? State { get; private set; }
public ValueTask<VexConnectorState?> GetAsync(string connectorId, CancellationToken cancellationToken, MongoDB.Driver.IClientSessionHandle? session = null)
=> ValueTask.FromResult(State);
public ValueTask SaveAsync(VexConnectorState state, CancellationToken cancellationToken, MongoDB.Driver.IClientSessionHandle? session = null)
{
State = state;
return ValueTask.CompletedTask;
}
}
private sealed class InMemoryRawSink : IVexRawDocumentSink
{
public List<VexRawDocument> Documents { get; } = new();
public ValueTask StoreAsync(VexRawDocument document, CancellationToken cancellationToken)
{
Documents.Add(document);
return ValueTask.CompletedTask;
}
}
private sealed class NoopSignatureVerifier : IVexSignatureVerifier
{
public ValueTask<VexSignatureMetadata?> VerifyAsync(VexRawDocument document, CancellationToken cancellationToken)
=> ValueTask.FromResult<VexSignatureMetadata?>(null);
}
private sealed class NoopNormalizerRouter : IVexNormalizerRouter
{
public ValueTask<VexClaimBatch> NormalizeAsync(VexRawDocument document, CancellationToken cancellationToken)
=> ValueTask.FromResult(new VexClaimBatch(document, ImmutableArray<VexClaim>.Empty, ImmutableDictionary<string, string>.Empty));
}
private sealed class TempDirectory : IDisposable
{
private readonly string _path;
public TempDirectory()
{
_path = Path.Combine(Path.GetTempPath(), "stellaops-excititor-tests", Guid.NewGuid().ToString("n"));
Directory.CreateDirectory(_path);
}
public string Combine(string relative) => Path.Combine(_path, relative);
public void Dispose()
{
try
{
if (Directory.Exists(_path))
{
Directory.Delete(_path, recursive: true);
}
}
catch
{
// Best-effort cleanup.
}
}
}
private static string ComputeSha256Hex(string payload)
{
var bytes = Encoding.UTF8.GetBytes(payload);
return ComputeSha256Hex(bytes);
}
private static string ComputeSha256Hex(ReadOnlySpan<byte> payload)
{
Span<byte> buffer = stackalloc byte[32];
SHA256.HashData(payload, buffer);
return Convert.ToHexString(buffer).ToLowerInvariant();
}
#endregion
}
using System.Collections.Immutable;
using System.Globalization;
using System.Net;
using System.Net.Http;
using System.Security.Cryptography;
using System.Text;
using FluentAssertions;
using Microsoft.Extensions.Caching.Memory;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Logging.Abstractions;
using StellaOps.Excititor.Connectors.Abstractions;
using StellaOps.Excititor.Connectors.SUSE.RancherVEXHub;
using StellaOps.Excititor.Connectors.SUSE.RancherVEXHub.Configuration;
using StellaOps.Excititor.Connectors.SUSE.RancherVEXHub.Events;
using StellaOps.Excititor.Connectors.SUSE.RancherVEXHub.Metadata;
using StellaOps.Excititor.Connectors.SUSE.RancherVEXHub.State;
using StellaOps.Excititor.Core;
using StellaOps.Excititor.Storage.Mongo;
using Xunit;
namespace StellaOps.Excititor.Connectors.SUSE.RancherVEXHub.Tests.Connectors;
public sealed class RancherHubConnectorTests
{
[Fact]
public async Task FetchAsync_OfflineSnapshot_StoresDocumentAndUpdatesCheckpoint()
{
using var fixture = await ConnectorFixture.CreateAsync();
var sink = new InMemoryRawSink();
var context = fixture.CreateContext(sink);
var documents = await CollectAsync(fixture.Connector.FetchAsync(context, CancellationToken.None));
documents.Should().HaveCount(1);
var document = documents[0];
document.Digest.Should().Be(fixture.ExpectedDocumentDigest);
document.Metadata.Should().ContainKey("rancher.event.id").WhoseValue.Should().Be("evt-1");
document.Metadata.Should().ContainKey("rancher.event.cursor").WhoseValue.Should().Be("cursor-2");
sink.Documents.Should().HaveCount(1);
var state = fixture.StateRepository.State;
state.Should().NotBeNull();
state!.LastUpdated.Should().Be(DateTimeOffset.Parse("2025-10-19T12:00:00Z", CultureInfo.InvariantCulture, DateTimeStyles.AssumeUniversal));
state.DocumentDigests.Should().Contain(fixture.ExpectedDocumentDigest);
state.DocumentDigests.Should().Contain("checkpoint:cursor-2");
state.DocumentDigests.Count.Should().BeLessOrEqualTo(ConnectorFixture.MaxDigestHistory + 1);
}
[Fact]
public async Task FetchAsync_WhenDocumentDownloadFails_QuarantinesEvent()
{
using var fixture = await ConnectorFixture.CreateAsync();
fixture.Handler.SetRoute(fixture.DocumentUri, () => new HttpResponseMessage(HttpStatusCode.InternalServerError));
var sink = new InMemoryRawSink();
var context = fixture.CreateContext(sink);
var documents = await CollectAsync(fixture.Connector.FetchAsync(context, CancellationToken.None));
documents.Should().BeEmpty();
sink.Documents.Should().HaveCount(1);
var quarantined = sink.Documents[0];
quarantined.Metadata.Should().Contain("rancher.event.quarantine", "true");
quarantined.Metadata.Should().ContainKey("rancher.event.error").WhoseValue.Should().Contain("document fetch failed");
var state = fixture.StateRepository.State;
state.Should().NotBeNull();
state!.DocumentDigests.Should().Contain(d => d.StartsWith("quarantine:", StringComparison.Ordinal));
}
[Fact]
public async Task FetchAsync_ReplayingSnapshot_SkipsDuplicateDocuments()
{
using var fixture = await ConnectorFixture.CreateAsync();
var firstSink = new InMemoryRawSink();
var firstContext = fixture.CreateContext(firstSink);
await CollectAsync(fixture.Connector.FetchAsync(firstContext, CancellationToken.None));
var secondSink = new InMemoryRawSink();
var secondContext = fixture.CreateContext(secondSink);
var secondRunDocuments = await CollectAsync(fixture.Connector.FetchAsync(secondContext, CancellationToken.None));
secondRunDocuments.Should().BeEmpty();
secondSink.Documents.Should().BeEmpty();
var state = fixture.StateRepository.State;
state.Should().NotBeNull();
state!.DocumentDigests.Should().Contain(fixture.ExpectedDocumentDigest);
}
[Fact]
public async Task FetchAsync_TrimsPersistedDigestHistory()
{
var existingDigests = Enumerable.Range(0, ConnectorFixture.MaxDigestHistory + 5)
.Select(i => $"sha256:{i:X32}")
.ToImmutableArray();
var initialState = new VexConnectorState(
"excititor:suse.rancher",
DateTimeOffset.Parse("2025-10-18T00:00:00Z", CultureInfo.InvariantCulture, DateTimeStyles.AssumeUniversal),
ImmutableArray.CreateBuilder<string>()
.Add("checkpoint:cursor-old")
.AddRange(existingDigests)
.ToImmutable());
using var fixture = await ConnectorFixture.CreateAsync(initialState);
var sink = new InMemoryRawSink();
var context = fixture.CreateContext(sink);
await CollectAsync(fixture.Connector.FetchAsync(context, CancellationToken.None));
var state = fixture.StateRepository.State;
state.Should().NotBeNull();
state!.DocumentDigests.Should().Contain(d => d.StartsWith("checkpoint:", StringComparison.Ordinal));
state.DocumentDigests.Count.Should().Be(ConnectorFixture.MaxDigestHistory + 1);
}
private static async Task<List<VexRawDocument>> CollectAsync(IAsyncEnumerable<VexRawDocument> source)
{
var list = new List<VexRawDocument>();
await foreach (var document in source.ConfigureAwait(false))
{
list.Add(document);
}
return list;
}
#region helpers
private sealed class ConnectorFixture : IDisposable
{
public const int MaxDigestHistory = 200;
private readonly IServiceProvider _serviceProvider;
private readonly TempDirectory _tempDirectory;
private readonly HttpClient _httpClient;
private ConnectorFixture(
RancherHubConnector connector,
InMemoryConnectorStateRepository stateRepository,
RoutingHttpMessageHandler handler,
IServiceProvider serviceProvider,
TempDirectory tempDirectory,
HttpClient httpClient,
Uri documentUri,
string documentDigest)
{
Connector = connector;
StateRepository = stateRepository;
Handler = handler;
_serviceProvider = serviceProvider;
_tempDirectory = tempDirectory;
_httpClient = httpClient;
DocumentUri = documentUri;
ExpectedDocumentDigest = $"sha256:{documentDigest}";
}
public RancherHubConnector Connector { get; }
public InMemoryConnectorStateRepository StateRepository { get; }
public RoutingHttpMessageHandler Handler { get; }
public Uri DocumentUri { get; }
public string ExpectedDocumentDigest { get; }
public VexConnectorContext CreateContext(InMemoryRawSink sink, DateTimeOffset? since = null)
=> new(
since,
VexConnectorSettings.Empty,
sink,
new NoopSignatureVerifier(),
new NoopNormalizerRouter(),
_serviceProvider,
ImmutableDictionary<string, string>.Empty);
public void Dispose()
{
_httpClient.Dispose();
_tempDirectory.Dispose();
}
public static async Task<ConnectorFixture> CreateAsync(VexConnectorState? initialState = null)
{
var tempDirectory = new TempDirectory();
var documentPayload = "{\"document\":\"payload\"}";
var documentDigest = ComputeSha256Hex(documentPayload);
var documentUri = new Uri("https://hub.test/events/evt-1.json");
var eventsPayload = """
{
"cursor": "cursor-1",
"nextCursor": "cursor-2",
"events": [
{
"id": "evt-1",
"type": "vex.statement.published",
"channel": "rancher/rke2",
"publishedAt": "2025-10-19T12:00:00Z",
"document": {
"uri": "https://hub.test/events/evt-1.json",
"sha256": "DOC_DIGEST",
"format": "csaf"
}
}
]
}
""".Replace("DOC_DIGEST", documentDigest, StringComparison.Ordinal);
var eventsPath = tempDirectory.Combine("events.json");
await File.WriteAllTextAsync(eventsPath, eventsPayload, Encoding.UTF8).ConfigureAwait(false);
var eventsChecksum = ComputeSha256Hex(eventsPayload);
var discoveryPayload = """
{
"hubId": "excititor:suse.rancher",
"title": "SUSE Rancher VEX Hub",
"subscription": {
"eventsUri": "https://hub.test/events",
"checkpointUri": "https://hub.test/checkpoint",
"channels": [ "rancher/rke2" ],
"requiresAuthentication": false
},
"offline": {
"snapshotUri": "EVENTS_URI",
"sha256": "EVENTS_DIGEST"
}
}
"""
.Replace("EVENTS_URI", new Uri(eventsPath).ToString(), StringComparison.Ordinal)
.Replace("EVENTS_DIGEST", eventsChecksum, StringComparison.Ordinal);
var discoveryPath = tempDirectory.Combine("discovery.json");
await File.WriteAllTextAsync(discoveryPath, discoveryPayload, Encoding.UTF8).ConfigureAwait(false);
var handler = new RoutingHttpMessageHandler();
handler.SetRoute(documentUri, () => JsonResponse(documentPayload));
var httpClient = new HttpClient(handler)
{
Timeout = TimeSpan.FromSeconds(10),
};
var httpFactory = new SingletonHttpClientFactory(httpClient);
var memoryCache = new MemoryCache(new MemoryCacheOptions());
var fileSystem = new System.IO.Abstractions.FileSystem();
var tokenProvider = new RancherHubTokenProvider(httpFactory, memoryCache, NullLogger<RancherHubTokenProvider>.Instance);
var metadataLoader = new RancherHubMetadataLoader(httpFactory, memoryCache, tokenProvider, fileSystem, NullLogger<RancherHubMetadataLoader>.Instance);
var eventClient = new RancherHubEventClient(httpFactory, tokenProvider, fileSystem, NullLogger<RancherHubEventClient>.Instance);
var stateRepository = new InMemoryConnectorStateRepository(initialState);
var checkpointManager = new RancherHubCheckpointManager(stateRepository);
var validators = new[] { new RancherHubConnectorOptionsValidator(fileSystem) };
var connector = new RancherHubConnector(
metadataLoader,
eventClient,
checkpointManager,
tokenProvider,
httpFactory,
NullLogger<RancherHubConnector>.Instance,
TimeProvider.System,
validators);
var settingsValues = ImmutableDictionary.CreateBuilder<string, string>(StringComparer.OrdinalIgnoreCase);
settingsValues["DiscoveryUri"] = "https://hub.test/.well-known/rancher-hub.json";
settingsValues["OfflineSnapshotPath"] = discoveryPath;
settingsValues["PreferOfflineSnapshot"] = "true";
var settings = new VexConnectorSettings(settingsValues.ToImmutable());
await connector.ValidateAsync(settings, CancellationToken.None).ConfigureAwait(false);
var services = new ServiceCollection().BuildServiceProvider();
return new ConnectorFixture(
connector,
stateRepository,
handler,
services,
tempDirectory,
httpClient,
documentUri,
documentDigest);
}
private static HttpResponseMessage JsonResponse(string payload)
{
var response = new HttpResponseMessage(HttpStatusCode.OK)
{
Content = new StringContent(payload, Encoding.UTF8, "application/json"),
};
return response;
}
}
private sealed class SingletonHttpClientFactory : IHttpClientFactory
{
private readonly HttpClient _client;
public SingletonHttpClientFactory(HttpClient client)
{
_client = client;
}
public HttpClient CreateClient(string name) => _client;
}
private sealed class RoutingHttpMessageHandler : HttpMessageHandler
{
private readonly Dictionary<Uri, Queue<Func<HttpResponseMessage>>> _routes = new();
public void SetRoute(Uri uri, params Func<HttpResponseMessage>[] responders)
{
ArgumentNullException.ThrowIfNull(uri);
if (responders is null || responders.Length == 0)
{
_routes.Remove(uri);
return;
}
_routes[uri] = new Queue<Func<HttpResponseMessage>>(responders);
}
protected override Task<HttpResponseMessage> SendAsync(HttpRequestMessage request, CancellationToken cancellationToken)
{
if (request.RequestUri is not null &&
_routes.TryGetValue(request.RequestUri, out var queue) &&
queue.Count > 0)
{
var responder = queue.Count > 1 ? queue.Dequeue() : queue.Peek();
var response = responder();
response.RequestMessage = request;
return Task.FromResult(response);
}
return Task.FromResult(new HttpResponseMessage(HttpStatusCode.NotFound)
{
Content = new StringContent($"No response configured for {request.RequestUri}", Encoding.UTF8, "text/plain"),
});
}
}
private sealed class InMemoryConnectorStateRepository : IVexConnectorStateRepository
{
public InMemoryConnectorStateRepository(VexConnectorState? initialState = null)
{
State = initialState;
}
public VexConnectorState? State { get; private set; }
public ValueTask<VexConnectorState?> GetAsync(string connectorId, CancellationToken cancellationToken, MongoDB.Driver.IClientSessionHandle? session = null)
=> ValueTask.FromResult(State);
public ValueTask SaveAsync(VexConnectorState state, CancellationToken cancellationToken, MongoDB.Driver.IClientSessionHandle? session = null)
{
State = state;
return ValueTask.CompletedTask;
}
}
private sealed class InMemoryRawSink : IVexRawDocumentSink
{
public List<VexRawDocument> Documents { get; } = new();
public ValueTask StoreAsync(VexRawDocument document, CancellationToken cancellationToken)
{
Documents.Add(document);
return ValueTask.CompletedTask;
}
}
private sealed class NoopSignatureVerifier : IVexSignatureVerifier
{
public ValueTask<VexSignatureMetadata?> VerifyAsync(VexRawDocument document, CancellationToken cancellationToken)
=> ValueTask.FromResult<VexSignatureMetadata?>(null);
}
private sealed class NoopNormalizerRouter : IVexNormalizerRouter
{
public ValueTask<VexClaimBatch> NormalizeAsync(VexRawDocument document, CancellationToken cancellationToken)
=> ValueTask.FromResult(new VexClaimBatch(document, ImmutableArray<VexClaim>.Empty, ImmutableDictionary<string, string>.Empty));
}
private sealed class TempDirectory : IDisposable
{
private readonly string _path;
public TempDirectory()
{
_path = Path.Combine(Path.GetTempPath(), "stellaops-excititor-tests", Guid.NewGuid().ToString("n"));
Directory.CreateDirectory(_path);
}
public string Combine(string relative) => Path.Combine(_path, relative);
public void Dispose()
{
try
{
if (Directory.Exists(_path))
{
Directory.Delete(_path, recursive: true);
}
}
catch
{
// Best-effort cleanup.
}
}
}
private static string ComputeSha256Hex(string payload)
{
var bytes = Encoding.UTF8.GetBytes(payload);
return ComputeSha256Hex(bytes);
}
private static string ComputeSha256Hex(ReadOnlySpan<byte> payload)
{
Span<byte> buffer = stackalloc byte[32];
SHA256.HashData(payload, buffer);
return Convert.ToHexString(buffer).ToLowerInvariant();
}
#endregion
}

View File

@@ -1,23 +1,23 @@
<?xml version='1.0' encoding='utf-8'?>
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>net10.0</TargetFramework>
<LangVersion>preview</LangVersion>
<Nullable>enable</Nullable>
<ImplicitUsings>enable</ImplicitUsings>
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
<UseConcelierTestInfra>false</UseConcelierTestInfra>
</PropertyGroup>
<ItemGroup>
<ProjectReference Include="../../__Libraries/StellaOps.Excititor.Connectors.SUSE.RancherVEXHub/StellaOps.Excititor.Connectors.SUSE.RancherVEXHub.csproj" />
<ProjectReference Include="../../__Libraries/StellaOps.Excititor.Storage.Mongo/StellaOps.Excititor.Storage.Mongo.csproj" />
</ItemGroup>
<ItemGroup>
<Compile Remove="..\..\..\StellaOps.Concelier.Tests.Shared\AssemblyInfo.cs" />
<Compile Remove="..\..\..\StellaOps.Concelier.Tests.Shared\MongoFixtureCollection.cs" />
</ItemGroup>
<ItemGroup>
<PackageReference Include="FluentAssertions" Version="6.12.0" />
<PackageReference Include="System.IO.Abstractions.TestingHelpers" Version="20.0.28" />
</ItemGroup>
</Project>
<?xml version='1.0' encoding='utf-8'?>
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>net10.0</TargetFramework>
<LangVersion>preview</LangVersion>
<Nullable>enable</Nullable>
<ImplicitUsings>enable</ImplicitUsings>
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
<UseConcelierTestInfra>false</UseConcelierTestInfra>
</PropertyGroup>
<ItemGroup>
<ProjectReference Include="../../__Libraries/StellaOps.Excititor.Connectors.SUSE.RancherVEXHub/StellaOps.Excititor.Connectors.SUSE.RancherVEXHub.csproj" />
<ProjectReference Include="../../__Libraries/StellaOps.Excititor.Storage.Mongo/StellaOps.Excititor.Storage.Mongo.csproj" />
</ItemGroup>
<ItemGroup>
<Compile Remove="..\..\..\StellaOps.Concelier.Tests.Shared\AssemblyInfo.cs" />
<Compile Remove="..\..\..\StellaOps.Concelier.Tests.Shared\MongoFixtureCollection.cs" />
</ItemGroup>
<ItemGroup>
<PackageReference Include="FluentAssertions" Version="6.12.0" />
<PackageReference Include="System.IO.Abstractions.TestingHelpers" Version="20.0.28" />
</ItemGroup>
</Project>

View File

@@ -1,169 +1,169 @@
using System;
using System.Collections.Generic;
using System.Collections.Immutable;
using System.Linq;
using Microsoft.Extensions.Logging.Abstractions;
using Microsoft.Extensions.Options;
using Microsoft.Extensions.Time.Testing;
using StellaOps.Excititor.Core;
using StellaOps.Excititor.Policy;
using System.Diagnostics.Metrics;
namespace StellaOps.Excititor.Core.Tests;
public class VexPolicyDiagnosticsTests
{
[Fact]
public void GetDiagnostics_ReportsCountsRecommendationsAndOverrides()
{
var overrides = new[]
{
new KeyValuePair<string, double>("provider-a", 0.8),
new KeyValuePair<string, double>("provider-b", 0.6),
};
var snapshot = new VexPolicySnapshot(
"custom/v1",
new VexConsensusPolicyOptions(
version: "custom/v1",
providerOverrides: overrides),
new BaselineVexConsensusPolicy(),
ImmutableArray.Create(
new VexPolicyIssue("sample.error", "Blocking issue.", VexPolicyIssueSeverity.Error),
new VexPolicyIssue("sample.warning", "Non-blocking issue.", VexPolicyIssueSeverity.Warning)),
"rev-test",
"ABCDEF");
var fakeProvider = new FakePolicyProvider(snapshot);
var fakeTime = new FakeTimeProvider(new DateTimeOffset(2025, 10, 16, 17, 0, 0, TimeSpan.Zero));
var diagnostics = new VexPolicyDiagnostics(fakeProvider, fakeTime);
var report = diagnostics.GetDiagnostics();
Assert.Equal("custom/v1", report.Version);
Assert.Equal("rev-test", report.RevisionId);
Assert.Equal("ABCDEF", report.Digest);
Assert.Equal(1, report.ErrorCount);
Assert.Equal(1, report.WarningCount);
Assert.Equal(fakeTime.GetUtcNow(), report.GeneratedAt);
Assert.Collection(report.Issues,
issue => Assert.Equal("sample.error", issue.Code),
issue => Assert.Equal("sample.warning", issue.Code));
Assert.Equal(new[] { "provider-a", "provider-b" }, report.ActiveOverrides.Keys.OrderBy(static key => key, StringComparer.Ordinal));
Assert.Contains(report.Recommendations, message => message.Contains("Resolve policy errors", StringComparison.OrdinalIgnoreCase));
Assert.Contains(report.Recommendations, message => message.Contains("provider-a", StringComparison.OrdinalIgnoreCase));
Assert.Contains(report.Recommendations, message => message.Contains("docs/modules/excititor/architecture.md", StringComparison.OrdinalIgnoreCase));
}
[Fact]
public void GetDiagnostics_WhenNoIssues_StillReturnsDefaultRecommendation()
{
var fakeProvider = new FakePolicyProvider(VexPolicySnapshot.Default);
var fakeTime = new FakeTimeProvider(new DateTimeOffset(2025, 10, 16, 17, 0, 0, TimeSpan.Zero));
var diagnostics = new VexPolicyDiagnostics(fakeProvider, fakeTime);
var report = diagnostics.GetDiagnostics();
Assert.Equal(0, report.ErrorCount);
Assert.Equal(0, report.WarningCount);
Assert.Empty(report.ActiveOverrides);
Assert.Single(report.Recommendations);
}
[Fact]
public void PolicyProvider_ComputesRevisionAndDigest_AndEmitsTelemetry()
{
using var listener = new MeterListener();
var reloadMeasurements = 0;
string? lastRevision = null;
listener.InstrumentPublished += (instrument, _) =>
{
if (instrument.Meter.Name == "StellaOps.Excititor.Policy" &&
instrument.Name == "vex.policy.reloads")
{
listener.EnableMeasurementEvents(instrument);
}
};
listener.SetMeasurementEventCallback<long>((instrument, measurement, tags, state) =>
{
reloadMeasurements++;
foreach (var tag in tags)
{
if (tag.Key is "revision" && tag.Value is string revision)
{
lastRevision = revision;
break;
}
}
});
listener.Start();
var optionsMonitor = new MutableOptionsMonitor<VexPolicyOptions>(new VexPolicyOptions());
var provider = new VexPolicyProvider(optionsMonitor, NullLogger<VexPolicyProvider>.Instance);
var snapshot1 = provider.GetSnapshot();
Assert.Equal("rev-1", snapshot1.RevisionId);
Assert.False(string.IsNullOrWhiteSpace(snapshot1.Digest));
var snapshot2 = provider.GetSnapshot();
Assert.Equal("rev-1", snapshot2.RevisionId);
Assert.Equal(snapshot1.Digest, snapshot2.Digest);
optionsMonitor.Update(new VexPolicyOptions
{
ProviderOverrides = new Dictionary<string, double>
{
["provider-a"] = 0.4
}
});
var snapshot3 = provider.GetSnapshot();
Assert.Equal("rev-2", snapshot3.RevisionId);
Assert.NotEqual(snapshot1.Digest, snapshot3.Digest);
listener.Dispose();
Assert.True(reloadMeasurements >= 2);
Assert.Equal("rev-2", lastRevision);
}
private sealed class FakePolicyProvider : IVexPolicyProvider
{
private readonly VexPolicySnapshot _snapshot;
public FakePolicyProvider(VexPolicySnapshot snapshot)
{
_snapshot = snapshot;
}
public VexPolicySnapshot GetSnapshot() => _snapshot;
}
private sealed class MutableOptionsMonitor<T> : IOptionsMonitor<T>
{
private T _value;
public MutableOptionsMonitor(T value)
{
_value = value;
}
public T CurrentValue => _value;
public T Get(string? name) => _value;
public void Update(T newValue) => _value = newValue;
public IDisposable OnChange(Action<T, string?> listener) => NullDisposable.Instance;
private sealed class NullDisposable : IDisposable
{
public static readonly NullDisposable Instance = new();
public void Dispose()
{
}
}
}
}
using System;
using System.Collections.Generic;
using System.Collections.Immutable;
using System.Linq;
using Microsoft.Extensions.Logging.Abstractions;
using Microsoft.Extensions.Options;
using Microsoft.Extensions.Time.Testing;
using StellaOps.Excititor.Core;
using StellaOps.Excititor.Policy;
using System.Diagnostics.Metrics;
namespace StellaOps.Excititor.Core.Tests;
public class VexPolicyDiagnosticsTests
{
[Fact]
public void GetDiagnostics_ReportsCountsRecommendationsAndOverrides()
{
var overrides = new[]
{
new KeyValuePair<string, double>("provider-a", 0.8),
new KeyValuePair<string, double>("provider-b", 0.6),
};
var snapshot = new VexPolicySnapshot(
"custom/v1",
new VexConsensusPolicyOptions(
version: "custom/v1",
providerOverrides: overrides),
new BaselineVexConsensusPolicy(),
ImmutableArray.Create(
new VexPolicyIssue("sample.error", "Blocking issue.", VexPolicyIssueSeverity.Error),
new VexPolicyIssue("sample.warning", "Non-blocking issue.", VexPolicyIssueSeverity.Warning)),
"rev-test",
"ABCDEF");
var fakeProvider = new FakePolicyProvider(snapshot);
var fakeTime = new FakeTimeProvider(new DateTimeOffset(2025, 10, 16, 17, 0, 0, TimeSpan.Zero));
var diagnostics = new VexPolicyDiagnostics(fakeProvider, fakeTime);
var report = diagnostics.GetDiagnostics();
Assert.Equal("custom/v1", report.Version);
Assert.Equal("rev-test", report.RevisionId);
Assert.Equal("ABCDEF", report.Digest);
Assert.Equal(1, report.ErrorCount);
Assert.Equal(1, report.WarningCount);
Assert.Equal(fakeTime.GetUtcNow(), report.GeneratedAt);
Assert.Collection(report.Issues,
issue => Assert.Equal("sample.error", issue.Code),
issue => Assert.Equal("sample.warning", issue.Code));
Assert.Equal(new[] { "provider-a", "provider-b" }, report.ActiveOverrides.Keys.OrderBy(static key => key, StringComparer.Ordinal));
Assert.Contains(report.Recommendations, message => message.Contains("Resolve policy errors", StringComparison.OrdinalIgnoreCase));
Assert.Contains(report.Recommendations, message => message.Contains("provider-a", StringComparison.OrdinalIgnoreCase));
Assert.Contains(report.Recommendations, message => message.Contains("docs/modules/excititor/architecture.md", StringComparison.OrdinalIgnoreCase));
}
[Fact]
public void GetDiagnostics_WhenNoIssues_StillReturnsDefaultRecommendation()
{
var fakeProvider = new FakePolicyProvider(VexPolicySnapshot.Default);
var fakeTime = new FakeTimeProvider(new DateTimeOffset(2025, 10, 16, 17, 0, 0, TimeSpan.Zero));
var diagnostics = new VexPolicyDiagnostics(fakeProvider, fakeTime);
var report = diagnostics.GetDiagnostics();
Assert.Equal(0, report.ErrorCount);
Assert.Equal(0, report.WarningCount);
Assert.Empty(report.ActiveOverrides);
Assert.Single(report.Recommendations);
}
[Fact]
public void PolicyProvider_ComputesRevisionAndDigest_AndEmitsTelemetry()
{
using var listener = new MeterListener();
var reloadMeasurements = 0;
string? lastRevision = null;
listener.InstrumentPublished += (instrument, _) =>
{
if (instrument.Meter.Name == "StellaOps.Excititor.Policy" &&
instrument.Name == "vex.policy.reloads")
{
listener.EnableMeasurementEvents(instrument);
}
};
listener.SetMeasurementEventCallback<long>((instrument, measurement, tags, state) =>
{
reloadMeasurements++;
foreach (var tag in tags)
{
if (tag.Key is "revision" && tag.Value is string revision)
{
lastRevision = revision;
break;
}
}
});
listener.Start();
var optionsMonitor = new MutableOptionsMonitor<VexPolicyOptions>(new VexPolicyOptions());
var provider = new VexPolicyProvider(optionsMonitor, NullLogger<VexPolicyProvider>.Instance);
var snapshot1 = provider.GetSnapshot();
Assert.Equal("rev-1", snapshot1.RevisionId);
Assert.False(string.IsNullOrWhiteSpace(snapshot1.Digest));
var snapshot2 = provider.GetSnapshot();
Assert.Equal("rev-1", snapshot2.RevisionId);
Assert.Equal(snapshot1.Digest, snapshot2.Digest);
optionsMonitor.Update(new VexPolicyOptions
{
ProviderOverrides = new Dictionary<string, double>
{
["provider-a"] = 0.4
}
});
var snapshot3 = provider.GetSnapshot();
Assert.Equal("rev-2", snapshot3.RevisionId);
Assert.NotEqual(snapshot1.Digest, snapshot3.Digest);
listener.Dispose();
Assert.True(reloadMeasurements >= 2);
Assert.Equal("rev-2", lastRevision);
}
private sealed class FakePolicyProvider : IVexPolicyProvider
{
private readonly VexPolicySnapshot _snapshot;
public FakePolicyProvider(VexPolicySnapshot snapshot)
{
_snapshot = snapshot;
}
public VexPolicySnapshot GetSnapshot() => _snapshot;
}
private sealed class MutableOptionsMonitor<T> : IOptionsMonitor<T>
{
private T _value;
public MutableOptionsMonitor(T value)
{
_value = value;
}
public T CurrentValue => _value;
public T Get(string? name) => _value;
public void Update(T newValue) => _value = newValue;
public IDisposable OnChange(Action<T, string?> listener) => NullDisposable.Instance;
private sealed class NullDisposable : IDisposable
{
public static readonly NullDisposable Instance = new();
public void Dispose()
{
}
}
}
}

View File

@@ -1,73 +1,73 @@
using System.Collections.Immutable;
using System.Text.Json;
using FluentAssertions;
using StellaOps.Excititor.Core;
using StellaOps.Excititor.Formats.CSAF;
namespace StellaOps.Excititor.Formats.CSAF.Tests;
public sealed class CsafExporterTests
{
[Fact]
public async Task SerializeAsync_WritesDeterministicCsafDocument()
{
var claims = ImmutableArray.Create(
new VexClaim(
"CVE-2025-3000",
"vendor:example",
new VexProduct("pkg:example/app@1.0.0", "Example App", "1.0.0", "pkg:example/app@1.0.0"),
VexClaimStatus.Affected,
new VexClaimDocument(VexDocumentFormat.Csaf, "sha256:doc1", new Uri("https://example.com/csaf/advisory1.json")),
new DateTimeOffset(2025, 10, 10, 0, 0, 0, TimeSpan.Zero),
new DateTimeOffset(2025, 10, 11, 0, 0, 0, TimeSpan.Zero),
detail: "Impact on Example App 1.0.0"),
new VexClaim(
"CVE-2025-3000",
"vendor:example",
new VexProduct("pkg:example/app@1.0.0", "Example App", "1.0.0", "pkg:example/app@1.0.0"),
VexClaimStatus.NotAffected,
new VexClaimDocument(VexDocumentFormat.Csaf, "sha256:doc2", new Uri("https://example.com/csaf/advisory2.json")),
new DateTimeOffset(2025, 10, 12, 0, 0, 0, TimeSpan.Zero),
new DateTimeOffset(2025, 10, 12, 0, 0, 0, TimeSpan.Zero),
justification: VexJustification.ComponentNotPresent),
new VexClaim(
"ADVISORY-1",
"vendor:example",
new VexProduct("pkg:example/lib@2.0.0", "Example Lib", "2.0.0"),
VexClaimStatus.NotAffected,
new VexClaimDocument(VexDocumentFormat.Csaf, "sha256:doc3", new Uri("https://example.com/csaf/advisory3.json")),
new DateTimeOffset(2025, 10, 12, 0, 0, 0, TimeSpan.Zero),
new DateTimeOffset(2025, 10, 12, 0, 0, 0, TimeSpan.Zero),
justification: null));
var request = new VexExportRequest(
VexQuery.Empty,
ImmutableArray<VexConsensus>.Empty,
claims,
new DateTimeOffset(2025, 10, 13, 0, 0, 0, TimeSpan.Zero));
var exporter = new CsafExporter();
var digest = exporter.Digest(request);
await using var stream = new MemoryStream();
var result = await exporter.SerializeAsync(request, stream, CancellationToken.None);
digest.Should().NotBeNull();
digest.Should().Be(result.Digest);
stream.Position = 0;
using var document = JsonDocument.Parse(stream);
var root = document.RootElement;
root.GetProperty("document").GetProperty("tracking").GetProperty("id").GetString()!.Should().StartWith("stellaops:csaf");
root.GetProperty("product_tree").GetProperty("full_product_names").GetArrayLength().Should().Be(2);
root.GetProperty("vulnerabilities").EnumerateArray().Should().HaveCount(2);
var metadata = root.GetProperty("metadata");
metadata.GetProperty("query_signature").GetString().Should().NotBeNull();
metadata.GetProperty("diagnostics").EnumerateObject().Select(p => p.Name).Should().Contain("policy.justification_missing");
result.Metadata.Should().ContainKey("csaf.vulnerabilityCount");
result.Metadata["csaf.productCount"].Should().Be("2");
}
}
using System.Collections.Immutable;
using System.Text.Json;
using FluentAssertions;
using StellaOps.Excititor.Core;
using StellaOps.Excititor.Formats.CSAF;
namespace StellaOps.Excititor.Formats.CSAF.Tests;
public sealed class CsafExporterTests
{
[Fact]
public async Task SerializeAsync_WritesDeterministicCsafDocument()
{
var claims = ImmutableArray.Create(
new VexClaim(
"CVE-2025-3000",
"vendor:example",
new VexProduct("pkg:example/app@1.0.0", "Example App", "1.0.0", "pkg:example/app@1.0.0"),
VexClaimStatus.Affected,
new VexClaimDocument(VexDocumentFormat.Csaf, "sha256:doc1", new Uri("https://example.com/csaf/advisory1.json")),
new DateTimeOffset(2025, 10, 10, 0, 0, 0, TimeSpan.Zero),
new DateTimeOffset(2025, 10, 11, 0, 0, 0, TimeSpan.Zero),
detail: "Impact on Example App 1.0.0"),
new VexClaim(
"CVE-2025-3000",
"vendor:example",
new VexProduct("pkg:example/app@1.0.0", "Example App", "1.0.0", "pkg:example/app@1.0.0"),
VexClaimStatus.NotAffected,
new VexClaimDocument(VexDocumentFormat.Csaf, "sha256:doc2", new Uri("https://example.com/csaf/advisory2.json")),
new DateTimeOffset(2025, 10, 12, 0, 0, 0, TimeSpan.Zero),
new DateTimeOffset(2025, 10, 12, 0, 0, 0, TimeSpan.Zero),
justification: VexJustification.ComponentNotPresent),
new VexClaim(
"ADVISORY-1",
"vendor:example",
new VexProduct("pkg:example/lib@2.0.0", "Example Lib", "2.0.0"),
VexClaimStatus.NotAffected,
new VexClaimDocument(VexDocumentFormat.Csaf, "sha256:doc3", new Uri("https://example.com/csaf/advisory3.json")),
new DateTimeOffset(2025, 10, 12, 0, 0, 0, TimeSpan.Zero),
new DateTimeOffset(2025, 10, 12, 0, 0, 0, TimeSpan.Zero),
justification: null));
var request = new VexExportRequest(
VexQuery.Empty,
ImmutableArray<VexConsensus>.Empty,
claims,
new DateTimeOffset(2025, 10, 13, 0, 0, 0, TimeSpan.Zero));
var exporter = new CsafExporter();
var digest = exporter.Digest(request);
await using var stream = new MemoryStream();
var result = await exporter.SerializeAsync(request, stream, CancellationToken.None);
digest.Should().NotBeNull();
digest.Should().Be(result.Digest);
stream.Position = 0;
using var document = JsonDocument.Parse(stream);
var root = document.RootElement;
root.GetProperty("document").GetProperty("tracking").GetProperty("id").GetString()!.Should().StartWith("stellaops:csaf");
root.GetProperty("product_tree").GetProperty("full_product_names").GetArrayLength().Should().Be(2);
root.GetProperty("vulnerabilities").EnumerateArray().Should().HaveCount(2);
var metadata = root.GetProperty("metadata");
metadata.GetProperty("query_signature").GetString().Should().NotBeNull();
metadata.GetProperty("diagnostics").EnumerateObject().Select(p => p.Name).Should().Contain("policy.justification_missing");
result.Metadata.Should().ContainKey("csaf.vulnerabilityCount");
result.Metadata["csaf.productCount"].Should().Be("2");
}
}

View File

@@ -1,37 +1,37 @@
using System.Collections.Immutable;
using FluentAssertions;
using StellaOps.Excititor.Core;
using StellaOps.Excititor.Formats.CycloneDX;
namespace StellaOps.Excititor.Formats.CycloneDX.Tests;
public sealed class CycloneDxComponentReconcilerTests
{
[Fact]
public void Reconcile_AssignsBomRefsAndDiagnostics()
{
var claims = ImmutableArray.Create(
new VexClaim(
"CVE-2025-7000",
"vendor:one",
new VexProduct("pkg:demo/component@1.0.0", "Demo Component", "1.0.0", "pkg:demo/component@1.0.0"),
VexClaimStatus.Affected,
new VexClaimDocument(VexDocumentFormat.CycloneDx, "sha256:doc1", new Uri("https://example.com/vex/1")),
DateTimeOffset.UtcNow,
DateTimeOffset.UtcNow),
new VexClaim(
"CVE-2025-7000",
"vendor:two",
new VexProduct("component-key", "Component Key"),
VexClaimStatus.NotAffected,
new VexClaimDocument(VexDocumentFormat.CycloneDx, "sha256:doc2", new Uri("https://example.com/vex/2")),
DateTimeOffset.UtcNow,
DateTimeOffset.UtcNow));
var result = CycloneDxComponentReconciler.Reconcile(claims);
result.Components.Should().HaveCount(2);
result.ComponentRefs.Should().ContainKey(("CVE-2025-7000", "component-key"));
result.Diagnostics.Keys.Should().Contain("missing_purl");
}
}
using System.Collections.Immutable;
using FluentAssertions;
using StellaOps.Excititor.Core;
using StellaOps.Excititor.Formats.CycloneDX;
namespace StellaOps.Excititor.Formats.CycloneDX.Tests;
public sealed class CycloneDxComponentReconcilerTests
{
[Fact]
public void Reconcile_AssignsBomRefsAndDiagnostics()
{
var claims = ImmutableArray.Create(
new VexClaim(
"CVE-2025-7000",
"vendor:one",
new VexProduct("pkg:demo/component@1.0.0", "Demo Component", "1.0.0", "pkg:demo/component@1.0.0"),
VexClaimStatus.Affected,
new VexClaimDocument(VexDocumentFormat.CycloneDx, "sha256:doc1", new Uri("https://example.com/vex/1")),
DateTimeOffset.UtcNow,
DateTimeOffset.UtcNow),
new VexClaim(
"CVE-2025-7000",
"vendor:two",
new VexProduct("component-key", "Component Key"),
VexClaimStatus.NotAffected,
new VexClaimDocument(VexDocumentFormat.CycloneDx, "sha256:doc2", new Uri("https://example.com/vex/2")),
DateTimeOffset.UtcNow,
DateTimeOffset.UtcNow));
var result = CycloneDxComponentReconciler.Reconcile(claims);
result.Components.Should().HaveCount(2);
result.ComponentRefs.Should().ContainKey(("CVE-2025-7000", "component-key"));
result.Diagnostics.Keys.Should().Contain("missing_purl");
}
}

View File

@@ -1,47 +1,47 @@
using System.Collections.Immutable;
using System.Text.Json;
using FluentAssertions;
using StellaOps.Excititor.Core;
using StellaOps.Excititor.Formats.CycloneDX;
namespace StellaOps.Excititor.Formats.CycloneDX.Tests;
public sealed class CycloneDxExporterTests
{
[Fact]
public async Task SerializeAsync_WritesCycloneDxVexDocument()
{
var claims = ImmutableArray.Create(
new VexClaim(
"CVE-2025-6000",
"vendor:demo",
new VexProduct("pkg:demo/component@1.2.3", "Demo Component", "1.2.3", "pkg:demo/component@1.2.3"),
VexClaimStatus.Fixed,
new VexClaimDocument(VexDocumentFormat.CycloneDx, "sha256:doc1", new Uri("https://example.com/cyclonedx/1")),
new DateTimeOffset(2025, 10, 10, 0, 0, 0, TimeSpan.Zero),
new DateTimeOffset(2025, 10, 11, 0, 0, 0, TimeSpan.Zero),
detail: "Issue resolved in 1.2.3"));
var request = new VexExportRequest(
VexQuery.Empty,
ImmutableArray<VexConsensus>.Empty,
claims,
new DateTimeOffset(2025, 10, 12, 0, 0, 0, TimeSpan.Zero));
var exporter = new CycloneDxExporter();
await using var stream = new MemoryStream();
var result = await exporter.SerializeAsync(request, stream, CancellationToken.None);
stream.Position = 0;
using var document = JsonDocument.Parse(stream);
var root = document.RootElement;
root.GetProperty("bomFormat").GetString().Should().Be("CycloneDX");
root.GetProperty("components").EnumerateArray().Should().HaveCount(1);
root.GetProperty("vulnerabilities").EnumerateArray().Should().HaveCount(1);
result.Metadata.Should().ContainKey("cyclonedx.vulnerabilityCount");
result.Metadata["cyclonedx.componentCount"].Should().Be("1");
result.Digest.Algorithm.Should().Be("sha256");
}
}
using System.Collections.Immutable;
using System.Text.Json;
using FluentAssertions;
using StellaOps.Excititor.Core;
using StellaOps.Excititor.Formats.CycloneDX;
namespace StellaOps.Excititor.Formats.CycloneDX.Tests;
public sealed class CycloneDxExporterTests
{
[Fact]
public async Task SerializeAsync_WritesCycloneDxVexDocument()
{
var claims = ImmutableArray.Create(
new VexClaim(
"CVE-2025-6000",
"vendor:demo",
new VexProduct("pkg:demo/component@1.2.3", "Demo Component", "1.2.3", "pkg:demo/component@1.2.3"),
VexClaimStatus.Fixed,
new VexClaimDocument(VexDocumentFormat.CycloneDx, "sha256:doc1", new Uri("https://example.com/cyclonedx/1")),
new DateTimeOffset(2025, 10, 10, 0, 0, 0, TimeSpan.Zero),
new DateTimeOffset(2025, 10, 11, 0, 0, 0, TimeSpan.Zero),
detail: "Issue resolved in 1.2.3"));
var request = new VexExportRequest(
VexQuery.Empty,
ImmutableArray<VexConsensus>.Empty,
claims,
new DateTimeOffset(2025, 10, 12, 0, 0, 0, TimeSpan.Zero));
var exporter = new CycloneDxExporter();
await using var stream = new MemoryStream();
var result = await exporter.SerializeAsync(request, stream, CancellationToken.None);
stream.Position = 0;
using var document = JsonDocument.Parse(stream);
var root = document.RootElement;
root.GetProperty("bomFormat").GetString().Should().Be("CycloneDX");
root.GetProperty("components").EnumerateArray().Should().HaveCount(1);
root.GetProperty("vulnerabilities").EnumerateArray().Should().HaveCount(1);
result.Metadata.Should().ContainKey("cyclonedx.vulnerabilityCount");
result.Metadata["cyclonedx.componentCount"].Should().Be("1");
result.Digest.Algorithm.Should().Be("sha256");
}
}

View File

@@ -1,49 +1,49 @@
using System.Collections.Immutable;
using System.Text.Json;
using FluentAssertions;
using StellaOps.Excititor.Core;
using StellaOps.Excititor.Formats.OpenVEX;
namespace StellaOps.Excititor.Formats.OpenVEX.Tests;
public sealed class OpenVexExporterTests
{
[Fact]
public async Task SerializeAsync_ProducesCanonicalOpenVexDocument()
{
var claims = ImmutableArray.Create(
new VexClaim(
"CVE-2025-5000",
"vendor:alpha",
new VexProduct("pkg:alpha/app@2.0.0", "Alpha App", "2.0.0", "pkg:alpha/app@2.0.0"),
VexClaimStatus.NotAffected,
new VexClaimDocument(VexDocumentFormat.OpenVex, "sha256:doc1", new Uri("https://example.com/openvex/alpha")),
new DateTimeOffset(2025, 10, 10, 0, 0, 0, TimeSpan.Zero),
new DateTimeOffset(2025, 10, 11, 0, 0, 0, TimeSpan.Zero),
justification: VexJustification.ComponentNotPresent,
detail: "Component not shipped."));
var request = new VexExportRequest(
VexQuery.Empty,
ImmutableArray<VexConsensus>.Empty,
claims,
new DateTimeOffset(2025, 10, 12, 0, 0, 0, TimeSpan.Zero));
var exporter = new OpenVexExporter();
await using var stream = new MemoryStream();
var result = await exporter.SerializeAsync(request, stream, CancellationToken.None);
stream.Position = 0;
using var document = JsonDocument.Parse(stream);
var root = document.RootElement;
root.GetProperty("document").GetProperty("author").GetString().Should().Be("StellaOps Excititor");
root.GetProperty("statements").GetArrayLength().Should().Be(1);
var statement = root.GetProperty("statements")[0];
statement.GetProperty("status").GetString().Should().Be("not_affected");
statement.GetProperty("products")[0].GetProperty("id").GetString().Should().Be("pkg:alpha/app@2.0.0");
result.Metadata.Should().ContainKey("openvex.statementCount");
result.Metadata["openvex.statementCount"].Should().Be("1");
result.Digest.Algorithm.Should().Be("sha256");
}
}
using System.Collections.Immutable;
using System.Text.Json;
using FluentAssertions;
using StellaOps.Excititor.Core;
using StellaOps.Excititor.Formats.OpenVEX;
namespace StellaOps.Excititor.Formats.OpenVEX.Tests;
public sealed class OpenVexExporterTests
{
[Fact]
public async Task SerializeAsync_ProducesCanonicalOpenVexDocument()
{
var claims = ImmutableArray.Create(
new VexClaim(
"CVE-2025-5000",
"vendor:alpha",
new VexProduct("pkg:alpha/app@2.0.0", "Alpha App", "2.0.0", "pkg:alpha/app@2.0.0"),
VexClaimStatus.NotAffected,
new VexClaimDocument(VexDocumentFormat.OpenVex, "sha256:doc1", new Uri("https://example.com/openvex/alpha")),
new DateTimeOffset(2025, 10, 10, 0, 0, 0, TimeSpan.Zero),
new DateTimeOffset(2025, 10, 11, 0, 0, 0, TimeSpan.Zero),
justification: VexJustification.ComponentNotPresent,
detail: "Component not shipped."));
var request = new VexExportRequest(
VexQuery.Empty,
ImmutableArray<VexConsensus>.Empty,
claims,
new DateTimeOffset(2025, 10, 12, 0, 0, 0, TimeSpan.Zero));
var exporter = new OpenVexExporter();
await using var stream = new MemoryStream();
var result = await exporter.SerializeAsync(request, stream, CancellationToken.None);
stream.Position = 0;
using var document = JsonDocument.Parse(stream);
var root = document.RootElement;
root.GetProperty("document").GetProperty("author").GetString().Should().Be("StellaOps Excititor");
root.GetProperty("statements").GetArrayLength().Should().Be(1);
var statement = root.GetProperty("statements")[0];
statement.GetProperty("status").GetString().Should().Be("not_affected");
statement.GetProperty("products")[0].GetProperty("id").GetString().Should().Be("pkg:alpha/app@2.0.0");
result.Metadata.Should().ContainKey("openvex.statementCount");
result.Metadata["openvex.statementCount"].Should().Be("1");
result.Digest.Algorithm.Should().Be("sha256");
}
}

View File

@@ -1,39 +1,39 @@
using System.Collections.Immutable;
using FluentAssertions;
using StellaOps.Excititor.Core;
using StellaOps.Excititor.Formats.OpenVEX;
namespace StellaOps.Excititor.Formats.OpenVEX.Tests;
public sealed class OpenVexStatementMergerTests
{
[Fact]
public void Merge_DetectsConflictsAndSelectsCanonicalStatus()
{
var claims = ImmutableArray.Create(
new VexClaim(
"CVE-2025-4000",
"vendor:one",
new VexProduct("pkg:demo/app@1.0.0", "Demo App", "1.0.0"),
VexClaimStatus.NotAffected,
new VexClaimDocument(VexDocumentFormat.OpenVex, "sha256:doc1", new Uri("https://example.com/openvex/1")),
DateTimeOffset.UtcNow,
DateTimeOffset.UtcNow,
justification: VexJustification.ComponentNotPresent),
new VexClaim(
"CVE-2025-4000",
"vendor:two",
new VexProduct("pkg:demo/app@1.0.0", "Demo App", "1.0.0"),
VexClaimStatus.Affected,
new VexClaimDocument(VexDocumentFormat.OpenVex, "sha256:doc2", new Uri("https://example.com/openvex/2")),
DateTimeOffset.UtcNow,
DateTimeOffset.UtcNow));
var result = OpenVexStatementMerger.Merge(claims);
result.Statements.Should().HaveCount(1);
var statement = result.Statements[0];
statement.Status.Should().Be(VexClaimStatus.Affected);
result.Diagnostics.Should().ContainKey("openvex.status_conflict");
}
}
using System.Collections.Immutable;
using FluentAssertions;
using StellaOps.Excititor.Core;
using StellaOps.Excititor.Formats.OpenVEX;
namespace StellaOps.Excititor.Formats.OpenVEX.Tests;
public sealed class OpenVexStatementMergerTests
{
[Fact]
public void Merge_DetectsConflictsAndSelectsCanonicalStatus()
{
var claims = ImmutableArray.Create(
new VexClaim(
"CVE-2025-4000",
"vendor:one",
new VexProduct("pkg:demo/app@1.0.0", "Demo App", "1.0.0"),
VexClaimStatus.NotAffected,
new VexClaimDocument(VexDocumentFormat.OpenVex, "sha256:doc1", new Uri("https://example.com/openvex/1")),
DateTimeOffset.UtcNow,
DateTimeOffset.UtcNow,
justification: VexJustification.ComponentNotPresent),
new VexClaim(
"CVE-2025-4000",
"vendor:two",
new VexProduct("pkg:demo/app@1.0.0", "Demo App", "1.0.0"),
VexClaimStatus.Affected,
new VexClaimDocument(VexDocumentFormat.OpenVex, "sha256:doc2", new Uri("https://example.com/openvex/2")),
DateTimeOffset.UtcNow,
DateTimeOffset.UtcNow));
var result = OpenVexStatementMerger.Merge(claims);
result.Statements.Should().HaveCount(1);
var statement = result.Statements[0];
statement.Status.Should().Be(VexClaimStatus.Affected);
result.Diagnostics.Should().ContainKey("openvex.status_conflict");
}
}

View File

@@ -1,4 +1,4 @@
# StellaOps.Notify.WebService — Agent Charter
## Mission
Implement Notify control plane per `docs/modules/notify/ARCHITECTURE.md`.
# StellaOps.Notify.WebService — Agent Charter
## Mission
Implement Notify control plane per `docs/modules/notify/ARCHITECTURE.md`.

View File

@@ -1,4 +1,4 @@
# StellaOps.Notify.Worker — Agent Charter
## Mission
Consume events, evaluate rules, and dispatch deliveries per `docs/modules/notify/ARCHITECTURE.md`.
# StellaOps.Notify.Worker — Agent Charter
## Mission
Consume events, evaluate rules, and dispatch deliveries per `docs/modules/notify/ARCHITECTURE.md`.

View File

@@ -1,4 +1,4 @@
# StellaOps.Notify.Connectors.Email — Agent Charter
## Mission
Implement SMTP connector plug-in per `docs/modules/notify/ARCHITECTURE.md`.
# StellaOps.Notify.Connectors.Email — Agent Charter
## Mission
Implement SMTP connector plug-in per `docs/modules/notify/ARCHITECTURE.md`.

View File

@@ -1,4 +1,4 @@
# StellaOps.Notify.Connectors.Slack — Agent Charter
## Mission
Deliver Slack connector plug-in per `docs/modules/notify/ARCHITECTURE.md`.
# StellaOps.Notify.Connectors.Slack — Agent Charter
## Mission
Deliver Slack connector plug-in per `docs/modules/notify/ARCHITECTURE.md`.

View File

@@ -1,4 +1,4 @@
# StellaOps.Notify.Connectors.Teams — Agent Charter
## Mission
Implement Microsoft Teams connector plug-in per `docs/modules/notify/ARCHITECTURE.md`.
# StellaOps.Notify.Connectors.Teams — Agent Charter
## Mission
Implement Microsoft Teams connector plug-in per `docs/modules/notify/ARCHITECTURE.md`.

View File

@@ -1,4 +1,4 @@
# StellaOps.Notify.Connectors.Webhook — Agent Charter
## Mission
Implement generic webhook connector plug-in per `docs/modules/notify/ARCHITECTURE.md`.
# StellaOps.Notify.Connectors.Webhook — Agent Charter
## Mission
Implement generic webhook connector plug-in per `docs/modules/notify/ARCHITECTURE.md`.

View File

@@ -1,4 +1,4 @@
# StellaOps.Notify.Engine — Agent Charter
## Mission
Deliver rule evaluation, digest, and rendering logic per `docs/modules/notify/ARCHITECTURE.md`.
# StellaOps.Notify.Engine — Agent Charter
## Mission
Deliver rule evaluation, digest, and rendering logic per `docs/modules/notify/ARCHITECTURE.md`.

View File

@@ -1,4 +1,4 @@
# StellaOps.Notify.Models — Agent Charter
## Mission
Define Notify DTOs and contracts per `docs/modules/notify/ARCHITECTURE.md`.
# StellaOps.Notify.Models — Agent Charter
## Mission
Define Notify DTOs and contracts per `docs/modules/notify/ARCHITECTURE.md`.

View File

@@ -1,4 +1,4 @@
# StellaOps.Notify.Queue — Agent Charter
## Mission
Provide event & delivery queues for Notify per `docs/modules/notify/ARCHITECTURE.md`.
# StellaOps.Notify.Queue — Agent Charter
## Mission
Provide event & delivery queues for Notify per `docs/modules/notify/ARCHITECTURE.md`.

View File

@@ -1,4 +1,4 @@
# StellaOps.Notify.Storage.Mongo — Agent Charter
## Mission
Implement Mongo persistence (rules, channels, deliveries, digests, locks, audit) per `docs/modules/notify/ARCHITECTURE.md`.
# StellaOps.Notify.Storage.Mongo — Agent Charter
## Mission
Implement Mongo persistence (rules, channels, deliveries, digests, locks, audit) per `docs/modules/notify/ARCHITECTURE.md`.

View File

@@ -1,25 +1,25 @@
<?xml version='1.0' encoding='utf-8'?>
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>net10.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
</PropertyGroup>
<ItemGroup>
<ProjectReference Include="../../__Libraries/StellaOps.Notify.Models/StellaOps.Notify.Models.csproj" />
</ItemGroup>
<ItemGroup>
<PackageReference Include="NJsonSchema" Version="10.9.0" />
<None Include="../../docs/events/samples/*.json">
<CopyToOutputDirectory>Always</CopyToOutputDirectory>
</None>
<None Include="../../docs/events/*.json">
<CopyToOutputDirectory>Always</CopyToOutputDirectory>
</None>
<None Include="../../../../docs/modules/notify/resources/samples/*.json">
<CopyToOutputDirectory>Always</CopyToOutputDirectory>
</None>
</ItemGroup>
<?xml version='1.0' encoding='utf-8'?>
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>net10.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
</PropertyGroup>
<ItemGroup>
<ProjectReference Include="../../__Libraries/StellaOps.Notify.Models/StellaOps.Notify.Models.csproj" />
</ItemGroup>
<ItemGroup>
<PackageReference Include="NJsonSchema" Version="10.9.0" />
<None Include="../../docs/events/samples/*.json">
<CopyToOutputDirectory>Always</CopyToOutputDirectory>
</None>
<None Include="../../docs/events/*.json">
<CopyToOutputDirectory>Always</CopyToOutputDirectory>
</None>
<None Include="../../../../docs/modules/notify/resources/samples/*.json">
<CopyToOutputDirectory>Always</CopyToOutputDirectory>
</None>
</ItemGroup>
</Project>

View File

@@ -1,29 +1,29 @@
<?xml version='1.0' encoding='utf-8'?>
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>net10.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
<UseConcelierTestInfra>false</UseConcelierTestInfra>
</PropertyGroup>
<ItemGroup>
<ProjectReference Include="../../__Libraries/StellaOps.Notify.Models/StellaOps.Notify.Models.csproj" />
<ProjectReference Include="../../__Libraries/StellaOps.Notify.Storage.Mongo/StellaOps.Notify.Storage.Mongo.csproj" />
</ItemGroup>
<ItemGroup>
<PackageReference Include="coverlet.collector" Version="6.0.4" />
<PackageReference Include="Microsoft.NET.Test.Sdk" Version="17.14.0" />
<PackageReference Include="Mongo2Go" Version="3.1.3" />
<PackageReference Include="MongoDB.Bson" Version="3.5.0" />
<PackageReference Include="xunit" Version="2.9.2" />
<PackageReference Include="xunit.runner.visualstudio" Version="2.8.2" />
</ItemGroup>
<ItemGroup>
<None Include="../../../../docs/modules/notify/resources/samples/*.json">
<CopyToOutputDirectory>Always</CopyToOutputDirectory>
</None>
</ItemGroup>
</Project>
<?xml version='1.0' encoding='utf-8'?>
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>net10.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
<UseConcelierTestInfra>false</UseConcelierTestInfra>
</PropertyGroup>
<ItemGroup>
<ProjectReference Include="../../__Libraries/StellaOps.Notify.Models/StellaOps.Notify.Models.csproj" />
<ProjectReference Include="../../__Libraries/StellaOps.Notify.Storage.Mongo/StellaOps.Notify.Storage.Mongo.csproj" />
</ItemGroup>
<ItemGroup>
<PackageReference Include="coverlet.collector" Version="6.0.4" />
<PackageReference Include="Microsoft.NET.Test.Sdk" Version="17.14.0" />
<PackageReference Include="Mongo2Go" Version="3.1.3" />
<PackageReference Include="MongoDB.Bson" Version="3.5.0" />
<PackageReference Include="xunit" Version="2.9.2" />
<PackageReference Include="xunit.runner.visualstudio" Version="2.8.2" />
</ItemGroup>
<ItemGroup>
<None Include="../../../../docs/modules/notify/resources/samples/*.json">
<CopyToOutputDirectory>Always</CopyToOutputDirectory>
</None>
</ItemGroup>
</Project>

View File

@@ -1,19 +1,19 @@
<?xml version='1.0' encoding='utf-8'?>
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>net10.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
</PropertyGroup>
<ItemGroup>
<ProjectReference Include="../../__Libraries/StellaOps.Notify.Models/StellaOps.Notify.Models.csproj" />
<ProjectReference Include="../../StellaOps.Notify.WebService/StellaOps.Notify.WebService.csproj" />
</ItemGroup>
<ItemGroup>
<None Include="../../../../docs/modules/notify/resources/samples/*.json">
<CopyToOutputDirectory>Always</CopyToOutputDirectory>
</None>
</ItemGroup>
</Project>
<?xml version='1.0' encoding='utf-8'?>
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>net10.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
</PropertyGroup>
<ItemGroup>
<ProjectReference Include="../../__Libraries/StellaOps.Notify.Models/StellaOps.Notify.Models.csproj" />
<ProjectReference Include="../../StellaOps.Notify.WebService/StellaOps.Notify.WebService.csproj" />
</ItemGroup>
<ItemGroup>
<None Include="../../../../docs/modules/notify/resources/samples/*.json">
<CopyToOutputDirectory>Always</CopyToOutputDirectory>
</None>
</ItemGroup>
</Project>

View File

@@ -7,6 +7,7 @@
> 2025-10-26: Blocked while bootstrapping DSL parser/evaluator; remaining grammar coverage (profile keywords, condition parsing) and rule evaluation semantics still pending to satisfy acceptance tests.
| POLICY-ENGINE-20-003 | TODO | Policy Guild, Concelier Core Guild, Excititor Core Guild | POLICY-ENGINE-20-001, CONCELIER-POLICY-20-002, EXCITITOR-POLICY-20-002 | Implement selection joiners resolving SBOM↔advisory↔VEX tuples using linksets and PURL equivalence tables, with deterministic batching. | Joiners fetch correct candidate sets in integration tests; batching meets memory targets; explain traces list input provenance. |
> 2025-10-26: Scheduler DTO contracts for runs/diffs/explains available (`src/Scheduler/__Libraries/StellaOps.Scheduler.Models/docs/SCHED-MODELS-20-001-POLICY-RUNS.md`); consume `PolicyRunRequest/Status/DiffSummary` from samples under `samples/api/scheduler/`.
> 2025-10-31: Raw Concelier observations expose `rawLinkset`; update joiners/tests to consume it and align rollout/backfill per `docs/dev/raw-linkset-backfill-plan.md`.
| POLICY-ENGINE-20-004 | TODO | Policy Guild, Platform Storage Guild | POLICY-ENGINE-20-003, CONCELIER-POLICY-20-003, EXCITITOR-POLICY-20-003 | Ship materialization writer that upserts into `effective_finding_{policyId}` with append-only history, tenant scoping, and trace references. | Writes restricted to Policy Engine identity; idempotent upserts proven via tests; collections indexed per design and docs updated. |
| POLICY-ENGINE-20-005 | TODO | Policy Guild, Security Engineering | POLICY-ENGINE-20-002 | Enforce determinism guard banning wall-clock, RNG, and network usage during evaluation via static analysis + runtime sandbox. | Guard blocks forbidden APIs in unit/integration tests; violations emit `ERR_POL_004`; CI analyzer wired. |
| POLICY-ENGINE-20-006 | TODO | Policy Guild, Scheduler Worker Guild | POLICY-ENGINE-20-003, POLICY-ENGINE-20-004, SCHED-WORKER-20-301 | Implement incremental orchestrator reacting to advisory/vex/SBOM change streams and scheduling partial policy re-evaluations. | Change stream listeners enqueue affected tuples with dedupe; orchestrator meets 5 min SLA in perf tests; metrics exposed (`policy_run_seconds`). |

View File

@@ -1,12 +1,12 @@
# StellaOps.Policy — Agent Charter
## Mission
Deliver the policy engine outlined in `docs/modules/scanner/ARCHITECTURE.md` and related prose:
- Define YAML schema (ignore rules, VEX inclusion/exclusion, vendor precedence, license gates).
- Provide policy snapshot storage with revision digests and diagnostics.
- Offer preview APIs to compare policy impacts on existing reports.
## Expectations
- Coordinate with Scanner.WebService, Feedser, Vexer, UI, Notify.
- Maintain deterministic serialization and unit tests for precedence rules.
- Update `TASKS.md` and broadcast contract changes.
# StellaOps.Policy — Agent Charter
## Mission
Deliver the policy engine outlined in `docs/modules/scanner/ARCHITECTURE.md` and related prose:
- Define YAML schema (ignore rules, VEX inclusion/exclusion, vendor precedence, license gates).
- Provide policy snapshot storage with revision digests and diagnostics.
- Offer preview APIs to compare policy impacts on existing reports.
## Expectations
- Coordinate with Scanner.WebService, Feedser, Vexer, UI, Notify.
- Maintain deterministic serialization and unit tests for precedence rules.
- Update `TASKS.md` and broadcast contract changes.

View File

@@ -1,12 +1,12 @@
# StellaOps.Scanner.Sbomer.BuildXPlugin — Agent Charter
## Mission
Implement the build-time SBOM generator described in `docs/modules/scanner/ARCHITECTURE.md` and new buildx dossier requirements:
- Provide a deterministic BuildKit/Buildx generator that produces layer SBOM fragments and uploads them to local CAS.
- Emit OCI annotations (+provenance) compatible with Scanner.Emit and Attestor hand-offs.
- Respect restart-time plug-in policy (`plugins/scanner/buildx/` manifests) and keep CI overhead ≤300ms per layer.
## Expectations
- Read architecture + upcoming Buildx addendum before coding.
- Ensure graceful fallback to post-build scan when generator unavailable.
- Provide integration tests with mock BuildKit, and update `TASKS.md` as states change.
# StellaOps.Scanner.Sbomer.BuildXPlugin — Agent Charter
## Mission
Implement the build-time SBOM generator described in `docs/modules/scanner/ARCHITECTURE.md` and new buildx dossier requirements:
- Provide a deterministic BuildKit/Buildx generator that produces layer SBOM fragments and uploads them to local CAS.
- Emit OCI annotations (+provenance) compatible with Scanner.Emit and Attestor hand-offs.
- Respect restart-time plug-in policy (`plugins/scanner/buildx/` manifests) and keep CI overhead ≤300ms per layer.
## Expectations
- Read architecture + upcoming Buildx addendum before coding.
- Ensure graceful fallback to post-build scan when generator unavailable.
- Provide integration tests with mock BuildKit, and update `TASKS.md` as states change.

View File

@@ -1,243 +1,243 @@
using System.Buffers;
using System.Collections.Concurrent;
using System.Collections.Immutable;
using System.Linq;
using System.Text;
namespace StellaOps.Scanner.Analyzers.Lang.Rust.Internal;
internal static class RustBinaryClassifier
{
private static readonly ReadOnlyMemory<byte> ElfMagic = new byte[] { 0x7F, (byte)'E', (byte)'L', (byte)'F' };
private static readonly ReadOnlyMemory<byte> SymbolPrefix = new byte[] { (byte)'_', (byte)'Z', (byte)'N' };
private const int ChunkSize = 64 * 1024;
private const int OverlapSize = 48;
private const long MaxBinarySize = 128L * 1024L * 1024L;
private static readonly HashSet<string> StandardCrates = new(StringComparer.Ordinal)
{
"core",
"alloc",
"std",
"panic_unwind",
"panic_abort",
};
private static readonly EnumerationOptions Enumeration = new()
{
MatchCasing = MatchCasing.CaseSensitive,
IgnoreInaccessible = true,
RecurseSubdirectories = true,
AttributesToSkip = FileAttributes.Device | FileAttributes.ReparsePoint,
};
private static readonly ConcurrentDictionary<RustFileCacheKey, ImmutableArray<string>> CandidateCache = new();
public static IReadOnlyList<RustBinaryInfo> Scan(string rootPath, CancellationToken cancellationToken)
{
if (string.IsNullOrWhiteSpace(rootPath))
{
throw new ArgumentException("Root path is required", nameof(rootPath));
}
var binaries = new List<RustBinaryInfo>();
foreach (var path in Directory.EnumerateFiles(rootPath, "*", Enumeration))
{
cancellationToken.ThrowIfCancellationRequested();
if (!IsEligibleBinary(path))
{
continue;
}
if (!RustFileCacheKey.TryCreate(path, out var key))
{
continue;
}
var candidates = CandidateCache.GetOrAdd(
key,
static (_, state) => ExtractCrateNames(state.Path, state.CancellationToken),
(Path: path, CancellationToken: cancellationToken));
binaries.Add(new RustBinaryInfo(path, candidates));
}
return binaries;
}
private static bool IsEligibleBinary(string path)
{
try
{
var info = new FileInfo(path);
if (!info.Exists || info.Length == 0 || info.Length > MaxBinarySize)
{
return false;
}
using var stream = info.OpenRead();
Span<byte> buffer = stackalloc byte[4];
var read = stream.Read(buffer);
if (read != 4)
{
return false;
}
return buffer.SequenceEqual(ElfMagic.Span);
}
catch (IOException)
{
return false;
}
catch (UnauthorizedAccessException)
{
return false;
}
}
private static ImmutableArray<string> ExtractCrateNames(string path, CancellationToken cancellationToken)
{
var names = new HashSet<string>(StringComparer.Ordinal);
var buffer = ArrayPool<byte>.Shared.Rent(ChunkSize + OverlapSize);
var overlap = new byte[OverlapSize];
var overlapLength = 0;
try
{
using var stream = new FileStream(path, FileMode.Open, FileAccess.Read, FileShare.Read);
while (true)
{
cancellationToken.ThrowIfCancellationRequested();
// Copy previous overlap to buffer prefix.
if (overlapLength > 0)
{
Array.Copy(overlap, 0, buffer, 0, overlapLength);
}
var read = stream.Read(buffer, overlapLength, ChunkSize);
if (read <= 0)
{
break;
}
var span = new ReadOnlySpan<byte>(buffer, 0, overlapLength + read);
ScanForSymbols(span, names);
overlapLength = Math.Min(OverlapSize, span.Length);
if (overlapLength > 0)
{
span[^overlapLength..].CopyTo(overlap);
}
}
}
catch (IOException)
{
return ImmutableArray<string>.Empty;
}
catch (UnauthorizedAccessException)
{
return ImmutableArray<string>.Empty;
}
finally
{
ArrayPool<byte>.Shared.Return(buffer);
}
if (names.Count == 0)
{
return ImmutableArray<string>.Empty;
}
var ordered = names
.Where(static name => !string.IsNullOrWhiteSpace(name))
.Select(static name => name.Trim())
.Where(static name => name.Length > 1)
.Where(name => !StandardCrates.Contains(name))
.Distinct(StringComparer.Ordinal)
.OrderBy(static name => name, StringComparer.Ordinal)
.ToImmutableArray();
return ordered;
}
private static void ScanForSymbols(ReadOnlySpan<byte> span, HashSet<string> names)
{
var prefix = SymbolPrefix.Span;
var index = 0;
while (index < span.Length)
{
var slice = span[index..];
var offset = slice.IndexOf(prefix);
if (offset < 0)
{
break;
}
index += offset + prefix.Length;
if (index >= span.Length)
{
break;
}
var remaining = span[index..];
if (!TryParseCrate(remaining, out var crate, out var consumed))
{
index += 1;
continue;
}
if (!string.IsNullOrWhiteSpace(crate))
{
names.Add(crate);
}
index += Math.Max(consumed, 1);
}
}
private static bool TryParseCrate(ReadOnlySpan<byte> span, out string? crate, out int consumed)
{
crate = null;
consumed = 0;
var i = 0;
var length = 0;
while (i < span.Length && span[i] is >= (byte)'0' and <= (byte)'9')
{
length = (length * 10) + (span[i] - (byte)'0');
i++;
if (length > 256)
{
return false;
}
}
if (i == 0 || length <= 0 || i + length > span.Length)
{
return false;
}
crate = Encoding.ASCII.GetString(span.Slice(i, length));
consumed = i + length;
return true;
}
}
internal sealed record RustBinaryInfo(string AbsolutePath, ImmutableArray<string> CrateCandidates)
{
public string ComputeSha256()
{
if (RustFileHashCache.TryGetSha256(AbsolutePath, out var sha256) && !string.IsNullOrEmpty(sha256))
{
return sha256;
}
return string.Empty;
}
}
using System.Buffers;
using System.Collections.Concurrent;
using System.Collections.Immutable;
using System.Linq;
using System.Text;
namespace StellaOps.Scanner.Analyzers.Lang.Rust.Internal;
internal static class RustBinaryClassifier
{
private static readonly ReadOnlyMemory<byte> ElfMagic = new byte[] { 0x7F, (byte)'E', (byte)'L', (byte)'F' };
private static readonly ReadOnlyMemory<byte> SymbolPrefix = new byte[] { (byte)'_', (byte)'Z', (byte)'N' };
private const int ChunkSize = 64 * 1024;
private const int OverlapSize = 48;
private const long MaxBinarySize = 128L * 1024L * 1024L;
private static readonly HashSet<string> StandardCrates = new(StringComparer.Ordinal)
{
"core",
"alloc",
"std",
"panic_unwind",
"panic_abort",
};
private static readonly EnumerationOptions Enumeration = new()
{
MatchCasing = MatchCasing.CaseSensitive,
IgnoreInaccessible = true,
RecurseSubdirectories = true,
AttributesToSkip = FileAttributes.Device | FileAttributes.ReparsePoint,
};
private static readonly ConcurrentDictionary<RustFileCacheKey, ImmutableArray<string>> CandidateCache = new();
public static IReadOnlyList<RustBinaryInfo> Scan(string rootPath, CancellationToken cancellationToken)
{
if (string.IsNullOrWhiteSpace(rootPath))
{
throw new ArgumentException("Root path is required", nameof(rootPath));
}
var binaries = new List<RustBinaryInfo>();
foreach (var path in Directory.EnumerateFiles(rootPath, "*", Enumeration))
{
cancellationToken.ThrowIfCancellationRequested();
if (!IsEligibleBinary(path))
{
continue;
}
if (!RustFileCacheKey.TryCreate(path, out var key))
{
continue;
}
var candidates = CandidateCache.GetOrAdd(
key,
static (_, state) => ExtractCrateNames(state.Path, state.CancellationToken),
(Path: path, CancellationToken: cancellationToken));
binaries.Add(new RustBinaryInfo(path, candidates));
}
return binaries;
}
private static bool IsEligibleBinary(string path)
{
try
{
var info = new FileInfo(path);
if (!info.Exists || info.Length == 0 || info.Length > MaxBinarySize)
{
return false;
}
using var stream = info.OpenRead();
Span<byte> buffer = stackalloc byte[4];
var read = stream.Read(buffer);
if (read != 4)
{
return false;
}
return buffer.SequenceEqual(ElfMagic.Span);
}
catch (IOException)
{
return false;
}
catch (UnauthorizedAccessException)
{
return false;
}
}
private static ImmutableArray<string> ExtractCrateNames(string path, CancellationToken cancellationToken)
{
var names = new HashSet<string>(StringComparer.Ordinal);
var buffer = ArrayPool<byte>.Shared.Rent(ChunkSize + OverlapSize);
var overlap = new byte[OverlapSize];
var overlapLength = 0;
try
{
using var stream = new FileStream(path, FileMode.Open, FileAccess.Read, FileShare.Read);
while (true)
{
cancellationToken.ThrowIfCancellationRequested();
// Copy previous overlap to buffer prefix.
if (overlapLength > 0)
{
Array.Copy(overlap, 0, buffer, 0, overlapLength);
}
var read = stream.Read(buffer, overlapLength, ChunkSize);
if (read <= 0)
{
break;
}
var span = new ReadOnlySpan<byte>(buffer, 0, overlapLength + read);
ScanForSymbols(span, names);
overlapLength = Math.Min(OverlapSize, span.Length);
if (overlapLength > 0)
{
span[^overlapLength..].CopyTo(overlap);
}
}
}
catch (IOException)
{
return ImmutableArray<string>.Empty;
}
catch (UnauthorizedAccessException)
{
return ImmutableArray<string>.Empty;
}
finally
{
ArrayPool<byte>.Shared.Return(buffer);
}
if (names.Count == 0)
{
return ImmutableArray<string>.Empty;
}
var ordered = names
.Where(static name => !string.IsNullOrWhiteSpace(name))
.Select(static name => name.Trim())
.Where(static name => name.Length > 1)
.Where(name => !StandardCrates.Contains(name))
.Distinct(StringComparer.Ordinal)
.OrderBy(static name => name, StringComparer.Ordinal)
.ToImmutableArray();
return ordered;
}
private static void ScanForSymbols(ReadOnlySpan<byte> span, HashSet<string> names)
{
var prefix = SymbolPrefix.Span;
var index = 0;
while (index < span.Length)
{
var slice = span[index..];
var offset = slice.IndexOf(prefix);
if (offset < 0)
{
break;
}
index += offset + prefix.Length;
if (index >= span.Length)
{
break;
}
var remaining = span[index..];
if (!TryParseCrate(remaining, out var crate, out var consumed))
{
index += 1;
continue;
}
if (!string.IsNullOrWhiteSpace(crate))
{
names.Add(crate);
}
index += Math.Max(consumed, 1);
}
}
private static bool TryParseCrate(ReadOnlySpan<byte> span, out string? crate, out int consumed)
{
crate = null;
consumed = 0;
var i = 0;
var length = 0;
while (i < span.Length && span[i] is >= (byte)'0' and <= (byte)'9')
{
length = (length * 10) + (span[i] - (byte)'0');
i++;
if (length > 256)
{
return false;
}
}
if (i == 0 || length <= 0 || i + length > span.Length)
{
return false;
}
crate = Encoding.ASCII.GetString(span.Slice(i, length));
consumed = i + length;
return true;
}
}
internal sealed record RustBinaryInfo(string AbsolutePath, ImmutableArray<string> CrateCandidates)
{
public string ComputeSha256()
{
if (RustFileHashCache.TryGetSha256(AbsolutePath, out var sha256) && !string.IsNullOrEmpty(sha256))
{
return sha256;
}
return string.Empty;
}
}

View File

@@ -1,312 +1,312 @@
using System.Collections.Concurrent;
using System.Collections.Immutable;
namespace StellaOps.Scanner.Analyzers.Lang.Rust.Internal;
internal static class RustCargoLockParser
{
private static readonly ConcurrentDictionary<RustFileCacheKey, ImmutableArray<RustCargoPackage>> Cache = new();
public static IReadOnlyList<RustCargoPackage> Parse(string path, CancellationToken cancellationToken)
{
if (string.IsNullOrWhiteSpace(path))
{
throw new ArgumentException("Lock path is required", nameof(path));
}
if (!RustFileCacheKey.TryCreate(path, out var key))
{
return Array.Empty<RustCargoPackage>();
}
var packages = Cache.GetOrAdd(
key,
static (_, state) => ParseInternal(state.Path, state.CancellationToken),
(Path: path, CancellationToken: cancellationToken));
return packages.IsDefaultOrEmpty ? Array.Empty<RustCargoPackage>() : packages;
}
private static ImmutableArray<RustCargoPackage> ParseInternal(string path, CancellationToken cancellationToken)
{
var resultBuilder = ImmutableArray.CreateBuilder<RustCargoPackage>();
using var stream = new FileStream(path, FileMode.Open, FileAccess.Read, FileShare.Read);
using var reader = new StreamReader(stream);
RustCargoPackageBuilder? packageBuilder = null;
string? currentArrayKey = null;
var arrayValues = new List<string>();
while (!reader.EndOfStream)
{
cancellationToken.ThrowIfCancellationRequested();
var line = reader.ReadLine();
if (line is null)
{
break;
}
var trimmed = TrimComments(line.AsSpan());
if (trimmed.Length == 0)
{
continue;
}
if (IsPackageHeader(trimmed))
{
FlushCurrent(packageBuilder, resultBuilder);
packageBuilder = new RustCargoPackageBuilder();
currentArrayKey = null;
arrayValues.Clear();
continue;
}
if (packageBuilder is null)
{
continue;
}
if (currentArrayKey is not null)
{
if (trimmed[0] == ']')
{
builder.SetArray(currentArrayKey, arrayValues);
currentArrayKey = null;
arrayValues.Clear();
continue;
}
var value = ExtractString(trimmed);
if (!string.IsNullOrEmpty(value))
{
arrayValues.Add(value);
}
continue;
}
if (trimmed[0] == '[')
{
// Entering a new table; finish any pending package and skip section.
FlushCurrent(builder, packages);
builder = null;
continue;
}
var equalsIndex = trimmed.IndexOf('=');
if (equalsIndex < 0)
{
continue;
}
var key = trimmed[..equalsIndex].Trim();
var valuePart = trimmed[(equalsIndex + 1)..].Trim();
if (valuePart.Length == 0)
{
continue;
}
if (valuePart[0] == '[')
{
currentArrayKey = key.ToString();
arrayValues.Clear();
if (valuePart.Length > 1 && valuePart[^1] == ']')
{
var inline = valuePart[1..^1].Trim();
if (inline.Length > 0)
{
foreach (var token in SplitInlineArray(inline.ToString()))
{
var parsedValue = ExtractString(token.AsSpan());
if (!string.IsNullOrEmpty(parsedValue))
{
arrayValues.Add(parsedValue);
}
}
}
packageBuilder.SetArray(currentArrayKey, arrayValues);
currentArrayKey = null;
arrayValues.Clear();
}
continue;
}
var parsed = ExtractString(valuePart);
if (parsed is not null)
{
packageBuilder.SetField(key, parsed);
}
}
if (currentArrayKey is not null && arrayValues.Count > 0)
{
packageBuilder?.SetArray(currentArrayKey, arrayValues);
}
FlushCurrent(packageBuilder, resultBuilder);
return resultBuilder.ToImmutable();
}
private static ReadOnlySpan<char> TrimComments(ReadOnlySpan<char> line)
{
var index = line.IndexOf('#');
if (index >= 0)
{
line = line[..index];
}
return line.Trim();
}
private static bool IsPackageHeader(ReadOnlySpan<char> value)
=> value.SequenceEqual("[[package]]".AsSpan());
private static IEnumerable<string> SplitInlineArray(string value)
{
var start = 0;
var inString = false;
for (var i = 0; i < value.Length; i++)
{
var current = value[i];
if (current == '"')
{
inString = !inString;
}
if (current == ',' && !inString)
{
var item = value.AsSpan(start, i - start).Trim();
if (item.Length > 0)
{
yield return item.ToString();
}
start = i + 1;
}
}
if (start < value.Length)
{
var item = value.AsSpan(start).Trim();
if (item.Length > 0)
{
yield return item.ToString();
}
}
}
private static string? ExtractString(ReadOnlySpan<char> value)
{
if (value.Length == 0)
{
return null;
}
if (value[0] == '"' && value[^1] == '"')
{
var inner = value[1..^1];
return inner.ToString();
}
var trimmed = value.Trim();
return trimmed.Length == 0 ? null : trimmed.ToString();
}
private static void FlushCurrent(RustCargoPackageBuilder? packageBuilder, ImmutableArray<RustCargoPackage>.Builder packages)
{
if (packageBuilder is null || !packageBuilder.HasData)
{
return;
}
if (packageBuilder.TryBuild(out var package))
{
packages.Add(package);
}
}
private sealed class RustCargoPackageBuilder
{
private readonly SortedSet<string> _dependencies = new(StringComparer.Ordinal);
private string? _name;
private string? _version;
private string? _source;
private string? _checksum;
public bool HasData => !string.IsNullOrWhiteSpace(_name);
public void SetField(ReadOnlySpan<char> key, string value)
{
if (key.SequenceEqual("name".AsSpan()))
{
_name ??= value.Trim();
}
else if (key.SequenceEqual("version".AsSpan()))
{
_version ??= value.Trim();
}
else if (key.SequenceEqual("source".AsSpan()))
{
_source ??= value.Trim();
}
else if (key.SequenceEqual("checksum".AsSpan()))
{
_checksum ??= value.Trim();
}
}
public void SetArray(string key, IEnumerable<string> values)
{
if (!string.Equals(key, "dependencies", StringComparison.Ordinal))
{
return;
}
foreach (var entry in values)
{
if (string.IsNullOrWhiteSpace(entry))
{
continue;
}
var normalized = entry.Trim();
if (normalized.Length > 0)
{
_dependencies.Add(normalized);
}
}
}
public bool TryBuild(out RustCargoPackage package)
{
if (string.IsNullOrWhiteSpace(_name))
{
package = null!;
return false;
}
package = new RustCargoPackage(
_name!,
_version ?? string.Empty,
_source,
_checksum,
_dependencies.ToArray());
return true;
}
}
}
internal sealed record RustCargoPackage(
string Name,
string Version,
string? Source,
string? Checksum,
IReadOnlyList<string> Dependencies);
using System.Collections.Concurrent;
using System.Collections.Immutable;
namespace StellaOps.Scanner.Analyzers.Lang.Rust.Internal;
internal static class RustCargoLockParser
{
private static readonly ConcurrentDictionary<RustFileCacheKey, ImmutableArray<RustCargoPackage>> Cache = new();
public static IReadOnlyList<RustCargoPackage> Parse(string path, CancellationToken cancellationToken)
{
if (string.IsNullOrWhiteSpace(path))
{
throw new ArgumentException("Lock path is required", nameof(path));
}
if (!RustFileCacheKey.TryCreate(path, out var key))
{
return Array.Empty<RustCargoPackage>();
}
var packages = Cache.GetOrAdd(
key,
static (_, state) => ParseInternal(state.Path, state.CancellationToken),
(Path: path, CancellationToken: cancellationToken));
return packages.IsDefaultOrEmpty ? Array.Empty<RustCargoPackage>() : packages;
}
private static ImmutableArray<RustCargoPackage> ParseInternal(string path, CancellationToken cancellationToken)
{
var resultBuilder = ImmutableArray.CreateBuilder<RustCargoPackage>();
using var stream = new FileStream(path, FileMode.Open, FileAccess.Read, FileShare.Read);
using var reader = new StreamReader(stream);
RustCargoPackageBuilder? packageBuilder = null;
string? currentArrayKey = null;
var arrayValues = new List<string>();
while (!reader.EndOfStream)
{
cancellationToken.ThrowIfCancellationRequested();
var line = reader.ReadLine();
if (line is null)
{
break;
}
var trimmed = TrimComments(line.AsSpan());
if (trimmed.Length == 0)
{
continue;
}
if (IsPackageHeader(trimmed))
{
FlushCurrent(packageBuilder, resultBuilder);
packageBuilder = new RustCargoPackageBuilder();
currentArrayKey = null;
arrayValues.Clear();
continue;
}
if (packageBuilder is null)
{
continue;
}
if (currentArrayKey is not null)
{
if (trimmed[0] == ']')
{
builder.SetArray(currentArrayKey, arrayValues);
currentArrayKey = null;
arrayValues.Clear();
continue;
}
var value = ExtractString(trimmed);
if (!string.IsNullOrEmpty(value))
{
arrayValues.Add(value);
}
continue;
}
if (trimmed[0] == '[')
{
// Entering a new table; finish any pending package and skip section.
FlushCurrent(builder, packages);
builder = null;
continue;
}
var equalsIndex = trimmed.IndexOf('=');
if (equalsIndex < 0)
{
continue;
}
var key = trimmed[..equalsIndex].Trim();
var valuePart = trimmed[(equalsIndex + 1)..].Trim();
if (valuePart.Length == 0)
{
continue;
}
if (valuePart[0] == '[')
{
currentArrayKey = key.ToString();
arrayValues.Clear();
if (valuePart.Length > 1 && valuePart[^1] == ']')
{
var inline = valuePart[1..^1].Trim();
if (inline.Length > 0)
{
foreach (var token in SplitInlineArray(inline.ToString()))
{
var parsedValue = ExtractString(token.AsSpan());
if (!string.IsNullOrEmpty(parsedValue))
{
arrayValues.Add(parsedValue);
}
}
}
packageBuilder.SetArray(currentArrayKey, arrayValues);
currentArrayKey = null;
arrayValues.Clear();
}
continue;
}
var parsed = ExtractString(valuePart);
if (parsed is not null)
{
packageBuilder.SetField(key, parsed);
}
}
if (currentArrayKey is not null && arrayValues.Count > 0)
{
packageBuilder?.SetArray(currentArrayKey, arrayValues);
}
FlushCurrent(packageBuilder, resultBuilder);
return resultBuilder.ToImmutable();
}
private static ReadOnlySpan<char> TrimComments(ReadOnlySpan<char> line)
{
var index = line.IndexOf('#');
if (index >= 0)
{
line = line[..index];
}
return line.Trim();
}
private static bool IsPackageHeader(ReadOnlySpan<char> value)
=> value.SequenceEqual("[[package]]".AsSpan());
private static IEnumerable<string> SplitInlineArray(string value)
{
var start = 0;
var inString = false;
for (var i = 0; i < value.Length; i++)
{
var current = value[i];
if (current == '"')
{
inString = !inString;
}
if (current == ',' && !inString)
{
var item = value.AsSpan(start, i - start).Trim();
if (item.Length > 0)
{
yield return item.ToString();
}
start = i + 1;
}
}
if (start < value.Length)
{
var item = value.AsSpan(start).Trim();
if (item.Length > 0)
{
yield return item.ToString();
}
}
}
private static string? ExtractString(ReadOnlySpan<char> value)
{
if (value.Length == 0)
{
return null;
}
if (value[0] == '"' && value[^1] == '"')
{
var inner = value[1..^1];
return inner.ToString();
}
var trimmed = value.Trim();
return trimmed.Length == 0 ? null : trimmed.ToString();
}
private static void FlushCurrent(RustCargoPackageBuilder? packageBuilder, ImmutableArray<RustCargoPackage>.Builder packages)
{
if (packageBuilder is null || !packageBuilder.HasData)
{
return;
}
if (packageBuilder.TryBuild(out var package))
{
packages.Add(package);
}
}
private sealed class RustCargoPackageBuilder
{
private readonly SortedSet<string> _dependencies = new(StringComparer.Ordinal);
private string? _name;
private string? _version;
private string? _source;
private string? _checksum;
public bool HasData => !string.IsNullOrWhiteSpace(_name);
public void SetField(ReadOnlySpan<char> key, string value)
{
if (key.SequenceEqual("name".AsSpan()))
{
_name ??= value.Trim();
}
else if (key.SequenceEqual("version".AsSpan()))
{
_version ??= value.Trim();
}
else if (key.SequenceEqual("source".AsSpan()))
{
_source ??= value.Trim();
}
else if (key.SequenceEqual("checksum".AsSpan()))
{
_checksum ??= value.Trim();
}
}
public void SetArray(string key, IEnumerable<string> values)
{
if (!string.Equals(key, "dependencies", StringComparison.Ordinal))
{
return;
}
foreach (var entry in values)
{
if (string.IsNullOrWhiteSpace(entry))
{
continue;
}
var normalized = entry.Trim();
if (normalized.Length > 0)
{
_dependencies.Add(normalized);
}
}
}
public bool TryBuild(out RustCargoPackage package)
{
if (string.IsNullOrWhiteSpace(_name))
{
package = null!;
return false;
}
package = new RustCargoPackage(
_name!,
_version ?? string.Empty,
_source,
_checksum,
_dependencies.ToArray());
return true;
}
}
}
internal sealed record RustCargoPackage(
string Name,
string Version,
string? Source,
string? Checksum,
IReadOnlyList<string> Dependencies);

View File

@@ -1,74 +1,74 @@
using System.Security;
namespace StellaOps.Scanner.Analyzers.Lang.Rust.Internal;
internal readonly struct RustFileCacheKey : IEquatable<RustFileCacheKey>
{
private readonly string _normalizedPath;
private readonly long _length;
private readonly long _lastWriteTicks;
private RustFileCacheKey(string normalizedPath, long length, long lastWriteTicks)
{
_normalizedPath = normalizedPath;
_length = length;
_lastWriteTicks = lastWriteTicks;
}
public static bool TryCreate(string path, out RustFileCacheKey key)
{
key = default;
if (string.IsNullOrWhiteSpace(path))
{
return false;
}
try
{
var info = new FileInfo(path);
if (!info.Exists)
{
return false;
}
var normalizedPath = OperatingSystem.IsWindows()
? info.FullName.ToLowerInvariant()
: info.FullName;
key = new RustFileCacheKey(normalizedPath, info.Length, info.LastWriteTimeUtc.Ticks);
return true;
}
catch (IOException)
{
return false;
}
catch (UnauthorizedAccessException)
{
return false;
}
catch (SecurityException)
{
return false;
}
catch (ArgumentException)
{
return false;
}
catch (NotSupportedException)
{
return false;
}
}
public bool Equals(RustFileCacheKey other)
=> _length == other._length
&& _lastWriteTicks == other._lastWriteTicks
&& string.Equals(_normalizedPath, other._normalizedPath, StringComparison.Ordinal);
public override bool Equals(object? obj)
=> obj is RustFileCacheKey other && Equals(other);
public override int GetHashCode()
=> HashCode.Combine(_normalizedPath, _length, _lastWriteTicks);
}
using System.Security;
namespace StellaOps.Scanner.Analyzers.Lang.Rust.Internal;
internal readonly struct RustFileCacheKey : IEquatable<RustFileCacheKey>
{
private readonly string _normalizedPath;
private readonly long _length;
private readonly long _lastWriteTicks;
private RustFileCacheKey(string normalizedPath, long length, long lastWriteTicks)
{
_normalizedPath = normalizedPath;
_length = length;
_lastWriteTicks = lastWriteTicks;
}
public static bool TryCreate(string path, out RustFileCacheKey key)
{
key = default;
if (string.IsNullOrWhiteSpace(path))
{
return false;
}
try
{
var info = new FileInfo(path);
if (!info.Exists)
{
return false;
}
var normalizedPath = OperatingSystem.IsWindows()
? info.FullName.ToLowerInvariant()
: info.FullName;
key = new RustFileCacheKey(normalizedPath, info.Length, info.LastWriteTimeUtc.Ticks);
return true;
}
catch (IOException)
{
return false;
}
catch (UnauthorizedAccessException)
{
return false;
}
catch (SecurityException)
{
return false;
}
catch (ArgumentException)
{
return false;
}
catch (NotSupportedException)
{
return false;
}
}
public bool Equals(RustFileCacheKey other)
=> _length == other._length
&& _lastWriteTicks == other._lastWriteTicks
&& string.Equals(_normalizedPath, other._normalizedPath, StringComparison.Ordinal);
public override bool Equals(object? obj)
=> obj is RustFileCacheKey other && Equals(other);
public override int GetHashCode()
=> HashCode.Combine(_normalizedPath, _length, _lastWriteTicks);
}

View File

@@ -1,45 +1,45 @@
using System.Collections.Concurrent;
using System.Security;
namespace StellaOps.Scanner.Analyzers.Lang.Rust.Internal;
internal static class RustFileHashCache
{
private static readonly ConcurrentDictionary<RustFileCacheKey, string> Sha256Cache = new();
public static bool TryGetSha256(string path, out string? sha256)
{
sha256 = null;
if (!RustFileCacheKey.TryCreate(path, out var key))
{
return false;
}
try
{
sha256 = Sha256Cache.GetOrAdd(key, static (_, state) => ComputeSha256(state), path);
return !string.IsNullOrEmpty(sha256);
}
catch (IOException)
{
return false;
}
catch (UnauthorizedAccessException)
{
return false;
}
catch (SecurityException)
{
return false;
}
}
private static string ComputeSha256(string path)
{
using var stream = File.Open(path, FileMode.Open, FileAccess.Read, FileShare.Read);
using var sha = System.Security.Cryptography.SHA256.Create();
var hash = sha.ComputeHash(stream);
return Convert.ToHexString(hash).ToLowerInvariant();
}
}
using System.Collections.Concurrent;
using System.Security;
namespace StellaOps.Scanner.Analyzers.Lang.Rust.Internal;
internal static class RustFileHashCache
{
private static readonly ConcurrentDictionary<RustFileCacheKey, string> Sha256Cache = new();
public static bool TryGetSha256(string path, out string? sha256)
{
sha256 = null;
if (!RustFileCacheKey.TryCreate(path, out var key))
{
return false;
}
try
{
sha256 = Sha256Cache.GetOrAdd(key, static (_, state) => ComputeSha256(state), path);
return !string.IsNullOrEmpty(sha256);
}
catch (IOException)
{
return false;
}
catch (UnauthorizedAccessException)
{
return false;
}
catch (SecurityException)
{
return false;
}
}
private static string ComputeSha256(string path)
{
using var stream = File.Open(path, FileMode.Open, FileAccess.Read, FileShare.Read);
using var sha = System.Security.Cryptography.SHA256.Create();
var hash = sha.ComputeHash(stream);
return Convert.ToHexString(hash).ToLowerInvariant();
}
}

View File

@@ -1,186 +1,186 @@
using System.Collections.Concurrent;
using System.Text.Json;
namespace StellaOps.Scanner.Analyzers.Lang.Rust.Internal;
internal static class RustFingerprintScanner
{
private static readonly EnumerationOptions Enumeration = new()
{
MatchCasing = MatchCasing.CaseSensitive,
IgnoreInaccessible = true,
RecurseSubdirectories = true,
AttributesToSkip = FileAttributes.Device | FileAttributes.ReparsePoint,
};
private static readonly string FingerprintSegment = $"{Path.DirectorySeparatorChar}.fingerprint{Path.DirectorySeparatorChar}";
private static readonly ConcurrentDictionary<RustFileCacheKey, RustFingerprintRecord?> Cache = new();
public static IReadOnlyList<RustFingerprintRecord> Scan(string rootPath, CancellationToken cancellationToken)
{
if (string.IsNullOrWhiteSpace(rootPath))
{
throw new ArgumentException("Root path is required", nameof(rootPath));
}
var results = new List<RustFingerprintRecord>();
foreach (var path in Directory.EnumerateFiles(rootPath, "*.json", Enumeration))
{
cancellationToken.ThrowIfCancellationRequested();
if (!path.Contains(FingerprintSegment, StringComparison.Ordinal))
{
continue;
}
if (!RustFileCacheKey.TryCreate(path, out var key))
{
continue;
}
var record = Cache.GetOrAdd(
key,
static (_, state) => ParseFingerprint(state),
path);
if (record is not null)
{
results.Add(record);
}
}
return results;
}
private static RustFingerprintRecord? ParseFingerprint(string path)
{
try
{
using var stream = new FileStream(path, FileMode.Open, FileAccess.Read, FileShare.Read);
using var document = JsonDocument.Parse(stream);
var root = document.RootElement;
var pkgId = TryGetString(root, "pkgid")
?? TryGetString(root, "package_id")
?? TryGetString(root, "packageId");
var (name, version, source) = ParseIdentity(pkgId, path);
if (string.IsNullOrWhiteSpace(name))
{
return null;
}
var profile = TryGetString(root, "profile");
var targetKind = TryGetKind(root);
return new RustFingerprintRecord(
Name: name!,
Version: version,
Source: source,
TargetKind: targetKind,
Profile: profile,
AbsolutePath: path);
}
catch (JsonException)
{
return null;
}
catch (IOException)
{
return null;
}
catch (UnauthorizedAccessException)
{
return null;
}
}
private static (string? Name, string? Version, string? Source) ParseIdentity(string? pkgId, string filePath)
{
if (!string.IsNullOrWhiteSpace(pkgId))
{
var span = pkgId.AsSpan().Trim();
var firstSpace = span.IndexOf(' ');
if (firstSpace > 0 && firstSpace < span.Length - 1)
{
var name = span[..firstSpace].ToString();
var remaining = span[(firstSpace + 1)..].Trim();
var secondSpace = remaining.IndexOf(' ');
if (secondSpace < 0)
{
return (name, remaining.ToString(), null);
}
var version = remaining[..secondSpace].ToString();
var potentialSource = remaining[(secondSpace + 1)..].Trim();
if (potentialSource.Length > 1 && potentialSource[0] == '(' && potentialSource[^1] == ')')
{
potentialSource = potentialSource[1..^1].Trim();
}
var source = potentialSource.Length == 0 ? null : potentialSource.ToString();
return (name, version, source);
}
}
var directory = Path.GetDirectoryName(filePath);
if (string.IsNullOrEmpty(directory))
{
return (null, null, null);
}
var crateDirectory = Path.GetFileName(directory);
if (string.IsNullOrWhiteSpace(crateDirectory))
{
return (null, null, null);
}
var dashIndex = crateDirectory.LastIndexOf('-');
if (dashIndex <= 0)
{
return (crateDirectory, null, null);
}
var maybeName = crateDirectory[..dashIndex];
return (maybeName, null, null);
}
private static string? TryGetKind(JsonElement root)
{
if (root.TryGetProperty("target_kind", out var array) && array.ValueKind == JsonValueKind.Array && array.GetArrayLength() > 0)
{
var first = array[0];
if (first.ValueKind == JsonValueKind.String)
{
return first.GetString();
}
}
if (root.TryGetProperty("target", out var target) && target.ValueKind == JsonValueKind.String)
{
return target.GetString();
}
return null;
}
private static string? TryGetString(JsonElement element, string propertyName)
{
if (element.TryGetProperty(propertyName, out var value) && value.ValueKind == JsonValueKind.String)
{
return value.GetString();
}
return null;
}
}
internal sealed record RustFingerprintRecord(
string Name,
string? Version,
string? Source,
string? TargetKind,
string? Profile,
string AbsolutePath);
using System.Collections.Concurrent;
using System.Text.Json;
namespace StellaOps.Scanner.Analyzers.Lang.Rust.Internal;
internal static class RustFingerprintScanner
{
private static readonly EnumerationOptions Enumeration = new()
{
MatchCasing = MatchCasing.CaseSensitive,
IgnoreInaccessible = true,
RecurseSubdirectories = true,
AttributesToSkip = FileAttributes.Device | FileAttributes.ReparsePoint,
};
private static readonly string FingerprintSegment = $"{Path.DirectorySeparatorChar}.fingerprint{Path.DirectorySeparatorChar}";
private static readonly ConcurrentDictionary<RustFileCacheKey, RustFingerprintRecord?> Cache = new();
public static IReadOnlyList<RustFingerprintRecord> Scan(string rootPath, CancellationToken cancellationToken)
{
if (string.IsNullOrWhiteSpace(rootPath))
{
throw new ArgumentException("Root path is required", nameof(rootPath));
}
var results = new List<RustFingerprintRecord>();
foreach (var path in Directory.EnumerateFiles(rootPath, "*.json", Enumeration))
{
cancellationToken.ThrowIfCancellationRequested();
if (!path.Contains(FingerprintSegment, StringComparison.Ordinal))
{
continue;
}
if (!RustFileCacheKey.TryCreate(path, out var key))
{
continue;
}
var record = Cache.GetOrAdd(
key,
static (_, state) => ParseFingerprint(state),
path);
if (record is not null)
{
results.Add(record);
}
}
return results;
}
private static RustFingerprintRecord? ParseFingerprint(string path)
{
try
{
using var stream = new FileStream(path, FileMode.Open, FileAccess.Read, FileShare.Read);
using var document = JsonDocument.Parse(stream);
var root = document.RootElement;
var pkgId = TryGetString(root, "pkgid")
?? TryGetString(root, "package_id")
?? TryGetString(root, "packageId");
var (name, version, source) = ParseIdentity(pkgId, path);
if (string.IsNullOrWhiteSpace(name))
{
return null;
}
var profile = TryGetString(root, "profile");
var targetKind = TryGetKind(root);
return new RustFingerprintRecord(
Name: name!,
Version: version,
Source: source,
TargetKind: targetKind,
Profile: profile,
AbsolutePath: path);
}
catch (JsonException)
{
return null;
}
catch (IOException)
{
return null;
}
catch (UnauthorizedAccessException)
{
return null;
}
}
private static (string? Name, string? Version, string? Source) ParseIdentity(string? pkgId, string filePath)
{
if (!string.IsNullOrWhiteSpace(pkgId))
{
var span = pkgId.AsSpan().Trim();
var firstSpace = span.IndexOf(' ');
if (firstSpace > 0 && firstSpace < span.Length - 1)
{
var name = span[..firstSpace].ToString();
var remaining = span[(firstSpace + 1)..].Trim();
var secondSpace = remaining.IndexOf(' ');
if (secondSpace < 0)
{
return (name, remaining.ToString(), null);
}
var version = remaining[..secondSpace].ToString();
var potentialSource = remaining[(secondSpace + 1)..].Trim();
if (potentialSource.Length > 1 && potentialSource[0] == '(' && potentialSource[^1] == ')')
{
potentialSource = potentialSource[1..^1].Trim();
}
var source = potentialSource.Length == 0 ? null : potentialSource.ToString();
return (name, version, source);
}
}
var directory = Path.GetDirectoryName(filePath);
if (string.IsNullOrEmpty(directory))
{
return (null, null, null);
}
var crateDirectory = Path.GetFileName(directory);
if (string.IsNullOrWhiteSpace(crateDirectory))
{
return (null, null, null);
}
var dashIndex = crateDirectory.LastIndexOf('-');
if (dashIndex <= 0)
{
return (crateDirectory, null, null);
}
var maybeName = crateDirectory[..dashIndex];
return (maybeName, null, null);
}
private static string? TryGetKind(JsonElement root)
{
if (root.TryGetProperty("target_kind", out var array) && array.ValueKind == JsonValueKind.Array && array.GetArrayLength() > 0)
{
var first = array[0];
if (first.ValueKind == JsonValueKind.String)
{
return first.GetString();
}
}
if (root.TryGetProperty("target", out var target) && target.ValueKind == JsonValueKind.String)
{
return target.GetString();
}
return null;
}
private static string? TryGetString(JsonElement element, string propertyName)
{
if (element.TryGetProperty(propertyName, out var value) && value.ValueKind == JsonValueKind.String)
{
return value.GetString();
}
return null;
}
}
internal sealed record RustFingerprintRecord(
string Name,
string? Version,
string? Source,
string? TargetKind,
string? Profile,
string AbsolutePath);

View File

@@ -1,298 +1,298 @@
using System.Collections.Concurrent;
using System.Collections.Immutable;
using System.Security;
namespace StellaOps.Scanner.Analyzers.Lang.Rust.Internal;
internal static class RustLicenseScanner
{
private static readonly ConcurrentDictionary<string, RustLicenseIndex> IndexCache = new(StringComparer.Ordinal);
public static RustLicenseIndex GetOrCreate(string rootPath, CancellationToken cancellationToken)
{
if (string.IsNullOrWhiteSpace(rootPath) || !Directory.Exists(rootPath))
{
return RustLicenseIndex.Empty;
}
var normalizedRoot = NormalizeRoot(rootPath);
return IndexCache.GetOrAdd(
normalizedRoot,
static (_, state) => BuildIndex(state.RootPath, state.CancellationToken),
(RootPath: rootPath, CancellationToken: cancellationToken));
}
private static RustLicenseIndex BuildIndex(string rootPath, CancellationToken cancellationToken)
{
var byName = new Dictionary<string, List<RustLicenseInfo>>(StringComparer.Ordinal);
var enumeration = new EnumerationOptions
{
MatchCasing = MatchCasing.CaseSensitive,
IgnoreInaccessible = true,
RecurseSubdirectories = true,
AttributesToSkip = FileAttributes.Device | FileAttributes.ReparsePoint,
};
foreach (var cargoTomlPath in Directory.EnumerateFiles(rootPath, "Cargo.toml", enumeration))
{
cancellationToken.ThrowIfCancellationRequested();
if (IsUnderTargetDirectory(cargoTomlPath))
{
continue;
}
if (!TryParseCargoToml(rootPath, cargoTomlPath, out var info))
{
continue;
}
var normalizedName = RustCrateBuilder.NormalizeName(info.Name);
if (!byName.TryGetValue(normalizedName, out var entries))
{
entries = new List<RustLicenseInfo>();
byName[normalizedName] = entries;
}
entries.Add(info);
}
foreach (var entry in byName.Values)
{
entry.Sort(static (left, right) =>
{
var versionCompare = string.Compare(left.Version, right.Version, StringComparison.OrdinalIgnoreCase);
if (versionCompare != 0)
{
return versionCompare;
}
return string.Compare(left.CargoTomlRelativePath, right.CargoTomlRelativePath, StringComparison.Ordinal);
});
}
return new RustLicenseIndex(byName);
}
private static bool TryParseCargoToml(string rootPath, string cargoTomlPath, out RustLicenseInfo info)
{
info = default!;
try
{
using var stream = new FileStream(cargoTomlPath, FileMode.Open, FileAccess.Read, FileShare.Read);
using var reader = new StreamReader(stream, leaveOpen: false);
string? name = null;
string? version = null;
string? licenseExpression = null;
string? licenseFile = null;
var inPackageSection = false;
while (reader.ReadLine() is { } line)
{
line = StripComment(line).Trim();
if (line.Length == 0)
{
continue;
}
if (line.StartsWith("[", StringComparison.Ordinal))
{
inPackageSection = string.Equals(line, "[package]", StringComparison.OrdinalIgnoreCase);
if (!inPackageSection && line.StartsWith("[dependency", StringComparison.OrdinalIgnoreCase))
{
// Exiting package section.
break;
}
continue;
}
if (!inPackageSection)
{
continue;
}
if (TryParseStringAssignment(line, "name", out var parsedName))
{
name ??= parsedName;
continue;
}
if (TryParseStringAssignment(line, "version", out var parsedVersion))
{
version ??= parsedVersion;
continue;
}
if (TryParseStringAssignment(line, "license", out var parsedLicense))
{
licenseExpression ??= parsedLicense;
continue;
}
if (TryParseStringAssignment(line, "license-file", out var parsedLicenseFile))
{
licenseFile ??= parsedLicenseFile;
continue;
}
}
if (string.IsNullOrWhiteSpace(name))
{
return false;
}
var expressions = ImmutableArray<string>.Empty;
if (!string.IsNullOrWhiteSpace(licenseExpression))
{
expressions = ImmutableArray.Create(licenseExpression!);
}
var files = ImmutableArray<RustLicenseFileReference>.Empty;
if (!string.IsNullOrWhiteSpace(licenseFile))
{
var directory = Path.GetDirectoryName(cargoTomlPath) ?? string.Empty;
var absolute = Path.GetFullPath(Path.Combine(directory, licenseFile!));
if (File.Exists(absolute))
{
var relative = NormalizeRelativePath(rootPath, absolute);
if (RustFileHashCache.TryGetSha256(absolute, out var sha256))
{
files = ImmutableArray.Create(new RustLicenseFileReference(relative, sha256));
}
else
{
files = ImmutableArray.Create(new RustLicenseFileReference(relative, null));
}
}
}
var cargoRelative = NormalizeRelativePath(rootPath, cargoTomlPath);
info = new RustLicenseInfo(
name!.Trim(),
string.IsNullOrWhiteSpace(version) ? null : version!.Trim(),
expressions,
files,
cargoRelative);
return true;
}
catch (IOException)
{
return false;
}
catch (UnauthorizedAccessException)
{
return false;
}
catch (SecurityException)
{
return false;
}
}
private static string NormalizeRoot(string rootPath)
{
var full = Path.GetFullPath(rootPath);
return OperatingSystem.IsWindows()
? full.ToLowerInvariant()
: full;
}
private static bool TryParseStringAssignment(string line, string key, out string? value)
{
value = null;
if (!line.StartsWith(key, StringComparison.Ordinal))
{
return false;
}
var remaining = line[key.Length..].TrimStart();
if (remaining.Length == 0 || remaining[0] != '=')
{
return false;
}
remaining = remaining[1..].TrimStart();
if (remaining.Length < 2 || remaining[0] != '"' || remaining[^1] != '"')
{
return false;
}
value = remaining[1..^1];
return true;
}
private static string StripComment(string line)
{
var index = line.IndexOf('#');
return index < 0 ? line : line[..index];
}
private static bool IsUnderTargetDirectory(string path)
{
var segment = $"{Path.DirectorySeparatorChar}target{Path.DirectorySeparatorChar}";
return path.Contains(segment, OperatingSystem.IsWindows() ? StringComparison.OrdinalIgnoreCase : StringComparison.Ordinal);
}
private static string NormalizeRelativePath(string rootPath, string absolutePath)
{
var relative = Path.GetRelativePath(rootPath, absolutePath);
if (string.IsNullOrWhiteSpace(relative) || relative == ".")
{
return ".";
}
return relative.Replace('\\', '/');
}
}
internal sealed class RustLicenseIndex
{
private readonly Dictionary<string, List<RustLicenseInfo>> _byName;
public static readonly RustLicenseIndex Empty = new(new Dictionary<string, List<RustLicenseInfo>>(StringComparer.Ordinal));
public RustLicenseIndex(Dictionary<string, List<RustLicenseInfo>> byName)
{
_byName = byName ?? throw new ArgumentNullException(nameof(byName));
}
public RustLicenseInfo? Find(string crateName, string? version)
{
if (string.IsNullOrWhiteSpace(crateName))
{
return null;
}
var normalized = RustCrateBuilder.NormalizeName(crateName);
if (!_byName.TryGetValue(normalized, out var list) || list.Count == 0)
{
return null;
}
if (!string.IsNullOrWhiteSpace(version))
{
var match = list.FirstOrDefault(entry => string.Equals(entry.Version, version, StringComparison.OrdinalIgnoreCase));
if (match is not null)
{
return match;
}
}
return list[0];
}
}
internal sealed record RustLicenseInfo(
string Name,
string? Version,
ImmutableArray<string> Expressions,
ImmutableArray<RustLicenseFileReference> Files,
string CargoTomlRelativePath);
internal sealed record RustLicenseFileReference(string RelativePath, string? Sha256);
using System.Collections.Concurrent;
using System.Collections.Immutable;
using System.Security;
namespace StellaOps.Scanner.Analyzers.Lang.Rust.Internal;
internal static class RustLicenseScanner
{
private static readonly ConcurrentDictionary<string, RustLicenseIndex> IndexCache = new(StringComparer.Ordinal);
public static RustLicenseIndex GetOrCreate(string rootPath, CancellationToken cancellationToken)
{
if (string.IsNullOrWhiteSpace(rootPath) || !Directory.Exists(rootPath))
{
return RustLicenseIndex.Empty;
}
var normalizedRoot = NormalizeRoot(rootPath);
return IndexCache.GetOrAdd(
normalizedRoot,
static (_, state) => BuildIndex(state.RootPath, state.CancellationToken),
(RootPath: rootPath, CancellationToken: cancellationToken));
}
private static RustLicenseIndex BuildIndex(string rootPath, CancellationToken cancellationToken)
{
var byName = new Dictionary<string, List<RustLicenseInfo>>(StringComparer.Ordinal);
var enumeration = new EnumerationOptions
{
MatchCasing = MatchCasing.CaseSensitive,
IgnoreInaccessible = true,
RecurseSubdirectories = true,
AttributesToSkip = FileAttributes.Device | FileAttributes.ReparsePoint,
};
foreach (var cargoTomlPath in Directory.EnumerateFiles(rootPath, "Cargo.toml", enumeration))
{
cancellationToken.ThrowIfCancellationRequested();
if (IsUnderTargetDirectory(cargoTomlPath))
{
continue;
}
if (!TryParseCargoToml(rootPath, cargoTomlPath, out var info))
{
continue;
}
var normalizedName = RustCrateBuilder.NormalizeName(info.Name);
if (!byName.TryGetValue(normalizedName, out var entries))
{
entries = new List<RustLicenseInfo>();
byName[normalizedName] = entries;
}
entries.Add(info);
}
foreach (var entry in byName.Values)
{
entry.Sort(static (left, right) =>
{
var versionCompare = string.Compare(left.Version, right.Version, StringComparison.OrdinalIgnoreCase);
if (versionCompare != 0)
{
return versionCompare;
}
return string.Compare(left.CargoTomlRelativePath, right.CargoTomlRelativePath, StringComparison.Ordinal);
});
}
return new RustLicenseIndex(byName);
}
private static bool TryParseCargoToml(string rootPath, string cargoTomlPath, out RustLicenseInfo info)
{
info = default!;
try
{
using var stream = new FileStream(cargoTomlPath, FileMode.Open, FileAccess.Read, FileShare.Read);
using var reader = new StreamReader(stream, leaveOpen: false);
string? name = null;
string? version = null;
string? licenseExpression = null;
string? licenseFile = null;
var inPackageSection = false;
while (reader.ReadLine() is { } line)
{
line = StripComment(line).Trim();
if (line.Length == 0)
{
continue;
}
if (line.StartsWith("[", StringComparison.Ordinal))
{
inPackageSection = string.Equals(line, "[package]", StringComparison.OrdinalIgnoreCase);
if (!inPackageSection && line.StartsWith("[dependency", StringComparison.OrdinalIgnoreCase))
{
// Exiting package section.
break;
}
continue;
}
if (!inPackageSection)
{
continue;
}
if (TryParseStringAssignment(line, "name", out var parsedName))
{
name ??= parsedName;
continue;
}
if (TryParseStringAssignment(line, "version", out var parsedVersion))
{
version ??= parsedVersion;
continue;
}
if (TryParseStringAssignment(line, "license", out var parsedLicense))
{
licenseExpression ??= parsedLicense;
continue;
}
if (TryParseStringAssignment(line, "license-file", out var parsedLicenseFile))
{
licenseFile ??= parsedLicenseFile;
continue;
}
}
if (string.IsNullOrWhiteSpace(name))
{
return false;
}
var expressions = ImmutableArray<string>.Empty;
if (!string.IsNullOrWhiteSpace(licenseExpression))
{
expressions = ImmutableArray.Create(licenseExpression!);
}
var files = ImmutableArray<RustLicenseFileReference>.Empty;
if (!string.IsNullOrWhiteSpace(licenseFile))
{
var directory = Path.GetDirectoryName(cargoTomlPath) ?? string.Empty;
var absolute = Path.GetFullPath(Path.Combine(directory, licenseFile!));
if (File.Exists(absolute))
{
var relative = NormalizeRelativePath(rootPath, absolute);
if (RustFileHashCache.TryGetSha256(absolute, out var sha256))
{
files = ImmutableArray.Create(new RustLicenseFileReference(relative, sha256));
}
else
{
files = ImmutableArray.Create(new RustLicenseFileReference(relative, null));
}
}
}
var cargoRelative = NormalizeRelativePath(rootPath, cargoTomlPath);
info = new RustLicenseInfo(
name!.Trim(),
string.IsNullOrWhiteSpace(version) ? null : version!.Trim(),
expressions,
files,
cargoRelative);
return true;
}
catch (IOException)
{
return false;
}
catch (UnauthorizedAccessException)
{
return false;
}
catch (SecurityException)
{
return false;
}
}
private static string NormalizeRoot(string rootPath)
{
var full = Path.GetFullPath(rootPath);
return OperatingSystem.IsWindows()
? full.ToLowerInvariant()
: full;
}
private static bool TryParseStringAssignment(string line, string key, out string? value)
{
value = null;
if (!line.StartsWith(key, StringComparison.Ordinal))
{
return false;
}
var remaining = line[key.Length..].TrimStart();
if (remaining.Length == 0 || remaining[0] != '=')
{
return false;
}
remaining = remaining[1..].TrimStart();
if (remaining.Length < 2 || remaining[0] != '"' || remaining[^1] != '"')
{
return false;
}
value = remaining[1..^1];
return true;
}
private static string StripComment(string line)
{
var index = line.IndexOf('#');
return index < 0 ? line : line[..index];
}
private static bool IsUnderTargetDirectory(string path)
{
var segment = $"{Path.DirectorySeparatorChar}target{Path.DirectorySeparatorChar}";
return path.Contains(segment, OperatingSystem.IsWindows() ? StringComparison.OrdinalIgnoreCase : StringComparison.Ordinal);
}
private static string NormalizeRelativePath(string rootPath, string absolutePath)
{
var relative = Path.GetRelativePath(rootPath, absolutePath);
if (string.IsNullOrWhiteSpace(relative) || relative == ".")
{
return ".";
}
return relative.Replace('\\', '/');
}
}
internal sealed class RustLicenseIndex
{
private readonly Dictionary<string, List<RustLicenseInfo>> _byName;
public static readonly RustLicenseIndex Empty = new(new Dictionary<string, List<RustLicenseInfo>>(StringComparer.Ordinal));
public RustLicenseIndex(Dictionary<string, List<RustLicenseInfo>> byName)
{
_byName = byName ?? throw new ArgumentNullException(nameof(byName));
}
public RustLicenseInfo? Find(string crateName, string? version)
{
if (string.IsNullOrWhiteSpace(crateName))
{
return null;
}
var normalized = RustCrateBuilder.NormalizeName(crateName);
if (!_byName.TryGetValue(normalized, out var list) || list.Count == 0)
{
return null;
}
if (!string.IsNullOrWhiteSpace(version))
{
var match = list.FirstOrDefault(entry => string.Equals(entry.Version, version, StringComparison.OrdinalIgnoreCase));
if (match is not null)
{
return match;
}
}
return list[0];
}
}
internal sealed record RustLicenseInfo(
string Name,
string? Version,
ImmutableArray<string> Expressions,
ImmutableArray<RustLicenseFileReference> Files,
string CargoTomlRelativePath);
internal sealed record RustLicenseFileReference(string RelativePath, string? Sha256);

View File

@@ -1,15 +1,15 @@
# StellaOps.Scanner.Queue — Agent Charter
## Mission
Deliver the scanner job queue backbone defined in `docs/modules/scanner/ARCHITECTURE.md`, providing deterministic, offline-friendly leasing semantics for WebService producers and Worker consumers.
## Responsibilities
- Define queue abstractions with idempotent enqueue tokens, acknowledgement, lease renewal, and claim support.
- Ship first-party adapters for Redis Streams and NATS JetStream, respecting offline deployments and allow-listed hosts.
- Surface health probes, structured diagnostics, and metrics needed by Scanner WebService/Worker.
- Document operational expectations and configuration binding hooks.
## Interfaces & Dependencies
- Consumes shared configuration primitives from `StellaOps.Configuration`.
- Exposes dependency injection extensions for `StellaOps.DependencyInjection`.
- Targets `net10.0` (preview) and aligns with scanner DTOs once `StellaOps.Scanner.Core` lands.
# StellaOps.Scanner.Queue — Agent Charter
## Mission
Deliver the scanner job queue backbone defined in `docs/modules/scanner/ARCHITECTURE.md`, providing deterministic, offline-friendly leasing semantics for WebService producers and Worker consumers.
## Responsibilities
- Define queue abstractions with idempotent enqueue tokens, acknowledgement, lease renewal, and claim support.
- Ship first-party adapters for Redis Streams and NATS JetStream, respecting offline deployments and allow-listed hosts.
- Surface health probes, structured diagnostics, and metrics needed by Scanner WebService/Worker.
- Document operational expectations and configuration binding hooks.
## Interfaces & Dependencies
- Consumes shared configuration primitives from `StellaOps.Configuration`.
- Exposes dependency injection extensions for `StellaOps.DependencyInjection`.
- Targets `net10.0` (preview) and aligns with scanner DTOs once `StellaOps.Scanner.Core` lands.

View File

@@ -1,4 +1,4 @@
[package]
name = "my_app"
version = "0.1.0"
license = "MIT"
[package]
name = "my_app"
version = "0.1.0"
license = "MIT"

View File

@@ -1,16 +1,16 @@
MIT License
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
MIT License
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

View File

@@ -1,4 +1,4 @@
[package]
name = "serde"
version = "1.0.188"
license = "Apache-2.0"
[package]
name = "serde"
version = "1.0.188"
license = "Apache-2.0"

View File

@@ -1,59 +1,59 @@
using System;
using System.IO;
using System.Linq;
using StellaOps.Scanner.Analyzers.Lang.Rust;
using StellaOps.Scanner.Analyzers.Lang.Tests.Harness;
using StellaOps.Scanner.Analyzers.Lang.Tests.TestUtilities;
namespace StellaOps.Scanner.Analyzers.Lang.Tests.Rust;
public sealed class RustLanguageAnalyzerTests
{
[Fact]
public async Task SimpleFixtureProducesDeterministicOutputAsync()
{
var cancellationToken = TestContext.Current.CancellationToken;
var fixturePath = TestPaths.ResolveFixture("lang", "rust", "simple");
var goldenPath = Path.Combine(fixturePath, "expected.json");
var usageHints = new LanguageUsageHints(new[]
{
Path.Combine(fixturePath, "usr/local/bin/my_app")
});
var analyzers = new ILanguageAnalyzer[]
{
new RustLanguageAnalyzer()
};
await LanguageAnalyzerTestHarness.AssertDeterministicAsync(
fixturePath,
goldenPath,
analyzers,
cancellationToken,
usageHints);
}
[Fact]
public async Task AnalyzerIsThreadSafeUnderConcurrencyAsync()
{
var cancellationToken = TestContext.Current.CancellationToken;
var fixturePath = TestPaths.ResolveFixture("lang", "rust", "simple");
var analyzers = new ILanguageAnalyzer[]
{
new RustLanguageAnalyzer()
};
var workers = Math.Max(Environment.ProcessorCount, 4);
var tasks = Enumerable.Range(0, workers)
.Select(_ => LanguageAnalyzerTestHarness.RunToJsonAsync(fixturePath, analyzers, cancellationToken));
var results = await Task.WhenAll(tasks);
var baseline = results[0];
foreach (var result in results)
{
Assert.Equal(baseline, result);
}
}
}
using System;
using System.IO;
using System.Linq;
using StellaOps.Scanner.Analyzers.Lang.Rust;
using StellaOps.Scanner.Analyzers.Lang.Tests.Harness;
using StellaOps.Scanner.Analyzers.Lang.Tests.TestUtilities;
namespace StellaOps.Scanner.Analyzers.Lang.Tests.Rust;
public sealed class RustLanguageAnalyzerTests
{
[Fact]
public async Task SimpleFixtureProducesDeterministicOutputAsync()
{
var cancellationToken = TestContext.Current.CancellationToken;
var fixturePath = TestPaths.ResolveFixture("lang", "rust", "simple");
var goldenPath = Path.Combine(fixturePath, "expected.json");
var usageHints = new LanguageUsageHints(new[]
{
Path.Combine(fixturePath, "usr/local/bin/my_app")
});
var analyzers = new ILanguageAnalyzer[]
{
new RustLanguageAnalyzer()
};
await LanguageAnalyzerTestHarness.AssertDeterministicAsync(
fixturePath,
goldenPath,
analyzers,
cancellationToken,
usageHints);
}
[Fact]
public async Task AnalyzerIsThreadSafeUnderConcurrencyAsync()
{
var cancellationToken = TestContext.Current.CancellationToken;
var fixturePath = TestPaths.ResolveFixture("lang", "rust", "simple");
var analyzers = new ILanguageAnalyzer[]
{
new RustLanguageAnalyzer()
};
var workers = Math.Max(Environment.ProcessorCount, 4);
var tasks = Enumerable.Range(0, workers)
.Select(_ => LanguageAnalyzerTestHarness.RunToJsonAsync(fixturePath, analyzers, cancellationToken));
var results = await Task.WhenAll(tasks);
var baseline = results[0];
foreach (var result in results)
{
Assert.Equal(baseline, result);
}
}
}

View File

@@ -1,50 +1,50 @@
{
"eventId": "6d2d1b77-f3c3-4f70-8a9d-6f2d0c8801ab",
"kind": "scanner.event.report.ready",
"version": 1,
"tenant": "tenant-alpha",
"occurredAt": "2025-10-19T12:34:56Z",
"recordedAt": "2025-10-19T12:34:57Z",
"source": "scanner.webservice",
"idempotencyKey": "scanner.event.report.ready:tenant-alpha:report-abc",
"correlationId": "report-abc",
"traceId": "0af7651916cd43dd8448eb211c80319c",
"spanId": "b7ad6b7169203331",
"scope": {
"namespace": "acme/edge",
"repo": "api",
"digest": "sha256:feedface"
},
"attributes": {
"reportId": "report-abc",
"policyRevisionId": "rev-42",
"policyDigest": "digest-123",
"verdict": "blocked"
},
"payload": {
"reportId": "report-abc",
"scanId": "report-abc",
"imageDigest": "sha256:feedface",
"generatedAt": "2025-10-19T12:34:56Z",
"verdict": "fail",
"summary": {
"total": 1,
"blocked": 1,
"warned": 0,
"ignored": 0,
"quieted": 0
},
"delta": {
"newCritical": 1,
"kev": [
"CVE-2024-9999"
]
},
"quietedFindingCount": 0,
"policy": {
"digest": "digest-123",
"revisionId": "rev-42"
},
{
"eventId": "6d2d1b77-f3c3-4f70-8a9d-6f2d0c8801ab",
"kind": "scanner.event.report.ready",
"version": 1,
"tenant": "tenant-alpha",
"occurredAt": "2025-10-19T12:34:56Z",
"recordedAt": "2025-10-19T12:34:57Z",
"source": "scanner.webservice",
"idempotencyKey": "scanner.event.report.ready:tenant-alpha:report-abc",
"correlationId": "report-abc",
"traceId": "0af7651916cd43dd8448eb211c80319c",
"spanId": "b7ad6b7169203331",
"scope": {
"namespace": "acme/edge",
"repo": "api",
"digest": "sha256:feedface"
},
"attributes": {
"reportId": "report-abc",
"policyRevisionId": "rev-42",
"policyDigest": "digest-123",
"verdict": "blocked"
},
"payload": {
"reportId": "report-abc",
"scanId": "report-abc",
"imageDigest": "sha256:feedface",
"generatedAt": "2025-10-19T12:34:56Z",
"verdict": "fail",
"summary": {
"total": 1,
"blocked": 1,
"warned": 0,
"ignored": 0,
"quieted": 0
},
"delta": {
"newCritical": 1,
"kev": [
"CVE-2024-9999"
]
},
"quietedFindingCount": 0,
"policy": {
"digest": "digest-123",
"revisionId": "rev-42"
},
"links": {
"report": {
"ui": "https://scanner.example/ui/reports/report-abc",
@@ -59,43 +59,43 @@
"api": "https://scanner.example/api/v1/reports/report-abc/attestation"
}
},
"dsse": {
"payloadType": "application/vnd.stellaops.report+json",
"payload": "eyJyZXBvcnRJZCI6InJlcG9ydC1hYmMiLCJpbWFnZURpZ2VzdCI6InNoYTI1NjpmZWVkZmFjZSIsImdlbmVyYXRlZEF0IjoiMjAyNS0xMC0xOVQxMjozNDo1NiswMDowMCIsInZlcmRpY3QiOiJibG9ja2VkIiwicG9saWN5Ijp7InJldmlzaW9uSWQiOiJyZXYtNDIiLCJkaWdlc3QiOiJkaWdlc3QtMTIzIn0sInN1bW1hcnkiOnsidG90YWwiOjEsImJsb2NrZWQiOjEsIndhcm5lZCI6MCwiaWdub3JlZCI6MCwicXVpZXRlZCI6MH0sInZlcmRpY3RzIjpbeyJmaW5kaW5nSWQiOiJmaW5kaW5nLTEiLCJzdGF0dXMiOiJCbG9ja2VkIiwic2NvcmUiOjQ3LjUsInNvdXJjZVRydXN0IjoiTlZEIiwicmVhY2hhYmlsaXR5IjoicnVudGltZSJ9XSwiaXNzdWVzIjpbXX0=",
"signatures": [
{
"keyId": "test-key",
"algorithm": "hs256",
"signature": "signature-value"
}
]
},
"report": {
"reportId": "report-abc",
"generatedAt": "2025-10-19T12:34:56Z",
"imageDigest": "sha256:feedface",
"policy": {
"digest": "digest-123",
"revisionId": "rev-42"
},
"summary": {
"total": 1,
"blocked": 1,
"warned": 0,
"ignored": 0,
"quieted": 0
},
"verdict": "blocked",
"verdicts": [
{
"findingId": "finding-1",
"status": "Blocked",
"score": 47.5,
"sourceTrust": "NVD",
"reachability": "runtime"
}
],
"issues": []
}
}
}
"dsse": {
"payloadType": "application/vnd.stellaops.report+json",
"payload": "eyJyZXBvcnRJZCI6InJlcG9ydC1hYmMiLCJpbWFnZURpZ2VzdCI6InNoYTI1NjpmZWVkZmFjZSIsImdlbmVyYXRlZEF0IjoiMjAyNS0xMC0xOVQxMjozNDo1NiswMDowMCIsInZlcmRpY3QiOiJibG9ja2VkIiwicG9saWN5Ijp7InJldmlzaW9uSWQiOiJyZXYtNDIiLCJkaWdlc3QiOiJkaWdlc3QtMTIzIn0sInN1bW1hcnkiOnsidG90YWwiOjEsImJsb2NrZWQiOjEsIndhcm5lZCI6MCwiaWdub3JlZCI6MCwicXVpZXRlZCI6MH0sInZlcmRpY3RzIjpbeyJmaW5kaW5nSWQiOiJmaW5kaW5nLTEiLCJzdGF0dXMiOiJCbG9ja2VkIiwic2NvcmUiOjQ3LjUsInNvdXJjZVRydXN0IjoiTlZEIiwicmVhY2hhYmlsaXR5IjoicnVudGltZSJ9XSwiaXNzdWVzIjpbXX0=",
"signatures": [
{
"keyId": "test-key",
"algorithm": "hs256",
"signature": "signature-value"
}
]
},
"report": {
"reportId": "report-abc",
"generatedAt": "2025-10-19T12:34:56Z",
"imageDigest": "sha256:feedface",
"policy": {
"digest": "digest-123",
"revisionId": "rev-42"
},
"summary": {
"total": 1,
"blocked": 1,
"warned": 0,
"ignored": 0,
"quieted": 0
},
"verdict": "blocked",
"verdicts": [
{
"findingId": "finding-1",
"status": "Blocked",
"score": 47.5,
"sourceTrust": "NVD",
"reachability": "runtime"
}
],
"issues": []
}
}
}

View File

@@ -1,56 +1,56 @@
{
"eventId": "08a6de24-4a94-4d14-8432-9d14f36f6da3",
"kind": "scanner.event.scan.completed",
"version": 1,
"tenant": "tenant-alpha",
"occurredAt": "2025-10-19T12:34:56Z",
"recordedAt": "2025-10-19T12:34:57Z",
"source": "scanner.webservice",
"idempotencyKey": "scanner.event.scan.completed:tenant-alpha:report-abc",
"correlationId": "report-abc",
"traceId": "4bf92f3577b34da6a3ce929d0e0e4736",
"scope": {
"namespace": "acme/edge",
"repo": "api",
"digest": "sha256:feedface"
},
"attributes": {
"reportId": "report-abc",
"policyRevisionId": "rev-42",
"policyDigest": "digest-123",
"verdict": "blocked"
},
"payload": {
"reportId": "report-abc",
"scanId": "report-abc",
"imageDigest": "sha256:feedface",
"verdict": "fail",
"summary": {
"total": 1,
"blocked": 1,
"warned": 0,
"ignored": 0,
"quieted": 0
},
"delta": {
"newCritical": 1,
"kev": [
"CVE-2024-9999"
]
},
"policy": {
"digest": "digest-123",
"revisionId": "rev-42"
},
"findings": [
{
"id": "finding-1",
"severity": "Critical",
"cve": "CVE-2024-9999",
"purl": "pkg:docker/acme/edge-api@sha256-feedface",
"reachability": "runtime"
}
],
{
"eventId": "08a6de24-4a94-4d14-8432-9d14f36f6da3",
"kind": "scanner.event.scan.completed",
"version": 1,
"tenant": "tenant-alpha",
"occurredAt": "2025-10-19T12:34:56Z",
"recordedAt": "2025-10-19T12:34:57Z",
"source": "scanner.webservice",
"idempotencyKey": "scanner.event.scan.completed:tenant-alpha:report-abc",
"correlationId": "report-abc",
"traceId": "4bf92f3577b34da6a3ce929d0e0e4736",
"scope": {
"namespace": "acme/edge",
"repo": "api",
"digest": "sha256:feedface"
},
"attributes": {
"reportId": "report-abc",
"policyRevisionId": "rev-42",
"policyDigest": "digest-123",
"verdict": "blocked"
},
"payload": {
"reportId": "report-abc",
"scanId": "report-abc",
"imageDigest": "sha256:feedface",
"verdict": "fail",
"summary": {
"total": 1,
"blocked": 1,
"warned": 0,
"ignored": 0,
"quieted": 0
},
"delta": {
"newCritical": 1,
"kev": [
"CVE-2024-9999"
]
},
"policy": {
"digest": "digest-123",
"revisionId": "rev-42"
},
"findings": [
{
"id": "finding-1",
"severity": "Critical",
"cve": "CVE-2024-9999",
"purl": "pkg:docker/acme/edge-api@sha256-feedface",
"reachability": "runtime"
}
],
"links": {
"report": {
"ui": "https://scanner.example/ui/reports/report-abc",
@@ -65,43 +65,43 @@
"api": "https://scanner.example/api/v1/reports/report-abc/attestation"
}
},
"dsse": {
"payloadType": "application/vnd.stellaops.report+json",
"payload": "eyJyZXBvcnRJZCI6InJlcG9ydC1hYmMiLCJpbWFnZURpZ2VzdCI6InNoYTI1NjpmZWVkZmFjZSIsImdlbmVyYXRlZEF0IjoiMjAyNS0xMC0xOVQxMjozNDo1NiswMDowMCIsInZlcmRpY3QiOiJibG9ja2VkIiwicG9saWN5Ijp7InJldmlzaW9uSWQiOiJyZXYtNDIiLCJkaWdlc3QiOiJkaWdlc3QtMTIzIn0sInN1bW1hcnkiOnsidG90YWwiOjEsImJsb2NrZWQiOjEsIndhcm5lZCI6MCwiaWdub3JlZCI6MCwicXVpZXRlZCI6MH0sInZlcmRpY3RzIjpbeyJmaW5kaW5nSWQiOiJmaW5kaW5nLTEiLCJzdGF0dXMiOiJCbG9ja2VkIiwic2NvcmUiOjQ3LjUsInNvdXJjZVRydXN0IjoiTlZEIiwicmVhY2hhYmlsaXR5IjoicnVudGltZSJ9XSwiaXNzdWVzIjpbXX0=",
"signatures": [
{
"keyId": "test-key",
"algorithm": "hs256",
"signature": "signature-value"
}
]
},
"report": {
"reportId": "report-abc",
"generatedAt": "2025-10-19T12:34:56Z",
"imageDigest": "sha256:feedface",
"policy": {
"digest": "digest-123",
"revisionId": "rev-42"
},
"summary": {
"total": 1,
"blocked": 1,
"warned": 0,
"ignored": 0,
"quieted": 0
},
"verdict": "blocked",
"verdicts": [
{
"findingId": "finding-1",
"status": "Blocked",
"score": 47.5,
"sourceTrust": "NVD",
"reachability": "runtime"
}
],
"issues": []
}
}
}
"dsse": {
"payloadType": "application/vnd.stellaops.report+json",
"payload": "eyJyZXBvcnRJZCI6InJlcG9ydC1hYmMiLCJpbWFnZURpZ2VzdCI6InNoYTI1NjpmZWVkZmFjZSIsImdlbmVyYXRlZEF0IjoiMjAyNS0xMC0xOVQxMjozNDo1NiswMDowMCIsInZlcmRpY3QiOiJibG9ja2VkIiwicG9saWN5Ijp7InJldmlzaW9uSWQiOiJyZXYtNDIiLCJkaWdlc3QiOiJkaWdlc3QtMTIzIn0sInN1bW1hcnkiOnsidG90YWwiOjEsImJsb2NrZWQiOjEsIndhcm5lZCI6MCwiaWdub3JlZCI6MCwicXVpZXRlZCI6MH0sInZlcmRpY3RzIjpbeyJmaW5kaW5nSWQiOiJmaW5kaW5nLTEiLCJzdGF0dXMiOiJCbG9ja2VkIiwic2NvcmUiOjQ3LjUsInNvdXJjZVRydXN0IjoiTlZEIiwicmVhY2hhYmlsaXR5IjoicnVudGltZSJ9XSwiaXNzdWVzIjpbXX0=",
"signatures": [
{
"keyId": "test-key",
"algorithm": "hs256",
"signature": "signature-value"
}
]
},
"report": {
"reportId": "report-abc",
"generatedAt": "2025-10-19T12:34:56Z",
"imageDigest": "sha256:feedface",
"policy": {
"digest": "digest-123",
"revisionId": "rev-42"
},
"summary": {
"total": 1,
"blocked": 1,
"warned": 0,
"ignored": 0,
"quieted": 0
},
"verdict": "blocked",
"verdicts": [
{
"findingId": "finding-1",
"status": "Blocked",
"score": 47.5,
"sourceTrust": "NVD",
"reachability": "runtime"
}
],
"issues": []
}
}
}

View File

@@ -1,4 +1,4 @@
# StellaOps.Scheduler.WebService — Agent Charter
## Mission
Implement Scheduler control plane per `docs/modules/scheduler/ARCHITECTURE.md`.
# StellaOps.Scheduler.WebService — Agent Charter
## Mission
Implement Scheduler control plane per `docs/modules/scheduler/ARCHITECTURE.md`.

View File

@@ -1,4 +1,4 @@
# StellaOps.Scheduler.ImpactIndex — Agent Charter
## Mission
Build the global impact index per `docs/modules/scheduler/ARCHITECTURE.md` (roaring bitmaps, selectors, snapshotting).
# StellaOps.Scheduler.ImpactIndex — Agent Charter
## Mission
Build the global impact index per `docs/modules/scheduler/ARCHITECTURE.md` (roaring bitmaps, selectors, snapshotting).

View File

@@ -1,4 +1,4 @@
# StellaOps.Scheduler.Models — Agent Charter
## Mission
Define Scheduler DTOs (Schedule, Run, ImpactSet, Selector, DeltaSummary) per `docs/modules/scheduler/ARCHITECTURE.md`.
# StellaOps.Scheduler.Models — Agent Charter
## Mission
Define Scheduler DTOs (Schedule, Run, ImpactSet, Selector, DeltaSummary) per `docs/modules/scheduler/ARCHITECTURE.md`.

View File

@@ -1,4 +1,4 @@
# StellaOps.Scheduler.Queue — Agent Charter
## Mission
Provide queue abstraction (Redis Streams / NATS JetStream) for planner inputs and runner segments per `docs/modules/scheduler/ARCHITECTURE.md`.
# StellaOps.Scheduler.Queue — Agent Charter
## Mission
Provide queue abstraction (Redis Streams / NATS JetStream) for planner inputs and runner segments per `docs/modules/scheduler/ARCHITECTURE.md`.

View File

@@ -1,4 +1,4 @@
# StellaOps.Scheduler.Storage.Mongo — Agent Charter
## Mission
Implement Mongo persistence (schedules, runs, impact cursors, locks, audit) per `docs/modules/scheduler/ARCHITECTURE.md`.
# StellaOps.Scheduler.Storage.Mongo — Agent Charter
## Mission
Implement Mongo persistence (schedules, runs, impact cursors, locks, audit) per `docs/modules/scheduler/ARCHITECTURE.md`.

View File

@@ -1,4 +1,4 @@
# StellaOps.Scheduler.Worker — Agent Charter
## Mission
Implement Scheduler planners/runners per `docs/modules/scheduler/ARCHITECTURE.md`.
# StellaOps.Scheduler.Worker — Agent Charter
## Mission
Implement Scheduler planners/runners per `docs/modules/scheduler/ARCHITECTURE.md`.

View File

@@ -1,43 +1,43 @@
# SCHED-WORKER-16-205 — Scheduler Worker Observability
_Sprint 16 · Scheduler Worker Guild_
The scheduler worker now exposes first-class metrics covering planner latency,
runner throughput, and backlog health.
## Meter: `StellaOps.Scheduler.Worker`
| Metric | Type | Tags | Description |
| --- | --- | --- | --- |
| `scheduler_planner_runs_total` | Counter | `mode`, `status` | Planner outcomes (`enqueued`, `no_work`, `failed`). |
| `scheduler_planner_latency_seconds` | Histogram | `mode`, `status` | Time between run creation and planner completion. |
| `scheduler_runner_segments_total` | Counter | `mode`, `status` | Runner segments processed (`Completed`, `persist_failed`, `RunMissing`). |
| `scheduler_runner_images_total` | Counter | `mode`, `delta` | Images processed per mode, split by whether a delta was observed. |
| `scheduler_runner_delta_total` | Counter | `mode` | Total new findings observed. |
| `scheduler_runner_delta_critical_total` | Counter | `mode` | Critical findings observed. |
| `scheduler_runner_delta_high_total` | Counter | `mode` | High findings observed. |
| `scheduler_runner_delta_kev_total` | Counter | `mode` | KEV hits surfaced across runner segments. |
| `scheduler_run_duration_seconds` | Histogram | `mode`, `result` | End-to-end run durations (currently recorded for successful completions). |
| `scheduler_runs_active` | Up/down counter | `mode` | Active runs in-flight. |
| `scheduler_runner_backlog` | Observable gauge | `mode`, `scheduleId` | Remaining images awaiting runner processing per schedule. |
## Instrumentation notes
- Planner records latency once a run transitions out of `Planning`. `no_work`
completions emit zero-duration runs without incrementing the active counter.
- Runner updates backlog after every segment and decrements the active counter
when a run reaches `Completed`.
- Delta counters aggregate per severity and KEV hit; they only increment when
`DeltaSummary` reports meaningful changes.
- Metrics are emitted regardless of Notify availability so operators can track
queue pressure even in air-gapped deployments.
## Dashboards & alerts
- **Grafana dashboard:** `docs/modules/scheduler/operations/worker-grafana-dashboard.json`
(import into Prometheus-backed Grafana). Panels mirror the metrics above with
mode filters.
- **Prometheus rules:** `docs/modules/scheduler/operations/worker-prometheus-rules.yaml`
provides planner failure/latency, backlog, and stuck-run alerts.
- **Operations guide:** see `docs/modules/scheduler/operations/worker.md` for
runbook steps, alert context, and dashboard wiring instructions.
# SCHED-WORKER-16-205 — Scheduler Worker Observability
_Sprint 16 · Scheduler Worker Guild_
The scheduler worker now exposes first-class metrics covering planner latency,
runner throughput, and backlog health.
## Meter: `StellaOps.Scheduler.Worker`
| Metric | Type | Tags | Description |
| --- | --- | --- | --- |
| `scheduler_planner_runs_total` | Counter | `mode`, `status` | Planner outcomes (`enqueued`, `no_work`, `failed`). |
| `scheduler_planner_latency_seconds` | Histogram | `mode`, `status` | Time between run creation and planner completion. |
| `scheduler_runner_segments_total` | Counter | `mode`, `status` | Runner segments processed (`Completed`, `persist_failed`, `RunMissing`). |
| `scheduler_runner_images_total` | Counter | `mode`, `delta` | Images processed per mode, split by whether a delta was observed. |
| `scheduler_runner_delta_total` | Counter | `mode` | Total new findings observed. |
| `scheduler_runner_delta_critical_total` | Counter | `mode` | Critical findings observed. |
| `scheduler_runner_delta_high_total` | Counter | `mode` | High findings observed. |
| `scheduler_runner_delta_kev_total` | Counter | `mode` | KEV hits surfaced across runner segments. |
| `scheduler_run_duration_seconds` | Histogram | `mode`, `result` | End-to-end run durations (currently recorded for successful completions). |
| `scheduler_runs_active` | Up/down counter | `mode` | Active runs in-flight. |
| `scheduler_runner_backlog` | Observable gauge | `mode`, `scheduleId` | Remaining images awaiting runner processing per schedule. |
## Instrumentation notes
- Planner records latency once a run transitions out of `Planning`. `no_work`
completions emit zero-duration runs without incrementing the active counter.
- Runner updates backlog after every segment and decrements the active counter
when a run reaches `Completed`.
- Delta counters aggregate per severity and KEV hit; they only increment when
`DeltaSummary` reports meaningful changes.
- Metrics are emitted regardless of Notify availability so operators can track
queue pressure even in air-gapped deployments.
## Dashboards & alerts
- **Grafana dashboard:** `docs/modules/scheduler/operations/worker-grafana-dashboard.json`
(import into Prometheus-backed Grafana). Panels mirror the metrics above with
mode filters.
- **Prometheus rules:** `docs/modules/scheduler/operations/worker-prometheus-rules.yaml`
provides planner failure/latency, backlog, and stuck-run alerts.
- **Operations guide:** see `docs/modules/scheduler/operations/worker.md` for
runbook steps, alert context, and dashboard wiring instructions.

View File

@@ -1,20 +1,20 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>net10.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
</PropertyGroup>
<ItemGroup>
<ProjectReference Include="../../Concelier/StellaOps.Concelier.PluginBinaries/StellaOps.Concelier.Connector.Osv/StellaOps.Concelier.Connector.Osv.csproj" />
<ProjectReference Include="../../Concelier/StellaOps.Concelier.PluginBinaries/StellaOps.Concelier.Connector.Ghsa/StellaOps.Concelier.Connector.Ghsa.csproj" />
<ProjectReference Include="../../Concelier/StellaOps.Concelier.PluginBinaries/StellaOps.Concelier.Connector.Nvd/StellaOps.Concelier.Connector.Nvd.csproj" />
<ProjectReference Include="../../Concelier/StellaOps.Concelier.PluginBinaries/StellaOps.Concelier.Connector.Common/StellaOps.Concelier.Connector.Common.csproj" />
<ProjectReference Include="../../Concelier/__Libraries/StellaOps.Concelier.Storage.Mongo/StellaOps.Concelier.Storage.Mongo.csproj" />
<ProjectReference Include="../../Concelier/__Libraries/StellaOps.Concelier.Models/StellaOps.Concelier.Models.csproj" />
<ProjectReference Include="../../Concelier/__Libraries/StellaOps.Concelier.Testing/StellaOps.Concelier.Testing.csproj" />
</ItemGroup>
</Project>
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>net10.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
</PropertyGroup>
<ItemGroup>
<ProjectReference Include="../../Concelier/StellaOps.Concelier.PluginBinaries/StellaOps.Concelier.Connector.Osv/StellaOps.Concelier.Connector.Osv.csproj" />
<ProjectReference Include="../../Concelier/StellaOps.Concelier.PluginBinaries/StellaOps.Concelier.Connector.Ghsa/StellaOps.Concelier.Connector.Ghsa.csproj" />
<ProjectReference Include="../../Concelier/StellaOps.Concelier.PluginBinaries/StellaOps.Concelier.Connector.Nvd/StellaOps.Concelier.Connector.Nvd.csproj" />
<ProjectReference Include="../../Concelier/StellaOps.Concelier.PluginBinaries/StellaOps.Concelier.Connector.Common/StellaOps.Concelier.Connector.Common.csproj" />
<ProjectReference Include="../../Concelier/__Libraries/StellaOps.Concelier.Storage.Mongo/StellaOps.Concelier.Storage.Mongo.csproj" />
<ProjectReference Include="../../Concelier/__Libraries/StellaOps.Concelier.Models/StellaOps.Concelier.Models.csproj" />
<ProjectReference Include="../../Concelier/__Libraries/StellaOps.Concelier.Testing/StellaOps.Concelier.Testing.csproj" />
</ItemGroup>
</Project>

View File

@@ -1,378 +1,378 @@
using System.Linq;
using System.Text;
using System.Text.Json;
using System.Text.Json.Serialization;
using MongoDB.Bson;
using StellaOps.Concelier.Models;
using StellaOps.Concelier.Connector.Ghsa;
using StellaOps.Concelier.Connector.Common;
using StellaOps.Concelier.Connector.Ghsa.Internal;
using StellaOps.Concelier.Connector.Osv.Internal;
using StellaOps.Concelier.Connector.Osv;
using StellaOps.Concelier.Connector.Nvd;
using StellaOps.Concelier.Storage.Mongo.Documents;
using StellaOps.Concelier.Storage.Mongo.Dtos;
var serializerOptions = new JsonSerializerOptions(JsonSerializerDefaults.Web)
{
DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingNull,
};
var projectRoot = Path.GetFullPath(Path.Combine(AppContext.BaseDirectory, "..", "..", "..", "..", ".."));
var osvFixturesPath = Path.Combine(projectRoot, "src", "StellaOps.Concelier.Connector.Osv.Tests", "Fixtures");
var ghsaFixturesPath = Path.Combine(projectRoot, "src", "StellaOps.Concelier.Connector.Ghsa.Tests", "Fixtures");
var nvdFixturesPath = Path.Combine(projectRoot, "src", "StellaOps.Concelier.Connector.Nvd.Tests", "Nvd", "Fixtures");
RewriteOsvFixtures(osvFixturesPath);
RewriteSnapshotFixtures(osvFixturesPath);
RewriteGhsaFixtures(osvFixturesPath);
RewriteCreditParityFixtures(ghsaFixturesPath, nvdFixturesPath);
return;
void RewriteOsvFixtures(string fixturesPath)
{
var rawPath = Path.Combine(fixturesPath, "osv-ghsa.raw-osv.json");
if (!File.Exists(rawPath))
{
Console.WriteLine($"[FixtureUpdater] OSV raw fixture missing: {rawPath}");
return;
}
using var document = JsonDocument.Parse(File.ReadAllText(rawPath));
var advisories = new List<Advisory>();
foreach (var element in document.RootElement.EnumerateArray())
{
var dto = JsonSerializer.Deserialize<OsvVulnerabilityDto>(element.GetRawText(), serializerOptions);
if (dto is null)
{
continue;
}
var ecosystem = dto.Affected?.FirstOrDefault()?.Package?.Ecosystem ?? "unknown";
var uri = new Uri($"https://osv.dev/vulnerability/{dto.Id}");
var documentRecord = new DocumentRecord(
Guid.NewGuid(),
OsvConnectorPlugin.SourceName,
uri.ToString(),
DateTimeOffset.UtcNow,
"fixture-sha",
DocumentStatuses.PendingMap,
"application/json",
null,
new Dictionary<string, string>(StringComparer.Ordinal)
{
["osv.ecosystem"] = ecosystem,
},
null,
DateTimeOffset.UtcNow,
null,
null);
var payload = BsonDocument.Parse(element.GetRawText());
var dtoRecord = new DtoRecord(
Guid.NewGuid(),
documentRecord.Id,
OsvConnectorPlugin.SourceName,
"osv.v1",
payload,
DateTimeOffset.UtcNow);
var advisory = OsvMapper.Map(dto, documentRecord, dtoRecord, ecosystem);
advisories.Add(advisory);
}
advisories.Sort((left, right) => string.Compare(left.AdvisoryKey, right.AdvisoryKey, StringComparison.Ordinal));
var snapshot = SnapshotSerializer.ToSnapshot(advisories);
File.WriteAllText(Path.Combine(fixturesPath, "osv-ghsa.osv.json"), snapshot);
Console.WriteLine($"[FixtureUpdater] Updated {Path.Combine(fixturesPath, "osv-ghsa.osv.json")}");
}
void RewriteSnapshotFixtures(string fixturesPath)
{
var baselinePublished = new DateTimeOffset(2025, 1, 5, 12, 0, 0, TimeSpan.Zero);
var baselineModified = new DateTimeOffset(2025, 1, 8, 6, 30, 0, TimeSpan.Zero);
var baselineFetched = new DateTimeOffset(2025, 1, 8, 7, 0, 0, TimeSpan.Zero);
var cases = new (string Ecosystem, string Purl, string PackageName, string SnapshotFile)[]
{
("npm", "pkg:npm/%40scope%2Fleft-pad", "@scope/left-pad", "osv-npm.snapshot.json"),
("PyPI", "pkg:pypi/requests", "requests", "osv-pypi.snapshot.json"),
};
foreach (var (ecosystem, purl, packageName, snapshotFile) in cases)
{
var dto = new OsvVulnerabilityDto
{
Id = $"OSV-2025-{ecosystem}-0001",
Summary = $"{ecosystem} package vulnerability",
Details = $"Detailed description for {ecosystem} package {packageName}.",
Published = baselinePublished,
Modified = baselineModified,
Aliases = new[] { $"CVE-2025-11{ecosystem.Length}", $"GHSA-{ecosystem.Length}abc-{ecosystem.Length}def-{ecosystem.Length}ghi" },
Related = new[] { $"OSV-RELATED-{ecosystem}-42" },
References = new[]
{
new OsvReferenceDto { Url = $"https://example.com/{ecosystem}/advisory", Type = "ADVISORY" },
new OsvReferenceDto { Url = $"https://example.com/{ecosystem}/fix", Type = "FIX" },
},
Severity = new[]
{
new OsvSeverityDto { Type = "CVSS_V3", Score = "CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:H/I:H/A:H" },
},
Affected = new[]
{
new OsvAffectedPackageDto
{
Package = new OsvPackageDto
{
Ecosystem = ecosystem,
Name = packageName,
Purl = purl,
},
Ranges = new[]
{
new OsvRangeDto
{
Type = "SEMVER",
Events = new[]
{
new OsvEventDto { Introduced = "0" },
new OsvEventDto { Fixed = "2.0.0" },
},
},
},
Versions = new[] { "1.0.0", "1.5.0" },
EcosystemSpecific = JsonDocument.Parse("{\"severity\":\"high\"}").RootElement.Clone(),
},
},
DatabaseSpecific = JsonDocument.Parse("{\"source\":\"osv.dev\"}").RootElement.Clone(),
};
var document = new DocumentRecord(
Guid.NewGuid(),
OsvConnectorPlugin.SourceName,
$"https://osv.dev/vulnerability/{dto.Id}",
baselineFetched,
"fixture-sha",
DocumentStatuses.PendingParse,
"application/json",
null,
new Dictionary<string, string>(StringComparer.Ordinal) { ["osv.ecosystem"] = ecosystem },
null,
baselineModified,
null);
var payload = BsonDocument.Parse(JsonSerializer.Serialize(dto, serializerOptions));
var dtoRecord = new DtoRecord(Guid.NewGuid(), document.Id, OsvConnectorPlugin.SourceName, "osv.v1", payload, baselineModified);
var advisory = OsvMapper.Map(dto, document, dtoRecord, ecosystem);
var snapshot = SnapshotSerializer.ToSnapshot(advisory);
File.WriteAllText(Path.Combine(fixturesPath, snapshotFile), snapshot);
Console.WriteLine($"[FixtureUpdater] Updated {Path.Combine(fixturesPath, snapshotFile)}");
}
}
void RewriteGhsaFixtures(string fixturesPath)
{
var rawPath = Path.Combine(fixturesPath, "osv-ghsa.raw-ghsa.json");
if (!File.Exists(rawPath))
{
Console.WriteLine($"[FixtureUpdater] GHSA raw fixture missing: {rawPath}");
return;
}
JsonDocument document;
try
{
document = JsonDocument.Parse(File.ReadAllText(rawPath));
}
catch (JsonException ex)
{
Console.WriteLine($"[FixtureUpdater] Failed to parse GHSA raw fixture '{rawPath}': {ex.Message}");
return;
}
using (document)
{
var advisories = new List<Advisory>();
foreach (var element in document.RootElement.EnumerateArray())
{
GhsaRecordDto dto;
try
{
dto = GhsaRecordParser.Parse(Encoding.UTF8.GetBytes(element.GetRawText()));
}
catch (JsonException)
{
continue;
}
var uri = new Uri($"https://github.com/advisories/{dto.GhsaId}");
var documentRecord = new DocumentRecord(
Guid.NewGuid(),
GhsaConnectorPlugin.SourceName,
uri.ToString(),
DateTimeOffset.UtcNow,
"fixture-sha",
DocumentStatuses.PendingMap,
"application/json",
null,
new Dictionary<string, string>(StringComparer.Ordinal),
null,
DateTimeOffset.UtcNow,
null,
null);
var advisory = GhsaMapper.Map(dto, documentRecord, DateTimeOffset.UtcNow);
advisories.Add(advisory);
}
advisories.Sort((left, right) => string.Compare(left.AdvisoryKey, right.AdvisoryKey, StringComparison.Ordinal));
var snapshot = SnapshotSerializer.ToSnapshot(advisories);
File.WriteAllText(Path.Combine(fixturesPath, "osv-ghsa.ghsa.json"), snapshot);
Console.WriteLine($"[FixtureUpdater] Updated {Path.Combine(fixturesPath, "osv-ghsa.ghsa.json")}");
}
}
void RewriteCreditParityFixtures(string ghsaFixturesPath, string nvdFixturesPath)
{
Directory.CreateDirectory(ghsaFixturesPath);
Directory.CreateDirectory(nvdFixturesPath);
var advisoryKeyGhsa = "GHSA-credit-parity";
var advisoryKeyNvd = "CVE-2025-5555";
var recordedAt = new DateTimeOffset(2025, 10, 10, 15, 0, 0, TimeSpan.Zero);
var published = new DateTimeOffset(2025, 10, 9, 18, 30, 0, TimeSpan.Zero);
var modified = new DateTimeOffset(2025, 10, 10, 12, 0, 0, TimeSpan.Zero);
AdvisoryCredit[] CreateCredits(string source) =>
[
CreateCredit("Alice Researcher", "reporter", new[] { "mailto:alice.researcher@example.com" }, source),
CreateCredit("Bob Maintainer", "remediation_developer", new[] { "https://github.com/acme/bob-maintainer" }, source)
];
AdvisoryCredit CreateCredit(string displayName, string role, IReadOnlyList<string> contacts, string source)
{
var provenance = new AdvisoryProvenance(
source,
"credit",
$"{source}:{displayName.ToLowerInvariant().Replace(' ', '-')}",
recordedAt,
new[] { ProvenanceFieldMasks.Credits });
return new AdvisoryCredit(displayName, role, contacts, provenance);
}
AdvisoryReference[] CreateReferences(string sourceName, params (string Url, string Kind)[] entries)
{
if (entries is null || entries.Length == 0)
{
return Array.Empty<AdvisoryReference>();
}
var references = new List<AdvisoryReference>(entries.Length);
foreach (var entry in entries)
{
var provenance = new AdvisoryProvenance(
sourceName,
"reference",
entry.Url,
recordedAt,
new[] { ProvenanceFieldMasks.References });
references.Add(new AdvisoryReference(
entry.Url,
entry.Kind,
sourceTag: null,
summary: null,
provenance));
}
return references.ToArray();
}
Advisory CreateAdvisory(
string sourceName,
string advisoryKey,
IEnumerable<string> aliases,
AdvisoryCredit[] credits,
AdvisoryReference[] references,
string documentValue)
{
var documentProvenance = new AdvisoryProvenance(
sourceName,
"document",
documentValue,
recordedAt,
new[] { ProvenanceFieldMasks.Advisory });
var mappingProvenance = new AdvisoryProvenance(
sourceName,
"mapping",
advisoryKey,
recordedAt,
new[] { ProvenanceFieldMasks.Advisory });
return new Advisory(
advisoryKey,
"Credit parity regression fixture",
"Credit parity regression fixture",
"en",
published,
modified,
"moderate",
exploitKnown: false,
aliases,
credits,
references,
Array.Empty<AffectedPackage>(),
Array.Empty<CvssMetric>(),
new[] { documentProvenance, mappingProvenance });
}
var ghsa = CreateAdvisory(
"ghsa",
advisoryKeyGhsa,
new[] { advisoryKeyGhsa, advisoryKeyNvd },
CreateCredits("ghsa"),
CreateReferences(
"ghsa",
( $"https://github.com/advisories/{advisoryKeyGhsa}", "advisory"),
( "https://example.com/ghsa/patch", "patch")),
$"security/advisories/{advisoryKeyGhsa}");
var osv = CreateAdvisory(
OsvConnectorPlugin.SourceName,
advisoryKeyGhsa,
new[] { advisoryKeyGhsa, advisoryKeyNvd },
CreateCredits(OsvConnectorPlugin.SourceName),
CreateReferences(
OsvConnectorPlugin.SourceName,
( $"https://github.com/advisories/{advisoryKeyGhsa}", "advisory"),
( $"https://osv.dev/vulnerability/{advisoryKeyGhsa}", "advisory")),
$"https://osv.dev/vulnerability/{advisoryKeyGhsa}");
var nvd = CreateAdvisory(
NvdConnectorPlugin.SourceName,
advisoryKeyNvd,
new[] { advisoryKeyNvd, advisoryKeyGhsa },
CreateCredits(NvdConnectorPlugin.SourceName),
CreateReferences(
NvdConnectorPlugin.SourceName,
( $"https://services.nvd.nist.gov/vuln/detail/{advisoryKeyNvd}", "advisory"),
( "https://example.com/nvd/reference", "report")),
$"https://services.nvd.nist.gov/vuln/detail/{advisoryKeyNvd}");
var ghsaSnapshot = SnapshotSerializer.ToSnapshot(ghsa);
var osvSnapshot = SnapshotSerializer.ToSnapshot(osv);
var nvdSnapshot = SnapshotSerializer.ToSnapshot(nvd);
File.WriteAllText(Path.Combine(ghsaFixturesPath, "credit-parity.ghsa.json"), ghsaSnapshot);
File.WriteAllText(Path.Combine(ghsaFixturesPath, "credit-parity.osv.json"), osvSnapshot);
File.WriteAllText(Path.Combine(ghsaFixturesPath, "credit-parity.nvd.json"), nvdSnapshot);
File.WriteAllText(Path.Combine(nvdFixturesPath, "credit-parity.ghsa.json"), ghsaSnapshot);
File.WriteAllText(Path.Combine(nvdFixturesPath, "credit-parity.osv.json"), osvSnapshot);
File.WriteAllText(Path.Combine(nvdFixturesPath, "credit-parity.nvd.json"), nvdSnapshot);
Console.WriteLine($"[FixtureUpdater] Updated credit parity fixtures under {ghsaFixturesPath} and {nvdFixturesPath}");
}
using System.Linq;
using System.Text;
using System.Text.Json;
using System.Text.Json.Serialization;
using MongoDB.Bson;
using StellaOps.Concelier.Models;
using StellaOps.Concelier.Connector.Ghsa;
using StellaOps.Concelier.Connector.Common;
using StellaOps.Concelier.Connector.Ghsa.Internal;
using StellaOps.Concelier.Connector.Osv.Internal;
using StellaOps.Concelier.Connector.Osv;
using StellaOps.Concelier.Connector.Nvd;
using StellaOps.Concelier.Storage.Mongo.Documents;
using StellaOps.Concelier.Storage.Mongo.Dtos;
var serializerOptions = new JsonSerializerOptions(JsonSerializerDefaults.Web)
{
DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingNull,
};
var projectRoot = Path.GetFullPath(Path.Combine(AppContext.BaseDirectory, "..", "..", "..", "..", ".."));
var osvFixturesPath = Path.Combine(projectRoot, "src", "StellaOps.Concelier.Connector.Osv.Tests", "Fixtures");
var ghsaFixturesPath = Path.Combine(projectRoot, "src", "StellaOps.Concelier.Connector.Ghsa.Tests", "Fixtures");
var nvdFixturesPath = Path.Combine(projectRoot, "src", "StellaOps.Concelier.Connector.Nvd.Tests", "Nvd", "Fixtures");
RewriteOsvFixtures(osvFixturesPath);
RewriteSnapshotFixtures(osvFixturesPath);
RewriteGhsaFixtures(osvFixturesPath);
RewriteCreditParityFixtures(ghsaFixturesPath, nvdFixturesPath);
return;
void RewriteOsvFixtures(string fixturesPath)
{
var rawPath = Path.Combine(fixturesPath, "osv-ghsa.raw-osv.json");
if (!File.Exists(rawPath))
{
Console.WriteLine($"[FixtureUpdater] OSV raw fixture missing: {rawPath}");
return;
}
using var document = JsonDocument.Parse(File.ReadAllText(rawPath));
var advisories = new List<Advisory>();
foreach (var element in document.RootElement.EnumerateArray())
{
var dto = JsonSerializer.Deserialize<OsvVulnerabilityDto>(element.GetRawText(), serializerOptions);
if (dto is null)
{
continue;
}
var ecosystem = dto.Affected?.FirstOrDefault()?.Package?.Ecosystem ?? "unknown";
var uri = new Uri($"https://osv.dev/vulnerability/{dto.Id}");
var documentRecord = new DocumentRecord(
Guid.NewGuid(),
OsvConnectorPlugin.SourceName,
uri.ToString(),
DateTimeOffset.UtcNow,
"fixture-sha",
DocumentStatuses.PendingMap,
"application/json",
null,
new Dictionary<string, string>(StringComparer.Ordinal)
{
["osv.ecosystem"] = ecosystem,
},
null,
DateTimeOffset.UtcNow,
null,
null);
var payload = BsonDocument.Parse(element.GetRawText());
var dtoRecord = new DtoRecord(
Guid.NewGuid(),
documentRecord.Id,
OsvConnectorPlugin.SourceName,
"osv.v1",
payload,
DateTimeOffset.UtcNow);
var advisory = OsvMapper.Map(dto, documentRecord, dtoRecord, ecosystem);
advisories.Add(advisory);
}
advisories.Sort((left, right) => string.Compare(left.AdvisoryKey, right.AdvisoryKey, StringComparison.Ordinal));
var snapshot = SnapshotSerializer.ToSnapshot(advisories);
File.WriteAllText(Path.Combine(fixturesPath, "osv-ghsa.osv.json"), snapshot);
Console.WriteLine($"[FixtureUpdater] Updated {Path.Combine(fixturesPath, "osv-ghsa.osv.json")}");
}
void RewriteSnapshotFixtures(string fixturesPath)
{
var baselinePublished = new DateTimeOffset(2025, 1, 5, 12, 0, 0, TimeSpan.Zero);
var baselineModified = new DateTimeOffset(2025, 1, 8, 6, 30, 0, TimeSpan.Zero);
var baselineFetched = new DateTimeOffset(2025, 1, 8, 7, 0, 0, TimeSpan.Zero);
var cases = new (string Ecosystem, string Purl, string PackageName, string SnapshotFile)[]
{
("npm", "pkg:npm/%40scope%2Fleft-pad", "@scope/left-pad", "osv-npm.snapshot.json"),
("PyPI", "pkg:pypi/requests", "requests", "osv-pypi.snapshot.json"),
};
foreach (var (ecosystem, purl, packageName, snapshotFile) in cases)
{
var dto = new OsvVulnerabilityDto
{
Id = $"OSV-2025-{ecosystem}-0001",
Summary = $"{ecosystem} package vulnerability",
Details = $"Detailed description for {ecosystem} package {packageName}.",
Published = baselinePublished,
Modified = baselineModified,
Aliases = new[] { $"CVE-2025-11{ecosystem.Length}", $"GHSA-{ecosystem.Length}abc-{ecosystem.Length}def-{ecosystem.Length}ghi" },
Related = new[] { $"OSV-RELATED-{ecosystem}-42" },
References = new[]
{
new OsvReferenceDto { Url = $"https://example.com/{ecosystem}/advisory", Type = "ADVISORY" },
new OsvReferenceDto { Url = $"https://example.com/{ecosystem}/fix", Type = "FIX" },
},
Severity = new[]
{
new OsvSeverityDto { Type = "CVSS_V3", Score = "CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:H/I:H/A:H" },
},
Affected = new[]
{
new OsvAffectedPackageDto
{
Package = new OsvPackageDto
{
Ecosystem = ecosystem,
Name = packageName,
Purl = purl,
},
Ranges = new[]
{
new OsvRangeDto
{
Type = "SEMVER",
Events = new[]
{
new OsvEventDto { Introduced = "0" },
new OsvEventDto { Fixed = "2.0.0" },
},
},
},
Versions = new[] { "1.0.0", "1.5.0" },
EcosystemSpecific = JsonDocument.Parse("{\"severity\":\"high\"}").RootElement.Clone(),
},
},
DatabaseSpecific = JsonDocument.Parse("{\"source\":\"osv.dev\"}").RootElement.Clone(),
};
var document = new DocumentRecord(
Guid.NewGuid(),
OsvConnectorPlugin.SourceName,
$"https://osv.dev/vulnerability/{dto.Id}",
baselineFetched,
"fixture-sha",
DocumentStatuses.PendingParse,
"application/json",
null,
new Dictionary<string, string>(StringComparer.Ordinal) { ["osv.ecosystem"] = ecosystem },
null,
baselineModified,
null);
var payload = BsonDocument.Parse(JsonSerializer.Serialize(dto, serializerOptions));
var dtoRecord = new DtoRecord(Guid.NewGuid(), document.Id, OsvConnectorPlugin.SourceName, "osv.v1", payload, baselineModified);
var advisory = OsvMapper.Map(dto, document, dtoRecord, ecosystem);
var snapshot = SnapshotSerializer.ToSnapshot(advisory);
File.WriteAllText(Path.Combine(fixturesPath, snapshotFile), snapshot);
Console.WriteLine($"[FixtureUpdater] Updated {Path.Combine(fixturesPath, snapshotFile)}");
}
}
void RewriteGhsaFixtures(string fixturesPath)
{
var rawPath = Path.Combine(fixturesPath, "osv-ghsa.raw-ghsa.json");
if (!File.Exists(rawPath))
{
Console.WriteLine($"[FixtureUpdater] GHSA raw fixture missing: {rawPath}");
return;
}
JsonDocument document;
try
{
document = JsonDocument.Parse(File.ReadAllText(rawPath));
}
catch (JsonException ex)
{
Console.WriteLine($"[FixtureUpdater] Failed to parse GHSA raw fixture '{rawPath}': {ex.Message}");
return;
}
using (document)
{
var advisories = new List<Advisory>();
foreach (var element in document.RootElement.EnumerateArray())
{
GhsaRecordDto dto;
try
{
dto = GhsaRecordParser.Parse(Encoding.UTF8.GetBytes(element.GetRawText()));
}
catch (JsonException)
{
continue;
}
var uri = new Uri($"https://github.com/advisories/{dto.GhsaId}");
var documentRecord = new DocumentRecord(
Guid.NewGuid(),
GhsaConnectorPlugin.SourceName,
uri.ToString(),
DateTimeOffset.UtcNow,
"fixture-sha",
DocumentStatuses.PendingMap,
"application/json",
null,
new Dictionary<string, string>(StringComparer.Ordinal),
null,
DateTimeOffset.UtcNow,
null,
null);
var advisory = GhsaMapper.Map(dto, documentRecord, DateTimeOffset.UtcNow);
advisories.Add(advisory);
}
advisories.Sort((left, right) => string.Compare(left.AdvisoryKey, right.AdvisoryKey, StringComparison.Ordinal));
var snapshot = SnapshotSerializer.ToSnapshot(advisories);
File.WriteAllText(Path.Combine(fixturesPath, "osv-ghsa.ghsa.json"), snapshot);
Console.WriteLine($"[FixtureUpdater] Updated {Path.Combine(fixturesPath, "osv-ghsa.ghsa.json")}");
}
}
void RewriteCreditParityFixtures(string ghsaFixturesPath, string nvdFixturesPath)
{
Directory.CreateDirectory(ghsaFixturesPath);
Directory.CreateDirectory(nvdFixturesPath);
var advisoryKeyGhsa = "GHSA-credit-parity";
var advisoryKeyNvd = "CVE-2025-5555";
var recordedAt = new DateTimeOffset(2025, 10, 10, 15, 0, 0, TimeSpan.Zero);
var published = new DateTimeOffset(2025, 10, 9, 18, 30, 0, TimeSpan.Zero);
var modified = new DateTimeOffset(2025, 10, 10, 12, 0, 0, TimeSpan.Zero);
AdvisoryCredit[] CreateCredits(string source) =>
[
CreateCredit("Alice Researcher", "reporter", new[] { "mailto:alice.researcher@example.com" }, source),
CreateCredit("Bob Maintainer", "remediation_developer", new[] { "https://github.com/acme/bob-maintainer" }, source)
];
AdvisoryCredit CreateCredit(string displayName, string role, IReadOnlyList<string> contacts, string source)
{
var provenance = new AdvisoryProvenance(
source,
"credit",
$"{source}:{displayName.ToLowerInvariant().Replace(' ', '-')}",
recordedAt,
new[] { ProvenanceFieldMasks.Credits });
return new AdvisoryCredit(displayName, role, contacts, provenance);
}
AdvisoryReference[] CreateReferences(string sourceName, params (string Url, string Kind)[] entries)
{
if (entries is null || entries.Length == 0)
{
return Array.Empty<AdvisoryReference>();
}
var references = new List<AdvisoryReference>(entries.Length);
foreach (var entry in entries)
{
var provenance = new AdvisoryProvenance(
sourceName,
"reference",
entry.Url,
recordedAt,
new[] { ProvenanceFieldMasks.References });
references.Add(new AdvisoryReference(
entry.Url,
entry.Kind,
sourceTag: null,
summary: null,
provenance));
}
return references.ToArray();
}
Advisory CreateAdvisory(
string sourceName,
string advisoryKey,
IEnumerable<string> aliases,
AdvisoryCredit[] credits,
AdvisoryReference[] references,
string documentValue)
{
var documentProvenance = new AdvisoryProvenance(
sourceName,
"document",
documentValue,
recordedAt,
new[] { ProvenanceFieldMasks.Advisory });
var mappingProvenance = new AdvisoryProvenance(
sourceName,
"mapping",
advisoryKey,
recordedAt,
new[] { ProvenanceFieldMasks.Advisory });
return new Advisory(
advisoryKey,
"Credit parity regression fixture",
"Credit parity regression fixture",
"en",
published,
modified,
"moderate",
exploitKnown: false,
aliases,
credits,
references,
Array.Empty<AffectedPackage>(),
Array.Empty<CvssMetric>(),
new[] { documentProvenance, mappingProvenance });
}
var ghsa = CreateAdvisory(
"ghsa",
advisoryKeyGhsa,
new[] { advisoryKeyGhsa, advisoryKeyNvd },
CreateCredits("ghsa"),
CreateReferences(
"ghsa",
( $"https://github.com/advisories/{advisoryKeyGhsa}", "advisory"),
( "https://example.com/ghsa/patch", "patch")),
$"security/advisories/{advisoryKeyGhsa}");
var osv = CreateAdvisory(
OsvConnectorPlugin.SourceName,
advisoryKeyGhsa,
new[] { advisoryKeyGhsa, advisoryKeyNvd },
CreateCredits(OsvConnectorPlugin.SourceName),
CreateReferences(
OsvConnectorPlugin.SourceName,
( $"https://github.com/advisories/{advisoryKeyGhsa}", "advisory"),
( $"https://osv.dev/vulnerability/{advisoryKeyGhsa}", "advisory")),
$"https://osv.dev/vulnerability/{advisoryKeyGhsa}");
var nvd = CreateAdvisory(
NvdConnectorPlugin.SourceName,
advisoryKeyNvd,
new[] { advisoryKeyNvd, advisoryKeyGhsa },
CreateCredits(NvdConnectorPlugin.SourceName),
CreateReferences(
NvdConnectorPlugin.SourceName,
( $"https://services.nvd.nist.gov/vuln/detail/{advisoryKeyNvd}", "advisory"),
( "https://example.com/nvd/reference", "report")),
$"https://services.nvd.nist.gov/vuln/detail/{advisoryKeyNvd}");
var ghsaSnapshot = SnapshotSerializer.ToSnapshot(ghsa);
var osvSnapshot = SnapshotSerializer.ToSnapshot(osv);
var nvdSnapshot = SnapshotSerializer.ToSnapshot(nvd);
File.WriteAllText(Path.Combine(ghsaFixturesPath, "credit-parity.ghsa.json"), ghsaSnapshot);
File.WriteAllText(Path.Combine(ghsaFixturesPath, "credit-parity.osv.json"), osvSnapshot);
File.WriteAllText(Path.Combine(ghsaFixturesPath, "credit-parity.nvd.json"), nvdSnapshot);
File.WriteAllText(Path.Combine(nvdFixturesPath, "credit-parity.ghsa.json"), ghsaSnapshot);
File.WriteAllText(Path.Combine(nvdFixturesPath, "credit-parity.osv.json"), osvSnapshot);
File.WriteAllText(Path.Combine(nvdFixturesPath, "credit-parity.nvd.json"), nvdSnapshot);
Console.WriteLine($"[FixtureUpdater] Updated credit parity fixtures under {ghsaFixturesPath} and {nvdFixturesPath}");
}

View File

@@ -1,18 +1,18 @@
<?xml version="1.0" encoding="utf-8"?>
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>net10.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Microsoft.Extensions.DependencyInjection" Version="10.0.0-rc.2.25502.107" />
<PackageReference Include="Microsoft.Extensions.Logging.Abstractions" Version="10.0.0-rc.2.25502.107" />
</ItemGroup>
<ItemGroup>
<ProjectReference Include="..\..\src\StellaOps.Scanner.Analyzers.Lang\StellaOps.Scanner.Analyzers.Lang.csproj" />
<ProjectReference Include="..\..\src\StellaOps.Scanner.Core\StellaOps.Scanner.Core.csproj" />
</ItemGroup>
</Project>
<?xml version="1.0" encoding="utf-8"?>
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>net10.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Microsoft.Extensions.DependencyInjection" Version="10.0.0-rc.2.25502.107" />
<PackageReference Include="Microsoft.Extensions.Logging.Abstractions" Version="10.0.0-rc.2.25502.107" />
</ItemGroup>
<ItemGroup>
<ProjectReference Include="..\..\src\StellaOps.Scanner.Analyzers.Lang\StellaOps.Scanner.Analyzers.Lang.csproj" />
<ProjectReference Include="..\..\src\StellaOps.Scanner.Core\StellaOps.Scanner.Core.csproj" />
</ItemGroup>
</Project>

View File

@@ -1,348 +1,348 @@
using System.Collections.Immutable;
using System.Diagnostics;
using System.Reflection;
using System.Security.Cryptography;
using System.Text;
using System.Text.Json;
using System.Text.Json.Serialization;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Logging.Abstractions;
using StellaOps.Scanner.Analyzers.Lang;
using StellaOps.Scanner.Analyzers.Lang.Plugin;
using StellaOps.Scanner.Core.Security;
internal sealed record SmokeScenario(string Name, string[] UsageHintRelatives)
{
public IReadOnlyList<string> ResolveUsageHints(string scenarioRoot)
=> UsageHintRelatives.Select(relative => Path.GetFullPath(Path.Combine(scenarioRoot, relative))).ToArray();
}
internal sealed class SmokeOptions
{
public string RepoRoot { get; set; } = Directory.GetCurrentDirectory();
public string PluginDirectoryName { get; set; } = "StellaOps.Scanner.Analyzers.Lang.Python";
public string FixtureRelativePath { get; set; } = Path.Combine("src", "StellaOps.Scanner.Analyzers.Lang.Python.Tests", "Fixtures", "lang", "python");
public static SmokeOptions Parse(string[] args)
{
var options = new SmokeOptions();
for (var index = 0; index < args.Length; index++)
{
var current = args[index];
switch (current)
{
case "--repo-root":
case "-r":
options.RepoRoot = RequireValue(args, ref index, current);
break;
case "--plugin-directory":
case "-p":
options.PluginDirectoryName = RequireValue(args, ref index, current);
break;
case "--fixture-path":
case "-f":
options.FixtureRelativePath = RequireValue(args, ref index, current);
break;
case "--help":
case "-h":
PrintUsage();
Environment.Exit(0);
break;
default:
throw new ArgumentException($"Unknown argument '{current}'. Use --help for usage.");
}
}
options.RepoRoot = Path.GetFullPath(options.RepoRoot);
return options;
}
private static string RequireValue(string[] args, ref int index, string switchName)
{
if (index + 1 >= args.Length)
{
throw new ArgumentException($"Missing value for '{switchName}'.");
}
index++;
var value = args[index];
if (string.IsNullOrWhiteSpace(value))
{
throw new ArgumentException($"Value for '{switchName}' cannot be empty.");
}
return value;
}
private static void PrintUsage()
{
Console.WriteLine("Language Analyzer Smoke Harness");
Console.WriteLine("Usage: dotnet run --project src/Tools/LanguageAnalyzerSmoke -- [options]");
Console.WriteLine();
Console.WriteLine("Options:");
Console.WriteLine(" -r, --repo-root <path> Repository root (defaults to current working directory)");
Console.WriteLine(" -p, --plugin-directory <name> Analyzer plug-in directory under plugins/scanner/analyzers/lang (defaults to StellaOps.Scanner.Analyzers.Lang.Python)");
Console.WriteLine(" -f, --fixture-path <path> Relative path to fixtures root (defaults to src/Scanner/__Tests/StellaOps.Scanner.Analyzers.Lang.Python.Tests/Fixtures/lang/python)");
Console.WriteLine(" -h, --help Show usage information");
}
}
internal sealed record PluginManifest
{
[JsonPropertyName("schemaVersion")]
public string SchemaVersion { get; init; } = string.Empty;
[JsonPropertyName("id")]
public string Id { get; init; } = string.Empty;
[JsonPropertyName("displayName")]
public string DisplayName { get; init; } = string.Empty;
[JsonPropertyName("version")]
public string Version { get; init; } = string.Empty;
[JsonPropertyName("requiresRestart")]
public bool RequiresRestart { get; init; }
[JsonPropertyName("entryPoint")]
public PluginEntryPoint EntryPoint { get; init; } = new();
[JsonPropertyName("capabilities")]
public IReadOnlyList<string> Capabilities { get; init; } = Array.Empty<string>();
[JsonPropertyName("metadata")]
public IReadOnlyDictionary<string, string> Metadata { get; init; } = ImmutableDictionary<string, string>.Empty;
}
internal sealed record PluginEntryPoint
{
[JsonPropertyName("type")]
public string Type { get; init; } = string.Empty;
[JsonPropertyName("assembly")]
public string Assembly { get; init; } = string.Empty;
[JsonPropertyName("typeName")]
public string TypeName { get; init; } = string.Empty;
}
file static class Program
{
private static readonly SmokeScenario[] PythonScenarios =
{
new("simple-venv", new[] { Path.Combine("bin", "simple-tool") }),
new("pip-cache", new[] { Path.Combine("lib", "python3.11", "site-packages", "cache_pkg-1.2.3.data", "scripts", "cache-tool") }),
new("layered-editable", new[] { Path.Combine("layer1", "usr", "bin", "layered-cli") })
};
public static async Task<int> Main(string[] args)
{
try
{
var options = SmokeOptions.Parse(args);
await RunAsync(options).ConfigureAwait(false);
Console.WriteLine("✅ Python analyzer smoke checks passed");
return 0;
}
catch (Exception ex)
{
Console.Error.WriteLine($"❌ {ex.Message}");
return 1;
}
}
private static async Task RunAsync(SmokeOptions options)
{
ValidateOptions(options);
var pluginRoot = Path.Combine(options.RepoRoot, "plugins", "scanner", "analyzers", "lang", options.PluginDirectoryName);
var manifestPath = Path.Combine(pluginRoot, "manifest.json");
if (!File.Exists(manifestPath))
{
throw new FileNotFoundException($"Plug-in manifest not found at '{manifestPath}'.", manifestPath);
}
using var manifestStream = File.OpenRead(manifestPath);
var manifest = JsonSerializer.Deserialize<PluginManifest>(manifestStream, new JsonSerializerOptions
{
PropertyNameCaseInsensitive = true,
ReadCommentHandling = JsonCommentHandling.Skip
}) ?? throw new InvalidOperationException($"Unable to parse manifest '{manifestPath}'.");
ValidateManifest(manifest, options.PluginDirectoryName);
var pluginAssemblyPath = Path.Combine(pluginRoot, manifest.EntryPoint.Assembly);
if (!File.Exists(pluginAssemblyPath))
{
throw new FileNotFoundException($"Plug-in assembly '{manifest.EntryPoint.Assembly}' not found under '{pluginRoot}'.", pluginAssemblyPath);
}
var sha256 = ComputeSha256(pluginAssemblyPath);
Console.WriteLine($"→ Plug-in assembly SHA-256: {sha256}");
using var serviceProvider = BuildServiceProvider();
var catalog = new LanguageAnalyzerPluginCatalog(new RestartOnlyPluginGuard(), NullLogger<LanguageAnalyzerPluginCatalog>.Instance);
catalog.LoadFromDirectory(pluginRoot, seal: true);
if (catalog.Plugins.Count == 0)
{
throw new InvalidOperationException($"No analyzer plug-ins were loaded from '{pluginRoot}'.");
}
var analyzerSet = catalog.CreateAnalyzers(serviceProvider);
if (analyzerSet.Count == 0)
{
throw new InvalidOperationException("Language analyzer plug-ins reported no analyzers.");
}
var analyzerIds = analyzerSet.Select(analyzer => analyzer.Id).ToArray();
Console.WriteLine($"→ Loaded analyzers: {string.Join(", ", analyzerIds)}");
if (!analyzerIds.Contains("python", StringComparer.OrdinalIgnoreCase))
{
throw new InvalidOperationException("Python analyzer was not created by the plug-in.");
}
var fixtureRoot = Path.GetFullPath(Path.Combine(options.RepoRoot, options.FixtureRelativePath));
if (!Directory.Exists(fixtureRoot))
{
throw new DirectoryNotFoundException($"Fixture directory '{fixtureRoot}' does not exist.");
}
foreach (var scenario in PythonScenarios)
{
await RunScenarioAsync(scenario, fixtureRoot, catalog, serviceProvider).ConfigureAwait(false);
}
}
private static ServiceProvider BuildServiceProvider()
{
var services = new ServiceCollection();
services.AddLogging();
return services.BuildServiceProvider();
}
private static async Task RunScenarioAsync(SmokeScenario scenario, string fixtureRoot, ILanguageAnalyzerPluginCatalog catalog, IServiceProvider services)
{
var scenarioRoot = Path.Combine(fixtureRoot, scenario.Name);
if (!Directory.Exists(scenarioRoot))
{
throw new DirectoryNotFoundException($"Scenario '{scenario.Name}' directory missing at '{scenarioRoot}'.");
}
var goldenPath = Path.Combine(scenarioRoot, "expected.json");
string? goldenNormalized = null;
if (File.Exists(goldenPath))
{
goldenNormalized = NormalizeJson(await File.ReadAllTextAsync(goldenPath).ConfigureAwait(false));
}
var usageHints = new LanguageUsageHints(scenario.ResolveUsageHints(scenarioRoot));
var context = new LanguageAnalyzerContext(scenarioRoot, TimeProvider.System, usageHints, services);
var coldEngine = new LanguageAnalyzerEngine(catalog.CreateAnalyzers(services));
var coldStopwatch = Stopwatch.StartNew();
var coldResult = await coldEngine.AnalyzeAsync(context, CancellationToken.None).ConfigureAwait(false);
coldStopwatch.Stop();
if (coldResult.Components.Count == 0)
{
throw new InvalidOperationException($"Scenario '{scenario.Name}' produced no components during cold run.");
}
var coldJson = NormalizeJson(coldResult.ToJson(indent: true));
if (goldenNormalized is string expected && !string.Equals(coldJson, expected, StringComparison.Ordinal))
{
Console.WriteLine($"⚠️ Scenario '{scenario.Name}' output deviates from repository golden snapshot.");
}
var warmEngine = new LanguageAnalyzerEngine(catalog.CreateAnalyzers(services));
var warmStopwatch = Stopwatch.StartNew();
var warmResult = await warmEngine.AnalyzeAsync(context, CancellationToken.None).ConfigureAwait(false);
warmStopwatch.Stop();
var warmJson = NormalizeJson(warmResult.ToJson(indent: true));
if (!string.Equals(coldJson, warmJson, StringComparison.Ordinal))
{
throw new InvalidOperationException($"Scenario '{scenario.Name}' produced different outputs between cold and warm runs.");
}
EnsureDurationWithinBudget(scenario.Name, coldStopwatch.Elapsed, warmStopwatch.Elapsed);
Console.WriteLine($"✓ Scenario '{scenario.Name}' — components {coldResult.Components.Count}, cold {coldStopwatch.Elapsed.TotalMilliseconds:F1} ms, warm {warmStopwatch.Elapsed.TotalMilliseconds:F1} ms");
}
private static void EnsureDurationWithinBudget(string scenarioName, TimeSpan coldDuration, TimeSpan warmDuration)
{
var coldBudget = TimeSpan.FromSeconds(30);
var warmBudget = TimeSpan.FromSeconds(5);
if (coldDuration > coldBudget)
{
throw new InvalidOperationException($"Scenario '{scenarioName}' cold run exceeded budget ({coldDuration.TotalSeconds:F2}s > {coldBudget.TotalSeconds:F2}s).");
}
if (warmDuration > warmBudget)
{
throw new InvalidOperationException($"Scenario '{scenarioName}' warm run exceeded budget ({warmDuration.TotalSeconds:F2}s > {warmBudget.TotalSeconds:F2}s).");
}
}
private static string NormalizeJson(string json)
=> json.Replace("\r\n", "\n", StringComparison.Ordinal).TrimEnd();
private static void ValidateOptions(SmokeOptions options)
{
if (!Directory.Exists(options.RepoRoot))
{
throw new DirectoryNotFoundException($"Repository root '{options.RepoRoot}' does not exist.");
}
}
private static void ValidateManifest(PluginManifest manifest, string expectedDirectory)
{
if (!string.Equals(manifest.SchemaVersion, "1.0", StringComparison.Ordinal))
{
throw new InvalidOperationException($"Unexpected manifest schema version '{manifest.SchemaVersion}'.");
}
if (!manifest.RequiresRestart)
{
throw new InvalidOperationException("Language analyzer plug-in must be marked as restart-only.");
}
if (!string.Equals(manifest.EntryPoint.Type, "dotnet", StringComparison.OrdinalIgnoreCase))
{
throw new InvalidOperationException($"Unsupported entry point type '{manifest.EntryPoint.Type}'.");
}
if (!manifest.Capabilities.Contains("python", StringComparer.OrdinalIgnoreCase))
{
throw new InvalidOperationException("Manifest capabilities do not include 'python'.");
}
if (!string.Equals(manifest.EntryPoint.TypeName, "StellaOps.Scanner.Analyzers.Lang.Python.PythonAnalyzerPlugin", StringComparison.Ordinal))
{
throw new InvalidOperationException($"Unexpected entry point type name '{manifest.EntryPoint.TypeName}'.");
}
if (!string.Equals(manifest.Id, "stellaops.analyzer.lang.python", StringComparison.OrdinalIgnoreCase))
{
throw new InvalidOperationException($"Manifest id '{manifest.Id}' does not match expected plug-in id for directory '{expectedDirectory}'.");
}
}
private static string ComputeSha256(string path)
{
using var hash = SHA256.Create();
using var stream = File.OpenRead(path);
var digest = hash.ComputeHash(stream);
var builder = new StringBuilder(digest.Length * 2);
foreach (var b in digest)
{
builder.Append(b.ToString("x2"));
}
return builder.ToString();
}
}
using System.Collections.Immutable;
using System.Diagnostics;
using System.Reflection;
using System.Security.Cryptography;
using System.Text;
using System.Text.Json;
using System.Text.Json.Serialization;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Logging.Abstractions;
using StellaOps.Scanner.Analyzers.Lang;
using StellaOps.Scanner.Analyzers.Lang.Plugin;
using StellaOps.Scanner.Core.Security;
internal sealed record SmokeScenario(string Name, string[] UsageHintRelatives)
{
public IReadOnlyList<string> ResolveUsageHints(string scenarioRoot)
=> UsageHintRelatives.Select(relative => Path.GetFullPath(Path.Combine(scenarioRoot, relative))).ToArray();
}
internal sealed class SmokeOptions
{
public string RepoRoot { get; set; } = Directory.GetCurrentDirectory();
public string PluginDirectoryName { get; set; } = "StellaOps.Scanner.Analyzers.Lang.Python";
public string FixtureRelativePath { get; set; } = Path.Combine("src", "StellaOps.Scanner.Analyzers.Lang.Python.Tests", "Fixtures", "lang", "python");
public static SmokeOptions Parse(string[] args)
{
var options = new SmokeOptions();
for (var index = 0; index < args.Length; index++)
{
var current = args[index];
switch (current)
{
case "--repo-root":
case "-r":
options.RepoRoot = RequireValue(args, ref index, current);
break;
case "--plugin-directory":
case "-p":
options.PluginDirectoryName = RequireValue(args, ref index, current);
break;
case "--fixture-path":
case "-f":
options.FixtureRelativePath = RequireValue(args, ref index, current);
break;
case "--help":
case "-h":
PrintUsage();
Environment.Exit(0);
break;
default:
throw new ArgumentException($"Unknown argument '{current}'. Use --help for usage.");
}
}
options.RepoRoot = Path.GetFullPath(options.RepoRoot);
return options;
}
private static string RequireValue(string[] args, ref int index, string switchName)
{
if (index + 1 >= args.Length)
{
throw new ArgumentException($"Missing value for '{switchName}'.");
}
index++;
var value = args[index];
if (string.IsNullOrWhiteSpace(value))
{
throw new ArgumentException($"Value for '{switchName}' cannot be empty.");
}
return value;
}
private static void PrintUsage()
{
Console.WriteLine("Language Analyzer Smoke Harness");
Console.WriteLine("Usage: dotnet run --project src/Tools/LanguageAnalyzerSmoke -- [options]");
Console.WriteLine();
Console.WriteLine("Options:");
Console.WriteLine(" -r, --repo-root <path> Repository root (defaults to current working directory)");
Console.WriteLine(" -p, --plugin-directory <name> Analyzer plug-in directory under plugins/scanner/analyzers/lang (defaults to StellaOps.Scanner.Analyzers.Lang.Python)");
Console.WriteLine(" -f, --fixture-path <path> Relative path to fixtures root (defaults to src/Scanner/__Tests/StellaOps.Scanner.Analyzers.Lang.Python.Tests/Fixtures/lang/python)");
Console.WriteLine(" -h, --help Show usage information");
}
}
internal sealed record PluginManifest
{
[JsonPropertyName("schemaVersion")]
public string SchemaVersion { get; init; } = string.Empty;
[JsonPropertyName("id")]
public string Id { get; init; } = string.Empty;
[JsonPropertyName("displayName")]
public string DisplayName { get; init; } = string.Empty;
[JsonPropertyName("version")]
public string Version { get; init; } = string.Empty;
[JsonPropertyName("requiresRestart")]
public bool RequiresRestart { get; init; }
[JsonPropertyName("entryPoint")]
public PluginEntryPoint EntryPoint { get; init; } = new();
[JsonPropertyName("capabilities")]
public IReadOnlyList<string> Capabilities { get; init; } = Array.Empty<string>();
[JsonPropertyName("metadata")]
public IReadOnlyDictionary<string, string> Metadata { get; init; } = ImmutableDictionary<string, string>.Empty;
}
internal sealed record PluginEntryPoint
{
[JsonPropertyName("type")]
public string Type { get; init; } = string.Empty;
[JsonPropertyName("assembly")]
public string Assembly { get; init; } = string.Empty;
[JsonPropertyName("typeName")]
public string TypeName { get; init; } = string.Empty;
}
file static class Program
{
private static readonly SmokeScenario[] PythonScenarios =
{
new("simple-venv", new[] { Path.Combine("bin", "simple-tool") }),
new("pip-cache", new[] { Path.Combine("lib", "python3.11", "site-packages", "cache_pkg-1.2.3.data", "scripts", "cache-tool") }),
new("layered-editable", new[] { Path.Combine("layer1", "usr", "bin", "layered-cli") })
};
public static async Task<int> Main(string[] args)
{
try
{
var options = SmokeOptions.Parse(args);
await RunAsync(options).ConfigureAwait(false);
Console.WriteLine("✅ Python analyzer smoke checks passed");
return 0;
}
catch (Exception ex)
{
Console.Error.WriteLine($"❌ {ex.Message}");
return 1;
}
}
private static async Task RunAsync(SmokeOptions options)
{
ValidateOptions(options);
var pluginRoot = Path.Combine(options.RepoRoot, "plugins", "scanner", "analyzers", "lang", options.PluginDirectoryName);
var manifestPath = Path.Combine(pluginRoot, "manifest.json");
if (!File.Exists(manifestPath))
{
throw new FileNotFoundException($"Plug-in manifest not found at '{manifestPath}'.", manifestPath);
}
using var manifestStream = File.OpenRead(manifestPath);
var manifest = JsonSerializer.Deserialize<PluginManifest>(manifestStream, new JsonSerializerOptions
{
PropertyNameCaseInsensitive = true,
ReadCommentHandling = JsonCommentHandling.Skip
}) ?? throw new InvalidOperationException($"Unable to parse manifest '{manifestPath}'.");
ValidateManifest(manifest, options.PluginDirectoryName);
var pluginAssemblyPath = Path.Combine(pluginRoot, manifest.EntryPoint.Assembly);
if (!File.Exists(pluginAssemblyPath))
{
throw new FileNotFoundException($"Plug-in assembly '{manifest.EntryPoint.Assembly}' not found under '{pluginRoot}'.", pluginAssemblyPath);
}
var sha256 = ComputeSha256(pluginAssemblyPath);
Console.WriteLine($"→ Plug-in assembly SHA-256: {sha256}");
using var serviceProvider = BuildServiceProvider();
var catalog = new LanguageAnalyzerPluginCatalog(new RestartOnlyPluginGuard(), NullLogger<LanguageAnalyzerPluginCatalog>.Instance);
catalog.LoadFromDirectory(pluginRoot, seal: true);
if (catalog.Plugins.Count == 0)
{
throw new InvalidOperationException($"No analyzer plug-ins were loaded from '{pluginRoot}'.");
}
var analyzerSet = catalog.CreateAnalyzers(serviceProvider);
if (analyzerSet.Count == 0)
{
throw new InvalidOperationException("Language analyzer plug-ins reported no analyzers.");
}
var analyzerIds = analyzerSet.Select(analyzer => analyzer.Id).ToArray();
Console.WriteLine($"→ Loaded analyzers: {string.Join(", ", analyzerIds)}");
if (!analyzerIds.Contains("python", StringComparer.OrdinalIgnoreCase))
{
throw new InvalidOperationException("Python analyzer was not created by the plug-in.");
}
var fixtureRoot = Path.GetFullPath(Path.Combine(options.RepoRoot, options.FixtureRelativePath));
if (!Directory.Exists(fixtureRoot))
{
throw new DirectoryNotFoundException($"Fixture directory '{fixtureRoot}' does not exist.");
}
foreach (var scenario in PythonScenarios)
{
await RunScenarioAsync(scenario, fixtureRoot, catalog, serviceProvider).ConfigureAwait(false);
}
}
private static ServiceProvider BuildServiceProvider()
{
var services = new ServiceCollection();
services.AddLogging();
return services.BuildServiceProvider();
}
private static async Task RunScenarioAsync(SmokeScenario scenario, string fixtureRoot, ILanguageAnalyzerPluginCatalog catalog, IServiceProvider services)
{
var scenarioRoot = Path.Combine(fixtureRoot, scenario.Name);
if (!Directory.Exists(scenarioRoot))
{
throw new DirectoryNotFoundException($"Scenario '{scenario.Name}' directory missing at '{scenarioRoot}'.");
}
var goldenPath = Path.Combine(scenarioRoot, "expected.json");
string? goldenNormalized = null;
if (File.Exists(goldenPath))
{
goldenNormalized = NormalizeJson(await File.ReadAllTextAsync(goldenPath).ConfigureAwait(false));
}
var usageHints = new LanguageUsageHints(scenario.ResolveUsageHints(scenarioRoot));
var context = new LanguageAnalyzerContext(scenarioRoot, TimeProvider.System, usageHints, services);
var coldEngine = new LanguageAnalyzerEngine(catalog.CreateAnalyzers(services));
var coldStopwatch = Stopwatch.StartNew();
var coldResult = await coldEngine.AnalyzeAsync(context, CancellationToken.None).ConfigureAwait(false);
coldStopwatch.Stop();
if (coldResult.Components.Count == 0)
{
throw new InvalidOperationException($"Scenario '{scenario.Name}' produced no components during cold run.");
}
var coldJson = NormalizeJson(coldResult.ToJson(indent: true));
if (goldenNormalized is string expected && !string.Equals(coldJson, expected, StringComparison.Ordinal))
{
Console.WriteLine($"⚠️ Scenario '{scenario.Name}' output deviates from repository golden snapshot.");
}
var warmEngine = new LanguageAnalyzerEngine(catalog.CreateAnalyzers(services));
var warmStopwatch = Stopwatch.StartNew();
var warmResult = await warmEngine.AnalyzeAsync(context, CancellationToken.None).ConfigureAwait(false);
warmStopwatch.Stop();
var warmJson = NormalizeJson(warmResult.ToJson(indent: true));
if (!string.Equals(coldJson, warmJson, StringComparison.Ordinal))
{
throw new InvalidOperationException($"Scenario '{scenario.Name}' produced different outputs between cold and warm runs.");
}
EnsureDurationWithinBudget(scenario.Name, coldStopwatch.Elapsed, warmStopwatch.Elapsed);
Console.WriteLine($"✓ Scenario '{scenario.Name}' — components {coldResult.Components.Count}, cold {coldStopwatch.Elapsed.TotalMilliseconds:F1} ms, warm {warmStopwatch.Elapsed.TotalMilliseconds:F1} ms");
}
private static void EnsureDurationWithinBudget(string scenarioName, TimeSpan coldDuration, TimeSpan warmDuration)
{
var coldBudget = TimeSpan.FromSeconds(30);
var warmBudget = TimeSpan.FromSeconds(5);
if (coldDuration > coldBudget)
{
throw new InvalidOperationException($"Scenario '{scenarioName}' cold run exceeded budget ({coldDuration.TotalSeconds:F2}s > {coldBudget.TotalSeconds:F2}s).");
}
if (warmDuration > warmBudget)
{
throw new InvalidOperationException($"Scenario '{scenarioName}' warm run exceeded budget ({warmDuration.TotalSeconds:F2}s > {warmBudget.TotalSeconds:F2}s).");
}
}
private static string NormalizeJson(string json)
=> json.Replace("\r\n", "\n", StringComparison.Ordinal).TrimEnd();
private static void ValidateOptions(SmokeOptions options)
{
if (!Directory.Exists(options.RepoRoot))
{
throw new DirectoryNotFoundException($"Repository root '{options.RepoRoot}' does not exist.");
}
}
private static void ValidateManifest(PluginManifest manifest, string expectedDirectory)
{
if (!string.Equals(manifest.SchemaVersion, "1.0", StringComparison.Ordinal))
{
throw new InvalidOperationException($"Unexpected manifest schema version '{manifest.SchemaVersion}'.");
}
if (!manifest.RequiresRestart)
{
throw new InvalidOperationException("Language analyzer plug-in must be marked as restart-only.");
}
if (!string.Equals(manifest.EntryPoint.Type, "dotnet", StringComparison.OrdinalIgnoreCase))
{
throw new InvalidOperationException($"Unsupported entry point type '{manifest.EntryPoint.Type}'.");
}
if (!manifest.Capabilities.Contains("python", StringComparer.OrdinalIgnoreCase))
{
throw new InvalidOperationException("Manifest capabilities do not include 'python'.");
}
if (!string.Equals(manifest.EntryPoint.TypeName, "StellaOps.Scanner.Analyzers.Lang.Python.PythonAnalyzerPlugin", StringComparison.Ordinal))
{
throw new InvalidOperationException($"Unexpected entry point type name '{manifest.EntryPoint.TypeName}'.");
}
if (!string.Equals(manifest.Id, "stellaops.analyzer.lang.python", StringComparison.OrdinalIgnoreCase))
{
throw new InvalidOperationException($"Manifest id '{manifest.Id}' does not match expected plug-in id for directory '{expectedDirectory}'.");
}
}
private static string ComputeSha256(string path)
{
using var hash = SHA256.Create();
using var stream = File.OpenRead(path);
var digest = hash.ComputeHash(stream);
var builder = new StringBuilder(digest.Length * 2);
foreach (var b in digest)
{
builder.Append(b.ToString("x2"));
}
return builder.ToString();
}
}

View File

@@ -1,12 +1,12 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>net10.0</TargetFramework>
<Nullable>enable</Nullable>
<ImplicitUsings>enable</ImplicitUsings>
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="StackExchange.Redis" Version="2.8.24" />
</ItemGroup>
</Project>
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>net10.0</TargetFramework>
<Nullable>enable</Nullable>
<ImplicitUsings>enable</ImplicitUsings>
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="StackExchange.Redis" Version="2.8.24" />
</ItemGroup>
</Project>

View File

@@ -1,198 +1,198 @@
using System.Globalization;
using System.Net.Http.Headers;
using System.Linq;
using System.Text.Json;
using StackExchange.Redis;
static string RequireEnv(string name)
{
var value = Environment.GetEnvironmentVariable(name);
if (string.IsNullOrWhiteSpace(value))
{
throw new InvalidOperationException($"Environment variable '{name}' is required for Notify smoke validation.");
}
return value;
}
static string? GetField(StreamEntry entry, string fieldName)
{
foreach (var pair in entry.Values)
{
if (string.Equals(pair.Name, fieldName, StringComparison.OrdinalIgnoreCase))
{
return pair.Value.ToString();
}
}
return null;
}
static void Ensure(bool condition, string message)
{
if (!condition)
{
throw new InvalidOperationException(message);
}
}
var redisDsn = RequireEnv("NOTIFY_SMOKE_REDIS_DSN");
var redisStream = Environment.GetEnvironmentVariable("NOTIFY_SMOKE_STREAM");
if (string.IsNullOrWhiteSpace(redisStream))
{
redisStream = "stella.events";
}
var expectedKindsEnv = RequireEnv("NOTIFY_SMOKE_EXPECT_KINDS");
var expectedKinds = expectedKindsEnv
.Split(',', StringSplitOptions.RemoveEmptyEntries | StringSplitOptions.TrimEntries)
.Select(kind => kind.ToLowerInvariant())
.Distinct()
.ToArray();
Ensure(expectedKinds.Length > 0, "Expected at least one event kind in NOTIFY_SMOKE_EXPECT_KINDS.");
var lookbackMinutesEnv = RequireEnv("NOTIFY_SMOKE_LOOKBACK_MINUTES");
if (!double.TryParse(lookbackMinutesEnv, NumberStyles.Any, CultureInfo.InvariantCulture, out var lookbackMinutes))
{
throw new InvalidOperationException("NOTIFY_SMOKE_LOOKBACK_MINUTES must be numeric.");
}
Ensure(lookbackMinutes > 0, "NOTIFY_SMOKE_LOOKBACK_MINUTES must be greater than zero.");
var now = DateTimeOffset.UtcNow;
var sinceThreshold = now - TimeSpan.FromMinutes(Math.Max(1, lookbackMinutes));
Console.WriteLine($" Checking Redis stream '{redisStream}' for kinds [{string.Join(", ", expectedKinds)}] within the last {lookbackMinutes:F1} minutes.");
var redisConfig = ConfigurationOptions.Parse(redisDsn);
redisConfig.AbortOnConnectFail = false;
await using var redisConnection = await ConnectionMultiplexer.ConnectAsync(redisConfig);
var database = redisConnection.GetDatabase();
var streamEntries = await database.StreamRangeAsync(redisStream, "-", "+", count: 200);
if (streamEntries.Length > 1)
{
Array.Reverse(streamEntries);
}
Ensure(streamEntries.Length > 0, $"Redis stream '{redisStream}' is empty.");
var recentEntries = new List<StreamEntry>();
foreach (var entry in streamEntries)
{
var timestampText = GetField(entry, "ts");
if (timestampText is null)
{
continue;
}
if (!DateTimeOffset.TryParse(timestampText, CultureInfo.InvariantCulture, DateTimeStyles.AssumeUniversal | DateTimeStyles.AdjustToUniversal, out var entryTimestamp))
{
continue;
}
if (entryTimestamp >= sinceThreshold)
{
recentEntries.Add(entry);
}
}
Ensure(recentEntries.Count > 0, $"No Redis events newer than {sinceThreshold:u} located in stream '{redisStream}'.");
var missingKinds = new List<string>();
foreach (var kind in expectedKinds)
{
var match = recentEntries.FirstOrDefault(entry =>
{
var entryKind = GetField(entry, "kind")?.ToLowerInvariant();
return entryKind == kind;
});
if (match.Equals(default(StreamEntry)))
{
missingKinds.Add(kind);
}
}
Ensure(missingKinds.Count == 0, $"Missing expected Redis events for kinds: {string.Join(", ", missingKinds)}");
Console.WriteLine("✅ Redis event stream contains the expected scanner events.");
var notifyBaseUrl = RequireEnv("NOTIFY_SMOKE_NOTIFY_BASEURL").TrimEnd('/');
var notifyToken = RequireEnv("NOTIFY_SMOKE_NOTIFY_TOKEN");
var notifyTenant = RequireEnv("NOTIFY_SMOKE_NOTIFY_TENANT");
var notifyTenantHeader = Environment.GetEnvironmentVariable("NOTIFY_SMOKE_NOTIFY_TENANT_HEADER");
if (string.IsNullOrWhiteSpace(notifyTenantHeader))
{
notifyTenantHeader = "X-StellaOps-Tenant";
}
var notifyTimeoutSeconds = 30;
var notifyTimeoutEnv = Environment.GetEnvironmentVariable("NOTIFY_SMOKE_NOTIFY_TIMEOUT_SECONDS");
if (!string.IsNullOrWhiteSpace(notifyTimeoutEnv) && int.TryParse(notifyTimeoutEnv, NumberStyles.Integer, CultureInfo.InvariantCulture, out var parsedTimeout))
{
notifyTimeoutSeconds = Math.Max(5, parsedTimeout);
}
using var httpClient = new HttpClient
{
Timeout = TimeSpan.FromSeconds(notifyTimeoutSeconds),
};
httpClient.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Bearer", notifyToken);
httpClient.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
httpClient.DefaultRequestHeaders.Add(notifyTenantHeader, notifyTenant);
var sinceQuery = Uri.EscapeDataString(sinceThreshold.ToString("O", CultureInfo.InvariantCulture));
var deliveriesUrl = $"{notifyBaseUrl}/api/v1/deliveries?since={sinceQuery}&limit=200";
Console.WriteLine($" Querying Notify deliveries via {deliveriesUrl}.");
using var response = await httpClient.GetAsync(deliveriesUrl);
if (!response.IsSuccessStatusCode)
{
var body = await response.Content.ReadAsStringAsync();
throw new InvalidOperationException($"Notify deliveries request failed with {(int)response.StatusCode} {response.ReasonPhrase}: {body}");
}
var json = await response.Content.ReadAsStringAsync();
if (string.IsNullOrWhiteSpace(json))
{
throw new InvalidOperationException("Notify deliveries response body was empty.");
}
using var document = JsonDocument.Parse(json);
var root = document.RootElement;
IEnumerable<JsonElement> EnumerateDeliveries(JsonElement element)
{
return element.ValueKind switch
{
JsonValueKind.Array => element.EnumerateArray(),
JsonValueKind.Object when element.TryGetProperty("items", out var items) && items.ValueKind == JsonValueKind.Array => items.EnumerateArray(),
_ => throw new InvalidOperationException("Notify deliveries response was not an array or did not contain an 'items' collection.")
};
}
var deliveries = EnumerateDeliveries(root).ToArray();
Ensure(deliveries.Length > 0, "Notify deliveries response did not return any records.");
var missingDeliveryKinds = new List<string>();
foreach (var kind in expectedKinds)
{
var found = deliveries.Any(delivery =>
delivery.TryGetProperty("kind", out var kindProperty) &&
kindProperty.GetString()?.Equals(kind, StringComparison.OrdinalIgnoreCase) == true &&
delivery.TryGetProperty("status", out var statusProperty) &&
!string.Equals(statusProperty.GetString(), "failed", StringComparison.OrdinalIgnoreCase));
if (!found)
{
missingDeliveryKinds.Add(kind);
}
}
Ensure(missingDeliveryKinds.Count == 0, $"Notify deliveries missing successful records for kinds: {string.Join(", ", missingDeliveryKinds)}");
Console.WriteLine("✅ Notify deliveries include the expected scanner events.");
Console.WriteLine("🎉 Notify smoke validation completed successfully.");
using System.Globalization;
using System.Net.Http.Headers;
using System.Linq;
using System.Text.Json;
using StackExchange.Redis;
static string RequireEnv(string name)
{
var value = Environment.GetEnvironmentVariable(name);
if (string.IsNullOrWhiteSpace(value))
{
throw new InvalidOperationException($"Environment variable '{name}' is required for Notify smoke validation.");
}
return value;
}
static string? GetField(StreamEntry entry, string fieldName)
{
foreach (var pair in entry.Values)
{
if (string.Equals(pair.Name, fieldName, StringComparison.OrdinalIgnoreCase))
{
return pair.Value.ToString();
}
}
return null;
}
static void Ensure(bool condition, string message)
{
if (!condition)
{
throw new InvalidOperationException(message);
}
}
var redisDsn = RequireEnv("NOTIFY_SMOKE_REDIS_DSN");
var redisStream = Environment.GetEnvironmentVariable("NOTIFY_SMOKE_STREAM");
if (string.IsNullOrWhiteSpace(redisStream))
{
redisStream = "stella.events";
}
var expectedKindsEnv = RequireEnv("NOTIFY_SMOKE_EXPECT_KINDS");
var expectedKinds = expectedKindsEnv
.Split(',', StringSplitOptions.RemoveEmptyEntries | StringSplitOptions.TrimEntries)
.Select(kind => kind.ToLowerInvariant())
.Distinct()
.ToArray();
Ensure(expectedKinds.Length > 0, "Expected at least one event kind in NOTIFY_SMOKE_EXPECT_KINDS.");
var lookbackMinutesEnv = RequireEnv("NOTIFY_SMOKE_LOOKBACK_MINUTES");
if (!double.TryParse(lookbackMinutesEnv, NumberStyles.Any, CultureInfo.InvariantCulture, out var lookbackMinutes))
{
throw new InvalidOperationException("NOTIFY_SMOKE_LOOKBACK_MINUTES must be numeric.");
}
Ensure(lookbackMinutes > 0, "NOTIFY_SMOKE_LOOKBACK_MINUTES must be greater than zero.");
var now = DateTimeOffset.UtcNow;
var sinceThreshold = now - TimeSpan.FromMinutes(Math.Max(1, lookbackMinutes));
Console.WriteLine($" Checking Redis stream '{redisStream}' for kinds [{string.Join(", ", expectedKinds)}] within the last {lookbackMinutes:F1} minutes.");
var redisConfig = ConfigurationOptions.Parse(redisDsn);
redisConfig.AbortOnConnectFail = false;
await using var redisConnection = await ConnectionMultiplexer.ConnectAsync(redisConfig);
var database = redisConnection.GetDatabase();
var streamEntries = await database.StreamRangeAsync(redisStream, "-", "+", count: 200);
if (streamEntries.Length > 1)
{
Array.Reverse(streamEntries);
}
Ensure(streamEntries.Length > 0, $"Redis stream '{redisStream}' is empty.");
var recentEntries = new List<StreamEntry>();
foreach (var entry in streamEntries)
{
var timestampText = GetField(entry, "ts");
if (timestampText is null)
{
continue;
}
if (!DateTimeOffset.TryParse(timestampText, CultureInfo.InvariantCulture, DateTimeStyles.AssumeUniversal | DateTimeStyles.AdjustToUniversal, out var entryTimestamp))
{
continue;
}
if (entryTimestamp >= sinceThreshold)
{
recentEntries.Add(entry);
}
}
Ensure(recentEntries.Count > 0, $"No Redis events newer than {sinceThreshold:u} located in stream '{redisStream}'.");
var missingKinds = new List<string>();
foreach (var kind in expectedKinds)
{
var match = recentEntries.FirstOrDefault(entry =>
{
var entryKind = GetField(entry, "kind")?.ToLowerInvariant();
return entryKind == kind;
});
if (match.Equals(default(StreamEntry)))
{
missingKinds.Add(kind);
}
}
Ensure(missingKinds.Count == 0, $"Missing expected Redis events for kinds: {string.Join(", ", missingKinds)}");
Console.WriteLine("✅ Redis event stream contains the expected scanner events.");
var notifyBaseUrl = RequireEnv("NOTIFY_SMOKE_NOTIFY_BASEURL").TrimEnd('/');
var notifyToken = RequireEnv("NOTIFY_SMOKE_NOTIFY_TOKEN");
var notifyTenant = RequireEnv("NOTIFY_SMOKE_NOTIFY_TENANT");
var notifyTenantHeader = Environment.GetEnvironmentVariable("NOTIFY_SMOKE_NOTIFY_TENANT_HEADER");
if (string.IsNullOrWhiteSpace(notifyTenantHeader))
{
notifyTenantHeader = "X-StellaOps-Tenant";
}
var notifyTimeoutSeconds = 30;
var notifyTimeoutEnv = Environment.GetEnvironmentVariable("NOTIFY_SMOKE_NOTIFY_TIMEOUT_SECONDS");
if (!string.IsNullOrWhiteSpace(notifyTimeoutEnv) && int.TryParse(notifyTimeoutEnv, NumberStyles.Integer, CultureInfo.InvariantCulture, out var parsedTimeout))
{
notifyTimeoutSeconds = Math.Max(5, parsedTimeout);
}
using var httpClient = new HttpClient
{
Timeout = TimeSpan.FromSeconds(notifyTimeoutSeconds),
};
httpClient.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Bearer", notifyToken);
httpClient.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
httpClient.DefaultRequestHeaders.Add(notifyTenantHeader, notifyTenant);
var sinceQuery = Uri.EscapeDataString(sinceThreshold.ToString("O", CultureInfo.InvariantCulture));
var deliveriesUrl = $"{notifyBaseUrl}/api/v1/deliveries?since={sinceQuery}&limit=200";
Console.WriteLine($" Querying Notify deliveries via {deliveriesUrl}.");
using var response = await httpClient.GetAsync(deliveriesUrl);
if (!response.IsSuccessStatusCode)
{
var body = await response.Content.ReadAsStringAsync();
throw new InvalidOperationException($"Notify deliveries request failed with {(int)response.StatusCode} {response.ReasonPhrase}: {body}");
}
var json = await response.Content.ReadAsStringAsync();
if (string.IsNullOrWhiteSpace(json))
{
throw new InvalidOperationException("Notify deliveries response body was empty.");
}
using var document = JsonDocument.Parse(json);
var root = document.RootElement;
IEnumerable<JsonElement> EnumerateDeliveries(JsonElement element)
{
return element.ValueKind switch
{
JsonValueKind.Array => element.EnumerateArray(),
JsonValueKind.Object when element.TryGetProperty("items", out var items) && items.ValueKind == JsonValueKind.Array => items.EnumerateArray(),
_ => throw new InvalidOperationException("Notify deliveries response was not an array or did not contain an 'items' collection.")
};
}
var deliveries = EnumerateDeliveries(root).ToArray();
Ensure(deliveries.Length > 0, "Notify deliveries response did not return any records.");
var missingDeliveryKinds = new List<string>();
foreach (var kind in expectedKinds)
{
var found = deliveries.Any(delivery =>
delivery.TryGetProperty("kind", out var kindProperty) &&
kindProperty.GetString()?.Equals(kind, StringComparison.OrdinalIgnoreCase) == true &&
delivery.TryGetProperty("status", out var statusProperty) &&
!string.Equals(statusProperty.GetString(), "failed", StringComparison.OrdinalIgnoreCase));
if (!found)
{
missingDeliveryKinds.Add(kind);
}
}
Ensure(missingDeliveryKinds.Count == 0, $"Notify deliveries missing successful records for kinds: {string.Join(", ", missingDeliveryKinds)}");
Console.WriteLine("✅ Notify deliveries include the expected scanner events.");
Console.WriteLine("🎉 Notify smoke validation completed successfully.");

View File

@@ -1,14 +1,14 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>net10.0</TargetFramework>
<Nullable>enable</Nullable>
<ImplicitUsings>enable</ImplicitUsings>
</PropertyGroup>
<ItemGroup>
<ProjectReference Include="..\..\src\StellaOps.Policy\StellaOps.Policy.csproj" />
</ItemGroup>
</Project>
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>net10.0</TargetFramework>
<Nullable>enable</Nullable>
<ImplicitUsings>enable</ImplicitUsings>
</PropertyGroup>
<ItemGroup>
<ProjectReference Include="..\..\src\StellaOps.Policy\StellaOps.Policy.csproj" />
</ItemGroup>
</Project>

View File

@@ -1,56 +1,56 @@
using StellaOps.Policy;
if (args.Length == 0)
{
Console.Error.WriteLine("Usage: policy-dsl-validator [--strict] [--json] <path-or-glob> [<path-or-glob> ...]");
Console.Error.WriteLine("Example: policy-dsl-validator --strict docs/examples/policies");
return 64; // EX_USAGE
}
var inputs = new List<string>();
var strict = false;
var outputJson = false;
foreach (var arg in args)
{
switch (arg)
{
case "--strict":
case "-s":
strict = true;
break;
case "--json":
case "-j":
outputJson = true;
break;
case "--help":
case "-h":
case "-?":
Console.WriteLine("Usage: policy-dsl-validator [--strict] [--json] <path-or-glob> [<path-or-glob> ...]");
Console.WriteLine("Example: policy-dsl-validator --strict docs/examples/policies");
return 0;
default:
inputs.Add(arg);
break;
}
}
if (inputs.Count == 0)
{
Console.Error.WriteLine("No input files or directories provided.");
return 64; // EX_USAGE
}
var options = new PolicyValidationCliOptions
{
Inputs = inputs,
Strict = strict,
OutputJson = outputJson,
};
var cli = new PolicyValidationCli();
var exitCode = await cli.RunAsync(options, CancellationToken.None);
return exitCode;
using StellaOps.Policy;
if (args.Length == 0)
{
Console.Error.WriteLine("Usage: policy-dsl-validator [--strict] [--json] <path-or-glob> [<path-or-glob> ...]");
Console.Error.WriteLine("Example: policy-dsl-validator --strict docs/examples/policies");
return 64; // EX_USAGE
}
var inputs = new List<string>();
var strict = false;
var outputJson = false;
foreach (var arg in args)
{
switch (arg)
{
case "--strict":
case "-s":
strict = true;
break;
case "--json":
case "-j":
outputJson = true;
break;
case "--help":
case "-h":
case "-?":
Console.WriteLine("Usage: policy-dsl-validator [--strict] [--json] <path-or-glob> [<path-or-glob> ...]");
Console.WriteLine("Example: policy-dsl-validator --strict docs/examples/policies");
return 0;
default:
inputs.Add(arg);
break;
}
}
if (inputs.Count == 0)
{
Console.Error.WriteLine("No input files or directories provided.");
return 64; // EX_USAGE
}
var options = new PolicyValidationCliOptions
{
Inputs = inputs,
Strict = strict,
OutputJson = outputJson,
};
var cli = new PolicyValidationCli();
var exitCode = await cli.RunAsync(options, CancellationToken.None);
return exitCode;

View File

@@ -1,21 +1,21 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>net10.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="NJsonSchema" Version="11.5.1" />
<PackageReference Include="NJsonSchema.SystemTextJson" Version="11.5.1" />
<PackageReference Include="Newtonsoft.Json" Version="13.0.3" />
</ItemGroup>
<ItemGroup>
<ProjectReference Include="..\..\StellaOps.Scheduler.Models\StellaOps.Scheduler.Models.csproj" />
</ItemGroup>
</Project>
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>net10.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="NJsonSchema" Version="11.5.1" />
<PackageReference Include="NJsonSchema.SystemTextJson" Version="11.5.1" />
<PackageReference Include="Newtonsoft.Json" Version="13.0.3" />
</ItemGroup>
<ItemGroup>
<ProjectReference Include="..\..\StellaOps.Scheduler.Models\StellaOps.Scheduler.Models.csproj" />
</ItemGroup>
</Project>

View File

@@ -1,48 +1,48 @@
using System.Collections.Immutable;
using System.Text.Json;
using System.Text.Json.Serialization;
using NJsonSchema;
using NJsonSchema.Generation;
using NJsonSchema.Generation.SystemTextJson;
using Newtonsoft.Json;
using StellaOps.Scheduler.Models;
var output = args.Length switch
{
0 => Path.GetFullPath(Path.Combine(AppContext.BaseDirectory, "..", "..", "..", "docs", "schemas")),
1 => Path.GetFullPath(args[0]),
_ => throw new ArgumentException("Usage: dotnet run --project src/Tools/PolicySchemaExporter -- [outputDirectory]")
};
Directory.CreateDirectory(output);
var generatorSettings = new SystemTextJsonSchemaGeneratorSettings
{
SchemaType = SchemaType.JsonSchema,
DefaultReferenceTypeNullHandling = ReferenceTypeNullHandling.NotNull,
SerializerOptions = new JsonSerializerOptions
{
PropertyNamingPolicy = JsonNamingPolicy.CamelCase,
DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingNull,
},
};
var generator = new JsonSchemaGenerator(generatorSettings);
var exports = ImmutableArray.Create(
(FileName: "policy-run-request.schema.json", Type: typeof(PolicyRunRequest)),
(FileName: "policy-run-status.schema.json", Type: typeof(PolicyRunStatus)),
(FileName: "policy-diff-summary.schema.json", Type: typeof(PolicyDiffSummary)),
(FileName: "policy-explain-trace.schema.json", Type: typeof(PolicyExplainTrace))
);
foreach (var export in exports)
{
var schema = generator.Generate(export.Type);
schema.Title = export.Type.Name;
schema.AllowAdditionalProperties = false;
var outputPath = Path.Combine(output, export.FileName);
await File.WriteAllTextAsync(outputPath, schema.ToJson(Formatting.Indented) + Environment.NewLine);
Console.WriteLine($"Wrote {outputPath}");
}
using System.Collections.Immutable;
using System.Text.Json;
using System.Text.Json.Serialization;
using NJsonSchema;
using NJsonSchema.Generation;
using NJsonSchema.Generation.SystemTextJson;
using Newtonsoft.Json;
using StellaOps.Scheduler.Models;
var output = args.Length switch
{
0 => Path.GetFullPath(Path.Combine(AppContext.BaseDirectory, "..", "..", "..", "docs", "schemas")),
1 => Path.GetFullPath(args[0]),
_ => throw new ArgumentException("Usage: dotnet run --project src/Tools/PolicySchemaExporter -- [outputDirectory]")
};
Directory.CreateDirectory(output);
var generatorSettings = new SystemTextJsonSchemaGeneratorSettings
{
SchemaType = SchemaType.JsonSchema,
DefaultReferenceTypeNullHandling = ReferenceTypeNullHandling.NotNull,
SerializerOptions = new JsonSerializerOptions
{
PropertyNamingPolicy = JsonNamingPolicy.CamelCase,
DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingNull,
},
};
var generator = new JsonSchemaGenerator(generatorSettings);
var exports = ImmutableArray.Create(
(FileName: "policy-run-request.schema.json", Type: typeof(PolicyRunRequest)),
(FileName: "policy-run-status.schema.json", Type: typeof(PolicyRunStatus)),
(FileName: "policy-diff-summary.schema.json", Type: typeof(PolicyDiffSummary)),
(FileName: "policy-explain-trace.schema.json", Type: typeof(PolicyExplainTrace))
);
foreach (var export in exports)
{
var schema = generator.Generate(export.Type);
schema.Title = export.Type.Name;
schema.AllowAdditionalProperties = false;
var outputPath = Path.Combine(output, export.FileName);
await File.WriteAllTextAsync(outputPath, schema.ToJson(Formatting.Indented) + Environment.NewLine);
Console.WriteLine($"Wrote {outputPath}");
}

View File

@@ -1,14 +1,14 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>net10.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
</PropertyGroup>
<ItemGroup>
<ProjectReference Include="..\..\src\StellaOps.Policy\StellaOps.Policy.csproj" />
</ItemGroup>
</Project>
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>net10.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
</PropertyGroup>
<ItemGroup>
<ProjectReference Include="..\..\src\StellaOps.Policy\StellaOps.Policy.csproj" />
</ItemGroup>
</Project>

View File

@@ -1,291 +1,291 @@
using System.Collections.Immutable;
using System.Text.Json;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Logging.Abstractions;
using StellaOps.Policy;
var scenarioRoot = "samples/policy/simulations";
string? outputDir = null;
for (var i = 0; i < args.Length; i++)
{
var arg = args[i];
switch (arg)
{
case "--scenario-root":
case "-r":
if (i + 1 >= args.Length)
{
Console.Error.WriteLine("Missing value for --scenario-root.");
return 64;
}
scenarioRoot = args[++i];
break;
case "--output":
case "-o":
if (i + 1 >= args.Length)
{
Console.Error.WriteLine("Missing value for --output.");
return 64;
}
outputDir = args[++i];
break;
case "--help":
case "-h":
case "-?":
PrintUsage();
return 0;
default:
Console.Error.WriteLine($"Unknown argument '{arg}'.");
PrintUsage();
return 64;
}
}
if (!Directory.Exists(scenarioRoot))
{
Console.Error.WriteLine($"Scenario root '{scenarioRoot}' does not exist.");
return 66;
}
var scenarioFiles = Directory.GetFiles(scenarioRoot, "scenario.json", SearchOption.AllDirectories);
if (scenarioFiles.Length == 0)
{
Console.Error.WriteLine($"No scenario.json files found under '{scenarioRoot}'.");
return 0;
}
var loggerFactory = NullLoggerFactory.Instance;
var snapshotStore = new PolicySnapshotStore(
new NullPolicySnapshotRepository(),
new NullPolicyAuditRepository(),
TimeProvider.System,
loggerFactory.CreateLogger<PolicySnapshotStore>());
var previewService = new PolicyPreviewService(snapshotStore, loggerFactory.CreateLogger<PolicyPreviewService>());
var serializerOptions = new JsonSerializerOptions(JsonSerializerDefaults.Web)
{
PropertyNameCaseInsensitive = true,
ReadCommentHandling = JsonCommentHandling.Skip,
};
var summary = new List<ScenarioResult>();
var success = true;
foreach (var scenarioFile in scenarioFiles.OrderBy(static f => f, StringComparer.OrdinalIgnoreCase))
{
var scenarioText = await File.ReadAllTextAsync(scenarioFile);
var scenario = JsonSerializer.Deserialize<PolicySimulationScenario>(scenarioText, serializerOptions);
if (scenario is null)
{
Console.Error.WriteLine($"Failed to deserialize scenario '{scenarioFile}'.");
success = false;
continue;
}
var repoRoot = Directory.GetCurrentDirectory();
var policyPath = Path.Combine(repoRoot, scenario.PolicyPath);
if (!File.Exists(policyPath))
{
Console.Error.WriteLine($"Policy file '{scenario.PolicyPath}' referenced by scenario '{scenario.Name}' does not exist.");
success = false;
continue;
}
var policyContent = await File.ReadAllTextAsync(policyPath);
var policyFormat = PolicySchema.DetectFormat(policyPath);
var findings = scenario.Findings.Select(ToPolicyFinding).ToImmutableArray();
var baseline = scenario.Baseline?.Select(ToPolicyVerdict).ToImmutableArray() ?? ImmutableArray<PolicyVerdict>.Empty;
var request = new PolicyPreviewRequest(
ImageDigest: $"sha256:simulation-{scenario.Name}",
Findings: findings,
BaselineVerdicts: baseline,
SnapshotOverride: null,
ProposedPolicy: new PolicySnapshotContent(
Content: policyContent,
Format: policyFormat,
Actor: "ci",
Source: "ci/simulation-smoke",
Description: $"CI simulation for scenario '{scenario.Name}'"));
var response = await previewService.PreviewAsync(request, CancellationToken.None);
var scenarioResult = EvaluateScenario(scenario, response);
summary.Add(scenarioResult);
if (!scenarioResult.Success)
{
success = false;
}
}
if (outputDir is not null)
{
Directory.CreateDirectory(outputDir);
var summaryPath = Path.Combine(outputDir, "policy-simulation-summary.json");
await File.WriteAllTextAsync(summaryPath, JsonSerializer.Serialize(summary, new JsonSerializerOptions { WriteIndented = true }));
}
return success ? 0 : 1;
static void PrintUsage()
{
Console.WriteLine("Usage: policy-simulation-smoke [--scenario-root <path>] [--output <dir>]");
Console.WriteLine("Example: policy-simulation-smoke --scenario-root samples/policy/simulations --output artifacts/policy-simulations");
}
static PolicyFinding ToPolicyFinding(ScenarioFinding finding)
{
var tags = finding.Tags is null ? ImmutableArray<string>.Empty : ImmutableArray.CreateRange(finding.Tags);
var severity = Enum.Parse<PolicySeverity>(finding.Severity, ignoreCase: true);
return new PolicyFinding(
finding.FindingId,
severity,
finding.Environment,
finding.Source,
finding.Vendor,
finding.License,
finding.Image,
finding.Repository,
finding.Package,
finding.Purl,
finding.Cve,
finding.Path,
finding.LayerDigest,
tags);
}
static PolicyVerdict ToPolicyVerdict(ScenarioBaseline baseline)
{
var status = Enum.Parse<PolicyVerdictStatus>(baseline.Status, ignoreCase: true);
var inputs = baseline.Inputs?.ToImmutableDictionary(StringComparer.OrdinalIgnoreCase) ?? ImmutableDictionary<string, double>.Empty;
return new PolicyVerdict(
baseline.FindingId,
status,
RuleName: baseline.RuleName,
RuleAction: baseline.RuleAction,
Notes: baseline.Notes,
Score: baseline.Score,
ConfigVersion: baseline.ConfigVersion ?? PolicyScoringConfig.Default.Version,
Inputs: inputs,
QuietedBy: null,
Quiet: false,
UnknownConfidence: null,
ConfidenceBand: null,
UnknownAgeDays: null,
SourceTrust: null,
Reachability: null);
}
static ScenarioResult EvaluateScenario(PolicySimulationScenario scenario, PolicyPreviewResponse response)
{
var result = new ScenarioResult(scenario.Name);
if (!response.Success)
{
result.Failures.Add("Preview failed.");
return result with { Success = false, ChangedCount = response.ChangedCount };
}
var diffs = response.Diffs.ToDictionary(diff => diff.Projected.FindingId, StringComparer.OrdinalIgnoreCase);
foreach (var expected in scenario.ExpectedDiffs)
{
if (!diffs.TryGetValue(expected.FindingId, out var diff))
{
result.Failures.Add($"Expected finding '{expected.FindingId}' missing from diff.");
continue;
}
var projectedStatus = diff.Projected.Status.ToString();
result.ActualStatuses[expected.FindingId] = projectedStatus;
if (!string.Equals(projectedStatus, expected.Status, StringComparison.OrdinalIgnoreCase))
{
result.Failures.Add($"Finding '{expected.FindingId}' expected status '{expected.Status}' but was '{projectedStatus}'.");
}
}
foreach (var diff in diffs.Values)
{
if (!result.ActualStatuses.ContainsKey(diff.Projected.FindingId))
{
result.ActualStatuses[diff.Projected.FindingId] = diff.Projected.Status.ToString();
}
}
var success = result.Failures.Count == 0;
return result with
{
Success = success,
ChangedCount = response.ChangedCount
};
}
internal sealed record PolicySimulationScenario
{
public string Name { get; init; } = "scenario";
public string PolicyPath { get; init; } = string.Empty;
public List<ScenarioFinding> Findings { get; init; } = new();
public List<ScenarioExpectedDiff> ExpectedDiffs { get; init; } = new();
public List<ScenarioBaseline>? Baseline { get; init; }
}
internal sealed record ScenarioFinding
{
public string FindingId { get; init; } = string.Empty;
public string Severity { get; init; } = "Low";
public string? Environment { get; init; }
public string? Source { get; init; }
public string? Vendor { get; init; }
public string? License { get; init; }
public string? Image { get; init; }
public string? Repository { get; init; }
public string? Package { get; init; }
public string? Purl { get; init; }
public string? Cve { get; init; }
public string? Path { get; init; }
public string? LayerDigest { get; init; }
public string[]? Tags { get; init; }
}
internal sealed record ScenarioExpectedDiff
{
public string FindingId { get; init; } = string.Empty;
public string Status { get; init; } = "Pass";
}
internal sealed record ScenarioBaseline
{
public string FindingId { get; init; } = string.Empty;
public string Status { get; init; } = "Pass";
public string? RuleName { get; init; }
public string? RuleAction { get; init; }
public string? Notes { get; init; }
public double Score { get; init; }
public string? ConfigVersion { get; init; }
public Dictionary<string, double>? Inputs { get; init; }
}
internal sealed record ScenarioResult(string ScenarioName)
{
public bool Success { get; init; } = true;
public int ChangedCount { get; init; }
public List<string> Failures { get; } = new();
public Dictionary<string, string> ActualStatuses { get; } = new(StringComparer.OrdinalIgnoreCase);
}
internal sealed class NullPolicySnapshotRepository : IPolicySnapshotRepository
{
public Task AddAsync(PolicySnapshot snapshot, CancellationToken cancellationToken = default) => Task.CompletedTask;
public Task<PolicySnapshot?> GetLatestAsync(CancellationToken cancellationToken = default) => Task.FromResult<PolicySnapshot?>(null);
public Task<IReadOnlyList<PolicySnapshot>> ListAsync(int limit, CancellationToken cancellationToken = default)
=> Task.FromResult<IReadOnlyList<PolicySnapshot>>(Array.Empty<PolicySnapshot>());
}
internal sealed class NullPolicyAuditRepository : IPolicyAuditRepository
{
public Task AddAsync(PolicyAuditEntry entry, CancellationToken cancellationToken = default) => Task.CompletedTask;
public Task<IReadOnlyList<PolicyAuditEntry>> ListAsync(int limit, CancellationToken cancellationToken = default)
=> Task.FromResult<IReadOnlyList<PolicyAuditEntry>>(Array.Empty<PolicyAuditEntry>());
}
using System.Collections.Immutable;
using System.Text.Json;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Logging.Abstractions;
using StellaOps.Policy;
var scenarioRoot = "samples/policy/simulations";
string? outputDir = null;
for (var i = 0; i < args.Length; i++)
{
var arg = args[i];
switch (arg)
{
case "--scenario-root":
case "-r":
if (i + 1 >= args.Length)
{
Console.Error.WriteLine("Missing value for --scenario-root.");
return 64;
}
scenarioRoot = args[++i];
break;
case "--output":
case "-o":
if (i + 1 >= args.Length)
{
Console.Error.WriteLine("Missing value for --output.");
return 64;
}
outputDir = args[++i];
break;
case "--help":
case "-h":
case "-?":
PrintUsage();
return 0;
default:
Console.Error.WriteLine($"Unknown argument '{arg}'.");
PrintUsage();
return 64;
}
}
if (!Directory.Exists(scenarioRoot))
{
Console.Error.WriteLine($"Scenario root '{scenarioRoot}' does not exist.");
return 66;
}
var scenarioFiles = Directory.GetFiles(scenarioRoot, "scenario.json", SearchOption.AllDirectories);
if (scenarioFiles.Length == 0)
{
Console.Error.WriteLine($"No scenario.json files found under '{scenarioRoot}'.");
return 0;
}
var loggerFactory = NullLoggerFactory.Instance;
var snapshotStore = new PolicySnapshotStore(
new NullPolicySnapshotRepository(),
new NullPolicyAuditRepository(),
TimeProvider.System,
loggerFactory.CreateLogger<PolicySnapshotStore>());
var previewService = new PolicyPreviewService(snapshotStore, loggerFactory.CreateLogger<PolicyPreviewService>());
var serializerOptions = new JsonSerializerOptions(JsonSerializerDefaults.Web)
{
PropertyNameCaseInsensitive = true,
ReadCommentHandling = JsonCommentHandling.Skip,
};
var summary = new List<ScenarioResult>();
var success = true;
foreach (var scenarioFile in scenarioFiles.OrderBy(static f => f, StringComparer.OrdinalIgnoreCase))
{
var scenarioText = await File.ReadAllTextAsync(scenarioFile);
var scenario = JsonSerializer.Deserialize<PolicySimulationScenario>(scenarioText, serializerOptions);
if (scenario is null)
{
Console.Error.WriteLine($"Failed to deserialize scenario '{scenarioFile}'.");
success = false;
continue;
}
var repoRoot = Directory.GetCurrentDirectory();
var policyPath = Path.Combine(repoRoot, scenario.PolicyPath);
if (!File.Exists(policyPath))
{
Console.Error.WriteLine($"Policy file '{scenario.PolicyPath}' referenced by scenario '{scenario.Name}' does not exist.");
success = false;
continue;
}
var policyContent = await File.ReadAllTextAsync(policyPath);
var policyFormat = PolicySchema.DetectFormat(policyPath);
var findings = scenario.Findings.Select(ToPolicyFinding).ToImmutableArray();
var baseline = scenario.Baseline?.Select(ToPolicyVerdict).ToImmutableArray() ?? ImmutableArray<PolicyVerdict>.Empty;
var request = new PolicyPreviewRequest(
ImageDigest: $"sha256:simulation-{scenario.Name}",
Findings: findings,
BaselineVerdicts: baseline,
SnapshotOverride: null,
ProposedPolicy: new PolicySnapshotContent(
Content: policyContent,
Format: policyFormat,
Actor: "ci",
Source: "ci/simulation-smoke",
Description: $"CI simulation for scenario '{scenario.Name}'"));
var response = await previewService.PreviewAsync(request, CancellationToken.None);
var scenarioResult = EvaluateScenario(scenario, response);
summary.Add(scenarioResult);
if (!scenarioResult.Success)
{
success = false;
}
}
if (outputDir is not null)
{
Directory.CreateDirectory(outputDir);
var summaryPath = Path.Combine(outputDir, "policy-simulation-summary.json");
await File.WriteAllTextAsync(summaryPath, JsonSerializer.Serialize(summary, new JsonSerializerOptions { WriteIndented = true }));
}
return success ? 0 : 1;
static void PrintUsage()
{
Console.WriteLine("Usage: policy-simulation-smoke [--scenario-root <path>] [--output <dir>]");
Console.WriteLine("Example: policy-simulation-smoke --scenario-root samples/policy/simulations --output artifacts/policy-simulations");
}
static PolicyFinding ToPolicyFinding(ScenarioFinding finding)
{
var tags = finding.Tags is null ? ImmutableArray<string>.Empty : ImmutableArray.CreateRange(finding.Tags);
var severity = Enum.Parse<PolicySeverity>(finding.Severity, ignoreCase: true);
return new PolicyFinding(
finding.FindingId,
severity,
finding.Environment,
finding.Source,
finding.Vendor,
finding.License,
finding.Image,
finding.Repository,
finding.Package,
finding.Purl,
finding.Cve,
finding.Path,
finding.LayerDigest,
tags);
}
static PolicyVerdict ToPolicyVerdict(ScenarioBaseline baseline)
{
var status = Enum.Parse<PolicyVerdictStatus>(baseline.Status, ignoreCase: true);
var inputs = baseline.Inputs?.ToImmutableDictionary(StringComparer.OrdinalIgnoreCase) ?? ImmutableDictionary<string, double>.Empty;
return new PolicyVerdict(
baseline.FindingId,
status,
RuleName: baseline.RuleName,
RuleAction: baseline.RuleAction,
Notes: baseline.Notes,
Score: baseline.Score,
ConfigVersion: baseline.ConfigVersion ?? PolicyScoringConfig.Default.Version,
Inputs: inputs,
QuietedBy: null,
Quiet: false,
UnknownConfidence: null,
ConfidenceBand: null,
UnknownAgeDays: null,
SourceTrust: null,
Reachability: null);
}
static ScenarioResult EvaluateScenario(PolicySimulationScenario scenario, PolicyPreviewResponse response)
{
var result = new ScenarioResult(scenario.Name);
if (!response.Success)
{
result.Failures.Add("Preview failed.");
return result with { Success = false, ChangedCount = response.ChangedCount };
}
var diffs = response.Diffs.ToDictionary(diff => diff.Projected.FindingId, StringComparer.OrdinalIgnoreCase);
foreach (var expected in scenario.ExpectedDiffs)
{
if (!diffs.TryGetValue(expected.FindingId, out var diff))
{
result.Failures.Add($"Expected finding '{expected.FindingId}' missing from diff.");
continue;
}
var projectedStatus = diff.Projected.Status.ToString();
result.ActualStatuses[expected.FindingId] = projectedStatus;
if (!string.Equals(projectedStatus, expected.Status, StringComparison.OrdinalIgnoreCase))
{
result.Failures.Add($"Finding '{expected.FindingId}' expected status '{expected.Status}' but was '{projectedStatus}'.");
}
}
foreach (var diff in diffs.Values)
{
if (!result.ActualStatuses.ContainsKey(diff.Projected.FindingId))
{
result.ActualStatuses[diff.Projected.FindingId] = diff.Projected.Status.ToString();
}
}
var success = result.Failures.Count == 0;
return result with
{
Success = success,
ChangedCount = response.ChangedCount
};
}
internal sealed record PolicySimulationScenario
{
public string Name { get; init; } = "scenario";
public string PolicyPath { get; init; } = string.Empty;
public List<ScenarioFinding> Findings { get; init; } = new();
public List<ScenarioExpectedDiff> ExpectedDiffs { get; init; } = new();
public List<ScenarioBaseline>? Baseline { get; init; }
}
internal sealed record ScenarioFinding
{
public string FindingId { get; init; } = string.Empty;
public string Severity { get; init; } = "Low";
public string? Environment { get; init; }
public string? Source { get; init; }
public string? Vendor { get; init; }
public string? License { get; init; }
public string? Image { get; init; }
public string? Repository { get; init; }
public string? Package { get; init; }
public string? Purl { get; init; }
public string? Cve { get; init; }
public string? Path { get; init; }
public string? LayerDigest { get; init; }
public string[]? Tags { get; init; }
}
internal sealed record ScenarioExpectedDiff
{
public string FindingId { get; init; } = string.Empty;
public string Status { get; init; } = "Pass";
}
internal sealed record ScenarioBaseline
{
public string FindingId { get; init; } = string.Empty;
public string Status { get; init; } = "Pass";
public string? RuleName { get; init; }
public string? RuleAction { get; init; }
public string? Notes { get; init; }
public double Score { get; init; }
public string? ConfigVersion { get; init; }
public Dictionary<string, double>? Inputs { get; init; }
}
internal sealed record ScenarioResult(string ScenarioName)
{
public bool Success { get; init; } = true;
public int ChangedCount { get; init; }
public List<string> Failures { get; } = new();
public Dictionary<string, string> ActualStatuses { get; } = new(StringComparer.OrdinalIgnoreCase);
}
internal sealed class NullPolicySnapshotRepository : IPolicySnapshotRepository
{
public Task AddAsync(PolicySnapshot snapshot, CancellationToken cancellationToken = default) => Task.CompletedTask;
public Task<PolicySnapshot?> GetLatestAsync(CancellationToken cancellationToken = default) => Task.FromResult<PolicySnapshot?>(null);
public Task<IReadOnlyList<PolicySnapshot>> ListAsync(int limit, CancellationToken cancellationToken = default)
=> Task.FromResult<IReadOnlyList<PolicySnapshot>>(Array.Empty<PolicySnapshot>());
}
internal sealed class NullPolicyAuditRepository : IPolicyAuditRepository
{
public Task AddAsync(PolicyAuditEntry entry, CancellationToken cancellationToken = default) => Task.CompletedTask;
public Task<IReadOnlyList<PolicyAuditEntry>> ListAsync(int limit, CancellationToken cancellationToken = default)
=> Task.FromResult<IReadOnlyList<PolicyAuditEntry>>(Array.Empty<PolicyAuditEntry>());
}

View File

@@ -1,286 +1,286 @@
using Amazon;
using Amazon.Runtime;
using Amazon.S3;
using Amazon.S3.Model;
using System.Net.Http.Headers;
var options = MigrationOptions.Parse(args);
if (options is null)
{
MigrationOptions.PrintUsage();
return 1;
}
Console.WriteLine($"RustFS migrator starting (prefix: '{options.Prefix ?? "<all>"}')");
if (options.DryRun)
{
Console.WriteLine("Dry-run enabled. No objects will be written to RustFS.");
}
var s3Config = new AmazonS3Config
{
ForcePathStyle = true,
};
if (!string.IsNullOrWhiteSpace(options.S3ServiceUrl))
{
s3Config.ServiceURL = options.S3ServiceUrl;
s3Config.UseHttp = options.S3ServiceUrl.StartsWith("http://", StringComparison.OrdinalIgnoreCase);
}
if (!string.IsNullOrWhiteSpace(options.S3Region))
{
s3Config.RegionEndpoint = RegionEndpoint.GetBySystemName(options.S3Region);
}
using var s3Client = CreateS3Client(options, s3Config);
using var httpClient = CreateRustFsClient(options);
var listRequest = new ListObjectsV2Request
{
BucketName = options.S3Bucket,
Prefix = options.Prefix,
MaxKeys = 1000,
};
var migrated = 0;
var skipped = 0;
do
{
var response = await s3Client.ListObjectsV2Async(listRequest).ConfigureAwait(false);
foreach (var entry in response.S3Objects)
{
if (entry.Size == 0 && entry.Key.EndsWith('/'))
{
skipped++;
continue;
}
Console.WriteLine($"Migrating {entry.Key} ({entry.Size} bytes)...");
if (options.DryRun)
{
migrated++;
continue;
}
using var getResponse = await s3Client.GetObjectAsync(new GetObjectRequest
{
BucketName = options.S3Bucket,
Key = entry.Key,
}).ConfigureAwait(false);
await using var memory = new MemoryStream();
await getResponse.ResponseStream.CopyToAsync(memory).ConfigureAwait(false);
memory.Position = 0;
using var request = new HttpRequestMessage(HttpMethod.Put, BuildRustFsUri(options, entry.Key))
{
Content = new ByteArrayContent(memory.ToArray()),
};
request.Content.Headers.ContentType = MediaTypeHeaderValue.Parse("application/octet-stream");
if (options.Immutable)
{
request.Headers.TryAddWithoutValidation("X-RustFS-Immutable", "true");
}
if (options.RetentionSeconds is { } retainSeconds)
{
request.Headers.TryAddWithoutValidation("X-RustFS-Retain-Seconds", retainSeconds.ToString());
}
if (!string.IsNullOrWhiteSpace(options.RustFsApiKeyHeader) && !string.IsNullOrWhiteSpace(options.RustFsApiKey))
{
request.Headers.TryAddWithoutValidation(options.RustFsApiKeyHeader!, options.RustFsApiKey!);
}
using var responseMessage = await httpClient.SendAsync(request).ConfigureAwait(false);
if (!responseMessage.IsSuccessStatusCode)
{
var error = await responseMessage.Content.ReadAsStringAsync().ConfigureAwait(false);
Console.Error.WriteLine($"Failed to upload {entry.Key}: {(int)responseMessage.StatusCode} {responseMessage.ReasonPhrase}\n{error}");
return 2;
}
migrated++;
}
listRequest.ContinuationToken = response.NextContinuationToken;
} while (!string.IsNullOrEmpty(listRequest.ContinuationToken));
Console.WriteLine($"Migration complete. Migrated {migrated} objects. Skipped {skipped} directory markers.");
return 0;
static AmazonS3Client CreateS3Client(MigrationOptions options, AmazonS3Config config)
{
if (!string.IsNullOrWhiteSpace(options.S3AccessKey) && !string.IsNullOrWhiteSpace(options.S3SecretKey))
{
var credentials = new BasicAWSCredentials(options.S3AccessKey, options.S3SecretKey);
return new AmazonS3Client(credentials, config);
}
return new AmazonS3Client(config);
}
static HttpClient CreateRustFsClient(MigrationOptions options)
{
var client = new HttpClient
{
BaseAddress = new Uri(options.RustFsEndpoint, UriKind.Absolute),
Timeout = TimeSpan.FromMinutes(5),
};
if (!string.IsNullOrWhiteSpace(options.RustFsApiKeyHeader) && !string.IsNullOrWhiteSpace(options.RustFsApiKey))
{
client.DefaultRequestHeaders.TryAddWithoutValidation(options.RustFsApiKeyHeader, options.RustFsApiKey);
}
return client;
}
static Uri BuildRustFsUri(MigrationOptions options, string key)
{
var normalized = string.Join('/', key
.Split('/', StringSplitOptions.RemoveEmptyEntries)
.Select(Uri.EscapeDataString));
var builder = new UriBuilder(options.RustFsEndpoint)
{
Path = $"/api/v1/buckets/{Uri.EscapeDataString(options.RustFsBucket)}/objects/{normalized}",
};
return builder.Uri;
}
internal sealed record MigrationOptions
{
public string S3Bucket { get; init; } = string.Empty;
public string? S3ServiceUrl { get; init; }
= null;
public string? S3Region { get; init; }
= null;
public string? S3AccessKey { get; init; }
= null;
public string? S3SecretKey { get; init; }
= null;
public string RustFsEndpoint { get; init; } = string.Empty;
public string RustFsBucket { get; init; } = string.Empty;
public string? RustFsApiKeyHeader { get; init; }
= null;
public string? RustFsApiKey { get; init; }
= null;
public string? Prefix { get; init; }
= null;
public bool Immutable { get; init; }
= false;
public int? RetentionSeconds { get; init; }
= null;
public bool DryRun { get; init; }
= false;
public static MigrationOptions? Parse(string[] args)
{
var builder = new Dictionary<string, string?>(StringComparer.OrdinalIgnoreCase);
for (var i = 0; i < args.Length; i++)
{
var key = args[i];
if (key.StartsWith("--", StringComparison.OrdinalIgnoreCase))
{
var normalized = key[2..];
if (string.Equals(normalized, "immutable", StringComparison.OrdinalIgnoreCase) || string.Equals(normalized, "dry-run", StringComparison.OrdinalIgnoreCase))
{
builder[normalized] = "true";
continue;
}
if (i + 1 >= args.Length)
{
Console.Error.WriteLine($"Missing value for argument '{key}'.");
return null;
}
builder[normalized] = args[++i];
}
}
if (!builder.TryGetValue("s3-bucket", out var bucket) || string.IsNullOrWhiteSpace(bucket))
{
Console.Error.WriteLine("--s3-bucket is required.");
return null;
}
if (!builder.TryGetValue("rustfs-endpoint", out var rustFsEndpoint) || string.IsNullOrWhiteSpace(rustFsEndpoint))
{
Console.Error.WriteLine("--rustfs-endpoint is required.");
return null;
}
if (!builder.TryGetValue("rustfs-bucket", out var rustFsBucket) || string.IsNullOrWhiteSpace(rustFsBucket))
{
Console.Error.WriteLine("--rustfs-bucket is required.");
return null;
}
int? retentionSeconds = null;
if (builder.TryGetValue("retain-days", out var retainStr) && !string.IsNullOrWhiteSpace(retainStr))
{
if (double.TryParse(retainStr, out var days) && days > 0)
{
retentionSeconds = (int)Math.Ceiling(days * 24 * 60 * 60);
}
else
{
Console.Error.WriteLine("--retain-days must be a positive number.");
return null;
}
}
return new MigrationOptions
{
S3Bucket = bucket,
S3ServiceUrl = builder.TryGetValue("s3-endpoint", out var s3Endpoint) ? s3Endpoint : null,
S3Region = builder.TryGetValue("s3-region", out var s3Region) ? s3Region : null,
S3AccessKey = builder.TryGetValue("s3-access-key", out var s3AccessKey) ? s3AccessKey : null,
S3SecretKey = builder.TryGetValue("s3-secret-key", out var s3SecretKey) ? s3SecretKey : null,
RustFsEndpoint = rustFsEndpoint!,
RustFsBucket = rustFsBucket!,
RustFsApiKeyHeader = builder.TryGetValue("rustfs-api-key-header", out var apiKeyHeader) ? apiKeyHeader : null,
RustFsApiKey = builder.TryGetValue("rustfs-api-key", out var apiKey) ? apiKey : null,
Prefix = builder.TryGetValue("prefix", out var prefix) ? prefix : null,
Immutable = builder.ContainsKey("immutable"),
RetentionSeconds = retentionSeconds,
DryRun = builder.ContainsKey("dry-run"),
};
}
public static void PrintUsage()
{
Console.WriteLine(@"Usage: dotnet run --project src/Tools/RustFsMigrator -- \
--s3-bucket <name> \
[--s3-endpoint http://minio:9000] \
[--s3-region us-east-1] \
[--s3-access-key key --s3-secret-key secret] \
--rustfs-endpoint http://rustfs:8080 \
--rustfs-bucket scanner-artifacts \
[--rustfs-api-key-header X-API-Key --rustfs-api-key token] \
[--prefix scanner/] \
[--immutable] \
[--retain-days 365] \
[--dry-run]");
}
}
using Amazon;
using Amazon.Runtime;
using Amazon.S3;
using Amazon.S3.Model;
using System.Net.Http.Headers;
var options = MigrationOptions.Parse(args);
if (options is null)
{
MigrationOptions.PrintUsage();
return 1;
}
Console.WriteLine($"RustFS migrator starting (prefix: '{options.Prefix ?? "<all>"}')");
if (options.DryRun)
{
Console.WriteLine("Dry-run enabled. No objects will be written to RustFS.");
}
var s3Config = new AmazonS3Config
{
ForcePathStyle = true,
};
if (!string.IsNullOrWhiteSpace(options.S3ServiceUrl))
{
s3Config.ServiceURL = options.S3ServiceUrl;
s3Config.UseHttp = options.S3ServiceUrl.StartsWith("http://", StringComparison.OrdinalIgnoreCase);
}
if (!string.IsNullOrWhiteSpace(options.S3Region))
{
s3Config.RegionEndpoint = RegionEndpoint.GetBySystemName(options.S3Region);
}
using var s3Client = CreateS3Client(options, s3Config);
using var httpClient = CreateRustFsClient(options);
var listRequest = new ListObjectsV2Request
{
BucketName = options.S3Bucket,
Prefix = options.Prefix,
MaxKeys = 1000,
};
var migrated = 0;
var skipped = 0;
do
{
var response = await s3Client.ListObjectsV2Async(listRequest).ConfigureAwait(false);
foreach (var entry in response.S3Objects)
{
if (entry.Size == 0 && entry.Key.EndsWith('/'))
{
skipped++;
continue;
}
Console.WriteLine($"Migrating {entry.Key} ({entry.Size} bytes)...");
if (options.DryRun)
{
migrated++;
continue;
}
using var getResponse = await s3Client.GetObjectAsync(new GetObjectRequest
{
BucketName = options.S3Bucket,
Key = entry.Key,
}).ConfigureAwait(false);
await using var memory = new MemoryStream();
await getResponse.ResponseStream.CopyToAsync(memory).ConfigureAwait(false);
memory.Position = 0;
using var request = new HttpRequestMessage(HttpMethod.Put, BuildRustFsUri(options, entry.Key))
{
Content = new ByteArrayContent(memory.ToArray()),
};
request.Content.Headers.ContentType = MediaTypeHeaderValue.Parse("application/octet-stream");
if (options.Immutable)
{
request.Headers.TryAddWithoutValidation("X-RustFS-Immutable", "true");
}
if (options.RetentionSeconds is { } retainSeconds)
{
request.Headers.TryAddWithoutValidation("X-RustFS-Retain-Seconds", retainSeconds.ToString());
}
if (!string.IsNullOrWhiteSpace(options.RustFsApiKeyHeader) && !string.IsNullOrWhiteSpace(options.RustFsApiKey))
{
request.Headers.TryAddWithoutValidation(options.RustFsApiKeyHeader!, options.RustFsApiKey!);
}
using var responseMessage = await httpClient.SendAsync(request).ConfigureAwait(false);
if (!responseMessage.IsSuccessStatusCode)
{
var error = await responseMessage.Content.ReadAsStringAsync().ConfigureAwait(false);
Console.Error.WriteLine($"Failed to upload {entry.Key}: {(int)responseMessage.StatusCode} {responseMessage.ReasonPhrase}\n{error}");
return 2;
}
migrated++;
}
listRequest.ContinuationToken = response.NextContinuationToken;
} while (!string.IsNullOrEmpty(listRequest.ContinuationToken));
Console.WriteLine($"Migration complete. Migrated {migrated} objects. Skipped {skipped} directory markers.");
return 0;
static AmazonS3Client CreateS3Client(MigrationOptions options, AmazonS3Config config)
{
if (!string.IsNullOrWhiteSpace(options.S3AccessKey) && !string.IsNullOrWhiteSpace(options.S3SecretKey))
{
var credentials = new BasicAWSCredentials(options.S3AccessKey, options.S3SecretKey);
return new AmazonS3Client(credentials, config);
}
return new AmazonS3Client(config);
}
static HttpClient CreateRustFsClient(MigrationOptions options)
{
var client = new HttpClient
{
BaseAddress = new Uri(options.RustFsEndpoint, UriKind.Absolute),
Timeout = TimeSpan.FromMinutes(5),
};
if (!string.IsNullOrWhiteSpace(options.RustFsApiKeyHeader) && !string.IsNullOrWhiteSpace(options.RustFsApiKey))
{
client.DefaultRequestHeaders.TryAddWithoutValidation(options.RustFsApiKeyHeader, options.RustFsApiKey);
}
return client;
}
static Uri BuildRustFsUri(MigrationOptions options, string key)
{
var normalized = string.Join('/', key
.Split('/', StringSplitOptions.RemoveEmptyEntries)
.Select(Uri.EscapeDataString));
var builder = new UriBuilder(options.RustFsEndpoint)
{
Path = $"/api/v1/buckets/{Uri.EscapeDataString(options.RustFsBucket)}/objects/{normalized}",
};
return builder.Uri;
}
internal sealed record MigrationOptions
{
public string S3Bucket { get; init; } = string.Empty;
public string? S3ServiceUrl { get; init; }
= null;
public string? S3Region { get; init; }
= null;
public string? S3AccessKey { get; init; }
= null;
public string? S3SecretKey { get; init; }
= null;
public string RustFsEndpoint { get; init; } = string.Empty;
public string RustFsBucket { get; init; } = string.Empty;
public string? RustFsApiKeyHeader { get; init; }
= null;
public string? RustFsApiKey { get; init; }
= null;
public string? Prefix { get; init; }
= null;
public bool Immutable { get; init; }
= false;
public int? RetentionSeconds { get; init; }
= null;
public bool DryRun { get; init; }
= false;
public static MigrationOptions? Parse(string[] args)
{
var builder = new Dictionary<string, string?>(StringComparer.OrdinalIgnoreCase);
for (var i = 0; i < args.Length; i++)
{
var key = args[i];
if (key.StartsWith("--", StringComparison.OrdinalIgnoreCase))
{
var normalized = key[2..];
if (string.Equals(normalized, "immutable", StringComparison.OrdinalIgnoreCase) || string.Equals(normalized, "dry-run", StringComparison.OrdinalIgnoreCase))
{
builder[normalized] = "true";
continue;
}
if (i + 1 >= args.Length)
{
Console.Error.WriteLine($"Missing value for argument '{key}'.");
return null;
}
builder[normalized] = args[++i];
}
}
if (!builder.TryGetValue("s3-bucket", out var bucket) || string.IsNullOrWhiteSpace(bucket))
{
Console.Error.WriteLine("--s3-bucket is required.");
return null;
}
if (!builder.TryGetValue("rustfs-endpoint", out var rustFsEndpoint) || string.IsNullOrWhiteSpace(rustFsEndpoint))
{
Console.Error.WriteLine("--rustfs-endpoint is required.");
return null;
}
if (!builder.TryGetValue("rustfs-bucket", out var rustFsBucket) || string.IsNullOrWhiteSpace(rustFsBucket))
{
Console.Error.WriteLine("--rustfs-bucket is required.");
return null;
}
int? retentionSeconds = null;
if (builder.TryGetValue("retain-days", out var retainStr) && !string.IsNullOrWhiteSpace(retainStr))
{
if (double.TryParse(retainStr, out var days) && days > 0)
{
retentionSeconds = (int)Math.Ceiling(days * 24 * 60 * 60);
}
else
{
Console.Error.WriteLine("--retain-days must be a positive number.");
return null;
}
}
return new MigrationOptions
{
S3Bucket = bucket,
S3ServiceUrl = builder.TryGetValue("s3-endpoint", out var s3Endpoint) ? s3Endpoint : null,
S3Region = builder.TryGetValue("s3-region", out var s3Region) ? s3Region : null,
S3AccessKey = builder.TryGetValue("s3-access-key", out var s3AccessKey) ? s3AccessKey : null,
S3SecretKey = builder.TryGetValue("s3-secret-key", out var s3SecretKey) ? s3SecretKey : null,
RustFsEndpoint = rustFsEndpoint!,
RustFsBucket = rustFsBucket!,
RustFsApiKeyHeader = builder.TryGetValue("rustfs-api-key-header", out var apiKeyHeader) ? apiKeyHeader : null,
RustFsApiKey = builder.TryGetValue("rustfs-api-key", out var apiKey) ? apiKey : null,
Prefix = builder.TryGetValue("prefix", out var prefix) ? prefix : null,
Immutable = builder.ContainsKey("immutable"),
RetentionSeconds = retentionSeconds,
DryRun = builder.ContainsKey("dry-run"),
};
}
public static void PrintUsage()
{
Console.WriteLine(@"Usage: dotnet run --project src/Tools/RustFsMigrator -- \
--s3-bucket <name> \
[--s3-endpoint http://minio:9000] \
[--s3-region us-east-1] \
[--s3-access-key key --s3-secret-key secret] \
--rustfs-endpoint http://rustfs:8080 \
--rustfs-bucket scanner-artifacts \
[--rustfs-api-key-header X-API-Key --rustfs-api-key token] \
[--prefix scanner/] \
[--immutable] \
[--retain-days 365] \
[--dry-run]");
}
}

View File

@@ -1,11 +1,11 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>net10.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="AWSSDK.S3" Version="3.7.305.6" />
</ItemGroup>
</Project>
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>net10.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="AWSSDK.S3" Version="3.7.305.6" />
</ItemGroup>
</Project>

Some files were not shown because too many files have changed in this diff Show More