Add unit tests and implementations for MongoDB index models and OpenAPI metadata

- Implemented `MongoIndexModelTests` to verify index models for various stores.
- Created `OpenApiMetadataFactory` with methods to generate OpenAPI metadata.
- Added tests for `OpenApiMetadataFactory` to ensure expected defaults and URL overrides.
- Introduced `ObserverSurfaceSecrets` and `WebhookSurfaceSecrets` for managing secrets.
- Developed `RuntimeSurfaceFsClient` and `WebhookSurfaceFsClient` for manifest retrieval.
- Added dependency injection tests for `SurfaceEnvironmentRegistration` in both Observer and Webhook contexts.
- Implemented tests for secret resolution in `ObserverSurfaceSecretsTests` and `WebhookSurfaceSecretsTests`.
- Created `EnsureLinkNotMergeCollectionsMigrationTests` to validate MongoDB migration logic.
- Added project files for MongoDB tests and NuGet package mirroring.
This commit is contained in:
master
2025-11-17 21:21:56 +02:00
parent d3128aec24
commit 9075bad2d9
146 changed files with 152183 additions and 82 deletions

50
src/Concelier/AGENTS.md Normal file
View File

@@ -0,0 +1,50 @@
# Concelier · AGENTS Charter (Sprint 01120113)
## Module Scope & Working Directory
- Working directory: `src/Concelier/**` (WebService, __Libraries, Storage.Mongo, analyzers, tests, seed-data). Do not edit other modules unless explicitly referenced by this sprint.
- Mission: Link-Not-Merge (LNM) ingestion of advisory observations, correlation into linksets, evidence/export APIs, and deterministic telemetry.
## Roles
- **Backend engineer (ASP.NET Core / Mongo):** connectors, ingestion guards, linkset builder, WebService APIs, storage migrations.
- **Observability/Platform engineer:** OTEL metrics/logs, health/readiness, distributed locks, scheduler safety.
- **QA automation:** Mongo2Go + WebApplicationFactory tests for handlers/jobs; determinism and guardrail regression harnesses.
- **Docs/Schema steward:** keep LNM schemas, API references, and inline provenance docs aligned with behavior.
## Required Reading (must be treated as read before setting DOING)
- `docs/README.md`
- `docs/07_HIGH_LEVEL_ARCHITECTURE.md`
- `docs/modules/platform/architecture-overview.md`
- `docs/modules/concelier/architecture.md`
- `docs/modules/concelier/link-not-merge-schema.md`
- `docs/provenance/inline-dsse.md` (for provenance anchors/DSSE notes)
- Any sprint-specific ADRs/notes linked from `docs/implplan/SPRINT_0112_0001_0001_concelier_i.md` or `SPRINT_0113_0001_0002_concelier_ii.md`.
## Working Agreements
- **Aggregation-Only Contract (AOC):** no derived semantics in ingestion; enforce via `AOCWriteGuard` and analyzers. Raw observations are append-only; linksets carry correlations/conflicts only.
- **Determinism:** use canonical JSON writer; sort collections (fieldType, observationPath, sourceId) for cache keys; UTC ISO-8601 timestamps; stable ordering in exports/events.
- **Offline-first:** avoid new external calls outside allowlisted connectors; feature flags must default safe for air-gapped deployments (`concelier:features:*`).
- **Tenant safety:** every API/job must enforce tenant headers/guards; no cross-tenant leaks.
- **Schema gates:** LNM schema changes require docs + tests; update `link-not-merge-schema.md` and samples together.
- **Cross-module edits:** none without sprint note; if needed, log in sprint Execution Log and Decisions & Risks.
## Coding & Observability Standards
- Target **.NET 10**; prefer latest C# preview features already enabled in repo.
- Mongo driver ≥ 3.x; canonical BSON/JSON mapping lives in Storage.Mongo.
- Metrics: use `Meter` names under `StellaOps.Concelier.*`; tag `tenant`, `source`, `result` as applicable. Counters/histograms must be documented.
- Logging: structured, no PII; include `tenant`, `source`, `job`, `correlationId` when available.
- Scheduler/locks: one lock per connector/export job; no duplicate runs; honor `CancellationToken`.
## Testing Rules
- Write/maintain tests alongside code:
- Web/API: `StellaOps.Concelier.WebService.Tests` with WebApplicationFactory + Mongo2Go fixtures.
- Core/Linkset/Guards: `StellaOps.Concelier.Core.Tests`.
- Storage: `StellaOps.Concelier.Storage.Mongo.Tests` (use in-memory or Mongo2Go; determinism on ordering/hashes).
- Observability/analyzers: tests in `__Analyzers` or respective test projects.
- Tests must assert determinism (stable ordering/hashes), tenant guards, AOC invariants, and no derived fields in ingestion.
- Prefer seeded fixtures under `seed-data/` for repeatability; avoid network in tests.
## Delivery Discipline
- Update sprint tracker status (`TODO → DOING → DONE/BLOCKED`) when you start/finish/block work; mirror decisions in Execution Log and Decisions & Risks.
- If a design decision is needed, mark the task `BLOCKED` in the sprint doc and record the decision ask—do not pause the codebase.
- When changing contracts (APIs, schemas, telemetry, exports), update corresponding docs and link them from the sprint Decisions & Risks section.

View File

@@ -0,0 +1,50 @@
using System;
using System.Collections.Generic;
using System.Collections.Immutable;
using MongoDB.Bson;
using System.Linq;
namespace StellaOps.Concelier.Core.Linksets;
public sealed record AdvisoryLinkset(
string TenantId,
string Source,
string AdvisoryId,
ImmutableArray<string> ObservationIds,
AdvisoryLinksetNormalized? Normalized,
AdvisoryLinksetProvenance? Provenance,
DateTimeOffset CreatedAt,
string? BuiltByJobId);
public sealed record AdvisoryLinksetNormalized(
IReadOnlyList<string>? Purls,
IReadOnlyList<string>? Versions,
IReadOnlyList<Dictionary<string, object?>>? Ranges,
IReadOnlyList<Dictionary<string, object?>>? Severities)
{
public List<BsonDocument>? RangesToBson()
=> Ranges is null ? null : Ranges.Select(BsonDocumentHelper.FromDictionary).ToList();
public List<BsonDocument>? SeveritiesToBson()
=> Severities is null ? null : Severities.Select(BsonDocumentHelper.FromDictionary).ToList();
}
public sealed record AdvisoryLinksetProvenance(
IReadOnlyList<string>? ObservationHashes,
string? ToolVersion,
string? PolicyHash);
internal static class BsonDocumentHelper
{
public static BsonDocument FromDictionary(Dictionary<string, object?> dictionary)
{
ArgumentNullException.ThrowIfNull(dictionary);
var doc = new BsonDocument();
foreach (var kvp in dictionary)
{
doc[kvp.Key] = kvp.Value is null ? BsonNull.Value : BsonValue.Create(kvp.Value);
}
return doc;
}
}

View File

@@ -0,0 +1,82 @@
using System.Collections.Generic;
using System.Collections.Immutable;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using StellaOps.Concelier.Core.Observations;
namespace StellaOps.Concelier.Core.Linksets;
internal sealed class AdvisoryLinksetBackfillService : IAdvisoryLinksetBackfillService
{
private readonly IAdvisoryObservationLookup _observations;
private readonly IAdvisoryLinksetSink _linksetSink;
private readonly TimeProvider _timeProvider;
public AdvisoryLinksetBackfillService(
IAdvisoryObservationLookup observations,
IAdvisoryLinksetSink linksetSink,
TimeProvider timeProvider)
{
_observations = observations ?? throw new ArgumentNullException(nameof(observations));
_linksetSink = linksetSink ?? throw new ArgumentNullException(nameof(linksetSink));
_timeProvider = timeProvider ?? throw new ArgumentNullException(nameof(timeProvider));
}
public async Task<int> BackfillTenantAsync(string tenant, CancellationToken cancellationToken)
{
ArgumentException.ThrowIfNullOrWhiteSpace(tenant);
cancellationToken.ThrowIfCancellationRequested();
var observations = await _observations.ListByTenantAsync(tenant, cancellationToken).ConfigureAwait(false);
if (observations.Count == 0)
{
return 0;
}
var groups = observations.GroupBy(
o => (o.Source.Vendor, o.Upstream.UpstreamId),
new VendorUpstreamComparer());
var count = 0;
var now = _timeProvider.GetUtcNow();
foreach (var group in groups)
{
cancellationToken.ThrowIfCancellationRequested();
var observationIds = group.Select(o => o.ObservationId).Distinct(StringComparer.Ordinal).ToImmutableArray();
var createdAt = group.Max(o => o.CreatedAt);
var normalized = AdvisoryLinksetNormalization.FromPurls(group.SelectMany(o => o.Linkset.Purls));
var linkset = new AdvisoryLinkset(
tenant,
group.Key.Vendor,
group.Key.UpstreamId,
observationIds,
normalized,
null,
createdAt,
null);
await _linksetSink.UpsertAsync(linkset, cancellationToken).ConfigureAwait(false);
count++;
}
return count;
}
}
internal sealed class VendorUpstreamComparer : IEqualityComparer<(string Vendor, string UpstreamId)>
{
public bool Equals((string Vendor, string UpstreamId) x, (string Vendor, string UpstreamId) y)
=> StringComparer.Ordinal.Equals(x.Vendor, y.Vendor)
&& StringComparer.Ordinal.Equals(x.UpstreamId, y.UpstreamId);
public int GetHashCode((string Vendor, string UpstreamId) obj)
{
var hash = new HashCode();
hash.Add(obj.Vendor, StringComparer.Ordinal);
hash.Add(obj.UpstreamId, StringComparer.Ordinal);
return hash.ToHashCode();
}
}

View File

@@ -0,0 +1,5 @@
using System;
namespace StellaOps.Concelier.Core.Linksets;
public sealed record AdvisoryLinksetCursor(DateTimeOffset CreatedAt, string AdvisoryId);

View File

@@ -0,0 +1,78 @@
using System;
using System.Collections.Generic;
using System.Linq;
using StellaOps.Concelier.RawModels;
using StellaOps.Concelier.Models;
namespace StellaOps.Concelier.Core.Linksets;
internal static class AdvisoryLinksetNormalization
{
public static AdvisoryLinksetNormalized? FromRawLinkset(RawLinkset linkset)
{
ArgumentNullException.ThrowIfNull(linkset);
return Build(linkset.PackageUrls);
}
public static AdvisoryLinksetNormalized? FromPurls(IEnumerable<string>? purls)
{
if (purls is null)
{
return null;
}
return Build(purls);
}
private static AdvisoryLinksetNormalized? Build(IEnumerable<string> purlValues)
{
var normalizedPurls = NormalizePurls(purlValues);
var versions = ExtractVersions(normalizedPurls);
if (normalizedPurls.Count == 0 && versions.Count == 0)
{
return null;
}
return new AdvisoryLinksetNormalized(normalizedPurls, versions, null, null);
}
private static List<string> NormalizePurls(IEnumerable<string> purls)
{
var distinct = new SortedSet<string>(StringComparer.Ordinal);
foreach (var purl in purls)
{
var normalized = Validation.TrimToNull(purl);
if (normalized is null)
{
continue;
}
distinct.Add(normalized);
}
return distinct.ToList();
}
private static List<string> ExtractVersions(IReadOnlyCollection<string> purls)
{
var versions = new SortedSet<string>(StringComparer.Ordinal);
foreach (var purl in purls)
{
var atIndex = purl.LastIndexOf('@');
if (atIndex < 0 || atIndex >= purl.Length - 1)
{
continue;
}
var version = purl[(atIndex + 1)..];
if (!string.IsNullOrWhiteSpace(version))
{
versions.Add(version);
}
}
return versions.ToList();
}
}

View File

@@ -0,0 +1,10 @@
using System.Collections.Generic;
namespace StellaOps.Concelier.Core.Linksets;
public sealed record AdvisoryLinksetQueryOptions(
string Tenant,
IEnumerable<string>? AdvisoryIds = null,
IEnumerable<string>? Sources = null,
int? Limit = null,
string? Cursor = null);

View File

@@ -0,0 +1,111 @@
using System.Collections.Immutable;
using System;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
namespace StellaOps.Concelier.Core.Linksets;
public interface IAdvisoryLinksetQueryService
{
Task<AdvisoryLinksetQueryResult> QueryAsync(AdvisoryLinksetQueryOptions options, CancellationToken cancellationToken);
}
public sealed record AdvisoryLinksetQueryResult(ImmutableArray<AdvisoryLinkset> Linksets, string? NextCursor, bool HasMore);
public sealed record AdvisoryLinksetPage(ImmutableArray<AdvisoryLinkset> Linksets, string? NextCursor, bool HasMore);
public sealed class AdvisoryLinksetQueryService : IAdvisoryLinksetQueryService
{
private const int DefaultLimit = 100;
private const int MaxLimit = 500;
private readonly IAdvisoryLinksetLookup _store;
public AdvisoryLinksetQueryService(IAdvisoryLinksetLookup store)
{
_store = store ?? throw new ArgumentNullException(nameof(store));
}
public async Task<AdvisoryLinksetQueryResult> QueryAsync(AdvisoryLinksetQueryOptions options, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(options);
cancellationToken.ThrowIfCancellationRequested();
var tenant = string.IsNullOrWhiteSpace(options.Tenant)
? throw new ArgumentNullException(nameof(options.Tenant))
: options.Tenant.ToLowerInvariant();
var limit = NormalizeLimit(options.Limit);
var cursor = DecodeCursor(options.Cursor);
var linksets = await _store
.FindByTenantAsync(tenant, options.AdvisoryIds, options.Sources, cursor, limit + 1, cancellationToken)
.ConfigureAwait(false);
var ordered = linksets
.OrderByDescending(ls => ls.CreatedAt)
.ThenBy(ls => ls.AdvisoryId, StringComparer.Ordinal)
.ToImmutableArray();
var hasMore = ordered.Length > limit;
var page = hasMore ? ordered.Take(limit).ToImmutableArray() : ordered;
var nextCursor = hasMore ? EncodeCursor(page[^1]) : null;
return new AdvisoryLinksetQueryResult(page, nextCursor, hasMore);
}
private static int NormalizeLimit(int? requested)
{
if (!requested.HasValue || requested <= 0)
{
return DefaultLimit;
}
return requested.Value > MaxLimit ? MaxLimit : requested.Value;
}
private static AdvisoryLinksetCursor? DecodeCursor(string? cursor)
{
if (string.IsNullOrWhiteSpace(cursor))
{
return null;
}
try
{
var buffer = Convert.FromBase64String(cursor.Trim());
var payload = System.Text.Encoding.UTF8.GetString(buffer);
var separator = payload.IndexOf(':');
if (separator <= 0 || separator >= payload.Length - 1)
{
throw new FormatException("Cursor format invalid.");
}
var ticksText = payload[..separator];
if (!long.TryParse(ticksText, out var ticks))
{
throw new FormatException("Cursor timestamp invalid.");
}
var advisoryId = payload[(separator + 1)..];
if (string.IsNullOrWhiteSpace(advisoryId))
{
throw new FormatException("Cursor advisoryId missing.");
}
return new AdvisoryLinksetCursor(new DateTimeOffset(new DateTime(ticks, DateTimeKind.Utc)), advisoryId);
}
catch (FormatException)
{
throw;
}
catch (Exception ex)
{
throw new FormatException("Cursor is malformed.", ex);
}
}
private static string? EncodeCursor(AdvisoryLinkset linkset)
{
var payload = $"{linkset.CreatedAt.UtcTicks}:{linkset.AdvisoryId}";
return Convert.ToBase64String(System.Text.Encoding.UTF8.GetBytes(payload));
}
}

View File

@@ -0,0 +1,9 @@
using System.Threading;
using System.Threading.Tasks;
namespace StellaOps.Concelier.Core.Linksets;
public interface IAdvisoryLinksetBackfillService
{
Task<int> BackfillTenantAsync(string tenant, CancellationToken cancellationToken);
}

View File

@@ -0,0 +1,9 @@
using System.Threading;
using System.Threading.Tasks;
namespace StellaOps.Concelier.Core.Linksets;
public interface IAdvisoryLinksetSink
{
Task UpsertAsync(AdvisoryLinkset linkset, CancellationToken cancellationToken);
}

View File

@@ -0,0 +1,20 @@
using System.Collections.Generic;
using System.Threading;
using System.Threading.Tasks;
namespace StellaOps.Concelier.Core.Linksets;
public interface IAdvisoryLinksetStore : IAdvisoryLinksetSink, IAdvisoryLinksetLookup
{
}
public interface IAdvisoryLinksetLookup
{
Task<IReadOnlyList<AdvisoryLinkset>> FindByTenantAsync(
string tenantId,
IEnumerable<string>? advisoryIds,
IEnumerable<string>? sources,
AdvisoryLinksetCursor? cursor,
int limit,
CancellationToken cancellationToken);
}

View File

@@ -0,0 +1,45 @@
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.DependencyInjection.Extensions;
using StellaOps.Concelier.Core.Observations;
using StellaOps.Concelier.Core.Linksets;
namespace StellaOps.Concelier.Core.Linksets;
public static class ObservationPipelineServiceCollectionExtensions
{
public static IServiceCollection AddConcelierObservationPipeline(this IServiceCollection services)
{
ArgumentNullException.ThrowIfNull(services);
services.TryAddSingleton<IAdvisoryObservationSink, NullObservationSink>();
services.TryAddSingleton<IAdvisoryLinksetSink, NullLinksetSink>();
services.TryAddSingleton<IAdvisoryLinksetLookup, NullLinksetLookup>();
services.TryAddSingleton<IAdvisoryLinksetBackfillService, AdvisoryLinksetBackfillService>();
return services;
}
private sealed class NullObservationSink : IAdvisoryObservationSink
{
public Task UpsertAsync(Models.Observations.AdvisoryObservation observation, CancellationToken cancellationToken)
=> Task.CompletedTask;
}
private sealed class NullLinksetSink : IAdvisoryLinksetSink
{
public Task UpsertAsync(AdvisoryLinkset linkset, CancellationToken cancellationToken)
=> Task.CompletedTask;
}
private sealed class NullLinksetLookup : IAdvisoryLinksetLookup
{
public Task<IReadOnlyList<AdvisoryLinkset>> FindByTenantAsync(
string tenantId,
IEnumerable<string>? advisoryIds,
IEnumerable<string>? sources,
AdvisoryLinksetCursor? cursor,
int limit,
CancellationToken cancellationToken)
=> Task.FromResult<IReadOnlyList<AdvisoryLinkset>>(Array.Empty<AdvisoryLinkset>());
}
}

View File

@@ -0,0 +1,10 @@
using System.Threading;
using System.Threading.Tasks;
using StellaOps.Concelier.Models.Observations;
namespace StellaOps.Concelier.Core.Observations;
public interface IAdvisoryObservationSink
{
Task UpsertAsync(AdvisoryObservation observation, CancellationToken cancellationToken);
}

View File

@@ -9,7 +9,8 @@ using Microsoft.Extensions.Logging;
using StellaOps.Aoc;
using StellaOps.Ingestion.Telemetry;
using StellaOps.Concelier.Core.Aoc;
using StellaOps.Concelier.Core.Linksets;
using StellaOps.Concelier.Core.Linksets;
using StellaOps.Concelier.Core.Observations;
using StellaOps.Concelier.RawModels;
using StellaOps.Concelier.Models;
@@ -19,28 +20,37 @@ internal sealed class AdvisoryRawService : IAdvisoryRawService
{
private static readonly ImmutableArray<string> EmptyArray = ImmutableArray<string>.Empty;
private readonly IAdvisoryRawRepository _repository;
private readonly IAdvisoryRawWriteGuard _writeGuard;
private readonly IAocGuard _aocGuard;
private readonly IAdvisoryLinksetMapper _linksetMapper;
private readonly TimeProvider _timeProvider;
private readonly ILogger<AdvisoryRawService> _logger;
private readonly IAdvisoryRawRepository _repository;
private readonly IAdvisoryRawWriteGuard _writeGuard;
private readonly IAocGuard _aocGuard;
private readonly IAdvisoryLinksetMapper _linksetMapper;
private readonly IAdvisoryObservationFactory _observationFactory;
private readonly IAdvisoryObservationSink _observationSink;
private readonly IAdvisoryLinksetSink _linksetSink;
private readonly TimeProvider _timeProvider;
private readonly ILogger<AdvisoryRawService> _logger;
public AdvisoryRawService(
IAdvisoryRawRepository repository,
IAdvisoryRawWriteGuard writeGuard,
IAocGuard aocGuard,
IAdvisoryLinksetMapper linksetMapper,
TimeProvider timeProvider,
ILogger<AdvisoryRawService> logger)
{
_repository = repository ?? throw new ArgumentNullException(nameof(repository));
_writeGuard = writeGuard ?? throw new ArgumentNullException(nameof(writeGuard));
_aocGuard = aocGuard ?? throw new ArgumentNullException(nameof(aocGuard));
_linksetMapper = linksetMapper ?? throw new ArgumentNullException(nameof(linksetMapper));
_timeProvider = timeProvider ?? throw new ArgumentNullException(nameof(timeProvider));
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
IAdvisoryRawRepository repository,
IAdvisoryRawWriteGuard writeGuard,
IAocGuard aocGuard,
IAdvisoryLinksetMapper linksetMapper,
IAdvisoryObservationFactory observationFactory,
IAdvisoryObservationSink observationSink,
IAdvisoryLinksetSink linksetSink,
TimeProvider timeProvider,
ILogger<AdvisoryRawService> logger)
{
_repository = repository ?? throw new ArgumentNullException(nameof(repository));
_writeGuard = writeGuard ?? throw new ArgumentNullException(nameof(writeGuard));
_aocGuard = aocGuard ?? throw new ArgumentNullException(nameof(aocGuard));
_linksetMapper = linksetMapper ?? throw new ArgumentNullException(nameof(linksetMapper));
_observationFactory = observationFactory ?? throw new ArgumentNullException(nameof(observationFactory));
_observationSink = observationSink ?? throw new ArgumentNullException(nameof(observationSink));
_linksetSink = linksetSink ?? throw new ArgumentNullException(nameof(linksetSink));
_timeProvider = timeProvider ?? throw new ArgumentNullException(nameof(timeProvider));
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public async Task<AdvisoryRawUpsertResult> IngestAsync(AdvisoryRawDocument document, CancellationToken cancellationToken)
{
@@ -102,6 +112,23 @@ internal sealed class AdvisoryRawService : IAdvisoryRawService
var result = await _repository.UpsertAsync(enriched, cancellationToken).ConfigureAwait(false);
IngestionTelemetry.RecordWriteAttempt(tenant, source, result.Inserted ? IngestionTelemetry.ResultOk : IngestionTelemetry.ResultNoop);
// Persist observation + linkset for Link-Not-Merge consumers (idempotent upserts).
var observation = _observationFactory.Create(enriched, _timeProvider.GetUtcNow());
await _observationSink.UpsertAsync(observation, cancellationToken).ConfigureAwait(false);
var normalizedLinkset = AdvisoryLinksetNormalization.FromRawLinkset(enriched.Linkset);
var linkset = new AdvisoryLinkset(
tenant,
source,
enriched.Upstream.UpstreamId,
ImmutableArray.Create(observation.ObservationId),
normalizedLinkset,
null,
_timeProvider.GetUtcNow(),
null);
await _linksetSink.UpsertAsync(linkset, cancellationToken).ConfigureAwait(false);
if (result.Inserted)
{
_logger.LogInformation(

View File

@@ -0,0 +1,87 @@
using System;
using System.Collections.Generic;
using MongoDB.Bson;
using MongoDB.Bson.Serialization.Attributes;
namespace StellaOps.Concelier.Storage.Mongo.Linksets;
[BsonIgnoreExtraElements]
public sealed class AdvisoryLinksetDocument
{
[BsonId]
public ObjectId Id { get; set; }
= ObjectId.GenerateNewId();
[BsonElement("tenantId")]
public string TenantId { get; set; } = string.Empty;
[BsonElement("source")]
public string Source { get; set; } = string.Empty;
[BsonElement("advisoryId")]
public string AdvisoryId { get; set; } = string.Empty;
[BsonElement("observations")]
public List<string> Observations { get; set; } = new();
[BsonElement("normalized")]
[BsonIgnoreIfNull]
public AdvisoryLinksetNormalizedDocument? Normalized { get; set; }
= null;
[BsonElement("createdAt")]
public DateTime CreatedAt { get; set; } = DateTime.UtcNow;
[BsonElement("builtByJobId")]
[BsonIgnoreIfNull]
public string? BuiltByJobId { get; set; }
= null;
[BsonElement("provenance")]
[BsonIgnoreIfNull]
public AdvisoryLinksetProvenanceDocument? Provenance { get; set; }
= null;
}
[BsonIgnoreExtraElements]
public sealed class AdvisoryLinksetNormalizedDocument
{
[BsonElement("purls")]
[BsonIgnoreIfNull]
public List<string>? Purls { get; set; }
= new();
[BsonElement("versions")]
[BsonIgnoreIfNull]
public List<string>? Versions { get; set; }
= new();
[BsonElement("ranges")]
[BsonIgnoreIfNull]
public List<BsonDocument>? Ranges { get; set; }
= new();
[BsonElement("severities")]
[BsonIgnoreIfNull]
public List<BsonDocument>? Severities { get; set; }
= new();
}
[BsonIgnoreExtraElements]
public sealed class AdvisoryLinksetProvenanceDocument
{
[BsonElement("observationHashes")]
[BsonIgnoreIfNull]
public List<string>? ObservationHashes { get; set; }
= new();
[BsonElement("toolVersion")]
[BsonIgnoreIfNull]
public string? ToolVersion { get; set; }
= null;
[BsonElement("policyHash")]
[BsonIgnoreIfNull]
public string? PolicyHash { get; set; }
= null;
}

View File

@@ -0,0 +1,22 @@
using System;
using System.Threading;
using System.Threading.Tasks;
using CoreLinksets = StellaOps.Concelier.Core.Linksets;
namespace StellaOps.Concelier.Storage.Mongo.Linksets;
internal sealed class AdvisoryLinksetSink : CoreLinksets.IAdvisoryLinksetSink
{
private readonly IAdvisoryLinksetStore _store;
public AdvisoryLinksetSink(IAdvisoryLinksetStore store)
{
_store = store ?? throw new ArgumentNullException(nameof(store));
}
public Task UpsertAsync(CoreLinksets.AdvisoryLinkset linkset, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(linkset);
return _store.UpsertAsync(linkset, cancellationToken);
}
}

View File

@@ -0,0 +1,170 @@
using System;
using System.Collections.Generic;
using System.Collections.Immutable;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using MongoDB.Driver;
using CoreLinksets = StellaOps.Concelier.Core.Linksets;
namespace StellaOps.Concelier.Storage.Mongo.Linksets;
// Internal type kept in storage namespace to avoid name clash with core interface
internal sealed class MongoAdvisoryLinksetStore : CoreLinksets.IAdvisoryLinksetStore, CoreLinksets.IAdvisoryLinksetLookup
{
private readonly IMongoCollection<AdvisoryLinksetDocument> _collection;
public MongoAdvisoryLinksetStore(IMongoCollection<AdvisoryLinksetDocument> collection)
{
_collection = collection ?? throw new ArgumentNullException(nameof(collection));
}
public async Task UpsertAsync(CoreLinksets.AdvisoryLinkset linkset, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(linkset);
var document = MapToDocument(linkset);
var filter = Builders<AdvisoryLinksetDocument>.Filter.And(
Builders<AdvisoryLinksetDocument>.Filter.Eq(d => d.TenantId, linkset.TenantId),
Builders<AdvisoryLinksetDocument>.Filter.Eq(d => d.Source, linkset.Source),
Builders<AdvisoryLinksetDocument>.Filter.Eq(d => d.AdvisoryId, linkset.AdvisoryId));
var options = new ReplaceOptions { IsUpsert = true };
await _collection.ReplaceOneAsync(filter, document, options, cancellationToken).ConfigureAwait(false);
}
public async Task<IReadOnlyList<CoreLinksets.AdvisoryLinkset>> FindByTenantAsync(
string tenantId,
IEnumerable<string>? advisoryIds,
IEnumerable<string>? sources,
CoreLinksets.AdvisoryLinksetCursor? cursor,
int limit,
CancellationToken cancellationToken)
{
ArgumentException.ThrowIfNullOrWhiteSpace(tenantId);
if (limit <= 0)
{
throw new ArgumentOutOfRangeException(nameof(limit));
}
var builder = Builders<AdvisoryLinksetDocument>.Filter;
var filters = new List<FilterDefinition<AdvisoryLinksetDocument>>
{
builder.Eq(d => d.TenantId, tenantId.ToLowerInvariant())
};
if (advisoryIds is not null)
{
var ids = advisoryIds.Where(v => !string.IsNullOrWhiteSpace(v)).ToArray();
if (ids.Length > 0)
{
filters.Add(builder.In(d => d.AdvisoryId, ids));
}
}
if (sources is not null)
{
var srcs = sources.Where(v => !string.IsNullOrWhiteSpace(v)).ToArray();
if (srcs.Length > 0)
{
filters.Add(builder.In(d => d.Source, srcs));
}
}
var filter = builder.And(filters);
var sort = Builders<AdvisoryLinksetDocument>.Sort.Descending(d => d.CreatedAt).Ascending(d => d.AdvisoryId);
var findFilter = filter;
if (cursor is not null)
{
var cursorFilter = builder.Or(
builder.Lt(d => d.CreatedAt, cursor.CreatedAt.UtcDateTime),
builder.And(
builder.Eq(d => d.CreatedAt, cursor.CreatedAt.UtcDateTime),
builder.Gt(d => d.AdvisoryId, cursor.AdvisoryId)));
findFilter = builder.And(findFilter, cursorFilter);
}
var documents = await _collection.Find(findFilter)
.Sort(sort)
.Limit(limit)
.ToListAsync(cancellationToken)
.ConfigureAwait(false);
return documents.Select(FromDocument).ToArray();
}
private static AdvisoryLinksetDocument MapToDocument(CoreLinksets.AdvisoryLinkset linkset)
{
var doc = new AdvisoryLinksetDocument
{
TenantId = linkset.TenantId,
Source = linkset.Source,
AdvisoryId = linkset.AdvisoryId,
Observations = new List<string>(linkset.ObservationIds),
CreatedAt = linkset.CreatedAt.UtcDateTime,
BuiltByJobId = linkset.BuiltByJobId,
Provenance = linkset.Provenance is null ? null : new AdvisoryLinksetProvenanceDocument
{
ObservationHashes = linkset.Provenance.ObservationHashes is null
? null
: new List<string>(linkset.Provenance.ObservationHashes),
ToolVersion = linkset.Provenance.ToolVersion,
PolicyHash = linkset.Provenance.PolicyHash,
},
Normalized = linkset.Normalized is null ? null : new AdvisoryLinksetNormalizedDocument
{
Purls = linkset.Normalized.Purls is null ? null : new List<string>(linkset.Normalized.Purls),
Versions = linkset.Normalized.Versions is null ? null : new List<string>(linkset.Normalized.Versions),
Ranges = linkset.Normalized.RangesToBson(),
Severities = linkset.Normalized.SeveritiesToBson(),
}
};
return doc;
}
private static CoreLinksets.AdvisoryLinkset FromDocument(AdvisoryLinksetDocument doc)
{
return new AdvisoryLinkset(
doc.TenantId,
doc.Source,
doc.AdvisoryId,
doc.Observations.ToImmutableArray(),
doc.Normalized is null ? null : new AdvisoryLinksetNormalized(
doc.Normalized.Purls,
doc.Normalized.Versions,
doc.Normalized.Ranges?.Select(ToDictionary).ToList(),
doc.Normalized.Severities?.Select(ToDictionary).ToList()),
doc.Provenance is null ? null : new AdvisoryLinksetProvenance(
doc.Provenance.ObservationHashes,
doc.Provenance.ToolVersion,
doc.Provenance.PolicyHash),
DateTime.SpecifyKind(doc.CreatedAt, DateTimeKind.Utc),
doc.BuiltByJobId);
}
private static Dictionary<string, object?> ToDictionary(MongoDB.Bson.BsonDocument bson)
{
var dict = new Dictionary<string, object?>(StringComparer.Ordinal);
foreach (var element in bson.Elements)
{
dict[element.Name] = element.Value switch
{
MongoDB.Bson.BsonString s => s.AsString,
MongoDB.Bson.BsonInt32 i => i.AsInt32,
MongoDB.Bson.BsonInt64 l => l.AsInt64,
MongoDB.Bson.BsonDouble d => d.AsDouble,
MongoDB.Bson.BsonDecimal128 dec => dec.ToDecimal(),
MongoDB.Bson.BsonBoolean b => b.AsBoolean,
MongoDB.Bson.BsonDateTime dt => dt.ToUniversalTime(),
MongoDB.Bson.BsonNull => (object?)null,
MongoDB.Bson.BsonArray arr => arr.Select(v => v.ToString()).ToArray(),
_ => element.Value.ToString()
};
}
return dict;
}
}

View File

@@ -0,0 +1,242 @@
using System.Collections.Generic;
using MongoDB.Bson;
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo.Migrations;
internal sealed class EnsureLinkNotMergeCollectionsMigration : IMongoMigration
{
public string Id => "20251116_link_not_merge_collections";
public string Description => "Ensure advisory_observations and advisory_linksets collections exist with validators and indexes for Link-Not-Merge";
public async Task ApplyAsync(IMongoDatabase database, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(database);
await EnsureObservationsAsync(database, cancellationToken).ConfigureAwait(false);
await EnsureLinksetsAsync(database, cancellationToken).ConfigureAwait(false);
}
private static async Task EnsureObservationsAsync(IMongoDatabase database, CancellationToken ct)
{
var collectionName = MongoStorageDefaults.Collections.AdvisoryObservations;
var validator = new BsonDocument("$jsonSchema", BuildObservationSchema());
await EnsureCollectionWithValidatorAsync(database, collectionName, validator, ct).ConfigureAwait(false);
var collection = database.GetCollection<BsonDocument>(collectionName);
var indexes = new List<CreateIndexModel<BsonDocument>>
{
new(new BsonDocument
{
{"tenant", 1},
{"source", 1},
{"advisoryId", 1},
{"upstream.fetchedAt", -1},
},
new CreateIndexOptions { Name = "obs_tenant_source_adv_fetchedAt" }),
new(new BsonDocument
{
{"provenance.sourceArtifactSha", 1},
},
new CreateIndexOptions { Name = "obs_prov_sourceArtifactSha_unique", Unique = true }),
};
await collection.Indexes.CreateManyAsync(indexes, cancellationToken: ct).ConfigureAwait(false);
}
private static async Task EnsureLinksetsAsync(IMongoDatabase database, CancellationToken ct)
{
var collectionName = MongoStorageDefaults.Collections.AdvisoryLinksets;
var validator = new BsonDocument("$jsonSchema", BuildLinksetSchema());
await EnsureCollectionWithValidatorAsync(database, collectionName, validator, ct).ConfigureAwait(false);
var collection = database.GetCollection<BsonDocument>(collectionName);
var indexes = new List<CreateIndexModel<BsonDocument>>
{
new(new BsonDocument
{
{"tenantId", 1},
{"advisoryId", 1},
{"source", 1},
},
new CreateIndexOptions { Name = "linkset_tenant_advisory_source", Unique = true }),
new(new BsonDocument { { "observations", 1 } }, new CreateIndexOptions { Name = "linkset_observations" })
};
await collection.Indexes.CreateManyAsync(indexes, cancellationToken: ct).ConfigureAwait(false);
}
private static async Task EnsureCollectionWithValidatorAsync(
IMongoDatabase database,
string collectionName,
BsonDocument validator,
CancellationToken ct)
{
var filter = new BsonDocument("name", collectionName);
var existing = await database.ListCollectionsAsync(new ListCollectionsOptions { Filter = filter }, ct)
.ConfigureAwait(false);
var exists = await existing.AnyAsync(ct).ConfigureAwait(false);
if (!exists)
{
var options = new CreateCollectionOptions<BsonDocument>
{
Validator = validator,
ValidationLevel = DocumentValidationLevel.Moderate,
ValidationAction = DocumentValidationAction.Error,
};
await database.CreateCollectionAsync(collectionName, options, ct).ConfigureAwait(false);
}
else
{
var command = new BsonDocument
{
{ "collMod", collectionName },
{ "validator", validator },
{ "validationLevel", "moderate" },
{ "validationAction", "error" },
};
await database.RunCommandAsync<BsonDocument>(command, cancellationToken: ct).ConfigureAwait(false);
}
}
private static BsonDocument BuildObservationSchema()
{
return new BsonDocument
{
{ "bsonType", "object" },
{ "required", new BsonArray { "_id", "tenantId", "source", "advisoryId", "affected", "provenance", "ingestedAt" } },
{ "properties", new BsonDocument
{
{ "_id", new BsonDocument("bsonType", "string") },
{ "tenantId", new BsonDocument("bsonType", "string") },
{ "source", new BsonDocument("bsonType", "string") },
{ "advisoryId", new BsonDocument("bsonType", "string") },
{ "title", new BsonDocument("bsonType", new BsonArray { "string", "null" }) },
{ "summary", new BsonDocument("bsonType", new BsonArray { "string", "null" }) },
{ "severities", new BsonDocument
{
{ "bsonType", "array" },
{ "items", new BsonDocument
{
{ "bsonType", "object" },
{ "required", new BsonArray { "system", "score" } },
{ "properties", new BsonDocument
{
{ "system", new BsonDocument("bsonType", "string") },
{ "score", new BsonDocument("bsonType", new BsonArray { "double", "int", "long", "decimal" }) },
{ "vector", new BsonDocument("bsonType", new BsonArray { "string", "null" }) }
}
}
}
}
}
},
{ "affected", new BsonDocument
{
{ "bsonType", "array" },
{ "items", new BsonDocument
{
{ "bsonType", "object" },
{ "required", new BsonArray { "purl" } },
{ "properties", new BsonDocument
{
{ "purl", new BsonDocument("bsonType", "string") },
{ "package", new BsonDocument("bsonType", new BsonArray { "string", "null" }) },
{ "versions", new BsonDocument("bsonType", new BsonArray { "array", "null" }) },
{ "ranges", new BsonDocument("bsonType", new BsonArray { "array", "null" }) },
{ "ecosystem", new BsonDocument("bsonType", new BsonArray { "string", "null" }) },
{ "cpe", new BsonDocument("bsonType", new BsonArray { "array", "null" }) },
{ "cpes", new BsonDocument("bsonType", new BsonArray { "array", "null" }) }
}
}
}
}
}
},
{ "references", new BsonDocument
{
{ "bsonType", new BsonArray { "array", "null" } },
{ "items", new BsonDocument("bsonType", "string") }
}
},
{ "weaknesses", new BsonDocument
{
{ "bsonType", new BsonArray { "array", "null" } },
{ "items", new BsonDocument("bsonType", "string") }
}
},
{ "published", new BsonDocument("bsonType", new BsonArray { "date", "null" }) },
{ "modified", new BsonDocument("bsonType", new BsonArray { "date", "null" }) },
{ "provenance", new BsonDocument
{
{ "bsonType", "object" },
{ "required", new BsonArray { "sourceArtifactSha", "fetchedAt" } },
{ "properties", new BsonDocument
{
{ "sourceArtifactSha", new BsonDocument("bsonType", "string") },
{ "fetchedAt", new BsonDocument("bsonType", "date") },
{ "ingestJobId", new BsonDocument("bsonType", new BsonArray { "string", "null" }) },
{ "signature", new BsonDocument("bsonType", new BsonArray { "object", "null" }) }
}
}
}
},
{ "ingestedAt", new BsonDocument("bsonType", "date") }
}
}
};
}
private static BsonDocument BuildLinksetSchema()
{
return new BsonDocument
{
{ "bsonType", "object" },
{ "required", new BsonArray { "_id", "tenantId", "source", "advisoryId", "observations", "createdAt" } },
{ "properties", new BsonDocument
{
{ "_id", new BsonDocument("bsonType", "objectId") },
{ "tenantId", new BsonDocument("bsonType", "string") },
{ "source", new BsonDocument("bsonType", "string") },
{ "advisoryId", new BsonDocument("bsonType", "string") },
{ "observations", new BsonDocument
{
{ "bsonType", "array" },
{ "items", new BsonDocument("bsonType", "string") }
}
},
{ "normalized", new BsonDocument
{
{ "bsonType", new BsonArray { "object", "null" } },
{ "properties", new BsonDocument
{
{ "purls", new BsonDocument { { "bsonType", new BsonArray { "array", "null" } }, { "items", new BsonDocument("bsonType", "string") } } },
{ "versions", new BsonDocument { { "bsonType", new BsonArray { "array", "null" } }, { "items", new BsonDocument("bsonType", "string") } } },
{ "ranges", new BsonDocument { { "bsonType", new BsonArray { "array", "null" } }, { "items", new BsonDocument("bsonType", "object") } } },
{ "severities", new BsonDocument { { "bsonType", new BsonArray { "array", "null" } }, { "items", new BsonDocument("bsonType", "object") } } }
}
}
}
},
{ "createdAt", new BsonDocument("bsonType", "date") },
{ "builtByJobId", new BsonDocument("bsonType", new BsonArray { "string", "null" }) },
{ "provenance", new BsonDocument
{
{ "bsonType", new BsonArray { "object", "null" } },
{ "properties", new BsonDocument
{
{ "observationHashes", new BsonDocument { { "bsonType", new BsonArray { "array", "null" } }, { "items", new BsonDocument("bsonType", "string") } } },
{ "toolVersion", new BsonDocument("bsonType", new BsonArray { "string", "null" }) },
{ "policyHash", new BsonDocument("bsonType", new BsonArray { "string", "null" }) }
}
}
}
}
}
}
};
}
}

View File

@@ -0,0 +1,22 @@
using System;
using System.Threading;
using System.Threading.Tasks;
using StellaOps.Concelier.Core.Observations;
using StellaOps.Concelier.Models.Observations;
namespace StellaOps.Concelier.Storage.Mongo.Observations;
internal sealed class AdvisoryObservationSink : IAdvisoryObservationSink
{
private readonly IAdvisoryObservationStore _store;
public AdvisoryObservationSink(IAdvisoryObservationStore store)
{
_store = store ?? throw new ArgumentNullException(nameof(store));
}
public Task UpsertAsync(AdvisoryObservation observation, CancellationToken cancellationToken)
{
return _store.UpsertAsync(observation, cancellationToken);
}
}

View File

@@ -80,6 +80,13 @@ public static class ServiceCollectionExtensions
services.AddSingleton<IAdvisoryEventRepository, MongoAdvisoryEventRepository>();
services.AddSingleton<IAdvisoryEventLog, AdvisoryEventLog>();
services.AddSingleton<IAdvisoryRawRepository, MongoAdvisoryRawRepository>();
services.AddSingleton<StellaOps.Concelier.Storage.Mongo.Linksets.MongoAdvisoryLinksetStore>();
services.AddSingleton<StellaOps.Concelier.Core.Linksets.IAdvisoryLinksetStore>(sp =>
sp.GetRequiredService<StellaOps.Concelier.Storage.Mongo.Linksets.MongoAdvisoryLinksetStore>());
services.AddSingleton<StellaOps.Concelier.Core.Linksets.IAdvisoryLinksetLookup>(sp =>
sp.GetRequiredService<StellaOps.Concelier.Storage.Mongo.Linksets.MongoAdvisoryLinksetStore>());
services.AddSingleton<StellaOps.Concelier.Core.Linksets.IAdvisoryObservationSink, StellaOps.Concelier.Storage.Mongo.Linksets.AdvisoryObservationSink>();
services.AddSingleton<StellaOps.Concelier.Core.Linksets.IAdvisoryLinksetSink, StellaOps.Concelier.Storage.Mongo.Linksets.AdvisoryLinksetSink>();
services.AddSingleton<IExportStateStore, ExportStateStore>();
services.TryAddSingleton<ExportStateManager>();

View File

@@ -0,0 +1,94 @@
using System.Collections.Generic;
using System.Collections.Immutable;
using StellaOps.Concelier.Core.Linksets;
using Xunit;
namespace StellaOps.Concelier.Core.Tests.Linksets;
public sealed class AdvisoryLinksetQueryServiceTests
{
[Fact]
public async Task QueryAsync_ReturnsPagedResults_WithCursor()
{
var linksets = new List<AdvisoryLinkset>
{
new("tenant", "ghsa", "adv-003",
ImmutableArray.Create("obs-003"),
new AdvisoryLinksetNormalized(new[]{"pkg:npm/a"}, new[]{"1.0.0"}, null, null),
null, DateTimeOffset.Parse("2025-11-10T12:00:00Z"), null),
new("tenant", "ghsa", "adv-002",
ImmutableArray.Create("obs-002"),
new AdvisoryLinksetNormalized(new[]{"pkg:npm/b"}, new[]{"2.0.0"}, null, null),
null, DateTimeOffset.Parse("2025-11-09T12:00:00Z"), null),
new("tenant", "ghsa", "adv-001",
ImmutableArray.Create("obs-001"),
new AdvisoryLinksetNormalized(new[]{"pkg:npm/c"}, new[]{"3.0.0"}, null, null),
null, DateTimeOffset.Parse("2025-11-08T12:00:00Z"), null),
};
var lookup = new FakeLinksetLookup(linksets);
var service = new AdvisoryLinksetQueryService(lookup);
var firstPage = await service.QueryAsync(new AdvisoryLinksetQueryOptions("tenant", limit: 2), CancellationToken.None);
Assert.Equal(2, firstPage.Linksets.Length);
Assert.True(firstPage.HasMore);
Assert.False(string.IsNullOrWhiteSpace(firstPage.NextCursor));
Assert.Equal("adv-003", firstPage.Linksets[0].AdvisoryId);
Assert.Equal("pkg:npm/a", firstPage.Linksets[0].Normalized?.Purls?.First());
var secondPage = await service.QueryAsync(new AdvisoryLinksetQueryOptions("tenant", limit: 2, Cursor: firstPage.NextCursor), CancellationToken.None);
Assert.Single(secondPage.Linksets);
Assert.False(secondPage.HasMore);
Assert.Null(secondPage.NextCursor);
Assert.Equal("adv-001", secondPage.Linksets[0].AdvisoryId);
}
[Fact]
public async Task QueryAsync_InvalidCursor_ThrowsFormatException()
{
var lookup = new FakeLinksetLookup(Array.Empty<AdvisoryLinkset>());
var service = new AdvisoryLinksetQueryService(lookup);
await Assert.ThrowsAsync<FormatException>(async () =>
{
await service.QueryAsync(new AdvisoryLinksetQueryOptions("tenant", limit: 1, Cursor: "not-base64"), CancellationToken.None);
});
}
private sealed class FakeLinksetLookup : IAdvisoryLinksetLookup
{
private readonly IReadOnlyList<AdvisoryLinkset> _linksets;
public FakeLinksetLookup(IReadOnlyList<AdvisoryLinkset> linksets)
{
_linksets = linksets;
}
public Task<IReadOnlyList<AdvisoryLinkset>> FindByTenantAsync(
string tenantId,
IEnumerable<string>? advisoryIds,
IEnumerable<string>? sources,
AdvisoryLinksetCursor? cursor,
int limit,
CancellationToken cancellationToken)
{
var ordered = _linksets
.Where(ls => ls.TenantId == tenantId)
.OrderByDescending(ls => ls.CreatedAt)
.ThenBy(ls => ls.AdvisoryId, StringComparer.Ordinal)
.ToList();
if (cursor is not null)
{
ordered = ordered
.Where(ls => ls.CreatedAt < cursor.CreatedAt ||
(ls.CreatedAt == cursor.CreatedAt && string.Compare(ls.AdvisoryId, cursor.AdvisoryId, StringComparison.Ordinal) > 0))
.ToList();
}
return Task.FromResult<IReadOnlyList<AdvisoryLinkset>>(ordered.Take(limit).ToList());
}
}
}

View File

@@ -205,6 +205,104 @@ public sealed class WebServiceEndpointsTests : IAsyncLifetime
Assert.Equal("tenant-a:nvd:alpha:1", secondObservations[0].GetProperty("observationId").GetString());
}
[Fact]
public async Task LinksetsEndpoint_ReturnsNormalizedLinksetsFromIngestion()
{
var tenant = "tenant-linkset-ingest";
using var client = _factory.CreateClient();
client.DefaultRequestHeaders.Add("X-Stella-Tenant", tenant);
var firstIngest = await client.PostAsJsonAsync("/ingest/advisory", BuildAdvisoryIngestRequest("sha256:linkset-1", "GHSA-LINK-001", purls: new[] { "pkg:npm/demo@1.0.0" }));
firstIngest.EnsureSuccessStatusCode();
var secondIngest = await client.PostAsJsonAsync("/ingest/advisory", BuildAdvisoryIngestRequest("sha256:linkset-2", "GHSA-LINK-002", purls: new[] { "pkg:npm/demo@2.0.0" }));
secondIngest.EnsureSuccessStatusCode();
var response = await client.GetAsync("/linksets?tenant=tenant-linkset-ingest&limit=10");
response.EnsureSuccessStatusCode();
var payload = await response.Content.ReadFromJsonAsync<AdvisoryLinksetQueryResponse>();
Assert.NotNull(payload);
Assert.Equal(2, payload!.Linksets.Length);
var linksetAdvisoryIds = payload.Linksets.Select(ls => ls.AdvisoryId).OrderBy(id => id, StringComparer.Ordinal).ToArray();
Assert.Equal(new[] { "GHSA-LINK-001", "GHSA-LINK-002" }, linksetAdvisoryIds);
var allPurls = payload.Linksets.SelectMany(ls => ls.Purls).OrderBy(p => p, StringComparer.Ordinal).ToArray();
Assert.Contains("pkg:npm/demo@1.0.0", allPurls);
Assert.Contains("pkg:npm/demo@2.0.0", allPurls);
var versions = payload.Linksets
.SelectMany(ls => ls.Versions)
.Distinct(StringComparer.Ordinal)
.OrderBy(v => v, StringComparer.Ordinal)
.ToArray();
Assert.Contains("1.0.0", versions);
Assert.Contains("2.0.0", versions);
Assert.False(payload.HasMore);
Assert.True(string.IsNullOrEmpty(payload.NextCursor));
}
[Fact]
public async Task LinksetsEndpoint_SupportsCursorPagination()
{
var tenant = "tenant-linkset-page";
var documents = new[]
{
CreateLinksetDocument(
tenant,
"nvd",
"ADV-002",
new[] { "obs-2" },
new[] { "pkg:npm/demo@2.0.0" },
new[] { "2.0.0" },
new DateTime(2025, 1, 6, 0, 0, 0, DateTimeKind.Utc)),
CreateLinksetDocument(
tenant,
"osv",
"ADV-001",
new[] { "obs-1" },
new[] { "pkg:npm/demo@1.0.0" },
new[] { "1.0.0" },
new DateTime(2025, 1, 5, 0, 0, 0, DateTimeKind.Utc)),
CreateLinksetDocument(
"tenant-other",
"osv",
"ADV-999",
new[] { "obs-x" },
new[] { "pkg:npm/other@1.0.0" },
new[] { "1.0.0" },
new DateTime(2025, 1, 4, 0, 0, 0, DateTimeKind.Utc))
};
await SeedLinksetDocumentsAsync(documents);
using var client = _factory.CreateClient();
var firstResponse = await client.GetAsync($"/linksets?tenant={tenant}&limit=1");
firstResponse.EnsureSuccessStatusCode();
var firstPayload = await firstResponse.Content.ReadFromJsonAsync<AdvisoryLinksetQueryResponse>();
Assert.NotNull(firstPayload);
var first = Assert.Single(firstPayload!.Linksets);
Assert.Equal("ADV-002", first.AdvisoryId);
Assert.Equal(new[] { "pkg:npm/demo@2.0.0" }, first.Purls.ToArray());
Assert.Equal(new[] { "2.0.0" }, first.Versions.ToArray());
Assert.True(firstPayload.HasMore);
Assert.False(string.IsNullOrWhiteSpace(firstPayload.NextCursor));
var secondResponse = await client.GetAsync($"/linksets?tenant={tenant}&limit=1&cursor={Uri.EscapeDataString(firstPayload.NextCursor!)}");
secondResponse.EnsureSuccessStatusCode();
var secondPayload = await secondResponse.Content.ReadFromJsonAsync<AdvisoryLinksetQueryResponse>();
Assert.NotNull(secondPayload);
var second = Assert.Single(secondPayload!.Linksets);
Assert.Equal("ADV-001", second.AdvisoryId);
Assert.Equal(new[] { "pkg:npm/demo@1.0.0" }, second.Purls.ToArray());
Assert.Equal(new[] { "1.0.0" }, second.Versions.ToArray());
Assert.False(secondPayload.HasMore);
Assert.True(string.IsNullOrEmpty(secondPayload.NextCursor));
}
[Fact]
public async Task ObservationsEndpoint_ReturnsBadRequestWhenTenantMissing()
{
@@ -1505,6 +1603,52 @@ public sealed class WebServiceEndpointsTests : IAsyncLifetime
await SeedAdvisoryRawDocumentsAsync(rawDocuments);
}
private async Task SeedLinksetDocumentsAsync(IEnumerable<AdvisoryLinksetDocument> documents)
{
var client = new MongoClient(_runner.ConnectionString);
var database = client.GetDatabase(MongoStorageDefaults.DefaultDatabaseName);
var collection = database.GetCollection<AdvisoryLinksetDocument>(MongoStorageDefaults.Collections.AdvisoryLinksets);
try
{
await database.DropCollectionAsync(MongoStorageDefaults.Collections.AdvisoryLinksets);
}
catch (MongoCommandException ex) when (ex.CodeName == "NamespaceNotFound" || ex.Message.Contains("ns not found", StringComparison.OrdinalIgnoreCase))
{
// Collection not created yet; safe to ignore.
}
var snapshot = documents?.ToArray() ?? Array.Empty<AdvisoryLinksetDocument>();
if (snapshot.Length > 0)
{
await collection.InsertManyAsync(snapshot);
}
}
private static AdvisoryLinksetDocument CreateLinksetDocument(
string tenant,
string source,
string advisoryId,
IEnumerable<string> observationIds,
IEnumerable<string> purls,
IEnumerable<string> versions,
DateTime createdAtUtc)
{
return new AdvisoryLinksetDocument
{
TenantId = tenant,
Source = source,
AdvisoryId = advisoryId,
Observations = observationIds.ToList(),
CreatedAt = DateTime.SpecifyKind(createdAtUtc, DateTimeKind.Utc),
Normalized = new AdvisoryLinksetNormalizedDocument
{
Purls = purls.ToList(),
Versions = versions.ToList()
}
};
}
private static AdvisoryObservationDocument[] BuildSampleObservationDocuments()
{
return new[]

View File

@@ -0,0 +1,44 @@
using System;
using System.Collections.Generic;
using System.Text.Json.Serialization;
namespace StellaOps.Excititor.WebService.Contracts;
public sealed record VexEvidenceChunkResponse(
[property: JsonPropertyName("observationId")] string ObservationId,
[property: JsonPropertyName("linksetId")] string LinksetId,
[property: JsonPropertyName("vulnerabilityId")] string VulnerabilityId,
[property: JsonPropertyName("productKey")] string ProductKey,
[property: JsonPropertyName("providerId")] string ProviderId,
[property: JsonPropertyName("status")] string Status,
[property: JsonPropertyName("justification")] string? Justification,
[property: JsonPropertyName("detail")] string? Detail,
[property: JsonPropertyName("scopeScore")] double? ScopeScore,
[property: JsonPropertyName("firstSeen")] DateTimeOffset FirstSeen,
[property: JsonPropertyName("lastSeen")] DateTimeOffset LastSeen,
[property: JsonPropertyName("scope")] VexEvidenceChunkScope Scope,
[property: JsonPropertyName("document")] VexEvidenceChunkDocument Document,
[property: JsonPropertyName("signature")] VexEvidenceChunkSignature? Signature,
[property: JsonPropertyName("metadata")] IReadOnlyDictionary<string, string> Metadata);
public sealed record VexEvidenceChunkScope(
[property: JsonPropertyName("key")] string Key,
[property: JsonPropertyName("name")] string? Name,
[property: JsonPropertyName("version")] string? Version,
[property: JsonPropertyName("purl")] string? Purl,
[property: JsonPropertyName("cpe")] string? Cpe,
[property: JsonPropertyName("componentIdentifiers")] IReadOnlyList<string> ComponentIdentifiers);
public sealed record VexEvidenceChunkDocument(
[property: JsonPropertyName("digest")] string Digest,
[property: JsonPropertyName("format")] string Format,
[property: JsonPropertyName("sourceUri")] string SourceUri,
[property: JsonPropertyName("revision")] string? Revision);
public sealed record VexEvidenceChunkSignature(
[property: JsonPropertyName("type")] string Type,
[property: JsonPropertyName("subject")] string? Subject,
[property: JsonPropertyName("issuer")] string? Issuer,
[property: JsonPropertyName("keyId")] string? KeyId,
[property: JsonPropertyName("verifiedAt")] DateTimeOffset? VerifiedAt,
[property: JsonPropertyName("transparencyRef")] string? TransparencyRef);

View File

@@ -4,6 +4,7 @@ using System.Linq;
using System.Collections.Immutable;
using System.Globalization;
using System.Text;
using System.Text.Json;
using Microsoft.AspNetCore.Authentication;
using Microsoft.AspNetCore.Http;
using Microsoft.AspNetCore.Mvc;
@@ -51,11 +52,12 @@ services.AddOptions<ExcititorObservabilityOptions>()
services.AddScoped<ExcititorHealthService>();
services.AddExcititorAocGuards();
services.AddVexExportEngine();
services.AddVexExportCacheServices();
services.AddVexExportCacheServices();
services.AddVexAttestation();
services.Configure<VexAttestationClientOptions>(configuration.GetSection("Excititor:Attestation:Client"));
services.Configure<VexAttestationVerificationOptions>(configuration.GetSection("Excititor:Attestation:Verification"));
services.AddVexPolicy();
services.AddVexPolicy();
services.AddSingleton<IVexEvidenceChunkService, VexEvidenceChunkService>();
services.AddRedHatCsafConnector();
services.Configure<MirrorDistributionOptions>(configuration.GetSection(MirrorDistributionOptions.SectionName));
services.AddSingleton<MirrorRateLimiter>();
@@ -515,6 +517,69 @@ app.MapGet("/v1/vex/observations/{vulnerabilityId}/{productKey}", async (
return Results.Json(response);
});
app.MapGet("/v1/vex/evidence/chunks", async (
HttpContext context,
[FromServices] IVexEvidenceChunkService chunkService,
[FromServices] IOptions<VexMongoStorageOptions> storageOptions,
CancellationToken cancellationToken) =>
{
var scopeResult = ScopeAuthorization.RequireScope(context, "vex.read");
if (scopeResult is not null)
{
return scopeResult;
}
if (!TryResolveTenant(context, storageOptions.Value, requireHeader: false, out var tenant, out var tenantError))
{
return tenantError;
}
var vulnerabilityId = context.Request.Query["vulnerabilityId"].FirstOrDefault();
var productKey = context.Request.Query["productKey"].FirstOrDefault();
if (string.IsNullOrWhiteSpace(vulnerabilityId) || string.IsNullOrWhiteSpace(productKey))
{
return ValidationProblem("vulnerabilityId and productKey are required.");
}
var providerFilter = BuildStringFilterSet(context.Request.Query["providerId"]);
var statusFilter = BuildStatusFilter(context.Request.Query["status"]);
var since = ParseSinceTimestamp(context.Request.Query["since"]);
var limit = ResolveLimit(context.Request.Query["limit"], defaultValue: 200, min: 1, max: 500);
var request = new VexEvidenceChunkRequest(
tenant,
vulnerabilityId.Trim(),
productKey.Trim(),
providerFilter,
statusFilter,
since,
limit);
VexEvidenceChunkResult result;
try
{
result = await chunkService.QueryAsync(request, cancellationToken).ConfigureAwait(false);
}
catch (OperationCanceledException)
{
return Results.StatusCode(StatusCodes.Status499ClientClosedRequest);
}
context.Response.Headers["X-Total-Count"] = result.TotalCount.ToString(CultureInfo.InvariantCulture);
context.Response.Headers["X-Truncated"] = result.Truncated ? "true" : "false";
context.Response.ContentType = "application/x-ndjson";
var options = new JsonSerializerOptions(JsonSerializerDefaults.Web);
foreach (var chunk in result.Chunks)
{
var line = JsonSerializer.Serialize(chunk, options);
await context.Response.WriteAsync(line, cancellationToken).ConfigureAwait(false);
await context.Response.WriteAsync("\n", cancellationToken).ConfigureAwait(false);
}
return Results.Empty;
});
app.MapPost("/aoc/verify", async (
HttpContext context,
VexAocVerifyRequest? request,

View File

@@ -0,0 +1,130 @@
using System;
using System.Collections.Generic;
using System.Collections.Immutable;
using System.Globalization;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using StellaOps.Excititor.Core;
using StellaOps.Excititor.Storage.Mongo;
using StellaOps.Excititor.WebService.Contracts;
namespace StellaOps.Excititor.WebService.Services;
internal interface IVexEvidenceChunkService
{
Task<VexEvidenceChunkResult> QueryAsync(VexEvidenceChunkRequest request, CancellationToken cancellationToken);
}
internal sealed record VexEvidenceChunkRequest(
string Tenant,
string VulnerabilityId,
string ProductKey,
ImmutableHashSet<string> ProviderIds,
ImmutableHashSet<VexClaimStatus> Statuses,
DateTimeOffset? Since,
int Limit);
internal sealed record VexEvidenceChunkResult(
IReadOnlyList<VexEvidenceChunkResponse> Chunks,
bool Truncated,
int TotalCount,
DateTimeOffset GeneratedAtUtc);
internal sealed class VexEvidenceChunkService : IVexEvidenceChunkService
{
private readonly IVexClaimStore _claimStore;
private readonly TimeProvider _timeProvider;
public VexEvidenceChunkService(IVexClaimStore claimStore, TimeProvider timeProvider)
{
_claimStore = claimStore ?? throw new ArgumentNullException(nameof(claimStore));
_timeProvider = timeProvider ?? throw new ArgumentNullException(nameof(timeProvider));
}
public async Task<VexEvidenceChunkResult> QueryAsync(VexEvidenceChunkRequest request, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(request);
var claims = await _claimStore
.FindAsync(request.VulnerabilityId, request.ProductKey, request.Since, cancellationToken)
.ConfigureAwait(false);
var filtered = claims
.Where(claim => MatchesProvider(claim, request.ProviderIds))
.Where(claim => MatchesStatus(claim, request.Statuses))
.OrderByDescending(claim => claim.LastSeen)
.ToList();
var total = filtered.Count;
if (filtered.Count > request.Limit)
{
filtered = filtered.Take(request.Limit).ToList();
}
var chunks = filtered
.Select(MapChunk)
.ToList();
return new VexEvidenceChunkResult(
chunks,
total > request.Limit,
total,
_timeProvider.GetUtcNow());
}
private static bool MatchesProvider(VexClaim claim, ImmutableHashSet<string> providers)
=> providers.Count == 0 || providers.Contains(claim.ProviderId, StringComparer.OrdinalIgnoreCase);
private static bool MatchesStatus(VexClaim claim, ImmutableHashSet<VexClaimStatus> statuses)
=> statuses.Count == 0 || statuses.Contains(claim.Status);
private static VexEvidenceChunkResponse MapChunk(VexClaim claim)
{
var observationId = string.Create(CultureInfo.InvariantCulture, $"{claim.ProviderId}:{claim.Document.Digest}");
var linksetId = string.Create(CultureInfo.InvariantCulture, $"{claim.VulnerabilityId}:{claim.Product.Key}");
var scope = new VexEvidenceChunkScope(
claim.Product.Key,
claim.Product.Name,
claim.Product.Version,
claim.Product.Purl,
claim.Product.Cpe,
claim.Product.ComponentIdentifiers);
var document = new VexEvidenceChunkDocument(
claim.Document.Digest,
claim.Document.Format.ToString().ToLowerInvariant(),
claim.Document.SourceUri.ToString(),
claim.Document.Revision);
var signature = claim.Document.Signature is null
? null
: new VexEvidenceChunkSignature(
claim.Document.Signature.Type,
claim.Document.Signature.Subject,
claim.Document.Signature.Issuer,
claim.Document.Signature.KeyId,
claim.Document.Signature.VerifiedAt,
claim.Document.Signature.TransparencyLogReference);
var scopeScore = claim.Confidence?.Score ?? claim.Signals?.Severity?.Score;
return new VexEvidenceChunkResponse(
observationId,
linksetId,
claim.VulnerabilityId,
claim.Product.Key,
claim.ProviderId,
claim.Status.ToString(),
claim.Justification?.ToString(),
claim.Detail,
scopeScore,
claim.FirstSeen,
claim.LastSeen,
scope,
document,
signature,
claim.AdditionalMetadata);
}
}

View File

@@ -0,0 +1,3 @@
using System.Runtime.CompilerServices;
[assembly: InternalsVisibleTo("StellaOps.Excititor.Attestation.Tests")]

View File

@@ -0,0 +1,76 @@
using System;
using System.Collections.Immutable;
namespace StellaOps.Excititor.Core.Observations;
/// <summary>
/// Immutable timeline event emitted for ingest/linkset changes with deterministic field ordering.
/// </summary>
public sealed record TimelineEvent
{
public TimelineEvent(
string eventId,
string tenant,
string providerId,
string streamId,
string eventType,
string traceId,
string justificationSummary,
DateTimeOffset createdAt,
string? evidenceHash = null,
string? payloadHash = null,
ImmutableDictionary<string, string>? attributes = null)
{
EventId = Ensure(eventId, nameof(eventId));
Tenant = Ensure(tenant, nameof(tenant)).ToLowerInvariant();
ProviderId = Ensure(providerId, nameof(providerId));
StreamId = Ensure(streamId, nameof(streamId));
EventType = Ensure(eventType, nameof(eventType));
TraceId = Ensure(traceId, nameof(traceId));
JustificationSummary = justificationSummary?.Trim() ?? string.Empty;
EvidenceHash = evidenceHash?.Trim();
PayloadHash = payloadHash?.Trim();
CreatedAt = createdAt;
Attributes = Normalize(attributes);
}
public string EventId { get; }
public string Tenant { get; }
public string ProviderId { get; }
public string StreamId { get; }
public string EventType { get; }
public string TraceId { get; }
public string JustificationSummary { get; }
public string? EvidenceHash { get; }
public string? PayloadHash { get; }
public DateTimeOffset CreatedAt { get; }
public ImmutableDictionary<string, string> Attributes { get; }
private static string Ensure(string value, string name)
{
if (string.IsNullOrWhiteSpace(value))
{
throw new ArgumentException($"{name} cannot be null or whitespace", name);
}
return value.Trim();
}
private static ImmutableDictionary<string, string> Normalize(ImmutableDictionary<string, string>? attributes)
{
if (attributes is null || attributes.Count == 0)
{
return ImmutableDictionary<string, string>.Empty;
}
var builder = ImmutableDictionary.CreateBuilder<string, string>(StringComparer.Ordinal);
foreach (var kv in attributes)
{
if (string.IsNullOrWhiteSpace(kv.Key) || kv.Value is null)
{
continue;
}
builder[kv.Key.Trim()] = kv.Value;
}
return builder.ToImmutable();
}
}

View File

@@ -0,0 +1,99 @@
using System;
using System.Collections.Generic;
using System.Collections.Immutable;
using System.Linq;
namespace StellaOps.Excititor.Core;
/// <summary>
/// Aggregation-only attestation payload describing evidence supplier identity and the observation/linkset it covers.
/// Used by Advisory AI / Policy to chain trust without Excititor interpreting verdicts.
/// </summary>
public sealed record VexAttestationPayload
{
public VexAttestationPayload(
string attestationId,
string supplierId,
string observationId,
string linksetId,
string vulnerabilityId,
string productKey,
string? justificationSummary,
DateTimeOffset issuedAt,
ImmutableDictionary<string, string>? metadata = null)
{
AttestationId = EnsureNotNullOrWhiteSpace(attestationId, nameof(attestationId));
SupplierId = EnsureNotNullOrWhiteSpace(supplierId, nameof(supplierId));
ObservationId = EnsureNotNullOrWhiteSpace(observationId, nameof(observationId));
LinksetId = EnsureNotNullOrWhiteSpace(linksetId, nameof(linksetId));
VulnerabilityId = EnsureNotNullOrWhiteSpace(vulnerabilityId, nameof(vulnerabilityId));
ProductKey = EnsureNotNullOrWhiteSpace(productKey, nameof(productKey));
JustificationSummary = TrimToNull(justificationSummary);
IssuedAt = issuedAt.ToUniversalTime();
Metadata = NormalizeMetadata(metadata);
}
public string AttestationId { get; }
public string SupplierId { get; }
public string ObservationId { get; }
public string LinksetId { get; }
public string VulnerabilityId { get; }
public string ProductKey { get; }
public string? JustificationSummary { get; }
public DateTimeOffset IssuedAt { get; }
public ImmutableDictionary<string, string> Metadata { get; }
private static ImmutableDictionary<string, string> NormalizeMetadata(ImmutableDictionary<string, string>? metadata)
{
if (metadata is null || metadata.Count == 0)
{
return ImmutableDictionary<string, string>.Empty;
}
var builder = ImmutableDictionary.CreateBuilder<string, string>(StringComparer.Ordinal);
foreach (var pair in metadata.OrderBy(kv => kv.Key, StringComparer.Ordinal))
{
var key = TrimToNull(pair.Key);
var value = TrimToNull(pair.Value);
if (key is null || value is null)
{
continue;
}
builder[key] = value;
}
return builder.ToImmutable();
}
private static string EnsureNotNullOrWhiteSpace(string value, string name)
=> string.IsNullOrWhiteSpace(value) ? throw new ArgumentException($"{name} must be provided.", name) : value.Trim();
private static string? TrimToNull(string? value)
=> string.IsNullOrWhiteSpace(value) ? null : value.Trim();
}
/// <summary>
/// Lightweight mapping from attestation IDs back to the observation/linkset/product tuple for provenance tracing.
/// </summary>
public sealed record VexAttestationLink
{
public VexAttestationLink(string attestationId, string observationId, string linksetId, string productKey)
{
AttestationId = EnsureNotNullOrWhiteSpace(attestationId, nameof(attestationId));
ObservationId = EnsureNotNullOrWhiteSpace(observationId, nameof(observationId));
LinksetId = EnsureNotNullOrWhiteSpace(linksetId, nameof(linksetId));
ProductKey = EnsureNotNullOrWhiteSpace(productKey, nameof(productKey));
}
public string AttestationId { get; }
public string ObservationId { get; }
public string LinksetId { get; }
public string ProductKey { get; }
private static string EnsureNotNullOrWhiteSpace(string value, string name)
=> string.IsNullOrWhiteSpace(value) ? throw new ArgumentException($"{name} must be provided.", name) : value.Trim();
}

View File

@@ -0,0 +1,12 @@
using System.Threading;
using System.Threading.Tasks;
using StellaOps.Excititor.Core;
namespace StellaOps.Excititor.Storage.Mongo;
public interface IVexAttestationLinkStore
{
ValueTask UpsertAsync(VexAttestationPayload payload, CancellationToken cancellationToken);
ValueTask<VexAttestationPayload?> FindAsync(string attestationId, CancellationToken cancellationToken);
}

View File

@@ -0,0 +1,43 @@
using System;
using System.Collections.Generic;
using System.Threading;
using System.Threading.Tasks;
using MongoDB.Driver;
using StellaOps.Excititor.Core;
namespace StellaOps.Excititor.Storage.Mongo;
public sealed class MongoVexAttestationLinkStore : IVexAttestationLinkStore
{
private readonly IMongoCollection<VexAttestationLinkRecord> _collection;
public MongoVexAttestationLinkStore(IMongoDatabase database)
{
ArgumentNullException.ThrowIfNull(database);
VexMongoMappingRegistry.Register();
_collection = database.GetCollection<VexAttestationLinkRecord>(VexMongoCollectionNames.Attestations);
}
public async ValueTask UpsertAsync(VexAttestationPayload payload, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(payload);
var record = VexAttestationLinkRecord.FromDomain(payload);
var filter = Builders<VexAttestationLinkRecord>.Filter.Eq(x => x.AttestationId, record.AttestationId);
var options = new ReplaceOptions { IsUpsert = true };
await _collection.ReplaceOneAsync(filter, record, options, cancellationToken).ConfigureAwait(false);
}
public async ValueTask<VexAttestationPayload?> FindAsync(string attestationId, CancellationToken cancellationToken)
{
if (string.IsNullOrWhiteSpace(attestationId))
{
throw new ArgumentException("Attestation id must be provided.", nameof(attestationId));
}
var filter = Builders<VexAttestationLinkRecord>.Filter.Eq(x => x.AttestationId, attestationId.Trim());
var record = await _collection.Find(filter).FirstOrDefaultAsync(cancellationToken).ConfigureAwait(false);
return record?.ToDomain();
}
}

View File

@@ -0,0 +1,63 @@
using System;
using System.Collections.Generic;
using System.Collections.Immutable;
using MongoDB.Bson.Serialization.Attributes;
using StellaOps.Excititor.Core;
namespace StellaOps.Excititor.Storage.Mongo;
[BsonIgnoreExtraElements]
internal sealed class VexAttestationLinkRecord
{
[BsonId]
public string AttestationId { get; set; } = default!;
public string SupplierId { get; set; } = default!;
public string ObservationId { get; set; } = default!;
public string LinksetId { get; set; } = default!;
public string VulnerabilityId { get; set; } = default!;
public string ProductKey { get; set; } = default!;
public string? JustificationSummary { get; set; }
= null;
public DateTime IssuedAt { get; set; }
= DateTime.SpecifyKind(DateTime.UtcNow, DateTimeKind.Utc);
public Dictionary<string, string> Metadata { get; set; } = new(StringComparer.Ordinal);
public static VexAttestationLinkRecord FromDomain(VexAttestationPayload payload)
=> new()
{
AttestationId = payload.AttestationId,
SupplierId = payload.SupplierId,
ObservationId = payload.ObservationId,
LinksetId = payload.LinksetId,
VulnerabilityId = payload.VulnerabilityId,
ProductKey = payload.ProductKey,
JustificationSummary = payload.JustificationSummary,
IssuedAt = payload.IssuedAt.UtcDateTime,
Metadata = payload.Metadata.ToDictionary(kv => kv.Key, kv => kv.Value, StringComparer.Ordinal),
};
public VexAttestationPayload ToDomain()
{
var metadata = (Metadata ?? new Dictionary<string, string>(StringComparer.Ordinal))
.ToImmutableDictionary(StringComparer.Ordinal);
return new VexAttestationPayload(
AttestationId,
SupplierId,
ObservationId,
LinksetId,
VulnerabilityId,
ProductKey,
JustificationSummary,
new DateTimeOffset(DateTime.SpecifyKind(IssuedAt, DateTimeKind.Utc)),
metadata);
}
}

View File

@@ -70,10 +70,11 @@ public static class VexMongoCollectionNames
public const string Providers = "vex.providers";
public const string Raw = "vex.raw";
public const string Statements = "vex.statements";
public const string Claims = Statements;
public const string Consensus = "vex.consensus";
public const string Claims = Statements;
public const string Consensus = "vex.consensus";
public const string Exports = "vex.exports";
public const string Cache = "vex.cache";
public const string ConnectorState = "vex.connector_state";
public const string ConsensusHolds = "vex.consensus_holds";
public const string Attestations = "vex.attestations";
}

View File

@@ -0,0 +1,41 @@
using System;
using System.Collections.Immutable;
using FluentAssertions;
using StellaOps.Excititor.Core.Observations;
using Xunit;
namespace StellaOps.Excititor.Core.Tests.Observations;
public class TimelineEventTests
{
[Fact]
public void Normalizes_and_requires_fields()
{
var evt = new TimelineEvent(
eventId: " EVT-1 ",
tenant: "TenantA",
providerId: "prov",
streamId: "stream",
eventType: "ingest",
traceId: "trace-123",
justificationSummary: " summary ",
createdAt: DateTimeOffset.UnixEpoch,
evidenceHash: " evhash ",
payloadHash: " pwhash ",
attributes: ImmutableDictionary<string, string>.Empty.Add(" a ", " b " ));
evt.EventId.Should().Be("EVT-1");
evt.Tenant.Should().Be("tenanta");
evt.JustificationSummary.Should().Be("summary");
evt.EvidenceHash.Should().Be("evhash");
evt.PayloadHash.Should().Be("pwhash");
evt.Attributes.Should().ContainKey("a");
}
[Fact]
public void Throws_on_missing_required()
{
Action act = () => new TimelineEvent(" ", "t", "p", "s", "t", "trace", "", DateTimeOffset.UtcNow);
act.Should().Throw<ArgumentException>();
}
}

View File

@@ -0,0 +1,15 @@
using System;
using System.Collections.Immutable;
using FluentAssertions;
using StellaOps.Excititor.Core;
using Xunit;
namespace StellaOps.Excititor.Core.Tests;
public sealed class VexAttestationPayloadTests
{
[Fact]
public void Payload_NormalizesAndOrdersMetadata()
{
var metadata = ImmutableDictionary<string, string>.Empty
.Add(b,

View File

@@ -0,0 +1,86 @@
using System;
using System.Collections.Generic;
using System.Net.Http.Headers;
using System.Net.Http.Json;
using EphemeralMongo;
using Microsoft.AspNetCore.Mvc.Testing;
using Microsoft.Extensions.Configuration;
using StellaOps.Excititor.Core;
using StellaOps.Excititor.Storage.Mongo;
using Xunit;
namespace StellaOps.Excititor.WebService.Tests;
public sealed class VexAttestationLinkEndpointTests : IDisposable
{
private readonly IMongoRunner _runner;
private readonly TestWebApplicationFactory _factory;
public VexAttestationLinkEndpointTests()
{
_runner = MongoRunner.Run(new MongoRunnerOptions { UseSingleNodeReplicaSet = true });
_factory = new TestWebApplicationFactory(
configureConfiguration: configuration =>
{
configuration.AddInMemoryCollection(new Dictionary<string, string?>
{
[Excititor:Storage:Mongo:ConnectionString] = _runner.ConnectionString,
[Excititor:Storage:Mongo:DatabaseName] = vex_attestation_links,
[Excititor:Storage:Mongo:DefaultTenant] = tests,
});
},
configureServices: services =>
{
TestServiceOverrides.Apply(services);
services.AddTestAuthentication();
});
SeedLink();
}
[Fact]
public async Task GetAttestationLink_ReturnsPayload()
{
using var client = _factory.CreateClient(new WebApplicationFactoryClientOptions { AllowAutoRedirect = false });
client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue(Bearer, vex.read);
var response = await client.GetAsync(/v1/vex/attestations/att-123);
response.EnsureSuccessStatusCode();
var payload = await response.Content.ReadFromJsonAsync<VexAttestationPayload>();
Assert.NotNull(payload);
Assert.Equal(att-123, payload!.AttestationId);
Assert.Equal(supplier-a, payload.SupplierId);
Assert.Equal(CVE-2025-0001, payload.VulnerabilityId);
Assert.Equal(pkg:demo, payload.ProductKey);
}
private void SeedLink()
{
var client = new MongoDB.Driver.MongoClient(_runner.ConnectionString);
var database = client.GetDatabase(vex_attestation_links);
var collection = database.GetCollection<VexAttestationLinkRecord>(VexMongoCollectionNames.Attestations);
var record = new VexAttestationLinkRecord
{
AttestationId = att-123,
SupplierId = supplier-a,
ObservationId = obs-1,
LinksetId = link-1,
VulnerabilityId = CVE-2025-0001,
ProductKey = pkg:demo,
JustificationSummary = summary,
IssuedAt = DateTime.UtcNow,
Metadata = new Dictionary<string, string> { [policyRevisionId] = rev-1 },
};
collection.InsertOne(record);
}
public void Dispose()
{
_factory.Dispose();
_runner.Dispose();
}
}

View File

@@ -0,0 +1,117 @@
using System;
using System.Collections.Generic;
using System.Collections.Immutable;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using FluentAssertions;
using StellaOps.Excititor.Core;
using StellaOps.Excititor.Storage.Mongo;
using StellaOps.Excititor.WebService.Services;
using Xunit;
namespace StellaOps.Excititor.WebService.Tests;
public sealed class VexEvidenceChunkServiceTests
{
[Fact]
public async Task QueryAsync_FiltersAndLimitsResults()
{
var now = new DateTimeOffset(2025, 11, 16, 12, 0, 0, TimeSpan.Zero);
var claims = new[]
{
CreateClaim("provider-a", VexClaimStatus.Affected, now.AddHours(-6), now.AddHours(-5), score: 0.9),
CreateClaim("provider-b", VexClaimStatus.NotAffected, now.AddHours(-4), now.AddHours(-3), score: 0.2)
};
var service = new VexEvidenceChunkService(new FakeClaimStore(claims), new FixedTimeProvider(now));
var request = new VexEvidenceChunkRequest(
Tenant: "tenant-a",
VulnerabilityId: "CVE-2025-0001",
ProductKey: "pkg:docker/demo",
ProviderIds: ImmutableHashSet.Create("provider-b"),
Statuses: ImmutableHashSet.Create(VexClaimStatus.NotAffected),
Since: null,
Limit: 1);
var result = await service.QueryAsync(request, CancellationToken.None);
result.Truncated.Should().BeTrue();
result.TotalCount.Should().Be(1);
result.GeneratedAtUtc.Should().Be(now);
var chunk = result.Chunks.Single();
chunk.ProviderId.Should().Be("provider-b");
chunk.Status.Should().Be(VexClaimStatus.NotAffected.ToString());
chunk.ScopeScore.Should().Be(0.2);
chunk.ObservationId.Should().Contain("provider-b");
chunk.Document.Digest.Should().NotBeNullOrWhiteSpace();
}
private static VexClaim CreateClaim(string providerId, VexClaimStatus status, DateTimeOffset firstSeen, DateTimeOffset lastSeen, double? score)
{
var product = new VexProduct("pkg:docker/demo", "demo", "1.0.0", "pkg:docker/demo:1.0.0", null, new[] { "component-a" });
var document = new VexClaimDocument(
VexDocumentFormat.SbomCycloneDx,
digest: Guid.NewGuid().ToString("N"),
sourceUri: new Uri("https://example.test/vex.json"),
revision: "r1",
signature: new VexSignatureMetadata("cosign", "demo", "issuer", keyId: "kid", verifiedAt: firstSeen, transparencyLogReference: null));
var signals = score.HasValue
? new VexSignalSnapshot(new VexSeveritySignal("cvss", score, "low", vector: null), Kev: null, Epss: null)
: null;
return new VexClaim(
"CVE-2025-0001",
providerId,
product,
status,
document,
firstSeen,
lastSeen,
justification: VexJustification.ComponentNotPresent,
detail: "demo detail",
confidence: null,
signals: signals,
additionalMetadata: ImmutableDictionary<string, string>.Empty);
}
private sealed class FakeClaimStore : IVexClaimStore
{
private readonly IReadOnlyCollection<VexClaim> _claims;
public FakeClaimStore(IReadOnlyCollection<VexClaim> claims)
{
_claims = claims;
}
public ValueTask AppendAsync(IEnumerable<VexClaim> claims, DateTimeOffset observedAt, CancellationToken cancellationToken, MongoDB.Driver.IClientSessionHandle? session = null)
=> throw new NotSupportedException();
public ValueTask<IReadOnlyCollection<VexClaim>> FindAsync(string vulnerabilityId, string productKey, DateTimeOffset? since, CancellationToken cancellationToken, MongoDB.Driver.IClientSessionHandle? session = null)
{
var query = _claims
.Where(claim => claim.VulnerabilityId == vulnerabilityId)
.Where(claim => claim.Product.Key == productKey);
if (since.HasValue)
{
query = query.Where(claim => claim.LastSeen >= since.Value);
}
return ValueTask.FromResult<IReadOnlyCollection<VexClaim>>(query.ToList());
}
}
private sealed class FixedTimeProvider : TimeProvider
{
private readonly DateTimeOffset _timestamp;
public FixedTimeProvider(DateTimeOffset timestamp)
{
_timestamp = timestamp;
}
public override DateTimeOffset GetUtcNow() => _timestamp;
}
}

View File

@@ -0,0 +1,128 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Net.Http.Headers;
using System.Text.Json;
using System.Threading.Tasks;
using EphemeralMongo;
using Microsoft.AspNetCore.Mvc.Testing;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
using MongoDB.Driver;
using StellaOps.Excititor.Core;
using StellaOps.Excititor.Storage.Mongo;
using StellaOps.Excititor.WebService.Contracts;
using Xunit;
namespace StellaOps.Excititor.WebService.Tests;
public sealed class VexEvidenceChunksEndpointTests : IDisposable
{
private readonly IMongoRunner _runner;
private readonly TestWebApplicationFactory _factory;
public VexEvidenceChunksEndpointTests()
{
_runner = MongoRunner.Run(new MongoRunnerOptions { UseSingleNodeReplicaSet = true });
_factory = new TestWebApplicationFactory(
configureConfiguration: configuration =>
{
configuration.AddInMemoryCollection(new Dictionary<string, string?>
{
["Excititor:Storage:Mongo:ConnectionString"] = _runner.ConnectionString,
["Excititor:Storage:Mongo:DatabaseName"] = "vex_chunks_tests",
["Excititor:Storage:Mongo:DefaultTenant"] = "tests",
});
},
configureServices: services =>
{
TestServiceOverrides.Apply(services);
services.AddTestAuthentication();
});
SeedStatements();
}
[Fact]
public async Task ChunksEndpoint_Filters_ByProvider_AndStreamsNdjson()
{
using var client = _factory.CreateClient(new WebApplicationFactoryClientOptions { AllowAutoRedirect = false });
client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Bearer", "vex.read");
client.DefaultRequestHeaders.Add("X-Stella-Tenant", "tests");
var response = await client.GetAsync("/v1/vex/evidence/chunks?vulnerabilityId=CVE-2025-0001&productKey=pkg:docker/demo&providerId=provider-b&limit=1");
response.EnsureSuccessStatusCode();
Assert.True(response.Headers.TryGetValues("Excititor-Results-Truncated", out var truncatedValues));
Assert.Contains("true", truncatedValues, StringComparer.OrdinalIgnoreCase);
var body = await response.Content.ReadAsStringAsync();
var lines = body.Split(n, StringSplitOptions.RemoveEmptyEntries);
Assert.Single(lines);
var chunk = JsonSerializer.Deserialize<VexEvidenceChunkResponse>(lines[0], new JsonSerializerOptions(JsonSerializerDefaults.Web));
Assert.NotNull(chunk);
Assert.Equal("provider-b", chunk!.ProviderId);
Assert.Equal("NotAffected", chunk.Status);
Assert.Equal("pkg:docker/demo", chunk.Scope.Key);
Assert.Equal("CVE-2025-0001", chunk.VulnerabilityId);
}
private void SeedStatements()
{
var client = new MongoClient(_runner.ConnectionString);
var database = client.GetDatabase("vex_chunks_tests");
var collection = database.GetCollection<VexStatementRecord>(VexMongoCollectionNames.Statements);
var now = DateTimeOffset.UtcNow;
var claims = new[]
{
CreateClaim("provider-a", VexClaimStatus.Affected, now.AddHours(-6), now.AddHours(-5), 0.9),
CreateClaim("provider-b", VexClaimStatus.NotAffected, now.AddHours(-4), now.AddHours(-3), 0.2),
CreateClaim("provider-c", VexClaimStatus.Affected, now.AddHours(-2), now.AddHours(-1), 0.5)
};
var records = claims
.Select(claim => VexStatementRecord.FromDomain(claim, now))
.ToList();
collection.InsertMany(records);
}
private static VexClaim CreateClaim(string providerId, VexClaimStatus status, DateTimeOffset firstSeen, DateTimeOffset lastSeen, double? score)
{
var product = new VexProduct("pkg:docker/demo", "demo", "1.0.0", "pkg:docker/demo:1.0.0", null, new[] { "component-a" });
var document = new VexClaimDocument(
VexDocumentFormat.SbomCycloneDx,
digest: Guid.NewGuid().ToString("N"),
sourceUri: new Uri("https://example.test/vex.json"),
revision: "r1",
signature: new VexSignatureMetadata("cosign", "demo", "issuer", keyId: "kid", verifiedAt: firstSeen, transparencyLogReference: null));
var signals = score.HasValue
? new VexSignalSnapshot(new VexSeveritySignal("cvss", score, "low", vector: null), Kev: null, Epss: null)
: null;
return new VexClaim(
"CVE-2025-0001",
providerId,
product,
status,
document,
firstSeen,
lastSeen,
justification: VexJustification.ComponentNotPresent,
detail: "demo detail",
confidence: null,
signals: signals,
additionalMetadata: null);
}
public void Dispose()
{
_factory.Dispose();
_runner.Dispose();
}
}

View File

@@ -0,0 +1,7 @@
// Temporary shim for compilers that do not surface System.Runtime.CompilerServices.IsExternalInit
// (needed for record types). Remove when toolchain natively provides the type.
namespace System.Runtime.CompilerServices;
internal static class IsExternalInit
{
}

View File

@@ -0,0 +1,2 @@
{"tenant": "tenant-a", "chain_id": "c8d6f7f1-58f8-4c2d-8d92-f9b8790a0001", "sequence_no": 1, "event_id": "c0e6d9b4-1d89-4b07-b622-1c7b6d111001", "event_type": "finding.assignment", "policy_version": "2025.01", "finding_id": "F-001", "artifact_id": "artifact-1", "actor_id": "system", "actor_type": "system", "occurred_at": "2025-01-01T00:00:00Z", "recorded_at": "2025-01-01T00:00:01Z", "payload": {"comment": "seed event"}, "previous_hash": "0000000000000000000000000000000000000000000000000000000000000000", "event_hash": "0d95f63532b6488407e8fd2e837edb3e9bfc8a2defde232aca99dbfd518558c6", "merkle_leaf_hash": "d08d4da76da50fbe4274a394c73fcaae0180fd591238d224bc7d5efee2ad3696"}
{"tenant": "tenant-a", "chain_id": "c8d6f7f1-58f8-4c2d-8d92-f9b8790a0001", "sequence_no": 2, "event_id": "c0e6d9b4-1d89-4b07-b622-1c7b6d111002", "event_type": "finding.comment", "policy_version": "2025.01", "finding_id": "F-001", "artifact_id": "artifact-1", "actor_id": "analyst", "actor_type": "operator", "occurred_at": "2025-01-01T00:00:10Z", "recorded_at": "2025-01-01T00:00:11Z", "payload": {"comment": "follow-up"}, "previous_hash": "PLACEHOLDER", "event_hash": "0e77979af948be38de028a2497f15529473ae5aeb0a95f5d9d648efc8afb9fa3", "merkle_leaf_hash": "2854050efba048f2674ba27fd7dc2f1b65e90e150098bfeeb4fc6e23334c3790"}

View File

@@ -0,0 +1,15 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>net10.0</TargetFramework>
<LangVersion>preview</LangVersion>
<Nullable>enable</Nullable>
<ImplicitUsings>enable</ImplicitUsings>
</PropertyGroup>
<ItemGroup>
<ProjectReference Include="..\..\StellaOps.Findings.Ledger.csproj" />
</ItemGroup>
<ItemGroup>
<PackageReference Include="System.CommandLine" Version="2.0.0-beta4.22272.1" />
</ItemGroup>
</Project>

View File

@@ -0,0 +1,502 @@
using System.CommandLine;
using System.Diagnostics;
using System.Diagnostics.Metrics;
using System.Text.Json;
using System.Text.Json.Nodes;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using StellaOps.Findings.Ledger.Domain;
using StellaOps.Findings.Ledger.Hashing;
using StellaOps.Findings.Ledger.Infrastructure;
using StellaOps.Findings.Ledger.Infrastructure.Merkle;
using StellaOps.Findings.Ledger.Infrastructure.Postgres;
using StellaOps.Findings.Ledger.Infrastructure.Projection;
using StellaOps.Findings.Ledger.Options;
using StellaOps.Findings.Ledger.Observability;
using StellaOps.Findings.Ledger.Services;
// Command-line options
var fixturesOption = new Option<FileInfo[]>(
name: "--fixture",
description: "NDJSON fixtures containing canonical ledger envelopes (sequence-ordered)")
{
IsRequired = true
};
fixturesOption.AllowMultipleArgumentsPerToken = true;
var connectionOption = new Option<string>(
name: "--connection",
description: "PostgreSQL connection string for ledger DB")
{
IsRequired = true
};
var tenantOption = new Option<string>(
name: "--tenant",
getDefaultValue: () => "tenant-a",
description: "Tenant identifier for appended events");
var maxParallelOption = new Option<int>(
name: "--maxParallel",
getDefaultValue: () => 4,
description: "Maximum concurrent append operations");
var reportOption = new Option<FileInfo?>(
name: "--report",
description: "Path to write harness report JSON (with DSSE placeholder)");
var metricsOption = new Option<FileInfo?>(
name: "--metrics",
description: "Optional path to write metrics snapshot JSON");
var root = new RootCommand("Findings Ledger Replay Harness (LEDGER-29-008)");
root.AddOption(fixturesOption);
root.AddOption(connectionOption);
root.AddOption(tenantOption);
root.AddOption(maxParallelOption);
root.AddOption(reportOption);
root.AddOption(metricsOption);
root.SetHandler(async (FileInfo[] fixtures, string connection, string tenant, int maxParallel, FileInfo? reportFile, FileInfo? metricsFile) =>
{
await using var host = BuildHost(connection);
using var scope = host.Services.CreateScope();
var writeService = scope.ServiceProvider.GetRequiredService<ILedgerEventWriteService>();
var projectionWorker = scope.ServiceProvider.GetRequiredService<LedgerProjectionWorker>();
var anchorWorker = scope.ServiceProvider.GetRequiredService<LedgerMerkleAnchorWorker>();
var logger = scope.ServiceProvider.GetRequiredService<ILoggerFactory>().CreateLogger("Harness");
var timeProvider = scope.ServiceProvider.GetRequiredService<TimeProvider>();
var cts = new CancellationTokenSource();
var projectionTask = projectionWorker.StartAsync(cts.Token);
var anchorTask = anchorWorker.StartAsync(cts.Token);
var (meterListener, metrics) = CreateMeterListener();
var sw = Stopwatch.StartNew();
long eventsWritten = 0;
await Parallel.ForEachAsync(fixtures, new ParallelOptions { MaxDegreeOfParallelism = maxParallel, CancellationToken = cts.Token }, async (file, token) =>
{
await foreach (var draft in ReadDraftsAsync(file, tenant, timeProvider, token))
{
var result = await writeService.AppendAsync(draft, token).ConfigureAwait(false);
if (result.Status is LedgerWriteStatus.ValidationFailed or LedgerWriteStatus.Conflict)
{
throw new InvalidOperationException($"Append failed for {draft.EventId}: {string.Join(",", result.Errors)} ({result.ConflictCode})");
}
Interlocked.Increment(ref eventsWritten);
if (eventsWritten % 50_000 == 0)
{
logger.LogInformation("Appended {Count} events...", eventsWritten);
}
}
}).ConfigureAwait(false);
// Wait for projector to catch up
await Task.Delay(TimeSpan.FromSeconds(2), cts.Token);
sw.Stop();
meterListener.RecordObservableInstruments();
var verification = await VerifyLedgerAsync(scope.ServiceProvider, tenant, eventsWritten, cts.Token).ConfigureAwait(false);
var writeLatencyP95Ms = Percentile(metrics.HistDouble("ledger_write_latency_seconds"), 95) * 1000;
var rebuildP95Ms = Percentile(metrics.HistDouble("ledger_projection_rebuild_seconds"), 95) * 1000;
var projectionLagSeconds = metrics.GaugeDouble("ledger_projection_lag_seconds").DefaultIfEmpty(0).Max();
var backlogEvents = metrics.GaugeLong("ledger_ingest_backlog_events").DefaultIfEmpty(0).Max();
var dbConnections = metrics.GaugeLong("ledger_db_connections_active").DefaultIfEmpty(0).Sum();
var report = new HarnessReport(
tenant,
fixtures.Select(f => f.FullName).ToArray(),
eventsWritten,
sw.Elapsed.TotalSeconds,
status: verification.Success ? "pass" : "fail",
WriteLatencyP95Ms: writeLatencyP95Ms,
ProjectionRebuildP95Ms: rebuildP95Ms,
ProjectionLagSecondsMax: projectionLagSeconds,
BacklogEventsMax: backlogEvents,
DbConnectionsObserved: dbConnections,
VerificationErrors: verification.Errors.ToArray());
var jsonOptions = new JsonSerializerOptions { WriteIndented = true };
var json = JsonSerializer.Serialize(report, jsonOptions);
Console.WriteLine(json);
if (reportFile is not null)
{
await File.WriteAllTextAsync(reportFile.FullName, json, cts.Token).ConfigureAwait(false);
await WriteDssePlaceholderAsync(reportFile.FullName, json, cts.Token).ConfigureAwait(false);
}
if (metricsFile is not null)
{
var snapshot = metrics.ToSnapshot();
var metricsJson = JsonSerializer.Serialize(snapshot, jsonOptions);
await File.WriteAllTextAsync(metricsFile.FullName, metricsJson, cts.Token).ConfigureAwait(false);
}
cts.Cancel();
await Task.WhenAll(projectionTask, anchorTask).WaitAsync(TimeSpan.FromSeconds(5));
}, fixturesOption, connectionOption, tenantOption, maxParallelOption, reportOption, metricsOption);
await root.InvokeAsync(args);
static async Task WriteDssePlaceholderAsync(string reportPath, string json, CancellationToken cancellationToken)
{
using var sha = System.Security.Cryptography.SHA256.Create();
var digest = sha.ComputeHash(System.Text.Encoding.UTF8.GetBytes(json));
var sig = new
{
payloadType = "application/vnd.stella-ledger-harness+json",
sha256 = Convert.ToHexString(digest).ToLowerInvariant(),
signedBy = "harness-local",
createdAt = DateTimeOffset.UtcNow
};
var sigJson = JsonSerializer.Serialize(sig, new JsonSerializerOptions { WriteIndented = true });
await File.WriteAllTextAsync(reportPath + ".sig", sigJson, cancellationToken).ConfigureAwait(false);
}
static (MeterListener Listener, MetricsBag Bag) CreateMeterListener()
{
var bag = new MetricsBag();
var listener = new MeterListener
{
InstrumentPublished = (instrument, meterListener) =>
{
if (instrument.Meter.Name == "StellaOps.Findings.Ledger")
{
meterListener.EnableMeasurementEvents(instrument);
}
}
};
listener.SetMeasurementEventCallback<double>((instrument, measurement, tags, _) =>
{
bag.Add(instrument, measurement, tags);
});
listener.SetMeasurementEventCallback<long>((instrument, measurement, tags, _) =>
{
bag.Add(instrument, measurement, tags);
});
listener.Start();
return (listener, bag);
}
static IHost BuildHost(string connectionString)
{
return Host.CreateDefaultBuilder()
.ConfigureLogging(logging =>
{
logging.ClearProviders();
logging.AddSimpleConsole(options =>
{
options.SingleLine = true;
options.TimestampFormat = "HH:mm:ss ";
});
})
.ConfigureServices(services =>
{
services.Configure<LedgerServiceOptions>(opts =>
{
opts.Database.ConnectionString = connectionString;
});
services.AddSingleton<TimeProvider>(_ => TimeProvider.System);
services.AddSingleton<LedgerDataSource>();
services.AddSingleton<ILedgerEventRepository, PostgresLedgerEventRepository>();
services.AddSingleton<IFindingProjectionRepository, NoOpProjectionRepository>();
services.AddSingleton<ILedgerEventStream, PostgresLedgerEventStream>();
services.AddSingleton<IPolicyEvaluationService, NoOpPolicyEvaluationService>();
services.AddSingleton<IMerkleAnchorRepository, NoOpMerkleAnchorRepository>();
services.AddSingleton<LedgerAnchorQueue>();
services.AddSingleton<IMerkleAnchorScheduler, QueueMerkleAnchorScheduler>();
services.AddSingleton<LedgerMerkleAnchorWorker>();
services.AddSingleton<LedgerProjectionWorker>();
services.AddSingleton<ILedgerEventWriteService, LedgerEventWriteService>();
})
.Build();
}
static async IAsyncEnumerable<LedgerEventDraft> ReadDraftsAsync(FileInfo file, string tenant, TimeProvider timeProvider, [EnumeratorCancellation] CancellationToken cancellationToken)
{
await using var stream = file.OpenRead();
using var reader = new StreamReader(stream);
var recordedAtBase = timeProvider.GetUtcNow();
while (!reader.EndOfStream)
{
var line = await reader.ReadLineAsync().ConfigureAwait(false);
if (string.IsNullOrWhiteSpace(line))
{
continue;
}
var node = JsonNode.Parse(line)?.AsObject();
if (node is null)
{
continue;
}
yield return ToDraft(node, tenant, recordedAtBase);
cancellationToken.ThrowIfCancellationRequested();
}
}
static LedgerEventDraft ToDraft(JsonObject node, string defaultTenant, DateTimeOffset recordedAtBase)
{
string required(string name) => node[name]?.GetValue<string>() ?? throw new InvalidOperationException($"{name} missing");
var tenantId = node.TryGetPropertyValue("tenant", out var tenantNode)
? tenantNode!.GetValue<string>()
: defaultTenant;
var chainId = Guid.Parse(required("chain_id"));
var sequence = node["sequence_no"]?.GetValue<long>() ?? node["sequence"]?.GetValue<long>() ?? throw new InvalidOperationException("sequence_no missing");
var eventId = Guid.Parse(required("event_id"));
var eventType = required("event_type");
var policyVersion = required("policy_version");
var findingId = required("finding_id");
var artifactId = required("artifact_id");
var sourceRunId = node.TryGetPropertyValue("source_run_id", out var sourceRunNode) && sourceRunNode is not null && !string.IsNullOrWhiteSpace(sourceRunNode.GetValue<string>())
? Guid.Parse(sourceRunNode!.GetValue<string>())
: null;
var actorId = required("actor_id");
var actorType = required("actor_type");
var occurredAt = DateTimeOffset.Parse(required("occurred_at"));
var recordedAt = node.TryGetPropertyValue("recorded_at", out var recordedAtNode) && recordedAtNode is not null
? DateTimeOffset.Parse(recordedAtNode.GetValue<string>())
: recordedAtBase;
var payload = node.TryGetPropertyValue("payload", out var payloadNode) && payloadNode is JsonObject payloadObj
? payloadObj
: throw new InvalidOperationException("payload missing");
var canonicalEnvelope = LedgerCanonicalJsonSerializer.Canonicalize(payload);
var prev = node.TryGetPropertyValue("previous_hash", out var prevNode) ? prevNode?.GetValue<string>() : null;
return new LedgerEventDraft(
tenantId,
chainId,
sequence,
eventId,
eventType,
policyVersion,
findingId,
artifactId,
sourceRunId,
actorId,
actorType,
occurredAt,
recordedAt,
payload,
canonicalEnvelope,
prev);
}
static async Task<VerificationResult> VerifyLedgerAsync(IServiceProvider services, string tenant, long expectedEvents, CancellationToken cancellationToken)
{
var errors = new List<string>();
var dataSource = services.GetRequiredService<LedgerDataSource>();
await using var connection = await dataSource.OpenConnectionAsync(tenant, "verify", cancellationToken).ConfigureAwait(false);
// Count check
await using (var countCommand = new Npgsql.NpgsqlCommand("select count(*) from ledger_events where tenant_id = @tenant", connection))
{
countCommand.Parameters.AddWithValue("tenant", tenant);
var count = (long)await countCommand.ExecuteScalarAsync(cancellationToken).ConfigureAwait(false);
if (count < expectedEvents)
{
errors.Add($"event_count_mismatch:{count}/{expectedEvents}");
}
}
// Sequence and hash verification
const string query = """
select chain_id, sequence_no, event_id, event_body, event_hash, previous_hash, merkle_leaf_hash
from ledger_events
where tenant_id = @tenant
order by chain_id, sequence_no
""";
await using var command = new Npgsql.NpgsqlCommand(query, connection);
command.Parameters.AddWithValue("tenant", tenant);
await using var reader = await command.ExecuteReaderAsync(cancellationToken).ConfigureAwait(false);
Guid? currentChain = null;
long expectedSequence = 1;
string? prevHash = null;
while (await reader.ReadAsync(cancellationToken).ConfigureAwait(false))
{
var chainId = reader.GetGuid(0);
var sequence = reader.GetInt64(1);
var eventId = reader.GetGuid(2);
var eventBodyJson = reader.GetString(3);
var eventHash = reader.GetString(4);
var previousHash = reader.GetString(5);
var merkleLeafHash = reader.GetString(6);
if (currentChain != chainId)
{
currentChain = chainId;
expectedSequence = 1;
prevHash = LedgerEventConstants.EmptyHash;
}
if (sequence != expectedSequence)
{
errors.Add($"sequence_gap:{chainId}:{sequence}");
}
if (!string.Equals(previousHash, prevHash, StringComparison.Ordinal))
{
errors.Add($"previous_hash_mismatch:{chainId}:{sequence}");
}
var node = JsonNode.Parse(eventBodyJson)?.AsObject() ?? new JsonObject();
var canonical = LedgerCanonicalJsonSerializer.Canonicalize(node);
var hashResult = LedgerHashing.ComputeHashes(canonical, sequence);
if (!string.Equals(hashResult.EventHash, eventHash, StringComparison.Ordinal))
{
errors.Add($"event_hash_mismatch:{eventId}");
}
if (!string.Equals(hashResult.MerkleLeafHash, merkleLeafHash, StringComparison.Ordinal))
{
errors.Add($"merkle_leaf_mismatch:{eventId}");
}
prevHash = eventHash;
expectedSequence++;
}
if (errors.Count == 0)
{
// Additional check: projector caught up (no lag > 0)
var lagMax = LedgerMetricsSnapshot.LagMax;
if (lagMax > 0)
{
errors.Add($"projection_lag_remaining:{lagMax}");
}
}
return new VerificationResult(errors.Count == 0, errors);
}
static double Percentile(IEnumerable<double> values, double percentile)
{
var data = values.Where(v => !double.IsNaN(v)).OrderBy(v => v).ToArray();
if (data.Length == 0)
{
return 0;
}
var rank = (percentile / 100.0) * (data.Length - 1);
var lowerIndex = (int)Math.Floor(rank);
var upperIndex = (int)Math.Ceiling(rank);
if (lowerIndex == upperIndex)
{
return data[lowerIndex];
}
var fraction = rank - lowerIndex;
return data[lowerIndex] + (data[upperIndex] - data[lowerIndex]) * fraction;
}
internal sealed record HarnessReport(
string Tenant,
IReadOnlyList<string> Fixtures,
long EventsWritten,
double DurationSeconds,
string Status,
double WriteLatencyP95Ms,
double ProjectionRebuildP95Ms,
double ProjectionLagSecondsMax,
double BacklogEventsMax,
long DbConnectionsObserved,
IReadOnlyList<string> VerificationErrors);
internal sealed record VerificationResult(bool Success, IReadOnlyList<string> Errors);
internal sealed class MetricsBag
{
private readonly List<(string Name, double Value)> doubles = new();
private readonly List<(string Name, long Value)> longs = new();
public void Add(Instrument instrument, double value, ReadOnlySpan<KeyValuePair<string, object?>> _)
=> doubles.Add((instrument.Name, value));
public void Add(Instrument instrument, long value, ReadOnlySpan<KeyValuePair<string, object?>> _)
=> longs.Add((instrument.Name, value));
public IEnumerable<double> HistDouble(string name) => doubles.Where(d => d.Name == name).Select(d => d.Value);
public IEnumerable<double> GaugeDouble(string name) => doubles.Where(d => d.Name == name).Select(d => d.Value);
public IEnumerable<long> GaugeLong(string name) => longs.Where(l => l.Name == name).Select(l => l.Value);
public object ToSnapshot() => new
{
doubles = doubles.GroupBy(x => x.Name).ToDictionary(g => g.Key, g => g.Select(v => v.Value).ToArray()),
longs = longs.GroupBy(x => x.Name).ToDictionary(g => g.Key, g => g.Select(v => v.Value).ToArray())
};
}
// Harness lightweight no-op implementations for projection/merkle to keep replay fast
internal sealed class NoOpPolicyEvaluationService : IPolicyEvaluationService
{
public Task<PolicyEvaluationResult> EvaluateAsync(LedgerEventRecord record, FindingProjection? current, CancellationToken cancellationToken)
{
return Task.FromResult(new PolicyEvaluationResult("noop", record.OccurredAt, record.RecordedAt, current?.Status ?? "new"));
}
}
internal sealed class NoOpProjectionRepository : IFindingProjectionRepository
{
public Task<FindingProjection?> GetAsync(string tenantId, string findingId, string policyVersion, CancellationToken cancellationToken) =>
Task.FromResult<FindingProjection?>(null);
public Task InsertActionAsync(FindingAction action, CancellationToken cancellationToken) => Task.CompletedTask;
public Task InsertHistoryAsync(FindingHistory history, CancellationToken cancellationToken) => Task.CompletedTask;
public Task SaveCheckpointAsync(ProjectionCheckpoint checkpoint, CancellationToken cancellationToken) => Task.CompletedTask;
public Task<ProjectionCheckpoint> GetCheckpointAsync(CancellationToken cancellationToken) =>
Task.FromResult(new ProjectionCheckpoint(DateTimeOffset.MinValue, Guid.Empty, DateTimeOffset.MinValue));
public Task UpsertAsync(FindingProjection projection, CancellationToken cancellationToken) => Task.CompletedTask;
public Task EnsureIndexesAsync(CancellationToken cancellationToken) => Task.CompletedTask;
}
internal sealed class NoOpMerkleAnchorRepository : IMerkleAnchorRepository
{
public Task InsertAsync(string tenantId, Guid anchorId, DateTimeOffset windowStart, DateTimeOffset windowEnd, long sequenceStart, long sequenceEnd, string rootHash, long leafCount, DateTime anchoredAt, string? anchorReference, CancellationToken cancellationToken)
=> Task.CompletedTask;
public Task<MerkleAnchor?> GetLatestAsync(string tenantId, CancellationToken cancellationToken) =>
Task.FromResult<MerkleAnchor?>(null);
}
internal sealed class QueueMerkleAnchorScheduler : IMerkleAnchorScheduler
{
private readonly LedgerAnchorQueue _queue;
public QueueMerkleAnchorScheduler(LedgerAnchorQueue queue)
{
_queue = queue ?? throw new ArgumentNullException(nameof(queue));
}
public Task EnqueueAsync(LedgerEventRecord record, CancellationToken cancellationToken)
=> _queue.EnqueueAsync(record, cancellationToken).AsTask();
}

View File

@@ -0,0 +1,43 @@
import json
import sys
from hashlib import sha256
EMPTY_PREV = "0" * 64
def canonical(obj):
return json.dumps(obj, separators=(",", ":"), sort_keys=True)
def hash_event(payload, sequence_no):
canonical_json = canonical(payload).encode()
event_hash = sha256(canonical_json + str(sequence_no).encode()).hexdigest()
merkle_leaf = sha256(event_hash.encode()).hexdigest()
return event_hash, merkle_leaf
def main(path):
out_lines = []
last_hash = {}
with open(path, "r") as f:
events = [json.loads(line) for line in f if line.strip()]
events.sort(key=lambda e: (e["chain_id"], e["sequence_no"]))
for e in events:
prev = e.get("previous_hash") or last_hash.get(e["chain_id"], EMPTY_PREV)
payload = e.get("payload") or e
event_hash, leaf = hash_event(payload, e["sequence_no"])
e["event_hash"] = event_hash
e["merkle_leaf_hash"] = leaf
e["previous_hash"] = prev
last_hash[e["chain_id"]] = event_hash
out_lines.append(json.dumps(e))
with open(path, "w") as f:
for line in out_lines:
f.write(line + "\n")
if __name__ == "__main__":
if len(sys.argv) != 2:
print("usage: compute_hashes.py <ndjson>")
sys.exit(1)
main(sys.argv[1])

View File

@@ -0,0 +1,37 @@
using System.Text.Json;
using LedgerReplayHarness;
using FluentAssertions;
using Xunit;
namespace StellaOps.Findings.Ledger.Tests;
public class HarnessRunnerTests
{
[Fact]
public async Task HarnessRunner_WritesReportAndValidatesHashes()
{
var fixturePath = Path.Combine(AppContext.BaseDirectory, "fixtures", "sample.ndjson");
var tempReport = Path.GetTempFileName();
try
{
var exitCode = await HarnessRunner.RunAsync(new[] { fixturePath }, "tenant-test", tempReport);
exitCode.Should().Be(0);
var json = await File.ReadAllTextAsync(tempReport);
using var doc = JsonDocument.Parse(json);
doc.RootElement.GetProperty("eventsWritten").GetInt64().Should().BeGreaterThan(0);
doc.RootElement.GetProperty("status").GetString().Should().Be("pass");
doc.RootElement.GetProperty("tenant").GetString().Should().Be("tenant-test");
doc.RootElement.GetProperty("hashSummary").GetProperty("uniqueEventHashes").GetInt32().Should().Be(1);
doc.RootElement.GetProperty("hashSummary").GetProperty("uniqueMerkleLeaves").GetInt32().Should().Be(1);
}
finally
{
if (File.Exists(tempReport))
{
File.Delete(tempReport);
}
}
}
}

View File

@@ -0,0 +1,223 @@
using System.Diagnostics.Metrics;
using System.Linq;
using FluentAssertions;
using StellaOps.Findings.Ledger.Observability;
using Xunit;
namespace StellaOps.Findings.Ledger.Tests;
public class LedgerMetricsTests
{
[Fact]
public void ProjectionLagGauge_RecordsLatestPerTenant()
{
using var listener = CreateListener();
var measurements = new List<Measurement<double>>();
listener.SetMeasurementEventCallback<double>((instrument, measurement, tags, state) =>
{
if (instrument.Name == "ledger_projection_lag_seconds")
{
measurements.Add(measurement);
}
});
LedgerMetrics.RecordProjectionLag(TimeSpan.FromSeconds(42), "tenant-a");
listener.RecordObservableInstruments();
var measurement = measurements.Should().ContainSingle().Subject;
measurement.Value.Should().BeApproximately(42, precision: 0.001);
measurement.Tags.ToDictionary(kvp => kvp.Key, kvp => kvp.Value)
.Should().Contain(new KeyValuePair<string, object?>("tenant", "tenant-a"));
}
[Fact]
public void MerkleAnchorDuration_EmitsHistogramMeasurement()
{
using var listener = CreateListener();
var measurements = new List<Measurement<double>>();
listener.SetMeasurementEventCallback<double>((instrument, measurement, tags, state) =>
{
if (instrument.Name == "ledger_merkle_anchor_duration_seconds")
{
measurements.Add(measurement);
}
});
LedgerMetrics.RecordMerkleAnchorDuration(TimeSpan.FromSeconds(1.5), "tenant-b");
var measurement = measurements.Should().ContainSingle().Subject;
measurement.Value.Should().BeApproximately(1.5, precision: 0.001);
measurement.Tags.ToDictionary(kvp => kvp.Key, kvp => kvp.Value)
.Should().Contain(new KeyValuePair<string, object?>("tenant", "tenant-b"));
}
[Fact]
public void MerkleAnchorFailure_IncrementsCounter()
{
using var listener = CreateListener();
var measurements = new List<Measurement<long>>();
listener.SetMeasurementEventCallback<long>((instrument, measurement, tags, state) =>
{
if (instrument.Name == "ledger_merkle_anchor_failures_total")
{
measurements.Add(measurement);
}
});
LedgerMetrics.RecordMerkleAnchorFailure("tenant-c", "persist_failure");
var measurement = measurements.Should().ContainSingle().Subject;
measurement.Value.Should().Be(1);
var tags = measurement.Tags.ToDictionary(kvp => kvp.Key, kvp => kvp.Value);
tags.Should().Contain(new KeyValuePair<string, object?>("tenant", "tenant-c"));
tags.Should().Contain(new KeyValuePair<string, object?>("reason", "persist_failure"));
}
[Fact]
public void AttachmentFailure_IncrementsCounter()
{
using var listener = CreateListener();
var measurements = new List<Measurement<long>>();
listener.SetMeasurementEventCallback<long>((instrument, measurement, tags, state) =>
{
if (instrument.Name == "ledger_attachments_encryption_failures_total")
{
measurements.Add(measurement);
}
});
LedgerMetrics.RecordAttachmentFailure("tenant-d", "encrypt");
var measurement = measurements.Should().ContainSingle().Subject;
measurement.Value.Should().Be(1);
var tags = measurement.Tags.ToDictionary(kvp => kvp.Key, kvp => kvp.Value);
tags.Should().Contain(new KeyValuePair<string, object?>("tenant", "tenant-d"));
tags.Should().Contain(new KeyValuePair<string, object?>("stage", "encrypt"));
}
[Fact]
public void BacklogGauge_ReflectsOutstandingQueue()
{
using var listener = CreateListener();
var measurements = new List<Measurement<long>>();
// Reset
LedgerMetrics.DecrementBacklog("tenant-q");
LedgerMetrics.IncrementBacklog("tenant-q");
LedgerMetrics.IncrementBacklog("tenant-q");
LedgerMetrics.DecrementBacklog("tenant-q");
listener.SetMeasurementEventCallback<long>((instrument, measurement, tags, state) =>
{
if (instrument.Name == "ledger_ingest_backlog_events")
{
measurements.Add(measurement);
}
});
listener.RecordObservableInstruments();
var measurement = measurements.Should().ContainSingle().Subject;
measurement.Value.Should().Be(1);
measurement.Tags.ToDictionary(kvp => kvp.Key, kvp => kvp.Value)
.Should().Contain(new KeyValuePair<string, object?>("tenant", "tenant-q"));
}
[Fact]
public void ProjectionRebuildHistogram_RecordsScenarioTags()
{
using var listener = CreateListener();
var measurements = new List<Measurement<double>>();
listener.SetMeasurementEventCallback<double>((instrument, measurement, tags, state) =>
{
if (instrument.Name == "ledger_projection_rebuild_seconds")
{
measurements.Add(measurement);
}
});
LedgerMetrics.RecordProjectionRebuild(TimeSpan.FromSeconds(3.2), "tenant-r", "replay");
var measurement = measurements.Should().ContainSingle().Subject;
measurement.Value.Should().BeApproximately(3.2, 0.001);
var tags = measurement.Tags.ToDictionary(kvp => kvp.Key, kvp => kvp.Value);
tags.Should().Contain(new KeyValuePair<string, object?>("tenant", "tenant-r"));
tags.Should().Contain(new KeyValuePair<string, object?>("scenario", "replay"));
}
[Fact]
public void DbConnectionsGauge_TracksRoleCounts()
{
using var listener = CreateListener();
var measurements = new List<Measurement<long>>();
// Reset
LedgerMetrics.DecrementDbConnection("writer");
LedgerMetrics.IncrementDbConnection("writer");
listener.SetMeasurementEventCallback<long>((instrument, measurement, tags, state) =>
{
if (instrument.Name == "ledger_db_connections_active")
{
measurements.Add(measurement);
}
});
listener.RecordObservableInstruments();
var measurement = measurements.Should().ContainSingle().Subject;
measurement.Value.Should().Be(1);
measurement.Tags.ToDictionary(kvp => kvp.Key, kvp => kvp.Value)
.Should().Contain(new KeyValuePair<string, object?>("role", "writer"));
LedgerMetrics.DecrementDbConnection("writer");
}
[Fact]
public void VersionInfoGauge_EmitsConstantOne()
{
using var listener = CreateListener();
var measurements = new List<Measurement<long>>();
listener.SetMeasurementEventCallback<long>((instrument, measurement, tags, state) =>
{
if (instrument.Name == "ledger_app_version_info")
{
measurements.Add(measurement);
}
});
listener.RecordObservableInstruments();
var measurement = measurements.Should().ContainSingle().Subject;
measurement.Value.Should().Be(1);
var tags = measurement.Tags.ToDictionary(kvp => kvp.Key, kvp => kvp.Value);
tags.Should().ContainKey("version");
tags.Should().ContainKey("git_sha");
}
private static MeterListener CreateListener()
{
var listener = new MeterListener
{
InstrumentPublished = (instrument, l) =>
{
if (instrument.Meter.Name == "StellaOps.Findings.Ledger")
{
l.EnableMeasurementEvents(instrument);
}
}
};
listener.Start();
return listener;
}
}

View File

@@ -0,0 +1,148 @@
using System.Text.Json;
using System.Text.Json.Nodes;
using StellaOps.Findings.Ledger.Domain;
using StellaOps.Findings.Ledger.Hashing;
namespace LedgerReplayHarness;
public sealed class HarnessRunner
{
private readonly ILedgerClient _client;
private readonly int _maxParallel;
public HarnessRunner(ILedgerClient client, int maxParallel = 4)
{
_client = client ?? throw new ArgumentNullException(nameof(client));
_maxParallel = maxParallel <= 0 ? 1 : maxParallel;
}
public async Task<int> RunAsync(IEnumerable<string> fixtures, string tenant, string reportPath, CancellationToken cancellationToken)
{
if (fixtures is null || !fixtures.Any())
{
throw new ArgumentException("At least one fixture is required.", nameof(fixtures));
}
var stats = new HarnessStats();
tenant = string.IsNullOrWhiteSpace(tenant) ? "default" : tenant;
reportPath = string.IsNullOrWhiteSpace(reportPath) ? "harness-report.json" : reportPath;
var eventCount = 0L;
var hashesValid = true;
DateTimeOffset? earliest = null;
DateTimeOffset? latest = null;
var latencies = new List<double>();
var leafHashes = new List<string>();
string? expectedMerkleRoot = null;
var latencies = new ConcurrentBag<double>();
var swTotal = Stopwatch.StartNew();
var throttler = new TaskThrottler(_maxParallel);
foreach (var fixture in fixtures)
{
await foreach (var line in ReadLinesAsync(fixture, cancellationToken))
{
if (string.IsNullOrWhiteSpace(line)) continue;
var node = JsonNode.Parse(line)?.AsObject();
if (node is null) continue;
eventCount++;
var recordedAt = node["recorded_at"]?.GetValue<DateTimeOffset>() ?? DateTimeOffset.UtcNow;
earliest = earliest is null ? recordedAt : DateTimeOffset.Compare(recordedAt, earliest.Value) < 0 ? recordedAt : earliest;
latest = latest is null
? recordedAt
: DateTimeOffset.Compare(recordedAt, latest.Value) > 0 ? recordedAt : latest;
if (node["canonical_envelope"] is JsonObject envelope && node["sequence_no"] is not null)
{
var seq = node["sequence_no"]!.GetValue<long>();
var computed = LedgerHashing.ComputeHashes(envelope, seq);
var expected = node["event_hash"]?.GetValue<string>();
if (!string.IsNullOrEmpty(expected) && !string.Equals(expected, computed.EventHash, StringComparison.Ordinal))
{
hashesValid = false;
}
stats.UpdateHashes(computed.EventHash, computed.MerkleLeafHash);
leafHashes.Add(computed.MerkleLeafHash);
expectedMerkleRoot ??= node["merkle_root"]?.GetValue<string>();
// enqueue for concurrent append
var record = new LedgerEventRecord(
tenant,
envelope["chain_id"]?.GetValue<Guid>() ?? Guid.Empty,
seq,
envelope["event_id"]?.GetValue<Guid>() ?? Guid.Empty,
envelope["event_type"]?.GetValue<string>() ?? string.Empty,
envelope["policy_version"]?.GetValue<string>() ?? string.Empty,
envelope["finding_id"]?.GetValue<string>() ?? string.Empty,
envelope["artifact_id"]?.GetValue<string>() ?? string.Empty,
envelope["source_run_id"]?.GetValue<Guid?>(),
envelope["actor_id"]?.GetValue<string>() ?? "system",
envelope["actor_type"]?.GetValue<string>() ?? "system",
envelope["occurred_at"]?.GetValue<DateTimeOffset>() ?? recordedAt,
recordedAt,
envelope,
computed.EventHash,
envelope["previous_hash"]?.GetValue<string>() ?? string.Empty,
computed.MerkleLeafHash,
computed.CanonicalJson);
// fire-and-track latency
await throttler.RunAsync(async () =>
{
var sw = Stopwatch.StartNew();
await _client.AppendAsync(record, cancellationToken).ConfigureAwait(false);
sw.Stop();
latencies.Add(sw.Elapsed.TotalMilliseconds);
}, cancellationToken).ConfigureAwait(false);
}
}
}
await throttler.DrainAsync(cancellationToken).ConfigureAwait(false);
swTotal.Stop();
var latencyArray = latencies.ToArray();
Array.Sort(latencyArray);
double p95 = latencyArray.Length == 0 ? 0 : latencyArray[(int)Math.Ceiling(latencyArray.Length * 0.95) - 1];
string? computedRoot = leafHashes.Count == 0 ? null : MerkleCalculator.ComputeRoot(leafHashes);
var merkleOk = expectedMerkleRoot is null || string.Equals(expectedMerkleRoot, computedRoot, StringComparison.OrdinalIgnoreCase);
var report = new
{
tenant,
fixtures = fixtures.ToArray(),
eventsWritten = eventCount,
durationSeconds = Math.Max(swTotal.Elapsed.TotalSeconds, (latest - earliest)?.TotalSeconds ?? 0),
throughputEps = swTotal.Elapsed.TotalSeconds > 0 ? eventCount / swTotal.Elapsed.TotalSeconds : 0,
latencyP95Ms = p95,
projectionLagMaxSeconds = 0,
cpuPercentMax = 0,
memoryMbMax = 0,
status = hashesValid && merkleOk ? "pass" : "fail",
timestamp = DateTimeOffset.UtcNow.ToString("O"),
hashSummary = stats.ToReport(),
merkleRoot = computedRoot,
merkleExpected = expectedMerkleRoot
};
var json = JsonSerializer.Serialize(report, new JsonSerializerOptions { WriteIndented = true });
await File.WriteAllTextAsync(reportPath, json);
return hashesValid && merkleOk ? 0 : 1;
}
private static async IAsyncEnumerable<string> ReadLinesAsync(string path, [System.Runtime.CompilerServices.EnumeratorCancellation] CancellationToken cancellationToken)
{
await using var stream = File.OpenRead(path);
using var reader = new StreamReader(stream);
string? line;
while (!reader.EndOfStream && !cancellationToken.IsCancellationRequested && (line = await reader.ReadLineAsync()) is not null)
{
yield return line;
}
}
}

View File

@@ -0,0 +1,26 @@
namespace LedgerReplayHarness;
internal sealed class HarnessStats
{
private readonly HashSet<string> _eventHashes = new(StringComparer.OrdinalIgnoreCase);
private readonly HashSet<string> _leafHashes = new(StringComparer.OrdinalIgnoreCase);
public void UpdateHashes(string eventHash, string leafHash)
{
if (!string.IsNullOrWhiteSpace(eventHash))
{
_eventHashes.Add(eventHash);
}
if (!string.IsNullOrWhiteSpace(leafHash))
{
_leafHashes.Add(leafHash);
}
}
public object ToReport() => new
{
uniqueEventHashes = _eventHashes.Count,
uniqueMerkleLeaves = _leafHashes.Count
};
}

View File

@@ -0,0 +1,8 @@
using StellaOps.Findings.Ledger.Domain;
namespace LedgerReplayHarness;
public interface ILedgerClient
{
Task AppendAsync(LedgerEventRecord record, CancellationToken cancellationToken);
}

View File

@@ -0,0 +1,15 @@
using System.Collections.Concurrent;
using StellaOps.Findings.Ledger.Domain;
namespace LedgerReplayHarness;
public sealed class InMemoryLedgerClient : ILedgerClient
{
private readonly ConcurrentDictionary<(string Tenant, Guid EventId), LedgerEventRecord> _store = new();
public Task AppendAsync(LedgerEventRecord record, CancellationToken cancellationToken)
{
_store.TryAdd((record.TenantId, record.EventId), record);
return Task.CompletedTask;
}
}

View File

@@ -0,0 +1,14 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>net10.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
</PropertyGroup>
<ItemGroup>
<ProjectReference Include="..\\..\\StellaOps.Findings.Ledger\\StellaOps.Findings.Ledger.csproj" />
</ItemGroup>
<ItemGroup>
<PackageReference Include="System.CommandLine" Version="2.0.0-beta4.22272.1" />
</ItemGroup>
</Project>

View File

@@ -0,0 +1,41 @@
using System.Security.Cryptography;
using System.Text;
namespace LedgerReplayHarness;
internal static class MerkleCalculator
{
public static string ComputeRoot(IReadOnlyList<string> leafHashes)
{
if (leafHashes is null || leafHashes.Count == 0)
{
throw new ArgumentException("At least one leaf hash is required.", nameof(leafHashes));
}
var level = leafHashes.Select(Normalize).ToList();
while (level.Count > 1)
{
var next = new List<string>((level.Count + 1) / 2);
for (int i = 0; i < level.Count; i += 2)
{
var left = level[i];
var right = i + 1 < level.Count ? level[i + 1] : level[i];
next.Add(HashPair(left, right));
}
level = next;
}
return level[0];
}
private static string Normalize(string hex)
=> hex?.Trim().ToLowerInvariant() ?? string.Empty;
private static string HashPair(string left, string right)
{
using var sha = SHA256.Create();
var data = Encoding.UTF8.GetBytes(left + right);
var hash = sha.ComputeHash(data);
return Convert.ToHexString(hash).ToLowerInvariant();
}
}

View File

@@ -0,0 +1,22 @@
using System.CommandLine;
using LedgerReplayHarness;
var fixtureOption = new Option<string[]>("--fixture", "NDJSON fixture path(s)") { IsRequired = true, AllowMultipleArgumentsPerToken = true };
var tenantOption = new Option<string>("--tenant", () => "default", "Tenant identifier");
var reportOption = new Option<string>("--report", () => "harness-report.json", "Path to write JSON report");
var parallelOption = new Option<int>("--maxParallel", () => 4, "Maximum parallelism when sending events");
var root = new RootCommand("Findings Ledger replay & determinism harness");
root.AddOption(fixtureOption);
root.AddOption(tenantOption);
root.AddOption(reportOption);
root.AddOption(parallelOption);
root.SetHandler(async (fixtures, tenant, report, maxParallel) =>
{
var runner = new HarnessRunner(new InMemoryLedgerClient(), maxParallel);
var exitCode = await runner.RunAsync(fixtures, tenant, report, CancellationToken.None);
Environment.Exit(exitCode);
}, fixtureOption, tenantOption, reportOption, parallelOption);
return await root.InvokeAsync(args);

View File

@@ -0,0 +1,36 @@
namespace LedgerReplayHarness;
internal sealed class TaskThrottler
{
private readonly SemaphoreSlim _semaphore;
private readonly List<Task> _tasks = new();
public TaskThrottler(int maxDegreeOfParallelism)
{
_semaphore = new SemaphoreSlim(maxDegreeOfParallelism > 0 ? maxDegreeOfParallelism : 1);
}
public async Task RunAsync(Func<Task> taskFactory, CancellationToken cancellationToken)
{
await _semaphore.WaitAsync(cancellationToken).ConfigureAwait(false);
var task = Task.Run(async () =>
{
try
{
await taskFactory().ConfigureAwait(false);
}
finally
{
_semaphore.Release();
}
}, cancellationToken);
lock (_tasks) _tasks.Add(task);
}
public async Task DrainAsync(CancellationToken cancellationToken)
{
Task[] pending;
lock (_tasks) pending = _tasks.ToArray();
await Task.WhenAll(pending).WaitAsync(cancellationToken).ConfigureAwait(false);
}
}

View File

@@ -0,0 +1,77 @@
using System.Text.Json;
using Xunit;
namespace StellaOps.Notifier.Tests;
public sealed class AttestationTemplateCoverageTests
{
private static readonly string RepoRoot = LocateRepoRoot();
[Fact]
public void Attestation_templates_cover_required_channels()
{
var directory = Path.Combine(RepoRoot, "offline", "notifier", "templates", "attestation");
Assert.True(Directory.Exists(directory), $"Expected template directory at {directory}");
var templates = Directory
.GetFiles(directory, "*.template.json")
.Select(path => new
{
Path = path,
Document = JsonDocument.Parse(File.ReadAllText(path)).RootElement
})
.ToList();
var required = new Dictionary<string, string[]>
{
["tmpl-attest-verify-fail"] = new[] { "slack", "email", "webhook" },
["tmpl-attest-expiry-warning"] = new[] { "email", "slack" },
["tmpl-attest-key-rotation"] = new[] { "email", "webhook" },
["tmpl-attest-transparency-anomaly"] = new[] { "slack", "webhook" }
};
foreach (var pair in required)
{
var matches = templates.Where(t => t.Document.GetProperty("key").GetString() == pair.Key);
var channels = matches
.Select(t => t.Document.GetProperty("channelType").GetString() ?? string.Empty)
.ToHashSet(StringComparer.OrdinalIgnoreCase);
var missing = pair.Value.Where(requiredChannel => !channels.Contains(requiredChannel)).ToArray();
Assert.True(missing.Length == 0, $"{pair.Key} missing channels: {string.Join(", ", missing)}");
}
}
[Fact]
public void Attestation_templates_include_schema_and_locale_metadata()
{
var directory = Path.Combine(RepoRoot, "offline", "notifier", "templates", "attestation");
Assert.True(Directory.Exists(directory), $"Expected template directory at {directory}");
foreach (var path in Directory.GetFiles(directory, "*.template.json"))
{
var document = JsonDocument.Parse(File.ReadAllText(path)).RootElement;
Assert.True(document.TryGetProperty("schemaVersion", out var schemaVersion) && !string.IsNullOrWhiteSpace(schemaVersion.GetString()), $"schemaVersion missing for {Path.GetFileName(path)}");
Assert.True(document.TryGetProperty("locale", out var locale) && !string.IsNullOrWhiteSpace(locale.GetString()), $"locale missing for {Path.GetFileName(path)}");
Assert.True(document.TryGetProperty("key", out var key) && !string.IsNullOrWhiteSpace(key.GetString()), $"key missing for {Path.GetFileName(path)}");
}
}
private static string LocateRepoRoot()
{
var directory = AppContext.BaseDirectory;
while (directory != null)
{
var candidate = Path.Combine(directory, "offline", "notifier", "templates", "attestation");
if (Directory.Exists(candidate))
{
return directory;
}
directory = Directory.GetParent(directory)?.FullName;
}
throw new InvalidOperationException("Unable to locate repository root containing offline/notifier/templates/attestation.");
}
}

View File

@@ -0,0 +1,66 @@
using System.Text.Json;
using Xunit;
namespace StellaOps.Notifier.Tests;
public sealed class DeprecationTemplateTests
{
[Fact]
public void Deprecation_templates_cover_slack_and_email()
{
var directory = LocateOfflineDeprecationDir();
Assert.True(Directory.Exists(directory), $"Expected template directory at {directory}");
var templates = Directory
.GetFiles(directory, "*.template.json")
.Select(path => new
{
Path = path,
Document = JsonDocument.Parse(File.ReadAllText(path)).RootElement
})
.ToList();
var channels = templates
.Where(t => t.Document.GetProperty("key").GetString() == "tmpl-api-deprecation")
.Select(t => t.Document.GetProperty("channelType").GetString() ?? string.Empty)
.ToHashSet(StringComparer.OrdinalIgnoreCase);
Assert.Contains("slack", channels);
Assert.Contains("email", channels);
}
[Fact]
public void Deprecation_templates_require_core_metadata()
{
var directory = LocateOfflineDeprecationDir();
Assert.True(Directory.Exists(directory), $"Expected template directory at {directory}");
foreach (var path in Directory.GetFiles(directory, "*.template.json"))
{
var document = JsonDocument.Parse(File.ReadAllText(path)).RootElement;
Assert.True(document.TryGetProperty("metadata", out var meta), $"metadata missing for {Path.GetFileName(path)}");
// Ensure documented metadata keys are present for offline baseline.
Assert.True(meta.TryGetProperty("version", out _), $"metadata.version missing for {Path.GetFileName(path)}");
Assert.True(meta.TryGetProperty("author", out _), $"metadata.author missing for {Path.GetFileName(path)}");
}
}
private static string LocateOfflineDeprecationDir()
{
var directory = AppContext.BaseDirectory;
while (directory != null)
{
var candidate = Path.Combine(directory, "offline", "notifier", "templates", "deprecation");
if (Directory.Exists(candidate))
{
return candidate;
}
directory = Directory.GetParent(directory)?.FullName;
}
throw new InvalidOperationException("Unable to locate offline/notifier/templates/deprecation directory.");
}
}

View File

@@ -0,0 +1,87 @@
using System.Net;
using Microsoft.AspNetCore.Mvc.Testing;
using StellaOps.Notifier.WebService;
using Xunit;
namespace StellaOps.Notifier.Tests;
public sealed class OpenApiEndpointTests : IClassFixture<WebApplicationFactory<WebServiceAssemblyMarker>>
{
private readonly HttpClient _client;
private readonly InMemoryPackApprovalRepository _packRepo = new();
private readonly InMemoryLockRepository _lockRepo = new();
private readonly InMemoryAuditRepository _auditRepo = new();
public OpenApiEndpointTests(WebApplicationFactory<WebServiceAssemblyMarker> factory)
{
_client = factory
.WithWebHostBuilder(builder =>
{
builder.ConfigureServices(services =>
{
services.AddSingleton<INotifyPackApprovalRepository>(_packRepo);
services.AddSingleton<INotifyLockRepository>(_lockRepo);
services.AddSingleton<INotifyAuditRepository>(_auditRepo);
});
})
.CreateClient();
}
[Fact]
public async Task OpenApi_endpoint_serves_yaml_with_scope_header()
{
var response = await _client.GetAsync("/.well-known/openapi", TestContext.Current.CancellationToken);
Assert.Equal(HttpStatusCode.OK, response.StatusCode);
Assert.Equal("application/yaml", response.Content.Headers.ContentType?.MediaType);
Assert.True(response.Headers.TryGetValues("X-OpenAPI-Scope", out var values) &&
values.Contains("notify"));
Assert.True(response.Headers.ETag is not null && response.Headers.ETag.Tag.Length > 2);
var body = await response.Content.ReadAsStringAsync(TestContext.Current.CancellationToken);
Assert.Contains("openapi: 3.1.0", body);
Assert.Contains("/api/v1/notify/quiet-hours", body);
Assert.Contains("/api/v1/notify/incidents", body);
}
[Fact]
public async Task Deprecation_headers_emitted_for_api_surface()
{
var response = await _client.GetAsync("/api/v1/notify/rules", TestContext.Current.CancellationToken);
Assert.True(response.Headers.TryGetValues("Deprecation", out var depValues) &&
depValues.Contains("true"));
Assert.True(response.Headers.TryGetValues("Sunset", out var sunsetValues) &&
sunsetValues.Any());
Assert.True(response.Headers.TryGetValues("Link", out var linkValues) &&
linkValues.Any(v => v.Contains("rel=\"deprecation\"")));
}
[Fact]
public async Task PackApprovals_endpoint_validates_missing_headers()
{
var content = new StringContent("""{"eventId":"00000000-0000-0000-0000-000000000001","issuedAt":"2025-11-17T16:00:00Z","kind":"pack.approval.granted","packId":"offline-kit","decision":"approved","actor":"task-runner"}""", Encoding.UTF8, "application/json");
var response = await _client.PostAsync("/api/v1/notify/pack-approvals", content, TestContext.Current.CancellationToken);
Assert.Equal(HttpStatusCode.BadRequest, response.StatusCode);
}
[Fact]
public async Task PackApprovals_endpoint_accepts_happy_path_and_echoes_resume_token()
{
var content = new StringContent("""{"eventId":"00000000-0000-0000-0000-000000000002","issuedAt":"2025-11-17T16:00:00Z","kind":"pack.approval.granted","packId":"offline-kit","decision":"approved","actor":"task-runner","resumeToken":"rt-ok"}""", Encoding.UTF8, "application/json");
var request = new HttpRequestMessage(HttpMethod.Post, "/api/v1/notify/pack-approvals")
{
Content = content
};
request.Headers.Add("X-StellaOps-Tenant", "tenant-a");
request.Headers.Add("Idempotency-Key", Guid.NewGuid().ToString());
var response = await _client.SendAsync(request, TestContext.Current.CancellationToken);
Assert.Equal(HttpStatusCode.Accepted, response.StatusCode);
Assert.True(response.Headers.TryGetValues("X-Resume-After", out var resumeValues) &&
resumeValues.Contains("rt-ok"));
Assert.True(_packRepo.Exists("tenant-a", Guid.Parse("00000000-0000-0000-0000-000000000002"), "offline-kit"));
}
}

View File

@@ -0,0 +1,30 @@
using StellaOps.Notify.Storage.Mongo.Documents;
using StellaOps.Notify.Storage.Mongo.Repositories;
namespace StellaOps.Notifier.Tests.Support;
internal sealed class InMemoryAuditRepository : INotifyAuditRepository
{
private readonly List<NotifyAuditEntryDocument> _entries = new();
public Task AppendAsync(NotifyAuditEntryDocument entry, CancellationToken cancellationToken = default)
{
_entries.Add(entry);
return Task.CompletedTask;
}
public Task<IReadOnlyList<NotifyAuditEntryDocument>> QueryAsync(string tenantId, DateTimeOffset? since, int? limit, CancellationToken cancellationToken = default)
{
var items = _entries
.Where(e => e.TenantId == tenantId && (!since.HasValue || e.Timestamp >= since.Value))
.OrderByDescending(e => e.Timestamp)
.ToList();
if (limit is > 0)
{
items = items.Take(limit.Value).ToList();
}
return Task.FromResult<IReadOnlyList<NotifyAuditEntryDocument>>(items);
}
}

View File

@@ -0,0 +1,18 @@
using StellaOps.Notify.Storage.Mongo.Documents;
using StellaOps.Notify.Storage.Mongo.Repositories;
namespace StellaOps.Notifier.Tests.Support;
internal sealed class InMemoryPackApprovalRepository : INotifyPackApprovalRepository
{
private readonly Dictionary<(string TenantId, Guid EventId, string PackId), PackApprovalDocument> _records = new();
public Task UpsertAsync(PackApprovalDocument document, CancellationToken cancellationToken = default)
{
_records[(document.TenantId, document.EventId, document.PackId)] = document;
return Task.CompletedTask;
}
public bool Exists(string tenantId, Guid eventId, string packId)
=> _records.ContainsKey((tenantId, eventId, packId));
}

View File

@@ -0,0 +1,45 @@
using System.Text.Json.Serialization;
namespace StellaOps.Notifier.WebService.Contracts;
public sealed class PackApprovalRequest
{
[JsonPropertyName("eventId")]
public Guid EventId { get; init; }
[JsonPropertyName("issuedAt")]
public DateTimeOffset IssuedAt { get; init; }
[JsonPropertyName("kind")]
public string Kind { get; init; } = string.Empty;
[JsonPropertyName("packId")]
public string PackId { get; init; } = string.Empty;
[JsonPropertyName("policy")]
public PackApprovalPolicy? Policy { get; init; }
[JsonPropertyName("decision")]
public string Decision { get; init; } = string.Empty;
[JsonPropertyName("actor")]
public string Actor { get; init; } = string.Empty;
[JsonPropertyName("resumeToken")]
public string? ResumeToken { get; init; }
[JsonPropertyName("summary")]
public string? Summary { get; init; }
[JsonPropertyName("labels")]
public Dictionary<string, string>? Labels { get; init; }
}
public sealed class PackApprovalPolicy
{
[JsonPropertyName("id")]
public string? Id { get; init; }
[JsonPropertyName("version")]
public string? Version { get; init; }
}

View File

@@ -1,24 +1,141 @@
using Microsoft.AspNetCore.Builder;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;
using StellaOps.Notify.Storage.Mongo;
using StellaOps.Notifier.WebService.Setup;
var builder = WebApplication.CreateBuilder(args);
builder.Configuration
.AddJsonFile("appsettings.json", optional: true, reloadOnChange: true)
.AddEnvironmentVariables(prefix: "NOTIFIER_");
var mongoSection = builder.Configuration.GetSection("notifier:storage:mongo");
builder.Services.AddNotifyMongoStorage(mongoSection);
builder.Services.AddHealthChecks();
builder.Services.AddHostedService<MongoInitializationHostedService>();
var app = builder.Build();
app.MapHealthChecks("/healthz");
app.Run();
using System.Text.Json;
using Microsoft.AspNetCore.Builder;
using Microsoft.AspNetCore.Http;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;
using StellaOps.Notifier.WebService.Contracts;
using StellaOps.Notifier.WebService.Setup;
using StellaOps.Notify.Storage.Mongo;
using StellaOps.Notify.Storage.Mongo.Documents;
using StellaOps.Notify.Storage.Mongo.Repositories;
var builder = WebApplication.CreateBuilder(args);
builder.Configuration
.AddJsonFile("appsettings.json", optional: true, reloadOnChange: true)
.AddEnvironmentVariables(prefix: "NOTIFIER_");
var mongoSection = builder.Configuration.GetSection("notifier:storage:mongo");
builder.Services.AddNotifyMongoStorage(mongoSection);
builder.Services.AddSingleton<OpenApiDocumentCache>();
builder.Services.AddHealthChecks();
builder.Services.AddHostedService<MongoInitializationHostedService>();
var app = builder.Build();
app.MapHealthChecks("/healthz");
// Deprecation headers for retiring v1 APIs (RFC 8594 / IETF Sunset)
app.Use(async (context, next) =>
{
if (context.Request.Path.StartsWithSegments("/api/v1", StringComparison.OrdinalIgnoreCase))
{
context.Response.Headers["Deprecation"] = "true";
context.Response.Headers["Sunset"] = "Tue, 31 Mar 2026 00:00:00 GMT";
context.Response.Headers["Link"] =
"<https://docs.stellaops.example.com/notify/deprecations>; rel=\"deprecation\"; type=\"text/html\"";
}
await next().ConfigureAwait(false);
});
app.MapPost("/api/v1/notify/pack-approvals", async (
HttpContext context,
PackApprovalRequest request,
INotifyLockRepository locks,
INotifyPackApprovalRepository packApprovals,
INotifyAuditRepository audit,
TimeProvider timeProvider) =>
{
var tenantId = context.Request.Headers["X-StellaOps-Tenant"].ToString();
if (string.IsNullOrWhiteSpace(tenantId))
{
return Results.BadRequest(Error("tenant_missing", "X-StellaOps-Tenant header is required.", context));
}
var idempotencyKey = context.Request.Headers["Idempotency-Key"].ToString();
if (string.IsNullOrWhiteSpace(idempotencyKey))
{
return Results.BadRequest(Error("idempotency_key_missing", "Idempotency-Key header is required.", context));
}
if (request.EventId == Guid.Empty || string.IsNullOrWhiteSpace(request.PackId) ||
string.IsNullOrWhiteSpace(request.Kind) || string.IsNullOrWhiteSpace(request.Decision) ||
string.IsNullOrWhiteSpace(request.Actor))
{
return Results.BadRequest(Error("invalid_request", "eventId, packId, kind, decision, actor are required.", context));
}
var lockKey = $"pack-approvals|{tenantId}|{idempotencyKey}";
var ttl = TimeSpan.FromMinutes(15);
var reserved = await locks.TryAcquireAsync(tenantId, lockKey, "pack-approvals", ttl, context.RequestAborted)
.ConfigureAwait(false);
if (!reserved)
{
return Results.StatusCode(StatusCodes.Status200OK);
}
var document = new PackApprovalDocument
{
TenantId = tenantId,
EventId = request.EventId,
PackId = request.PackId,
Kind = request.Kind,
Decision = request.Decision,
Actor = request.Actor,
IssuedAt = request.IssuedAt,
PolicyId = request.Policy?.Id,
PolicyVersion = request.Policy?.Version,
ResumeToken = request.ResumeToken,
Summary = request.Summary,
Labels = request.Labels,
CreatedAt = timeProvider.GetUtcNow()
};
await packApprovals.UpsertAsync(document, context.RequestAborted).ConfigureAwait(false);
var auditEntry = new NotifyAuditEntryDocument
{
TenantId = tenantId,
Actor = request.Actor,
Action = "pack.approval.ingested",
EntityId = request.PackId,
EntityType = "pack-approval",
Timestamp = timeProvider.GetUtcNow(),
Payload = MongoDB.Bson.Serialization.BsonSerializer.Deserialize<MongoDB.Bson.BsonDocument>(JsonSerializer.Serialize(request))
};
await audit.AppendAsync(auditEntry, context.RequestAborted).ConfigureAwait(false);
if (!string.IsNullOrWhiteSpace(request.ResumeToken))
{
context.Response.Headers["X-Resume-After"] = request.ResumeToken;
}
return Results.Accepted();
});
app.MapGet("/.well-known/openapi", (HttpContext context, OpenApiDocumentCache cache) =>
{
context.Response.Headers.CacheControl = "public, max-age=300";
context.Response.Headers["X-OpenAPI-Scope"] = "notify";
context.Response.Headers.ETag = $"\"{cache.Sha256}\"";
return Results.Content(cache.Document, "application/yaml");
});
app.Run();
public partial class Program;
static object Error(string code, string message, HttpContext context) => new
{
error = new
{
code,
message,
traceId = context.TraceIdentifier
}
};

View File

@@ -0,0 +1,28 @@
using System.Text;
namespace StellaOps.Notifier.WebService.Setup;
public sealed class OpenApiDocumentCache
{
private readonly string _document;
private readonly string _hash;
public OpenApiDocumentCache(IHostEnvironment environment)
{
var path = Path.Combine(environment.ContentRootPath, "openapi", "notify-openapi.yaml");
if (!File.Exists(path))
{
throw new FileNotFoundException("OpenAPI document not found.", path);
}
_document = File.ReadAllText(path, Encoding.UTF8);
using var sha = System.Security.Cryptography.SHA256.Create();
var bytes = Encoding.UTF8.GetBytes(_document);
_hash = Convert.ToHexString(sha.ComputeHash(bytes)).ToLowerInvariant();
}
public string Document => _document;
public string Sha256 => _hash;
}

View File

@@ -0,0 +1,6 @@
namespace StellaOps.Notifier.WebService;
/// <summary>
/// Marker type used for testing/hosting the web application.
/// </summary>
public sealed class WebServiceAssemblyMarker;

View File

@@ -0,0 +1,501 @@
# OpenAPI 3.1 specification for StellaOps Notifier WebService (draft)
openapi: 3.1.0
info:
title: StellaOps Notifier API
version: 0.6.0-draft
description: |
Contract for Notifications Studio (Notifier) covering rules, templates, incidents,
and quiet hours. Uses the platform error envelope and tenant header `X-StellaOps-Tenant`.
servers:
- url: https://api.stellaops.example.com
description: Production
- url: https://api.dev.stellaops.example.com
description: Development
security:
- oauth2: [notify.viewer]
- oauth2: [notify.operator]
- oauth2: [notify.admin]
paths:
/api/v1/notify/rules:
get:
summary: List notification rules
tags: [Rules]
parameters:
- $ref: '#/components/parameters/Tenant'
- $ref: '#/components/parameters/PageSize'
- $ref: '#/components/parameters/PageToken'
responses:
'200':
description: Paginated rule list
content:
application/json:
schema:
type: object
properties:
items:
type: array
items: { $ref: '#/components/schemas/NotifyRule' }
nextPageToken:
type: string
examples:
default:
value:
items:
- ruleId: rule-critical
tenantId: tenant-dev
name: Critical scanner verdicts
enabled: true
match:
eventKinds: [scanner.report.ready]
minSeverity: critical
actions:
- actionId: act-slack-critical
channel: chn-slack-soc
template: tmpl-critical
digest: instant
nextPageToken: null
default:
$ref: '#/components/responses/Error'
post:
summary: Create a notification rule
tags: [Rules]
parameters:
- $ref: '#/components/parameters/Tenant'
requestBody:
required: true
content:
application/json:
schema: { $ref: '#/components/schemas/NotifyRule' }
examples:
create-rule:
value:
ruleId: rule-attest-fail
tenantId: tenant-dev
name: Attestation failures → SOC
enabled: true
match:
eventKinds: [attestor.verification.failed]
actions:
- actionId: act-soc
channel: chn-webhook-soc
template: tmpl-attest-verify-fail
responses:
'201':
description: Rule created
content:
application/json:
schema: { $ref: '#/components/schemas/NotifyRule' }
default:
$ref: '#/components/responses/Error'
/api/v1/notify/rules/{ruleId}:
get:
summary: Fetch a rule
tags: [Rules]
parameters:
- $ref: '#/components/parameters/Tenant'
- $ref: '#/components/parameters/RuleId'
responses:
'200':
description: Rule
content:
application/json:
schema: { $ref: '#/components/schemas/NotifyRule' }
default:
$ref: '#/components/responses/Error'
patch:
summary: Update a rule (partial)
tags: [Rules]
parameters:
- $ref: '#/components/parameters/Tenant'
- $ref: '#/components/parameters/RuleId'
requestBody:
required: true
content:
application/json:
schema:
type: object
description: JSON Merge Patch
responses:
'200':
description: Updated rule
content:
application/json:
schema: { $ref: '#/components/schemas/NotifyRule' }
default:
$ref: '#/components/responses/Error'
/api/v1/notify/templates:
get:
summary: List templates
tags: [Templates]
parameters:
- $ref: '#/components/parameters/Tenant'
- name: key
in: query
description: Filter by template key
schema: { type: string }
responses:
'200':
description: Templates
content:
application/json:
schema:
type: array
items: { $ref: '#/components/schemas/NotifyTemplate' }
default:
$ref: '#/components/responses/Error'
post:
summary: Create a template
tags: [Templates]
parameters:
- $ref: '#/components/parameters/Tenant'
requestBody:
required: true
content:
application/json:
schema: { $ref: '#/components/schemas/NotifyTemplate' }
responses:
'201':
description: Template created
content:
application/json:
schema: { $ref: '#/components/schemas/NotifyTemplate' }
default:
$ref: '#/components/responses/Error'
/api/v1/notify/templates/{templateId}:
get:
summary: Fetch a template
tags: [Templates]
parameters:
- $ref: '#/components/parameters/Tenant'
- $ref: '#/components/parameters/TemplateId'
responses:
'200':
description: Template
content:
application/json:
schema: { $ref: '#/components/schemas/NotifyTemplate' }
default:
$ref: '#/components/responses/Error'
patch:
summary: Update a template (partial)
tags: [Templates]
parameters:
- $ref: '#/components/parameters/Tenant'
- $ref: '#/components/parameters/TemplateId'
requestBody:
required: true
content:
application/json:
schema:
type: object
description: JSON Merge Patch
responses:
'200':
description: Updated template
content:
application/json:
schema: { $ref: '#/components/schemas/NotifyTemplate' }
default:
$ref: '#/components/responses/Error'
/api/v1/notify/incidents:
get:
summary: List incidents (paged)
tags: [Incidents]
parameters:
- $ref: '#/components/parameters/Tenant'
- $ref: '#/components/parameters/PageSize'
- $ref: '#/components/parameters/PageToken'
responses:
'200':
description: Incident page
content:
application/json:
schema:
type: object
properties:
items:
type: array
items: { $ref: '#/components/schemas/Incident' }
nextPageToken: { type: string }
default:
$ref: '#/components/responses/Error'
post:
summary: Raise an incident (ops/toggle/override)
tags: [Incidents]
parameters:
- $ref: '#/components/parameters/Tenant'
requestBody:
required: true
content:
application/json:
schema: { $ref: '#/components/schemas/Incident' }
examples:
start-incident:
value:
incidentId: inc-telemetry-outage
kind: outage
severity: major
startedAt: 2025-11-17T04:02:00Z
shortDescription: "Telemetry pipeline degraded; burn-rate breach"
metadata:
source: slo-evaluator
responses:
'202':
description: Incident accepted
default:
$ref: '#/components/responses/Error'
/api/v1/notify/incidents/{incidentId}/ack:
post:
summary: Acknowledge an incident notification
tags: [Incidents]
parameters:
- $ref: '#/components/parameters/Tenant'
- $ref: '#/components/parameters/IncidentId'
requestBody:
required: true
content:
application/json:
schema:
type: object
properties:
ackToken:
type: string
description: DSSE-signed acknowledgement token
responses:
'204':
description: Acknowledged
default:
$ref: '#/components/responses/Error'
/api/v1/notify/quiet-hours:
get:
summary: Get quiet-hours schedule
tags: [QuietHours]
parameters:
- $ref: '#/components/parameters/Tenant'
responses:
'200':
description: Quiet hours schedule
content:
application/json:
schema: { $ref: '#/components/schemas/QuietHours' }
examples:
current:
value:
quietHoursId: qh-default
windows:
- timezone: UTC
days: [Mon, Tue, Wed, Thu, Fri]
start: "22:00"
end: "06:00"
exemptions:
- eventKinds: [attestor.verification.failed]
reason: "Always alert for attestation failures"
default:
$ref: '#/components/responses/Error'
post:
summary: Set quiet-hours schedule
tags: [QuietHours]
parameters:
- $ref: '#/components/parameters/Tenant'
requestBody:
required: true
content:
application/json:
schema: { $ref: '#/components/schemas/QuietHours' }
responses:
'200':
description: Updated quiet hours
content:
application/json:
schema: { $ref: '#/components/schemas/QuietHours' }
default:
$ref: '#/components/responses/Error'
components:
securitySchemes:
oauth2:
type: oauth2
flows:
clientCredentials:
tokenUrl: https://auth.stellaops.example.com/oauth/token
scopes:
notify.viewer: Read-only Notifier access
notify.operator: Manage rules/templates/incidents within tenant
notify.admin: Tenant-scoped administration
parameters:
Tenant:
name: X-StellaOps-Tenant
in: header
required: true
description: Tenant slug
schema: { type: string }
PageSize:
name: pageSize
in: query
schema: { type: integer, minimum: 1, maximum: 200, default: 50 }
PageToken:
name: pageToken
in: query
schema: { type: string }
RuleId:
name: ruleId
in: path
required: true
schema: { type: string }
TemplateId:
name: templateId
in: path
required: true
schema: { type: string }
IncidentId:
name: incidentId
in: path
required: true
schema: { type: string }
responses:
Error:
description: Standard error envelope
content:
application/json:
schema: { $ref: '#/components/schemas/ErrorEnvelope' }
examples:
validation:
value:
error:
code: validation_failed
message: "quietHours.windows[0].start must be HH:mm"
traceId: "f62f3c2b9c8e4c53"
schemas:
ErrorEnvelope:
type: object
required: [error]
properties:
error:
type: object
required: [code, message, traceId]
properties:
code: { type: string }
message: { type: string }
traceId: { type: string }
NotifyRule:
type: object
required: [ruleId, tenantId, name, match, actions]
properties:
ruleId: { type: string }
tenantId: { type: string }
name: { type: string }
description: { type: string }
enabled: { type: boolean, default: true }
match: { $ref: '#/components/schemas/RuleMatch' }
actions:
type: array
items: { $ref: '#/components/schemas/RuleAction' }
labels:
type: object
additionalProperties: { type: string }
metadata:
type: object
additionalProperties: { type: string }
RuleMatch:
type: object
properties:
eventKinds:
type: array
items: { type: string }
minSeverity: { type: string, enum: [info, low, medium, high, critical] }
verdicts:
type: array
items: { type: string }
labels:
type: array
items: { type: string }
kevOnly: { type: boolean }
RuleAction:
type: object
required: [actionId, channel]
properties:
actionId: { type: string }
channel: { type: string }
template: { type: string }
digest: { type: string, description: "Digest window key e.g. instant|5m|15m|1h|1d" }
throttle: { type: string, description: "ISO-8601 duration, e.g. PT5M" }
locale: { type: string }
enabled: { type: boolean, default: true }
metadata:
type: object
additionalProperties: { type: string }
NotifyTemplate:
type: object
required: [templateId, tenantId, key, channelType, locale, body, renderMode, format]
properties:
templateId: { type: string }
tenantId: { type: string }
key: { type: string }
channelType: { type: string, enum: [slack, teams, email, webhook, custom] }
locale: { type: string, description: "BCP-47, lower-case" }
renderMode: { type: string, enum: [Markdown, Html, AdaptiveCard, PlainText, Json] }
format: { type: string, enum: [slack, teams, email, webhook, json] }
description: { type: string }
body: { type: string }
metadata:
type: object
additionalProperties: { type: string }
Incident:
type: object
required: [incidentId, kind, severity, startedAt]
properties:
incidentId: { type: string }
kind: { type: string, description: "outage|degradation|security|ops-drill" }
severity: { type: string, enum: [minor, major, critical] }
startedAt: { type: string, format: date-time }
endedAt: { type: string, format: date-time }
shortDescription: { type: string }
description: { type: string }
metadata:
type: object
additionalProperties: { type: string }
QuietHours:
type: object
required: [quietHoursId, windows]
properties:
quietHoursId: { type: string }
windows:
type: array
items: { $ref: '#/components/schemas/QuietHoursWindow' }
exemptions:
type: array
description: Event kinds that bypass quiet hours
items:
type: object
properties:
eventKinds:
type: array
items: { type: string }
reason: { type: string }
QuietHoursWindow:
type: object
required: [timezone, days, start, end]
properties:
timezone: { type: string, description: "IANA TZ, e.g., UTC" }
days:
type: array
items:
type: string
enum: [Mon, Tue, Wed, Thu, Fri, Sat, Sun]
start: { type: string, description: "HH:mm" }
end: { type: string, description: "HH:mm" }

View File

@@ -0,0 +1,15 @@
# Sprint 171 · Notifier.I
| ID | Status | Owner(s) | Notes |
| --- | --- | --- | --- |
| NOTIFY-ATTEST-74-001 | DONE (2025-11-16) | Notifications Service Guild | Attestation template suite complete; Slack expiry template added; coverage tests guard required channels. |
| NOTIFY-ATTEST-74-002 | TODO | Notifications Service Guild · KMS Guild | Wire notifications to key rotation/revocation events + transparency witness failures (depends on 74-001). |
| NOTIFY-OAS-61-001 | DONE (2025-11-17) | Notifications Service Guild · API Contracts Guild | OAS updated with rules/templates/incidents/quiet hours and standard error envelope. |
| NOTIFY-OAS-61-002 | DONE (2025-11-17) | Notifications Service Guild | `.well-known/openapi` discovery endpoint with scope metadata implemented. |
| NOTIFY-OAS-62-001 | DONE (2025-11-17) | Notifications Service Guild · SDK Generator Guild | SDK usage examples + smoke tests (depends on 61-002). |
| NOTIFY-OAS-63-001 | TODO | Notifications Service Guild · API Governance Guild | Deprecation headers + template notices for retiring APIs (depends on 62-001). |
| NOTIFY-OBS-51-001 | TODO | Notifications Service Guild · Observability Guild | Integrate SLO evaluator webhooks once schema lands. |
| NOTIFY-OBS-55-001 | TODO | Notifications Service Guild · Ops Guild | Incident mode start/stop notifications; quiet-hour overrides. |
| NOTIFY-RISK-66-001 | TODO | Notifications Service Guild · Risk Engine Guild | Trigger risk severity escalation/downgrade notifications (waiting on Policy export). |
| NOTIFY-RISK-67-001 | TODO | Notifications Service Guild · Policy Guild | Notify when risk profiles publish/deprecate/threshold-change (depends on 66-001). |
| NOTIFY-RISK-68-001 | TODO | Notifications Service Guild | Per-profile routing rules + quiet hours for risk alerts (depends on 67-001). |

View File

@@ -0,0 +1,15 @@
# Notifier OAS Discovery — ETag Guidance
The Notifier WebService exposes its OpenAPI document at `/.well-known/openapi` with headers:
- `X-OpenAPI-Scope: notify`
- `ETag: "<sha256>"` (stable per spec bytes)
- `Cache-Control: public, max-age=300`
Usage notes:
- SDK generators and CI smoke tests should re-use the `ETag` for conditional GETs (`If-None-Match`) to avoid redundant downloads.
- Mirror/Offline bundles should copy `openapi/notify-openapi.yaml` and retain the `ETag` alongside the file hash used in air-gap validation.
- When the spec changes, the SHA-256 and `ETag` change together; callers can detect breaking/non-breaking updates via the published changelog (source of truth in `docs/api/notify-openapi.yaml`).
Applies to tasks: NOTIFY-OAS-61-001/61-002/63-001.

View File

@@ -0,0 +1,49 @@
using MongoDB.Bson;
using MongoDB.Bson.Serialization.Attributes;
namespace StellaOps.Notify.Storage.Mongo.Documents;
public sealed class PackApprovalDocument
{
[BsonId]
public ObjectId Id { get; init; }
[BsonElement("tenantId")]
public required string TenantId { get; init; }
[BsonElement("eventId")]
public required Guid EventId { get; init; }
[BsonElement("packId")]
public required string PackId { get; init; }
[BsonElement("kind")]
public required string Kind { get; init; }
[BsonElement("decision")]
public required string Decision { get; init; }
[BsonElement("actor")]
public required string Actor { get; init; }
[BsonElement("issuedAt")]
public required DateTimeOffset IssuedAt { get; init; }
[BsonElement("policyId")]
public string? PolicyId { get; init; }
[BsonElement("policyVersion")]
public string? PolicyVersion { get; init; }
[BsonElement("resumeToken")]
public string? ResumeToken { get; init; }
[BsonElement("summary")]
public string? Summary { get; init; }
[BsonElement("labels")]
public Dictionary<string, string>? Labels { get; init; }
[BsonElement("createdAt")]
public required DateTimeOffset CreatedAt { get; init; }
}

View File

@@ -0,0 +1,34 @@
using Microsoft.Extensions.Logging;
using StellaOps.Notify.Storage.Mongo.Internal;
namespace StellaOps.Notify.Storage.Mongo.Migrations;
internal sealed class EnsurePackApprovalsCollectionMigration : INotifyMongoMigration
{
private readonly ILogger<EnsurePackApprovalsCollectionMigration> _logger;
public EnsurePackApprovalsCollectionMigration(ILogger<EnsurePackApprovalsCollectionMigration> logger)
=> _logger = logger ?? throw new ArgumentNullException(nameof(logger));
public string Id => "20251117_pack_approvals_collection_v1";
public async ValueTask ExecuteAsync(NotifyMongoContext context, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(context);
var target = context.Options.PackApprovalsCollection;
var cursor = await context.Database
.ListCollectionNamesAsync(cancellationToken: cancellationToken)
.ConfigureAwait(false);
var existing = await cursor.ToListAsync(cancellationToken).ConfigureAwait(false);
if (existing.Contains(target, StringComparer.Ordinal))
{
return;
}
_logger.LogInformation("Creating pack approvals collection '{Collection}'.", target);
await context.Database.CreateCollectionAsync(target, cancellationToken: cancellationToken).ConfigureAwait(false);
}
}

View File

@@ -0,0 +1,41 @@
using MongoDB.Bson;
using MongoDB.Driver;
using StellaOps.Notify.Storage.Mongo.Internal;
namespace StellaOps.Notify.Storage.Mongo.Migrations;
internal sealed class EnsurePackApprovalsIndexesMigration : INotifyMongoMigration
{
public string Id => "20251117_pack_approvals_indexes_v1";
public async ValueTask ExecuteAsync(NotifyMongoContext context, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(context);
var collection = context.Database.GetCollection<BsonDocument>(context.Options.PackApprovalsCollection);
var unique = new CreateIndexModel<BsonDocument>(
Builders<BsonDocument>.IndexKeys
.Ascending("tenantId")
.Ascending("packId")
.Ascending("eventId"),
new CreateIndexOptions
{
Name = "tenant_pack_event",
Unique = true
});
await collection.Indexes.CreateOneAsync(unique, cancellationToken: cancellationToken).ConfigureAwait(false);
var issuedAt = new CreateIndexModel<BsonDocument>(
Builders<BsonDocument>.IndexKeys
.Ascending("tenantId")
.Descending("issuedAt"),
new CreateIndexOptions
{
Name = "tenant_issuedAt"
});
await collection.Indexes.CreateOneAsync(issuedAt, cancellationToken: cancellationToken).ConfigureAwait(false);
}
}

View File

@@ -0,0 +1,8 @@
using StellaOps.Notify.Storage.Mongo.Documents;
namespace StellaOps.Notify.Storage.Mongo.Repositories;
public interface INotifyPackApprovalRepository
{
Task UpsertAsync(PackApprovalDocument document, CancellationToken cancellationToken = default);
}

View File

@@ -0,0 +1,29 @@
using MongoDB.Driver;
using StellaOps.Notify.Storage.Mongo.Documents;
using StellaOps.Notify.Storage.Mongo.Internal;
namespace StellaOps.Notify.Storage.Mongo.Repositories;
internal sealed class NotifyPackApprovalRepository : INotifyPackApprovalRepository
{
private readonly IMongoCollection<PackApprovalDocument> _collection;
public NotifyPackApprovalRepository(NotifyMongoContext context)
{
ArgumentNullException.ThrowIfNull(context);
_collection = context.Database.GetCollection<PackApprovalDocument>(context.Options.PackApprovalsCollection);
}
public async Task UpsertAsync(PackApprovalDocument document, CancellationToken cancellationToken = default)
{
ArgumentNullException.ThrowIfNull(document);
var filter = Builders<PackApprovalDocument>.Filter.And(
Builders<PackApprovalDocument>.Filter.Eq(x => x.TenantId, document.TenantId),
Builders<PackApprovalDocument>.Filter.Eq(x => x.PackId, document.PackId),
Builders<PackApprovalDocument>.Filter.Eq(x => x.EventId, document.EventId));
var options = new ReplaceOptions { IsUpsert = true };
await _collection.ReplaceOneAsync(filter, document, options, cancellationToken).ConfigureAwait(false);
}
}

View File

@@ -0,0 +1,5 @@
<Project>
<PropertyGroup>
<RestoreSources>;;</RestoreSources>
</PropertyGroup>
</Project>

View File

@@ -0,0 +1,60 @@
using System.Text;
using System.Text.Json;
using StellaOps.Provenance.Attestation;
static int PrintUsage()
{
Console.Error.WriteLine("Usage: stella-forensic-verify --payload <file> --signature-hex <hex> --key-hex <hex> [--key-id <id>] [--content-type <ct>]");
return 1;
}
string? GetArg(string name)
{
for (int i = 0; i < args.Length - 1; i++)
{
if (args[i].Equals(name, StringComparison.OrdinalIgnoreCase))
return args[i + 1];
}
return null;
}
string? payloadPath = GetArg("--payload");
string? signatureHex = GetArg("--signature-hex");
string? keyHex = GetArg("--key-hex");
string keyId = GetArg("--key-id") ?? "hmac";
string contentType = GetArg("--content-type") ?? "application/octet-stream";
if (payloadPath is null || signatureHex is null || keyHex is null)
{
return PrintUsage();
}
byte[] payload = await System.IO.File.ReadAllBytesAsync(payloadPath);
byte[] signature;
byte[] key;
try
{
signature = Hex.FromHex(signatureHex);
key = Hex.FromHex(keyHex);
}
catch (Exception ex)
{
Console.Error.WriteLine($"hex parse error: {ex.Message}");
return 1;
}
var request = new SignRequest(payload, contentType);
var signResult = new SignResult(signature, keyId, DateTimeOffset.MinValue, null);
var verifier = new HmacVerifier(new InMemoryKeyProvider(keyId, key));
var result = await verifier.VerifyAsync(request, signResult);
var json = JsonSerializer.Serialize(new
{
valid = result.IsValid,
reason = result.Reason,
verifiedAt = result.VerifiedAt.ToUniversalTime().ToString("O")
});
Console.WriteLine(json);
return result.IsValid ? 0 : 2;

View File

@@ -0,0 +1,14 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>net10.0</TargetFramework>
<LangVersion>preview</LangVersion>
<Nullable>enable</Nullable>
<ImplicitUsings>enable</ImplicitUsings>
<PackAsTool>true</PackAsTool>
<ToolCommandName>stella-forensic-verify</ToolCommandName>
<PackageOutputPath>../../out/tools</PackageOutputPath>
</PropertyGroup>
<ItemGroup>
<ProjectReference Include="../StellaOps.Provenance.Attestation/StellaOps.Provenance.Attestation.csproj" />
</ItemGroup>
</Project>

View File

@@ -0,0 +1 @@
test

View File

@@ -0,0 +1,113 @@
using System.Security.Cryptography;
using System.Text;
using System.Text.Json;
using System.Linq;
namespace StellaOps.Provenance.Attestation;
public sealed record BuildDefinition(
string BuildType,
IReadOnlyDictionary<string, string>? ExternalParameters = null,
IReadOnlyDictionary<string, string>? ResolvedDependencies = null);
public sealed record BuildMetadata(
string? BuildInvocationId,
DateTimeOffset? BuildStartedOn,
DateTimeOffset? BuildFinishedOn,
bool? Reproducible = null,
IReadOnlyDictionary<string, bool>? Completeness = null,
IReadOnlyDictionary<string, string>? Environment = null);
public static class CanonicalJson
{
private static readonly JsonSerializerOptions Options = new()
{
PropertyNamingPolicy = null,
WriteIndented = false,
DefaultIgnoreCondition = System.Text.Json.Serialization.JsonIgnoreCondition.WhenWritingNull
};
public static byte[] SerializeToUtf8Bytes<T>(T value)
{
var element = JsonSerializer.SerializeToElement(value, Options);
using var stream = new MemoryStream();
using var writer = new Utf8JsonWriter(stream, new JsonWriterOptions { Indented = false });
WriteCanonical(element, writer);
writer.Flush();
return stream.ToArray();
}
public static string SerializeToString<T>(T value) => Encoding.UTF8.GetString(SerializeToUtf8Bytes(value));
private static void WriteCanonical(JsonElement element, Utf8JsonWriter writer)
{
switch (element.ValueKind)
{
case JsonValueKind.Object:
writer.WriteStartObject();
foreach (var property in element.EnumerateObject().OrderBy(p => p.Name, StringComparer.Ordinal))
{
writer.WritePropertyName(property.Name);
WriteCanonical(property.Value, writer);
}
writer.WriteEndObject();
break;
case JsonValueKind.Array:
writer.WriteStartArray();
foreach (var item in element.EnumerateArray())
{
WriteCanonical(item, writer);
}
writer.WriteEndArray();
break;
default:
element.WriteTo(writer);
break;
}
}
}
public static class MerkleTree
{
public static byte[] ComputeRoot(IEnumerable<byte[]> leaves)
{
var leafList = leaves?.ToList() ?? throw new ArgumentNullException(nameof(leaves));
if (leafList.Count == 0) throw new ArgumentException("At least one leaf required", nameof(leaves));
var level = leafList.Select(NormalizeLeaf).ToList();
using var sha = SHA256.Create();
while (level.Count > 1)
{
var next = new List<byte[]>((level.Count + 1) / 2);
for (var i = 0; i < level.Count; i += 2)
{
var left = level[i];
var right = i + 1 < level.Count ? level[i + 1] : left;
var combined = new byte[left.Length + right.Length];
Buffer.BlockCopy(left, 0, combined, 0, left.Length);
Buffer.BlockCopy(right, 0, combined, left.Length, right.Length);
next.Add(sha.ComputeHash(combined));
}
level = next;
}
return level[0];
static byte[] NormalizeLeaf(byte[] data)
{
if (data.Length == 32) return data;
using var sha = SHA256.Create();
return sha.ComputeHash(data);
}
}
}
public sealed record BuildStatement(
BuildDefinition BuildDefinition,
BuildMetadata BuildMetadata);
public static class BuildStatementFactory
{
public static BuildStatement Create(BuildDefinition definition, BuildMetadata metadata) => new(definition, metadata);
}

View File

@@ -0,0 +1,20 @@
using System.Globalization;
namespace StellaOps.Provenance.Attestation;
public static class Hex
{
public static byte[] FromHex(string hex)
{
if (string.IsNullOrWhiteSpace(hex)) throw new ArgumentException("hex is required", nameof(hex));
if (hex.StartsWith("0x", StringComparison.OrdinalIgnoreCase)) hex = hex[2..];
if (hex.Length % 2 != 0) throw new FormatException("hex length must be even");
var bytes = new byte[hex.Length / 2];
for (int i = 0; i < bytes.Length; i++)
{
bytes[i] = byte.Parse(hex.Substring(i * 2, 2), NumberStyles.HexNumber, CultureInfo.InvariantCulture);
}
return bytes;
}
}

View File

@@ -0,0 +1,21 @@
using System.Text.Json;
using System.Text.Json.Serialization;
namespace StellaOps.Provenance.Attestation;
public sealed record PromotionPredicate(
string ImageDigest,
string SbomDigest,
string VexDigest,
string PromotionId,
string? RekorEntry = null,
IReadOnlyDictionary<string,string>? Metadata = null);
public static class PromotionAttestationBuilder
{
public static byte[] CreateCanonicalJson(PromotionPredicate predicate)
{
if (predicate is null) throw new ArgumentNullException(nameof(predicate));
return CanonicalJson.SerializeToUtf8Bytes(predicate);
}
}

View File

@@ -0,0 +1,107 @@
using System.Security.Cryptography;
namespace StellaOps.Provenance.Attestation;
public sealed record SignRequest(
byte[] Payload,
string ContentType,
IReadOnlyDictionary<string, string>? Claims = null,
IReadOnlyCollection<string>? RequiredClaims = null);
public sealed record SignResult(
byte[] Signature,
string KeyId,
DateTimeOffset SignedAt,
IReadOnlyDictionary<string, string>? Claims);
public interface IKeyProvider
{
string KeyId { get; }
byte[] KeyMaterial { get; }
}
public interface IAuditSink
{
void LogSigned(string keyId, string contentType, IReadOnlyDictionary<string, string>? claims, DateTimeOffset signedAt);
void LogMissingClaim(string keyId, string claimName);
}
public sealed class NullAuditSink : IAuditSink
{
public static readonly NullAuditSink Instance = new();
private NullAuditSink() { }
public void LogSigned(string keyId, string contentType, IReadOnlyDictionary<string, string>? claims, DateTimeOffset signedAt) { }
public void LogMissingClaim(string keyId, string claimName) { }
}
public sealed class HmacSigner : ISigner
{
private readonly IKeyProvider _keyProvider;
private readonly IAuditSink _audit;
private readonly TimeProvider _timeProvider;
public HmacSigner(IKeyProvider keyProvider, IAuditSink? audit = null, TimeProvider? timeProvider = null)
{
_keyProvider = keyProvider ?? throw new ArgumentNullException(nameof(keyProvider));
_audit = audit ?? NullAuditSink.Instance;
_timeProvider = timeProvider ?? TimeProvider.System;
}
public Task<SignResult> SignAsync(SignRequest request, CancellationToken cancellationToken = default)
{
if (request is null) throw new ArgumentNullException(nameof(request));
if (request.RequiredClaims is not null)
{
foreach (var required in request.RequiredClaims)
{
if (request.Claims is null || !request.Claims.ContainsKey(required))
{
_audit.LogMissingClaim(_keyProvider.KeyId, required);
throw new InvalidOperationException($"Missing required claim {required}.");
}
}
}
using var hmac = new HMACSHA256(_keyProvider.KeyMaterial);
var signature = hmac.ComputeHash(request.Payload);
var signedAt = _timeProvider.GetUtcNow();
_audit.LogSigned(_keyProvider.KeyId, request.ContentType, request.Claims, signedAt);
return Task.FromResult(new SignResult(
Signature: signature,
KeyId: _keyProvider.KeyId,
SignedAt: signedAt,
Claims: request.Claims));
}
}
public interface ISigner
{
Task<SignResult> SignAsync(SignRequest request, CancellationToken cancellationToken = default);
}
public sealed class InMemoryKeyProvider : IKeyProvider
{
public string KeyId { get; }
public byte[] KeyMaterial { get; }
public InMemoryKeyProvider(string keyId, byte[] keyMaterial)
{
KeyId = keyId ?? throw new ArgumentNullException(nameof(keyId));
KeyMaterial = keyMaterial ?? throw new ArgumentNullException(nameof(keyMaterial));
}
}
public sealed class InMemoryAuditSink : IAuditSink
{
public List<(string keyId, string contentType, IReadOnlyDictionary<string, string>? claims, DateTimeOffset signedAt)> Signed { get; } = new();
public List<(string keyId, string claim)> Missing { get; } = new();
public void LogSigned(string keyId, string contentType, IReadOnlyDictionary<string, string>? claims, DateTimeOffset signedAt)
=> Signed.Add((keyId, contentType, claims, signedAt));
public void LogMissingClaim(string keyId, string claimName)
=> Missing.Add((keyId, claimName));
}

View File

@@ -0,0 +1,9 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>net10.0</TargetFramework>
<LangVersion>preview</LangVersion>
<Nullable>enable</Nullable>
<ImplicitUsings>enable</ImplicitUsings>
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
</PropertyGroup>
</Project>

View File

@@ -0,0 +1,35 @@
using System.Security.Cryptography;
namespace StellaOps.Provenance.Attestation;
public sealed record VerificationResult(bool IsValid, string Reason, DateTimeOffset VerifiedAt);
public interface IVerifier
{
Task<VerificationResult> VerifyAsync(SignRequest request, SignResult signature, CancellationToken cancellationToken = default);
}
public sealed class HmacVerifier : IVerifier
{
private readonly IKeyProvider _keyProvider;
private readonly TimeProvider _timeProvider;
public HmacVerifier(IKeyProvider keyProvider, TimeProvider? timeProvider = null)
{
_keyProvider = keyProvider ?? throw new ArgumentNullException(nameof(keyProvider));
_timeProvider = timeProvider ?? TimeProvider.System;
}
public Task<VerificationResult> VerifyAsync(SignRequest request, SignResult signature, CancellationToken cancellationToken = default)
{
if (request is null) throw new ArgumentNullException(nameof(request));
if (signature is null) throw new ArgumentNullException(nameof(signature));
using var hmac = new HMACSHA256(_keyProvider.KeyMaterial);
var expected = hmac.ComputeHash(request.Payload);
var ok = CryptographicOperations.FixedTimeEquals(expected, signature.Signature) &&
string.Equals(_keyProvider.KeyId, signature.KeyId, StringComparison.Ordinal);
var result = new VerificationResult(
IsValid: ok,
Reason: ok ? ok : signature

View File

@@ -0,0 +1,27 @@
using System.Collections.Generic;
using FluentAssertions;
using StellaOps.Provenance.Attestation;
using Xunit;
namespace StellaOps.Provenance.Attestation.Tests;
public class CanonicalJsonTests
{
[Fact]
public void Canonicalizes_property_order_and_omits_nulls()
{
var model = new BuildDefinition(
BuildType: "https://slsa.dev/provenance/v1",
ExternalParameters: new Dictionary<string, string>
{
["b"] = "2",
["a"] = "1",
["c"] = "3"
},
ResolvedDependencies: null);
var json = CanonicalJson.SerializeToString(model);
json.Should().Be("{\"BuildType\":\"https://slsa.dev/provenance/v1\",\"ExternalParameters\":{\"a\":\"1\",\"b\":\"2\",\"c\":\"3\"}}");
}
}

View File

@@ -0,0 +1,21 @@
using FluentAssertions;
using StellaOps.Provenance.Attestation;
using Xunit;
namespace StellaOps.Provenance.Attestation.Tests;
public class HexTests
{
[Fact]
public void Parses_even_length_hex()
{
Hex.FromHex("0A0b").Should().BeEquivalentTo(new byte[] { 0x0A, 0x0B });
}
[Fact]
public void Throws_on_odd_length()
{
Action act = () => Hex.FromHex("ABC");
act.Should().Throw<FormatException>();
}
}

View File

@@ -0,0 +1,38 @@
using System.Security.Cryptography;
using System.Text;
using FluentAssertions;
using StellaOps.Provenance.Attestation;
using Xunit;
namespace StellaOps.Provenance.Attestation.Tests;
public class MerkleTreeTests
{
[Fact]
public void Computes_deterministic_root_for_same_inputs()
{
var leaves = new[]
{
Encoding.UTF8.GetBytes("a"),
Encoding.UTF8.GetBytes("b"),
Encoding.UTF8.GetBytes("c")
};
var root1 = MerkleTree.ComputeRoot(leaves);
var root2 = MerkleTree.ComputeRoot(leaves);
root1.Should().BeEquivalentTo(root2);
}
[Fact]
public void Normalizes_non_hash_leaves()
{
var leaves = new[] { Encoding.UTF8.GetBytes("single") };
var root = MerkleTree.ComputeRoot(leaves);
using var sha = SHA256.Create();
var expected = sha.ComputeHash(leaves[0]);
root.Should().BeEquivalentTo(expected);
}
}

View File

@@ -0,0 +1,26 @@
using System.Text;
using FluentAssertions;
using StellaOps.Provenance.Attestation;
using Xunit;
namespace StellaOps.Provenance.Attestation.Tests;
public class PromotionAttestationBuilderTests
{
[Fact]
public void Produces_canonical_json_for_predicate()
{
var predicate = new PromotionPredicate(
ImageDigest: sha256:img,
SbomDigest: sha256:sbom,
VexDigest: sha256:vex,
PromotionId: prom-1,
RekorEntry: uuid,
Metadata: new Dictionary<string, string>{{env,prod}});
var bytes = PromotionAttestationBuilder.CreateCanonicalJson(predicate);
var json = Encoding.UTF8.GetString(bytes);
json.Should().Be("ImageDigest":"sha256:img");
}
}

View File

@@ -0,0 +1,47 @@
using System;
using System.Text;
using System.Threading.Tasks;
using System.Collections.Generic;
using FluentAssertions;
using StellaOps.Provenance.Attestation;
using Xunit;
namespace StellaOps.Provenance.Attestation.Tests;
public class SignerTests
{
[Fact]
public async Task HmacSigner_is_deterministic_for_same_input()
{
var key = new InMemoryKeyProvider("test-key", Encoding.UTF8.GetBytes("secret"));
var audit = new InMemoryAuditSink();
var signer = new HmacSigner(key, audit, TimeProvider.System);
var request = new SignRequest(Encoding.UTF8.GetBytes("payload"), "application/json");
var r1 = await signer.SignAsync(request);
var r2 = await signer.SignAsync(request);
r1.Signature.Should().BeEquivalentTo(r2.Signature);
r1.KeyId.Should().Be("test-key");
audit.Signed.Should().HaveCount(2);
}
[Fact]
public async Task HmacSigner_enforces_required_claims()
{
var key = new InMemoryKeyProvider("test-key", Encoding.UTF8.GetBytes("secret"));
var audit = new InMemoryAuditSink();
var signer = new HmacSigner(key, audit, TimeProvider.System);
var request = new SignRequest(
Payload: Encoding.UTF8.GetBytes("payload"),
ContentType: "application/json",
Claims: new Dictionary<string, string> { ["foo"] = "bar" },
RequiredClaims: new[] { "foo", "bar" });
var ex = await Assert.ThrowsAsync<InvalidOperationException>(() => signer.SignAsync(request));
ex.Message.Should().Contain("bar");
audit.Missing.Should().ContainSingle(m => m.claim == "bar");
}
}

View File

@@ -0,0 +1,14 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>net10.0</TargetFramework>
<IsPackable>false</IsPackable>
<Nullable>enable</Nullable>
<LangVersion>preview</LangVersion>
</PropertyGroup>
<ItemGroup>
<ProjectReference Include="../../StellaOps.Provenance.Attestation/StellaOps.Provenance.Attestation.csproj" />
<PackageReference Include="FluentAssertions" Version="6.12.0" />
<PackageReference Include="xunit" Version="2.7.0" />
<PackageReference Include="xunit.runner.visualstudio" Version="2.5.8" />
</ItemGroup>
</Project>

View File

@@ -0,0 +1,42 @@
using System.Text;
using FluentAssertions;
using StellaOps.Provenance.Attestation;
using Xunit;
namespace StellaOps.Provenance.Attestation.Tests;
public class VerificationTests
{
[Fact]
public async Task Verifier_accepts_valid_signature()
{
var key = new InMemoryKeyProvider(test-key, Encoding.UTF8.GetBytes(secret));
var signer = new HmacSigner(key);
var verifier = new HmacVerifier(key);
var request = new SignRequest(Encoding.UTF8.GetBytes(payload), application/json);
var signature = await signer.SignAsync(request);
var result = await verifier.VerifyAsync(request, signature);
result.IsValid.Should().BeTrue();
result.Reason.Should().Be(ok);
}
[Fact]
public async Task Verifier_rejects_tampered_payload()
{
var key = new InMemoryKeyProvider(test-key, Encoding.UTF8.GetBytes(secret));
var signer = new HmacSigner(key);
var verifier = new HmacVerifier(key);
var request = new SignRequest(Encoding.UTF8.GetBytes(payload), application/json);
var signature = await signer.SignAsync(request);
var tampered = new SignRequest(Encoding.UTF8.GetBytes(payload-tampered), application/json);
var result = await verifier.VerifyAsync(tampered, signature);
result.IsValid.Should().BeFalse();
result.Reason.Should().Contain(mismatch);
}
}
EOF}

View File

@@ -0,0 +1,215 @@
using System.Text.Json;
namespace StellaOps.Scanner.Analyzers.Lang.DotNet.Internal;
/// <summary>
/// Resolves publish artifacts (deps/runtimeconfig) into deterministic entrypoint identities.
/// </summary>
public static class DotNetEntrypointResolver
{
private static readonly EnumerationOptions Enumeration = new()
{
RecurseSubdirectories = true,
IgnoreInaccessible = true,
AttributesToSkip = FileAttributes.Device | FileAttributes.ReparsePoint
};
public static ValueTask<IReadOnlyList<DotNetEntrypoint>> ResolveAsync(
LanguageAnalyzerContext context,
CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(context);
var depsFiles = Directory
.EnumerateFiles(context.RootPath, "*.deps.json", Enumeration)
.OrderBy(static path => path, StringComparer.Ordinal)
.ToArray();
if (depsFiles.Length == 0)
{
return ValueTask.FromResult<IReadOnlyList<DotNetEntrypoint>>(Array.Empty<DotNetEntrypoint>());
}
var results = new List<DotNetEntrypoint>(depsFiles.Length);
foreach (var depsPath in depsFiles)
{
cancellationToken.ThrowIfCancellationRequested();
try
{
var relativeDepsPath = NormalizeRelative(context.GetRelativePath(depsPath));
var depsFile = DotNetDepsFile.Load(depsPath, relativeDepsPath, cancellationToken);
if (depsFile is null)
{
continue;
}
DotNetRuntimeConfig? runtimeConfig = null;
var runtimeConfigPath = Path.ChangeExtension(depsPath, ".runtimeconfig.json");
string? relativeRuntimeConfig = null;
if (!string.IsNullOrEmpty(runtimeConfigPath) && File.Exists(runtimeConfigPath))
{
relativeRuntimeConfig = NormalizeRelative(context.GetRelativePath(runtimeConfigPath));
runtimeConfig = DotNetRuntimeConfig.Load(runtimeConfigPath, relativeRuntimeConfig, cancellationToken);
}
var tfms = CollectTargetFrameworks(depsFile, runtimeConfig);
var rids = CollectRuntimeIdentifiers(depsFile, runtimeConfig);
var publishKind = DeterminePublishKind(depsFile);
var name = GetEntrypointName(depsPath);
var id = BuildDeterministicId(name, tfms, rids, publishKind);
results.Add(new DotNetEntrypoint(
Id: id,
Name: name,
TargetFrameworks: tfms,
RuntimeIdentifiers: rids,
RelativeDepsPath: relativeDepsPath,
RelativeRuntimeConfigPath: relativeRuntimeConfig,
PublishKind: publishKind));
}
catch (IOException)
{
continue;
}
catch (JsonException)
{
continue;
}
catch (UnauthorizedAccessException)
{
continue;
}
}
return ValueTask.FromResult<IReadOnlyList<DotNetEntrypoint>>(results);
}
private static string GetEntrypointName(string depsPath)
{
// Strip .json then any trailing .deps suffix to yield a logical entrypoint name.
var stem = Path.GetFileNameWithoutExtension(depsPath); // removes .json
if (stem.EndsWith(".deps", StringComparison.OrdinalIgnoreCase))
{
stem = stem[..^".deps".Length];
}
return stem;
}
private static IReadOnlyCollection<string> CollectTargetFrameworks(DotNetDepsFile depsFile, DotNetRuntimeConfig? runtimeConfig)
{
var tfms = new SortedSet<string>(StringComparer.OrdinalIgnoreCase);
foreach (var library in depsFile.Libraries.Values)
{
foreach (var tfm in library.TargetFrameworks)
{
tfms.Add(tfm);
}
}
if (runtimeConfig is not null)
{
foreach (var tfm in runtimeConfig.Tfms)
{
tfms.Add(tfm);
}
foreach (var framework in runtimeConfig.Frameworks)
{
tfms.Add(framework);
}
}
return tfms;
}
private static IReadOnlyCollection<string> CollectRuntimeIdentifiers(DotNetDepsFile depsFile, DotNetRuntimeConfig? runtimeConfig)
{
var rids = new SortedSet<string>(StringComparer.OrdinalIgnoreCase);
foreach (var library in depsFile.Libraries.Values)
{
foreach (var rid in library.RuntimeIdentifiers)
{
rids.Add(rid);
}
}
if (runtimeConfig is not null)
{
foreach (var entry in runtimeConfig.RuntimeGraph)
{
rids.Add(entry.Rid);
foreach (var fallback in entry.Fallbacks)
{
if (!string.IsNullOrWhiteSpace(fallback))
{
rids.Add(fallback);
}
}
}
}
return rids;
}
private static DotNetPublishKind DeterminePublishKind(DotNetDepsFile depsFile)
{
foreach (var library in depsFile.Libraries.Values)
{
if (library.Id.StartsWith("Microsoft.NETCore.App.Runtime.", StringComparison.OrdinalIgnoreCase) ||
library.Id.StartsWith("Microsoft.WindowsDesktop.App.Runtime.", StringComparison.OrdinalIgnoreCase))
{
return DotNetPublishKind.SelfContained;
}
}
return DotNetPublishKind.FrameworkDependent;
}
private static string BuildDeterministicId(
string name,
IReadOnlyCollection<string> tfms,
IReadOnlyCollection<string> rids,
DotNetPublishKind publishKind)
{
var tfmPart = tfms.Count == 0 ? "unknown" : string.Join('+', tfms.OrderBy(t => t, StringComparer.OrdinalIgnoreCase));
var ridPart = rids.Count == 0 ? "none" : string.Join('+', rids.OrderBy(r => r, StringComparer.OrdinalIgnoreCase));
var publishPart = publishKind.ToString().ToLowerInvariant();
return $"{name}:{tfmPart}:{ridPart}:{publishPart}";
}
private static string NormalizeRelative(string path)
{
if (string.IsNullOrWhiteSpace(path) || path == ".")
{
return ".";
}
var normalized = path.Replace('\\', '/');
return string.IsNullOrWhiteSpace(normalized) ? "." : normalized;
}
}
public sealed record DotNetEntrypoint(
string Id,
string Name,
IReadOnlyCollection<string> TargetFrameworks,
IReadOnlyCollection<string> RuntimeIdentifiers,
string RelativeDepsPath,
string? RelativeRuntimeConfigPath,
DotNetPublishKind PublishKind);
public enum DotNetPublishKind
{
Unknown = 0,
FrameworkDependent = 1,
SelfContained = 2
}

View File

@@ -0,0 +1,7 @@
namespace StellaOps.Scanner.Analyzers.Lang;
public sealed class AnalysisSnapshot
{
public IReadOnlyList<LanguageEntrypointSnapshot> Entrypoints { get; set; } = Array.Empty<LanguageEntrypointSnapshot>();
public IReadOnlyList<LanguageComponentSnapshot> Components { get; set; } = Array.Empty<LanguageComponentSnapshot>();
}

View File

@@ -0,0 +1,15 @@
using System;
namespace StellaOps.Scanner.Analyzers.Lang;
internal static class LanguageComponentEvidenceExtensions
{
/// <summary>
/// Builds a stable key for evidence items to support deterministic dictionaries.
/// </summary>
public static string ToKey(this LanguageComponentEvidence evidence)
{
ArgumentNullException.ThrowIfNull(evidence);
return $"{evidence.Kind}:{evidence.Source}:{evidence.Locator}".ToLowerInvariant();
}
}

View File

@@ -0,0 +1,96 @@
using System.Text.Json.Serialization;
namespace StellaOps.Scanner.Analyzers.Lang;
public sealed class LanguageEntrypointRecord
{
private readonly SortedDictionary<string, string?> _metadata;
private readonly SortedDictionary<string, LanguageComponentEvidence> _evidence;
public LanguageEntrypointRecord(
string id,
string name,
IEnumerable<KeyValuePair<string, string?>>? metadata = null,
IEnumerable<LanguageComponentEvidence>? evidence = null)
{
if (string.IsNullOrWhiteSpace(id))
{
throw new ArgumentException("Entrypoint id is required", nameof(id));
}
Id = id.Trim();
Name = string.IsNullOrWhiteSpace(name) ? Id : name.Trim();
_metadata = new SortedDictionary<string, string?>(StringComparer.Ordinal);
foreach (var pair in metadata ?? Array.Empty<KeyValuePair<string, string?>>())
{
_metadata[pair.Key] = pair.Value;
}
_evidence = new SortedDictionary<string, LanguageComponentEvidence>(StringComparer.Ordinal);
foreach (var item in evidence ?? Array.Empty<LanguageComponentEvidence>())
{
var key = item.ToKey();
_evidence[key] = item;
}
}
public string Id { get; }
public string Name { get; }
public IReadOnlyDictionary<string, string?> Metadata => _metadata;
public IReadOnlyCollection<LanguageComponentEvidence> Evidence => _evidence.Values;
internal LanguageEntrypointSnapshot ToSnapshot()
=> new()
{
Id = Id,
Name = Name,
Metadata = _metadata.ToDictionary(static kvp => kvp.Key, static kvp => kvp.Value, StringComparer.Ordinal),
Evidence = _evidence.Values
.Select(static item => new LanguageComponentEvidenceSnapshot
{
Kind = item.Kind,
Source = item.Source,
Locator = item.Locator,
Value = item.Value,
Sha256 = item.Sha256
})
.ToList()
};
internal static LanguageEntrypointRecord FromSnapshot(LanguageEntrypointSnapshot snapshot)
{
if (snapshot is null)
{
throw new ArgumentNullException(nameof(snapshot));
}
var evidence = snapshot.Evidence?
.Select(static e => new LanguageComponentEvidence(e.Kind, e.Source, e.Locator, e.Value, e.Sha256))
.ToArray();
return new LanguageEntrypointRecord(
snapshot.Id ?? string.Empty,
snapshot.Name ?? snapshot.Id ?? string.Empty,
snapshot.Metadata,
evidence);
}
}
public sealed class LanguageEntrypointSnapshot
{
[JsonPropertyName("id")]
public string? Id { get; set; }
[JsonPropertyName("name")]
public string? Name { get; set; }
[JsonPropertyName("metadata")]
public Dictionary<string, string?> Metadata { get; set; } = new(StringComparer.Ordinal);
[JsonPropertyName("evidence")]
public List<LanguageComponentEvidenceSnapshot> Evidence { get; set; } = new();
}

View File

@@ -0,0 +1,5 @@
# EntryTrace Tasks
| Task ID | Status | Date | Summary |
| --- | --- | --- | --- |
| SCANNER-ENG-0008 | DONE | 2025-11-16 | Documented quarterly EntryTrace heuristic cadence and workflow; attached to Sprint 0138 Execution Log. |

View File

@@ -0,0 +1,51 @@
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using StellaOps.Scanner.Analyzers.Lang;
using StellaOps.Scanner.Analyzers.Lang.DotNet.Internal;
using StellaOps.Scanner.Analyzers.Lang.Tests.TestUtilities;
namespace StellaOps.Scanner.Analyzers.Lang.Tests.DotNet;
public sealed class DotNetEntrypointResolverTests
{
[Fact]
public async Task SimpleFixtureResolvesSingleEntrypointAsync()
{
var cancellationToken = TestContext.Current.CancellationToken;
var fixturePath = TestPaths.ResolveFixture("lang", "dotnet", "simple");
var context = new LanguageAnalyzerContext(fixturePath, TimeProvider.System);
var entrypoints = await DotNetEntrypointResolver.ResolveAsync(context, cancellationToken);
Assert.Single(entrypoints);
var entrypoint = entrypoints[0];
Assert.Equal("Sample.App", entrypoint.Name);
Assert.Equal("Sample.App:Microsoft.AspNetCore.App@10.0.0+Microsoft.NETCore.App@10.0.0+net10.0:any+linux+linux-x64+unix+win+win-x86:frameworkdependent", entrypoint.Id);
Assert.Contains("net10.0", entrypoint.TargetFrameworks);
Assert.Contains("linux-x64", entrypoint.RuntimeIdentifiers);
Assert.Equal("Sample.App.deps.json", entrypoint.RelativeDepsPath);
Assert.Equal("Sample.App.runtimeconfig.json", entrypoint.RelativeRuntimeConfigPath);
Assert.Equal(DotNetPublishKind.FrameworkDependent, entrypoint.PublishKind);
}
[Fact]
public async Task DeterministicOrderingIsStableAsync()
{
var cancellationToken = TestContext.Current.CancellationToken;
var fixturePath = TestPaths.ResolveFixture("lang", "dotnet", "multi");
var context = new LanguageAnalyzerContext(fixturePath, TimeProvider.System);
var first = await DotNetEntrypointResolver.ResolveAsync(context, cancellationToken);
var second = await DotNetEntrypointResolver.ResolveAsync(context, cancellationToken);
Assert.Equal(first.Count, second.Count);
for (var i = 0; i < first.Count; i++)
{
Assert.Equal(first[i].Id, second[i].Id);
Assert.True(first[i].TargetFrameworks.SequenceEqual(second[i].TargetFrameworks));
Assert.True(first[i].RuntimeIdentifiers.SequenceEqual(second[i].RuntimeIdentifiers));
}
}
}

View File

@@ -0,0 +1,9 @@
# Active Tasks
| ID | Status | Owner(s) | Depends on | Description | Notes |
|----|--------|----------|------------|-------------|-------|
| SCHED-WORKER-23-101 | BLOCKED (2025-11-17) | Scheduler Worker Guild | SCHED-WORKER-21-203 | Implement policy re-evaluation worker that shards assets, honours rate limits, and updates progress for Console after policy activation events. | Waiting on Policy guild contract for activation event shape and throttle source. |
| SCHED-WORKER-15-401 | DONE (2025-11-17) | Scheduler Worker Guild | — | Investigate and stabilize PlannerBackgroundService fairness tests (tenant fairness cap; manual/event trigger priority). | Increased monotonic wait tolerance; tests now stable. |
| SCHED-WORKER-99-901 | DONE (2025-11-17) | Scheduler Worker Guild | — | Harden PolicyRunTargetingService coverage for incremental delta rules (MaxSboms, selector replay). | Added focused unit tests + deterministic stubs. |
| SCHED-SURFACE-01 | BLOCKED (2025-11-17) | Scheduler Worker Guild | — | Evaluate Surface.FS pointers when planning delta scans to avoid redundant work and prioritise drift-triggered assets. | Blocked: Surface.FS pointer schema/data source not documented; need contract from Surface/Policy guild. |
| SCHED-WORKER-99-902 | DONE (2025-11-17) | Scheduler Worker Guild | — | Housekeeping: add guard test to PolicyRunDispatchBackgroundService to ensure disabled policy mode performs no leases. | Added stub repository & clients; verified no lease attempts when disabled. |

View File

@@ -0,0 +1,130 @@
using System;
using System.Collections.Generic;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging.Abstractions;
using Microsoft.Extensions.Options;
using StellaOps.Scheduler.Models;
using StellaOps.Scheduler.Storage.Mongo.Repositories;
using StellaOps.Scheduler.Worker.Options;
using StellaOps.Scheduler.Worker.Policy;
namespace StellaOps.Scheduler.Worker.Tests;
public sealed class PolicyRunDispatchBackgroundServiceTests
{
[Fact]
public async Task ExecuteAsync_DoesNotLease_WhenPolicyDispatchDisabled()
{
var repository = new RecordingPolicyRunJobRepository();
var options = CreateOptions(policyEnabled: false, idleDelay: TimeSpan.FromMilliseconds(1));
var service = CreateService(repository, options);
using var cts = new CancellationTokenSource(TimeSpan.FromMilliseconds(50));
await service.StartAsync(cts.Token);
await service.StopAsync(CancellationToken.None);
Assert.Equal(0, repository.LeaseAttempts);
}
private static PolicyRunDispatchBackgroundService CreateService(
RecordingPolicyRunJobRepository repository,
SchedulerWorkerOptions options)
{
var executionService = new PolicyRunExecutionService(
repository,
new StubPolicyRunClient(),
Options.Create(options),
timeProvider: null,
new SchedulerWorkerMetrics(),
new StubPolicyRunTargetingService(),
new StubPolicySimulationWebhookClient(),
NullLogger<PolicyRunExecutionService>.Instance);
return new PolicyRunDispatchBackgroundService(
repository,
executionService,
Options.Create(options),
timeProvider: null,
NullLogger<PolicyRunDispatchBackgroundService>.Instance);
}
private static SchedulerWorkerOptions CreateOptions(bool policyEnabled, TimeSpan idleDelay)
{
var options = new SchedulerWorkerOptions();
options.Policy.Enabled = policyEnabled;
options.Policy.Dispatch.IdleDelay = idleDelay;
options.Policy.Dispatch.BatchSize = 1;
return options;
}
private sealed class RecordingPolicyRunJobRepository : IPolicyRunJobRepository
{
public int LeaseAttempts { get; private set; }
public Task InsertAsync(PolicyRunJob job, IClientSessionHandle? session = null, CancellationToken cancellationToken = default)
=> Task.CompletedTask;
public Task<PolicyRunJob?> GetAsync(string tenantId, string jobId, IClientSessionHandle? session = null, CancellationToken cancellationToken = default)
=> Task.FromResult<PolicyRunJob?>(null);
public Task<PolicyRunJob?> GetByRunIdAsync(string tenantId, string runId, IClientSessionHandle? session = null, CancellationToken cancellationToken = default)
=> Task.FromResult<PolicyRunJob?>(null);
public Task<PolicyRunJob?> LeaseAsync(
string leaseOwner,
DateTimeOffset now,
TimeSpan leaseDuration,
int maxAttempts,
IClientSessionHandle? session = null,
CancellationToken cancellationToken = default)
{
Interlocked.Increment(ref LeaseAttempts);
return Task.FromResult<PolicyRunJob?>(null);
}
public Task<IReadOnlyList<PolicyRunJob>> ListAsync(
string tenantId,
string? policyId = null,
PolicyRunMode? mode = null,
IReadOnlyCollection<PolicyRunJobStatus>? statuses = null,
DateTimeOffset? queuedAfter = null,
int limit = 50,
IClientSessionHandle? session = null,
CancellationToken cancellationToken = default)
=> Task.FromResult<IReadOnlyList<PolicyRunJob>>(Array.Empty<PolicyRunJob>());
public Task<bool> ReplaceAsync(
PolicyRunJob job,
string? expectedLeaseOwner = null,
IClientSessionHandle? session = null,
CancellationToken cancellationToken = default)
=> Task.FromResult(true);
public Task<long> CountAsync(
string tenantId,
PolicyRunMode mode,
IReadOnlyCollection<PolicyRunJobStatus> statuses,
CancellationToken cancellationToken = default)
=> Task.FromResult(0L);
}
private sealed class StubPolicyRunClient : IPolicyRunClient
{
public Task<PolicyRunSubmissionResult> SubmitAsync(PolicyRunJob job, PolicyRunRequest request, CancellationToken cancellationToken)
=> Task.FromResult(PolicyRunSubmissionResult.Failed("disabled"));
}
private sealed class StubPolicyRunTargetingService : IPolicyRunTargetingService
{
public Task<PolicyRunTargetingResult> EnsureTargetsAsync(PolicyRunJob job, CancellationToken cancellationToken)
=> Task.FromResult(PolicyRunTargetingResult.Unchanged(job));
}
private sealed class StubPolicySimulationWebhookClient : IPolicySimulationWebhookClient
{
public Task NotifyAsync(PolicySimulationWebhookPayload payload, CancellationToken cancellationToken)
=> Task.CompletedTask;
}
}

View File

@@ -0,0 +1,62 @@
using MongoDB.Driver;
using StellaOps.TaskRunner.Infrastructure.Execution;
using Xunit;
namespace StellaOps.TaskRunner.Tests;
public sealed class MongoIndexModelTests
{
[Fact]
public void StateStore_indexes_match_contract()
{
var models = MongoPackRunStateStore.GetIndexModels().ToArray();
Assert.Collection(models,
model => Assert.Equal("pack_runs_updatedAt_desc", model.Options.Name),
model => Assert.Equal("pack_runs_tenant_updatedAt_desc", model.Options.Name));
Assert.True(models[1].Options.Sparse ?? false);
}
[Fact]
public void LogStore_indexes_match_contract()
{
var models = MongoPackRunLogStore.GetIndexModels().ToArray();
Assert.Collection(models,
model =>
{
Assert.Equal("pack_run_logs_run_sequence", model.Options.Name);
Assert.True(model.Options.Unique ?? false);
},
model => Assert.Equal("pack_run_logs_run_timestamp", model.Options.Name));
}
[Fact]
public void ArtifactStore_indexes_match_contract()
{
var models = MongoPackRunArtifactUploader.GetIndexModels().ToArray();
Assert.Collection(models,
model =>
{
Assert.Equal("pack_artifacts_run_name", model.Options.Name);
Assert.True(model.Options.Unique ?? false);
},
model => Assert.Equal("pack_artifacts_run", model.Options.Name));
}
[Fact]
public void ApprovalStore_indexes_match_contract()
{
var models = MongoPackRunApprovalStore.GetIndexModels().ToArray();
Assert.Collection(models,
model =>
{
Assert.Equal("pack_run_approvals_run_approval", model.Options.Name);
Assert.True(model.Options.Unique ?? false);
},
model => Assert.Equal("pack_run_approvals_run_status", model.Options.Name));
}
}

View File

@@ -0,0 +1,27 @@
using StellaOps.TaskRunner.WebService;
namespace StellaOps.TaskRunner.Tests;
public sealed class OpenApiMetadataFactoryTests
{
[Fact]
public void Create_ProducesExpectedDefaults()
{
var metadata = OpenApiMetadataFactory.Create();
Assert.Equal("/openapi", metadata.Url);
Assert.False(string.IsNullOrWhiteSpace(metadata.Build));
Assert.StartsWith("W/\"", metadata.ETag);
Assert.EndsWith("\"", metadata.ETag);
Assert.Equal(64, metadata.Signature.Length);
Assert.True(metadata.Signature.All(c => char.IsDigit(c) || (c >= 'a' && c <= 'f')));
}
[Fact]
public void Create_AllowsOverrideUrl()
{
var metadata = OpenApiMetadataFactory.Create("/docs/openapi.json");
Assert.Equal("/docs/openapi.json", metadata.Url);
}
}

View File

@@ -0,0 +1,38 @@
using System.Reflection;
namespace StellaOps.TaskRunner.WebService;
internal static class OpenApiMetadataFactory
{
internal static Type ResponseType => typeof(OpenApiMetadata);
public static OpenApiMetadata Create(string? specUrl = null)
{
var assembly = Assembly.GetExecutingAssembly().GetName();
var version = assembly.Version?.ToString() ?? "0.0.0";
var url = string.IsNullOrWhiteSpace(specUrl) ? "/openapi" : specUrl;
var etag = CreateWeakEtag(version);
var signature = ComputeSignature(url, version);
return new OpenApiMetadata(url, version, etag, signature);
}
private static string CreateWeakEtag(string input)
{
if (string.IsNullOrWhiteSpace(input))
{
input = "0.0.0";
}
return $"W/\"{input}\"";
}
private static string ComputeSignature(string url, string build)
{
var data = System.Text.Encoding.UTF8.GetBytes(url + build);
var hash = System.Security.Cryptography.SHA256.HashData(data);
return Convert.ToHexString(hash).ToLowerInvariant();
}
internal sealed record OpenApiMetadata(string Url, string Build, string ETag, string Signature);
}

View File

@@ -0,0 +1,58 @@
using Microsoft.Extensions.Options;
using StellaOps.Scanner.Surface.Env;
using StellaOps.Scanner.Surface.Secrets;
using StellaOps.Zastava.Core.Configuration;
using StellaOps.Zastava.Observer.Configuration;
namespace StellaOps.Zastava.Observer.Secrets;
internal interface IObserverSurfaceSecrets
{
ValueTask<CasAccessSecret> GetCasAccessAsync(string? name, CancellationToken cancellationToken = default);
ValueTask<AttestationSecret> GetAttestationAsync(string? name, CancellationToken cancellationToken = default);
}
internal sealed class ObserverSurfaceSecrets : IObserverSurfaceSecrets
{
private const string Component = "Zastava.Observer";
private readonly ISurfaceSecretProvider _provider;
private readonly IOptions<ZastavaRuntimeOptions> _runtime;
private readonly IOptions<ZastavaObserverOptions> _observer;
public ObserverSurfaceSecrets(
ISurfaceSecretProvider provider,
IOptions<ZastavaRuntimeOptions> runtime,
IOptions<ZastavaObserverOptions> observer)
{
_provider = provider ?? throw new ArgumentNullException(nameof(provider));
_runtime = runtime ?? throw new ArgumentNullException(nameof(runtime));
_observer = observer ?? throw new ArgumentNullException(nameof(observer));
}
public async ValueTask<CasAccessSecret> GetCasAccessAsync(string? name, CancellationToken cancellationToken = default)
{
var options = _observer.Value.Secrets;
var request = new SurfaceSecretRequest(
Tenant: _runtime.Value.Tenant,
Component: Component,
SecretType: "cas-access",
Name: string.IsNullOrWhiteSpace(name) ? options.CasAccessName : name);
var handle = await _provider.GetAsync(request, cancellationToken).ConfigureAwait(false);
return SurfaceSecretParser.ParseCasAccessSecret(handle);
}
public async ValueTask<AttestationSecret> GetAttestationAsync(string? name, CancellationToken cancellationToken = default)
{
var options = _observer.Value.Secrets;
var request = new SurfaceSecretRequest(
Tenant: _runtime.Value.Tenant,
Component: Component,
SecretType: "attestation",
Name: string.IsNullOrWhiteSpace(name) ? options.AttestationName : name);
var handle = await _provider.GetAsync(request, cancellationToken).ConfigureAwait(false);
return SurfaceSecretParser.ParseAttestationSecret(handle);
}
}

View File

@@ -0,0 +1,32 @@
using StellaOps.Scanner.Surface.Env;
using StellaOps.Scanner.Surface.FS;
namespace StellaOps.Zastava.Observer.Surface;
internal interface IRuntimeSurfaceFsClient
{
Task<SurfaceManifestDocument?> TryGetManifestAsync(string manifestDigest, CancellationToken cancellationToken = default);
}
internal sealed class RuntimeSurfaceFsClient : IRuntimeSurfaceFsClient
{
private readonly ISurfaceManifestReader _manifestReader;
private readonly SurfaceEnvironmentSettings _environment;
public RuntimeSurfaceFsClient(ISurfaceManifestReader manifestReader, SurfaceEnvironmentSettings environment)
{
_manifestReader = manifestReader ?? throw new ArgumentNullException(nameof(manifestReader));
_environment = environment ?? throw new ArgumentNullException(nameof(environment));
}
public Task<SurfaceManifestDocument?> TryGetManifestAsync(string manifestDigest, CancellationToken cancellationToken = default)
{
if (string.IsNullOrWhiteSpace(manifestDigest))
{
return Task.FromResult<SurfaceManifestDocument?>(null);
}
// manifest digests follow sha256:<hex>; manifest reader handles validation and tenant discovery
return _manifestReader.TryGetByDigestAsync(manifestDigest.Trim(), cancellationToken);
}
}

View File

@@ -0,0 +1,43 @@
using Microsoft.Extensions.Options;
using StellaOps.Scanner.Surface.Secrets;
using StellaOps.Zastava.Core.Configuration;
using StellaOps.Zastava.Webhook.Configuration;
namespace StellaOps.Zastava.Webhook.Secrets;
internal interface IWebhookSurfaceSecrets
{
ValueTask<AttestationSecret> GetAttestationAsync(string? name, CancellationToken cancellationToken = default);
}
internal sealed class WebhookSurfaceSecrets : IWebhookSurfaceSecrets
{
private const string Component = "Zastava.Webhook";
private readonly ISurfaceSecretProvider _provider;
private readonly IOptions<ZastavaRuntimeOptions> _runtime;
private readonly IOptions<ZastavaWebhookOptions> _webhook;
public WebhookSurfaceSecrets(
ISurfaceSecretProvider provider,
IOptions<ZastavaRuntimeOptions> runtime,
IOptions<ZastavaWebhookOptions> webhook)
{
_provider = provider ?? throw new ArgumentNullException(nameof(provider));
_runtime = runtime ?? throw new ArgumentNullException(nameof(runtime));
_webhook = webhook ?? throw new ArgumentNullException(nameof(webhook));
}
public async ValueTask<AttestationSecret> GetAttestationAsync(string? name, CancellationToken cancellationToken = default)
{
var options = _webhook.Value.Secrets;
var request = new SurfaceSecretRequest(
Tenant: _runtime.Value.Tenant,
Component: Component,
SecretType: "attestation",
Name: string.IsNullOrWhiteSpace(name) ? options.AttestationName : name);
var handle = await _provider.GetAsync(request, cancellationToken).ConfigureAwait(false);
return SurfaceSecretParser.ParseAttestationSecret(handle);
}
}

Some files were not shown because too many files have changed in this diff Show More