Add tests and implement timeline ingestion options with NATS and Redis subscribers

- Introduced `BinaryReachabilityLifterTests` to validate binary lifting functionality.
- Created `PackRunWorkerOptions` for configuring worker paths and execution persistence.
- Added `TimelineIngestionOptions` for configuring NATS and Redis ingestion transports.
- Implemented `NatsTimelineEventSubscriber` for subscribing to NATS events.
- Developed `RedisTimelineEventSubscriber` for reading from Redis Streams.
- Added `TimelineEnvelopeParser` to normalize incoming event envelopes.
- Created unit tests for `TimelineEnvelopeParser` to ensure correct field mapping.
- Implemented `TimelineAuthorizationAuditSink` for logging authorization outcomes.
This commit is contained in:
StellaOps Bot
2025-12-03 09:46:48 +02:00
parent e923880694
commit 35c8f9216f
520 changed files with 4416 additions and 31492 deletions

View File

@@ -1,40 +0,0 @@
# AGENTS
## Role
Canonical persistence for raw documents, DTOs, canonical advisories, jobs, and state. Provides repositories and bootstrapper for collections/indexes.
## Scope
- Collections (MongoStorageDefaults): source, source_state, document, dto, advisory, alias, affected, reference, kev_flag, ru_flags, jp_flags, psirt_flags, merge_event, export_state, locks, jobs; GridFS bucket fs.documents; field names include ttlAt (locks), sourceName, uri, advisoryKey.
- Records: SourceState (cursor, lastSuccess/error, failCount, backoffUntil), JobRun, MergeEvent, ExportState, Advisory documents mirroring Models with embedded arrays when practical.
- Bootstrapper: create collections, indexes (unique advisoryKey, scheme/value, platform/name, published, modified), TTL on locks, and validate connectivity for /ready health probes.
- Job store: create, read, mark completed/failed; compute durations; recent/last queries; active by status.
- Advisory store: CRUD for canonical advisories; query by key/alias and list for exporters with deterministic paging.
## Participants
- Core jobs read/write runs and leases; WebService /ready pings database; /jobs APIs query runs/definitions.
- Source connectors store raw docs, DTOs, and mapped canonical advisories with provenance; Update SourceState cursor/backoff.
- Exporters read advisories and write export_state.
## Interfaces & contracts
- IMongoDatabase injected; MongoUrl from options; database name from options or MongoUrl or default "concelier".
- Repositories expose async methods with CancellationToken; deterministic sorting.
- All date/time values stored as UTC; identifiers normalized.
## In/Out of scope
In: persistence, bootstrap, indexes, basic query helpers.
Out: business mapping logic, HTTP, packaging.
## Observability & security expectations
- Log collection/index creation; warn on existing mismatches.
- Timeouts and retry policies; avoid unbounded scans; page reads.
- Do not log DSNs with credentials; redact in diagnostics.
## Tests
- Author and review coverage in `../StellaOps.Concelier.Storage.Mongo.Tests`.
- Shared fixtures (e.g., `MongoIntegrationFixture`, `ConnectorTestHarness`) live in `../StellaOps.Concelier.Testing`.
- Keep fixtures deterministic; match new cases to real-world advisories or regression scenarios.
## Required Reading
- `docs/modules/concelier/architecture.md`
- `docs/modules/platform/architecture-overview.md`
## Working Agreement
- 1. Update task status to `DOING`/`DONE` in both correspoding sprint file `/docs/implplan/SPRINT_*.md` and the local `TASKS.md` when you start or finish work.
- 2. Review this charter and the Required Reading documents before coding; confirm prerequisites are met.
- 3. Keep changes deterministic (stable ordering, timestamps, hashes) and align with offline/air-gap expectations.
- 4. Coordinate doc updates, tests, and cross-guild communication whenever contracts or workflows change.
- 5. Revert to `TODO` if you pause the task without shipping changes; leave notes in commit/PR descriptions for context.

View File

@@ -1,32 +0,0 @@
using System.Collections.Generic;
using MongoDB.Bson;
using MongoDB.Bson.Serialization.Attributes;
namespace StellaOps.Concelier.Storage.Mongo.Advisories;
[BsonIgnoreExtraElements]
public sealed class AdvisoryDocument
{
[BsonId]
public string Id { get; set; } = string.Empty;
[BsonElement("advisoryKey")]
public string AdvisoryKey
{
get => Id;
set => Id = value;
}
[BsonElement("payload")]
public BsonDocument Payload { get; set; } = new();
[BsonElement("modified")]
public DateTime Modified { get; set; }
[BsonElement("published")]
public DateTime? Published { get; set; }
[BsonElement("normalizedVersions")]
[BsonIgnoreIfNull]
public List<NormalizedVersionDocument>? NormalizedVersions { get; set; }
}

View File

@@ -1,568 +0,0 @@
using System;
using System.Collections.Generic;
using System.Linq;
using System.Runtime.CompilerServices;
using System.Text.Json;
using System.Text.Json.Serialization;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using MongoDB.Bson;
using MongoDB.Driver;
using StellaOps.Concelier.Models;
using StellaOps.Concelier.Storage.Mongo.Aliases;
namespace StellaOps.Concelier.Storage.Mongo.Advisories;
public sealed class AdvisoryStore : IAdvisoryStore
{
private readonly IMongoDatabase _database;
private readonly IMongoCollection<AdvisoryDocument> _collection;
private readonly ILogger<AdvisoryStore> _logger;
private readonly IAliasStore _aliasStore;
private readonly TimeProvider _timeProvider;
private readonly MongoStorageOptions _options;
private IMongoCollection<AdvisoryDocument>? _legacyCollection;
public AdvisoryStore(
IMongoDatabase database,
IAliasStore aliasStore,
ILogger<AdvisoryStore> logger,
IOptions<MongoStorageOptions> options,
TimeProvider? timeProvider = null)
{
_database = database ?? throw new ArgumentNullException(nameof(database));
_collection = _database.GetCollection<AdvisoryDocument>(MongoStorageDefaults.Collections.Advisory);
_aliasStore = aliasStore ?? throw new ArgumentNullException(nameof(aliasStore));
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
_options = options?.Value ?? throw new ArgumentNullException(nameof(options));
_timeProvider = timeProvider ?? TimeProvider.System;
}
public async Task UpsertAsync(Advisory advisory, CancellationToken cancellationToken, IClientSessionHandle? session = null)
{
ArgumentNullException.ThrowIfNull(advisory);
var missing = ProvenanceInspector.FindMissingProvenance(advisory);
var primarySource = advisory.Provenance.FirstOrDefault()?.Source ?? "unknown";
foreach (var item in missing)
{
var source = string.IsNullOrWhiteSpace(item.Source) ? primarySource : item.Source;
_logger.LogWarning(
"Missing provenance detected for {Component} in advisory {AdvisoryKey} (source {Source}).",
item.Component,
advisory.AdvisoryKey,
source);
ProvenanceDiagnostics.RecordMissing(source, item.Component, item.RecordedAt, item.FieldMask);
}
var payload = CanonicalJsonSerializer.Serialize(advisory);
var normalizedVersions = _options.EnableSemVerStyle
? NormalizedVersionDocumentFactory.Create(advisory)
: null;
var document = new AdvisoryDocument
{
AdvisoryKey = advisory.AdvisoryKey,
Payload = BsonDocument.Parse(payload),
Modified = advisory.Modified?.UtcDateTime ?? DateTime.UtcNow,
Published = advisory.Published?.UtcDateTime,
NormalizedVersions = normalizedVersions,
};
var options = new ReplaceOptions { IsUpsert = true };
var filter = Builders<AdvisoryDocument>.Filter.Eq(x => x.AdvisoryKey, advisory.AdvisoryKey);
await ReplaceAsync(filter, document, options, session, cancellationToken).ConfigureAwait(false);
_logger.LogDebug("Upserted advisory {AdvisoryKey}", advisory.AdvisoryKey);
var aliasEntries = BuildAliasEntries(advisory);
var updatedAt = _timeProvider.GetUtcNow();
await _aliasStore.ReplaceAsync(advisory.AdvisoryKey, aliasEntries, updatedAt, cancellationToken).ConfigureAwait(false);
}
public async Task<Advisory?> FindAsync(string advisoryKey, CancellationToken cancellationToken, IClientSessionHandle? session = null)
{
ArgumentException.ThrowIfNullOrEmpty(advisoryKey);
var filter = Builders<AdvisoryDocument>.Filter.Eq(x => x.AdvisoryKey, advisoryKey);
var query = session is null
? _collection.Find(filter)
: _collection.Find(session, filter);
var document = await query.FirstOrDefaultAsync(cancellationToken).ConfigureAwait(false);
return document is null ? null : Deserialize(document.Payload);
}
private static IEnumerable<AliasEntry> BuildAliasEntries(Advisory advisory)
{
foreach (var alias in advisory.Aliases)
{
if (AliasSchemeRegistry.TryGetScheme(alias, out var scheme))
{
yield return new AliasEntry(scheme, alias);
}
else
{
yield return new AliasEntry(AliasStoreConstants.UnscopedScheme, alias);
}
}
yield return new AliasEntry(AliasStoreConstants.PrimaryScheme, advisory.AdvisoryKey);
}
public async Task<IReadOnlyList<Advisory>> GetRecentAsync(int limit, CancellationToken cancellationToken, IClientSessionHandle? session = null)
{
var filter = FilterDefinition<AdvisoryDocument>.Empty;
var query = session is null
? _collection.Find(filter)
: _collection.Find(session, filter);
var cursor = await query
.SortByDescending(x => x.Modified)
.Limit(limit)
.ToListAsync(cancellationToken)
.ConfigureAwait(false);
return cursor.Select(static doc => Deserialize(doc.Payload)).ToArray();
}
private async Task ReplaceAsync(
FilterDefinition<AdvisoryDocument> filter,
AdvisoryDocument document,
ReplaceOptions options,
IClientSessionHandle? session,
CancellationToken cancellationToken)
{
try
{
if (session is null)
{
await _collection.ReplaceOneAsync(filter, document, options, cancellationToken).ConfigureAwait(false);
}
else
{
await _collection.ReplaceOneAsync(session, filter, document, options, cancellationToken).ConfigureAwait(false);
}
}
catch (MongoWriteException ex) when (IsNamespaceViewError(ex))
{
var legacyCollection = await GetLegacyAdvisoryCollectionAsync(cancellationToken).ConfigureAwait(false);
if (session is null)
{
await legacyCollection.ReplaceOneAsync(filter, document, options, cancellationToken).ConfigureAwait(false);
}
else
{
await legacyCollection.ReplaceOneAsync(session, filter, document, options, cancellationToken).ConfigureAwait(false);
}
}
}
private static bool IsNamespaceViewError(MongoWriteException ex)
=> ex?.WriteError?.Code == 166 ||
(ex?.WriteError?.Message?.Contains("is a view", StringComparison.OrdinalIgnoreCase) ?? false);
private async ValueTask<IMongoCollection<AdvisoryDocument>> GetLegacyAdvisoryCollectionAsync(CancellationToken cancellationToken)
{
if (_legacyCollection is not null)
{
return _legacyCollection;
}
var filter = new BsonDocument("name", MongoStorageDefaults.Collections.Advisory);
using var cursor = await _database
.ListCollectionsAsync(new ListCollectionsOptions { Filter = filter }, cancellationToken)
.ConfigureAwait(false);
var info = await cursor.FirstOrDefaultAsync(cancellationToken).ConfigureAwait(false)
?? throw new InvalidOperationException("Advisory collection metadata not found.");
if (!info.TryGetValue("options", out var optionsValue) || optionsValue is not BsonDocument optionsDocument)
{
throw new InvalidOperationException("Advisory view options missing.");
}
if (!optionsDocument.TryGetValue("viewOn", out var viewOnValue) || viewOnValue.BsonType != BsonType.String)
{
throw new InvalidOperationException("Advisory view target not specified.");
}
var targetName = viewOnValue.AsString;
_legacyCollection = _database.GetCollection<AdvisoryDocument>(targetName);
return _legacyCollection;
}
public async IAsyncEnumerable<Advisory> StreamAsync([EnumeratorCancellation] CancellationToken cancellationToken, IClientSessionHandle? session = null)
{
var options = new FindOptions<AdvisoryDocument>
{
Sort = Builders<AdvisoryDocument>.Sort.Ascending(static doc => doc.AdvisoryKey),
};
using var cursor = session is null
? await _collection.FindAsync(FilterDefinition<AdvisoryDocument>.Empty, options, cancellationToken).ConfigureAwait(false)
: await _collection.FindAsync(session, FilterDefinition<AdvisoryDocument>.Empty, options, cancellationToken).ConfigureAwait(false);
while (await cursor.MoveNextAsync(cancellationToken).ConfigureAwait(false))
{
foreach (var document in cursor.Current)
{
cancellationToken.ThrowIfCancellationRequested();
yield return Deserialize(document.Payload);
}
}
}
private static Advisory Deserialize(BsonDocument payload)
{
ArgumentNullException.ThrowIfNull(payload);
var advisoryKey = payload.GetValue("advisoryKey", defaultValue: null)?.AsString
?? throw new InvalidOperationException("advisoryKey missing from payload.");
var title = payload.GetValue("title", defaultValue: null)?.AsString ?? advisoryKey;
string? summary = payload.TryGetValue("summary", out var summaryValue) && summaryValue.IsString ? summaryValue.AsString : null;
string? description = payload.TryGetValue("description", out var descriptionValue) && descriptionValue.IsString ? descriptionValue.AsString : null;
string? language = payload.TryGetValue("language", out var languageValue) && languageValue.IsString ? languageValue.AsString : null;
DateTimeOffset? published = TryReadDateTime(payload, "published");
DateTimeOffset? modified = TryReadDateTime(payload, "modified");
string? severity = payload.TryGetValue("severity", out var severityValue) && severityValue.IsString ? severityValue.AsString : null;
var exploitKnown = payload.TryGetValue("exploitKnown", out var exploitValue) && exploitValue.IsBoolean && exploitValue.AsBoolean;
var aliases = payload.TryGetValue("aliases", out var aliasValue) && aliasValue is BsonArray aliasArray
? aliasArray.OfType<BsonValue>().Where(static x => x.IsString).Select(static x => x.AsString)
: Array.Empty<string>();
var credits = payload.TryGetValue("credits", out var creditsValue) && creditsValue is BsonArray creditsArray
? creditsArray.OfType<BsonDocument>().Select(DeserializeCredit).ToArray()
: Array.Empty<AdvisoryCredit>();
var references = payload.TryGetValue("references", out var referencesValue) && referencesValue is BsonArray referencesArray
? referencesArray.OfType<BsonDocument>().Select(DeserializeReference).ToArray()
: Array.Empty<AdvisoryReference>();
var affectedPackages = payload.TryGetValue("affectedPackages", out var affectedValue) && affectedValue is BsonArray affectedArray
? affectedArray.OfType<BsonDocument>().Select(DeserializeAffectedPackage).ToArray()
: Array.Empty<AffectedPackage>();
var cvssMetrics = payload.TryGetValue("cvssMetrics", out var cvssValue) && cvssValue is BsonArray cvssArray
? cvssArray.OfType<BsonDocument>().Select(DeserializeCvssMetric).ToArray()
: Array.Empty<CvssMetric>();
var cwes = payload.TryGetValue("cwes", out var cweValue) && cweValue is BsonArray cweArray
? cweArray.OfType<BsonDocument>().Select(DeserializeWeakness).ToArray()
: Array.Empty<AdvisoryWeakness>();
string? canonicalMetricId = payload.TryGetValue("canonicalMetricId", out var canonicalMetricValue) && canonicalMetricValue.IsString
? canonicalMetricValue.AsString
: null;
var provenance = payload.TryGetValue("provenance", out var provenanceValue) && provenanceValue is BsonArray provenanceArray
? provenanceArray.OfType<BsonDocument>().Select(DeserializeProvenance).ToArray()
: Array.Empty<AdvisoryProvenance>();
return new Advisory(
advisoryKey,
title,
summary,
language,
published,
modified,
severity,
exploitKnown,
aliases,
credits,
references,
affectedPackages,
cvssMetrics,
provenance,
description,
cwes,
canonicalMetricId);
}
private static AdvisoryReference DeserializeReference(BsonDocument document)
{
var url = document.GetValue("url", defaultValue: null)?.AsString
?? throw new InvalidOperationException("reference.url missing from payload.");
string? kind = document.TryGetValue("kind", out var kindValue) && kindValue.IsString ? kindValue.AsString : null;
string? sourceTag = document.TryGetValue("sourceTag", out var sourceTagValue) && sourceTagValue.IsString ? sourceTagValue.AsString : null;
string? summary = document.TryGetValue("summary", out var summaryValue) && summaryValue.IsString ? summaryValue.AsString : null;
var provenance = document.TryGetValue("provenance", out var provenanceValue) && provenanceValue.IsBsonDocument
? DeserializeProvenance(provenanceValue.AsBsonDocument)
: AdvisoryProvenance.Empty;
return new AdvisoryReference(url, kind, sourceTag, summary, provenance);
}
private static AdvisoryCredit DeserializeCredit(BsonDocument document)
{
var displayName = document.GetValue("displayName", defaultValue: null)?.AsString
?? throw new InvalidOperationException("credits.displayName missing from payload.");
string? role = document.TryGetValue("role", out var roleValue) && roleValue.IsString ? roleValue.AsString : null;
var contacts = document.TryGetValue("contacts", out var contactsValue) && contactsValue is BsonArray contactsArray
? contactsArray.OfType<BsonValue>().Where(static value => value.IsString).Select(static value => value.AsString).ToArray()
: Array.Empty<string>();
var provenance = document.TryGetValue("provenance", out var provenanceValue) && provenanceValue.IsBsonDocument
? DeserializeProvenance(provenanceValue.AsBsonDocument)
: AdvisoryProvenance.Empty;
return new AdvisoryCredit(displayName, role, contacts, provenance);
}
private static AffectedPackage DeserializeAffectedPackage(BsonDocument document)
{
var type = document.GetValue("type", defaultValue: null)?.AsString
?? throw new InvalidOperationException("affectedPackages.type missing from payload.");
var identifier = document.GetValue("identifier", defaultValue: null)?.AsString
?? throw new InvalidOperationException("affectedPackages.identifier missing from payload.");
string? platform = document.TryGetValue("platform", out var platformValue) && platformValue.IsString ? platformValue.AsString : null;
var versionRanges = document.TryGetValue("versionRanges", out var rangesValue) && rangesValue is BsonArray rangesArray
? rangesArray.OfType<BsonDocument>().Select(DeserializeVersionRange).ToArray()
: Array.Empty<AffectedVersionRange>();
var statuses = document.TryGetValue("statuses", out var statusesValue) && statusesValue is BsonArray statusesArray
? statusesArray.OfType<BsonDocument>().Select(DeserializeStatus).ToArray()
: Array.Empty<AffectedPackageStatus>();
var normalizedVersions = document.TryGetValue("normalizedVersions", out var normalizedValue) && normalizedValue is BsonArray normalizedArray
? normalizedArray.OfType<BsonDocument>().Select(DeserializeNormalizedVersionRule).ToArray()
: Array.Empty<NormalizedVersionRule>();
var provenance = document.TryGetValue("provenance", out var provenanceValue) && provenanceValue is BsonArray provenanceArray
? provenanceArray.OfType<BsonDocument>().Select(DeserializeProvenance).ToArray()
: Array.Empty<AdvisoryProvenance>();
return new AffectedPackage(type, identifier, platform, versionRanges, statuses, provenance, normalizedVersions);
}
private static AffectedVersionRange DeserializeVersionRange(BsonDocument document)
{
var rangeKind = document.GetValue("rangeKind", defaultValue: null)?.AsString
?? throw new InvalidOperationException("versionRanges.rangeKind missing from payload.");
string? introducedVersion = document.TryGetValue("introducedVersion", out var introducedValue) && introducedValue.IsString ? introducedValue.AsString : null;
string? fixedVersion = document.TryGetValue("fixedVersion", out var fixedValue) && fixedValue.IsString ? fixedValue.AsString : null;
string? lastAffectedVersion = document.TryGetValue("lastAffectedVersion", out var lastValue) && lastValue.IsString ? lastValue.AsString : null;
string? rangeExpression = document.TryGetValue("rangeExpression", out var expressionValue) && expressionValue.IsString ? expressionValue.AsString : null;
var provenance = document.TryGetValue("provenance", out var provenanceValue) && provenanceValue.IsBsonDocument
? DeserializeProvenance(provenanceValue.AsBsonDocument)
: AdvisoryProvenance.Empty;
RangePrimitives? primitives = null;
if (document.TryGetValue("primitives", out var primitivesValue) && primitivesValue.IsBsonDocument)
{
primitives = DeserializePrimitives(primitivesValue.AsBsonDocument);
}
return new AffectedVersionRange(rangeKind, introducedVersion, fixedVersion, lastAffectedVersion, rangeExpression, provenance, primitives);
}
private static AffectedPackageStatus DeserializeStatus(BsonDocument document)
{
var status = document.GetValue("status", defaultValue: null)?.AsString
?? throw new InvalidOperationException("statuses.status missing from payload.");
var provenance = document.TryGetValue("provenance", out var provenanceValue) && provenanceValue.IsBsonDocument
? DeserializeProvenance(provenanceValue.AsBsonDocument)
: AdvisoryProvenance.Empty;
return new AffectedPackageStatus(status, provenance);
}
private static CvssMetric DeserializeCvssMetric(BsonDocument document)
{
var version = document.GetValue("version", defaultValue: null)?.AsString
?? throw new InvalidOperationException("cvssMetrics.version missing from payload.");
var vector = document.GetValue("vector", defaultValue: null)?.AsString
?? throw new InvalidOperationException("cvssMetrics.vector missing from payload.");
var baseScore = document.TryGetValue("baseScore", out var scoreValue) && scoreValue.IsNumeric ? scoreValue.ToDouble() : 0d;
var baseSeverity = document.GetValue("baseSeverity", defaultValue: null)?.AsString
?? throw new InvalidOperationException("cvssMetrics.baseSeverity missing from payload.");
var provenance = document.TryGetValue("provenance", out var provenanceValue) && provenanceValue.IsBsonDocument
? DeserializeProvenance(provenanceValue.AsBsonDocument)
: AdvisoryProvenance.Empty;
return new CvssMetric(version, vector, baseScore, baseSeverity, provenance);
}
private static AdvisoryWeakness DeserializeWeakness(BsonDocument document)
{
var taxonomy = document.GetValue("taxonomy", defaultValue: null)?.AsString
?? throw new InvalidOperationException("cwes.taxonomy missing from payload.");
var identifier = document.GetValue("identifier", defaultValue: null)?.AsString
?? throw new InvalidOperationException("cwes.identifier missing from payload.");
string? name = document.TryGetValue("name", out var nameValue) && nameValue.IsString ? nameValue.AsString : null;
string? uri = document.TryGetValue("uri", out var uriValue) && uriValue.IsString ? uriValue.AsString : null;
var provenance = document.TryGetValue("provenance", out var provenanceValue) && provenanceValue.IsBsonDocument
? DeserializeProvenance(provenanceValue.AsBsonDocument)
: AdvisoryProvenance.Empty;
return new AdvisoryWeakness(taxonomy, identifier, name, uri, new[] { provenance });
}
private static AdvisoryProvenance DeserializeProvenance(BsonDocument document)
{
var source = document.GetValue("source", defaultValue: null)?.AsString
?? throw new InvalidOperationException("provenance.source missing from payload.");
var kind = document.GetValue("kind", defaultValue: null)?.AsString
?? throw new InvalidOperationException("provenance.kind missing from payload.");
string? value = document.TryGetValue("value", out var valueElement) && valueElement.IsString ? valueElement.AsString : null;
string? decisionReason = document.TryGetValue("decisionReason", out var reasonElement) && reasonElement.IsString ? reasonElement.AsString : null;
var recordedAt = TryConvertDateTime(document.GetValue("recordedAt", defaultValue: null));
IEnumerable<string>? fieldMask = null;
if (document.TryGetValue("fieldMask", out var fieldMaskValue) && fieldMaskValue is BsonArray fieldMaskArray)
{
fieldMask = fieldMaskArray
.OfType<BsonValue>()
.Where(static element => element.IsString)
.Select(static element => element.AsString);
}
return new AdvisoryProvenance(
source,
kind,
value ?? string.Empty,
recordedAt ?? DateTimeOffset.UtcNow,
fieldMask,
decisionReason);
}
private static NormalizedVersionRule DeserializeNormalizedVersionRule(BsonDocument document)
{
var scheme = document.GetValue("scheme", defaultValue: null)?.AsString
?? throw new InvalidOperationException("normalizedVersions.scheme missing from payload.");
var type = document.GetValue("type", defaultValue: null)?.AsString
?? throw new InvalidOperationException("normalizedVersions.type missing from payload.");
string? min = document.TryGetValue("min", out var minValue) && minValue.IsString ? minValue.AsString : null;
bool? minInclusive = document.TryGetValue("minInclusive", out var minInclusiveValue) && minInclusiveValue.IsBoolean ? minInclusiveValue.AsBoolean : null;
string? max = document.TryGetValue("max", out var maxValue) && maxValue.IsString ? maxValue.AsString : null;
bool? maxInclusive = document.TryGetValue("maxInclusive", out var maxInclusiveValue) && maxInclusiveValue.IsBoolean ? maxInclusiveValue.AsBoolean : null;
string? value = document.TryGetValue("value", out var valueElement) && valueElement.IsString ? valueElement.AsString : null;
string? notes = document.TryGetValue("notes", out var notesValue) && notesValue.IsString ? notesValue.AsString : null;
return new NormalizedVersionRule(
scheme,
type,
min,
minInclusive,
max,
maxInclusive,
value,
notes);
}
private static RangePrimitives? DeserializePrimitives(BsonDocument document)
{
SemVerPrimitive? semVer = null;
NevraPrimitive? nevra = null;
EvrPrimitive? evr = null;
IReadOnlyDictionary<string, string>? vendor = null;
if (document.TryGetValue("semVer", out var semverValue) && semverValue.IsBsonDocument)
{
var semverDoc = semverValue.AsBsonDocument;
semVer = new SemVerPrimitive(
semverDoc.TryGetValue("introduced", out var semIntroduced) && semIntroduced.IsString ? semIntroduced.AsString : null,
semverDoc.TryGetValue("introducedInclusive", out var semIntroducedInclusive) && semIntroducedInclusive.IsBoolean && semIntroducedInclusive.AsBoolean,
semverDoc.TryGetValue("fixed", out var semFixed) && semFixed.IsString ? semFixed.AsString : null,
semverDoc.TryGetValue("fixedInclusive", out var semFixedInclusive) && semFixedInclusive.IsBoolean && semFixedInclusive.AsBoolean,
semverDoc.TryGetValue("lastAffected", out var semLast) && semLast.IsString ? semLast.AsString : null,
semverDoc.TryGetValue("lastAffectedInclusive", out var semLastInclusive) && semLastInclusive.IsBoolean && semLastInclusive.AsBoolean,
semverDoc.TryGetValue("constraintExpression", out var constraint) && constraint.IsString ? constraint.AsString : null,
semverDoc.TryGetValue("exactValue", out var exact) && exact.IsString ? exact.AsString : null);
}
if (document.TryGetValue("nevra", out var nevraValue) && nevraValue.IsBsonDocument)
{
var nevraDoc = nevraValue.AsBsonDocument;
nevra = new NevraPrimitive(
DeserializeNevraComponent(nevraDoc, "introduced"),
DeserializeNevraComponent(nevraDoc, "fixed"),
DeserializeNevraComponent(nevraDoc, "lastAffected"));
}
if (document.TryGetValue("evr", out var evrValue) && evrValue.IsBsonDocument)
{
var evrDoc = evrValue.AsBsonDocument;
evr = new EvrPrimitive(
DeserializeEvrComponent(evrDoc, "introduced"),
DeserializeEvrComponent(evrDoc, "fixed"),
DeserializeEvrComponent(evrDoc, "lastAffected"));
}
if (document.TryGetValue("vendorExtensions", out var vendorValue) && vendorValue.IsBsonDocument)
{
vendor = vendorValue.AsBsonDocument.Elements
.Where(static e => e.Value.IsString)
.ToDictionary(static e => e.Name, static e => e.Value.AsString, StringComparer.Ordinal);
if (vendor.Count == 0)
{
vendor = null;
}
}
if (semVer is null && nevra is null && evr is null && vendor is null)
{
return null;
}
return new RangePrimitives(semVer, nevra, evr, vendor);
}
private static NevraComponent? DeserializeNevraComponent(BsonDocument parent, string field)
{
if (!parent.TryGetValue(field, out var value) || !value.IsBsonDocument)
{
return null;
}
var component = value.AsBsonDocument;
var name = component.TryGetValue("name", out var nameValue) && nameValue.IsString ? nameValue.AsString : null;
var version = component.TryGetValue("version", out var versionValue) && versionValue.IsString ? versionValue.AsString : null;
if (name is null || version is null)
{
return null;
}
var epoch = component.TryGetValue("epoch", out var epochValue) && epochValue.IsNumeric ? epochValue.ToInt32() : 0;
var release = component.TryGetValue("release", out var releaseValue) && releaseValue.IsString ? releaseValue.AsString : string.Empty;
var architecture = component.TryGetValue("architecture", out var archValue) && archValue.IsString ? archValue.AsString : null;
return new NevraComponent(name, epoch, version, release, architecture);
}
private static EvrComponent? DeserializeEvrComponent(BsonDocument parent, string field)
{
if (!parent.TryGetValue(field, out var value) || !value.IsBsonDocument)
{
return null;
}
var component = value.AsBsonDocument;
var epoch = component.TryGetValue("epoch", out var epochValue) && epochValue.IsNumeric ? epochValue.ToInt32() : 0;
var upstream = component.TryGetValue("upstreamVersion", out var upstreamValue) && upstreamValue.IsString ? upstreamValue.AsString : null;
if (upstream is null)
{
return null;
}
var revision = component.TryGetValue("revision", out var revisionValue) && revisionValue.IsString ? revisionValue.AsString : null;
return new EvrComponent(epoch, upstream, revision);
}
private static DateTimeOffset? TryReadDateTime(BsonDocument document, string field)
=> document.TryGetValue(field, out var value) ? TryConvertDateTime(value) : null;
private static DateTimeOffset? TryConvertDateTime(BsonValue? value)
{
if (value is null)
{
return null;
}
return value switch
{
BsonDateTime dateTime => DateTime.SpecifyKind(dateTime.ToUniversalTime(), DateTimeKind.Utc),
BsonString stringValue when DateTimeOffset.TryParse(stringValue.AsString, out var parsed) => parsed.ToUniversalTime(),
_ => null,
};
}
}

View File

@@ -1,15 +0,0 @@
using MongoDB.Driver;
using StellaOps.Concelier.Models;
namespace StellaOps.Concelier.Storage.Mongo.Advisories;
public interface IAdvisoryStore
{
Task UpsertAsync(Advisory advisory, CancellationToken cancellationToken, IClientSessionHandle? session = null);
Task<Advisory?> FindAsync(string advisoryKey, CancellationToken cancellationToken, IClientSessionHandle? session = null);
Task<IReadOnlyList<Advisory>> GetRecentAsync(int limit, CancellationToken cancellationToken, IClientSessionHandle? session = null);
IAsyncEnumerable<Advisory> StreamAsync(CancellationToken cancellationToken, IClientSessionHandle? session = null);
}

View File

@@ -1,64 +0,0 @@
using System;
using MongoDB.Bson.Serialization.Attributes;
namespace StellaOps.Concelier.Storage.Mongo.Advisories;
[BsonIgnoreExtraElements]
public sealed class NormalizedVersionDocument
{
[BsonElement("packageId")]
public string PackageId { get; set; } = string.Empty;
[BsonElement("packageType")]
public string PackageType { get; set; } = string.Empty;
[BsonElement("scheme")]
public string Scheme { get; set; } = string.Empty;
[BsonElement("type")]
public string Type { get; set; } = string.Empty;
[BsonElement("style")]
[BsonIgnoreIfNull]
public string? Style { get; set; }
[BsonElement("min")]
[BsonIgnoreIfNull]
public string? Min { get; set; }
[BsonElement("minInclusive")]
[BsonIgnoreIfNull]
public bool? MinInclusive { get; set; }
[BsonElement("max")]
[BsonIgnoreIfNull]
public string? Max { get; set; }
[BsonElement("maxInclusive")]
[BsonIgnoreIfNull]
public bool? MaxInclusive { get; set; }
[BsonElement("value")]
[BsonIgnoreIfNull]
public string? Value { get; set; }
[BsonElement("notes")]
[BsonIgnoreIfNull]
public string? Notes { get; set; }
[BsonElement("decisionReason")]
[BsonIgnoreIfNull]
public string? DecisionReason { get; set; }
[BsonElement("constraint")]
[BsonIgnoreIfNull]
public string? Constraint { get; set; }
[BsonElement("source")]
[BsonIgnoreIfNull]
public string? Source { get; set; }
[BsonElement("recordedAt")]
[BsonIgnoreIfNull]
public DateTime? RecordedAtUtc { get; set; }
}

View File

@@ -1,100 +0,0 @@
using System;
using System.Collections.Generic;
using System.Linq;
using StellaOps.Concelier.Models;
namespace StellaOps.Concelier.Storage.Mongo.Advisories;
internal static class NormalizedVersionDocumentFactory
{
public static List<NormalizedVersionDocument>? Create(Advisory advisory)
{
if (advisory.AffectedPackages.IsDefaultOrEmpty || advisory.AffectedPackages.Length == 0)
{
return null;
}
var documents = new List<NormalizedVersionDocument>();
var advisoryFallbackReason = advisory.Provenance.FirstOrDefault()?.DecisionReason;
var advisoryFallbackSource = advisory.Provenance.FirstOrDefault()?.Source;
var advisoryFallbackRecordedAt = advisory.Provenance.FirstOrDefault()?.RecordedAt;
foreach (var package in advisory.AffectedPackages)
{
if (package.NormalizedVersions.IsDefaultOrEmpty || package.NormalizedVersions.Length == 0)
{
continue;
}
foreach (var rule in package.NormalizedVersions)
{
var matchingRange = FindMatchingRange(package, rule);
var decisionReason = matchingRange?.Provenance.DecisionReason
?? package.Provenance.FirstOrDefault()?.DecisionReason
?? advisoryFallbackReason;
var source = matchingRange?.Provenance.Source
?? package.Provenance.FirstOrDefault()?.Source
?? advisoryFallbackSource;
var recordedAt = matchingRange?.Provenance.RecordedAt
?? package.Provenance.FirstOrDefault()?.RecordedAt
?? advisoryFallbackRecordedAt;
var constraint = matchingRange?.Primitives?.SemVer?.ConstraintExpression
?? matchingRange?.RangeExpression;
var style = matchingRange?.Primitives?.SemVer?.Style ?? rule.Type;
documents.Add(new NormalizedVersionDocument
{
PackageId = package.Identifier ?? string.Empty,
PackageType = package.Type ?? string.Empty,
Scheme = rule.Scheme,
Type = rule.Type,
Style = style,
Min = rule.Min,
MinInclusive = rule.MinInclusive,
Max = rule.Max,
MaxInclusive = rule.MaxInclusive,
Value = rule.Value,
Notes = rule.Notes,
DecisionReason = decisionReason,
Constraint = constraint,
Source = source,
RecordedAtUtc = recordedAt?.UtcDateTime,
});
}
}
return documents.Count == 0 ? null : documents;
}
private static AffectedVersionRange? FindMatchingRange(AffectedPackage package, NormalizedVersionRule rule)
{
foreach (var range in package.VersionRanges)
{
var candidate = range.ToNormalizedVersionRule(rule.Notes);
if (candidate is null)
{
continue;
}
if (NormalizedRulesEquivalent(candidate, rule))
{
return range;
}
}
return null;
}
private static bool NormalizedRulesEquivalent(NormalizedVersionRule left, NormalizedVersionRule right)
=> string.Equals(left.Scheme, right.Scheme, StringComparison.Ordinal)
&& string.Equals(left.Type, right.Type, StringComparison.Ordinal)
&& string.Equals(left.Min, right.Min, StringComparison.Ordinal)
&& left.MinInclusive == right.MinInclusive
&& string.Equals(left.Max, right.Max, StringComparison.Ordinal)
&& left.MaxInclusive == right.MaxInclusive
&& string.Equals(left.Value, right.Value, StringComparison.Ordinal);
}

View File

@@ -1,38 +0,0 @@
using System;
using MongoDB.Bson;
using MongoDB.Bson.Serialization.Attributes;
namespace StellaOps.Concelier.Storage.Mongo.Aliases;
[BsonIgnoreExtraElements]
internal sealed class AliasDocument
{
[BsonId]
public ObjectId Id { get; set; }
[BsonElement("advisoryKey")]
public string AdvisoryKey { get; set; } = string.Empty;
[BsonElement("scheme")]
public string Scheme { get; set; } = string.Empty;
[BsonElement("value")]
public string Value { get; set; } = string.Empty;
[BsonElement("updatedAt")]
public DateTime UpdatedAt { get; set; }
}
internal static class AliasDocumentExtensions
{
public static AliasRecord ToRecord(this AliasDocument document)
{
ArgumentNullException.ThrowIfNull(document);
var updatedAt = DateTime.SpecifyKind(document.UpdatedAt, DateTimeKind.Utc);
return new AliasRecord(
document.AdvisoryKey,
document.Scheme,
document.Value,
new DateTimeOffset(updatedAt));
}
}

View File

@@ -1,185 +0,0 @@
using System.Collections.Generic;
using System.Linq;
using Microsoft.Extensions.Logging;
using MongoDB.Bson;
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo.Aliases;
public sealed class AliasStore : IAliasStore
{
private readonly IMongoCollection<AliasDocument> _collection;
private readonly ILogger<AliasStore> _logger;
public AliasStore(IMongoDatabase database, ILogger<AliasStore> logger)
{
_collection = (database ?? throw new ArgumentNullException(nameof(database)))
.GetCollection<AliasDocument>(MongoStorageDefaults.Collections.Alias);
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public async Task<AliasUpsertResult> ReplaceAsync(
string advisoryKey,
IEnumerable<AliasEntry> aliases,
DateTimeOffset updatedAt,
CancellationToken cancellationToken)
{
ArgumentException.ThrowIfNullOrWhiteSpace(advisoryKey);
var aliasList = Normalize(aliases).ToArray();
var deleteFilter = Builders<AliasDocument>.Filter.Eq(x => x.AdvisoryKey, advisoryKey);
await _collection.DeleteManyAsync(deleteFilter, cancellationToken).ConfigureAwait(false);
if (aliasList.Length > 0)
{
var documents = new List<AliasDocument>(aliasList.Length);
var updatedAtUtc = updatedAt.ToUniversalTime().UtcDateTime;
foreach (var alias in aliasList)
{
documents.Add(new AliasDocument
{
Id = ObjectId.GenerateNewId(),
AdvisoryKey = advisoryKey,
Scheme = alias.Scheme,
Value = alias.Value,
UpdatedAt = updatedAtUtc,
});
}
if (documents.Count > 0)
{
try
{
await _collection.InsertManyAsync(
documents,
new InsertManyOptions { IsOrdered = false },
cancellationToken).ConfigureAwait(false);
}
catch (MongoBulkWriteException<AliasDocument> ex) when (ex.WriteErrors.Any(error => error.Category == ServerErrorCategory.DuplicateKey))
{
foreach (var writeError in ex.WriteErrors.Where(error => error.Category == ServerErrorCategory.DuplicateKey))
{
var duplicateDocument = documents.ElementAtOrDefault(writeError.Index);
_logger.LogError(
ex,
"Alias duplicate detected while inserting {Scheme}:{Value} for advisory {AdvisoryKey}. Existing aliases: {Existing}",
duplicateDocument?.Scheme,
duplicateDocument?.Value,
duplicateDocument?.AdvisoryKey,
string.Join(", ", aliasList.Select(a => $"{a.Scheme}:{a.Value}")));
}
throw;
}
catch (MongoWriteException ex) when (ex.WriteError?.Category == ServerErrorCategory.DuplicateKey)
{
_logger.LogError(
ex,
"Alias duplicate detected while inserting aliases for advisory {AdvisoryKey}. Aliases: {Aliases}",
advisoryKey,
string.Join(", ", aliasList.Select(a => $"{a.Scheme}:{a.Value}")));
throw;
}
}
}
if (aliasList.Length == 0)
{
return new AliasUpsertResult(advisoryKey, Array.Empty<AliasCollision>());
}
var collisions = new List<AliasCollision>();
foreach (var alias in aliasList)
{
var filter = Builders<AliasDocument>.Filter.Eq(x => x.Scheme, alias.Scheme)
& Builders<AliasDocument>.Filter.Eq(x => x.Value, alias.Value);
using var cursor = await _collection.FindAsync(filter, cancellationToken: cancellationToken).ConfigureAwait(false);
var advisoryKeys = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
while (await cursor.MoveNextAsync(cancellationToken).ConfigureAwait(false))
{
foreach (var document in cursor.Current)
{
advisoryKeys.Add(document.AdvisoryKey);
}
}
if (advisoryKeys.Count <= 1)
{
continue;
}
var collision = new AliasCollision(alias.Scheme, alias.Value, advisoryKeys.ToArray());
collisions.Add(collision);
AliasStoreMetrics.RecordCollision(alias.Scheme, advisoryKeys.Count);
_logger.LogWarning(
"Alias collision detected for {Scheme}:{Value}; advisories: {Advisories}",
alias.Scheme,
alias.Value,
string.Join(", ", advisoryKeys));
}
return new AliasUpsertResult(advisoryKey, collisions);
}
public async Task<IReadOnlyList<AliasRecord>> GetByAliasAsync(string scheme, string value, CancellationToken cancellationToken)
{
ArgumentException.ThrowIfNullOrWhiteSpace(scheme);
ArgumentException.ThrowIfNullOrWhiteSpace(value);
var normalizedScheme = NormalizeScheme(scheme);
var normalizedValue = value.Trim();
var filter = Builders<AliasDocument>.Filter.Eq(x => x.Scheme, normalizedScheme)
& Builders<AliasDocument>.Filter.Eq(x => x.Value, normalizedValue);
var documents = await _collection.Find(filter).ToListAsync(cancellationToken).ConfigureAwait(false);
return documents.Select(static d => d.ToRecord()).ToArray();
}
public async Task<IReadOnlyList<AliasRecord>> GetByAdvisoryAsync(string advisoryKey, CancellationToken cancellationToken)
{
ArgumentException.ThrowIfNullOrWhiteSpace(advisoryKey);
var filter = Builders<AliasDocument>.Filter.Eq(x => x.AdvisoryKey, advisoryKey);
var documents = await _collection.Find(filter).ToListAsync(cancellationToken).ConfigureAwait(false);
return documents.Select(static d => d.ToRecord()).ToArray();
}
private static IEnumerable<AliasEntry> Normalize(IEnumerable<AliasEntry> aliases)
{
if (aliases is null)
{
yield break;
}
var seen = new HashSet<string>(StringComparer.Ordinal);
foreach (var alias in aliases)
{
if (alias is null)
{
continue;
}
var scheme = NormalizeScheme(alias.Scheme);
var value = alias.Value?.Trim();
if (string.IsNullOrEmpty(value))
{
continue;
}
var key = $"{scheme}\u0001{value}";
if (!seen.Add(key))
{
continue;
}
yield return new AliasEntry(scheme, value);
}
}
private static string NormalizeScheme(string scheme)
{
return string.IsNullOrWhiteSpace(scheme)
? AliasStoreConstants.UnscopedScheme
: scheme.Trim().ToUpperInvariant();
}
}

View File

@@ -1,7 +0,0 @@
namespace StellaOps.Concelier.Storage.Mongo.Aliases;
public static class AliasStoreConstants
{
public const string PrimaryScheme = "PRIMARY";
public const string UnscopedScheme = "UNSCOPED";
}

View File

@@ -1,22 +0,0 @@
using System.Collections.Generic;
using System.Diagnostics.Metrics;
namespace StellaOps.Concelier.Storage.Mongo.Aliases;
internal static class AliasStoreMetrics
{
private static readonly Meter Meter = new("StellaOps.Concelier.Merge");
internal static readonly Counter<long> AliasCollisionCounter = Meter.CreateCounter<long>(
"concelier.merge.alias_conflict",
unit: "count",
description: "Number of alias collisions detected when the same alias maps to multiple advisories.");
public static void RecordCollision(string scheme, int advisoryCount)
{
AliasCollisionCounter.Add(
1,
new KeyValuePair<string, object?>("scheme", scheme),
new KeyValuePair<string, object?>("advisory_count", advisoryCount));
}
}

View File

@@ -1,27 +0,0 @@
using System;
using System.Collections.Generic;
using System.Threading;
using System.Threading.Tasks;
namespace StellaOps.Concelier.Storage.Mongo.Aliases;
public interface IAliasStore
{
Task<AliasUpsertResult> ReplaceAsync(
string advisoryKey,
IEnumerable<AliasEntry> aliases,
DateTimeOffset updatedAt,
CancellationToken cancellationToken);
Task<IReadOnlyList<AliasRecord>> GetByAliasAsync(string scheme, string value, CancellationToken cancellationToken);
Task<IReadOnlyList<AliasRecord>> GetByAdvisoryAsync(string advisoryKey, CancellationToken cancellationToken);
}
public sealed record AliasEntry(string Scheme, string Value);
public sealed record AliasRecord(string AdvisoryKey, string Scheme, string Value, DateTimeOffset UpdatedAt);
public sealed record AliasCollision(string Scheme, string Value, IReadOnlyList<string> AdvisoryKeys);
public sealed record AliasUpsertResult(string AdvisoryKey, IReadOnlyList<AliasCollision> Collisions);

View File

@@ -1,43 +0,0 @@
using System;
using System.Collections.Generic;
using MongoDB.Bson;
using MongoDB.Bson.Serialization.Attributes;
namespace StellaOps.Concelier.Storage.Mongo.ChangeHistory;
[BsonIgnoreExtraElements]
public sealed class ChangeHistoryDocument
{
[BsonId]
public string Id { get; set; } = string.Empty;
[BsonElement("source")]
public string SourceName { get; set; } = string.Empty;
[BsonElement("advisoryKey")]
public string AdvisoryKey { get; set; } = string.Empty;
[BsonElement("documentId")]
public string DocumentId { get; set; } = string.Empty;
[BsonElement("documentSha256")]
public string DocumentSha256 { get; set; } = string.Empty;
[BsonElement("currentHash")]
public string CurrentHash { get; set; } = string.Empty;
[BsonElement("previousHash")]
public string? PreviousHash { get; set; }
[BsonElement("currentSnapshot")]
public string CurrentSnapshot { get; set; } = string.Empty;
[BsonElement("previousSnapshot")]
public string? PreviousSnapshot { get; set; }
[BsonElement("changes")]
public List<BsonDocument> Changes { get; set; } = new();
[BsonElement("capturedAt")]
public DateTime CapturedAt { get; set; }
}

View File

@@ -1,70 +0,0 @@
using System;
using System.Collections.Generic;
using MongoDB.Bson;
namespace StellaOps.Concelier.Storage.Mongo.ChangeHistory;
internal static class ChangeHistoryDocumentExtensions
{
public static ChangeHistoryDocument ToDocument(this ChangeHistoryRecord record)
{
var changes = new List<BsonDocument>(record.Changes.Count);
foreach (var change in record.Changes)
{
changes.Add(new BsonDocument
{
["field"] = change.Field,
["type"] = change.ChangeType,
["previous"] = change.PreviousValue is null ? BsonNull.Value : new BsonString(change.PreviousValue),
["current"] = change.CurrentValue is null ? BsonNull.Value : new BsonString(change.CurrentValue),
});
}
return new ChangeHistoryDocument
{
Id = record.Id.ToString(),
SourceName = record.SourceName,
AdvisoryKey = record.AdvisoryKey,
DocumentId = record.DocumentId.ToString(),
DocumentSha256 = record.DocumentSha256,
CurrentHash = record.CurrentHash,
PreviousHash = record.PreviousHash,
CurrentSnapshot = record.CurrentSnapshot,
PreviousSnapshot = record.PreviousSnapshot,
Changes = changes,
CapturedAt = record.CapturedAt.UtcDateTime,
};
}
public static ChangeHistoryRecord ToRecord(this ChangeHistoryDocument document)
{
var changes = new List<ChangeHistoryFieldChange>(document.Changes.Count);
foreach (var change in document.Changes)
{
var previousValue = change.TryGetValue("previous", out var previousBson) && previousBson is not BsonNull
? previousBson.AsString
: null;
var currentValue = change.TryGetValue("current", out var currentBson) && currentBson is not BsonNull
? currentBson.AsString
: null;
var fieldName = change.GetValue("field", "").AsString;
var changeType = change.GetValue("type", "").AsString;
changes.Add(new ChangeHistoryFieldChange(fieldName, changeType, previousValue, currentValue));
}
var capturedAtUtc = DateTime.SpecifyKind(document.CapturedAt, DateTimeKind.Utc);
return new ChangeHistoryRecord(
Guid.Parse(document.Id),
document.SourceName,
document.AdvisoryKey,
Guid.Parse(document.DocumentId),
document.DocumentSha256,
document.CurrentHash,
document.PreviousHash,
document.CurrentSnapshot,
document.PreviousSnapshot,
changes,
new DateTimeOffset(capturedAtUtc));
}
}

View File

@@ -1,24 +0,0 @@
using System;
namespace StellaOps.Concelier.Storage.Mongo.ChangeHistory;
public sealed record ChangeHistoryFieldChange
{
public ChangeHistoryFieldChange(string field, string changeType, string? previousValue, string? currentValue)
{
ArgumentException.ThrowIfNullOrEmpty(field);
ArgumentException.ThrowIfNullOrEmpty(changeType);
Field = field;
ChangeType = changeType;
PreviousValue = previousValue;
CurrentValue = currentValue;
}
public string Field { get; }
public string ChangeType { get; }
public string? PreviousValue { get; }
public string? CurrentValue { get; }
}

View File

@@ -1,62 +0,0 @@
using System;
using System.Collections.Generic;
namespace StellaOps.Concelier.Storage.Mongo.ChangeHistory;
public sealed class ChangeHistoryRecord
{
public ChangeHistoryRecord(
Guid id,
string sourceName,
string advisoryKey,
Guid documentId,
string documentSha256,
string currentHash,
string? previousHash,
string currentSnapshot,
string? previousSnapshot,
IReadOnlyList<ChangeHistoryFieldChange> changes,
DateTimeOffset capturedAt)
{
ArgumentException.ThrowIfNullOrEmpty(sourceName);
ArgumentException.ThrowIfNullOrEmpty(advisoryKey);
ArgumentException.ThrowIfNullOrEmpty(documentSha256);
ArgumentException.ThrowIfNullOrEmpty(currentHash);
ArgumentException.ThrowIfNullOrEmpty(currentSnapshot);
ArgumentNullException.ThrowIfNull(changes);
Id = id;
SourceName = sourceName;
AdvisoryKey = advisoryKey;
DocumentId = documentId;
DocumentSha256 = documentSha256;
CurrentHash = currentHash;
PreviousHash = previousHash;
CurrentSnapshot = currentSnapshot;
PreviousSnapshot = previousSnapshot;
Changes = changes;
CapturedAt = capturedAt;
}
public Guid Id { get; }
public string SourceName { get; }
public string AdvisoryKey { get; }
public Guid DocumentId { get; }
public string DocumentSha256 { get; }
public string CurrentHash { get; }
public string? PreviousHash { get; }
public string CurrentSnapshot { get; }
public string? PreviousSnapshot { get; }
public IReadOnlyList<ChangeHistoryFieldChange> Changes { get; }
public DateTimeOffset CapturedAt { get; }
}

View File

@@ -1,12 +0,0 @@
using System.Collections.Generic;
using System.Threading;
using System.Threading.Tasks;
namespace StellaOps.Concelier.Storage.Mongo.ChangeHistory;
public interface IChangeHistoryStore
{
Task AddAsync(ChangeHistoryRecord record, CancellationToken cancellationToken);
Task<IReadOnlyList<ChangeHistoryRecord>> GetRecentAsync(string sourceName, string advisoryKey, int limit, CancellationToken cancellationToken);
}

View File

@@ -1,53 +0,0 @@
using System;
using System.Collections.Generic;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging;
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo.ChangeHistory;
public sealed class MongoChangeHistoryStore : IChangeHistoryStore
{
private readonly IMongoCollection<ChangeHistoryDocument> _collection;
private readonly ILogger<MongoChangeHistoryStore> _logger;
public MongoChangeHistoryStore(IMongoDatabase database, ILogger<MongoChangeHistoryStore> logger)
{
_collection = (database ?? throw new ArgumentNullException(nameof(database)))
.GetCollection<ChangeHistoryDocument>(MongoStorageDefaults.Collections.ChangeHistory);
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public async Task AddAsync(ChangeHistoryRecord record, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(record);
var document = record.ToDocument();
await _collection.InsertOneAsync(document, cancellationToken: cancellationToken).ConfigureAwait(false);
_logger.LogDebug("Recorded change history for {Source}/{Advisory} with hash {Hash}", record.SourceName, record.AdvisoryKey, record.CurrentHash);
}
public async Task<IReadOnlyList<ChangeHistoryRecord>> GetRecentAsync(string sourceName, string advisoryKey, int limit, CancellationToken cancellationToken)
{
ArgumentException.ThrowIfNullOrEmpty(sourceName);
ArgumentException.ThrowIfNullOrEmpty(advisoryKey);
if (limit <= 0)
{
limit = 10;
}
var cursor = await _collection.Find(x => x.SourceName == sourceName && x.AdvisoryKey == advisoryKey)
.SortByDescending(x => x.CapturedAt)
.Limit(limit)
.ToListAsync(cancellationToken)
.ConfigureAwait(false);
var records = new List<ChangeHistoryRecord>(cursor.Count);
foreach (var document in cursor)
{
records.Add(document.ToRecord());
}
return records;
}
}

View File

@@ -1,69 +0,0 @@
using System;
using System.Collections.Generic;
using System.Linq;
using MongoDB.Bson;
using MongoDB.Bson.Serialization.Attributes;
namespace StellaOps.Concelier.Storage.Mongo.Conflicts;
[BsonIgnoreExtraElements]
public sealed class AdvisoryConflictDocument
{
[BsonId]
public string Id { get; set; } = Guid.Empty.ToString("N");
[BsonElement("vulnerabilityKey")]
public string VulnerabilityKey { get; set; } = string.Empty;
[BsonElement("conflictHash")]
public byte[] ConflictHash { get; set; } = Array.Empty<byte>();
[BsonElement("asOf")]
public DateTime AsOf { get; set; }
[BsonElement("recordedAt")]
public DateTime RecordedAt { get; set; }
[BsonElement("statementIds")]
public List<string> StatementIds { get; set; } = new();
[BsonElement("details")]
public BsonDocument Details { get; set; } = new();
[BsonElement("provenance")]
[BsonIgnoreIfNull]
public BsonDocument? Provenance { get; set; }
[BsonElement("trust")]
[BsonIgnoreIfNull]
public BsonDocument? Trust { get; set; }
}
internal static class AdvisoryConflictDocumentExtensions
{
public static AdvisoryConflictDocument FromRecord(AdvisoryConflictRecord record)
=> new()
{
Id = record.Id.ToString(),
VulnerabilityKey = record.VulnerabilityKey,
ConflictHash = record.ConflictHash,
AsOf = record.AsOf.UtcDateTime,
RecordedAt = record.RecordedAt.UtcDateTime,
StatementIds = record.StatementIds.Select(static id => id.ToString()).ToList(),
Details = (BsonDocument)record.Details.DeepClone(),
Provenance = record.Provenance is null ? null : (BsonDocument)record.Provenance.DeepClone(),
Trust = record.Trust is null ? null : (BsonDocument)record.Trust.DeepClone(),
};
public static AdvisoryConflictRecord ToRecord(this AdvisoryConflictDocument document)
=> new(
Guid.Parse(document.Id),
document.VulnerabilityKey,
document.ConflictHash,
DateTime.SpecifyKind(document.AsOf, DateTimeKind.Utc),
DateTime.SpecifyKind(document.RecordedAt, DateTimeKind.Utc),
document.StatementIds.Select(static value => Guid.Parse(value)).ToList(),
(BsonDocument)document.Details.DeepClone(),
document.Provenance is null ? null : (BsonDocument)document.Provenance.DeepClone(),
document.Trust is null ? null : (BsonDocument)document.Trust.DeepClone());
}

View File

@@ -1,16 +0,0 @@
using System;
using System.Collections.Generic;
using MongoDB.Bson;
namespace StellaOps.Concelier.Storage.Mongo.Conflicts;
public sealed record AdvisoryConflictRecord(
Guid Id,
string VulnerabilityKey,
byte[] ConflictHash,
DateTimeOffset AsOf,
DateTimeOffset RecordedAt,
IReadOnlyList<Guid> StatementIds,
BsonDocument Details,
BsonDocument? Provenance = null,
BsonDocument? Trust = null);

View File

@@ -1,93 +0,0 @@
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo.Conflicts;
public interface IAdvisoryConflictStore
{
ValueTask InsertAsync(
IReadOnlyCollection<AdvisoryConflictRecord> conflicts,
CancellationToken cancellationToken,
IClientSessionHandle? session = null);
ValueTask<IReadOnlyList<AdvisoryConflictRecord>> GetConflictsAsync(
string vulnerabilityKey,
DateTimeOffset? asOf,
CancellationToken cancellationToken,
IClientSessionHandle? session = null);
}
public sealed class AdvisoryConflictStore : IAdvisoryConflictStore
{
private readonly IMongoCollection<AdvisoryConflictDocument> _collection;
public AdvisoryConflictStore(IMongoDatabase database)
{
ArgumentNullException.ThrowIfNull(database);
_collection = database.GetCollection<AdvisoryConflictDocument>(MongoStorageDefaults.Collections.AdvisoryConflicts);
}
public async ValueTask InsertAsync(
IReadOnlyCollection<AdvisoryConflictRecord> conflicts,
CancellationToken cancellationToken,
IClientSessionHandle? session = null)
{
ArgumentNullException.ThrowIfNull(conflicts);
if (conflicts.Count == 0)
{
return;
}
var documents = conflicts.Select(AdvisoryConflictDocumentExtensions.FromRecord).ToList();
var options = new InsertManyOptions { IsOrdered = true };
try
{
if (session is null)
{
await _collection.InsertManyAsync(documents, options, cancellationToken).ConfigureAwait(false);
}
else
{
await _collection.InsertManyAsync(session, documents, options, cancellationToken).ConfigureAwait(false);
}
}
catch (MongoBulkWriteException ex) when (ex.WriteErrors.All(error => error.Category == ServerErrorCategory.DuplicateKey))
{
// Conflicts already persisted for this state; ignore duplicates.
}
}
public async ValueTask<IReadOnlyList<AdvisoryConflictRecord>> GetConflictsAsync(
string vulnerabilityKey,
DateTimeOffset? asOf,
CancellationToken cancellationToken,
IClientSessionHandle? session = null)
{
ArgumentException.ThrowIfNullOrWhiteSpace(vulnerabilityKey);
var filter = Builders<AdvisoryConflictDocument>.Filter.Eq(document => document.VulnerabilityKey, vulnerabilityKey);
if (asOf.HasValue)
{
filter &= Builders<AdvisoryConflictDocument>.Filter.Lte(document => document.AsOf, asOf.Value.UtcDateTime);
}
var find = session is null
? _collection.Find(filter)
: _collection.Find(session, filter);
var documents = await find
.SortByDescending(document => document.AsOf)
.ThenByDescending(document => document.RecordedAt)
.ToListAsync(cancellationToken)
.ConfigureAwait(false);
return documents.Select(static document => document.ToRecord()).ToList();
}
}

View File

@@ -1,131 +0,0 @@
using System;
using MongoDB.Bson;
using MongoDB.Bson.Serialization.Attributes;
namespace StellaOps.Concelier.Storage.Mongo.Documents;
[BsonIgnoreExtraElements]
public sealed class DocumentDocument
{
[BsonId]
public string Id { get; set; } = string.Empty;
[BsonElement("sourceName")]
public string SourceName { get; set; } = string.Empty;
[BsonElement("uri")]
public string Uri { get; set; } = string.Empty;
[BsonElement("fetchedAt")]
public DateTime FetchedAt { get; set; }
[BsonElement("sha256")]
public string Sha256 { get; set; } = string.Empty;
[BsonElement("status")]
public string Status { get; set; } = string.Empty;
[BsonElement("contentType")]
[BsonIgnoreIfNull]
public string? ContentType { get; set; }
[BsonElement("headers")]
[BsonIgnoreIfNull]
public BsonDocument? Headers { get; set; }
[BsonElement("metadata")]
[BsonIgnoreIfNull]
public BsonDocument? Metadata { get; set; }
[BsonElement("etag")]
[BsonIgnoreIfNull]
public string? Etag { get; set; }
[BsonElement("lastModified")]
[BsonIgnoreIfNull]
public DateTime? LastModified { get; set; }
[BsonElement("expiresAt")]
[BsonIgnoreIfNull]
public DateTime? ExpiresAt { get; set; }
[BsonElement("gridFsId")]
[BsonIgnoreIfNull]
public ObjectId? GridFsId { get; set; }
}
internal static class DocumentDocumentExtensions
{
public static DocumentDocument FromRecord(DocumentRecord record)
{
return new DocumentDocument
{
Id = record.Id.ToString(),
SourceName = record.SourceName,
Uri = record.Uri,
FetchedAt = record.FetchedAt.UtcDateTime,
Sha256 = record.Sha256,
Status = record.Status,
ContentType = record.ContentType,
Headers = ToBson(record.Headers),
Metadata = ToBson(record.Metadata),
Etag = record.Etag,
LastModified = record.LastModified?.UtcDateTime,
GridFsId = record.GridFsId,
ExpiresAt = record.ExpiresAt?.UtcDateTime,
};
}
public static DocumentRecord ToRecord(this DocumentDocument document)
{
IReadOnlyDictionary<string, string>? headers = null;
if (document.Headers is not null)
{
headers = document.Headers.Elements.ToDictionary(
static e => e.Name,
static e => e.Value?.ToString() ?? string.Empty,
StringComparer.Ordinal);
}
IReadOnlyDictionary<string, string>? metadata = null;
if (document.Metadata is not null)
{
metadata = document.Metadata.Elements.ToDictionary(
static e => e.Name,
static e => e.Value?.ToString() ?? string.Empty,
StringComparer.Ordinal);
}
return new DocumentRecord(
Guid.Parse(document.Id),
document.SourceName,
document.Uri,
DateTime.SpecifyKind(document.FetchedAt, DateTimeKind.Utc),
document.Sha256,
document.Status,
document.ContentType,
headers,
metadata,
document.Etag,
document.LastModified.HasValue ? DateTime.SpecifyKind(document.LastModified.Value, DateTimeKind.Utc) : null,
document.GridFsId,
document.ExpiresAt.HasValue ? DateTime.SpecifyKind(document.ExpiresAt.Value, DateTimeKind.Utc) : null);
}
private static BsonDocument? ToBson(IReadOnlyDictionary<string, string>? values)
{
if (values is null)
{
return null;
}
var document = new BsonDocument();
foreach (var kvp in values)
{
document[kvp.Key] = kvp.Value;
}
return document;
}
}

View File

@@ -1,22 +0,0 @@
using MongoDB.Bson;
namespace StellaOps.Concelier.Storage.Mongo.Documents;
public sealed record DocumentRecord(
Guid Id,
string SourceName,
string Uri,
DateTimeOffset FetchedAt,
string Sha256,
string Status,
string? ContentType,
IReadOnlyDictionary<string, string>? Headers,
IReadOnlyDictionary<string, string>? Metadata,
string? Etag,
DateTimeOffset? LastModified,
ObjectId? GridFsId,
DateTimeOffset? ExpiresAt = null)
{
public DocumentRecord WithStatus(string status)
=> this with { Status = status };
}

View File

@@ -1,87 +0,0 @@
using Microsoft.Extensions.Logging;
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo.Documents;
public sealed class DocumentStore : IDocumentStore
{
private readonly IMongoCollection<DocumentDocument> _collection;
private readonly ILogger<DocumentStore> _logger;
public DocumentStore(IMongoDatabase database, ILogger<DocumentStore> logger)
{
_collection = (database ?? throw new ArgumentNullException(nameof(database)))
.GetCollection<DocumentDocument>(MongoStorageDefaults.Collections.Document);
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public async Task<DocumentRecord> UpsertAsync(DocumentRecord record, CancellationToken cancellationToken, IClientSessionHandle? session = null)
{
ArgumentNullException.ThrowIfNull(record);
var document = DocumentDocumentExtensions.FromRecord(record);
var filter = Builders<DocumentDocument>.Filter.Eq(x => x.SourceName, record.SourceName)
& Builders<DocumentDocument>.Filter.Eq(x => x.Uri, record.Uri);
var options = new FindOneAndReplaceOptions<DocumentDocument>
{
IsUpsert = true,
ReturnDocument = ReturnDocument.After,
};
var replaced = session is null
? await _collection.FindOneAndReplaceAsync(filter, document, options, cancellationToken).ConfigureAwait(false)
: await _collection.FindOneAndReplaceAsync(session, filter, document, options, cancellationToken).ConfigureAwait(false);
_logger.LogDebug("Upserted document {Source}/{Uri}", record.SourceName, record.Uri);
return (replaced ?? document).ToRecord();
}
public async Task<DocumentRecord?> FindBySourceAndUriAsync(string sourceName, string uri, CancellationToken cancellationToken, IClientSessionHandle? session = null)
{
ArgumentException.ThrowIfNullOrEmpty(sourceName);
ArgumentException.ThrowIfNullOrEmpty(uri);
var filter = Builders<DocumentDocument>.Filter.Eq(x => x.SourceName, sourceName)
& Builders<DocumentDocument>.Filter.Eq(x => x.Uri, uri);
var query = session is null
? _collection.Find(filter)
: _collection.Find(session, filter);
var document = await query.FirstOrDefaultAsync(cancellationToken).ConfigureAwait(false);
return document?.ToRecord();
}
public async Task<DocumentRecord?> FindAsync(Guid id, CancellationToken cancellationToken, IClientSessionHandle? session = null)
{
var idValue = id.ToString();
var filter = Builders<DocumentDocument>.Filter.Eq(x => x.Id, idValue);
var query = session is null
? _collection.Find(filter)
: _collection.Find(session, filter);
var document = await query.FirstOrDefaultAsync(cancellationToken).ConfigureAwait(false);
return document?.ToRecord();
}
public async Task<bool> UpdateStatusAsync(Guid id, string status, CancellationToken cancellationToken, IClientSessionHandle? session = null)
{
ArgumentException.ThrowIfNullOrEmpty(status);
var update = Builders<DocumentDocument>.Update
.Set(x => x.Status, status)
.Set(x => x.LastModified, DateTime.UtcNow);
var idValue = id.ToString();
var filter = Builders<DocumentDocument>.Filter.Eq(x => x.Id, idValue);
UpdateResult result;
if (session is null)
{
result = await _collection.UpdateOneAsync(filter, update, cancellationToken: cancellationToken).ConfigureAwait(false);
}
else
{
result = await _collection.UpdateOneAsync(session, filter, update, cancellationToken: cancellationToken).ConfigureAwait(false);
}
return result.MatchedCount > 0;
}
}

View File

@@ -1,14 +0,0 @@
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo.Documents;
public interface IDocumentStore
{
Task<DocumentRecord> UpsertAsync(DocumentRecord record, CancellationToken cancellationToken, IClientSessionHandle? session = null);
Task<DocumentRecord?> FindBySourceAndUriAsync(string sourceName, string uri, CancellationToken cancellationToken, IClientSessionHandle? session = null);
Task<DocumentRecord?> FindAsync(Guid id, CancellationToken cancellationToken, IClientSessionHandle? session = null);
Task<bool> UpdateStatusAsync(Guid id, string status, CancellationToken cancellationToken, IClientSessionHandle? session = null);
}

View File

@@ -1,50 +0,0 @@
using System;
using MongoDB.Bson;
using MongoDB.Bson.Serialization.Attributes;
namespace StellaOps.Concelier.Storage.Mongo.Dtos;
[BsonIgnoreExtraElements]
public sealed class DtoDocument
{
[BsonId]
public string Id { get; set; } = string.Empty;
[BsonElement("documentId")]
public string DocumentId { get; set; } = string.Empty;
[BsonElement("sourceName")]
public string SourceName { get; set; } = string.Empty;
[BsonElement("schemaVersion")]
public string SchemaVersion { get; set; } = string.Empty;
[BsonElement("payload")]
public BsonDocument Payload { get; set; } = new();
[BsonElement("validatedAt")]
public DateTime ValidatedAt { get; set; }
}
internal static class DtoDocumentExtensions
{
public static DtoDocument FromRecord(DtoRecord record)
=> new()
{
Id = record.Id.ToString(),
DocumentId = record.DocumentId.ToString(),
SourceName = record.SourceName,
SchemaVersion = record.SchemaVersion,
Payload = record.Payload ?? new BsonDocument(),
ValidatedAt = record.ValidatedAt.UtcDateTime,
};
public static DtoRecord ToRecord(this DtoDocument document)
=> new(
Guid.Parse(document.Id),
Guid.Parse(document.DocumentId),
document.SourceName,
document.SchemaVersion,
document.Payload,
DateTime.SpecifyKind(document.ValidatedAt, DateTimeKind.Utc));
}

View File

@@ -1,11 +0,0 @@
using MongoDB.Bson;
namespace StellaOps.Concelier.Storage.Mongo.Dtos;
public sealed record DtoRecord(
Guid Id,
Guid DocumentId,
string SourceName,
string SchemaVersion,
BsonDocument Payload,
DateTimeOffset ValidatedAt);

View File

@@ -1,66 +0,0 @@
using Microsoft.Extensions.Logging;
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo.Dtos;
public sealed class DtoStore : IDtoStore
{
private readonly IMongoCollection<DtoDocument> _collection;
private readonly ILogger<DtoStore> _logger;
public DtoStore(IMongoDatabase database, ILogger<DtoStore> logger)
{
_collection = (database ?? throw new ArgumentNullException(nameof(database)))
.GetCollection<DtoDocument>(MongoStorageDefaults.Collections.Dto);
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public async Task<DtoRecord> UpsertAsync(DtoRecord record, CancellationToken cancellationToken, IClientSessionHandle? session = null)
{
ArgumentNullException.ThrowIfNull(record);
var document = DtoDocumentExtensions.FromRecord(record);
var documentId = record.DocumentId.ToString();
var filter = Builders<DtoDocument>.Filter.Eq(x => x.DocumentId, documentId)
& Builders<DtoDocument>.Filter.Eq(x => x.SourceName, record.SourceName);
var options = new FindOneAndReplaceOptions<DtoDocument>
{
IsUpsert = true,
ReturnDocument = ReturnDocument.After,
};
var replaced = session is null
? await _collection.FindOneAndReplaceAsync(filter, document, options, cancellationToken).ConfigureAwait(false)
: await _collection.FindOneAndReplaceAsync(session, filter, document, options, cancellationToken).ConfigureAwait(false);
_logger.LogDebug("Upserted DTO for {Source}/{DocumentId}", record.SourceName, record.DocumentId);
return (replaced ?? document).ToRecord();
}
public async Task<DtoRecord?> FindByDocumentIdAsync(Guid documentId, CancellationToken cancellationToken, IClientSessionHandle? session = null)
{
var documentIdValue = documentId.ToString();
var filter = Builders<DtoDocument>.Filter.Eq(x => x.DocumentId, documentIdValue);
var query = session is null
? _collection.Find(filter)
: _collection.Find(session, filter);
var document = await query.FirstOrDefaultAsync(cancellationToken).ConfigureAwait(false);
return document?.ToRecord();
}
public async Task<IReadOnlyList<DtoRecord>> GetBySourceAsync(string sourceName, int limit, CancellationToken cancellationToken, IClientSessionHandle? session = null)
{
var filter = Builders<DtoDocument>.Filter.Eq(x => x.SourceName, sourceName);
var query = session is null
? _collection.Find(filter)
: _collection.Find(session, filter);
var cursor = await query
.SortByDescending(x => x.ValidatedAt)
.Limit(limit)
.ToListAsync(cancellationToken)
.ConfigureAwait(false);
return cursor.Select(static x => x.ToRecord()).ToArray();
}
}

View File

@@ -1,12 +0,0 @@
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo.Dtos;
public interface IDtoStore
{
Task<DtoRecord> UpsertAsync(DtoRecord record, CancellationToken cancellationToken, IClientSessionHandle? session = null);
Task<DtoRecord?> FindByDocumentIdAsync(Guid documentId, CancellationToken cancellationToken, IClientSessionHandle? session = null);
Task<IReadOnlyList<DtoRecord>> GetBySourceAsync(string sourceName, int limit, CancellationToken cancellationToken, IClientSessionHandle? session = null);
}

View File

@@ -1,425 +0,0 @@
using System;
using System.Collections.Generic;
using System.Collections.Immutable;
using System.IO;
using System.Linq;
using System.Text;
using System.Text.Encodings.Web;
using System.Text.Json;
using System.Threading;
using System.Threading.Tasks;
using MongoDB.Bson;
using StellaOps.Concelier.Core.Events;
using StellaOps.Concelier.Models;
using StellaOps.Concelier.Storage.Mongo.Conflicts;
using StellaOps.Concelier.Storage.Mongo.Statements;
using StellaOps.Provenance.Mongo;
namespace StellaOps.Concelier.Storage.Mongo.Events;
public sealed class MongoAdvisoryEventRepository : IAdvisoryEventRepository
{
private readonly IAdvisoryStatementStore _statementStore;
private readonly IAdvisoryConflictStore _conflictStore;
public MongoAdvisoryEventRepository(
IAdvisoryStatementStore statementStore,
IAdvisoryConflictStore conflictStore)
{
_statementStore = statementStore ?? throw new ArgumentNullException(nameof(statementStore));
_conflictStore = conflictStore ?? throw new ArgumentNullException(nameof(conflictStore));
}
public async ValueTask InsertStatementsAsync(
IReadOnlyCollection<AdvisoryStatementEntry> statements,
CancellationToken cancellationToken)
{
if (statements is null)
{
throw new ArgumentNullException(nameof(statements));
}
if (statements.Count == 0)
{
return;
}
var records = statements
.Select(static entry =>
{
var payload = BsonDocument.Parse(entry.CanonicalJson);
var (provenanceDoc, trustDoc) = BuildMetadata(entry.Provenance, entry.Trust);
return new AdvisoryStatementRecord(
entry.StatementId,
entry.VulnerabilityKey,
entry.AdvisoryKey,
entry.StatementHash.ToArray(),
entry.AsOf,
entry.RecordedAt,
payload,
entry.InputDocumentIds.ToArray(),
provenanceDoc,
trustDoc);
})
.ToList();
await _statementStore.InsertAsync(records, cancellationToken).ConfigureAwait(false);
}
public async ValueTask InsertConflictsAsync(
IReadOnlyCollection<AdvisoryConflictEntry> conflicts,
CancellationToken cancellationToken)
{
if (conflicts is null)
{
throw new ArgumentNullException(nameof(conflicts));
}
if (conflicts.Count == 0)
{
return;
}
var records = conflicts
.Select(static entry =>
{
var payload = BsonDocument.Parse(entry.CanonicalJson);
var (provenanceDoc, trustDoc) = BuildMetadata(entry.Provenance, entry.Trust);
return new AdvisoryConflictRecord(
entry.ConflictId,
entry.VulnerabilityKey,
entry.ConflictHash.ToArray(),
entry.AsOf,
entry.RecordedAt,
entry.StatementIds.ToArray(),
payload,
provenanceDoc,
trustDoc);
})
.ToList();
await _conflictStore.InsertAsync(records, cancellationToken).ConfigureAwait(false);
}
public async ValueTask<IReadOnlyList<AdvisoryStatementEntry>> GetStatementsAsync(
string vulnerabilityKey,
DateTimeOffset? asOf,
CancellationToken cancellationToken)
{
var records = await _statementStore
.GetStatementsAsync(vulnerabilityKey, asOf, cancellationToken)
.ConfigureAwait(false);
if (records.Count == 0)
{
return Array.Empty<AdvisoryStatementEntry>();
}
var entries = records
.Select(static record =>
{
var advisory = CanonicalJsonSerializer.Deserialize<Advisory>(record.Payload.ToJson());
var canonicalJson = CanonicalJsonSerializer.Serialize(advisory);
var (provenance, trust) = ParseMetadata(record.Provenance, record.Trust);
return new AdvisoryStatementEntry(
record.Id,
record.VulnerabilityKey,
record.AdvisoryKey,
canonicalJson,
record.StatementHash.ToImmutableArray(),
record.AsOf,
record.RecordedAt,
record.InputDocumentIds.ToImmutableArray(),
provenance,
trust);
})
.ToList();
return entries;
}
public async ValueTask<IReadOnlyList<AdvisoryConflictEntry>> GetConflictsAsync(
string vulnerabilityKey,
DateTimeOffset? asOf,
CancellationToken cancellationToken)
{
var records = await _conflictStore
.GetConflictsAsync(vulnerabilityKey, asOf, cancellationToken)
.ConfigureAwait(false);
if (records.Count == 0)
{
return Array.Empty<AdvisoryConflictEntry>();
}
var entries = records
.Select(static record =>
{
var canonicalJson = Canonicalize(record.Details);
var (provenance, trust) = ParseMetadata(record.Provenance, record.Trust);
return new AdvisoryConflictEntry(
record.Id,
record.VulnerabilityKey,
canonicalJson,
record.ConflictHash.ToImmutableArray(),
record.AsOf,
record.RecordedAt,
record.StatementIds.ToImmutableArray(),
provenance,
trust);
})
.ToList();
return entries;
}
public async ValueTask AttachStatementProvenanceAsync(
Guid statementId,
DsseProvenance dsse,
TrustInfo trust,
CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(dsse);
ArgumentNullException.ThrowIfNull(trust);
var (provenanceDoc, trustDoc) = BuildMetadata(dsse, trust);
if (provenanceDoc is null || trustDoc is null)
{
throw new InvalidOperationException("Failed to build provenance documents.");
}
await _statementStore
.UpdateProvenanceAsync(statementId, provenanceDoc, trustDoc, cancellationToken)
.ConfigureAwait(false);
}
private static readonly JsonWriterOptions CanonicalWriterOptions = new()
{
Encoder = JavaScriptEncoder.UnsafeRelaxedJsonEscaping,
Indented = false,
SkipValidation = false,
};
private static string Canonicalize(BsonDocument document)
{
using var json = JsonDocument.Parse(document.ToJson());
using var stream = new MemoryStream();
using (var writer = new Utf8JsonWriter(stream, CanonicalWriterOptions))
{
WriteCanonical(json.RootElement, writer);
}
return Encoding.UTF8.GetString(stream.ToArray());
}
private static (BsonDocument? Provenance, BsonDocument? Trust) BuildMetadata(DsseProvenance? provenance, TrustInfo? trust)
{
if (provenance is null || trust is null)
{
return (null, null);
}
var metadata = new BsonDocument();
metadata.AttachDsseProvenance(provenance, trust);
var provenanceDoc = metadata.TryGetValue("provenance", out var provenanceValue)
? (BsonDocument)provenanceValue.DeepClone()
: null;
var trustDoc = metadata.TryGetValue("trust", out var trustValue)
? (BsonDocument)trustValue.DeepClone()
: null;
return (provenanceDoc, trustDoc);
}
private static (DsseProvenance?, TrustInfo?) ParseMetadata(BsonDocument? provenanceDoc, BsonDocument? trustDoc)
{
DsseProvenance? dsse = null;
if (provenanceDoc is not null &&
provenanceDoc.TryGetValue("dsse", out var dsseValue) &&
dsseValue is BsonDocument dsseBody)
{
if (TryGetString(dsseBody, "envelopeDigest", out var envelopeDigest) &&
TryGetString(dsseBody, "payloadType", out var payloadType) &&
dsseBody.TryGetValue("key", out var keyValue) &&
keyValue is BsonDocument keyDoc &&
TryGetString(keyDoc, "keyId", out var keyId))
{
var keyInfo = new DsseKeyInfo
{
KeyId = keyId,
Issuer = GetOptionalString(keyDoc, "issuer"),
Algo = GetOptionalString(keyDoc, "algo"),
};
dsse = new DsseProvenance
{
EnvelopeDigest = envelopeDigest,
PayloadType = payloadType,
Key = keyInfo,
Rekor = ParseRekor(dsseBody),
Chain = ParseChain(dsseBody)
};
}
}
TrustInfo? trust = null;
if (trustDoc is not null)
{
trust = new TrustInfo
{
Verified = trustDoc.TryGetValue("verified", out var verifiedValue) && verifiedValue.ToBoolean(),
Verifier = GetOptionalString(trustDoc, "verifier"),
Witnesses = trustDoc.TryGetValue("witnesses", out var witnessValue) && witnessValue.IsInt32 ? witnessValue.AsInt32 : (int?)null,
PolicyScore = trustDoc.TryGetValue("policyScore", out var scoreValue) && scoreValue.IsNumeric ? scoreValue.AsDouble : (double?)null
};
}
return (dsse, trust);
}
private static DsseRekorInfo? ParseRekor(BsonDocument dsseBody)
{
if (!dsseBody.TryGetValue("rekor", out var rekorValue) || !rekorValue.IsBsonDocument)
{
return null;
}
var rekorDoc = rekorValue.AsBsonDocument;
if (!TryGetInt64(rekorDoc, "logIndex", out var logIndex))
{
return null;
}
return new DsseRekorInfo
{
LogIndex = logIndex,
Uuid = GetOptionalString(rekorDoc, "uuid") ?? string.Empty,
IntegratedTime = TryGetInt64(rekorDoc, "integratedTime", out var integratedTime) ? integratedTime : null,
MirrorSeq = TryGetInt64(rekorDoc, "mirrorSeq", out var mirrorSeq) ? mirrorSeq : null
};
}
private static IReadOnlyCollection<DsseChainLink>? ParseChain(BsonDocument dsseBody)
{
if (!dsseBody.TryGetValue("chain", out var chainValue) || !chainValue.IsBsonArray)
{
return null;
}
var links = new List<DsseChainLink>();
foreach (var element in chainValue.AsBsonArray)
{
if (!element.IsBsonDocument)
{
continue;
}
var linkDoc = element.AsBsonDocument;
if (!TryGetString(linkDoc, "type", out var type) ||
!TryGetString(linkDoc, "id", out var id) ||
!TryGetString(linkDoc, "digest", out var digest))
{
continue;
}
links.Add(new DsseChainLink
{
Type = type,
Id = id,
Digest = digest
});
}
return links.Count == 0 ? null : links;
}
private static bool TryGetString(BsonDocument document, string name, out string value)
{
if (document.TryGetValue(name, out var bsonValue) && bsonValue.IsString)
{
value = bsonValue.AsString;
return true;
}
value = string.Empty;
return false;
}
private static string? GetOptionalString(BsonDocument document, string name)
=> document.TryGetValue(name, out var bsonValue) && bsonValue.IsString ? bsonValue.AsString : null;
private static bool TryGetInt64(BsonDocument document, string name, out long value)
{
if (document.TryGetValue(name, out var bsonValue))
{
if (bsonValue.IsInt64)
{
value = bsonValue.AsInt64;
return true;
}
if (bsonValue.IsInt32)
{
value = bsonValue.AsInt32;
return true;
}
if (bsonValue.IsString && long.TryParse(bsonValue.AsString, out var parsed))
{
value = parsed;
return true;
}
}
value = 0;
return false;
}
private static void WriteCanonical(JsonElement element, Utf8JsonWriter writer)
{
switch (element.ValueKind)
{
case JsonValueKind.Object:
writer.WriteStartObject();
foreach (var property in element.EnumerateObject().OrderBy(static p => p.Name, StringComparer.Ordinal))
{
writer.WritePropertyName(property.Name);
WriteCanonical(property.Value, writer);
}
writer.WriteEndObject();
break;
case JsonValueKind.Array:
writer.WriteStartArray();
foreach (var item in element.EnumerateArray())
{
WriteCanonical(item, writer);
}
writer.WriteEndArray();
break;
case JsonValueKind.String:
writer.WriteStringValue(element.GetString());
break;
case JsonValueKind.Number:
writer.WriteRawValue(element.GetRawText());
break;
case JsonValueKind.True:
writer.WriteBooleanValue(true);
break;
case JsonValueKind.False:
writer.WriteBooleanValue(false);
break;
case JsonValueKind.Null:
writer.WriteNullValue();
break;
default:
writer.WriteRawValue(element.GetRawText());
break;
}
}
}

View File

@@ -1,90 +0,0 @@
using System.Collections.Generic;
using System.Linq;
using MongoDB.Bson.Serialization.Attributes;
namespace StellaOps.Concelier.Storage.Mongo.Exporting;
[BsonIgnoreExtraElements]
public sealed class ExportStateDocument
{
[BsonId]
public string Id { get; set; } = string.Empty;
[BsonElement("baseExportId")]
public string? BaseExportId { get; set; }
[BsonElement("baseDigest")]
public string? BaseDigest { get; set; }
[BsonElement("lastFullDigest")]
public string? LastFullDigest { get; set; }
[BsonElement("lastDeltaDigest")]
public string? LastDeltaDigest { get; set; }
[BsonElement("exportCursor")]
public string? ExportCursor { get; set; }
[BsonElement("targetRepo")]
public string? TargetRepository { get; set; }
[BsonElement("exporterVersion")]
public string? ExporterVersion { get; set; }
[BsonElement("updatedAt")]
public DateTime UpdatedAt { get; set; }
[BsonElement("files")]
public List<ExportStateFileDocument>? Files { get; set; }
}
public sealed class ExportStateFileDocument
{
[BsonElement("path")]
public string Path { get; set; } = string.Empty;
[BsonElement("length")]
public long Length { get; set; }
[BsonElement("digest")]
public string Digest { get; set; } = string.Empty;
}
internal static class ExportStateDocumentExtensions
{
public static ExportStateDocument FromRecord(ExportStateRecord record)
=> new()
{
Id = record.Id,
BaseExportId = record.BaseExportId,
BaseDigest = record.BaseDigest,
LastFullDigest = record.LastFullDigest,
LastDeltaDigest = record.LastDeltaDigest,
ExportCursor = record.ExportCursor,
TargetRepository = record.TargetRepository,
ExporterVersion = record.ExporterVersion,
UpdatedAt = record.UpdatedAt.UtcDateTime,
Files = record.Files.Select(static file => new ExportStateFileDocument
{
Path = file.Path,
Length = file.Length,
Digest = file.Digest,
}).ToList(),
};
public static ExportStateRecord ToRecord(this ExportStateDocument document)
=> new(
document.Id,
document.BaseExportId,
document.BaseDigest,
document.LastFullDigest,
document.LastDeltaDigest,
document.ExportCursor,
document.TargetRepository,
document.ExporterVersion,
DateTime.SpecifyKind(document.UpdatedAt, DateTimeKind.Utc),
(document.Files ?? new List<ExportStateFileDocument>())
.Where(static entry => !string.IsNullOrWhiteSpace(entry.Path))
.Select(static entry => new ExportFileRecord(entry.Path, entry.Length, entry.Digest))
.ToArray());
}

View File

@@ -1,135 +0,0 @@
using System;
using System.Collections.Generic;
using System.Threading;
using System.Threading.Tasks;
namespace StellaOps.Concelier.Storage.Mongo.Exporting;
/// <summary>
/// Helper for exporters to read and persist their export metadata in Mongo-backed storage.
/// </summary>
public sealed class ExportStateManager
{
private readonly IExportStateStore _store;
private readonly TimeProvider _timeProvider;
public ExportStateManager(IExportStateStore store, TimeProvider? timeProvider = null)
{
_store = store ?? throw new ArgumentNullException(nameof(store));
_timeProvider = timeProvider ?? TimeProvider.System;
}
public Task<ExportStateRecord?> GetAsync(string exporterId, CancellationToken cancellationToken)
{
ArgumentException.ThrowIfNullOrEmpty(exporterId);
return _store.FindAsync(exporterId, cancellationToken);
}
public async Task<ExportStateRecord> StoreFullExportAsync(
string exporterId,
string exportId,
string exportDigest,
string? cursor,
string? targetRepository,
string exporterVersion,
bool resetBaseline,
IReadOnlyList<ExportFileRecord> manifest,
CancellationToken cancellationToken)
{
ArgumentException.ThrowIfNullOrEmpty(exporterId);
ArgumentException.ThrowIfNullOrEmpty(exportId);
ArgumentException.ThrowIfNullOrEmpty(exportDigest);
ArgumentException.ThrowIfNullOrEmpty(exporterVersion);
manifest ??= Array.Empty<ExportFileRecord>();
var existing = await _store.FindAsync(exporterId, cancellationToken).ConfigureAwait(false);
var now = _timeProvider.GetUtcNow();
if (existing is null)
{
var resolvedRepository = string.IsNullOrWhiteSpace(targetRepository) ? null : targetRepository;
return await _store.UpsertAsync(
new ExportStateRecord(
exporterId,
BaseExportId: exportId,
BaseDigest: exportDigest,
LastFullDigest: exportDigest,
LastDeltaDigest: null,
ExportCursor: cursor ?? exportDigest,
TargetRepository: resolvedRepository,
ExporterVersion: exporterVersion,
UpdatedAt: now,
Files: manifest),
cancellationToken).ConfigureAwait(false);
}
var repositorySpecified = !string.IsNullOrWhiteSpace(targetRepository);
var resolvedRepo = repositorySpecified ? targetRepository : existing.TargetRepository;
var repositoryChanged = repositorySpecified
&& !string.Equals(existing.TargetRepository, targetRepository, StringComparison.Ordinal);
var shouldResetBaseline =
resetBaseline
|| string.IsNullOrWhiteSpace(existing.BaseExportId)
|| string.IsNullOrWhiteSpace(existing.BaseDigest)
|| repositoryChanged;
var updatedRecord = shouldResetBaseline
? existing with
{
BaseExportId = exportId,
BaseDigest = exportDigest,
LastFullDigest = exportDigest,
LastDeltaDigest = null,
ExportCursor = cursor ?? exportDigest,
TargetRepository = resolvedRepo,
ExporterVersion = exporterVersion,
UpdatedAt = now,
Files = manifest,
}
: existing with
{
LastFullDigest = exportDigest,
LastDeltaDigest = null,
ExportCursor = cursor ?? existing.ExportCursor,
TargetRepository = resolvedRepo,
ExporterVersion = exporterVersion,
UpdatedAt = now,
Files = manifest,
};
return await _store.UpsertAsync(updatedRecord, cancellationToken).ConfigureAwait(false);
}
public async Task<ExportStateRecord> StoreDeltaExportAsync(
string exporterId,
string deltaDigest,
string? cursor,
string exporterVersion,
IReadOnlyList<ExportFileRecord> manifest,
CancellationToken cancellationToken)
{
ArgumentException.ThrowIfNullOrEmpty(exporterId);
ArgumentException.ThrowIfNullOrEmpty(deltaDigest);
ArgumentException.ThrowIfNullOrEmpty(exporterVersion);
manifest ??= Array.Empty<ExportFileRecord>();
var existing = await _store.FindAsync(exporterId, cancellationToken).ConfigureAwait(false);
if (existing is null)
{
throw new InvalidOperationException($"Full export state missing for '{exporterId}'.");
}
var now = _timeProvider.GetUtcNow();
var record = existing with
{
LastDeltaDigest = deltaDigest,
ExportCursor = cursor ?? existing.ExportCursor,
ExporterVersion = exporterVersion,
UpdatedAt = now,
Files = manifest,
};
return await _store.UpsertAsync(record, cancellationToken).ConfigureAwait(false);
}
}

View File

@@ -1,15 +0,0 @@
namespace StellaOps.Concelier.Storage.Mongo.Exporting;
public sealed record ExportStateRecord(
string Id,
string? BaseExportId,
string? BaseDigest,
string? LastFullDigest,
string? LastDeltaDigest,
string? ExportCursor,
string? TargetRepository,
string? ExporterVersion,
DateTimeOffset UpdatedAt,
IReadOnlyList<ExportFileRecord> Files);
public sealed record ExportFileRecord(string Path, long Length, string Digest);

View File

@@ -1,43 +0,0 @@
using Microsoft.Extensions.Logging;
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo.Exporting;
public sealed class ExportStateStore : IExportStateStore
{
private readonly IMongoCollection<ExportStateDocument> _collection;
private readonly ILogger<ExportStateStore> _logger;
public ExportStateStore(IMongoDatabase database, ILogger<ExportStateStore> logger)
{
_collection = (database ?? throw new ArgumentNullException(nameof(database)))
.GetCollection<ExportStateDocument>(MongoStorageDefaults.Collections.ExportState);
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public async Task<ExportStateRecord> UpsertAsync(ExportStateRecord record, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(record);
var document = ExportStateDocumentExtensions.FromRecord(record);
var options = new FindOneAndReplaceOptions<ExportStateDocument>
{
IsUpsert = true,
ReturnDocument = ReturnDocument.After,
};
var replaced = await _collection.FindOneAndReplaceAsync<ExportStateDocument, ExportStateDocument>(
x => x.Id == record.Id,
document,
options,
cancellationToken).ConfigureAwait(false);
_logger.LogDebug("Stored export state {StateId}", record.Id);
return (replaced ?? document).ToRecord();
}
public async Task<ExportStateRecord?> FindAsync(string id, CancellationToken cancellationToken)
{
var document = await _collection.Find(x => x.Id == id).FirstOrDefaultAsync(cancellationToken).ConfigureAwait(false);
return document?.ToRecord();
}
}

View File

@@ -1,8 +0,0 @@
namespace StellaOps.Concelier.Storage.Mongo.Exporting;
public interface IExportStateStore
{
Task<ExportStateRecord> UpsertAsync(ExportStateRecord record, CancellationToken cancellationToken);
Task<ExportStateRecord?> FindAsync(string id, CancellationToken cancellationToken);
}

View File

@@ -1,15 +0,0 @@
using MongoDB.Bson;
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo;
public interface ISourceStateRepository
{
Task<SourceStateRecord?> TryGetAsync(string sourceName, CancellationToken cancellationToken, IClientSessionHandle? session = null);
Task<SourceStateRecord> UpsertAsync(SourceStateRecord record, CancellationToken cancellationToken, IClientSessionHandle? session = null);
Task<SourceStateRecord?> UpdateCursorAsync(string sourceName, BsonDocument cursor, DateTimeOffset completedAt, CancellationToken cancellationToken, IClientSessionHandle? session = null);
Task<SourceStateRecord?> MarkFailureAsync(string sourceName, DateTimeOffset failedAt, TimeSpan? backoff, string? failureReason, CancellationToken cancellationToken, IClientSessionHandle? session = null);
}

View File

@@ -1,38 +0,0 @@
using MongoDB.Bson.Serialization.Attributes;
using StellaOps.Concelier.Core.Jobs;
namespace StellaOps.Concelier.Storage.Mongo;
[BsonIgnoreExtraElements]
public sealed class JobLeaseDocument
{
[BsonId]
public string Key { get; set; } = string.Empty;
[BsonElement("holder")]
public string Holder { get; set; } = string.Empty;
[BsonElement("acquiredAt")]
public DateTime AcquiredAt { get; set; }
[BsonElement("heartbeatAt")]
public DateTime HeartbeatAt { get; set; }
[BsonElement("leaseMs")]
public long LeaseMs { get; set; }
[BsonElement("ttlAt")]
public DateTime TtlAt { get; set; }
}
internal static class JobLeaseDocumentExtensions
{
public static JobLease ToLease(this JobLeaseDocument document)
=> new(
document.Key,
document.Holder,
DateTime.SpecifyKind(document.AcquiredAt, DateTimeKind.Utc),
DateTime.SpecifyKind(document.HeartbeatAt, DateTimeKind.Utc),
TimeSpan.FromMilliseconds(document.LeaseMs),
DateTime.SpecifyKind(document.TtlAt, DateTimeKind.Utc));
}

View File

@@ -1,119 +0,0 @@
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text.Json;
using MongoDB.Bson;
using MongoDB.Bson.Serialization.Attributes;
using StellaOps.Concelier.Core.Jobs;
namespace StellaOps.Concelier.Storage.Mongo;
[BsonIgnoreExtraElements]
public sealed class JobRunDocument
{
[BsonId]
public string Id { get; set; } = string.Empty;
[BsonElement("kind")]
public string Kind { get; set; } = string.Empty;
[BsonElement("status")]
public string Status { get; set; } = JobRunStatus.Pending.ToString();
[BsonElement("trigger")]
public string Trigger { get; set; } = string.Empty;
[BsonElement("parameters")]
public BsonDocument Parameters { get; set; } = new();
[BsonElement("parametersHash")]
[BsonIgnoreIfNull]
public string? ParametersHash { get; set; }
[BsonElement("createdAt")]
public DateTime CreatedAt { get; set; }
[BsonElement("startedAt")]
[BsonIgnoreIfNull]
public DateTime? StartedAt { get; set; }
[BsonElement("completedAt")]
[BsonIgnoreIfNull]
public DateTime? CompletedAt { get; set; }
[BsonElement("error")]
[BsonIgnoreIfNull]
public string? Error { get; set; }
[BsonElement("timeoutMs")]
[BsonIgnoreIfNull]
public long? TimeoutMs { get; set; }
[BsonElement("leaseMs")]
[BsonIgnoreIfNull]
public long? LeaseMs { get; set; }
}
internal static class JobRunDocumentExtensions
{
public static JobRunDocument FromRequest(JobRunCreateRequest request, Guid id)
{
return new JobRunDocument
{
Id = id.ToString(),
Kind = request.Kind,
Status = JobRunStatus.Pending.ToString(),
Trigger = request.Trigger,
Parameters = request.Parameters is { Count: > 0 }
? BsonDocument.Parse(JsonSerializer.Serialize(request.Parameters))
: new BsonDocument(),
ParametersHash = request.ParametersHash,
CreatedAt = request.CreatedAt.UtcDateTime,
TimeoutMs = request.Timeout?.MillisecondsFromTimespan(),
LeaseMs = request.LeaseDuration?.MillisecondsFromTimespan(),
};
}
public static JobRunSnapshot ToSnapshot(this JobRunDocument document)
{
var parameters = document.Parameters?.ToDictionary() ?? new Dictionary<string, object?>();
return new JobRunSnapshot(
Guid.Parse(document.Id),
document.Kind,
Enum.Parse<JobRunStatus>(document.Status, ignoreCase: true),
DateTime.SpecifyKind(document.CreatedAt, DateTimeKind.Utc),
document.StartedAt.HasValue ? DateTime.SpecifyKind(document.StartedAt.Value, DateTimeKind.Utc) : null,
document.CompletedAt.HasValue ? DateTime.SpecifyKind(document.CompletedAt.Value, DateTimeKind.Utc) : null,
document.Trigger,
document.ParametersHash,
document.Error,
document.TimeoutMs?.MillisecondsToTimespan(),
document.LeaseMs?.MillisecondsToTimespan(),
parameters);
}
public static Dictionary<string, object?> ToDictionary(this BsonDocument document)
{
return document.Elements.ToDictionary(
static element => element.Name,
static element => element.Value switch
{
BsonString s => (object?)s.AsString,
BsonBoolean b => b.AsBoolean,
BsonInt32 i => i.AsInt32,
BsonInt64 l => l.AsInt64,
BsonDouble d => d.AsDouble,
BsonNull => null,
BsonArray array => array.Select(v => v.IsBsonDocument ? ToDictionary(v.AsBsonDocument) : (object?)v.ToString()).ToArray(),
BsonDocument doc => ToDictionary(doc),
_ => element.Value.ToString(),
});
}
private static long MillisecondsFromTimespan(this TimeSpan timeSpan)
=> (long)timeSpan.TotalMilliseconds;
private static TimeSpan MillisecondsToTimespan(this long milliseconds)
=> TimeSpan.FromMilliseconds(milliseconds);
}

View File

@@ -1,11 +0,0 @@
using System.Threading;
using System.Threading.Tasks;
namespace StellaOps.Concelier.Storage.Mongo.JpFlags;
public interface IJpFlagStore
{
Task UpsertAsync(JpFlagRecord record, CancellationToken cancellationToken);
Task<JpFlagRecord?> FindAsync(string advisoryKey, CancellationToken cancellationToken);
}

View File

@@ -1,54 +0,0 @@
using MongoDB.Bson.Serialization.Attributes;
namespace StellaOps.Concelier.Storage.Mongo.JpFlags;
[BsonIgnoreExtraElements]
public sealed class JpFlagDocument
{
[BsonId]
[BsonElement("advisoryKey")]
public string AdvisoryKey { get; set; } = string.Empty;
[BsonElement("sourceName")]
public string SourceName { get; set; } = string.Empty;
[BsonElement("category")]
[BsonIgnoreIfNull]
public string? Category { get; set; }
[BsonElement("vendorStatus")]
[BsonIgnoreIfNull]
public string? VendorStatus { get; set; }
[BsonElement("recordedAt")]
public DateTime RecordedAt { get; set; }
}
internal static class JpFlagDocumentExtensions
{
public static JpFlagDocument FromRecord(JpFlagRecord record)
{
ArgumentNullException.ThrowIfNull(record);
return new JpFlagDocument
{
AdvisoryKey = record.AdvisoryKey,
SourceName = record.SourceName,
Category = record.Category,
VendorStatus = record.VendorStatus,
RecordedAt = record.RecordedAt.UtcDateTime,
};
}
public static JpFlagRecord ToRecord(this JpFlagDocument document)
{
ArgumentNullException.ThrowIfNull(document);
return new JpFlagRecord(
document.AdvisoryKey,
document.SourceName,
document.Category,
document.VendorStatus,
DateTime.SpecifyKind(document.RecordedAt, DateTimeKind.Utc));
}
}

View File

@@ -1,15 +0,0 @@
namespace StellaOps.Concelier.Storage.Mongo.JpFlags;
/// <summary>
/// Captures Japan-specific enrichment flags derived from JVN payloads.
/// </summary>
public sealed record JpFlagRecord(
string AdvisoryKey,
string SourceName,
string? Category,
string? VendorStatus,
DateTimeOffset RecordedAt)
{
public JpFlagRecord WithRecordedAt(DateTimeOffset recordedAt)
=> this with { RecordedAt = recordedAt.ToUniversalTime() };
}

View File

@@ -1,39 +0,0 @@
using Microsoft.Extensions.Logging;
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo.JpFlags;
public sealed class JpFlagStore : IJpFlagStore
{
private readonly IMongoCollection<JpFlagDocument> _collection;
private readonly ILogger<JpFlagStore> _logger;
public JpFlagStore(IMongoDatabase database, ILogger<JpFlagStore> logger)
{
ArgumentNullException.ThrowIfNull(database);
ArgumentNullException.ThrowIfNull(logger);
_collection = database.GetCollection<JpFlagDocument>(MongoStorageDefaults.Collections.JpFlags);
_logger = logger;
}
public async Task UpsertAsync(JpFlagRecord record, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(record);
var document = JpFlagDocumentExtensions.FromRecord(record);
var filter = Builders<JpFlagDocument>.Filter.Eq(x => x.AdvisoryKey, record.AdvisoryKey);
var options = new ReplaceOptions { IsUpsert = true };
await _collection.ReplaceOneAsync(filter, document, options, cancellationToken).ConfigureAwait(false);
_logger.LogDebug("Upserted jp_flag for {AdvisoryKey}", record.AdvisoryKey);
}
public async Task<JpFlagRecord?> FindAsync(string advisoryKey, CancellationToken cancellationToken)
{
ArgumentException.ThrowIfNullOrEmpty(advisoryKey);
var filter = Builders<JpFlagDocument>.Filter.Eq(x => x.AdvisoryKey, advisoryKey);
var document = await _collection.Find(filter).FirstOrDefaultAsync(cancellationToken).ConfigureAwait(false);
return document?.ToRecord();
}
}

View File

@@ -1,122 +0,0 @@
using System;
using System.Collections.Generic;
using MongoDB.Bson;
using MongoDB.Bson.Serialization.Attributes;
namespace StellaOps.Concelier.Storage.Mongo.Linksets;
[BsonIgnoreExtraElements]
public sealed class AdvisoryLinksetDocument
{
[BsonId]
public ObjectId Id { get; set; }
= ObjectId.GenerateNewId();
[BsonElement("tenantId")]
public string TenantId { get; set; } = string.Empty;
[BsonElement("source")]
public string Source { get; set; } = string.Empty;
[BsonElement("advisoryId")]
public string AdvisoryId { get; set; } = string.Empty;
[BsonElement("observations")]
public List<string> Observations { get; set; } = new();
[BsonElement("normalized")]
[BsonIgnoreIfNull]
public AdvisoryLinksetNormalizedDocument? Normalized { get; set; }
= null;
[BsonElement("confidence")]
[BsonIgnoreIfNull]
public double? Confidence { get; set; }
= null;
[BsonElement("conflicts")]
[BsonIgnoreIfNull]
public List<AdvisoryLinksetConflictDocument>? Conflicts { get; set; }
= null;
[BsonElement("createdAt")]
public DateTime CreatedAt { get; set; } = DateTime.UtcNow;
[BsonElement("builtByJobId")]
[BsonIgnoreIfNull]
public string? BuiltByJobId { get; set; }
= null;
[BsonElement("provenance")]
[BsonIgnoreIfNull]
public AdvisoryLinksetProvenanceDocument? Provenance { get; set; }
= null;
}
[BsonIgnoreExtraElements]
public sealed class AdvisoryLinksetNormalizedDocument
{
[BsonElement("purls")]
[BsonIgnoreIfNull]
public List<string>? Purls { get; set; }
= new();
[BsonElement("cpes")]
[BsonIgnoreIfNull]
public List<string>? Cpes { get; set; }
= new();
[BsonElement("versions")]
[BsonIgnoreIfNull]
public List<string>? Versions { get; set; }
= new();
[BsonElement("ranges")]
[BsonIgnoreIfNull]
public List<BsonDocument>? Ranges { get; set; }
= new();
[BsonElement("severities")]
[BsonIgnoreIfNull]
public List<BsonDocument>? Severities { get; set; }
= new();
}
[BsonIgnoreExtraElements]
public sealed class AdvisoryLinksetProvenanceDocument
{
[BsonElement("observationHashes")]
[BsonIgnoreIfNull]
public List<string>? ObservationHashes { get; set; }
= new();
[BsonElement("toolVersion")]
[BsonIgnoreIfNull]
public string? ToolVersion { get; set; }
= null;
[BsonElement("policyHash")]
[BsonIgnoreIfNull]
public string? PolicyHash { get; set; }
= null;
}
[BsonIgnoreExtraElements]
public sealed class AdvisoryLinksetConflictDocument
{
[BsonElement("field")]
public string Field { get; set; } = string.Empty;
[BsonElement("reason")]
public string Reason { get; set; } = string.Empty;
[BsonElement("values")]
[BsonIgnoreIfNull]
public List<string>? Values { get; set; }
= new();
[BsonElement("sourceIds")]
[BsonIgnoreIfNull]
public List<string>? SourceIds { get; set; }
= new();
}

View File

@@ -1,23 +0,0 @@
using System;
using System.Threading;
using System.Threading.Tasks;
using CoreLinksets = StellaOps.Concelier.Core.Linksets;
namespace StellaOps.Concelier.Storage.Mongo.Linksets;
// Backcompat sink name retained for compile includes; forwards to the Mongo-specific store.
internal sealed class AdvisoryLinksetSink : CoreLinksets.IAdvisoryLinksetSink
{
private readonly IMongoAdvisoryLinksetStore _store;
public AdvisoryLinksetSink(IMongoAdvisoryLinksetStore store)
{
_store = store ?? throw new ArgumentNullException(nameof(store));
}
public Task UpsertAsync(CoreLinksets.AdvisoryLinkset linkset, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(linkset);
return _store.UpsertAsync(linkset, cancellationToken);
}
}

View File

@@ -1,20 +0,0 @@
using System;
using System.Threading;
using System.Threading.Tasks;
namespace StellaOps.Concelier.Storage.Mongo.Linksets;
internal sealed class ConcelierMongoLinksetSink : global::StellaOps.Concelier.Core.Linksets.IAdvisoryLinksetSink
{
private readonly IMongoAdvisoryLinksetStore _store;
public ConcelierMongoLinksetSink(IMongoAdvisoryLinksetStore store)
{
_store = store ?? throw new ArgumentNullException(nameof(store));
}
public Task UpsertAsync(global::StellaOps.Concelier.Core.Linksets.AdvisoryLinkset linkset, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(linkset);
return _store.UpsertAsync(linkset, cancellationToken);
}
}

View File

@@ -1,186 +0,0 @@
using System;
using System.Collections.Generic;
using System.Collections.Immutable;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using MongoDB.Driver;
using CoreLinksets = StellaOps.Concelier.Core.Linksets;
namespace StellaOps.Concelier.Storage.Mongo.Linksets;
// Storage implementation of advisory linkset persistence.
internal sealed class ConcelierMongoLinksetStore : IMongoAdvisoryLinksetStore
{
private readonly IMongoCollection<AdvisoryLinksetDocument> _collection;
public ConcelierMongoLinksetStore(IMongoCollection<AdvisoryLinksetDocument> collection)
{
_collection = collection ?? throw new ArgumentNullException(nameof(collection));
}
public async Task UpsertAsync(CoreLinksets.AdvisoryLinkset linkset, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(linkset);
var document = MapToDocument(linkset);
var tenant = linkset.TenantId.ToLowerInvariant();
var filter = Builders<AdvisoryLinksetDocument>.Filter.And(
Builders<AdvisoryLinksetDocument>.Filter.Eq(d => d.TenantId, tenant),
Builders<AdvisoryLinksetDocument>.Filter.Eq(d => d.Source, linkset.Source),
Builders<AdvisoryLinksetDocument>.Filter.Eq(d => d.AdvisoryId, linkset.AdvisoryId));
var options = new ReplaceOptions { IsUpsert = true };
await _collection.ReplaceOneAsync(filter, document, options, cancellationToken).ConfigureAwait(false);
}
public async Task<IReadOnlyList<CoreLinksets.AdvisoryLinkset>> FindByTenantAsync(
string tenantId,
IEnumerable<string>? advisoryIds,
IEnumerable<string>? sources,
CoreLinksets.AdvisoryLinksetCursor? cursor,
int limit,
CancellationToken cancellationToken)
{
ArgumentException.ThrowIfNullOrWhiteSpace(tenantId);
if (limit <= 0)
{
throw new ArgumentOutOfRangeException(nameof(limit));
}
var builder = Builders<AdvisoryLinksetDocument>.Filter;
var filters = new List<FilterDefinition<AdvisoryLinksetDocument>>
{
builder.Eq(d => d.TenantId, tenantId.ToLowerInvariant())
};
if (advisoryIds is not null)
{
var ids = advisoryIds.Where(v => !string.IsNullOrWhiteSpace(v)).ToArray();
if (ids.Length > 0)
{
filters.Add(builder.In(d => d.AdvisoryId, ids));
}
}
if (sources is not null)
{
var srcs = sources.Where(v => !string.IsNullOrWhiteSpace(v)).ToArray();
if (srcs.Length > 0)
{
filters.Add(builder.In(d => d.Source, srcs));
}
}
var filter = builder.And(filters);
if (cursor is not null)
{
var cursorFilter = builder.Or(
builder.Lt(d => d.CreatedAt, cursor.CreatedAt.UtcDateTime),
builder.And(
builder.Eq(d => d.CreatedAt, cursor.CreatedAt.UtcDateTime),
builder.Gt(d => d.AdvisoryId, cursor.AdvisoryId)));
filter = builder.And(filter, cursorFilter);
}
var sort = Builders<AdvisoryLinksetDocument>.Sort.Descending(d => d.CreatedAt).Ascending(d => d.AdvisoryId);
var documents = await _collection.Find(filter)
.Sort(sort)
.Limit(limit)
.ToListAsync(cancellationToken)
.ConfigureAwait(false);
return documents.Select(FromDocument).ToArray();
}
private static AdvisoryLinksetDocument MapToDocument(CoreLinksets.AdvisoryLinkset linkset)
{
return new AdvisoryLinksetDocument
{
TenantId = linkset.TenantId.ToLowerInvariant(),
Source = linkset.Source,
AdvisoryId = linkset.AdvisoryId,
Observations = new List<string>(linkset.ObservationIds),
CreatedAt = linkset.CreatedAt.UtcDateTime,
BuiltByJobId = linkset.BuiltByJobId,
Confidence = linkset.Confidence,
Conflicts = linkset.Conflicts is null
? null
: linkset.Conflicts.Select(conflict => new AdvisoryLinksetConflictDocument
{
Field = conflict.Field,
Reason = conflict.Reason,
Values = conflict.Values is null ? null : new List<string>(conflict.Values),
SourceIds = conflict.SourceIds is null ? null : new List<string>(conflict.SourceIds)
}).ToList(),
Provenance = linkset.Provenance is null ? null : new AdvisoryLinksetProvenanceDocument
{
ObservationHashes = linkset.Provenance.ObservationHashes is null
? null
: new List<string>(linkset.Provenance.ObservationHashes),
ToolVersion = linkset.Provenance.ToolVersion,
PolicyHash = linkset.Provenance.PolicyHash,
},
Normalized = linkset.Normalized is null ? null : new AdvisoryLinksetNormalizedDocument
{
Purls = linkset.Normalized.Purls is null ? null : new List<string>(linkset.Normalized.Purls),
Cpes = linkset.Normalized.Cpes is null ? null : new List<string>(linkset.Normalized.Cpes),
Versions = linkset.Normalized.Versions is null ? null : new List<string>(linkset.Normalized.Versions),
Ranges = linkset.Normalized.RangesToBson(),
Severities = linkset.Normalized.SeveritiesToBson(),
}
};
}
private static CoreLinksets.AdvisoryLinkset FromDocument(AdvisoryLinksetDocument doc)
{
return new CoreLinksets.AdvisoryLinkset(
doc.TenantId,
doc.Source,
doc.AdvisoryId,
doc.Observations.ToImmutableArray(),
doc.Normalized is null ? null : new CoreLinksets.AdvisoryLinksetNormalized(
doc.Normalized.Purls,
doc.Normalized.Cpes,
doc.Normalized.Versions,
doc.Normalized.Ranges?.Select(ToDictionary).ToList(),
doc.Normalized.Severities?.Select(ToDictionary).ToList()),
doc.Provenance is null ? null : new CoreLinksets.AdvisoryLinksetProvenance(
doc.Provenance.ObservationHashes,
doc.Provenance.ToolVersion,
doc.Provenance.PolicyHash),
doc.Confidence,
doc.Conflicts is null
? null
: doc.Conflicts.Select(conflict => new CoreLinksets.AdvisoryLinksetConflict(
conflict.Field,
conflict.Reason,
conflict.Values,
conflict.SourceIds)).ToList(),
DateTime.SpecifyKind(doc.CreatedAt, DateTimeKind.Utc),
doc.BuiltByJobId);
}
private static Dictionary<string, object?> ToDictionary(MongoDB.Bson.BsonDocument bson)
{
var dict = new Dictionary<string, object?>(StringComparer.Ordinal);
foreach (var element in bson.Elements)
{
dict[element.Name] = element.Value switch
{
MongoDB.Bson.BsonString s => s.AsString,
MongoDB.Bson.BsonInt32 i => i.AsInt32,
MongoDB.Bson.BsonInt64 l => l.AsInt64,
MongoDB.Bson.BsonDouble d => d.AsDouble,
MongoDB.Bson.BsonDecimal128 dec => dec.ToDecimal(),
MongoDB.Bson.BsonBoolean b => b.AsBoolean,
MongoDB.Bson.BsonDateTime dt => dt.ToUniversalTime(),
MongoDB.Bson.BsonNull => (object?)null,
MongoDB.Bson.BsonArray arr => arr.Select(v => v.ToString()).ToArray(),
_ => element.Value.ToString()
};
}
return dict;
}
}

View File

@@ -1,5 +0,0 @@
namespace StellaOps.Concelier.Storage.Mongo.Linksets;
public interface IMongoAdvisoryLinksetStore : global::StellaOps.Concelier.Core.Linksets.IAdvisoryLinksetStore, global::StellaOps.Concelier.Core.Linksets.IAdvisoryLinksetLookup
{
}

View File

@@ -1,61 +0,0 @@
# Mongo Schema Migration Playbook
This module owns the persistent shape of Concelier's MongoDB database. Upgrades must be deterministic and safe to run on live replicas. The `MongoMigrationRunner` executes idempotent migrations on startup immediately after the bootstrapper completes its collection and index checks.
## Execution Path
1. `StellaOps.Concelier.WebService` calls `MongoBootstrapper.InitializeAsync()` during startup.
2. Once collections and baseline indexes are ensured, the bootstrapper invokes `MongoMigrationRunner.RunAsync()`.
3. Each `IMongoMigration` implementation is sorted by its `Id` (ordinal compare) and executed exactly once. Completion is recorded in the `schema_migrations` collection.
4. Failures surface during startup and prevent the service from serving traffic, matching our "fail-fast" requirement for storage incompatibilities.
## Creating a Migration
1. Implement `IMongoMigration` under `StellaOps.Concelier.Storage.Mongo.Migrations`. Use a monotonically increasing identifier such as `yyyyMMdd_description`.
2. Keep the body idempotent: query state first, drop/re-create indexes only when mismatch is detected, and avoid multi-document transactions unless required.
3. Add the migration to DI in `ServiceCollectionExtensions` so it flows into the runner.
4. Write an integration test that exercises the migration against a Mongo2Go instance to validate behaviour.
## Current Migrations
| Id | Description |
| --- | --- |
| `20241005_document_expiry_indexes` | Ensures `document` collection uses the correct TTL/partial index depending on raw document retention settings. |
| `20241005_gridfs_expiry_indexes` | Aligns the GridFS `documents.files` TTL index with retention settings. |
| `20251019_advisory_event_collections` | Creates/aligns indexes for `advisory_statements` and `advisory_conflicts` collections powering the event log + conflict replay pipeline. |
| `20251028_advisory_raw_idempotency_index` | Applies compound unique index on `(source.vendor, upstream.upstream_id, upstream.content_hash, tenant)` after verifying no duplicates exist. |
| `20251028_advisory_supersedes_backfill` | Renames legacy `advisory` collection to a read-only backup view and backfills `supersedes` chains across `advisory_raw`. |
| `20251028_advisory_raw_validator` | Applies Aggregation-Only Contract JSON schema validator to the `advisory_raw` collection with configurable enforcement level. |
| `20251104_advisory_observations_raw_linkset` | Backfills `rawLinkset` on `advisory_observations` using stored `advisory_raw` documents so canonical and raw projections co-exist for downstream policy joins. |
| `20251120_advisory_observation_events` | Creates `advisory_observation_events` collection with tenant/hash indexes for observation event fan-out (advisory.observation.updated@1). Includes optional `publishedAt` marker for transport outbox. |
| `20251117_advisory_linksets_tenant_lower` | Lowercases `advisory_linksets.tenantId` to align writes with lookup filters. |
| `20251116_link_not_merge_collections` | Ensures `advisory_observations` and `advisory_linksets` collections exist with JSON schema validators and baseline indexes for LNM. |
| `20251127_lnm_sharding_and_ttl` | Adds hashed shard key indexes on `tenantId` for horizontal scaling and optional TTL indexes on `ingestedAt`/`createdAt` for storage retention. Creates `advisory_linkset_events` collection for linkset event outbox (LNM-21-101-DEV). |
| `20251127_lnm_legacy_backfill` | Backfills `advisory_observations` from `advisory_raw` documents and creates/updates `advisory_linksets` by grouping observations. Seeds `backfill_marker` tombstones on migrated documents for rollback tracking (LNM-21-102-DEV). |
| `20251128_policy_delta_checkpoints` | Creates `policy_delta_checkpoints` collection with tenant/consumer indexes for deterministic policy delta tracking. Supports cursor-based pagination and change-stream resume tokens for policy consumers (CONCELIER-POLICY-20-003). |
| `20251128_policy_lookup_indexes` | Adds secondary indexes for policy lookup patterns: alias multikey index on observations, confidence/severity indexes on linksets. Supports efficient policy joins without cached verdicts (CONCELIER-POLICY-23-001). |
## Operator Runbook
- `schema_migrations` records each applied migration (`_id`, `description`, `appliedAt`). Review this collection when auditing upgrades.
- Prior to applying `20251028_advisory_raw_idempotency_index`, run the duplicate audit script against the target database:
```bash
mongo concelier ops/devops/scripts/check-advisory-raw-duplicates.js --eval 'var LIMIT=200;'
```
Resolve any reported rows before rolling out the migration.
- After `20251028_advisory_supersedes_backfill` completes, ensure `db.advisory` reports `type: "view"` and `options.viewOn: "advisory_backup_20251028"`. Supersedes chains can be spot-checked via `db.advisory_raw.find({ supersedes: { $exists: true } }).limit(5)`.
- To re-run a migration in a lab, delete the corresponding document from `schema_migrations` and restart the service. **Do not** do this in production unless the migration body is known to be idempotent and safe.
- When changing retention settings (`RawDocumentRetention`), deploy the new configuration and restart Concelier. The migration runner will adjust indexes on the next boot.
- For the event-log collections (`advisory_statements`, `advisory_conflicts`), rollback is simply `db.advisory_statements.drop()` / `db.advisory_conflicts.drop()` followed by a restart if you must revert to the pre-event-log schema (only in labs). Production rollbacks should instead gate merge features that rely on these collections.
- For `20251127_lnm_legacy_backfill` rollback, use the provided Offline Kit script:
```bash
mongo concelier ops/devops/scripts/rollback-lnm-backfill.js
```
This script removes backfilled observations and linksets by querying the `backfill_marker` field (`lnm_21_102_dev`), then clears the tombstone markers from `advisory_raw`. After rollback, delete `20251127_lnm_legacy_backfill` from `schema_migrations` and restart.
- If migrations fail, restart with `Logging__LogLevel__StellaOps.Concelier.Storage.Mongo.Migrations=Debug` to surface diagnostic output. Remediate underlying index/collection drift before retrying.
## Validating an Upgrade
1. Run `dotnet test --filter MongoMigrationRunnerTests` to exercise integration coverage.
2. In staging, execute `db.schema_migrations.find().sort({_id:1})` to verify applied migrations and timestamps.
3. Inspect index shapes: `db.document.getIndexes()` and `db.documents.files.getIndexes()` for TTL/partial filter alignment.

View File

@@ -1,8 +0,0 @@
namespace StellaOps.Concelier.Storage.Mongo.MergeEvents;
public interface IMergeEventStore
{
Task AppendAsync(MergeEventRecord record, CancellationToken cancellationToken);
Task<IReadOnlyList<MergeEventRecord>> GetRecentAsync(string advisoryKey, int limit, CancellationToken cancellationToken);
}

View File

@@ -1,101 +0,0 @@
using System;
using System.Collections.Generic;
using System.Linq;
using MongoDB.Bson;
using MongoDB.Bson.Serialization.Attributes;
namespace StellaOps.Concelier.Storage.Mongo.MergeEvents;
[BsonIgnoreExtraElements]
public sealed class MergeEventDocument
{
[BsonId]
public string Id { get; set; } = string.Empty;
[BsonElement("advisoryKey")]
public string AdvisoryKey { get; set; } = string.Empty;
[BsonElement("beforeHash")]
public byte[] BeforeHash { get; set; } = Array.Empty<byte>();
[BsonElement("afterHash")]
public byte[] AfterHash { get; set; } = Array.Empty<byte>();
[BsonElement("mergedAt")]
public DateTime MergedAt { get; set; }
[BsonElement("inputDocuments")]
public List<string> InputDocuments { get; set; } = new();
[BsonElement("fieldDecisions")]
[BsonIgnoreIfNull]
public List<MergeFieldDecisionDocument>? FieldDecisions { get; set; }
}
internal static class MergeEventDocumentExtensions
{
public static MergeEventDocument FromRecord(MergeEventRecord record)
=> new()
{
Id = record.Id.ToString(),
AdvisoryKey = record.AdvisoryKey,
BeforeHash = record.BeforeHash,
AfterHash = record.AfterHash,
MergedAt = record.MergedAt.UtcDateTime,
InputDocuments = record.InputDocumentIds.Select(static id => id.ToString()).ToList(),
FieldDecisions = record.FieldDecisions.Count == 0
? null
: record.FieldDecisions.Select(MergeFieldDecisionDocument.FromRecord).ToList(),
};
public static MergeEventRecord ToRecord(this MergeEventDocument document)
=> new(
Guid.Parse(document.Id),
document.AdvisoryKey,
document.BeforeHash,
document.AfterHash,
DateTime.SpecifyKind(document.MergedAt, DateTimeKind.Utc),
document.InputDocuments.Select(static value => Guid.Parse(value)).ToList(),
document.FieldDecisions is null
? Array.Empty<MergeFieldDecision>()
: document.FieldDecisions.Select(static decision => decision.ToRecord()).ToList());
}
[BsonIgnoreExtraElements]
public sealed class MergeFieldDecisionDocument
{
[BsonElement("field")]
public string Field { get; set; } = string.Empty;
[BsonElement("selectedSource")]
[BsonIgnoreIfNull]
public string? SelectedSource { get; set; }
[BsonElement("decisionReason")]
public string DecisionReason { get; set; } = string.Empty;
[BsonElement("selectedModified")]
[BsonIgnoreIfNull]
public DateTime? SelectedModified { get; set; }
[BsonElement("consideredSources")]
public List<string> ConsideredSources { get; set; } = new();
public static MergeFieldDecisionDocument FromRecord(MergeFieldDecision record)
=> new()
{
Field = record.Field,
SelectedSource = record.SelectedSource,
DecisionReason = record.DecisionReason,
SelectedModified = record.SelectedModified?.UtcDateTime,
ConsideredSources = record.ConsideredSources.ToList(),
};
public MergeFieldDecision ToRecord()
=> new(
Field,
SelectedSource,
DecisionReason,
SelectedModified.HasValue ? DateTime.SpecifyKind(SelectedModified.Value, DateTimeKind.Utc) : null,
ConsideredSources);
}

View File

@@ -1,10 +0,0 @@
namespace StellaOps.Concelier.Storage.Mongo.MergeEvents;
public sealed record MergeEventRecord(
Guid Id,
string AdvisoryKey,
byte[] BeforeHash,
byte[] AfterHash,
DateTimeOffset MergedAt,
IReadOnlyList<Guid> InputDocumentIds,
IReadOnlyList<MergeFieldDecision> FieldDecisions);

View File

@@ -1,36 +0,0 @@
using Microsoft.Extensions.Logging;
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo.MergeEvents;
public sealed class MergeEventStore : IMergeEventStore
{
private readonly IMongoCollection<MergeEventDocument> _collection;
private readonly ILogger<MergeEventStore> _logger;
public MergeEventStore(IMongoDatabase database, ILogger<MergeEventStore> logger)
{
_collection = (database ?? throw new ArgumentNullException(nameof(database)))
.GetCollection<MergeEventDocument>(MongoStorageDefaults.Collections.MergeEvent);
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public async Task AppendAsync(MergeEventRecord record, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(record);
var document = MergeEventDocumentExtensions.FromRecord(record);
await _collection.InsertOneAsync(document, cancellationToken: cancellationToken).ConfigureAwait(false);
_logger.LogDebug("Appended merge event {MergeId} for {AdvisoryKey}", record.Id, record.AdvisoryKey);
}
public async Task<IReadOnlyList<MergeEventRecord>> GetRecentAsync(string advisoryKey, int limit, CancellationToken cancellationToken)
{
var cursor = await _collection.Find(x => x.AdvisoryKey == advisoryKey)
.SortByDescending(x => x.MergedAt)
.Limit(limit)
.ToListAsync(cancellationToken)
.ConfigureAwait(false);
return cursor.Select(static x => x.ToRecord()).ToArray();
}
}

View File

@@ -1,8 +0,0 @@
namespace StellaOps.Concelier.Storage.Mongo.MergeEvents;
public sealed record MergeFieldDecision(
string Field,
string? SelectedSource,
string DecisionReason,
DateTimeOffset? SelectedModified,
IReadOnlyList<string> ConsideredSources);

View File

@@ -1,166 +0,0 @@
using System;
using System.Collections.Generic;
using System.Collections.Immutable;
using System.Globalization;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using MongoDB.Bson;
using MongoDB.Driver;
using StellaOps.Concelier.Core.Raw;
using StellaOps.Concelier.RawModels;
namespace StellaOps.Concelier.Storage.Mongo.Migrations;
public sealed class EnsureAdvisoryCanonicalKeyBackfillMigration : IMongoMigration
{
public string Id => "2025-11-07-advisory-canonical-key";
public string Description => "Populate advisory_key and links for advisory_raw documents.";
public async Task ApplyAsync(IMongoDatabase database, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(database);
var collection = database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.AdvisoryRaw);
var filter = Builders<BsonDocument>.Filter.Or(
Builders<BsonDocument>.Filter.Exists("advisory_key", false),
Builders<BsonDocument>.Filter.Type("advisory_key", BsonType.Null),
Builders<BsonDocument>.Filter.Eq("advisory_key", string.Empty),
Builders<BsonDocument>.Filter.Or(
Builders<BsonDocument>.Filter.Exists("links", false),
Builders<BsonDocument>.Filter.Type("links", BsonType.Null)));
using var cursor = await collection.Find(filter).ToCursorAsync(cancellationToken).ConfigureAwait(false);
while (await cursor.MoveNextAsync(cancellationToken).ConfigureAwait(false))
{
foreach (var document in cursor.Current)
{
cancellationToken.ThrowIfCancellationRequested();
if (!document.TryGetValue("_id", out var idValue) || idValue.IsBsonNull)
{
continue;
}
var source = ParseSource(document.GetValue("source", new BsonDocument()).AsBsonDocument);
var upstream = ParseUpstream(document.GetValue("upstream", new BsonDocument()).AsBsonDocument);
var identifiers = ParseIdentifiers(document.GetValue("identifiers", new BsonDocument()).AsBsonDocument);
var canonical = AdvisoryCanonicalizer.Canonicalize(identifiers, source, upstream);
var linksArray = new BsonArray((canonical.Links.IsDefaultOrEmpty ? ImmutableArray<RawLink>.Empty : canonical.Links)
.Select(link => new BsonDocument
{
{ "scheme", link.Scheme },
{ "value", link.Value }
}));
var update = Builders<BsonDocument>.Update
.Set("advisory_key", canonical.AdvisoryKey)
.Set("links", linksArray);
await collection.UpdateOneAsync(
Builders<BsonDocument>.Filter.Eq("_id", idValue),
update,
cancellationToken: cancellationToken).ConfigureAwait(false);
}
}
}
private static RawSourceMetadata ParseSource(BsonDocument source)
{
return new RawSourceMetadata(
GetRequiredString(source, "vendor"),
GetOptionalString(source, "connector") ?? string.Empty,
GetOptionalString(source, "version") ?? "unknown",
GetOptionalString(source, "stream"));
}
private static RawUpstreamMetadata ParseUpstream(BsonDocument upstream)
{
var provenance = ImmutableDictionary.CreateBuilder<string, string>(StringComparer.Ordinal);
if (upstream.TryGetValue("provenance", out var provenanceValue) && provenanceValue.IsBsonDocument)
{
foreach (var element in provenanceValue.AsBsonDocument)
{
provenance[element.Name] = BsonValueToString(element.Value);
}
}
var signature = upstream.TryGetValue("signature", out var signatureValue) && signatureValue.IsBsonDocument
? signatureValue.AsBsonDocument
: new BsonDocument();
var signatureMetadata = new RawSignatureMetadata(
signature.GetValue("present", BsonBoolean.False).AsBoolean,
signature.TryGetValue("format", out var format) && !format.IsBsonNull ? format.AsString : null,
signature.TryGetValue("key_id", out var keyId) && !keyId.IsBsonNull ? keyId.AsString : null,
signature.TryGetValue("sig", out var sig) && !sig.IsBsonNull ? sig.AsString : null,
signature.TryGetValue("certificate", out var certificate) && !certificate.IsBsonNull ? certificate.AsString : null,
signature.TryGetValue("digest", out var digest) && !digest.IsBsonNull ? digest.AsString : null);
return new RawUpstreamMetadata(
GetRequiredString(upstream, "upstream_id"),
upstream.TryGetValue("document_version", out var version) && !version.IsBsonNull ? version.AsString : null,
GetDateTimeOffset(upstream, "retrieved_at", DateTimeOffset.UtcNow),
GetRequiredString(upstream, "content_hash"),
signatureMetadata,
provenance.ToImmutable());
}
private static RawIdentifiers ParseIdentifiers(BsonDocument identifiers)
{
var aliases = identifiers.TryGetValue("aliases", out var aliasesValue) && aliasesValue.IsBsonArray
? aliasesValue.AsBsonArray.Select(BsonValueToString).ToImmutableArray()
: ImmutableArray<string>.Empty;
return new RawIdentifiers(
aliases,
GetRequiredString(identifiers, "primary"));
}
private static string GetRequiredString(BsonDocument document, string name)
{
if (!document.TryGetValue(name, out var value) || value.IsBsonNull)
{
return string.Empty;
}
return value.IsString ? value.AsString : value.ToString() ?? string.Empty;
}
private static string? GetOptionalString(BsonDocument document, string name)
{
if (!document.TryGetValue(name, out var value) || value.IsBsonNull)
{
return null;
}
return value.IsString ? value.AsString : value.ToString();
}
private static string BsonValueToString(BsonValue value)
{
return value switch
{
null => string.Empty,
BsonString s => s.AsString,
BsonBoolean b => b.AsBoolean.ToString(),
BsonDateTime dateTime => dateTime.ToUniversalTime().ToString("O"),
BsonInt32 i => i.AsInt32.ToString(CultureInfo.InvariantCulture),
BsonInt64 l => l.AsInt64.ToString(CultureInfo.InvariantCulture),
BsonDouble d => d.AsDouble.ToString(CultureInfo.InvariantCulture),
_ => value?.ToString() ?? string.Empty
};
}
private static DateTimeOffset GetDateTimeOffset(BsonDocument document, string name, DateTimeOffset defaultValue)
{
if (!document.TryGetValue(name, out var value) || value.IsBsonNull)
{
return defaultValue;
}
return value.ToUniversalTime();
}
}

View File

@@ -1,44 +0,0 @@
using System.Collections.Generic;
using System.Threading;
using System.Threading.Tasks;
using MongoDB.Bson;
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo.Migrations;
public sealed class EnsureAdvisoryEventCollectionsMigration : IMongoMigration
{
public string Id => "20251019_advisory_event_collections";
public string Description => "Ensure advisory_statements and advisory_conflicts indexes exist for event log storage.";
public async Task ApplyAsync(IMongoDatabase database, CancellationToken cancellationToken)
{
var statements = database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.AdvisoryStatements);
var conflicts = database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.AdvisoryConflicts);
var statementIndexes = new List<CreateIndexModel<BsonDocument>>
{
new(
Builders<BsonDocument>.IndexKeys.Ascending("vulnerabilityKey").Descending("asOf"),
new CreateIndexOptions { Name = "advisory_statements_vulnerability_asof_desc" }),
new(
Builders<BsonDocument>.IndexKeys.Ascending("statementHash"),
new CreateIndexOptions { Name = "advisory_statements_statementHash_unique", Unique = true }),
};
await statements.Indexes.CreateManyAsync(statementIndexes, cancellationToken).ConfigureAwait(false);
var conflictIndexes = new List<CreateIndexModel<BsonDocument>>
{
new(
Builders<BsonDocument>.IndexKeys.Ascending("vulnerabilityKey").Descending("asOf"),
new CreateIndexOptions { Name = "advisory_conflicts_vulnerability_asof_desc" }),
new(
Builders<BsonDocument>.IndexKeys.Ascending("conflictHash"),
new CreateIndexOptions { Name = "advisory_conflicts_conflictHash_unique", Unique = true }),
};
await conflicts.Indexes.CreateManyAsync(conflictIndexes, cancellationToken).ConfigureAwait(false);
}
}

View File

@@ -1,69 +0,0 @@
using System;
using System.Collections.Generic;
using System.Threading;
using System.Threading.Tasks;
using MongoDB.Bson;
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo.Migrations;
/// <summary>
/// Normalises advisory_linksets tenant ids to lowercase to keep lookups/write paths consistent.
/// </summary>
public sealed class EnsureAdvisoryLinksetsTenantLowerMigration : IMongoMigration
{
private const string MigrationId = "20251117_advisory_linksets_tenant_lower";
private const int BatchSize = 500;
public string Id => MigrationId;
public string Description => "Lowercase tenant ids in advisory_linksets to match query filters.";
public async Task ApplyAsync(IMongoDatabase database, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(database);
var collection = database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.AdvisoryLinksets);
using var cursor = await collection
.Find(Builders<BsonDocument>.Filter.Empty)
.ToCursorAsync(cancellationToken)
.ConfigureAwait(false);
var writes = new List<WriteModel<BsonDocument>>(BatchSize);
while (await cursor.MoveNextAsync(cancellationToken).ConfigureAwait(false))
{
foreach (var doc in cursor.Current)
{
if (!doc.TryGetValue("TenantId", out var tenantValue) || tenantValue.BsonType != BsonType.String)
{
continue;
}
var currentTenant = tenantValue.AsString;
var lower = currentTenant.ToLowerInvariant();
if (lower == currentTenant)
{
continue;
}
var idFilter = Builders<BsonDocument>.Filter.Eq("_id", doc["_id"]);
var update = Builders<BsonDocument>.Update.Set("TenantId", lower);
writes.Add(new UpdateOneModel<BsonDocument>(idFilter, update));
if (writes.Count >= BatchSize)
{
await collection.BulkWriteAsync(writes, new BulkWriteOptions { IsOrdered = false }, cancellationToken)
.ConfigureAwait(false);
writes.Clear();
}
}
}
if (writes.Count > 0)
{
await collection.BulkWriteAsync(writes, new BulkWriteOptions { IsOrdered = false }, cancellationToken)
.ConfigureAwait(false);
}
}
}

View File

@@ -1,31 +0,0 @@
using System.Collections.Generic;
using System.Threading;
using System.Threading.Tasks;
using MongoDB.Bson;
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo.Migrations;
public sealed class EnsureAdvisoryObservationEventCollectionMigration : IMongoMigration
{
public string Id => "20251120_advisory_observation_events";
public string Description => "Ensure advisory_observation_events collection and indexes exist for observation event fan-out.";
public async Task ApplyAsync(IMongoDatabase database, CancellationToken cancellationToken)
{
var collection = database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.AdvisoryObservationEvents);
var indexes = new List<CreateIndexModel<BsonDocument>>
{
new(
Builders<BsonDocument>.IndexKeys.Ascending("tenantId").Descending("ingestedAt"),
new CreateIndexOptions { Name = "advisory_observation_events_tenant_ingested_desc" }),
new(
Builders<BsonDocument>.IndexKeys.Ascending("observationHash"),
new CreateIndexOptions { Name = "advisory_observation_events_hash_unique", Unique = true }),
};
await collection.Indexes.CreateManyAsync(indexes, cancellationToken).ConfigureAwait(false);
}
}

View File

@@ -1,442 +0,0 @@
using System;
using System.Collections.Generic;
using System.Collections.Immutable;
using System.Globalization;
using System.Linq;
using System.Text.Json;
using System.Threading;
using System.Threading.Tasks;
using MongoDB.Bson;
using MongoDB.Bson.IO;
using MongoDB.Driver;
using StellaOps.Concelier.RawModels;
namespace StellaOps.Concelier.Storage.Mongo.Migrations;
/// <summary>
/// Backfills the raw linkset projection on advisory observations so downstream services
/// can rely on both canonical and raw linkset shapes.
/// </summary>
public sealed class EnsureAdvisoryObservationsRawLinksetMigration : IMongoMigration
{
private const string MigrationId = "20251104_advisory_observations_raw_linkset";
private const int BulkBatchSize = 500;
public string Id => MigrationId;
public string Description => "Populate rawLinkset field for advisory observations using stored advisory_raw documents.";
public async Task ApplyAsync(IMongoDatabase database, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(database);
var observations = database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.AdvisoryObservations);
var rawCollection = database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.AdvisoryRaw);
var filter = Builders<BsonDocument>.Filter.Exists("rawLinkset", false) |
Builders<BsonDocument>.Filter.Type("rawLinkset", BsonType.Null);
using var cursor = await observations
.Find(filter)
.ToCursorAsync(cancellationToken)
.ConfigureAwait(false);
var updates = new List<WriteModel<BsonDocument>>(BulkBatchSize);
var missingRawDocuments = new List<string>();
while (await cursor.MoveNextAsync(cancellationToken).ConfigureAwait(false))
{
foreach (var observationDocument in cursor.Current)
{
cancellationToken.ThrowIfCancellationRequested();
if (!TryExtractObservationKey(observationDocument, out var key))
{
continue;
}
var rawFilter = Builders<BsonDocument>.Filter.Eq("tenant", key.Tenant) &
Builders<BsonDocument>.Filter.Eq("source.vendor", key.Vendor) &
Builders<BsonDocument>.Filter.Eq("upstream.upstream_id", key.UpstreamId) &
Builders<BsonDocument>.Filter.Eq("upstream.content_hash", key.ContentHash);
var rawDocument = await rawCollection
.Find(rawFilter)
.Sort(Builders<BsonDocument>.Sort.Descending("ingested_at").Descending("_id"))
.Limit(1)
.FirstOrDefaultAsync(cancellationToken)
.ConfigureAwait(false);
if (rawDocument is null)
{
missingRawDocuments.Add(key.ToString());
continue;
}
var advisoryRaw = MapToRawDocument(rawDocument);
var rawLinkset = BuildRawLinkset(advisoryRaw.Identifiers, advisoryRaw.Linkset);
var rawLinksetDocument = BuildRawLinksetBson(rawLinkset);
var update = Builders<BsonDocument>.Update.Set("rawLinkset", rawLinksetDocument);
updates.Add(new UpdateOneModel<BsonDocument>(
Builders<BsonDocument>.Filter.Eq("_id", observationDocument["_id"].AsString),
update));
if (updates.Count >= BulkBatchSize)
{
await observations.BulkWriteAsync(updates, cancellationToken: cancellationToken).ConfigureAwait(false);
updates.Clear();
}
}
}
if (updates.Count > 0)
{
await observations.BulkWriteAsync(updates, cancellationToken: cancellationToken).ConfigureAwait(false);
}
if (missingRawDocuments.Count > 0)
{
throw new InvalidOperationException(
$"Unable to locate advisory_raw documents for {missingRawDocuments.Count} observations: {string.Join(", ", missingRawDocuments.Take(10))}");
}
}
private static bool TryExtractObservationKey(BsonDocument observation, out ObservationKey key)
{
key = default;
if (!observation.TryGetValue("tenant", out var tenantValue) || tenantValue.IsBsonNull)
{
return false;
}
if (!observation.TryGetValue("source", out var sourceValue) || sourceValue is not BsonDocument sourceDocument)
{
return false;
}
if (!observation.TryGetValue("upstream", out var upstreamValue) || upstreamValue is not BsonDocument upstreamDocument)
{
return false;
}
var tenant = tenantValue.AsString;
var vendor = sourceDocument.GetValue("vendor", BsonString.Empty).AsString;
var upstreamId = upstreamDocument.GetValue("upstream_id", BsonString.Empty).AsString;
var contentHash = upstreamDocument.GetValue("contentHash", BsonString.Empty).AsString;
var createdAt = observation.GetValue("createdAt", BsonNull.Value);
key = new ObservationKey(
tenant,
vendor,
upstreamId,
contentHash,
BsonValueToDateTimeOffset(createdAt) ?? DateTimeOffset.UtcNow);
return !string.IsNullOrWhiteSpace(tenant) &&
!string.IsNullOrWhiteSpace(vendor) &&
!string.IsNullOrWhiteSpace(upstreamId) &&
!string.IsNullOrWhiteSpace(contentHash);
}
private static AdvisoryRawDocument MapToRawDocument(BsonDocument document)
{
var tenant = GetRequiredString(document, "tenant");
var source = MapSource(document["source"].AsBsonDocument);
var upstream = MapUpstream(document["upstream"].AsBsonDocument);
var content = MapContent(document["content"].AsBsonDocument);
var identifiers = MapIdentifiers(document["identifiers"].AsBsonDocument);
var linkset = MapLinkset(document["linkset"].AsBsonDocument);
var supersedes = document.GetValue("supersedes", BsonNull.Value);
return new AdvisoryRawDocument(
tenant,
source,
upstream,
content,
identifiers,
linkset,
Supersedes: supersedes.IsBsonNull ? null : supersedes.AsString);
}
private static RawSourceMetadata MapSource(BsonDocument source)
{
return new RawSourceMetadata(
GetRequiredString(source, "vendor"),
GetRequiredString(source, "connector"),
GetRequiredString(source, "version"),
GetOptionalString(source, "stream"));
}
private static RawUpstreamMetadata MapUpstream(BsonDocument upstream)
{
var provenanceBuilder = ImmutableDictionary.CreateBuilder<string, string>(StringComparer.Ordinal);
if (upstream.TryGetValue("provenance", out var provenanceValue) && provenanceValue.IsBsonDocument)
{
foreach (var element in provenanceValue.AsBsonDocument)
{
provenanceBuilder[element.Name] = BsonValueToString(element.Value);
}
}
var signatureDocument = upstream["signature"].AsBsonDocument;
var signature = new RawSignatureMetadata(
signatureDocument.GetValue("present", BsonBoolean.False).AsBoolean,
signatureDocument.TryGetValue("format", out var format) && !format.IsBsonNull ? format.AsString : null,
signatureDocument.TryGetValue("key_id", out var keyId) && !keyId.IsBsonNull ? keyId.AsString : null,
signatureDocument.TryGetValue("sig", out var sig) && !sig.IsBsonNull ? sig.AsString : null,
signatureDocument.TryGetValue("certificate", out var certificate) && !certificate.IsBsonNull ? certificate.AsString : null,
signatureDocument.TryGetValue("digest", out var digest) && !digest.IsBsonNull ? digest.AsString : null);
return new RawUpstreamMetadata(
GetRequiredString(upstream, "upstream_id"),
upstream.TryGetValue("document_version", out var version) && !version.IsBsonNull ? version.AsString : null,
GetDateTimeOffset(upstream, "retrieved_at", DateTimeOffset.UtcNow),
GetRequiredString(upstream, "content_hash"),
signature,
provenanceBuilder.ToImmutable());
}
private static RawContent MapContent(BsonDocument content)
{
var rawValue = content.GetValue("raw", BsonNull.Value);
string rawJson;
if (rawValue.IsBsonNull)
{
rawJson = "{}";
}
else if (rawValue.IsString)
{
rawJson = rawValue.AsString ?? "{}";
}
else
{
rawJson = rawValue.ToJson(new JsonWriterSettings { OutputMode = JsonOutputMode.RelaxedExtendedJson });
}
using var document = JsonDocument.Parse(string.IsNullOrWhiteSpace(rawJson) ? "{}" : rawJson);
return new RawContent(
GetRequiredString(content, "format"),
content.TryGetValue("spec_version", out var specVersion) && !specVersion.IsBsonNull ? specVersion.AsString : null,
document.RootElement.Clone(),
content.TryGetValue("encoding", out var encoding) && !encoding.IsBsonNull ? encoding.AsString : null);
}
private static RawIdentifiers MapIdentifiers(BsonDocument identifiers)
{
var aliases = identifiers.TryGetValue("aliases", out var aliasesValue) && aliasesValue.IsBsonArray
? aliasesValue.AsBsonArray.Select(BsonValueToString).ToImmutableArray()
: ImmutableArray<string>.Empty;
return new RawIdentifiers(
aliases,
GetRequiredString(identifiers, "primary"));
}
private static RawLinkset MapLinkset(BsonDocument linkset)
{
var aliases = linkset.TryGetValue("aliases", out var aliasesValue) && aliasesValue.IsBsonArray
? aliasesValue.AsBsonArray.Select(BsonValueToString).ToImmutableArray()
: ImmutableArray<string>.Empty;
var purls = linkset.TryGetValue("purls", out var purlsValue) && purlsValue.IsBsonArray
? purlsValue.AsBsonArray.Select(BsonValueToString).ToImmutableArray()
: ImmutableArray<string>.Empty;
var cpes = linkset.TryGetValue("cpes", out var cpesValue) && cpesValue.IsBsonArray
? cpesValue.AsBsonArray.Select(BsonValueToString).ToImmutableArray()
: ImmutableArray<string>.Empty;
var references = linkset.TryGetValue("references", out var referencesValue) && referencesValue.IsBsonArray
? referencesValue.AsBsonArray
.Where(static value => value.IsBsonDocument)
.Select(value =>
{
var doc = value.AsBsonDocument;
return new RawReference(
GetRequiredString(doc, "type"),
GetRequiredString(doc, "url"),
doc.TryGetValue("source", out var source) && !source.IsBsonNull ? source.AsString : null);
})
.ToImmutableArray()
: ImmutableArray<RawReference>.Empty;
var reconciledFrom = linkset.TryGetValue("reconciled_from", out var reconciledValue) && reconciledValue.IsBsonArray
? reconciledValue.AsBsonArray.Select(BsonValueToString).ToImmutableArray()
: ImmutableArray<string>.Empty;
var notesBuilder = ImmutableDictionary.CreateBuilder<string, string>(StringComparer.Ordinal);
if (linkset.TryGetValue("notes", out var notesValue) && notesValue.IsBsonDocument)
{
foreach (var element in notesValue.AsBsonDocument)
{
notesBuilder[element.Name] = BsonValueToString(element.Value);
}
}
return new RawLinkset
{
Aliases = aliases,
PackageUrls = purls,
Cpes = cpes,
References = references,
ReconciledFrom = reconciledFrom,
Notes = notesBuilder.ToImmutable()
};
}
private static RawLinkset BuildRawLinkset(RawIdentifiers identifiers, RawLinkset linkset)
{
var aliasBuilder = ImmutableArray.CreateBuilder<string>();
if (!string.IsNullOrWhiteSpace(identifiers.PrimaryId))
{
aliasBuilder.Add(identifiers.PrimaryId);
}
if (!identifiers.Aliases.IsDefaultOrEmpty)
{
foreach (var alias in identifiers.Aliases)
{
if (!string.IsNullOrEmpty(alias))
{
aliasBuilder.Add(alias);
}
}
}
if (!linkset.Aliases.IsDefaultOrEmpty)
{
foreach (var alias in linkset.Aliases)
{
if (!string.IsNullOrEmpty(alias))
{
aliasBuilder.Add(alias);
}
}
}
static ImmutableArray<string> EnsureArray(ImmutableArray<string> values)
=> values.IsDefault ? ImmutableArray<string>.Empty : values;
static ImmutableArray<RawReference> EnsureReferences(ImmutableArray<RawReference> values)
=> values.IsDefault ? ImmutableArray<RawReference>.Empty : values;
return linkset with
{
Aliases = aliasBuilder.ToImmutable(),
PackageUrls = EnsureArray(linkset.PackageUrls),
Cpes = EnsureArray(linkset.Cpes),
References = EnsureReferences(linkset.References),
ReconciledFrom = EnsureArray(linkset.ReconciledFrom),
Notes = linkset.Notes ?? ImmutableDictionary<string, string>.Empty
};
}
private static BsonDocument BuildRawLinksetBson(RawLinkset rawLinkset)
{
var references = new BsonArray(rawLinkset.References.Select(reference =>
{
var referenceDocument = new BsonDocument
{
{ "type", reference.Type },
{ "url", reference.Url }
};
if (!string.IsNullOrWhiteSpace(reference.Source))
{
referenceDocument["source"] = reference.Source;
}
return referenceDocument;
}));
var notes = new BsonDocument();
if (rawLinkset.Notes is not null)
{
foreach (var entry in rawLinkset.Notes)
{
notes[entry.Key] = entry.Value;
}
}
return new BsonDocument
{
{ "aliases", new BsonArray(rawLinkset.Aliases) },
{ "purls", new BsonArray(rawLinkset.PackageUrls) },
{ "cpes", new BsonArray(rawLinkset.Cpes) },
{ "references", references },
{ "reconciled_from", new BsonArray(rawLinkset.ReconciledFrom) },
{ "notes", notes }
};
}
private static string GetRequiredString(BsonDocument document, string key)
{
if (!document.TryGetValue(key, out var value) || value.IsBsonNull)
{
return string.Empty;
}
return value.IsString ? value.AsString : value.ToString() ?? string.Empty;
}
private static string? GetOptionalString(BsonDocument document, string key)
{
if (!document.TryGetValue(key, out var value) || value.IsBsonNull)
{
return null;
}
return value.IsString ? value.AsString : value.ToString();
}
private static string BsonValueToString(BsonValue value)
{
if (value.IsString)
{
return value.AsString ?? string.Empty;
}
if (value.IsBsonNull)
{
return string.Empty;
}
return value.ToString() ?? string.Empty;
}
private static DateTimeOffset GetDateTimeOffset(BsonDocument document, string field, DateTimeOffset fallback)
{
if (!document.TryGetValue(field, out var value) || value.IsBsonNull)
{
return fallback;
}
return BsonValueToDateTimeOffset(value) ?? fallback;
}
private static DateTimeOffset? BsonValueToDateTimeOffset(BsonValue value)
{
return value.BsonType switch
{
BsonType.DateTime => new DateTimeOffset(DateTime.SpecifyKind(value.ToUniversalTime(), DateTimeKind.Utc)),
BsonType.String when DateTimeOffset.TryParse(value.AsString, CultureInfo.InvariantCulture, DateTimeStyles.AssumeUniversal, out var parsed)
=> parsed.ToUniversalTime(),
BsonType.Int64 => DateTimeOffset.FromUnixTimeMilliseconds(value.AsInt64).ToUniversalTime(),
_ => null
};
}
private readonly record struct ObservationKey(
string Tenant,
string Vendor,
string UpstreamId,
string ContentHash,
DateTimeOffset CreatedAt)
{
public override string ToString()
=> $"{Tenant}:{Vendor}:{UpstreamId}:{ContentHash}";
}
}

View File

@@ -1,156 +0,0 @@
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using MongoDB.Bson;
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo.Migrations;
public sealed class EnsureAdvisoryRawIdempotencyIndexMigration : IMongoMigration
{
private const string IndexName = "advisory_raw_idempotency";
public string Id => "20251028_advisory_raw_idempotency_index";
public string Description => "Ensure advisory_raw collection enforces idempotency via unique compound index.";
public async Task ApplyAsync(IMongoDatabase database, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(database);
var collection = database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.AdvisoryRaw);
await EnsureNoDuplicatesAsync(collection, cancellationToken).ConfigureAwait(false);
var existingIndex = await FindExistingIndexAsync(collection, cancellationToken).ConfigureAwait(false);
if (existingIndex is not null &&
existingIndex.TryGetValue("unique", out var uniqueValue) &&
uniqueValue.ToBoolean() &&
existingIndex.TryGetValue("key", out var keyValue) &&
keyValue is BsonDocument keyDocument &&
KeysMatch(keyDocument))
{
return;
}
if (existingIndex is not null)
{
try
{
await collection.Indexes.DropOneAsync(IndexName, cancellationToken).ConfigureAwait(false);
}
catch (MongoCommandException ex) when (ex.Code == 27)
{
// Index not found; safe to ignore.
}
}
var keys = Builders<BsonDocument>.IndexKeys
.Ascending("source.vendor")
.Ascending("upstream.upstream_id")
.Ascending("upstream.content_hash")
.Ascending("tenant");
var options = new CreateIndexOptions
{
Name = IndexName,
Unique = true,
};
await collection.Indexes.CreateOneAsync(new CreateIndexModel<BsonDocument>(keys, options), cancellationToken: cancellationToken).ConfigureAwait(false);
}
private static async Task<BsonDocument?> FindExistingIndexAsync(
IMongoCollection<BsonDocument> collection,
CancellationToken cancellationToken)
{
using var cursor = await collection.Indexes.ListAsync(cancellationToken).ConfigureAwait(false);
var indexes = await cursor.ToListAsync(cancellationToken).ConfigureAwait(false);
return indexes.FirstOrDefault(doc => doc.TryGetValue("name", out var name) && string.Equals(name.AsString, IndexName, StringComparison.Ordinal));
}
private static async Task EnsureNoDuplicatesAsync(IMongoCollection<BsonDocument> collection, CancellationToken cancellationToken)
{
var pipeline = new[]
{
new BsonDocument("$group", new BsonDocument
{
{
"_id",
new BsonDocument
{
{ "vendor", "$source.vendor" },
{ "upstreamId", "$upstream.upstream_id" },
{ "contentHash", "$upstream.content_hash" },
{ "tenant", "$tenant" },
}
},
{ "count", new BsonDocument("$sum", 1) },
{ "ids", new BsonDocument("$push", "$_id") },
}),
new BsonDocument("$match", new BsonDocument("count", new BsonDocument("$gt", 1))),
new BsonDocument("$limit", 1),
};
var pipelineDefinition = PipelineDefinition<BsonDocument, BsonDocument>.Create(pipeline);
var duplicate = await collection.Aggregate(pipelineDefinition, cancellationToken: cancellationToken)
.FirstOrDefaultAsync(cancellationToken)
.ConfigureAwait(false);
if (duplicate is null)
{
return;
}
var keyDocument = duplicate["_id"].AsBsonDocument;
var vendor = keyDocument.GetValue("vendor", BsonNull.Value)?.ToString() ?? "<null>";
var upstreamId = keyDocument.GetValue("upstreamId", BsonNull.Value)?.ToString() ?? "<null>";
var contentHash = keyDocument.GetValue("contentHash", BsonNull.Value)?.ToString() ?? "<null>";
var tenant = keyDocument.GetValue("tenant", BsonNull.Value)?.ToString() ?? "<null>";
BsonArray idArray = duplicate.TryGetValue("ids", out var idsValue) && idsValue is BsonArray array
? array
: new BsonArray();
var ids = new string[idArray.Count];
for (var i = 0; i < idArray.Count; i++)
{
ids[i] = idArray[i]?.ToString() ?? "<null>";
}
throw new InvalidOperationException(
$"Cannot create advisory_raw idempotency index because duplicate documents exist for vendor '{vendor}', upstream '{upstreamId}', content hash '{contentHash}', tenant '{tenant}'. Conflicting document ids: {string.Join(", ", ids)}.");
}
private static bool KeysMatch(BsonDocument keyDocument)
{
if (keyDocument.ElementCount != 4)
{
return false;
}
var expected = new[]
{
("source.vendor", 1),
("upstream.upstream_id", 1),
("upstream.content_hash", 1),
("tenant", 1),
};
var index = 0;
foreach (var element in keyDocument.Elements)
{
if (!string.Equals(element.Name, expected[index].Item1, StringComparison.Ordinal))
{
return false;
}
if (!element.Value.IsInt32 || element.Value.AsInt32 != expected[index].Item2)
{
return false;
}
index++;
}
return true;
}
}

View File

@@ -1,372 +0,0 @@
using System.Collections.Generic;
using Microsoft.Extensions.Options;
using MongoDB.Bson;
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo.Migrations;
internal sealed class EnsureAdvisoryRawValidatorMigration : IMongoMigration
{
private const string ForbiddenExactPattern = "^(?i)(severity|cvss|cvss_vector|merged_from|consensus_provider|reachability|asset_criticality|risk_score)$";
private const string ForbiddenEffectivePattern = "^(?i)effective_";
private static readonly IReadOnlyList<object> AllBsonTypeNames = new object[]
{
"double",
"string",
"object",
"array",
"binData",
"undefined",
"objectId",
"bool",
"date",
"null",
"regex",
"dbPointer",
"javascript",
"symbol",
"javascriptWithScope",
"int",
"timestamp",
"long",
"decimal",
"minKey",
"maxKey",
};
private readonly MongoStorageOptions _options;
public EnsureAdvisoryRawValidatorMigration(IOptions<MongoStorageOptions> options)
{
ArgumentNullException.ThrowIfNull(options);
_options = options.Value ?? throw new ArgumentNullException(nameof(options.Value));
}
public string Id => "20251028_advisory_raw_validator";
public string Description => "Ensure advisory_raw collection enforces Aggregation-Only Contract schema";
public async Task ApplyAsync(IMongoDatabase database, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(database);
var validatorOptions = _options.AdvisoryRawValidator ?? new MongoCollectionValidatorOptions();
var validator = new BsonDocument("$jsonSchema", BuildSchema());
var command = new BsonDocument
{
{ "collMod", MongoStorageDefaults.Collections.AdvisoryRaw },
{ "validator", validator },
{ "validationLevel", GetValidationLevelString(validatorOptions.Level) },
{ "validationAction", GetValidationActionString(validatorOptions.Action) },
};
try
{
await database.RunCommandAsync<BsonDocument>(command, cancellationToken: cancellationToken).ConfigureAwait(false);
}
catch (MongoCommandException ex) when (ex.Code == 26)
{
var createOptions = new CreateCollectionOptions<BsonDocument>
{
Validator = validator,
ValidationLevel = MapValidationLevel(validatorOptions.Level),
ValidationAction = MapValidationAction(validatorOptions.Action),
};
await database.CreateCollectionAsync(
MongoStorageDefaults.Collections.AdvisoryRaw,
createOptions,
cancellationToken).ConfigureAwait(false);
}
}
private static BsonDocument BuildSchema()
{
var schema = new BsonDocument
{
{ "bsonType", "object" },
{ "required", new BsonArray(new[] { "tenant", "source", "upstream", "content", "linkset" }) },
{ "properties", BuildTopLevelProperties() },
{ "patternProperties", BuildForbiddenPatterns() },
};
return schema;
}
private static BsonDocument BuildTopLevelProperties()
{
var properties = new BsonDocument
{
{ "_id", new BsonDocument("bsonType", "string") },
{
"tenant",
new BsonDocument
{
{ "bsonType", "string" },
{ "minLength", 1 },
}
},
{ "source", BuildSourceSchema() },
{ "upstream", BuildUpstreamSchema() },
{ "content", BuildContentSchema() },
{ "identifiers", BuildIdentifiersSchema() },
{ "linkset", BuildLinksetSchema() },
{ "supersedes", new BsonDocument("bsonType", new BsonArray(new object[] { "string", "null" })) },
{
"created_at",
new BsonDocument
{
{ "bsonType", new BsonArray(new object[] { "date", "string", "null" }) },
}
},
{
"ingested_at",
new BsonDocument
{
{ "bsonType", new BsonArray(new object[] { "date", "string", "null" }) },
}
},
};
return properties;
}
private static BsonDocument BuildSourceSchema()
{
return new BsonDocument
{
{ "bsonType", "object" },
{ "required", new BsonArray(new[] { "vendor", "connector", "version" }) },
{
"properties",
new BsonDocument
{
{ "vendor", new BsonDocument("bsonType", "string") },
{ "connector", new BsonDocument("bsonType", "string") },
{ "version", new BsonDocument("bsonType", "string") },
{ "stream", new BsonDocument("bsonType", new BsonArray(new object[] { "string", "null" })) },
}
},
{ "additionalProperties", false },
};
}
private static BsonDocument BuildUpstreamSchema()
{
return new BsonDocument
{
{ "bsonType", "object" },
{ "required", new BsonArray(new[] { "upstream_id", "retrieved_at", "content_hash", "signature", "provenance" }) },
{
"properties",
new BsonDocument
{
{ "upstream_id", new BsonDocument("bsonType", "string") },
{ "document_version", new BsonDocument("bsonType", new BsonArray(new object[] { "string", "null" })) },
{ "retrieved_at", new BsonDocument("bsonType", new BsonArray(new object[] { "date", "string" })) },
{ "content_hash", new BsonDocument("bsonType", "string") },
{ "signature", BuildSignatureSchema() },
{ "provenance", BuildProvenanceSchema() },
}
},
{ "additionalProperties", false },
};
}
private static BsonDocument BuildSignatureSchema()
{
return new BsonDocument
{
{ "bsonType", "object" },
{ "required", new BsonArray(new[] { "present" }) },
{
"properties",
new BsonDocument
{
{ "present", new BsonDocument("bsonType", "bool") },
{ "format", new BsonDocument("bsonType", new BsonArray(new object[] { "string", "null" })) },
{ "key_id", new BsonDocument("bsonType", new BsonArray(new object[] { "string", "null" })) },
{ "sig", new BsonDocument("bsonType", new BsonArray(new object[] { "string", "null" })) },
{ "certificate", new BsonDocument("bsonType", new BsonArray(new object[] { "string", "null" })) },
{ "digest", new BsonDocument("bsonType", new BsonArray(new object[] { "string", "null" })) },
}
},
{ "additionalProperties", false },
};
}
private static BsonDocument BuildProvenanceSchema()
{
return new BsonDocument
{
{ "bsonType", "object" },
{ "additionalProperties", new BsonDocument("bsonType", new BsonArray(new object[] { "string", "null" })) },
};
}
private static BsonDocument BuildContentSchema()
{
return new BsonDocument
{
{ "bsonType", "object" },
{ "required", new BsonArray(new[] { "format", "raw" }) },
{
"properties",
new BsonDocument
{
{ "format", new BsonDocument("bsonType", "string") },
{ "spec_version", new BsonDocument("bsonType", new BsonArray(new object[] { "string", "null" })) },
{
"raw",
new BsonDocument
{
{ "bsonType", new BsonArray(new object[]
{
"object",
"array",
"string",
"bool",
"double",
"int",
"long",
"decimal",
})
},
}
},
{ "encoding", new BsonDocument("bsonType", new BsonArray(new object[] { "string", "null" })) },
}
},
{ "additionalProperties", false },
};
}
private static BsonDocument BuildIdentifiersSchema()
{
return new BsonDocument
{
{ "bsonType", "object" },
{ "required", new BsonArray(new[] { "aliases", "primary" }) },
{
"properties",
new BsonDocument
{
{ "aliases", CreateStringArraySchema() },
{ "primary", new BsonDocument("bsonType", "string") },
}
},
{ "additionalProperties", false },
};
}
private static BsonDocument BuildLinksetSchema()
{
return new BsonDocument
{
{ "bsonType", "object" },
{
"properties",
new BsonDocument
{
{ "aliases", CreateStringArraySchema() },
{ "purls", CreateStringArraySchema() },
{ "cpes", CreateStringArraySchema() },
{
"references",
new BsonDocument
{
{ "bsonType", "array" },
{
"items",
new BsonDocument
{
{ "bsonType", "object" },
{ "required", new BsonArray(new[] { "type", "url" }) },
{
"properties",
new BsonDocument
{
{ "type", new BsonDocument("bsonType", "string") },
{ "url", new BsonDocument("bsonType", "string") },
{ "source", new BsonDocument("bsonType", new BsonArray(new object[] { "string", "null" })) },
}
},
{ "additionalProperties", false },
}
}
}
},
{ "reconciled_from", CreateStringArraySchema() },
{
"notes",
new BsonDocument
{
{ "bsonType", "object" },
{ "additionalProperties", new BsonDocument("bsonType", new BsonArray(new object[] { "string", "null" })) },
}
},
}
},
{ "additionalProperties", false },
};
}
private static BsonDocument CreateStringArraySchema()
{
return new BsonDocument
{
{ "bsonType", "array" },
{ "items", new BsonDocument("bsonType", "string") },
};
}
private static BsonDocument BuildForbiddenPatterns()
{
return new BsonDocument
{
{ ForbiddenExactPattern, CreateForbiddenPattern("Derived and normalized fields are forbidden by the Aggregation-Only Contract.") },
{ ForbiddenEffectivePattern, CreateForbiddenPattern("Fields starting with 'effective_' must not be persisted in advisory_raw.") },
};
}
private static BsonDocument CreateForbiddenPattern(string description)
{
return new BsonDocument
{
{ "description", description },
{ "not", new BsonDocument("bsonType", new BsonArray(AllBsonTypeNames)) },
};
}
private static string GetValidationLevelString(MongoValidationLevel level) => level switch
{
MongoValidationLevel.Off => "off",
MongoValidationLevel.Moderate => "moderate",
MongoValidationLevel.Strict => "strict",
_ => "moderate",
};
private static string GetValidationActionString(MongoValidationAction action) => action switch
{
MongoValidationAction.Warn => "warn",
MongoValidationAction.Error => "error",
_ => "warn",
};
private static DocumentValidationLevel MapValidationLevel(MongoValidationLevel level) => level switch
{
MongoValidationLevel.Off => DocumentValidationLevel.Off,
MongoValidationLevel.Moderate => DocumentValidationLevel.Moderate,
MongoValidationLevel.Strict => DocumentValidationLevel.Strict,
_ => DocumentValidationLevel.Moderate,
};
private static DocumentValidationAction MapValidationAction(MongoValidationAction action) => action switch
{
MongoValidationAction.Warn => DocumentValidationAction.Warn,
MongoValidationAction.Error => DocumentValidationAction.Error,
_ => DocumentValidationAction.Warn,
};
}

View File

@@ -1,242 +0,0 @@
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using MongoDB.Bson;
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo.Migrations;
public sealed class EnsureAdvisorySupersedesBackfillMigration : IMongoMigration
{
private const string BackupCollectionName = "advisory_backup_20251028";
private const string SupersedesMigrationId = "20251028_advisory_supersedes_backfill";
public string Id => SupersedesMigrationId;
public string Description => "Backfill advisory_raw supersedes chains and replace legacy advisory collection with read-only view.";
public async Task ApplyAsync(IMongoDatabase database, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(database);
await EnsureLegacyAdvisoryViewAsync(database, cancellationToken).ConfigureAwait(false);
await BackfillSupersedesAsync(database, cancellationToken).ConfigureAwait(false);
}
private static async Task EnsureLegacyAdvisoryViewAsync(IMongoDatabase database, CancellationToken cancellationToken)
{
var advisoryInfo = await GetCollectionInfoAsync(database, MongoStorageDefaults.Collections.Advisory, cancellationToken).ConfigureAwait(false);
var backupInfo = await GetCollectionInfoAsync(database, BackupCollectionName, cancellationToken).ConfigureAwait(false);
if (advisoryInfo is not null && !IsView(advisoryInfo))
{
if (backupInfo is null)
{
await RenameCollectionAsync(database, MongoStorageDefaults.Collections.Advisory, BackupCollectionName, cancellationToken).ConfigureAwait(false);
backupInfo = await GetCollectionInfoAsync(database, BackupCollectionName, cancellationToken).ConfigureAwait(false);
}
else
{
await database.DropCollectionAsync(MongoStorageDefaults.Collections.Advisory, cancellationToken).ConfigureAwait(false);
}
advisoryInfo = null;
}
if (backupInfo is null)
{
await database.CreateCollectionAsync(BackupCollectionName, cancellationToken: cancellationToken).ConfigureAwait(false);
}
if (advisoryInfo is null)
{
await CreateViewAsync(database, MongoStorageDefaults.Collections.Advisory, BackupCollectionName, cancellationToken).ConfigureAwait(false);
}
else if (!ViewTargets(advisoryInfo, BackupCollectionName))
{
await database.DropCollectionAsync(MongoStorageDefaults.Collections.Advisory, cancellationToken).ConfigureAwait(false);
await CreateViewAsync(database, MongoStorageDefaults.Collections.Advisory, BackupCollectionName, cancellationToken).ConfigureAwait(false);
}
}
private static async Task BackfillSupersedesAsync(IMongoDatabase database, CancellationToken cancellationToken)
{
var collection = database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.AdvisoryRaw);
var pipeline = new EmptyPipelineDefinition<BsonDocument>()
.Group(new BsonDocument
{
{
"_id",
new BsonDocument
{
{ "tenant", "$tenant" },
{ "vendor", "$source.vendor" },
{ "upstreamId", "$upstream.upstream_id" },
}
},
{
"documents",
new BsonDocument("$push", new BsonDocument
{
{ "_id", "$_id" },
{ "retrievedAt", "$upstream.retrieved_at" },
{ "contentHash", "$upstream.content_hash" },
{ "supersedes", "$supersedes" },
})
}
})
.Match(new BsonDocument("documents.1", new BsonDocument("$exists", true)));
using var cursor = await collection.AggregateAsync(pipeline, cancellationToken: cancellationToken).ConfigureAwait(false);
while (await cursor.MoveNextAsync(cancellationToken).ConfigureAwait(false))
{
foreach (var group in cursor.Current)
{
if (!group.TryGetValue("documents", out var documentsValue) || documentsValue is not BsonArray documentsArray || documentsArray.Count == 0)
{
continue;
}
var ordered = documentsArray
.Select(x => x.AsBsonDocument)
.Select(x => new AdvisoryRawRecord(
x.GetValue("_id").AsString,
GetDateTime(x, "retrievedAt"),
x.TryGetValue("supersedes", out var sup) ? sup : BsonNull.Value))
.OrderBy(record => record.RetrievedAt)
.ThenBy(record => record.Id, StringComparer.Ordinal)
.ToArray();
if (ordered.Length <= 1)
{
continue;
}
var updates = new List<WriteModel<BsonDocument>>(ordered.Length);
for (var index = 0; index < ordered.Length; index++)
{
var expectedSupersedes = index == 0 ? BsonNull.Value : BsonValue.Create(ordered[index - 1].Id);
var current = ordered[index];
if (AreSupersedesEqual(current.Supersedes, expectedSupersedes))
{
continue;
}
var filter = Builders<BsonDocument>.Filter.Eq("_id", current.Id);
var update = Builders<BsonDocument>.Update.Set("supersedes", expectedSupersedes);
updates.Add(new UpdateOneModel<BsonDocument>(filter, update));
}
if (updates.Count > 0)
{
await collection.BulkWriteAsync(updates, cancellationToken: cancellationToken).ConfigureAwait(false);
}
}
}
}
private static async Task<BsonDocument?> GetCollectionInfoAsync(IMongoDatabase database, string name, CancellationToken cancellationToken)
{
var command = new BsonDocument
{
{ "listCollections", 1 },
{ "filter", new BsonDocument("name", name) },
};
var result = await database.RunCommandAsync<BsonDocument>(command, cancellationToken: cancellationToken).ConfigureAwait(false);
var batch = result["cursor"]["firstBatch"].AsBsonArray;
return batch.Count > 0 ? batch[0].AsBsonDocument : null;
}
private static bool IsView(BsonDocument collectionInfo)
=> string.Equals(collectionInfo.GetValue("type", BsonString.Empty).AsString, "view", StringComparison.OrdinalIgnoreCase);
private static bool ViewTargets(BsonDocument collectionInfo, string target)
{
if (!IsView(collectionInfo))
{
return false;
}
if (!collectionInfo.TryGetValue("options", out var options) || options is not BsonDocument optionsDocument)
{
return false;
}
return optionsDocument.TryGetValue("viewOn", out var viewOn) && string.Equals(viewOn.AsString, target, StringComparison.Ordinal);
}
private static async Task RenameCollectionAsync(IMongoDatabase database, string source, string destination, CancellationToken cancellationToken)
{
var admin = database.Client.GetDatabase("admin");
var renameCommand = new BsonDocument
{
{ "renameCollection", $"{database.DatabaseNamespace.DatabaseName}.{source}" },
{ "to", $"{database.DatabaseNamespace.DatabaseName}.{destination}" },
{ "dropTarget", false },
};
try
{
await admin.RunCommandAsync<BsonDocument>(renameCommand, cancellationToken: cancellationToken).ConfigureAwait(false);
}
catch (MongoCommandException ex) when (ex.Code == 26)
{
// Source namespace not found; ignore.
}
catch (MongoCommandException ex) when (ex.Code == 48)
{
// Target namespace exists; fall back to manual handling by copying data.
await database.DropCollectionAsync(destination, cancellationToken).ConfigureAwait(false);
await admin.RunCommandAsync<BsonDocument>(renameCommand, cancellationToken: cancellationToken).ConfigureAwait(false);
}
}
private static async Task CreateViewAsync(IMongoDatabase database, string viewName, string sourceName, CancellationToken cancellationToken)
{
var createCommand = new BsonDocument
{
{ "create", viewName },
{ "viewOn", sourceName },
};
await database.RunCommandAsync<BsonDocument>(createCommand, cancellationToken: cancellationToken).ConfigureAwait(false);
}
private static DateTime GetDateTime(BsonDocument document, string fieldName)
{
if (!document.TryGetValue(fieldName, out var value))
{
return DateTime.MinValue;
}
return value.BsonType switch
{
BsonType.DateTime => value.ToUniversalTime(),
BsonType.String when DateTime.TryParse(value.AsString, out var parsed) => parsed.ToUniversalTime(),
_ => DateTime.MinValue,
};
}
private static bool AreSupersedesEqual(BsonValue? left, BsonValue? right)
{
if (left is null || left.IsBsonNull)
{
return right is null || right.IsBsonNull;
}
if (right is null || right.IsBsonNull)
{
return left.IsBsonNull;
}
return left.Equals(right);
}
private sealed record AdvisoryRawRecord(string Id, DateTime RetrievedAt, BsonValue Supersedes);
}

View File

@@ -1,146 +0,0 @@
using System;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Options;
using MongoDB.Bson;
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo.Migrations;
internal sealed class EnsureDocumentExpiryIndexesMigration : IMongoMigration
{
private readonly MongoStorageOptions _options;
public EnsureDocumentExpiryIndexesMigration(IOptions<MongoStorageOptions> options)
{
ArgumentNullException.ThrowIfNull(options);
_options = options.Value;
}
public string Id => "20241005_document_expiry_indexes";
public string Description => "Ensure document.expiresAt index matches configured retention";
public async Task ApplyAsync(IMongoDatabase database, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(database);
var needsTtl = _options.RawDocumentRetention > TimeSpan.Zero;
var collection = database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.Document);
using var cursor = await collection.Indexes.ListAsync(cancellationToken).ConfigureAwait(false);
var indexes = await cursor.ToListAsync(cancellationToken).ConfigureAwait(false);
var ttlIndex = indexes.FirstOrDefault(x => TryGetName(x, out var name) && string.Equals(name, "document_expiresAt_ttl", StringComparison.Ordinal));
var nonTtlIndex = indexes.FirstOrDefault(x => TryGetName(x, out var name) && string.Equals(name, "document_expiresAt", StringComparison.Ordinal));
if (needsTtl)
{
var shouldRebuild = ttlIndex is null || !IndexMatchesTtlExpectations(ttlIndex);
if (shouldRebuild)
{
if (ttlIndex is not null)
{
await collection.Indexes.DropOneAsync("document_expiresAt_ttl", cancellationToken).ConfigureAwait(false);
}
if (nonTtlIndex is not null)
{
await collection.Indexes.DropOneAsync("document_expiresAt", cancellationToken).ConfigureAwait(false);
}
var options = new CreateIndexOptions<BsonDocument>
{
Name = "document_expiresAt_ttl",
ExpireAfter = TimeSpan.Zero,
PartialFilterExpression = Builders<BsonDocument>.Filter.Exists("expiresAt", true),
};
var keys = Builders<BsonDocument>.IndexKeys.Ascending("expiresAt");
await collection.Indexes.CreateOneAsync(new CreateIndexModel<BsonDocument>(keys, options), cancellationToken: cancellationToken).ConfigureAwait(false);
}
else if (nonTtlIndex is not null)
{
await collection.Indexes.DropOneAsync("document_expiresAt", cancellationToken).ConfigureAwait(false);
}
}
else
{
if (ttlIndex is not null)
{
await collection.Indexes.DropOneAsync("document_expiresAt_ttl", cancellationToken).ConfigureAwait(false);
}
var shouldRebuild = nonTtlIndex is null || !IndexMatchesNonTtlExpectations(nonTtlIndex);
if (shouldRebuild)
{
if (nonTtlIndex is not null)
{
await collection.Indexes.DropOneAsync("document_expiresAt", cancellationToken).ConfigureAwait(false);
}
var options = new CreateIndexOptions<BsonDocument>
{
Name = "document_expiresAt",
PartialFilterExpression = Builders<BsonDocument>.Filter.Exists("expiresAt", true),
};
var keys = Builders<BsonDocument>.IndexKeys.Ascending("expiresAt");
await collection.Indexes.CreateOneAsync(new CreateIndexModel<BsonDocument>(keys, options), cancellationToken: cancellationToken).ConfigureAwait(false);
}
}
}
private static bool IndexMatchesTtlExpectations(BsonDocument index)
{
if (!index.TryGetValue("expireAfterSeconds", out var expireAfter) || expireAfter.ToDouble() != 0)
{
return false;
}
if (!index.TryGetValue("partialFilterExpression", out var partialFilter) || partialFilter is not BsonDocument partialDoc)
{
return false;
}
if (!partialDoc.TryGetValue("expiresAt", out var expiresAtRule) || expiresAtRule is not BsonDocument expiresAtDoc)
{
return false;
}
return expiresAtDoc.Contains("$exists") && expiresAtDoc["$exists"].ToBoolean();
}
private static bool IndexMatchesNonTtlExpectations(BsonDocument index)
{
if (index.Contains("expireAfterSeconds"))
{
return false;
}
if (!index.TryGetValue("partialFilterExpression", out var partialFilter) || partialFilter is not BsonDocument partialDoc)
{
return false;
}
if (!partialDoc.TryGetValue("expiresAt", out var expiresAtRule) || expiresAtRule is not BsonDocument expiresAtDoc)
{
return false;
}
return expiresAtDoc.Contains("$exists") && expiresAtDoc["$exists"].ToBoolean();
}
private static bool TryGetName(BsonDocument index, out string name)
{
if (index.TryGetValue("name", out var value) && value.IsString)
{
name = value.AsString;
return true;
}
name = string.Empty;
return false;
}
}

View File

@@ -1,95 +0,0 @@
using System;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Options;
using MongoDB.Bson;
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo.Migrations;
internal sealed class EnsureGridFsExpiryIndexesMigration : IMongoMigration
{
private readonly MongoStorageOptions _options;
public EnsureGridFsExpiryIndexesMigration(IOptions<MongoStorageOptions> options)
{
ArgumentNullException.ThrowIfNull(options);
_options = options.Value;
}
public string Id => "20241005_gridfs_expiry_indexes";
public string Description => "Ensure GridFS metadata.expiresAt TTL index reflects retention settings";
public async Task ApplyAsync(IMongoDatabase database, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(database);
var needsTtl = _options.RawDocumentRetention > TimeSpan.Zero;
var collection = database.GetCollection<BsonDocument>("documents.files");
using var cursor = await collection.Indexes.ListAsync(cancellationToken).ConfigureAwait(false);
var indexes = await cursor.ToListAsync(cancellationToken).ConfigureAwait(false);
var ttlIndex = indexes.FirstOrDefault(x => TryGetName(x, out var name) && string.Equals(name, "gridfs_files_expiresAt_ttl", StringComparison.Ordinal));
if (needsTtl)
{
var shouldRebuild = ttlIndex is null || !IndexMatchesTtlExpectations(ttlIndex);
if (shouldRebuild)
{
if (ttlIndex is not null)
{
await collection.Indexes.DropOneAsync("gridfs_files_expiresAt_ttl", cancellationToken).ConfigureAwait(false);
}
var keys = Builders<BsonDocument>.IndexKeys.Ascending("metadata.expiresAt");
var options = new CreateIndexOptions<BsonDocument>
{
Name = "gridfs_files_expiresAt_ttl",
ExpireAfter = TimeSpan.Zero,
PartialFilterExpression = Builders<BsonDocument>.Filter.Exists("metadata.expiresAt", true),
};
await collection.Indexes.CreateOneAsync(new CreateIndexModel<BsonDocument>(keys, options), cancellationToken: cancellationToken).ConfigureAwait(false);
}
}
else if (ttlIndex is not null)
{
await collection.Indexes.DropOneAsync("gridfs_files_expiresAt_ttl", cancellationToken).ConfigureAwait(false);
}
}
private static bool IndexMatchesTtlExpectations(BsonDocument index)
{
if (!index.TryGetValue("expireAfterSeconds", out var expireAfter) || expireAfter.ToDouble() != 0)
{
return false;
}
if (!index.TryGetValue("partialFilterExpression", out var partialFilter) || partialFilter is not BsonDocument partialDoc)
{
return false;
}
if (!partialDoc.TryGetValue("metadata.expiresAt", out var expiresAtRule) || expiresAtRule is not BsonDocument expiresAtDoc)
{
return false;
}
return expiresAtDoc.Contains("$exists") && expiresAtDoc["$exists"].ToBoolean();
}
private static bool TryGetName(BsonDocument index, out string name)
{
if (index.TryGetValue("name", out var value) && value.IsString)
{
name = value.AsString;
return true;
}
name = string.Empty;
return false;
}
}

View File

@@ -1,548 +0,0 @@
using System;
using System.Collections.Generic;
using System.Collections.Immutable;
using System.Globalization;
using System.Linq;
using System.Text.Json;
using System.Text.Json.Nodes;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using MongoDB.Bson;
using MongoDB.Bson.IO;
using MongoDB.Driver;
using StellaOps.Concelier.Storage.Mongo.Observations;
namespace StellaOps.Concelier.Storage.Mongo.Migrations;
/// <summary>
/// Backfills advisory_observations and advisory_linksets from existing advisory_raw documents.
/// Per LNM-21-102-DEV: Creates immutable observations from raw documents and groups them into linksets.
/// Also seeds tombstones for rollback tracking (backfill_marker field) to support Offline Kit rollback.
/// </summary>
internal sealed class EnsureLegacyAdvisoriesBackfillMigration : IMongoMigration
{
private const int BulkBatchSize = 250;
private const string BackfillMarkerField = "backfill_marker";
private const string BackfillMarkerValue = "lnm_21_102_dev";
private static readonly JsonWriterSettings JsonSettings = new() { OutputMode = JsonOutputMode.RelaxedExtendedJson };
private readonly MongoStorageOptions _options;
private readonly ILogger<EnsureLegacyAdvisoriesBackfillMigration> _logger;
public EnsureLegacyAdvisoriesBackfillMigration(
IOptions<MongoStorageOptions> options,
ILogger<EnsureLegacyAdvisoriesBackfillMigration> logger)
{
ArgumentNullException.ThrowIfNull(options);
ArgumentNullException.ThrowIfNull(logger);
_options = options.Value;
_logger = logger;
}
public string Id => "20251127_lnm_legacy_backfill";
public string Description => "Backfill advisory_observations and advisory_linksets from advisory_raw; seed tombstones for rollback (LNM-21-102-DEV)";
public async Task ApplyAsync(IMongoDatabase database, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(database);
var rawCollection = database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.AdvisoryRaw);
var observationsCollection = database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.AdvisoryObservations);
var linksetsCollection = database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.AdvisoryLinksets);
_logger.LogInformation("Starting legacy advisory backfill migration {MigrationId}", Id);
var backfilledObservations = await BackfillObservationsAsync(
rawCollection,
observationsCollection,
cancellationToken).ConfigureAwait(false);
_logger.LogInformation(
"Backfilled {Count} observations from advisory_raw",
backfilledObservations.Count);
if (backfilledObservations.Count > 0)
{
var linksetsCreated = await CreateLinksetsFromObservationsAsync(
observationsCollection,
linksetsCollection,
backfilledObservations,
cancellationToken).ConfigureAwait(false);
_logger.LogInformation(
"Created/updated {Count} linksets from backfilled observations",
linksetsCreated);
}
await SeedTombstonesAsync(rawCollection, cancellationToken).ConfigureAwait(false);
_logger.LogInformation("Completed legacy advisory backfill migration {MigrationId}", Id);
}
private async Task<IReadOnlyList<string>> BackfillObservationsAsync(
IMongoCollection<BsonDocument> rawCollection,
IMongoCollection<BsonDocument> observationsCollection,
CancellationToken ct)
{
var backfilledIds = new List<string>();
var batchSize = Math.Max(25, _options.BackfillBatchSize);
string? lastId = null;
while (true)
{
var filter = Builders<BsonDocument>.Filter.Empty;
if (!string.IsNullOrEmpty(lastId))
{
filter = Builders<BsonDocument>.Filter.Gt("_id", lastId);
}
var rawDocs = await rawCollection
.Find(filter)
.Sort(Builders<BsonDocument>.Sort.Ascending("_id"))
.Limit(batchSize)
.ToListAsync(ct)
.ConfigureAwait(false);
if (rawDocs.Count == 0)
{
break;
}
lastId = rawDocs[^1]["_id"].AsString;
var rawDocIds = rawDocs
.Select(d => BuildObservationIdFromRaw(d))
.Where(id => !string.IsNullOrEmpty(id))
.ToArray();
var existingFilter = Builders<BsonDocument>.Filter.In("_id", rawDocIds);
var existingObservations = await observationsCollection
.Find(existingFilter)
.Project(Builders<BsonDocument>.Projection.Include("_id"))
.ToListAsync(ct)
.ConfigureAwait(false);
var existingIds = existingObservations
.Select(d => d["_id"].AsString)
.ToHashSet(StringComparer.Ordinal);
var newObservations = new List<BsonDocument>();
foreach (var rawDoc in rawDocs)
{
var observationId = BuildObservationIdFromRaw(rawDoc);
if (string.IsNullOrEmpty(observationId) || existingIds.Contains(observationId))
{
continue;
}
var observation = MapRawToObservation(rawDoc, observationId);
if (observation is not null)
{
newObservations.Add(observation);
backfilledIds.Add(observationId);
}
}
if (newObservations.Count > 0)
{
try
{
await observationsCollection.InsertManyAsync(
newObservations,
new InsertManyOptions { IsOrdered = false },
ct).ConfigureAwait(false);
}
catch (MongoBulkWriteException ex) when (ex.WriteErrors.All(e => e.Category == ServerErrorCategory.DuplicateKey))
{
_logger.LogDebug(
"Some observations already exist during backfill batch; continuing with {Inserted} inserted",
newObservations.Count - ex.WriteErrors.Count);
}
}
}
return backfilledIds;
}
private async Task<int> CreateLinksetsFromObservationsAsync(
IMongoCollection<BsonDocument> observationsCollection,
IMongoCollection<BsonDocument> linksetsCollection,
IReadOnlyList<string> observationIds,
CancellationToken ct)
{
var filter = Builders<BsonDocument>.Filter.In("_id", observationIds);
var pipeline = new EmptyPipelineDefinition<BsonDocument>()
.Match(filter)
.Group(new BsonDocument
{
{
"_id",
new BsonDocument
{
{ "tenant", "$tenant" },
{ "advisoryKey", new BsonDocument("$ifNull", new BsonArray { "$advisoryKey", "$linkset.aliases" }) },
{ "vendor", "$source.vendor" }
}
},
{ "observationIds", new BsonDocument("$push", "$_id") },
{ "latestCreatedAt", new BsonDocument("$max", "$createdAt") },
{
"purls",
new BsonDocument("$push", new BsonDocument("$ifNull", new BsonArray { "$linkset.purls", new BsonArray() }))
},
{
"cpes",
new BsonDocument("$push", new BsonDocument("$ifNull", new BsonArray { "$linkset.cpes", new BsonArray() }))
}
});
using var cursor = await observationsCollection
.AggregateAsync(pipeline, cancellationToken: ct)
.ConfigureAwait(false);
var linksetUpdates = new List<WriteModel<BsonDocument>>();
var createdCount = 0;
while (await cursor.MoveNextAsync(ct).ConfigureAwait(false))
{
foreach (var group in cursor.Current)
{
var groupId = group["_id"].AsBsonDocument;
var tenant = groupId.GetValue("tenant", BsonString.Empty).AsString;
var advisoryKey = ExtractAdvisoryKeyFromGroup(groupId);
var vendor = groupId.GetValue("vendor", BsonString.Empty).AsString;
var observations = group["observationIds"].AsBsonArray.Select(v => v.AsString).ToList();
var latestCreatedAt = group["latestCreatedAt"].ToUniversalTime();
if (string.IsNullOrWhiteSpace(tenant) || string.IsNullOrWhiteSpace(advisoryKey) || observations.Count == 0)
{
continue;
}
var purls = FlattenArrayOfArrays(group["purls"].AsBsonArray);
var cpes = FlattenArrayOfArrays(group["cpes"].AsBsonArray);
var linksetFilter = Builders<BsonDocument>.Filter.And(
Builders<BsonDocument>.Filter.Eq("tenantId", tenant.ToLowerInvariant()),
Builders<BsonDocument>.Filter.Eq("source", vendor),
Builders<BsonDocument>.Filter.Eq("advisoryId", advisoryKey));
var linksetUpdate = new BsonDocument
{
{ "$setOnInsert", new BsonDocument
{
{ "tenantId", tenant.ToLowerInvariant() },
{ "source", vendor },
{ "advisoryId", advisoryKey },
{ "createdAt", latestCreatedAt },
{ BackfillMarkerField, BackfillMarkerValue }
}
},
{ "$addToSet", new BsonDocument
{
{ "observations", new BsonDocument("$each", new BsonArray(observations)) }
}
},
{ "$set", new BsonDocument
{
{ "normalized.purls", new BsonArray(purls.Distinct(StringComparer.Ordinal)) },
{ "normalized.cpes", new BsonArray(cpes.Distinct(StringComparer.Ordinal)) }
}
}
};
linksetUpdates.Add(new UpdateOneModel<BsonDocument>(linksetFilter, linksetUpdate)
{
IsUpsert = true
});
createdCount++;
if (linksetUpdates.Count >= BulkBatchSize)
{
await linksetsCollection.BulkWriteAsync(linksetUpdates, cancellationToken: ct).ConfigureAwait(false);
linksetUpdates.Clear();
}
}
}
if (linksetUpdates.Count > 0)
{
await linksetsCollection.BulkWriteAsync(linksetUpdates, cancellationToken: ct).ConfigureAwait(false);
}
return createdCount;
}
private async Task SeedTombstonesAsync(
IMongoCollection<BsonDocument> rawCollection,
CancellationToken ct)
{
var filter = Builders<BsonDocument>.Filter.Exists(BackfillMarkerField, false);
var update = Builders<BsonDocument>.Update.Set(BackfillMarkerField, BackfillMarkerValue);
var result = await rawCollection
.UpdateManyAsync(filter, update, cancellationToken: ct)
.ConfigureAwait(false);
_logger.LogInformation(
"Seeded tombstone markers on {Count} advisory_raw documents for rollback tracking",
result.ModifiedCount);
}
private static string BuildObservationIdFromRaw(BsonDocument rawDoc)
{
var tenant = rawDoc.GetValue("tenant", BsonString.Empty).AsString;
var sourceDoc = rawDoc.GetValue("source", BsonNull.Value);
var upstreamDoc = rawDoc.GetValue("upstream", BsonNull.Value);
if (sourceDoc.IsBsonNull || upstreamDoc.IsBsonNull)
{
return string.Empty;
}
var vendor = sourceDoc.AsBsonDocument.GetValue("vendor", BsonString.Empty).AsString;
var upstreamId = upstreamDoc.AsBsonDocument.GetValue("upstream_id", BsonString.Empty).AsString;
var contentHash = upstreamDoc.AsBsonDocument.GetValue("content_hash", BsonString.Empty).AsString;
if (string.IsNullOrWhiteSpace(tenant) || string.IsNullOrWhiteSpace(vendor) ||
string.IsNullOrWhiteSpace(upstreamId) || string.IsNullOrWhiteSpace(contentHash))
{
return string.Empty;
}
return $"obs:{tenant}:{vendor}:{SanitizeIdSegment(upstreamId)}:{ShortenHash(contentHash)}";
}
private static BsonDocument? MapRawToObservation(BsonDocument rawDoc, string observationId)
{
try
{
var tenant = rawDoc.GetValue("tenant", BsonString.Empty).AsString;
var sourceDoc = rawDoc["source"].AsBsonDocument;
var upstreamDoc = rawDoc["upstream"].AsBsonDocument;
var contentDoc = rawDoc["content"].AsBsonDocument;
var linksetDoc = rawDoc.GetValue("linkset", new BsonDocument()).AsBsonDocument;
var advisoryKey = rawDoc.GetValue("advisory_key", BsonString.Empty).AsString;
var ingestedAt = GetDateTime(rawDoc, "ingested_at");
var retrievedAt = GetDateTime(upstreamDoc, "retrieved_at");
var observation = new BsonDocument
{
{ "_id", observationId },
{ "tenant", tenant },
{ "advisoryKey", advisoryKey },
{
"source", new BsonDocument
{
{ "vendor", sourceDoc.GetValue("vendor", BsonString.Empty).AsString },
{ "stream", sourceDoc.GetValue("stream", BsonString.Empty).AsString },
{ "api", sourceDoc.GetValue("connector", BsonString.Empty).AsString },
{ "collectorVersion", sourceDoc.GetValue("version", BsonNull.Value) }
}
},
{
"upstream", new BsonDocument
{
{ "upstream_id", upstreamDoc.GetValue("upstream_id", BsonString.Empty).AsString },
{ "document_version", upstreamDoc.GetValue("document_version", BsonNull.Value) },
{ "fetchedAt", retrievedAt },
{ "receivedAt", ingestedAt },
{ "contentHash", upstreamDoc.GetValue("content_hash", BsonString.Empty).AsString },
{
"signature", MapSignature(upstreamDoc.GetValue("signature", new BsonDocument()).AsBsonDocument)
},
{ "metadata", upstreamDoc.GetValue("provenance", new BsonDocument()) }
}
},
{
"content", new BsonDocument
{
{ "format", contentDoc.GetValue("format", BsonString.Empty).AsString },
{ "specVersion", contentDoc.GetValue("spec_version", BsonNull.Value) },
{ "raw", contentDoc.GetValue("raw", new BsonDocument()) },
{ "metadata", new BsonDocument() }
}
},
{ "linkset", MapLinkset(linksetDoc) },
{ "rawLinkset", MapRawLinkset(linksetDoc, rawDoc.GetValue("identifiers", new BsonDocument()).AsBsonDocument) },
{ "createdAt", ingestedAt },
{ "ingestedAt", ingestedAt },
{ BackfillMarkerField, BackfillMarkerValue }
};
return observation;
}
catch (Exception)
{
return null;
}
}
private static BsonDocument MapSignature(BsonDocument signatureDoc)
{
return new BsonDocument
{
{ "present", signatureDoc.GetValue("present", BsonBoolean.False).AsBoolean },
{ "format", signatureDoc.GetValue("format", BsonNull.Value) },
{ "keyId", signatureDoc.GetValue("key_id", BsonNull.Value) },
{ "signature", signatureDoc.GetValue("sig", BsonNull.Value) }
};
}
private static BsonDocument MapLinkset(BsonDocument linksetDoc)
{
return new BsonDocument
{
{ "aliases", linksetDoc.GetValue("aliases", new BsonArray()) },
{ "purls", linksetDoc.GetValue("purls", new BsonArray()) },
{ "cpes", linksetDoc.GetValue("cpes", new BsonArray()) },
{ "references", MapReferences(linksetDoc.GetValue("references", new BsonArray()).AsBsonArray) }
};
}
private static BsonArray MapReferences(BsonArray referencesArray)
{
var result = new BsonArray();
foreach (var refValue in referencesArray)
{
if (!refValue.IsBsonDocument)
{
continue;
}
var refDoc = refValue.AsBsonDocument;
result.Add(new BsonDocument
{
{ "type", refDoc.GetValue("type", BsonString.Empty).AsString },
{ "url", refDoc.GetValue("url", BsonString.Empty).AsString }
});
}
return result;
}
private static BsonDocument MapRawLinkset(BsonDocument linksetDoc, BsonDocument identifiersDoc)
{
var aliases = new BsonArray();
if (identifiersDoc.TryGetValue("primary", out var primary) && !primary.IsBsonNull)
{
aliases.Add(primary);
}
if (identifiersDoc.TryGetValue("aliases", out var idAliases) && idAliases.IsBsonArray)
{
foreach (var alias in idAliases.AsBsonArray)
{
aliases.Add(alias);
}
}
if (linksetDoc.TryGetValue("aliases", out var linkAliases) && linkAliases.IsBsonArray)
{
foreach (var alias in linkAliases.AsBsonArray)
{
aliases.Add(alias);
}
}
return new BsonDocument
{
{ "aliases", aliases },
{ "scopes", new BsonArray() },
{ "relationships", new BsonArray() },
{ "purls", linksetDoc.GetValue("purls", new BsonArray()) },
{ "cpes", linksetDoc.GetValue("cpes", new BsonArray()) },
{ "references", linksetDoc.GetValue("references", new BsonArray()) },
{ "reconciled_from", linksetDoc.GetValue("reconciled_from", new BsonArray()) },
{ "notes", linksetDoc.GetValue("notes", new BsonDocument()) }
};
}
private static string ExtractAdvisoryKeyFromGroup(BsonDocument groupId)
{
var advisoryKeyValue = groupId.GetValue("advisoryKey", BsonNull.Value);
if (advisoryKeyValue.IsBsonArray)
{
var array = advisoryKeyValue.AsBsonArray;
return array.Count > 0 ? array[0].AsString : string.Empty;
}
return advisoryKeyValue.IsBsonNull ? string.Empty : advisoryKeyValue.AsString;
}
private static IReadOnlyList<string> FlattenArrayOfArrays(BsonArray arrayOfArrays)
{
var result = new List<string>();
foreach (var item in arrayOfArrays)
{
if (item.IsBsonArray)
{
foreach (var subItem in item.AsBsonArray)
{
if (subItem.IsString && !string.IsNullOrWhiteSpace(subItem.AsString))
{
result.Add(subItem.AsString);
}
}
}
else if (item.IsString && !string.IsNullOrWhiteSpace(item.AsString))
{
result.Add(item.AsString);
}
}
return result;
}
private static DateTime GetDateTime(BsonDocument doc, string field)
{
if (!doc.TryGetValue(field, out var value) || value.IsBsonNull)
{
return DateTime.UtcNow;
}
return value.BsonType switch
{
BsonType.DateTime => value.ToUniversalTime(),
BsonType.String when DateTime.TryParse(value.AsString, CultureInfo.InvariantCulture, DateTimeStyles.AssumeUniversal, out var parsed)
=> parsed.ToUniversalTime(),
BsonType.Int64 => DateTimeOffset.FromUnixTimeMilliseconds(value.AsInt64).UtcDateTime,
_ => DateTime.UtcNow
};
}
private static string SanitizeIdSegment(string value)
{
if (string.IsNullOrWhiteSpace(value))
{
return "unknown";
}
var sanitized = string.Concat(value.Select(c =>
char.IsLetterOrDigit(c) ? char.ToLowerInvariant(c) : (c is '-' or '.' ? c : '-')));
sanitized = sanitized.Trim('-');
if (string.IsNullOrEmpty(sanitized))
{
return "unknown";
}
return sanitized.Length > 48 ? sanitized[..48] : sanitized;
}
private static string ShortenHash(string hash)
{
if (string.IsNullOrWhiteSpace(hash))
{
return "0";
}
var clean = hash.Replace(":", "-");
return clean.Length > 12 ? clean[..12] : clean;
}
}

View File

@@ -1,243 +0,0 @@
using System.Collections.Generic;
using MongoDB.Bson;
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo.Migrations;
internal sealed class EnsureLinkNotMergeCollectionsMigration : IMongoMigration
{
public string Id => "20251116_link_not_merge_collections";
public string Description => "Ensure advisory_observations and advisory_linksets collections exist with validators and indexes for Link-Not-Merge";
public async Task ApplyAsync(IMongoDatabase database, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(database);
await EnsureObservationsAsync(database, cancellationToken).ConfigureAwait(false);
await EnsureLinksetsAsync(database, cancellationToken).ConfigureAwait(false);
}
private static async Task EnsureObservationsAsync(IMongoDatabase database, CancellationToken ct)
{
var collectionName = MongoStorageDefaults.Collections.AdvisoryObservations;
var validator = new BsonDocument("$jsonSchema", BuildObservationSchema());
await EnsureCollectionWithValidatorAsync(database, collectionName, validator, ct).ConfigureAwait(false);
var collection = database.GetCollection<BsonDocument>(collectionName);
var indexes = new List<CreateIndexModel<BsonDocument>>
{
new(new BsonDocument
{
{"tenant", 1},
{"source", 1},
{"advisoryId", 1},
{"upstream.fetchedAt", -1},
},
new CreateIndexOptions { Name = "obs_tenant_source_adv_fetchedAt" }),
new(new BsonDocument
{
{"provenance.sourceArtifactSha", 1},
},
new CreateIndexOptions { Name = "obs_prov_sourceArtifactSha_unique", Unique = true }),
};
await collection.Indexes.CreateManyAsync(indexes, cancellationToken: ct).ConfigureAwait(false);
}
private static async Task EnsureLinksetsAsync(IMongoDatabase database, CancellationToken ct)
{
var collectionName = MongoStorageDefaults.Collections.AdvisoryLinksets;
var validator = new BsonDocument("$jsonSchema", BuildLinksetSchema());
await EnsureCollectionWithValidatorAsync(database, collectionName, validator, ct).ConfigureAwait(false);
var collection = database.GetCollection<BsonDocument>(collectionName);
var indexes = new List<CreateIndexModel<BsonDocument>>
{
new(new BsonDocument
{
{"tenantId", 1},
{"advisoryId", 1},
{"source", 1},
},
new CreateIndexOptions { Name = "linkset_tenant_advisory_source", Unique = true }),
new(new BsonDocument { { "observations", 1 } }, new CreateIndexOptions { Name = "linkset_observations" })
};
await collection.Indexes.CreateManyAsync(indexes, cancellationToken: ct).ConfigureAwait(false);
}
private static async Task EnsureCollectionWithValidatorAsync(
IMongoDatabase database,
string collectionName,
BsonDocument validator,
CancellationToken ct)
{
var filter = new BsonDocument("name", collectionName);
var existing = await database.ListCollectionsAsync(new ListCollectionsOptions { Filter = filter }, ct)
.ConfigureAwait(false);
var exists = await existing.AnyAsync(ct).ConfigureAwait(false);
if (!exists)
{
var options = new CreateCollectionOptions<BsonDocument>
{
Validator = validator,
ValidationLevel = DocumentValidationLevel.Moderate,
ValidationAction = DocumentValidationAction.Error,
};
await database.CreateCollectionAsync(collectionName, options, ct).ConfigureAwait(false);
}
else
{
var command = new BsonDocument
{
{ "collMod", collectionName },
{ "validator", validator },
{ "validationLevel", "moderate" },
{ "validationAction", "error" },
};
await database.RunCommandAsync<BsonDocument>(command, cancellationToken: ct).ConfigureAwait(false);
}
}
private static BsonDocument BuildObservationSchema()
{
return new BsonDocument
{
{ "bsonType", "object" },
{ "required", new BsonArray { "_id", "tenantId", "source", "advisoryId", "affected", "provenance", "ingestedAt" } },
{ "properties", new BsonDocument
{
{ "_id", new BsonDocument("bsonType", "string") },
{ "tenantId", new BsonDocument("bsonType", "string") },
{ "source", new BsonDocument("bsonType", "string") },
{ "advisoryId", new BsonDocument("bsonType", "string") },
{ "title", new BsonDocument("bsonType", new BsonArray { "string", "null" }) },
{ "summary", new BsonDocument("bsonType", new BsonArray { "string", "null" }) },
{ "severities", new BsonDocument
{
{ "bsonType", "array" },
{ "items", new BsonDocument
{
{ "bsonType", "object" },
{ "required", new BsonArray { "system", "score" } },
{ "properties", new BsonDocument
{
{ "system", new BsonDocument("bsonType", "string") },
{ "score", new BsonDocument("bsonType", new BsonArray { "double", "int", "long", "decimal" }) },
{ "vector", new BsonDocument("bsonType", new BsonArray { "string", "null" }) }
}
}
}
}
}
},
{ "affected", new BsonDocument
{
{ "bsonType", "array" },
{ "items", new BsonDocument
{
{ "bsonType", "object" },
{ "required", new BsonArray { "purl" } },
{ "properties", new BsonDocument
{
{ "purl", new BsonDocument("bsonType", "string") },
{ "package", new BsonDocument("bsonType", new BsonArray { "string", "null" }) },
{ "versions", new BsonDocument("bsonType", new BsonArray { "array", "null" }) },
{ "ranges", new BsonDocument("bsonType", new BsonArray { "array", "null" }) },
{ "ecosystem", new BsonDocument("bsonType", new BsonArray { "string", "null" }) },
{ "cpe", new BsonDocument("bsonType", new BsonArray { "array", "null" }) },
{ "cpes", new BsonDocument("bsonType", new BsonArray { "array", "null" }) }
}
}
}
}
}
},
{ "references", new BsonDocument
{
{ "bsonType", new BsonArray { "array", "null" } },
{ "items", new BsonDocument("bsonType", "string") }
}
},
{ "weaknesses", new BsonDocument
{
{ "bsonType", new BsonArray { "array", "null" } },
{ "items", new BsonDocument("bsonType", "string") }
}
},
{ "published", new BsonDocument("bsonType", new BsonArray { "date", "null" }) },
{ "modified", new BsonDocument("bsonType", new BsonArray { "date", "null" }) },
{ "provenance", new BsonDocument
{
{ "bsonType", "object" },
{ "required", new BsonArray { "sourceArtifactSha", "fetchedAt" } },
{ "properties", new BsonDocument
{
{ "sourceArtifactSha", new BsonDocument("bsonType", "string") },
{ "fetchedAt", new BsonDocument("bsonType", "date") },
{ "ingestJobId", new BsonDocument("bsonType", new BsonArray { "string", "null" }) },
{ "signature", new BsonDocument("bsonType", new BsonArray { "object", "null" }) }
}
}
}
},
{ "ingestedAt", new BsonDocument("bsonType", "date") }
}
}
};
}
private static BsonDocument BuildLinksetSchema()
{
return new BsonDocument
{
{ "bsonType", "object" },
{ "required", new BsonArray { "_id", "tenantId", "source", "advisoryId", "observations", "createdAt" } },
{ "properties", new BsonDocument
{
{ "_id", new BsonDocument("bsonType", "objectId") },
{ "tenantId", new BsonDocument("bsonType", "string") },
{ "source", new BsonDocument("bsonType", "string") },
{ "advisoryId", new BsonDocument("bsonType", "string") },
{ "observations", new BsonDocument
{
{ "bsonType", "array" },
{ "items", new BsonDocument("bsonType", "string") }
}
},
{ "normalized", new BsonDocument
{
{ "bsonType", new BsonArray { "object", "null" } },
{ "properties", new BsonDocument
{
{ "purls", new BsonDocument { { "bsonType", new BsonArray { "array", "null" } }, { "items", new BsonDocument("bsonType", "string") } } },
{ "cpes", new BsonDocument { { "bsonType", new BsonArray { "array", "null" } }, { "items", new BsonDocument("bsonType", "string") } } },
{ "versions", new BsonDocument { { "bsonType", new BsonArray { "array", "null" } }, { "items", new BsonDocument("bsonType", "string") } } },
{ "ranges", new BsonDocument { { "bsonType", new BsonArray { "array", "null" } }, { "items", new BsonDocument("bsonType", "object") } } },
{ "severities", new BsonDocument { { "bsonType", new BsonArray { "array", "null" } }, { "items", new BsonDocument("bsonType", "object") } } }
}
}
}
},
{ "createdAt", new BsonDocument("bsonType", "date") },
{ "builtByJobId", new BsonDocument("bsonType", new BsonArray { "string", "null" }) },
{ "provenance", new BsonDocument
{
{ "bsonType", new BsonArray { "object", "null" } },
{ "properties", new BsonDocument
{
{ "observationHashes", new BsonDocument { { "bsonType", new BsonArray { "array", "null" } }, { "items", new BsonDocument("bsonType", "string") } } },
{ "toolVersion", new BsonDocument("bsonType", new BsonArray { "string", "null" }) },
{ "policyHash", new BsonDocument("bsonType", new BsonArray { "string", "null" }) }
}
}
}
}
}
}
};
}
}

View File

@@ -1,203 +0,0 @@
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Options;
using MongoDB.Bson;
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo.Migrations;
/// <summary>
/// Adds hashed shard key indexes and TTL indexes for LNM collections.
/// Per LNM-21-101-DEV: hashed shard keys for horizontal scaling, tenant indexes, TTL for ingest metadata.
/// </summary>
internal sealed class EnsureLinkNotMergeShardingAndTtlMigration : IMongoMigration
{
private readonly MongoStorageOptions _options;
public EnsureLinkNotMergeShardingAndTtlMigration(IOptions<MongoStorageOptions> options)
{
ArgumentNullException.ThrowIfNull(options);
_options = options.Value;
}
public string Id => "20251127_lnm_sharding_and_ttl";
public string Description => "Add hashed shard key indexes and TTL indexes for advisory_observations and advisory_linksets (LNM-21-101-DEV)";
public async Task ApplyAsync(IMongoDatabase database, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(database);
await EnsureObservationShardingAndTtlAsync(database, cancellationToken).ConfigureAwait(false);
await EnsureLinksetShardingAndTtlAsync(database, cancellationToken).ConfigureAwait(false);
await EnsureLinksetEventShardingAndTtlAsync(database, cancellationToken).ConfigureAwait(false);
}
private async Task EnsureObservationShardingAndTtlAsync(IMongoDatabase database, CancellationToken ct)
{
var collection = database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.AdvisoryObservations);
var indexes = new List<CreateIndexModel<BsonDocument>>();
// Hashed shard key on tenantId for horizontal scaling
indexes.Add(new CreateIndexModel<BsonDocument>(
new BsonDocument("tenantId", "hashed"),
new CreateIndexOptions { Name = "obs_tenantId_hashed", Background = true }));
// TTL index on ingestedAt if retention is configured
var needsTtl = _options.ObservationRetention > TimeSpan.Zero;
if (needsTtl)
{
await EnsureTtlIndexAsync(
collection,
"ingestedAt",
"obs_ingestedAt_ttl",
_options.ObservationRetention,
ct).ConfigureAwait(false);
}
await collection.Indexes.CreateManyAsync(indexes, cancellationToken: ct).ConfigureAwait(false);
}
private async Task EnsureLinksetShardingAndTtlAsync(IMongoDatabase database, CancellationToken ct)
{
var collection = database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.AdvisoryLinksets);
var indexes = new List<CreateIndexModel<BsonDocument>>();
// Hashed shard key on tenantId for horizontal scaling
indexes.Add(new CreateIndexModel<BsonDocument>(
new BsonDocument("tenantId", "hashed"),
new CreateIndexOptions { Name = "linkset_tenantId_hashed", Background = true }));
await collection.Indexes.CreateManyAsync(indexes, cancellationToken: ct).ConfigureAwait(false);
// TTL index on createdAt if retention is configured
var needsTtl = _options.LinksetRetention > TimeSpan.Zero;
if (needsTtl)
{
await EnsureTtlIndexAsync(
collection,
"createdAt",
"linkset_createdAt_ttl",
_options.LinksetRetention,
ct).ConfigureAwait(false);
}
}
private async Task EnsureLinksetEventShardingAndTtlAsync(IMongoDatabase database, CancellationToken ct)
{
// Check if linkset events collection exists (future-proofing for event outbox)
var collectionName = "advisory_linkset_events";
var filter = new BsonDocument("name", collectionName);
using var cursor = await database.ListCollectionsAsync(new ListCollectionsOptions { Filter = filter }, ct).ConfigureAwait(false);
var exists = await cursor.AnyAsync(ct).ConfigureAwait(false);
if (!exists)
{
// Create the collection for linkset events with basic schema
var validator = new BsonDocument("$jsonSchema", new BsonDocument
{
{ "bsonType", "object" },
{ "required", new BsonArray { "_id", "tenantId", "eventId", "linksetId", "createdAt" } },
{ "properties", new BsonDocument
{
{ "_id", new BsonDocument("bsonType", "objectId") },
{ "tenantId", new BsonDocument("bsonType", "string") },
{ "eventId", new BsonDocument("bsonType", "string") },
{ "linksetId", new BsonDocument("bsonType", "string") },
{ "advisoryId", new BsonDocument("bsonType", "string") },
{ "payload", new BsonDocument("bsonType", "object") },
{ "createdAt", new BsonDocument("bsonType", "date") },
{ "publishedAt", new BsonDocument("bsonType", new BsonArray { "date", "null" }) }
}
}
});
var createOptions = new CreateCollectionOptions<BsonDocument>
{
Validator = validator,
ValidationLevel = DocumentValidationLevel.Moderate,
ValidationAction = DocumentValidationAction.Error,
};
await database.CreateCollectionAsync(collectionName, createOptions, ct).ConfigureAwait(false);
}
var collection = database.GetCollection<BsonDocument>(collectionName);
var indexes = new List<CreateIndexModel<BsonDocument>>
{
// Hashed shard key
new(new BsonDocument("tenantId", "hashed"),
new CreateIndexOptions { Name = "linkset_event_tenantId_hashed", Background = true }),
// Unique event ID index
new(new BsonDocument("eventId", 1),
new CreateIndexOptions { Name = "linkset_event_eventId_unique", Unique = true, Background = true }),
// Outbox processing index (unpublished events)
new(new BsonDocument { { "publishedAt", 1 }, { "createdAt", 1 } },
new CreateIndexOptions { Name = "linkset_event_outbox", Background = true })
};
await collection.Indexes.CreateManyAsync(indexes, cancellationToken: ct).ConfigureAwait(false);
// TTL for event cleanup
var needsTtl = _options.EventRetention > TimeSpan.Zero;
if (needsTtl)
{
await EnsureTtlIndexAsync(
collection,
"createdAt",
"linkset_event_createdAt_ttl",
_options.EventRetention,
ct).ConfigureAwait(false);
}
}
private static async Task EnsureTtlIndexAsync(
IMongoCollection<BsonDocument> collection,
string field,
string indexName,
TimeSpan expiration,
CancellationToken ct)
{
using var cursor = await collection.Indexes.ListAsync(ct).ConfigureAwait(false);
var indexes = await cursor.ToListAsync(ct).ConfigureAwait(false);
var existing = indexes.FirstOrDefault(x =>
x.TryGetValue("name", out var name) &&
name.IsString &&
name.AsString == indexName);
if (existing is not null)
{
// Check if TTL value matches expected
if (existing.TryGetValue("expireAfterSeconds", out var expireAfter))
{
var expectedSeconds = (long)expiration.TotalSeconds;
if (expireAfter.ToInt64() == expectedSeconds)
{
return; // Index already correct
}
}
// Drop and recreate with correct TTL
await collection.Indexes.DropOneAsync(indexName, ct).ConfigureAwait(false);
}
var options = new CreateIndexOptions<BsonDocument>
{
Name = indexName,
ExpireAfter = expiration,
Background = true
};
var keys = Builders<BsonDocument>.IndexKeys.Ascending(field);
await collection.Indexes.CreateOneAsync(new CreateIndexModel<BsonDocument>(keys, options), cancellationToken: ct).ConfigureAwait(false);
}
}

View File

@@ -1,102 +0,0 @@
using System;
using System.Collections.Generic;
using MongoDB.Bson;
using MongoDB.Driver;
using StellaOps.Concelier.Storage.Mongo.Orchestrator;
namespace StellaOps.Concelier.Storage.Mongo.Migrations;
internal sealed class EnsureOrchestratorCollectionsMigration : IMongoMigration
{
public string Id => "20251122_orchestrator_registry_commands";
public string Description => "Ensure orchestrator registry, commands, and heartbeats collections exist with indexes";
public async Task ApplyAsync(IMongoDatabase database, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(database);
await EnsureRegistryAsync(database, cancellationToken).ConfigureAwait(false);
await EnsureCommandsAsync(database, cancellationToken).ConfigureAwait(false);
await EnsureHeartbeatsAsync(database, cancellationToken).ConfigureAwait(false);
}
private static async Task EnsureRegistryAsync(IMongoDatabase database, CancellationToken ct)
{
var name = MongoStorageDefaults.Collections.OrchestratorRegistry;
await EnsureCollectionAsync(database, name, ct).ConfigureAwait(false);
var collection = database.GetCollection<BsonDocument>(name);
var indexes = new List<CreateIndexModel<BsonDocument>>
{
new(new BsonDocument
{
{"tenant", 1},
{"connectorId", 1},
}, new CreateIndexOptions { Name = "orch_registry_tenant_connector", Unique = true }),
new(new BsonDocument
{
{"source", 1},
}, new CreateIndexOptions { Name = "orch_registry_source" }),
};
await collection.Indexes.CreateManyAsync(indexes, cancellationToken: ct).ConfigureAwait(false);
}
private static async Task EnsureCommandsAsync(IMongoDatabase database, CancellationToken ct)
{
var name = MongoStorageDefaults.Collections.OrchestratorCommands;
await EnsureCollectionAsync(database, name, ct).ConfigureAwait(false);
var collection = database.GetCollection<BsonDocument>(name);
var indexes = new List<CreateIndexModel<BsonDocument>>
{
new(new BsonDocument
{
{"tenant", 1},
{"connectorId", 1},
{"runId", 1},
{"sequence", 1},
}, new CreateIndexOptions { Name = "orch_cmd_tenant_connector_run_seq" }),
new(new BsonDocument { {"expiresAt", 1} }, new CreateIndexOptions
{
Name = "orch_cmd_expiresAt_ttl",
ExpireAfter = TimeSpan.FromSeconds(0),
})
};
await collection.Indexes.CreateManyAsync(indexes, cancellationToken: ct).ConfigureAwait(false);
}
private static async Task EnsureHeartbeatsAsync(IMongoDatabase database, CancellationToken ct)
{
var name = MongoStorageDefaults.Collections.OrchestratorHeartbeats;
await EnsureCollectionAsync(database, name, ct).ConfigureAwait(false);
var collection = database.GetCollection<BsonDocument>(name);
var indexes = new List<CreateIndexModel<BsonDocument>>
{
new(new BsonDocument
{
{"tenant", 1},
{"connectorId", 1},
{"runId", 1},
{"sequence", 1},
}, new CreateIndexOptions { Name = "orch_hb_tenant_connector_run_seq" }),
new(new BsonDocument { {"timestamp", -1} }, new CreateIndexOptions { Name = "orch_hb_timestamp_desc" })
};
await collection.Indexes.CreateManyAsync(indexes, cancellationToken: ct).ConfigureAwait(false);
}
private static async Task EnsureCollectionAsync(IMongoDatabase database, string collectionName, CancellationToken ct)
{
var filter = new BsonDocument("name", collectionName);
using var cursor = await database.ListCollectionsAsync(new ListCollectionsOptions { Filter = filter }, ct).ConfigureAwait(false);
var exists = await cursor.AnyAsync(ct).ConfigureAwait(false);
if (!exists)
{
await database.CreateCollectionAsync(collectionName, cancellationToken: ct).ConfigureAwait(false);
}
}
}

View File

@@ -1,81 +0,0 @@
using System.Threading;
using System.Threading.Tasks;
using MongoDB.Bson;
using MongoDB.Driver;
using StellaOps.Concelier.Storage.Mongo.PolicyDelta;
namespace StellaOps.Concelier.Storage.Mongo.Migrations;
/// <summary>
/// Creates the policy_delta_checkpoints collection with indexes for deterministic policy delta tracking.
/// </summary>
internal sealed class EnsurePolicyDeltaCheckpointsCollectionMigration : IMongoMigration
{
public string Id => "20251128_policy_delta_checkpoints";
public string Description =>
"Creates policy_delta_checkpoints collection with tenant/consumer indexes for deterministic policy deltas (CONCELIER-POLICY-20-003).";
public async Task ApplyAsync(IMongoDatabase database, CancellationToken cancellationToken)
{
var collectionName = MongoStorageDefaults.Collections.PolicyDeltaCheckpoints;
// Ensure collection exists
var collectionNames = await database
.ListCollectionNames(cancellationToken: cancellationToken)
.ToListAsync(cancellationToken)
.ConfigureAwait(false);
var exists = collectionNames.Contains(collectionName);
if (!exists)
{
await database.CreateCollectionAsync(collectionName, cancellationToken: cancellationToken)
.ConfigureAwait(false);
}
var collection = database.GetCollection<PolicyDeltaCheckpointDocument>(collectionName);
// Index: tenantId for listing checkpoints by tenant
var tenantIndex = new CreateIndexModel<PolicyDeltaCheckpointDocument>(
Builders<PolicyDeltaCheckpointDocument>.IndexKeys.Ascending(d => d.TenantId),
new CreateIndexOptions
{
Name = "ix_tenantId",
Background = true
});
// Index: consumerId for querying checkpoints by consumer
var consumerIndex = new CreateIndexModel<PolicyDeltaCheckpointDocument>(
Builders<PolicyDeltaCheckpointDocument>.IndexKeys.Ascending(d => d.ConsumerId),
new CreateIndexOptions
{
Name = "ix_consumerId",
Background = true
});
// Compound index: (tenantId, consumerId) for efficient lookups
var compoundIndex = new CreateIndexModel<PolicyDeltaCheckpointDocument>(
Builders<PolicyDeltaCheckpointDocument>.IndexKeys
.Ascending(d => d.TenantId)
.Ascending(d => d.ConsumerId),
new CreateIndexOptions
{
Name = "ix_tenantId_consumerId",
Background = true
});
// Index: updatedAt for maintenance queries (stale checkpoint detection)
var updatedAtIndex = new CreateIndexModel<PolicyDeltaCheckpointDocument>(
Builders<PolicyDeltaCheckpointDocument>.IndexKeys.Ascending(d => d.UpdatedAt),
new CreateIndexOptions
{
Name = "ix_updatedAt",
Background = true
});
await collection.Indexes.CreateManyAsync(
[tenantIndex, consumerIndex, compoundIndex, updatedAtIndex],
cancellationToken)
.ConfigureAwait(false);
}
}

View File

@@ -1,131 +0,0 @@
using System.Collections.Generic;
using System.Threading;
using System.Threading.Tasks;
using MongoDB.Bson;
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo.Migrations;
/// <summary>
/// Adds secondary indexes for policy lookup patterns: alias lookups, confidence filtering, and severity-based queries.
/// Supports efficient policy joins without cached verdicts per CONCELIER-POLICY-23-001.
/// </summary>
/// <remarks>
/// Query patterns supported:
/// <list type="bullet">
/// <item>Find observations by alias (CVE-ID, GHSA-ID): db.advisory_observations.find({"linkset.aliases": "cve-2024-1234"})</item>
/// <item>Find linksets by confidence range: db.advisory_linksets.find({"confidence": {$gte: 0.7}})</item>
/// <item>Find linksets by provider severity: db.advisory_linksets.find({"normalized.severities.system": "cvss_v31", "normalized.severities.score": {$gte: 7.0}})</item>
/// <item>Find linksets by tenant and advisory with confidence: db.advisory_linksets.find({"tenantId": "...", "advisoryId": "...", "confidence": {$gte: 0.5}})</item>
/// </list>
/// </remarks>
internal sealed class EnsurePolicyLookupIndexesMigration : IMongoMigration
{
public string Id => "20251128_policy_lookup_indexes";
public string Description => "Add secondary indexes for alias, confidence, and severity-based policy lookups (CONCELIER-POLICY-23-001)";
public async Task ApplyAsync(IMongoDatabase database, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(database);
await EnsureObservationPolicyIndexesAsync(database, cancellationToken).ConfigureAwait(false);
await EnsureLinksetPolicyIndexesAsync(database, cancellationToken).ConfigureAwait(false);
}
private static async Task EnsureObservationPolicyIndexesAsync(IMongoDatabase database, CancellationToken ct)
{
var collection = database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.AdvisoryObservations);
var indexes = new List<CreateIndexModel<BsonDocument>>
{
// Multikey index on linkset.aliases for alias-based lookups (CVE-ID, GHSA-ID, etc.)
// Query pattern: db.advisory_observations.find({"linkset.aliases": "cve-2024-1234"})
new(new BsonDocument("linkset.aliases", 1),
new CreateIndexOptions
{
Name = "obs_linkset_aliases",
Background = true,
Sparse = true
}),
// Compound index for tenant + alias lookups
// Query pattern: db.advisory_observations.find({"tenant": "...", "linkset.aliases": "cve-2024-1234"})
new(new BsonDocument { { "tenant", 1 }, { "linkset.aliases", 1 } },
new CreateIndexOptions
{
Name = "obs_tenant_aliases",
Background = true
})
};
await collection.Indexes.CreateManyAsync(indexes, cancellationToken: ct).ConfigureAwait(false);
}
private static async Task EnsureLinksetPolicyIndexesAsync(IMongoDatabase database, CancellationToken ct)
{
var collection = database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.AdvisoryLinksets);
var indexes = new List<CreateIndexModel<BsonDocument>>
{
// Index on confidence for confidence-based filtering
// Query pattern: db.advisory_linksets.find({"confidence": {$gte: 0.7}})
new(new BsonDocument("confidence", -1),
new CreateIndexOptions
{
Name = "linkset_confidence",
Background = true,
Sparse = true
}),
// Compound index for tenant + confidence lookups
// Query pattern: db.advisory_linksets.find({"tenantId": "...", "confidence": {$gte: 0.7}})
new(new BsonDocument { { "tenantId", 1 }, { "confidence", -1 } },
new CreateIndexOptions
{
Name = "linkset_tenant_confidence",
Background = true
}),
// Index on normalized.severities.system for severity system filtering
// Query pattern: db.advisory_linksets.find({"normalized.severities.system": "cvss_v31"})
new(new BsonDocument("normalized.severities.system", 1),
new CreateIndexOptions
{
Name = "linkset_severity_system",
Background = true,
Sparse = true
}),
// Compound index for severity system + score for range queries
// Query pattern: db.advisory_linksets.find({"normalized.severities.system": "cvss_v31", "normalized.severities.score": {$gte: 7.0}})
new(new BsonDocument { { "normalized.severities.system", 1 }, { "normalized.severities.score", -1 } },
new CreateIndexOptions
{
Name = "linkset_severity_system_score",
Background = true,
Sparse = true
}),
// Compound index for tenant + advisory + confidence (policy delta queries)
// Query pattern: db.advisory_linksets.find({"tenantId": "...", "advisoryId": "...", "confidence": {$gte: 0.5}})
new(new BsonDocument { { "tenantId", 1 }, { "advisoryId", 1 }, { "confidence", -1 } },
new CreateIndexOptions
{
Name = "linkset_tenant_advisory_confidence",
Background = true
}),
// Index for createdAt-based pagination (policy delta cursors)
// Query pattern: db.advisory_linksets.find({"tenantId": "...", "createdAt": {$gt: ISODate("...")}}).sort({"createdAt": 1})
new(new BsonDocument { { "tenantId", 1 }, { "createdAt", 1 } },
new CreateIndexOptions
{
Name = "linkset_tenant_createdAt",
Background = true
})
};
await collection.Indexes.CreateManyAsync(indexes, cancellationToken: ct).ConfigureAwait(false);
}
}

View File

@@ -1,24 +0,0 @@
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo.Migrations;
/// <summary>
/// Represents a single, idempotent MongoDB migration.
/// </summary>
public interface IMongoMigration
{
/// <summary>
/// Unique identifier for the migration. Sorting is performed using ordinal comparison.
/// </summary>
string Id { get; }
/// <summary>
/// Short description surfaced in logs to aid runbooks.
/// </summary>
string Description { get; }
/// <summary>
/// Executes the migration.
/// </summary>
Task ApplyAsync(IMongoDatabase database, CancellationToken cancellationToken);
}

View File

@@ -1,18 +0,0 @@
using MongoDB.Bson;
using MongoDB.Bson.Serialization.Attributes;
namespace StellaOps.Concelier.Storage.Mongo.Migrations;
[BsonIgnoreExtraElements]
internal sealed class MongoMigrationDocument
{
[BsonId]
public string Id { get; set; } = string.Empty;
[BsonElement("description")]
[BsonIgnoreIfNull]
public string? Description { get; set; }
[BsonElement("appliedAt")]
public DateTime AppliedAtUtc { get; set; }
}

View File

@@ -1,102 +0,0 @@
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging;
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo.Migrations;
/// <summary>
/// Executes pending schema migrations tracked inside MongoDB to keep upgrades deterministic.
/// </summary>
public sealed class MongoMigrationRunner
{
private readonly IMongoDatabase _database;
private readonly IReadOnlyList<IMongoMigration> _migrations;
private readonly ILogger<MongoMigrationRunner> _logger;
private readonly TimeProvider _timeProvider;
public MongoMigrationRunner(
IMongoDatabase database,
IEnumerable<IMongoMigration> migrations,
ILogger<MongoMigrationRunner> logger,
TimeProvider? timeProvider = null)
{
_database = database ?? throw new ArgumentNullException(nameof(database));
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
_timeProvider = timeProvider ?? TimeProvider.System;
_migrations = (migrations ?? throw new ArgumentNullException(nameof(migrations)))
.OrderBy(m => m.Id, StringComparer.Ordinal)
.ToArray();
}
public async Task RunAsync(CancellationToken cancellationToken)
{
if (_migrations.Count == 0)
{
return;
}
var collection = _database.GetCollection<MongoMigrationDocument>(MongoStorageDefaults.Collections.Migrations);
await EnsureCollectionExistsAsync(_database, cancellationToken).ConfigureAwait(false);
var appliedIds = await LoadAppliedMigrationIdsAsync(collection, cancellationToken).ConfigureAwait(false);
foreach (var migration in _migrations)
{
if (appliedIds.Contains(migration.Id, StringComparer.Ordinal))
{
continue;
}
_logger.LogInformation("Applying Mongo migration {MigrationId}: {Description}", migration.Id, migration.Description);
try
{
await migration.ApplyAsync(_database, cancellationToken).ConfigureAwait(false);
var document = new MongoMigrationDocument
{
Id = migration.Id,
Description = string.IsNullOrWhiteSpace(migration.Description) ? null : migration.Description,
AppliedAtUtc = _timeProvider.GetUtcNow().UtcDateTime,
};
await collection.InsertOneAsync(document, cancellationToken: cancellationToken).ConfigureAwait(false);
_logger.LogInformation("Mongo migration {MigrationId} applied", migration.Id);
}
catch (Exception ex)
{
_logger.LogError(ex, "Mongo migration {MigrationId} failed", migration.Id);
throw;
}
}
}
private static async Task<HashSet<string>> LoadAppliedMigrationIdsAsync(
IMongoCollection<MongoMigrationDocument> collection,
CancellationToken cancellationToken)
{
using var cursor = await collection.FindAsync(FilterDefinition<MongoMigrationDocument>.Empty, cancellationToken: cancellationToken).ConfigureAwait(false);
var applied = await cursor.ToListAsync(cancellationToken).ConfigureAwait(false);
var set = new HashSet<string>(StringComparer.Ordinal);
foreach (var document in applied)
{
if (!string.IsNullOrWhiteSpace(document.Id))
{
set.Add(document.Id);
}
}
return set;
}
private static async Task EnsureCollectionExistsAsync(IMongoDatabase database, CancellationToken cancellationToken)
{
using var cursor = await database.ListCollectionNamesAsync(cancellationToken: cancellationToken).ConfigureAwait(false);
var names = await cursor.ToListAsync(cancellationToken).ConfigureAwait(false);
if (!names.Contains(MongoStorageDefaults.Collections.Migrations, StringComparer.Ordinal))
{
await database.CreateCollectionAsync(MongoStorageDefaults.Collections.Migrations, cancellationToken: cancellationToken).ConfigureAwait(false);
}
}
}

View File

@@ -1,81 +0,0 @@
using System;
using System.Collections.Generic;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using MongoDB.Bson;
using MongoDB.Driver;
using StellaOps.Concelier.Models;
using StellaOps.Concelier.Storage.Mongo.Advisories;
namespace StellaOps.Concelier.Storage.Mongo.Migrations;
public sealed class SemVerStyleBackfillMigration : IMongoMigration
{
private readonly MongoStorageOptions _options;
private readonly ILogger<SemVerStyleBackfillMigration> _logger;
public SemVerStyleBackfillMigration(IOptions<MongoStorageOptions> options, ILogger<SemVerStyleBackfillMigration> logger)
{
_options = options?.Value ?? throw new ArgumentNullException(nameof(options));
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public string Id => "20251011-semver-style-backfill";
public string Description => "Populate advisory.normalizedVersions for existing documents when SemVer style storage is enabled.";
public async Task ApplyAsync(IMongoDatabase database, CancellationToken cancellationToken)
{
if (!_options.EnableSemVerStyle)
{
_logger.LogInformation("SemVer style flag disabled; skipping migration {MigrationId}.", Id);
return;
}
var collection = database.GetCollection<AdvisoryDocument>(MongoStorageDefaults.Collections.Advisory);
var filter = Builders<AdvisoryDocument>.Filter.Or(
Builders<AdvisoryDocument>.Filter.Exists(doc => doc.NormalizedVersions, false),
Builders<AdvisoryDocument>.Filter.Where(doc => doc.NormalizedVersions == null || doc.NormalizedVersions.Count == 0));
var batchSize = Math.Max(25, _options.BackfillBatchSize);
while (true)
{
var pending = await collection.Find(filter)
.SortBy(doc => doc.AdvisoryKey)
.Limit(batchSize)
.ToListAsync(cancellationToken)
.ConfigureAwait(false);
if (pending.Count == 0)
{
break;
}
var updates = new List<WriteModel<AdvisoryDocument>>(pending.Count);
foreach (var document in pending)
{
var advisory = CanonicalJsonSerializer.Deserialize<Advisory>(document.Payload.ToJson());
var normalized = NormalizedVersionDocumentFactory.Create(advisory);
if (normalized is null || normalized.Count == 0)
{
updates.Add(new UpdateOneModel<AdvisoryDocument>(
Builders<AdvisoryDocument>.Filter.Eq(doc => doc.AdvisoryKey, document.AdvisoryKey),
Builders<AdvisoryDocument>.Update.Unset(doc => doc.NormalizedVersions)));
continue;
}
updates.Add(new UpdateOneModel<AdvisoryDocument>(
Builders<AdvisoryDocument>.Filter.Eq(doc => doc.AdvisoryKey, document.AdvisoryKey),
Builders<AdvisoryDocument>.Update.Set(doc => doc.NormalizedVersions, normalized)));
}
if (updates.Count > 0)
{
await collection.BulkWriteAsync(updates, cancellationToken: cancellationToken).ConfigureAwait(false);
}
}
}
}

View File

@@ -1,413 +0,0 @@
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using MongoDB.Bson;
using MongoDB.Driver;
using StellaOps.Concelier.Storage.Mongo.Migrations;
namespace StellaOps.Concelier.Storage.Mongo;
/// <summary>
/// Ensures required collections and indexes exist before the service begins processing.
/// </summary>
public sealed class MongoBootstrapper
{
private const string RawDocumentBucketName = "documents";
private static readonly string[] RequiredCollections =
{
MongoStorageDefaults.Collections.Source,
MongoStorageDefaults.Collections.SourceState,
MongoStorageDefaults.Collections.Document,
MongoStorageDefaults.Collections.Dto,
MongoStorageDefaults.Collections.Advisory,
MongoStorageDefaults.Collections.AdvisoryRaw,
MongoStorageDefaults.Collections.Alias,
MongoStorageDefaults.Collections.Affected,
MongoStorageDefaults.Collections.Reference,
MongoStorageDefaults.Collections.KevFlag,
MongoStorageDefaults.Collections.RuFlags,
MongoStorageDefaults.Collections.JpFlags,
MongoStorageDefaults.Collections.PsirtFlags,
MongoStorageDefaults.Collections.MergeEvent,
MongoStorageDefaults.Collections.ExportState,
MongoStorageDefaults.Collections.ChangeHistory,
MongoStorageDefaults.Collections.AdvisoryStatements,
MongoStorageDefaults.Collections.AdvisoryConflicts,
MongoStorageDefaults.Collections.AdvisoryObservations,
MongoStorageDefaults.Collections.Locks,
MongoStorageDefaults.Collections.Jobs,
MongoStorageDefaults.Collections.Migrations,
};
private readonly IMongoDatabase _database;
private readonly MongoStorageOptions _options;
private readonly ILogger<MongoBootstrapper> _logger;
private readonly MongoMigrationRunner _migrationRunner;
public MongoBootstrapper(
IMongoDatabase database,
IOptions<MongoStorageOptions> options,
ILogger<MongoBootstrapper> logger,
MongoMigrationRunner migrationRunner)
{
_database = database ?? throw new ArgumentNullException(nameof(database));
_options = options?.Value ?? throw new ArgumentNullException(nameof(options));
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
_migrationRunner = migrationRunner ?? throw new ArgumentNullException(nameof(migrationRunner));
}
public async Task InitializeAsync(CancellationToken cancellationToken)
{
var existingCollections = await ListCollectionsAsync(cancellationToken).ConfigureAwait(false);
foreach (var collectionName in RequiredCollections)
{
if (!existingCollections.Contains(collectionName))
{
await _database.CreateCollectionAsync(collectionName, cancellationToken: cancellationToken).ConfigureAwait(false);
_logger.LogInformation("Created Mongo collection {Collection}", collectionName);
}
}
await Task.WhenAll(
EnsureLocksIndexesAsync(cancellationToken),
EnsureJobsIndexesAsync(cancellationToken),
EnsureAdvisoryIndexesAsync(cancellationToken),
EnsureDocumentsIndexesAsync(cancellationToken),
EnsureDtoIndexesAsync(cancellationToken),
EnsureAliasIndexesAsync(cancellationToken),
EnsureAffectedIndexesAsync(cancellationToken),
EnsureReferenceIndexesAsync(cancellationToken),
EnsureSourceStateIndexesAsync(cancellationToken),
EnsurePsirtFlagIndexesAsync(cancellationToken),
EnsureAdvisoryStatementIndexesAsync(cancellationToken),
EnsureAdvisoryConflictIndexesAsync(cancellationToken),
EnsureObservationIndexesAsync(cancellationToken),
EnsureChangeHistoryIndexesAsync(cancellationToken),
EnsureGridFsIndexesAsync(cancellationToken)).ConfigureAwait(false);
await _migrationRunner.RunAsync(cancellationToken).ConfigureAwait(false);
_logger.LogInformation("Mongo bootstrapper completed");
}
private async Task<HashSet<string>> ListCollectionsAsync(CancellationToken cancellationToken)
{
using var cursor = await _database.ListCollectionNamesAsync(cancellationToken: cancellationToken).ConfigureAwait(false);
var list = await cursor.ToListAsync(cancellationToken).ConfigureAwait(false);
return new HashSet<string>(list, StringComparer.Ordinal);
}
private async Task<bool> CollectionIsViewAsync(string collectionName, CancellationToken cancellationToken)
{
var filter = Builders<BsonDocument>.Filter.Eq("name", collectionName);
var options = new ListCollectionsOptions { Filter = filter };
using var cursor = await _database.ListCollectionsAsync(options, cancellationToken).ConfigureAwait(false);
var collections = await cursor.ToListAsync(cancellationToken).ConfigureAwait(false);
if (collections.Count == 0)
{
return false;
}
var typeValue = collections[0].GetValue("type", BsonString.Empty).AsString;
return string.Equals(typeValue, "view", StringComparison.OrdinalIgnoreCase);
}
private Task EnsureLocksIndexesAsync(CancellationToken cancellationToken)
{
var collection = _database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.Locks);
var indexes = new List<CreateIndexModel<BsonDocument>>
{
new(
Builders<BsonDocument>.IndexKeys.Ascending("ttlAt"),
new CreateIndexOptions { Name = "ttl_at_ttl", ExpireAfter = TimeSpan.Zero }),
};
return collection.Indexes.CreateManyAsync(indexes, cancellationToken);
}
private Task EnsureJobsIndexesAsync(CancellationToken cancellationToken)
{
var collection = _database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.Jobs);
var indexes = new List<CreateIndexModel<BsonDocument>>
{
new(
Builders<BsonDocument>.IndexKeys.Descending("createdAt"),
new CreateIndexOptions { Name = "jobs_createdAt_desc" }),
new(
Builders<BsonDocument>.IndexKeys.Ascending("kind").Descending("createdAt"),
new CreateIndexOptions { Name = "jobs_kind_createdAt" }),
new(
Builders<BsonDocument>.IndexKeys.Ascending("status").Descending("createdAt"),
new CreateIndexOptions { Name = "jobs_status_createdAt" }),
};
return collection.Indexes.CreateManyAsync(indexes, cancellationToken);
}
private async Task EnsureAdvisoryIndexesAsync(CancellationToken cancellationToken)
{
if (await CollectionIsViewAsync(MongoStorageDefaults.Collections.Advisory, cancellationToken).ConfigureAwait(false))
{
_logger.LogDebug("Skipping advisory index creation because {Collection} is a view", MongoStorageDefaults.Collections.Advisory);
return;
}
var collection = _database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.Advisory);
var indexes = new List<CreateIndexModel<BsonDocument>>
{
new(
Builders<BsonDocument>.IndexKeys.Ascending("advisoryKey"),
new CreateIndexOptions { Name = "advisory_key_unique", Unique = true }),
new(
Builders<BsonDocument>.IndexKeys.Descending("modified"),
new CreateIndexOptions { Name = "advisory_modified_desc" }),
new(
Builders<BsonDocument>.IndexKeys.Descending("published"),
new CreateIndexOptions { Name = "advisory_published_desc" }),
};
if (_options.EnableSemVerStyle)
{
indexes.Add(new CreateIndexModel<BsonDocument>(
Builders<BsonDocument>.IndexKeys
.Ascending("normalizedVersions.packageId")
.Ascending("normalizedVersions.scheme")
.Ascending("normalizedVersions.type"),
new CreateIndexOptions { Name = "advisory_normalizedVersions_pkg_scheme_type" }));
indexes.Add(new CreateIndexModel<BsonDocument>(
Builders<BsonDocument>.IndexKeys.Ascending("normalizedVersions.value"),
new CreateIndexOptions { Name = "advisory_normalizedVersions_value", Sparse = true }));
}
await collection.Indexes.CreateManyAsync(indexes, cancellationToken).ConfigureAwait(false);
}
private Task EnsureDocumentsIndexesAsync(CancellationToken cancellationToken)
{
var collection = _database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.Document);
var indexes = new List<CreateIndexModel<BsonDocument>>
{
new(
Builders<BsonDocument>.IndexKeys.Ascending("sourceName").Ascending("uri"),
new CreateIndexOptions { Name = "document_source_uri_unique", Unique = true }),
new(
Builders<BsonDocument>.IndexKeys.Descending("fetchedAt"),
new CreateIndexOptions { Name = "document_fetchedAt_desc" }),
};
var expiresKey = Builders<BsonDocument>.IndexKeys.Ascending("expiresAt");
var expiresOptions = new CreateIndexOptions<BsonDocument>
{
Name = _options.RawDocumentRetention > TimeSpan.Zero ? "document_expiresAt_ttl" : "document_expiresAt",
PartialFilterExpression = Builders<BsonDocument>.Filter.Exists("expiresAt", true),
};
if (_options.RawDocumentRetention > TimeSpan.Zero)
{
expiresOptions.ExpireAfter = TimeSpan.Zero;
}
indexes.Add(new CreateIndexModel<BsonDocument>(expiresKey, expiresOptions));
return collection.Indexes.CreateManyAsync(indexes, cancellationToken);
}
private Task EnsureAliasIndexesAsync(CancellationToken cancellationToken)
{
var collection = _database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.Alias);
var indexes = new List<CreateIndexModel<BsonDocument>>
{
new(
Builders<BsonDocument>.IndexKeys.Ascending("scheme").Ascending("value"),
new CreateIndexOptions { Name = "alias_scheme_value", Unique = false }),
};
return collection.Indexes.CreateManyAsync(indexes, cancellationToken);
}
private Task EnsureGridFsIndexesAsync(CancellationToken cancellationToken)
{
if (_options.RawDocumentRetention <= TimeSpan.Zero)
{
return Task.CompletedTask;
}
var collectionName = $"{RawDocumentBucketName}.files";
var collection = _database.GetCollection<BsonDocument>(collectionName);
var indexes = new List<CreateIndexModel<BsonDocument>>
{
new(
Builders<BsonDocument>.IndexKeys.Ascending("metadata.expiresAt"),
new CreateIndexOptions<BsonDocument>
{
Name = "gridfs_files_expiresAt_ttl",
ExpireAfter = TimeSpan.Zero,
PartialFilterExpression = Builders<BsonDocument>.Filter.Exists("metadata.expiresAt", true),
}),
};
return collection.Indexes.CreateManyAsync(indexes, cancellationToken);
}
private Task EnsureAffectedIndexesAsync(CancellationToken cancellationToken)
{
var collection = _database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.Affected);
var indexes = new List<CreateIndexModel<BsonDocument>>
{
new(
Builders<BsonDocument>.IndexKeys.Ascending("platform").Ascending("name"),
new CreateIndexOptions { Name = "affected_platform_name" }),
new(
Builders<BsonDocument>.IndexKeys.Ascending("advisoryId"),
new CreateIndexOptions { Name = "affected_advisoryId" }),
};
return collection.Indexes.CreateManyAsync(indexes, cancellationToken);
}
private Task EnsureReferenceIndexesAsync(CancellationToken cancellationToken)
{
var collection = _database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.Reference);
var indexes = new List<CreateIndexModel<BsonDocument>>
{
new(
Builders<BsonDocument>.IndexKeys.Ascending("url"),
new CreateIndexOptions { Name = "reference_url" }),
new(
Builders<BsonDocument>.IndexKeys.Ascending("advisoryId"),
new CreateIndexOptions { Name = "reference_advisoryId" }),
};
return collection.Indexes.CreateManyAsync(indexes, cancellationToken);
}
private Task EnsureObservationIndexesAsync(CancellationToken cancellationToken)
{
var collection = _database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.AdvisoryObservations);
var indexes = new List<CreateIndexModel<BsonDocument>>
{
new(
Builders<BsonDocument>.IndexKeys
.Ascending("tenant")
.Ascending("upstream.upstream_id")
.Ascending("upstream.document_version"),
new CreateIndexOptions { Name = "advisory_obs_tenant_upstream", Unique = false }),
new(
Builders<BsonDocument>.IndexKeys
.Ascending("tenant")
.Ascending("linkset.aliases"),
new CreateIndexOptions { Name = "advisory_obs_tenant_aliases" }),
new(
Builders<BsonDocument>.IndexKeys
.Ascending("tenant")
.Ascending("linkset.purls"),
new CreateIndexOptions { Name = "advisory_obs_tenant_purls" }),
new(
Builders<BsonDocument>.IndexKeys
.Ascending("tenant")
.Descending("createdAt"),
new CreateIndexOptions { Name = "advisory_obs_tenant_createdAt" })
};
return collection.Indexes.CreateManyAsync(indexes, cancellationToken);
}
private Task EnsureAdvisoryStatementIndexesAsync(CancellationToken cancellationToken)
{
var collection = _database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.AdvisoryStatements);
var indexes = new List<CreateIndexModel<BsonDocument>>
{
new(
Builders<BsonDocument>.IndexKeys.Ascending("vulnerabilityKey").Descending("asOf"),
new CreateIndexOptions { Name = "advisory_statements_vulnerability_asof_desc" }),
new(
Builders<BsonDocument>.IndexKeys.Ascending("statementHash"),
new CreateIndexOptions { Name = "advisory_statements_statementHash_unique", Unique = true }),
};
return collection.Indexes.CreateManyAsync(indexes, cancellationToken);
}
private Task EnsureAdvisoryConflictIndexesAsync(CancellationToken cancellationToken)
{
var collection = _database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.AdvisoryConflicts);
var indexes = new List<CreateIndexModel<BsonDocument>>
{
new(
Builders<BsonDocument>.IndexKeys.Ascending("vulnerabilityKey").Descending("asOf"),
new CreateIndexOptions { Name = "advisory_conflicts_vulnerability_asof_desc" }),
new(
Builders<BsonDocument>.IndexKeys.Ascending("conflictHash"),
new CreateIndexOptions { Name = "advisory_conflicts_conflictHash_unique", Unique = true }),
};
return collection.Indexes.CreateManyAsync(indexes, cancellationToken);
}
private Task EnsureSourceStateIndexesAsync(CancellationToken cancellationToken)
{
var collection = _database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.SourceState);
var indexes = new List<CreateIndexModel<BsonDocument>>
{
new(
Builders<BsonDocument>.IndexKeys.Ascending("sourceName"),
new CreateIndexOptions { Name = "source_state_unique", Unique = true }),
};
return collection.Indexes.CreateManyAsync(indexes, cancellationToken);
}
private Task EnsureDtoIndexesAsync(CancellationToken cancellationToken)
{
var collection = _database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.Dto);
var indexes = new List<CreateIndexModel<BsonDocument>>
{
new(
Builders<BsonDocument>.IndexKeys.Ascending("documentId"),
new CreateIndexOptions { Name = "dto_documentId" }),
new(
Builders<BsonDocument>.IndexKeys.Ascending("sourceName").Descending("validatedAt"),
new CreateIndexOptions { Name = "dto_source_validated" }),
};
return collection.Indexes.CreateManyAsync(indexes, cancellationToken);
}
private async Task EnsurePsirtFlagIndexesAsync(CancellationToken cancellationToken)
{
var collection = _database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.PsirtFlags);
try
{
await collection.Indexes.DropOneAsync("psirt_advisoryKey_unique", cancellationToken).ConfigureAwait(false);
}
catch (MongoCommandException ex) when (ex.CodeName == "IndexNotFound")
{
}
var index = new CreateIndexModel<BsonDocument>(
Builders<BsonDocument>.IndexKeys.Ascending("vendor"),
new CreateIndexOptions { Name = "psirt_vendor" });
await collection.Indexes.CreateOneAsync(index, cancellationToken: cancellationToken).ConfigureAwait(false);
}
private Task EnsureChangeHistoryIndexesAsync(CancellationToken cancellationToken)
{
var collection = _database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.ChangeHistory);
var indexes = new List<CreateIndexModel<BsonDocument>>
{
new(
Builders<BsonDocument>.IndexKeys.Ascending("source").Ascending("advisoryKey").Descending("capturedAt"),
new CreateIndexOptions { Name = "history_source_advisory_capturedAt" }),
new(
Builders<BsonDocument>.IndexKeys.Descending("capturedAt"),
new CreateIndexOptions { Name = "history_capturedAt" }),
new(
Builders<BsonDocument>.IndexKeys.Ascending("documentId"),
new CreateIndexOptions { Name = "history_documentId" })
};
return collection.Indexes.CreateManyAsync(indexes, cancellationToken);
}
}

View File

@@ -1,21 +0,0 @@
namespace StellaOps.Concelier.Storage.Mongo;
public enum MongoValidationLevel
{
Off,
Moderate,
Strict,
}
public enum MongoValidationAction
{
Warn,
Error,
}
public sealed class MongoCollectionValidatorOptions
{
public MongoValidationLevel Level { get; set; } = MongoValidationLevel.Moderate;
public MongoValidationAction Action { get; set; } = MongoValidationAction.Warn;
}

View File

@@ -1,223 +0,0 @@
using System;
using System.Collections.Generic;
using System.Linq;
using Microsoft.Extensions.Logging;
using MongoDB.Bson;
using MongoDB.Bson.Serialization;
using MongoDB.Driver;
using StellaOps.Concelier.Core.Jobs;
namespace StellaOps.Concelier.Storage.Mongo;
public sealed class MongoJobStore : IJobStore
{
private static readonly string PendingStatus = JobRunStatus.Pending.ToString();
private static readonly string RunningStatus = JobRunStatus.Running.ToString();
private readonly IMongoCollection<JobRunDocument> _collection;
private readonly ILogger<MongoJobStore> _logger;
public MongoJobStore(IMongoCollection<JobRunDocument> collection, ILogger<MongoJobStore> logger)
{
_collection = collection ?? throw new ArgumentNullException(nameof(collection));
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public async Task<JobRunSnapshot> CreateAsync(JobRunCreateRequest request, CancellationToken cancellationToken, IClientSessionHandle? session = null)
{
var runId = Guid.NewGuid();
var document = JobRunDocumentExtensions.FromRequest(request, runId);
if (session is null)
{
await _collection.InsertOneAsync(document, cancellationToken: cancellationToken).ConfigureAwait(false);
}
else
{
await _collection.InsertOneAsync(session, document, cancellationToken: cancellationToken).ConfigureAwait(false);
}
_logger.LogDebug("Created job run {RunId} for {Kind} with trigger {Trigger}", runId, request.Kind, request.Trigger);
return document.ToSnapshot();
}
public async Task<JobRunSnapshot?> TryStartAsync(Guid runId, DateTimeOffset startedAt, CancellationToken cancellationToken, IClientSessionHandle? session = null)
{
var runIdValue = runId.ToString();
var filter = Builders<JobRunDocument>.Filter.Eq(x => x.Id, runIdValue)
& Builders<JobRunDocument>.Filter.Eq(x => x.Status, PendingStatus);
var update = Builders<JobRunDocument>.Update
.Set(x => x.Status, RunningStatus)
.Set(x => x.StartedAt, startedAt.UtcDateTime);
var options = new FindOneAndUpdateOptions<JobRunDocument>
{
ReturnDocument = ReturnDocument.After,
};
var result = session is null
? await _collection.FindOneAndUpdateAsync(filter, update, options, cancellationToken).ConfigureAwait(false)
: await _collection.FindOneAndUpdateAsync(session, filter, update, options, cancellationToken).ConfigureAwait(false);
if (result is null)
{
_logger.LogDebug("Failed to start job run {RunId}; status transition rejected", runId);
return null;
}
return result.ToSnapshot();
}
public async Task<JobRunSnapshot?> TryCompleteAsync(Guid runId, JobRunCompletion completion, CancellationToken cancellationToken, IClientSessionHandle? session = null)
{
var runIdValue = runId.ToString();
var filter = Builders<JobRunDocument>.Filter.Eq(x => x.Id, runIdValue)
& Builders<JobRunDocument>.Filter.In(x => x.Status, new[] { PendingStatus, RunningStatus });
var update = Builders<JobRunDocument>.Update
.Set(x => x.Status, completion.Status.ToString())
.Set(x => x.CompletedAt, completion.CompletedAt.UtcDateTime)
.Set(x => x.Error, completion.Error);
var options = new FindOneAndUpdateOptions<JobRunDocument>
{
ReturnDocument = ReturnDocument.After,
};
var result = session is null
? await _collection.FindOneAndUpdateAsync(filter, update, options, cancellationToken).ConfigureAwait(false)
: await _collection.FindOneAndUpdateAsync(session, filter, update, options, cancellationToken).ConfigureAwait(false);
if (result is null)
{
_logger.LogWarning("Failed to mark job run {RunId} as {Status}", runId, completion.Status);
return null;
}
return result.ToSnapshot();
}
public async Task<JobRunSnapshot?> FindAsync(Guid runId, CancellationToken cancellationToken, IClientSessionHandle? session = null)
{
var filter = Builders<JobRunDocument>.Filter.Eq(x => x.Id, runId.ToString());
var query = session is null
? _collection.Find(filter)
: _collection.Find(session, filter);
var document = await query.FirstOrDefaultAsync(cancellationToken).ConfigureAwait(false);
return document?.ToSnapshot();
}
public async Task<IReadOnlyList<JobRunSnapshot>> GetRecentRunsAsync(string? kind, int limit, CancellationToken cancellationToken, IClientSessionHandle? session = null)
{
if (limit <= 0)
{
return Array.Empty<JobRunSnapshot>();
}
var filter = string.IsNullOrWhiteSpace(kind)
? Builders<JobRunDocument>.Filter.Empty
: Builders<JobRunDocument>.Filter.Eq(x => x.Kind, kind);
var query = session is null
? _collection.Find(filter)
: _collection.Find(session, filter);
var cursor = await query
.SortByDescending(x => x.CreatedAt)
.Limit(limit)
.ToListAsync(cancellationToken)
.ConfigureAwait(false);
return cursor.Select(static doc => doc.ToSnapshot()).ToArray();
}
public async Task<IReadOnlyList<JobRunSnapshot>> GetActiveRunsAsync(CancellationToken cancellationToken, IClientSessionHandle? session = null)
{
var filter = Builders<JobRunDocument>.Filter.In(x => x.Status, new[] { PendingStatus, RunningStatus });
var query = session is null
? _collection.Find(filter)
: _collection.Find(session, filter);
var cursor = await query
.SortByDescending(x => x.CreatedAt)
.ToListAsync(cancellationToken)
.ConfigureAwait(false);
return cursor.Select(static doc => doc.ToSnapshot()).ToArray();
}
public async Task<JobRunSnapshot?> GetLastRunAsync(string kind, CancellationToken cancellationToken, IClientSessionHandle? session = null)
{
var filter = Builders<JobRunDocument>.Filter.Eq(x => x.Kind, kind);
var query = session is null
? _collection.Find(filter)
: _collection.Find(session, filter);
var cursor = await query
.SortByDescending(x => x.CreatedAt)
.Limit(1)
.ToListAsync(cancellationToken)
.ConfigureAwait(false);
return cursor.FirstOrDefault()?.ToSnapshot();
}
public async Task<IReadOnlyDictionary<string, JobRunSnapshot>> GetLastRunsAsync(IEnumerable<string> kinds, CancellationToken cancellationToken, IClientSessionHandle? session = null)
{
if (kinds is null)
{
throw new ArgumentNullException(nameof(kinds));
}
var kindList = kinds
.Where(static kind => !string.IsNullOrWhiteSpace(kind))
.Select(static kind => kind.Trim())
.Distinct(StringComparer.Ordinal)
.ToArray();
if (kindList.Length == 0)
{
return new Dictionary<string, JobRunSnapshot>(StringComparer.Ordinal);
}
var matchStage = new BsonDocument("$match", new BsonDocument("kind", new BsonDocument("$in", new BsonArray(kindList))));
var sortStage = new BsonDocument("$sort", new BsonDocument("createdAt", -1));
var groupStage = new BsonDocument("$group", new BsonDocument
{
{ "_id", "$kind" },
{ "document", new BsonDocument("$first", "$$ROOT") }
});
var pipeline = new[] { matchStage, sortStage, groupStage };
var aggregateFluent = session is null
? _collection.Aggregate<BsonDocument>(pipeline)
: _collection.Aggregate<BsonDocument>(session, pipeline);
var aggregate = await aggregateFluent
.ToListAsync(cancellationToken)
.ConfigureAwait(false);
var results = new Dictionary<string, JobRunSnapshot>(StringComparer.Ordinal);
foreach (var element in aggregate)
{
if (!element.TryGetValue("_id", out var idValue) || idValue.BsonType != BsonType.String)
{
continue;
}
if (!element.TryGetValue("document", out var documentValue) || documentValue.BsonType != BsonType.Document)
{
continue;
}
var document = BsonSerializer.Deserialize<JobRunDocument>(documentValue.AsBsonDocument);
results[idValue.AsString] = document.ToSnapshot();
}
return results;
}
}

View File

@@ -1,116 +0,0 @@
using Microsoft.Extensions.Logging;
using MongoDB.Driver;
using StellaOps.Concelier.Core.Jobs;
namespace StellaOps.Concelier.Storage.Mongo;
public sealed class MongoLeaseStore : ILeaseStore
{
private readonly IMongoCollection<JobLeaseDocument> _collection;
private readonly ILogger<MongoLeaseStore> _logger;
public MongoLeaseStore(IMongoCollection<JobLeaseDocument> collection, ILogger<MongoLeaseStore> logger)
{
_collection = collection ?? throw new ArgumentNullException(nameof(collection));
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public async Task<JobLease?> TryAcquireAsync(string key, string holder, TimeSpan leaseDuration, DateTimeOffset now, CancellationToken cancellationToken)
{
var nowUtc = now.UtcDateTime;
var ttlUtc = nowUtc.Add(leaseDuration);
var filter = Builders<JobLeaseDocument>.Filter.Eq(x => x.Key, key)
& Builders<JobLeaseDocument>.Filter.Or(
Builders<JobLeaseDocument>.Filter.Lte(x => x.TtlAt, nowUtc),
Builders<JobLeaseDocument>.Filter.Eq(x => x.Holder, holder));
var update = Builders<JobLeaseDocument>.Update
.Set(x => x.Holder, holder)
.Set(x => x.AcquiredAt, nowUtc)
.Set(x => x.HeartbeatAt, nowUtc)
.Set(x => x.LeaseMs, (long)leaseDuration.TotalMilliseconds)
.Set(x => x.TtlAt, ttlUtc);
var options = new FindOneAndUpdateOptions<JobLeaseDocument>
{
ReturnDocument = ReturnDocument.After,
};
var updated = await _collection.FindOneAndUpdateAsync(filter, update, options, cancellationToken).ConfigureAwait(false);
if (updated is not null)
{
_logger.LogDebug("Lease {Key} acquired by {Holder}", key, holder);
return updated.ToLease();
}
try
{
var document = new JobLeaseDocument
{
Key = key,
Holder = holder,
AcquiredAt = nowUtc,
HeartbeatAt = nowUtc,
LeaseMs = (long)leaseDuration.TotalMilliseconds,
TtlAt = ttlUtc,
};
await _collection.InsertOneAsync(document, cancellationToken: cancellationToken).ConfigureAwait(false);
_logger.LogDebug("Lease {Key} inserted for {Holder}", key, holder);
return document.ToLease();
}
catch (MongoWriteException ex) when (ex.WriteError.Category == ServerErrorCategory.DuplicateKey)
{
_logger.LogDebug(ex, "Lease {Key} already held by another process", key);
return null;
}
}
public async Task<JobLease?> HeartbeatAsync(string key, string holder, TimeSpan leaseDuration, DateTimeOffset now, CancellationToken cancellationToken)
{
var nowUtc = now.UtcDateTime;
var ttlUtc = nowUtc.Add(leaseDuration);
var filter = Builders<JobLeaseDocument>.Filter.Eq(x => x.Key, key)
& Builders<JobLeaseDocument>.Filter.Eq(x => x.Holder, holder);
var update = Builders<JobLeaseDocument>.Update
.Set(x => x.HeartbeatAt, nowUtc)
.Set(x => x.LeaseMs, (long)leaseDuration.TotalMilliseconds)
.Set(x => x.TtlAt, ttlUtc);
var updated = await _collection.FindOneAndUpdateAsync(
filter,
update,
new FindOneAndUpdateOptions<JobLeaseDocument>
{
ReturnDocument = ReturnDocument.After,
},
cancellationToken).ConfigureAwait(false);
if (updated is null)
{
_logger.LogDebug("Heartbeat rejected for lease {Key} held by {Holder}", key, holder);
}
return updated?.ToLease();
}
public async Task<bool> ReleaseAsync(string key, string holder, CancellationToken cancellationToken)
{
var result = await _collection.DeleteOneAsync(
Builders<JobLeaseDocument>.Filter.Eq(x => x.Key, key)
& Builders<JobLeaseDocument>.Filter.Eq(x => x.Holder, holder),
cancellationToken).ConfigureAwait(false);
if (result.DeletedCount == 0)
{
_logger.LogDebug("Lease {Key} not released by {Holder}; no matching document", key, holder);
return false;
}
_logger.LogDebug("Lease {Key} released by {Holder}", key, holder);
return true;
}
}

View File

@@ -1,34 +0,0 @@
using Microsoft.Extensions.Options;
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo;
public interface IMongoSessionProvider
{
Task<IClientSessionHandle> StartSessionAsync(CancellationToken cancellationToken = default);
}
internal sealed class MongoSessionProvider : IMongoSessionProvider
{
private readonly IMongoClient _client;
private readonly MongoStorageOptions _options;
public MongoSessionProvider(IMongoClient client, IOptions<MongoStorageOptions> options)
{
_client = client ?? throw new ArgumentNullException(nameof(client));
_options = options?.Value ?? throw new ArgumentNullException(nameof(options));
}
public Task<IClientSessionHandle> StartSessionAsync(CancellationToken cancellationToken = default)
{
var sessionOptions = new ClientSessionOptions
{
DefaultTransactionOptions = new TransactionOptions(
readPreference: ReadPreference.Primary,
readConcern: ReadConcern.Majority,
writeConcern: WriteConcern.WMajority.With(wTimeout: _options.CommandTimeout))
};
return _client.StartSessionAsync(sessionOptions, cancellationToken);
}
}

View File

@@ -1,115 +0,0 @@
using Microsoft.Extensions.Logging;
using MongoDB.Bson;
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo;
public sealed class MongoSourceStateRepository : ISourceStateRepository
{
private readonly IMongoCollection<SourceStateDocument> _collection;
private const int MaxFailureReasonLength = 1024;
private readonly ILogger<MongoSourceStateRepository> _logger;
public MongoSourceStateRepository(IMongoDatabase database, ILogger<MongoSourceStateRepository> logger)
{
_collection = (database ?? throw new ArgumentNullException(nameof(database)))
.GetCollection<SourceStateDocument>(MongoStorageDefaults.Collections.SourceState);
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public async Task<SourceStateRecord?> TryGetAsync(string sourceName, CancellationToken cancellationToken, IClientSessionHandle? session = null)
{
var filter = Builders<SourceStateDocument>.Filter.Eq(x => x.SourceName, sourceName);
var query = session is null
? _collection.Find(filter)
: _collection.Find(session, filter);
var document = await query.FirstOrDefaultAsync(cancellationToken).ConfigureAwait(false);
return document?.ToRecord();
}
public async Task<SourceStateRecord> UpsertAsync(SourceStateRecord record, CancellationToken cancellationToken, IClientSessionHandle? session = null)
{
var document = SourceStateDocumentExtensions.FromRecord(record with { UpdatedAt = DateTimeOffset.UtcNow });
var filter = Builders<SourceStateDocument>.Filter.Eq(x => x.SourceName, record.SourceName);
var options = new ReplaceOptions { IsUpsert = true };
if (session is null)
{
await _collection.ReplaceOneAsync(filter, document, options, cancellationToken).ConfigureAwait(false);
}
else
{
await _collection.ReplaceOneAsync(session, filter, document, options, cancellationToken).ConfigureAwait(false);
}
_logger.LogDebug("Upserted source state for {Source}", record.SourceName);
return document.ToRecord();
}
public async Task<SourceStateRecord?> UpdateCursorAsync(string sourceName, BsonDocument cursor, DateTimeOffset completedAt, CancellationToken cancellationToken, IClientSessionHandle? session = null)
{
ArgumentException.ThrowIfNullOrEmpty(sourceName);
var update = Builders<SourceStateDocument>.Update
.Set(x => x.Cursor, cursor ?? new BsonDocument())
.Set(x => x.LastSuccess, completedAt.UtcDateTime)
.Set(x => x.FailCount, 0)
.Set(x => x.BackoffUntil, (DateTime?)null)
.Set(x => x.LastFailureReason, null)
.Set(x => x.UpdatedAt, DateTime.UtcNow)
.SetOnInsert(x => x.SourceName, sourceName);
var options = new FindOneAndUpdateOptions<SourceStateDocument>
{
ReturnDocument = ReturnDocument.After,
IsUpsert = true,
};
var filter = Builders<SourceStateDocument>.Filter.Eq(x => x.SourceName, sourceName);
var document = session is null
? await _collection.FindOneAndUpdateAsync(filter, update, options, cancellationToken).ConfigureAwait(false)
: await _collection.FindOneAndUpdateAsync(session, filter, update, options, cancellationToken).ConfigureAwait(false);
return document?.ToRecord();
}
public async Task<SourceStateRecord?> MarkFailureAsync(string sourceName, DateTimeOffset failedAt, TimeSpan? backoff, string? failureReason, CancellationToken cancellationToken, IClientSessionHandle? session = null)
{
ArgumentException.ThrowIfNullOrEmpty(sourceName);
var reasonValue = NormalizeFailureReason(failureReason);
var update = Builders<SourceStateDocument>.Update
.Inc(x => x.FailCount, 1)
.Set(x => x.LastFailure, failedAt.UtcDateTime)
.Set(x => x.BackoffUntil, backoff.HasValue ? failedAt.UtcDateTime.Add(backoff.Value) : null)
.Set(x => x.LastFailureReason, reasonValue)
.Set(x => x.UpdatedAt, DateTime.UtcNow)
.SetOnInsert(x => x.SourceName, sourceName);
var options = new FindOneAndUpdateOptions<SourceStateDocument>
{
ReturnDocument = ReturnDocument.After,
IsUpsert = true,
};
var filter = Builders<SourceStateDocument>.Filter.Eq(x => x.SourceName, sourceName);
var document = session is null
? await _collection.FindOneAndUpdateAsync(filter, update, options, cancellationToken).ConfigureAwait(false)
: await _collection.FindOneAndUpdateAsync(session, filter, update, options, cancellationToken).ConfigureAwait(false);
return document?.ToRecord();
}
private static string? NormalizeFailureReason(string? reason)
{
if (string.IsNullOrWhiteSpace(reason))
{
return null;
}
var trimmed = reason.Trim();
if (trimmed.Length <= MaxFailureReasonLength)
{
return trimmed;
}
return trimmed[..MaxFailureReasonLength];
}
}

View File

@@ -1,38 +0,0 @@
namespace StellaOps.Concelier.Storage.Mongo;
public static class MongoStorageDefaults
{
public const string DefaultDatabaseName = "concelier";
public static class Collections
{
public const string Source = "source";
public const string SourceState = "source_state";
public const string Document = "document";
public const string Dto = "dto";
public const string Advisory = "advisory";
public const string AdvisoryRaw = "advisory_raw";
public const string Alias = "alias";
public const string Affected = "affected";
public const string Reference = "reference";
public const string KevFlag = "kev_flag";
public const string RuFlags = "ru_flags";
public const string JpFlags = "jp_flags";
public const string PsirtFlags = "psirt_flags";
public const string MergeEvent = "merge_event";
public const string ExportState = "export_state";
public const string Locks = "locks";
public const string Jobs = "jobs";
public const string Migrations = "schema_migrations";
public const string ChangeHistory = "source_change_history";
public const string AdvisoryStatements = "advisory_statements";
public const string AdvisoryConflicts = "advisory_conflicts";
public const string AdvisoryObservations = "advisory_observations";
public const string AdvisoryLinksets = "advisory_linksets";
public const string AdvisoryObservationEvents = "advisory_observation_events";
public const string OrchestratorRegistry = "orchestrator_registry";
public const string OrchestratorCommands = "orchestrator_commands";
public const string OrchestratorHeartbeats = "orchestrator_heartbeats";
public const string PolicyDeltaCheckpoints = "policy_delta_checkpoints";
}
}

View File

@@ -1,162 +0,0 @@
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo;
public sealed class MongoStorageOptions
{
public string ConnectionString { get; set; } = string.Empty;
public string? DatabaseName { get; set; }
public TimeSpan CommandTimeout { get; set; } = TimeSpan.FromSeconds(30);
/// <summary>
/// Retention period for raw documents (document + DTO + GridFS payloads).
/// Set to <see cref="TimeSpan.Zero"/> to disable automatic expiry.
/// </summary>
public TimeSpan RawDocumentRetention { get; set; } = TimeSpan.FromDays(45);
/// <summary>
/// Additional grace period applied on top of <see cref="RawDocumentRetention"/> before TTL purges old rows.
/// Allows the retention background service to delete GridFS blobs first.
/// </summary>
public TimeSpan RawDocumentRetentionTtlGrace { get; set; } = TimeSpan.FromDays(1);
/// <summary>
/// Interval between retention sweeps. Only used when <see cref="RawDocumentRetention"/> is greater than zero.
/// </summary>
public TimeSpan RawDocumentRetentionSweepInterval { get; set; } = TimeSpan.FromHours(6);
/// <summary>
/// Retention period for observation documents (advisory_observations).
/// Set to <see cref="TimeSpan.Zero"/> to disable automatic expiry.
/// Per LNM-21-101-DEV: observations are append-only but may be TTL-pruned for storage efficiency.
/// </summary>
public TimeSpan ObservationRetention { get; set; } = TimeSpan.Zero;
/// <summary>
/// Retention period for linkset documents (advisory_linksets).
/// Set to <see cref="TimeSpan.Zero"/> to disable automatic expiry.
/// </summary>
public TimeSpan LinksetRetention { get; set; } = TimeSpan.Zero;
/// <summary>
/// Retention period for event documents (advisory_observation_events, advisory_linkset_events).
/// Set to <see cref="TimeSpan.Zero"/> to disable automatic expiry.
/// </summary>
public TimeSpan EventRetention { get; set; } = TimeSpan.FromDays(30);
/// <summary>
/// Enables dual-write of normalized SemVer analytics for affected packages.
/// </summary>
public bool EnableSemVerStyle { get; set; } = true;
/// <summary>
/// Batch size used by backfill migrations when repopulating normalized version documents.
/// </summary>
public int BackfillBatchSize { get; set; } = 250;
/// <summary>
/// Tenant identifier associated with advisory ingestion when upstream context does not specify one.
/// </summary>
public string DefaultTenant { get; set; } = "tenant-default";
/// <summary>
/// JSON schema validator settings for the advisory_raw collection.
/// </summary>
public MongoCollectionValidatorOptions AdvisoryRawValidator { get; set; } = new();
public string GetDatabaseName()
{
if (!string.IsNullOrWhiteSpace(DatabaseName))
{
return DatabaseName.Trim();
}
if (!string.IsNullOrWhiteSpace(ConnectionString))
{
var url = MongoUrl.Create(ConnectionString);
if (!string.IsNullOrWhiteSpace(url.DatabaseName))
{
return url.DatabaseName;
}
}
return MongoStorageDefaults.DefaultDatabaseName;
}
public void EnsureValid()
{
var isTesting = string.Equals(
Environment.GetEnvironmentVariable("DOTNET_ENVIRONMENT"),
"Testing",
StringComparison.OrdinalIgnoreCase);
var bypass = string.Equals(
Environment.GetEnvironmentVariable("CONCELIER_BYPASS_MONGO"),
"1",
StringComparison.OrdinalIgnoreCase);
if (isTesting || bypass)
{
// Under test, skip validation entirely; callers may stub Mongo.
return;
}
if (string.IsNullOrWhiteSpace(ConnectionString))
{
var fallback = Environment.GetEnvironmentVariable("CONCELIER_TEST_STORAGE_DSN");
if (!string.IsNullOrWhiteSpace(fallback))
{
ConnectionString = fallback;
}
}
if (string.IsNullOrWhiteSpace(ConnectionString))
{
throw new InvalidOperationException("Mongo connection string is not configured.");
}
if (CommandTimeout <= TimeSpan.Zero)
{
throw new InvalidOperationException("Command timeout must be greater than zero.");
}
if (RawDocumentRetention < TimeSpan.Zero)
{
throw new InvalidOperationException("Raw document retention cannot be negative.");
}
if (RawDocumentRetentionTtlGrace < TimeSpan.Zero)
{
throw new InvalidOperationException("Raw document retention TTL grace cannot be negative.");
}
if (RawDocumentRetention > TimeSpan.Zero && RawDocumentRetentionSweepInterval <= TimeSpan.Zero)
{
throw new InvalidOperationException("Raw document retention sweep interval must be positive when retention is enabled.");
}
if (BackfillBatchSize <= 0)
{
throw new InvalidOperationException("Backfill batch size must be greater than zero.");
}
if (string.IsNullOrWhiteSpace(DefaultTenant))
{
throw new InvalidOperationException("Default tenant must be provided for advisory ingestion.");
}
AdvisoryRawValidator ??= new MongoCollectionValidatorOptions();
if (!Enum.IsDefined(typeof(MongoValidationLevel), AdvisoryRawValidator.Level))
{
throw new InvalidOperationException($"Unsupported Mongo validation level '{AdvisoryRawValidator.Level}' configured for advisory raw validator.");
}
if (!Enum.IsDefined(typeof(MongoValidationAction), AdvisoryRawValidator.Action))
{
throw new InvalidOperationException($"Unsupported Mongo validation action '{AdvisoryRawValidator.Action}' configured for advisory raw validator.");
}
_ = GetDatabaseName();
}
}

View File

@@ -1,247 +0,0 @@
using System;
using System.Collections.Generic;
using MongoDB.Bson;
using MongoDB.Bson.Serialization.Attributes;
namespace StellaOps.Concelier.Storage.Mongo.Observations;
[BsonIgnoreExtraElements]
public sealed class AdvisoryObservationDocument
{
[BsonId]
public string Id { get; set; } = string.Empty;
[BsonElement("tenant")]
public string Tenant { get; set; } = string.Empty;
[BsonElement("source")]
public AdvisoryObservationSourceDocument Source { get; set; } = new();
[BsonElement("upstream")]
public AdvisoryObservationUpstreamDocument Upstream { get; set; } = new();
[BsonElement("content")]
public AdvisoryObservationContentDocument Content { get; set; } = new();
[BsonElement("linkset")]
public AdvisoryObservationLinksetDocument Linkset { get; set; } = new();
[BsonElement("rawLinkset")]
[BsonIgnoreIfNull]
public AdvisoryObservationRawLinksetDocument? RawLinkset { get; set; }
= null;
[BsonElement("createdAt")]
public DateTime CreatedAt { get; set; }
= DateTime.UtcNow;
[BsonElement("attributes")]
[BsonIgnoreIfNull]
public Dictionary<string, string>? Attributes { get; set; }
= new(StringComparer.Ordinal);
}
[BsonIgnoreExtraElements]
public sealed class AdvisoryObservationSourceDocument
{
[BsonElement("vendor")]
public string Vendor { get; set; } = string.Empty;
[BsonElement("stream")]
public string Stream { get; set; } = string.Empty;
[BsonElement("api")]
public string Api { get; set; } = string.Empty;
[BsonElement("collectorVersion")]
[BsonIgnoreIfNull]
public string? CollectorVersion { get; set; }
= null;
}
[BsonIgnoreExtraElements]
public sealed class AdvisoryObservationUpstreamDocument
{
[BsonElement("upstream_id")]
public string UpstreamId { get; set; } = string.Empty;
[BsonElement("document_version")]
[BsonIgnoreIfNull]
public string? DocumentVersion { get; set; }
= null;
[BsonElement("fetchedAt")]
public DateTime FetchedAt { get; set; }
= DateTime.UtcNow;
[BsonElement("receivedAt")]
public DateTime ReceivedAt { get; set; }
= DateTime.UtcNow;
[BsonElement("contentHash")]
public string ContentHash { get; set; } = string.Empty;
[BsonElement("signature")]
public AdvisoryObservationSignatureDocument Signature { get; set; } = new();
[BsonElement("metadata")]
[BsonIgnoreIfNull]
public Dictionary<string, string>? Metadata { get; set; }
= new(StringComparer.Ordinal);
}
[BsonIgnoreExtraElements]
public sealed class AdvisoryObservationSignatureDocument
{
[BsonElement("present")]
public bool Present { get; set; }
= false;
[BsonElement("format")]
[BsonIgnoreIfNull]
public string? Format { get; set; }
= null;
[BsonElement("keyId")]
[BsonIgnoreIfNull]
public string? KeyId { get; set; }
= null;
[BsonElement("signature")]
[BsonIgnoreIfNull]
public string? Signature { get; set; }
= null;
}
[BsonIgnoreExtraElements]
public sealed class AdvisoryObservationContentDocument
{
[BsonElement("format")]
public string Format { get; set; } = string.Empty;
[BsonElement("specVersion")]
[BsonIgnoreIfNull]
public string? SpecVersion { get; set; }
= null;
[BsonElement("raw")]
public BsonDocument Raw { get; set; } = new();
[BsonElement("metadata")]
[BsonIgnoreIfNull]
public Dictionary<string, string>? Metadata { get; set; }
= new(StringComparer.Ordinal);
}
[BsonIgnoreExtraElements]
public sealed class AdvisoryObservationLinksetDocument
{
[BsonElement("aliases")]
[BsonIgnoreIfNull]
public List<string>? Aliases { get; set; }
= new();
[BsonElement("purls")]
[BsonIgnoreIfNull]
public List<string>? Purls { get; set; }
= new();
[BsonElement("cpes")]
[BsonIgnoreIfNull]
public List<string>? Cpes { get; set; }
= new();
[BsonElement("references")]
[BsonIgnoreIfNull]
public List<AdvisoryObservationReferenceDocument>? References { get; set; }
= new();
}
[BsonIgnoreExtraElements]
public sealed class AdvisoryObservationReferenceDocument
{
[BsonElement("type")]
public string Type { get; set; } = string.Empty;
[BsonElement("url")]
public string Url { get; set; } = string.Empty;
}
[BsonIgnoreExtraElements]
public sealed class AdvisoryObservationRawLinksetDocument
{
[BsonElement("aliases")]
[BsonIgnoreIfNull]
public List<string>? Aliases { get; set; }
= new();
[BsonElement("scopes")]
[BsonIgnoreIfNull]
public List<string>? Scopes { get; set; }
= new();
[BsonElement("relationships")]
[BsonIgnoreIfNull]
public List<AdvisoryObservationRawRelationshipDocument>? Relationships { get; set; }
= new();
[BsonElement("purls")]
[BsonIgnoreIfNull]
public List<string>? PackageUrls { get; set; }
= new();
[BsonElement("cpes")]
[BsonIgnoreIfNull]
public List<string>? Cpes { get; set; }
= new();
[BsonElement("references")]
[BsonIgnoreIfNull]
public List<AdvisoryObservationRawReferenceDocument>? References { get; set; }
= new();
[BsonElement("reconciled_from")]
[BsonIgnoreIfNull]
public List<string>? ReconciledFrom { get; set; }
= new();
[BsonElement("notes")]
[BsonIgnoreIfNull]
public Dictionary<string, string>? Notes { get; set; }
= new(StringComparer.Ordinal);
}
[BsonIgnoreExtraElements]
public sealed class AdvisoryObservationRawReferenceDocument
{
[BsonElement("type")]
[BsonIgnoreIfNull]
public string? Type { get; set; }
= null;
[BsonElement("url")]
public string Url { get; set; } = string.Empty;
[BsonElement("source")]
[BsonIgnoreIfNull]
public string? Source { get; set; }
= null;
}
[BsonIgnoreExtraElements]
public sealed class AdvisoryObservationRawRelationshipDocument
{
[BsonElement("type")]
public string Type { get; set; } = string.Empty;
[BsonElement("source")]
public string Source { get; set; } = string.Empty;
[BsonElement("target")]
public string Target { get; set; } = string.Empty;
[BsonElement("provenance")]
[BsonIgnoreIfNull]
public string? Provenance { get; set; }
= null;
}

View File

@@ -1,274 +0,0 @@
using System;
using System.Collections.Generic;
using System.Collections.Immutable;
using System.Linq;
using System.Text.Json;
using System.Text.Json.Nodes;
using MongoDB.Bson;
using MongoDB.Bson.IO;
using StellaOps.Concelier.Models.Observations;
using StellaOps.Concelier.RawModels;
namespace StellaOps.Concelier.Storage.Mongo.Observations;
internal static class AdvisoryObservationDocumentFactory
{
private static readonly JsonWriterSettings JsonSettings = new() { OutputMode = JsonOutputMode.RelaxedExtendedJson };
public static AdvisoryObservation ToModel(AdvisoryObservationDocument document)
{
ArgumentNullException.ThrowIfNull(document);
var rawNode = ParseJsonNode(document.Content.Raw);
var attributes = ToImmutable(document.Attributes);
var contentMetadata = ToImmutable(document.Content.Metadata);
var upstreamMetadata = ToImmutable(document.Upstream.Metadata);
var rawLinkset = ToRawLinkset(document.RawLinkset);
var observation = new AdvisoryObservation(
document.Id,
document.Tenant,
new AdvisoryObservationSource(
document.Source.Vendor,
document.Source.Stream,
document.Source.Api,
document.Source.CollectorVersion),
new AdvisoryObservationUpstream(
document.Upstream.UpstreamId,
document.Upstream.DocumentVersion,
DateTime.SpecifyKind(document.Upstream.FetchedAt, DateTimeKind.Utc),
DateTime.SpecifyKind(document.Upstream.ReceivedAt, DateTimeKind.Utc),
document.Upstream.ContentHash,
new AdvisoryObservationSignature(
document.Upstream.Signature.Present,
document.Upstream.Signature.Format,
document.Upstream.Signature.KeyId,
document.Upstream.Signature.Signature),
upstreamMetadata),
new AdvisoryObservationContent(
document.Content.Format,
document.Content.SpecVersion,
rawNode,
contentMetadata),
new AdvisoryObservationLinkset(
document.Linkset.Aliases ?? Enumerable.Empty<string>(),
document.Linkset.Purls ?? Enumerable.Empty<string>(),
document.Linkset.Cpes ?? Enumerable.Empty<string>(),
document.Linkset.References?.Select(reference => new AdvisoryObservationReference(reference.Type, reference.Url))),
rawLinkset,
DateTime.SpecifyKind(document.CreatedAt, DateTimeKind.Utc),
attributes);
return observation;
}
public static AdvisoryObservationDocument ToDocument(AdvisoryObservation observation)
{
ArgumentNullException.ThrowIfNull(observation);
var contentRaw = observation.Content.Raw?.ToJsonString(new JsonSerializerOptions
{
Encoder = System.Text.Encodings.Web.JavaScriptEncoder.UnsafeRelaxedJsonEscaping
}) ?? "{}";
var document = new AdvisoryObservationDocument
{
Id = observation.ObservationId,
Tenant = observation.Tenant,
Source = new AdvisoryObservationSourceDocument
{
Vendor = observation.Source.Vendor,
Stream = observation.Source.Stream,
Api = observation.Source.Api,
CollectorVersion = observation.Source.CollectorVersion
},
Upstream = new AdvisoryObservationUpstreamDocument
{
UpstreamId = observation.Upstream.UpstreamId,
DocumentVersion = observation.Upstream.DocumentVersion,
FetchedAt = observation.Upstream.FetchedAt.UtcDateTime,
ReceivedAt = observation.Upstream.ReceivedAt.UtcDateTime,
ContentHash = observation.Upstream.ContentHash,
Signature = new AdvisoryObservationSignatureDocument
{
Present = observation.Upstream.Signature.Present,
Format = observation.Upstream.Signature.Format,
KeyId = observation.Upstream.Signature.KeyId,
Signature = observation.Upstream.Signature.Signature
},
Metadata = observation.Upstream.Metadata.ToDictionary(pair => pair.Key, pair => pair.Value, StringComparer.Ordinal)
},
Content = new AdvisoryObservationContentDocument
{
Format = observation.Content.Format,
SpecVersion = observation.Content.SpecVersion,
Raw = BsonDocument.Parse(contentRaw),
Metadata = observation.Content.Metadata.ToDictionary(pair => pair.Key, pair => pair.Value, StringComparer.Ordinal)
},
Linkset = new AdvisoryObservationLinksetDocument
{
Aliases = observation.Linkset.Aliases.ToList(),
Purls = observation.Linkset.Purls.ToList(),
Cpes = observation.Linkset.Cpes.ToList(),
References = observation.Linkset.References.Select(reference => new AdvisoryObservationReferenceDocument
{
Type = reference.Type,
Url = reference.Url
}).ToList()
},
RawLinkset = ToRawLinksetDocument(observation.RawLinkset),
CreatedAt = observation.CreatedAt.UtcDateTime,
Attributes = observation.Attributes.ToDictionary(pair => pair.Key, pair => pair.Value, StringComparer.Ordinal)
};
return document;
}
private static JsonNode ParseJsonNode(BsonDocument raw)
{
if (raw is null || raw.ElementCount == 0)
{
return JsonNode.Parse("{}")!;
}
var json = raw.ToJson(JsonSettings);
return JsonNode.Parse(json)!;
}
private static ImmutableDictionary<string, string> ToImmutable(Dictionary<string, string>? values)
{
if (values is null || values.Count == 0)
{
return ImmutableDictionary<string, string>.Empty;
}
var builder = ImmutableDictionary.CreateBuilder<string, string>(StringComparer.Ordinal);
foreach (var pair in values)
{
if (string.IsNullOrWhiteSpace(pair.Key) || pair.Value is null)
{
continue;
}
builder[pair.Key.Trim()] = pair.Value;
}
return builder.ToImmutable();
}
private static RawLinkset ToRawLinkset(AdvisoryObservationRawLinksetDocument? document)
{
if (document is null)
{
return new RawLinkset();
}
static ImmutableArray<string> ToImmutableStringArray(List<string>? values)
{
if (values is null || values.Count == 0)
{
return ImmutableArray<string>.Empty;
}
return values
.Select(static value => value ?? string.Empty)
.ToImmutableArray();
}
static ImmutableArray<RawRelationship> ToImmutableRelationships(List<AdvisoryObservationRawRelationshipDocument>? relationships)
{
if (relationships is null || relationships.Count == 0)
{
return ImmutableArray<RawRelationship>.Empty;
}
return relationships
.Select(static relationship => new RawRelationship(
relationship.Type ?? string.Empty,
relationship.Source ?? string.Empty,
relationship.Target ?? string.Empty,
relationship.Provenance))
.ToImmutableArray();
}
static ImmutableArray<RawReference> ToImmutableReferences(List<AdvisoryObservationRawReferenceDocument>? references)
{
if (references is null || references.Count == 0)
{
return ImmutableArray<RawReference>.Empty;
}
return references
.Select(static reference => new RawReference(
reference.Type ?? string.Empty,
reference.Url,
reference.Source))
.ToImmutableArray();
}
static ImmutableDictionary<string, string> ToImmutableDictionary(Dictionary<string, string>? values)
{
if (values is null || values.Count == 0)
{
return ImmutableDictionary<string, string>.Empty;
}
var builder = ImmutableDictionary.CreateBuilder<string, string>(StringComparer.Ordinal);
foreach (var pair in values)
{
if (pair.Key is null)
{
continue;
}
builder[pair.Key] = pair.Value;
}
return builder.ToImmutable();
}
return new RawLinkset
{
Aliases = ToImmutableStringArray(document.Aliases),
Scopes = ToImmutableStringArray(document.Scopes),
Relationships = ToImmutableRelationships(document.Relationships),
PackageUrls = ToImmutableStringArray(document.PackageUrls),
Cpes = ToImmutableStringArray(document.Cpes),
References = ToImmutableReferences(document.References),
ReconciledFrom = ToImmutableStringArray(document.ReconciledFrom),
Notes = ToImmutableDictionary(document.Notes)
};
}
private static AdvisoryObservationRawLinksetDocument ToRawLinksetDocument(RawLinkset rawLinkset)
{
return new AdvisoryObservationRawLinksetDocument
{
Aliases = rawLinkset.Aliases.IsDefault ? new List<string>() : rawLinkset.Aliases.ToList(),
Scopes = rawLinkset.Scopes.IsDefault ? new List<string>() : rawLinkset.Scopes.ToList(),
Relationships = rawLinkset.Relationships.IsDefault
? new List<AdvisoryObservationRawRelationshipDocument>()
: rawLinkset.Relationships.Select(relationship => new AdvisoryObservationRawRelationshipDocument
{
Type = relationship.Type,
Source = relationship.Source,
Target = relationship.Target,
Provenance = relationship.Provenance
}).ToList(),
PackageUrls = rawLinkset.PackageUrls.IsDefault ? new List<string>() : rawLinkset.PackageUrls.ToList(),
Cpes = rawLinkset.Cpes.IsDefault ? new List<string>() : rawLinkset.Cpes.ToList(),
References = rawLinkset.References.IsDefault
? new List<AdvisoryObservationRawReferenceDocument>()
: rawLinkset.References.Select(reference => new AdvisoryObservationRawReferenceDocument
{
Type = reference.Type,
Url = reference.Url,
Source = reference.Source
}).ToList(),
ReconciledFrom = rawLinkset.ReconciledFrom.IsDefault ? new List<string>() : rawLinkset.ReconciledFrom.ToList(),
Notes = rawLinkset.Notes.Count == 0 ? new Dictionary<string, string>(StringComparer.Ordinal)
: rawLinkset.Notes.ToDictionary(pair => pair.Key, pair => pair.Value, StringComparer.Ordinal)
};
}
}

View File

@@ -1,82 +0,0 @@
using System;
using System.Collections.Generic;
using MongoDB.Bson;
using MongoDB.Bson.Serialization.Attributes;
namespace StellaOps.Concelier.Storage.Mongo.Observations;
public sealed class AdvisoryObservationEventDocument
{
[BsonId]
public Guid Id { get; set; }
[BsonElement("tenantId")]
public string TenantId { get; set; } = string.Empty;
[BsonElement("observationId")]
public string ObservationId { get; set; } = string.Empty;
[BsonElement("advisoryId")]
public string AdvisoryId { get; set; } = string.Empty;
[BsonElement("source")]
public AdvisoryObservationSourceDocument Source { get; set; } = new();
[BsonElement("linksetSummary")]
public AdvisoryObservationLinksetSummaryDocument LinksetSummary { get; set; } = new();
[BsonElement("documentSha")]
public string DocumentSha { get; set; } = string.Empty;
[BsonElement("observationHash")]
public string ObservationHash { get; set; } = string.Empty;
[BsonElement("ingestedAt")]
public DateTime IngestedAt { get; set; }
= DateTime.SpecifyKind(DateTime.UtcNow, DateTimeKind.Utc);
[BsonElement("replayCursor")]
public string ReplayCursor { get; set; } = string.Empty;
[BsonElement("supersedesId")]
public string? SupersedesId { get; set; }
[BsonElement("traceId")]
public string? TraceId { get; set; }
[BsonElement("publishedAt")]
public DateTime? PublishedAt { get; set; }
}
public sealed class AdvisoryObservationLinksetSummaryDocument
{
[BsonElement("aliases")]
public List<string> Aliases { get; set; } = new();
[BsonElement("purls")]
public List<string> Purls { get; set; } = new();
[BsonElement("cpes")]
public List<string> Cpes { get; set; } = new();
[BsonElement("scopes")]
public List<string> Scopes { get; set; } = new();
[BsonElement("relationships")]
public List<AdvisoryObservationRelationshipDocument> Relationships { get; set; } = new();
}
public sealed class AdvisoryObservationRelationshipDocument
{
[BsonElement("type")]
public string Type { get; set; } = string.Empty;
[BsonElement("source")]
public string Source { get; set; } = string.Empty;
[BsonElement("target")]
public string Target { get; set; } = string.Empty;
[BsonElement("provenance")]
public string? Provenance { get; set; }
}

View File

@@ -1,60 +0,0 @@
using System;
using System.Collections.Generic;
using StellaOps.Concelier.Core.Observations;
using StellaOps.Concelier.Models.Observations;
namespace StellaOps.Concelier.Storage.Mongo.Observations;
internal sealed class AdvisoryObservationLookup : IAdvisoryObservationLookup
{
private readonly IAdvisoryObservationStore _store;
public AdvisoryObservationLookup(IAdvisoryObservationStore store)
{
_store = store ?? throw new ArgumentNullException(nameof(store));
}
public ValueTask<IReadOnlyList<AdvisoryObservation>> ListByTenantAsync(
string tenant,
CancellationToken cancellationToken)
{
ArgumentException.ThrowIfNullOrWhiteSpace(tenant);
cancellationToken.ThrowIfCancellationRequested();
return new ValueTask<IReadOnlyList<AdvisoryObservation>>(
_store.ListByTenantAsync(tenant, cancellationToken));
}
public ValueTask<IReadOnlyList<AdvisoryObservation>> FindByFiltersAsync(
string tenant,
IReadOnlyCollection<string> observationIds,
IReadOnlyCollection<string> aliases,
IReadOnlyCollection<string> purls,
IReadOnlyCollection<string> cpes,
AdvisoryObservationCursor? cursor,
int limit,
CancellationToken cancellationToken)
{
ArgumentException.ThrowIfNullOrWhiteSpace(tenant);
ArgumentNullException.ThrowIfNull(observationIds);
ArgumentNullException.ThrowIfNull(aliases);
ArgumentNullException.ThrowIfNull(purls);
ArgumentNullException.ThrowIfNull(cpes);
if (limit <= 0)
{
throw new ArgumentOutOfRangeException(nameof(limit), "Limit must be greater than zero.");
}
cancellationToken.ThrowIfCancellationRequested();
return new ValueTask<IReadOnlyList<AdvisoryObservation>>(
_store.FindByFiltersAsync(
tenant,
observationIds,
aliases,
purls,
cpes,
cursor,
limit,
cancellationToken));
}
}

View File

@@ -1,46 +0,0 @@
using System;
using System.Collections.Generic;
using System.Threading;
using System.Threading.Tasks;
using StellaOps.Concelier.Core.Observations;
using StellaOps.Concelier.Models.Observations;
namespace StellaOps.Concelier.Storage.Mongo.Observations;
internal sealed class AdvisoryObservationSink : IAdvisoryObservationSink
{
private readonly IAdvisoryObservationStore _store;
private readonly IAdvisoryObservationEventPublisher _publisher;
private readonly TimeProvider _timeProvider;
public AdvisoryObservationSink(
IAdvisoryObservationStore store,
IAdvisoryObservationEventPublisher publisher,
TimeProvider? timeProvider = null)
{
_store = store ?? throw new ArgumentNullException(nameof(store));
_publisher = publisher ?? throw new ArgumentNullException(nameof(publisher));
_timeProvider = timeProvider ?? TimeProvider.System;
}
public Task UpsertAsync(AdvisoryObservation observation, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(observation);
return UpsertAndPublishAsync(observation, cancellationToken);
}
private async Task UpsertAndPublishAsync(AdvisoryObservation observation, CancellationToken cancellationToken)
{
await _store.UpsertAsync(observation, cancellationToken).ConfigureAwait(false);
var evt = AdvisoryObservationUpdatedEvent.FromObservation(
observation,
supersedesId: observation.Attributes.GetValueOrDefault("supersedesId")
?? observation.Attributes.GetValueOrDefault("supersedes"),
traceId: observation.Attributes.GetValueOrDefault("traceId"),
replayCursor: _timeProvider.GetUtcNow().Ticks.ToString());
await _publisher.PublishAsync(evt, cancellationToken).ConfigureAwait(false);
}
}

View File

@@ -1,201 +0,0 @@
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text.RegularExpressions;
using System.Threading;
using System.Threading.Tasks;
using MongoDB.Bson;
using MongoDB.Driver;
using StellaOps.Concelier.Core.Observations;
using StellaOps.Concelier.Models.Observations;
namespace StellaOps.Concelier.Storage.Mongo.Observations;
internal sealed class AdvisoryObservationStore : IAdvisoryObservationStore
{
private readonly IMongoCollection<AdvisoryObservationDocument> collection;
public AdvisoryObservationStore(IMongoCollection<AdvisoryObservationDocument> collection)
{
this.collection = collection ?? throw new ArgumentNullException(nameof(collection));
}
public async Task<IReadOnlyList<AdvisoryObservation>> ListByTenantAsync(string tenant, CancellationToken cancellationToken)
{
ArgumentException.ThrowIfNullOrWhiteSpace(tenant);
var filter = Builders<AdvisoryObservationDocument>.Filter.Eq(document => document.Tenant, tenant.ToLowerInvariant());
var documents = await collection
.Find(filter)
.SortByDescending(document => document.CreatedAt)
.ThenBy(document => document.Id)
.ToListAsync(cancellationToken)
.ConfigureAwait(false);
return documents.Select(AdvisoryObservationDocumentFactory.ToModel).ToArray();
}
public async Task<IReadOnlyList<AdvisoryObservation>> FindByFiltersAsync(
string tenant,
IEnumerable<string>? observationIds,
IEnumerable<string>? aliases,
IEnumerable<string>? purls,
IEnumerable<string>? cpes,
AdvisoryObservationCursor? cursor,
int limit,
CancellationToken cancellationToken)
{
ArgumentException.ThrowIfNullOrWhiteSpace(tenant);
if (limit <= 0)
{
throw new ArgumentOutOfRangeException(nameof(limit), "Limit must be greater than zero.");
}
cancellationToken.ThrowIfCancellationRequested();
var normalizedTenant = tenant.ToLowerInvariant();
var normalizedObservationIds = NormalizeValues(observationIds, static value => value);
var normalizedAliases = NormalizeAliasFilters(aliases);
var normalizedPurls = NormalizeValues(purls, static value => value);
var normalizedCpes = NormalizeValues(cpes, static value => value);
var builder = Builders<AdvisoryObservationDocument>.Filter;
var filters = new List<FilterDefinition<AdvisoryObservationDocument>>
{
builder.Eq(document => document.Tenant, normalizedTenant)
};
if (normalizedObservationIds.Length > 0)
{
filters.Add(builder.In(document => document.Id, normalizedObservationIds));
}
if (normalizedAliases.Length > 0)
{
var aliasFilters = normalizedAliases
.Select(alias => CreateAliasFilter(builder, alias))
.ToList();
filters.Add(builder.Or(aliasFilters));
}
if (normalizedPurls.Length > 0)
{
filters.Add(builder.In("linkset.purls", normalizedPurls));
}
if (normalizedCpes.Length > 0)
{
filters.Add(builder.In("linkset.cpes", normalizedCpes));
}
if (cursor.HasValue)
{
var createdAtUtc = cursor.Value.CreatedAt.UtcDateTime;
var observationId = cursor.Value.ObservationId;
var createdBefore = builder.Lt(document => document.CreatedAt, createdAtUtc);
var sameCreatedNextId = builder.And(
builder.Eq(document => document.CreatedAt, createdAtUtc),
builder.Gt(document => document.Id, observationId));
filters.Add(builder.Or(createdBefore, sameCreatedNextId));
}
var filter = builder.And(filters);
var documents = await collection
.Find(filter)
.SortByDescending(document => document.CreatedAt)
.ThenBy(document => document.Id)
.Limit(limit)
.ToListAsync(cancellationToken)
.ConfigureAwait(false);
return documents.Select(AdvisoryObservationDocumentFactory.ToModel).ToArray();
}
public async Task UpsertAsync(AdvisoryObservation observation, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(observation);
cancellationToken.ThrowIfCancellationRequested();
var document = AdvisoryObservationDocumentFactory.ToDocument(observation);
var filter = Builders<AdvisoryObservationDocument>.Filter.Eq(d => d.Id, document.Id);
var options = new ReplaceOptions { IsUpsert = true };
await collection.ReplaceOneAsync(filter, document, options, cancellationToken).ConfigureAwait(false);
}
private static string[] NormalizeValues(IEnumerable<string>? values, Func<string, string> projector)
{
if (values is null)
{
return Array.Empty<string>();
}
var set = new HashSet<string>(StringComparer.Ordinal);
foreach (var value in values)
{
if (string.IsNullOrWhiteSpace(value))
{
continue;
}
var projected = projector(value.Trim());
if (!string.IsNullOrEmpty(projected))
{
set.Add(projected);
}
}
if (set.Count == 0)
{
return Array.Empty<string>();
}
return set.ToArray();
}
private static string[] NormalizeAliasFilters(IEnumerable<string>? aliases)
{
if (aliases is null)
{
return Array.Empty<string>();
}
var set = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
var list = new List<string>();
foreach (var alias in aliases)
{
if (alias is null)
{
continue;
}
var trimmed = alias.Trim();
if (trimmed.Length == 0)
{
continue;
}
if (set.Add(trimmed))
{
list.Add(trimmed);
}
}
return list.Count == 0 ? Array.Empty<string>() : list.ToArray();
}
private static FilterDefinition<AdvisoryObservationDocument> CreateAliasFilter(
FilterDefinitionBuilder<AdvisoryObservationDocument> builder,
string alias)
{
var escaped = Regex.Escape(alias);
var regex = new BsonRegularExpression($"^{escaped}$", "i");
return builder.Or(
builder.Regex("rawLinkset.aliases", regex),
builder.Regex("linkset.aliases", regex));
}
}

View File

@@ -1,66 +0,0 @@
using System;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Hosting;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using StellaOps.Concelier.Core.Observations;
namespace StellaOps.Concelier.Storage.Mongo.Observations;
internal sealed class AdvisoryObservationTransportWorker : BackgroundService
{
private readonly IAdvisoryObservationEventOutbox _outbox;
private readonly IAdvisoryObservationEventTransport _transport;
private readonly ILogger<AdvisoryObservationTransportWorker> _logger;
private readonly AdvisoryObservationEventPublisherOptions _options;
public AdvisoryObservationTransportWorker(
IAdvisoryObservationEventOutbox outbox,
IAdvisoryObservationEventTransport transport,
IOptions<AdvisoryObservationEventPublisherOptions> options,
ILogger<AdvisoryObservationTransportWorker> logger)
{
_outbox = outbox ?? throw new ArgumentNullException(nameof(outbox));
_transport = transport ?? throw new ArgumentNullException(nameof(transport));
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
_options = options.Value;
}
protected override async Task ExecuteAsync(CancellationToken stoppingToken)
{
if (!_options.Enabled)
{
_logger.LogInformation("Observation transport worker disabled.");
return;
}
while (!stoppingToken.IsCancellationRequested)
{
try
{
var batch = await _outbox.DequeueAsync(25, stoppingToken).ConfigureAwait(false);
if (batch.Count == 0)
{
await Task.Delay(TimeSpan.FromSeconds(2), stoppingToken).ConfigureAwait(false);
continue;
}
foreach (var evt in batch)
{
await _transport.SendAsync(evt, stoppingToken).ConfigureAwait(false);
await _outbox.MarkPublishedAsync(evt.EventId, DateTimeOffset.UtcNow, stoppingToken).ConfigureAwait(false);
}
}
catch (OperationCanceledException)
{
break;
}
catch (Exception ex)
{
_logger.LogWarning(ex, "Observation transport worker error; retrying");
await Task.Delay(TimeSpan.FromSeconds(5), stoppingToken).ConfigureAwait(false);
}
}
}
}

View File

@@ -1,23 +0,0 @@
using System.Collections.Generic;
using System.Threading;
using StellaOps.Concelier.Models.Observations;
using StellaOps.Concelier.Core.Observations;
namespace StellaOps.Concelier.Storage.Mongo.Observations;
public interface IAdvisoryObservationStore
{
Task<IReadOnlyList<AdvisoryObservation>> ListByTenantAsync(string tenant, CancellationToken cancellationToken);
Task<IReadOnlyList<AdvisoryObservation>> FindByFiltersAsync(
string tenant,
IEnumerable<string>? observationIds,
IEnumerable<string>? aliases,
IEnumerable<string>? purls,
IEnumerable<string>? cpes,
AdvisoryObservationCursor? cursor,
int limit,
CancellationToken cancellationToken);
Task UpsertAsync(AdvisoryObservation observation, CancellationToken cancellationToken);
}

View File

@@ -1,75 +0,0 @@
using System;
using System.Collections.Generic;
using System.Collections.Immutable;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using MongoDB.Driver;
using StellaOps.Concelier.Core.Observations;
namespace StellaOps.Concelier.Storage.Mongo.Observations;
internal sealed class MongoAdvisoryObservationEventOutbox : IAdvisoryObservationEventOutbox
{
private readonly IMongoCollection<AdvisoryObservationEventDocument> _collection;
public MongoAdvisoryObservationEventOutbox(IMongoCollection<AdvisoryObservationEventDocument> collection)
{
_collection = collection ?? throw new ArgumentNullException(nameof(collection));
}
public async Task<IReadOnlyCollection<AdvisoryObservationUpdatedEvent>> DequeueAsync(int take, CancellationToken cancellationToken)
{
if (take <= 0)
{
return Array.Empty<AdvisoryObservationUpdatedEvent>();
}
var filter = Builders<AdvisoryObservationEventDocument>.Filter.Eq(doc => doc.PublishedAt, null);
var documents = await _collection
.Find(filter)
.SortByDescending(doc => doc.IngestedAt)
.Limit(take)
.ToListAsync(cancellationToken)
.ConfigureAwait(false);
return documents.Select(ToDomain).ToArray();
}
public Task MarkPublishedAsync(Guid eventId, DateTimeOffset publishedAt, CancellationToken cancellationToken)
{
var update = Builders<AdvisoryObservationEventDocument>.Update.Set(doc => doc.PublishedAt, publishedAt.UtcDateTime);
return _collection.UpdateOneAsync(
Builders<AdvisoryObservationEventDocument>.Filter.Eq(doc => doc.Id, eventId),
update,
cancellationToken: cancellationToken);
}
private static AdvisoryObservationUpdatedEvent ToDomain(AdvisoryObservationEventDocument doc)
{
return new AdvisoryObservationUpdatedEvent(
doc.Id,
doc.TenantId,
doc.ObservationId,
doc.AdvisoryId,
new Models.Observations.AdvisoryObservationSource(
doc.Source.Vendor,
doc.Source.Stream,
doc.Source.Api,
doc.Source.CollectorVersion),
new AdvisoryObservationLinksetSummary(
doc.LinksetSummary.Aliases.ToImmutableArray(),
doc.LinksetSummary.Purls.ToImmutableArray(),
doc.LinksetSummary.Cpes.ToImmutableArray(),
doc.LinksetSummary.Scopes.ToImmutableArray(),
doc.LinksetSummary.Relationships
.Select(rel => new AdvisoryObservationRelationshipSummary(rel.Type, rel.Source, rel.Target, rel.Provenance))
.ToImmutableArray()),
doc.DocumentSha,
doc.ObservationHash,
doc.IngestedAt,
doc.ReplayCursor,
doc.SupersedesId,
doc.TraceId);
}
}

View File

@@ -1,62 +0,0 @@
using System;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using MongoDB.Driver;
using StellaOps.Concelier.Core.Observations;
namespace StellaOps.Concelier.Storage.Mongo.Observations;
internal sealed class MongoAdvisoryObservationEventPublisher : IAdvisoryObservationEventPublisher
{
private readonly IMongoCollection<AdvisoryObservationEventDocument> _collection;
public MongoAdvisoryObservationEventPublisher(IMongoCollection<AdvisoryObservationEventDocument> collection)
{
_collection = collection ?? throw new ArgumentNullException(nameof(collection));
}
public Task PublishAsync(AdvisoryObservationUpdatedEvent @event, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(@event);
var document = new AdvisoryObservationEventDocument
{
Id = @event.EventId,
TenantId = @event.TenantId,
ObservationId = @event.ObservationId,
AdvisoryId = @event.AdvisoryId,
DocumentSha = @event.DocumentSha,
ObservationHash = @event.ObservationHash,
IngestedAt = @event.IngestedAt.UtcDateTime,
ReplayCursor = @event.ReplayCursor,
SupersedesId = @event.SupersedesId,
TraceId = @event.TraceId,
Source = new AdvisoryObservationSourceDocument
{
Vendor = @event.Source.Vendor,
Stream = @event.Source.Stream,
Api = @event.Source.Api,
CollectorVersion = @event.Source.CollectorVersion
},
LinksetSummary = new AdvisoryObservationLinksetSummaryDocument
{
Aliases = @event.LinksetSummary.Aliases.ToList(),
Purls = @event.LinksetSummary.Purls.ToList(),
Cpes = @event.LinksetSummary.Cpes.ToList(),
Scopes = @event.LinksetSummary.Scopes.ToList(),
Relationships = @event.LinksetSummary.Relationships
.Select(static rel => new AdvisoryObservationRelationshipDocument
{
Type = rel.Type,
Source = rel.Source,
Target = rel.Target,
Provenance = rel.Provenance
})
.ToList()
}
};
return _collection.InsertOneAsync(document, cancellationToken: cancellationToken);
}
}

View File

@@ -1,65 +0,0 @@
using System;
using System.Text;
using System.Text.Json;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using NATS.Client.Core;
using NATS.Client.JetStream;
using NATS.Client.JetStream.Models;
using StellaOps.Concelier.Core.Observations;
namespace StellaOps.Concelier.Storage.Mongo.Observations;
internal sealed class NatsAdvisoryObservationEventPublisher : IAdvisoryObservationEventTransport
{
private readonly ILogger<NatsAdvisoryObservationEventPublisher> _logger;
private readonly AdvisoryObservationEventPublisherOptions _options;
public NatsAdvisoryObservationEventPublisher(
IOptions<AdvisoryObservationEventPublisherOptions> options,
ILogger<NatsAdvisoryObservationEventPublisher> logger)
{
ArgumentNullException.ThrowIfNull(options);
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
_options = options.Value;
}
public async Task SendAsync(AdvisoryObservationUpdatedEvent @event, CancellationToken cancellationToken)
{
if (!_options.Enabled)
{
return;
}
var subject = _options.Subject;
var payload = JsonSerializer.SerializeToUtf8Bytes(@event);
var opts = new NatsOpts { Url = _options.NatsUrl ?? "nats://127.0.0.1:4222" };
await using var connection = new NatsConnection(opts);
var js = new NatsJSContext(connection);
await EnsureStreamAsync(js, cancellationToken).ConfigureAwait(false);
await js.PublishAsync(subject, payload, cancellationToken: cancellationToken).ConfigureAwait(false);
_logger.LogDebug("Published advisory.observation.updated@1 to NATS subject {Subject} for observation {ObservationId}", subject, @event.ObservationId);
}
private async Task EnsureStreamAsync(NatsJSContext js, CancellationToken cancellationToken)
{
var stream = _options.Stream;
try
{
await js.GetStreamAsync(stream, cancellationToken: cancellationToken).ConfigureAwait(false);
}
catch (NatsJSApiException ex) when (ex.Error?.Code == 404)
{
var cfg = new StreamConfig(stream, new[] { _options.Subject })
{
Description = "Concelier advisory observation events",
MaxMsgSize = 512 * 1024,
};
await js.CreateStreamAsync(cfg, cancellationToken).ConfigureAwait(false);
}
}
}

View File

@@ -1,15 +0,0 @@
using System.Threading;
using System.Threading.Tasks;
using StellaOps.Concelier.Core.Observations;
namespace StellaOps.Concelier.Storage.Mongo.Observations;
internal sealed class NullAdvisoryObservationEventTransport : IAdvisoryObservationEventTransport
{
public static readonly NullAdvisoryObservationEventTransport Instance = new();
private NullAdvisoryObservationEventTransport() { }
public Task SendAsync(AdvisoryObservationUpdatedEvent @event, CancellationToken cancellationToken)
=> Task.CompletedTask;
}

View File

@@ -1,167 +0,0 @@
using System;
using System.Collections.Generic;
using MongoDB.Bson;
using MongoDB.Bson.Serialization.Attributes;
namespace StellaOps.Concelier.Storage.Mongo.Observations.V1;
[BsonIgnoreExtraElements]
public sealed class AdvisoryObservationV1Document
{
[BsonId]
public ObjectId Id { get; set; }
[BsonElement("tenantId")]
public string TenantId { get; set; } = string.Empty;
[BsonElement("source")]
public string Source { get; set; } = string.Empty;
[BsonElement("advisoryId")]
public string AdvisoryId { get; set; } = string.Empty;
[BsonElement("title")]
[BsonIgnoreIfNull]
public string? Title { get; set; }
[BsonElement("summary")]
[BsonIgnoreIfNull]
public string? Summary { get; set; }
[BsonElement("severities")]
[BsonIgnoreIfNull]
public List<ObservationSeverityDocument>? Severities { get; set; }
[BsonElement("affected")]
[BsonIgnoreIfNull]
public List<ObservationAffectedDocument>? Affected { get; set; }
[BsonElement("references")]
[BsonIgnoreIfNull]
public List<string>? References { get; set; }
[BsonElement("weaknesses")]
[BsonIgnoreIfNull]
public List<string>? Weaknesses { get; set; }
[BsonElement("published")]
[BsonDateTimeOptions(Kind = DateTimeKind.Utc)]
[BsonIgnoreIfNull]
public DateTime? Published { get; set; }
[BsonElement("modified")]
[BsonDateTimeOptions(Kind = DateTimeKind.Utc)]
[BsonIgnoreIfNull]
public DateTime? Modified { get; set; }
[BsonElement("provenance")]
public ObservationProvenanceDocument Provenance { get; set; } = new();
[BsonElement("ingestedAt")]
[BsonDateTimeOptions(Kind = DateTimeKind.Utc)]
public DateTime IngestedAt { get; set; }
[BsonElement("supersedesId")]
[BsonIgnoreIfNull]
public ObjectId? SupersedesId { get; set; }
}
[BsonIgnoreExtraElements]
public sealed class ObservationSeverityDocument
{
[BsonElement("system")]
public string System { get; set; } = string.Empty;
[BsonElement("score")]
public double Score { get; set; }
[BsonElement("vector")]
[BsonIgnoreIfNull]
public string? Vector { get; set; }
}
[BsonIgnoreExtraElements]
public sealed class ObservationAffectedDocument
{
[BsonElement("purl")]
public string Purl { get; set; } = string.Empty;
[BsonElement("package")]
[BsonIgnoreIfNull]
public string? Package { get; set; }
[BsonElement("versions")]
[BsonIgnoreIfNull]
public List<string>? Versions { get; set; }
[BsonElement("ranges")]
[BsonIgnoreIfNull]
public List<ObservationVersionRangeDocument>? Ranges { get; set; }
[BsonElement("ecosystem")]
[BsonIgnoreIfNull]
public string? Ecosystem { get; set; }
[BsonElement("cpes")]
[BsonIgnoreIfNull]
public List<string>? Cpes { get; set; }
}
[BsonIgnoreExtraElements]
public sealed class ObservationVersionRangeDocument
{
[BsonElement("type")]
public string Type { get; set; } = string.Empty;
[BsonElement("events")]
[BsonIgnoreIfNull]
public List<ObservationRangeEventDocument>? Events { get; set; }
}
[BsonIgnoreExtraElements]
public sealed class ObservationRangeEventDocument
{
[BsonElement("event")]
public string Event { get; set; } = string.Empty;
[BsonElement("value")]
public string Value { get; set; } = string.Empty;
}
[BsonIgnoreExtraElements]
public sealed class ObservationProvenanceDocument
{
[BsonElement("sourceArtifactSha")]
public string SourceArtifactSha { get; set; } = string.Empty;
[BsonElement("fetchedAt")]
[BsonDateTimeOptions(Kind = DateTimeKind.Utc)]
public DateTime FetchedAt { get; set; }
[BsonElement("ingestJobId")]
[BsonIgnoreIfNull]
public string? IngestJobId { get; set; }
[BsonElement("signature")]
[BsonIgnoreIfNull]
public ObservationSignatureDocument? Signature { get; set; }
}
[BsonIgnoreExtraElements]
public sealed class ObservationSignatureDocument
{
[BsonElement("present")]
public bool Present { get; set; }
[BsonElement("format")]
[BsonIgnoreIfNull]
public string? Format { get; set; }
[BsonElement("keyId")]
[BsonIgnoreIfNull]
public string? KeyId { get; set; }
[BsonElement("signature")]
[BsonIgnoreIfNull]
public string? Signature { get; set; }
}

View File

@@ -1,178 +0,0 @@
using System;
using System.Collections.Generic;
using System.Collections.Immutable;
using System.Linq;
using MongoDB.Bson;
using StellaOps.Concelier.Models.Observations;
namespace StellaOps.Concelier.Storage.Mongo.Observations.V1;
internal static class AdvisoryObservationV1DocumentFactory
{
public static AdvisoryObservationV1 ToModel(AdvisoryObservationV1Document document)
{
ArgumentNullException.ThrowIfNull(document);
var severities = document.Severities is null
? ImmutableArray<ObservationSeverity>.Empty
: document.Severities.Select(s => new ObservationSeverity(s.System, s.Score, s.Vector)).ToImmutableArray();
var affected = document.Affected is null
? ImmutableArray<ObservationAffected>.Empty
: document.Affected.Select(ToAffected).ToImmutableArray();
var references = ToImmutableStrings(document.References);
var weaknesses = ToImmutableStrings(document.Weaknesses);
var provenanceDoc = document.Provenance ?? throw new ArgumentNullException(nameof(document.Provenance));
var signatureDoc = provenanceDoc.Signature;
var signature = signatureDoc is null
? null
: new ObservationSignature(signatureDoc.Present, signatureDoc.Format, signatureDoc.KeyId, signatureDoc.Signature);
var provenance = new ObservationProvenance(
provenanceDoc.SourceArtifactSha,
DateTime.SpecifyKind(provenanceDoc.FetchedAt, DateTimeKind.Utc),
provenanceDoc.IngestJobId,
signature);
return new AdvisoryObservationV1(
document.Id.ToString(),
document.TenantId,
document.Source,
document.AdvisoryId,
document.Title,
document.Summary,
severities,
affected,
references,
weaknesses,
document.Published.HasValue ? DateTime.SpecifyKind(document.Published.Value, DateTimeKind.Utc) : null,
document.Modified.HasValue ? DateTime.SpecifyKind(document.Modified.Value, DateTimeKind.Utc) : null,
provenance,
DateTime.SpecifyKind(document.IngestedAt, DateTimeKind.Utc),
document.SupersedesId?.ToString());
}
public static AdvisoryObservationV1Document FromModel(AdvisoryObservationV1 model)
{
ArgumentNullException.ThrowIfNull(model);
var document = new AdvisoryObservationV1Document
{
Id = ObjectId.Parse(model.ObservationId),
TenantId = model.Tenant,
Source = model.Source,
AdvisoryId = model.AdvisoryId,
Title = model.Title,
Summary = model.Summary,
Severities = model.Severities
.Select(severity => new ObservationSeverityDocument
{
System = severity.System,
Score = severity.Score,
Vector = severity.Vector
})
.ToList(),
Affected = model.Affected.Select(ToDocument).ToList(),
References = model.References.IsDefaultOrEmpty ? null : model.References.ToList(),
Weaknesses = model.Weaknesses.IsDefaultOrEmpty ? null : model.Weaknesses.ToList(),
Published = model.Published?.UtcDateTime,
Modified = model.Modified?.UtcDateTime,
SupersedesId = string.IsNullOrWhiteSpace(model.SupersedesObservationId)
? null
: ObjectId.Parse(model.SupersedesObservationId!),
IngestedAt = model.IngestedAt.UtcDateTime,
Provenance = new ObservationProvenanceDocument
{
SourceArtifactSha = model.Provenance.SourceArtifactSha,
FetchedAt = model.Provenance.FetchedAt.UtcDateTime,
IngestJobId = model.Provenance.IngestJobId,
Signature = model.Provenance.Signature is null
? null
: new ObservationSignatureDocument
{
Present = model.Provenance.Signature.Present,
Format = model.Provenance.Signature.Format,
KeyId = model.Provenance.Signature.KeyId,
Signature = model.Provenance.Signature.SignatureValue
}
}
};
return document;
}
private static ImmutableArray<string> ToImmutableStrings(IEnumerable<string>? values)
{
if (values is null)
{
return ImmutableArray<string>.Empty;
}
var builder = ImmutableArray.CreateBuilder<string>();
foreach (var value in values)
{
if (string.IsNullOrWhiteSpace(value))
{
continue;
}
builder.Add(value.Trim());
}
return builder.Count == 0 ? ImmutableArray<string>.Empty : builder.ToImmutable();
}
private static ObservationAffected ToAffected(ObservationAffectedDocument document)
{
var ranges = document.Ranges is null
? ImmutableArray<ObservationVersionRange>.Empty
: document.Ranges.Select(ToRange).ToImmutableArray();
return new ObservationAffected(
document.Purl,
document.Package,
ToImmutableStrings(document.Versions),
ranges,
document.Ecosystem,
ToImmutableStrings(document.Cpes));
}
private static ObservationVersionRange ToRange(ObservationVersionRangeDocument document)
{
var events = document.Events is null
? ImmutableArray<ObservationRangeEvent>.Empty
: document.Events.Select(evt => new ObservationRangeEvent(evt.Event, evt.Value)).ToImmutableArray();
return new ObservationVersionRange(document.Type, events);
}
private static ObservationAffectedDocument ToDocument(ObservationAffected model)
{
return new ObservationAffectedDocument
{
Purl = model.Purl,
Package = model.Package,
Versions = model.Versions.IsDefaultOrEmpty ? null : model.Versions.ToList(),
Ranges = model.Ranges.IsDefaultOrEmpty ? null : model.Ranges.Select(ToDocument).ToList(),
Ecosystem = model.Ecosystem,
Cpes = model.Cpes.IsDefaultOrEmpty ? null : model.Cpes.ToList()
};
}
private static ObservationVersionRangeDocument ToDocument(ObservationVersionRange model)
{
return new ObservationVersionRangeDocument
{
Type = model.Type,
Events = model.Events.IsDefaultOrEmpty
? null
: model.Events.Select(evt => new ObservationRangeEventDocument
{
Event = evt.Event,
Value = evt.Value
}).ToList()
};
}
}

View File

@@ -1,26 +0,0 @@
using System;
using System.Security.Cryptography;
using System.Text;
using MongoDB.Bson;
namespace StellaOps.Concelier.Storage.Mongo.Observations.V1;
internal static class ObservationIdBuilder
{
public static ObjectId Create(string tenant, string source, string advisoryId, string sourceArtifactSha)
{
ArgumentException.ThrowIfNullOrWhiteSpace(tenant);
ArgumentException.ThrowIfNullOrWhiteSpace(source);
ArgumentException.ThrowIfNullOrWhiteSpace(advisoryId);
ArgumentException.ThrowIfNullOrWhiteSpace(sourceArtifactSha);
var material = $"{tenant.Trim().ToLowerInvariant()}|{source.Trim().ToLowerInvariant()}|{advisoryId.Trim()}|{sourceArtifactSha.Trim()}";
var hash = SHA256.HashData(Encoding.UTF8.GetBytes(material));
Span<byte> objectIdBytes = stackalloc byte[12];
hash.AsSpan(0, 12).CopyTo(objectIdBytes);
// ObjectId requires a byte[]; copy the stackalloc span into a managed array.
return new ObjectId(objectIdBytes.ToArray());
}
}

View File

@@ -1,19 +0,0 @@
namespace StellaOps.Concelier.Storage.Mongo.Orchestrator;
public interface IOrchestratorRegistryStore
{
Task UpsertAsync(OrchestratorRegistryRecord record, CancellationToken cancellationToken);
Task<OrchestratorRegistryRecord?> GetAsync(string tenant, string connectorId, CancellationToken cancellationToken);
Task EnqueueCommandAsync(OrchestratorCommandRecord command, CancellationToken cancellationToken);
Task<IReadOnlyList<OrchestratorCommandRecord>> GetPendingCommandsAsync(
string tenant,
string connectorId,
Guid runId,
long? afterSequence,
CancellationToken cancellationToken);
Task AppendHeartbeatAsync(OrchestratorHeartbeatRecord heartbeat, CancellationToken cancellationToken);
}

View File

@@ -1,94 +0,0 @@
using System;
using System.Linq;
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo.Orchestrator;
public sealed class MongoOrchestratorRegistryStore : IOrchestratorRegistryStore
{
private readonly IMongoCollection<OrchestratorRegistryDocument> _registry;
private readonly IMongoCollection<OrchestratorCommandDocument> _commands;
private readonly IMongoCollection<OrchestratorHeartbeatDocument> _heartbeats;
public MongoOrchestratorRegistryStore(
IMongoCollection<OrchestratorRegistryDocument> registry,
IMongoCollection<OrchestratorCommandDocument> commands,
IMongoCollection<OrchestratorHeartbeatDocument> heartbeats)
{
_registry = registry ?? throw new ArgumentNullException(nameof(registry));
_commands = commands ?? throw new ArgumentNullException(nameof(commands));
_heartbeats = heartbeats ?? throw new ArgumentNullException(nameof(heartbeats));
}
public async Task UpsertAsync(OrchestratorRegistryRecord record, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(record);
var document = OrchestratorRegistryDocumentExtensions.FromRecord(record);
var filter = Builders<OrchestratorRegistryDocument>.Filter.And(
Builders<OrchestratorRegistryDocument>.Filter.Eq(x => x.Tenant, record.Tenant),
Builders<OrchestratorRegistryDocument>.Filter.Eq(x => x.ConnectorId, record.ConnectorId));
var options = new ReplaceOptions { IsUpsert = true };
await _registry.ReplaceOneAsync(filter, document, options, cancellationToken).ConfigureAwait(false);
}
public async Task<OrchestratorRegistryRecord?> GetAsync(string tenant, string connectorId, CancellationToken cancellationToken)
{
var filter = Builders<OrchestratorRegistryDocument>.Filter.And(
Builders<OrchestratorRegistryDocument>.Filter.Eq(x => x.Tenant, tenant),
Builders<OrchestratorRegistryDocument>.Filter.Eq(x => x.ConnectorId, connectorId));
var document = await _registry
.Find(filter)
.FirstOrDefaultAsync(cancellationToken)
.ConfigureAwait(false);
return document?.ToRecord();
}
public async Task EnqueueCommandAsync(OrchestratorCommandRecord command, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(command);
var document = OrchestratorCommandDocumentExtensions.FromRecord(command);
await _commands.InsertOneAsync(document, cancellationToken: cancellationToken).ConfigureAwait(false);
}
public async Task<IReadOnlyList<OrchestratorCommandRecord>> GetPendingCommandsAsync(
string tenant,
string connectorId,
Guid runId,
long? afterSequence,
CancellationToken cancellationToken)
{
var filter = Builders<OrchestratorCommandDocument>.Filter.And(
Builders<OrchestratorCommandDocument>.Filter.Eq(x => x.Tenant, tenant),
Builders<OrchestratorCommandDocument>.Filter.Eq(x => x.ConnectorId, connectorId),
Builders<OrchestratorCommandDocument>.Filter.Eq(x => x.RunId, runId));
if (afterSequence.HasValue)
{
filter &= Builders<OrchestratorCommandDocument>.Filter.Gt(x => x.Sequence, afterSequence.Value);
}
var results = await _commands
.Find(filter)
.Sort(Builders<OrchestratorCommandDocument>.Sort.Ascending(x => x.Sequence))
.ToListAsync(cancellationToken)
.ConfigureAwait(false);
return results
.Select(static c => c.ToRecord())
.ToArray();
}
public async Task AppendHeartbeatAsync(OrchestratorHeartbeatRecord heartbeat, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(heartbeat);
var document = OrchestratorHeartbeatDocumentExtensions.FromRecord(heartbeat);
await _heartbeats.InsertOneAsync(document, cancellationToken: cancellationToken).ConfigureAwait(false);
}
}

View File

@@ -1,141 +0,0 @@
using System;
using System.Globalization;
using MongoDB.Bson.Serialization.Attributes;
namespace StellaOps.Concelier.Storage.Mongo.Orchestrator;
[BsonIgnoreExtraElements]
public sealed class OrchestratorCommandDocument
{
[BsonId]
public string Id { get; set; } = string.Empty;
[BsonElement("tenant")]
public string Tenant { get; set; } = string.Empty;
[BsonElement("connectorId")]
public string ConnectorId { get; set; } = string.Empty;
[BsonElement("runId")]
public Guid RunId { get; set; }
= Guid.Empty;
[BsonElement("sequence")]
public long Sequence { get; set; }
= 0;
[BsonElement("command")]
public OrchestratorCommandKind Command { get; set; }
= OrchestratorCommandKind.Pause;
[BsonElement("throttle")]
public OrchestratorThrottleOverrideDocument? Throttle { get; set; }
= null;
[BsonElement("backfill")]
public OrchestratorBackfillRangeDocument? Backfill { get; set; }
= null;
[BsonElement("createdAt")]
public DateTime CreatedAt { get; set; }
= DateTime.SpecifyKind(DateTime.UnixEpoch, DateTimeKind.Utc);
[BsonElement("expiresAt")]
public DateTime? ExpiresAt { get; set; }
= null;
}
[BsonIgnoreExtraElements]
public sealed class OrchestratorThrottleOverrideDocument
{
[BsonElement("rpm")]
public int? Rpm { get; set; }
= null;
[BsonElement("burst")]
public int? Burst { get; set; }
= null;
[BsonElement("cooldownSeconds")]
public int? CooldownSeconds { get; set; }
= null;
[BsonElement("expiresAt")]
public DateTime? ExpiresAt { get; set; }
= null;
}
[BsonIgnoreExtraElements]
public sealed class OrchestratorBackfillRangeDocument
{
[BsonElement("fromCursor")]
public string? FromCursor { get; set; }
= null;
[BsonElement("toCursor")]
public string? ToCursor { get; set; }
= null;
}
internal static class OrchestratorCommandDocumentExtensions
{
public static OrchestratorCommandDocument FromRecord(OrchestratorCommandRecord record)
{
ArgumentNullException.ThrowIfNull(record);
return new OrchestratorCommandDocument
{
Id = BuildId(record.Tenant, record.ConnectorId, record.RunId, record.Sequence),
Tenant = record.Tenant,
ConnectorId = record.ConnectorId,
RunId = record.RunId,
Sequence = record.Sequence,
Command = record.Command,
Throttle = record.Throttle is null
? null
: new OrchestratorThrottleOverrideDocument
{
Rpm = record.Throttle.Rpm,
Burst = record.Throttle.Burst,
CooldownSeconds = record.Throttle.CooldownSeconds,
ExpiresAt = record.Throttle.ExpiresAt?.UtcDateTime,
},
Backfill = record.Backfill is null
? null
: new OrchestratorBackfillRangeDocument
{
FromCursor = record.Backfill.FromCursor,
ToCursor = record.Backfill.ToCursor,
},
CreatedAt = record.CreatedAt.UtcDateTime,
ExpiresAt = record.ExpiresAt?.UtcDateTime,
};
}
public static OrchestratorCommandRecord ToRecord(this OrchestratorCommandDocument document)
{
ArgumentNullException.ThrowIfNull(document);
return new OrchestratorCommandRecord(
document.Tenant,
document.ConnectorId,
document.RunId,
document.Sequence,
document.Command,
document.Throttle is null
? null
: new OrchestratorThrottleOverride(
document.Throttle.Rpm,
document.Throttle.Burst,
document.Throttle.CooldownSeconds,
document.Throttle.ExpiresAt is null ? null : DateTime.SpecifyKind(document.Throttle.ExpiresAt.Value, DateTimeKind.Utc)),
document.Backfill is null
? null
: new OrchestratorBackfillRange(document.Backfill.FromCursor, document.Backfill.ToCursor),
DateTime.SpecifyKind(document.CreatedAt, DateTimeKind.Utc),
document.ExpiresAt is null ? null : DateTime.SpecifyKind(document.ExpiresAt.Value, DateTimeKind.Utc));
}
private static string BuildId(string tenant, string connectorId, Guid runId, long sequence)
=> string.Create(CultureInfo.InvariantCulture, $"{tenant}:{connectorId}:{runId}:{sequence}");
}

View File

@@ -1,26 +0,0 @@
using System;
namespace StellaOps.Concelier.Storage.Mongo.Orchestrator;
public sealed record OrchestratorCommandRecord(
string Tenant,
string ConnectorId,
Guid RunId,
long Sequence,
OrchestratorCommandKind Command,
OrchestratorThrottleOverride? Throttle,
OrchestratorBackfillRange? Backfill,
DateTimeOffset CreatedAt,
DateTimeOffset? ExpiresAt);
public enum OrchestratorCommandKind
{
Pause,
Resume,
Throttle,
Backfill,
}
public sealed record OrchestratorThrottleOverride(int? Rpm, int? Burst, int? CooldownSeconds, DateTimeOffset? ExpiresAt);
public sealed record OrchestratorBackfillRange(string? FromCursor, string? ToCursor);

View File

@@ -1,105 +0,0 @@
using System;
using System.Globalization;
using MongoDB.Bson.Serialization.Attributes;
namespace StellaOps.Concelier.Storage.Mongo.Orchestrator;
[BsonIgnoreExtraElements]
public sealed class OrchestratorHeartbeatDocument
{
[BsonId]
public string Id { get; set; } = string.Empty;
[BsonElement("tenant")]
public string Tenant { get; set; } = string.Empty;
[BsonElement("connectorId")]
public string ConnectorId { get; set; } = string.Empty;
[BsonElement("runId")]
public Guid RunId { get; set; }
= Guid.Empty;
[BsonElement("sequence")]
public long Sequence { get; set; }
= 0;
[BsonElement("status")]
public OrchestratorHeartbeatStatus Status { get; set; }
= OrchestratorHeartbeatStatus.Starting;
[BsonElement("progress")]
public int? Progress { get; set; }
= null;
[BsonElement("queueDepth")]
public int? QueueDepth { get; set; }
= null;
[BsonElement("lastArtifactHash")]
public string? LastArtifactHash { get; set; }
= null;
[BsonElement("lastArtifactKind")]
public string? LastArtifactKind { get; set; }
= null;
[BsonElement("errorCode")]
public string? ErrorCode { get; set; }
= null;
[BsonElement("retryAfterSeconds")]
public int? RetryAfterSeconds { get; set; }
= null;
[BsonElement("timestamp")]
public DateTime Timestamp { get; set; }
= DateTime.SpecifyKind(DateTime.UnixEpoch, DateTimeKind.Utc);
}
internal static class OrchestratorHeartbeatDocumentExtensions
{
public static OrchestratorHeartbeatDocument FromRecord(OrchestratorHeartbeatRecord record)
{
ArgumentNullException.ThrowIfNull(record);
return new OrchestratorHeartbeatDocument
{
Id = BuildId(record.Tenant, record.ConnectorId, record.RunId, record.Sequence),
Tenant = record.Tenant,
ConnectorId = record.ConnectorId,
RunId = record.RunId,
Sequence = record.Sequence,
Status = record.Status,
Progress = record.Progress,
QueueDepth = record.QueueDepth,
LastArtifactHash = record.LastArtifactHash,
LastArtifactKind = record.LastArtifactKind,
ErrorCode = record.ErrorCode,
RetryAfterSeconds = record.RetryAfterSeconds,
Timestamp = record.TimestampUtc.UtcDateTime,
};
}
public static OrchestratorHeartbeatRecord ToRecord(this OrchestratorHeartbeatDocument document)
{
ArgumentNullException.ThrowIfNull(document);
return new OrchestratorHeartbeatRecord(
document.Tenant,
document.ConnectorId,
document.RunId,
document.Sequence,
document.Status,
document.Progress,
document.QueueDepth,
document.LastArtifactHash,
document.LastArtifactKind,
document.ErrorCode,
document.RetryAfterSeconds,
DateTime.SpecifyKind(document.Timestamp, DateTimeKind.Utc));
}
private static string BuildId(string tenant, string connectorId, Guid runId, long sequence)
=> string.Create(CultureInfo.InvariantCulture, $"{tenant}:{connectorId}:{runId}:{sequence}");
}

Some files were not shown because too many files have changed in this diff Show More