Restructure solution layout by module

This commit is contained in:
master
2025-10-28 15:10:40 +02:00
parent 95daa159c4
commit d870da18ce
4103 changed files with 192899 additions and 187024 deletions

View File

@@ -0,0 +1,29 @@
# AGENTS
## Role
Canonical persistence for raw documents, DTOs, canonical advisories, jobs, and state. Provides repositories and bootstrapper for collections/indexes.
## Scope
- Collections (MongoStorageDefaults): source, source_state, document, dto, advisory, alias, affected, reference, kev_flag, ru_flags, jp_flags, psirt_flags, merge_event, export_state, locks, jobs; GridFS bucket fs.documents; field names include ttlAt (locks), sourceName, uri, advisoryKey.
- Records: SourceState (cursor, lastSuccess/error, failCount, backoffUntil), JobRun, MergeEvent, ExportState, Advisory documents mirroring Models with embedded arrays when practical.
- Bootstrapper: create collections, indexes (unique advisoryKey, scheme/value, platform/name, published, modified), TTL on locks, and validate connectivity for /ready health probes.
- Job store: create, read, mark completed/failed; compute durations; recent/last queries; active by status.
- Advisory store: CRUD for canonical advisories; query by key/alias and list for exporters with deterministic paging.
## Participants
- Core jobs read/write runs and leases; WebService /ready pings database; /jobs APIs query runs/definitions.
- Source connectors store raw docs, DTOs, and mapped canonical advisories with provenance; Update SourceState cursor/backoff.
- Exporters read advisories and write export_state.
## Interfaces & contracts
- IMongoDatabase injected; MongoUrl from options; database name from options or MongoUrl or default "concelier".
- Repositories expose async methods with CancellationToken; deterministic sorting.
- All date/time values stored as UTC; identifiers normalized.
## In/Out of scope
In: persistence, bootstrap, indexes, basic query helpers.
Out: business mapping logic, HTTP, packaging.
## Observability & security expectations
- Log collection/index creation; warn on existing mismatches.
- Timeouts and retry policies; avoid unbounded scans; page reads.
- Do not log DSNs with credentials; redact in diagnostics.
## Tests
- Author and review coverage in `../StellaOps.Concelier.Storage.Mongo.Tests`.
- Shared fixtures (e.g., `MongoIntegrationFixture`, `ConnectorTestHarness`) live in `../StellaOps.Concelier.Testing`.
- Keep fixtures deterministic; match new cases to real-world advisories or regression scenarios.

View File

@@ -0,0 +1,32 @@
using System.Collections.Generic;
using MongoDB.Bson;
using MongoDB.Bson.Serialization.Attributes;
namespace StellaOps.Concelier.Storage.Mongo.Advisories;
[BsonIgnoreExtraElements]
public sealed class AdvisoryDocument
{
[BsonId]
public string Id { get; set; } = string.Empty;
[BsonElement("advisoryKey")]
public string AdvisoryKey
{
get => Id;
set => Id = value;
}
[BsonElement("payload")]
public BsonDocument Payload { get; set; } = new();
[BsonElement("modified")]
public DateTime Modified { get; set; }
[BsonElement("published")]
public DateTime? Published { get; set; }
[BsonElement("normalizedVersions")]
[BsonIgnoreIfNull]
public List<NormalizedVersionDocument>? NormalizedVersions { get; set; }
}

View File

@@ -0,0 +1,508 @@
using System;
using System.Collections.Generic;
using System.Linq;
using System.Runtime.CompilerServices;
using System.Text.Json;
using System.Text.Json.Serialization;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using MongoDB.Bson;
using MongoDB.Driver;
using StellaOps.Concelier.Models;
using StellaOps.Concelier.Storage.Mongo.Aliases;
namespace StellaOps.Concelier.Storage.Mongo.Advisories;
public sealed class AdvisoryStore : IAdvisoryStore
{
private readonly IMongoCollection<AdvisoryDocument> _collection;
private readonly ILogger<AdvisoryStore> _logger;
private readonly IAliasStore _aliasStore;
private readonly TimeProvider _timeProvider;
private readonly MongoStorageOptions _options;
public AdvisoryStore(
IMongoDatabase database,
IAliasStore aliasStore,
ILogger<AdvisoryStore> logger,
IOptions<MongoStorageOptions> options,
TimeProvider? timeProvider = null)
{
_collection = (database ?? throw new ArgumentNullException(nameof(database)))
.GetCollection<AdvisoryDocument>(MongoStorageDefaults.Collections.Advisory);
_aliasStore = aliasStore ?? throw new ArgumentNullException(nameof(aliasStore));
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
_options = options?.Value ?? throw new ArgumentNullException(nameof(options));
_timeProvider = timeProvider ?? TimeProvider.System;
}
public async Task UpsertAsync(Advisory advisory, CancellationToken cancellationToken, IClientSessionHandle? session = null)
{
ArgumentNullException.ThrowIfNull(advisory);
var missing = ProvenanceInspector.FindMissingProvenance(advisory);
var primarySource = advisory.Provenance.FirstOrDefault()?.Source ?? "unknown";
foreach (var item in missing)
{
var source = string.IsNullOrWhiteSpace(item.Source) ? primarySource : item.Source;
_logger.LogWarning(
"Missing provenance detected for {Component} in advisory {AdvisoryKey} (source {Source}).",
item.Component,
advisory.AdvisoryKey,
source);
ProvenanceDiagnostics.RecordMissing(source, item.Component, item.RecordedAt, item.FieldMask);
}
var payload = CanonicalJsonSerializer.Serialize(advisory);
var normalizedVersions = _options.EnableSemVerStyle
? NormalizedVersionDocumentFactory.Create(advisory)
: null;
var document = new AdvisoryDocument
{
AdvisoryKey = advisory.AdvisoryKey,
Payload = BsonDocument.Parse(payload),
Modified = advisory.Modified?.UtcDateTime ?? DateTime.UtcNow,
Published = advisory.Published?.UtcDateTime,
NormalizedVersions = normalizedVersions,
};
var options = new ReplaceOptions { IsUpsert = true };
var filter = Builders<AdvisoryDocument>.Filter.Eq(x => x.AdvisoryKey, advisory.AdvisoryKey);
if (session is null)
{
await _collection.ReplaceOneAsync(filter, document, options, cancellationToken).ConfigureAwait(false);
}
else
{
await _collection.ReplaceOneAsync(session, filter, document, options, cancellationToken).ConfigureAwait(false);
}
_logger.LogDebug("Upserted advisory {AdvisoryKey}", advisory.AdvisoryKey);
var aliasEntries = BuildAliasEntries(advisory);
var updatedAt = _timeProvider.GetUtcNow();
await _aliasStore.ReplaceAsync(advisory.AdvisoryKey, aliasEntries, updatedAt, cancellationToken).ConfigureAwait(false);
}
public async Task<Advisory?> FindAsync(string advisoryKey, CancellationToken cancellationToken, IClientSessionHandle? session = null)
{
ArgumentException.ThrowIfNullOrEmpty(advisoryKey);
var filter = Builders<AdvisoryDocument>.Filter.Eq(x => x.AdvisoryKey, advisoryKey);
var query = session is null
? _collection.Find(filter)
: _collection.Find(session, filter);
var document = await query.FirstOrDefaultAsync(cancellationToken).ConfigureAwait(false);
return document is null ? null : Deserialize(document.Payload);
}
private static IEnumerable<AliasEntry> BuildAliasEntries(Advisory advisory)
{
foreach (var alias in advisory.Aliases)
{
if (AliasSchemeRegistry.TryGetScheme(alias, out var scheme))
{
yield return new AliasEntry(scheme, alias);
}
else
{
yield return new AliasEntry(AliasStoreConstants.UnscopedScheme, alias);
}
}
yield return new AliasEntry(AliasStoreConstants.PrimaryScheme, advisory.AdvisoryKey);
}
public async Task<IReadOnlyList<Advisory>> GetRecentAsync(int limit, CancellationToken cancellationToken, IClientSessionHandle? session = null)
{
var filter = FilterDefinition<AdvisoryDocument>.Empty;
var query = session is null
? _collection.Find(filter)
: _collection.Find(session, filter);
var cursor = await query
.SortByDescending(x => x.Modified)
.Limit(limit)
.ToListAsync(cancellationToken)
.ConfigureAwait(false);
return cursor.Select(static doc => Deserialize(doc.Payload)).ToArray();
}
public async IAsyncEnumerable<Advisory> StreamAsync([EnumeratorCancellation] CancellationToken cancellationToken, IClientSessionHandle? session = null)
{
var options = new FindOptions<AdvisoryDocument>
{
Sort = Builders<AdvisoryDocument>.Sort.Ascending(static doc => doc.AdvisoryKey),
};
using var cursor = session is null
? await _collection.FindAsync(FilterDefinition<AdvisoryDocument>.Empty, options, cancellationToken).ConfigureAwait(false)
: await _collection.FindAsync(session, FilterDefinition<AdvisoryDocument>.Empty, options, cancellationToken).ConfigureAwait(false);
while (await cursor.MoveNextAsync(cancellationToken).ConfigureAwait(false))
{
foreach (var document in cursor.Current)
{
cancellationToken.ThrowIfCancellationRequested();
yield return Deserialize(document.Payload);
}
}
}
private static Advisory Deserialize(BsonDocument payload)
{
ArgumentNullException.ThrowIfNull(payload);
var advisoryKey = payload.GetValue("advisoryKey", defaultValue: null)?.AsString
?? throw new InvalidOperationException("advisoryKey missing from payload.");
var title = payload.GetValue("title", defaultValue: null)?.AsString ?? advisoryKey;
string? summary = payload.TryGetValue("summary", out var summaryValue) && summaryValue.IsString ? summaryValue.AsString : null;
string? description = payload.TryGetValue("description", out var descriptionValue) && descriptionValue.IsString ? descriptionValue.AsString : null;
string? language = payload.TryGetValue("language", out var languageValue) && languageValue.IsString ? languageValue.AsString : null;
DateTimeOffset? published = TryReadDateTime(payload, "published");
DateTimeOffset? modified = TryReadDateTime(payload, "modified");
string? severity = payload.TryGetValue("severity", out var severityValue) && severityValue.IsString ? severityValue.AsString : null;
var exploitKnown = payload.TryGetValue("exploitKnown", out var exploitValue) && exploitValue.IsBoolean && exploitValue.AsBoolean;
var aliases = payload.TryGetValue("aliases", out var aliasValue) && aliasValue is BsonArray aliasArray
? aliasArray.OfType<BsonValue>().Where(static x => x.IsString).Select(static x => x.AsString)
: Array.Empty<string>();
var credits = payload.TryGetValue("credits", out var creditsValue) && creditsValue is BsonArray creditsArray
? creditsArray.OfType<BsonDocument>().Select(DeserializeCredit).ToArray()
: Array.Empty<AdvisoryCredit>();
var references = payload.TryGetValue("references", out var referencesValue) && referencesValue is BsonArray referencesArray
? referencesArray.OfType<BsonDocument>().Select(DeserializeReference).ToArray()
: Array.Empty<AdvisoryReference>();
var affectedPackages = payload.TryGetValue("affectedPackages", out var affectedValue) && affectedValue is BsonArray affectedArray
? affectedArray.OfType<BsonDocument>().Select(DeserializeAffectedPackage).ToArray()
: Array.Empty<AffectedPackage>();
var cvssMetrics = payload.TryGetValue("cvssMetrics", out var cvssValue) && cvssValue is BsonArray cvssArray
? cvssArray.OfType<BsonDocument>().Select(DeserializeCvssMetric).ToArray()
: Array.Empty<CvssMetric>();
var cwes = payload.TryGetValue("cwes", out var cweValue) && cweValue is BsonArray cweArray
? cweArray.OfType<BsonDocument>().Select(DeserializeWeakness).ToArray()
: Array.Empty<AdvisoryWeakness>();
string? canonicalMetricId = payload.TryGetValue("canonicalMetricId", out var canonicalMetricValue) && canonicalMetricValue.IsString
? canonicalMetricValue.AsString
: null;
var provenance = payload.TryGetValue("provenance", out var provenanceValue) && provenanceValue is BsonArray provenanceArray
? provenanceArray.OfType<BsonDocument>().Select(DeserializeProvenance).ToArray()
: Array.Empty<AdvisoryProvenance>();
return new Advisory(
advisoryKey,
title,
summary,
language,
published,
modified,
severity,
exploitKnown,
aliases,
credits,
references,
affectedPackages,
cvssMetrics,
provenance,
description,
cwes,
canonicalMetricId);
}
private static AdvisoryReference DeserializeReference(BsonDocument document)
{
var url = document.GetValue("url", defaultValue: null)?.AsString
?? throw new InvalidOperationException("reference.url missing from payload.");
string? kind = document.TryGetValue("kind", out var kindValue) && kindValue.IsString ? kindValue.AsString : null;
string? sourceTag = document.TryGetValue("sourceTag", out var sourceTagValue) && sourceTagValue.IsString ? sourceTagValue.AsString : null;
string? summary = document.TryGetValue("summary", out var summaryValue) && summaryValue.IsString ? summaryValue.AsString : null;
var provenance = document.TryGetValue("provenance", out var provenanceValue) && provenanceValue.IsBsonDocument
? DeserializeProvenance(provenanceValue.AsBsonDocument)
: AdvisoryProvenance.Empty;
return new AdvisoryReference(url, kind, sourceTag, summary, provenance);
}
private static AdvisoryCredit DeserializeCredit(BsonDocument document)
{
var displayName = document.GetValue("displayName", defaultValue: null)?.AsString
?? throw new InvalidOperationException("credits.displayName missing from payload.");
string? role = document.TryGetValue("role", out var roleValue) && roleValue.IsString ? roleValue.AsString : null;
var contacts = document.TryGetValue("contacts", out var contactsValue) && contactsValue is BsonArray contactsArray
? contactsArray.OfType<BsonValue>().Where(static value => value.IsString).Select(static value => value.AsString).ToArray()
: Array.Empty<string>();
var provenance = document.TryGetValue("provenance", out var provenanceValue) && provenanceValue.IsBsonDocument
? DeserializeProvenance(provenanceValue.AsBsonDocument)
: AdvisoryProvenance.Empty;
return new AdvisoryCredit(displayName, role, contacts, provenance);
}
private static AffectedPackage DeserializeAffectedPackage(BsonDocument document)
{
var type = document.GetValue("type", defaultValue: null)?.AsString
?? throw new InvalidOperationException("affectedPackages.type missing from payload.");
var identifier = document.GetValue("identifier", defaultValue: null)?.AsString
?? throw new InvalidOperationException("affectedPackages.identifier missing from payload.");
string? platform = document.TryGetValue("platform", out var platformValue) && platformValue.IsString ? platformValue.AsString : null;
var versionRanges = document.TryGetValue("versionRanges", out var rangesValue) && rangesValue is BsonArray rangesArray
? rangesArray.OfType<BsonDocument>().Select(DeserializeVersionRange).ToArray()
: Array.Empty<AffectedVersionRange>();
var statuses = document.TryGetValue("statuses", out var statusesValue) && statusesValue is BsonArray statusesArray
? statusesArray.OfType<BsonDocument>().Select(DeserializeStatus).ToArray()
: Array.Empty<AffectedPackageStatus>();
var normalizedVersions = document.TryGetValue("normalizedVersions", out var normalizedValue) && normalizedValue is BsonArray normalizedArray
? normalizedArray.OfType<BsonDocument>().Select(DeserializeNormalizedVersionRule).ToArray()
: Array.Empty<NormalizedVersionRule>();
var provenance = document.TryGetValue("provenance", out var provenanceValue) && provenanceValue is BsonArray provenanceArray
? provenanceArray.OfType<BsonDocument>().Select(DeserializeProvenance).ToArray()
: Array.Empty<AdvisoryProvenance>();
return new AffectedPackage(type, identifier, platform, versionRanges, statuses, provenance, normalizedVersions);
}
private static AffectedVersionRange DeserializeVersionRange(BsonDocument document)
{
var rangeKind = document.GetValue("rangeKind", defaultValue: null)?.AsString
?? throw new InvalidOperationException("versionRanges.rangeKind missing from payload.");
string? introducedVersion = document.TryGetValue("introducedVersion", out var introducedValue) && introducedValue.IsString ? introducedValue.AsString : null;
string? fixedVersion = document.TryGetValue("fixedVersion", out var fixedValue) && fixedValue.IsString ? fixedValue.AsString : null;
string? lastAffectedVersion = document.TryGetValue("lastAffectedVersion", out var lastValue) && lastValue.IsString ? lastValue.AsString : null;
string? rangeExpression = document.TryGetValue("rangeExpression", out var expressionValue) && expressionValue.IsString ? expressionValue.AsString : null;
var provenance = document.TryGetValue("provenance", out var provenanceValue) && provenanceValue.IsBsonDocument
? DeserializeProvenance(provenanceValue.AsBsonDocument)
: AdvisoryProvenance.Empty;
RangePrimitives? primitives = null;
if (document.TryGetValue("primitives", out var primitivesValue) && primitivesValue.IsBsonDocument)
{
primitives = DeserializePrimitives(primitivesValue.AsBsonDocument);
}
return new AffectedVersionRange(rangeKind, introducedVersion, fixedVersion, lastAffectedVersion, rangeExpression, provenance, primitives);
}
private static AffectedPackageStatus DeserializeStatus(BsonDocument document)
{
var status = document.GetValue("status", defaultValue: null)?.AsString
?? throw new InvalidOperationException("statuses.status missing from payload.");
var provenance = document.TryGetValue("provenance", out var provenanceValue) && provenanceValue.IsBsonDocument
? DeserializeProvenance(provenanceValue.AsBsonDocument)
: AdvisoryProvenance.Empty;
return new AffectedPackageStatus(status, provenance);
}
private static CvssMetric DeserializeCvssMetric(BsonDocument document)
{
var version = document.GetValue("version", defaultValue: null)?.AsString
?? throw new InvalidOperationException("cvssMetrics.version missing from payload.");
var vector = document.GetValue("vector", defaultValue: null)?.AsString
?? throw new InvalidOperationException("cvssMetrics.vector missing from payload.");
var baseScore = document.TryGetValue("baseScore", out var scoreValue) && scoreValue.IsNumeric ? scoreValue.ToDouble() : 0d;
var baseSeverity = document.GetValue("baseSeverity", defaultValue: null)?.AsString
?? throw new InvalidOperationException("cvssMetrics.baseSeverity missing from payload.");
var provenance = document.TryGetValue("provenance", out var provenanceValue) && provenanceValue.IsBsonDocument
? DeserializeProvenance(provenanceValue.AsBsonDocument)
: AdvisoryProvenance.Empty;
return new CvssMetric(version, vector, baseScore, baseSeverity, provenance);
}
private static AdvisoryWeakness DeserializeWeakness(BsonDocument document)
{
var taxonomy = document.GetValue("taxonomy", defaultValue: null)?.AsString
?? throw new InvalidOperationException("cwes.taxonomy missing from payload.");
var identifier = document.GetValue("identifier", defaultValue: null)?.AsString
?? throw new InvalidOperationException("cwes.identifier missing from payload.");
string? name = document.TryGetValue("name", out var nameValue) && nameValue.IsString ? nameValue.AsString : null;
string? uri = document.TryGetValue("uri", out var uriValue) && uriValue.IsString ? uriValue.AsString : null;
var provenance = document.TryGetValue("provenance", out var provenanceValue) && provenanceValue.IsBsonDocument
? DeserializeProvenance(provenanceValue.AsBsonDocument)
: AdvisoryProvenance.Empty;
return new AdvisoryWeakness(taxonomy, identifier, name, uri, new[] { provenance });
}
private static AdvisoryProvenance DeserializeProvenance(BsonDocument document)
{
var source = document.GetValue("source", defaultValue: null)?.AsString
?? throw new InvalidOperationException("provenance.source missing from payload.");
var kind = document.GetValue("kind", defaultValue: null)?.AsString
?? throw new InvalidOperationException("provenance.kind missing from payload.");
string? value = document.TryGetValue("value", out var valueElement) && valueElement.IsString ? valueElement.AsString : null;
string? decisionReason = document.TryGetValue("decisionReason", out var reasonElement) && reasonElement.IsString ? reasonElement.AsString : null;
var recordedAt = TryConvertDateTime(document.GetValue("recordedAt", defaultValue: null));
IEnumerable<string>? fieldMask = null;
if (document.TryGetValue("fieldMask", out var fieldMaskValue) && fieldMaskValue is BsonArray fieldMaskArray)
{
fieldMask = fieldMaskArray
.OfType<BsonValue>()
.Where(static element => element.IsString)
.Select(static element => element.AsString);
}
return new AdvisoryProvenance(
source,
kind,
value ?? string.Empty,
recordedAt ?? DateTimeOffset.UtcNow,
fieldMask,
decisionReason);
}
private static NormalizedVersionRule DeserializeNormalizedVersionRule(BsonDocument document)
{
var scheme = document.GetValue("scheme", defaultValue: null)?.AsString
?? throw new InvalidOperationException("normalizedVersions.scheme missing from payload.");
var type = document.GetValue("type", defaultValue: null)?.AsString
?? throw new InvalidOperationException("normalizedVersions.type missing from payload.");
string? min = document.TryGetValue("min", out var minValue) && minValue.IsString ? minValue.AsString : null;
bool? minInclusive = document.TryGetValue("minInclusive", out var minInclusiveValue) && minInclusiveValue.IsBoolean ? minInclusiveValue.AsBoolean : null;
string? max = document.TryGetValue("max", out var maxValue) && maxValue.IsString ? maxValue.AsString : null;
bool? maxInclusive = document.TryGetValue("maxInclusive", out var maxInclusiveValue) && maxInclusiveValue.IsBoolean ? maxInclusiveValue.AsBoolean : null;
string? value = document.TryGetValue("value", out var valueElement) && valueElement.IsString ? valueElement.AsString : null;
string? notes = document.TryGetValue("notes", out var notesValue) && notesValue.IsString ? notesValue.AsString : null;
return new NormalizedVersionRule(
scheme,
type,
min,
minInclusive,
max,
maxInclusive,
value,
notes);
}
private static RangePrimitives? DeserializePrimitives(BsonDocument document)
{
SemVerPrimitive? semVer = null;
NevraPrimitive? nevra = null;
EvrPrimitive? evr = null;
IReadOnlyDictionary<string, string>? vendor = null;
if (document.TryGetValue("semVer", out var semverValue) && semverValue.IsBsonDocument)
{
var semverDoc = semverValue.AsBsonDocument;
semVer = new SemVerPrimitive(
semverDoc.TryGetValue("introduced", out var semIntroduced) && semIntroduced.IsString ? semIntroduced.AsString : null,
semverDoc.TryGetValue("introducedInclusive", out var semIntroducedInclusive) && semIntroducedInclusive.IsBoolean && semIntroducedInclusive.AsBoolean,
semverDoc.TryGetValue("fixed", out var semFixed) && semFixed.IsString ? semFixed.AsString : null,
semverDoc.TryGetValue("fixedInclusive", out var semFixedInclusive) && semFixedInclusive.IsBoolean && semFixedInclusive.AsBoolean,
semverDoc.TryGetValue("lastAffected", out var semLast) && semLast.IsString ? semLast.AsString : null,
semverDoc.TryGetValue("lastAffectedInclusive", out var semLastInclusive) && semLastInclusive.IsBoolean && semLastInclusive.AsBoolean,
semverDoc.TryGetValue("constraintExpression", out var constraint) && constraint.IsString ? constraint.AsString : null,
semverDoc.TryGetValue("exactValue", out var exact) && exact.IsString ? exact.AsString : null);
}
if (document.TryGetValue("nevra", out var nevraValue) && nevraValue.IsBsonDocument)
{
var nevraDoc = nevraValue.AsBsonDocument;
nevra = new NevraPrimitive(
DeserializeNevraComponent(nevraDoc, "introduced"),
DeserializeNevraComponent(nevraDoc, "fixed"),
DeserializeNevraComponent(nevraDoc, "lastAffected"));
}
if (document.TryGetValue("evr", out var evrValue) && evrValue.IsBsonDocument)
{
var evrDoc = evrValue.AsBsonDocument;
evr = new EvrPrimitive(
DeserializeEvrComponent(evrDoc, "introduced"),
DeserializeEvrComponent(evrDoc, "fixed"),
DeserializeEvrComponent(evrDoc, "lastAffected"));
}
if (document.TryGetValue("vendorExtensions", out var vendorValue) && vendorValue.IsBsonDocument)
{
vendor = vendorValue.AsBsonDocument.Elements
.Where(static e => e.Value.IsString)
.ToDictionary(static e => e.Name, static e => e.Value.AsString, StringComparer.Ordinal);
if (vendor.Count == 0)
{
vendor = null;
}
}
if (semVer is null && nevra is null && evr is null && vendor is null)
{
return null;
}
return new RangePrimitives(semVer, nevra, evr, vendor);
}
private static NevraComponent? DeserializeNevraComponent(BsonDocument parent, string field)
{
if (!parent.TryGetValue(field, out var value) || !value.IsBsonDocument)
{
return null;
}
var component = value.AsBsonDocument;
var name = component.TryGetValue("name", out var nameValue) && nameValue.IsString ? nameValue.AsString : null;
var version = component.TryGetValue("version", out var versionValue) && versionValue.IsString ? versionValue.AsString : null;
if (name is null || version is null)
{
return null;
}
var epoch = component.TryGetValue("epoch", out var epochValue) && epochValue.IsNumeric ? epochValue.ToInt32() : 0;
var release = component.TryGetValue("release", out var releaseValue) && releaseValue.IsString ? releaseValue.AsString : string.Empty;
var architecture = component.TryGetValue("architecture", out var archValue) && archValue.IsString ? archValue.AsString : null;
return new NevraComponent(name, epoch, version, release, architecture);
}
private static EvrComponent? DeserializeEvrComponent(BsonDocument parent, string field)
{
if (!parent.TryGetValue(field, out var value) || !value.IsBsonDocument)
{
return null;
}
var component = value.AsBsonDocument;
var epoch = component.TryGetValue("epoch", out var epochValue) && epochValue.IsNumeric ? epochValue.ToInt32() : 0;
var upstream = component.TryGetValue("upstreamVersion", out var upstreamValue) && upstreamValue.IsString ? upstreamValue.AsString : null;
if (upstream is null)
{
return null;
}
var revision = component.TryGetValue("revision", out var revisionValue) && revisionValue.IsString ? revisionValue.AsString : null;
return new EvrComponent(epoch, upstream, revision);
}
private static DateTimeOffset? TryReadDateTime(BsonDocument document, string field)
=> document.TryGetValue(field, out var value) ? TryConvertDateTime(value) : null;
private static DateTimeOffset? TryConvertDateTime(BsonValue? value)
{
if (value is null)
{
return null;
}
return value switch
{
BsonDateTime dateTime => DateTime.SpecifyKind(dateTime.ToUniversalTime(), DateTimeKind.Utc),
BsonString stringValue when DateTimeOffset.TryParse(stringValue.AsString, out var parsed) => parsed.ToUniversalTime(),
_ => null,
};
}
}

View File

@@ -0,0 +1,15 @@
using MongoDB.Driver;
using StellaOps.Concelier.Models;
namespace StellaOps.Concelier.Storage.Mongo.Advisories;
public interface IAdvisoryStore
{
Task UpsertAsync(Advisory advisory, CancellationToken cancellationToken, IClientSessionHandle? session = null);
Task<Advisory?> FindAsync(string advisoryKey, CancellationToken cancellationToken, IClientSessionHandle? session = null);
Task<IReadOnlyList<Advisory>> GetRecentAsync(int limit, CancellationToken cancellationToken, IClientSessionHandle? session = null);
IAsyncEnumerable<Advisory> StreamAsync(CancellationToken cancellationToken, IClientSessionHandle? session = null);
}

View File

@@ -0,0 +1,64 @@
using System;
using MongoDB.Bson.Serialization.Attributes;
namespace StellaOps.Concelier.Storage.Mongo.Advisories;
[BsonIgnoreExtraElements]
public sealed class NormalizedVersionDocument
{
[BsonElement("packageId")]
public string PackageId { get; set; } = string.Empty;
[BsonElement("packageType")]
public string PackageType { get; set; } = string.Empty;
[BsonElement("scheme")]
public string Scheme { get; set; } = string.Empty;
[BsonElement("type")]
public string Type { get; set; } = string.Empty;
[BsonElement("style")]
[BsonIgnoreIfNull]
public string? Style { get; set; }
[BsonElement("min")]
[BsonIgnoreIfNull]
public string? Min { get; set; }
[BsonElement("minInclusive")]
[BsonIgnoreIfNull]
public bool? MinInclusive { get; set; }
[BsonElement("max")]
[BsonIgnoreIfNull]
public string? Max { get; set; }
[BsonElement("maxInclusive")]
[BsonIgnoreIfNull]
public bool? MaxInclusive { get; set; }
[BsonElement("value")]
[BsonIgnoreIfNull]
public string? Value { get; set; }
[BsonElement("notes")]
[BsonIgnoreIfNull]
public string? Notes { get; set; }
[BsonElement("decisionReason")]
[BsonIgnoreIfNull]
public string? DecisionReason { get; set; }
[BsonElement("constraint")]
[BsonIgnoreIfNull]
public string? Constraint { get; set; }
[BsonElement("source")]
[BsonIgnoreIfNull]
public string? Source { get; set; }
[BsonElement("recordedAt")]
[BsonIgnoreIfNull]
public DateTime? RecordedAtUtc { get; set; }
}

View File

@@ -0,0 +1,100 @@
using System;
using System.Collections.Generic;
using System.Linq;
using StellaOps.Concelier.Models;
namespace StellaOps.Concelier.Storage.Mongo.Advisories;
internal static class NormalizedVersionDocumentFactory
{
public static List<NormalizedVersionDocument>? Create(Advisory advisory)
{
if (advisory.AffectedPackages.IsDefaultOrEmpty || advisory.AffectedPackages.Length == 0)
{
return null;
}
var documents = new List<NormalizedVersionDocument>();
var advisoryFallbackReason = advisory.Provenance.FirstOrDefault()?.DecisionReason;
var advisoryFallbackSource = advisory.Provenance.FirstOrDefault()?.Source;
var advisoryFallbackRecordedAt = advisory.Provenance.FirstOrDefault()?.RecordedAt;
foreach (var package in advisory.AffectedPackages)
{
if (package.NormalizedVersions.IsDefaultOrEmpty || package.NormalizedVersions.Length == 0)
{
continue;
}
foreach (var rule in package.NormalizedVersions)
{
var matchingRange = FindMatchingRange(package, rule);
var decisionReason = matchingRange?.Provenance.DecisionReason
?? package.Provenance.FirstOrDefault()?.DecisionReason
?? advisoryFallbackReason;
var source = matchingRange?.Provenance.Source
?? package.Provenance.FirstOrDefault()?.Source
?? advisoryFallbackSource;
var recordedAt = matchingRange?.Provenance.RecordedAt
?? package.Provenance.FirstOrDefault()?.RecordedAt
?? advisoryFallbackRecordedAt;
var constraint = matchingRange?.Primitives?.SemVer?.ConstraintExpression
?? matchingRange?.RangeExpression;
var style = matchingRange?.Primitives?.SemVer?.Style ?? rule.Type;
documents.Add(new NormalizedVersionDocument
{
PackageId = package.Identifier ?? string.Empty,
PackageType = package.Type ?? string.Empty,
Scheme = rule.Scheme,
Type = rule.Type,
Style = style,
Min = rule.Min,
MinInclusive = rule.MinInclusive,
Max = rule.Max,
MaxInclusive = rule.MaxInclusive,
Value = rule.Value,
Notes = rule.Notes,
DecisionReason = decisionReason,
Constraint = constraint,
Source = source,
RecordedAtUtc = recordedAt?.UtcDateTime,
});
}
}
return documents.Count == 0 ? null : documents;
}
private static AffectedVersionRange? FindMatchingRange(AffectedPackage package, NormalizedVersionRule rule)
{
foreach (var range in package.VersionRanges)
{
var candidate = range.ToNormalizedVersionRule(rule.Notes);
if (candidate is null)
{
continue;
}
if (NormalizedRulesEquivalent(candidate, rule))
{
return range;
}
}
return null;
}
private static bool NormalizedRulesEquivalent(NormalizedVersionRule left, NormalizedVersionRule right)
=> string.Equals(left.Scheme, right.Scheme, StringComparison.Ordinal)
&& string.Equals(left.Type, right.Type, StringComparison.Ordinal)
&& string.Equals(left.Min, right.Min, StringComparison.Ordinal)
&& left.MinInclusive == right.MinInclusive
&& string.Equals(left.Max, right.Max, StringComparison.Ordinal)
&& left.MaxInclusive == right.MaxInclusive
&& string.Equals(left.Value, right.Value, StringComparison.Ordinal);
}

View File

@@ -0,0 +1,38 @@
using System;
using MongoDB.Bson;
using MongoDB.Bson.Serialization.Attributes;
namespace StellaOps.Concelier.Storage.Mongo.Aliases;
[BsonIgnoreExtraElements]
internal sealed class AliasDocument
{
[BsonId]
public ObjectId Id { get; set; }
[BsonElement("advisoryKey")]
public string AdvisoryKey { get; set; } = string.Empty;
[BsonElement("scheme")]
public string Scheme { get; set; } = string.Empty;
[BsonElement("value")]
public string Value { get; set; } = string.Empty;
[BsonElement("updatedAt")]
public DateTime UpdatedAt { get; set; }
}
internal static class AliasDocumentExtensions
{
public static AliasRecord ToRecord(this AliasDocument document)
{
ArgumentNullException.ThrowIfNull(document);
var updatedAt = DateTime.SpecifyKind(document.UpdatedAt, DateTimeKind.Utc);
return new AliasRecord(
document.AdvisoryKey,
document.Scheme,
document.Value,
new DateTimeOffset(updatedAt));
}
}

View File

@@ -0,0 +1,157 @@
using System.Collections.Generic;
using System.Linq;
using Microsoft.Extensions.Logging;
using MongoDB.Bson;
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo.Aliases;
public sealed class AliasStore : IAliasStore
{
private readonly IMongoCollection<AliasDocument> _collection;
private readonly ILogger<AliasStore> _logger;
public AliasStore(IMongoDatabase database, ILogger<AliasStore> logger)
{
_collection = (database ?? throw new ArgumentNullException(nameof(database)))
.GetCollection<AliasDocument>(MongoStorageDefaults.Collections.Alias);
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public async Task<AliasUpsertResult> ReplaceAsync(
string advisoryKey,
IEnumerable<AliasEntry> aliases,
DateTimeOffset updatedAt,
CancellationToken cancellationToken)
{
ArgumentException.ThrowIfNullOrWhiteSpace(advisoryKey);
var aliasList = Normalize(aliases).ToArray();
var deleteFilter = Builders<AliasDocument>.Filter.Eq(x => x.AdvisoryKey, advisoryKey);
await _collection.DeleteManyAsync(deleteFilter, cancellationToken).ConfigureAwait(false);
if (aliasList.Length > 0)
{
var documents = new List<AliasDocument>(aliasList.Length);
var updatedAtUtc = updatedAt.ToUniversalTime().UtcDateTime;
foreach (var alias in aliasList)
{
documents.Add(new AliasDocument
{
Id = ObjectId.GenerateNewId(),
AdvisoryKey = advisoryKey,
Scheme = alias.Scheme,
Value = alias.Value,
UpdatedAt = updatedAtUtc,
});
}
if (documents.Count > 0)
{
await _collection.InsertManyAsync(
documents,
new InsertManyOptions { IsOrdered = false },
cancellationToken).ConfigureAwait(false);
}
}
if (aliasList.Length == 0)
{
return new AliasUpsertResult(advisoryKey, Array.Empty<AliasCollision>());
}
var collisions = new List<AliasCollision>();
foreach (var alias in aliasList)
{
var filter = Builders<AliasDocument>.Filter.Eq(x => x.Scheme, alias.Scheme)
& Builders<AliasDocument>.Filter.Eq(x => x.Value, alias.Value);
using var cursor = await _collection.FindAsync(filter, cancellationToken: cancellationToken).ConfigureAwait(false);
var advisoryKeys = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
while (await cursor.MoveNextAsync(cancellationToken).ConfigureAwait(false))
{
foreach (var document in cursor.Current)
{
advisoryKeys.Add(document.AdvisoryKey);
}
}
if (advisoryKeys.Count <= 1)
{
continue;
}
var collision = new AliasCollision(alias.Scheme, alias.Value, advisoryKeys.ToArray());
collisions.Add(collision);
AliasStoreMetrics.RecordCollision(alias.Scheme, advisoryKeys.Count);
_logger.LogWarning(
"Alias collision detected for {Scheme}:{Value}; advisories: {Advisories}",
alias.Scheme,
alias.Value,
string.Join(", ", advisoryKeys));
}
return new AliasUpsertResult(advisoryKey, collisions);
}
public async Task<IReadOnlyList<AliasRecord>> GetByAliasAsync(string scheme, string value, CancellationToken cancellationToken)
{
ArgumentException.ThrowIfNullOrWhiteSpace(scheme);
ArgumentException.ThrowIfNullOrWhiteSpace(value);
var normalizedScheme = NormalizeScheme(scheme);
var normalizedValue = value.Trim();
var filter = Builders<AliasDocument>.Filter.Eq(x => x.Scheme, normalizedScheme)
& Builders<AliasDocument>.Filter.Eq(x => x.Value, normalizedValue);
var documents = await _collection.Find(filter).ToListAsync(cancellationToken).ConfigureAwait(false);
return documents.Select(static d => d.ToRecord()).ToArray();
}
public async Task<IReadOnlyList<AliasRecord>> GetByAdvisoryAsync(string advisoryKey, CancellationToken cancellationToken)
{
ArgumentException.ThrowIfNullOrWhiteSpace(advisoryKey);
var filter = Builders<AliasDocument>.Filter.Eq(x => x.AdvisoryKey, advisoryKey);
var documents = await _collection.Find(filter).ToListAsync(cancellationToken).ConfigureAwait(false);
return documents.Select(static d => d.ToRecord()).ToArray();
}
private static IEnumerable<AliasEntry> Normalize(IEnumerable<AliasEntry> aliases)
{
if (aliases is null)
{
yield break;
}
var seen = new HashSet<string>(StringComparer.Ordinal);
foreach (var alias in aliases)
{
if (alias is null)
{
continue;
}
var scheme = NormalizeScheme(alias.Scheme);
var value = alias.Value?.Trim();
if (string.IsNullOrEmpty(value))
{
continue;
}
var key = $"{scheme}\u0001{value}";
if (!seen.Add(key))
{
continue;
}
yield return new AliasEntry(scheme, value);
}
}
private static string NormalizeScheme(string scheme)
{
return string.IsNullOrWhiteSpace(scheme)
? AliasStoreConstants.UnscopedScheme
: scheme.Trim().ToUpperInvariant();
}
}

View File

@@ -0,0 +1,7 @@
namespace StellaOps.Concelier.Storage.Mongo.Aliases;
public static class AliasStoreConstants
{
public const string PrimaryScheme = "PRIMARY";
public const string UnscopedScheme = "UNSCOPED";
}

View File

@@ -0,0 +1,22 @@
using System.Collections.Generic;
using System.Diagnostics.Metrics;
namespace StellaOps.Concelier.Storage.Mongo.Aliases;
internal static class AliasStoreMetrics
{
private static readonly Meter Meter = new("StellaOps.Concelier.Merge");
internal static readonly Counter<long> AliasCollisionCounter = Meter.CreateCounter<long>(
"concelier.merge.alias_conflict",
unit: "count",
description: "Number of alias collisions detected when the same alias maps to multiple advisories.");
public static void RecordCollision(string scheme, int advisoryCount)
{
AliasCollisionCounter.Add(
1,
new KeyValuePair<string, object?>("scheme", scheme),
new KeyValuePair<string, object?>("advisory_count", advisoryCount));
}
}

View File

@@ -0,0 +1,27 @@
using System;
using System.Collections.Generic;
using System.Threading;
using System.Threading.Tasks;
namespace StellaOps.Concelier.Storage.Mongo.Aliases;
public interface IAliasStore
{
Task<AliasUpsertResult> ReplaceAsync(
string advisoryKey,
IEnumerable<AliasEntry> aliases,
DateTimeOffset updatedAt,
CancellationToken cancellationToken);
Task<IReadOnlyList<AliasRecord>> GetByAliasAsync(string scheme, string value, CancellationToken cancellationToken);
Task<IReadOnlyList<AliasRecord>> GetByAdvisoryAsync(string advisoryKey, CancellationToken cancellationToken);
}
public sealed record AliasEntry(string Scheme, string Value);
public sealed record AliasRecord(string AdvisoryKey, string Scheme, string Value, DateTimeOffset UpdatedAt);
public sealed record AliasCollision(string Scheme, string Value, IReadOnlyList<string> AdvisoryKeys);
public sealed record AliasUpsertResult(string AdvisoryKey, IReadOnlyList<AliasCollision> Collisions);

View File

@@ -0,0 +1,43 @@
using System;
using System.Collections.Generic;
using MongoDB.Bson;
using MongoDB.Bson.Serialization.Attributes;
namespace StellaOps.Concelier.Storage.Mongo.ChangeHistory;
[BsonIgnoreExtraElements]
public sealed class ChangeHistoryDocument
{
[BsonId]
public string Id { get; set; } = string.Empty;
[BsonElement("source")]
public string SourceName { get; set; } = string.Empty;
[BsonElement("advisoryKey")]
public string AdvisoryKey { get; set; } = string.Empty;
[BsonElement("documentId")]
public string DocumentId { get; set; } = string.Empty;
[BsonElement("documentSha256")]
public string DocumentSha256 { get; set; } = string.Empty;
[BsonElement("currentHash")]
public string CurrentHash { get; set; } = string.Empty;
[BsonElement("previousHash")]
public string? PreviousHash { get; set; }
[BsonElement("currentSnapshot")]
public string CurrentSnapshot { get; set; } = string.Empty;
[BsonElement("previousSnapshot")]
public string? PreviousSnapshot { get; set; }
[BsonElement("changes")]
public List<BsonDocument> Changes { get; set; } = new();
[BsonElement("capturedAt")]
public DateTime CapturedAt { get; set; }
}

View File

@@ -0,0 +1,70 @@
using System;
using System.Collections.Generic;
using MongoDB.Bson;
namespace StellaOps.Concelier.Storage.Mongo.ChangeHistory;
internal static class ChangeHistoryDocumentExtensions
{
public static ChangeHistoryDocument ToDocument(this ChangeHistoryRecord record)
{
var changes = new List<BsonDocument>(record.Changes.Count);
foreach (var change in record.Changes)
{
changes.Add(new BsonDocument
{
["field"] = change.Field,
["type"] = change.ChangeType,
["previous"] = change.PreviousValue is null ? BsonNull.Value : new BsonString(change.PreviousValue),
["current"] = change.CurrentValue is null ? BsonNull.Value : new BsonString(change.CurrentValue),
});
}
return new ChangeHistoryDocument
{
Id = record.Id.ToString(),
SourceName = record.SourceName,
AdvisoryKey = record.AdvisoryKey,
DocumentId = record.DocumentId.ToString(),
DocumentSha256 = record.DocumentSha256,
CurrentHash = record.CurrentHash,
PreviousHash = record.PreviousHash,
CurrentSnapshot = record.CurrentSnapshot,
PreviousSnapshot = record.PreviousSnapshot,
Changes = changes,
CapturedAt = record.CapturedAt.UtcDateTime,
};
}
public static ChangeHistoryRecord ToRecord(this ChangeHistoryDocument document)
{
var changes = new List<ChangeHistoryFieldChange>(document.Changes.Count);
foreach (var change in document.Changes)
{
var previousValue = change.TryGetValue("previous", out var previousBson) && previousBson is not BsonNull
? previousBson.AsString
: null;
var currentValue = change.TryGetValue("current", out var currentBson) && currentBson is not BsonNull
? currentBson.AsString
: null;
var fieldName = change.GetValue("field", "").AsString;
var changeType = change.GetValue("type", "").AsString;
changes.Add(new ChangeHistoryFieldChange(fieldName, changeType, previousValue, currentValue));
}
var capturedAtUtc = DateTime.SpecifyKind(document.CapturedAt, DateTimeKind.Utc);
return new ChangeHistoryRecord(
Guid.Parse(document.Id),
document.SourceName,
document.AdvisoryKey,
Guid.Parse(document.DocumentId),
document.DocumentSha256,
document.CurrentHash,
document.PreviousHash,
document.CurrentSnapshot,
document.PreviousSnapshot,
changes,
new DateTimeOffset(capturedAtUtc));
}
}

View File

@@ -0,0 +1,24 @@
using System;
namespace StellaOps.Concelier.Storage.Mongo.ChangeHistory;
public sealed record ChangeHistoryFieldChange
{
public ChangeHistoryFieldChange(string field, string changeType, string? previousValue, string? currentValue)
{
ArgumentException.ThrowIfNullOrEmpty(field);
ArgumentException.ThrowIfNullOrEmpty(changeType);
Field = field;
ChangeType = changeType;
PreviousValue = previousValue;
CurrentValue = currentValue;
}
public string Field { get; }
public string ChangeType { get; }
public string? PreviousValue { get; }
public string? CurrentValue { get; }
}

View File

@@ -0,0 +1,62 @@
using System;
using System.Collections.Generic;
namespace StellaOps.Concelier.Storage.Mongo.ChangeHistory;
public sealed class ChangeHistoryRecord
{
public ChangeHistoryRecord(
Guid id,
string sourceName,
string advisoryKey,
Guid documentId,
string documentSha256,
string currentHash,
string? previousHash,
string currentSnapshot,
string? previousSnapshot,
IReadOnlyList<ChangeHistoryFieldChange> changes,
DateTimeOffset capturedAt)
{
ArgumentException.ThrowIfNullOrEmpty(sourceName);
ArgumentException.ThrowIfNullOrEmpty(advisoryKey);
ArgumentException.ThrowIfNullOrEmpty(documentSha256);
ArgumentException.ThrowIfNullOrEmpty(currentHash);
ArgumentException.ThrowIfNullOrEmpty(currentSnapshot);
ArgumentNullException.ThrowIfNull(changes);
Id = id;
SourceName = sourceName;
AdvisoryKey = advisoryKey;
DocumentId = documentId;
DocumentSha256 = documentSha256;
CurrentHash = currentHash;
PreviousHash = previousHash;
CurrentSnapshot = currentSnapshot;
PreviousSnapshot = previousSnapshot;
Changes = changes;
CapturedAt = capturedAt;
}
public Guid Id { get; }
public string SourceName { get; }
public string AdvisoryKey { get; }
public Guid DocumentId { get; }
public string DocumentSha256 { get; }
public string CurrentHash { get; }
public string? PreviousHash { get; }
public string CurrentSnapshot { get; }
public string? PreviousSnapshot { get; }
public IReadOnlyList<ChangeHistoryFieldChange> Changes { get; }
public DateTimeOffset CapturedAt { get; }
}

View File

@@ -0,0 +1,12 @@
using System.Collections.Generic;
using System.Threading;
using System.Threading.Tasks;
namespace StellaOps.Concelier.Storage.Mongo.ChangeHistory;
public interface IChangeHistoryStore
{
Task AddAsync(ChangeHistoryRecord record, CancellationToken cancellationToken);
Task<IReadOnlyList<ChangeHistoryRecord>> GetRecentAsync(string sourceName, string advisoryKey, int limit, CancellationToken cancellationToken);
}

View File

@@ -0,0 +1,53 @@
using System;
using System.Collections.Generic;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging;
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo.ChangeHistory;
public sealed class MongoChangeHistoryStore : IChangeHistoryStore
{
private readonly IMongoCollection<ChangeHistoryDocument> _collection;
private readonly ILogger<MongoChangeHistoryStore> _logger;
public MongoChangeHistoryStore(IMongoDatabase database, ILogger<MongoChangeHistoryStore> logger)
{
_collection = (database ?? throw new ArgumentNullException(nameof(database)))
.GetCollection<ChangeHistoryDocument>(MongoStorageDefaults.Collections.ChangeHistory);
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public async Task AddAsync(ChangeHistoryRecord record, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(record);
var document = record.ToDocument();
await _collection.InsertOneAsync(document, cancellationToken: cancellationToken).ConfigureAwait(false);
_logger.LogDebug("Recorded change history for {Source}/{Advisory} with hash {Hash}", record.SourceName, record.AdvisoryKey, record.CurrentHash);
}
public async Task<IReadOnlyList<ChangeHistoryRecord>> GetRecentAsync(string sourceName, string advisoryKey, int limit, CancellationToken cancellationToken)
{
ArgumentException.ThrowIfNullOrEmpty(sourceName);
ArgumentException.ThrowIfNullOrEmpty(advisoryKey);
if (limit <= 0)
{
limit = 10;
}
var cursor = await _collection.Find(x => x.SourceName == sourceName && x.AdvisoryKey == advisoryKey)
.SortByDescending(x => x.CapturedAt)
.Limit(limit)
.ToListAsync(cancellationToken)
.ConfigureAwait(false);
var records = new List<ChangeHistoryRecord>(cursor.Count);
foreach (var document in cursor)
{
records.Add(document.ToRecord());
}
return records;
}
}

View File

@@ -0,0 +1,57 @@
using System;
using System.Collections.Generic;
using System.Linq;
using MongoDB.Bson;
using MongoDB.Bson.Serialization.Attributes;
namespace StellaOps.Concelier.Storage.Mongo.Conflicts;
[BsonIgnoreExtraElements]
public sealed class AdvisoryConflictDocument
{
[BsonId]
public string Id { get; set; } = Guid.Empty.ToString("N");
[BsonElement("vulnerabilityKey")]
public string VulnerabilityKey { get; set; } = string.Empty;
[BsonElement("conflictHash")]
public byte[] ConflictHash { get; set; } = Array.Empty<byte>();
[BsonElement("asOf")]
public DateTime AsOf { get; set; }
[BsonElement("recordedAt")]
public DateTime RecordedAt { get; set; }
[BsonElement("statementIds")]
public List<string> StatementIds { get; set; } = new();
[BsonElement("details")]
public BsonDocument Details { get; set; } = new();
}
internal static class AdvisoryConflictDocumentExtensions
{
public static AdvisoryConflictDocument FromRecord(AdvisoryConflictRecord record)
=> new()
{
Id = record.Id.ToString(),
VulnerabilityKey = record.VulnerabilityKey,
ConflictHash = record.ConflictHash,
AsOf = record.AsOf.UtcDateTime,
RecordedAt = record.RecordedAt.UtcDateTime,
StatementIds = record.StatementIds.Select(static id => id.ToString()).ToList(),
Details = (BsonDocument)record.Details.DeepClone(),
};
public static AdvisoryConflictRecord ToRecord(this AdvisoryConflictDocument document)
=> new(
Guid.Parse(document.Id),
document.VulnerabilityKey,
document.ConflictHash,
DateTime.SpecifyKind(document.AsOf, DateTimeKind.Utc),
DateTime.SpecifyKind(document.RecordedAt, DateTimeKind.Utc),
document.StatementIds.Select(static value => Guid.Parse(value)).ToList(),
(BsonDocument)document.Details.DeepClone());
}

View File

@@ -0,0 +1,14 @@
using System;
using System.Collections.Generic;
using MongoDB.Bson;
namespace StellaOps.Concelier.Storage.Mongo.Conflicts;
public sealed record AdvisoryConflictRecord(
Guid Id,
string VulnerabilityKey,
byte[] ConflictHash,
DateTimeOffset AsOf,
DateTimeOffset RecordedAt,
IReadOnlyList<Guid> StatementIds,
BsonDocument Details);

View File

@@ -0,0 +1,93 @@
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo.Conflicts;
public interface IAdvisoryConflictStore
{
ValueTask InsertAsync(
IReadOnlyCollection<AdvisoryConflictRecord> conflicts,
CancellationToken cancellationToken,
IClientSessionHandle? session = null);
ValueTask<IReadOnlyList<AdvisoryConflictRecord>> GetConflictsAsync(
string vulnerabilityKey,
DateTimeOffset? asOf,
CancellationToken cancellationToken,
IClientSessionHandle? session = null);
}
public sealed class AdvisoryConflictStore : IAdvisoryConflictStore
{
private readonly IMongoCollection<AdvisoryConflictDocument> _collection;
public AdvisoryConflictStore(IMongoDatabase database)
{
ArgumentNullException.ThrowIfNull(database);
_collection = database.GetCollection<AdvisoryConflictDocument>(MongoStorageDefaults.Collections.AdvisoryConflicts);
}
public async ValueTask InsertAsync(
IReadOnlyCollection<AdvisoryConflictRecord> conflicts,
CancellationToken cancellationToken,
IClientSessionHandle? session = null)
{
ArgumentNullException.ThrowIfNull(conflicts);
if (conflicts.Count == 0)
{
return;
}
var documents = conflicts.Select(AdvisoryConflictDocumentExtensions.FromRecord).ToList();
var options = new InsertManyOptions { IsOrdered = true };
try
{
if (session is null)
{
await _collection.InsertManyAsync(documents, options, cancellationToken).ConfigureAwait(false);
}
else
{
await _collection.InsertManyAsync(session, documents, options, cancellationToken).ConfigureAwait(false);
}
}
catch (MongoBulkWriteException ex) when (ex.WriteErrors.All(error => error.Category == ServerErrorCategory.DuplicateKey))
{
// Conflicts already persisted for this state; ignore duplicates.
}
}
public async ValueTask<IReadOnlyList<AdvisoryConflictRecord>> GetConflictsAsync(
string vulnerabilityKey,
DateTimeOffset? asOf,
CancellationToken cancellationToken,
IClientSessionHandle? session = null)
{
ArgumentException.ThrowIfNullOrWhiteSpace(vulnerabilityKey);
var filter = Builders<AdvisoryConflictDocument>.Filter.Eq(document => document.VulnerabilityKey, vulnerabilityKey);
if (asOf.HasValue)
{
filter &= Builders<AdvisoryConflictDocument>.Filter.Lte(document => document.AsOf, asOf.Value.UtcDateTime);
}
var find = session is null
? _collection.Find(filter)
: _collection.Find(session, filter);
var documents = await find
.SortByDescending(document => document.AsOf)
.ThenByDescending(document => document.RecordedAt)
.ToListAsync(cancellationToken)
.ConfigureAwait(false);
return documents.Select(static document => document.ToRecord()).ToList();
}
}

View File

@@ -0,0 +1,131 @@
using System;
using MongoDB.Bson;
using MongoDB.Bson.Serialization.Attributes;
namespace StellaOps.Concelier.Storage.Mongo.Documents;
[BsonIgnoreExtraElements]
public sealed class DocumentDocument
{
[BsonId]
public string Id { get; set; } = string.Empty;
[BsonElement("sourceName")]
public string SourceName { get; set; } = string.Empty;
[BsonElement("uri")]
public string Uri { get; set; } = string.Empty;
[BsonElement("fetchedAt")]
public DateTime FetchedAt { get; set; }
[BsonElement("sha256")]
public string Sha256 { get; set; } = string.Empty;
[BsonElement("status")]
public string Status { get; set; } = string.Empty;
[BsonElement("contentType")]
[BsonIgnoreIfNull]
public string? ContentType { get; set; }
[BsonElement("headers")]
[BsonIgnoreIfNull]
public BsonDocument? Headers { get; set; }
[BsonElement("metadata")]
[BsonIgnoreIfNull]
public BsonDocument? Metadata { get; set; }
[BsonElement("etag")]
[BsonIgnoreIfNull]
public string? Etag { get; set; }
[BsonElement("lastModified")]
[BsonIgnoreIfNull]
public DateTime? LastModified { get; set; }
[BsonElement("expiresAt")]
[BsonIgnoreIfNull]
public DateTime? ExpiresAt { get; set; }
[BsonElement("gridFsId")]
[BsonIgnoreIfNull]
public ObjectId? GridFsId { get; set; }
}
internal static class DocumentDocumentExtensions
{
public static DocumentDocument FromRecord(DocumentRecord record)
{
return new DocumentDocument
{
Id = record.Id.ToString(),
SourceName = record.SourceName,
Uri = record.Uri,
FetchedAt = record.FetchedAt.UtcDateTime,
Sha256 = record.Sha256,
Status = record.Status,
ContentType = record.ContentType,
Headers = ToBson(record.Headers),
Metadata = ToBson(record.Metadata),
Etag = record.Etag,
LastModified = record.LastModified?.UtcDateTime,
GridFsId = record.GridFsId,
ExpiresAt = record.ExpiresAt?.UtcDateTime,
};
}
public static DocumentRecord ToRecord(this DocumentDocument document)
{
IReadOnlyDictionary<string, string>? headers = null;
if (document.Headers is not null)
{
headers = document.Headers.Elements.ToDictionary(
static e => e.Name,
static e => e.Value?.ToString() ?? string.Empty,
StringComparer.Ordinal);
}
IReadOnlyDictionary<string, string>? metadata = null;
if (document.Metadata is not null)
{
metadata = document.Metadata.Elements.ToDictionary(
static e => e.Name,
static e => e.Value?.ToString() ?? string.Empty,
StringComparer.Ordinal);
}
return new DocumentRecord(
Guid.Parse(document.Id),
document.SourceName,
document.Uri,
DateTime.SpecifyKind(document.FetchedAt, DateTimeKind.Utc),
document.Sha256,
document.Status,
document.ContentType,
headers,
metadata,
document.Etag,
document.LastModified.HasValue ? DateTime.SpecifyKind(document.LastModified.Value, DateTimeKind.Utc) : null,
document.GridFsId,
document.ExpiresAt.HasValue ? DateTime.SpecifyKind(document.ExpiresAt.Value, DateTimeKind.Utc) : null);
}
private static BsonDocument? ToBson(IReadOnlyDictionary<string, string>? values)
{
if (values is null)
{
return null;
}
var document = new BsonDocument();
foreach (var kvp in values)
{
document[kvp.Key] = kvp.Value;
}
return document;
}
}

View File

@@ -0,0 +1,22 @@
using MongoDB.Bson;
namespace StellaOps.Concelier.Storage.Mongo.Documents;
public sealed record DocumentRecord(
Guid Id,
string SourceName,
string Uri,
DateTimeOffset FetchedAt,
string Sha256,
string Status,
string? ContentType,
IReadOnlyDictionary<string, string>? Headers,
IReadOnlyDictionary<string, string>? Metadata,
string? Etag,
DateTimeOffset? LastModified,
ObjectId? GridFsId,
DateTimeOffset? ExpiresAt = null)
{
public DocumentRecord WithStatus(string status)
=> this with { Status = status };
}

View File

@@ -0,0 +1,87 @@
using Microsoft.Extensions.Logging;
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo.Documents;
public sealed class DocumentStore : IDocumentStore
{
private readonly IMongoCollection<DocumentDocument> _collection;
private readonly ILogger<DocumentStore> _logger;
public DocumentStore(IMongoDatabase database, ILogger<DocumentStore> logger)
{
_collection = (database ?? throw new ArgumentNullException(nameof(database)))
.GetCollection<DocumentDocument>(MongoStorageDefaults.Collections.Document);
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public async Task<DocumentRecord> UpsertAsync(DocumentRecord record, CancellationToken cancellationToken, IClientSessionHandle? session = null)
{
ArgumentNullException.ThrowIfNull(record);
var document = DocumentDocumentExtensions.FromRecord(record);
var filter = Builders<DocumentDocument>.Filter.Eq(x => x.SourceName, record.SourceName)
& Builders<DocumentDocument>.Filter.Eq(x => x.Uri, record.Uri);
var options = new FindOneAndReplaceOptions<DocumentDocument>
{
IsUpsert = true,
ReturnDocument = ReturnDocument.After,
};
var replaced = session is null
? await _collection.FindOneAndReplaceAsync(filter, document, options, cancellationToken).ConfigureAwait(false)
: await _collection.FindOneAndReplaceAsync(session, filter, document, options, cancellationToken).ConfigureAwait(false);
_logger.LogDebug("Upserted document {Source}/{Uri}", record.SourceName, record.Uri);
return (replaced ?? document).ToRecord();
}
public async Task<DocumentRecord?> FindBySourceAndUriAsync(string sourceName, string uri, CancellationToken cancellationToken, IClientSessionHandle? session = null)
{
ArgumentException.ThrowIfNullOrEmpty(sourceName);
ArgumentException.ThrowIfNullOrEmpty(uri);
var filter = Builders<DocumentDocument>.Filter.Eq(x => x.SourceName, sourceName)
& Builders<DocumentDocument>.Filter.Eq(x => x.Uri, uri);
var query = session is null
? _collection.Find(filter)
: _collection.Find(session, filter);
var document = await query.FirstOrDefaultAsync(cancellationToken).ConfigureAwait(false);
return document?.ToRecord();
}
public async Task<DocumentRecord?> FindAsync(Guid id, CancellationToken cancellationToken, IClientSessionHandle? session = null)
{
var idValue = id.ToString();
var filter = Builders<DocumentDocument>.Filter.Eq(x => x.Id, idValue);
var query = session is null
? _collection.Find(filter)
: _collection.Find(session, filter);
var document = await query.FirstOrDefaultAsync(cancellationToken).ConfigureAwait(false);
return document?.ToRecord();
}
public async Task<bool> UpdateStatusAsync(Guid id, string status, CancellationToken cancellationToken, IClientSessionHandle? session = null)
{
ArgumentException.ThrowIfNullOrEmpty(status);
var update = Builders<DocumentDocument>.Update
.Set(x => x.Status, status)
.Set(x => x.LastModified, DateTime.UtcNow);
var idValue = id.ToString();
var filter = Builders<DocumentDocument>.Filter.Eq(x => x.Id, idValue);
UpdateResult result;
if (session is null)
{
result = await _collection.UpdateOneAsync(filter, update, cancellationToken: cancellationToken).ConfigureAwait(false);
}
else
{
result = await _collection.UpdateOneAsync(session, filter, update, cancellationToken: cancellationToken).ConfigureAwait(false);
}
return result.MatchedCount > 0;
}
}

View File

@@ -0,0 +1,14 @@
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo.Documents;
public interface IDocumentStore
{
Task<DocumentRecord> UpsertAsync(DocumentRecord record, CancellationToken cancellationToken, IClientSessionHandle? session = null);
Task<DocumentRecord?> FindBySourceAndUriAsync(string sourceName, string uri, CancellationToken cancellationToken, IClientSessionHandle? session = null);
Task<DocumentRecord?> FindAsync(Guid id, CancellationToken cancellationToken, IClientSessionHandle? session = null);
Task<bool> UpdateStatusAsync(Guid id, string status, CancellationToken cancellationToken, IClientSessionHandle? session = null);
}

View File

@@ -0,0 +1,50 @@
using System;
using MongoDB.Bson;
using MongoDB.Bson.Serialization.Attributes;
namespace StellaOps.Concelier.Storage.Mongo.Dtos;
[BsonIgnoreExtraElements]
public sealed class DtoDocument
{
[BsonId]
public string Id { get; set; } = string.Empty;
[BsonElement("documentId")]
public string DocumentId { get; set; } = string.Empty;
[BsonElement("sourceName")]
public string SourceName { get; set; } = string.Empty;
[BsonElement("schemaVersion")]
public string SchemaVersion { get; set; } = string.Empty;
[BsonElement("payload")]
public BsonDocument Payload { get; set; } = new();
[BsonElement("validatedAt")]
public DateTime ValidatedAt { get; set; }
}
internal static class DtoDocumentExtensions
{
public static DtoDocument FromRecord(DtoRecord record)
=> new()
{
Id = record.Id.ToString(),
DocumentId = record.DocumentId.ToString(),
SourceName = record.SourceName,
SchemaVersion = record.SchemaVersion,
Payload = record.Payload ?? new BsonDocument(),
ValidatedAt = record.ValidatedAt.UtcDateTime,
};
public static DtoRecord ToRecord(this DtoDocument document)
=> new(
Guid.Parse(document.Id),
Guid.Parse(document.DocumentId),
document.SourceName,
document.SchemaVersion,
document.Payload,
DateTime.SpecifyKind(document.ValidatedAt, DateTimeKind.Utc));
}

View File

@@ -0,0 +1,11 @@
using MongoDB.Bson;
namespace StellaOps.Concelier.Storage.Mongo.Dtos;
public sealed record DtoRecord(
Guid Id,
Guid DocumentId,
string SourceName,
string SchemaVersion,
BsonDocument Payload,
DateTimeOffset ValidatedAt);

View File

@@ -0,0 +1,66 @@
using Microsoft.Extensions.Logging;
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo.Dtos;
public sealed class DtoStore : IDtoStore
{
private readonly IMongoCollection<DtoDocument> _collection;
private readonly ILogger<DtoStore> _logger;
public DtoStore(IMongoDatabase database, ILogger<DtoStore> logger)
{
_collection = (database ?? throw new ArgumentNullException(nameof(database)))
.GetCollection<DtoDocument>(MongoStorageDefaults.Collections.Dto);
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public async Task<DtoRecord> UpsertAsync(DtoRecord record, CancellationToken cancellationToken, IClientSessionHandle? session = null)
{
ArgumentNullException.ThrowIfNull(record);
var document = DtoDocumentExtensions.FromRecord(record);
var documentId = record.DocumentId.ToString();
var filter = Builders<DtoDocument>.Filter.Eq(x => x.DocumentId, documentId)
& Builders<DtoDocument>.Filter.Eq(x => x.SourceName, record.SourceName);
var options = new FindOneAndReplaceOptions<DtoDocument>
{
IsUpsert = true,
ReturnDocument = ReturnDocument.After,
};
var replaced = session is null
? await _collection.FindOneAndReplaceAsync(filter, document, options, cancellationToken).ConfigureAwait(false)
: await _collection.FindOneAndReplaceAsync(session, filter, document, options, cancellationToken).ConfigureAwait(false);
_logger.LogDebug("Upserted DTO for {Source}/{DocumentId}", record.SourceName, record.DocumentId);
return (replaced ?? document).ToRecord();
}
public async Task<DtoRecord?> FindByDocumentIdAsync(Guid documentId, CancellationToken cancellationToken, IClientSessionHandle? session = null)
{
var documentIdValue = documentId.ToString();
var filter = Builders<DtoDocument>.Filter.Eq(x => x.DocumentId, documentIdValue);
var query = session is null
? _collection.Find(filter)
: _collection.Find(session, filter);
var document = await query.FirstOrDefaultAsync(cancellationToken).ConfigureAwait(false);
return document?.ToRecord();
}
public async Task<IReadOnlyList<DtoRecord>> GetBySourceAsync(string sourceName, int limit, CancellationToken cancellationToken, IClientSessionHandle? session = null)
{
var filter = Builders<DtoDocument>.Filter.Eq(x => x.SourceName, sourceName);
var query = session is null
? _collection.Find(filter)
: _collection.Find(session, filter);
var cursor = await query
.SortByDescending(x => x.ValidatedAt)
.Limit(limit)
.ToListAsync(cancellationToken)
.ConfigureAwait(false);
return cursor.Select(static x => x.ToRecord()).ToArray();
}
}

View File

@@ -0,0 +1,12 @@
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo.Dtos;
public interface IDtoStore
{
Task<DtoRecord> UpsertAsync(DtoRecord record, CancellationToken cancellationToken, IClientSessionHandle? session = null);
Task<DtoRecord?> FindByDocumentIdAsync(Guid documentId, CancellationToken cancellationToken, IClientSessionHandle? session = null);
Task<IReadOnlyList<DtoRecord>> GetBySourceAsync(string sourceName, int limit, CancellationToken cancellationToken, IClientSessionHandle? session = null);
}

View File

@@ -0,0 +1,224 @@
using System;
using System.Collections.Generic;
using System.Collections.Immutable;
using System.IO;
using System.Linq;
using System.Text;
using System.Text.Encodings.Web;
using System.Text.Json;
using System.Threading;
using System.Threading.Tasks;
using MongoDB.Bson;
using StellaOps.Concelier.Core.Events;
using StellaOps.Concelier.Models;
using StellaOps.Concelier.Storage.Mongo.Conflicts;
using StellaOps.Concelier.Storage.Mongo.Statements;
namespace StellaOps.Concelier.Storage.Mongo.Events;
public sealed class MongoAdvisoryEventRepository : IAdvisoryEventRepository
{
private readonly IAdvisoryStatementStore _statementStore;
private readonly IAdvisoryConflictStore _conflictStore;
public MongoAdvisoryEventRepository(
IAdvisoryStatementStore statementStore,
IAdvisoryConflictStore conflictStore)
{
_statementStore = statementStore ?? throw new ArgumentNullException(nameof(statementStore));
_conflictStore = conflictStore ?? throw new ArgumentNullException(nameof(conflictStore));
}
public async ValueTask InsertStatementsAsync(
IReadOnlyCollection<AdvisoryStatementEntry> statements,
CancellationToken cancellationToken)
{
if (statements is null)
{
throw new ArgumentNullException(nameof(statements));
}
if (statements.Count == 0)
{
return;
}
var records = statements
.Select(static entry =>
{
var payload = BsonDocument.Parse(entry.CanonicalJson);
return new AdvisoryStatementRecord(
entry.StatementId,
entry.VulnerabilityKey,
entry.AdvisoryKey,
entry.StatementHash.ToArray(),
entry.AsOf,
entry.RecordedAt,
payload,
entry.InputDocumentIds.ToArray());
})
.ToList();
await _statementStore.InsertAsync(records, cancellationToken).ConfigureAwait(false);
}
public async ValueTask InsertConflictsAsync(
IReadOnlyCollection<AdvisoryConflictEntry> conflicts,
CancellationToken cancellationToken)
{
if (conflicts is null)
{
throw new ArgumentNullException(nameof(conflicts));
}
if (conflicts.Count == 0)
{
return;
}
var records = conflicts
.Select(static entry =>
{
var payload = BsonDocument.Parse(entry.CanonicalJson);
return new AdvisoryConflictRecord(
entry.ConflictId,
entry.VulnerabilityKey,
entry.ConflictHash.ToArray(),
entry.AsOf,
entry.RecordedAt,
entry.StatementIds.ToArray(),
payload);
})
.ToList();
await _conflictStore.InsertAsync(records, cancellationToken).ConfigureAwait(false);
}
public async ValueTask<IReadOnlyList<AdvisoryStatementEntry>> GetStatementsAsync(
string vulnerabilityKey,
DateTimeOffset? asOf,
CancellationToken cancellationToken)
{
var records = await _statementStore
.GetStatementsAsync(vulnerabilityKey, asOf, cancellationToken)
.ConfigureAwait(false);
if (records.Count == 0)
{
return Array.Empty<AdvisoryStatementEntry>();
}
var entries = records
.Select(static record =>
{
var advisory = CanonicalJsonSerializer.Deserialize<Advisory>(record.Payload.ToJson());
var canonicalJson = CanonicalJsonSerializer.Serialize(advisory);
return new AdvisoryStatementEntry(
record.Id,
record.VulnerabilityKey,
record.AdvisoryKey,
canonicalJson,
record.StatementHash.ToImmutableArray(),
record.AsOf,
record.RecordedAt,
record.InputDocumentIds.ToImmutableArray());
})
.ToList();
return entries;
}
public async ValueTask<IReadOnlyList<AdvisoryConflictEntry>> GetConflictsAsync(
string vulnerabilityKey,
DateTimeOffset? asOf,
CancellationToken cancellationToken)
{
var records = await _conflictStore
.GetConflictsAsync(vulnerabilityKey, asOf, cancellationToken)
.ConfigureAwait(false);
if (records.Count == 0)
{
return Array.Empty<AdvisoryConflictEntry>();
}
var entries = records
.Select(static record =>
{
var canonicalJson = Canonicalize(record.Details);
return new AdvisoryConflictEntry(
record.Id,
record.VulnerabilityKey,
canonicalJson,
record.ConflictHash.ToImmutableArray(),
record.AsOf,
record.RecordedAt,
record.StatementIds.ToImmutableArray());
})
.ToList();
return entries;
}
private static readonly JsonWriterOptions CanonicalWriterOptions = new()
{
Encoder = JavaScriptEncoder.UnsafeRelaxedJsonEscaping,
Indented = false,
SkipValidation = false,
};
private static string Canonicalize(BsonDocument document)
{
using var json = JsonDocument.Parse(document.ToJson());
using var stream = new MemoryStream();
using (var writer = new Utf8JsonWriter(stream, CanonicalWriterOptions))
{
WriteCanonical(json.RootElement, writer);
}
return Encoding.UTF8.GetString(stream.ToArray());
}
private static void WriteCanonical(JsonElement element, Utf8JsonWriter writer)
{
switch (element.ValueKind)
{
case JsonValueKind.Object:
writer.WriteStartObject();
foreach (var property in element.EnumerateObject().OrderBy(static p => p.Name, StringComparer.Ordinal))
{
writer.WritePropertyName(property.Name);
WriteCanonical(property.Value, writer);
}
writer.WriteEndObject();
break;
case JsonValueKind.Array:
writer.WriteStartArray();
foreach (var item in element.EnumerateArray())
{
WriteCanonical(item, writer);
}
writer.WriteEndArray();
break;
case JsonValueKind.String:
writer.WriteStringValue(element.GetString());
break;
case JsonValueKind.Number:
writer.WriteRawValue(element.GetRawText());
break;
case JsonValueKind.True:
writer.WriteBooleanValue(true);
break;
case JsonValueKind.False:
writer.WriteBooleanValue(false);
break;
case JsonValueKind.Null:
writer.WriteNullValue();
break;
default:
writer.WriteRawValue(element.GetRawText());
break;
}
}
}

View File

@@ -0,0 +1,90 @@
using System.Collections.Generic;
using System.Linq;
using MongoDB.Bson.Serialization.Attributes;
namespace StellaOps.Concelier.Storage.Mongo.Exporting;
[BsonIgnoreExtraElements]
public sealed class ExportStateDocument
{
[BsonId]
public string Id { get; set; } = string.Empty;
[BsonElement("baseExportId")]
public string? BaseExportId { get; set; }
[BsonElement("baseDigest")]
public string? BaseDigest { get; set; }
[BsonElement("lastFullDigest")]
public string? LastFullDigest { get; set; }
[BsonElement("lastDeltaDigest")]
public string? LastDeltaDigest { get; set; }
[BsonElement("exportCursor")]
public string? ExportCursor { get; set; }
[BsonElement("targetRepo")]
public string? TargetRepository { get; set; }
[BsonElement("exporterVersion")]
public string? ExporterVersion { get; set; }
[BsonElement("updatedAt")]
public DateTime UpdatedAt { get; set; }
[BsonElement("files")]
public List<ExportStateFileDocument>? Files { get; set; }
}
public sealed class ExportStateFileDocument
{
[BsonElement("path")]
public string Path { get; set; } = string.Empty;
[BsonElement("length")]
public long Length { get; set; }
[BsonElement("digest")]
public string Digest { get; set; } = string.Empty;
}
internal static class ExportStateDocumentExtensions
{
public static ExportStateDocument FromRecord(ExportStateRecord record)
=> new()
{
Id = record.Id,
BaseExportId = record.BaseExportId,
BaseDigest = record.BaseDigest,
LastFullDigest = record.LastFullDigest,
LastDeltaDigest = record.LastDeltaDigest,
ExportCursor = record.ExportCursor,
TargetRepository = record.TargetRepository,
ExporterVersion = record.ExporterVersion,
UpdatedAt = record.UpdatedAt.UtcDateTime,
Files = record.Files.Select(static file => new ExportStateFileDocument
{
Path = file.Path,
Length = file.Length,
Digest = file.Digest,
}).ToList(),
};
public static ExportStateRecord ToRecord(this ExportStateDocument document)
=> new(
document.Id,
document.BaseExportId,
document.BaseDigest,
document.LastFullDigest,
document.LastDeltaDigest,
document.ExportCursor,
document.TargetRepository,
document.ExporterVersion,
DateTime.SpecifyKind(document.UpdatedAt, DateTimeKind.Utc),
(document.Files ?? new List<ExportStateFileDocument>())
.Where(static entry => !string.IsNullOrWhiteSpace(entry.Path))
.Select(static entry => new ExportFileRecord(entry.Path, entry.Length, entry.Digest))
.ToArray());
}

View File

@@ -0,0 +1,135 @@
using System;
using System.Collections.Generic;
using System.Threading;
using System.Threading.Tasks;
namespace StellaOps.Concelier.Storage.Mongo.Exporting;
/// <summary>
/// Helper for exporters to read and persist their export metadata in Mongo-backed storage.
/// </summary>
public sealed class ExportStateManager
{
private readonly IExportStateStore _store;
private readonly TimeProvider _timeProvider;
public ExportStateManager(IExportStateStore store, TimeProvider? timeProvider = null)
{
_store = store ?? throw new ArgumentNullException(nameof(store));
_timeProvider = timeProvider ?? TimeProvider.System;
}
public Task<ExportStateRecord?> GetAsync(string exporterId, CancellationToken cancellationToken)
{
ArgumentException.ThrowIfNullOrEmpty(exporterId);
return _store.FindAsync(exporterId, cancellationToken);
}
public async Task<ExportStateRecord> StoreFullExportAsync(
string exporterId,
string exportId,
string exportDigest,
string? cursor,
string? targetRepository,
string exporterVersion,
bool resetBaseline,
IReadOnlyList<ExportFileRecord> manifest,
CancellationToken cancellationToken)
{
ArgumentException.ThrowIfNullOrEmpty(exporterId);
ArgumentException.ThrowIfNullOrEmpty(exportId);
ArgumentException.ThrowIfNullOrEmpty(exportDigest);
ArgumentException.ThrowIfNullOrEmpty(exporterVersion);
manifest ??= Array.Empty<ExportFileRecord>();
var existing = await _store.FindAsync(exporterId, cancellationToken).ConfigureAwait(false);
var now = _timeProvider.GetUtcNow();
if (existing is null)
{
var resolvedRepository = string.IsNullOrWhiteSpace(targetRepository) ? null : targetRepository;
return await _store.UpsertAsync(
new ExportStateRecord(
exporterId,
BaseExportId: exportId,
BaseDigest: exportDigest,
LastFullDigest: exportDigest,
LastDeltaDigest: null,
ExportCursor: cursor ?? exportDigest,
TargetRepository: resolvedRepository,
ExporterVersion: exporterVersion,
UpdatedAt: now,
Files: manifest),
cancellationToken).ConfigureAwait(false);
}
var repositorySpecified = !string.IsNullOrWhiteSpace(targetRepository);
var resolvedRepo = repositorySpecified ? targetRepository : existing.TargetRepository;
var repositoryChanged = repositorySpecified
&& !string.Equals(existing.TargetRepository, targetRepository, StringComparison.Ordinal);
var shouldResetBaseline =
resetBaseline
|| string.IsNullOrWhiteSpace(existing.BaseExportId)
|| string.IsNullOrWhiteSpace(existing.BaseDigest)
|| repositoryChanged;
var updatedRecord = shouldResetBaseline
? existing with
{
BaseExportId = exportId,
BaseDigest = exportDigest,
LastFullDigest = exportDigest,
LastDeltaDigest = null,
ExportCursor = cursor ?? exportDigest,
TargetRepository = resolvedRepo,
ExporterVersion = exporterVersion,
UpdatedAt = now,
Files = manifest,
}
: existing with
{
LastFullDigest = exportDigest,
LastDeltaDigest = null,
ExportCursor = cursor ?? existing.ExportCursor,
TargetRepository = resolvedRepo,
ExporterVersion = exporterVersion,
UpdatedAt = now,
Files = manifest,
};
return await _store.UpsertAsync(updatedRecord, cancellationToken).ConfigureAwait(false);
}
public async Task<ExportStateRecord> StoreDeltaExportAsync(
string exporterId,
string deltaDigest,
string? cursor,
string exporterVersion,
IReadOnlyList<ExportFileRecord> manifest,
CancellationToken cancellationToken)
{
ArgumentException.ThrowIfNullOrEmpty(exporterId);
ArgumentException.ThrowIfNullOrEmpty(deltaDigest);
ArgumentException.ThrowIfNullOrEmpty(exporterVersion);
manifest ??= Array.Empty<ExportFileRecord>();
var existing = await _store.FindAsync(exporterId, cancellationToken).ConfigureAwait(false);
if (existing is null)
{
throw new InvalidOperationException($"Full export state missing for '{exporterId}'.");
}
var now = _timeProvider.GetUtcNow();
var record = existing with
{
LastDeltaDigest = deltaDigest,
ExportCursor = cursor ?? existing.ExportCursor,
ExporterVersion = exporterVersion,
UpdatedAt = now,
Files = manifest,
};
return await _store.UpsertAsync(record, cancellationToken).ConfigureAwait(false);
}
}

View File

@@ -0,0 +1,15 @@
namespace StellaOps.Concelier.Storage.Mongo.Exporting;
public sealed record ExportStateRecord(
string Id,
string? BaseExportId,
string? BaseDigest,
string? LastFullDigest,
string? LastDeltaDigest,
string? ExportCursor,
string? TargetRepository,
string? ExporterVersion,
DateTimeOffset UpdatedAt,
IReadOnlyList<ExportFileRecord> Files);
public sealed record ExportFileRecord(string Path, long Length, string Digest);

View File

@@ -0,0 +1,43 @@
using Microsoft.Extensions.Logging;
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo.Exporting;
public sealed class ExportStateStore : IExportStateStore
{
private readonly IMongoCollection<ExportStateDocument> _collection;
private readonly ILogger<ExportStateStore> _logger;
public ExportStateStore(IMongoDatabase database, ILogger<ExportStateStore> logger)
{
_collection = (database ?? throw new ArgumentNullException(nameof(database)))
.GetCollection<ExportStateDocument>(MongoStorageDefaults.Collections.ExportState);
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public async Task<ExportStateRecord> UpsertAsync(ExportStateRecord record, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(record);
var document = ExportStateDocumentExtensions.FromRecord(record);
var options = new FindOneAndReplaceOptions<ExportStateDocument>
{
IsUpsert = true,
ReturnDocument = ReturnDocument.After,
};
var replaced = await _collection.FindOneAndReplaceAsync<ExportStateDocument, ExportStateDocument>(
x => x.Id == record.Id,
document,
options,
cancellationToken).ConfigureAwait(false);
_logger.LogDebug("Stored export state {StateId}", record.Id);
return (replaced ?? document).ToRecord();
}
public async Task<ExportStateRecord?> FindAsync(string id, CancellationToken cancellationToken)
{
var document = await _collection.Find(x => x.Id == id).FirstOrDefaultAsync(cancellationToken).ConfigureAwait(false);
return document?.ToRecord();
}
}

View File

@@ -0,0 +1,8 @@
namespace StellaOps.Concelier.Storage.Mongo.Exporting;
public interface IExportStateStore
{
Task<ExportStateRecord> UpsertAsync(ExportStateRecord record, CancellationToken cancellationToken);
Task<ExportStateRecord?> FindAsync(string id, CancellationToken cancellationToken);
}

View File

@@ -0,0 +1,15 @@
using MongoDB.Bson;
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo;
public interface ISourceStateRepository
{
Task<SourceStateRecord?> TryGetAsync(string sourceName, CancellationToken cancellationToken, IClientSessionHandle? session = null);
Task<SourceStateRecord> UpsertAsync(SourceStateRecord record, CancellationToken cancellationToken, IClientSessionHandle? session = null);
Task<SourceStateRecord?> UpdateCursorAsync(string sourceName, BsonDocument cursor, DateTimeOffset completedAt, CancellationToken cancellationToken, IClientSessionHandle? session = null);
Task<SourceStateRecord?> MarkFailureAsync(string sourceName, DateTimeOffset failedAt, TimeSpan? backoff, string? failureReason, CancellationToken cancellationToken, IClientSessionHandle? session = null);
}

View File

@@ -0,0 +1,38 @@
using MongoDB.Bson.Serialization.Attributes;
using StellaOps.Concelier.Core.Jobs;
namespace StellaOps.Concelier.Storage.Mongo;
[BsonIgnoreExtraElements]
public sealed class JobLeaseDocument
{
[BsonId]
public string Key { get; set; } = string.Empty;
[BsonElement("holder")]
public string Holder { get; set; } = string.Empty;
[BsonElement("acquiredAt")]
public DateTime AcquiredAt { get; set; }
[BsonElement("heartbeatAt")]
public DateTime HeartbeatAt { get; set; }
[BsonElement("leaseMs")]
public long LeaseMs { get; set; }
[BsonElement("ttlAt")]
public DateTime TtlAt { get; set; }
}
internal static class JobLeaseDocumentExtensions
{
public static JobLease ToLease(this JobLeaseDocument document)
=> new(
document.Key,
document.Holder,
DateTime.SpecifyKind(document.AcquiredAt, DateTimeKind.Utc),
DateTime.SpecifyKind(document.HeartbeatAt, DateTimeKind.Utc),
TimeSpan.FromMilliseconds(document.LeaseMs),
DateTime.SpecifyKind(document.TtlAt, DateTimeKind.Utc));
}

View File

@@ -0,0 +1,119 @@
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text.Json;
using MongoDB.Bson;
using MongoDB.Bson.Serialization.Attributes;
using StellaOps.Concelier.Core.Jobs;
namespace StellaOps.Concelier.Storage.Mongo;
[BsonIgnoreExtraElements]
public sealed class JobRunDocument
{
[BsonId]
public string Id { get; set; } = string.Empty;
[BsonElement("kind")]
public string Kind { get; set; } = string.Empty;
[BsonElement("status")]
public string Status { get; set; } = JobRunStatus.Pending.ToString();
[BsonElement("trigger")]
public string Trigger { get; set; } = string.Empty;
[BsonElement("parameters")]
public BsonDocument Parameters { get; set; } = new();
[BsonElement("parametersHash")]
[BsonIgnoreIfNull]
public string? ParametersHash { get; set; }
[BsonElement("createdAt")]
public DateTime CreatedAt { get; set; }
[BsonElement("startedAt")]
[BsonIgnoreIfNull]
public DateTime? StartedAt { get; set; }
[BsonElement("completedAt")]
[BsonIgnoreIfNull]
public DateTime? CompletedAt { get; set; }
[BsonElement("error")]
[BsonIgnoreIfNull]
public string? Error { get; set; }
[BsonElement("timeoutMs")]
[BsonIgnoreIfNull]
public long? TimeoutMs { get; set; }
[BsonElement("leaseMs")]
[BsonIgnoreIfNull]
public long? LeaseMs { get; set; }
}
internal static class JobRunDocumentExtensions
{
public static JobRunDocument FromRequest(JobRunCreateRequest request, Guid id)
{
return new JobRunDocument
{
Id = id.ToString(),
Kind = request.Kind,
Status = JobRunStatus.Pending.ToString(),
Trigger = request.Trigger,
Parameters = request.Parameters is { Count: > 0 }
? BsonDocument.Parse(JsonSerializer.Serialize(request.Parameters))
: new BsonDocument(),
ParametersHash = request.ParametersHash,
CreatedAt = request.CreatedAt.UtcDateTime,
TimeoutMs = request.Timeout?.MillisecondsFromTimespan(),
LeaseMs = request.LeaseDuration?.MillisecondsFromTimespan(),
};
}
public static JobRunSnapshot ToSnapshot(this JobRunDocument document)
{
var parameters = document.Parameters?.ToDictionary() ?? new Dictionary<string, object?>();
return new JobRunSnapshot(
Guid.Parse(document.Id),
document.Kind,
Enum.Parse<JobRunStatus>(document.Status, ignoreCase: true),
DateTime.SpecifyKind(document.CreatedAt, DateTimeKind.Utc),
document.StartedAt.HasValue ? DateTime.SpecifyKind(document.StartedAt.Value, DateTimeKind.Utc) : null,
document.CompletedAt.HasValue ? DateTime.SpecifyKind(document.CompletedAt.Value, DateTimeKind.Utc) : null,
document.Trigger,
document.ParametersHash,
document.Error,
document.TimeoutMs?.MillisecondsToTimespan(),
document.LeaseMs?.MillisecondsToTimespan(),
parameters);
}
public static Dictionary<string, object?> ToDictionary(this BsonDocument document)
{
return document.Elements.ToDictionary(
static element => element.Name,
static element => element.Value switch
{
BsonString s => (object?)s.AsString,
BsonBoolean b => b.AsBoolean,
BsonInt32 i => i.AsInt32,
BsonInt64 l => l.AsInt64,
BsonDouble d => d.AsDouble,
BsonNull => null,
BsonArray array => array.Select(v => v.IsBsonDocument ? ToDictionary(v.AsBsonDocument) : (object?)v.ToString()).ToArray(),
BsonDocument doc => ToDictionary(doc),
_ => element.Value.ToString(),
});
}
private static long MillisecondsFromTimespan(this TimeSpan timeSpan)
=> (long)timeSpan.TotalMilliseconds;
private static TimeSpan MillisecondsToTimespan(this long milliseconds)
=> TimeSpan.FromMilliseconds(milliseconds);
}

View File

@@ -0,0 +1,11 @@
using System.Threading;
using System.Threading.Tasks;
namespace StellaOps.Concelier.Storage.Mongo.JpFlags;
public interface IJpFlagStore
{
Task UpsertAsync(JpFlagRecord record, CancellationToken cancellationToken);
Task<JpFlagRecord?> FindAsync(string advisoryKey, CancellationToken cancellationToken);
}

View File

@@ -0,0 +1,54 @@
using MongoDB.Bson.Serialization.Attributes;
namespace StellaOps.Concelier.Storage.Mongo.JpFlags;
[BsonIgnoreExtraElements]
public sealed class JpFlagDocument
{
[BsonId]
[BsonElement("advisoryKey")]
public string AdvisoryKey { get; set; } = string.Empty;
[BsonElement("sourceName")]
public string SourceName { get; set; } = string.Empty;
[BsonElement("category")]
[BsonIgnoreIfNull]
public string? Category { get; set; }
[BsonElement("vendorStatus")]
[BsonIgnoreIfNull]
public string? VendorStatus { get; set; }
[BsonElement("recordedAt")]
public DateTime RecordedAt { get; set; }
}
internal static class JpFlagDocumentExtensions
{
public static JpFlagDocument FromRecord(JpFlagRecord record)
{
ArgumentNullException.ThrowIfNull(record);
return new JpFlagDocument
{
AdvisoryKey = record.AdvisoryKey,
SourceName = record.SourceName,
Category = record.Category,
VendorStatus = record.VendorStatus,
RecordedAt = record.RecordedAt.UtcDateTime,
};
}
public static JpFlagRecord ToRecord(this JpFlagDocument document)
{
ArgumentNullException.ThrowIfNull(document);
return new JpFlagRecord(
document.AdvisoryKey,
document.SourceName,
document.Category,
document.VendorStatus,
DateTime.SpecifyKind(document.RecordedAt, DateTimeKind.Utc));
}
}

View File

@@ -0,0 +1,15 @@
namespace StellaOps.Concelier.Storage.Mongo.JpFlags;
/// <summary>
/// Captures Japan-specific enrichment flags derived from JVN payloads.
/// </summary>
public sealed record JpFlagRecord(
string AdvisoryKey,
string SourceName,
string? Category,
string? VendorStatus,
DateTimeOffset RecordedAt)
{
public JpFlagRecord WithRecordedAt(DateTimeOffset recordedAt)
=> this with { RecordedAt = recordedAt.ToUniversalTime() };
}

View File

@@ -0,0 +1,39 @@
using Microsoft.Extensions.Logging;
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo.JpFlags;
public sealed class JpFlagStore : IJpFlagStore
{
private readonly IMongoCollection<JpFlagDocument> _collection;
private readonly ILogger<JpFlagStore> _logger;
public JpFlagStore(IMongoDatabase database, ILogger<JpFlagStore> logger)
{
ArgumentNullException.ThrowIfNull(database);
ArgumentNullException.ThrowIfNull(logger);
_collection = database.GetCollection<JpFlagDocument>(MongoStorageDefaults.Collections.JpFlags);
_logger = logger;
}
public async Task UpsertAsync(JpFlagRecord record, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(record);
var document = JpFlagDocumentExtensions.FromRecord(record);
var filter = Builders<JpFlagDocument>.Filter.Eq(x => x.AdvisoryKey, record.AdvisoryKey);
var options = new ReplaceOptions { IsUpsert = true };
await _collection.ReplaceOneAsync(filter, document, options, cancellationToken).ConfigureAwait(false);
_logger.LogDebug("Upserted jp_flag for {AdvisoryKey}", record.AdvisoryKey);
}
public async Task<JpFlagRecord?> FindAsync(string advisoryKey, CancellationToken cancellationToken)
{
ArgumentException.ThrowIfNullOrEmpty(advisoryKey);
var filter = Builders<JpFlagDocument>.Filter.Eq(x => x.AdvisoryKey, advisoryKey);
var document = await _collection.Find(filter).FirstOrDefaultAsync(cancellationToken).ConfigureAwait(false);
return document?.ToRecord();
}
}

View File

@@ -0,0 +1,48 @@
# Mongo Schema Migration Playbook
This module owns the persistent shape of Concelier's MongoDB database. Upgrades must be deterministic and safe to run on live replicas. The `MongoMigrationRunner` executes idempotent migrations on startup immediately after the bootstrapper completes its collection and index checks.
## Execution Path
1. `StellaOps.Concelier.WebService` calls `MongoBootstrapper.InitializeAsync()` during startup.
2. Once collections and baseline indexes are ensured, the bootstrapper invokes `MongoMigrationRunner.RunAsync()`.
3. Each `IMongoMigration` implementation is sorted by its `Id` (ordinal compare) and executed exactly once. Completion is recorded in the `schema_migrations` collection.
4. Failures surface during startup and prevent the service from serving traffic, matching our "fail-fast" requirement for storage incompatibilities.
## Creating a Migration
1. Implement `IMongoMigration` under `StellaOps.Concelier.Storage.Mongo.Migrations`. Use a monotonically increasing identifier such as `yyyyMMdd_description`.
2. Keep the body idempotent: query state first, drop/re-create indexes only when mismatch is detected, and avoid multi-document transactions unless required.
3. Add the migration to DI in `ServiceCollectionExtensions` so it flows into the runner.
4. Write an integration test that exercises the migration against a Mongo2Go instance to validate behaviour.
## Current Migrations
| Id | Description |
| --- | --- |
| `20241005_document_expiry_indexes` | Ensures `document` collection uses the correct TTL/partial index depending on raw document retention settings. |
| `20241005_gridfs_expiry_indexes` | Aligns the GridFS `documents.files` TTL index with retention settings. |
| `20251019_advisory_event_collections` | Creates/aligns indexes for `advisory_statements` and `advisory_conflicts` collections powering the event log + conflict replay pipeline. |
| `20251028_advisory_raw_idempotency_index` | Applies compound unique index on `(source.vendor, upstream.upstream_id, upstream.content_hash, tenant)` after verifying no duplicates exist. |
| `20251028_advisory_supersedes_backfill` | Renames legacy `advisory` collection to a read-only backup view and backfills `supersedes` chains across `advisory_raw`. |
| `20251028_advisory_raw_validator` | Applies Aggregation-Only Contract JSON schema validator to the `advisory_raw` collection with configurable enforcement level. |
## Operator Runbook
- `schema_migrations` records each applied migration (`_id`, `description`, `appliedAt`). Review this collection when auditing upgrades.
- Prior to applying `20251028_advisory_raw_idempotency_index`, run the duplicate audit script against the target database:
```bash
mongo concelier ops/devops/scripts/check-advisory-raw-duplicates.js --eval 'var LIMIT=200;'
```
Resolve any reported rows before rolling out the migration.
- After `20251028_advisory_supersedes_backfill` completes, ensure `db.advisory` reports `type: "view"` and `options.viewOn: "advisory_backup_20251028"`. Supersedes chains can be spot-checked via `db.advisory_raw.find({ supersedes: { $exists: true } }).limit(5)`.
- To re-run a migration in a lab, delete the corresponding document from `schema_migrations` and restart the service. **Do not** do this in production unless the migration body is known to be idempotent and safe.
- When changing retention settings (`RawDocumentRetention`), deploy the new configuration and restart Concelier. The migration runner will adjust indexes on the next boot.
- For the event-log collections (`advisory_statements`, `advisory_conflicts`), rollback is simply `db.advisory_statements.drop()` / `db.advisory_conflicts.drop()` followed by a restart if you must revert to the pre-event-log schema (only in labs). Production rollbacks should instead gate merge features that rely on these collections.
- If migrations fail, restart with `Logging__LogLevel__StellaOps.Concelier.Storage.Mongo.Migrations=Debug` to surface diagnostic output. Remediate underlying index/collection drift before retrying.
## Validating an Upgrade
1. Run `dotnet test --filter MongoMigrationRunnerTests` to exercise integration coverage.
2. In staging, execute `db.schema_migrations.find().sort({_id:1})` to verify applied migrations and timestamps.
3. Inspect index shapes: `db.document.getIndexes()` and `db.documents.files.getIndexes()` for TTL/partial filter alignment.

View File

@@ -0,0 +1,8 @@
namespace StellaOps.Concelier.Storage.Mongo.MergeEvents;
public interface IMergeEventStore
{
Task AppendAsync(MergeEventRecord record, CancellationToken cancellationToken);
Task<IReadOnlyList<MergeEventRecord>> GetRecentAsync(string advisoryKey, int limit, CancellationToken cancellationToken);
}

View File

@@ -0,0 +1,101 @@
using System;
using System.Collections.Generic;
using System.Linq;
using MongoDB.Bson;
using MongoDB.Bson.Serialization.Attributes;
namespace StellaOps.Concelier.Storage.Mongo.MergeEvents;
[BsonIgnoreExtraElements]
public sealed class MergeEventDocument
{
[BsonId]
public string Id { get; set; } = string.Empty;
[BsonElement("advisoryKey")]
public string AdvisoryKey { get; set; } = string.Empty;
[BsonElement("beforeHash")]
public byte[] BeforeHash { get; set; } = Array.Empty<byte>();
[BsonElement("afterHash")]
public byte[] AfterHash { get; set; } = Array.Empty<byte>();
[BsonElement("mergedAt")]
public DateTime MergedAt { get; set; }
[BsonElement("inputDocuments")]
public List<string> InputDocuments { get; set; } = new();
[BsonElement("fieldDecisions")]
[BsonIgnoreIfNull]
public List<MergeFieldDecisionDocument>? FieldDecisions { get; set; }
}
internal static class MergeEventDocumentExtensions
{
public static MergeEventDocument FromRecord(MergeEventRecord record)
=> new()
{
Id = record.Id.ToString(),
AdvisoryKey = record.AdvisoryKey,
BeforeHash = record.BeforeHash,
AfterHash = record.AfterHash,
MergedAt = record.MergedAt.UtcDateTime,
InputDocuments = record.InputDocumentIds.Select(static id => id.ToString()).ToList(),
FieldDecisions = record.FieldDecisions.Count == 0
? null
: record.FieldDecisions.Select(MergeFieldDecisionDocument.FromRecord).ToList(),
};
public static MergeEventRecord ToRecord(this MergeEventDocument document)
=> new(
Guid.Parse(document.Id),
document.AdvisoryKey,
document.BeforeHash,
document.AfterHash,
DateTime.SpecifyKind(document.MergedAt, DateTimeKind.Utc),
document.InputDocuments.Select(static value => Guid.Parse(value)).ToList(),
document.FieldDecisions is null
? Array.Empty<MergeFieldDecision>()
: document.FieldDecisions.Select(static decision => decision.ToRecord()).ToList());
}
[BsonIgnoreExtraElements]
public sealed class MergeFieldDecisionDocument
{
[BsonElement("field")]
public string Field { get; set; } = string.Empty;
[BsonElement("selectedSource")]
[BsonIgnoreIfNull]
public string? SelectedSource { get; set; }
[BsonElement("decisionReason")]
public string DecisionReason { get; set; } = string.Empty;
[BsonElement("selectedModified")]
[BsonIgnoreIfNull]
public DateTime? SelectedModified { get; set; }
[BsonElement("consideredSources")]
public List<string> ConsideredSources { get; set; } = new();
public static MergeFieldDecisionDocument FromRecord(MergeFieldDecision record)
=> new()
{
Field = record.Field,
SelectedSource = record.SelectedSource,
DecisionReason = record.DecisionReason,
SelectedModified = record.SelectedModified?.UtcDateTime,
ConsideredSources = record.ConsideredSources.ToList(),
};
public MergeFieldDecision ToRecord()
=> new(
Field,
SelectedSource,
DecisionReason,
SelectedModified.HasValue ? DateTime.SpecifyKind(SelectedModified.Value, DateTimeKind.Utc) : null,
ConsideredSources);
}

View File

@@ -0,0 +1,10 @@
namespace StellaOps.Concelier.Storage.Mongo.MergeEvents;
public sealed record MergeEventRecord(
Guid Id,
string AdvisoryKey,
byte[] BeforeHash,
byte[] AfterHash,
DateTimeOffset MergedAt,
IReadOnlyList<Guid> InputDocumentIds,
IReadOnlyList<MergeFieldDecision> FieldDecisions);

View File

@@ -0,0 +1,36 @@
using Microsoft.Extensions.Logging;
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo.MergeEvents;
public sealed class MergeEventStore : IMergeEventStore
{
private readonly IMongoCollection<MergeEventDocument> _collection;
private readonly ILogger<MergeEventStore> _logger;
public MergeEventStore(IMongoDatabase database, ILogger<MergeEventStore> logger)
{
_collection = (database ?? throw new ArgumentNullException(nameof(database)))
.GetCollection<MergeEventDocument>(MongoStorageDefaults.Collections.MergeEvent);
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public async Task AppendAsync(MergeEventRecord record, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(record);
var document = MergeEventDocumentExtensions.FromRecord(record);
await _collection.InsertOneAsync(document, cancellationToken: cancellationToken).ConfigureAwait(false);
_logger.LogDebug("Appended merge event {MergeId} for {AdvisoryKey}", record.Id, record.AdvisoryKey);
}
public async Task<IReadOnlyList<MergeEventRecord>> GetRecentAsync(string advisoryKey, int limit, CancellationToken cancellationToken)
{
var cursor = await _collection.Find(x => x.AdvisoryKey == advisoryKey)
.SortByDescending(x => x.MergedAt)
.Limit(limit)
.ToListAsync(cancellationToken)
.ConfigureAwait(false);
return cursor.Select(static x => x.ToRecord()).ToArray();
}
}

View File

@@ -0,0 +1,8 @@
namespace StellaOps.Concelier.Storage.Mongo.MergeEvents;
public sealed record MergeFieldDecision(
string Field,
string? SelectedSource,
string DecisionReason,
DateTimeOffset? SelectedModified,
IReadOnlyList<string> ConsideredSources);

View File

@@ -0,0 +1,44 @@
using System.Collections.Generic;
using System.Threading;
using System.Threading.Tasks;
using MongoDB.Bson;
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo.Migrations;
public sealed class EnsureAdvisoryEventCollectionsMigration : IMongoMigration
{
public string Id => "20251019_advisory_event_collections";
public string Description => "Ensure advisory_statements and advisory_conflicts indexes exist for event log storage.";
public async Task ApplyAsync(IMongoDatabase database, CancellationToken cancellationToken)
{
var statements = database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.AdvisoryStatements);
var conflicts = database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.AdvisoryConflicts);
var statementIndexes = new List<CreateIndexModel<BsonDocument>>
{
new(
Builders<BsonDocument>.IndexKeys.Ascending("vulnerabilityKey").Descending("asOf"),
new CreateIndexOptions { Name = "advisory_statements_vulnerability_asof_desc" }),
new(
Builders<BsonDocument>.IndexKeys.Ascending("statementHash"),
new CreateIndexOptions { Name = "advisory_statements_statementHash_unique", Unique = true }),
};
await statements.Indexes.CreateManyAsync(statementIndexes, cancellationToken).ConfigureAwait(false);
var conflictIndexes = new List<CreateIndexModel<BsonDocument>>
{
new(
Builders<BsonDocument>.IndexKeys.Ascending("vulnerabilityKey").Descending("asOf"),
new CreateIndexOptions { Name = "advisory_conflicts_vulnerability_asof_desc" }),
new(
Builders<BsonDocument>.IndexKeys.Ascending("conflictHash"),
new CreateIndexOptions { Name = "advisory_conflicts_conflictHash_unique", Unique = true }),
};
await conflicts.Indexes.CreateManyAsync(conflictIndexes, cancellationToken).ConfigureAwait(false);
}
}

View File

@@ -0,0 +1,156 @@
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using MongoDB.Bson;
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo.Migrations;
public sealed class EnsureAdvisoryRawIdempotencyIndexMigration : IMongoMigration
{
private const string IndexName = "advisory_raw_idempotency";
public string Id => "20251028_advisory_raw_idempotency_index";
public string Description => "Ensure advisory_raw collection enforces idempotency via unique compound index.";
public async Task ApplyAsync(IMongoDatabase database, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(database);
var collection = database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.AdvisoryRaw);
await EnsureNoDuplicatesAsync(collection, cancellationToken).ConfigureAwait(false);
var existingIndex = await FindExistingIndexAsync(collection, cancellationToken).ConfigureAwait(false);
if (existingIndex is not null &&
existingIndex.TryGetValue("unique", out var uniqueValue) &&
uniqueValue.ToBoolean() &&
existingIndex.TryGetValue("key", out var keyValue) &&
keyValue is BsonDocument keyDocument &&
KeysMatch(keyDocument))
{
return;
}
if (existingIndex is not null)
{
try
{
await collection.Indexes.DropOneAsync(IndexName, cancellationToken).ConfigureAwait(false);
}
catch (MongoCommandException ex) when (ex.Code == 27)
{
// Index not found; safe to ignore.
}
}
var keys = Builders<BsonDocument>.IndexKeys
.Ascending("source.vendor")
.Ascending("upstream.upstream_id")
.Ascending("upstream.content_hash")
.Ascending("tenant");
var options = new CreateIndexOptions
{
Name = IndexName,
Unique = true,
};
await collection.Indexes.CreateOneAsync(new CreateIndexModel<BsonDocument>(keys, options), cancellationToken: cancellationToken).ConfigureAwait(false);
}
private static async Task<BsonDocument?> FindExistingIndexAsync(
IMongoCollection<BsonDocument> collection,
CancellationToken cancellationToken)
{
using var cursor = await collection.Indexes.ListAsync(cancellationToken).ConfigureAwait(false);
var indexes = await cursor.ToListAsync(cancellationToken).ConfigureAwait(false);
return indexes.FirstOrDefault(doc => doc.TryGetValue("name", out var name) && string.Equals(name.AsString, IndexName, StringComparison.Ordinal));
}
private static async Task EnsureNoDuplicatesAsync(IMongoCollection<BsonDocument> collection, CancellationToken cancellationToken)
{
var pipeline = new[]
{
new BsonDocument("$group", new BsonDocument
{
{
"_id",
new BsonDocument
{
{ "vendor", "$source.vendor" },
{ "upstreamId", "$upstream.upstream_id" },
{ "contentHash", "$upstream.content_hash" },
{ "tenant", "$tenant" },
}
},
{ "count", new BsonDocument("$sum", 1) },
{ "ids", new BsonDocument("$push", "$_id") },
}),
new BsonDocument("$match", new BsonDocument("count", new BsonDocument("$gt", 1))),
new BsonDocument("$limit", 1),
};
var pipelineDefinition = PipelineDefinition<BsonDocument, BsonDocument>.Create(pipeline);
var duplicate = await collection.Aggregate(pipelineDefinition, cancellationToken: cancellationToken)
.FirstOrDefaultAsync(cancellationToken)
.ConfigureAwait(false);
if (duplicate is null)
{
return;
}
var keyDocument = duplicate["_id"].AsBsonDocument;
var vendor = keyDocument.GetValue("vendor", BsonNull.Value)?.ToString() ?? "<null>";
var upstreamId = keyDocument.GetValue("upstreamId", BsonNull.Value)?.ToString() ?? "<null>";
var contentHash = keyDocument.GetValue("contentHash", BsonNull.Value)?.ToString() ?? "<null>";
var tenant = keyDocument.GetValue("tenant", BsonNull.Value)?.ToString() ?? "<null>";
BsonArray idArray = duplicate.TryGetValue("ids", out var idsValue) && idsValue is BsonArray array
? array
: new BsonArray();
var ids = new string[idArray.Count];
for (var i = 0; i < idArray.Count; i++)
{
ids[i] = idArray[i]?.ToString() ?? "<null>";
}
throw new InvalidOperationException(
$"Cannot create advisory_raw idempotency index because duplicate documents exist for vendor '{vendor}', upstream '{upstreamId}', content hash '{contentHash}', tenant '{tenant}'. Conflicting document ids: {string.Join(", ", ids)}.");
}
private static bool KeysMatch(BsonDocument keyDocument)
{
if (keyDocument.ElementCount != 4)
{
return false;
}
var expected = new[]
{
("source.vendor", 1),
("upstream.upstream_id", 1),
("upstream.content_hash", 1),
("tenant", 1),
};
var index = 0;
foreach (var element in keyDocument.Elements)
{
if (!string.Equals(element.Name, expected[index].Item1, StringComparison.Ordinal))
{
return false;
}
if (!element.Value.IsInt32 || element.Value.AsInt32 != expected[index].Item2)
{
return false;
}
index++;
}
return true;
}
}

View File

@@ -0,0 +1,372 @@
using System.Collections.Generic;
using Microsoft.Extensions.Options;
using MongoDB.Bson;
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo.Migrations;
internal sealed class EnsureAdvisoryRawValidatorMigration : IMongoMigration
{
private const string ForbiddenExactPattern = "^(?i)(severity|cvss|cvss_vector|merged_from|consensus_provider|reachability|asset_criticality|risk_score)$";
private const string ForbiddenEffectivePattern = "^(?i)effective_";
private static readonly IReadOnlyList<object> AllBsonTypeNames = new object[]
{
"double",
"string",
"object",
"array",
"binData",
"undefined",
"objectId",
"bool",
"date",
"null",
"regex",
"dbPointer",
"javascript",
"symbol",
"javascriptWithScope",
"int",
"timestamp",
"long",
"decimal",
"minKey",
"maxKey",
};
private readonly MongoStorageOptions _options;
public EnsureAdvisoryRawValidatorMigration(IOptions<MongoStorageOptions> options)
{
ArgumentNullException.ThrowIfNull(options);
_options = options.Value ?? throw new ArgumentNullException(nameof(options.Value));
}
public string Id => "20251028_advisory_raw_validator";
public string Description => "Ensure advisory_raw collection enforces Aggregation-Only Contract schema";
public async Task ApplyAsync(IMongoDatabase database, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(database);
var validatorOptions = _options.AdvisoryRawValidator ?? new MongoCollectionValidatorOptions();
var validator = new BsonDocument("$jsonSchema", BuildSchema());
var command = new BsonDocument
{
{ "collMod", MongoStorageDefaults.Collections.AdvisoryRaw },
{ "validator", validator },
{ "validationLevel", GetValidationLevelString(validatorOptions.Level) },
{ "validationAction", GetValidationActionString(validatorOptions.Action) },
};
try
{
await database.RunCommandAsync<BsonDocument>(command, cancellationToken: cancellationToken).ConfigureAwait(false);
}
catch (MongoCommandException ex) when (ex.Code == 26)
{
var createOptions = new CreateCollectionOptions<BsonDocument>
{
Validator = validator,
ValidationLevel = MapValidationLevel(validatorOptions.Level),
ValidationAction = MapValidationAction(validatorOptions.Action),
};
await database.CreateCollectionAsync(
MongoStorageDefaults.Collections.AdvisoryRaw,
createOptions,
cancellationToken).ConfigureAwait(false);
}
}
private static BsonDocument BuildSchema()
{
var schema = new BsonDocument
{
{ "bsonType", "object" },
{ "required", new BsonArray(new[] { "tenant", "source", "upstream", "content", "linkset" }) },
{ "properties", BuildTopLevelProperties() },
{ "patternProperties", BuildForbiddenPatterns() },
};
return schema;
}
private static BsonDocument BuildTopLevelProperties()
{
var properties = new BsonDocument
{
{ "_id", new BsonDocument("bsonType", "string") },
{
"tenant",
new BsonDocument
{
{ "bsonType", "string" },
{ "minLength", 1 },
}
},
{ "source", BuildSourceSchema() },
{ "upstream", BuildUpstreamSchema() },
{ "content", BuildContentSchema() },
{ "identifiers", BuildIdentifiersSchema() },
{ "linkset", BuildLinksetSchema() },
{ "supersedes", new BsonDocument("bsonType", new BsonArray(new object[] { "string", "null" })) },
{
"created_at",
new BsonDocument
{
{ "bsonType", new BsonArray(new object[] { "date", "string", "null" }) },
}
},
{
"ingested_at",
new BsonDocument
{
{ "bsonType", new BsonArray(new object[] { "date", "string", "null" }) },
}
},
};
return properties;
}
private static BsonDocument BuildSourceSchema()
{
return new BsonDocument
{
{ "bsonType", "object" },
{ "required", new BsonArray(new[] { "vendor", "connector", "version" }) },
{
"properties",
new BsonDocument
{
{ "vendor", new BsonDocument("bsonType", "string") },
{ "connector", new BsonDocument("bsonType", "string") },
{ "version", new BsonDocument("bsonType", "string") },
{ "stream", new BsonDocument("bsonType", new BsonArray(new object[] { "string", "null" })) },
}
},
{ "additionalProperties", false },
};
}
private static BsonDocument BuildUpstreamSchema()
{
return new BsonDocument
{
{ "bsonType", "object" },
{ "required", new BsonArray(new[] { "upstream_id", "retrieved_at", "content_hash", "signature", "provenance" }) },
{
"properties",
new BsonDocument
{
{ "upstream_id", new BsonDocument("bsonType", "string") },
{ "document_version", new BsonDocument("bsonType", new BsonArray(new object[] { "string", "null" })) },
{ "retrieved_at", new BsonDocument("bsonType", new BsonArray(new object[] { "date", "string" })) },
{ "content_hash", new BsonDocument("bsonType", "string") },
{ "signature", BuildSignatureSchema() },
{ "provenance", BuildProvenanceSchema() },
}
},
{ "additionalProperties", false },
};
}
private static BsonDocument BuildSignatureSchema()
{
return new BsonDocument
{
{ "bsonType", "object" },
{ "required", new BsonArray(new[] { "present" }) },
{
"properties",
new BsonDocument
{
{ "present", new BsonDocument("bsonType", "bool") },
{ "format", new BsonDocument("bsonType", new BsonArray(new object[] { "string", "null" })) },
{ "key_id", new BsonDocument("bsonType", new BsonArray(new object[] { "string", "null" })) },
{ "sig", new BsonDocument("bsonType", new BsonArray(new object[] { "string", "null" })) },
{ "certificate", new BsonDocument("bsonType", new BsonArray(new object[] { "string", "null" })) },
{ "digest", new BsonDocument("bsonType", new BsonArray(new object[] { "string", "null" })) },
}
},
{ "additionalProperties", false },
};
}
private static BsonDocument BuildProvenanceSchema()
{
return new BsonDocument
{
{ "bsonType", "object" },
{ "additionalProperties", new BsonDocument("bsonType", new BsonArray(new object[] { "string", "null" })) },
};
}
private static BsonDocument BuildContentSchema()
{
return new BsonDocument
{
{ "bsonType", "object" },
{ "required", new BsonArray(new[] { "format", "raw" }) },
{
"properties",
new BsonDocument
{
{ "format", new BsonDocument("bsonType", "string") },
{ "spec_version", new BsonDocument("bsonType", new BsonArray(new object[] { "string", "null" })) },
{
"raw",
new BsonDocument
{
{ "bsonType", new BsonArray(new object[]
{
"object",
"array",
"string",
"bool",
"double",
"int",
"long",
"decimal",
})
},
}
},
{ "encoding", new BsonDocument("bsonType", new BsonArray(new object[] { "string", "null" })) },
}
},
{ "additionalProperties", false },
};
}
private static BsonDocument BuildIdentifiersSchema()
{
return new BsonDocument
{
{ "bsonType", "object" },
{ "required", new BsonArray(new[] { "aliases", "primary" }) },
{
"properties",
new BsonDocument
{
{ "aliases", CreateStringArraySchema() },
{ "primary", new BsonDocument("bsonType", "string") },
}
},
{ "additionalProperties", false },
};
}
private static BsonDocument BuildLinksetSchema()
{
return new BsonDocument
{
{ "bsonType", "object" },
{
"properties",
new BsonDocument
{
{ "aliases", CreateStringArraySchema() },
{ "purls", CreateStringArraySchema() },
{ "cpes", CreateStringArraySchema() },
{
"references",
new BsonDocument
{
{ "bsonType", "array" },
{
"items",
new BsonDocument
{
{ "bsonType", "object" },
{ "required", new BsonArray(new[] { "type", "url" }) },
{
"properties",
new BsonDocument
{
{ "type", new BsonDocument("bsonType", "string") },
{ "url", new BsonDocument("bsonType", "string") },
{ "source", new BsonDocument("bsonType", new BsonArray(new object[] { "string", "null" })) },
}
},
{ "additionalProperties", false },
}
}
}
},
{ "reconciled_from", CreateStringArraySchema() },
{
"notes",
new BsonDocument
{
{ "bsonType", "object" },
{ "additionalProperties", new BsonDocument("bsonType", new BsonArray(new object[] { "string", "null" })) },
}
},
}
},
{ "additionalProperties", false },
};
}
private static BsonDocument CreateStringArraySchema()
{
return new BsonDocument
{
{ "bsonType", "array" },
{ "items", new BsonDocument("bsonType", "string") },
};
}
private static BsonDocument BuildForbiddenPatterns()
{
return new BsonDocument
{
{ ForbiddenExactPattern, CreateForbiddenPattern("Derived and normalized fields are forbidden by the Aggregation-Only Contract.") },
{ ForbiddenEffectivePattern, CreateForbiddenPattern("Fields starting with 'effective_' must not be persisted in advisory_raw.") },
};
}
private static BsonDocument CreateForbiddenPattern(string description)
{
return new BsonDocument
{
{ "description", description },
{ "not", new BsonDocument("bsonType", new BsonArray(AllBsonTypeNames)) },
};
}
private static string GetValidationLevelString(MongoValidationLevel level) => level switch
{
MongoValidationLevel.Off => "off",
MongoValidationLevel.Moderate => "moderate",
MongoValidationLevel.Strict => "strict",
_ => "moderate",
};
private static string GetValidationActionString(MongoValidationAction action) => action switch
{
MongoValidationAction.Warn => "warn",
MongoValidationAction.Error => "error",
_ => "warn",
};
private static DocumentValidationLevel MapValidationLevel(MongoValidationLevel level) => level switch
{
MongoValidationLevel.Off => DocumentValidationLevel.Off,
MongoValidationLevel.Moderate => DocumentValidationLevel.Moderate,
MongoValidationLevel.Strict => DocumentValidationLevel.Strict,
_ => DocumentValidationLevel.Moderate,
};
private static DocumentValidationAction MapValidationAction(MongoValidationAction action) => action switch
{
MongoValidationAction.Warn => DocumentValidationAction.Warn,
MongoValidationAction.Error => DocumentValidationAction.Error,
_ => DocumentValidationAction.Warn,
};
}

View File

@@ -0,0 +1,242 @@
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using MongoDB.Bson;
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo.Migrations;
public sealed class EnsureAdvisorySupersedesBackfillMigration : IMongoMigration
{
private const string BackupCollectionName = "advisory_backup_20251028";
private const string SupersedesMigrationId = "20251028_advisory_supersedes_backfill";
public string Id => SupersedesMigrationId;
public string Description => "Backfill advisory_raw supersedes chains and replace legacy advisory collection with read-only view.";
public async Task ApplyAsync(IMongoDatabase database, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(database);
await EnsureLegacyAdvisoryViewAsync(database, cancellationToken).ConfigureAwait(false);
await BackfillSupersedesAsync(database, cancellationToken).ConfigureAwait(false);
}
private static async Task EnsureLegacyAdvisoryViewAsync(IMongoDatabase database, CancellationToken cancellationToken)
{
var advisoryInfo = await GetCollectionInfoAsync(database, MongoStorageDefaults.Collections.Advisory, cancellationToken).ConfigureAwait(false);
var backupInfo = await GetCollectionInfoAsync(database, BackupCollectionName, cancellationToken).ConfigureAwait(false);
if (advisoryInfo is not null && !IsView(advisoryInfo))
{
if (backupInfo is null)
{
await RenameCollectionAsync(database, MongoStorageDefaults.Collections.Advisory, BackupCollectionName, cancellationToken).ConfigureAwait(false);
backupInfo = await GetCollectionInfoAsync(database, BackupCollectionName, cancellationToken).ConfigureAwait(false);
}
else
{
await database.DropCollectionAsync(MongoStorageDefaults.Collections.Advisory, cancellationToken).ConfigureAwait(false);
}
advisoryInfo = null;
}
if (backupInfo is null)
{
await database.CreateCollectionAsync(BackupCollectionName, cancellationToken: cancellationToken).ConfigureAwait(false);
}
if (advisoryInfo is null)
{
await CreateViewAsync(database, MongoStorageDefaults.Collections.Advisory, BackupCollectionName, cancellationToken).ConfigureAwait(false);
}
else if (!ViewTargets(advisoryInfo, BackupCollectionName))
{
await database.DropCollectionAsync(MongoStorageDefaults.Collections.Advisory, cancellationToken).ConfigureAwait(false);
await CreateViewAsync(database, MongoStorageDefaults.Collections.Advisory, BackupCollectionName, cancellationToken).ConfigureAwait(false);
}
}
private static async Task BackfillSupersedesAsync(IMongoDatabase database, CancellationToken cancellationToken)
{
var collection = database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.AdvisoryRaw);
var pipeline = new EmptyPipelineDefinition<BsonDocument>()
.Group(new BsonDocument
{
{
"_id",
new BsonDocument
{
{ "tenant", "$tenant" },
{ "vendor", "$source.vendor" },
{ "upstreamId", "$upstream.upstream_id" },
}
},
{
"documents",
new BsonDocument("$push", new BsonDocument
{
{ "_id", "$_id" },
{ "retrievedAt", "$upstream.retrieved_at" },
{ "contentHash", "$upstream.content_hash" },
{ "supersedes", "$supersedes" },
})
}
})
.Match(new BsonDocument("documents.1", new BsonDocument("$exists", true)));
using var cursor = await collection.AggregateAsync(pipeline, cancellationToken: cancellationToken).ConfigureAwait(false);
while (await cursor.MoveNextAsync(cancellationToken).ConfigureAwait(false))
{
foreach (var group in cursor.Current)
{
if (!group.TryGetValue("documents", out var documentsValue) || documentsValue is not BsonArray documentsArray || documentsArray.Count == 0)
{
continue;
}
var ordered = documentsArray
.Select(x => x.AsBsonDocument)
.Select(x => new AdvisoryRawRecord(
x.GetValue("_id").AsString,
GetDateTime(x, "retrievedAt"),
x.TryGetValue("supersedes", out var sup) ? sup : BsonNull.Value))
.OrderBy(record => record.RetrievedAt)
.ThenBy(record => record.Id, StringComparer.Ordinal)
.ToArray();
if (ordered.Length <= 1)
{
continue;
}
var updates = new List<WriteModel<BsonDocument>>(ordered.Length);
for (var index = 0; index < ordered.Length; index++)
{
var expectedSupersedes = index == 0 ? BsonNull.Value : BsonValue.Create(ordered[index - 1].Id);
var current = ordered[index];
if (AreSupersedesEqual(current.Supersedes, expectedSupersedes))
{
continue;
}
var filter = Builders<BsonDocument>.Filter.Eq("_id", current.Id);
var update = Builders<BsonDocument>.Update.Set("supersedes", expectedSupersedes);
updates.Add(new UpdateOneModel<BsonDocument>(filter, update));
}
if (updates.Count > 0)
{
await collection.BulkWriteAsync(updates, cancellationToken: cancellationToken).ConfigureAwait(false);
}
}
}
}
private static async Task<BsonDocument?> GetCollectionInfoAsync(IMongoDatabase database, string name, CancellationToken cancellationToken)
{
var command = new BsonDocument
{
{ "listCollections", 1 },
{ "filter", new BsonDocument("name", name) },
};
var result = await database.RunCommandAsync<BsonDocument>(command, cancellationToken: cancellationToken).ConfigureAwait(false);
var batch = result["cursor"]["firstBatch"].AsBsonArray;
return batch.Count > 0 ? batch[0].AsBsonDocument : null;
}
private static bool IsView(BsonDocument collectionInfo)
=> string.Equals(collectionInfo.GetValue("type", BsonString.Empty).AsString, "view", StringComparison.OrdinalIgnoreCase);
private static bool ViewTargets(BsonDocument collectionInfo, string target)
{
if (!IsView(collectionInfo))
{
return false;
}
if (!collectionInfo.TryGetValue("options", out var options) || options is not BsonDocument optionsDocument)
{
return false;
}
return optionsDocument.TryGetValue("viewOn", out var viewOn) && string.Equals(viewOn.AsString, target, StringComparison.Ordinal);
}
private static async Task RenameCollectionAsync(IMongoDatabase database, string source, string destination, CancellationToken cancellationToken)
{
var admin = database.Client.GetDatabase("admin");
var renameCommand = new BsonDocument
{
{ "renameCollection", $"{database.DatabaseNamespace.DatabaseName}.{source}" },
{ "to", $"{database.DatabaseNamespace.DatabaseName}.{destination}" },
{ "dropTarget", false },
};
try
{
await admin.RunCommandAsync<BsonDocument>(renameCommand, cancellationToken: cancellationToken).ConfigureAwait(false);
}
catch (MongoCommandException ex) when (ex.Code == 26)
{
// Source namespace not found; ignore.
}
catch (MongoCommandException ex) when (ex.Code == 48)
{
// Target namespace exists; fall back to manual handling by copying data.
await database.DropCollectionAsync(destination, cancellationToken).ConfigureAwait(false);
await admin.RunCommandAsync<BsonDocument>(renameCommand, cancellationToken: cancellationToken).ConfigureAwait(false);
}
}
private static async Task CreateViewAsync(IMongoDatabase database, string viewName, string sourceName, CancellationToken cancellationToken)
{
var createCommand = new BsonDocument
{
{ "create", viewName },
{ "viewOn", sourceName },
};
await database.RunCommandAsync<BsonDocument>(createCommand, cancellationToken: cancellationToken).ConfigureAwait(false);
}
private static DateTime GetDateTime(BsonDocument document, string fieldName)
{
if (!document.TryGetValue(fieldName, out var value))
{
return DateTime.MinValue;
}
return value.BsonType switch
{
BsonType.DateTime => value.ToUniversalTime(),
BsonType.String when DateTime.TryParse(value.AsString, out var parsed) => parsed.ToUniversalTime(),
_ => DateTime.MinValue,
};
}
private static bool AreSupersedesEqual(BsonValue? left, BsonValue? right)
{
if (left is null || left.IsBsonNull)
{
return right is null || right.IsBsonNull;
}
if (right is null || right.IsBsonNull)
{
return left.IsBsonNull;
}
return left.Equals(right);
}
private sealed record AdvisoryRawRecord(string Id, DateTime RetrievedAt, BsonValue Supersedes);
}

View File

@@ -0,0 +1,146 @@
using System;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Options;
using MongoDB.Bson;
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo.Migrations;
internal sealed class EnsureDocumentExpiryIndexesMigration : IMongoMigration
{
private readonly MongoStorageOptions _options;
public EnsureDocumentExpiryIndexesMigration(IOptions<MongoStorageOptions> options)
{
ArgumentNullException.ThrowIfNull(options);
_options = options.Value;
}
public string Id => "20241005_document_expiry_indexes";
public string Description => "Ensure document.expiresAt index matches configured retention";
public async Task ApplyAsync(IMongoDatabase database, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(database);
var needsTtl = _options.RawDocumentRetention > TimeSpan.Zero;
var collection = database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.Document);
using var cursor = await collection.Indexes.ListAsync(cancellationToken).ConfigureAwait(false);
var indexes = await cursor.ToListAsync(cancellationToken).ConfigureAwait(false);
var ttlIndex = indexes.FirstOrDefault(x => TryGetName(x, out var name) && string.Equals(name, "document_expiresAt_ttl", StringComparison.Ordinal));
var nonTtlIndex = indexes.FirstOrDefault(x => TryGetName(x, out var name) && string.Equals(name, "document_expiresAt", StringComparison.Ordinal));
if (needsTtl)
{
var shouldRebuild = ttlIndex is null || !IndexMatchesTtlExpectations(ttlIndex);
if (shouldRebuild)
{
if (ttlIndex is not null)
{
await collection.Indexes.DropOneAsync("document_expiresAt_ttl", cancellationToken).ConfigureAwait(false);
}
if (nonTtlIndex is not null)
{
await collection.Indexes.DropOneAsync("document_expiresAt", cancellationToken).ConfigureAwait(false);
}
var options = new CreateIndexOptions<BsonDocument>
{
Name = "document_expiresAt_ttl",
ExpireAfter = TimeSpan.Zero,
PartialFilterExpression = Builders<BsonDocument>.Filter.Exists("expiresAt", true),
};
var keys = Builders<BsonDocument>.IndexKeys.Ascending("expiresAt");
await collection.Indexes.CreateOneAsync(new CreateIndexModel<BsonDocument>(keys, options), cancellationToken: cancellationToken).ConfigureAwait(false);
}
else if (nonTtlIndex is not null)
{
await collection.Indexes.DropOneAsync("document_expiresAt", cancellationToken).ConfigureAwait(false);
}
}
else
{
if (ttlIndex is not null)
{
await collection.Indexes.DropOneAsync("document_expiresAt_ttl", cancellationToken).ConfigureAwait(false);
}
var shouldRebuild = nonTtlIndex is null || !IndexMatchesNonTtlExpectations(nonTtlIndex);
if (shouldRebuild)
{
if (nonTtlIndex is not null)
{
await collection.Indexes.DropOneAsync("document_expiresAt", cancellationToken).ConfigureAwait(false);
}
var options = new CreateIndexOptions<BsonDocument>
{
Name = "document_expiresAt",
PartialFilterExpression = Builders<BsonDocument>.Filter.Exists("expiresAt", true),
};
var keys = Builders<BsonDocument>.IndexKeys.Ascending("expiresAt");
await collection.Indexes.CreateOneAsync(new CreateIndexModel<BsonDocument>(keys, options), cancellationToken: cancellationToken).ConfigureAwait(false);
}
}
}
private static bool IndexMatchesTtlExpectations(BsonDocument index)
{
if (!index.TryGetValue("expireAfterSeconds", out var expireAfter) || expireAfter.ToDouble() != 0)
{
return false;
}
if (!index.TryGetValue("partialFilterExpression", out var partialFilter) || partialFilter is not BsonDocument partialDoc)
{
return false;
}
if (!partialDoc.TryGetValue("expiresAt", out var expiresAtRule) || expiresAtRule is not BsonDocument expiresAtDoc)
{
return false;
}
return expiresAtDoc.Contains("$exists") && expiresAtDoc["$exists"].ToBoolean();
}
private static bool IndexMatchesNonTtlExpectations(BsonDocument index)
{
if (index.Contains("expireAfterSeconds"))
{
return false;
}
if (!index.TryGetValue("partialFilterExpression", out var partialFilter) || partialFilter is not BsonDocument partialDoc)
{
return false;
}
if (!partialDoc.TryGetValue("expiresAt", out var expiresAtRule) || expiresAtRule is not BsonDocument expiresAtDoc)
{
return false;
}
return expiresAtDoc.Contains("$exists") && expiresAtDoc["$exists"].ToBoolean();
}
private static bool TryGetName(BsonDocument index, out string name)
{
if (index.TryGetValue("name", out var value) && value.IsString)
{
name = value.AsString;
return true;
}
name = string.Empty;
return false;
}
}

View File

@@ -0,0 +1,95 @@
using System;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Options;
using MongoDB.Bson;
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo.Migrations;
internal sealed class EnsureGridFsExpiryIndexesMigration : IMongoMigration
{
private readonly MongoStorageOptions _options;
public EnsureGridFsExpiryIndexesMigration(IOptions<MongoStorageOptions> options)
{
ArgumentNullException.ThrowIfNull(options);
_options = options.Value;
}
public string Id => "20241005_gridfs_expiry_indexes";
public string Description => "Ensure GridFS metadata.expiresAt TTL index reflects retention settings";
public async Task ApplyAsync(IMongoDatabase database, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(database);
var needsTtl = _options.RawDocumentRetention > TimeSpan.Zero;
var collection = database.GetCollection<BsonDocument>("documents.files");
using var cursor = await collection.Indexes.ListAsync(cancellationToken).ConfigureAwait(false);
var indexes = await cursor.ToListAsync(cancellationToken).ConfigureAwait(false);
var ttlIndex = indexes.FirstOrDefault(x => TryGetName(x, out var name) && string.Equals(name, "gridfs_files_expiresAt_ttl", StringComparison.Ordinal));
if (needsTtl)
{
var shouldRebuild = ttlIndex is null || !IndexMatchesTtlExpectations(ttlIndex);
if (shouldRebuild)
{
if (ttlIndex is not null)
{
await collection.Indexes.DropOneAsync("gridfs_files_expiresAt_ttl", cancellationToken).ConfigureAwait(false);
}
var keys = Builders<BsonDocument>.IndexKeys.Ascending("metadata.expiresAt");
var options = new CreateIndexOptions<BsonDocument>
{
Name = "gridfs_files_expiresAt_ttl",
ExpireAfter = TimeSpan.Zero,
PartialFilterExpression = Builders<BsonDocument>.Filter.Exists("metadata.expiresAt", true),
};
await collection.Indexes.CreateOneAsync(new CreateIndexModel<BsonDocument>(keys, options), cancellationToken: cancellationToken).ConfigureAwait(false);
}
}
else if (ttlIndex is not null)
{
await collection.Indexes.DropOneAsync("gridfs_files_expiresAt_ttl", cancellationToken).ConfigureAwait(false);
}
}
private static bool IndexMatchesTtlExpectations(BsonDocument index)
{
if (!index.TryGetValue("expireAfterSeconds", out var expireAfter) || expireAfter.ToDouble() != 0)
{
return false;
}
if (!index.TryGetValue("partialFilterExpression", out var partialFilter) || partialFilter is not BsonDocument partialDoc)
{
return false;
}
if (!partialDoc.TryGetValue("metadata.expiresAt", out var expiresAtRule) || expiresAtRule is not BsonDocument expiresAtDoc)
{
return false;
}
return expiresAtDoc.Contains("$exists") && expiresAtDoc["$exists"].ToBoolean();
}
private static bool TryGetName(BsonDocument index, out string name)
{
if (index.TryGetValue("name", out var value) && value.IsString)
{
name = value.AsString;
return true;
}
name = string.Empty;
return false;
}
}

View File

@@ -0,0 +1,24 @@
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo.Migrations;
/// <summary>
/// Represents a single, idempotent MongoDB migration.
/// </summary>
public interface IMongoMigration
{
/// <summary>
/// Unique identifier for the migration. Sorting is performed using ordinal comparison.
/// </summary>
string Id { get; }
/// <summary>
/// Short description surfaced in logs to aid runbooks.
/// </summary>
string Description { get; }
/// <summary>
/// Executes the migration.
/// </summary>
Task ApplyAsync(IMongoDatabase database, CancellationToken cancellationToken);
}

View File

@@ -0,0 +1,18 @@
using MongoDB.Bson;
using MongoDB.Bson.Serialization.Attributes;
namespace StellaOps.Concelier.Storage.Mongo.Migrations;
[BsonIgnoreExtraElements]
internal sealed class MongoMigrationDocument
{
[BsonId]
public string Id { get; set; } = string.Empty;
[BsonElement("description")]
[BsonIgnoreIfNull]
public string? Description { get; set; }
[BsonElement("appliedAt")]
public DateTime AppliedAtUtc { get; set; }
}

View File

@@ -0,0 +1,102 @@
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging;
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo.Migrations;
/// <summary>
/// Executes pending schema migrations tracked inside MongoDB to keep upgrades deterministic.
/// </summary>
public sealed class MongoMigrationRunner
{
private readonly IMongoDatabase _database;
private readonly IReadOnlyList<IMongoMigration> _migrations;
private readonly ILogger<MongoMigrationRunner> _logger;
private readonly TimeProvider _timeProvider;
public MongoMigrationRunner(
IMongoDatabase database,
IEnumerable<IMongoMigration> migrations,
ILogger<MongoMigrationRunner> logger,
TimeProvider? timeProvider = null)
{
_database = database ?? throw new ArgumentNullException(nameof(database));
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
_timeProvider = timeProvider ?? TimeProvider.System;
_migrations = (migrations ?? throw new ArgumentNullException(nameof(migrations)))
.OrderBy(m => m.Id, StringComparer.Ordinal)
.ToArray();
}
public async Task RunAsync(CancellationToken cancellationToken)
{
if (_migrations.Count == 0)
{
return;
}
var collection = _database.GetCollection<MongoMigrationDocument>(MongoStorageDefaults.Collections.Migrations);
await EnsureCollectionExistsAsync(_database, cancellationToken).ConfigureAwait(false);
var appliedIds = await LoadAppliedMigrationIdsAsync(collection, cancellationToken).ConfigureAwait(false);
foreach (var migration in _migrations)
{
if (appliedIds.Contains(migration.Id, StringComparer.Ordinal))
{
continue;
}
_logger.LogInformation("Applying Mongo migration {MigrationId}: {Description}", migration.Id, migration.Description);
try
{
await migration.ApplyAsync(_database, cancellationToken).ConfigureAwait(false);
var document = new MongoMigrationDocument
{
Id = migration.Id,
Description = string.IsNullOrWhiteSpace(migration.Description) ? null : migration.Description,
AppliedAtUtc = _timeProvider.GetUtcNow().UtcDateTime,
};
await collection.InsertOneAsync(document, cancellationToken: cancellationToken).ConfigureAwait(false);
_logger.LogInformation("Mongo migration {MigrationId} applied", migration.Id);
}
catch (Exception ex)
{
_logger.LogError(ex, "Mongo migration {MigrationId} failed", migration.Id);
throw;
}
}
}
private static async Task<HashSet<string>> LoadAppliedMigrationIdsAsync(
IMongoCollection<MongoMigrationDocument> collection,
CancellationToken cancellationToken)
{
using var cursor = await collection.FindAsync(FilterDefinition<MongoMigrationDocument>.Empty, cancellationToken: cancellationToken).ConfigureAwait(false);
var applied = await cursor.ToListAsync(cancellationToken).ConfigureAwait(false);
var set = new HashSet<string>(StringComparer.Ordinal);
foreach (var document in applied)
{
if (!string.IsNullOrWhiteSpace(document.Id))
{
set.Add(document.Id);
}
}
return set;
}
private static async Task EnsureCollectionExistsAsync(IMongoDatabase database, CancellationToken cancellationToken)
{
using var cursor = await database.ListCollectionNamesAsync(cancellationToken: cancellationToken).ConfigureAwait(false);
var names = await cursor.ToListAsync(cancellationToken).ConfigureAwait(false);
if (!names.Contains(MongoStorageDefaults.Collections.Migrations, StringComparer.Ordinal))
{
await database.CreateCollectionAsync(MongoStorageDefaults.Collections.Migrations, cancellationToken: cancellationToken).ConfigureAwait(false);
}
}
}

View File

@@ -0,0 +1,81 @@
using System;
using System.Collections.Generic;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using MongoDB.Bson;
using MongoDB.Driver;
using StellaOps.Concelier.Models;
using StellaOps.Concelier.Storage.Mongo.Advisories;
namespace StellaOps.Concelier.Storage.Mongo.Migrations;
public sealed class SemVerStyleBackfillMigration : IMongoMigration
{
private readonly MongoStorageOptions _options;
private readonly ILogger<SemVerStyleBackfillMigration> _logger;
public SemVerStyleBackfillMigration(IOptions<MongoStorageOptions> options, ILogger<SemVerStyleBackfillMigration> logger)
{
_options = options?.Value ?? throw new ArgumentNullException(nameof(options));
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public string Id => "20251011-semver-style-backfill";
public string Description => "Populate advisory.normalizedVersions for existing documents when SemVer style storage is enabled.";
public async Task ApplyAsync(IMongoDatabase database, CancellationToken cancellationToken)
{
if (!_options.EnableSemVerStyle)
{
_logger.LogInformation("SemVer style flag disabled; skipping migration {MigrationId}.", Id);
return;
}
var collection = database.GetCollection<AdvisoryDocument>(MongoStorageDefaults.Collections.Advisory);
var filter = Builders<AdvisoryDocument>.Filter.Or(
Builders<AdvisoryDocument>.Filter.Exists(doc => doc.NormalizedVersions, false),
Builders<AdvisoryDocument>.Filter.Where(doc => doc.NormalizedVersions == null || doc.NormalizedVersions.Count == 0));
var batchSize = Math.Max(25, _options.BackfillBatchSize);
while (true)
{
var pending = await collection.Find(filter)
.SortBy(doc => doc.AdvisoryKey)
.Limit(batchSize)
.ToListAsync(cancellationToken)
.ConfigureAwait(false);
if (pending.Count == 0)
{
break;
}
var updates = new List<WriteModel<AdvisoryDocument>>(pending.Count);
foreach (var document in pending)
{
var advisory = CanonicalJsonSerializer.Deserialize<Advisory>(document.Payload.ToJson());
var normalized = NormalizedVersionDocumentFactory.Create(advisory);
if (normalized is null || normalized.Count == 0)
{
updates.Add(new UpdateOneModel<AdvisoryDocument>(
Builders<AdvisoryDocument>.Filter.Eq(doc => doc.AdvisoryKey, document.AdvisoryKey),
Builders<AdvisoryDocument>.Update.Unset(doc => doc.NormalizedVersions)));
continue;
}
updates.Add(new UpdateOneModel<AdvisoryDocument>(
Builders<AdvisoryDocument>.Filter.Eq(doc => doc.AdvisoryKey, document.AdvisoryKey),
Builders<AdvisoryDocument>.Update.Set(doc => doc.NormalizedVersions, normalized)));
}
if (updates.Count > 0)
{
await collection.BulkWriteAsync(updates, cancellationToken: cancellationToken).ConfigureAwait(false);
}
}
}
}

View File

@@ -0,0 +1,392 @@
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using MongoDB.Bson;
using MongoDB.Driver;
using StellaOps.Concelier.Storage.Mongo.Migrations;
namespace StellaOps.Concelier.Storage.Mongo;
/// <summary>
/// Ensures required collections and indexes exist before the service begins processing.
/// </summary>
public sealed class MongoBootstrapper
{
private const string RawDocumentBucketName = "documents";
private static readonly string[] RequiredCollections =
{
MongoStorageDefaults.Collections.Source,
MongoStorageDefaults.Collections.SourceState,
MongoStorageDefaults.Collections.Document,
MongoStorageDefaults.Collections.Dto,
MongoStorageDefaults.Collections.Advisory,
MongoStorageDefaults.Collections.AdvisoryRaw,
MongoStorageDefaults.Collections.Alias,
MongoStorageDefaults.Collections.Affected,
MongoStorageDefaults.Collections.Reference,
MongoStorageDefaults.Collections.KevFlag,
MongoStorageDefaults.Collections.RuFlags,
MongoStorageDefaults.Collections.JpFlags,
MongoStorageDefaults.Collections.PsirtFlags,
MongoStorageDefaults.Collections.MergeEvent,
MongoStorageDefaults.Collections.ExportState,
MongoStorageDefaults.Collections.ChangeHistory,
MongoStorageDefaults.Collections.AdvisoryStatements,
MongoStorageDefaults.Collections.AdvisoryConflicts,
MongoStorageDefaults.Collections.AdvisoryObservations,
MongoStorageDefaults.Collections.Locks,
MongoStorageDefaults.Collections.Jobs,
MongoStorageDefaults.Collections.Migrations,
};
private readonly IMongoDatabase _database;
private readonly MongoStorageOptions _options;
private readonly ILogger<MongoBootstrapper> _logger;
private readonly MongoMigrationRunner _migrationRunner;
public MongoBootstrapper(
IMongoDatabase database,
IOptions<MongoStorageOptions> options,
ILogger<MongoBootstrapper> logger,
MongoMigrationRunner migrationRunner)
{
_database = database ?? throw new ArgumentNullException(nameof(database));
_options = options?.Value ?? throw new ArgumentNullException(nameof(options));
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
_migrationRunner = migrationRunner ?? throw new ArgumentNullException(nameof(migrationRunner));
}
public async Task InitializeAsync(CancellationToken cancellationToken)
{
var existingCollections = await ListCollectionsAsync(cancellationToken).ConfigureAwait(false);
foreach (var collectionName in RequiredCollections)
{
if (!existingCollections.Contains(collectionName))
{
await _database.CreateCollectionAsync(collectionName, cancellationToken: cancellationToken).ConfigureAwait(false);
_logger.LogInformation("Created Mongo collection {Collection}", collectionName);
}
}
await Task.WhenAll(
EnsureLocksIndexesAsync(cancellationToken),
EnsureJobsIndexesAsync(cancellationToken),
EnsureAdvisoryIndexesAsync(cancellationToken),
EnsureDocumentsIndexesAsync(cancellationToken),
EnsureDtoIndexesAsync(cancellationToken),
EnsureAliasIndexesAsync(cancellationToken),
EnsureAffectedIndexesAsync(cancellationToken),
EnsureReferenceIndexesAsync(cancellationToken),
EnsureSourceStateIndexesAsync(cancellationToken),
EnsurePsirtFlagIndexesAsync(cancellationToken),
EnsureAdvisoryStatementIndexesAsync(cancellationToken),
EnsureAdvisoryConflictIndexesAsync(cancellationToken),
EnsureObservationIndexesAsync(cancellationToken),
EnsureChangeHistoryIndexesAsync(cancellationToken),
EnsureGridFsIndexesAsync(cancellationToken)).ConfigureAwait(false);
await _migrationRunner.RunAsync(cancellationToken).ConfigureAwait(false);
_logger.LogInformation("Mongo bootstrapper completed");
}
private async Task<HashSet<string>> ListCollectionsAsync(CancellationToken cancellationToken)
{
using var cursor = await _database.ListCollectionNamesAsync(cancellationToken: cancellationToken).ConfigureAwait(false);
var list = await cursor.ToListAsync(cancellationToken).ConfigureAwait(false);
return new HashSet<string>(list, StringComparer.Ordinal);
}
private Task EnsureLocksIndexesAsync(CancellationToken cancellationToken)
{
var collection = _database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.Locks);
var indexes = new List<CreateIndexModel<BsonDocument>>
{
new(
Builders<BsonDocument>.IndexKeys.Ascending("ttlAt"),
new CreateIndexOptions { Name = "ttl_at_ttl", ExpireAfter = TimeSpan.Zero }),
};
return collection.Indexes.CreateManyAsync(indexes, cancellationToken);
}
private Task EnsureJobsIndexesAsync(CancellationToken cancellationToken)
{
var collection = _database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.Jobs);
var indexes = new List<CreateIndexModel<BsonDocument>>
{
new(
Builders<BsonDocument>.IndexKeys.Descending("createdAt"),
new CreateIndexOptions { Name = "jobs_createdAt_desc" }),
new(
Builders<BsonDocument>.IndexKeys.Ascending("kind").Descending("createdAt"),
new CreateIndexOptions { Name = "jobs_kind_createdAt" }),
new(
Builders<BsonDocument>.IndexKeys.Ascending("status").Descending("createdAt"),
new CreateIndexOptions { Name = "jobs_status_createdAt" }),
};
return collection.Indexes.CreateManyAsync(indexes, cancellationToken);
}
private Task EnsureAdvisoryIndexesAsync(CancellationToken cancellationToken)
{
var collection = _database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.Advisory);
var indexes = new List<CreateIndexModel<BsonDocument>>
{
new(
Builders<BsonDocument>.IndexKeys.Ascending("advisoryKey"),
new CreateIndexOptions { Name = "advisory_key_unique", Unique = true }),
new(
Builders<BsonDocument>.IndexKeys.Descending("modified"),
new CreateIndexOptions { Name = "advisory_modified_desc" }),
new(
Builders<BsonDocument>.IndexKeys.Descending("published"),
new CreateIndexOptions { Name = "advisory_published_desc" }),
};
if (_options.EnableSemVerStyle)
{
indexes.Add(new CreateIndexModel<BsonDocument>(
Builders<BsonDocument>.IndexKeys
.Ascending("normalizedVersions.packageId")
.Ascending("normalizedVersions.scheme")
.Ascending("normalizedVersions.type"),
new CreateIndexOptions { Name = "advisory_normalizedVersions_pkg_scheme_type" }));
indexes.Add(new CreateIndexModel<BsonDocument>(
Builders<BsonDocument>.IndexKeys.Ascending("normalizedVersions.value"),
new CreateIndexOptions { Name = "advisory_normalizedVersions_value", Sparse = true }));
}
return collection.Indexes.CreateManyAsync(indexes, cancellationToken);
}
private Task EnsureDocumentsIndexesAsync(CancellationToken cancellationToken)
{
var collection = _database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.Document);
var indexes = new List<CreateIndexModel<BsonDocument>>
{
new(
Builders<BsonDocument>.IndexKeys.Ascending("sourceName").Ascending("uri"),
new CreateIndexOptions { Name = "document_source_uri_unique", Unique = true }),
new(
Builders<BsonDocument>.IndexKeys.Descending("fetchedAt"),
new CreateIndexOptions { Name = "document_fetchedAt_desc" }),
};
var expiresKey = Builders<BsonDocument>.IndexKeys.Ascending("expiresAt");
var expiresOptions = new CreateIndexOptions<BsonDocument>
{
Name = _options.RawDocumentRetention > TimeSpan.Zero ? "document_expiresAt_ttl" : "document_expiresAt",
PartialFilterExpression = Builders<BsonDocument>.Filter.Exists("expiresAt", true),
};
if (_options.RawDocumentRetention > TimeSpan.Zero)
{
expiresOptions.ExpireAfter = TimeSpan.Zero;
}
indexes.Add(new CreateIndexModel<BsonDocument>(expiresKey, expiresOptions));
return collection.Indexes.CreateManyAsync(indexes, cancellationToken);
}
private Task EnsureAliasIndexesAsync(CancellationToken cancellationToken)
{
var collection = _database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.Alias);
var indexes = new List<CreateIndexModel<BsonDocument>>
{
new(
Builders<BsonDocument>.IndexKeys.Ascending("scheme").Ascending("value"),
new CreateIndexOptions { Name = "alias_scheme_value", Unique = false }),
};
return collection.Indexes.CreateManyAsync(indexes, cancellationToken);
}
private Task EnsureGridFsIndexesAsync(CancellationToken cancellationToken)
{
if (_options.RawDocumentRetention <= TimeSpan.Zero)
{
return Task.CompletedTask;
}
var collectionName = $"{RawDocumentBucketName}.files";
var collection = _database.GetCollection<BsonDocument>(collectionName);
var indexes = new List<CreateIndexModel<BsonDocument>>
{
new(
Builders<BsonDocument>.IndexKeys.Ascending("metadata.expiresAt"),
new CreateIndexOptions<BsonDocument>
{
Name = "gridfs_files_expiresAt_ttl",
ExpireAfter = TimeSpan.Zero,
PartialFilterExpression = Builders<BsonDocument>.Filter.Exists("metadata.expiresAt", true),
}),
};
return collection.Indexes.CreateManyAsync(indexes, cancellationToken);
}
private Task EnsureAffectedIndexesAsync(CancellationToken cancellationToken)
{
var collection = _database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.Affected);
var indexes = new List<CreateIndexModel<BsonDocument>>
{
new(
Builders<BsonDocument>.IndexKeys.Ascending("platform").Ascending("name"),
new CreateIndexOptions { Name = "affected_platform_name" }),
new(
Builders<BsonDocument>.IndexKeys.Ascending("advisoryId"),
new CreateIndexOptions { Name = "affected_advisoryId" }),
};
return collection.Indexes.CreateManyAsync(indexes, cancellationToken);
}
private Task EnsureReferenceIndexesAsync(CancellationToken cancellationToken)
{
var collection = _database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.Reference);
var indexes = new List<CreateIndexModel<BsonDocument>>
{
new(
Builders<BsonDocument>.IndexKeys.Ascending("url"),
new CreateIndexOptions { Name = "reference_url" }),
new(
Builders<BsonDocument>.IndexKeys.Ascending("advisoryId"),
new CreateIndexOptions { Name = "reference_advisoryId" }),
};
return collection.Indexes.CreateManyAsync(indexes, cancellationToken);
}
private Task EnsureObservationIndexesAsync(CancellationToken cancellationToken)
{
var collection = _database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.AdvisoryObservations);
var indexes = new List<CreateIndexModel<BsonDocument>>
{
new(
Builders<BsonDocument>.IndexKeys
.Ascending("tenant")
.Ascending("upstream.upstream_id")
.Ascending("upstream.document_version"),
new CreateIndexOptions { Name = "advisory_obs_tenant_upstream", Unique = false }),
new(
Builders<BsonDocument>.IndexKeys
.Ascending("tenant")
.Ascending("linkset.aliases"),
new CreateIndexOptions { Name = "advisory_obs_tenant_aliases" }),
new(
Builders<BsonDocument>.IndexKeys
.Ascending("tenant")
.Ascending("linkset.purls"),
new CreateIndexOptions { Name = "advisory_obs_tenant_purls" }),
new(
Builders<BsonDocument>.IndexKeys
.Ascending("tenant")
.Descending("createdAt"),
new CreateIndexOptions { Name = "advisory_obs_tenant_createdAt" })
};
return collection.Indexes.CreateManyAsync(indexes, cancellationToken);
}
private Task EnsureAdvisoryStatementIndexesAsync(CancellationToken cancellationToken)
{
var collection = _database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.AdvisoryStatements);
var indexes = new List<CreateIndexModel<BsonDocument>>
{
new(
Builders<BsonDocument>.IndexKeys.Ascending("vulnerabilityKey").Descending("asOf"),
new CreateIndexOptions { Name = "advisory_statements_vulnerability_asof_desc" }),
new(
Builders<BsonDocument>.IndexKeys.Ascending("statementHash"),
new CreateIndexOptions { Name = "advisory_statements_statementHash_unique", Unique = true }),
};
return collection.Indexes.CreateManyAsync(indexes, cancellationToken);
}
private Task EnsureAdvisoryConflictIndexesAsync(CancellationToken cancellationToken)
{
var collection = _database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.AdvisoryConflicts);
var indexes = new List<CreateIndexModel<BsonDocument>>
{
new(
Builders<BsonDocument>.IndexKeys.Ascending("vulnerabilityKey").Descending("asOf"),
new CreateIndexOptions { Name = "advisory_conflicts_vulnerability_asof_desc" }),
new(
Builders<BsonDocument>.IndexKeys.Ascending("conflictHash"),
new CreateIndexOptions { Name = "advisory_conflicts_conflictHash_unique", Unique = true }),
};
return collection.Indexes.CreateManyAsync(indexes, cancellationToken);
}
private Task EnsureSourceStateIndexesAsync(CancellationToken cancellationToken)
{
var collection = _database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.SourceState);
var indexes = new List<CreateIndexModel<BsonDocument>>
{
new(
Builders<BsonDocument>.IndexKeys.Ascending("sourceName"),
new CreateIndexOptions { Name = "source_state_unique", Unique = true }),
};
return collection.Indexes.CreateManyAsync(indexes, cancellationToken);
}
private Task EnsureDtoIndexesAsync(CancellationToken cancellationToken)
{
var collection = _database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.Dto);
var indexes = new List<CreateIndexModel<BsonDocument>>
{
new(
Builders<BsonDocument>.IndexKeys.Ascending("documentId"),
new CreateIndexOptions { Name = "dto_documentId" }),
new(
Builders<BsonDocument>.IndexKeys.Ascending("sourceName").Descending("validatedAt"),
new CreateIndexOptions { Name = "dto_source_validated" }),
};
return collection.Indexes.CreateManyAsync(indexes, cancellationToken);
}
private async Task EnsurePsirtFlagIndexesAsync(CancellationToken cancellationToken)
{
var collection = _database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.PsirtFlags);
try
{
await collection.Indexes.DropOneAsync("psirt_advisoryKey_unique", cancellationToken).ConfigureAwait(false);
}
catch (MongoCommandException ex) when (ex.CodeName == "IndexNotFound")
{
}
var index = new CreateIndexModel<BsonDocument>(
Builders<BsonDocument>.IndexKeys.Ascending("vendor"),
new CreateIndexOptions { Name = "psirt_vendor" });
await collection.Indexes.CreateOneAsync(index, cancellationToken: cancellationToken).ConfigureAwait(false);
}
private Task EnsureChangeHistoryIndexesAsync(CancellationToken cancellationToken)
{
var collection = _database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.ChangeHistory);
var indexes = new List<CreateIndexModel<BsonDocument>>
{
new(
Builders<BsonDocument>.IndexKeys.Ascending("source").Ascending("advisoryKey").Descending("capturedAt"),
new CreateIndexOptions { Name = "history_source_advisory_capturedAt" }),
new(
Builders<BsonDocument>.IndexKeys.Descending("capturedAt"),
new CreateIndexOptions { Name = "history_capturedAt" }),
new(
Builders<BsonDocument>.IndexKeys.Ascending("documentId"),
new CreateIndexOptions { Name = "history_documentId" })
};
return collection.Indexes.CreateManyAsync(indexes, cancellationToken);
}
}

View File

@@ -0,0 +1,21 @@
namespace StellaOps.Concelier.Storage.Mongo;
public enum MongoValidationLevel
{
Off,
Moderate,
Strict,
}
public enum MongoValidationAction
{
Warn,
Error,
}
public sealed class MongoCollectionValidatorOptions
{
public MongoValidationLevel Level { get; set; } = MongoValidationLevel.Moderate;
public MongoValidationAction Action { get; set; } = MongoValidationAction.Warn;
}

View File

@@ -0,0 +1,223 @@
using System;
using System.Collections.Generic;
using System.Linq;
using Microsoft.Extensions.Logging;
using MongoDB.Bson;
using MongoDB.Bson.Serialization;
using MongoDB.Driver;
using StellaOps.Concelier.Core.Jobs;
namespace StellaOps.Concelier.Storage.Mongo;
public sealed class MongoJobStore : IJobStore
{
private static readonly string PendingStatus = JobRunStatus.Pending.ToString();
private static readonly string RunningStatus = JobRunStatus.Running.ToString();
private readonly IMongoCollection<JobRunDocument> _collection;
private readonly ILogger<MongoJobStore> _logger;
public MongoJobStore(IMongoCollection<JobRunDocument> collection, ILogger<MongoJobStore> logger)
{
_collection = collection ?? throw new ArgumentNullException(nameof(collection));
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public async Task<JobRunSnapshot> CreateAsync(JobRunCreateRequest request, CancellationToken cancellationToken, IClientSessionHandle? session = null)
{
var runId = Guid.NewGuid();
var document = JobRunDocumentExtensions.FromRequest(request, runId);
if (session is null)
{
await _collection.InsertOneAsync(document, cancellationToken: cancellationToken).ConfigureAwait(false);
}
else
{
await _collection.InsertOneAsync(session, document, cancellationToken: cancellationToken).ConfigureAwait(false);
}
_logger.LogDebug("Created job run {RunId} for {Kind} with trigger {Trigger}", runId, request.Kind, request.Trigger);
return document.ToSnapshot();
}
public async Task<JobRunSnapshot?> TryStartAsync(Guid runId, DateTimeOffset startedAt, CancellationToken cancellationToken, IClientSessionHandle? session = null)
{
var runIdValue = runId.ToString();
var filter = Builders<JobRunDocument>.Filter.Eq(x => x.Id, runIdValue)
& Builders<JobRunDocument>.Filter.Eq(x => x.Status, PendingStatus);
var update = Builders<JobRunDocument>.Update
.Set(x => x.Status, RunningStatus)
.Set(x => x.StartedAt, startedAt.UtcDateTime);
var options = new FindOneAndUpdateOptions<JobRunDocument>
{
ReturnDocument = ReturnDocument.After,
};
var result = session is null
? await _collection.FindOneAndUpdateAsync(filter, update, options, cancellationToken).ConfigureAwait(false)
: await _collection.FindOneAndUpdateAsync(session, filter, update, options, cancellationToken).ConfigureAwait(false);
if (result is null)
{
_logger.LogDebug("Failed to start job run {RunId}; status transition rejected", runId);
return null;
}
return result.ToSnapshot();
}
public async Task<JobRunSnapshot?> TryCompleteAsync(Guid runId, JobRunCompletion completion, CancellationToken cancellationToken, IClientSessionHandle? session = null)
{
var runIdValue = runId.ToString();
var filter = Builders<JobRunDocument>.Filter.Eq(x => x.Id, runIdValue)
& Builders<JobRunDocument>.Filter.In(x => x.Status, new[] { PendingStatus, RunningStatus });
var update = Builders<JobRunDocument>.Update
.Set(x => x.Status, completion.Status.ToString())
.Set(x => x.CompletedAt, completion.CompletedAt.UtcDateTime)
.Set(x => x.Error, completion.Error);
var options = new FindOneAndUpdateOptions<JobRunDocument>
{
ReturnDocument = ReturnDocument.After,
};
var result = session is null
? await _collection.FindOneAndUpdateAsync(filter, update, options, cancellationToken).ConfigureAwait(false)
: await _collection.FindOneAndUpdateAsync(session, filter, update, options, cancellationToken).ConfigureAwait(false);
if (result is null)
{
_logger.LogWarning("Failed to mark job run {RunId} as {Status}", runId, completion.Status);
return null;
}
return result.ToSnapshot();
}
public async Task<JobRunSnapshot?> FindAsync(Guid runId, CancellationToken cancellationToken, IClientSessionHandle? session = null)
{
var filter = Builders<JobRunDocument>.Filter.Eq(x => x.Id, runId.ToString());
var query = session is null
? _collection.Find(filter)
: _collection.Find(session, filter);
var document = await query.FirstOrDefaultAsync(cancellationToken).ConfigureAwait(false);
return document?.ToSnapshot();
}
public async Task<IReadOnlyList<JobRunSnapshot>> GetRecentRunsAsync(string? kind, int limit, CancellationToken cancellationToken, IClientSessionHandle? session = null)
{
if (limit <= 0)
{
return Array.Empty<JobRunSnapshot>();
}
var filter = string.IsNullOrWhiteSpace(kind)
? Builders<JobRunDocument>.Filter.Empty
: Builders<JobRunDocument>.Filter.Eq(x => x.Kind, kind);
var query = session is null
? _collection.Find(filter)
: _collection.Find(session, filter);
var cursor = await query
.SortByDescending(x => x.CreatedAt)
.Limit(limit)
.ToListAsync(cancellationToken)
.ConfigureAwait(false);
return cursor.Select(static doc => doc.ToSnapshot()).ToArray();
}
public async Task<IReadOnlyList<JobRunSnapshot>> GetActiveRunsAsync(CancellationToken cancellationToken, IClientSessionHandle? session = null)
{
var filter = Builders<JobRunDocument>.Filter.In(x => x.Status, new[] { PendingStatus, RunningStatus });
var query = session is null
? _collection.Find(filter)
: _collection.Find(session, filter);
var cursor = await query
.SortByDescending(x => x.CreatedAt)
.ToListAsync(cancellationToken)
.ConfigureAwait(false);
return cursor.Select(static doc => doc.ToSnapshot()).ToArray();
}
public async Task<JobRunSnapshot?> GetLastRunAsync(string kind, CancellationToken cancellationToken, IClientSessionHandle? session = null)
{
var filter = Builders<JobRunDocument>.Filter.Eq(x => x.Kind, kind);
var query = session is null
? _collection.Find(filter)
: _collection.Find(session, filter);
var cursor = await query
.SortByDescending(x => x.CreatedAt)
.Limit(1)
.ToListAsync(cancellationToken)
.ConfigureAwait(false);
return cursor.FirstOrDefault()?.ToSnapshot();
}
public async Task<IReadOnlyDictionary<string, JobRunSnapshot>> GetLastRunsAsync(IEnumerable<string> kinds, CancellationToken cancellationToken, IClientSessionHandle? session = null)
{
if (kinds is null)
{
throw new ArgumentNullException(nameof(kinds));
}
var kindList = kinds
.Where(static kind => !string.IsNullOrWhiteSpace(kind))
.Select(static kind => kind.Trim())
.Distinct(StringComparer.Ordinal)
.ToArray();
if (kindList.Length == 0)
{
return new Dictionary<string, JobRunSnapshot>(StringComparer.Ordinal);
}
var matchStage = new BsonDocument("$match", new BsonDocument("kind", new BsonDocument("$in", new BsonArray(kindList))));
var sortStage = new BsonDocument("$sort", new BsonDocument("createdAt", -1));
var groupStage = new BsonDocument("$group", new BsonDocument
{
{ "_id", "$kind" },
{ "document", new BsonDocument("$first", "$$ROOT") }
});
var pipeline = new[] { matchStage, sortStage, groupStage };
var aggregateFluent = session is null
? _collection.Aggregate<BsonDocument>(pipeline)
: _collection.Aggregate<BsonDocument>(session, pipeline);
var aggregate = await aggregateFluent
.ToListAsync(cancellationToken)
.ConfigureAwait(false);
var results = new Dictionary<string, JobRunSnapshot>(StringComparer.Ordinal);
foreach (var element in aggregate)
{
if (!element.TryGetValue("_id", out var idValue) || idValue.BsonType != BsonType.String)
{
continue;
}
if (!element.TryGetValue("document", out var documentValue) || documentValue.BsonType != BsonType.Document)
{
continue;
}
var document = BsonSerializer.Deserialize<JobRunDocument>(documentValue.AsBsonDocument);
results[idValue.AsString] = document.ToSnapshot();
}
return results;
}
}

View File

@@ -0,0 +1,116 @@
using Microsoft.Extensions.Logging;
using MongoDB.Driver;
using StellaOps.Concelier.Core.Jobs;
namespace StellaOps.Concelier.Storage.Mongo;
public sealed class MongoLeaseStore : ILeaseStore
{
private readonly IMongoCollection<JobLeaseDocument> _collection;
private readonly ILogger<MongoLeaseStore> _logger;
public MongoLeaseStore(IMongoCollection<JobLeaseDocument> collection, ILogger<MongoLeaseStore> logger)
{
_collection = collection ?? throw new ArgumentNullException(nameof(collection));
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public async Task<JobLease?> TryAcquireAsync(string key, string holder, TimeSpan leaseDuration, DateTimeOffset now, CancellationToken cancellationToken)
{
var nowUtc = now.UtcDateTime;
var ttlUtc = nowUtc.Add(leaseDuration);
var filter = Builders<JobLeaseDocument>.Filter.Eq(x => x.Key, key)
& Builders<JobLeaseDocument>.Filter.Or(
Builders<JobLeaseDocument>.Filter.Lte(x => x.TtlAt, nowUtc),
Builders<JobLeaseDocument>.Filter.Eq(x => x.Holder, holder));
var update = Builders<JobLeaseDocument>.Update
.Set(x => x.Holder, holder)
.Set(x => x.AcquiredAt, nowUtc)
.Set(x => x.HeartbeatAt, nowUtc)
.Set(x => x.LeaseMs, (long)leaseDuration.TotalMilliseconds)
.Set(x => x.TtlAt, ttlUtc);
var options = new FindOneAndUpdateOptions<JobLeaseDocument>
{
ReturnDocument = ReturnDocument.After,
};
var updated = await _collection.FindOneAndUpdateAsync(filter, update, options, cancellationToken).ConfigureAwait(false);
if (updated is not null)
{
_logger.LogDebug("Lease {Key} acquired by {Holder}", key, holder);
return updated.ToLease();
}
try
{
var document = new JobLeaseDocument
{
Key = key,
Holder = holder,
AcquiredAt = nowUtc,
HeartbeatAt = nowUtc,
LeaseMs = (long)leaseDuration.TotalMilliseconds,
TtlAt = ttlUtc,
};
await _collection.InsertOneAsync(document, cancellationToken: cancellationToken).ConfigureAwait(false);
_logger.LogDebug("Lease {Key} inserted for {Holder}", key, holder);
return document.ToLease();
}
catch (MongoWriteException ex) when (ex.WriteError.Category == ServerErrorCategory.DuplicateKey)
{
_logger.LogDebug(ex, "Lease {Key} already held by another process", key);
return null;
}
}
public async Task<JobLease?> HeartbeatAsync(string key, string holder, TimeSpan leaseDuration, DateTimeOffset now, CancellationToken cancellationToken)
{
var nowUtc = now.UtcDateTime;
var ttlUtc = nowUtc.Add(leaseDuration);
var filter = Builders<JobLeaseDocument>.Filter.Eq(x => x.Key, key)
& Builders<JobLeaseDocument>.Filter.Eq(x => x.Holder, holder);
var update = Builders<JobLeaseDocument>.Update
.Set(x => x.HeartbeatAt, nowUtc)
.Set(x => x.LeaseMs, (long)leaseDuration.TotalMilliseconds)
.Set(x => x.TtlAt, ttlUtc);
var updated = await _collection.FindOneAndUpdateAsync(
filter,
update,
new FindOneAndUpdateOptions<JobLeaseDocument>
{
ReturnDocument = ReturnDocument.After,
},
cancellationToken).ConfigureAwait(false);
if (updated is null)
{
_logger.LogDebug("Heartbeat rejected for lease {Key} held by {Holder}", key, holder);
}
return updated?.ToLease();
}
public async Task<bool> ReleaseAsync(string key, string holder, CancellationToken cancellationToken)
{
var result = await _collection.DeleteOneAsync(
Builders<JobLeaseDocument>.Filter.Eq(x => x.Key, key)
& Builders<JobLeaseDocument>.Filter.Eq(x => x.Holder, holder),
cancellationToken).ConfigureAwait(false);
if (result.DeletedCount == 0)
{
_logger.LogDebug("Lease {Key} not released by {Holder}; no matching document", key, holder);
return false;
}
_logger.LogDebug("Lease {Key} released by {Holder}", key, holder);
return true;
}
}

View File

@@ -0,0 +1,34 @@
using Microsoft.Extensions.Options;
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo;
public interface IMongoSessionProvider
{
Task<IClientSessionHandle> StartSessionAsync(CancellationToken cancellationToken = default);
}
internal sealed class MongoSessionProvider : IMongoSessionProvider
{
private readonly IMongoClient _client;
private readonly MongoStorageOptions _options;
public MongoSessionProvider(IMongoClient client, IOptions<MongoStorageOptions> options)
{
_client = client ?? throw new ArgumentNullException(nameof(client));
_options = options?.Value ?? throw new ArgumentNullException(nameof(options));
}
public Task<IClientSessionHandle> StartSessionAsync(CancellationToken cancellationToken = default)
{
var sessionOptions = new ClientSessionOptions
{
DefaultTransactionOptions = new TransactionOptions(
readPreference: ReadPreference.Primary,
readConcern: ReadConcern.Majority,
writeConcern: WriteConcern.WMajority.With(wTimeout: _options.CommandTimeout))
};
return _client.StartSessionAsync(sessionOptions, cancellationToken);
}
}

View File

@@ -0,0 +1,115 @@
using Microsoft.Extensions.Logging;
using MongoDB.Bson;
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo;
public sealed class MongoSourceStateRepository : ISourceStateRepository
{
private readonly IMongoCollection<SourceStateDocument> _collection;
private const int MaxFailureReasonLength = 1024;
private readonly ILogger<MongoSourceStateRepository> _logger;
public MongoSourceStateRepository(IMongoDatabase database, ILogger<MongoSourceStateRepository> logger)
{
_collection = (database ?? throw new ArgumentNullException(nameof(database)))
.GetCollection<SourceStateDocument>(MongoStorageDefaults.Collections.SourceState);
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public async Task<SourceStateRecord?> TryGetAsync(string sourceName, CancellationToken cancellationToken, IClientSessionHandle? session = null)
{
var filter = Builders<SourceStateDocument>.Filter.Eq(x => x.SourceName, sourceName);
var query = session is null
? _collection.Find(filter)
: _collection.Find(session, filter);
var document = await query.FirstOrDefaultAsync(cancellationToken).ConfigureAwait(false);
return document?.ToRecord();
}
public async Task<SourceStateRecord> UpsertAsync(SourceStateRecord record, CancellationToken cancellationToken, IClientSessionHandle? session = null)
{
var document = SourceStateDocumentExtensions.FromRecord(record with { UpdatedAt = DateTimeOffset.UtcNow });
var filter = Builders<SourceStateDocument>.Filter.Eq(x => x.SourceName, record.SourceName);
var options = new ReplaceOptions { IsUpsert = true };
if (session is null)
{
await _collection.ReplaceOneAsync(filter, document, options, cancellationToken).ConfigureAwait(false);
}
else
{
await _collection.ReplaceOneAsync(session, filter, document, options, cancellationToken).ConfigureAwait(false);
}
_logger.LogDebug("Upserted source state for {Source}", record.SourceName);
return document.ToRecord();
}
public async Task<SourceStateRecord?> UpdateCursorAsync(string sourceName, BsonDocument cursor, DateTimeOffset completedAt, CancellationToken cancellationToken, IClientSessionHandle? session = null)
{
ArgumentException.ThrowIfNullOrEmpty(sourceName);
var update = Builders<SourceStateDocument>.Update
.Set(x => x.Cursor, cursor ?? new BsonDocument())
.Set(x => x.LastSuccess, completedAt.UtcDateTime)
.Set(x => x.FailCount, 0)
.Set(x => x.BackoffUntil, (DateTime?)null)
.Set(x => x.LastFailureReason, null)
.Set(x => x.UpdatedAt, DateTime.UtcNow)
.SetOnInsert(x => x.SourceName, sourceName);
var options = new FindOneAndUpdateOptions<SourceStateDocument>
{
ReturnDocument = ReturnDocument.After,
IsUpsert = true,
};
var filter = Builders<SourceStateDocument>.Filter.Eq(x => x.SourceName, sourceName);
var document = session is null
? await _collection.FindOneAndUpdateAsync(filter, update, options, cancellationToken).ConfigureAwait(false)
: await _collection.FindOneAndUpdateAsync(session, filter, update, options, cancellationToken).ConfigureAwait(false);
return document?.ToRecord();
}
public async Task<SourceStateRecord?> MarkFailureAsync(string sourceName, DateTimeOffset failedAt, TimeSpan? backoff, string? failureReason, CancellationToken cancellationToken, IClientSessionHandle? session = null)
{
ArgumentException.ThrowIfNullOrEmpty(sourceName);
var reasonValue = NormalizeFailureReason(failureReason);
var update = Builders<SourceStateDocument>.Update
.Inc(x => x.FailCount, 1)
.Set(x => x.LastFailure, failedAt.UtcDateTime)
.Set(x => x.BackoffUntil, backoff.HasValue ? failedAt.UtcDateTime.Add(backoff.Value) : null)
.Set(x => x.LastFailureReason, reasonValue)
.Set(x => x.UpdatedAt, DateTime.UtcNow)
.SetOnInsert(x => x.SourceName, sourceName);
var options = new FindOneAndUpdateOptions<SourceStateDocument>
{
ReturnDocument = ReturnDocument.After,
IsUpsert = true,
};
var filter = Builders<SourceStateDocument>.Filter.Eq(x => x.SourceName, sourceName);
var document = session is null
? await _collection.FindOneAndUpdateAsync(filter, update, options, cancellationToken).ConfigureAwait(false)
: await _collection.FindOneAndUpdateAsync(session, filter, update, options, cancellationToken).ConfigureAwait(false);
return document?.ToRecord();
}
private static string? NormalizeFailureReason(string? reason)
{
if (string.IsNullOrWhiteSpace(reason))
{
return null;
}
var trimmed = reason.Trim();
if (trimmed.Length <= MaxFailureReasonLength)
{
return trimmed;
}
return trimmed[..MaxFailureReasonLength];
}
}

View File

@@ -0,0 +1,32 @@
namespace StellaOps.Concelier.Storage.Mongo;
public static class MongoStorageDefaults
{
public const string DefaultDatabaseName = "concelier";
public static class Collections
{
public const string Source = "source";
public const string SourceState = "source_state";
public const string Document = "document";
public const string Dto = "dto";
public const string Advisory = "advisory";
public const string AdvisoryRaw = "advisory_raw";
public const string Alias = "alias";
public const string Affected = "affected";
public const string Reference = "reference";
public const string KevFlag = "kev_flag";
public const string RuFlags = "ru_flags";
public const string JpFlags = "jp_flags";
public const string PsirtFlags = "psirt_flags";
public const string MergeEvent = "merge_event";
public const string ExportState = "export_state";
public const string Locks = "locks";
public const string Jobs = "jobs";
public const string Migrations = "schema_migrations";
public const string ChangeHistory = "source_change_history";
public const string AdvisoryStatements = "advisory_statements";
public const string AdvisoryConflicts = "advisory_conflicts";
public const string AdvisoryObservations = "advisory_observations";
}
}

View File

@@ -0,0 +1,119 @@
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo;
public sealed class MongoStorageOptions
{
public string ConnectionString { get; set; } = string.Empty;
public string? DatabaseName { get; set; }
public TimeSpan CommandTimeout { get; set; } = TimeSpan.FromSeconds(30);
/// <summary>
/// Retention period for raw documents (document + DTO + GridFS payloads).
/// Set to <see cref="TimeSpan.Zero"/> to disable automatic expiry.
/// </summary>
public TimeSpan RawDocumentRetention { get; set; } = TimeSpan.FromDays(45);
/// <summary>
/// Additional grace period applied on top of <see cref="RawDocumentRetention"/> before TTL purges old rows.
/// Allows the retention background service to delete GridFS blobs first.
/// </summary>
public TimeSpan RawDocumentRetentionTtlGrace { get; set; } = TimeSpan.FromDays(1);
/// <summary>
/// Interval between retention sweeps. Only used when <see cref="RawDocumentRetention"/> is greater than zero.
/// </summary>
public TimeSpan RawDocumentRetentionSweepInterval { get; set; } = TimeSpan.FromHours(6);
/// <summary>
/// Enables dual-write of normalized SemVer analytics for affected packages.
/// </summary>
public bool EnableSemVerStyle { get; set; } = true;
/// <summary>
/// Batch size used by backfill migrations when repopulating normalized version documents.
/// </summary>
public int BackfillBatchSize { get; set; } = 250;
/// <summary>
/// Tenant identifier associated with advisory ingestion when upstream context does not specify one.
/// </summary>
public string DefaultTenant { get; set; } = "tenant-default";
/// <summary>
/// JSON schema validator settings for the advisory_raw collection.
/// </summary>
public MongoCollectionValidatorOptions AdvisoryRawValidator { get; set; } = new();
public string GetDatabaseName()
{
if (!string.IsNullOrWhiteSpace(DatabaseName))
{
return DatabaseName.Trim();
}
if (!string.IsNullOrWhiteSpace(ConnectionString))
{
var url = MongoUrl.Create(ConnectionString);
if (!string.IsNullOrWhiteSpace(url.DatabaseName))
{
return url.DatabaseName;
}
}
return MongoStorageDefaults.DefaultDatabaseName;
}
public void EnsureValid()
{
if (string.IsNullOrWhiteSpace(ConnectionString))
{
throw new InvalidOperationException("Mongo connection string is not configured.");
}
if (CommandTimeout <= TimeSpan.Zero)
{
throw new InvalidOperationException("Command timeout must be greater than zero.");
}
if (RawDocumentRetention < TimeSpan.Zero)
{
throw new InvalidOperationException("Raw document retention cannot be negative.");
}
if (RawDocumentRetentionTtlGrace < TimeSpan.Zero)
{
throw new InvalidOperationException("Raw document retention TTL grace cannot be negative.");
}
if (RawDocumentRetention > TimeSpan.Zero && RawDocumentRetentionSweepInterval <= TimeSpan.Zero)
{
throw new InvalidOperationException("Raw document retention sweep interval must be positive when retention is enabled.");
}
if (BackfillBatchSize <= 0)
{
throw new InvalidOperationException("Backfill batch size must be greater than zero.");
}
if (string.IsNullOrWhiteSpace(DefaultTenant))
{
throw new InvalidOperationException("Default tenant must be provided for advisory ingestion.");
}
AdvisoryRawValidator ??= new MongoCollectionValidatorOptions();
if (!Enum.IsDefined(typeof(MongoValidationLevel), AdvisoryRawValidator.Level))
{
throw new InvalidOperationException($"Unsupported Mongo validation level '{AdvisoryRawValidator.Level}' configured for advisory raw validator.");
}
if (!Enum.IsDefined(typeof(MongoValidationAction), AdvisoryRawValidator.Action))
{
throw new InvalidOperationException($"Unsupported Mongo validation action '{AdvisoryRawValidator.Action}' configured for advisory raw validator.");
}
_ = GetDatabaseName();
}
}

View File

@@ -0,0 +1,163 @@
using System;
using System.Collections.Generic;
using MongoDB.Bson;
using MongoDB.Bson.Serialization.Attributes;
namespace StellaOps.Concelier.Storage.Mongo.Observations;
[BsonIgnoreExtraElements]
public sealed class AdvisoryObservationDocument
{
[BsonId]
public string Id { get; set; } = string.Empty;
[BsonElement("tenant")]
public string Tenant { get; set; } = string.Empty;
[BsonElement("source")]
public AdvisoryObservationSourceDocument Source { get; set; } = new();
[BsonElement("upstream")]
public AdvisoryObservationUpstreamDocument Upstream { get; set; } = new();
[BsonElement("content")]
public AdvisoryObservationContentDocument Content { get; set; } = new();
[BsonElement("linkset")]
public AdvisoryObservationLinksetDocument Linkset { get; set; } = new();
[BsonElement("createdAt")]
public DateTime CreatedAt { get; set; }
= DateTime.UtcNow;
[BsonElement("attributes")]
[BsonIgnoreIfNull]
public Dictionary<string, string>? Attributes { get; set; }
= new(StringComparer.Ordinal);
}
[BsonIgnoreExtraElements]
public sealed class AdvisoryObservationSourceDocument
{
[BsonElement("vendor")]
public string Vendor { get; set; } = string.Empty;
[BsonElement("stream")]
public string Stream { get; set; } = string.Empty;
[BsonElement("api")]
public string Api { get; set; } = string.Empty;
[BsonElement("collectorVersion")]
[BsonIgnoreIfNull]
public string? CollectorVersion { get; set; }
= null;
}
[BsonIgnoreExtraElements]
public sealed class AdvisoryObservationUpstreamDocument
{
[BsonElement("upstream_id")]
public string UpstreamId { get; set; } = string.Empty;
[BsonElement("document_version")]
[BsonIgnoreIfNull]
public string? DocumentVersion { get; set; }
= null;
[BsonElement("fetchedAt")]
public DateTime FetchedAt { get; set; }
= DateTime.UtcNow;
[BsonElement("receivedAt")]
public DateTime ReceivedAt { get; set; }
= DateTime.UtcNow;
[BsonElement("contentHash")]
public string ContentHash { get; set; } = string.Empty;
[BsonElement("signature")]
public AdvisoryObservationSignatureDocument Signature { get; set; } = new();
[BsonElement("metadata")]
[BsonIgnoreIfNull]
public Dictionary<string, string>? Metadata { get; set; }
= new(StringComparer.Ordinal);
}
[BsonIgnoreExtraElements]
public sealed class AdvisoryObservationSignatureDocument
{
[BsonElement("present")]
public bool Present { get; set; }
= false;
[BsonElement("format")]
[BsonIgnoreIfNull]
public string? Format { get; set; }
= null;
[BsonElement("keyId")]
[BsonIgnoreIfNull]
public string? KeyId { get; set; }
= null;
[BsonElement("signature")]
[BsonIgnoreIfNull]
public string? Signature { get; set; }
= null;
}
[BsonIgnoreExtraElements]
public sealed class AdvisoryObservationContentDocument
{
[BsonElement("format")]
public string Format { get; set; } = string.Empty;
[BsonElement("specVersion")]
[BsonIgnoreIfNull]
public string? SpecVersion { get; set; }
= null;
[BsonElement("raw")]
public BsonDocument Raw { get; set; } = new();
[BsonElement("metadata")]
[BsonIgnoreIfNull]
public Dictionary<string, string>? Metadata { get; set; }
= new(StringComparer.Ordinal);
}
[BsonIgnoreExtraElements]
public sealed class AdvisoryObservationLinksetDocument
{
[BsonElement("aliases")]
[BsonIgnoreIfNull]
public List<string>? Aliases { get; set; }
= new();
[BsonElement("purls")]
[BsonIgnoreIfNull]
public List<string>? Purls { get; set; }
= new();
[BsonElement("cpes")]
[BsonIgnoreIfNull]
public List<string>? Cpes { get; set; }
= new();
[BsonElement("references")]
[BsonIgnoreIfNull]
public List<AdvisoryObservationReferenceDocument>? References { get; set; }
= new();
}
[BsonIgnoreExtraElements]
public sealed class AdvisoryObservationReferenceDocument
{
[BsonElement("type")]
public string Type { get; set; } = string.Empty;
[BsonElement("url")]
public string Url { get; set; } = string.Empty;
}

View File

@@ -0,0 +1,92 @@
using System;
using System.Collections.Generic;
using System.Collections.Immutable;
using System.Linq;
using System.Text.Json.Nodes;
using MongoDB.Bson;
using MongoDB.Bson.IO;
using StellaOps.Concelier.Models.Observations;
namespace StellaOps.Concelier.Storage.Mongo.Observations;
internal static class AdvisoryObservationDocumentFactory
{
private static readonly JsonWriterSettings JsonSettings = new() { OutputMode = JsonOutputMode.RelaxedExtendedJson };
public static AdvisoryObservation ToModel(AdvisoryObservationDocument document)
{
ArgumentNullException.ThrowIfNull(document);
var rawNode = ParseJsonNode(document.Content.Raw);
var attributes = ToImmutable(document.Attributes);
var contentMetadata = ToImmutable(document.Content.Metadata);
var upstreamMetadata = ToImmutable(document.Upstream.Metadata);
var observation = new AdvisoryObservation(
document.Id,
document.Tenant,
new AdvisoryObservationSource(
document.Source.Vendor,
document.Source.Stream,
document.Source.Api,
document.Source.CollectorVersion),
new AdvisoryObservationUpstream(
document.Upstream.UpstreamId,
document.Upstream.DocumentVersion,
DateTime.SpecifyKind(document.Upstream.FetchedAt, DateTimeKind.Utc),
DateTime.SpecifyKind(document.Upstream.ReceivedAt, DateTimeKind.Utc),
document.Upstream.ContentHash,
new AdvisoryObservationSignature(
document.Upstream.Signature.Present,
document.Upstream.Signature.Format,
document.Upstream.Signature.KeyId,
document.Upstream.Signature.Signature),
upstreamMetadata),
new AdvisoryObservationContent(
document.Content.Format,
document.Content.SpecVersion,
rawNode,
contentMetadata),
new AdvisoryObservationLinkset(
document.Linkset.Aliases ?? Enumerable.Empty<string>(),
document.Linkset.Purls ?? Enumerable.Empty<string>(),
document.Linkset.Cpes ?? Enumerable.Empty<string>(),
document.Linkset.References?.Select(reference => new AdvisoryObservationReference(reference.Type, reference.Url))),
DateTime.SpecifyKind(document.CreatedAt, DateTimeKind.Utc),
attributes);
return observation;
}
private static JsonNode ParseJsonNode(BsonDocument raw)
{
if (raw is null || raw.ElementCount == 0)
{
return JsonNode.Parse("{}")!;
}
var json = raw.ToJson(JsonSettings);
return JsonNode.Parse(json)!;
}
private static ImmutableDictionary<string, string> ToImmutable(Dictionary<string, string>? values)
{
if (values is null || values.Count == 0)
{
return ImmutableDictionary<string, string>.Empty;
}
var builder = ImmutableDictionary.CreateBuilder<string, string>(StringComparer.Ordinal);
foreach (var pair in values)
{
if (string.IsNullOrWhiteSpace(pair.Key) || pair.Value is null)
{
continue;
}
builder[pair.Key.Trim()] = pair.Value;
}
return builder.ToImmutable();
}
}

View File

@@ -0,0 +1,60 @@
using System;
using System.Collections.Generic;
using StellaOps.Concelier.Core.Observations;
using StellaOps.Concelier.Models.Observations;
namespace StellaOps.Concelier.Storage.Mongo.Observations;
internal sealed class AdvisoryObservationLookup : IAdvisoryObservationLookup
{
private readonly IAdvisoryObservationStore _store;
public AdvisoryObservationLookup(IAdvisoryObservationStore store)
{
_store = store ?? throw new ArgumentNullException(nameof(store));
}
public ValueTask<IReadOnlyList<AdvisoryObservation>> ListByTenantAsync(
string tenant,
CancellationToken cancellationToken)
{
ArgumentException.ThrowIfNullOrWhiteSpace(tenant);
cancellationToken.ThrowIfCancellationRequested();
return new ValueTask<IReadOnlyList<AdvisoryObservation>>(
_store.ListByTenantAsync(tenant, cancellationToken));
}
public ValueTask<IReadOnlyList<AdvisoryObservation>> FindByFiltersAsync(
string tenant,
IReadOnlyCollection<string> observationIds,
IReadOnlyCollection<string> aliases,
IReadOnlyCollection<string> purls,
IReadOnlyCollection<string> cpes,
AdvisoryObservationCursor? cursor,
int limit,
CancellationToken cancellationToken)
{
ArgumentException.ThrowIfNullOrWhiteSpace(tenant);
ArgumentNullException.ThrowIfNull(observationIds);
ArgumentNullException.ThrowIfNull(aliases);
ArgumentNullException.ThrowIfNull(purls);
ArgumentNullException.ThrowIfNull(cpes);
if (limit <= 0)
{
throw new ArgumentOutOfRangeException(nameof(limit), "Limit must be greater than zero.");
}
cancellationToken.ThrowIfCancellationRequested();
return new ValueTask<IReadOnlyList<AdvisoryObservation>>(
_store.FindByFiltersAsync(
tenant,
observationIds,
aliases,
purls,
cpes,
cursor,
limit,
cancellationToken));
}
}

View File

@@ -0,0 +1,137 @@
using System;
using System.Collections.Generic;
using System.Linq;
using MongoDB.Driver;
using StellaOps.Concelier.Core.Observations;
using StellaOps.Concelier.Models.Observations;
namespace StellaOps.Concelier.Storage.Mongo.Observations;
internal sealed class AdvisoryObservationStore : IAdvisoryObservationStore
{
private readonly IMongoCollection<AdvisoryObservationDocument> collection;
public AdvisoryObservationStore(IMongoCollection<AdvisoryObservationDocument> collection)
{
this.collection = collection ?? throw new ArgumentNullException(nameof(collection));
}
public async Task<IReadOnlyList<AdvisoryObservation>> ListByTenantAsync(string tenant, CancellationToken cancellationToken)
{
ArgumentException.ThrowIfNullOrWhiteSpace(tenant);
var filter = Builders<AdvisoryObservationDocument>.Filter.Eq(document => document.Tenant, tenant.ToLowerInvariant());
var documents = await collection
.Find(filter)
.SortByDescending(document => document.CreatedAt)
.ThenBy(document => document.Id)
.ToListAsync(cancellationToken)
.ConfigureAwait(false);
return documents.Select(AdvisoryObservationDocumentFactory.ToModel).ToArray();
}
public async Task<IReadOnlyList<AdvisoryObservation>> FindByFiltersAsync(
string tenant,
IEnumerable<string>? observationIds,
IEnumerable<string>? aliases,
IEnumerable<string>? purls,
IEnumerable<string>? cpes,
AdvisoryObservationCursor? cursor,
int limit,
CancellationToken cancellationToken)
{
ArgumentException.ThrowIfNullOrWhiteSpace(tenant);
if (limit <= 0)
{
throw new ArgumentOutOfRangeException(nameof(limit), "Limit must be greater than zero.");
}
cancellationToken.ThrowIfCancellationRequested();
var normalizedTenant = tenant.ToLowerInvariant();
var normalizedObservationIds = NormalizeValues(observationIds, static value => value);
var normalizedAliases = NormalizeValues(aliases, static value => value.ToLowerInvariant());
var normalizedPurls = NormalizeValues(purls, static value => value);
var normalizedCpes = NormalizeValues(cpes, static value => value);
var builder = Builders<AdvisoryObservationDocument>.Filter;
var filters = new List<FilterDefinition<AdvisoryObservationDocument>>
{
builder.Eq(document => document.Tenant, normalizedTenant)
};
if (normalizedObservationIds.Length > 0)
{
filters.Add(builder.In(document => document.Id, normalizedObservationIds));
}
if (normalizedAliases.Length > 0)
{
filters.Add(builder.In("linkset.aliases", normalizedAliases));
}
if (normalizedPurls.Length > 0)
{
filters.Add(builder.In("linkset.purls", normalizedPurls));
}
if (normalizedCpes.Length > 0)
{
filters.Add(builder.In("linkset.cpes", normalizedCpes));
}
if (cursor.HasValue)
{
var createdAtUtc = cursor.Value.CreatedAt.UtcDateTime;
var observationId = cursor.Value.ObservationId;
var createdBefore = builder.Lt(document => document.CreatedAt, createdAtUtc);
var sameCreatedNextId = builder.And(
builder.Eq(document => document.CreatedAt, createdAtUtc),
builder.Gt(document => document.Id, observationId));
filters.Add(builder.Or(createdBefore, sameCreatedNextId));
}
var filter = builder.And(filters);
var documents = await collection
.Find(filter)
.SortByDescending(document => document.CreatedAt)
.ThenBy(document => document.Id)
.Limit(limit)
.ToListAsync(cancellationToken)
.ConfigureAwait(false);
return documents.Select(AdvisoryObservationDocumentFactory.ToModel).ToArray();
}
private static string[] NormalizeValues(IEnumerable<string>? values, Func<string, string> projector)
{
if (values is null)
{
return Array.Empty<string>();
}
var set = new HashSet<string>(StringComparer.Ordinal);
foreach (var value in values)
{
if (string.IsNullOrWhiteSpace(value))
{
continue;
}
var projected = projector(value.Trim());
if (!string.IsNullOrEmpty(projected))
{
set.Add(projected);
}
}
if (set.Count == 0)
{
return Array.Empty<string>();
}
return set.ToArray();
}
}

View File

@@ -0,0 +1,20 @@
using System.Collections.Generic;
using StellaOps.Concelier.Models.Observations;
using StellaOps.Concelier.Core.Observations;
namespace StellaOps.Concelier.Storage.Mongo.Observations;
public interface IAdvisoryObservationStore
{
Task<IReadOnlyList<AdvisoryObservation>> ListByTenantAsync(string tenant, CancellationToken cancellationToken);
Task<IReadOnlyList<AdvisoryObservation>> FindByFiltersAsync(
string tenant,
IEnumerable<string>? observationIds,
IEnumerable<string>? aliases,
IEnumerable<string>? purls,
IEnumerable<string>? cpes,
AdvisoryObservationCursor? cursor,
int limit,
CancellationToken cancellationToken);
}

View File

@@ -0,0 +1,3 @@
using System.Runtime.CompilerServices;
[assembly: InternalsVisibleTo("StellaOps.Concelier.Storage.Mongo.Tests")]

View File

@@ -0,0 +1,11 @@
using System.Threading;
using System.Threading.Tasks;
namespace StellaOps.Concelier.Storage.Mongo.PsirtFlags;
public interface IPsirtFlagStore
{
Task UpsertAsync(PsirtFlagRecord record, CancellationToken cancellationToken);
Task<PsirtFlagRecord?> FindAsync(string advisoryKey, CancellationToken cancellationToken);
}

View File

@@ -0,0 +1,52 @@
using MongoDB.Bson.Serialization.Attributes;
namespace StellaOps.Concelier.Storage.Mongo.PsirtFlags;
[BsonIgnoreExtraElements]
public sealed class PsirtFlagDocument
{
[BsonId]
[BsonElement("advisoryKey")]
public string AdvisoryKey { get; set; } = string.Empty;
[BsonElement("vendor")]
public string Vendor { get; set; } = string.Empty;
[BsonElement("sourceName")]
public string SourceName { get; set; } = string.Empty;
[BsonElement("advisoryIdText")]
public string AdvisoryIdText { get; set; } = string.Empty;
[BsonElement("flaggedAt")]
public DateTime FlaggedAt { get; set; }
}
internal static class PsirtFlagDocumentExtensions
{
public static PsirtFlagDocument FromRecord(PsirtFlagRecord record)
{
ArgumentNullException.ThrowIfNull(record);
return new PsirtFlagDocument
{
AdvisoryKey = string.IsNullOrWhiteSpace(record.AdvisoryKey) ? record.AdvisoryIdText : record.AdvisoryKey,
Vendor = record.Vendor,
SourceName = record.SourceName,
AdvisoryIdText = record.AdvisoryIdText,
FlaggedAt = record.FlaggedAt.UtcDateTime,
};
}
public static PsirtFlagRecord ToRecord(this PsirtFlagDocument document)
{
ArgumentNullException.ThrowIfNull(document);
return new PsirtFlagRecord(
document.AdvisoryKey,
document.Vendor,
document.SourceName,
document.AdvisoryIdText,
DateTime.SpecifyKind(document.FlaggedAt, DateTimeKind.Utc));
}
}

View File

@@ -0,0 +1,15 @@
namespace StellaOps.Concelier.Storage.Mongo.PsirtFlags;
/// <summary>
/// Describes a PSIRT precedence flag for a canonical advisory.
/// </summary>
public sealed record PsirtFlagRecord(
string AdvisoryKey,
string Vendor,
string SourceName,
string AdvisoryIdText,
DateTimeOffset FlaggedAt)
{
public PsirtFlagRecord WithFlaggedAt(DateTimeOffset flaggedAt)
=> this with { FlaggedAt = flaggedAt.ToUniversalTime() };
}

View File

@@ -0,0 +1,50 @@
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging;
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo.PsirtFlags;
public sealed class PsirtFlagStore : IPsirtFlagStore
{
private readonly IMongoCollection<PsirtFlagDocument> _collection;
private readonly ILogger<PsirtFlagStore> _logger;
public PsirtFlagStore(IMongoDatabase database, ILogger<PsirtFlagStore> logger)
{
ArgumentNullException.ThrowIfNull(database);
ArgumentNullException.ThrowIfNull(logger);
_collection = database.GetCollection<PsirtFlagDocument>(MongoStorageDefaults.Collections.PsirtFlags);
_logger = logger;
}
public async Task UpsertAsync(PsirtFlagRecord record, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(record);
ArgumentException.ThrowIfNullOrEmpty(record.AdvisoryKey);
var document = PsirtFlagDocumentExtensions.FromRecord(record);
var filter = Builders<PsirtFlagDocument>.Filter.Eq(x => x.AdvisoryKey, record.AdvisoryKey);
var options = new ReplaceOptions { IsUpsert = true };
try
{
await _collection.ReplaceOneAsync(filter, document, options, cancellationToken).ConfigureAwait(false);
_logger.LogDebug("Upserted PSIRT flag for {AdvisoryKey}", record.AdvisoryKey);
}
catch (MongoWriteException ex) when (ex.WriteError?.Category == ServerErrorCategory.DuplicateKey)
{
_logger.LogWarning(ex, "Duplicate PSIRT flag detected for {AdvisoryKey}", record.AdvisoryKey);
}
}
public async Task<PsirtFlagRecord?> FindAsync(string advisoryKey, CancellationToken cancellationToken)
{
ArgumentException.ThrowIfNullOrEmpty(advisoryKey);
var filter = Builders<PsirtFlagDocument>.Filter.Eq(x => x.AdvisoryKey, advisoryKey);
var document = await _collection.Find(filter).FirstOrDefaultAsync(cancellationToken).ConfigureAwait(false);
return document?.ToRecord();
}
}

View File

@@ -0,0 +1,720 @@
using System;
using System.Collections.Generic;
using System.Collections.Immutable;
using System.Globalization;
using System.Text;
using System.Linq;
using System.Text.Json;
using MongoDB.Bson;
using MongoDB.Driver;
using MongoDB.Bson.IO;
using Microsoft.Extensions.Logging;
using StellaOps.Concelier.Core.Raw;
using StellaOps.Concelier.RawModels;
namespace StellaOps.Concelier.Storage.Mongo.Raw;
internal sealed class MongoAdvisoryRawRepository : IAdvisoryRawRepository
{
private const int CursorSegmentCount = 2;
private readonly IMongoCollection<BsonDocument> _collection;
private readonly TimeProvider _timeProvider;
private readonly ILogger<MongoAdvisoryRawRepository> _logger;
public MongoAdvisoryRawRepository(
IMongoDatabase database,
TimeProvider timeProvider,
ILogger<MongoAdvisoryRawRepository> logger)
{
ArgumentNullException.ThrowIfNull(database);
_timeProvider = timeProvider ?? throw new ArgumentNullException(nameof(timeProvider));
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
_collection = database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.AdvisoryRaw);
}
public async Task<AdvisoryRawUpsertResult> UpsertAsync(AdvisoryRawDocument document, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(document);
var tenant = document.Tenant;
var vendor = document.Source.Vendor;
var upstreamId = document.Upstream.UpstreamId;
var contentHash = document.Upstream.ContentHash;
var baseFilter = Builders<BsonDocument>.Filter.Eq("tenant", tenant) &
Builders<BsonDocument>.Filter.Eq("source.vendor", vendor) &
Builders<BsonDocument>.Filter.Eq("upstream.upstream_id", upstreamId);
var duplicateFilter = baseFilter &
Builders<BsonDocument>.Filter.Eq("upstream.content_hash", contentHash);
var duplicate = await _collection
.Find(duplicateFilter)
.Limit(1)
.FirstOrDefaultAsync(cancellationToken)
.ConfigureAwait(false);
if (duplicate is not null)
{
var existing = MapToRecord(duplicate);
return new AdvisoryRawUpsertResult(false, existing);
}
var previous = await _collection
.Find(baseFilter)
.Sort(Builders<BsonDocument>.Sort.Descending("ingested_at").Descending("_id"))
.Limit(1)
.FirstOrDefaultAsync(cancellationToken)
.ConfigureAwait(false);
var supersedesId = previous?["_id"]?.AsString;
var recordDocument = CreateBsonDocument(document, supersedesId);
try
{
await _collection.InsertOneAsync(recordDocument, cancellationToken: cancellationToken).ConfigureAwait(false);
}
catch (MongoWriteException ex) when (ex.WriteError?.Category == ServerErrorCategory.DuplicateKey)
{
_logger.LogWarning(
ex,
"Duplicate key detected while inserting advisory_raw document tenant={Tenant} vendor={Vendor} upstream={Upstream} hash={Hash}",
tenant,
vendor,
upstreamId,
contentHash);
var existingDoc = await _collection
.Find(duplicateFilter)
.Limit(1)
.FirstOrDefaultAsync(cancellationToken)
.ConfigureAwait(false);
if (existingDoc is not null)
{
var existing = MapToRecord(existingDoc);
return new AdvisoryRawUpsertResult(false, existing);
}
throw;
}
var inserted = MapToRecord(recordDocument);
return new AdvisoryRawUpsertResult(true, inserted);
}
public async Task<AdvisoryRawRecord?> FindByIdAsync(string tenant, string id, CancellationToken cancellationToken)
{
ArgumentException.ThrowIfNullOrWhiteSpace(tenant);
ArgumentException.ThrowIfNullOrWhiteSpace(id);
var filter = Builders<BsonDocument>.Filter.Eq("tenant", tenant) &
Builders<BsonDocument>.Filter.Eq("_id", id);
var document = await _collection
.Find(filter)
.Limit(1)
.FirstOrDefaultAsync(cancellationToken)
.ConfigureAwait(false);
return document is null ? null : MapToRecord(document);
}
public async Task<AdvisoryRawQueryResult> QueryAsync(AdvisoryRawQueryOptions options, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(options);
var builder = Builders<BsonDocument>.Filter;
var filters = new List<FilterDefinition<BsonDocument>>
{
builder.Eq("tenant", options.Tenant)
};
if (!options.Vendors.IsDefaultOrEmpty)
{
filters.Add(builder.In("source.vendor", options.Vendors.Select(static vendor => vendor.Trim().ToLowerInvariant())));
}
if (!options.UpstreamIds.IsDefaultOrEmpty)
{
var upstreams = options.UpstreamIds
.Where(static id => !string.IsNullOrWhiteSpace(id))
.Select(static id => id.Trim())
.ToArray();
if (upstreams.Length > 0)
{
filters.Add(builder.In("upstream.upstream_id", upstreams));
}
}
if (!options.ContentHashes.IsDefaultOrEmpty)
{
var hashes = options.ContentHashes
.Where(static hash => !string.IsNullOrWhiteSpace(hash))
.Select(static hash => hash.Trim())
.ToArray();
if (hashes.Length > 0)
{
filters.Add(builder.In("upstream.content_hash", hashes));
}
}
if (!options.Aliases.IsDefaultOrEmpty)
{
var aliases = options.Aliases
.Where(static alias => !string.IsNullOrWhiteSpace(alias))
.Select(static alias => alias.Trim().ToLowerInvariant())
.ToArray();
if (aliases.Length > 0)
{
filters.Add(builder.In("linkset.aliases", aliases));
}
}
if (!options.PackageUrls.IsDefaultOrEmpty)
{
var purls = options.PackageUrls
.Where(static purl => !string.IsNullOrWhiteSpace(purl))
.Select(static purl => purl.Trim())
.ToArray();
if (purls.Length > 0)
{
filters.Add(builder.In("linkset.purls", purls));
}
}
if (options.Since is { } since)
{
filters.Add(builder.Gte("ingested_at", since.ToUniversalTime().UtcDateTime));
}
if (!string.IsNullOrWhiteSpace(options.Cursor) && TryDecodeCursor(options.Cursor, out var cursor))
{
var ingestTime = cursor.IngestedAt.UtcDateTime;
var cursorFilter = builder.Or(
builder.Lt("ingested_at", ingestTime),
builder.And(
builder.Eq("ingested_at", ingestTime),
builder.Gt("_id", cursor.Id)));
filters.Add(cursorFilter);
}
var filter = filters.Count == 1 ? filters[0] : builder.And(filters);
var limit = Math.Clamp(options.Limit, 1, 200);
var sort = Builders<BsonDocument>.Sort.Descending("ingested_at").Descending("_id");
var documents = await _collection
.Find(filter)
.Sort(sort)
.Limit(limit + 1)
.ToListAsync(cancellationToken)
.ConfigureAwait(false);
var hasMore = documents.Count > limit;
if (hasMore)
{
documents.RemoveAt(documents.Count - 1);
}
var records = documents.Select(MapToRecord).ToArray();
var nextCursor = hasMore && records.Length > 0
? EncodeCursor(records[^1].IngestedAt.UtcDateTime, records[^1].Id)
: null;
return new AdvisoryRawQueryResult(records, nextCursor, hasMore);
}
public async Task<IReadOnlyList<AdvisoryRawRecord>> ListForVerificationAsync(
string tenant,
DateTimeOffset since,
DateTimeOffset until,
IReadOnlyCollection<string> sourceVendors,
CancellationToken cancellationToken)
{
ArgumentException.ThrowIfNullOrWhiteSpace(tenant);
if (until < since)
{
throw new ArgumentException("Verification window end must not precede the start.");
}
var builder = Builders<BsonDocument>.Filter;
var filters = new List<FilterDefinition<BsonDocument>>
{
builder.Eq("tenant", tenant),
builder.Gte("ingested_at", since.ToUniversalTime().UtcDateTime),
builder.Lte("ingested_at", until.ToUniversalTime().UtcDateTime)
};
if (sourceVendors is { Count: > 0 })
{
filters.Add(builder.In("source.vendor", sourceVendors.Select(static vendor => vendor.Trim().ToLowerInvariant())));
}
var filter = builder.And(filters);
var sort = Builders<BsonDocument>.Sort.Ascending("ingested_at").Ascending("_id");
var documents = await _collection
.Find(filter)
.Sort(sort)
.ToListAsync(cancellationToken)
.ConfigureAwait(false);
return documents.Select(MapToRecord).ToArray();
}
private BsonDocument CreateBsonDocument(AdvisoryRawDocument document, string? supersedesId)
{
var now = _timeProvider.GetUtcNow().UtcDateTime;
var id = BuildDocumentId(document);
var supersedesValue = string.IsNullOrWhiteSpace(supersedesId) ? document.Supersedes : supersedesId;
var source = new BsonDocument
{
{ "vendor", document.Source.Vendor },
{ "connector", document.Source.Connector },
{ "version", document.Source.ConnectorVersion }
};
if (!string.IsNullOrWhiteSpace(document.Source.Stream))
{
source["stream"] = document.Source.Stream;
}
var signature = new BsonDocument
{
{ "present", document.Upstream.Signature.Present }
};
if (!string.IsNullOrWhiteSpace(document.Upstream.Signature.Format))
{
signature["format"] = document.Upstream.Signature.Format;
}
if (!string.IsNullOrWhiteSpace(document.Upstream.Signature.KeyId))
{
signature["key_id"] = document.Upstream.Signature.KeyId;
}
if (!string.IsNullOrWhiteSpace(document.Upstream.Signature.Signature))
{
signature["sig"] = document.Upstream.Signature.Signature;
}
if (!string.IsNullOrWhiteSpace(document.Upstream.Signature.Certificate))
{
signature["certificate"] = document.Upstream.Signature.Certificate;
}
if (!string.IsNullOrWhiteSpace(document.Upstream.Signature.Digest))
{
signature["digest"] = document.Upstream.Signature.Digest;
}
var provenance = new BsonDocument();
if (document.Upstream.Provenance is not null)
{
foreach (var entry in document.Upstream.Provenance)
{
provenance[entry.Key] = entry.Value;
}
}
var upstream = new BsonDocument
{
{ "upstream_id", document.Upstream.UpstreamId },
{ "document_version", string.IsNullOrWhiteSpace(document.Upstream.DocumentVersion) ? BsonNull.Value : BsonValue.Create(document.Upstream.DocumentVersion) },
{ "retrieved_at", document.Upstream.RetrievedAt.UtcDateTime },
{ "content_hash", document.Upstream.ContentHash },
{ "signature", signature },
{ "provenance", provenance }
};
var content = new BsonDocument
{
{ "format", document.Content.Format },
{ "raw", document.Content.Raw.GetRawText() }
};
if (!string.IsNullOrWhiteSpace(document.Content.SpecVersion))
{
content["spec_version"] = document.Content.SpecVersion;
}
if (!string.IsNullOrWhiteSpace(document.Content.Encoding))
{
content["encoding"] = document.Content.Encoding;
}
var identifiers = new BsonDocument
{
{ "aliases", new BsonArray(document.Identifiers.Aliases) },
{ "primary", document.Identifiers.PrimaryId }
};
var references = new BsonArray(document.Linkset.References.Select(reference =>
{
var referenceDocument = new BsonDocument
{
{ "type", reference.Type },
{ "url", reference.Url }
};
if (!string.IsNullOrWhiteSpace(reference.Source))
{
referenceDocument["source"] = reference.Source;
}
return referenceDocument;
}));
var notes = new BsonDocument();
if (document.Linkset.Notes is not null)
{
foreach (var entry in document.Linkset.Notes)
{
notes[entry.Key] = entry.Value;
}
}
var linkset = new BsonDocument
{
{ "aliases", new BsonArray(document.Linkset.Aliases) },
{ "purls", new BsonArray(document.Linkset.PackageUrls) },
{ "cpes", new BsonArray(document.Linkset.Cpes) },
{ "references", references },
{ "reconciled_from", new BsonArray(document.Linkset.ReconciledFrom) },
{ "notes", notes }
};
var bson = new BsonDocument
{
{ "_id", id },
{ "tenant", document.Tenant },
{ "source", source },
{ "upstream", upstream },
{ "content", content },
{ "identifiers", identifiers },
{ "linkset", linkset },
{ "supersedes", supersedesValue is null ? BsonNull.Value : supersedesValue },
{ "created_at", document.Upstream.RetrievedAt.UtcDateTime },
{ "ingested_at", now }
};
return bson;
}
private AdvisoryRawRecord MapToRecord(BsonDocument document)
{
var tenant = GetRequiredString(document, "tenant");
var source = MapSource(document["source"].AsBsonDocument);
var upstream = MapUpstream(document["upstream"].AsBsonDocument);
var content = MapContent(document["content"].AsBsonDocument);
var identifiers = MapIdentifiers(document["identifiers"].AsBsonDocument);
var linkset = MapLinkset(document["linkset"].AsBsonDocument);
var supersedes = document.GetValue("supersedes", BsonNull.Value);
var rawDocument = new AdvisoryRawDocument(
tenant,
source,
upstream,
content,
identifiers,
linkset,
supersedes.IsBsonNull ? null : supersedes.AsString);
var ingestedAt = GetDateTimeOffset(document, "ingested_at", rawDocument.Upstream.RetrievedAt);
var createdAt = GetDateTimeOffset(document, "created_at", rawDocument.Upstream.RetrievedAt);
return new AdvisoryRawRecord(
document.GetValue("_id").AsString,
rawDocument,
ingestedAt,
createdAt);
}
private static RawSourceMetadata MapSource(BsonDocument source)
{
return new RawSourceMetadata(
GetRequiredString(source, "vendor"),
GetRequiredString(source, "connector"),
GetRequiredString(source, "version"),
GetOptionalString(source, "stream"));
}
private static RawUpstreamMetadata MapUpstream(BsonDocument upstream)
{
var provenance = ImmutableDictionary.CreateBuilder<string, string>(StringComparer.Ordinal);
if (upstream.TryGetValue("provenance", out var provenanceValue) && provenanceValue.IsBsonDocument)
{
foreach (var element in provenanceValue.AsBsonDocument)
{
provenance[element.Name] = BsonValueToString(element.Value);
}
}
var signature = upstream["signature"].AsBsonDocument;
var signatureMetadata = new RawSignatureMetadata(
signature.GetValue("present", BsonBoolean.False).AsBoolean,
signature.TryGetValue("format", out var format) && !format.IsBsonNull ? format.AsString : null,
signature.TryGetValue("key_id", out var keyId) && !keyId.IsBsonNull ? keyId.AsString : null,
signature.TryGetValue("sig", out var sig) && !sig.IsBsonNull ? sig.AsString : null,
signature.TryGetValue("certificate", out var certificate) && !certificate.IsBsonNull ? certificate.AsString : null,
signature.TryGetValue("digest", out var digest) && !digest.IsBsonNull ? digest.AsString : null);
return new RawUpstreamMetadata(
GetRequiredString(upstream, "upstream_id"),
upstream.TryGetValue("document_version", out var version) && !version.IsBsonNull ? version.AsString : null,
GetDateTimeOffset(upstream, "retrieved_at", DateTimeOffset.UtcNow),
GetRequiredString(upstream, "content_hash"),
signatureMetadata,
provenance.ToImmutable());
}
private static RawContent MapContent(BsonDocument content)
{
var rawValue = content.GetValue("raw", BsonNull.Value);
string rawJson;
if (rawValue.IsBsonNull)
{
rawJson = "{}";
}
else if (rawValue.IsString)
{
rawJson = rawValue.AsString ?? "{}";
}
else
{
rawJson = rawValue.ToJson(new JsonWriterSettings { OutputMode = JsonOutputMode.RelaxedExtendedJson });
}
using var document = System.Text.Json.JsonDocument.Parse(string.IsNullOrWhiteSpace(rawJson) ? "{}" : rawJson);
return new RawContent(
GetRequiredString(content, "format"),
content.TryGetValue("spec_version", out var specVersion) && !specVersion.IsBsonNull ? specVersion.AsString : null,
document.RootElement.Clone(),
content.TryGetValue("encoding", out var encoding) && !encoding.IsBsonNull ? encoding.AsString : null);
}
private static RawIdentifiers MapIdentifiers(BsonDocument identifiers)
{
var aliases = identifiers.TryGetValue("aliases", out var aliasValue) && aliasValue.IsBsonArray
? aliasValue.AsBsonArray.Select(BsonValueToString).ToImmutableArray()
: ImmutableArray<string>.Empty;
return new RawIdentifiers(
aliases,
GetRequiredString(identifiers, "primary"));
}
private static RawLinkset MapLinkset(BsonDocument linkset)
{
var aliases = linkset.TryGetValue("aliases", out var aliasesValue) && aliasesValue.IsBsonArray
? aliasesValue.AsBsonArray.Select(BsonValueToString).ToImmutableArray()
: ImmutableArray<string>.Empty;
var purls = linkset.TryGetValue("purls", out var purlsValue) && purlsValue.IsBsonArray
? purlsValue.AsBsonArray.Select(BsonValueToString).ToImmutableArray()
: ImmutableArray<string>.Empty;
var cpes = linkset.TryGetValue("cpes", out var cpesValue) && cpesValue.IsBsonArray
? cpesValue.AsBsonArray.Select(BsonValueToString).ToImmutableArray()
: ImmutableArray<string>.Empty;
var references = linkset.TryGetValue("references", out var referencesValue) && referencesValue.IsBsonArray
? referencesValue.AsBsonArray
.Where(static value => value.IsBsonDocument)
.Select(value =>
{
var doc = value.AsBsonDocument;
return new RawReference(
GetRequiredString(doc, "type"),
GetRequiredString(doc, "url"),
doc.TryGetValue("source", out var sourceValue) && !sourceValue.IsBsonNull ? sourceValue.AsString : null);
})
.ToImmutableArray()
: ImmutableArray<RawReference>.Empty;
var reconciledFrom = linkset.TryGetValue("reconciled_from", out var reconciledValue) && reconciledValue.IsBsonArray
? reconciledValue.AsBsonArray.Select(BsonValueToString).ToImmutableArray()
: ImmutableArray<string>.Empty;
var notesBuilder = ImmutableDictionary.CreateBuilder<string, string>(StringComparer.Ordinal);
if (linkset.TryGetValue("notes", out var notesValue) && notesValue.IsBsonDocument)
{
foreach (var element in notesValue.AsBsonDocument)
{
notesBuilder[element.Name] = BsonValueToString(element.Value);
}
}
return new RawLinkset
{
Aliases = aliases,
PackageUrls = purls,
Cpes = cpes,
References = references,
ReconciledFrom = reconciledFrom,
Notes = notesBuilder.ToImmutable()
};
}
private static DateTimeOffset GetDateTimeOffset(BsonDocument document, string field, DateTimeOffset fallback)
{
if (!document.TryGetValue(field, out var value) || value.IsBsonNull)
{
return fallback;
}
return BsonValueToDateTimeOffset(value) ?? fallback;
}
private static DateTimeOffset? BsonValueToDateTimeOffset(BsonValue value)
{
switch (value.BsonType)
{
case BsonType.DateTime:
var dateTime = value.ToUniversalTime();
return new DateTimeOffset(DateTime.SpecifyKind(dateTime, DateTimeKind.Utc));
case BsonType.String:
if (DateTimeOffset.TryParse(value.AsString, CultureInfo.InvariantCulture, DateTimeStyles.AssumeUniversal, out var parsed))
{
return parsed.ToUniversalTime();
}
break;
case BsonType.Int64:
return DateTimeOffset.FromUnixTimeMilliseconds(value.AsInt64).ToUniversalTime();
}
return null;
}
private static string GetRequiredString(BsonDocument document, string key)
{
if (!document.TryGetValue(key, out var value) || value.IsBsonNull)
{
return string.Empty;
}
return value.IsString ? value.AsString : value.ToString() ?? string.Empty;
}
private static string? GetOptionalString(BsonDocument document, string key)
{
if (!document.TryGetValue(key, out var value) || value.IsBsonNull)
{
return null;
}
return value.IsString ? value.AsString : value.ToString();
}
private static string BsonValueToString(BsonValue value)
{
if (value.IsString)
{
return value.AsString ?? string.Empty;
}
if (value.IsBsonNull)
{
return string.Empty;
}
return value.ToString() ?? string.Empty;
}
private static string BuildDocumentId(AdvisoryRawDocument document)
{
var vendorSegment = SanitizeIdSegment(document.Source.Vendor);
var upstreamSegment = SanitizeIdSegment(document.Upstream.UpstreamId);
var revisionSegment = ComputeRevisionSegment(document.Upstream);
return $"advisory_raw:{vendorSegment}:{upstreamSegment}:{revisionSegment}";
}
private static string ComputeRevisionSegment(RawUpstreamMetadata upstream)
{
var hashSegment = SanitizeIdSegment(upstream.ContentHash.Replace(":", "-"));
if (string.IsNullOrWhiteSpace(upstream.DocumentVersion))
{
return hashSegment;
}
var versionSegment = SanitizeIdSegment(upstream.DocumentVersion);
if (hashSegment.Length > 12)
{
hashSegment = hashSegment[..12];
}
return $"{versionSegment}-{hashSegment}";
}
private static string SanitizeIdSegment(string? value)
{
if (string.IsNullOrWhiteSpace(value))
{
return "unknown";
}
var builder = new StringBuilder(value.Length);
foreach (var character in value.Trim())
{
if (char.IsLetterOrDigit(character))
{
builder.Append(char.ToLowerInvariant(character));
}
else if (character is '-' or '.')
{
builder.Append(character);
}
else
{
builder.Append('-');
}
}
var sanitized = builder.ToString().Trim('-');
if (sanitized.Length == 0)
{
return "unknown";
}
if (sanitized.Length > 64)
{
sanitized = sanitized[..64];
}
return sanitized;
}
private static string EncodeCursor(DateTime dateTimeUtc, string id)
{
var payload = $"{dateTimeUtc.Ticks}:{id}";
return Convert.ToBase64String(Encoding.UTF8.GetBytes(payload));
}
private static bool TryDecodeCursor(string cursor, out AdvisoryRawCursor result)
{
result = default;
try
{
var bytes = Convert.FromBase64String(cursor);
var payload = Encoding.UTF8.GetString(bytes);
var parts = payload.Split(':', CursorSegmentCount);
if (parts.Length != CursorSegmentCount)
{
return false;
}
if (!long.TryParse(parts[0], NumberStyles.Integer, CultureInfo.InvariantCulture, out var ticks))
{
return false;
}
var dateTime = new DateTime(ticks, DateTimeKind.Utc);
result = new AdvisoryRawCursor(new DateTimeOffset(dateTime), parts[1]);
return true;
}
catch
{
return false;
}
}
private readonly record struct AdvisoryRawCursor(DateTimeOffset IngestedAt, string Id);
}

View File

@@ -0,0 +1,155 @@
using Microsoft.Extensions.Hosting;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using MongoDB.Driver;
using MongoDB.Driver.GridFS;
using StellaOps.Concelier.Storage.Mongo.Documents;
using StellaOps.Concelier.Storage.Mongo.Dtos;
namespace StellaOps.Concelier.Storage.Mongo;
/// <summary>
/// Periodically purges expired raw documents, associated DTO payloads, and GridFS content.
/// Complements TTL indexes by ensuring deterministic cleanup before Mongo's background sweeper runs.
/// </summary>
internal sealed class RawDocumentRetentionService : BackgroundService
{
private readonly IMongoCollection<DocumentDocument> _documents;
private readonly IMongoCollection<DtoDocument> _dtos;
private readonly GridFSBucket _bucket;
private readonly MongoStorageOptions _options;
private readonly ILogger<RawDocumentRetentionService> _logger;
private readonly TimeProvider _timeProvider;
public RawDocumentRetentionService(
IMongoDatabase database,
IOptions<MongoStorageOptions> options,
ILogger<RawDocumentRetentionService> logger,
TimeProvider? timeProvider = null)
{
ArgumentNullException.ThrowIfNull(database);
ArgumentNullException.ThrowIfNull(options);
ArgumentNullException.ThrowIfNull(logger);
_documents = database.GetCollection<DocumentDocument>(MongoStorageDefaults.Collections.Document);
_dtos = database.GetCollection<DtoDocument>(MongoStorageDefaults.Collections.Dto);
_bucket = new GridFSBucket(database, new GridFSBucketOptions
{
BucketName = "documents",
ReadConcern = database.Settings.ReadConcern,
WriteConcern = database.Settings.WriteConcern,
});
_options = options.Value;
_logger = logger;
_timeProvider = timeProvider ?? TimeProvider.System;
}
protected override async Task ExecuteAsync(CancellationToken stoppingToken)
{
if (_options.RawDocumentRetention <= TimeSpan.Zero)
{
_logger.LogInformation("Raw document retention disabled; purge service idle.");
return;
}
var sweepInterval = _options.RawDocumentRetentionSweepInterval > TimeSpan.Zero
? _options.RawDocumentRetentionSweepInterval
: TimeSpan.FromHours(6);
while (!stoppingToken.IsCancellationRequested)
{
try
{
await SweepExpiredDocumentsAsync(stoppingToken).ConfigureAwait(false);
}
catch (OperationCanceledException) when (stoppingToken.IsCancellationRequested)
{
break;
}
catch (Exception ex)
{
_logger.LogError(ex, "Raw document retention sweep failed");
}
try
{
await Task.Delay(sweepInterval, stoppingToken).ConfigureAwait(false);
}
catch (OperationCanceledException) when (stoppingToken.IsCancellationRequested)
{
break;
}
}
}
internal async Task<int> SweepExpiredDocumentsAsync(CancellationToken cancellationToken)
{
var grace = _options.RawDocumentRetentionTtlGrace >= TimeSpan.Zero
? _options.RawDocumentRetentionTtlGrace
: TimeSpan.Zero;
var threshold = _timeProvider.GetUtcNow() + grace;
var filterBuilder = Builders<DocumentDocument>.Filter;
var filter = filterBuilder.And(
filterBuilder.Ne(doc => doc.ExpiresAt, null),
filterBuilder.Lte(doc => doc.ExpiresAt, threshold.UtcDateTime));
var removed = 0;
while (!cancellationToken.IsCancellationRequested)
{
var batch = await _documents
.Find(filter)
.SortBy(doc => doc.ExpiresAt)
.Limit(200)
.ToListAsync(cancellationToken)
.ConfigureAwait(false);
if (batch.Count == 0)
{
break;
}
foreach (var document in batch)
{
if (cancellationToken.IsCancellationRequested)
{
break;
}
await PurgeDocumentAsync(document, cancellationToken).ConfigureAwait(false);
removed++;
}
}
if (removed > 0)
{
_logger.LogInformation("Purged {Count} expired raw documents (threshold <= {Threshold})", removed, threshold);
}
return removed;
}
private async Task PurgeDocumentAsync(DocumentDocument document, CancellationToken cancellationToken)
{
if (document.GridFsId.HasValue)
{
try
{
await _bucket.DeleteAsync(document.GridFsId.Value, cancellationToken).ConfigureAwait(false);
}
catch (GridFSFileNotFoundException)
{
// already removed or TTL swept
}
catch (Exception ex)
{
_logger.LogWarning(ex, "Failed to delete GridFS payload {GridFsId} for document {DocumentId}", document.GridFsId, document.Id);
}
}
await _dtos.DeleteManyAsync(x => x.DocumentId == document.Id, cancellationToken).ConfigureAwait(false);
await _documents.DeleteOneAsync(x => x.Id == document.Id, cancellationToken).ConfigureAwait(false);
}
}

View File

@@ -0,0 +1,117 @@
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.DependencyInjection.Extensions;
using Microsoft.Extensions.Options;
using MongoDB.Driver;
using StellaOps.Concelier.Core.Jobs;
using StellaOps.Concelier.Storage.Mongo.Advisories;
using StellaOps.Concelier.Storage.Mongo.Aliases;
using StellaOps.Concelier.Storage.Mongo.ChangeHistory;
using StellaOps.Concelier.Storage.Mongo.Documents;
using StellaOps.Concelier.Storage.Mongo.Dtos;
using StellaOps.Concelier.Storage.Mongo.Raw;
using StellaOps.Concelier.Core.Raw;
using StellaOps.Concelier.Storage.Mongo.Exporting;
using StellaOps.Concelier.Storage.Mongo.JpFlags;
using StellaOps.Concelier.Storage.Mongo.MergeEvents;
using StellaOps.Concelier.Storage.Mongo.Conflicts;
using StellaOps.Concelier.Storage.Mongo.PsirtFlags;
using StellaOps.Concelier.Storage.Mongo.Statements;
using StellaOps.Concelier.Storage.Mongo.Events;
using StellaOps.Concelier.Core.Events;
using StellaOps.Concelier.Storage.Mongo.Migrations;
using StellaOps.Concelier.Storage.Mongo.Observations;
using StellaOps.Concelier.Core.Observations;
namespace StellaOps.Concelier.Storage.Mongo;
public static class ServiceCollectionExtensions
{
public static IServiceCollection AddMongoStorage(this IServiceCollection services, Action<MongoStorageOptions> configureOptions)
{
ArgumentNullException.ThrowIfNull(services);
ArgumentNullException.ThrowIfNull(configureOptions);
services.AddOptions<MongoStorageOptions>()
.Configure(configureOptions)
.PostConfigure(static options => options.EnsureValid());
services.TryAddSingleton(TimeProvider.System);
services.AddSingleton<IMongoClient>(static sp =>
{
var options = sp.GetRequiredService<IOptions<MongoStorageOptions>>().Value;
return new MongoClient(options.ConnectionString);
});
services.AddSingleton(static sp =>
{
var options = sp.GetRequiredService<IOptions<MongoStorageOptions>>().Value;
var client = sp.GetRequiredService<IMongoClient>();
var settings = new MongoDatabaseSettings
{
ReadConcern = ReadConcern.Majority,
WriteConcern = WriteConcern.WMajority,
ReadPreference = ReadPreference.PrimaryPreferred,
};
var database = client.GetDatabase(options.GetDatabaseName(), settings);
var writeConcern = database.Settings.WriteConcern.With(wTimeout: options.CommandTimeout);
return database.WithWriteConcern(writeConcern);
});
services.AddScoped<IMongoSessionProvider, MongoSessionProvider>();
services.AddSingleton<MongoBootstrapper>();
services.AddSingleton<IJobStore, MongoJobStore>();
services.AddSingleton<ILeaseStore, MongoLeaseStore>();
services.AddSingleton<ISourceStateRepository, MongoSourceStateRepository>();
services.AddSingleton<IDocumentStore, DocumentStore>();
services.AddSingleton<IDtoStore, DtoStore>();
services.AddSingleton<IAdvisoryStore, AdvisoryStore>();
services.AddSingleton<IAliasStore, AliasStore>();
services.AddSingleton<IChangeHistoryStore, MongoChangeHistoryStore>();
services.AddSingleton<IJpFlagStore, JpFlagStore>();
services.AddSingleton<IPsirtFlagStore, PsirtFlagStore>();
services.AddSingleton<IMergeEventStore, MergeEventStore>();
services.AddSingleton<IAdvisoryStatementStore, AdvisoryStatementStore>();
services.AddSingleton<IAdvisoryConflictStore, AdvisoryConflictStore>();
services.AddSingleton<IAdvisoryObservationStore, AdvisoryObservationStore>();
services.AddSingleton<IAdvisoryObservationLookup, AdvisoryObservationLookup>();
services.AddSingleton<IAdvisoryEventRepository, MongoAdvisoryEventRepository>();
services.AddSingleton<IAdvisoryEventLog, AdvisoryEventLog>();
services.AddSingleton<IAdvisoryRawRepository, MongoAdvisoryRawRepository>();
services.AddSingleton<IExportStateStore, ExportStateStore>();
services.TryAddSingleton<ExportStateManager>();
services.AddSingleton<IMongoCollection<JobRunDocument>>(static sp =>
{
var database = sp.GetRequiredService<IMongoDatabase>();
return database.GetCollection<JobRunDocument>(MongoStorageDefaults.Collections.Jobs);
});
services.AddSingleton<IMongoCollection<JobLeaseDocument>>(static sp =>
{
var database = sp.GetRequiredService<IMongoDatabase>();
return database.GetCollection<JobLeaseDocument>(MongoStorageDefaults.Collections.Locks);
});
services.AddSingleton<IMongoCollection<AdvisoryObservationDocument>>(static sp =>
{
var database = sp.GetRequiredService<IMongoDatabase>();
return database.GetCollection<AdvisoryObservationDocument>(MongoStorageDefaults.Collections.AdvisoryObservations);
});
services.AddHostedService<RawDocumentRetentionService>();
services.AddSingleton<MongoMigrationRunner>();
services.AddSingleton<IMongoMigration, EnsureDocumentExpiryIndexesMigration>();
services.AddSingleton<IMongoMigration, EnsureGridFsExpiryIndexesMigration>();
services.AddSingleton<IMongoMigration, EnsureAdvisoryRawIdempotencyIndexMigration>();
services.AddSingleton<IMongoMigration, EnsureAdvisorySupersedesBackfillMigration>();
services.AddSingleton<IMongoMigration, EnsureAdvisoryRawValidatorMigration>();
services.AddSingleton<IMongoMigration, EnsureAdvisoryEventCollectionsMigration>();
services.AddSingleton<IMongoMigration, SemVerStyleBackfillMigration>();
return services;
}
}

View File

@@ -0,0 +1,73 @@
using MongoDB.Bson;
using MongoDB.Bson.Serialization.Attributes;
namespace StellaOps.Concelier.Storage.Mongo;
[BsonIgnoreExtraElements]
public sealed class SourceStateDocument
{
[BsonId]
public string SourceName { get; set; } = string.Empty;
[BsonElement("enabled")]
public bool Enabled { get; set; } = true;
[BsonElement("paused")]
public bool Paused { get; set; }
[BsonElement("cursor")]
public BsonDocument Cursor { get; set; } = new();
[BsonElement("lastSuccess")]
[BsonIgnoreIfNull]
public DateTime? LastSuccess { get; set; }
[BsonElement("lastFailure")]
[BsonIgnoreIfNull]
public DateTime? LastFailure { get; set; }
[BsonElement("failCount")]
public int FailCount { get; set; }
[BsonElement("backoffUntil")]
[BsonIgnoreIfNull]
public DateTime? BackoffUntil { get; set; }
[BsonElement("updatedAt")]
public DateTime UpdatedAt { get; set; }
[BsonElement("lastFailureReason")]
[BsonIgnoreIfNull]
public string? LastFailureReason { get; set; }
}
internal static class SourceStateDocumentExtensions
{
public static SourceStateDocument FromRecord(SourceStateRecord record)
=> new()
{
SourceName = record.SourceName,
Enabled = record.Enabled,
Paused = record.Paused,
Cursor = record.Cursor ?? new BsonDocument(),
LastSuccess = record.LastSuccess?.UtcDateTime,
LastFailure = record.LastFailure?.UtcDateTime,
FailCount = record.FailCount,
BackoffUntil = record.BackoffUntil?.UtcDateTime,
UpdatedAt = record.UpdatedAt.UtcDateTime,
LastFailureReason = record.LastFailureReason,
};
public static SourceStateRecord ToRecord(this SourceStateDocument document)
=> new(
document.SourceName,
document.Enabled,
document.Paused,
document.Cursor ?? new BsonDocument(),
document.LastSuccess.HasValue ? DateTime.SpecifyKind(document.LastSuccess.Value, DateTimeKind.Utc) : null,
document.LastFailure.HasValue ? DateTime.SpecifyKind(document.LastFailure.Value, DateTimeKind.Utc) : null,
document.FailCount,
document.BackoffUntil.HasValue ? DateTime.SpecifyKind(document.BackoffUntil.Value, DateTimeKind.Utc) : null,
DateTime.SpecifyKind(document.UpdatedAt, DateTimeKind.Utc),
document.LastFailureReason);
}

View File

@@ -0,0 +1,15 @@
using MongoDB.Bson;
namespace StellaOps.Concelier.Storage.Mongo;
public sealed record SourceStateRecord(
string SourceName,
bool Enabled,
bool Paused,
BsonDocument Cursor,
DateTimeOffset? LastSuccess,
DateTimeOffset? LastFailure,
int FailCount,
DateTimeOffset? BackoffUntil,
DateTimeOffset UpdatedAt,
string? LastFailureReason);

View File

@@ -0,0 +1,19 @@
using System;
using System.Threading;
using System.Threading.Tasks;
namespace StellaOps.Concelier.Storage.Mongo;
public static class SourceStateRepositoryExtensions
{
public static Task<SourceStateRecord?> MarkFailureAsync(
this ISourceStateRepository repository,
string sourceName,
DateTimeOffset failedAt,
TimeSpan? backoff,
CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(repository);
return repository.MarkFailureAsync(sourceName, failedAt, backoff, failureReason: null, cancellationToken);
}
}

View File

@@ -0,0 +1,62 @@
using System;
using System.Collections.Generic;
using System.Linq;
using MongoDB.Bson;
using MongoDB.Bson.Serialization.Attributes;
namespace StellaOps.Concelier.Storage.Mongo.Statements;
[BsonIgnoreExtraElements]
public sealed class AdvisoryStatementDocument
{
[BsonId]
public string Id { get; set; } = string.Empty;
[BsonElement("vulnerabilityKey")]
public string VulnerabilityKey { get; set; } = string.Empty;
[BsonElement("advisoryKey")]
public string AdvisoryKey { get; set; } = string.Empty;
[BsonElement("statementHash")]
public byte[] StatementHash { get; set; } = Array.Empty<byte>();
[BsonElement("asOf")]
public DateTime AsOf { get; set; }
[BsonElement("recordedAt")]
public DateTime RecordedAt { get; set; }
[BsonElement("payload")]
public BsonDocument Payload { get; set; } = new();
[BsonElement("inputDocuments")]
public List<string> InputDocuments { get; set; } = new();
}
internal static class AdvisoryStatementDocumentExtensions
{
public static AdvisoryStatementDocument FromRecord(AdvisoryStatementRecord record)
=> new()
{
Id = record.Id.ToString(),
VulnerabilityKey = record.VulnerabilityKey,
AdvisoryKey = record.AdvisoryKey,
StatementHash = record.StatementHash,
AsOf = record.AsOf.UtcDateTime,
RecordedAt = record.RecordedAt.UtcDateTime,
Payload = (BsonDocument)record.Payload.DeepClone(),
InputDocuments = record.InputDocumentIds.Select(static id => id.ToString()).ToList(),
};
public static AdvisoryStatementRecord ToRecord(this AdvisoryStatementDocument document)
=> new(
Guid.Parse(document.Id),
document.VulnerabilityKey,
document.AdvisoryKey,
document.StatementHash,
DateTime.SpecifyKind(document.AsOf, DateTimeKind.Utc),
DateTime.SpecifyKind(document.RecordedAt, DateTimeKind.Utc),
(BsonDocument)document.Payload.DeepClone(),
document.InputDocuments.Select(static value => Guid.Parse(value)).ToList());
}

View File

@@ -0,0 +1,15 @@
using System;
using System.Collections.Generic;
using MongoDB.Bson;
namespace StellaOps.Concelier.Storage.Mongo.Statements;
public sealed record AdvisoryStatementRecord(
Guid Id,
string VulnerabilityKey,
string AdvisoryKey,
byte[] StatementHash,
DateTimeOffset AsOf,
DateTimeOffset RecordedAt,
BsonDocument Payload,
IReadOnlyList<Guid> InputDocumentIds);

View File

@@ -0,0 +1,93 @@
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using MongoDB.Driver;
namespace StellaOps.Concelier.Storage.Mongo.Statements;
public interface IAdvisoryStatementStore
{
ValueTask InsertAsync(
IReadOnlyCollection<AdvisoryStatementRecord> statements,
CancellationToken cancellationToken,
IClientSessionHandle? session = null);
ValueTask<IReadOnlyList<AdvisoryStatementRecord>> GetStatementsAsync(
string vulnerabilityKey,
DateTimeOffset? asOf,
CancellationToken cancellationToken,
IClientSessionHandle? session = null);
}
public sealed class AdvisoryStatementStore : IAdvisoryStatementStore
{
private readonly IMongoCollection<AdvisoryStatementDocument> _collection;
public AdvisoryStatementStore(IMongoDatabase database)
{
ArgumentNullException.ThrowIfNull(database);
_collection = database.GetCollection<AdvisoryStatementDocument>(MongoStorageDefaults.Collections.AdvisoryStatements);
}
public async ValueTask InsertAsync(
IReadOnlyCollection<AdvisoryStatementRecord> statements,
CancellationToken cancellationToken,
IClientSessionHandle? session = null)
{
ArgumentNullException.ThrowIfNull(statements);
if (statements.Count == 0)
{
return;
}
var documents = statements.Select(AdvisoryStatementDocumentExtensions.FromRecord).ToList();
var options = new InsertManyOptions { IsOrdered = true };
try
{
if (session is null)
{
await _collection.InsertManyAsync(documents, options, cancellationToken).ConfigureAwait(false);
}
else
{
await _collection.InsertManyAsync(session, documents, options, cancellationToken).ConfigureAwait(false);
}
}
catch (MongoBulkWriteException ex) when (ex.WriteErrors.All(error => error.Category == ServerErrorCategory.DuplicateKey))
{
// All duplicates already exist safe to ignore for immutable statement log.
}
}
public async ValueTask<IReadOnlyList<AdvisoryStatementRecord>> GetStatementsAsync(
string vulnerabilityKey,
DateTimeOffset? asOf,
CancellationToken cancellationToken,
IClientSessionHandle? session = null)
{
ArgumentException.ThrowIfNullOrWhiteSpace(vulnerabilityKey);
var filter = Builders<AdvisoryStatementDocument>.Filter.Eq(document => document.VulnerabilityKey, vulnerabilityKey);
if (asOf.HasValue)
{
filter &= Builders<AdvisoryStatementDocument>.Filter.Lte(document => document.AsOf, asOf.Value.UtcDateTime);
}
var find = session is null
? _collection.Find(filter)
: _collection.Find(session, filter);
var documents = await find
.SortByDescending(document => document.AsOf)
.ThenByDescending(document => document.RecordedAt)
.ToListAsync(cancellationToken)
.ConfigureAwait(false);
return documents.Select(static document => document.ToRecord()).ToList();
}
}

View File

@@ -0,0 +1,18 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>net10.0</TargetFramework>
<LangVersion>preview</LangVersion>
<Nullable>enable</Nullable>
<ImplicitUsings>enable</ImplicitUsings>
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="MongoDB.Driver" Version="3.5.0" />
<PackageReference Include="Microsoft.Extensions.Options" Version="10.0.0-rc.2.25502.107" />
<PackageReference Include="Microsoft.Extensions.Logging.Abstractions" Version="10.0.0-rc.2.25502.107" />
</ItemGroup>
<ItemGroup>
<ProjectReference Include="..\StellaOps.Concelier.Core\StellaOps.Concelier.Core.csproj" />
<ProjectReference Include="..\StellaOps.Concelier.Models\StellaOps.Concelier.Models.csproj" />
</ItemGroup>
</Project>

View File

@@ -0,0 +1,30 @@
# TASKS — Epic 1: Aggregation-Only Contract
> **AOC Reminder:** storage enforces append-only raw documents; no precedence/severity/normalization in ingestion collections.
| ID | Status | Owner(s) | Depends on | Notes |
|---|---|---|---|---|
| CONCELIER-STORE-AOC-19-001 `advisory_raw schema validator` | DONE (2025-10-28) | Concelier Storage Guild | Mongo cluster ops sign-off | Author MongoDB JSON schema enforcing required fields (`source`, `upstream`, `content`, `linkset`, `tenant`) and forbidding normalized/severity fields. Include migration toggles for staged rollout. |
> 2025-10-28: Added configurable validator migration (`20251028_advisory_raw_validator`), bootstrapper collection registration, storage options toggle, and Mongo migration tests covering schema + enforcement levels.
> Docs alignment (2025-10-26): Validator expectations + deployment steps documented in `docs/deploy/containers.md` §1.
| CONCELIER-STORE-AOC-19-002 `idempotency unique index` | DONE (2025-10-28) | Concelier Storage Guild | CONCELIER-STORE-AOC-19-001 | Create compound unique index on `(source.vendor, upstream.upstream_id, upstream.content_hash, tenant)` with backfill script verifying existing data, and document offline validator bootstrap. |
> 2025-10-28: Added `20251028_advisory_raw_idempotency_index` migration that detects duplicate raw advisories before creating the unique compound index, wired into DI, and extended migration tests to cover index shape + duplicate handling with supporting package updates.
> Docs alignment (2025-10-26): Idempotency contract + supersedes metrics in `docs/ingestion/aggregation-only-contract.md` §7 and observability guide.
| CONCELIER-STORE-AOC-19-003 `append-only supersedes migration` | DONE (2025-10-28) | Concelier Storage Guild | CONCELIER-STORE-AOC-19-002 | Introduce migration that freezes legacy `advisories` writes, copies data into `_backup_*`, and backfills supersedes pointers for raw revisions. Provide rollback plan. |
> 2025-10-28: Added supersedes backfill migration (`20251028_advisory_supersedes_backfill`) that renames `advisory` to a read-only view, snapshots data into `_backup_20251028`, and walks raw revisions to populate deterministic supersedes chains with integration coverage and operator scripts.
> Docs alignment (2025-10-26): Rollback guidance added to `docs/deploy/containers.md` §6.
| CONCELIER-STORE-AOC-19-004 `validator deployment playbook` | DONE (2025-10-28) | Concelier Storage Guild, DevOps Guild | CONCELIER-STORE-AOC-19-001 | Update `MIGRATIONS.md` and Offline Kit docs to cover enabling validators, rolling restarts, and validator smoke tests for air-gapped installs. |
> 2025-10-28: Documented duplicate audit + migration workflow in `docs/deploy/containers.md`, Offline Kit guide, and `MIGRATIONS.md`; published `ops/devops/scripts/check-advisory-raw-duplicates.js` for staging/offline clusters.
> Docs alignment (2025-10-26): Offline kit requirements documented in `docs/deploy/containers.md` §5.
## Policy Engine v2
| ID | Status | Owner(s) | Depends on | Notes |
|----|--------|----------|------------|-------|
| CONCELIER-POLICY-20-003 `Selection cursors` | TODO | Concelier Storage Guild | CONCELIER-STORE-AOC-19-002, POLICY-ENGINE-20-003 | Add advisory/vex selection cursors (per policy run) with change stream checkpoints, indexes, and offline migration scripts to support incremental evaluations. |
## Link-Not-Merge v1
| ID | Status | Owner(s) | Depends on | Notes |
|----|--------|----------|------------|-------|
| CONCELIER-LNM-21-101 `Observations collections` | TODO | Concelier Storage Guild | CONCELIER-LNM-21-001 | Provision `advisory_observations` and `advisory_linksets` collections with hashed shard keys, TTL for ingest metadata, and required indexes (`aliases`, `purls`, `observation_ids`). |
| CONCELIER-LNM-21-102 `Migration tooling` | TODO | Concelier Storage Guild, DevOps Guild | CONCELIER-LNM-21-101 | Backfill legacy merged advisories into observation/linkset collections, create tombstones for merged docs, and supply rollback scripts. |
| CONCELIER-LNM-21-103 `Blob/store wiring` | TODO | Concelier Storage Guild | CONCELIER-LNM-21-101 | Store large raw payloads in object storage with pointers from observations; update bootstrapper/offline kit to seed sample blobs. |