Align AOC tasks for Excititor and Concelier

This commit is contained in:
master
2025-10-31 18:50:15 +02:00
committed by root
parent 9e6d9fbae8
commit 8da4e12a90
334 changed files with 35528 additions and 34546 deletions

View File

@@ -1,59 +1,59 @@
# FEEDCONN-CERTCC-02-009 VINCE Detail & Map Reintegration Plan
- **Author:** BE-Conn-CERTCC (current on-call)
- **Date:** 2025-10-11
- **Scope:** Restore VINCE detail parsing and canonical mapping in Concelier without destabilising downstream Merge/Export pipelines.
## 1. Current State Snapshot (2025-10-11)
- ✅ Fetch pipeline, VINCE summary planner, and detail queue are live; documents land with `DocumentStatuses.PendingParse`.
- ✅ DTO aggregate (`CertCcNoteDto`) plus mapper emit vendor-centric `normalizedVersions` (`scheme=certcc.vendor`) and provenance aligned with `src/Concelier/__Libraries/StellaOps.Concelier.Models/PROVENANCE_GUIDELINES.md`.
- ✅ Regression coverage exists for fetch/parse/map flows (`CertCcConnectorSnapshotTests`), but snapshot regeneration is gated on harness refresh (FEEDCONN-CERTCC-02-007) and QA handoff (FEEDCONN-CERTCC-02-008).
- ⚠️ Parse/map jobs are not scheduled; production still operates in fetch-only mode.
- ⚠️ Downstream Merge team is finalising normalized range ingestion per `src/FASTER_MODELING_AND_NORMALIZATION.md`; we must avoid publishing canonical records until they certify compatibility.
## 2. Required Dependencies & Coordinated Tasks
| Dependency | Owner(s) | Blocking Condition | Handshake |
|------------|----------|--------------------|-----------|
| FEEDCONN-CERTCC-02-004 (Canonical mapping & range primitives hardening) | BE-Conn-CERTCC + Models | Ensure mapper emits deterministic `normalizedVersions` array and provenance field masks | Daily sync with Models/Merge leads; share fixture diff before each enablement phase |
| FEEDCONN-CERTCC-02-007 (Connector test harness remediation) | BE-Conn-CERTCC, QA | Restore `AddSourceCommon` harness + canned VINCE fixtures so we can shadow-run parse/map | Required before Phase 1 |
| FEEDCONN-CERTCC-02-008 (Snapshot coverage handoff) | QA | Snapshot refresh process green to surface regressions | Required before Phase 2 |
| FEEDCONN-CERTCC-02-010 (Partial-detail graceful degradation) | BE-Conn-CERTCC | Resiliency for missing VINCE endpoints to avoid job wedging after reintegration | Should land before Phase 2 cutover |
## 3. Phased Rollout Plan
| Phase | Window (UTC) | Actions | Success Signals | Rollback |
|-------|--------------|---------|-----------------|----------|
| **0 Pre-flight validation** | 2025-10-11 → 2025-10-12 | • Finish FEEDCONN-CERTCC-02-007 harness fixes and regenerate fixtures.<br>• Run `dotnet test src/Concelier/__Tests/StellaOps.Concelier.Connector.CertCc.Tests` with `UPDATE_CERTCC_FIXTURES=0` to confirm deterministic baselines.<br>• Generate sample advisory batch (`dotnet test … --filter SnapshotSmoke`) and deliver JSON diff to Merge for schema verification (`normalizedVersions[].scheme == certcc.vendor`, provenance masks populated). | • Harness tests green locally and in CI.<br>• Merge sign-off that sample advisories conform to `FASTER_MODELING_AND_NORMALIZATION.md`. | N/A (no production enablement yet). |
| **1 Shadow parse/map in staging** | Target start 2025-10-13 | • Register `source:cert-cc:parse` and `source:cert-cc:map` jobs, but gate them behind new config flag `concelier:sources:cert-cc:enableDetailMapping` (default `false`).<br>• Deploy (restart required for options rebinding), enable flag, and point connector at staging Mongo with isolated collection (`advisories_certcc_shadow`).<br>• Run connector for ≥2 cycles; compare advisory counts vs. fetch-only baseline and validate `concelier.range.primitives` metrics include `scheme=certcc.vendor`. | • No uncaught exceptions in staging logs.<br>• Shadow advisories match expected vendor counts (±5%).<br>`certcc.summary.fetch.*` + new `certcc.map.duration.ms` metrics stable. | Disable flag; staging returns to fetch-only. No production impact. |
| **2 Controlled production enablement** | Target start 2025-10-14 | • Redeploy production with flag enabled, start with job concurrency `1`, and reduce `MaxNotesPerFetch` to 5 for first 24h.<br>• Observe metrics dashboards hourly (fetch/map latency, pending queues, Mongo write throughput).<br>• QA to replay latest snapshots and confirm no deterministic drift.<br>• Publish advisory sample (top 10 changed docs) to Merge Slack channel for validation. | • Pending parse/mapping queues drain within expected SLA (<30min).<br>• No increase in merge dedupe anomalies.<br>• Mongo writes stay within 10% of baseline. | Toggle flag off, re-run fetch-only. Clear `pendingMappings` via connector cursor reset if stuck. |
| **3 Full production & cleanup** | Target start 2025-10-15 | • Restore `MaxNotesPerFetch` to configured default (20).<br>• Remove temporary throttles and leave flag enabled by default.<br>• Update `README.md` rollout notes; close FEEDCONN-CERTCC-02-009.<br>• Kick off post-merge audit with Merge to ensure new advisories dedupe with other sources. | • Stable operations for ≥48h, no degradation alerts.<br>• Merge confirms conflict resolver behaviour unchanged. | If regression detected, revert to Phase2 state or disable jobs; retain plan for reuse. |
## 4. Monitoring & Validation Checklist
- Dashboards: `certcc.*` meters (plan, summary fetch, detail fetch) plus `concelier.range.primitives` with tag `scheme=certcc.vendor`.
- Logs: ensure Parse/Map jobs emit `correlationId` aligned with fetch events for traceability.
- Data QA: run `src/Tools/dump_advisory` against two VINCE notes (one multi-vendor, one single-vendor) every phase to spot-check normalized versions ordering and provenance.
- Storage: verify Mongo TTL/size for `raw_documents` and `dtos`—detail payload volume increases by ~3× when mapping resumes.
## 5. Rollback / Contingency Playbook
1. Disable `concelier:sources:cert-cc:enableDetailMapping` flag (and optionally set `MaxNotesPerFetch=0` for a single cycle) to halt new detail ingestion.
2. Run connector once to update cursor; verify `pendingMappings` drains.
3. If advisories already persisted, coordinate with Merge to soft-delete affected `certcc/*` advisories by advisory key hash (no schema rollback required).
4. Re-run Phase1 shadow validation before retrying.
## 6. Communication Cadence
- Daily check-in with Models/Merge leads (09:30 EDT) to surface normalizedVersions/provenance diffs.
- Post-phase reports in `#concelier-certcc` Slack channel summarising metrics, advisory counts, and outstanding issues.
- Escalate blockers >12h via Runbook SEV-3 path and annotate `TASKS.md`.
## 7. Open Questions / Next Actions
- [ ] Confirm whether Merge requires additional provenance field masks before Phase2 (waiting on feedback from 2025-10-11 sample).
- [ ] Decide if CSAF endpoint ingestion (optional) should piggyback on Phase3 or stay deferred.
- [ ] Validate that FEEDCONN-CERTCC-02-010 coverage handles mixed 200/404 VINCE endpoints during partial outages.
Once Dependencies (Section2) are cleared and Phase3 completes, update `src/Concelier/StellaOps.Concelier.PluginBinaries/StellaOps.Concelier.Connector.CertCc/TASKS.md` and close FEEDCONN-CERTCC-02-009.
# FEEDCONN-CERTCC-02-009 VINCE Detail & Map Reintegration Plan
- **Author:** BE-Conn-CERTCC (current on-call)
- **Date:** 2025-10-11
- **Scope:** Restore VINCE detail parsing and canonical mapping in Concelier without destabilising downstream Merge/Export pipelines.
## 1. Current State Snapshot (2025-10-11)
- ✅ Fetch pipeline, VINCE summary planner, and detail queue are live; documents land with `DocumentStatuses.PendingParse`.
- ✅ DTO aggregate (`CertCcNoteDto`) plus mapper emit vendor-centric `normalizedVersions` (`scheme=certcc.vendor`) and provenance aligned with `src/Concelier/__Libraries/StellaOps.Concelier.Models/PROVENANCE_GUIDELINES.md`.
- ✅ Regression coverage exists for fetch/parse/map flows (`CertCcConnectorSnapshotTests`), but snapshot regeneration is gated on harness refresh (FEEDCONN-CERTCC-02-007) and QA handoff (FEEDCONN-CERTCC-02-008).
- ⚠️ Parse/map jobs are not scheduled; production still operates in fetch-only mode.
- ⚠️ Downstream Merge team is finalising normalized range ingestion per `src/FASTER_MODELING_AND_NORMALIZATION.md`; we must avoid publishing canonical records until they certify compatibility.
## 2. Required Dependencies & Coordinated Tasks
| Dependency | Owner(s) | Blocking Condition | Handshake |
|------------|----------|--------------------|-----------|
| FEEDCONN-CERTCC-02-004 (Canonical mapping & range primitives hardening) | BE-Conn-CERTCC + Models | Ensure mapper emits deterministic `normalizedVersions` array and provenance field masks | Daily sync with Models/Merge leads; share fixture diff before each enablement phase |
| FEEDCONN-CERTCC-02-007 (Connector test harness remediation) | BE-Conn-CERTCC, QA | Restore `AddSourceCommon` harness + canned VINCE fixtures so we can shadow-run parse/map | Required before Phase 1 |
| FEEDCONN-CERTCC-02-008 (Snapshot coverage handoff) | QA | Snapshot refresh process green to surface regressions | Required before Phase 2 |
| FEEDCONN-CERTCC-02-010 (Partial-detail graceful degradation) | BE-Conn-CERTCC | Resiliency for missing VINCE endpoints to avoid job wedging after reintegration | Should land before Phase 2 cutover |
## 3. Phased Rollout Plan
| Phase | Window (UTC) | Actions | Success Signals | Rollback |
|-------|--------------|---------|-----------------|----------|
| **0 Pre-flight validation** | 2025-10-11 → 2025-10-12 | • Finish FEEDCONN-CERTCC-02-007 harness fixes and regenerate fixtures.<br>• Run `dotnet test src/Concelier/__Tests/StellaOps.Concelier.Connector.CertCc.Tests` with `UPDATE_CERTCC_FIXTURES=0` to confirm deterministic baselines.<br>• Generate sample advisory batch (`dotnet test … --filter SnapshotSmoke`) and deliver JSON diff to Merge for schema verification (`normalizedVersions[].scheme == certcc.vendor`, provenance masks populated). | • Harness tests green locally and in CI.<br>• Merge sign-off that sample advisories conform to `FASTER_MODELING_AND_NORMALIZATION.md`. | N/A (no production enablement yet). |
| **1 Shadow parse/map in staging** | Target start 2025-10-13 | • Register `source:cert-cc:parse` and `source:cert-cc:map` jobs, but gate them behind new config flag `concelier:sources:cert-cc:enableDetailMapping` (default `false`).<br>• Deploy (restart required for options rebinding), enable flag, and point connector at staging Mongo with isolated collection (`advisories_certcc_shadow`).<br>• Run connector for ≥2 cycles; compare advisory counts vs. fetch-only baseline and validate `concelier.range.primitives` metrics include `scheme=certcc.vendor`. | • No uncaught exceptions in staging logs.<br>• Shadow advisories match expected vendor counts (±5%).<br>`certcc.summary.fetch.*` + new `certcc.map.duration.ms` metrics stable. | Disable flag; staging returns to fetch-only. No production impact. |
| **2 Controlled production enablement** | Target start 2025-10-14 | • Redeploy production with flag enabled, start with job concurrency `1`, and reduce `MaxNotesPerFetch` to 5 for first 24h.<br>• Observe metrics dashboards hourly (fetch/map latency, pending queues, Mongo write throughput).<br>• QA to replay latest snapshots and confirm no deterministic drift.<br>• Publish advisory sample (top 10 changed docs) to Merge Slack channel for validation. | • Pending parse/mapping queues drain within expected SLA (<30min).<br>• No increase in merge dedupe anomalies.<br>• Mongo writes stay within 10% of baseline. | Toggle flag off, re-run fetch-only. Clear `pendingMappings` via connector cursor reset if stuck. |
| **3 Full production & cleanup** | Target start 2025-10-15 | • Restore `MaxNotesPerFetch` to configured default (20).<br>• Remove temporary throttles and leave flag enabled by default.<br>• Update `README.md` rollout notes; close FEEDCONN-CERTCC-02-009.<br>• Kick off post-merge audit with Merge to ensure new advisories dedupe with other sources. | • Stable operations for ≥48h, no degradation alerts.<br>• Merge confirms conflict resolver behaviour unchanged. | If regression detected, revert to Phase2 state or disable jobs; retain plan for reuse. |
## 4. Monitoring & Validation Checklist
- Dashboards: `certcc.*` meters (plan, summary fetch, detail fetch) plus `concelier.range.primitives` with tag `scheme=certcc.vendor`.
- Logs: ensure Parse/Map jobs emit `correlationId` aligned with fetch events for traceability.
- Data QA: run `src/Tools/dump_advisory` against two VINCE notes (one multi-vendor, one single-vendor) every phase to spot-check normalized versions ordering and provenance.
- Storage: verify Mongo TTL/size for `raw_documents` and `dtos`—detail payload volume increases by ~3× when mapping resumes.
## 5. Rollback / Contingency Playbook
1. Disable `concelier:sources:cert-cc:enableDetailMapping` flag (and optionally set `MaxNotesPerFetch=0` for a single cycle) to halt new detail ingestion.
2. Run connector once to update cursor; verify `pendingMappings` drains.
3. If advisories already persisted, coordinate with Merge to soft-delete affected `certcc/*` advisories by advisory key hash (no schema rollback required).
4. Re-run Phase1 shadow validation before retrying.
## 6. Communication Cadence
- Daily check-in with Models/Merge leads (09:30 EDT) to surface normalizedVersions/provenance diffs.
- Post-phase reports in `#concelier-certcc` Slack channel summarising metrics, advisory counts, and outstanding issues.
- Escalate blockers >12h via Runbook SEV-3 path and annotate `TASKS.md`.
## 7. Open Questions / Next Actions
- [ ] Confirm whether Merge requires additional provenance field masks before Phase2 (waiting on feedback from 2025-10-11 sample).
- [ ] Decide if CSAF endpoint ingestion (optional) should piggyback on Phase3 or stay deferred.
- [ ] Validate that FEEDCONN-CERTCC-02-010 coverage handles mixed 200/404 VINCE endpoints during partial outages.
Once Dependencies (Section2) are cleared and Phase3 completes, update `src/Concelier/StellaOps.Concelier.PluginBinaries/StellaOps.Concelier.Connector.CertCc/TASKS.md` and close FEEDCONN-CERTCC-02-009.

View File

@@ -19,23 +19,25 @@ internal sealed class AdvisoryObservationFactory : IAdvisoryObservationFactory
ArgumentNullException.ThrowIfNull(rawDocument);
var source = CreateSource(rawDocument.Source, rawDocument.Upstream);
var upstream = CreateUpstream(rawDocument.Upstream);
var content = CreateContent(rawDocument.Content);
var linkset = CreateLinkset(rawDocument.Identifiers, rawDocument.Linkset);
var attributes = CreateAttributes(rawDocument);
var createdAt = (observedAt ?? rawDocument.Upstream.RetrievedAt).ToUniversalTime();
return new AdvisoryObservation(
var upstream = CreateUpstream(rawDocument.Upstream);
var content = CreateContent(rawDocument.Content);
var linkset = CreateLinkset(rawDocument.Identifiers, rawDocument.Linkset);
var rawLinkset = CreateRawLinkset(rawDocument.Identifiers, rawDocument.Linkset);
var attributes = CreateAttributes(rawDocument);
var createdAt = (observedAt ?? rawDocument.Upstream.RetrievedAt).ToUniversalTime();
return new AdvisoryObservation(
observationId: BuildObservationId(rawDocument),
tenant: rawDocument.Tenant,
source: source,
upstream: upstream,
content: content,
linkset: linkset,
createdAt: createdAt,
attributes: attributes);
}
upstream: upstream,
content: content,
linkset: linkset,
rawLinkset: rawLinkset,
createdAt: createdAt,
attributes: attributes);
}
private static AdvisoryObservationSource CreateSource(RawSourceMetadata source, RawUpstreamMetadata upstream)
{
@@ -110,16 +112,64 @@ internal sealed class AdvisoryObservationFactory : IAdvisoryObservationFactory
return JsonNode.Parse(document.RootElement.GetRawText()) ?? JsonNode.Parse("{}")!;
}
private static AdvisoryObservationLinkset CreateLinkset(RawIdentifiers identifiers, RawLinkset linkset)
{
var aliases = NormalizeAliases(identifiers, linkset);
var purls = NormalizePackageUrls(linkset.PackageUrls);
var cpes = NormalizeCpes(linkset.Cpes);
var references = NormalizeReferences(linkset.References);
return new AdvisoryObservationLinkset(aliases, purls, cpes, references);
}
private static AdvisoryObservationLinkset CreateLinkset(RawIdentifiers identifiers, RawLinkset linkset)
{
var aliases = NormalizeAliases(identifiers, linkset);
var purls = NormalizePackageUrls(linkset.PackageUrls);
var cpes = NormalizeCpes(linkset.Cpes);
var references = NormalizeReferences(linkset.References);
return new AdvisoryObservationLinkset(aliases, purls, cpes, references);
}
private static RawLinkset CreateRawLinkset(RawIdentifiers identifiers, RawLinkset linkset)
{
var aliasBuilder = ImmutableArray.CreateBuilder<string>();
if (!string.IsNullOrWhiteSpace(identifiers.PrimaryId))
{
aliasBuilder.Add(identifiers.PrimaryId);
}
if (!identifiers.Aliases.IsDefaultOrEmpty)
{
foreach (var alias in identifiers.Aliases)
{
if (!string.IsNullOrEmpty(alias))
{
aliasBuilder.Add(alias);
}
}
}
if (!linkset.Aliases.IsDefaultOrEmpty)
{
foreach (var alias in linkset.Aliases)
{
if (!string.IsNullOrEmpty(alias))
{
aliasBuilder.Add(alias);
}
}
}
static ImmutableArray<string> EnsureArray(ImmutableArray<string> values)
=> values.IsDefault ? ImmutableArray<string>.Empty : values;
static ImmutableArray<RawReference> EnsureReferences(ImmutableArray<RawReference> values)
=> values.IsDefault ? ImmutableArray<RawReference>.Empty : values;
return linkset with
{
Aliases = aliasBuilder.ToImmutable(),
PackageUrls = EnsureArray(linkset.PackageUrls),
Cpes = EnsureArray(linkset.Cpes),
References = EnsureReferences(linkset.References),
ReconciledFrom = EnsureArray(linkset.ReconciledFrom),
Notes = linkset.Notes ?? ImmutableDictionary<string, string>.Empty
};
}
private static IEnumerable<string> NormalizeAliases(RawIdentifiers identifiers, RawLinkset linkset)
{
var aliases = new HashSet<string>(StringComparer.OrdinalIgnoreCase);

View File

@@ -1,6 +1,7 @@
using System.Collections.Immutable;
using System.Text.Json;
using System.Text.Json.Nodes;
using System.Collections.Immutable;
using System.Text.Json;
using System.Text.Json.Nodes;
using StellaOps.Concelier.RawModels;
namespace StellaOps.Concelier.Models.Observations;
@@ -12,19 +13,21 @@ public sealed record AdvisoryObservation
AdvisoryObservationSource source,
AdvisoryObservationUpstream upstream,
AdvisoryObservationContent content,
AdvisoryObservationLinkset linkset,
DateTimeOffset createdAt,
ImmutableDictionary<string, string>? attributes = null)
{
ObservationId = Validation.EnsureNotNullOrWhiteSpace(observationId, nameof(observationId));
Tenant = Validation.EnsureNotNullOrWhiteSpace(tenant, nameof(tenant)).ToLowerInvariant();
Source = source ?? throw new ArgumentNullException(nameof(source));
Upstream = upstream ?? throw new ArgumentNullException(nameof(upstream));
Content = content ?? throw new ArgumentNullException(nameof(content));
Linkset = linkset ?? throw new ArgumentNullException(nameof(linkset));
CreatedAt = createdAt.ToUniversalTime();
Attributes = NormalizeAttributes(attributes);
}
AdvisoryObservationLinkset linkset,
RawLinkset rawLinkset,
DateTimeOffset createdAt,
ImmutableDictionary<string, string>? attributes = null)
{
ObservationId = Validation.EnsureNotNullOrWhiteSpace(observationId, nameof(observationId));
Tenant = Validation.EnsureNotNullOrWhiteSpace(tenant, nameof(tenant)).ToLowerInvariant();
Source = source ?? throw new ArgumentNullException(nameof(source));
Upstream = upstream ?? throw new ArgumentNullException(nameof(upstream));
Content = content ?? throw new ArgumentNullException(nameof(content));
Linkset = linkset ?? throw new ArgumentNullException(nameof(linkset));
RawLinkset = SanitizeRawLinkset(rawLinkset);
CreatedAt = createdAt.ToUniversalTime();
Attributes = NormalizeAttributes(attributes);
}
public string ObservationId { get; }
@@ -34,15 +37,17 @@ public sealed record AdvisoryObservation
public AdvisoryObservationUpstream Upstream { get; }
public AdvisoryObservationContent Content { get; }
public AdvisoryObservationLinkset Linkset { get; }
public DateTimeOffset CreatedAt { get; }
public ImmutableDictionary<string, string> Attributes { get; }
private static ImmutableDictionary<string, string> NormalizeAttributes(ImmutableDictionary<string, string>? attributes)
public AdvisoryObservationContent Content { get; }
public AdvisoryObservationLinkset Linkset { get; }
public RawLinkset RawLinkset { get; }
public DateTimeOffset CreatedAt { get; }
public ImmutableDictionary<string, string> Attributes { get; }
private static ImmutableDictionary<string, string> NormalizeAttributes(ImmutableDictionary<string, string>? attributes)
{
if (attributes is null || attributes.Count == 0)
{
@@ -59,10 +64,58 @@ public sealed record AdvisoryObservation
builder[pair.Key.Trim()] = pair.Value;
}
return builder.ToImmutable();
}
}
return builder.ToImmutable();
}
private static RawLinkset SanitizeRawLinkset(RawLinkset? rawLinkset)
{
if (rawLinkset is null)
{
return new RawLinkset();
}
static ImmutableArray<string> SanitizeStrings(ImmutableArray<string> values)
{
if (values.IsDefault)
{
return ImmutableArray<string>.Empty;
}
return values;
}
static ImmutableArray<RawReference> SanitizeReferences(ImmutableArray<RawReference> references)
{
if (references.IsDefault)
{
return ImmutableArray<RawReference>.Empty;
}
return references;
}
static ImmutableDictionary<string, string> SanitizeNotes(ImmutableDictionary<string, string>? notes)
{
if (notes is null || notes.Count == 0)
{
return ImmutableDictionary<string, string>.Empty;
}
return notes;
}
return rawLinkset with
{
Aliases = SanitizeStrings(rawLinkset.Aliases),
PackageUrls = SanitizeStrings(rawLinkset.PackageUrls),
Cpes = SanitizeStrings(rawLinkset.Cpes),
References = SanitizeReferences(rawLinkset.References),
ReconciledFrom = SanitizeStrings(rawLinkset.ReconciledFrom),
Notes = SanitizeNotes(rawLinkset.Notes)
};
}
}
public sealed record AdvisoryObservationSource
{

View File

@@ -1,12 +1,15 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>net10.0</TargetFramework>
<LangVersion>preview</LangVersion>
<Nullable>enable</Nullable>
<ImplicitUsings>enable</ImplicitUsings>
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Microsoft.Extensions.Logging.Abstractions" Version="10.0.0-rc.2.25502.107" />
</ItemGroup>
</Project>
<PropertyGroup>
<TargetFramework>net10.0</TargetFramework>
<LangVersion>preview</LangVersion>
<Nullable>enable</Nullable>
<ImplicitUsings>enable</ImplicitUsings>
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Microsoft.Extensions.Logging.Abstractions" Version="10.0.0-rc.2.25502.107" />
</ItemGroup>
<ItemGroup>
<ProjectReference Include="..\StellaOps.Concelier.RawModels\StellaOps.Concelier.RawModels.csproj" />
</ItemGroup>
</Project>

View File

@@ -26,6 +26,7 @@ This module owns the persistent shape of Concelier's MongoDB database. Upgrades
| `20251028_advisory_raw_idempotency_index` | Applies compound unique index on `(source.vendor, upstream.upstream_id, upstream.content_hash, tenant)` after verifying no duplicates exist. |
| `20251028_advisory_supersedes_backfill` | Renames legacy `advisory` collection to a read-only backup view and backfills `supersedes` chains across `advisory_raw`. |
| `20251028_advisory_raw_validator` | Applies Aggregation-Only Contract JSON schema validator to the `advisory_raw` collection with configurable enforcement level. |
| `20251104_advisory_observations_raw_linkset` | Backfills `rawLinkset` on `advisory_observations` using stored `advisory_raw` documents so canonical and raw projections co-exist for downstream policy joins. |
## Operator Runbook

View File

@@ -0,0 +1,442 @@
using System;
using System.Collections.Generic;
using System.Collections.Immutable;
using System.Globalization;
using System.Linq;
using System.Text.Json;
using System.Threading;
using System.Threading.Tasks;
using MongoDB.Bson;
using MongoDB.Bson.IO;
using MongoDB.Driver;
using StellaOps.Concelier.RawModels;
namespace StellaOps.Concelier.Storage.Mongo.Migrations;
/// <summary>
/// Backfills the raw linkset projection on advisory observations so downstream services
/// can rely on both canonical and raw linkset shapes.
/// </summary>
public sealed class EnsureAdvisoryObservationsRawLinksetMigration : IMongoMigration
{
private const string MigrationId = "20251104_advisory_observations_raw_linkset";
private const int BulkBatchSize = 500;
public string Id => MigrationId;
public string Description => "Populate rawLinkset field for advisory observations using stored advisory_raw documents.";
public async Task ApplyAsync(IMongoDatabase database, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(database);
var observations = database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.AdvisoryObservations);
var rawCollection = database.GetCollection<BsonDocument>(MongoStorageDefaults.Collections.AdvisoryRaw);
var filter = Builders<BsonDocument>.Filter.Exists("rawLinkset", false) |
Builders<BsonDocument>.Filter.Type("rawLinkset", BsonType.Null);
using var cursor = await observations
.Find(filter)
.ToCursorAsync(cancellationToken)
.ConfigureAwait(false);
var updates = new List<WriteModel<BsonDocument>>(BulkBatchSize);
var missingRawDocuments = new List<string>();
while (await cursor.MoveNextAsync(cancellationToken).ConfigureAwait(false))
{
foreach (var observationDocument in cursor.Current)
{
cancellationToken.ThrowIfCancellationRequested();
if (!TryExtractObservationKey(observationDocument, out var key))
{
continue;
}
var rawFilter = Builders<BsonDocument>.Filter.Eq("tenant", key.Tenant) &
Builders<BsonDocument>.Filter.Eq("source.vendor", key.Vendor) &
Builders<BsonDocument>.Filter.Eq("upstream.upstream_id", key.UpstreamId) &
Builders<BsonDocument>.Filter.Eq("upstream.content_hash", key.ContentHash);
var rawDocument = await rawCollection
.Find(rawFilter)
.Sort(Builders<BsonDocument>.Sort.Descending("ingested_at").Descending("_id"))
.Limit(1)
.FirstOrDefaultAsync(cancellationToken)
.ConfigureAwait(false);
if (rawDocument is null)
{
missingRawDocuments.Add(key.ToString());
continue;
}
var advisoryRaw = MapToRawDocument(rawDocument);
var rawLinkset = BuildRawLinkset(advisoryRaw.Identifiers, advisoryRaw.Linkset);
var rawLinksetDocument = BuildRawLinksetBson(rawLinkset);
var update = Builders<BsonDocument>.Update.Set("rawLinkset", rawLinksetDocument);
updates.Add(new UpdateOneModel<BsonDocument>(
Builders<BsonDocument>.Filter.Eq("_id", observationDocument["_id"].AsString),
update));
if (updates.Count >= BulkBatchSize)
{
await observations.BulkWriteAsync(updates, cancellationToken: cancellationToken).ConfigureAwait(false);
updates.Clear();
}
}
}
if (updates.Count > 0)
{
await observations.BulkWriteAsync(updates, cancellationToken: cancellationToken).ConfigureAwait(false);
}
if (missingRawDocuments.Count > 0)
{
throw new InvalidOperationException(
$"Unable to locate advisory_raw documents for {missingRawDocuments.Count} observations: {string.Join(", ", missingRawDocuments.Take(10))}");
}
}
private static bool TryExtractObservationKey(BsonDocument observation, out ObservationKey key)
{
key = default;
if (!observation.TryGetValue("tenant", out var tenantValue) || tenantValue.IsBsonNull)
{
return false;
}
if (!observation.TryGetValue("source", out var sourceValue) || sourceValue is not BsonDocument sourceDocument)
{
return false;
}
if (!observation.TryGetValue("upstream", out var upstreamValue) || upstreamValue is not BsonDocument upstreamDocument)
{
return false;
}
var tenant = tenantValue.AsString;
var vendor = sourceDocument.GetValue("vendor", BsonString.Empty).AsString;
var upstreamId = upstreamDocument.GetValue("upstream_id", BsonString.Empty).AsString;
var contentHash = upstreamDocument.GetValue("contentHash", BsonString.Empty).AsString;
var createdAt = observation.GetValue("createdAt", BsonNull.Value);
key = new ObservationKey(
tenant,
vendor,
upstreamId,
contentHash,
BsonValueToDateTimeOffset(createdAt) ?? DateTimeOffset.UtcNow);
return !string.IsNullOrWhiteSpace(tenant) &&
!string.IsNullOrWhiteSpace(vendor) &&
!string.IsNullOrWhiteSpace(upstreamId) &&
!string.IsNullOrWhiteSpace(contentHash);
}
private static AdvisoryRawDocument MapToRawDocument(BsonDocument document)
{
var tenant = GetRequiredString(document, "tenant");
var source = MapSource(document["source"].AsBsonDocument);
var upstream = MapUpstream(document["upstream"].AsBsonDocument);
var content = MapContent(document["content"].AsBsonDocument);
var identifiers = MapIdentifiers(document["identifiers"].AsBsonDocument);
var linkset = MapLinkset(document["linkset"].AsBsonDocument);
var supersedes = document.GetValue("supersedes", BsonNull.Value);
return new AdvisoryRawDocument(
tenant,
source,
upstream,
content,
identifiers,
linkset,
supersedes.IsBsonNull ? null : supersedes.AsString);
}
private static RawSourceMetadata MapSource(BsonDocument source)
{
return new RawSourceMetadata(
GetRequiredString(source, "vendor"),
GetRequiredString(source, "connector"),
GetRequiredString(source, "version"),
GetOptionalString(source, "stream"));
}
private static RawUpstreamMetadata MapUpstream(BsonDocument upstream)
{
var provenanceBuilder = ImmutableDictionary.CreateBuilder<string, string>(StringComparer.Ordinal);
if (upstream.TryGetValue("provenance", out var provenanceValue) && provenanceValue.IsBsonDocument)
{
foreach (var element in provenanceValue.AsBsonDocument)
{
provenanceBuilder[element.Name] = BsonValueToString(element.Value);
}
}
var signatureDocument = upstream["signature"].AsBsonDocument;
var signature = new RawSignatureMetadata(
signatureDocument.GetValue("present", BsonBoolean.False).AsBoolean,
signatureDocument.TryGetValue("format", out var format) && !format.IsBsonNull ? format.AsString : null,
signatureDocument.TryGetValue("key_id", out var keyId) && !keyId.IsBsonNull ? keyId.AsString : null,
signatureDocument.TryGetValue("sig", out var sig) && !sig.IsBsonNull ? sig.AsString : null,
signatureDocument.TryGetValue("certificate", out var certificate) && !certificate.IsBsonNull ? certificate.AsString : null,
signatureDocument.TryGetValue("digest", out var digest) && !digest.IsBsonNull ? digest.AsString : null);
return new RawUpstreamMetadata(
GetRequiredString(upstream, "upstream_id"),
upstream.TryGetValue("document_version", out var version) && !version.IsBsonNull ? version.AsString : null,
GetDateTimeOffset(upstream, "retrieved_at", DateTimeOffset.UtcNow),
GetRequiredString(upstream, "content_hash"),
signature,
provenanceBuilder.ToImmutable());
}
private static RawContent MapContent(BsonDocument content)
{
var rawValue = content.GetValue("raw", BsonNull.Value);
string rawJson;
if (rawValue.IsBsonNull)
{
rawJson = "{}";
}
else if (rawValue.IsString)
{
rawJson = rawValue.AsString ?? "{}";
}
else
{
rawJson = rawValue.ToJson(new JsonWriterSettings { OutputMode = JsonOutputMode.RelaxedExtendedJson });
}
using var document = JsonDocument.Parse(string.IsNullOrWhiteSpace(rawJson) ? "{}" : rawJson);
return new RawContent(
GetRequiredString(content, "format"),
content.TryGetValue("spec_version", out var specVersion) && !specVersion.IsBsonNull ? specVersion.AsString : null,
document.RootElement.Clone(),
content.TryGetValue("encoding", out var encoding) && !encoding.IsBsonNull ? encoding.AsString : null);
}
private static RawIdentifiers MapIdentifiers(BsonDocument identifiers)
{
var aliases = identifiers.TryGetValue("aliases", out var aliasesValue) && aliasesValue.IsBsonArray
? aliasesValue.AsBsonArray.Select(BsonValueToString).ToImmutableArray()
: ImmutableArray<string>.Empty;
return new RawIdentifiers(
aliases,
GetRequiredString(identifiers, "primary"));
}
private static RawLinkset MapLinkset(BsonDocument linkset)
{
var aliases = linkset.TryGetValue("aliases", out var aliasesValue) && aliasesValue.IsBsonArray
? aliasesValue.AsBsonArray.Select(BsonValueToString).ToImmutableArray()
: ImmutableArray<string>.Empty;
var purls = linkset.TryGetValue("purls", out var purlsValue) && purlsValue.IsBsonArray
? purlsValue.AsBsonArray.Select(BsonValueToString).ToImmutableArray()
: ImmutableArray<string>.Empty;
var cpes = linkset.TryGetValue("cpes", out var cpesValue) && cpesValue.IsBsonArray
? cpesValue.AsBsonArray.Select(BsonValueToString).ToImmutableArray()
: ImmutableArray<string>.Empty;
var references = linkset.TryGetValue("references", out var referencesValue) && referencesValue.IsBsonArray
? referencesValue.AsBsonArray
.Where(static value => value.IsBsonDocument)
.Select(value =>
{
var doc = value.AsBsonDocument;
return new RawReference(
GetRequiredString(doc, "type"),
GetRequiredString(doc, "url"),
doc.TryGetValue("source", out var source) && !source.IsBsonNull ? source.AsString : null);
})
.ToImmutableArray()
: ImmutableArray<RawReference>.Empty;
var reconciledFrom = linkset.TryGetValue("reconciled_from", out var reconciledValue) && reconciledValue.IsBsonArray
? reconciledValue.AsBsonArray.Select(BsonValueToString).ToImmutableArray()
: ImmutableArray<string>.Empty;
var notesBuilder = ImmutableDictionary.CreateBuilder<string, string>(StringComparer.Ordinal);
if (linkset.TryGetValue("notes", out var notesValue) && notesValue.IsBsonDocument)
{
foreach (var element in notesValue.AsBsonDocument)
{
notesBuilder[element.Name] = BsonValueToString(element.Value);
}
}
return new RawLinkset
{
Aliases = aliases,
PackageUrls = purls,
Cpes = cpes,
References = references,
ReconciledFrom = reconciledFrom,
Notes = notesBuilder.ToImmutable()
};
}
private static RawLinkset BuildRawLinkset(RawIdentifiers identifiers, RawLinkset linkset)
{
var aliasBuilder = ImmutableArray.CreateBuilder<string>();
if (!string.IsNullOrWhiteSpace(identifiers.PrimaryId))
{
aliasBuilder.Add(identifiers.PrimaryId);
}
if (!identifiers.Aliases.IsDefaultOrEmpty)
{
foreach (var alias in identifiers.Aliases)
{
if (!string.IsNullOrEmpty(alias))
{
aliasBuilder.Add(alias);
}
}
}
if (!linkset.Aliases.IsDefaultOrEmpty)
{
foreach (var alias in linkset.Aliases)
{
if (!string.IsNullOrEmpty(alias))
{
aliasBuilder.Add(alias);
}
}
}
static ImmutableArray<string> EnsureArray(ImmutableArray<string> values)
=> values.IsDefault ? ImmutableArray<string>.Empty : values;
static ImmutableArray<RawReference> EnsureReferences(ImmutableArray<RawReference> values)
=> values.IsDefault ? ImmutableArray<RawReference>.Empty : values;
return linkset with
{
Aliases = aliasBuilder.ToImmutable(),
PackageUrls = EnsureArray(linkset.PackageUrls),
Cpes = EnsureArray(linkset.Cpes),
References = EnsureReferences(linkset.References),
ReconciledFrom = EnsureArray(linkset.ReconciledFrom),
Notes = linkset.Notes ?? ImmutableDictionary<string, string>.Empty
};
}
private static BsonDocument BuildRawLinksetBson(RawLinkset rawLinkset)
{
var references = new BsonArray(rawLinkset.References.Select(reference =>
{
var referenceDocument = new BsonDocument
{
{ "type", reference.Type },
{ "url", reference.Url }
};
if (!string.IsNullOrWhiteSpace(reference.Source))
{
referenceDocument["source"] = reference.Source;
}
return referenceDocument;
}));
var notes = new BsonDocument();
if (rawLinkset.Notes is not null)
{
foreach (var entry in rawLinkset.Notes)
{
notes[entry.Key] = entry.Value;
}
}
return new BsonDocument
{
{ "aliases", new BsonArray(rawLinkset.Aliases) },
{ "purls", new BsonArray(rawLinkset.PackageUrls) },
{ "cpes", new BsonArray(rawLinkset.Cpes) },
{ "references", references },
{ "reconciled_from", new BsonArray(rawLinkset.ReconciledFrom) },
{ "notes", notes }
};
}
private static string GetRequiredString(BsonDocument document, string key)
{
if (!document.TryGetValue(key, out var value) || value.IsBsonNull)
{
return string.Empty;
}
return value.IsString ? value.AsString : value.ToString() ?? string.Empty;
}
private static string? GetOptionalString(BsonDocument document, string key)
{
if (!document.TryGetValue(key, out var value) || value.IsBsonNull)
{
return null;
}
return value.IsString ? value.AsString : value.ToString();
}
private static string BsonValueToString(BsonValue value)
{
if (value.IsString)
{
return value.AsString ?? string.Empty;
}
if (value.IsBsonNull)
{
return string.Empty;
}
return value.ToString() ?? string.Empty;
}
private static DateTimeOffset GetDateTimeOffset(BsonDocument document, string field, DateTimeOffset fallback)
{
if (!document.TryGetValue(field, out var value) || value.IsBsonNull)
{
return fallback;
}
return BsonValueToDateTimeOffset(value) ?? fallback;
}
private static DateTimeOffset? BsonValueToDateTimeOffset(BsonValue value)
{
return value.BsonType switch
{
BsonType.DateTime => new DateTimeOffset(DateTime.SpecifyKind(value.ToUniversalTime(), DateTimeKind.Utc)),
BsonType.String when DateTimeOffset.TryParse(value.AsString, CultureInfo.InvariantCulture, DateTimeStyles.AssumeUniversal, out var parsed)
=> parsed.ToUniversalTime(),
BsonType.Int64 => DateTimeOffset.FromUnixTimeMilliseconds(value.AsInt64).ToUniversalTime(),
_ => null
};
}
private readonly record struct ObservationKey(
string Tenant,
string Vendor,
string UpstreamId,
string ContentHash,
DateTimeOffset CreatedAt)
{
public override string ToString()
=> $"{Tenant}:{Vendor}:{UpstreamId}:{ContentHash}";
}
}

View File

@@ -24,11 +24,16 @@ public sealed class AdvisoryObservationDocument
public AdvisoryObservationContentDocument Content { get; set; } = new();
[BsonElement("linkset")]
public AdvisoryObservationLinksetDocument Linkset { get; set; } = new();
[BsonElement("createdAt")]
public DateTime CreatedAt { get; set; }
= DateTime.UtcNow;
public AdvisoryObservationLinksetDocument Linkset { get; set; } = new();
[BsonElement("rawLinkset")]
[BsonIgnoreIfNull]
public AdvisoryObservationRawLinksetDocument? RawLinkset { get; set; }
= null;
[BsonElement("createdAt")]
public DateTime CreatedAt { get; set; }
= DateTime.UtcNow;
[BsonElement("attributes")]
[BsonIgnoreIfNull]
@@ -129,11 +134,11 @@ public sealed class AdvisoryObservationContentDocument
}
[BsonIgnoreExtraElements]
public sealed class AdvisoryObservationLinksetDocument
{
[BsonElement("aliases")]
[BsonIgnoreIfNull]
public List<string>? Aliases { get; set; }
public sealed class AdvisoryObservationLinksetDocument
{
[BsonElement("aliases")]
[BsonIgnoreIfNull]
public List<string>? Aliases { get; set; }
= new();
[BsonElement("purls")]
@@ -153,11 +158,62 @@ public sealed class AdvisoryObservationLinksetDocument
}
[BsonIgnoreExtraElements]
public sealed class AdvisoryObservationReferenceDocument
{
[BsonElement("type")]
public string Type { get; set; } = string.Empty;
[BsonElement("url")]
public string Url { get; set; } = string.Empty;
}
public sealed class AdvisoryObservationReferenceDocument
{
[BsonElement("type")]
public string Type { get; set; } = string.Empty;
[BsonElement("url")]
public string Url { get; set; } = string.Empty;
}
[BsonIgnoreExtraElements]
public sealed class AdvisoryObservationRawLinksetDocument
{
[BsonElement("aliases")]
[BsonIgnoreIfNull]
public List<string>? Aliases { get; set; }
= new();
[BsonElement("purls")]
[BsonIgnoreIfNull]
public List<string>? PackageUrls { get; set; }
= new();
[BsonElement("cpes")]
[BsonIgnoreIfNull]
public List<string>? Cpes { get; set; }
= new();
[BsonElement("references")]
[BsonIgnoreIfNull]
public List<AdvisoryObservationRawReferenceDocument>? References { get; set; }
= new();
[BsonElement("reconciled_from")]
[BsonIgnoreIfNull]
public List<string>? ReconciledFrom { get; set; }
= new();
[BsonElement("notes")]
[BsonIgnoreIfNull]
public Dictionary<string, string>? Notes { get; set; }
= new(StringComparer.Ordinal);
}
[BsonIgnoreExtraElements]
public sealed class AdvisoryObservationRawReferenceDocument
{
[BsonElement("type")]
[BsonIgnoreIfNull]
public string? Type { get; set; }
= null;
[BsonElement("url")]
public string Url { get; set; } = string.Empty;
[BsonElement("source")]
[BsonIgnoreIfNull]
public string? Source { get; set; }
= null;
}

View File

@@ -5,7 +5,8 @@ using System.Linq;
using System.Text.Json.Nodes;
using MongoDB.Bson;
using MongoDB.Bson.IO;
using StellaOps.Concelier.Models.Observations;
using StellaOps.Concelier.Models.Observations;
using StellaOps.Concelier.RawModels;
namespace StellaOps.Concelier.Storage.Mongo.Observations;
@@ -22,12 +23,14 @@ internal static class AdvisoryObservationDocumentFactory
var contentMetadata = ToImmutable(document.Content.Metadata);
var upstreamMetadata = ToImmutable(document.Upstream.Metadata);
var observation = new AdvisoryObservation(
document.Id,
document.Tenant,
new AdvisoryObservationSource(
document.Source.Vendor,
document.Source.Stream,
var rawLinkset = ToRawLinkset(document.RawLinkset);
var observation = new AdvisoryObservation(
document.Id,
document.Tenant,
new AdvisoryObservationSource(
document.Source.Vendor,
document.Source.Stream,
document.Source.Api,
document.Source.CollectorVersion),
new AdvisoryObservationUpstream(
@@ -42,21 +45,22 @@ internal static class AdvisoryObservationDocumentFactory
document.Upstream.Signature.KeyId,
document.Upstream.Signature.Signature),
upstreamMetadata),
new AdvisoryObservationContent(
document.Content.Format,
document.Content.SpecVersion,
rawNode,
contentMetadata),
new AdvisoryObservationLinkset(
document.Linkset.Aliases ?? Enumerable.Empty<string>(),
document.Linkset.Purls ?? Enumerable.Empty<string>(),
document.Linkset.Cpes ?? Enumerable.Empty<string>(),
document.Linkset.References?.Select(reference => new AdvisoryObservationReference(reference.Type, reference.Url))),
DateTime.SpecifyKind(document.CreatedAt, DateTimeKind.Utc),
attributes);
return observation;
}
new AdvisoryObservationContent(
document.Content.Format,
document.Content.SpecVersion,
rawNode,
contentMetadata),
new AdvisoryObservationLinkset(
document.Linkset.Aliases ?? Enumerable.Empty<string>(),
document.Linkset.Purls ?? Enumerable.Empty<string>(),
document.Linkset.Cpes ?? Enumerable.Empty<string>(),
document.Linkset.References?.Select(reference => new AdvisoryObservationReference(reference.Type, reference.Url))),
rawLinkset,
DateTime.SpecifyKind(document.CreatedAt, DateTimeKind.Utc),
attributes);
return observation;
}
private static JsonNode ParseJsonNode(BsonDocument raw)
{
@@ -87,6 +91,72 @@ internal static class AdvisoryObservationDocumentFactory
builder[pair.Key.Trim()] = pair.Value;
}
return builder.ToImmutable();
}
}
return builder.ToImmutable();
}
private static RawLinkset ToRawLinkset(AdvisoryObservationRawLinksetDocument? document)
{
if (document is null)
{
return new RawLinkset();
}
static ImmutableArray<string> ToImmutableStringArray(List<string>? values)
{
if (values is null || values.Count == 0)
{
return ImmutableArray<string>.Empty;
}
return values
.Select(static value => value ?? string.Empty)
.ToImmutableArray();
}
static ImmutableArray<RawReference> ToImmutableReferences(List<AdvisoryObservationRawReferenceDocument>? references)
{
if (references is null || references.Count == 0)
{
return ImmutableArray<RawReference>.Empty;
}
return references
.Select(static reference => new RawReference(
reference.Type ?? string.Empty,
reference.Url,
reference.Source))
.ToImmutableArray();
}
static ImmutableDictionary<string, string> ToImmutableDictionary(Dictionary<string, string>? values)
{
if (values is null || values.Count == 0)
{
return ImmutableDictionary<string, string>.Empty;
}
var builder = ImmutableDictionary.CreateBuilder<string, string>(StringComparer.Ordinal);
foreach (var pair in values)
{
if (pair.Key is null)
{
continue;
}
builder[pair.Key] = pair.Value;
}
return builder.ToImmutable();
}
return new RawLinkset
{
Aliases = ToImmutableStringArray(document.Aliases),
PackageUrls = ToImmutableStringArray(document.PackageUrls),
Cpes = ToImmutableStringArray(document.Cpes),
References = ToImmutableReferences(document.References),
ReconciledFrom = ToImmutableStringArray(document.ReconciledFrom),
Notes = ToImmutableDictionary(document.Notes)
};
}
}

View File

@@ -109,6 +109,7 @@ public static class ServiceCollectionExtensions
services.AddSingleton<IMongoMigration, EnsureAdvisoryRawIdempotencyIndexMigration>();
services.AddSingleton<IMongoMigration, EnsureAdvisorySupersedesBackfillMigration>();
services.AddSingleton<IMongoMigration, EnsureAdvisoryRawValidatorMigration>();
services.AddSingleton<IMongoMigration, EnsureAdvisoryObservationsRawLinksetMigration>();
services.AddSingleton<IMongoMigration, EnsureAdvisoryEventCollectionsMigration>();
services.AddSingleton<IMongoMigration, SemVerStyleBackfillMigration>();

View File

@@ -10,6 +10,7 @@
> Docs alignment (2025-10-26): Rollback guidance added to `docs/deploy/containers.md` §6.
> 2025-10-28: Documented duplicate audit + migration workflow in `docs/deploy/containers.md`, Offline Kit guide, and `MIGRATIONS.md`; published `ops/devops/scripts/check-advisory-raw-duplicates.js` for staging/offline clusters.
> Docs alignment (2025-10-26): Offline kit requirements documented in `docs/deploy/containers.md` §5.
| CONCELIER-STORE-AOC-19-005 `Raw linkset backfill` | TODO (2025-11-04) | Concelier Storage Guild, DevOps Guild | CONCELIER-CORE-AOC-19-004 | Plan and execute advisory_observations `rawLinkset` backfill (online + Offline Kit bundles), supply migration scripts + rehearse rollback. Follow the coordination plan in `docs/dev/raw-linkset-backfill-plan.md`. |
## Policy Engine v2