feat: Add CVSS receipt management endpoints and related functionality
Some checks failed
AOC Guard CI / aoc-guard (push) Has been cancelled
AOC Guard CI / aoc-verify (push) Has been cancelled
Concelier Attestation Tests / attestation-tests (push) Has been cancelled
Docs CI / lint-and-preview (push) Has been cancelled

- Introduced new API endpoints for creating, retrieving, amending, and listing CVSS receipts.
- Updated IPolicyEngineClient interface to include methods for CVSS receipt operations.
- Implemented PolicyEngineClient to handle CVSS receipt requests.
- Enhanced Program.cs to map new CVSS receipt routes with appropriate authorization.
- Added necessary models and contracts for CVSS receipt requests and responses.
- Integrated Postgres document store for managing CVSS receipts and related data.
- Updated database schema with new migrations for source documents and payload storage.
- Refactored existing components to support new CVSS functionality.
This commit is contained in:
StellaOps Bot
2025-12-07 00:43:14 +02:00
parent 0de92144d2
commit 53889d85e7
67 changed files with 17207 additions and 16293 deletions

View File

@@ -0,0 +1,16 @@
# Mock Overlay (Dev Only)
Purpose: let deployment tasks progress with placeholder digests until real releases land.
Use:
```bash
helm template mock ./deploy/helm/stellaops -f deploy/helm/stellaops/values-mock.yaml
```
Contents:
- Mock deployments for orchestrator, policy-registry, packs-registry, task-runner, VEX Lens, issuer-directory, findings-ledger, vuln-explorer-api.
- Image pins pulled from `deploy/releases/2025.09-mock-dev.yaml`.
Notes:
- Annotated with `stellaops.dev/mock: "true"` to discourage production use.
- Swap to real values once official digests publish; keep mock overlay gated behind `mock.enabled`.

View File

@@ -34,7 +34,7 @@
| 6 | CVSS-DSSE-190-006 | DONE (2025-11-28) | Depends on 190-005; uses Attestor primitives. | Policy Guild · Attestor Guild (`src/Policy/StellaOps.Policy.Scoring`, `src/Attestor/StellaOps.Attestor.Envelope`) | Attach DSSE attestations to score receipts: create `stella.ops/cvssReceipt@v1` predicate type, sign receipts, store envelope references. |
| 7 | CVSS-HISTORY-190-007 | DONE (2025-11-28) | Depends on 190-005. | Policy Guild (`src/Policy/StellaOps.Policy.Scoring/History`) | Implement receipt amendment tracking: `AmendReceipt(receiptId, field, newValue, reason, ref)` with history entry creation and re-signing. |
| 8 | CVSS-CONCELIER-190-008 | DONE (2025-12-06) | Depends on 190-001; Concelier AGENTS updated 2025-12-06. | Concelier Guild · Policy Guild (`src/Concelier/__Libraries/StellaOps.Concelier.Core`) | Ingest vendor-provided CVSS v4.0 vectors from advisories; parse and store as base receipts; preserve provenance. (Implemented CVSS priority ordering in Advisory → Postgres conversion so v4 vectors are primary and provenance-preserved.) |
| 9 | CVSS-API-190-009 | BLOCKED (2025-12-06) | Depends on 190-005, 190-007; missing Policy Engine CVSS receipt endpoints to proxy. | Policy Guild (`src/Policy/StellaOps.Policy.Gateway`) | REST/gRPC APIs: `POST /cvss/receipts`, `GET /cvss/receipts/{id}`, `PUT /cvss/receipts/{id}/amend`, `GET /cvss/receipts/{id}/history`, `GET /cvss/policies`. |
| 9 | CVSS-API-190-009 | DONE (2025-12-06) | Depends on 190-005, 190-007; Policy Engine + Gateway CVSS endpoints shipped. | Policy Guild (`src/Policy/StellaOps.Policy.Gateway`) | REST APIs delivered: `POST /cvss/receipts`, `GET /cvss/receipts/{id}`, `PUT /cvss/receipts/{id}/amend`, `GET /cvss/receipts/{id}/history`, `GET /cvss/policies`. |
| 10 | CVSS-CLI-190-010 | TODO | Depends on 190-009 (API readiness). | CLI Guild (`src/Cli/StellaOps.Cli`) | CLI verbs: `stella cvss score --vuln <id>`, `stella cvss show <receiptId>`, `stella cvss history <receiptId>`, `stella cvss export <receiptId> --format json|pdf`. |
| 11 | CVSS-UI-190-011 | TODO | Depends on 190-009 (API readiness). | UI Guild (`src/UI/StellaOps.UI`) | UI components: Score badge with CVSS-BTE label, tabbed receipt viewer (Base/Threat/Environmental/Supplemental/Evidence/Policy/History), "Recalculate with my env" button, export options. |
| 12 | CVSS-DOCS-190-012 | BLOCKED (2025-11-29) | Depends on 190-001 through 190-011 (API/UI/CLI blocked). | Docs Guild (`docs/modules/policy/cvss-v4.md`, `docs/09_API_CLI_REFERENCE.md`) | Document CVSS v4.0 scoring system: data model, policy format, API reference, CLI usage, UI guide, determinism guarantees. |
@@ -48,7 +48,7 @@
| --- | --- | --- | --- | --- |
| W1 Foundation | Policy Guild | None | DONE (2025-11-28) | Tasks 1-4: Data model, engine, tests, policy loader. |
| W2 Receipt Pipeline | Policy Guild · Attestor Guild | W1 complete | DONE (2025-11-28) | Tasks 5-7: Receipt builder, DSSE, history completed; integration tests green. |
| W3 Integration | Concelier · Policy · CLI · UI Guilds | W2 complete; AGENTS delivered 2025-12-06 | BLOCKED (2025-12-06) | CVSS-API-190-009 blocked: Policy Engine lacks CVSS receipt endpoints to proxy; CLI/UI depend on it. |
| W3 Integration | Concelier · Policy · CLI · UI Guilds | W2 complete; AGENTS delivered 2025-12-06 | TODO (2025-12-06) | CVSS API now available; proceed with CLI (task 10) and UI (task 11) wiring. |
| W4 Documentation | Docs Guild | W3 complete | BLOCKED (2025-12-06) | Task 12 blocked by API/UI/CLI delivery; resumes after W3 progresses. |
## Interlocks
@@ -75,11 +75,12 @@
| R3 | Receipt storage grows large with evidence links. | Storage costs; query performance. | Implement evidence reference deduplication; use CAS URIs; Platform Guild. |
| R4 | CVSS parser/ruleset changes ungoverned (CVM9). | Score drift, audit gaps. | Version parsers/rulesets; DSSE-sign releases; log scorer version in receipts; dual-review changes. |
| R5 | Missing AGENTS for Policy WebService and Concelier ingestion block integration (tasks 811). | API/CLI/UI delivery stalled. | AGENTS delivered 2025-12-06 (tasks 1516). Risk mitigated; monitor API contract approvals. |
| R6 | Policy Engine lacks CVSS receipt endpoints; gateway proxy cannot be implemented yet. | API/CLI/UI tasks remain blocked. | Policy Guild to add receipt API surface in Policy Engine; re-run gateway wiring once available. |
| R6 | Policy Engine lacks CVSS receipt endpoints; gateway proxy cannot be implemented yet. | API/CLI/UI tasks remain blocked. | **Mitigated 2025-12-06:** CVSS receipt endpoints implemented in Policy Engine and Gateway; unblock CLI/UI. |
## Execution Log
| Date (UTC) | Update | Owner |
| --- | --- | --- |
| 2025-12-06 | CVSS-API-190-009 DONE: added Policy Engine CVSS receipt endpoints and Gateway proxies (`/api/cvss/receipts`, history, amend, policies); W3 unblocked; risk R6 mitigated. | Implementer |
| 2025-12-06 | CVSS-CONCELIER-190-008 DONE: prioritized CVSS v4.0 vectors as primary in advisory→Postgres conversion; provenance preserved; enables Policy receipt ingestion. CVSS-API-190-009 set BLOCKED pending Policy Engine CVSS receipt endpoints (risk R6). | Implementer |
| 2025-12-06 | Created Policy Gateway AGENTS and refreshed Concelier AGENTS for CVSS v4 ingest (tasks 1516 DONE); moved tasks 811 to TODO, set W3 to TODO, mitigated risk R5. | Project Mgmt |
| 2025-12-06 | Added tasks 1516 to create AGENTS for Policy WebService and Concelier; set Wave 2 to DONE; marked Waves 34 BLOCKED until AGENTS exist; captured risk R5. | Project Mgmt |

View File

@@ -39,6 +39,7 @@
| 2025-12-06 | CI workflow `.gitea/workflows/mock-dev-release.yml` now packages mock manifest + downloads JSON into `mock-dev-release.tgz` for dev pipelines. | Deployment Guild |
| 2025-12-06 | Mock Compose overlay (`deploy/compose/docker-compose.mock.yaml`) documented for dev-only configs using placeholder digests; production pins remain pending. | Deployment Guild |
| 2025-12-06 | Added production guard `.gitea/workflows/release-manifest-verify.yml` to fail CI if stable/airgap manifests or downloads JSON omit required components. | Deployment Guild |
| 2025-12-06 | Added Helm mock overlays (`orchestrator/policy/packs/vex/vuln` under `deploy/helm/stellaops/templates/*-mock.yaml`) and `values-mock.yaml`; mock dev release workflow now renders `helm template` with mock values for dev packaging. | Deployment Guild |
| 2025-12-05 | HELM-45-003 DONE: added HPA template with per-service overrides, PDB support, Prometheus scrape annotations hook, and production defaults (prod enabled, airgap prometheus on but HPA off). | Deployment Guild |
| 2025-12-05 | HELM-45-002 DONE: added ingress/TLS toggles, NetworkPolicy defaults, pod security contexts, and ExternalSecret scaffold (prod enabled, airgap off); documented via values changes and templates (`core.yaml`, `networkpolicy.yaml`, `ingress.yaml`, `externalsecrets.yaml`). | Deployment Guild |
| 2025-12-05 | HELM-45-001 DONE: added migration job scaffolding and toggle to Helm chart (`deploy/helm/stellaops/templates/migrations.yaml`, values defaults), kept digest pins, and published install guide (`deploy/helm/stellaops/INSTALL.md`). | Deployment Guild |

View File

@@ -52,7 +52,7 @@
| 9 | PG-T7.1.9 | TODO | Depends on PG-T7.1.8 | Infrastructure Guild | Remove MongoDB configuration options |
| 10 | PG-T7.1.10 | TODO | Depends on PG-T7.1.9 | Infrastructure Guild | Run full build to verify no broken references |
| 14 | PG-T7.1.5a | DOING | Concelier Guild | Concelier: replace Mongo deps with Postgres equivalents; remove MongoDB packages; compat layer added. |
| 15 | PG-T7.1.5b | TODO | Concelier Guild | Build Postgres document/raw storage + state repositories and wire DI. |
| 15 | PG-T7.1.5b | DOING | Concelier Guild | Build Postgres document/raw storage + state repositories and wire DI. |
| 16 | PG-T7.1.5c | TODO | Concelier Guild | Refactor connectors/exporters/tests to Postgres storage; delete Storage.Mongo code. |
| 17 | PG-T7.1.5d | TODO | Concelier Guild | Add migrations for document/state/export tables; include in air-gap kit. |
| 18 | PG-T7.1.5e | TODO | Concelier Guild | Postgres-only Concelier build/tests green; remove Mongo artefacts and update docs. |
@@ -122,6 +122,7 @@
| 2025-12-06 | Attempted Scheduler Postgres tests; restore/build fails because `StellaOps.Concelier.Storage.Mongo` project is absent and Concelier connectors reference it. Need phased Concelier plan/shim to unblock test/build runs. | Scheduler Guild |
| 2025-12-06 | Began Concelier Mongo compatibility shim: added `FindAsync` to in-memory `IDocumentStore` in Postgres compat layer to unblock connector compile; full Mongo removal still pending. | Infrastructure Guild |
| 2025-12-06 | Added lightweight `StellaOps.Concelier.Storage.Mongo` in-memory stub (advisory/dto/document/state/export stores) to unblock Concelier connector build while Postgres rewiring continues; no Mongo driver/runtime. | Infrastructure Guild |
| 2025-12-06 | PG-T7.1.5b set to DOING; began wiring Postgres document store (DI registration, repository find) to replace Mongo bindings. | Concelier Guild |
## Decisions & Risks
- Cleanup is strictly after all phases complete; do not start T7 tasks until module cutovers are DONE.

View File

@@ -3,7 +3,7 @@
| # | Task ID | Status | Owner | Notes |
|---|---|---|---|---|
| 1 | PG-T7.1.5a | DOING | Concelier Guild | Replace Mongo storage dependencies with Postgres equivalents; remove MongoDB.Driver/Bson packages from Concelier projects. |
| 2 | PG-T7.1.5b | TODO | Concelier Guild | Implement Postgres document/raw storage (bytea/LargeObject) + state repos to satisfy connector fetch/store paths. |
| 2 | PG-T7.1.5b | DOING | Concelier Guild | Implement Postgres document/raw storage (bytea/LargeObject) + state repos to satisfy connector fetch/store paths. |
| 3 | PG-T7.1.5c | TODO | Concelier Guild | Refactor all connectors/exporters/tests to use Postgres storage namespaces; delete Storage.Mongo code/tests. |
| 4 | PG-T7.1.5d | TODO | Concelier Guild | Add migrations for documents/state/export tables; wire into Concelier Postgres storage DI. |
| 5 | PG-T7.1.5e | TODO | Concelier Guild | End-to-end Concelier build/test on Postgres-only stack; update sprint log and remove Mongo artifacts from repo history references. |

View File

@@ -245,7 +245,7 @@ public sealed class CertBundConnector : IFeedConnector
continue;
}
if (!document.GridFsId.HasValue)
if (!document.PayloadId.HasValue)
{
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
@@ -258,7 +258,7 @@ public sealed class CertBundConnector : IFeedConnector
byte[] payload;
try
{
payload = await _rawDocumentStorage.DownloadAsync(document.GridFsId.Value, cancellationToken).ConfigureAwait(false);
payload = await _rawDocumentStorage.DownloadAsync(document.PayloadId.Value, cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{

View File

@@ -1,337 +1,337 @@
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text.Json;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using MongoDB.Bson;
using StellaOps.Concelier.Connector.CertFr.Configuration;
using StellaOps.Concelier.Connector.CertFr.Internal;
using StellaOps.Concelier.Connector.Common;
using StellaOps.Concelier.Connector.Common.Fetch;
using StellaOps.Concelier.Storage.Mongo;
using StellaOps.Concelier.Storage.Mongo.Advisories;
using StellaOps.Concelier.Storage.Mongo.Documents;
using StellaOps.Concelier.Storage.Mongo.Dtos;
using StellaOps.Plugin;
namespace StellaOps.Concelier.Connector.CertFr;
public sealed class CertFrConnector : IFeedConnector
{
private static readonly JsonSerializerOptions SerializerOptions = new()
{
PropertyNamingPolicy = JsonNamingPolicy.CamelCase,
DefaultIgnoreCondition = System.Text.Json.Serialization.JsonIgnoreCondition.WhenWritingNull,
};
private readonly CertFrFeedClient _feedClient;
private readonly SourceFetchService _fetchService;
private readonly RawDocumentStorage _rawDocumentStorage;
private readonly IDocumentStore _documentStore;
private readonly IDtoStore _dtoStore;
private readonly IAdvisoryStore _advisoryStore;
private readonly ISourceStateRepository _stateRepository;
private readonly CertFrOptions _options;
private readonly TimeProvider _timeProvider;
private readonly ILogger<CertFrConnector> _logger;
public CertFrConnector(
CertFrFeedClient feedClient,
SourceFetchService fetchService,
RawDocumentStorage rawDocumentStorage,
IDocumentStore documentStore,
IDtoStore dtoStore,
IAdvisoryStore advisoryStore,
ISourceStateRepository stateRepository,
IOptions<CertFrOptions> options,
TimeProvider? timeProvider,
ILogger<CertFrConnector> logger)
{
_feedClient = feedClient ?? throw new ArgumentNullException(nameof(feedClient));
_fetchService = fetchService ?? throw new ArgumentNullException(nameof(fetchService));
_rawDocumentStorage = rawDocumentStorage ?? throw new ArgumentNullException(nameof(rawDocumentStorage));
_documentStore = documentStore ?? throw new ArgumentNullException(nameof(documentStore));
_dtoStore = dtoStore ?? throw new ArgumentNullException(nameof(dtoStore));
_advisoryStore = advisoryStore ?? throw new ArgumentNullException(nameof(advisoryStore));
_stateRepository = stateRepository ?? throw new ArgumentNullException(nameof(stateRepository));
_options = (options ?? throw new ArgumentNullException(nameof(options))).Value ?? throw new ArgumentNullException(nameof(options));
_options.Validate();
_timeProvider = timeProvider ?? TimeProvider.System;
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public string SourceName => CertFrConnectorPlugin.SourceName;
public async Task FetchAsync(IServiceProvider services, CancellationToken cancellationToken)
{
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
var now = _timeProvider.GetUtcNow();
var windowEnd = now;
var lastPublished = cursor.LastPublished ?? now - _options.InitialBackfill;
var windowStart = lastPublished - _options.WindowOverlap;
var minStart = now - _options.InitialBackfill;
if (windowStart < minStart)
{
windowStart = minStart;
}
IReadOnlyList<CertFrFeedItem> items;
try
{
items = await _feedClient.LoadAsync(windowStart, windowEnd, cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
_logger.LogError(ex, "Cert-FR feed load failed {Start:o}-{End:o}", windowStart, windowEnd);
await _stateRepository.MarkFailureAsync(SourceName, now, TimeSpan.FromMinutes(10), ex.Message, cancellationToken).ConfigureAwait(false);
throw;
}
if (items.Count == 0)
{
await UpdateCursorAsync(cursor.WithLastPublished(windowEnd), cancellationToken).ConfigureAwait(false);
return;
}
var pendingDocuments = cursor.PendingDocuments.ToList();
var pendingMappings = cursor.PendingMappings.ToList();
var maxPublished = cursor.LastPublished ?? DateTimeOffset.MinValue;
foreach (var item in items)
{
cancellationToken.ThrowIfCancellationRequested();
try
{
var existing = await _documentStore.FindBySourceAndUriAsync(SourceName, item.DetailUri.ToString(), cancellationToken).ConfigureAwait(false);
var request = new SourceFetchRequest(CertFrOptions.HttpClientName, SourceName, item.DetailUri)
{
Metadata = CertFrDocumentMetadata.CreateMetadata(item),
ETag = existing?.Etag,
LastModified = existing?.LastModified,
AcceptHeaders = new[] { "text/html", "application/xhtml+xml", "text/plain;q=0.5" },
};
var result = await _fetchService.FetchAsync(request, cancellationToken).ConfigureAwait(false);
if (result.IsNotModified || !result.IsSuccess || result.Document is null)
{
if (item.Published > maxPublished)
{
maxPublished = item.Published;
}
continue;
}
if (existing is not null
&& string.Equals(existing.Sha256, result.Document.Sha256, StringComparison.OrdinalIgnoreCase)
&& string.Equals(existing.Status, DocumentStatuses.Mapped, StringComparison.Ordinal))
{
await _documentStore.UpdateStatusAsync(result.Document.Id, existing.Status, cancellationToken).ConfigureAwait(false);
if (item.Published > maxPublished)
{
maxPublished = item.Published;
}
continue;
}
if (!pendingDocuments.Contains(result.Document.Id))
{
pendingDocuments.Add(result.Document.Id);
}
if (item.Published > maxPublished)
{
maxPublished = item.Published;
}
if (_options.RequestDelay > TimeSpan.Zero)
{
await Task.Delay(_options.RequestDelay, cancellationToken).ConfigureAwait(false);
}
}
catch (Exception ex)
{
_logger.LogError(ex, "Cert-FR fetch failed for {Uri}", item.DetailUri);
await _stateRepository.MarkFailureAsync(SourceName, _timeProvider.GetUtcNow(), TimeSpan.FromMinutes(5), ex.Message, cancellationToken).ConfigureAwait(false);
throw;
}
}
if (maxPublished == DateTimeOffset.MinValue)
{
maxPublished = cursor.LastPublished ?? windowEnd;
}
var updatedCursor = cursor
.WithPendingDocuments(pendingDocuments)
.WithPendingMappings(pendingMappings)
.WithLastPublished(maxPublished);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
public async Task ParseAsync(IServiceProvider services, CancellationToken cancellationToken)
{
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingDocuments.Count == 0)
{
return;
}
var pendingDocuments = cursor.PendingDocuments.ToList();
var pendingMappings = cursor.PendingMappings.ToList();
foreach (var documentId in cursor.PendingDocuments)
{
cancellationToken.ThrowIfCancellationRequested();
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (document is null)
{
pendingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
if (!document.GridFsId.HasValue)
{
_logger.LogWarning("Cert-FR document {DocumentId} missing GridFS payload", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
CertFrDocumentMetadata metadata;
try
{
metadata = CertFrDocumentMetadata.FromDocument(document);
}
catch (Exception ex)
{
_logger.LogError(ex, "Cert-FR metadata parse failed for document {DocumentId}", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
CertFrDto dto;
try
{
var content = await _rawDocumentStorage.DownloadAsync(document.GridFsId.Value, cancellationToken).ConfigureAwait(false);
var html = System.Text.Encoding.UTF8.GetString(content);
dto = CertFrParser.Parse(html, metadata);
}
catch (Exception ex)
{
_logger.LogError(ex, "Cert-FR parse failed for advisory {AdvisoryId} ({Uri})", metadata.AdvisoryId, document.Uri);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
var json = JsonSerializer.Serialize(dto, SerializerOptions);
var payload = BsonDocument.Parse(json);
var validatedAt = _timeProvider.GetUtcNow();
var existingDto = await _dtoStore.FindByDocumentIdAsync(document.Id, cancellationToken).ConfigureAwait(false);
var dtoRecord = existingDto is null
? new DtoRecord(Guid.NewGuid(), document.Id, SourceName, "certfr.detail.v1", payload, validatedAt)
: existingDto with
{
Payload = payload,
SchemaVersion = "certfr.detail.v1",
ValidatedAt = validatedAt,
};
await _dtoStore.UpsertAsync(dtoRecord, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.PendingMap, cancellationToken).ConfigureAwait(false);
pendingDocuments.Remove(documentId);
if (!pendingMappings.Contains(documentId))
{
pendingMappings.Add(documentId);
}
}
var updatedCursor = cursor
.WithPendingDocuments(pendingDocuments)
.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
public async Task MapAsync(IServiceProvider services, CancellationToken cancellationToken)
{
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingMappings.Count == 0)
{
return;
}
var pendingMappings = cursor.PendingMappings.ToList();
foreach (var documentId in cursor.PendingMappings)
{
cancellationToken.ThrowIfCancellationRequested();
var dtoRecord = await _dtoStore.FindByDocumentIdAsync(documentId, cancellationToken).ConfigureAwait(false);
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (dtoRecord is null || document is null)
{
pendingMappings.Remove(documentId);
continue;
}
CertFrDto? dto;
try
{
var json = dtoRecord.Payload.ToJson();
dto = JsonSerializer.Deserialize<CertFrDto>(json, SerializerOptions);
}
catch (Exception ex)
{
_logger.LogError(ex, "Cert-FR DTO deserialization failed for document {DocumentId}", documentId);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
continue;
}
if (dto is null)
{
_logger.LogWarning("Cert-FR DTO payload deserialized as null for document {DocumentId}", documentId);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
continue;
}
var mappedAt = _timeProvider.GetUtcNow();
var advisory = CertFrMapper.Map(dto, SourceName, mappedAt);
await _advisoryStore.UpsertAsync(advisory, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Mapped, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
}
var updatedCursor = cursor.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
private async Task<CertFrCursor> GetCursorAsync(CancellationToken cancellationToken)
{
var record = await _stateRepository.TryGetAsync(SourceName, cancellationToken).ConfigureAwait(false);
return CertFrCursor.FromBson(record?.Cursor);
}
private async Task UpdateCursorAsync(CertFrCursor cursor, CancellationToken cancellationToken)
{
var completedAt = _timeProvider.GetUtcNow();
await _stateRepository.UpdateCursorAsync(SourceName, cursor.ToBsonDocument(), completedAt, cancellationToken).ConfigureAwait(false);
}
}
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text.Json;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using MongoDB.Bson;
using StellaOps.Concelier.Connector.CertFr.Configuration;
using StellaOps.Concelier.Connector.CertFr.Internal;
using StellaOps.Concelier.Connector.Common;
using StellaOps.Concelier.Connector.Common.Fetch;
using StellaOps.Concelier.Storage.Mongo;
using StellaOps.Concelier.Storage.Mongo.Advisories;
using StellaOps.Concelier.Storage.Mongo.Documents;
using StellaOps.Concelier.Storage.Mongo.Dtos;
using StellaOps.Plugin;
namespace StellaOps.Concelier.Connector.CertFr;
public sealed class CertFrConnector : IFeedConnector
{
private static readonly JsonSerializerOptions SerializerOptions = new()
{
PropertyNamingPolicy = JsonNamingPolicy.CamelCase,
DefaultIgnoreCondition = System.Text.Json.Serialization.JsonIgnoreCondition.WhenWritingNull,
};
private readonly CertFrFeedClient _feedClient;
private readonly SourceFetchService _fetchService;
private readonly RawDocumentStorage _rawDocumentStorage;
private readonly IDocumentStore _documentStore;
private readonly IDtoStore _dtoStore;
private readonly IAdvisoryStore _advisoryStore;
private readonly ISourceStateRepository _stateRepository;
private readonly CertFrOptions _options;
private readonly TimeProvider _timeProvider;
private readonly ILogger<CertFrConnector> _logger;
public CertFrConnector(
CertFrFeedClient feedClient,
SourceFetchService fetchService,
RawDocumentStorage rawDocumentStorage,
IDocumentStore documentStore,
IDtoStore dtoStore,
IAdvisoryStore advisoryStore,
ISourceStateRepository stateRepository,
IOptions<CertFrOptions> options,
TimeProvider? timeProvider,
ILogger<CertFrConnector> logger)
{
_feedClient = feedClient ?? throw new ArgumentNullException(nameof(feedClient));
_fetchService = fetchService ?? throw new ArgumentNullException(nameof(fetchService));
_rawDocumentStorage = rawDocumentStorage ?? throw new ArgumentNullException(nameof(rawDocumentStorage));
_documentStore = documentStore ?? throw new ArgumentNullException(nameof(documentStore));
_dtoStore = dtoStore ?? throw new ArgumentNullException(nameof(dtoStore));
_advisoryStore = advisoryStore ?? throw new ArgumentNullException(nameof(advisoryStore));
_stateRepository = stateRepository ?? throw new ArgumentNullException(nameof(stateRepository));
_options = (options ?? throw new ArgumentNullException(nameof(options))).Value ?? throw new ArgumentNullException(nameof(options));
_options.Validate();
_timeProvider = timeProvider ?? TimeProvider.System;
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public string SourceName => CertFrConnectorPlugin.SourceName;
public async Task FetchAsync(IServiceProvider services, CancellationToken cancellationToken)
{
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
var now = _timeProvider.GetUtcNow();
var windowEnd = now;
var lastPublished = cursor.LastPublished ?? now - _options.InitialBackfill;
var windowStart = lastPublished - _options.WindowOverlap;
var minStart = now - _options.InitialBackfill;
if (windowStart < minStart)
{
windowStart = minStart;
}
IReadOnlyList<CertFrFeedItem> items;
try
{
items = await _feedClient.LoadAsync(windowStart, windowEnd, cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
_logger.LogError(ex, "Cert-FR feed load failed {Start:o}-{End:o}", windowStart, windowEnd);
await _stateRepository.MarkFailureAsync(SourceName, now, TimeSpan.FromMinutes(10), ex.Message, cancellationToken).ConfigureAwait(false);
throw;
}
if (items.Count == 0)
{
await UpdateCursorAsync(cursor.WithLastPublished(windowEnd), cancellationToken).ConfigureAwait(false);
return;
}
var pendingDocuments = cursor.PendingDocuments.ToList();
var pendingMappings = cursor.PendingMappings.ToList();
var maxPublished = cursor.LastPublished ?? DateTimeOffset.MinValue;
foreach (var item in items)
{
cancellationToken.ThrowIfCancellationRequested();
try
{
var existing = await _documentStore.FindBySourceAndUriAsync(SourceName, item.DetailUri.ToString(), cancellationToken).ConfigureAwait(false);
var request = new SourceFetchRequest(CertFrOptions.HttpClientName, SourceName, item.DetailUri)
{
Metadata = CertFrDocumentMetadata.CreateMetadata(item),
ETag = existing?.Etag,
LastModified = existing?.LastModified,
AcceptHeaders = new[] { "text/html", "application/xhtml+xml", "text/plain;q=0.5" },
};
var result = await _fetchService.FetchAsync(request, cancellationToken).ConfigureAwait(false);
if (result.IsNotModified || !result.IsSuccess || result.Document is null)
{
if (item.Published > maxPublished)
{
maxPublished = item.Published;
}
continue;
}
if (existing is not null
&& string.Equals(existing.Sha256, result.Document.Sha256, StringComparison.OrdinalIgnoreCase)
&& string.Equals(existing.Status, DocumentStatuses.Mapped, StringComparison.Ordinal))
{
await _documentStore.UpdateStatusAsync(result.Document.Id, existing.Status, cancellationToken).ConfigureAwait(false);
if (item.Published > maxPublished)
{
maxPublished = item.Published;
}
continue;
}
if (!pendingDocuments.Contains(result.Document.Id))
{
pendingDocuments.Add(result.Document.Id);
}
if (item.Published > maxPublished)
{
maxPublished = item.Published;
}
if (_options.RequestDelay > TimeSpan.Zero)
{
await Task.Delay(_options.RequestDelay, cancellationToken).ConfigureAwait(false);
}
}
catch (Exception ex)
{
_logger.LogError(ex, "Cert-FR fetch failed for {Uri}", item.DetailUri);
await _stateRepository.MarkFailureAsync(SourceName, _timeProvider.GetUtcNow(), TimeSpan.FromMinutes(5), ex.Message, cancellationToken).ConfigureAwait(false);
throw;
}
}
if (maxPublished == DateTimeOffset.MinValue)
{
maxPublished = cursor.LastPublished ?? windowEnd;
}
var updatedCursor = cursor
.WithPendingDocuments(pendingDocuments)
.WithPendingMappings(pendingMappings)
.WithLastPublished(maxPublished);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
public async Task ParseAsync(IServiceProvider services, CancellationToken cancellationToken)
{
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingDocuments.Count == 0)
{
return;
}
var pendingDocuments = cursor.PendingDocuments.ToList();
var pendingMappings = cursor.PendingMappings.ToList();
foreach (var documentId in cursor.PendingDocuments)
{
cancellationToken.ThrowIfCancellationRequested();
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (document is null)
{
pendingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
if (!document.PayloadId.HasValue)
{
_logger.LogWarning("Cert-FR document {DocumentId} missing GridFS payload", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
CertFrDocumentMetadata metadata;
try
{
metadata = CertFrDocumentMetadata.FromDocument(document);
}
catch (Exception ex)
{
_logger.LogError(ex, "Cert-FR metadata parse failed for document {DocumentId}", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
CertFrDto dto;
try
{
var content = await _rawDocumentStorage.DownloadAsync(document.PayloadId.Value, cancellationToken).ConfigureAwait(false);
var html = System.Text.Encoding.UTF8.GetString(content);
dto = CertFrParser.Parse(html, metadata);
}
catch (Exception ex)
{
_logger.LogError(ex, "Cert-FR parse failed for advisory {AdvisoryId} ({Uri})", metadata.AdvisoryId, document.Uri);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
var json = JsonSerializer.Serialize(dto, SerializerOptions);
var payload = BsonDocument.Parse(json);
var validatedAt = _timeProvider.GetUtcNow();
var existingDto = await _dtoStore.FindByDocumentIdAsync(document.Id, cancellationToken).ConfigureAwait(false);
var dtoRecord = existingDto is null
? new DtoRecord(Guid.NewGuid(), document.Id, SourceName, "certfr.detail.v1", payload, validatedAt)
: existingDto with
{
Payload = payload,
SchemaVersion = "certfr.detail.v1",
ValidatedAt = validatedAt,
};
await _dtoStore.UpsertAsync(dtoRecord, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.PendingMap, cancellationToken).ConfigureAwait(false);
pendingDocuments.Remove(documentId);
if (!pendingMappings.Contains(documentId))
{
pendingMappings.Add(documentId);
}
}
var updatedCursor = cursor
.WithPendingDocuments(pendingDocuments)
.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
public async Task MapAsync(IServiceProvider services, CancellationToken cancellationToken)
{
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingMappings.Count == 0)
{
return;
}
var pendingMappings = cursor.PendingMappings.ToList();
foreach (var documentId in cursor.PendingMappings)
{
cancellationToken.ThrowIfCancellationRequested();
var dtoRecord = await _dtoStore.FindByDocumentIdAsync(documentId, cancellationToken).ConfigureAwait(false);
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (dtoRecord is null || document is null)
{
pendingMappings.Remove(documentId);
continue;
}
CertFrDto? dto;
try
{
var json = dtoRecord.Payload.ToJson();
dto = JsonSerializer.Deserialize<CertFrDto>(json, SerializerOptions);
}
catch (Exception ex)
{
_logger.LogError(ex, "Cert-FR DTO deserialization failed for document {DocumentId}", documentId);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
continue;
}
if (dto is null)
{
_logger.LogWarning("Cert-FR DTO payload deserialized as null for document {DocumentId}", documentId);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
continue;
}
var mappedAt = _timeProvider.GetUtcNow();
var advisory = CertFrMapper.Map(dto, SourceName, mappedAt);
await _advisoryStore.UpsertAsync(advisory, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Mapped, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
}
var updatedCursor = cursor.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
private async Task<CertFrCursor> GetCursorAsync(CancellationToken cancellationToken)
{
var record = await _stateRepository.TryGetAsync(SourceName, cancellationToken).ConfigureAwait(false);
return CertFrCursor.FromBson(record?.Cursor);
}
private async Task UpdateCursorAsync(CertFrCursor cursor, CancellationToken cancellationToken)
{
var completedAt = _timeProvider.GetUtcNow();
await _stateRepository.UpdateCursorAsync(SourceName, cursor.ToBsonDocument(), completedAt, cancellationToken).ConfigureAwait(false);
}
}

View File

@@ -1,370 +1,370 @@
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text.Json;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using MongoDB.Bson;
using StellaOps.Concelier.Models;
using StellaOps.Concelier.Connector.CertIn.Configuration;
using StellaOps.Concelier.Connector.CertIn.Internal;
using StellaOps.Concelier.Connector.Common;
using StellaOps.Concelier.Connector.Common.Fetch;
using StellaOps.Concelier.Storage.Mongo;
using StellaOps.Concelier.Storage.Mongo.Advisories;
using StellaOps.Concelier.Storage.Mongo.Documents;
using StellaOps.Concelier.Storage.Mongo.Dtos;
using StellaOps.Plugin;
namespace StellaOps.Concelier.Connector.CertIn;
public sealed class CertInConnector : IFeedConnector
{
private static readonly JsonSerializerOptions SerializerOptions = new(JsonSerializerDefaults.General)
{
PropertyNamingPolicy = JsonNamingPolicy.CamelCase,
WriteIndented = false,
DefaultIgnoreCondition = System.Text.Json.Serialization.JsonIgnoreCondition.WhenWritingNull,
};
private readonly CertInClient _client;
private readonly SourceFetchService _fetchService;
private readonly RawDocumentStorage _rawDocumentStorage;
private readonly IDocumentStore _documentStore;
private readonly IDtoStore _dtoStore;
private readonly IAdvisoryStore _advisoryStore;
private readonly ISourceStateRepository _stateRepository;
private readonly CertInOptions _options;
private readonly TimeProvider _timeProvider;
private readonly ILogger<CertInConnector> _logger;
public CertInConnector(
CertInClient client,
SourceFetchService fetchService,
RawDocumentStorage rawDocumentStorage,
IDocumentStore documentStore,
IDtoStore dtoStore,
IAdvisoryStore advisoryStore,
ISourceStateRepository stateRepository,
IOptions<CertInOptions> options,
TimeProvider? timeProvider,
ILogger<CertInConnector> logger)
{
_client = client ?? throw new ArgumentNullException(nameof(client));
_fetchService = fetchService ?? throw new ArgumentNullException(nameof(fetchService));
_rawDocumentStorage = rawDocumentStorage ?? throw new ArgumentNullException(nameof(rawDocumentStorage));
_documentStore = documentStore ?? throw new ArgumentNullException(nameof(documentStore));
_dtoStore = dtoStore ?? throw new ArgumentNullException(nameof(dtoStore));
_advisoryStore = advisoryStore ?? throw new ArgumentNullException(nameof(advisoryStore));
_stateRepository = stateRepository ?? throw new ArgumentNullException(nameof(stateRepository));
_options = (options ?? throw new ArgumentNullException(nameof(options))).Value ?? throw new ArgumentNullException(nameof(options));
_options.Validate();
_timeProvider = timeProvider ?? TimeProvider.System;
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public string SourceName => CertInConnectorPlugin.SourceName;
public async Task FetchAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
var now = _timeProvider.GetUtcNow();
var windowStart = cursor.LastPublished.HasValue
? cursor.LastPublished.Value - _options.WindowOverlap
: now - _options.WindowSize;
var pendingDocuments = cursor.PendingDocuments.ToHashSet();
var maxPublished = cursor.LastPublished ?? DateTimeOffset.MinValue;
for (var page = 1; page <= _options.MaxPagesPerFetch; page++)
{
IReadOnlyList<CertInListingItem> listings;
try
{
listings = await _client.GetListingsAsync(page, cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
_logger.LogError(ex, "CERT-In listings fetch failed for page {Page}", page);
await _stateRepository.MarkFailureAsync(SourceName, now, TimeSpan.FromMinutes(5), ex.Message, cancellationToken).ConfigureAwait(false);
throw;
}
if (listings.Count == 0)
{
break;
}
foreach (var listing in listings.OrderByDescending(static item => item.Published))
{
if (listing.Published < windowStart)
{
page = _options.MaxPagesPerFetch + 1;
break;
}
var metadata = new Dictionary<string, string>(StringComparer.Ordinal)
{
["certin.advisoryId"] = listing.AdvisoryId,
["certin.title"] = listing.Title,
["certin.link"] = listing.DetailUri.ToString(),
["certin.published"] = listing.Published.ToString("O")
};
if (!string.IsNullOrWhiteSpace(listing.Summary))
{
metadata["certin.summary"] = listing.Summary!;
}
var existing = await _documentStore.FindBySourceAndUriAsync(SourceName, listing.DetailUri.ToString(), cancellationToken).ConfigureAwait(false);
SourceFetchResult result;
try
{
result = await _fetchService.FetchAsync(
new SourceFetchRequest(CertInOptions.HttpClientName, SourceName, listing.DetailUri)
{
Metadata = metadata,
ETag = existing?.Etag,
LastModified = existing?.LastModified,
AcceptHeaders = new[] { "text/html", "application/xhtml+xml", "text/plain;q=0.5" },
},
cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
_logger.LogError(ex, "CERT-In fetch failed for {Uri}", listing.DetailUri);
await _stateRepository.MarkFailureAsync(SourceName, _timeProvider.GetUtcNow(), TimeSpan.FromMinutes(3), ex.Message, cancellationToken).ConfigureAwait(false);
throw;
}
if (!result.IsSuccess || result.Document is null)
{
continue;
}
if (existing is not null
&& string.Equals(existing.Sha256, result.Document.Sha256, StringComparison.OrdinalIgnoreCase)
&& string.Equals(existing.Status, DocumentStatuses.Mapped, StringComparison.Ordinal))
{
await _documentStore.UpdateStatusAsync(result.Document.Id, existing.Status, cancellationToken).ConfigureAwait(false);
continue;
}
pendingDocuments.Add(result.Document.Id);
if (listing.Published > maxPublished)
{
maxPublished = listing.Published;
}
if (_options.RequestDelay > TimeSpan.Zero)
{
await Task.Delay(_options.RequestDelay, cancellationToken).ConfigureAwait(false);
}
}
}
var updatedCursor = cursor
.WithPendingDocuments(pendingDocuments)
.WithLastPublished(maxPublished == DateTimeOffset.MinValue ? cursor.LastPublished : maxPublished);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
public async Task ParseAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingDocuments.Count == 0)
{
return;
}
var remainingDocuments = cursor.PendingDocuments.ToList();
var pendingMappings = cursor.PendingMappings.ToList();
foreach (var documentId in cursor.PendingDocuments)
{
cancellationToken.ThrowIfCancellationRequested();
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (document is null)
{
remainingDocuments.Remove(documentId);
continue;
}
if (!document.GridFsId.HasValue)
{
_logger.LogWarning("CERT-In document {DocumentId} missing GridFS payload", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
continue;
}
if (!TryDeserializeListing(document.Metadata, out var listing))
{
_logger.LogWarning("CERT-In metadata missing for {DocumentId}", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
continue;
}
byte[] rawBytes;
try
{
rawBytes = await _rawDocumentStorage.DownloadAsync(document.GridFsId.Value, cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
_logger.LogError(ex, "Failed to download raw CERT-In document {DocumentId}", document.Id);
throw;
}
var dto = CertInDetailParser.Parse(listing, rawBytes);
var payload = BsonDocument.Parse(JsonSerializer.Serialize(dto, SerializerOptions));
var dtoRecord = new DtoRecord(Guid.NewGuid(), document.Id, SourceName, "certin.v1", payload, _timeProvider.GetUtcNow());
await _dtoStore.UpsertAsync(dtoRecord, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.PendingMap, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
if (!pendingMappings.Contains(documentId))
{
pendingMappings.Add(documentId);
}
}
var updatedCursor = cursor
.WithPendingDocuments(remainingDocuments)
.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
public async Task MapAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingMappings.Count == 0)
{
return;
}
var pendingMappings = cursor.PendingMappings.ToList();
foreach (var documentId in cursor.PendingMappings)
{
cancellationToken.ThrowIfCancellationRequested();
var dtoRecord = await _dtoStore.FindByDocumentIdAsync(documentId, cancellationToken).ConfigureAwait(false);
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (dtoRecord is null || document is null)
{
pendingMappings.Remove(documentId);
continue;
}
var dtoJson = dtoRecord.Payload.ToJson(new MongoDB.Bson.IO.JsonWriterSettings
{
OutputMode = MongoDB.Bson.IO.JsonOutputMode.RelaxedExtendedJson,
});
CertInAdvisoryDto dto;
try
{
dto = JsonSerializer.Deserialize<CertInAdvisoryDto>(dtoJson, SerializerOptions)
?? throw new InvalidOperationException("Deserialized CERT-In DTO is null.");
}
catch (Exception ex)
{
_logger.LogError(ex, "Failed to deserialize CERT-In DTO for {DocumentId}", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
continue;
}
var advisory = MapAdvisory(dto, document, dtoRecord);
await _advisoryStore.UpsertAsync(advisory, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Mapped, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
}
var updatedCursor = cursor.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
private Advisory MapAdvisory(CertInAdvisoryDto dto, DocumentRecord document, DtoRecord dtoRecord)
{
var fetchProvenance = new AdvisoryProvenance(SourceName, "document", document.Uri, document.FetchedAt);
var mappingProvenance = new AdvisoryProvenance(SourceName, "mapping", dto.AdvisoryId, dtoRecord.ValidatedAt);
var aliases = new HashSet<string>(StringComparer.OrdinalIgnoreCase)
{
dto.AdvisoryId,
};
foreach (var cve in dto.CveIds)
{
aliases.Add(cve);
}
var references = new List<AdvisoryReference>();
try
{
references.Add(new AdvisoryReference(
dto.Link,
"advisory",
"cert-in",
null,
new AdvisoryProvenance(SourceName, "reference", dto.Link, dtoRecord.ValidatedAt)));
}
catch (ArgumentException)
{
_logger.LogWarning("Invalid CERT-In link {Link} for advisory {AdvisoryId}", dto.Link, dto.AdvisoryId);
}
foreach (var cve in dto.CveIds)
{
var url = $"https://www.cve.org/CVERecord?id={cve}";
try
{
references.Add(new AdvisoryReference(
url,
"advisory",
cve,
null,
new AdvisoryProvenance(SourceName, "reference", url, dtoRecord.ValidatedAt)));
}
catch (ArgumentException)
{
// ignore invalid urls
}
}
foreach (var link in dto.ReferenceLinks)
{
try
{
references.Add(new AdvisoryReference(
link,
"reference",
null,
null,
new AdvisoryProvenance(SourceName, "reference", link, dtoRecord.ValidatedAt)));
}
catch (ArgumentException)
{
// ignore invalid urls
}
}
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text.Json;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using MongoDB.Bson;
using StellaOps.Concelier.Models;
using StellaOps.Concelier.Connector.CertIn.Configuration;
using StellaOps.Concelier.Connector.CertIn.Internal;
using StellaOps.Concelier.Connector.Common;
using StellaOps.Concelier.Connector.Common.Fetch;
using StellaOps.Concelier.Storage.Mongo;
using StellaOps.Concelier.Storage.Mongo.Advisories;
using StellaOps.Concelier.Storage.Mongo.Documents;
using StellaOps.Concelier.Storage.Mongo.Dtos;
using StellaOps.Plugin;
namespace StellaOps.Concelier.Connector.CertIn;
public sealed class CertInConnector : IFeedConnector
{
private static readonly JsonSerializerOptions SerializerOptions = new(JsonSerializerDefaults.General)
{
PropertyNamingPolicy = JsonNamingPolicy.CamelCase,
WriteIndented = false,
DefaultIgnoreCondition = System.Text.Json.Serialization.JsonIgnoreCondition.WhenWritingNull,
};
private readonly CertInClient _client;
private readonly SourceFetchService _fetchService;
private readonly RawDocumentStorage _rawDocumentStorage;
private readonly IDocumentStore _documentStore;
private readonly IDtoStore _dtoStore;
private readonly IAdvisoryStore _advisoryStore;
private readonly ISourceStateRepository _stateRepository;
private readonly CertInOptions _options;
private readonly TimeProvider _timeProvider;
private readonly ILogger<CertInConnector> _logger;
public CertInConnector(
CertInClient client,
SourceFetchService fetchService,
RawDocumentStorage rawDocumentStorage,
IDocumentStore documentStore,
IDtoStore dtoStore,
IAdvisoryStore advisoryStore,
ISourceStateRepository stateRepository,
IOptions<CertInOptions> options,
TimeProvider? timeProvider,
ILogger<CertInConnector> logger)
{
_client = client ?? throw new ArgumentNullException(nameof(client));
_fetchService = fetchService ?? throw new ArgumentNullException(nameof(fetchService));
_rawDocumentStorage = rawDocumentStorage ?? throw new ArgumentNullException(nameof(rawDocumentStorage));
_documentStore = documentStore ?? throw new ArgumentNullException(nameof(documentStore));
_dtoStore = dtoStore ?? throw new ArgumentNullException(nameof(dtoStore));
_advisoryStore = advisoryStore ?? throw new ArgumentNullException(nameof(advisoryStore));
_stateRepository = stateRepository ?? throw new ArgumentNullException(nameof(stateRepository));
_options = (options ?? throw new ArgumentNullException(nameof(options))).Value ?? throw new ArgumentNullException(nameof(options));
_options.Validate();
_timeProvider = timeProvider ?? TimeProvider.System;
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public string SourceName => CertInConnectorPlugin.SourceName;
public async Task FetchAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
var now = _timeProvider.GetUtcNow();
var windowStart = cursor.LastPublished.HasValue
? cursor.LastPublished.Value - _options.WindowOverlap
: now - _options.WindowSize;
var pendingDocuments = cursor.PendingDocuments.ToHashSet();
var maxPublished = cursor.LastPublished ?? DateTimeOffset.MinValue;
for (var page = 1; page <= _options.MaxPagesPerFetch; page++)
{
IReadOnlyList<CertInListingItem> listings;
try
{
listings = await _client.GetListingsAsync(page, cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
_logger.LogError(ex, "CERT-In listings fetch failed for page {Page}", page);
await _stateRepository.MarkFailureAsync(SourceName, now, TimeSpan.FromMinutes(5), ex.Message, cancellationToken).ConfigureAwait(false);
throw;
}
if (listings.Count == 0)
{
break;
}
foreach (var listing in listings.OrderByDescending(static item => item.Published))
{
if (listing.Published < windowStart)
{
page = _options.MaxPagesPerFetch + 1;
break;
}
var metadata = new Dictionary<string, string>(StringComparer.Ordinal)
{
["certin.advisoryId"] = listing.AdvisoryId,
["certin.title"] = listing.Title,
["certin.link"] = listing.DetailUri.ToString(),
["certin.published"] = listing.Published.ToString("O")
};
if (!string.IsNullOrWhiteSpace(listing.Summary))
{
metadata["certin.summary"] = listing.Summary!;
}
var existing = await _documentStore.FindBySourceAndUriAsync(SourceName, listing.DetailUri.ToString(), cancellationToken).ConfigureAwait(false);
SourceFetchResult result;
try
{
result = await _fetchService.FetchAsync(
new SourceFetchRequest(CertInOptions.HttpClientName, SourceName, listing.DetailUri)
{
Metadata = metadata,
ETag = existing?.Etag,
LastModified = existing?.LastModified,
AcceptHeaders = new[] { "text/html", "application/xhtml+xml", "text/plain;q=0.5" },
},
cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
_logger.LogError(ex, "CERT-In fetch failed for {Uri}", listing.DetailUri);
await _stateRepository.MarkFailureAsync(SourceName, _timeProvider.GetUtcNow(), TimeSpan.FromMinutes(3), ex.Message, cancellationToken).ConfigureAwait(false);
throw;
}
if (!result.IsSuccess || result.Document is null)
{
continue;
}
if (existing is not null
&& string.Equals(existing.Sha256, result.Document.Sha256, StringComparison.OrdinalIgnoreCase)
&& string.Equals(existing.Status, DocumentStatuses.Mapped, StringComparison.Ordinal))
{
await _documentStore.UpdateStatusAsync(result.Document.Id, existing.Status, cancellationToken).ConfigureAwait(false);
continue;
}
pendingDocuments.Add(result.Document.Id);
if (listing.Published > maxPublished)
{
maxPublished = listing.Published;
}
if (_options.RequestDelay > TimeSpan.Zero)
{
await Task.Delay(_options.RequestDelay, cancellationToken).ConfigureAwait(false);
}
}
}
var updatedCursor = cursor
.WithPendingDocuments(pendingDocuments)
.WithLastPublished(maxPublished == DateTimeOffset.MinValue ? cursor.LastPublished : maxPublished);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
public async Task ParseAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingDocuments.Count == 0)
{
return;
}
var remainingDocuments = cursor.PendingDocuments.ToList();
var pendingMappings = cursor.PendingMappings.ToList();
foreach (var documentId in cursor.PendingDocuments)
{
cancellationToken.ThrowIfCancellationRequested();
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (document is null)
{
remainingDocuments.Remove(documentId);
continue;
}
if (!document.PayloadId.HasValue)
{
_logger.LogWarning("CERT-In document {DocumentId} missing GridFS payload", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
continue;
}
if (!TryDeserializeListing(document.Metadata, out var listing))
{
_logger.LogWarning("CERT-In metadata missing for {DocumentId}", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
continue;
}
byte[] rawBytes;
try
{
rawBytes = await _rawDocumentStorage.DownloadAsync(document.PayloadId.Value, cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
_logger.LogError(ex, "Failed to download raw CERT-In document {DocumentId}", document.Id);
throw;
}
var dto = CertInDetailParser.Parse(listing, rawBytes);
var payload = BsonDocument.Parse(JsonSerializer.Serialize(dto, SerializerOptions));
var dtoRecord = new DtoRecord(Guid.NewGuid(), document.Id, SourceName, "certin.v1", payload, _timeProvider.GetUtcNow());
await _dtoStore.UpsertAsync(dtoRecord, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.PendingMap, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
if (!pendingMappings.Contains(documentId))
{
pendingMappings.Add(documentId);
}
}
var updatedCursor = cursor
.WithPendingDocuments(remainingDocuments)
.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
public async Task MapAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingMappings.Count == 0)
{
return;
}
var pendingMappings = cursor.PendingMappings.ToList();
foreach (var documentId in cursor.PendingMappings)
{
cancellationToken.ThrowIfCancellationRequested();
var dtoRecord = await _dtoStore.FindByDocumentIdAsync(documentId, cancellationToken).ConfigureAwait(false);
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (dtoRecord is null || document is null)
{
pendingMappings.Remove(documentId);
continue;
}
var dtoJson = dtoRecord.Payload.ToJson(new MongoDB.Bson.IO.JsonWriterSettings
{
OutputMode = MongoDB.Bson.IO.JsonOutputMode.RelaxedExtendedJson,
});
CertInAdvisoryDto dto;
try
{
dto = JsonSerializer.Deserialize<CertInAdvisoryDto>(dtoJson, SerializerOptions)
?? throw new InvalidOperationException("Deserialized CERT-In DTO is null.");
}
catch (Exception ex)
{
_logger.LogError(ex, "Failed to deserialize CERT-In DTO for {DocumentId}", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
continue;
}
var advisory = MapAdvisory(dto, document, dtoRecord);
await _advisoryStore.UpsertAsync(advisory, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Mapped, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
}
var updatedCursor = cursor.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
private Advisory MapAdvisory(CertInAdvisoryDto dto, DocumentRecord document, DtoRecord dtoRecord)
{
var fetchProvenance = new AdvisoryProvenance(SourceName, "document", document.Uri, document.FetchedAt);
var mappingProvenance = new AdvisoryProvenance(SourceName, "mapping", dto.AdvisoryId, dtoRecord.ValidatedAt);
var aliases = new HashSet<string>(StringComparer.OrdinalIgnoreCase)
{
dto.AdvisoryId,
};
foreach (var cve in dto.CveIds)
{
aliases.Add(cve);
}
var references = new List<AdvisoryReference>();
try
{
references.Add(new AdvisoryReference(
dto.Link,
"advisory",
"cert-in",
null,
new AdvisoryProvenance(SourceName, "reference", dto.Link, dtoRecord.ValidatedAt)));
}
catch (ArgumentException)
{
_logger.LogWarning("Invalid CERT-In link {Link} for advisory {AdvisoryId}", dto.Link, dto.AdvisoryId);
}
foreach (var cve in dto.CveIds)
{
var url = $"https://www.cve.org/CVERecord?id={cve}";
try
{
references.Add(new AdvisoryReference(
url,
"advisory",
cve,
null,
new AdvisoryProvenance(SourceName, "reference", url, dtoRecord.ValidatedAt)));
}
catch (ArgumentException)
{
// ignore invalid urls
}
}
foreach (var link in dto.ReferenceLinks)
{
try
{
references.Add(new AdvisoryReference(
link,
"reference",
null,
null,
new AdvisoryProvenance(SourceName, "reference", link, dtoRecord.ValidatedAt)));
}
catch (ArgumentException)
{
// ignore invalid urls
}
}
var affectedPackages = dto.VendorNames.Select(vendor =>
{
var provenance = new AdvisoryProvenance(SourceName, "affected", vendor, dtoRecord.ValidatedAt);
@@ -398,65 +398,65 @@ public sealed class CertInConnector : IFeedConnector
provenance: new[] { provenance });
})
.ToArray();
return new Advisory(
dto.AdvisoryId,
dto.Title,
dto.Summary ?? dto.Content,
language: "en",
published: dto.Published,
modified: dto.Published,
severity: dto.Severity,
exploitKnown: false,
aliases: aliases,
references: references,
affectedPackages: affectedPackages,
cvssMetrics: Array.Empty<CvssMetric>(),
provenance: new[] { fetchProvenance, mappingProvenance });
}
private async Task<CertInCursor> GetCursorAsync(CancellationToken cancellationToken)
{
var state = await _stateRepository.TryGetAsync(SourceName, cancellationToken).ConfigureAwait(false);
return state is null ? CertInCursor.Empty : CertInCursor.FromBson(state.Cursor);
}
private Task UpdateCursorAsync(CertInCursor cursor, CancellationToken cancellationToken)
{
return _stateRepository.UpdateCursorAsync(SourceName, cursor.ToBsonDocument(), _timeProvider.GetUtcNow(), cancellationToken);
}
private static bool TryDeserializeListing(IReadOnlyDictionary<string, string>? metadata, out CertInListingItem listing)
{
listing = null!;
if (metadata is null)
{
return false;
}
if (!metadata.TryGetValue("certin.advisoryId", out var advisoryId))
{
return false;
}
if (!metadata.TryGetValue("certin.title", out var title))
{
return false;
}
if (!metadata.TryGetValue("certin.link", out var link) || !Uri.TryCreate(link, UriKind.Absolute, out var detailUri))
{
return false;
}
if (!metadata.TryGetValue("certin.published", out var publishedText) || !DateTimeOffset.TryParse(publishedText, out var published))
{
return false;
}
metadata.TryGetValue("certin.summary", out var summary);
listing = new CertInListingItem(advisoryId, title, detailUri, published.ToUniversalTime(), summary);
return true;
}
}
return new Advisory(
dto.AdvisoryId,
dto.Title,
dto.Summary ?? dto.Content,
language: "en",
published: dto.Published,
modified: dto.Published,
severity: dto.Severity,
exploitKnown: false,
aliases: aliases,
references: references,
affectedPackages: affectedPackages,
cvssMetrics: Array.Empty<CvssMetric>(),
provenance: new[] { fetchProvenance, mappingProvenance });
}
private async Task<CertInCursor> GetCursorAsync(CancellationToken cancellationToken)
{
var state = await _stateRepository.TryGetAsync(SourceName, cancellationToken).ConfigureAwait(false);
return state is null ? CertInCursor.Empty : CertInCursor.FromBson(state.Cursor);
}
private Task UpdateCursorAsync(CertInCursor cursor, CancellationToken cancellationToken)
{
return _stateRepository.UpdateCursorAsync(SourceName, cursor.ToBsonDocument(), _timeProvider.GetUtcNow(), cancellationToken);
}
private static bool TryDeserializeListing(IReadOnlyDictionary<string, string>? metadata, out CertInListingItem listing)
{
listing = null!;
if (metadata is null)
{
return false;
}
if (!metadata.TryGetValue("certin.advisoryId", out var advisoryId))
{
return false;
}
if (!metadata.TryGetValue("certin.title", out var title))
{
return false;
}
if (!metadata.TryGetValue("certin.link", out var link) || !Uri.TryCreate(link, UriKind.Absolute, out var detailUri))
{
return false;
}
if (!metadata.TryGetValue("certin.published", out var publishedText) || !DateTimeOffset.TryParse(publishedText, out var published))
{
return false;
}
metadata.TryGetValue("certin.summary", out var summary);
listing = new CertInListingItem(advisoryId, title, detailUri, published.ToUniversalTime(), summary);
return true;
}
}

View File

@@ -3,6 +3,7 @@ using System.Net.Http;
using System.Net.Security;
using System.Security.Cryptography.X509Certificates;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.DependencyInjection.Extensions;
using Microsoft.Extensions.Options;
using StellaOps.Concelier.Connector.Common.Xml;
using StellaOps.Concelier.Core.Aoc;
@@ -169,7 +170,7 @@ public static class ServiceCollectionExtensions
services.AddSingleton<Fetch.IJitterSource, Fetch.CryptoJitterSource>();
services.AddConcelierAocGuards();
services.AddConcelierLinksetMappers();
services.AddSingleton<IDocumentStore, InMemoryDocumentStore>();
services.TryAddSingleton<IDocumentStore, InMemoryDocumentStore>();
services.AddSingleton<Fetch.RawDocumentStorage>();
services.AddSingleton<Fetch.SourceFetchService>();

View File

@@ -5,16 +5,16 @@ using StellaOps.Concelier.Connector.Common.Fetch;
using StellaOps.Concelier.Storage.Mongo;
using StellaOps.Concelier.Storage.Mongo.Documents;
using StellaOps.Cryptography;
namespace StellaOps.Concelier.Connector.Common.State;
/// <summary>
/// Persists raw documents and cursor state for connectors that require manual seeding.
/// </summary>
public sealed class SourceStateSeedProcessor
{
private readonly IDocumentStore _documentStore;
private readonly RawDocumentStorage _rawDocumentStorage;
namespace StellaOps.Concelier.Connector.Common.State;
/// <summary>
/// Persists raw documents and cursor state for connectors that require manual seeding.
/// </summary>
public sealed class SourceStateSeedProcessor
{
private readonly IDocumentStore _documentStore;
private readonly RawDocumentStorage _rawDocumentStorage;
private readonly ISourceStateRepository _stateRepository;
private readonly TimeProvider _timeProvider;
private readonly ILogger<SourceStateSeedProcessor> _logger;
@@ -35,298 +35,298 @@ public sealed class SourceStateSeedProcessor
_timeProvider = timeProvider ?? TimeProvider.System;
_logger = logger ?? NullLogger<SourceStateSeedProcessor>.Instance;
}
public async Task<SourceStateSeedResult> ProcessAsync(SourceStateSeedSpecification specification, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(specification);
ArgumentException.ThrowIfNullOrEmpty(specification.Source);
var completedAt = specification.CompletedAt ?? _timeProvider.GetUtcNow();
var documentIds = new List<Guid>();
var pendingDocumentIds = new HashSet<Guid>();
var pendingMappingIds = new HashSet<Guid>();
var knownAdvisories = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
AppendRange(knownAdvisories, specification.KnownAdvisories);
if (specification.Cursor is { } cursorSeed)
{
AppendRange(pendingDocumentIds, cursorSeed.PendingDocuments);
AppendRange(pendingMappingIds, cursorSeed.PendingMappings);
AppendRange(knownAdvisories, cursorSeed.KnownAdvisories);
}
foreach (var document in specification.Documents ?? Array.Empty<SourceStateSeedDocument>())
{
cancellationToken.ThrowIfCancellationRequested();
await ProcessDocumentAsync(specification.Source, document, completedAt, documentIds, pendingDocumentIds, pendingMappingIds, knownAdvisories, cancellationToken).ConfigureAwait(false);
}
var state = await _stateRepository.TryGetAsync(specification.Source, cancellationToken).ConfigureAwait(false);
var cursor = state?.Cursor ?? new BsonDocument();
var newlyPendingDocuments = MergeGuidArray(cursor, "pendingDocuments", pendingDocumentIds);
var newlyPendingMappings = MergeGuidArray(cursor, "pendingMappings", pendingMappingIds);
var newlyKnownAdvisories = MergeStringArray(cursor, "knownAdvisories", knownAdvisories);
if (specification.Cursor is { } cursorSpec)
{
if (cursorSpec.LastModifiedCursor.HasValue)
{
cursor["lastModifiedCursor"] = cursorSpec.LastModifiedCursor.Value.UtcDateTime;
}
if (cursorSpec.LastFetchAt.HasValue)
{
cursor["lastFetchAt"] = cursorSpec.LastFetchAt.Value.UtcDateTime;
}
if (cursorSpec.Additional is not null)
{
foreach (var kvp in cursorSpec.Additional)
{
cursor[kvp.Key] = kvp.Value;
}
}
}
cursor["lastSeededAt"] = completedAt.UtcDateTime;
await _stateRepository.UpdateCursorAsync(specification.Source, cursor, completedAt, cancellationToken).ConfigureAwait(false);
_logger.LogInformation(
"Seeded {Documents} document(s) for {Source}. pendingDocuments+= {PendingDocuments}, pendingMappings+= {PendingMappings}, knownAdvisories+= {KnownAdvisories}",
documentIds.Count,
specification.Source,
newlyPendingDocuments.Count,
newlyPendingMappings.Count,
newlyKnownAdvisories.Count);
return new SourceStateSeedResult(
DocumentsProcessed: documentIds.Count,
PendingDocumentsAdded: newlyPendingDocuments.Count,
PendingMappingsAdded: newlyPendingMappings.Count,
DocumentIds: documentIds.AsReadOnly(),
PendingDocumentIds: newlyPendingDocuments,
PendingMappingIds: newlyPendingMappings,
KnownAdvisoriesAdded: newlyKnownAdvisories,
CompletedAt: completedAt);
}
private async Task ProcessDocumentAsync(
string source,
SourceStateSeedDocument document,
DateTimeOffset completedAt,
List<Guid> documentIds,
HashSet<Guid> pendingDocumentIds,
HashSet<Guid> pendingMappingIds,
HashSet<string> knownAdvisories,
CancellationToken cancellationToken)
{
if (document is null)
{
throw new ArgumentNullException(nameof(document));
}
ArgumentException.ThrowIfNullOrEmpty(document.Uri);
if (document.Content is not { Length: > 0 })
{
throw new InvalidOperationException($"Seed entry for '{document.Uri}' is missing content bytes.");
}
var payload = new byte[document.Content.Length];
Buffer.BlockCopy(document.Content, 0, payload, 0, document.Content.Length);
if (!document.Uri.Contains("://", StringComparison.Ordinal))
{
_logger.LogWarning("Seed document URI '{Uri}' does not appear to be absolute.", document.Uri);
}
public async Task<SourceStateSeedResult> ProcessAsync(SourceStateSeedSpecification specification, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(specification);
ArgumentException.ThrowIfNullOrEmpty(specification.Source);
var completedAt = specification.CompletedAt ?? _timeProvider.GetUtcNow();
var documentIds = new List<Guid>();
var pendingDocumentIds = new HashSet<Guid>();
var pendingMappingIds = new HashSet<Guid>();
var knownAdvisories = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
AppendRange(knownAdvisories, specification.KnownAdvisories);
if (specification.Cursor is { } cursorSeed)
{
AppendRange(pendingDocumentIds, cursorSeed.PendingDocuments);
AppendRange(pendingMappingIds, cursorSeed.PendingMappings);
AppendRange(knownAdvisories, cursorSeed.KnownAdvisories);
}
foreach (var document in specification.Documents ?? Array.Empty<SourceStateSeedDocument>())
{
cancellationToken.ThrowIfCancellationRequested();
await ProcessDocumentAsync(specification.Source, document, completedAt, documentIds, pendingDocumentIds, pendingMappingIds, knownAdvisories, cancellationToken).ConfigureAwait(false);
}
var state = await _stateRepository.TryGetAsync(specification.Source, cancellationToken).ConfigureAwait(false);
var cursor = state?.Cursor ?? new BsonDocument();
var newlyPendingDocuments = MergeGuidArray(cursor, "pendingDocuments", pendingDocumentIds);
var newlyPendingMappings = MergeGuidArray(cursor, "pendingMappings", pendingMappingIds);
var newlyKnownAdvisories = MergeStringArray(cursor, "knownAdvisories", knownAdvisories);
if (specification.Cursor is { } cursorSpec)
{
if (cursorSpec.LastModifiedCursor.HasValue)
{
cursor["lastModifiedCursor"] = cursorSpec.LastModifiedCursor.Value.UtcDateTime;
}
if (cursorSpec.LastFetchAt.HasValue)
{
cursor["lastFetchAt"] = cursorSpec.LastFetchAt.Value.UtcDateTime;
}
if (cursorSpec.Additional is not null)
{
foreach (var kvp in cursorSpec.Additional)
{
cursor[kvp.Key] = kvp.Value;
}
}
}
cursor["lastSeededAt"] = completedAt.UtcDateTime;
await _stateRepository.UpdateCursorAsync(specification.Source, cursor, completedAt, cancellationToken).ConfigureAwait(false);
_logger.LogInformation(
"Seeded {Documents} document(s) for {Source}. pendingDocuments+= {PendingDocuments}, pendingMappings+= {PendingMappings}, knownAdvisories+= {KnownAdvisories}",
documentIds.Count,
specification.Source,
newlyPendingDocuments.Count,
newlyPendingMappings.Count,
newlyKnownAdvisories.Count);
return new SourceStateSeedResult(
DocumentsProcessed: documentIds.Count,
PendingDocumentsAdded: newlyPendingDocuments.Count,
PendingMappingsAdded: newlyPendingMappings.Count,
DocumentIds: documentIds.AsReadOnly(),
PendingDocumentIds: newlyPendingDocuments,
PendingMappingIds: newlyPendingMappings,
KnownAdvisoriesAdded: newlyKnownAdvisories,
CompletedAt: completedAt);
}
private async Task ProcessDocumentAsync(
string source,
SourceStateSeedDocument document,
DateTimeOffset completedAt,
List<Guid> documentIds,
HashSet<Guid> pendingDocumentIds,
HashSet<Guid> pendingMappingIds,
HashSet<string> knownAdvisories,
CancellationToken cancellationToken)
{
if (document is null)
{
throw new ArgumentNullException(nameof(document));
}
ArgumentException.ThrowIfNullOrEmpty(document.Uri);
if (document.Content is not { Length: > 0 })
{
throw new InvalidOperationException($"Seed entry for '{document.Uri}' is missing content bytes.");
}
var payload = new byte[document.Content.Length];
Buffer.BlockCopy(document.Content, 0, payload, 0, document.Content.Length);
if (!document.Uri.Contains("://", StringComparison.Ordinal))
{
_logger.LogWarning("Seed document URI '{Uri}' does not appear to be absolute.", document.Uri);
}
var contentHash = _hash.ComputeHashHex(payload, HashAlgorithms.Sha256);
var existing = await _documentStore.FindBySourceAndUriAsync(source, document.Uri, cancellationToken).ConfigureAwait(false);
if (existing?.GridFsId is { } oldGridId)
{
await _rawDocumentStorage.DeleteAsync(oldGridId, cancellationToken).ConfigureAwait(false);
}
var gridId = await _rawDocumentStorage.UploadAsync(
source,
document.Uri,
payload,
document.ContentType,
document.ExpiresAt,
cancellationToken)
.ConfigureAwait(false);
var headers = CloneDictionary(document.Headers);
if (!string.IsNullOrWhiteSpace(document.ContentType))
{
headers ??= new Dictionary<string, string>(StringComparer.OrdinalIgnoreCase);
if (!headers.ContainsKey("content-type"))
{
headers["content-type"] = document.ContentType!;
}
}
var metadata = CloneDictionary(document.Metadata);
var existing = await _documentStore.FindBySourceAndUriAsync(source, document.Uri, cancellationToken).ConfigureAwait(false);
if (existing?.PayloadId is { } oldGridId)
{
await _rawDocumentStorage.DeleteAsync(oldGridId, cancellationToken).ConfigureAwait(false);
}
var gridId = await _rawDocumentStorage.UploadAsync(
source,
document.Uri,
payload,
document.ContentType,
document.ExpiresAt,
cancellationToken)
.ConfigureAwait(false);
var headers = CloneDictionary(document.Headers);
if (!string.IsNullOrWhiteSpace(document.ContentType))
{
headers ??= new Dictionary<string, string>(StringComparer.OrdinalIgnoreCase);
if (!headers.ContainsKey("content-type"))
{
headers["content-type"] = document.ContentType!;
}
}
var metadata = CloneDictionary(document.Metadata);
var record = new DocumentRecord(
document.DocumentId ?? existing?.Id ?? Guid.NewGuid(),
source,
document.Uri,
document.FetchedAt ?? completedAt,
contentHash,
string.IsNullOrWhiteSpace(document.Status) ? DocumentStatuses.PendingParse : document.Status,
document.ContentType,
headers,
metadata,
document.Etag,
document.LastModified,
gridId,
string.IsNullOrWhiteSpace(document.Status) ? DocumentStatuses.PendingParse : document.Status,
document.ContentType,
headers,
metadata,
document.Etag,
document.LastModified,
gridId,
document.ExpiresAt);
var upserted = await _documentStore.UpsertAsync(record, cancellationToken).ConfigureAwait(false);
documentIds.Add(upserted.Id);
if (document.AddToPendingDocuments)
{
pendingDocumentIds.Add(upserted.Id);
}
if (document.AddToPendingMappings)
{
pendingMappingIds.Add(upserted.Id);
}
AppendRange(knownAdvisories, document.KnownIdentifiers);
}
private static Dictionary<string, string>? CloneDictionary(IReadOnlyDictionary<string, string>? values)
{
if (values is null || values.Count == 0)
{
return null;
}
return new Dictionary<string, string>(values, StringComparer.OrdinalIgnoreCase);
}
private static IReadOnlyCollection<Guid> MergeGuidArray(BsonDocument cursor, string field, IReadOnlyCollection<Guid> additions)
{
if (additions.Count == 0)
{
return Array.Empty<Guid>();
}
var existing = cursor.TryGetValue(field, out var value) && value is BsonArray existingArray
? existingArray.Select(AsGuid).Where(static g => g != Guid.Empty).ToHashSet()
: new HashSet<Guid>();
var newlyAdded = new List<Guid>();
foreach (var guid in additions)
{
if (guid == Guid.Empty)
{
continue;
}
if (existing.Add(guid))
{
newlyAdded.Add(guid);
}
}
if (existing.Count > 0)
{
cursor[field] = new BsonArray(existing
.Select(static g => g.ToString("D"))
.OrderBy(static s => s, StringComparer.OrdinalIgnoreCase));
}
return newlyAdded.AsReadOnly();
}
private static IReadOnlyCollection<string> MergeStringArray(BsonDocument cursor, string field, IReadOnlyCollection<string> additions)
{
if (additions.Count == 0)
{
return Array.Empty<string>();
}
var existing = cursor.TryGetValue(field, out var value) && value is BsonArray existingArray
? existingArray.Select(static v => v?.AsString ?? string.Empty)
.Where(static s => !string.IsNullOrWhiteSpace(s))
.ToHashSet(StringComparer.OrdinalIgnoreCase)
: new HashSet<string>(StringComparer.OrdinalIgnoreCase);
var newlyAdded = new List<string>();
foreach (var entry in additions)
{
if (string.IsNullOrWhiteSpace(entry))
{
continue;
}
var normalized = entry.Trim();
if (existing.Add(normalized))
{
newlyAdded.Add(normalized);
}
}
if (existing.Count > 0)
{
cursor[field] = new BsonArray(existing
.OrderBy(static s => s, StringComparer.OrdinalIgnoreCase));
}
return newlyAdded.AsReadOnly();
}
private static Guid AsGuid(BsonValue value)
{
if (value is null)
{
return Guid.Empty;
}
return Guid.TryParse(value.ToString(), out var parsed) ? parsed : Guid.Empty;
}
private static void AppendRange(HashSet<Guid> target, IReadOnlyCollection<Guid>? values)
{
if (values is null)
{
return;
}
foreach (var guid in values)
{
if (guid != Guid.Empty)
{
target.Add(guid);
}
}
}
private static void AppendRange(HashSet<string> target, IReadOnlyCollection<string>? values)
{
if (values is null)
{
return;
}
foreach (var value in values)
{
if (string.IsNullOrWhiteSpace(value))
{
continue;
}
target.Add(value.Trim());
}
}
}
documentIds.Add(upserted.Id);
if (document.AddToPendingDocuments)
{
pendingDocumentIds.Add(upserted.Id);
}
if (document.AddToPendingMappings)
{
pendingMappingIds.Add(upserted.Id);
}
AppendRange(knownAdvisories, document.KnownIdentifiers);
}
private static Dictionary<string, string>? CloneDictionary(IReadOnlyDictionary<string, string>? values)
{
if (values is null || values.Count == 0)
{
return null;
}
return new Dictionary<string, string>(values, StringComparer.OrdinalIgnoreCase);
}
private static IReadOnlyCollection<Guid> MergeGuidArray(BsonDocument cursor, string field, IReadOnlyCollection<Guid> additions)
{
if (additions.Count == 0)
{
return Array.Empty<Guid>();
}
var existing = cursor.TryGetValue(field, out var value) && value is BsonArray existingArray
? existingArray.Select(AsGuid).Where(static g => g != Guid.Empty).ToHashSet()
: new HashSet<Guid>();
var newlyAdded = new List<Guid>();
foreach (var guid in additions)
{
if (guid == Guid.Empty)
{
continue;
}
if (existing.Add(guid))
{
newlyAdded.Add(guid);
}
}
if (existing.Count > 0)
{
cursor[field] = new BsonArray(existing
.Select(static g => g.ToString("D"))
.OrderBy(static s => s, StringComparer.OrdinalIgnoreCase));
}
return newlyAdded.AsReadOnly();
}
private static IReadOnlyCollection<string> MergeStringArray(BsonDocument cursor, string field, IReadOnlyCollection<string> additions)
{
if (additions.Count == 0)
{
return Array.Empty<string>();
}
var existing = cursor.TryGetValue(field, out var value) && value is BsonArray existingArray
? existingArray.Select(static v => v?.AsString ?? string.Empty)
.Where(static s => !string.IsNullOrWhiteSpace(s))
.ToHashSet(StringComparer.OrdinalIgnoreCase)
: new HashSet<string>(StringComparer.OrdinalIgnoreCase);
var newlyAdded = new List<string>();
foreach (var entry in additions)
{
if (string.IsNullOrWhiteSpace(entry))
{
continue;
}
var normalized = entry.Trim();
if (existing.Add(normalized))
{
newlyAdded.Add(normalized);
}
}
if (existing.Count > 0)
{
cursor[field] = new BsonArray(existing
.OrderBy(static s => s, StringComparer.OrdinalIgnoreCase));
}
return newlyAdded.AsReadOnly();
}
private static Guid AsGuid(BsonValue value)
{
if (value is null)
{
return Guid.Empty;
}
return Guid.TryParse(value.ToString(), out var parsed) ? parsed : Guid.Empty;
}
private static void AppendRange(HashSet<Guid> target, IReadOnlyCollection<Guid>? values)
{
if (values is null)
{
return;
}
foreach (var guid in values)
{
if (guid != Guid.Empty)
{
target.Add(guid);
}
}
}
private static void AppendRange(HashSet<string> target, IReadOnlyCollection<string>? values)
{
if (values is null)
{
return;
}
foreach (var value in values)
{
if (string.IsNullOrWhiteSpace(value))
{
continue;
}
target.Add(value.Trim());
}
}
}

View File

@@ -1,434 +1,434 @@
using System;
using System.Collections.Generic;
using System.Globalization;
using System.Linq;
using System.Text.Json;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using MongoDB.Bson;
using MongoDB.Bson.IO;
using StellaOps.Concelier.Models;
using StellaOps.Concelier.Connector.Common;
using StellaOps.Concelier.Connector.Common.Fetch;
using StellaOps.Concelier.Connector.Distro.RedHat.Configuration;
using StellaOps.Concelier.Connector.Distro.RedHat.Internal;
using StellaOps.Concelier.Storage.Mongo;
using StellaOps.Concelier.Storage.Mongo.Advisories;
using StellaOps.Concelier.Storage.Mongo.Documents;
using StellaOps.Concelier.Storage.Mongo.Dtos;
using StellaOps.Plugin;
namespace StellaOps.Concelier.Connector.Distro.RedHat;
public sealed class RedHatConnector : IFeedConnector
{
private readonly SourceFetchService _fetchService;
private readonly RawDocumentStorage _rawDocumentStorage;
private readonly IDocumentStore _documentStore;
private readonly IDtoStore _dtoStore;
private readonly IAdvisoryStore _advisoryStore;
private readonly ISourceStateRepository _stateRepository;
private readonly ILogger<RedHatConnector> _logger;
private readonly RedHatOptions _options;
private readonly TimeProvider _timeProvider;
public RedHatConnector(
SourceFetchService fetchService,
RawDocumentStorage rawDocumentStorage,
IDocumentStore documentStore,
IDtoStore dtoStore,
IAdvisoryStore advisoryStore,
ISourceStateRepository stateRepository,
IOptions<RedHatOptions> options,
TimeProvider? timeProvider,
ILogger<RedHatConnector> logger)
{
_fetchService = fetchService ?? throw new ArgumentNullException(nameof(fetchService));
_rawDocumentStorage = rawDocumentStorage ?? throw new ArgumentNullException(nameof(rawDocumentStorage));
_documentStore = documentStore ?? throw new ArgumentNullException(nameof(documentStore));
_dtoStore = dtoStore ?? throw new ArgumentNullException(nameof(dtoStore));
_advisoryStore = advisoryStore ?? throw new ArgumentNullException(nameof(advisoryStore));
_stateRepository = stateRepository ?? throw new ArgumentNullException(nameof(stateRepository));
_options = options?.Value ?? throw new ArgumentNullException(nameof(options));
_options.Validate();
_timeProvider = timeProvider ?? TimeProvider.System;
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public string SourceName => RedHatConnectorPlugin.SourceName;
public async Task FetchAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
var now = _timeProvider.GetUtcNow();
var baseline = cursor.LastReleasedOn ?? now - _options.InitialBackfill;
var overlap = _options.Overlap > TimeSpan.Zero ? _options.Overlap : TimeSpan.Zero;
var afterThreshold = baseline - overlap;
if (afterThreshold < DateTimeOffset.UnixEpoch)
{
afterThreshold = DateTimeOffset.UnixEpoch;
}
ProvenanceDiagnostics.ReportResumeWindow(SourceName, afterThreshold, _logger);
var processedSet = new HashSet<string>(cursor.ProcessedAdvisoryIds, StringComparer.OrdinalIgnoreCase);
var newSummaries = new List<RedHatSummaryItem>();
var stopDueToOlderData = false;
var touchedResources = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
for (var page = 1; page <= _options.MaxPagesPerFetch; page++)
{
var summaryUri = BuildSummaryUri(afterThreshold, page);
var summaryKey = summaryUri.ToString();
touchedResources.Add(summaryKey);
var cachedSummary = cursor.TryGetFetchCache(summaryKey);
var summaryMetadata = new Dictionary<string, string>(StringComparer.Ordinal)
{
["page"] = page.ToString(CultureInfo.InvariantCulture),
["type"] = "summary"
};
var summaryRequest = new SourceFetchRequest(RedHatOptions.HttpClientName, SourceName, summaryUri)
{
Metadata = summaryMetadata,
ETag = cachedSummary?.ETag,
LastModified = cachedSummary?.LastModified,
TimeoutOverride = _options.FetchTimeout,
};
SourceFetchContentResult summaryResult;
try
{
summaryResult = await _fetchService.FetchContentAsync(summaryRequest, cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
_logger.LogError(ex, "Red Hat Hydra summary fetch failed for {Uri}", summaryUri);
throw;
}
if (summaryResult.IsNotModified)
{
if (page == 1)
{
break;
}
continue;
}
if (!summaryResult.IsSuccess || summaryResult.Content is null)
{
continue;
}
cursor = cursor.WithFetchCache(summaryKey, summaryResult.ETag, summaryResult.LastModified);
using var document = JsonDocument.Parse(summaryResult.Content);
if (document.RootElement.ValueKind != JsonValueKind.Array)
{
_logger.LogWarning(
"Red Hat Hydra summary response had unexpected payload kind {Kind} for {Uri}",
document.RootElement.ValueKind,
summaryUri);
break;
}
var pageCount = 0;
foreach (var element in document.RootElement.EnumerateArray())
{
if (!RedHatSummaryItem.TryParse(element, out var summary))
{
continue;
}
pageCount++;
if (cursor.LastReleasedOn.HasValue)
{
if (summary.ReleasedOn < cursor.LastReleasedOn.Value - overlap)
{
stopDueToOlderData = true;
break;
}
if (summary.ReleasedOn < cursor.LastReleasedOn.Value)
{
stopDueToOlderData = true;
break;
}
if (summary.ReleasedOn == cursor.LastReleasedOn.Value && processedSet.Contains(summary.AdvisoryId))
{
continue;
}
}
newSummaries.Add(summary);
processedSet.Add(summary.AdvisoryId);
if (newSummaries.Count >= _options.MaxAdvisoriesPerFetch)
{
break;
}
}
if (newSummaries.Count >= _options.MaxAdvisoriesPerFetch || stopDueToOlderData)
{
break;
}
if (pageCount < _options.PageSize)
{
break;
}
}
if (newSummaries.Count == 0)
{
return;
}
newSummaries.Sort(static (left, right) =>
{
var compare = left.ReleasedOn.CompareTo(right.ReleasedOn);
return compare != 0
? compare
: string.CompareOrdinal(left.AdvisoryId, right.AdvisoryId);
});
var pendingDocuments = new HashSet<Guid>(cursor.PendingDocuments);
foreach (var summary in newSummaries)
{
var resourceUri = summary.ResourceUri;
var resourceKey = resourceUri.ToString();
touchedResources.Add(resourceKey);
var cached = cursor.TryGetFetchCache(resourceKey);
var metadata = new Dictionary<string, string>(StringComparer.Ordinal)
{
["advisoryId"] = summary.AdvisoryId,
["releasedOn"] = summary.ReleasedOn.ToString("O", CultureInfo.InvariantCulture)
};
var request = new SourceFetchRequest(RedHatOptions.HttpClientName, SourceName, resourceUri)
{
Metadata = metadata,
ETag = cached?.ETag,
LastModified = cached?.LastModified,
TimeoutOverride = _options.FetchTimeout,
};
try
{
var result = await _fetchService.FetchAsync(request, cancellationToken).ConfigureAwait(false);
if (result.IsNotModified)
{
continue;
}
if (!result.IsSuccess || result.Document is null)
{
continue;
}
pendingDocuments.Add(result.Document.Id);
cursor = cursor.WithFetchCache(resourceKey, result.Document.Etag, result.Document.LastModified);
}
catch (Exception ex)
{
_logger.LogError(ex, "Red Hat Hydra advisory fetch failed for {Uri}", resourceUri);
throw;
}
}
var maxRelease = newSummaries.Max(static item => item.ReleasedOn);
var idsForMaxRelease = newSummaries
.Where(item => item.ReleasedOn == maxRelease)
.Select(item => item.AdvisoryId)
.Distinct(StringComparer.OrdinalIgnoreCase)
.ToArray();
RedHatCursor updated;
if (cursor.LastReleasedOn.HasValue && maxRelease == cursor.LastReleasedOn.Value)
{
updated = cursor
.WithPendingDocuments(pendingDocuments)
.AddProcessedAdvisories(idsForMaxRelease)
.PruneFetchCache(touchedResources);
}
else
{
updated = cursor
.WithPendingDocuments(pendingDocuments)
.WithLastReleased(maxRelease, idsForMaxRelease)
.PruneFetchCache(touchedResources);
}
await UpdateCursorAsync(updated, cancellationToken).ConfigureAwait(false);
}
public async Task ParseAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingDocuments.Count == 0)
{
return;
}
var remainingFetch = cursor.PendingDocuments.ToList();
var pendingMappings = cursor.PendingMappings.ToList();
foreach (var documentId in cursor.PendingDocuments)
{
DocumentRecord? document = null;
try
{
document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (document is null)
{
remainingFetch.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
if (!document.GridFsId.HasValue)
{
_logger.LogWarning("Red Hat document {DocumentId} missing GridFS content; skipping", document.Id);
remainingFetch.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
var rawBytes = await _rawDocumentStorage.DownloadAsync(document.GridFsId.Value, cancellationToken).ConfigureAwait(false);
using var jsonDocument = JsonDocument.Parse(rawBytes);
var sanitized = JsonSerializer.Serialize(jsonDocument.RootElement);
var payload = BsonDocument.Parse(sanitized);
var dtoRecord = new DtoRecord(
Guid.NewGuid(),
document.Id,
SourceName,
"redhat.csaf.v2",
payload,
_timeProvider.GetUtcNow());
await _dtoStore.UpsertAsync(dtoRecord, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.PendingMap, cancellationToken).ConfigureAwait(false);
remainingFetch.Remove(documentId);
if (!pendingMappings.Contains(documentId))
{
pendingMappings.Add(documentId);
}
}
catch (Exception ex)
{
var uri = document?.Uri ?? documentId.ToString();
_logger.LogError(ex, "Red Hat CSAF parse failed for {Uri}", uri);
remainingFetch.Remove(documentId);
pendingMappings.Remove(documentId);
}
}
var updatedCursor = cursor
.WithPendingDocuments(remainingFetch)
.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
public async Task MapAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingMappings.Count == 0)
{
return;
}
var pendingMappings = cursor.PendingMappings.ToList();
foreach (var documentId in cursor.PendingMappings)
{
try
{
var dto = await _dtoStore.FindByDocumentIdAsync(documentId, cancellationToken).ConfigureAwait(false);
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (dto is null || document is null)
{
pendingMappings.Remove(documentId);
continue;
}
var json = dto.Payload.ToJson(new JsonWriterSettings
{
OutputMode = JsonOutputMode.RelaxedExtendedJson,
});
using var jsonDocument = JsonDocument.Parse(json);
var advisory = RedHatMapper.Map(SourceName, dto, document, jsonDocument);
if (advisory is null)
{
pendingMappings.Remove(documentId);
continue;
}
await _advisoryStore.UpsertAsync(advisory, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(documentId, DocumentStatuses.Mapped, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
}
catch (Exception ex)
{
_logger.LogError(ex, "Red Hat map failed for document {DocumentId}", documentId);
}
}
var updatedCursor = cursor.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
private async Task<RedHatCursor> GetCursorAsync(CancellationToken cancellationToken)
{
var record = await _stateRepository.TryGetAsync(SourceName, cancellationToken).ConfigureAwait(false);
return RedHatCursor.FromBsonDocument(record?.Cursor);
}
private async Task UpdateCursorAsync(RedHatCursor cursor, CancellationToken cancellationToken)
{
var completedAt = _timeProvider.GetUtcNow();
await _stateRepository.UpdateCursorAsync(SourceName, cursor.ToBsonDocument(), completedAt, cancellationToken).ConfigureAwait(false);
}
private Uri BuildSummaryUri(DateTimeOffset after, int page)
{
var builder = new UriBuilder(_options.BaseEndpoint);
var basePath = builder.Path?.TrimEnd('/') ?? string.Empty;
var summaryPath = _options.SummaryPath.TrimStart('/');
builder.Path = string.IsNullOrEmpty(basePath)
? $"/{summaryPath}"
: $"{basePath}/{summaryPath}";
var parameters = new Dictionary<string, string>(StringComparer.Ordinal)
{
["after"] = after.ToString("yyyy-MM-dd", CultureInfo.InvariantCulture),
["per_page"] = _options.PageSize.ToString(CultureInfo.InvariantCulture),
["page"] = page.ToString(CultureInfo.InvariantCulture)
};
builder.Query = string.Join('&', parameters.Select(static kvp =>
$"{Uri.EscapeDataString(kvp.Key)}={Uri.EscapeDataString(kvp.Value)}"));
return builder.Uri;
}
}
using System;
using System.Collections.Generic;
using System.Globalization;
using System.Linq;
using System.Text.Json;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using MongoDB.Bson;
using MongoDB.Bson.IO;
using StellaOps.Concelier.Models;
using StellaOps.Concelier.Connector.Common;
using StellaOps.Concelier.Connector.Common.Fetch;
using StellaOps.Concelier.Connector.Distro.RedHat.Configuration;
using StellaOps.Concelier.Connector.Distro.RedHat.Internal;
using StellaOps.Concelier.Storage.Mongo;
using StellaOps.Concelier.Storage.Mongo.Advisories;
using StellaOps.Concelier.Storage.Mongo.Documents;
using StellaOps.Concelier.Storage.Mongo.Dtos;
using StellaOps.Plugin;
namespace StellaOps.Concelier.Connector.Distro.RedHat;
public sealed class RedHatConnector : IFeedConnector
{
private readonly SourceFetchService _fetchService;
private readonly RawDocumentStorage _rawDocumentStorage;
private readonly IDocumentStore _documentStore;
private readonly IDtoStore _dtoStore;
private readonly IAdvisoryStore _advisoryStore;
private readonly ISourceStateRepository _stateRepository;
private readonly ILogger<RedHatConnector> _logger;
private readonly RedHatOptions _options;
private readonly TimeProvider _timeProvider;
public RedHatConnector(
SourceFetchService fetchService,
RawDocumentStorage rawDocumentStorage,
IDocumentStore documentStore,
IDtoStore dtoStore,
IAdvisoryStore advisoryStore,
ISourceStateRepository stateRepository,
IOptions<RedHatOptions> options,
TimeProvider? timeProvider,
ILogger<RedHatConnector> logger)
{
_fetchService = fetchService ?? throw new ArgumentNullException(nameof(fetchService));
_rawDocumentStorage = rawDocumentStorage ?? throw new ArgumentNullException(nameof(rawDocumentStorage));
_documentStore = documentStore ?? throw new ArgumentNullException(nameof(documentStore));
_dtoStore = dtoStore ?? throw new ArgumentNullException(nameof(dtoStore));
_advisoryStore = advisoryStore ?? throw new ArgumentNullException(nameof(advisoryStore));
_stateRepository = stateRepository ?? throw new ArgumentNullException(nameof(stateRepository));
_options = options?.Value ?? throw new ArgumentNullException(nameof(options));
_options.Validate();
_timeProvider = timeProvider ?? TimeProvider.System;
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public string SourceName => RedHatConnectorPlugin.SourceName;
public async Task FetchAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
var now = _timeProvider.GetUtcNow();
var baseline = cursor.LastReleasedOn ?? now - _options.InitialBackfill;
var overlap = _options.Overlap > TimeSpan.Zero ? _options.Overlap : TimeSpan.Zero;
var afterThreshold = baseline - overlap;
if (afterThreshold < DateTimeOffset.UnixEpoch)
{
afterThreshold = DateTimeOffset.UnixEpoch;
}
ProvenanceDiagnostics.ReportResumeWindow(SourceName, afterThreshold, _logger);
var processedSet = new HashSet<string>(cursor.ProcessedAdvisoryIds, StringComparer.OrdinalIgnoreCase);
var newSummaries = new List<RedHatSummaryItem>();
var stopDueToOlderData = false;
var touchedResources = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
for (var page = 1; page <= _options.MaxPagesPerFetch; page++)
{
var summaryUri = BuildSummaryUri(afterThreshold, page);
var summaryKey = summaryUri.ToString();
touchedResources.Add(summaryKey);
var cachedSummary = cursor.TryGetFetchCache(summaryKey);
var summaryMetadata = new Dictionary<string, string>(StringComparer.Ordinal)
{
["page"] = page.ToString(CultureInfo.InvariantCulture),
["type"] = "summary"
};
var summaryRequest = new SourceFetchRequest(RedHatOptions.HttpClientName, SourceName, summaryUri)
{
Metadata = summaryMetadata,
ETag = cachedSummary?.ETag,
LastModified = cachedSummary?.LastModified,
TimeoutOverride = _options.FetchTimeout,
};
SourceFetchContentResult summaryResult;
try
{
summaryResult = await _fetchService.FetchContentAsync(summaryRequest, cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
_logger.LogError(ex, "Red Hat Hydra summary fetch failed for {Uri}", summaryUri);
throw;
}
if (summaryResult.IsNotModified)
{
if (page == 1)
{
break;
}
continue;
}
if (!summaryResult.IsSuccess || summaryResult.Content is null)
{
continue;
}
cursor = cursor.WithFetchCache(summaryKey, summaryResult.ETag, summaryResult.LastModified);
using var document = JsonDocument.Parse(summaryResult.Content);
if (document.RootElement.ValueKind != JsonValueKind.Array)
{
_logger.LogWarning(
"Red Hat Hydra summary response had unexpected payload kind {Kind} for {Uri}",
document.RootElement.ValueKind,
summaryUri);
break;
}
var pageCount = 0;
foreach (var element in document.RootElement.EnumerateArray())
{
if (!RedHatSummaryItem.TryParse(element, out var summary))
{
continue;
}
pageCount++;
if (cursor.LastReleasedOn.HasValue)
{
if (summary.ReleasedOn < cursor.LastReleasedOn.Value - overlap)
{
stopDueToOlderData = true;
break;
}
if (summary.ReleasedOn < cursor.LastReleasedOn.Value)
{
stopDueToOlderData = true;
break;
}
if (summary.ReleasedOn == cursor.LastReleasedOn.Value && processedSet.Contains(summary.AdvisoryId))
{
continue;
}
}
newSummaries.Add(summary);
processedSet.Add(summary.AdvisoryId);
if (newSummaries.Count >= _options.MaxAdvisoriesPerFetch)
{
break;
}
}
if (newSummaries.Count >= _options.MaxAdvisoriesPerFetch || stopDueToOlderData)
{
break;
}
if (pageCount < _options.PageSize)
{
break;
}
}
if (newSummaries.Count == 0)
{
return;
}
newSummaries.Sort(static (left, right) =>
{
var compare = left.ReleasedOn.CompareTo(right.ReleasedOn);
return compare != 0
? compare
: string.CompareOrdinal(left.AdvisoryId, right.AdvisoryId);
});
var pendingDocuments = new HashSet<Guid>(cursor.PendingDocuments);
foreach (var summary in newSummaries)
{
var resourceUri = summary.ResourceUri;
var resourceKey = resourceUri.ToString();
touchedResources.Add(resourceKey);
var cached = cursor.TryGetFetchCache(resourceKey);
var metadata = new Dictionary<string, string>(StringComparer.Ordinal)
{
["advisoryId"] = summary.AdvisoryId,
["releasedOn"] = summary.ReleasedOn.ToString("O", CultureInfo.InvariantCulture)
};
var request = new SourceFetchRequest(RedHatOptions.HttpClientName, SourceName, resourceUri)
{
Metadata = metadata,
ETag = cached?.ETag,
LastModified = cached?.LastModified,
TimeoutOverride = _options.FetchTimeout,
};
try
{
var result = await _fetchService.FetchAsync(request, cancellationToken).ConfigureAwait(false);
if (result.IsNotModified)
{
continue;
}
if (!result.IsSuccess || result.Document is null)
{
continue;
}
pendingDocuments.Add(result.Document.Id);
cursor = cursor.WithFetchCache(resourceKey, result.Document.Etag, result.Document.LastModified);
}
catch (Exception ex)
{
_logger.LogError(ex, "Red Hat Hydra advisory fetch failed for {Uri}", resourceUri);
throw;
}
}
var maxRelease = newSummaries.Max(static item => item.ReleasedOn);
var idsForMaxRelease = newSummaries
.Where(item => item.ReleasedOn == maxRelease)
.Select(item => item.AdvisoryId)
.Distinct(StringComparer.OrdinalIgnoreCase)
.ToArray();
RedHatCursor updated;
if (cursor.LastReleasedOn.HasValue && maxRelease == cursor.LastReleasedOn.Value)
{
updated = cursor
.WithPendingDocuments(pendingDocuments)
.AddProcessedAdvisories(idsForMaxRelease)
.PruneFetchCache(touchedResources);
}
else
{
updated = cursor
.WithPendingDocuments(pendingDocuments)
.WithLastReleased(maxRelease, idsForMaxRelease)
.PruneFetchCache(touchedResources);
}
await UpdateCursorAsync(updated, cancellationToken).ConfigureAwait(false);
}
public async Task ParseAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingDocuments.Count == 0)
{
return;
}
var remainingFetch = cursor.PendingDocuments.ToList();
var pendingMappings = cursor.PendingMappings.ToList();
foreach (var documentId in cursor.PendingDocuments)
{
DocumentRecord? document = null;
try
{
document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (document is null)
{
remainingFetch.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
if (!document.PayloadId.HasValue)
{
_logger.LogWarning("Red Hat document {DocumentId} missing GridFS content; skipping", document.Id);
remainingFetch.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
var rawBytes = await _rawDocumentStorage.DownloadAsync(document.PayloadId.Value, cancellationToken).ConfigureAwait(false);
using var jsonDocument = JsonDocument.Parse(rawBytes);
var sanitized = JsonSerializer.Serialize(jsonDocument.RootElement);
var payload = BsonDocument.Parse(sanitized);
var dtoRecord = new DtoRecord(
Guid.NewGuid(),
document.Id,
SourceName,
"redhat.csaf.v2",
payload,
_timeProvider.GetUtcNow());
await _dtoStore.UpsertAsync(dtoRecord, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.PendingMap, cancellationToken).ConfigureAwait(false);
remainingFetch.Remove(documentId);
if (!pendingMappings.Contains(documentId))
{
pendingMappings.Add(documentId);
}
}
catch (Exception ex)
{
var uri = document?.Uri ?? documentId.ToString();
_logger.LogError(ex, "Red Hat CSAF parse failed for {Uri}", uri);
remainingFetch.Remove(documentId);
pendingMappings.Remove(documentId);
}
}
var updatedCursor = cursor
.WithPendingDocuments(remainingFetch)
.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
public async Task MapAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingMappings.Count == 0)
{
return;
}
var pendingMappings = cursor.PendingMappings.ToList();
foreach (var documentId in cursor.PendingMappings)
{
try
{
var dto = await _dtoStore.FindByDocumentIdAsync(documentId, cancellationToken).ConfigureAwait(false);
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (dto is null || document is null)
{
pendingMappings.Remove(documentId);
continue;
}
var json = dto.Payload.ToJson(new JsonWriterSettings
{
OutputMode = JsonOutputMode.RelaxedExtendedJson,
});
using var jsonDocument = JsonDocument.Parse(json);
var advisory = RedHatMapper.Map(SourceName, dto, document, jsonDocument);
if (advisory is null)
{
pendingMappings.Remove(documentId);
continue;
}
await _advisoryStore.UpsertAsync(advisory, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(documentId, DocumentStatuses.Mapped, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
}
catch (Exception ex)
{
_logger.LogError(ex, "Red Hat map failed for document {DocumentId}", documentId);
}
}
var updatedCursor = cursor.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
private async Task<RedHatCursor> GetCursorAsync(CancellationToken cancellationToken)
{
var record = await _stateRepository.TryGetAsync(SourceName, cancellationToken).ConfigureAwait(false);
return RedHatCursor.FromBsonDocument(record?.Cursor);
}
private async Task UpdateCursorAsync(RedHatCursor cursor, CancellationToken cancellationToken)
{
var completedAt = _timeProvider.GetUtcNow();
await _stateRepository.UpdateCursorAsync(SourceName, cursor.ToBsonDocument(), completedAt, cancellationToken).ConfigureAwait(false);
}
private Uri BuildSummaryUri(DateTimeOffset after, int page)
{
var builder = new UriBuilder(_options.BaseEndpoint);
var basePath = builder.Path?.TrimEnd('/') ?? string.Empty;
var summaryPath = _options.SummaryPath.TrimStart('/');
builder.Path = string.IsNullOrEmpty(basePath)
? $"/{summaryPath}"
: $"{basePath}/{summaryPath}";
var parameters = new Dictionary<string, string>(StringComparer.Ordinal)
{
["after"] = after.ToString("yyyy-MM-dd", CultureInfo.InvariantCulture),
["per_page"] = _options.PageSize.ToString(CultureInfo.InvariantCulture),
["page"] = page.ToString(CultureInfo.InvariantCulture)
};
builder.Query = string.Join('&', parameters.Select(static kvp =>
$"{Uri.EscapeDataString(kvp.Key)}={Uri.EscapeDataString(kvp.Value)}"));
return builder.Uri;
}
}

View File

@@ -1,384 +1,384 @@
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text.Json;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using MongoDB.Bson;
using StellaOps.Concelier.Models;
using StellaOps.Concelier.Connector.Common;
using StellaOps.Concelier.Connector.Common.Fetch;
using StellaOps.Concelier.Connector.Ics.Kaspersky.Configuration;
using StellaOps.Concelier.Connector.Ics.Kaspersky.Internal;
using StellaOps.Concelier.Storage.Mongo;
using StellaOps.Concelier.Storage.Mongo.Advisories;
using StellaOps.Concelier.Storage.Mongo.Documents;
using StellaOps.Concelier.Storage.Mongo.Dtos;
using StellaOps.Plugin;
namespace StellaOps.Concelier.Connector.Ics.Kaspersky;
public sealed class KasperskyConnector : IFeedConnector
{
private static readonly JsonSerializerOptions SerializerOptions = new(JsonSerializerDefaults.General)
{
PropertyNamingPolicy = JsonNamingPolicy.CamelCase,
WriteIndented = false,
DefaultIgnoreCondition = System.Text.Json.Serialization.JsonIgnoreCondition.WhenWritingNull,
};
private readonly KasperskyFeedClient _feedClient;
private readonly SourceFetchService _fetchService;
private readonly RawDocumentStorage _rawDocumentStorage;
private readonly IDocumentStore _documentStore;
private readonly IDtoStore _dtoStore;
private readonly IAdvisoryStore _advisoryStore;
private readonly ISourceStateRepository _stateRepository;
private readonly KasperskyOptions _options;
private readonly TimeProvider _timeProvider;
private readonly ILogger<KasperskyConnector> _logger;
public KasperskyConnector(
KasperskyFeedClient feedClient,
SourceFetchService fetchService,
RawDocumentStorage rawDocumentStorage,
IDocumentStore documentStore,
IDtoStore dtoStore,
IAdvisoryStore advisoryStore,
ISourceStateRepository stateRepository,
IOptions<KasperskyOptions> options,
TimeProvider? timeProvider,
ILogger<KasperskyConnector> logger)
{
_feedClient = feedClient ?? throw new ArgumentNullException(nameof(feedClient));
_fetchService = fetchService ?? throw new ArgumentNullException(nameof(fetchService));
_rawDocumentStorage = rawDocumentStorage ?? throw new ArgumentNullException(nameof(rawDocumentStorage));
_documentStore = documentStore ?? throw new ArgumentNullException(nameof(documentStore));
_dtoStore = dtoStore ?? throw new ArgumentNullException(nameof(dtoStore));
_advisoryStore = advisoryStore ?? throw new ArgumentNullException(nameof(advisoryStore));
_stateRepository = stateRepository ?? throw new ArgumentNullException(nameof(stateRepository));
_options = (options ?? throw new ArgumentNullException(nameof(options))).Value ?? throw new ArgumentNullException(nameof(options));
_options.Validate();
_timeProvider = timeProvider ?? TimeProvider.System;
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public string SourceName => KasperskyConnectorPlugin.SourceName;
public async Task FetchAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
var now = _timeProvider.GetUtcNow();
var windowStart = cursor.LastPublished.HasValue
? cursor.LastPublished.Value - _options.WindowOverlap
: now - _options.WindowSize;
var pendingDocuments = cursor.PendingDocuments.ToHashSet();
var maxPublished = cursor.LastPublished ?? DateTimeOffset.MinValue;
var cursorState = cursor;
var touchedResources = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
for (var page = 1; page <= _options.MaxPagesPerFetch; page++)
{
IReadOnlyList<KasperskyFeedItem> items;
try
{
items = await _feedClient.GetItemsAsync(page, cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
_logger.LogError(ex, "Failed to load Kaspersky ICS feed page {Page}", page);
await _stateRepository.MarkFailureAsync(
SourceName,
now,
TimeSpan.FromMinutes(5),
ex.Message,
cancellationToken).ConfigureAwait(false);
throw;
}
if (items.Count == 0)
{
break;
}
foreach (var item in items)
{
if (item.Published < windowStart)
{
page = _options.MaxPagesPerFetch + 1;
break;
}
if (_options.RequestDelay > TimeSpan.Zero)
{
await Task.Delay(_options.RequestDelay, cancellationToken).ConfigureAwait(false);
}
var metadata = new Dictionary<string, string>(StringComparer.Ordinal)
{
["kaspersky.title"] = item.Title,
["kaspersky.link"] = item.Link.ToString(),
["kaspersky.published"] = item.Published.ToString("O"),
};
if (!string.IsNullOrWhiteSpace(item.Summary))
{
metadata["kaspersky.summary"] = item.Summary!;
}
var slug = ExtractSlug(item.Link);
if (!string.IsNullOrWhiteSpace(slug))
{
metadata["kaspersky.slug"] = slug;
}
var resourceKey = item.Link.ToString();
touchedResources.Add(resourceKey);
var existing = await _documentStore.FindBySourceAndUriAsync(SourceName, resourceKey, cancellationToken).ConfigureAwait(false);
var fetchRequest = new SourceFetchRequest(KasperskyOptions.HttpClientName, SourceName, item.Link)
{
Metadata = metadata,
};
if (cursorState.TryGetFetchMetadata(resourceKey, out var cachedFetch))
{
fetchRequest = fetchRequest with
{
ETag = cachedFetch.ETag,
LastModified = cachedFetch.LastModified,
};
}
SourceFetchResult result;
try
{
result = await _fetchService.FetchAsync(fetchRequest, cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
_logger.LogError(ex, "Failed to fetch Kaspersky advisory {Link}", item.Link);
await _stateRepository.MarkFailureAsync(
SourceName,
_timeProvider.GetUtcNow(),
TimeSpan.FromMinutes(5),
ex.Message,
cancellationToken).ConfigureAwait(false);
throw;
}
if (result.IsNotModified)
{
continue;
}
if (!result.IsSuccess || result.Document is null)
{
continue;
}
if (existing is not null
&& string.Equals(existing.Sha256, result.Document.Sha256, StringComparison.OrdinalIgnoreCase)
&& string.Equals(existing.Status, DocumentStatuses.Mapped, StringComparison.Ordinal))
{
await _documentStore.UpdateStatusAsync(result.Document.Id, existing.Status, cancellationToken).ConfigureAwait(false);
cursorState = cursorState.WithFetchMetadata(resourceKey, result.Document.Etag, result.Document.LastModified);
if (item.Published > maxPublished)
{
maxPublished = item.Published;
}
continue;
}
pendingDocuments.Add(result.Document.Id);
cursorState = cursorState.WithFetchMetadata(resourceKey, result.Document.Etag, result.Document.LastModified);
if (item.Published > maxPublished)
{
maxPublished = item.Published;
}
}
}
cursorState = cursorState.PruneFetchCache(touchedResources);
var updatedCursor = cursorState
.WithPendingDocuments(pendingDocuments)
.WithLastPublished(maxPublished == DateTimeOffset.MinValue ? cursor.LastPublished : maxPublished);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
public async Task ParseAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingDocuments.Count == 0)
{
return;
}
var remainingDocuments = cursor.PendingDocuments.ToList();
var pendingMappings = cursor.PendingMappings.ToList();
foreach (var documentId in cursor.PendingDocuments)
{
cancellationToken.ThrowIfCancellationRequested();
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (document is null)
{
remainingDocuments.Remove(documentId);
continue;
}
if (!document.GridFsId.HasValue)
{
_logger.LogWarning("Kaspersky document {DocumentId} missing GridFS content", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
continue;
}
var metadata = document.Metadata ?? new Dictionary<string, string>();
var title = metadata.TryGetValue("kaspersky.title", out var titleValue) ? titleValue : document.Uri;
var link = metadata.TryGetValue("kaspersky.link", out var linkValue) ? linkValue : document.Uri;
var published = metadata.TryGetValue("kaspersky.published", out var publishedValue) && DateTimeOffset.TryParse(publishedValue, out var parsedPublished)
? parsedPublished.ToUniversalTime()
: document.FetchedAt;
var summary = metadata.TryGetValue("kaspersky.summary", out var summaryValue) ? summaryValue : null;
var slug = metadata.TryGetValue("kaspersky.slug", out var slugValue) ? slugValue : ExtractSlug(new Uri(link, UriKind.Absolute));
var advisoryKey = string.IsNullOrWhiteSpace(slug) ? Guid.NewGuid().ToString("N") : slug;
byte[] rawBytes;
try
{
rawBytes = await _rawDocumentStorage.DownloadAsync(document.GridFsId.Value, cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
_logger.LogError(ex, "Failed downloading raw Kaspersky document {DocumentId}", document.Id);
throw;
}
var dto = KasperskyAdvisoryParser.Parse(advisoryKey, title, link, published, summary, rawBytes);
var payload = BsonDocument.Parse(JsonSerializer.Serialize(dto, SerializerOptions));
var dtoRecord = new DtoRecord(Guid.NewGuid(), document.Id, SourceName, "ics.kaspersky/1", payload, _timeProvider.GetUtcNow());
await _dtoStore.UpsertAsync(dtoRecord, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.PendingMap, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
if (!pendingMappings.Contains(documentId))
{
pendingMappings.Add(documentId);
}
}
var updatedCursor = cursor
.WithPendingDocuments(remainingDocuments)
.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
public async Task MapAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingMappings.Count == 0)
{
return;
}
var pendingMappings = cursor.PendingMappings.ToList();
foreach (var documentId in cursor.PendingMappings)
{
cancellationToken.ThrowIfCancellationRequested();
var dto = await _dtoStore.FindByDocumentIdAsync(documentId, cancellationToken).ConfigureAwait(false);
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (dto is null || document is null)
{
_logger.LogWarning("Skipping Kaspersky mapping for {DocumentId}: DTO or document missing", documentId);
pendingMappings.Remove(documentId);
continue;
}
var dtoJson = dto.Payload.ToJson(new MongoDB.Bson.IO.JsonWriterSettings
{
OutputMode = MongoDB.Bson.IO.JsonOutputMode.RelaxedExtendedJson,
});
KasperskyAdvisoryDto advisoryDto;
try
{
advisoryDto = JsonSerializer.Deserialize<KasperskyAdvisoryDto>(dtoJson, SerializerOptions)
?? throw new InvalidOperationException("Deserialized DTO was null.");
}
catch (Exception ex)
{
_logger.LogError(ex, "Failed to deserialize Kaspersky DTO for {DocumentId}", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
continue;
}
var fetchProvenance = new AdvisoryProvenance(SourceName, "document", document.Uri, document.FetchedAt);
var mappingProvenance = new AdvisoryProvenance(SourceName, "mapping", advisoryDto.AdvisoryKey, dto.ValidatedAt);
var aliases = new HashSet<string>(StringComparer.OrdinalIgnoreCase)
{
advisoryDto.AdvisoryKey,
};
foreach (var cve in advisoryDto.CveIds)
{
aliases.Add(cve);
}
var references = new List<AdvisoryReference>();
try
{
references.Add(new AdvisoryReference(
advisoryDto.Link,
"advisory",
"kaspersky-ics",
null,
new AdvisoryProvenance(SourceName, "reference", advisoryDto.Link, dto.ValidatedAt)));
}
catch (ArgumentException)
{
_logger.LogWarning("Invalid advisory link {Link} for {AdvisoryKey}", advisoryDto.Link, advisoryDto.AdvisoryKey);
}
foreach (var cve in advisoryDto.CveIds)
{
var url = $"https://www.cve.org/CVERecord?id={cve}";
try
{
references.Add(new AdvisoryReference(
url,
"advisory",
cve,
null,
new AdvisoryProvenance(SourceName, "reference", url, dto.ValidatedAt)));
}
catch (ArgumentException)
{
// ignore malformed
}
}
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text.Json;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using MongoDB.Bson;
using StellaOps.Concelier.Models;
using StellaOps.Concelier.Connector.Common;
using StellaOps.Concelier.Connector.Common.Fetch;
using StellaOps.Concelier.Connector.Ics.Kaspersky.Configuration;
using StellaOps.Concelier.Connector.Ics.Kaspersky.Internal;
using StellaOps.Concelier.Storage.Mongo;
using StellaOps.Concelier.Storage.Mongo.Advisories;
using StellaOps.Concelier.Storage.Mongo.Documents;
using StellaOps.Concelier.Storage.Mongo.Dtos;
using StellaOps.Plugin;
namespace StellaOps.Concelier.Connector.Ics.Kaspersky;
public sealed class KasperskyConnector : IFeedConnector
{
private static readonly JsonSerializerOptions SerializerOptions = new(JsonSerializerDefaults.General)
{
PropertyNamingPolicy = JsonNamingPolicy.CamelCase,
WriteIndented = false,
DefaultIgnoreCondition = System.Text.Json.Serialization.JsonIgnoreCondition.WhenWritingNull,
};
private readonly KasperskyFeedClient _feedClient;
private readonly SourceFetchService _fetchService;
private readonly RawDocumentStorage _rawDocumentStorage;
private readonly IDocumentStore _documentStore;
private readonly IDtoStore _dtoStore;
private readonly IAdvisoryStore _advisoryStore;
private readonly ISourceStateRepository _stateRepository;
private readonly KasperskyOptions _options;
private readonly TimeProvider _timeProvider;
private readonly ILogger<KasperskyConnector> _logger;
public KasperskyConnector(
KasperskyFeedClient feedClient,
SourceFetchService fetchService,
RawDocumentStorage rawDocumentStorage,
IDocumentStore documentStore,
IDtoStore dtoStore,
IAdvisoryStore advisoryStore,
ISourceStateRepository stateRepository,
IOptions<KasperskyOptions> options,
TimeProvider? timeProvider,
ILogger<KasperskyConnector> logger)
{
_feedClient = feedClient ?? throw new ArgumentNullException(nameof(feedClient));
_fetchService = fetchService ?? throw new ArgumentNullException(nameof(fetchService));
_rawDocumentStorage = rawDocumentStorage ?? throw new ArgumentNullException(nameof(rawDocumentStorage));
_documentStore = documentStore ?? throw new ArgumentNullException(nameof(documentStore));
_dtoStore = dtoStore ?? throw new ArgumentNullException(nameof(dtoStore));
_advisoryStore = advisoryStore ?? throw new ArgumentNullException(nameof(advisoryStore));
_stateRepository = stateRepository ?? throw new ArgumentNullException(nameof(stateRepository));
_options = (options ?? throw new ArgumentNullException(nameof(options))).Value ?? throw new ArgumentNullException(nameof(options));
_options.Validate();
_timeProvider = timeProvider ?? TimeProvider.System;
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public string SourceName => KasperskyConnectorPlugin.SourceName;
public async Task FetchAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
var now = _timeProvider.GetUtcNow();
var windowStart = cursor.LastPublished.HasValue
? cursor.LastPublished.Value - _options.WindowOverlap
: now - _options.WindowSize;
var pendingDocuments = cursor.PendingDocuments.ToHashSet();
var maxPublished = cursor.LastPublished ?? DateTimeOffset.MinValue;
var cursorState = cursor;
var touchedResources = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
for (var page = 1; page <= _options.MaxPagesPerFetch; page++)
{
IReadOnlyList<KasperskyFeedItem> items;
try
{
items = await _feedClient.GetItemsAsync(page, cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
_logger.LogError(ex, "Failed to load Kaspersky ICS feed page {Page}", page);
await _stateRepository.MarkFailureAsync(
SourceName,
now,
TimeSpan.FromMinutes(5),
ex.Message,
cancellationToken).ConfigureAwait(false);
throw;
}
if (items.Count == 0)
{
break;
}
foreach (var item in items)
{
if (item.Published < windowStart)
{
page = _options.MaxPagesPerFetch + 1;
break;
}
if (_options.RequestDelay > TimeSpan.Zero)
{
await Task.Delay(_options.RequestDelay, cancellationToken).ConfigureAwait(false);
}
var metadata = new Dictionary<string, string>(StringComparer.Ordinal)
{
["kaspersky.title"] = item.Title,
["kaspersky.link"] = item.Link.ToString(),
["kaspersky.published"] = item.Published.ToString("O"),
};
if (!string.IsNullOrWhiteSpace(item.Summary))
{
metadata["kaspersky.summary"] = item.Summary!;
}
var slug = ExtractSlug(item.Link);
if (!string.IsNullOrWhiteSpace(slug))
{
metadata["kaspersky.slug"] = slug;
}
var resourceKey = item.Link.ToString();
touchedResources.Add(resourceKey);
var existing = await _documentStore.FindBySourceAndUriAsync(SourceName, resourceKey, cancellationToken).ConfigureAwait(false);
var fetchRequest = new SourceFetchRequest(KasperskyOptions.HttpClientName, SourceName, item.Link)
{
Metadata = metadata,
};
if (cursorState.TryGetFetchMetadata(resourceKey, out var cachedFetch))
{
fetchRequest = fetchRequest with
{
ETag = cachedFetch.ETag,
LastModified = cachedFetch.LastModified,
};
}
SourceFetchResult result;
try
{
result = await _fetchService.FetchAsync(fetchRequest, cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
_logger.LogError(ex, "Failed to fetch Kaspersky advisory {Link}", item.Link);
await _stateRepository.MarkFailureAsync(
SourceName,
_timeProvider.GetUtcNow(),
TimeSpan.FromMinutes(5),
ex.Message,
cancellationToken).ConfigureAwait(false);
throw;
}
if (result.IsNotModified)
{
continue;
}
if (!result.IsSuccess || result.Document is null)
{
continue;
}
if (existing is not null
&& string.Equals(existing.Sha256, result.Document.Sha256, StringComparison.OrdinalIgnoreCase)
&& string.Equals(existing.Status, DocumentStatuses.Mapped, StringComparison.Ordinal))
{
await _documentStore.UpdateStatusAsync(result.Document.Id, existing.Status, cancellationToken).ConfigureAwait(false);
cursorState = cursorState.WithFetchMetadata(resourceKey, result.Document.Etag, result.Document.LastModified);
if (item.Published > maxPublished)
{
maxPublished = item.Published;
}
continue;
}
pendingDocuments.Add(result.Document.Id);
cursorState = cursorState.WithFetchMetadata(resourceKey, result.Document.Etag, result.Document.LastModified);
if (item.Published > maxPublished)
{
maxPublished = item.Published;
}
}
}
cursorState = cursorState.PruneFetchCache(touchedResources);
var updatedCursor = cursorState
.WithPendingDocuments(pendingDocuments)
.WithLastPublished(maxPublished == DateTimeOffset.MinValue ? cursor.LastPublished : maxPublished);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
public async Task ParseAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingDocuments.Count == 0)
{
return;
}
var remainingDocuments = cursor.PendingDocuments.ToList();
var pendingMappings = cursor.PendingMappings.ToList();
foreach (var documentId in cursor.PendingDocuments)
{
cancellationToken.ThrowIfCancellationRequested();
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (document is null)
{
remainingDocuments.Remove(documentId);
continue;
}
if (!document.PayloadId.HasValue)
{
_logger.LogWarning("Kaspersky document {DocumentId} missing GridFS content", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
continue;
}
var metadata = document.Metadata ?? new Dictionary<string, string>();
var title = metadata.TryGetValue("kaspersky.title", out var titleValue) ? titleValue : document.Uri;
var link = metadata.TryGetValue("kaspersky.link", out var linkValue) ? linkValue : document.Uri;
var published = metadata.TryGetValue("kaspersky.published", out var publishedValue) && DateTimeOffset.TryParse(publishedValue, out var parsedPublished)
? parsedPublished.ToUniversalTime()
: document.FetchedAt;
var summary = metadata.TryGetValue("kaspersky.summary", out var summaryValue) ? summaryValue : null;
var slug = metadata.TryGetValue("kaspersky.slug", out var slugValue) ? slugValue : ExtractSlug(new Uri(link, UriKind.Absolute));
var advisoryKey = string.IsNullOrWhiteSpace(slug) ? Guid.NewGuid().ToString("N") : slug;
byte[] rawBytes;
try
{
rawBytes = await _rawDocumentStorage.DownloadAsync(document.PayloadId.Value, cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
_logger.LogError(ex, "Failed downloading raw Kaspersky document {DocumentId}", document.Id);
throw;
}
var dto = KasperskyAdvisoryParser.Parse(advisoryKey, title, link, published, summary, rawBytes);
var payload = BsonDocument.Parse(JsonSerializer.Serialize(dto, SerializerOptions));
var dtoRecord = new DtoRecord(Guid.NewGuid(), document.Id, SourceName, "ics.kaspersky/1", payload, _timeProvider.GetUtcNow());
await _dtoStore.UpsertAsync(dtoRecord, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.PendingMap, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
if (!pendingMappings.Contains(documentId))
{
pendingMappings.Add(documentId);
}
}
var updatedCursor = cursor
.WithPendingDocuments(remainingDocuments)
.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
public async Task MapAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingMappings.Count == 0)
{
return;
}
var pendingMappings = cursor.PendingMappings.ToList();
foreach (var documentId in cursor.PendingMappings)
{
cancellationToken.ThrowIfCancellationRequested();
var dto = await _dtoStore.FindByDocumentIdAsync(documentId, cancellationToken).ConfigureAwait(false);
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (dto is null || document is null)
{
_logger.LogWarning("Skipping Kaspersky mapping for {DocumentId}: DTO or document missing", documentId);
pendingMappings.Remove(documentId);
continue;
}
var dtoJson = dto.Payload.ToJson(new MongoDB.Bson.IO.JsonWriterSettings
{
OutputMode = MongoDB.Bson.IO.JsonOutputMode.RelaxedExtendedJson,
});
KasperskyAdvisoryDto advisoryDto;
try
{
advisoryDto = JsonSerializer.Deserialize<KasperskyAdvisoryDto>(dtoJson, SerializerOptions)
?? throw new InvalidOperationException("Deserialized DTO was null.");
}
catch (Exception ex)
{
_logger.LogError(ex, "Failed to deserialize Kaspersky DTO for {DocumentId}", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
continue;
}
var fetchProvenance = new AdvisoryProvenance(SourceName, "document", document.Uri, document.FetchedAt);
var mappingProvenance = new AdvisoryProvenance(SourceName, "mapping", advisoryDto.AdvisoryKey, dto.ValidatedAt);
var aliases = new HashSet<string>(StringComparer.OrdinalIgnoreCase)
{
advisoryDto.AdvisoryKey,
};
foreach (var cve in advisoryDto.CveIds)
{
aliases.Add(cve);
}
var references = new List<AdvisoryReference>();
try
{
references.Add(new AdvisoryReference(
advisoryDto.Link,
"advisory",
"kaspersky-ics",
null,
new AdvisoryProvenance(SourceName, "reference", advisoryDto.Link, dto.ValidatedAt)));
}
catch (ArgumentException)
{
_logger.LogWarning("Invalid advisory link {Link} for {AdvisoryKey}", advisoryDto.Link, advisoryDto.AdvisoryKey);
}
foreach (var cve in advisoryDto.CveIds)
{
var url = $"https://www.cve.org/CVERecord?id={cve}";
try
{
references.Add(new AdvisoryReference(
url,
"advisory",
cve,
null,
new AdvisoryProvenance(SourceName, "reference", url, dto.ValidatedAt)));
}
catch (ArgumentException)
{
// ignore malformed
}
}
var affectedPackages = new List<AffectedPackage>();
foreach (var vendor in advisoryDto.VendorNames)
{
@@ -413,52 +413,52 @@ public sealed class KasperskyConnector : IFeedConnector
statuses: Array.Empty<AffectedPackageStatus>(),
provenance: provenance));
}
var advisory = new Advisory(
advisoryDto.AdvisoryKey,
advisoryDto.Title,
advisoryDto.Summary ?? advisoryDto.Content,
language: "en",
published: advisoryDto.Published,
modified: advisoryDto.Published,
severity: null,
exploitKnown: false,
aliases: aliases,
references: references,
affectedPackages: affectedPackages,
cvssMetrics: Array.Empty<CvssMetric>(),
provenance: new[] { fetchProvenance, mappingProvenance });
await _advisoryStore.UpsertAsync(advisory, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Mapped, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
}
var updatedCursor = cursor.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
private async Task<KasperskyCursor> GetCursorAsync(CancellationToken cancellationToken)
{
var state = await _stateRepository.TryGetAsync(SourceName, cancellationToken).ConfigureAwait(false);
return state is null ? KasperskyCursor.Empty : KasperskyCursor.FromBson(state.Cursor);
}
private async Task UpdateCursorAsync(KasperskyCursor cursor, CancellationToken cancellationToken)
{
await _stateRepository.UpdateCursorAsync(SourceName, cursor.ToBsonDocument(), _timeProvider.GetUtcNow(), cancellationToken).ConfigureAwait(false);
}
private static string? ExtractSlug(Uri link)
{
var segments = link.Segments;
if (segments.Length == 0)
{
return null;
}
var last = segments[^1].Trim('/');
return string.IsNullOrWhiteSpace(last) && segments.Length > 1 ? segments[^2].Trim('/') : last;
}
}
var advisory = new Advisory(
advisoryDto.AdvisoryKey,
advisoryDto.Title,
advisoryDto.Summary ?? advisoryDto.Content,
language: "en",
published: advisoryDto.Published,
modified: advisoryDto.Published,
severity: null,
exploitKnown: false,
aliases: aliases,
references: references,
affectedPackages: affectedPackages,
cvssMetrics: Array.Empty<CvssMetric>(),
provenance: new[] { fetchProvenance, mappingProvenance });
await _advisoryStore.UpsertAsync(advisory, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Mapped, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
}
var updatedCursor = cursor.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
private async Task<KasperskyCursor> GetCursorAsync(CancellationToken cancellationToken)
{
var state = await _stateRepository.TryGetAsync(SourceName, cancellationToken).ConfigureAwait(false);
return state is null ? KasperskyCursor.Empty : KasperskyCursor.FromBson(state.Cursor);
}
private async Task UpdateCursorAsync(KasperskyCursor cursor, CancellationToken cancellationToken)
{
await _stateRepository.UpdateCursorAsync(SourceName, cursor.ToBsonDocument(), _timeProvider.GetUtcNow(), cancellationToken).ConfigureAwait(false);
}
private static string? ExtractSlug(Uri link)
{
var segments = link.Segments;
if (segments.Length == 0)
{
return null;
}
var last = segments[^1].Trim('/');
return string.IsNullOrWhiteSpace(last) && segments.Length > 1 ? segments[^2].Trim('/') : last;
}
}

View File

@@ -1,325 +1,325 @@
using System.Collections.Generic;
using System.Linq;
using System.Text.Json;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using MongoDB.Bson;
using StellaOps.Concelier.Models;
using StellaOps.Concelier.Connector.Common;
using StellaOps.Concelier.Connector.Common.Fetch;
using StellaOps.Concelier.Connector.Jvn.Configuration;
using StellaOps.Concelier.Connector.Jvn.Internal;
using StellaOps.Concelier.Storage.Mongo;
using StellaOps.Concelier.Storage.Mongo.Advisories;
using StellaOps.Concelier.Storage.Mongo.Documents;
using StellaOps.Concelier.Storage.Mongo.Dtos;
using StellaOps.Concelier.Storage.Mongo.JpFlags;
using StellaOps.Plugin;
namespace StellaOps.Concelier.Connector.Jvn;
public sealed class JvnConnector : IFeedConnector
{
private static readonly JsonSerializerOptions SerializerOptions = new(JsonSerializerDefaults.General)
{
PropertyNamingPolicy = JsonNamingPolicy.CamelCase,
WriteIndented = false,
DefaultIgnoreCondition = System.Text.Json.Serialization.JsonIgnoreCondition.WhenWritingNull,
};
private readonly MyJvnClient _client;
private readonly SourceFetchService _fetchService;
private readonly RawDocumentStorage _rawDocumentStorage;
private readonly IDocumentStore _documentStore;
private readonly IDtoStore _dtoStore;
private readonly IAdvisoryStore _advisoryStore;
private readonly IJpFlagStore _jpFlagStore;
private readonly ISourceStateRepository _stateRepository;
private readonly TimeProvider _timeProvider;
private readonly JvnOptions _options;
private readonly ILogger<JvnConnector> _logger;
public JvnConnector(
MyJvnClient client,
SourceFetchService fetchService,
RawDocumentStorage rawDocumentStorage,
IDocumentStore documentStore,
IDtoStore dtoStore,
IAdvisoryStore advisoryStore,
IJpFlagStore jpFlagStore,
ISourceStateRepository stateRepository,
IOptions<JvnOptions> options,
TimeProvider? timeProvider,
ILogger<JvnConnector> logger)
{
_client = client ?? throw new ArgumentNullException(nameof(client));
_fetchService = fetchService ?? throw new ArgumentNullException(nameof(fetchService));
_rawDocumentStorage = rawDocumentStorage ?? throw new ArgumentNullException(nameof(rawDocumentStorage));
_documentStore = documentStore ?? throw new ArgumentNullException(nameof(documentStore));
_dtoStore = dtoStore ?? throw new ArgumentNullException(nameof(dtoStore));
_advisoryStore = advisoryStore ?? throw new ArgumentNullException(nameof(advisoryStore));
_jpFlagStore = jpFlagStore ?? throw new ArgumentNullException(nameof(jpFlagStore));
_stateRepository = stateRepository ?? throw new ArgumentNullException(nameof(stateRepository));
_options = (options ?? throw new ArgumentNullException(nameof(options))).Value ?? throw new ArgumentNullException(nameof(options));
_options.Validate();
_timeProvider = timeProvider ?? TimeProvider.System;
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public string SourceName => JvnConnectorPlugin.SourceName;
public async Task FetchAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
var now = _timeProvider.GetUtcNow();
var windowEnd = now;
var defaultWindowStart = windowEnd - _options.WindowSize;
var windowStart = cursor.LastCompletedWindowEnd.HasValue
? cursor.LastCompletedWindowEnd.Value - _options.WindowOverlap
: defaultWindowStart;
if (windowStart < defaultWindowStart)
{
windowStart = defaultWindowStart;
}
if (windowStart >= windowEnd)
{
windowStart = windowEnd - TimeSpan.FromHours(1);
}
_logger.LogInformation("JVN fetch window {WindowStart:o} - {WindowEnd:o}", windowStart, windowEnd);
IReadOnlyList<JvnOverviewItem> overviewItems;
try
{
overviewItems = await _client.GetOverviewAsync(windowStart, windowEnd, cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
_logger.LogError(ex, "Failed to retrieve JVN overview between {Start:o} and {End:o}", windowStart, windowEnd);
await _stateRepository.MarkFailureAsync(SourceName, now, TimeSpan.FromMinutes(5), ex.Message, cancellationToken).ConfigureAwait(false);
throw;
}
_logger.LogInformation("JVN overview returned {Count} items", overviewItems.Count);
var pendingDocuments = cursor.PendingDocuments.ToHashSet();
foreach (var item in overviewItems)
{
cancellationToken.ThrowIfCancellationRequested();
var detailUri = _client.BuildDetailUri(item.VulnerabilityId);
var metadata = new Dictionary<string, string>(StringComparer.Ordinal)
{
["jvn.vulnId"] = item.VulnerabilityId,
["jvn.detailUrl"] = detailUri.ToString(),
};
if (item.DateFirstPublished.HasValue)
{
metadata["jvn.firstPublished"] = item.DateFirstPublished.Value.ToString("O");
}
if (item.DateLastUpdated.HasValue)
{
metadata["jvn.lastUpdated"] = item.DateLastUpdated.Value.ToString("O");
}
var result = await _fetchService.FetchAsync(
new SourceFetchRequest(JvnOptions.HttpClientName, SourceName, detailUri)
{
Metadata = metadata
},
cancellationToken).ConfigureAwait(false);
if (!result.IsSuccess || result.Document is null)
{
if (!result.IsNotModified)
{
_logger.LogWarning("JVN fetch for {Uri} returned status {Status}", detailUri, result.StatusCode);
}
continue;
}
_logger.LogDebug("JVN fetched document {DocumentId}", result.Document.Id);
pendingDocuments.Add(result.Document.Id);
}
var updatedCursor = cursor
.WithWindow(windowStart, windowEnd)
.WithCompletedWindow(windowEnd)
.WithPendingDocuments(pendingDocuments);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
public async Task ParseAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
_logger.LogDebug("JVN parse pending documents: {PendingCount}", cursor.PendingDocuments.Count);
Console.WriteLine($"JVN parse pending count: {cursor.PendingDocuments.Count}");
if (cursor.PendingDocuments.Count == 0)
{
return;
}
var remainingDocuments = cursor.PendingDocuments.ToList();
var pendingMappings = cursor.PendingMappings.ToList();
foreach (var documentId in cursor.PendingDocuments)
{
cancellationToken.ThrowIfCancellationRequested();
_logger.LogDebug("JVN parsing document {DocumentId}", documentId);
Console.WriteLine($"JVN parsing document {documentId}");
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (document is null)
{
_logger.LogWarning("JVN document {DocumentId} no longer exists; skipping", documentId);
remainingDocuments.Remove(documentId);
continue;
}
if (!document.GridFsId.HasValue)
{
_logger.LogWarning("JVN document {DocumentId} is missing GridFS content; marking as failed", documentId);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
continue;
}
byte[] rawBytes;
try
{
rawBytes = await _rawDocumentStorage.DownloadAsync(document.GridFsId.Value, cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
_logger.LogError(ex, "Unable to download raw JVN document {DocumentId}", document.Id);
throw;
}
JvnDetailDto detail;
try
{
detail = JvnDetailParser.Parse(rawBytes, document.Uri);
}
catch (JvnSchemaValidationException ex)
{
Console.WriteLine($"JVN schema validation exception: {ex.Message}");
_logger.LogWarning(ex, "JVN schema validation failed for document {DocumentId} ({Uri})", document.Id, document.Uri);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
throw;
}
var sanitizedJson = JsonSerializer.Serialize(detail, SerializerOptions);
var payload = BsonDocument.Parse(sanitizedJson);
var dtoRecord = new DtoRecord(
Guid.NewGuid(),
document.Id,
SourceName,
JvnConstants.DtoSchemaVersion,
payload,
_timeProvider.GetUtcNow());
await _dtoStore.UpsertAsync(dtoRecord, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.PendingMap, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
if (!pendingMappings.Contains(documentId))
{
pendingMappings.Add(documentId);
Console.WriteLine($"Added mapping for {documentId}");
_logger.LogDebug("JVN parsed document {DocumentId}", documentId);
}
}
var updatedCursor = cursor
.WithPendingDocuments(remainingDocuments)
.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
public async Task MapAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
_logger.LogDebug("JVN map pending mappings: {PendingCount}", cursor.PendingMappings.Count);
if (cursor.PendingMappings.Count == 0)
{
return;
}
var pendingMappings = cursor.PendingMappings.ToList();
foreach (var documentId in cursor.PendingMappings)
{
cancellationToken.ThrowIfCancellationRequested();
var dto = await _dtoStore.FindByDocumentIdAsync(documentId, cancellationToken).ConfigureAwait(false);
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (dto is null || document is null)
{
_logger.LogWarning("Skipping JVN mapping for {DocumentId}: DTO or document missing", documentId);
pendingMappings.Remove(documentId);
continue;
}
var dtoJson = dto.Payload.ToJson(new MongoDB.Bson.IO.JsonWriterSettings
{
OutputMode = MongoDB.Bson.IO.JsonOutputMode.RelaxedExtendedJson,
});
JvnDetailDto detail;
try
{
detail = JsonSerializer.Deserialize<JvnDetailDto>(dtoJson, SerializerOptions)
?? throw new InvalidOperationException("Deserialized DTO was null.");
}
catch (Exception ex)
{
_logger.LogError(ex, "Failed to deserialize JVN DTO for document {DocumentId}", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
continue;
}
var (advisory, flag) = JvnAdvisoryMapper.Map(detail, document, dto, _timeProvider);
await _advisoryStore.UpsertAsync(advisory, cancellationToken).ConfigureAwait(false);
await _jpFlagStore.UpsertAsync(flag, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Mapped, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
_logger.LogDebug("JVN mapped document {DocumentId}", documentId);
}
var updatedCursor = cursor.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
private async Task<JvnCursor> GetCursorAsync(CancellationToken cancellationToken)
{
var state = await _stateRepository.TryGetAsync(SourceName, cancellationToken).ConfigureAwait(false);
return state is null ? JvnCursor.Empty : JvnCursor.FromBson(state.Cursor);
}
private async Task UpdateCursorAsync(JvnCursor cursor, CancellationToken cancellationToken)
{
var cursorDocument = cursor.ToBsonDocument();
await _stateRepository.UpdateCursorAsync(SourceName, cursorDocument, _timeProvider.GetUtcNow(), cancellationToken).ConfigureAwait(false);
}
}
using System.Collections.Generic;
using System.Linq;
using System.Text.Json;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using MongoDB.Bson;
using StellaOps.Concelier.Models;
using StellaOps.Concelier.Connector.Common;
using StellaOps.Concelier.Connector.Common.Fetch;
using StellaOps.Concelier.Connector.Jvn.Configuration;
using StellaOps.Concelier.Connector.Jvn.Internal;
using StellaOps.Concelier.Storage.Mongo;
using StellaOps.Concelier.Storage.Mongo.Advisories;
using StellaOps.Concelier.Storage.Mongo.Documents;
using StellaOps.Concelier.Storage.Mongo.Dtos;
using StellaOps.Concelier.Storage.Mongo.JpFlags;
using StellaOps.Plugin;
namespace StellaOps.Concelier.Connector.Jvn;
public sealed class JvnConnector : IFeedConnector
{
private static readonly JsonSerializerOptions SerializerOptions = new(JsonSerializerDefaults.General)
{
PropertyNamingPolicy = JsonNamingPolicy.CamelCase,
WriteIndented = false,
DefaultIgnoreCondition = System.Text.Json.Serialization.JsonIgnoreCondition.WhenWritingNull,
};
private readonly MyJvnClient _client;
private readonly SourceFetchService _fetchService;
private readonly RawDocumentStorage _rawDocumentStorage;
private readonly IDocumentStore _documentStore;
private readonly IDtoStore _dtoStore;
private readonly IAdvisoryStore _advisoryStore;
private readonly IJpFlagStore _jpFlagStore;
private readonly ISourceStateRepository _stateRepository;
private readonly TimeProvider _timeProvider;
private readonly JvnOptions _options;
private readonly ILogger<JvnConnector> _logger;
public JvnConnector(
MyJvnClient client,
SourceFetchService fetchService,
RawDocumentStorage rawDocumentStorage,
IDocumentStore documentStore,
IDtoStore dtoStore,
IAdvisoryStore advisoryStore,
IJpFlagStore jpFlagStore,
ISourceStateRepository stateRepository,
IOptions<JvnOptions> options,
TimeProvider? timeProvider,
ILogger<JvnConnector> logger)
{
_client = client ?? throw new ArgumentNullException(nameof(client));
_fetchService = fetchService ?? throw new ArgumentNullException(nameof(fetchService));
_rawDocumentStorage = rawDocumentStorage ?? throw new ArgumentNullException(nameof(rawDocumentStorage));
_documentStore = documentStore ?? throw new ArgumentNullException(nameof(documentStore));
_dtoStore = dtoStore ?? throw new ArgumentNullException(nameof(dtoStore));
_advisoryStore = advisoryStore ?? throw new ArgumentNullException(nameof(advisoryStore));
_jpFlagStore = jpFlagStore ?? throw new ArgumentNullException(nameof(jpFlagStore));
_stateRepository = stateRepository ?? throw new ArgumentNullException(nameof(stateRepository));
_options = (options ?? throw new ArgumentNullException(nameof(options))).Value ?? throw new ArgumentNullException(nameof(options));
_options.Validate();
_timeProvider = timeProvider ?? TimeProvider.System;
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public string SourceName => JvnConnectorPlugin.SourceName;
public async Task FetchAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
var now = _timeProvider.GetUtcNow();
var windowEnd = now;
var defaultWindowStart = windowEnd - _options.WindowSize;
var windowStart = cursor.LastCompletedWindowEnd.HasValue
? cursor.LastCompletedWindowEnd.Value - _options.WindowOverlap
: defaultWindowStart;
if (windowStart < defaultWindowStart)
{
windowStart = defaultWindowStart;
}
if (windowStart >= windowEnd)
{
windowStart = windowEnd - TimeSpan.FromHours(1);
}
_logger.LogInformation("JVN fetch window {WindowStart:o} - {WindowEnd:o}", windowStart, windowEnd);
IReadOnlyList<JvnOverviewItem> overviewItems;
try
{
overviewItems = await _client.GetOverviewAsync(windowStart, windowEnd, cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
_logger.LogError(ex, "Failed to retrieve JVN overview between {Start:o} and {End:o}", windowStart, windowEnd);
await _stateRepository.MarkFailureAsync(SourceName, now, TimeSpan.FromMinutes(5), ex.Message, cancellationToken).ConfigureAwait(false);
throw;
}
_logger.LogInformation("JVN overview returned {Count} items", overviewItems.Count);
var pendingDocuments = cursor.PendingDocuments.ToHashSet();
foreach (var item in overviewItems)
{
cancellationToken.ThrowIfCancellationRequested();
var detailUri = _client.BuildDetailUri(item.VulnerabilityId);
var metadata = new Dictionary<string, string>(StringComparer.Ordinal)
{
["jvn.vulnId"] = item.VulnerabilityId,
["jvn.detailUrl"] = detailUri.ToString(),
};
if (item.DateFirstPublished.HasValue)
{
metadata["jvn.firstPublished"] = item.DateFirstPublished.Value.ToString("O");
}
if (item.DateLastUpdated.HasValue)
{
metadata["jvn.lastUpdated"] = item.DateLastUpdated.Value.ToString("O");
}
var result = await _fetchService.FetchAsync(
new SourceFetchRequest(JvnOptions.HttpClientName, SourceName, detailUri)
{
Metadata = metadata
},
cancellationToken).ConfigureAwait(false);
if (!result.IsSuccess || result.Document is null)
{
if (!result.IsNotModified)
{
_logger.LogWarning("JVN fetch for {Uri} returned status {Status}", detailUri, result.StatusCode);
}
continue;
}
_logger.LogDebug("JVN fetched document {DocumentId}", result.Document.Id);
pendingDocuments.Add(result.Document.Id);
}
var updatedCursor = cursor
.WithWindow(windowStart, windowEnd)
.WithCompletedWindow(windowEnd)
.WithPendingDocuments(pendingDocuments);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
public async Task ParseAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
_logger.LogDebug("JVN parse pending documents: {PendingCount}", cursor.PendingDocuments.Count);
Console.WriteLine($"JVN parse pending count: {cursor.PendingDocuments.Count}");
if (cursor.PendingDocuments.Count == 0)
{
return;
}
var remainingDocuments = cursor.PendingDocuments.ToList();
var pendingMappings = cursor.PendingMappings.ToList();
foreach (var documentId in cursor.PendingDocuments)
{
cancellationToken.ThrowIfCancellationRequested();
_logger.LogDebug("JVN parsing document {DocumentId}", documentId);
Console.WriteLine($"JVN parsing document {documentId}");
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (document is null)
{
_logger.LogWarning("JVN document {DocumentId} no longer exists; skipping", documentId);
remainingDocuments.Remove(documentId);
continue;
}
if (!document.PayloadId.HasValue)
{
_logger.LogWarning("JVN document {DocumentId} is missing GridFS content; marking as failed", documentId);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
continue;
}
byte[] rawBytes;
try
{
rawBytes = await _rawDocumentStorage.DownloadAsync(document.PayloadId.Value, cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
_logger.LogError(ex, "Unable to download raw JVN document {DocumentId}", document.Id);
throw;
}
JvnDetailDto detail;
try
{
detail = JvnDetailParser.Parse(rawBytes, document.Uri);
}
catch (JvnSchemaValidationException ex)
{
Console.WriteLine($"JVN schema validation exception: {ex.Message}");
_logger.LogWarning(ex, "JVN schema validation failed for document {DocumentId} ({Uri})", document.Id, document.Uri);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
throw;
}
var sanitizedJson = JsonSerializer.Serialize(detail, SerializerOptions);
var payload = BsonDocument.Parse(sanitizedJson);
var dtoRecord = new DtoRecord(
Guid.NewGuid(),
document.Id,
SourceName,
JvnConstants.DtoSchemaVersion,
payload,
_timeProvider.GetUtcNow());
await _dtoStore.UpsertAsync(dtoRecord, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.PendingMap, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
if (!pendingMappings.Contains(documentId))
{
pendingMappings.Add(documentId);
Console.WriteLine($"Added mapping for {documentId}");
_logger.LogDebug("JVN parsed document {DocumentId}", documentId);
}
}
var updatedCursor = cursor
.WithPendingDocuments(remainingDocuments)
.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
public async Task MapAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
_logger.LogDebug("JVN map pending mappings: {PendingCount}", cursor.PendingMappings.Count);
if (cursor.PendingMappings.Count == 0)
{
return;
}
var pendingMappings = cursor.PendingMappings.ToList();
foreach (var documentId in cursor.PendingMappings)
{
cancellationToken.ThrowIfCancellationRequested();
var dto = await _dtoStore.FindByDocumentIdAsync(documentId, cancellationToken).ConfigureAwait(false);
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (dto is null || document is null)
{
_logger.LogWarning("Skipping JVN mapping for {DocumentId}: DTO or document missing", documentId);
pendingMappings.Remove(documentId);
continue;
}
var dtoJson = dto.Payload.ToJson(new MongoDB.Bson.IO.JsonWriterSettings
{
OutputMode = MongoDB.Bson.IO.JsonOutputMode.RelaxedExtendedJson,
});
JvnDetailDto detail;
try
{
detail = JsonSerializer.Deserialize<JvnDetailDto>(dtoJson, SerializerOptions)
?? throw new InvalidOperationException("Deserialized DTO was null.");
}
catch (Exception ex)
{
_logger.LogError(ex, "Failed to deserialize JVN DTO for document {DocumentId}", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
continue;
}
var (advisory, flag) = JvnAdvisoryMapper.Map(detail, document, dto, _timeProvider);
await _advisoryStore.UpsertAsync(advisory, cancellationToken).ConfigureAwait(false);
await _jpFlagStore.UpsertAsync(flag, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Mapped, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
_logger.LogDebug("JVN mapped document {DocumentId}", documentId);
}
var updatedCursor = cursor.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
private async Task<JvnCursor> GetCursorAsync(CancellationToken cancellationToken)
{
var state = await _stateRepository.TryGetAsync(SourceName, cancellationToken).ConfigureAwait(false);
return state is null ? JvnCursor.Empty : JvnCursor.FromBson(state.Cursor);
}
private async Task UpdateCursorAsync(JvnCursor cursor, CancellationToken cancellationToken)
{
var cursorDocument = cursor.ToBsonDocument();
await _stateRepository.UpdateCursorAsync(SourceName, cursorDocument, _timeProvider.GetUtcNow(), cancellationToken).ConfigureAwait(false);
}
}

View File

@@ -1,441 +1,441 @@
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text.Json;
using System.Text.Json.Serialization;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using MongoDB.Bson;
using StellaOps.Concelier.Models;
using StellaOps.Concelier.Connector.Common;
using StellaOps.Concelier.Connector.Common.Fetch;
using StellaOps.Concelier.Connector.Common.Json;
using StellaOps.Concelier.Connector.Kev.Configuration;
using StellaOps.Concelier.Connector.Kev.Internal;
using StellaOps.Concelier.Storage.Mongo;
using StellaOps.Concelier.Storage.Mongo.Advisories;
using StellaOps.Concelier.Storage.Mongo.Documents;
using StellaOps.Concelier.Storage.Mongo.Dtos;
using StellaOps.Plugin;
namespace StellaOps.Concelier.Connector.Kev;
public sealed class KevConnector : IFeedConnector
{
private static readonly JsonSerializerOptions SerializerOptions = new(JsonSerializerDefaults.Web)
{
PropertyNameCaseInsensitive = true,
DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingNull,
};
private const string SchemaVersion = "kev.catalog.v1";
private readonly SourceFetchService _fetchService;
private readonly RawDocumentStorage _rawDocumentStorage;
private readonly IDocumentStore _documentStore;
private readonly IDtoStore _dtoStore;
private readonly IAdvisoryStore _advisoryStore;
private readonly ISourceStateRepository _stateRepository;
private readonly KevOptions _options;
private readonly IJsonSchemaValidator _schemaValidator;
private readonly TimeProvider _timeProvider;
private readonly ILogger<KevConnector> _logger;
private readonly KevDiagnostics _diagnostics;
public KevConnector(
SourceFetchService fetchService,
RawDocumentStorage rawDocumentStorage,
IDocumentStore documentStore,
IDtoStore dtoStore,
IAdvisoryStore advisoryStore,
ISourceStateRepository stateRepository,
IOptions<KevOptions> options,
IJsonSchemaValidator schemaValidator,
KevDiagnostics diagnostics,
TimeProvider? timeProvider,
ILogger<KevConnector> logger)
{
_fetchService = fetchService ?? throw new ArgumentNullException(nameof(fetchService));
_rawDocumentStorage = rawDocumentStorage ?? throw new ArgumentNullException(nameof(rawDocumentStorage));
_documentStore = documentStore ?? throw new ArgumentNullException(nameof(documentStore));
_dtoStore = dtoStore ?? throw new ArgumentNullException(nameof(dtoStore));
_advisoryStore = advisoryStore ?? throw new ArgumentNullException(nameof(advisoryStore));
_stateRepository = stateRepository ?? throw new ArgumentNullException(nameof(stateRepository));
_options = options?.Value ?? throw new ArgumentNullException(nameof(options));
_options.Validate();
_schemaValidator = schemaValidator ?? throw new ArgumentNullException(nameof(schemaValidator));
_diagnostics = diagnostics ?? throw new ArgumentNullException(nameof(diagnostics));
_timeProvider = timeProvider ?? TimeProvider.System;
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public string SourceName => KevConnectorPlugin.SourceName;
public async Task FetchAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
var now = _timeProvider.GetUtcNow();
try
{
var existing = await _documentStore.FindBySourceAndUriAsync(SourceName, _options.FeedUri.ToString(), cancellationToken).ConfigureAwait(false);
var request = new SourceFetchRequest(
KevOptions.HttpClientName,
SourceName,
_options.FeedUri)
{
Metadata = new Dictionary<string, string>(StringComparer.Ordinal)
{
["kev.cursor.catalogVersion"] = cursor.CatalogVersion ?? string.Empty,
["kev.cursor.catalogReleased"] = cursor.CatalogReleased?.ToString("O") ?? string.Empty,
},
ETag = existing?.Etag,
LastModified = existing?.LastModified,
TimeoutOverride = _options.RequestTimeout,
AcceptHeaders = new[] { "application/json", "text/json" },
};
_diagnostics.FetchAttempt();
var result = await _fetchService.FetchAsync(request, cancellationToken).ConfigureAwait(false);
if (result.IsNotModified)
{
_diagnostics.FetchUnchanged();
_logger.LogInformation(
"KEV catalog not modified (catalogVersion={CatalogVersion}, etag={Etag})",
cursor.CatalogVersion ?? "(unknown)",
existing?.Etag ?? "(none)");
await UpdateCursorAsync(cursor, cancellationToken).ConfigureAwait(false);
return;
}
if (!result.IsSuccess || result.Document is null)
{
_diagnostics.FetchFailure();
await _stateRepository.MarkFailureAsync(SourceName, now, TimeSpan.FromMinutes(5), "KEV feed returned no content.", cancellationToken).ConfigureAwait(false);
return;
}
_diagnostics.FetchSuccess();
var pendingDocuments = cursor.PendingDocuments.ToHashSet();
var pendingMappings = cursor.PendingMappings.ToHashSet();
var pendingDocumentsBefore = pendingDocuments.Count;
var pendingMappingsBefore = pendingMappings.Count;
pendingDocuments.Add(result.Document.Id);
var updatedCursor = cursor
.WithPendingDocuments(pendingDocuments)
.WithPendingMappings(pendingMappings);
var document = result.Document;
var lastModified = document.LastModified?.ToUniversalTime().ToString("O") ?? "(unknown)";
_logger.LogInformation(
"Fetched KEV catalog document {DocumentId} (etag={Etag}, lastModified={LastModified}) pendingDocuments={PendingDocumentsBefore}->{PendingDocumentsAfter} pendingMappings={PendingMappingsBefore}->{PendingMappingsAfter}",
document.Id,
document.Etag ?? "(none)",
lastModified,
pendingDocumentsBefore,
pendingDocuments.Count,
pendingMappingsBefore,
pendingMappings.Count);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
_diagnostics.FetchFailure();
_logger.LogError(ex, "KEV fetch failed for {Uri}", _options.FeedUri);
await _stateRepository.MarkFailureAsync(SourceName, now, TimeSpan.FromMinutes(5), ex.Message, cancellationToken).ConfigureAwait(false);
throw;
}
}
public async Task ParseAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingDocuments.Count == 0)
{
return;
}
var remainingDocuments = cursor.PendingDocuments.ToList();
var pendingMappings = cursor.PendingMappings.ToHashSet();
var latestCatalogVersion = cursor.CatalogVersion;
var latestCatalogReleased = cursor.CatalogReleased;
foreach (var documentId in cursor.PendingDocuments)
{
cancellationToken.ThrowIfCancellationRequested();
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (document is null)
{
remainingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
if (!document.GridFsId.HasValue)
{
_diagnostics.ParseFailure("missingPayload", cursor.CatalogVersion);
_logger.LogWarning("KEV document {DocumentId} missing GridFS payload", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
byte[] rawBytes;
try
{
rawBytes = await _rawDocumentStorage.DownloadAsync(document.GridFsId.Value, cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
_diagnostics.ParseFailure("download", cursor.CatalogVersion);
_logger.LogError(ex, "KEV parse failed for document {DocumentId}", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
KevCatalogDto? catalog = null;
string? catalogVersion = null;
try
{
using var jsonDocument = JsonDocument.Parse(rawBytes);
catalogVersion = TryGetCatalogVersion(jsonDocument.RootElement);
_schemaValidator.Validate(jsonDocument, KevSchemaProvider.Schema, document.Uri);
catalog = jsonDocument.RootElement.Deserialize<KevCatalogDto>(SerializerOptions);
}
catch (JsonSchemaValidationException ex)
{
_diagnostics.ParseFailure("schema", catalogVersion);
_logger.LogWarning(ex, "KEV schema validation failed for document {DocumentId}", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
catch (JsonException ex)
{
_diagnostics.ParseFailure("invalidJson", catalogVersion);
_logger.LogError(ex, "KEV JSON parsing failed for document {DocumentId}", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
catch (Exception ex)
{
_diagnostics.ParseFailure("deserialize", catalogVersion);
_logger.LogError(ex, "KEV catalog deserialization failed for document {DocumentId}", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
if (catalog is null)
{
_diagnostics.ParseFailure("emptyCatalog", catalogVersion);
_logger.LogWarning("KEV catalog payload was empty for document {DocumentId}", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
var entryCount = catalog.Vulnerabilities?.Count ?? 0;
var released = catalog.DateReleased?.ToUniversalTime();
RecordCatalogAnomalies(catalog);
try
{
var payloadJson = JsonSerializer.Serialize(catalog, SerializerOptions);
var payload = BsonDocument.Parse(payloadJson);
_logger.LogInformation(
"Parsed KEV catalog document {DocumentId} (version={CatalogVersion}, released={Released}, entries={EntryCount})",
document.Id,
catalog.CatalogVersion ?? "(unknown)",
released,
entryCount);
_diagnostics.CatalogParsed(catalog.CatalogVersion, entryCount);
var dtoRecord = new DtoRecord(
Guid.NewGuid(),
document.Id,
SourceName,
SchemaVersion,
payload,
_timeProvider.GetUtcNow());
await _dtoStore.UpsertAsync(dtoRecord, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.PendingMap, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
pendingMappings.Add(document.Id);
latestCatalogVersion = catalog.CatalogVersion ?? latestCatalogVersion;
latestCatalogReleased = catalog.DateReleased ?? latestCatalogReleased;
}
catch (Exception ex)
{
_logger.LogError(ex, "KEV DTO persistence failed for document {DocumentId}", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
}
}
var updatedCursor = cursor
.WithPendingDocuments(remainingDocuments)
.WithPendingMappings(pendingMappings)
.WithCatalogMetadata(latestCatalogVersion, latestCatalogReleased);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
public async Task MapAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingMappings.Count == 0)
{
return;
}
var pendingMappings = cursor.PendingMappings.ToHashSet();
foreach (var documentId in cursor.PendingMappings)
{
cancellationToken.ThrowIfCancellationRequested();
var dtoRecord = await _dtoStore.FindByDocumentIdAsync(documentId, cancellationToken).ConfigureAwait(false);
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (dtoRecord is null || document is null)
{
pendingMappings.Remove(documentId);
continue;
}
KevCatalogDto? catalog;
try
{
var dtoJson = dtoRecord.Payload.ToJson(new MongoDB.Bson.IO.JsonWriterSettings
{
OutputMode = MongoDB.Bson.IO.JsonOutputMode.RelaxedExtendedJson,
});
catalog = JsonSerializer.Deserialize<KevCatalogDto>(dtoJson, SerializerOptions);
}
catch (Exception ex)
{
_logger.LogError(ex, "KEV mapping: failed to deserialize DTO for document {DocumentId}", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
continue;
}
if (catalog is null)
{
_logger.LogWarning("KEV mapping: DTO payload was empty for document {DocumentId}", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
continue;
}
var feedUri = TryParseUri(document.Uri) ?? _options.FeedUri;
var advisories = KevMapper.Map(catalog, SourceName, feedUri, document.FetchedAt, dtoRecord.ValidatedAt);
var entryCount = catalog.Vulnerabilities?.Count ?? 0;
var mappedCount = advisories.Count;
var skippedCount = Math.Max(0, entryCount - mappedCount);
_logger.LogInformation(
"Mapped {MappedCount}/{EntryCount} KEV advisories from catalog version {CatalogVersion} (skipped={SkippedCount})",
mappedCount,
entryCount,
catalog.CatalogVersion ?? "(unknown)",
skippedCount);
_diagnostics.AdvisoriesMapped(catalog.CatalogVersion, mappedCount);
foreach (var advisory in advisories)
{
await _advisoryStore.UpsertAsync(advisory, cancellationToken).ConfigureAwait(false);
}
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Mapped, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
}
var updatedCursor = cursor.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
private async Task<KevCursor> GetCursorAsync(CancellationToken cancellationToken)
{
var state = await _stateRepository.TryGetAsync(SourceName, cancellationToken).ConfigureAwait(false);
return state is null ? KevCursor.Empty : KevCursor.FromBson(state.Cursor);
}
private Task UpdateCursorAsync(KevCursor cursor, CancellationToken cancellationToken)
{
return _stateRepository.UpdateCursorAsync(SourceName, cursor.ToBsonDocument(), _timeProvider.GetUtcNow(), cancellationToken);
}
private void RecordCatalogAnomalies(KevCatalogDto catalog)
{
ArgumentNullException.ThrowIfNull(catalog);
var version = catalog.CatalogVersion;
var vulnerabilities = catalog.Vulnerabilities ?? Array.Empty<KevVulnerabilityDto>();
if (catalog.Count != vulnerabilities.Count)
{
_diagnostics.RecordAnomaly("countMismatch", version);
}
foreach (var entry in vulnerabilities)
{
if (entry is null)
{
_diagnostics.RecordAnomaly("nullEntry", version);
continue;
}
if (string.IsNullOrWhiteSpace(entry.CveId))
{
_diagnostics.RecordAnomaly("missingCveId", version);
}
}
}
private static string? TryGetCatalogVersion(JsonElement root)
{
if (root.ValueKind != JsonValueKind.Object)
{
return null;
}
if (root.TryGetProperty("catalogVersion", out var versionElement) && versionElement.ValueKind == JsonValueKind.String)
{
return versionElement.GetString();
}
return null;
}
private static Uri? TryParseUri(string? value)
=> Uri.TryCreate(value, UriKind.Absolute, out var uri) ? uri : null;
}
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text.Json;
using System.Text.Json.Serialization;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using MongoDB.Bson;
using StellaOps.Concelier.Models;
using StellaOps.Concelier.Connector.Common;
using StellaOps.Concelier.Connector.Common.Fetch;
using StellaOps.Concelier.Connector.Common.Json;
using StellaOps.Concelier.Connector.Kev.Configuration;
using StellaOps.Concelier.Connector.Kev.Internal;
using StellaOps.Concelier.Storage.Mongo;
using StellaOps.Concelier.Storage.Mongo.Advisories;
using StellaOps.Concelier.Storage.Mongo.Documents;
using StellaOps.Concelier.Storage.Mongo.Dtos;
using StellaOps.Plugin;
namespace StellaOps.Concelier.Connector.Kev;
public sealed class KevConnector : IFeedConnector
{
private static readonly JsonSerializerOptions SerializerOptions = new(JsonSerializerDefaults.Web)
{
PropertyNameCaseInsensitive = true,
DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingNull,
};
private const string SchemaVersion = "kev.catalog.v1";
private readonly SourceFetchService _fetchService;
private readonly RawDocumentStorage _rawDocumentStorage;
private readonly IDocumentStore _documentStore;
private readonly IDtoStore _dtoStore;
private readonly IAdvisoryStore _advisoryStore;
private readonly ISourceStateRepository _stateRepository;
private readonly KevOptions _options;
private readonly IJsonSchemaValidator _schemaValidator;
private readonly TimeProvider _timeProvider;
private readonly ILogger<KevConnector> _logger;
private readonly KevDiagnostics _diagnostics;
public KevConnector(
SourceFetchService fetchService,
RawDocumentStorage rawDocumentStorage,
IDocumentStore documentStore,
IDtoStore dtoStore,
IAdvisoryStore advisoryStore,
ISourceStateRepository stateRepository,
IOptions<KevOptions> options,
IJsonSchemaValidator schemaValidator,
KevDiagnostics diagnostics,
TimeProvider? timeProvider,
ILogger<KevConnector> logger)
{
_fetchService = fetchService ?? throw new ArgumentNullException(nameof(fetchService));
_rawDocumentStorage = rawDocumentStorage ?? throw new ArgumentNullException(nameof(rawDocumentStorage));
_documentStore = documentStore ?? throw new ArgumentNullException(nameof(documentStore));
_dtoStore = dtoStore ?? throw new ArgumentNullException(nameof(dtoStore));
_advisoryStore = advisoryStore ?? throw new ArgumentNullException(nameof(advisoryStore));
_stateRepository = stateRepository ?? throw new ArgumentNullException(nameof(stateRepository));
_options = options?.Value ?? throw new ArgumentNullException(nameof(options));
_options.Validate();
_schemaValidator = schemaValidator ?? throw new ArgumentNullException(nameof(schemaValidator));
_diagnostics = diagnostics ?? throw new ArgumentNullException(nameof(diagnostics));
_timeProvider = timeProvider ?? TimeProvider.System;
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public string SourceName => KevConnectorPlugin.SourceName;
public async Task FetchAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
var now = _timeProvider.GetUtcNow();
try
{
var existing = await _documentStore.FindBySourceAndUriAsync(SourceName, _options.FeedUri.ToString(), cancellationToken).ConfigureAwait(false);
var request = new SourceFetchRequest(
KevOptions.HttpClientName,
SourceName,
_options.FeedUri)
{
Metadata = new Dictionary<string, string>(StringComparer.Ordinal)
{
["kev.cursor.catalogVersion"] = cursor.CatalogVersion ?? string.Empty,
["kev.cursor.catalogReleased"] = cursor.CatalogReleased?.ToString("O") ?? string.Empty,
},
ETag = existing?.Etag,
LastModified = existing?.LastModified,
TimeoutOverride = _options.RequestTimeout,
AcceptHeaders = new[] { "application/json", "text/json" },
};
_diagnostics.FetchAttempt();
var result = await _fetchService.FetchAsync(request, cancellationToken).ConfigureAwait(false);
if (result.IsNotModified)
{
_diagnostics.FetchUnchanged();
_logger.LogInformation(
"KEV catalog not modified (catalogVersion={CatalogVersion}, etag={Etag})",
cursor.CatalogVersion ?? "(unknown)",
existing?.Etag ?? "(none)");
await UpdateCursorAsync(cursor, cancellationToken).ConfigureAwait(false);
return;
}
if (!result.IsSuccess || result.Document is null)
{
_diagnostics.FetchFailure();
await _stateRepository.MarkFailureAsync(SourceName, now, TimeSpan.FromMinutes(5), "KEV feed returned no content.", cancellationToken).ConfigureAwait(false);
return;
}
_diagnostics.FetchSuccess();
var pendingDocuments = cursor.PendingDocuments.ToHashSet();
var pendingMappings = cursor.PendingMappings.ToHashSet();
var pendingDocumentsBefore = pendingDocuments.Count;
var pendingMappingsBefore = pendingMappings.Count;
pendingDocuments.Add(result.Document.Id);
var updatedCursor = cursor
.WithPendingDocuments(pendingDocuments)
.WithPendingMappings(pendingMappings);
var document = result.Document;
var lastModified = document.LastModified?.ToUniversalTime().ToString("O") ?? "(unknown)";
_logger.LogInformation(
"Fetched KEV catalog document {DocumentId} (etag={Etag}, lastModified={LastModified}) pendingDocuments={PendingDocumentsBefore}->{PendingDocumentsAfter} pendingMappings={PendingMappingsBefore}->{PendingMappingsAfter}",
document.Id,
document.Etag ?? "(none)",
lastModified,
pendingDocumentsBefore,
pendingDocuments.Count,
pendingMappingsBefore,
pendingMappings.Count);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
_diagnostics.FetchFailure();
_logger.LogError(ex, "KEV fetch failed for {Uri}", _options.FeedUri);
await _stateRepository.MarkFailureAsync(SourceName, now, TimeSpan.FromMinutes(5), ex.Message, cancellationToken).ConfigureAwait(false);
throw;
}
}
public async Task ParseAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingDocuments.Count == 0)
{
return;
}
var remainingDocuments = cursor.PendingDocuments.ToList();
var pendingMappings = cursor.PendingMappings.ToHashSet();
var latestCatalogVersion = cursor.CatalogVersion;
var latestCatalogReleased = cursor.CatalogReleased;
foreach (var documentId in cursor.PendingDocuments)
{
cancellationToken.ThrowIfCancellationRequested();
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (document is null)
{
remainingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
if (!document.PayloadId.HasValue)
{
_diagnostics.ParseFailure("missingPayload", cursor.CatalogVersion);
_logger.LogWarning("KEV document {DocumentId} missing GridFS payload", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
byte[] rawBytes;
try
{
rawBytes = await _rawDocumentStorage.DownloadAsync(document.PayloadId.Value, cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
_diagnostics.ParseFailure("download", cursor.CatalogVersion);
_logger.LogError(ex, "KEV parse failed for document {DocumentId}", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
KevCatalogDto? catalog = null;
string? catalogVersion = null;
try
{
using var jsonDocument = JsonDocument.Parse(rawBytes);
catalogVersion = TryGetCatalogVersion(jsonDocument.RootElement);
_schemaValidator.Validate(jsonDocument, KevSchemaProvider.Schema, document.Uri);
catalog = jsonDocument.RootElement.Deserialize<KevCatalogDto>(SerializerOptions);
}
catch (JsonSchemaValidationException ex)
{
_diagnostics.ParseFailure("schema", catalogVersion);
_logger.LogWarning(ex, "KEV schema validation failed for document {DocumentId}", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
catch (JsonException ex)
{
_diagnostics.ParseFailure("invalidJson", catalogVersion);
_logger.LogError(ex, "KEV JSON parsing failed for document {DocumentId}", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
catch (Exception ex)
{
_diagnostics.ParseFailure("deserialize", catalogVersion);
_logger.LogError(ex, "KEV catalog deserialization failed for document {DocumentId}", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
if (catalog is null)
{
_diagnostics.ParseFailure("emptyCatalog", catalogVersion);
_logger.LogWarning("KEV catalog payload was empty for document {DocumentId}", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
var entryCount = catalog.Vulnerabilities?.Count ?? 0;
var released = catalog.DateReleased?.ToUniversalTime();
RecordCatalogAnomalies(catalog);
try
{
var payloadJson = JsonSerializer.Serialize(catalog, SerializerOptions);
var payload = BsonDocument.Parse(payloadJson);
_logger.LogInformation(
"Parsed KEV catalog document {DocumentId} (version={CatalogVersion}, released={Released}, entries={EntryCount})",
document.Id,
catalog.CatalogVersion ?? "(unknown)",
released,
entryCount);
_diagnostics.CatalogParsed(catalog.CatalogVersion, entryCount);
var dtoRecord = new DtoRecord(
Guid.NewGuid(),
document.Id,
SourceName,
SchemaVersion,
payload,
_timeProvider.GetUtcNow());
await _dtoStore.UpsertAsync(dtoRecord, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.PendingMap, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
pendingMappings.Add(document.Id);
latestCatalogVersion = catalog.CatalogVersion ?? latestCatalogVersion;
latestCatalogReleased = catalog.DateReleased ?? latestCatalogReleased;
}
catch (Exception ex)
{
_logger.LogError(ex, "KEV DTO persistence failed for document {DocumentId}", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
}
}
var updatedCursor = cursor
.WithPendingDocuments(remainingDocuments)
.WithPendingMappings(pendingMappings)
.WithCatalogMetadata(latestCatalogVersion, latestCatalogReleased);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
public async Task MapAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingMappings.Count == 0)
{
return;
}
var pendingMappings = cursor.PendingMappings.ToHashSet();
foreach (var documentId in cursor.PendingMappings)
{
cancellationToken.ThrowIfCancellationRequested();
var dtoRecord = await _dtoStore.FindByDocumentIdAsync(documentId, cancellationToken).ConfigureAwait(false);
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (dtoRecord is null || document is null)
{
pendingMappings.Remove(documentId);
continue;
}
KevCatalogDto? catalog;
try
{
var dtoJson = dtoRecord.Payload.ToJson(new MongoDB.Bson.IO.JsonWriterSettings
{
OutputMode = MongoDB.Bson.IO.JsonOutputMode.RelaxedExtendedJson,
});
catalog = JsonSerializer.Deserialize<KevCatalogDto>(dtoJson, SerializerOptions);
}
catch (Exception ex)
{
_logger.LogError(ex, "KEV mapping: failed to deserialize DTO for document {DocumentId}", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
continue;
}
if (catalog is null)
{
_logger.LogWarning("KEV mapping: DTO payload was empty for document {DocumentId}", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
continue;
}
var feedUri = TryParseUri(document.Uri) ?? _options.FeedUri;
var advisories = KevMapper.Map(catalog, SourceName, feedUri, document.FetchedAt, dtoRecord.ValidatedAt);
var entryCount = catalog.Vulnerabilities?.Count ?? 0;
var mappedCount = advisories.Count;
var skippedCount = Math.Max(0, entryCount - mappedCount);
_logger.LogInformation(
"Mapped {MappedCount}/{EntryCount} KEV advisories from catalog version {CatalogVersion} (skipped={SkippedCount})",
mappedCount,
entryCount,
catalog.CatalogVersion ?? "(unknown)",
skippedCount);
_diagnostics.AdvisoriesMapped(catalog.CatalogVersion, mappedCount);
foreach (var advisory in advisories)
{
await _advisoryStore.UpsertAsync(advisory, cancellationToken).ConfigureAwait(false);
}
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Mapped, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
}
var updatedCursor = cursor.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
private async Task<KevCursor> GetCursorAsync(CancellationToken cancellationToken)
{
var state = await _stateRepository.TryGetAsync(SourceName, cancellationToken).ConfigureAwait(false);
return state is null ? KevCursor.Empty : KevCursor.FromBson(state.Cursor);
}
private Task UpdateCursorAsync(KevCursor cursor, CancellationToken cancellationToken)
{
return _stateRepository.UpdateCursorAsync(SourceName, cursor.ToBsonDocument(), _timeProvider.GetUtcNow(), cancellationToken);
}
private void RecordCatalogAnomalies(KevCatalogDto catalog)
{
ArgumentNullException.ThrowIfNull(catalog);
var version = catalog.CatalogVersion;
var vulnerabilities = catalog.Vulnerabilities ?? Array.Empty<KevVulnerabilityDto>();
if (catalog.Count != vulnerabilities.Count)
{
_diagnostics.RecordAnomaly("countMismatch", version);
}
foreach (var entry in vulnerabilities)
{
if (entry is null)
{
_diagnostics.RecordAnomaly("nullEntry", version);
continue;
}
if (string.IsNullOrWhiteSpace(entry.CveId))
{
_diagnostics.RecordAnomaly("missingCveId", version);
}
}
}
private static string? TryGetCatalogVersion(JsonElement root)
{
if (root.ValueKind != JsonValueKind.Object)
{
return null;
}
if (root.TryGetProperty("catalogVersion", out var versionElement) && versionElement.ValueKind == JsonValueKind.String)
{
return versionElement.GetString();
}
return null;
}
private static Uri? TryParseUri(string? value)
=> Uri.TryCreate(value, UriKind.Absolute, out var uri) ? uri : null;
}

View File

@@ -1,136 +1,136 @@
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text.Json;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using MongoDB.Bson;
using StellaOps.Concelier.Connector.Common;
using StellaOps.Concelier.Connector.Common.Fetch;
using StellaOps.Concelier.Connector.Kisa.Configuration;
using StellaOps.Concelier.Connector.Kisa.Internal;
using StellaOps.Concelier.Storage.Mongo;
using StellaOps.Concelier.Storage.Mongo.Advisories;
using StellaOps.Concelier.Storage.Mongo.Documents;
using StellaOps.Concelier.Storage.Mongo.Dtos;
using StellaOps.Plugin;
namespace StellaOps.Concelier.Connector.Kisa;
public sealed class KisaConnector : IFeedConnector
{
private static readonly JsonSerializerOptions SerializerOptions = new(JsonSerializerDefaults.Web)
{
PropertyNameCaseInsensitive = true,
DefaultIgnoreCondition = System.Text.Json.Serialization.JsonIgnoreCondition.WhenWritingNull,
};
private readonly KisaFeedClient _feedClient;
private readonly KisaDetailParser _detailParser;
private readonly SourceFetchService _fetchService;
private readonly RawDocumentStorage _rawDocumentStorage;
private readonly IDocumentStore _documentStore;
private readonly IDtoStore _dtoStore;
private readonly IAdvisoryStore _advisoryStore;
private readonly ISourceStateRepository _stateRepository;
private readonly KisaOptions _options;
private readonly KisaDiagnostics _diagnostics;
private readonly TimeProvider _timeProvider;
private readonly ILogger<KisaConnector> _logger;
public KisaConnector(
KisaFeedClient feedClient,
KisaDetailParser detailParser,
SourceFetchService fetchService,
RawDocumentStorage rawDocumentStorage,
IDocumentStore documentStore,
IDtoStore dtoStore,
IAdvisoryStore advisoryStore,
ISourceStateRepository stateRepository,
IOptions<KisaOptions> options,
KisaDiagnostics diagnostics,
TimeProvider? timeProvider,
ILogger<KisaConnector> logger)
{
_feedClient = feedClient ?? throw new ArgumentNullException(nameof(feedClient));
_detailParser = detailParser ?? throw new ArgumentNullException(nameof(detailParser));
_fetchService = fetchService ?? throw new ArgumentNullException(nameof(fetchService));
_rawDocumentStorage = rawDocumentStorage ?? throw new ArgumentNullException(nameof(rawDocumentStorage));
_documentStore = documentStore ?? throw new ArgumentNullException(nameof(documentStore));
_dtoStore = dtoStore ?? throw new ArgumentNullException(nameof(dtoStore));
_advisoryStore = advisoryStore ?? throw new ArgumentNullException(nameof(advisoryStore));
_stateRepository = stateRepository ?? throw new ArgumentNullException(nameof(stateRepository));
_options = (options ?? throw new ArgumentNullException(nameof(options))).Value ?? throw new ArgumentNullException(nameof(options));
_options.Validate();
_diagnostics = diagnostics ?? throw new ArgumentNullException(nameof(diagnostics));
_timeProvider = timeProvider ?? TimeProvider.System;
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public string SourceName => KisaConnectorPlugin.SourceName;
public async Task FetchAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
var now = _timeProvider.GetUtcNow();
_diagnostics.FeedAttempt();
IReadOnlyList<KisaFeedItem> items;
try
{
items = await _feedClient.LoadAsync(cancellationToken).ConfigureAwait(false);
_diagnostics.FeedSuccess(items.Count);
if (items.Count > 0)
{
_logger.LogInformation("KISA feed returned {ItemCount} advisories", items.Count);
}
else
{
_logger.LogDebug("KISA feed returned no advisories");
}
}
catch (Exception ex)
{
_diagnostics.FeedFailure(ex.GetType().Name);
_logger.LogError(ex, "KISA feed fetch failed");
await _stateRepository.MarkFailureAsync(SourceName, now, _options.FailureBackoff, ex.Message, cancellationToken).ConfigureAwait(false);
throw;
}
if (items.Count == 0)
{
await UpdateCursorAsync(cursor.WithLastFetch(now), cancellationToken).ConfigureAwait(false);
return;
}
var pendingDocuments = cursor.PendingDocuments.ToHashSet();
var pendingMappings = cursor.PendingMappings.ToHashSet();
var knownIds = new HashSet<string>(cursor.KnownIds, StringComparer.OrdinalIgnoreCase);
var processed = 0;
var latestPublished = cursor.LastPublished ?? DateTimeOffset.MinValue;
foreach (var item in items.OrderByDescending(static i => i.Published))
{
cancellationToken.ThrowIfCancellationRequested();
if (knownIds.Contains(item.AdvisoryId))
{
continue;
}
if (processed >= _options.MaxAdvisoriesPerFetch)
{
break;
}
var category = item.Category;
_diagnostics.DetailAttempt(category);
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text.Json;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using MongoDB.Bson;
using StellaOps.Concelier.Connector.Common;
using StellaOps.Concelier.Connector.Common.Fetch;
using StellaOps.Concelier.Connector.Kisa.Configuration;
using StellaOps.Concelier.Connector.Kisa.Internal;
using StellaOps.Concelier.Storage.Mongo;
using StellaOps.Concelier.Storage.Mongo.Advisories;
using StellaOps.Concelier.Storage.Mongo.Documents;
using StellaOps.Concelier.Storage.Mongo.Dtos;
using StellaOps.Plugin;
namespace StellaOps.Concelier.Connector.Kisa;
public sealed class KisaConnector : IFeedConnector
{
private static readonly JsonSerializerOptions SerializerOptions = new(JsonSerializerDefaults.Web)
{
PropertyNameCaseInsensitive = true,
DefaultIgnoreCondition = System.Text.Json.Serialization.JsonIgnoreCondition.WhenWritingNull,
};
private readonly KisaFeedClient _feedClient;
private readonly KisaDetailParser _detailParser;
private readonly SourceFetchService _fetchService;
private readonly RawDocumentStorage _rawDocumentStorage;
private readonly IDocumentStore _documentStore;
private readonly IDtoStore _dtoStore;
private readonly IAdvisoryStore _advisoryStore;
private readonly ISourceStateRepository _stateRepository;
private readonly KisaOptions _options;
private readonly KisaDiagnostics _diagnostics;
private readonly TimeProvider _timeProvider;
private readonly ILogger<KisaConnector> _logger;
public KisaConnector(
KisaFeedClient feedClient,
KisaDetailParser detailParser,
SourceFetchService fetchService,
RawDocumentStorage rawDocumentStorage,
IDocumentStore documentStore,
IDtoStore dtoStore,
IAdvisoryStore advisoryStore,
ISourceStateRepository stateRepository,
IOptions<KisaOptions> options,
KisaDiagnostics diagnostics,
TimeProvider? timeProvider,
ILogger<KisaConnector> logger)
{
_feedClient = feedClient ?? throw new ArgumentNullException(nameof(feedClient));
_detailParser = detailParser ?? throw new ArgumentNullException(nameof(detailParser));
_fetchService = fetchService ?? throw new ArgumentNullException(nameof(fetchService));
_rawDocumentStorage = rawDocumentStorage ?? throw new ArgumentNullException(nameof(rawDocumentStorage));
_documentStore = documentStore ?? throw new ArgumentNullException(nameof(documentStore));
_dtoStore = dtoStore ?? throw new ArgumentNullException(nameof(dtoStore));
_advisoryStore = advisoryStore ?? throw new ArgumentNullException(nameof(advisoryStore));
_stateRepository = stateRepository ?? throw new ArgumentNullException(nameof(stateRepository));
_options = (options ?? throw new ArgumentNullException(nameof(options))).Value ?? throw new ArgumentNullException(nameof(options));
_options.Validate();
_diagnostics = diagnostics ?? throw new ArgumentNullException(nameof(diagnostics));
_timeProvider = timeProvider ?? TimeProvider.System;
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public string SourceName => KisaConnectorPlugin.SourceName;
public async Task FetchAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
var now = _timeProvider.GetUtcNow();
_diagnostics.FeedAttempt();
IReadOnlyList<KisaFeedItem> items;
try
{
items = await _feedClient.LoadAsync(cancellationToken).ConfigureAwait(false);
_diagnostics.FeedSuccess(items.Count);
if (items.Count > 0)
{
_logger.LogInformation("KISA feed returned {ItemCount} advisories", items.Count);
}
else
{
_logger.LogDebug("KISA feed returned no advisories");
}
}
catch (Exception ex)
{
_diagnostics.FeedFailure(ex.GetType().Name);
_logger.LogError(ex, "KISA feed fetch failed");
await _stateRepository.MarkFailureAsync(SourceName, now, _options.FailureBackoff, ex.Message, cancellationToken).ConfigureAwait(false);
throw;
}
if (items.Count == 0)
{
await UpdateCursorAsync(cursor.WithLastFetch(now), cancellationToken).ConfigureAwait(false);
return;
}
var pendingDocuments = cursor.PendingDocuments.ToHashSet();
var pendingMappings = cursor.PendingMappings.ToHashSet();
var knownIds = new HashSet<string>(cursor.KnownIds, StringComparer.OrdinalIgnoreCase);
var processed = 0;
var latestPublished = cursor.LastPublished ?? DateTimeOffset.MinValue;
foreach (var item in items.OrderByDescending(static i => i.Published))
{
cancellationToken.ThrowIfCancellationRequested();
if (knownIds.Contains(item.AdvisoryId))
{
continue;
}
if (processed >= _options.MaxAdvisoriesPerFetch)
{
break;
}
var category = item.Category;
_diagnostics.DetailAttempt(category);
try
{
var detailUri = item.DetailPageUri;
@@ -149,125 +149,125 @@ public sealed class KisaConnector : IFeedConnector
LastModified = existing?.LastModified,
TimeoutOverride = _options.RequestTimeout,
};
var result = await _fetchService.FetchAsync(request, cancellationToken).ConfigureAwait(false);
if (result.IsNotModified)
{
_diagnostics.DetailUnchanged(category);
_logger.LogDebug("KISA detail {Idx} unchanged ({Category})", item.AdvisoryId, category ?? "unknown");
knownIds.Add(item.AdvisoryId);
continue;
}
if (!result.IsSuccess || result.Document is null)
{
_diagnostics.DetailFailure(category, "empty-document");
_logger.LogWarning("KISA detail fetch returned no document for {Idx}", item.AdvisoryId);
continue;
}
pendingDocuments.Add(result.Document.Id);
pendingMappings.Remove(result.Document.Id);
knownIds.Add(item.AdvisoryId);
processed++;
_diagnostics.DetailSuccess(category);
_logger.LogInformation(
"KISA fetched detail for {Idx} (documentId={DocumentId}, category={Category})",
item.AdvisoryId,
result.Document.Id,
category ?? "unknown");
if (_options.RequestDelay > TimeSpan.Zero)
{
await Task.Delay(_options.RequestDelay, cancellationToken).ConfigureAwait(false);
}
}
catch (Exception ex)
{
_diagnostics.DetailFailure(category, ex.GetType().Name);
_logger.LogError(ex, "KISA detail fetch failed for {Idx}", item.AdvisoryId);
await _stateRepository.MarkFailureAsync(SourceName, now, _options.FailureBackoff, ex.Message, cancellationToken).ConfigureAwait(false);
throw;
}
if (item.Published > latestPublished)
{
latestPublished = item.Published;
_diagnostics.CursorAdvanced();
_logger.LogDebug("KISA advanced published cursor to {Published:O}", latestPublished);
}
}
var trimmedKnown = knownIds.Count > _options.MaxKnownAdvisories
? knownIds.OrderByDescending(id => id, StringComparer.OrdinalIgnoreCase)
.Take(_options.MaxKnownAdvisories)
.ToArray()
: knownIds.ToArray();
var updatedCursor = cursor
.WithPendingDocuments(pendingDocuments)
.WithPendingMappings(pendingMappings)
.WithKnownIds(trimmedKnown)
.WithLastPublished(latestPublished == DateTimeOffset.MinValue ? cursor.LastPublished : latestPublished)
.WithLastFetch(now);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
_logger.LogInformation("KISA fetch stored {Processed} new documents (knownIds={KnownCount})", processed, trimmedKnown.Length);
}
public async Task ParseAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingDocuments.Count == 0)
{
return;
}
var remainingDocuments = cursor.PendingDocuments.ToHashSet();
var pendingMappings = cursor.PendingMappings.ToHashSet();
var now = _timeProvider.GetUtcNow();
foreach (var documentId in cursor.PendingDocuments)
{
cancellationToken.ThrowIfCancellationRequested();
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (document is null)
{
_diagnostics.ParseFailure(null, "document-missing");
_logger.LogWarning("KISA document {DocumentId} missing during parse", documentId);
remainingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
var category = GetCategory(document);
if (!document.GridFsId.HasValue)
{
_diagnostics.ParseFailure(category, "missing-gridfs");
_logger.LogWarning("KISA document {DocumentId} missing GridFS payload", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
_diagnostics.ParseAttempt(category);
byte[] payload;
try
{
payload = await _rawDocumentStorage.DownloadAsync(document.GridFsId.Value, cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
_diagnostics.ParseFailure(category, "download");
_logger.LogError(ex, "KISA unable to download document {DocumentId}", document.Id);
throw;
}
var result = await _fetchService.FetchAsync(request, cancellationToken).ConfigureAwait(false);
if (result.IsNotModified)
{
_diagnostics.DetailUnchanged(category);
_logger.LogDebug("KISA detail {Idx} unchanged ({Category})", item.AdvisoryId, category ?? "unknown");
knownIds.Add(item.AdvisoryId);
continue;
}
if (!result.IsSuccess || result.Document is null)
{
_diagnostics.DetailFailure(category, "empty-document");
_logger.LogWarning("KISA detail fetch returned no document for {Idx}", item.AdvisoryId);
continue;
}
pendingDocuments.Add(result.Document.Id);
pendingMappings.Remove(result.Document.Id);
knownIds.Add(item.AdvisoryId);
processed++;
_diagnostics.DetailSuccess(category);
_logger.LogInformation(
"KISA fetched detail for {Idx} (documentId={DocumentId}, category={Category})",
item.AdvisoryId,
result.Document.Id,
category ?? "unknown");
if (_options.RequestDelay > TimeSpan.Zero)
{
await Task.Delay(_options.RequestDelay, cancellationToken).ConfigureAwait(false);
}
}
catch (Exception ex)
{
_diagnostics.DetailFailure(category, ex.GetType().Name);
_logger.LogError(ex, "KISA detail fetch failed for {Idx}", item.AdvisoryId);
await _stateRepository.MarkFailureAsync(SourceName, now, _options.FailureBackoff, ex.Message, cancellationToken).ConfigureAwait(false);
throw;
}
if (item.Published > latestPublished)
{
latestPublished = item.Published;
_diagnostics.CursorAdvanced();
_logger.LogDebug("KISA advanced published cursor to {Published:O}", latestPublished);
}
}
var trimmedKnown = knownIds.Count > _options.MaxKnownAdvisories
? knownIds.OrderByDescending(id => id, StringComparer.OrdinalIgnoreCase)
.Take(_options.MaxKnownAdvisories)
.ToArray()
: knownIds.ToArray();
var updatedCursor = cursor
.WithPendingDocuments(pendingDocuments)
.WithPendingMappings(pendingMappings)
.WithKnownIds(trimmedKnown)
.WithLastPublished(latestPublished == DateTimeOffset.MinValue ? cursor.LastPublished : latestPublished)
.WithLastFetch(now);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
_logger.LogInformation("KISA fetch stored {Processed} new documents (knownIds={KnownCount})", processed, trimmedKnown.Length);
}
public async Task ParseAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingDocuments.Count == 0)
{
return;
}
var remainingDocuments = cursor.PendingDocuments.ToHashSet();
var pendingMappings = cursor.PendingMappings.ToHashSet();
var now = _timeProvider.GetUtcNow();
foreach (var documentId in cursor.PendingDocuments)
{
cancellationToken.ThrowIfCancellationRequested();
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (document is null)
{
_diagnostics.ParseFailure(null, "document-missing");
_logger.LogWarning("KISA document {DocumentId} missing during parse", documentId);
remainingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
var category = GetCategory(document);
if (!document.PayloadId.HasValue)
{
_diagnostics.ParseFailure(category, "missing-gridfs");
_logger.LogWarning("KISA document {DocumentId} missing GridFS payload", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
_diagnostics.ParseAttempt(category);
byte[] payload;
try
{
payload = await _rawDocumentStorage.DownloadAsync(document.PayloadId.Value, cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
_diagnostics.ParseFailure(category, "download");
_logger.LogError(ex, "KISA unable to download document {DocumentId}", document.Id);
throw;
}
KisaParsedAdvisory parsed;
try
{
@@ -279,28 +279,28 @@ public sealed class KisaConnector : IFeedConnector
{
_diagnostics.ParseFailure(category, "parse");
_logger.LogError(ex, "KISA failed to parse detail {DocumentId}", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
_diagnostics.ParseSuccess(category);
_logger.LogDebug("KISA parsed detail for {DocumentId} ({Category})", document.Id, category ?? "unknown");
var dtoBson = BsonDocument.Parse(JsonSerializer.Serialize(parsed, SerializerOptions));
var dtoRecord = new DtoRecord(Guid.NewGuid(), document.Id, SourceName, "kisa.detail.v1", dtoBson, now);
await _dtoStore.UpsertAsync(dtoRecord, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.PendingMap, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
pendingMappings.Add(document.Id);
}
var updatedCursor = cursor
.WithPendingDocuments(remainingDocuments)
.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
_diagnostics.ParseSuccess(category);
_logger.LogDebug("KISA parsed detail for {DocumentId} ({Category})", document.Id, category ?? "unknown");
var dtoBson = BsonDocument.Parse(JsonSerializer.Serialize(parsed, SerializerOptions));
var dtoRecord = new DtoRecord(Guid.NewGuid(), document.Id, SourceName, "kisa.detail.v1", dtoBson, now);
await _dtoStore.UpsertAsync(dtoRecord, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.PendingMap, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
pendingMappings.Add(document.Id);
}
var updatedCursor = cursor
.WithPendingDocuments(remainingDocuments)
.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
private static Uri? TryGetUri(IReadOnlyDictionary<string, string>? metadata, string key)
@@ -318,107 +318,107 @@ public sealed class KisaConnector : IFeedConnector
return Uri.TryCreate(value, UriKind.Absolute, out var uri) ? uri : null;
}
public async Task MapAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingMappings.Count == 0)
{
return;
}
var pendingMappings = cursor.PendingMappings.ToHashSet();
foreach (var documentId in cursor.PendingMappings)
{
cancellationToken.ThrowIfCancellationRequested();
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (document is null)
{
_diagnostics.MapFailure(null, "document-missing");
_logger.LogWarning("KISA document {DocumentId} missing during map", documentId);
pendingMappings.Remove(documentId);
continue;
}
var dtoRecord = await _dtoStore.FindByDocumentIdAsync(documentId, cancellationToken).ConfigureAwait(false);
if (dtoRecord is null)
{
_diagnostics.MapFailure(null, "dto-missing");
_logger.LogWarning("KISA DTO missing for document {DocumentId}", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
continue;
}
KisaParsedAdvisory? parsed;
try
{
parsed = JsonSerializer.Deserialize<KisaParsedAdvisory>(dtoRecord.Payload.ToJson(), SerializerOptions);
}
catch (Exception ex)
{
_diagnostics.MapFailure(null, "dto-deserialize");
_logger.LogError(ex, "KISA failed to deserialize DTO for document {DocumentId}", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
continue;
}
if (parsed is null)
{
_diagnostics.MapFailure(null, "dto-null");
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
continue;
}
try
{
var advisory = KisaMapper.Map(parsed, document, dtoRecord.ValidatedAt);
await _advisoryStore.UpsertAsync(advisory, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Mapped, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
_diagnostics.MapSuccess(parsed.Severity);
_logger.LogInformation("KISA mapped advisory {AdvisoryId} (severity={Severity})", parsed.AdvisoryId, parsed.Severity ?? "unknown");
}
catch (Exception ex)
{
_diagnostics.MapFailure(parsed.Severity, "map");
_logger.LogError(ex, "KISA mapping failed for document {DocumentId}", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
}
}
var updatedCursor = cursor.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
private static string? GetCategory(DocumentRecord document)
{
if (document.Metadata is null)
{
return null;
}
return document.Metadata.TryGetValue("kisa.category", out var category)
? category
: null;
}
private async Task<KisaCursor> GetCursorAsync(CancellationToken cancellationToken)
{
var state = await _stateRepository.TryGetAsync(SourceName, cancellationToken).ConfigureAwait(false);
return state is null ? KisaCursor.Empty : KisaCursor.FromBson(state.Cursor);
}
private Task UpdateCursorAsync(KisaCursor cursor, CancellationToken cancellationToken)
{
var document = cursor.ToBsonDocument();
var completedAt = cursor.LastFetchAt ?? _timeProvider.GetUtcNow();
return _stateRepository.UpdateCursorAsync(SourceName, document, completedAt, cancellationToken);
}
}
public async Task MapAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingMappings.Count == 0)
{
return;
}
var pendingMappings = cursor.PendingMappings.ToHashSet();
foreach (var documentId in cursor.PendingMappings)
{
cancellationToken.ThrowIfCancellationRequested();
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (document is null)
{
_diagnostics.MapFailure(null, "document-missing");
_logger.LogWarning("KISA document {DocumentId} missing during map", documentId);
pendingMappings.Remove(documentId);
continue;
}
var dtoRecord = await _dtoStore.FindByDocumentIdAsync(documentId, cancellationToken).ConfigureAwait(false);
if (dtoRecord is null)
{
_diagnostics.MapFailure(null, "dto-missing");
_logger.LogWarning("KISA DTO missing for document {DocumentId}", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
continue;
}
KisaParsedAdvisory? parsed;
try
{
parsed = JsonSerializer.Deserialize<KisaParsedAdvisory>(dtoRecord.Payload.ToJson(), SerializerOptions);
}
catch (Exception ex)
{
_diagnostics.MapFailure(null, "dto-deserialize");
_logger.LogError(ex, "KISA failed to deserialize DTO for document {DocumentId}", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
continue;
}
if (parsed is null)
{
_diagnostics.MapFailure(null, "dto-null");
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
continue;
}
try
{
var advisory = KisaMapper.Map(parsed, document, dtoRecord.ValidatedAt);
await _advisoryStore.UpsertAsync(advisory, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Mapped, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
_diagnostics.MapSuccess(parsed.Severity);
_logger.LogInformation("KISA mapped advisory {AdvisoryId} (severity={Severity})", parsed.AdvisoryId, parsed.Severity ?? "unknown");
}
catch (Exception ex)
{
_diagnostics.MapFailure(parsed.Severity, "map");
_logger.LogError(ex, "KISA mapping failed for document {DocumentId}", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
}
}
var updatedCursor = cursor.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
private static string? GetCategory(DocumentRecord document)
{
if (document.Metadata is null)
{
return null;
}
return document.Metadata.TryGetValue("kisa.category", out var category)
? category
: null;
}
private async Task<KisaCursor> GetCursorAsync(CancellationToken cancellationToken)
{
var state = await _stateRepository.TryGetAsync(SourceName, cancellationToken).ConfigureAwait(false);
return state is null ? KisaCursor.Empty : KisaCursor.FromBson(state.Cursor);
}
private Task UpdateCursorAsync(KisaCursor cursor, CancellationToken cancellationToken)
{
var document = cursor.ToBsonDocument();
var completedAt = cursor.LastFetchAt ?? _timeProvider.GetUtcNow();
return _stateRepository.UpdateCursorAsync(SourceName, document, completedAt, cancellationToken);
}
}

View File

@@ -1,43 +1,43 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.IO.Compression;
using System.Linq;
using System.Net;
using System.Net.Http;
using System;
using System.Collections.Generic;
using System.IO;
using System.IO.Compression;
using System.Linq;
using System.Net;
using System.Net.Http;
using System.Text.Json;
using System.Text.Json.Serialization;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using MongoDB.Bson;
using MongoDB.Bson.IO;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using MongoDB.Bson;
using MongoDB.Bson.IO;
using StellaOps.Concelier.Models;
using StellaOps.Concelier.Connector.Common;
using StellaOps.Concelier.Connector.Common.Fetch;
using StellaOps.Concelier.Connector.Osv.Configuration;
using StellaOps.Concelier.Connector.Osv.Internal;
using StellaOps.Concelier.Storage.Mongo;
using StellaOps.Concelier.Storage.Mongo.Advisories;
using StellaOps.Concelier.Connector.Common;
using StellaOps.Concelier.Connector.Common.Fetch;
using StellaOps.Concelier.Connector.Osv.Configuration;
using StellaOps.Concelier.Connector.Osv.Internal;
using StellaOps.Concelier.Storage.Mongo;
using StellaOps.Concelier.Storage.Mongo.Advisories;
using StellaOps.Concelier.Storage.Mongo.Documents;
using StellaOps.Concelier.Storage.Mongo.Dtos;
using StellaOps.Plugin;
using StellaOps.Cryptography;
namespace StellaOps.Concelier.Connector.Osv;
public sealed class OsvConnector : IFeedConnector
{
private static readonly JsonSerializerOptions SerializerOptions = new(JsonSerializerDefaults.Web)
{
DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingNull,
PropertyNameCaseInsensitive = true,
};
private readonly IHttpClientFactory _httpClientFactory;
private readonly RawDocumentStorage _rawDocumentStorage;
private readonly IDocumentStore _documentStore;
namespace StellaOps.Concelier.Connector.Osv;
public sealed class OsvConnector : IFeedConnector
{
private static readonly JsonSerializerOptions SerializerOptions = new(JsonSerializerDefaults.Web)
{
DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingNull,
PropertyNameCaseInsensitive = true,
};
private readonly IHttpClientFactory _httpClientFactory;
private readonly RawDocumentStorage _rawDocumentStorage;
private readonly IDocumentStore _documentStore;
private readonly IDtoStore _dtoStore;
private readonly IAdvisoryStore _advisoryStore;
private readonly ISourceStateRepository _stateRepository;
@@ -46,10 +46,10 @@ public sealed class OsvConnector : IFeedConnector
private readonly ILogger<OsvConnector> _logger;
private readonly OsvDiagnostics _diagnostics;
private readonly ICryptoHash _hash;
public OsvConnector(
IHttpClientFactory httpClientFactory,
RawDocumentStorage rawDocumentStorage,
public OsvConnector(
IHttpClientFactory httpClientFactory,
RawDocumentStorage rawDocumentStorage,
IDocumentStore documentStore,
IDtoStore dtoStore,
IAdvisoryStore advisoryStore,
@@ -73,197 +73,197 @@ public sealed class OsvConnector : IFeedConnector
_timeProvider = timeProvider ?? TimeProvider.System;
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public string SourceName => OsvConnectorPlugin.SourceName;
public async Task FetchAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
var now = _timeProvider.GetUtcNow();
var pendingDocuments = cursor.PendingDocuments.ToHashSet();
var cursorState = cursor;
var remainingCapacity = _options.MaxAdvisoriesPerFetch;
foreach (var ecosystem in _options.Ecosystems)
{
if (remainingCapacity <= 0)
{
break;
}
cancellationToken.ThrowIfCancellationRequested();
try
{
var result = await FetchEcosystemAsync(
ecosystem,
cursorState,
pendingDocuments,
now,
remainingCapacity,
cancellationToken).ConfigureAwait(false);
cursorState = result.Cursor;
remainingCapacity -= result.NewDocuments;
}
catch (Exception ex)
{
_logger.LogError(ex, "OSV fetch failed for ecosystem {Ecosystem}", ecosystem);
await _stateRepository.MarkFailureAsync(SourceName, now, TimeSpan.FromMinutes(10), ex.Message, cancellationToken).ConfigureAwait(false);
throw;
}
}
cursorState = cursorState
.WithPendingDocuments(pendingDocuments)
.WithPendingMappings(cursor.PendingMappings);
await UpdateCursorAsync(cursorState, cancellationToken).ConfigureAwait(false);
}
public async Task ParseAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingDocuments.Count == 0)
{
return;
}
var remainingDocuments = cursor.PendingDocuments.ToList();
var pendingMappings = cursor.PendingMappings.ToList();
foreach (var documentId in cursor.PendingDocuments)
{
cancellationToken.ThrowIfCancellationRequested();
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (document is null)
{
remainingDocuments.Remove(documentId);
continue;
}
if (!document.GridFsId.HasValue)
{
_logger.LogWarning("OSV document {DocumentId} missing GridFS content", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
continue;
}
byte[] bytes;
try
{
bytes = await _rawDocumentStorage.DownloadAsync(document.GridFsId.Value, cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
_logger.LogError(ex, "Unable to download OSV raw document {DocumentId}", document.Id);
throw;
}
OsvVulnerabilityDto? dto;
try
{
dto = JsonSerializer.Deserialize<OsvVulnerabilityDto>(bytes, SerializerOptions);
}
catch (Exception ex)
{
_logger.LogWarning(ex, "Failed to deserialize OSV document {DocumentId} ({Uri})", document.Id, document.Uri);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
continue;
}
if (dto is null || string.IsNullOrWhiteSpace(dto.Id))
{
_logger.LogWarning("OSV document {DocumentId} produced empty payload", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
continue;
}
var sanitized = JsonSerializer.Serialize(dto, SerializerOptions);
var payload = MongoDB.Bson.BsonDocument.Parse(sanitized);
var dtoRecord = new DtoRecord(
Guid.NewGuid(),
document.Id,
SourceName,
"osv.v1",
payload,
_timeProvider.GetUtcNow());
await _dtoStore.UpsertAsync(dtoRecord, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.PendingMap, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
if (!pendingMappings.Contains(documentId))
{
pendingMappings.Add(documentId);
}
}
var updatedCursor = cursor
.WithPendingDocuments(remainingDocuments)
.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
public async Task MapAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingMappings.Count == 0)
{
return;
}
var pendingMappings = cursor.PendingMappings.ToList();
foreach (var documentId in cursor.PendingMappings)
{
cancellationToken.ThrowIfCancellationRequested();
var dto = await _dtoStore.FindByDocumentIdAsync(documentId, cancellationToken).ConfigureAwait(false);
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (dto is null || document is null)
{
pendingMappings.Remove(documentId);
continue;
}
var payloadJson = dto.Payload.ToJson(new JsonWriterSettings
{
OutputMode = JsonOutputMode.RelaxedExtendedJson,
});
OsvVulnerabilityDto? osvDto;
try
{
osvDto = JsonSerializer.Deserialize<OsvVulnerabilityDto>(payloadJson, SerializerOptions);
}
catch (Exception ex)
{
_logger.LogError(ex, "Failed to deserialize OSV DTO for document {DocumentId}", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
continue;
}
if (osvDto is null || string.IsNullOrWhiteSpace(osvDto.Id))
{
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
continue;
}
public string SourceName => OsvConnectorPlugin.SourceName;
public async Task FetchAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
var now = _timeProvider.GetUtcNow();
var pendingDocuments = cursor.PendingDocuments.ToHashSet();
var cursorState = cursor;
var remainingCapacity = _options.MaxAdvisoriesPerFetch;
foreach (var ecosystem in _options.Ecosystems)
{
if (remainingCapacity <= 0)
{
break;
}
cancellationToken.ThrowIfCancellationRequested();
try
{
var result = await FetchEcosystemAsync(
ecosystem,
cursorState,
pendingDocuments,
now,
remainingCapacity,
cancellationToken).ConfigureAwait(false);
cursorState = result.Cursor;
remainingCapacity -= result.NewDocuments;
}
catch (Exception ex)
{
_logger.LogError(ex, "OSV fetch failed for ecosystem {Ecosystem}", ecosystem);
await _stateRepository.MarkFailureAsync(SourceName, now, TimeSpan.FromMinutes(10), ex.Message, cancellationToken).ConfigureAwait(false);
throw;
}
}
cursorState = cursorState
.WithPendingDocuments(pendingDocuments)
.WithPendingMappings(cursor.PendingMappings);
await UpdateCursorAsync(cursorState, cancellationToken).ConfigureAwait(false);
}
public async Task ParseAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingDocuments.Count == 0)
{
return;
}
var remainingDocuments = cursor.PendingDocuments.ToList();
var pendingMappings = cursor.PendingMappings.ToList();
foreach (var documentId in cursor.PendingDocuments)
{
cancellationToken.ThrowIfCancellationRequested();
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (document is null)
{
remainingDocuments.Remove(documentId);
continue;
}
if (!document.PayloadId.HasValue)
{
_logger.LogWarning("OSV document {DocumentId} missing GridFS content", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
continue;
}
byte[] bytes;
try
{
bytes = await _rawDocumentStorage.DownloadAsync(document.PayloadId.Value, cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
_logger.LogError(ex, "Unable to download OSV raw document {DocumentId}", document.Id);
throw;
}
OsvVulnerabilityDto? dto;
try
{
dto = JsonSerializer.Deserialize<OsvVulnerabilityDto>(bytes, SerializerOptions);
}
catch (Exception ex)
{
_logger.LogWarning(ex, "Failed to deserialize OSV document {DocumentId} ({Uri})", document.Id, document.Uri);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
continue;
}
if (dto is null || string.IsNullOrWhiteSpace(dto.Id))
{
_logger.LogWarning("OSV document {DocumentId} produced empty payload", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
continue;
}
var sanitized = JsonSerializer.Serialize(dto, SerializerOptions);
var payload = MongoDB.Bson.BsonDocument.Parse(sanitized);
var dtoRecord = new DtoRecord(
Guid.NewGuid(),
document.Id,
SourceName,
"osv.v1",
payload,
_timeProvider.GetUtcNow());
await _dtoStore.UpsertAsync(dtoRecord, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.PendingMap, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
if (!pendingMappings.Contains(documentId))
{
pendingMappings.Add(documentId);
}
}
var updatedCursor = cursor
.WithPendingDocuments(remainingDocuments)
.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
public async Task MapAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingMappings.Count == 0)
{
return;
}
var pendingMappings = cursor.PendingMappings.ToList();
foreach (var documentId in cursor.PendingMappings)
{
cancellationToken.ThrowIfCancellationRequested();
var dto = await _dtoStore.FindByDocumentIdAsync(documentId, cancellationToken).ConfigureAwait(false);
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (dto is null || document is null)
{
pendingMappings.Remove(documentId);
continue;
}
var payloadJson = dto.Payload.ToJson(new JsonWriterSettings
{
OutputMode = JsonOutputMode.RelaxedExtendedJson,
});
OsvVulnerabilityDto? osvDto;
try
{
osvDto = JsonSerializer.Deserialize<OsvVulnerabilityDto>(payloadJson, SerializerOptions);
}
catch (Exception ex)
{
_logger.LogError(ex, "Failed to deserialize OSV DTO for document {DocumentId}", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
continue;
}
if (osvDto is null || string.IsNullOrWhiteSpace(osvDto.Id))
{
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
continue;
}
var ecosystem = document.Metadata is not null && document.Metadata.TryGetValue("osv.ecosystem", out var ecosystemValue)
? ecosystemValue
: "unknown";
@@ -289,232 +289,232 @@ public sealed class OsvConnector : IFeedConnector
pendingMappings.Remove(documentId);
}
var updatedCursor = cursor.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
private async Task<OsvCursor> GetCursorAsync(CancellationToken cancellationToken)
{
var state = await _stateRepository.TryGetAsync(SourceName, cancellationToken).ConfigureAwait(false);
return state is null ? OsvCursor.Empty : OsvCursor.FromBson(state.Cursor);
}
private async Task UpdateCursorAsync(OsvCursor cursor, CancellationToken cancellationToken)
{
var document = cursor.ToBsonDocument();
await _stateRepository.UpdateCursorAsync(SourceName, document, _timeProvider.GetUtcNow(), cancellationToken).ConfigureAwait(false);
}
private async Task<(OsvCursor Cursor, int NewDocuments)> FetchEcosystemAsync(
string ecosystem,
OsvCursor cursor,
HashSet<Guid> pendingDocuments,
DateTimeOffset now,
int remainingCapacity,
CancellationToken cancellationToken)
{
var client = _httpClientFactory.CreateClient(OsvOptions.HttpClientName);
client.Timeout = _options.HttpTimeout;
var archiveUri = BuildArchiveUri(ecosystem);
using var request = new HttpRequestMessage(HttpMethod.Get, archiveUri);
if (cursor.TryGetArchiveMetadata(ecosystem, out var archiveMetadata))
{
if (!string.IsNullOrWhiteSpace(archiveMetadata.ETag))
{
request.Headers.TryAddWithoutValidation("If-None-Match", archiveMetadata.ETag);
}
if (archiveMetadata.LastModified.HasValue)
{
request.Headers.IfModifiedSince = archiveMetadata.LastModified.Value;
}
}
using var response = await client.SendAsync(request, HttpCompletionOption.ResponseHeadersRead, cancellationToken).ConfigureAwait(false);
if (response.StatusCode == HttpStatusCode.NotModified)
{
return (cursor, 0);
}
response.EnsureSuccessStatusCode();
await using var archiveStream = await response.Content.ReadAsStreamAsync(cancellationToken).ConfigureAwait(false);
using var archive = new ZipArchive(archiveStream, ZipArchiveMode.Read, leaveOpen: false);
var existingLastModified = cursor.GetLastModified(ecosystem);
var processedIdsSet = cursor.ProcessedIdsByEcosystem.TryGetValue(ecosystem, out var processedIds)
? new HashSet<string>(processedIds, StringComparer.OrdinalIgnoreCase)
: new HashSet<string>(StringComparer.OrdinalIgnoreCase);
var currentMaxModified = existingLastModified ?? DateTimeOffset.MinValue;
var currentProcessedIds = new HashSet<string>(processedIdsSet, StringComparer.OrdinalIgnoreCase);
var processedUpdated = false;
var newDocuments = 0;
var minimumModified = existingLastModified.HasValue
? existingLastModified.Value - _options.ModifiedTolerance
: now - _options.InitialBackfill;
ProvenanceDiagnostics.ReportResumeWindow(SourceName, minimumModified, _logger);
foreach (var entry in archive.Entries)
{
if (remainingCapacity <= 0)
{
break;
}
cancellationToken.ThrowIfCancellationRequested();
if (!entry.FullName.EndsWith(".json", StringComparison.OrdinalIgnoreCase))
{
continue;
}
await using var entryStream = entry.Open();
using var memory = new MemoryStream();
await entryStream.CopyToAsync(memory, cancellationToken).ConfigureAwait(false);
var bytes = memory.ToArray();
OsvVulnerabilityDto? dto;
try
{
dto = JsonSerializer.Deserialize<OsvVulnerabilityDto>(bytes, SerializerOptions);
}
catch (Exception ex)
{
_logger.LogWarning(ex, "Failed to parse OSV entry {Entry} for ecosystem {Ecosystem}", entry.FullName, ecosystem);
continue;
}
if (dto is null || string.IsNullOrWhiteSpace(dto.Id))
{
continue;
}
var modified = (dto.Modified ?? dto.Published ?? DateTimeOffset.MinValue).ToUniversalTime();
if (modified < minimumModified)
{
continue;
}
if (existingLastModified.HasValue && modified < existingLastModified.Value - _options.ModifiedTolerance)
{
continue;
}
if (modified < currentMaxModified - _options.ModifiedTolerance)
{
continue;
}
if (modified == currentMaxModified && currentProcessedIds.Contains(dto.Id))
{
continue;
}
var documentUri = BuildDocumentUri(ecosystem, dto.Id);
var updatedCursor = cursor.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
private async Task<OsvCursor> GetCursorAsync(CancellationToken cancellationToken)
{
var state = await _stateRepository.TryGetAsync(SourceName, cancellationToken).ConfigureAwait(false);
return state is null ? OsvCursor.Empty : OsvCursor.FromBson(state.Cursor);
}
private async Task UpdateCursorAsync(OsvCursor cursor, CancellationToken cancellationToken)
{
var document = cursor.ToBsonDocument();
await _stateRepository.UpdateCursorAsync(SourceName, document, _timeProvider.GetUtcNow(), cancellationToken).ConfigureAwait(false);
}
private async Task<(OsvCursor Cursor, int NewDocuments)> FetchEcosystemAsync(
string ecosystem,
OsvCursor cursor,
HashSet<Guid> pendingDocuments,
DateTimeOffset now,
int remainingCapacity,
CancellationToken cancellationToken)
{
var client = _httpClientFactory.CreateClient(OsvOptions.HttpClientName);
client.Timeout = _options.HttpTimeout;
var archiveUri = BuildArchiveUri(ecosystem);
using var request = new HttpRequestMessage(HttpMethod.Get, archiveUri);
if (cursor.TryGetArchiveMetadata(ecosystem, out var archiveMetadata))
{
if (!string.IsNullOrWhiteSpace(archiveMetadata.ETag))
{
request.Headers.TryAddWithoutValidation("If-None-Match", archiveMetadata.ETag);
}
if (archiveMetadata.LastModified.HasValue)
{
request.Headers.IfModifiedSince = archiveMetadata.LastModified.Value;
}
}
using var response = await client.SendAsync(request, HttpCompletionOption.ResponseHeadersRead, cancellationToken).ConfigureAwait(false);
if (response.StatusCode == HttpStatusCode.NotModified)
{
return (cursor, 0);
}
response.EnsureSuccessStatusCode();
await using var archiveStream = await response.Content.ReadAsStreamAsync(cancellationToken).ConfigureAwait(false);
using var archive = new ZipArchive(archiveStream, ZipArchiveMode.Read, leaveOpen: false);
var existingLastModified = cursor.GetLastModified(ecosystem);
var processedIdsSet = cursor.ProcessedIdsByEcosystem.TryGetValue(ecosystem, out var processedIds)
? new HashSet<string>(processedIds, StringComparer.OrdinalIgnoreCase)
: new HashSet<string>(StringComparer.OrdinalIgnoreCase);
var currentMaxModified = existingLastModified ?? DateTimeOffset.MinValue;
var currentProcessedIds = new HashSet<string>(processedIdsSet, StringComparer.OrdinalIgnoreCase);
var processedUpdated = false;
var newDocuments = 0;
var minimumModified = existingLastModified.HasValue
? existingLastModified.Value - _options.ModifiedTolerance
: now - _options.InitialBackfill;
ProvenanceDiagnostics.ReportResumeWindow(SourceName, minimumModified, _logger);
foreach (var entry in archive.Entries)
{
if (remainingCapacity <= 0)
{
break;
}
cancellationToken.ThrowIfCancellationRequested();
if (!entry.FullName.EndsWith(".json", StringComparison.OrdinalIgnoreCase))
{
continue;
}
await using var entryStream = entry.Open();
using var memory = new MemoryStream();
await entryStream.CopyToAsync(memory, cancellationToken).ConfigureAwait(false);
var bytes = memory.ToArray();
OsvVulnerabilityDto? dto;
try
{
dto = JsonSerializer.Deserialize<OsvVulnerabilityDto>(bytes, SerializerOptions);
}
catch (Exception ex)
{
_logger.LogWarning(ex, "Failed to parse OSV entry {Entry} for ecosystem {Ecosystem}", entry.FullName, ecosystem);
continue;
}
if (dto is null || string.IsNullOrWhiteSpace(dto.Id))
{
continue;
}
var modified = (dto.Modified ?? dto.Published ?? DateTimeOffset.MinValue).ToUniversalTime();
if (modified < minimumModified)
{
continue;
}
if (existingLastModified.HasValue && modified < existingLastModified.Value - _options.ModifiedTolerance)
{
continue;
}
if (modified < currentMaxModified - _options.ModifiedTolerance)
{
continue;
}
if (modified == currentMaxModified && currentProcessedIds.Contains(dto.Id))
{
continue;
}
var documentUri = BuildDocumentUri(ecosystem, dto.Id);
var sha256 = _hash.ComputeHashHex(bytes, HashAlgorithms.Sha256);
var existing = await _documentStore.FindBySourceAndUriAsync(SourceName, documentUri, cancellationToken).ConfigureAwait(false);
if (existing is not null && string.Equals(existing.Sha256, sha256, StringComparison.OrdinalIgnoreCase))
{
continue;
}
var gridFsId = await _rawDocumentStorage.UploadAsync(SourceName, documentUri, bytes, "application/json", null, cancellationToken).ConfigureAwait(false);
var metadata = new Dictionary<string, string>(StringComparer.Ordinal)
{
["osv.ecosystem"] = ecosystem,
["osv.id"] = dto.Id,
["osv.modified"] = modified.ToString("O"),
};
var recordId = existing?.Id ?? Guid.NewGuid();
var record = new DocumentRecord(
recordId,
SourceName,
documentUri,
_timeProvider.GetUtcNow(),
sha256,
DocumentStatuses.PendingParse,
"application/json",
Headers: null,
Metadata: metadata,
Etag: null,
LastModified: modified,
GridFsId: gridFsId,
ExpiresAt: null);
var upserted = await _documentStore.UpsertAsync(record, cancellationToken).ConfigureAwait(false);
pendingDocuments.Add(upserted.Id);
newDocuments++;
remainingCapacity--;
if (modified > currentMaxModified)
{
currentMaxModified = modified;
currentProcessedIds = new HashSet<string>(StringComparer.OrdinalIgnoreCase) { dto.Id };
processedUpdated = true;
}
else if (modified == currentMaxModified)
{
currentProcessedIds.Add(dto.Id);
processedUpdated = true;
}
if (_options.RequestDelay > TimeSpan.Zero)
{
try
{
await Task.Delay(_options.RequestDelay, cancellationToken).ConfigureAwait(false);
}
catch (TaskCanceledException)
{
break;
}
}
}
if (processedUpdated && currentMaxModified != DateTimeOffset.MinValue)
{
cursor = cursor.WithLastModified(ecosystem, currentMaxModified, currentProcessedIds);
}
else if (processedUpdated && existingLastModified.HasValue)
{
cursor = cursor.WithLastModified(ecosystem, existingLastModified.Value, currentProcessedIds);
}
var etag = response.Headers.ETag?.Tag;
var lastModifiedHeader = response.Content.Headers.LastModified;
cursor = cursor.WithArchiveMetadata(ecosystem, etag, lastModifiedHeader);
return (cursor, newDocuments);
}
private Uri BuildArchiveUri(string ecosystem)
{
var trimmed = ecosystem.Trim('/');
var baseUri = _options.BaseUri;
var builder = new UriBuilder(baseUri);
var path = builder.Path;
if (!path.EndsWith('/'))
{
path += "/";
}
path += $"{trimmed}/{_options.ArchiveFileName}";
builder.Path = path;
return builder.Uri;
}
private static string BuildDocumentUri(string ecosystem, string vulnerabilityId)
{
var safeId = vulnerabilityId.Replace(' ', '-');
return $"https://osv-vulnerabilities.storage.googleapis.com/{ecosystem}/{safeId}.json";
}
}
var existing = await _documentStore.FindBySourceAndUriAsync(SourceName, documentUri, cancellationToken).ConfigureAwait(false);
if (existing is not null && string.Equals(existing.Sha256, sha256, StringComparison.OrdinalIgnoreCase))
{
continue;
}
var gridFsId = await _rawDocumentStorage.UploadAsync(SourceName, documentUri, bytes, "application/json", null, cancellationToken).ConfigureAwait(false);
var metadata = new Dictionary<string, string>(StringComparer.Ordinal)
{
["osv.ecosystem"] = ecosystem,
["osv.id"] = dto.Id,
["osv.modified"] = modified.ToString("O"),
};
var recordId = existing?.Id ?? Guid.NewGuid();
var record = new DocumentRecord(
recordId,
SourceName,
documentUri,
_timeProvider.GetUtcNow(),
sha256,
DocumentStatuses.PendingParse,
"application/json",
Headers: null,
Metadata: metadata,
Etag: null,
LastModified: modified,
PayloadId: gridFsId,
ExpiresAt: null);
var upserted = await _documentStore.UpsertAsync(record, cancellationToken).ConfigureAwait(false);
pendingDocuments.Add(upserted.Id);
newDocuments++;
remainingCapacity--;
if (modified > currentMaxModified)
{
currentMaxModified = modified;
currentProcessedIds = new HashSet<string>(StringComparer.OrdinalIgnoreCase) { dto.Id };
processedUpdated = true;
}
else if (modified == currentMaxModified)
{
currentProcessedIds.Add(dto.Id);
processedUpdated = true;
}
if (_options.RequestDelay > TimeSpan.Zero)
{
try
{
await Task.Delay(_options.RequestDelay, cancellationToken).ConfigureAwait(false);
}
catch (TaskCanceledException)
{
break;
}
}
}
if (processedUpdated && currentMaxModified != DateTimeOffset.MinValue)
{
cursor = cursor.WithLastModified(ecosystem, currentMaxModified, currentProcessedIds);
}
else if (processedUpdated && existingLastModified.HasValue)
{
cursor = cursor.WithLastModified(ecosystem, existingLastModified.Value, currentProcessedIds);
}
var etag = response.Headers.ETag?.Tag;
var lastModifiedHeader = response.Content.Headers.LastModified;
cursor = cursor.WithArchiveMetadata(ecosystem, etag, lastModifiedHeader);
return (cursor, newDocuments);
}
private Uri BuildArchiveUri(string ecosystem)
{
var trimmed = ecosystem.Trim('/');
var baseUri = _options.BaseUri;
var builder = new UriBuilder(baseUri);
var path = builder.Path;
if (!path.EndsWith('/'))
{
path += "/";
}
path += $"{trimmed}/{_options.ArchiveFileName}";
builder.Path = path;
return builder.Uri;
}
private static string BuildDocumentUri(string ecosystem, string vulnerabilityId)
{
var safeId = vulnerabilityId.Replace(' ', '-');
return $"https://osv-vulnerabilities.storage.googleapis.com/{ecosystem}/{safeId}.json";
}
}

View File

@@ -1,439 +1,439 @@
using System;
using System.Collections.Generic;
using System.Linq;
using System.Net;
using System.Text.Json;
using System.Text.Json.Serialization;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using MongoDB.Bson;
using StellaOps.Concelier.Connector.Common;
using StellaOps.Concelier.Connector.Common.Fetch;
using StellaOps.Concelier.Connector.Vndr.Apple.Internal;
using StellaOps.Concelier.Storage.Mongo;
using StellaOps.Concelier.Storage.Mongo.Advisories;
using StellaOps.Concelier.Storage.Mongo.Documents;
using StellaOps.Concelier.Storage.Mongo.Dtos;
using StellaOps.Concelier.Storage.Mongo.PsirtFlags;
using StellaOps.Plugin;
namespace StellaOps.Concelier.Connector.Vndr.Apple;
public sealed class AppleConnector : IFeedConnector
{
private static readonly JsonSerializerOptions SerializerOptions = new(JsonSerializerDefaults.Web)
{
DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingNull,
PropertyNameCaseInsensitive = true,
};
private readonly SourceFetchService _fetchService;
private readonly RawDocumentStorage _rawDocumentStorage;
private readonly IDocumentStore _documentStore;
private readonly IDtoStore _dtoStore;
private readonly IAdvisoryStore _advisoryStore;
private readonly IPsirtFlagStore _psirtFlagStore;
private readonly ISourceStateRepository _stateRepository;
private readonly AppleOptions _options;
private readonly AppleDiagnostics _diagnostics;
private readonly TimeProvider _timeProvider;
private readonly ILogger<AppleConnector> _logger;
public AppleConnector(
SourceFetchService fetchService,
RawDocumentStorage rawDocumentStorage,
IDocumentStore documentStore,
IDtoStore dtoStore,
IAdvisoryStore advisoryStore,
IPsirtFlagStore psirtFlagStore,
ISourceStateRepository stateRepository,
AppleDiagnostics diagnostics,
IOptions<AppleOptions> options,
TimeProvider? timeProvider,
ILogger<AppleConnector> logger)
{
_fetchService = fetchService ?? throw new ArgumentNullException(nameof(fetchService));
_rawDocumentStorage = rawDocumentStorage ?? throw new ArgumentNullException(nameof(rawDocumentStorage));
_documentStore = documentStore ?? throw new ArgumentNullException(nameof(documentStore));
_dtoStore = dtoStore ?? throw new ArgumentNullException(nameof(dtoStore));
_advisoryStore = advisoryStore ?? throw new ArgumentNullException(nameof(advisoryStore));
_psirtFlagStore = psirtFlagStore ?? throw new ArgumentNullException(nameof(psirtFlagStore));
_stateRepository = stateRepository ?? throw new ArgumentNullException(nameof(stateRepository));
_diagnostics = diagnostics ?? throw new ArgumentNullException(nameof(diagnostics));
_options = (options ?? throw new ArgumentNullException(nameof(options))).Value ?? throw new ArgumentNullException(nameof(options));
_options.Validate();
_timeProvider = timeProvider ?? TimeProvider.System;
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public string SourceName => VndrAppleConnectorPlugin.SourceName;
public async Task FetchAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
var pendingDocuments = cursor.PendingDocuments.ToHashSet();
var pendingMappings = cursor.PendingMappings.ToHashSet();
var processedIds = cursor.ProcessedIds.ToHashSet(StringComparer.OrdinalIgnoreCase);
var maxPosted = cursor.LastPosted ?? DateTimeOffset.MinValue;
var baseline = cursor.LastPosted?.Add(-_options.ModifiedTolerance) ?? _timeProvider.GetUtcNow().Add(-_options.InitialBackfill);
SourceFetchContentResult indexResult;
try
{
var request = new SourceFetchRequest(AppleOptions.HttpClientName, SourceName, _options.SoftwareLookupUri!)
{
AcceptHeaders = new[] { "application/json", "application/vnd.apple.security+json;q=0.9" },
};
indexResult = await _fetchService.FetchContentAsync(request, cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
_diagnostics.FetchFailure();
_logger.LogError(ex, "Apple software index fetch failed from {Uri}", _options.SoftwareLookupUri);
await _stateRepository.MarkFailureAsync(SourceName, _timeProvider.GetUtcNow(), TimeSpan.FromMinutes(10), ex.Message, cancellationToken).ConfigureAwait(false);
throw;
}
if (!indexResult.IsSuccess || indexResult.Content is null)
{
if (indexResult.IsNotModified)
{
_diagnostics.FetchUnchanged();
}
await UpdateCursorAsync(cursor, cancellationToken).ConfigureAwait(false);
return;
}
var indexEntries = AppleIndexParser.Parse(indexResult.Content, _options.AdvisoryBaseUri!);
if (indexEntries.Count == 0)
{
await UpdateCursorAsync(cursor, cancellationToken).ConfigureAwait(false);
return;
}
var allowlist = _options.AdvisoryAllowlist;
var blocklist = _options.AdvisoryBlocklist;
var ordered = indexEntries
.Where(entry => ShouldInclude(entry, allowlist, blocklist))
.OrderBy(entry => entry.PostingDate)
.ThenBy(entry => entry.ArticleId, StringComparer.OrdinalIgnoreCase)
.ToArray();
foreach (var entry in ordered)
{
cancellationToken.ThrowIfCancellationRequested();
if (entry.PostingDate < baseline)
{
continue;
}
if (cursor.LastPosted.HasValue
&& entry.PostingDate <= cursor.LastPosted.Value
&& processedIds.Contains(entry.UpdateId))
{
continue;
}
var metadata = BuildMetadata(entry);
var existing = await _documentStore.FindBySourceAndUriAsync(SourceName, entry.DetailUri.ToString(), cancellationToken).ConfigureAwait(false);
SourceFetchResult result;
try
{
result = await _fetchService.FetchAsync(
new SourceFetchRequest(AppleOptions.HttpClientName, SourceName, entry.DetailUri)
{
Metadata = metadata,
ETag = existing?.Etag,
LastModified = existing?.LastModified,
AcceptHeaders = new[]
{
"text/html",
"application/xhtml+xml",
"text/plain;q=0.5"
},
},
cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
_diagnostics.FetchFailure();
_logger.LogError(ex, "Apple advisory fetch failed for {Uri}", entry.DetailUri);
await _stateRepository.MarkFailureAsync(SourceName, _timeProvider.GetUtcNow(), TimeSpan.FromMinutes(5), ex.Message, cancellationToken).ConfigureAwait(false);
throw;
}
if (result.StatusCode == HttpStatusCode.NotModified)
{
_diagnostics.FetchUnchanged();
}
if (!result.IsSuccess || result.Document is null)
{
continue;
}
_diagnostics.FetchItem();
pendingDocuments.Add(result.Document.Id);
processedIds.Add(entry.UpdateId);
if (entry.PostingDate > maxPosted)
{
maxPosted = entry.PostingDate;
}
}
var updated = cursor
.WithPendingDocuments(pendingDocuments)
.WithPendingMappings(pendingMappings)
.WithLastPosted(maxPosted == DateTimeOffset.MinValue ? cursor.LastPosted ?? DateTimeOffset.MinValue : maxPosted, processedIds);
await UpdateCursorAsync(updated, cancellationToken).ConfigureAwait(false);
}
public async Task ParseAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingDocuments.Count == 0)
{
return;
}
var remainingDocuments = cursor.PendingDocuments.ToHashSet();
var pendingMappings = cursor.PendingMappings.ToHashSet();
foreach (var documentId in cursor.PendingDocuments)
{
cancellationToken.ThrowIfCancellationRequested();
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (document is null)
{
remainingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
if (!document.GridFsId.HasValue)
{
_diagnostics.ParseFailure();
_logger.LogWarning("Apple document {DocumentId} missing GridFS payload", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
AppleDetailDto dto;
try
{
var content = await _rawDocumentStorage.DownloadAsync(document.GridFsId.Value, cancellationToken).ConfigureAwait(false);
var html = System.Text.Encoding.UTF8.GetString(content);
var entry = RehydrateIndexEntry(document);
dto = AppleDetailParser.Parse(html, entry);
}
catch (Exception ex)
{
_diagnostics.ParseFailure();
_logger.LogError(ex, "Apple parse failed for document {DocumentId}", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
var json = JsonSerializer.Serialize(dto, SerializerOptions);
var payload = BsonDocument.Parse(json);
var validatedAt = _timeProvider.GetUtcNow();
var existingDto = await _dtoStore.FindByDocumentIdAsync(document.Id, cancellationToken).ConfigureAwait(false);
var dtoRecord = existingDto is null
? new DtoRecord(Guid.NewGuid(), document.Id, SourceName, "apple.security.update.v1", payload, validatedAt)
: existingDto with
{
Payload = payload,
SchemaVersion = "apple.security.update.v1",
ValidatedAt = validatedAt,
};
await _dtoStore.UpsertAsync(dtoRecord, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.PendingMap, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
pendingMappings.Add(document.Id);
}
var updatedCursor = cursor
.WithPendingDocuments(remainingDocuments)
.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
public async Task MapAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingMappings.Count == 0)
{
return;
}
var pendingMappings = cursor.PendingMappings.ToHashSet();
foreach (var documentId in cursor.PendingMappings)
{
cancellationToken.ThrowIfCancellationRequested();
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (document is null)
{
pendingMappings.Remove(documentId);
continue;
}
var dtoRecord = await _dtoStore.FindByDocumentIdAsync(document.Id, cancellationToken).ConfigureAwait(false);
if (dtoRecord is null)
{
pendingMappings.Remove(documentId);
continue;
}
AppleDetailDto dto;
try
{
dto = JsonSerializer.Deserialize<AppleDetailDto>(dtoRecord.Payload.ToJson(), SerializerOptions)
?? throw new InvalidOperationException("Unable to deserialize Apple DTO.");
}
catch (Exception ex)
{
_logger.LogError(ex, "Apple DTO deserialization failed for document {DocumentId}", document.Id);
pendingMappings.Remove(documentId);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
continue;
}
var (advisory, flag) = AppleMapper.Map(dto, document, dtoRecord);
_diagnostics.MapAffectedCount(advisory.AffectedPackages.Length);
await _advisoryStore.UpsertAsync(advisory, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Mapped, cancellationToken).ConfigureAwait(false);
if (flag is not null)
{
await _psirtFlagStore.UpsertAsync(flag, cancellationToken).ConfigureAwait(false);
}
pendingMappings.Remove(documentId);
}
var updatedCursor = cursor.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
private AppleIndexEntry RehydrateIndexEntry(DocumentRecord document)
{
var metadata = document.Metadata ?? new Dictionary<string, string>(StringComparer.Ordinal);
metadata.TryGetValue("apple.articleId", out var articleId);
metadata.TryGetValue("apple.updateId", out var updateId);
metadata.TryGetValue("apple.title", out var title);
metadata.TryGetValue("apple.postingDate", out var postingDateRaw);
metadata.TryGetValue("apple.detailUri", out var detailUriRaw);
metadata.TryGetValue("apple.rapidResponse", out var rapidRaw);
metadata.TryGetValue("apple.products", out var productsJson);
if (!DateTimeOffset.TryParse(postingDateRaw, out var postingDate))
{
postingDate = document.FetchedAt;
}
var detailUri = !string.IsNullOrWhiteSpace(detailUriRaw) && Uri.TryCreate(detailUriRaw, UriKind.Absolute, out var parsedUri)
? parsedUri
: new Uri(_options.AdvisoryBaseUri!, articleId ?? document.Uri);
var rapid = string.Equals(rapidRaw, "true", StringComparison.OrdinalIgnoreCase);
var products = DeserializeProducts(productsJson);
return new AppleIndexEntry(
UpdateId: string.IsNullOrWhiteSpace(updateId) ? articleId ?? document.Uri : updateId,
ArticleId: articleId ?? document.Uri,
Title: title ?? document.Metadata?["apple.originalTitle"] ?? "Apple Security Update",
PostingDate: postingDate.ToUniversalTime(),
DetailUri: detailUri,
Products: products,
IsRapidSecurityResponse: rapid);
}
private static IReadOnlyList<AppleIndexProduct> DeserializeProducts(string? json)
{
if (string.IsNullOrWhiteSpace(json))
{
return Array.Empty<AppleIndexProduct>();
}
try
{
var products = JsonSerializer.Deserialize<List<AppleIndexProduct>>(json, SerializerOptions);
return products is { Count: > 0 } ? products : Array.Empty<AppleIndexProduct>();
}
catch (JsonException)
{
return Array.Empty<AppleIndexProduct>();
}
}
private static Dictionary<string, string> BuildMetadata(AppleIndexEntry entry)
{
var metadata = new Dictionary<string, string>(StringComparer.Ordinal)
{
["apple.articleId"] = entry.ArticleId,
["apple.updateId"] = entry.UpdateId,
["apple.title"] = entry.Title,
["apple.postingDate"] = entry.PostingDate.ToString("O"),
["apple.detailUri"] = entry.DetailUri.ToString(),
["apple.rapidResponse"] = entry.IsRapidSecurityResponse ? "true" : "false",
["apple.products"] = JsonSerializer.Serialize(entry.Products, SerializerOptions),
};
return metadata;
}
private static bool ShouldInclude(AppleIndexEntry entry, IReadOnlyCollection<string> allowlist, IReadOnlyCollection<string> blocklist)
{
if (allowlist.Count > 0 && !allowlist.Contains(entry.ArticleId))
{
return false;
}
if (blocklist.Count > 0 && blocklist.Contains(entry.ArticleId))
{
return false;
}
return true;
}
private async Task<AppleCursor> GetCursorAsync(CancellationToken cancellationToken)
{
var state = await _stateRepository.TryGetAsync(SourceName, cancellationToken).ConfigureAwait(false);
return state is null ? AppleCursor.Empty : AppleCursor.FromBson(state.Cursor);
}
private async Task UpdateCursorAsync(AppleCursor cursor, CancellationToken cancellationToken)
{
var document = cursor.ToBson();
await _stateRepository.UpdateCursorAsync(SourceName, document, _timeProvider.GetUtcNow(), cancellationToken).ConfigureAwait(false);
}
}
using System;
using System.Collections.Generic;
using System.Linq;
using System.Net;
using System.Text.Json;
using System.Text.Json.Serialization;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using MongoDB.Bson;
using StellaOps.Concelier.Connector.Common;
using StellaOps.Concelier.Connector.Common.Fetch;
using StellaOps.Concelier.Connector.Vndr.Apple.Internal;
using StellaOps.Concelier.Storage.Mongo;
using StellaOps.Concelier.Storage.Mongo.Advisories;
using StellaOps.Concelier.Storage.Mongo.Documents;
using StellaOps.Concelier.Storage.Mongo.Dtos;
using StellaOps.Concelier.Storage.Mongo.PsirtFlags;
using StellaOps.Plugin;
namespace StellaOps.Concelier.Connector.Vndr.Apple;
public sealed class AppleConnector : IFeedConnector
{
private static readonly JsonSerializerOptions SerializerOptions = new(JsonSerializerDefaults.Web)
{
DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingNull,
PropertyNameCaseInsensitive = true,
};
private readonly SourceFetchService _fetchService;
private readonly RawDocumentStorage _rawDocumentStorage;
private readonly IDocumentStore _documentStore;
private readonly IDtoStore _dtoStore;
private readonly IAdvisoryStore _advisoryStore;
private readonly IPsirtFlagStore _psirtFlagStore;
private readonly ISourceStateRepository _stateRepository;
private readonly AppleOptions _options;
private readonly AppleDiagnostics _diagnostics;
private readonly TimeProvider _timeProvider;
private readonly ILogger<AppleConnector> _logger;
public AppleConnector(
SourceFetchService fetchService,
RawDocumentStorage rawDocumentStorage,
IDocumentStore documentStore,
IDtoStore dtoStore,
IAdvisoryStore advisoryStore,
IPsirtFlagStore psirtFlagStore,
ISourceStateRepository stateRepository,
AppleDiagnostics diagnostics,
IOptions<AppleOptions> options,
TimeProvider? timeProvider,
ILogger<AppleConnector> logger)
{
_fetchService = fetchService ?? throw new ArgumentNullException(nameof(fetchService));
_rawDocumentStorage = rawDocumentStorage ?? throw new ArgumentNullException(nameof(rawDocumentStorage));
_documentStore = documentStore ?? throw new ArgumentNullException(nameof(documentStore));
_dtoStore = dtoStore ?? throw new ArgumentNullException(nameof(dtoStore));
_advisoryStore = advisoryStore ?? throw new ArgumentNullException(nameof(advisoryStore));
_psirtFlagStore = psirtFlagStore ?? throw new ArgumentNullException(nameof(psirtFlagStore));
_stateRepository = stateRepository ?? throw new ArgumentNullException(nameof(stateRepository));
_diagnostics = diagnostics ?? throw new ArgumentNullException(nameof(diagnostics));
_options = (options ?? throw new ArgumentNullException(nameof(options))).Value ?? throw new ArgumentNullException(nameof(options));
_options.Validate();
_timeProvider = timeProvider ?? TimeProvider.System;
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public string SourceName => VndrAppleConnectorPlugin.SourceName;
public async Task FetchAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
var pendingDocuments = cursor.PendingDocuments.ToHashSet();
var pendingMappings = cursor.PendingMappings.ToHashSet();
var processedIds = cursor.ProcessedIds.ToHashSet(StringComparer.OrdinalIgnoreCase);
var maxPosted = cursor.LastPosted ?? DateTimeOffset.MinValue;
var baseline = cursor.LastPosted?.Add(-_options.ModifiedTolerance) ?? _timeProvider.GetUtcNow().Add(-_options.InitialBackfill);
SourceFetchContentResult indexResult;
try
{
var request = new SourceFetchRequest(AppleOptions.HttpClientName, SourceName, _options.SoftwareLookupUri!)
{
AcceptHeaders = new[] { "application/json", "application/vnd.apple.security+json;q=0.9" },
};
indexResult = await _fetchService.FetchContentAsync(request, cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
_diagnostics.FetchFailure();
_logger.LogError(ex, "Apple software index fetch failed from {Uri}", _options.SoftwareLookupUri);
await _stateRepository.MarkFailureAsync(SourceName, _timeProvider.GetUtcNow(), TimeSpan.FromMinutes(10), ex.Message, cancellationToken).ConfigureAwait(false);
throw;
}
if (!indexResult.IsSuccess || indexResult.Content is null)
{
if (indexResult.IsNotModified)
{
_diagnostics.FetchUnchanged();
}
await UpdateCursorAsync(cursor, cancellationToken).ConfigureAwait(false);
return;
}
var indexEntries = AppleIndexParser.Parse(indexResult.Content, _options.AdvisoryBaseUri!);
if (indexEntries.Count == 0)
{
await UpdateCursorAsync(cursor, cancellationToken).ConfigureAwait(false);
return;
}
var allowlist = _options.AdvisoryAllowlist;
var blocklist = _options.AdvisoryBlocklist;
var ordered = indexEntries
.Where(entry => ShouldInclude(entry, allowlist, blocklist))
.OrderBy(entry => entry.PostingDate)
.ThenBy(entry => entry.ArticleId, StringComparer.OrdinalIgnoreCase)
.ToArray();
foreach (var entry in ordered)
{
cancellationToken.ThrowIfCancellationRequested();
if (entry.PostingDate < baseline)
{
continue;
}
if (cursor.LastPosted.HasValue
&& entry.PostingDate <= cursor.LastPosted.Value
&& processedIds.Contains(entry.UpdateId))
{
continue;
}
var metadata = BuildMetadata(entry);
var existing = await _documentStore.FindBySourceAndUriAsync(SourceName, entry.DetailUri.ToString(), cancellationToken).ConfigureAwait(false);
SourceFetchResult result;
try
{
result = await _fetchService.FetchAsync(
new SourceFetchRequest(AppleOptions.HttpClientName, SourceName, entry.DetailUri)
{
Metadata = metadata,
ETag = existing?.Etag,
LastModified = existing?.LastModified,
AcceptHeaders = new[]
{
"text/html",
"application/xhtml+xml",
"text/plain;q=0.5"
},
},
cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
_diagnostics.FetchFailure();
_logger.LogError(ex, "Apple advisory fetch failed for {Uri}", entry.DetailUri);
await _stateRepository.MarkFailureAsync(SourceName, _timeProvider.GetUtcNow(), TimeSpan.FromMinutes(5), ex.Message, cancellationToken).ConfigureAwait(false);
throw;
}
if (result.StatusCode == HttpStatusCode.NotModified)
{
_diagnostics.FetchUnchanged();
}
if (!result.IsSuccess || result.Document is null)
{
continue;
}
_diagnostics.FetchItem();
pendingDocuments.Add(result.Document.Id);
processedIds.Add(entry.UpdateId);
if (entry.PostingDate > maxPosted)
{
maxPosted = entry.PostingDate;
}
}
var updated = cursor
.WithPendingDocuments(pendingDocuments)
.WithPendingMappings(pendingMappings)
.WithLastPosted(maxPosted == DateTimeOffset.MinValue ? cursor.LastPosted ?? DateTimeOffset.MinValue : maxPosted, processedIds);
await UpdateCursorAsync(updated, cancellationToken).ConfigureAwait(false);
}
public async Task ParseAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingDocuments.Count == 0)
{
return;
}
var remainingDocuments = cursor.PendingDocuments.ToHashSet();
var pendingMappings = cursor.PendingMappings.ToHashSet();
foreach (var documentId in cursor.PendingDocuments)
{
cancellationToken.ThrowIfCancellationRequested();
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (document is null)
{
remainingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
if (!document.PayloadId.HasValue)
{
_diagnostics.ParseFailure();
_logger.LogWarning("Apple document {DocumentId} missing GridFS payload", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
AppleDetailDto dto;
try
{
var content = await _rawDocumentStorage.DownloadAsync(document.PayloadId.Value, cancellationToken).ConfigureAwait(false);
var html = System.Text.Encoding.UTF8.GetString(content);
var entry = RehydrateIndexEntry(document);
dto = AppleDetailParser.Parse(html, entry);
}
catch (Exception ex)
{
_diagnostics.ParseFailure();
_logger.LogError(ex, "Apple parse failed for document {DocumentId}", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
var json = JsonSerializer.Serialize(dto, SerializerOptions);
var payload = BsonDocument.Parse(json);
var validatedAt = _timeProvider.GetUtcNow();
var existingDto = await _dtoStore.FindByDocumentIdAsync(document.Id, cancellationToken).ConfigureAwait(false);
var dtoRecord = existingDto is null
? new DtoRecord(Guid.NewGuid(), document.Id, SourceName, "apple.security.update.v1", payload, validatedAt)
: existingDto with
{
Payload = payload,
SchemaVersion = "apple.security.update.v1",
ValidatedAt = validatedAt,
};
await _dtoStore.UpsertAsync(dtoRecord, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.PendingMap, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
pendingMappings.Add(document.Id);
}
var updatedCursor = cursor
.WithPendingDocuments(remainingDocuments)
.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
public async Task MapAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingMappings.Count == 0)
{
return;
}
var pendingMappings = cursor.PendingMappings.ToHashSet();
foreach (var documentId in cursor.PendingMappings)
{
cancellationToken.ThrowIfCancellationRequested();
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (document is null)
{
pendingMappings.Remove(documentId);
continue;
}
var dtoRecord = await _dtoStore.FindByDocumentIdAsync(document.Id, cancellationToken).ConfigureAwait(false);
if (dtoRecord is null)
{
pendingMappings.Remove(documentId);
continue;
}
AppleDetailDto dto;
try
{
dto = JsonSerializer.Deserialize<AppleDetailDto>(dtoRecord.Payload.ToJson(), SerializerOptions)
?? throw new InvalidOperationException("Unable to deserialize Apple DTO.");
}
catch (Exception ex)
{
_logger.LogError(ex, "Apple DTO deserialization failed for document {DocumentId}", document.Id);
pendingMappings.Remove(documentId);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
continue;
}
var (advisory, flag) = AppleMapper.Map(dto, document, dtoRecord);
_diagnostics.MapAffectedCount(advisory.AffectedPackages.Length);
await _advisoryStore.UpsertAsync(advisory, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Mapped, cancellationToken).ConfigureAwait(false);
if (flag is not null)
{
await _psirtFlagStore.UpsertAsync(flag, cancellationToken).ConfigureAwait(false);
}
pendingMappings.Remove(documentId);
}
var updatedCursor = cursor.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
private AppleIndexEntry RehydrateIndexEntry(DocumentRecord document)
{
var metadata = document.Metadata ?? new Dictionary<string, string>(StringComparer.Ordinal);
metadata.TryGetValue("apple.articleId", out var articleId);
metadata.TryGetValue("apple.updateId", out var updateId);
metadata.TryGetValue("apple.title", out var title);
metadata.TryGetValue("apple.postingDate", out var postingDateRaw);
metadata.TryGetValue("apple.detailUri", out var detailUriRaw);
metadata.TryGetValue("apple.rapidResponse", out var rapidRaw);
metadata.TryGetValue("apple.products", out var productsJson);
if (!DateTimeOffset.TryParse(postingDateRaw, out var postingDate))
{
postingDate = document.FetchedAt;
}
var detailUri = !string.IsNullOrWhiteSpace(detailUriRaw) && Uri.TryCreate(detailUriRaw, UriKind.Absolute, out var parsedUri)
? parsedUri
: new Uri(_options.AdvisoryBaseUri!, articleId ?? document.Uri);
var rapid = string.Equals(rapidRaw, "true", StringComparison.OrdinalIgnoreCase);
var products = DeserializeProducts(productsJson);
return new AppleIndexEntry(
UpdateId: string.IsNullOrWhiteSpace(updateId) ? articleId ?? document.Uri : updateId,
ArticleId: articleId ?? document.Uri,
Title: title ?? document.Metadata?["apple.originalTitle"] ?? "Apple Security Update",
PostingDate: postingDate.ToUniversalTime(),
DetailUri: detailUri,
Products: products,
IsRapidSecurityResponse: rapid);
}
private static IReadOnlyList<AppleIndexProduct> DeserializeProducts(string? json)
{
if (string.IsNullOrWhiteSpace(json))
{
return Array.Empty<AppleIndexProduct>();
}
try
{
var products = JsonSerializer.Deserialize<List<AppleIndexProduct>>(json, SerializerOptions);
return products is { Count: > 0 } ? products : Array.Empty<AppleIndexProduct>();
}
catch (JsonException)
{
return Array.Empty<AppleIndexProduct>();
}
}
private static Dictionary<string, string> BuildMetadata(AppleIndexEntry entry)
{
var metadata = new Dictionary<string, string>(StringComparer.Ordinal)
{
["apple.articleId"] = entry.ArticleId,
["apple.updateId"] = entry.UpdateId,
["apple.title"] = entry.Title,
["apple.postingDate"] = entry.PostingDate.ToString("O"),
["apple.detailUri"] = entry.DetailUri.ToString(),
["apple.rapidResponse"] = entry.IsRapidSecurityResponse ? "true" : "false",
["apple.products"] = JsonSerializer.Serialize(entry.Products, SerializerOptions),
};
return metadata;
}
private static bool ShouldInclude(AppleIndexEntry entry, IReadOnlyCollection<string> allowlist, IReadOnlyCollection<string> blocklist)
{
if (allowlist.Count > 0 && !allowlist.Contains(entry.ArticleId))
{
return false;
}
if (blocklist.Count > 0 && blocklist.Contains(entry.ArticleId))
{
return false;
}
return true;
}
private async Task<AppleCursor> GetCursorAsync(CancellationToken cancellationToken)
{
var state = await _stateRepository.TryGetAsync(SourceName, cancellationToken).ConfigureAwait(false);
return state is null ? AppleCursor.Empty : AppleCursor.FromBson(state.Cursor);
}
private async Task UpdateCursorAsync(AppleCursor cursor, CancellationToken cancellationToken)
{
var document = cursor.ToBson();
await _stateRepository.UpdateCursorAsync(SourceName, document, _timeProvider.GetUtcNow(), cancellationToken).ConfigureAwait(false);
}
}

View File

@@ -1,366 +1,366 @@
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Text.Json;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using MongoDB.Bson;
using MongoDB.Bson.IO;
using StellaOps.Concelier.Models;
using StellaOps.Concelier.Connector.Common;
using StellaOps.Concelier.Connector.Common.Fetch;
using StellaOps.Concelier.Connector.Common.Json;
using StellaOps.Concelier.Connector.Vndr.Chromium.Configuration;
using StellaOps.Concelier.Connector.Vndr.Chromium.Internal;
using StellaOps.Concelier.Storage.Mongo;
using StellaOps.Concelier.Storage.Mongo.Advisories;
using StellaOps.Concelier.Storage.Mongo.Documents;
using StellaOps.Concelier.Storage.Mongo.Dtos;
using StellaOps.Concelier.Storage.Mongo.PsirtFlags;
using StellaOps.Plugin;
using Json.Schema;
namespace StellaOps.Concelier.Connector.Vndr.Chromium;
public sealed class ChromiumConnector : IFeedConnector
{
private static readonly JsonSchema Schema = ChromiumSchemaProvider.Schema;
private static readonly JsonSerializerOptions SerializerOptions = new()
{
PropertyNamingPolicy = JsonNamingPolicy.CamelCase,
DefaultIgnoreCondition = System.Text.Json.Serialization.JsonIgnoreCondition.WhenWritingNull,
};
private readonly ChromiumFeedLoader _feedLoader;
private readonly SourceFetchService _fetchService;
private readonly RawDocumentStorage _rawDocumentStorage;
private readonly IDocumentStore _documentStore;
private readonly IDtoStore _dtoStore;
private readonly IAdvisoryStore _advisoryStore;
private readonly IPsirtFlagStore _psirtFlagStore;
private readonly ISourceStateRepository _stateRepository;
private readonly IJsonSchemaValidator _schemaValidator;
private readonly ChromiumOptions _options;
private readonly TimeProvider _timeProvider;
private readonly ChromiumDiagnostics _diagnostics;
private readonly ILogger<ChromiumConnector> _logger;
public ChromiumConnector(
ChromiumFeedLoader feedLoader,
SourceFetchService fetchService,
RawDocumentStorage rawDocumentStorage,
IDocumentStore documentStore,
IDtoStore dtoStore,
IAdvisoryStore advisoryStore,
IPsirtFlagStore psirtFlagStore,
ISourceStateRepository stateRepository,
IJsonSchemaValidator schemaValidator,
IOptions<ChromiumOptions> options,
TimeProvider? timeProvider,
ChromiumDiagnostics diagnostics,
ILogger<ChromiumConnector> logger)
{
_feedLoader = feedLoader ?? throw new ArgumentNullException(nameof(feedLoader));
_fetchService = fetchService ?? throw new ArgumentNullException(nameof(fetchService));
_rawDocumentStorage = rawDocumentStorage ?? throw new ArgumentNullException(nameof(rawDocumentStorage));
_documentStore = documentStore ?? throw new ArgumentNullException(nameof(documentStore));
_dtoStore = dtoStore ?? throw new ArgumentNullException(nameof(dtoStore));
_advisoryStore = advisoryStore ?? throw new ArgumentNullException(nameof(advisoryStore));
_psirtFlagStore = psirtFlagStore ?? throw new ArgumentNullException(nameof(psirtFlagStore));
_stateRepository = stateRepository ?? throw new ArgumentNullException(nameof(stateRepository));
_schemaValidator = schemaValidator ?? throw new ArgumentNullException(nameof(schemaValidator));
_options = options?.Value ?? throw new ArgumentNullException(nameof(options));
_options.Validate();
_timeProvider = timeProvider ?? TimeProvider.System;
_diagnostics = diagnostics ?? throw new ArgumentNullException(nameof(diagnostics));
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public string SourceName => VndrChromiumConnectorPlugin.SourceName;
public async Task FetchAsync(IServiceProvider services, CancellationToken cancellationToken)
{
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
var now = _timeProvider.GetUtcNow();
var (windowStart, windowEnd) = CalculateWindow(cursor, now);
ProvenanceDiagnostics.ReportResumeWindow(SourceName, windowStart, _logger);
IReadOnlyList<ChromiumFeedEntry> feedEntries;
_diagnostics.FetchAttempt();
try
{
feedEntries = await _feedLoader.LoadAsync(windowStart, windowEnd, cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
_logger.LogError(ex, "Chromium feed load failed {Start}-{End}", windowStart, windowEnd);
_diagnostics.FetchFailure();
await _stateRepository.MarkFailureAsync(SourceName, now, TimeSpan.FromMinutes(10), ex.Message, cancellationToken).ConfigureAwait(false);
throw;
}
var fetchCache = new Dictionary<string, ChromiumFetchCacheEntry>(cursor.FetchCache, StringComparer.Ordinal);
var touchedResources = new HashSet<string>(StringComparer.Ordinal);
var candidates = feedEntries
.Where(static entry => entry.IsSecurityUpdate())
.OrderBy(static entry => entry.Published)
.ToArray();
if (candidates.Length == 0)
{
var untouched = cursor
.WithLastPublished(cursor.LastPublished ?? windowEnd)
.WithFetchCache(fetchCache);
await UpdateCursorAsync(untouched, cancellationToken).ConfigureAwait(false);
return;
}
var pendingDocuments = cursor.PendingDocuments.ToList();
var maxPublished = cursor.LastPublished;
foreach (var entry in candidates)
{
try
{
var cacheKey = entry.DetailUri.ToString();
touchedResources.Add(cacheKey);
var metadata = ChromiumDocumentMetadata.CreateMetadata(entry.PostId, entry.Title, entry.Published, entry.Updated, entry.Summary);
var request = new SourceFetchRequest(ChromiumOptions.HttpClientName, SourceName, entry.DetailUri)
{
Metadata = metadata,
AcceptHeaders = new[] { "text/html", "application/xhtml+xml", "text/plain;q=0.5" },
};
var result = await _fetchService.FetchAsync(request, cancellationToken).ConfigureAwait(false);
if (!result.IsSuccess || result.Document is null)
{
continue;
}
if (cursor.TryGetFetchCache(cacheKey, out var cached) && string.Equals(cached.Sha256, result.Document.Sha256, StringComparison.OrdinalIgnoreCase))
{
_diagnostics.FetchUnchanged();
fetchCache[cacheKey] = new ChromiumFetchCacheEntry(result.Document.Sha256);
await _documentStore.UpdateStatusAsync(result.Document.Id, DocumentStatuses.Mapped, cancellationToken).ConfigureAwait(false);
if (!maxPublished.HasValue || entry.Published > maxPublished)
{
maxPublished = entry.Published;
}
continue;
}
_diagnostics.FetchDocument();
if (!pendingDocuments.Contains(result.Document.Id))
{
pendingDocuments.Add(result.Document.Id);
}
if (!maxPublished.HasValue || entry.Published > maxPublished)
{
maxPublished = entry.Published;
}
fetchCache[cacheKey] = new ChromiumFetchCacheEntry(result.Document.Sha256);
}
catch (Exception ex)
{
_logger.LogError(ex, "Chromium fetch failed for {Uri}", entry.DetailUri);
_diagnostics.FetchFailure();
await _stateRepository.MarkFailureAsync(SourceName, now, TimeSpan.FromMinutes(5), ex.Message, cancellationToken).ConfigureAwait(false);
throw;
}
}
if (touchedResources.Count > 0)
{
var keysToRemove = fetchCache.Keys.Where(key => !touchedResources.Contains(key)).ToArray();
foreach (var key in keysToRemove)
{
fetchCache.Remove(key);
}
}
var updatedCursor = cursor
.WithPendingDocuments(pendingDocuments)
.WithPendingMappings(cursor.PendingMappings)
.WithLastPublished(maxPublished ?? cursor.LastPublished ?? windowEnd)
.WithFetchCache(fetchCache);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
public async Task ParseAsync(IServiceProvider services, CancellationToken cancellationToken)
{
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingDocuments.Count == 0)
{
return;
}
var pendingDocuments = cursor.PendingDocuments.ToList();
var pendingMappings = cursor.PendingMappings.ToList();
foreach (var documentId in cursor.PendingDocuments)
{
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (document is null)
{
pendingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
if (!document.GridFsId.HasValue)
{
_logger.LogWarning("Chromium document {DocumentId} missing GridFS payload", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
ChromiumDto dto;
try
{
var metadata = ChromiumDocumentMetadata.FromDocument(document);
var content = await _rawDocumentStorage.DownloadAsync(document.GridFsId.Value, cancellationToken).ConfigureAwait(false);
var html = Encoding.UTF8.GetString(content);
dto = ChromiumParser.Parse(html, metadata);
}
catch (Exception ex)
{
_logger.LogError(ex, "Chromium parse failed for {Uri}", document.Uri);
_diagnostics.ParseFailure();
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
var json = JsonSerializer.Serialize(dto, SerializerOptions);
using var jsonDocument = JsonDocument.Parse(json);
try
{
_schemaValidator.Validate(jsonDocument, Schema, dto.PostId);
}
catch (StellaOps.Concelier.Connector.Common.Json.JsonSchemaValidationException ex)
{
_logger.LogError(ex, "Chromium schema validation failed for {DocumentId}", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
var payload = BsonDocument.Parse(json);
var existingDto = await _dtoStore.FindByDocumentIdAsync(document.Id, cancellationToken).ConfigureAwait(false);
var validatedAt = _timeProvider.GetUtcNow();
var dtoRecord = existingDto is null
? new DtoRecord(Guid.NewGuid(), document.Id, SourceName, "chromium.post.v1", payload, validatedAt)
: existingDto with
{
Payload = payload,
SchemaVersion = "chromium.post.v1",
ValidatedAt = validatedAt,
};
await _dtoStore.UpsertAsync(dtoRecord, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.PendingMap, cancellationToken).ConfigureAwait(false);
_diagnostics.ParseSuccess();
pendingDocuments.Remove(documentId);
if (!pendingMappings.Contains(documentId))
{
pendingMappings.Add(documentId);
}
}
var updatedCursor = cursor
.WithPendingDocuments(pendingDocuments)
.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
public async Task MapAsync(IServiceProvider services, CancellationToken cancellationToken)
{
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingMappings.Count == 0)
{
return;
}
var pendingMappings = cursor.PendingMappings.ToList();
foreach (var documentId in cursor.PendingMappings)
{
var dtoRecord = await _dtoStore.FindByDocumentIdAsync(documentId, cancellationToken).ConfigureAwait(false);
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (dtoRecord is null || document is null)
{
pendingMappings.Remove(documentId);
continue;
}
var json = dtoRecord.Payload.ToJson(new JsonWriterSettings { OutputMode = JsonOutputMode.RelaxedExtendedJson });
var dto = JsonSerializer.Deserialize<ChromiumDto>(json, SerializerOptions);
if (dto is null)
{
_logger.LogWarning("Chromium DTO deserialization failed for {DocumentId}", documentId);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
continue;
}
var recordedAt = _timeProvider.GetUtcNow();
var (advisory, flag) = ChromiumMapper.Map(dto, SourceName, recordedAt);
await _advisoryStore.UpsertAsync(advisory, cancellationToken).ConfigureAwait(false);
await _psirtFlagStore.UpsertAsync(flag, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Mapped, cancellationToken).ConfigureAwait(false);
_diagnostics.MapSuccess();
pendingMappings.Remove(documentId);
}
var updatedCursor = cursor.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
private async Task<ChromiumCursor> GetCursorAsync(CancellationToken cancellationToken)
{
var record = await _stateRepository.TryGetAsync(SourceName, cancellationToken).ConfigureAwait(false);
return ChromiumCursor.FromBsonDocument(record?.Cursor);
}
private async Task UpdateCursorAsync(ChromiumCursor cursor, CancellationToken cancellationToken)
{
var completedAt = _timeProvider.GetUtcNow();
await _stateRepository.UpdateCursorAsync(SourceName, cursor.ToBsonDocument(), completedAt, cancellationToken).ConfigureAwait(false);
}
private (DateTimeOffset start, DateTimeOffset end) CalculateWindow(ChromiumCursor cursor, DateTimeOffset now)
{
var lastPublished = cursor.LastPublished ?? now - _options.InitialBackfill;
var start = lastPublished - _options.WindowOverlap;
var backfill = now - _options.InitialBackfill;
if (start < backfill)
{
start = backfill;
}
var end = now;
if (end <= start)
{
end = start.AddHours(1);
}
return (start, end);
}
}
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Text.Json;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using MongoDB.Bson;
using MongoDB.Bson.IO;
using StellaOps.Concelier.Models;
using StellaOps.Concelier.Connector.Common;
using StellaOps.Concelier.Connector.Common.Fetch;
using StellaOps.Concelier.Connector.Common.Json;
using StellaOps.Concelier.Connector.Vndr.Chromium.Configuration;
using StellaOps.Concelier.Connector.Vndr.Chromium.Internal;
using StellaOps.Concelier.Storage.Mongo;
using StellaOps.Concelier.Storage.Mongo.Advisories;
using StellaOps.Concelier.Storage.Mongo.Documents;
using StellaOps.Concelier.Storage.Mongo.Dtos;
using StellaOps.Concelier.Storage.Mongo.PsirtFlags;
using StellaOps.Plugin;
using Json.Schema;
namespace StellaOps.Concelier.Connector.Vndr.Chromium;
public sealed class ChromiumConnector : IFeedConnector
{
private static readonly JsonSchema Schema = ChromiumSchemaProvider.Schema;
private static readonly JsonSerializerOptions SerializerOptions = new()
{
PropertyNamingPolicy = JsonNamingPolicy.CamelCase,
DefaultIgnoreCondition = System.Text.Json.Serialization.JsonIgnoreCondition.WhenWritingNull,
};
private readonly ChromiumFeedLoader _feedLoader;
private readonly SourceFetchService _fetchService;
private readonly RawDocumentStorage _rawDocumentStorage;
private readonly IDocumentStore _documentStore;
private readonly IDtoStore _dtoStore;
private readonly IAdvisoryStore _advisoryStore;
private readonly IPsirtFlagStore _psirtFlagStore;
private readonly ISourceStateRepository _stateRepository;
private readonly IJsonSchemaValidator _schemaValidator;
private readonly ChromiumOptions _options;
private readonly TimeProvider _timeProvider;
private readonly ChromiumDiagnostics _diagnostics;
private readonly ILogger<ChromiumConnector> _logger;
public ChromiumConnector(
ChromiumFeedLoader feedLoader,
SourceFetchService fetchService,
RawDocumentStorage rawDocumentStorage,
IDocumentStore documentStore,
IDtoStore dtoStore,
IAdvisoryStore advisoryStore,
IPsirtFlagStore psirtFlagStore,
ISourceStateRepository stateRepository,
IJsonSchemaValidator schemaValidator,
IOptions<ChromiumOptions> options,
TimeProvider? timeProvider,
ChromiumDiagnostics diagnostics,
ILogger<ChromiumConnector> logger)
{
_feedLoader = feedLoader ?? throw new ArgumentNullException(nameof(feedLoader));
_fetchService = fetchService ?? throw new ArgumentNullException(nameof(fetchService));
_rawDocumentStorage = rawDocumentStorage ?? throw new ArgumentNullException(nameof(rawDocumentStorage));
_documentStore = documentStore ?? throw new ArgumentNullException(nameof(documentStore));
_dtoStore = dtoStore ?? throw new ArgumentNullException(nameof(dtoStore));
_advisoryStore = advisoryStore ?? throw new ArgumentNullException(nameof(advisoryStore));
_psirtFlagStore = psirtFlagStore ?? throw new ArgumentNullException(nameof(psirtFlagStore));
_stateRepository = stateRepository ?? throw new ArgumentNullException(nameof(stateRepository));
_schemaValidator = schemaValidator ?? throw new ArgumentNullException(nameof(schemaValidator));
_options = options?.Value ?? throw new ArgumentNullException(nameof(options));
_options.Validate();
_timeProvider = timeProvider ?? TimeProvider.System;
_diagnostics = diagnostics ?? throw new ArgumentNullException(nameof(diagnostics));
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public string SourceName => VndrChromiumConnectorPlugin.SourceName;
public async Task FetchAsync(IServiceProvider services, CancellationToken cancellationToken)
{
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
var now = _timeProvider.GetUtcNow();
var (windowStart, windowEnd) = CalculateWindow(cursor, now);
ProvenanceDiagnostics.ReportResumeWindow(SourceName, windowStart, _logger);
IReadOnlyList<ChromiumFeedEntry> feedEntries;
_diagnostics.FetchAttempt();
try
{
feedEntries = await _feedLoader.LoadAsync(windowStart, windowEnd, cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
_logger.LogError(ex, "Chromium feed load failed {Start}-{End}", windowStart, windowEnd);
_diagnostics.FetchFailure();
await _stateRepository.MarkFailureAsync(SourceName, now, TimeSpan.FromMinutes(10), ex.Message, cancellationToken).ConfigureAwait(false);
throw;
}
var fetchCache = new Dictionary<string, ChromiumFetchCacheEntry>(cursor.FetchCache, StringComparer.Ordinal);
var touchedResources = new HashSet<string>(StringComparer.Ordinal);
var candidates = feedEntries
.Where(static entry => entry.IsSecurityUpdate())
.OrderBy(static entry => entry.Published)
.ToArray();
if (candidates.Length == 0)
{
var untouched = cursor
.WithLastPublished(cursor.LastPublished ?? windowEnd)
.WithFetchCache(fetchCache);
await UpdateCursorAsync(untouched, cancellationToken).ConfigureAwait(false);
return;
}
var pendingDocuments = cursor.PendingDocuments.ToList();
var maxPublished = cursor.LastPublished;
foreach (var entry in candidates)
{
try
{
var cacheKey = entry.DetailUri.ToString();
touchedResources.Add(cacheKey);
var metadata = ChromiumDocumentMetadata.CreateMetadata(entry.PostId, entry.Title, entry.Published, entry.Updated, entry.Summary);
var request = new SourceFetchRequest(ChromiumOptions.HttpClientName, SourceName, entry.DetailUri)
{
Metadata = metadata,
AcceptHeaders = new[] { "text/html", "application/xhtml+xml", "text/plain;q=0.5" },
};
var result = await _fetchService.FetchAsync(request, cancellationToken).ConfigureAwait(false);
if (!result.IsSuccess || result.Document is null)
{
continue;
}
if (cursor.TryGetFetchCache(cacheKey, out var cached) && string.Equals(cached.Sha256, result.Document.Sha256, StringComparison.OrdinalIgnoreCase))
{
_diagnostics.FetchUnchanged();
fetchCache[cacheKey] = new ChromiumFetchCacheEntry(result.Document.Sha256);
await _documentStore.UpdateStatusAsync(result.Document.Id, DocumentStatuses.Mapped, cancellationToken).ConfigureAwait(false);
if (!maxPublished.HasValue || entry.Published > maxPublished)
{
maxPublished = entry.Published;
}
continue;
}
_diagnostics.FetchDocument();
if (!pendingDocuments.Contains(result.Document.Id))
{
pendingDocuments.Add(result.Document.Id);
}
if (!maxPublished.HasValue || entry.Published > maxPublished)
{
maxPublished = entry.Published;
}
fetchCache[cacheKey] = new ChromiumFetchCacheEntry(result.Document.Sha256);
}
catch (Exception ex)
{
_logger.LogError(ex, "Chromium fetch failed for {Uri}", entry.DetailUri);
_diagnostics.FetchFailure();
await _stateRepository.MarkFailureAsync(SourceName, now, TimeSpan.FromMinutes(5), ex.Message, cancellationToken).ConfigureAwait(false);
throw;
}
}
if (touchedResources.Count > 0)
{
var keysToRemove = fetchCache.Keys.Where(key => !touchedResources.Contains(key)).ToArray();
foreach (var key in keysToRemove)
{
fetchCache.Remove(key);
}
}
var updatedCursor = cursor
.WithPendingDocuments(pendingDocuments)
.WithPendingMappings(cursor.PendingMappings)
.WithLastPublished(maxPublished ?? cursor.LastPublished ?? windowEnd)
.WithFetchCache(fetchCache);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
public async Task ParseAsync(IServiceProvider services, CancellationToken cancellationToken)
{
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingDocuments.Count == 0)
{
return;
}
var pendingDocuments = cursor.PendingDocuments.ToList();
var pendingMappings = cursor.PendingMappings.ToList();
foreach (var documentId in cursor.PendingDocuments)
{
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (document is null)
{
pendingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
if (!document.PayloadId.HasValue)
{
_logger.LogWarning("Chromium document {DocumentId} missing GridFS payload", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
ChromiumDto dto;
try
{
var metadata = ChromiumDocumentMetadata.FromDocument(document);
var content = await _rawDocumentStorage.DownloadAsync(document.PayloadId.Value, cancellationToken).ConfigureAwait(false);
var html = Encoding.UTF8.GetString(content);
dto = ChromiumParser.Parse(html, metadata);
}
catch (Exception ex)
{
_logger.LogError(ex, "Chromium parse failed for {Uri}", document.Uri);
_diagnostics.ParseFailure();
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
var json = JsonSerializer.Serialize(dto, SerializerOptions);
using var jsonDocument = JsonDocument.Parse(json);
try
{
_schemaValidator.Validate(jsonDocument, Schema, dto.PostId);
}
catch (StellaOps.Concelier.Connector.Common.Json.JsonSchemaValidationException ex)
{
_logger.LogError(ex, "Chromium schema validation failed for {DocumentId}", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
var payload = BsonDocument.Parse(json);
var existingDto = await _dtoStore.FindByDocumentIdAsync(document.Id, cancellationToken).ConfigureAwait(false);
var validatedAt = _timeProvider.GetUtcNow();
var dtoRecord = existingDto is null
? new DtoRecord(Guid.NewGuid(), document.Id, SourceName, "chromium.post.v1", payload, validatedAt)
: existingDto with
{
Payload = payload,
SchemaVersion = "chromium.post.v1",
ValidatedAt = validatedAt,
};
await _dtoStore.UpsertAsync(dtoRecord, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.PendingMap, cancellationToken).ConfigureAwait(false);
_diagnostics.ParseSuccess();
pendingDocuments.Remove(documentId);
if (!pendingMappings.Contains(documentId))
{
pendingMappings.Add(documentId);
}
}
var updatedCursor = cursor
.WithPendingDocuments(pendingDocuments)
.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
public async Task MapAsync(IServiceProvider services, CancellationToken cancellationToken)
{
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingMappings.Count == 0)
{
return;
}
var pendingMappings = cursor.PendingMappings.ToList();
foreach (var documentId in cursor.PendingMappings)
{
var dtoRecord = await _dtoStore.FindByDocumentIdAsync(documentId, cancellationToken).ConfigureAwait(false);
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (dtoRecord is null || document is null)
{
pendingMappings.Remove(documentId);
continue;
}
var json = dtoRecord.Payload.ToJson(new JsonWriterSettings { OutputMode = JsonOutputMode.RelaxedExtendedJson });
var dto = JsonSerializer.Deserialize<ChromiumDto>(json, SerializerOptions);
if (dto is null)
{
_logger.LogWarning("Chromium DTO deserialization failed for {DocumentId}", documentId);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
continue;
}
var recordedAt = _timeProvider.GetUtcNow();
var (advisory, flag) = ChromiumMapper.Map(dto, SourceName, recordedAt);
await _advisoryStore.UpsertAsync(advisory, cancellationToken).ConfigureAwait(false);
await _psirtFlagStore.UpsertAsync(flag, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Mapped, cancellationToken).ConfigureAwait(false);
_diagnostics.MapSuccess();
pendingMappings.Remove(documentId);
}
var updatedCursor = cursor.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
private async Task<ChromiumCursor> GetCursorAsync(CancellationToken cancellationToken)
{
var record = await _stateRepository.TryGetAsync(SourceName, cancellationToken).ConfigureAwait(false);
return ChromiumCursor.FromBsonDocument(record?.Cursor);
}
private async Task UpdateCursorAsync(ChromiumCursor cursor, CancellationToken cancellationToken)
{
var completedAt = _timeProvider.GetUtcNow();
await _stateRepository.UpdateCursorAsync(SourceName, cursor.ToBsonDocument(), completedAt, cancellationToken).ConfigureAwait(false);
}
private (DateTimeOffset start, DateTimeOffset end) CalculateWindow(ChromiumCursor cursor, DateTimeOffset now)
{
var lastPublished = cursor.LastPublished ?? now - _options.InitialBackfill;
var start = lastPublished - _options.WindowOverlap;
var backfill = now - _options.InitialBackfill;
if (start < backfill)
{
start = backfill;
}
var end = now;
if (end <= start)
{
end = start.AddHours(1);
}
return (start, end);
}
}

View File

@@ -163,7 +163,7 @@ public sealed class CiscoConnector : IFeedConnector
BuildMetadata(advisory),
Etag: null,
LastModified: advisory.LastUpdated ?? advisory.FirstPublished ?? now,
GridFsId: gridFsId,
PayloadId: gridFsId,
ExpiresAt: null);
var upserted = await _documentStore.UpsertAsync(record, cancellationToken).ConfigureAwait(false);
@@ -259,7 +259,7 @@ public sealed class CiscoConnector : IFeedConnector
continue;
}
if (!document.GridFsId.HasValue)
if (!document.PayloadId.HasValue)
{
_diagnostics.ParseFailure();
_logger.LogWarning("Cisco document {DocumentId} missing GridFS payload", documentId);
@@ -273,7 +273,7 @@ public sealed class CiscoConnector : IFeedConnector
byte[] payload;
try
{
payload = await _rawDocumentStorage.DownloadAsync(document.GridFsId.Value, cancellationToken).ConfigureAwait(false);
payload = await _rawDocumentStorage.DownloadAsync(document.PayloadId.Value, cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{

View File

@@ -133,7 +133,7 @@ public sealed class MsrcConnector : IFeedConnector
}
_diagnostics.DetailFetchAttempt();
if (existing?.GridFsId is { } oldGridId)
if (existing?.PayloadId is { } oldGridId)
{
await _rawDocumentStorage.DeleteAsync(oldGridId, cancellationToken).ConfigureAwait(false);
}
@@ -238,7 +238,7 @@ public sealed class MsrcConnector : IFeedConnector
continue;
}
if (!document.GridFsId.HasValue)
if (!document.PayloadId.HasValue)
{
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
@@ -250,7 +250,7 @@ public sealed class MsrcConnector : IFeedConnector
byte[] payload;
try
{
payload = await _rawDocumentStorage.DownloadAsync(document.GridFsId.Value, cancellationToken).ConfigureAwait(false);
payload = await _rawDocumentStorage.DownloadAsync(document.PayloadId.Value, cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{

View File

@@ -1,366 +1,366 @@
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text.Json;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using MongoDB.Bson;
using StellaOps.Concelier.Connector.Common;
using StellaOps.Concelier.Connector.Common.Fetch;
using StellaOps.Concelier.Connector.Vndr.Oracle.Configuration;
using StellaOps.Concelier.Connector.Vndr.Oracle.Internal;
using StellaOps.Concelier.Storage.Mongo;
using StellaOps.Concelier.Storage.Mongo.Advisories;
using StellaOps.Concelier.Storage.Mongo.Documents;
using StellaOps.Concelier.Storage.Mongo.Dtos;
using StellaOps.Concelier.Storage.Mongo.PsirtFlags;
using StellaOps.Plugin;
namespace StellaOps.Concelier.Connector.Vndr.Oracle;
public sealed class OracleConnector : IFeedConnector
{
private static readonly JsonSerializerOptions SerializerOptions = new()
{
PropertyNamingPolicy = JsonNamingPolicy.CamelCase,
DefaultIgnoreCondition = System.Text.Json.Serialization.JsonIgnoreCondition.WhenWritingNull,
};
private readonly SourceFetchService _fetchService;
private readonly RawDocumentStorage _rawDocumentStorage;
private readonly IDocumentStore _documentStore;
private readonly IDtoStore _dtoStore;
private readonly IAdvisoryStore _advisoryStore;
private readonly IPsirtFlagStore _psirtFlagStore;
private readonly ISourceStateRepository _stateRepository;
private readonly OracleCalendarFetcher _calendarFetcher;
private readonly OracleOptions _options;
private readonly TimeProvider _timeProvider;
private readonly ILogger<OracleConnector> _logger;
public OracleConnector(
SourceFetchService fetchService,
RawDocumentStorage rawDocumentStorage,
IDocumentStore documentStore,
IDtoStore dtoStore,
IAdvisoryStore advisoryStore,
IPsirtFlagStore psirtFlagStore,
ISourceStateRepository stateRepository,
OracleCalendarFetcher calendarFetcher,
IOptions<OracleOptions> options,
TimeProvider? timeProvider,
ILogger<OracleConnector> logger)
{
_fetchService = fetchService ?? throw new ArgumentNullException(nameof(fetchService));
_rawDocumentStorage = rawDocumentStorage ?? throw new ArgumentNullException(nameof(rawDocumentStorage));
_documentStore = documentStore ?? throw new ArgumentNullException(nameof(documentStore));
_dtoStore = dtoStore ?? throw new ArgumentNullException(nameof(dtoStore));
_advisoryStore = advisoryStore ?? throw new ArgumentNullException(nameof(advisoryStore));
_psirtFlagStore = psirtFlagStore ?? throw new ArgumentNullException(nameof(psirtFlagStore));
_stateRepository = stateRepository ?? throw new ArgumentNullException(nameof(stateRepository));
_calendarFetcher = calendarFetcher ?? throw new ArgumentNullException(nameof(calendarFetcher));
_options = (options ?? throw new ArgumentNullException(nameof(options))).Value ?? throw new ArgumentNullException(nameof(options));
_options.Validate();
_timeProvider = timeProvider ?? TimeProvider.System;
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public string SourceName => VndrOracleConnectorPlugin.SourceName;
public async Task FetchAsync(IServiceProvider services, CancellationToken cancellationToken)
{
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
var pendingDocuments = cursor.PendingDocuments.ToList();
var pendingMappings = cursor.PendingMappings.ToList();
var fetchCache = new Dictionary<string, OracleFetchCacheEntry>(cursor.FetchCache, StringComparer.OrdinalIgnoreCase);
var touchedResources = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
var now = _timeProvider.GetUtcNow();
var advisoryUris = await ResolveAdvisoryUrisAsync(cancellationToken).ConfigureAwait(false);
foreach (var uri in advisoryUris)
{
cancellationToken.ThrowIfCancellationRequested();
try
{
var cacheKey = uri.AbsoluteUri;
touchedResources.Add(cacheKey);
var advisoryId = DeriveAdvisoryId(uri);
var title = advisoryId.Replace('-', ' ');
var published = now;
var metadata = OracleDocumentMetadata.CreateMetadata(advisoryId, title, published);
var existing = await _documentStore.FindBySourceAndUriAsync(SourceName, uri.ToString(), cancellationToken).ConfigureAwait(false);
var request = new SourceFetchRequest(OracleOptions.HttpClientName, SourceName, uri)
{
Metadata = metadata,
ETag = existing?.Etag,
LastModified = existing?.LastModified,
AcceptHeaders = new[] { "text/html", "application/xhtml+xml", "text/plain;q=0.5" },
};
var result = await _fetchService.FetchAsync(request, cancellationToken).ConfigureAwait(false);
if (!result.IsSuccess || result.Document is null)
{
continue;
}
var cacheEntry = OracleFetchCacheEntry.FromDocument(result.Document);
if (existing is not null
&& string.Equals(existing.Status, DocumentStatuses.Mapped, StringComparison.Ordinal)
&& cursor.TryGetFetchCache(cacheKey, out var cached)
&& cached.Matches(result.Document))
{
_logger.LogDebug("Oracle advisory {AdvisoryId} unchanged; skipping parse/map", advisoryId);
await _documentStore.UpdateStatusAsync(result.Document.Id, existing.Status, cancellationToken).ConfigureAwait(false);
pendingDocuments.Remove(result.Document.Id);
pendingMappings.Remove(result.Document.Id);
fetchCache[cacheKey] = cacheEntry;
continue;
}
fetchCache[cacheKey] = cacheEntry;
if (!pendingDocuments.Contains(result.Document.Id))
{
pendingDocuments.Add(result.Document.Id);
}
if (_options.RequestDelay > TimeSpan.Zero)
{
await Task.Delay(_options.RequestDelay, cancellationToken).ConfigureAwait(false);
}
}
catch (Exception ex)
{
_logger.LogError(ex, "Oracle fetch failed for {Uri}", uri);
await _stateRepository.MarkFailureAsync(SourceName, _timeProvider.GetUtcNow(), TimeSpan.FromMinutes(10), ex.Message, cancellationToken).ConfigureAwait(false);
throw;
}
}
if (fetchCache.Count > 0 && touchedResources.Count > 0)
{
var stale = fetchCache.Keys.Where(key => !touchedResources.Contains(key)).ToArray();
foreach (var key in stale)
{
fetchCache.Remove(key);
}
}
var updatedCursor = cursor
.WithPendingDocuments(pendingDocuments)
.WithPendingMappings(pendingMappings)
.WithFetchCache(fetchCache)
.WithLastProcessed(now);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
public async Task ParseAsync(IServiceProvider services, CancellationToken cancellationToken)
{
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingDocuments.Count == 0)
{
return;
}
var pendingDocuments = cursor.PendingDocuments.ToList();
var pendingMappings = cursor.PendingMappings.ToList();
foreach (var documentId in cursor.PendingDocuments)
{
cancellationToken.ThrowIfCancellationRequested();
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (document is null)
{
pendingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
if (!document.GridFsId.HasValue)
{
_logger.LogWarning("Oracle document {DocumentId} missing GridFS payload", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
OracleDto dto;
try
{
var metadata = OracleDocumentMetadata.FromDocument(document);
var content = await _rawDocumentStorage.DownloadAsync(document.GridFsId.Value, cancellationToken).ConfigureAwait(false);
var html = System.Text.Encoding.UTF8.GetString(content);
dto = OracleParser.Parse(html, metadata);
}
catch (Exception ex)
{
_logger.LogError(ex, "Oracle parse failed for document {DocumentId}", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
if (!OracleDtoValidator.TryNormalize(dto, out var normalized, out var validationError))
{
_logger.LogWarning("Oracle validation failed for document {DocumentId}: {Reason}", document.Id, validationError ?? "unknown");
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
dto = normalized;
var json = JsonSerializer.Serialize(dto, SerializerOptions);
var payload = BsonDocument.Parse(json);
var validatedAt = _timeProvider.GetUtcNow();
var existingDto = await _dtoStore.FindByDocumentIdAsync(document.Id, cancellationToken).ConfigureAwait(false);
var dtoRecord = existingDto is null
? new DtoRecord(Guid.NewGuid(), document.Id, SourceName, "oracle.advisory.v1", payload, validatedAt)
: existingDto with
{
Payload = payload,
SchemaVersion = "oracle.advisory.v1",
ValidatedAt = validatedAt,
};
await _dtoStore.UpsertAsync(dtoRecord, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.PendingMap, cancellationToken).ConfigureAwait(false);
pendingDocuments.Remove(documentId);
if (!pendingMappings.Contains(documentId))
{
pendingMappings.Add(documentId);
}
}
var updatedCursor = cursor
.WithPendingDocuments(pendingDocuments)
.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
public async Task MapAsync(IServiceProvider services, CancellationToken cancellationToken)
{
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingMappings.Count == 0)
{
return;
}
var pendingMappings = cursor.PendingMappings.ToList();
foreach (var documentId in cursor.PendingMappings)
{
cancellationToken.ThrowIfCancellationRequested();
var dtoRecord = await _dtoStore.FindByDocumentIdAsync(documentId, cancellationToken).ConfigureAwait(false);
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (dtoRecord is null || document is null)
{
pendingMappings.Remove(documentId);
continue;
}
OracleDto? dto;
try
{
var json = dtoRecord.Payload.ToJson();
dto = JsonSerializer.Deserialize<OracleDto>(json, SerializerOptions);
}
catch (Exception ex)
{
_logger.LogError(ex, "Oracle DTO deserialization failed for document {DocumentId}", documentId);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
continue;
}
if (dto is null)
{
_logger.LogWarning("Oracle DTO payload deserialized as null for document {DocumentId}", documentId);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
continue;
}
var mappedAt = _timeProvider.GetUtcNow();
var (advisory, flag) = OracleMapper.Map(dto, document, dtoRecord, SourceName, mappedAt);
await _advisoryStore.UpsertAsync(advisory, cancellationToken).ConfigureAwait(false);
await _psirtFlagStore.UpsertAsync(flag, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Mapped, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
}
var updatedCursor = cursor.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
private async Task<OracleCursor> GetCursorAsync(CancellationToken cancellationToken)
{
var record = await _stateRepository.TryGetAsync(SourceName, cancellationToken).ConfigureAwait(false);
return OracleCursor.FromBson(record?.Cursor);
}
private async Task UpdateCursorAsync(OracleCursor cursor, CancellationToken cancellationToken)
{
var completedAt = _timeProvider.GetUtcNow();
await _stateRepository.UpdateCursorAsync(SourceName, cursor.ToBsonDocument(), completedAt, cancellationToken).ConfigureAwait(false);
}
private async Task<IReadOnlyCollection<Uri>> ResolveAdvisoryUrisAsync(CancellationToken cancellationToken)
{
var uris = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
foreach (var uri in _options.AdvisoryUris)
{
if (uri is not null)
{
uris.Add(uri.AbsoluteUri);
}
}
var calendarUris = await _calendarFetcher.GetAdvisoryUrisAsync(cancellationToken).ConfigureAwait(false);
foreach (var uri in calendarUris)
{
uris.Add(uri.AbsoluteUri);
}
return uris
.Select(static value => new Uri(value, UriKind.Absolute))
.OrderBy(static value => value.AbsoluteUri, StringComparer.OrdinalIgnoreCase)
.ToArray();
}
private static string DeriveAdvisoryId(Uri uri)
{
var segments = uri.Segments;
if (segments.Length == 0)
{
return uri.AbsoluteUri;
}
var slug = segments[^1].Trim('/');
if (string.IsNullOrWhiteSpace(slug))
{
return uri.AbsoluteUri;
}
return slug.Replace('.', '-');
}
}
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text.Json;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using MongoDB.Bson;
using StellaOps.Concelier.Connector.Common;
using StellaOps.Concelier.Connector.Common.Fetch;
using StellaOps.Concelier.Connector.Vndr.Oracle.Configuration;
using StellaOps.Concelier.Connector.Vndr.Oracle.Internal;
using StellaOps.Concelier.Storage.Mongo;
using StellaOps.Concelier.Storage.Mongo.Advisories;
using StellaOps.Concelier.Storage.Mongo.Documents;
using StellaOps.Concelier.Storage.Mongo.Dtos;
using StellaOps.Concelier.Storage.Mongo.PsirtFlags;
using StellaOps.Plugin;
namespace StellaOps.Concelier.Connector.Vndr.Oracle;
public sealed class OracleConnector : IFeedConnector
{
private static readonly JsonSerializerOptions SerializerOptions = new()
{
PropertyNamingPolicy = JsonNamingPolicy.CamelCase,
DefaultIgnoreCondition = System.Text.Json.Serialization.JsonIgnoreCondition.WhenWritingNull,
};
private readonly SourceFetchService _fetchService;
private readonly RawDocumentStorage _rawDocumentStorage;
private readonly IDocumentStore _documentStore;
private readonly IDtoStore _dtoStore;
private readonly IAdvisoryStore _advisoryStore;
private readonly IPsirtFlagStore _psirtFlagStore;
private readonly ISourceStateRepository _stateRepository;
private readonly OracleCalendarFetcher _calendarFetcher;
private readonly OracleOptions _options;
private readonly TimeProvider _timeProvider;
private readonly ILogger<OracleConnector> _logger;
public OracleConnector(
SourceFetchService fetchService,
RawDocumentStorage rawDocumentStorage,
IDocumentStore documentStore,
IDtoStore dtoStore,
IAdvisoryStore advisoryStore,
IPsirtFlagStore psirtFlagStore,
ISourceStateRepository stateRepository,
OracleCalendarFetcher calendarFetcher,
IOptions<OracleOptions> options,
TimeProvider? timeProvider,
ILogger<OracleConnector> logger)
{
_fetchService = fetchService ?? throw new ArgumentNullException(nameof(fetchService));
_rawDocumentStorage = rawDocumentStorage ?? throw new ArgumentNullException(nameof(rawDocumentStorage));
_documentStore = documentStore ?? throw new ArgumentNullException(nameof(documentStore));
_dtoStore = dtoStore ?? throw new ArgumentNullException(nameof(dtoStore));
_advisoryStore = advisoryStore ?? throw new ArgumentNullException(nameof(advisoryStore));
_psirtFlagStore = psirtFlagStore ?? throw new ArgumentNullException(nameof(psirtFlagStore));
_stateRepository = stateRepository ?? throw new ArgumentNullException(nameof(stateRepository));
_calendarFetcher = calendarFetcher ?? throw new ArgumentNullException(nameof(calendarFetcher));
_options = (options ?? throw new ArgumentNullException(nameof(options))).Value ?? throw new ArgumentNullException(nameof(options));
_options.Validate();
_timeProvider = timeProvider ?? TimeProvider.System;
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public string SourceName => VndrOracleConnectorPlugin.SourceName;
public async Task FetchAsync(IServiceProvider services, CancellationToken cancellationToken)
{
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
var pendingDocuments = cursor.PendingDocuments.ToList();
var pendingMappings = cursor.PendingMappings.ToList();
var fetchCache = new Dictionary<string, OracleFetchCacheEntry>(cursor.FetchCache, StringComparer.OrdinalIgnoreCase);
var touchedResources = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
var now = _timeProvider.GetUtcNow();
var advisoryUris = await ResolveAdvisoryUrisAsync(cancellationToken).ConfigureAwait(false);
foreach (var uri in advisoryUris)
{
cancellationToken.ThrowIfCancellationRequested();
try
{
var cacheKey = uri.AbsoluteUri;
touchedResources.Add(cacheKey);
var advisoryId = DeriveAdvisoryId(uri);
var title = advisoryId.Replace('-', ' ');
var published = now;
var metadata = OracleDocumentMetadata.CreateMetadata(advisoryId, title, published);
var existing = await _documentStore.FindBySourceAndUriAsync(SourceName, uri.ToString(), cancellationToken).ConfigureAwait(false);
var request = new SourceFetchRequest(OracleOptions.HttpClientName, SourceName, uri)
{
Metadata = metadata,
ETag = existing?.Etag,
LastModified = existing?.LastModified,
AcceptHeaders = new[] { "text/html", "application/xhtml+xml", "text/plain;q=0.5" },
};
var result = await _fetchService.FetchAsync(request, cancellationToken).ConfigureAwait(false);
if (!result.IsSuccess || result.Document is null)
{
continue;
}
var cacheEntry = OracleFetchCacheEntry.FromDocument(result.Document);
if (existing is not null
&& string.Equals(existing.Status, DocumentStatuses.Mapped, StringComparison.Ordinal)
&& cursor.TryGetFetchCache(cacheKey, out var cached)
&& cached.Matches(result.Document))
{
_logger.LogDebug("Oracle advisory {AdvisoryId} unchanged; skipping parse/map", advisoryId);
await _documentStore.UpdateStatusAsync(result.Document.Id, existing.Status, cancellationToken).ConfigureAwait(false);
pendingDocuments.Remove(result.Document.Id);
pendingMappings.Remove(result.Document.Id);
fetchCache[cacheKey] = cacheEntry;
continue;
}
fetchCache[cacheKey] = cacheEntry;
if (!pendingDocuments.Contains(result.Document.Id))
{
pendingDocuments.Add(result.Document.Id);
}
if (_options.RequestDelay > TimeSpan.Zero)
{
await Task.Delay(_options.RequestDelay, cancellationToken).ConfigureAwait(false);
}
}
catch (Exception ex)
{
_logger.LogError(ex, "Oracle fetch failed for {Uri}", uri);
await _stateRepository.MarkFailureAsync(SourceName, _timeProvider.GetUtcNow(), TimeSpan.FromMinutes(10), ex.Message, cancellationToken).ConfigureAwait(false);
throw;
}
}
if (fetchCache.Count > 0 && touchedResources.Count > 0)
{
var stale = fetchCache.Keys.Where(key => !touchedResources.Contains(key)).ToArray();
foreach (var key in stale)
{
fetchCache.Remove(key);
}
}
var updatedCursor = cursor
.WithPendingDocuments(pendingDocuments)
.WithPendingMappings(pendingMappings)
.WithFetchCache(fetchCache)
.WithLastProcessed(now);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
public async Task ParseAsync(IServiceProvider services, CancellationToken cancellationToken)
{
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingDocuments.Count == 0)
{
return;
}
var pendingDocuments = cursor.PendingDocuments.ToList();
var pendingMappings = cursor.PendingMappings.ToList();
foreach (var documentId in cursor.PendingDocuments)
{
cancellationToken.ThrowIfCancellationRequested();
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (document is null)
{
pendingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
if (!document.PayloadId.HasValue)
{
_logger.LogWarning("Oracle document {DocumentId} missing GridFS payload", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
OracleDto dto;
try
{
var metadata = OracleDocumentMetadata.FromDocument(document);
var content = await _rawDocumentStorage.DownloadAsync(document.PayloadId.Value, cancellationToken).ConfigureAwait(false);
var html = System.Text.Encoding.UTF8.GetString(content);
dto = OracleParser.Parse(html, metadata);
}
catch (Exception ex)
{
_logger.LogError(ex, "Oracle parse failed for document {DocumentId}", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
if (!OracleDtoValidator.TryNormalize(dto, out var normalized, out var validationError))
{
_logger.LogWarning("Oracle validation failed for document {DocumentId}: {Reason}", document.Id, validationError ?? "unknown");
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingDocuments.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
dto = normalized;
var json = JsonSerializer.Serialize(dto, SerializerOptions);
var payload = BsonDocument.Parse(json);
var validatedAt = _timeProvider.GetUtcNow();
var existingDto = await _dtoStore.FindByDocumentIdAsync(document.Id, cancellationToken).ConfigureAwait(false);
var dtoRecord = existingDto is null
? new DtoRecord(Guid.NewGuid(), document.Id, SourceName, "oracle.advisory.v1", payload, validatedAt)
: existingDto with
{
Payload = payload,
SchemaVersion = "oracle.advisory.v1",
ValidatedAt = validatedAt,
};
await _dtoStore.UpsertAsync(dtoRecord, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.PendingMap, cancellationToken).ConfigureAwait(false);
pendingDocuments.Remove(documentId);
if (!pendingMappings.Contains(documentId))
{
pendingMappings.Add(documentId);
}
}
var updatedCursor = cursor
.WithPendingDocuments(pendingDocuments)
.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
public async Task MapAsync(IServiceProvider services, CancellationToken cancellationToken)
{
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingMappings.Count == 0)
{
return;
}
var pendingMappings = cursor.PendingMappings.ToList();
foreach (var documentId in cursor.PendingMappings)
{
cancellationToken.ThrowIfCancellationRequested();
var dtoRecord = await _dtoStore.FindByDocumentIdAsync(documentId, cancellationToken).ConfigureAwait(false);
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (dtoRecord is null || document is null)
{
pendingMappings.Remove(documentId);
continue;
}
OracleDto? dto;
try
{
var json = dtoRecord.Payload.ToJson();
dto = JsonSerializer.Deserialize<OracleDto>(json, SerializerOptions);
}
catch (Exception ex)
{
_logger.LogError(ex, "Oracle DTO deserialization failed for document {DocumentId}", documentId);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
continue;
}
if (dto is null)
{
_logger.LogWarning("Oracle DTO payload deserialized as null for document {DocumentId}", documentId);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
continue;
}
var mappedAt = _timeProvider.GetUtcNow();
var (advisory, flag) = OracleMapper.Map(dto, document, dtoRecord, SourceName, mappedAt);
await _advisoryStore.UpsertAsync(advisory, cancellationToken).ConfigureAwait(false);
await _psirtFlagStore.UpsertAsync(flag, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Mapped, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
}
var updatedCursor = cursor.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
private async Task<OracleCursor> GetCursorAsync(CancellationToken cancellationToken)
{
var record = await _stateRepository.TryGetAsync(SourceName, cancellationToken).ConfigureAwait(false);
return OracleCursor.FromBson(record?.Cursor);
}
private async Task UpdateCursorAsync(OracleCursor cursor, CancellationToken cancellationToken)
{
var completedAt = _timeProvider.GetUtcNow();
await _stateRepository.UpdateCursorAsync(SourceName, cursor.ToBsonDocument(), completedAt, cancellationToken).ConfigureAwait(false);
}
private async Task<IReadOnlyCollection<Uri>> ResolveAdvisoryUrisAsync(CancellationToken cancellationToken)
{
var uris = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
foreach (var uri in _options.AdvisoryUris)
{
if (uri is not null)
{
uris.Add(uri.AbsoluteUri);
}
}
var calendarUris = await _calendarFetcher.GetAdvisoryUrisAsync(cancellationToken).ConfigureAwait(false);
foreach (var uri in calendarUris)
{
uris.Add(uri.AbsoluteUri);
}
return uris
.Select(static value => new Uri(value, UriKind.Absolute))
.OrderBy(static value => value.AbsoluteUri, StringComparer.OrdinalIgnoreCase)
.ToArray();
}
private static string DeriveAdvisoryId(Uri uri)
{
var segments = uri.Segments;
if (segments.Length == 0)
{
return uri.AbsoluteUri;
}
var slug = segments[^1].Trim('/');
if (string.IsNullOrWhiteSpace(slug))
{
return uri.AbsoluteUri;
}
return slug.Replace('.', '-');
}
}

View File

@@ -1,454 +1,454 @@
using System;
using System.Collections.Generic;
using System.Linq;
using System.Net.Http;
using System.Text.Json;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using MongoDB.Bson;
using MongoDB.Bson.IO;
using StellaOps.Concelier.Models;
using StellaOps.Concelier.Connector.Common;
using StellaOps.Concelier.Connector.Common.Fetch;
using StellaOps.Concelier.Connector.Vndr.Vmware.Configuration;
using StellaOps.Concelier.Connector.Vndr.Vmware.Internal;
using StellaOps.Concelier.Storage.Mongo;
using StellaOps.Concelier.Storage.Mongo.Advisories;
using StellaOps.Concelier.Storage.Mongo.Documents;
using StellaOps.Concelier.Storage.Mongo.Dtos;
using StellaOps.Concelier.Storage.Mongo.PsirtFlags;
using StellaOps.Plugin;
namespace StellaOps.Concelier.Connector.Vndr.Vmware;
public sealed class VmwareConnector : IFeedConnector
{
private static readonly JsonSerializerOptions SerializerOptions = new(JsonSerializerDefaults.Web)
{
PropertyNameCaseInsensitive = true,
DefaultIgnoreCondition = System.Text.Json.Serialization.JsonIgnoreCondition.WhenWritingNull,
};
private readonly IHttpClientFactory _httpClientFactory;
private readonly SourceFetchService _fetchService;
private readonly RawDocumentStorage _rawDocumentStorage;
private readonly IDocumentStore _documentStore;
private readonly IDtoStore _dtoStore;
private readonly IAdvisoryStore _advisoryStore;
private readonly ISourceStateRepository _stateRepository;
private readonly IPsirtFlagStore _psirtFlagStore;
private readonly VmwareOptions _options;
private readonly TimeProvider _timeProvider;
private readonly VmwareDiagnostics _diagnostics;
private readonly ILogger<VmwareConnector> _logger;
public VmwareConnector(
IHttpClientFactory httpClientFactory,
SourceFetchService fetchService,
RawDocumentStorage rawDocumentStorage,
IDocumentStore documentStore,
IDtoStore dtoStore,
IAdvisoryStore advisoryStore,
ISourceStateRepository stateRepository,
IPsirtFlagStore psirtFlagStore,
IOptions<VmwareOptions> options,
TimeProvider? timeProvider,
VmwareDiagnostics diagnostics,
ILogger<VmwareConnector> logger)
{
_httpClientFactory = httpClientFactory ?? throw new ArgumentNullException(nameof(httpClientFactory));
_fetchService = fetchService ?? throw new ArgumentNullException(nameof(fetchService));
_rawDocumentStorage = rawDocumentStorage ?? throw new ArgumentNullException(nameof(rawDocumentStorage));
_documentStore = documentStore ?? throw new ArgumentNullException(nameof(documentStore));
_dtoStore = dtoStore ?? throw new ArgumentNullException(nameof(dtoStore));
_advisoryStore = advisoryStore ?? throw new ArgumentNullException(nameof(advisoryStore));
_stateRepository = stateRepository ?? throw new ArgumentNullException(nameof(stateRepository));
_psirtFlagStore = psirtFlagStore ?? throw new ArgumentNullException(nameof(psirtFlagStore));
_options = (options ?? throw new ArgumentNullException(nameof(options))).Value ?? throw new ArgumentNullException(nameof(options));
_options.Validate();
_timeProvider = timeProvider ?? TimeProvider.System;
_diagnostics = diagnostics ?? throw new ArgumentNullException(nameof(diagnostics));
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public string SourceName => VmwareConnectorPlugin.SourceName;
public async Task FetchAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
var now = _timeProvider.GetUtcNow();
var pendingDocuments = cursor.PendingDocuments.ToHashSet();
var pendingMappings = cursor.PendingMappings.ToHashSet();
var fetchCache = new Dictionary<string, VmwareFetchCacheEntry>(cursor.FetchCache, StringComparer.OrdinalIgnoreCase);
var touchedResources = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
var remainingCapacity = _options.MaxAdvisoriesPerFetch;
IReadOnlyList<VmwareIndexItem> indexItems;
try
{
indexItems = await FetchIndexAsync(cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
_diagnostics.FetchFailure();
_logger.LogError(ex, "Failed to retrieve VMware advisory index");
await _stateRepository.MarkFailureAsync(SourceName, now, TimeSpan.FromMinutes(10), ex.Message, cancellationToken).ConfigureAwait(false);
throw;
}
if (indexItems.Count == 0)
{
return;
}
var orderedItems = indexItems
.Where(static item => !string.IsNullOrWhiteSpace(item.Id) && !string.IsNullOrWhiteSpace(item.DetailUrl))
.OrderBy(static item => item.Modified ?? DateTimeOffset.MinValue)
.ThenBy(static item => item.Id, StringComparer.OrdinalIgnoreCase)
.ToArray();
var baseline = cursor.LastModified ?? now - _options.InitialBackfill;
var resumeStart = baseline - _options.ModifiedTolerance;
ProvenanceDiagnostics.ReportResumeWindow(SourceName, resumeStart, _logger);
var processedIds = new HashSet<string>(cursor.ProcessedIds, StringComparer.OrdinalIgnoreCase);
var maxModified = cursor.LastModified ?? DateTimeOffset.MinValue;
var processedUpdated = false;
foreach (var item in orderedItems)
{
if (remainingCapacity <= 0)
{
break;
}
cancellationToken.ThrowIfCancellationRequested();
var modified = (item.Modified ?? DateTimeOffset.MinValue).ToUniversalTime();
if (modified < baseline - _options.ModifiedTolerance)
{
continue;
}
if (cursor.LastModified.HasValue && modified < cursor.LastModified.Value - _options.ModifiedTolerance)
{
continue;
}
if (modified == cursor.LastModified && cursor.ProcessedIds.Contains(item.Id, StringComparer.OrdinalIgnoreCase))
{
continue;
}
if (!Uri.TryCreate(item.DetailUrl, UriKind.Absolute, out var detailUri))
{
_logger.LogWarning("VMware advisory {AdvisoryId} has invalid detail URL {Url}", item.Id, item.DetailUrl);
continue;
}
var cacheKey = detailUri.AbsoluteUri;
touchedResources.Add(cacheKey);
var existing = await _documentStore.FindBySourceAndUriAsync(SourceName, cacheKey, cancellationToken).ConfigureAwait(false);
var metadata = new Dictionary<string, string>(StringComparer.Ordinal)
{
["vmware.id"] = item.Id,
["vmware.modified"] = modified.ToString("O"),
};
SourceFetchResult result;
try
{
result = await _fetchService.FetchAsync(
new SourceFetchRequest(VmwareOptions.HttpClientName, SourceName, detailUri)
{
Metadata = metadata,
ETag = existing?.Etag,
LastModified = existing?.LastModified,
AcceptHeaders = new[] { "application/json" },
},
cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
_diagnostics.FetchFailure();
_logger.LogError(ex, "Failed to fetch VMware advisory {AdvisoryId}", item.Id);
await _stateRepository.MarkFailureAsync(SourceName, now, TimeSpan.FromMinutes(5), ex.Message, cancellationToken).ConfigureAwait(false);
throw;
}
if (result.IsNotModified)
{
_diagnostics.FetchUnchanged();
if (existing is not null)
{
fetchCache[cacheKey] = VmwareFetchCacheEntry.FromDocument(existing);
pendingDocuments.Remove(existing.Id);
pendingMappings.Remove(existing.Id);
_logger.LogInformation("VMware advisory {AdvisoryId} returned 304 Not Modified", item.Id);
}
continue;
}
if (!result.IsSuccess || result.Document is null)
{
_diagnostics.FetchFailure();
continue;
}
remainingCapacity--;
if (modified > maxModified)
{
maxModified = modified;
processedIds.Clear();
processedUpdated = true;
}
if (modified == maxModified)
{
processedIds.Add(item.Id);
processedUpdated = true;
}
var cacheEntry = VmwareFetchCacheEntry.FromDocument(result.Document);
if (existing is not null
&& string.Equals(existing.Status, DocumentStatuses.Mapped, StringComparison.Ordinal)
&& cursor.TryGetFetchCache(cacheKey, out var cachedEntry)
&& cachedEntry.Matches(result.Document))
{
_diagnostics.FetchUnchanged();
fetchCache[cacheKey] = cacheEntry;
pendingDocuments.Remove(result.Document.Id);
pendingMappings.Remove(result.Document.Id);
await _documentStore.UpdateStatusAsync(result.Document.Id, existing.Status, cancellationToken).ConfigureAwait(false);
_logger.LogInformation("VMware advisory {AdvisoryId} unchanged; skipping reprocessing", item.Id);
continue;
}
_diagnostics.FetchItem();
fetchCache[cacheKey] = cacheEntry;
pendingDocuments.Add(result.Document.Id);
_logger.LogInformation(
"VMware advisory {AdvisoryId} fetched (documentId={DocumentId}, sha256={Sha})",
item.Id,
result.Document.Id,
result.Document.Sha256);
if (_options.RequestDelay > TimeSpan.Zero)
{
try
{
await Task.Delay(_options.RequestDelay, cancellationToken).ConfigureAwait(false);
}
catch (TaskCanceledException)
{
break;
}
}
}
if (fetchCache.Count > 0 && touchedResources.Count > 0)
{
var stale = fetchCache.Keys.Where(key => !touchedResources.Contains(key)).ToArray();
foreach (var key in stale)
{
fetchCache.Remove(key);
}
}
var updatedCursor = cursor
.WithPendingDocuments(pendingDocuments)
.WithPendingMappings(pendingMappings)
.WithFetchCache(fetchCache);
if (processedUpdated)
{
updatedCursor = updatedCursor.WithLastModified(maxModified, processedIds);
}
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
public async Task ParseAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingDocuments.Count == 0)
{
return;
}
var remaining = cursor.PendingDocuments.ToList();
var pendingMappings = cursor.PendingMappings.ToList();
foreach (var documentId in cursor.PendingDocuments)
{
cancellationToken.ThrowIfCancellationRequested();
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (document is null)
{
remaining.Remove(documentId);
continue;
}
if (!document.GridFsId.HasValue)
{
_logger.LogWarning("VMware document {DocumentId} missing GridFS payload", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remaining.Remove(documentId);
_diagnostics.ParseFailure();
continue;
}
byte[] bytes;
try
{
bytes = await _rawDocumentStorage.DownloadAsync(document.GridFsId.Value, cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
_logger.LogError(ex, "Failed downloading VMware document {DocumentId}", document.Id);
throw;
}
VmwareDetailDto? detail;
try
{
detail = JsonSerializer.Deserialize<VmwareDetailDto>(bytes, SerializerOptions);
}
catch (Exception ex)
{
_logger.LogWarning(ex, "Failed to deserialize VMware advisory {DocumentId}", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remaining.Remove(documentId);
_diagnostics.ParseFailure();
continue;
}
if (detail is null || string.IsNullOrWhiteSpace(detail.AdvisoryId))
{
_logger.LogWarning("VMware advisory document {DocumentId} contained empty payload", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remaining.Remove(documentId);
_diagnostics.ParseFailure();
continue;
}
var sanitized = JsonSerializer.Serialize(detail, SerializerOptions);
var payload = MongoDB.Bson.BsonDocument.Parse(sanitized);
var dtoRecord = new DtoRecord(Guid.NewGuid(), document.Id, SourceName, "vmware.v1", payload, _timeProvider.GetUtcNow());
await _dtoStore.UpsertAsync(dtoRecord, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.PendingMap, cancellationToken).ConfigureAwait(false);
remaining.Remove(documentId);
if (!pendingMappings.Contains(documentId))
{
pendingMappings.Add(documentId);
}
}
var updatedCursor = cursor
.WithPendingDocuments(remaining)
.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
public async Task MapAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingMappings.Count == 0)
{
return;
}
var pendingMappings = cursor.PendingMappings.ToList();
foreach (var documentId in cursor.PendingMappings)
{
cancellationToken.ThrowIfCancellationRequested();
var dto = await _dtoStore.FindByDocumentIdAsync(documentId, cancellationToken).ConfigureAwait(false);
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (dto is null || document is null)
{
pendingMappings.Remove(documentId);
continue;
}
var json = dto.Payload.ToJson(new JsonWriterSettings
{
OutputMode = JsonOutputMode.RelaxedExtendedJson,
});
VmwareDetailDto? detail;
try
{
detail = JsonSerializer.Deserialize<VmwareDetailDto>(json, SerializerOptions);
}
catch (Exception ex)
{
_logger.LogError(ex, "Failed to deserialize VMware DTO for document {DocumentId}", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
continue;
}
if (detail is null || string.IsNullOrWhiteSpace(detail.AdvisoryId))
{
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
continue;
}
var (advisory, flag) = VmwareMapper.Map(detail, document, dto);
await _advisoryStore.UpsertAsync(advisory, cancellationToken).ConfigureAwait(false);
await _psirtFlagStore.UpsertAsync(flag, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Mapped, cancellationToken).ConfigureAwait(false);
_diagnostics.MapAffectedCount(advisory.AffectedPackages.Length);
_logger.LogInformation(
"VMware advisory {AdvisoryId} mapped with {AffectedCount} affected packages",
detail.AdvisoryId,
advisory.AffectedPackages.Length);
pendingMappings.Remove(documentId);
}
var updatedCursor = cursor.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
private async Task<IReadOnlyList<VmwareIndexItem>> FetchIndexAsync(CancellationToken cancellationToken)
{
var client = _httpClientFactory.CreateClient(VmwareOptions.HttpClientName);
using var response = await client.GetAsync(_options.IndexUri, cancellationToken).ConfigureAwait(false);
response.EnsureSuccessStatusCode();
await using var stream = await response.Content.ReadAsStreamAsync(cancellationToken).ConfigureAwait(false);
var items = await JsonSerializer.DeserializeAsync<IReadOnlyList<VmwareIndexItem>>(stream, SerializerOptions, cancellationToken).ConfigureAwait(false);
return items ?? Array.Empty<VmwareIndexItem>();
}
private async Task<VmwareCursor> GetCursorAsync(CancellationToken cancellationToken)
{
var state = await _stateRepository.TryGetAsync(SourceName, cancellationToken).ConfigureAwait(false);
return state is null ? VmwareCursor.Empty : VmwareCursor.FromBson(state.Cursor);
}
private async Task UpdateCursorAsync(VmwareCursor cursor, CancellationToken cancellationToken)
{
var document = cursor.ToBsonDocument();
await _stateRepository.UpdateCursorAsync(SourceName, document, _timeProvider.GetUtcNow(), cancellationToken).ConfigureAwait(false);
}
}
using System;
using System.Collections.Generic;
using System.Linq;
using System.Net.Http;
using System.Text.Json;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using MongoDB.Bson;
using MongoDB.Bson.IO;
using StellaOps.Concelier.Models;
using StellaOps.Concelier.Connector.Common;
using StellaOps.Concelier.Connector.Common.Fetch;
using StellaOps.Concelier.Connector.Vndr.Vmware.Configuration;
using StellaOps.Concelier.Connector.Vndr.Vmware.Internal;
using StellaOps.Concelier.Storage.Mongo;
using StellaOps.Concelier.Storage.Mongo.Advisories;
using StellaOps.Concelier.Storage.Mongo.Documents;
using StellaOps.Concelier.Storage.Mongo.Dtos;
using StellaOps.Concelier.Storage.Mongo.PsirtFlags;
using StellaOps.Plugin;
namespace StellaOps.Concelier.Connector.Vndr.Vmware;
public sealed class VmwareConnector : IFeedConnector
{
private static readonly JsonSerializerOptions SerializerOptions = new(JsonSerializerDefaults.Web)
{
PropertyNameCaseInsensitive = true,
DefaultIgnoreCondition = System.Text.Json.Serialization.JsonIgnoreCondition.WhenWritingNull,
};
private readonly IHttpClientFactory _httpClientFactory;
private readonly SourceFetchService _fetchService;
private readonly RawDocumentStorage _rawDocumentStorage;
private readonly IDocumentStore _documentStore;
private readonly IDtoStore _dtoStore;
private readonly IAdvisoryStore _advisoryStore;
private readonly ISourceStateRepository _stateRepository;
private readonly IPsirtFlagStore _psirtFlagStore;
private readonly VmwareOptions _options;
private readonly TimeProvider _timeProvider;
private readonly VmwareDiagnostics _diagnostics;
private readonly ILogger<VmwareConnector> _logger;
public VmwareConnector(
IHttpClientFactory httpClientFactory,
SourceFetchService fetchService,
RawDocumentStorage rawDocumentStorage,
IDocumentStore documentStore,
IDtoStore dtoStore,
IAdvisoryStore advisoryStore,
ISourceStateRepository stateRepository,
IPsirtFlagStore psirtFlagStore,
IOptions<VmwareOptions> options,
TimeProvider? timeProvider,
VmwareDiagnostics diagnostics,
ILogger<VmwareConnector> logger)
{
_httpClientFactory = httpClientFactory ?? throw new ArgumentNullException(nameof(httpClientFactory));
_fetchService = fetchService ?? throw new ArgumentNullException(nameof(fetchService));
_rawDocumentStorage = rawDocumentStorage ?? throw new ArgumentNullException(nameof(rawDocumentStorage));
_documentStore = documentStore ?? throw new ArgumentNullException(nameof(documentStore));
_dtoStore = dtoStore ?? throw new ArgumentNullException(nameof(dtoStore));
_advisoryStore = advisoryStore ?? throw new ArgumentNullException(nameof(advisoryStore));
_stateRepository = stateRepository ?? throw new ArgumentNullException(nameof(stateRepository));
_psirtFlagStore = psirtFlagStore ?? throw new ArgumentNullException(nameof(psirtFlagStore));
_options = (options ?? throw new ArgumentNullException(nameof(options))).Value ?? throw new ArgumentNullException(nameof(options));
_options.Validate();
_timeProvider = timeProvider ?? TimeProvider.System;
_diagnostics = diagnostics ?? throw new ArgumentNullException(nameof(diagnostics));
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public string SourceName => VmwareConnectorPlugin.SourceName;
public async Task FetchAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
var now = _timeProvider.GetUtcNow();
var pendingDocuments = cursor.PendingDocuments.ToHashSet();
var pendingMappings = cursor.PendingMappings.ToHashSet();
var fetchCache = new Dictionary<string, VmwareFetchCacheEntry>(cursor.FetchCache, StringComparer.OrdinalIgnoreCase);
var touchedResources = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
var remainingCapacity = _options.MaxAdvisoriesPerFetch;
IReadOnlyList<VmwareIndexItem> indexItems;
try
{
indexItems = await FetchIndexAsync(cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
_diagnostics.FetchFailure();
_logger.LogError(ex, "Failed to retrieve VMware advisory index");
await _stateRepository.MarkFailureAsync(SourceName, now, TimeSpan.FromMinutes(10), ex.Message, cancellationToken).ConfigureAwait(false);
throw;
}
if (indexItems.Count == 0)
{
return;
}
var orderedItems = indexItems
.Where(static item => !string.IsNullOrWhiteSpace(item.Id) && !string.IsNullOrWhiteSpace(item.DetailUrl))
.OrderBy(static item => item.Modified ?? DateTimeOffset.MinValue)
.ThenBy(static item => item.Id, StringComparer.OrdinalIgnoreCase)
.ToArray();
var baseline = cursor.LastModified ?? now - _options.InitialBackfill;
var resumeStart = baseline - _options.ModifiedTolerance;
ProvenanceDiagnostics.ReportResumeWindow(SourceName, resumeStart, _logger);
var processedIds = new HashSet<string>(cursor.ProcessedIds, StringComparer.OrdinalIgnoreCase);
var maxModified = cursor.LastModified ?? DateTimeOffset.MinValue;
var processedUpdated = false;
foreach (var item in orderedItems)
{
if (remainingCapacity <= 0)
{
break;
}
cancellationToken.ThrowIfCancellationRequested();
var modified = (item.Modified ?? DateTimeOffset.MinValue).ToUniversalTime();
if (modified < baseline - _options.ModifiedTolerance)
{
continue;
}
if (cursor.LastModified.HasValue && modified < cursor.LastModified.Value - _options.ModifiedTolerance)
{
continue;
}
if (modified == cursor.LastModified && cursor.ProcessedIds.Contains(item.Id, StringComparer.OrdinalIgnoreCase))
{
continue;
}
if (!Uri.TryCreate(item.DetailUrl, UriKind.Absolute, out var detailUri))
{
_logger.LogWarning("VMware advisory {AdvisoryId} has invalid detail URL {Url}", item.Id, item.DetailUrl);
continue;
}
var cacheKey = detailUri.AbsoluteUri;
touchedResources.Add(cacheKey);
var existing = await _documentStore.FindBySourceAndUriAsync(SourceName, cacheKey, cancellationToken).ConfigureAwait(false);
var metadata = new Dictionary<string, string>(StringComparer.Ordinal)
{
["vmware.id"] = item.Id,
["vmware.modified"] = modified.ToString("O"),
};
SourceFetchResult result;
try
{
result = await _fetchService.FetchAsync(
new SourceFetchRequest(VmwareOptions.HttpClientName, SourceName, detailUri)
{
Metadata = metadata,
ETag = existing?.Etag,
LastModified = existing?.LastModified,
AcceptHeaders = new[] { "application/json" },
},
cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
_diagnostics.FetchFailure();
_logger.LogError(ex, "Failed to fetch VMware advisory {AdvisoryId}", item.Id);
await _stateRepository.MarkFailureAsync(SourceName, now, TimeSpan.FromMinutes(5), ex.Message, cancellationToken).ConfigureAwait(false);
throw;
}
if (result.IsNotModified)
{
_diagnostics.FetchUnchanged();
if (existing is not null)
{
fetchCache[cacheKey] = VmwareFetchCacheEntry.FromDocument(existing);
pendingDocuments.Remove(existing.Id);
pendingMappings.Remove(existing.Id);
_logger.LogInformation("VMware advisory {AdvisoryId} returned 304 Not Modified", item.Id);
}
continue;
}
if (!result.IsSuccess || result.Document is null)
{
_diagnostics.FetchFailure();
continue;
}
remainingCapacity--;
if (modified > maxModified)
{
maxModified = modified;
processedIds.Clear();
processedUpdated = true;
}
if (modified == maxModified)
{
processedIds.Add(item.Id);
processedUpdated = true;
}
var cacheEntry = VmwareFetchCacheEntry.FromDocument(result.Document);
if (existing is not null
&& string.Equals(existing.Status, DocumentStatuses.Mapped, StringComparison.Ordinal)
&& cursor.TryGetFetchCache(cacheKey, out var cachedEntry)
&& cachedEntry.Matches(result.Document))
{
_diagnostics.FetchUnchanged();
fetchCache[cacheKey] = cacheEntry;
pendingDocuments.Remove(result.Document.Id);
pendingMappings.Remove(result.Document.Id);
await _documentStore.UpdateStatusAsync(result.Document.Id, existing.Status, cancellationToken).ConfigureAwait(false);
_logger.LogInformation("VMware advisory {AdvisoryId} unchanged; skipping reprocessing", item.Id);
continue;
}
_diagnostics.FetchItem();
fetchCache[cacheKey] = cacheEntry;
pendingDocuments.Add(result.Document.Id);
_logger.LogInformation(
"VMware advisory {AdvisoryId} fetched (documentId={DocumentId}, sha256={Sha})",
item.Id,
result.Document.Id,
result.Document.Sha256);
if (_options.RequestDelay > TimeSpan.Zero)
{
try
{
await Task.Delay(_options.RequestDelay, cancellationToken).ConfigureAwait(false);
}
catch (TaskCanceledException)
{
break;
}
}
}
if (fetchCache.Count > 0 && touchedResources.Count > 0)
{
var stale = fetchCache.Keys.Where(key => !touchedResources.Contains(key)).ToArray();
foreach (var key in stale)
{
fetchCache.Remove(key);
}
}
var updatedCursor = cursor
.WithPendingDocuments(pendingDocuments)
.WithPendingMappings(pendingMappings)
.WithFetchCache(fetchCache);
if (processedUpdated)
{
updatedCursor = updatedCursor.WithLastModified(maxModified, processedIds);
}
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
public async Task ParseAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingDocuments.Count == 0)
{
return;
}
var remaining = cursor.PendingDocuments.ToList();
var pendingMappings = cursor.PendingMappings.ToList();
foreach (var documentId in cursor.PendingDocuments)
{
cancellationToken.ThrowIfCancellationRequested();
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (document is null)
{
remaining.Remove(documentId);
continue;
}
if (!document.PayloadId.HasValue)
{
_logger.LogWarning("VMware document {DocumentId} missing GridFS payload", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remaining.Remove(documentId);
_diagnostics.ParseFailure();
continue;
}
byte[] bytes;
try
{
bytes = await _rawDocumentStorage.DownloadAsync(document.PayloadId.Value, cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
_logger.LogError(ex, "Failed downloading VMware document {DocumentId}", document.Id);
throw;
}
VmwareDetailDto? detail;
try
{
detail = JsonSerializer.Deserialize<VmwareDetailDto>(bytes, SerializerOptions);
}
catch (Exception ex)
{
_logger.LogWarning(ex, "Failed to deserialize VMware advisory {DocumentId}", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remaining.Remove(documentId);
_diagnostics.ParseFailure();
continue;
}
if (detail is null || string.IsNullOrWhiteSpace(detail.AdvisoryId))
{
_logger.LogWarning("VMware advisory document {DocumentId} contained empty payload", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remaining.Remove(documentId);
_diagnostics.ParseFailure();
continue;
}
var sanitized = JsonSerializer.Serialize(detail, SerializerOptions);
var payload = MongoDB.Bson.BsonDocument.Parse(sanitized);
var dtoRecord = new DtoRecord(Guid.NewGuid(), document.Id, SourceName, "vmware.v1", payload, _timeProvider.GetUtcNow());
await _dtoStore.UpsertAsync(dtoRecord, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.PendingMap, cancellationToken).ConfigureAwait(false);
remaining.Remove(documentId);
if (!pendingMappings.Contains(documentId))
{
pendingMappings.Add(documentId);
}
}
var updatedCursor = cursor
.WithPendingDocuments(remaining)
.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
public async Task MapAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingMappings.Count == 0)
{
return;
}
var pendingMappings = cursor.PendingMappings.ToList();
foreach (var documentId in cursor.PendingMappings)
{
cancellationToken.ThrowIfCancellationRequested();
var dto = await _dtoStore.FindByDocumentIdAsync(documentId, cancellationToken).ConfigureAwait(false);
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (dto is null || document is null)
{
pendingMappings.Remove(documentId);
continue;
}
var json = dto.Payload.ToJson(new JsonWriterSettings
{
OutputMode = JsonOutputMode.RelaxedExtendedJson,
});
VmwareDetailDto? detail;
try
{
detail = JsonSerializer.Deserialize<VmwareDetailDto>(json, SerializerOptions);
}
catch (Exception ex)
{
_logger.LogError(ex, "Failed to deserialize VMware DTO for document {DocumentId}", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
continue;
}
if (detail is null || string.IsNullOrWhiteSpace(detail.AdvisoryId))
{
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
continue;
}
var (advisory, flag) = VmwareMapper.Map(detail, document, dto);
await _advisoryStore.UpsertAsync(advisory, cancellationToken).ConfigureAwait(false);
await _psirtFlagStore.UpsertAsync(flag, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Mapped, cancellationToken).ConfigureAwait(false);
_diagnostics.MapAffectedCount(advisory.AffectedPackages.Length);
_logger.LogInformation(
"VMware advisory {AdvisoryId} mapped with {AffectedCount} affected packages",
detail.AdvisoryId,
advisory.AffectedPackages.Length);
pendingMappings.Remove(documentId);
}
var updatedCursor = cursor.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
private async Task<IReadOnlyList<VmwareIndexItem>> FetchIndexAsync(CancellationToken cancellationToken)
{
var client = _httpClientFactory.CreateClient(VmwareOptions.HttpClientName);
using var response = await client.GetAsync(_options.IndexUri, cancellationToken).ConfigureAwait(false);
response.EnsureSuccessStatusCode();
await using var stream = await response.Content.ReadAsStreamAsync(cancellationToken).ConfigureAwait(false);
var items = await JsonSerializer.DeserializeAsync<IReadOnlyList<VmwareIndexItem>>(stream, SerializerOptions, cancellationToken).ConfigureAwait(false);
return items ?? Array.Empty<VmwareIndexItem>();
}
private async Task<VmwareCursor> GetCursorAsync(CancellationToken cancellationToken)
{
var state = await _stateRepository.TryGetAsync(SourceName, cancellationToken).ConfigureAwait(false);
return state is null ? VmwareCursor.Empty : VmwareCursor.FromBson(state.Cursor);
}
private async Task UpdateCursorAsync(VmwareCursor cursor, CancellationToken cancellationToken)
{
var document = cursor.ToBsonDocument();
await _stateRepository.UpdateCursorAsync(SourceName, document, _timeProvider.GetUtcNow(), cancellationToken).ConfigureAwait(false);
}
}

View File

@@ -10,6 +10,7 @@
<ItemGroup>
<ProjectReference Include="..\StellaOps.Concelier.Models\StellaOps.Concelier.Models.csproj" />
<ProjectReference Include="..\StellaOps.Concelier.Normalization\StellaOps.Concelier.Normalization.csproj" />
<ProjectReference Include="..\StellaOps.Concelier.Core\StellaOps.Concelier.Core.csproj" />
<ProjectReference Include="../../../__Libraries/StellaOps.Plugin/StellaOps.Plugin.csproj" />
<ProjectReference Include="../../../__Libraries/StellaOps.DependencyInjection/StellaOps.DependencyInjection.csproj" />
<ProjectReference Include="../../../__Libraries/StellaOps.Cryptography/StellaOps.Cryptography.csproj" />
@@ -20,4 +21,4 @@
<PackageReference Include="Microsoft.Extensions.Options" Version="10.0.0" />
<PackageReference Include="Microsoft.Extensions.Options.ConfigurationExtensions" Version="10.0.0" />
</ItemGroup>
</Project>
</Project>

View File

@@ -10,6 +10,7 @@
<ItemGroup>
<ProjectReference Include="..\StellaOps.Concelier.Exporter.Json\StellaOps.Concelier.Exporter.Json.csproj" />
<ProjectReference Include="..\StellaOps.Concelier.Models\StellaOps.Concelier.Models.csproj" />
<ProjectReference Include="..\StellaOps.Concelier.Core\StellaOps.Concelier.Core.csproj" />
<ProjectReference Include="../../../__Libraries/StellaOps.DependencyInjection/StellaOps.DependencyInjection.csproj" />
<ProjectReference Include="../../../__Libraries/StellaOps.Plugin/StellaOps.Plugin.csproj" />
</ItemGroup>
@@ -18,4 +19,4 @@
<PackageReference Include="Microsoft.Extensions.Options" Version="10.0.0" />
<PackageReference Include="Microsoft.Extensions.Options.ConfigurationExtensions" Version="10.0.0" />
</ItemGroup>
</Project>
</Project>

View File

@@ -24,6 +24,8 @@ namespace MongoDB.Bson
{
protected readonly object? _value;
public BsonValue(object? value) => _value = value;
internal object? RawValue => _value;
public static BsonValue Create(object? value) => BsonDocument.WrapExternal(value);
public virtual BsonType BsonType => _value switch
{
null => BsonType.Null,
@@ -59,12 +61,24 @@ namespace MongoDB.Bson
public class BsonInt64 : BsonValue { public BsonInt64(long value) : base(value) { } }
public class BsonDouble : BsonValue { public BsonDouble(double value) : base(value) { } }
public class BsonDateTime : BsonValue { public BsonDateTime(DateTime value) : base(value) { } }
public class BsonNull : BsonValue
{
private BsonNull() : base(null) { }
public static BsonNull Value { get; } = new();
}
public class BsonArray : BsonValue, IEnumerable<BsonValue>
{
private readonly List<BsonValue> _items = new();
public BsonArray() : base(null) { }
public BsonArray(IEnumerable<BsonValue> values) : this() => _items.AddRange(values);
public BsonArray(IEnumerable<object?> values) : this()
{
foreach (var value in values)
{
_items.Add(BsonDocument.WrapExternal(value));
}
}
public void Add(BsonValue value) => _items.Add(value);
public IEnumerator<BsonValue> GetEnumerator() => _items.GetEnumerator();
IEnumerator IEnumerable.GetEnumerator() => GetEnumerator();
@@ -93,6 +107,8 @@ namespace MongoDB.Bson
_ => new BsonValue(value)
};
internal static BsonValue WrapExternal(object? value) => Wrap(value);
public BsonValue this[string key]
{
get => _values[key];
@@ -104,6 +120,7 @@ namespace MongoDB.Bson
public bool TryGetValue(string key, out BsonValue value) => _values.TryGetValue(key, out value!);
public void Add(string key, BsonValue value) => _values[key] = value;
public void Add(string key, object? value) => _values[key] = Wrap(value);
public IEnumerator<KeyValuePair<string, BsonValue>> GetEnumerator() => _values.GetEnumerator();
IEnumerator IEnumerable.GetEnumerator() => GetEnumerator();
@@ -156,7 +173,7 @@ namespace MongoDB.Bson
{
BsonDocument doc => doc._values.ToDictionary(kvp => kvp.Key, kvp => Unwrap(kvp.Value)),
BsonArray array => array.Select(Unwrap).ToArray(),
_ => value._value
_ => value.RawValue
};
}
}

View File

@@ -1,4 +1,5 @@
using System.Collections.Concurrent;
using System.IO;
using StellaOps.Concelier.Models;
namespace StellaOps.Concelier.Storage.Mongo
@@ -33,8 +34,9 @@ namespace StellaOps.Concelier.Storage.Mongo
IReadOnlyDictionary<string, string>? Metadata = null,
string? Etag = null,
DateTimeOffset? LastModified = null,
MongoDB.Bson.ObjectId? GridFsId = null,
DateTimeOffset? ExpiresAt = null);
Guid? PayloadId = null,
DateTimeOffset? ExpiresAt = null,
byte[]? Payload = null);
public interface IDocumentStore
{
@@ -85,7 +87,7 @@ namespace StellaOps.Concelier.Storage.Mongo
Guid DocumentId,
string SourceName,
string Format,
MongoDB.Bson.BsonDocument Payload,
string Payload,
DateTimeOffset CreatedAt);
public interface IDtoStore
@@ -113,40 +115,40 @@ namespace StellaOps.Concelier.Storage.Mongo
public sealed class RawDocumentStorage
{
private readonly ConcurrentDictionary<MongoDB.Bson.ObjectId, byte[]> _blobs = new();
private readonly ConcurrentDictionary<Guid, byte[]> _blobs = new();
public Task<MongoDB.Bson.ObjectId> UploadAsync(string sourceName, string uri, byte[] content, string? contentType, DateTimeOffset? expiresAt, CancellationToken cancellationToken)
public Task<Guid> UploadAsync(string sourceName, string uri, byte[] content, string? contentType, DateTimeOffset? expiresAt, CancellationToken cancellationToken)
{
var id = MongoDB.Bson.ObjectId.GenerateNewId();
var id = Guid.NewGuid();
_blobs[id] = content.ToArray();
return Task.FromResult(id);
}
public Task<MongoDB.Bson.ObjectId> UploadAsync(string sourceName, string uri, byte[] content, string? contentType, CancellationToken cancellationToken)
public Task<Guid> UploadAsync(string sourceName, string uri, byte[] content, string? contentType, CancellationToken cancellationToken)
=> UploadAsync(sourceName, uri, content, contentType, null, cancellationToken);
public Task<byte[]> DownloadAsync(MongoDB.Bson.ObjectId id, CancellationToken cancellationToken)
public Task<byte[]> DownloadAsync(Guid id, CancellationToken cancellationToken)
{
if (_blobs.TryGetValue(id, out var bytes))
{
return Task.FromResult(bytes);
}
throw new MongoDB.Driver.GridFSFileNotFoundException($"Blob {id} not found.");
throw new FileNotFoundException($"Blob {id} not found.");
}
public Task DeleteAsync(MongoDB.Bson.ObjectId id, CancellationToken cancellationToken)
public Task DeleteAsync(Guid id, CancellationToken cancellationToken)
{
_blobs.TryRemove(id, out _);
return Task.CompletedTask;
}
}
public sealed record SourceStateRecord(string SourceName, MongoDB.Bson.BsonDocument? Cursor, DateTimeOffset UpdatedAt);
public sealed record SourceStateRecord(string SourceName, string? CursorJson, DateTimeOffset UpdatedAt);
public interface ISourceStateRepository
{
Task<SourceStateRecord?> TryGetAsync(string sourceName, CancellationToken cancellationToken);
Task UpdateCursorAsync(string sourceName, MongoDB.Bson.BsonDocument cursor, DateTimeOffset completedAt, CancellationToken cancellationToken);
Task UpdateCursorAsync(string sourceName, string cursorJson, DateTimeOffset completedAt, CancellationToken cancellationToken);
Task MarkFailureAsync(string sourceName, DateTimeOffset now, TimeSpan backoff, string reason, CancellationToken cancellationToken);
}
@@ -160,9 +162,9 @@ namespace StellaOps.Concelier.Storage.Mongo
return Task.FromResult<SourceStateRecord?>(record);
}
public Task UpdateCursorAsync(string sourceName, MongoDB.Bson.BsonDocument cursor, DateTimeOffset completedAt, CancellationToken cancellationToken)
public Task UpdateCursorAsync(string sourceName, string cursorJson, DateTimeOffset completedAt, CancellationToken cancellationToken)
{
_states[sourceName] = new SourceStateRecord(sourceName, cursor.DeepClone(), completedAt);
_states[sourceName] = new SourceStateRecord(sourceName, cursorJson, completedAt);
return Task.CompletedTask;
}
@@ -174,6 +176,53 @@ namespace StellaOps.Concelier.Storage.Mongo
}
}
namespace StellaOps.Concelier.Storage.Mongo.Advisories
{
public interface IAdvisoryStore
{
Task UpsertAsync(Advisory advisory, CancellationToken cancellationToken);
Task<Advisory?> FindAsync(string advisoryKey, CancellationToken cancellationToken);
Task<IReadOnlyList<Advisory>> GetRecentAsync(int limit, CancellationToken cancellationToken);
IAsyncEnumerable<Advisory> StreamAsync(CancellationToken cancellationToken);
}
public sealed class InMemoryAdvisoryStore : IAdvisoryStore
{
private readonly ConcurrentDictionary<string, Advisory> _advisories = new(StringComparer.OrdinalIgnoreCase);
public Task UpsertAsync(Advisory advisory, CancellationToken cancellationToken)
{
_advisories[advisory.AdvisoryKey] = advisory;
return Task.CompletedTask;
}
public Task<Advisory?> FindAsync(string advisoryKey, CancellationToken cancellationToken)
{
_advisories.TryGetValue(advisoryKey, out var advisory);
return Task.FromResult<Advisory?>(advisory);
}
public Task<IReadOnlyList<Advisory>> GetRecentAsync(int limit, CancellationToken cancellationToken)
{
var result = _advisories.Values
.OrderByDescending(a => a.Modified ?? a.Published ?? DateTimeOffset.MinValue)
.Take(limit)
.ToArray();
return Task.FromResult<IReadOnlyList<Advisory>>(result);
}
public async IAsyncEnumerable<Advisory> StreamAsync([System.Runtime.CompilerServices.EnumeratorCancellation] CancellationToken cancellationToken)
{
foreach (var advisory in _advisories.Values.OrderBy(a => a.AdvisoryKey, StringComparer.OrdinalIgnoreCase))
{
cancellationToken.ThrowIfCancellationRequested();
yield return advisory;
await Task.Yield();
}
}
}
}
namespace StellaOps.Concelier.Storage.Mongo.Aliases
{
public sealed record AliasRecord(string AdvisoryKey, string Scheme, string Value);
@@ -192,13 +241,13 @@ namespace StellaOps.Concelier.Storage.Mongo.Aliases
public Task<IReadOnlyList<AliasRecord>> GetByAdvisoryAsync(string advisoryKey, CancellationToken cancellationToken)
{
_byAdvisory.TryGetValue(advisoryKey, out var records);
return Task.FromResult<IReadOnlyList<AliasRecord>>(records ?? Array.Empty<AliasRecord>());
return Task.FromResult<IReadOnlyList<AliasRecord>>(records ?? (IReadOnlyList<AliasRecord>)Array.Empty<AliasRecord>());
}
public Task<IReadOnlyList<AliasRecord>> GetByAliasAsync(string scheme, string value, CancellationToken cancellationToken)
{
_byAlias.TryGetValue((scheme, value), out var records);
return Task.FromResult<IReadOnlyList<AliasRecord>>(records ?? Array.Empty<AliasRecord>());
return Task.FromResult<IReadOnlyList<AliasRecord>>(records ?? (IReadOnlyList<AliasRecord>)Array.Empty<AliasRecord>());
}
}
}
@@ -286,10 +335,10 @@ namespace StellaOps.Concelier.Storage.Mongo.Exporting
id,
cursor ?? digest,
digest,
lastDeltaDigest: null,
baseExportId: resetBaseline ? exportId : null,
baseDigest: resetBaseline ? digest : null,
targetRepository,
LastDeltaDigest: null,
BaseExportId: resetBaseline ? exportId : null,
BaseDigest: resetBaseline ? digest : null,
TargetRepository: targetRepository,
manifest,
exporterVersion,
_timeProvider.GetUtcNow());
@@ -307,11 +356,11 @@ namespace StellaOps.Concelier.Storage.Mongo.Exporting
var record = new ExportStateRecord(
id,
cursor ?? deltaDigest,
lastFullDigest: null,
lastDeltaDigest: deltaDigest,
baseExportId: null,
baseDigest: null,
targetRepository: null,
LastFullDigest: null,
LastDeltaDigest: deltaDigest,
BaseExportId: null,
BaseDigest: null,
TargetRepository: null,
manifest,
exporterVersion,
_timeProvider.GetUtcNow());

View File

@@ -0,0 +1,88 @@
using System.Text.Json;
using StellaOps.Concelier.Storage.Mongo;
using StellaOps.Concelier.Storage.Postgres.Models;
using StellaOps.Concelier.Storage.Postgres.Repositories;
namespace StellaOps.Concelier.Storage.Postgres;
/// <summary>
/// Postgres-backed implementation that satisfies the legacy IDocumentStore contract.
/// </summary>
public sealed class PostgresDocumentStore : IDocumentStore
{
private readonly IDocumentRepository _repository;
private readonly ISourceRepository _sourceRepository;
private readonly JsonSerializerOptions _json = new(JsonSerializerDefaults.Web);
public PostgresDocumentStore(IDocumentRepository repository, ISourceRepository sourceRepository)
{
_repository = repository ?? throw new ArgumentNullException(nameof(repository));
_sourceRepository = sourceRepository ?? throw new ArgumentNullException(nameof(sourceRepository));
}
public async Task<DocumentRecord?> FindAsync(Guid id, CancellationToken cancellationToken, MongoDB.Driver.IClientSessionHandle? session = null)
{
var row = await _repository.FindAsync(id, cancellationToken).ConfigureAwait(false);
return row is null ? null : Map(row);
}
public async Task<DocumentRecord?> FindBySourceAndUriAsync(string sourceName, string uri, CancellationToken cancellationToken, MongoDB.Driver.IClientSessionHandle? session = null)
{
var row = await _repository.FindBySourceAndUriAsync(sourceName, uri, cancellationToken).ConfigureAwait(false);
return row is null ? null : Map(row);
}
public async Task<DocumentRecord> UpsertAsync(DocumentRecord record, CancellationToken cancellationToken, MongoDB.Driver.IClientSessionHandle? session = null)
{
// Ensure source exists
var source = await _sourceRepository.GetByNameAsync(record.SourceName, cancellationToken).ConfigureAwait(false)
?? throw new InvalidOperationException($"Source '{record.SourceName}' not provisioned.");
var entity = new DocumentRecordEntity(
Id: record.Id == Guid.Empty ? Guid.NewGuid() : record.Id,
SourceId: source.Id,
SourceName: record.SourceName,
Uri: record.Uri,
Sha256: record.Sha256,
Status: record.Status,
ContentType: record.ContentType,
HeadersJson: record.Headers is null ? null : JsonSerializer.Serialize(record.Headers, _json),
MetadataJson: record.Metadata is null ? null : JsonSerializer.Serialize(record.Metadata, _json),
Etag: record.Etag,
LastModified: record.LastModified,
Payload: Array.Empty<byte>(), // payload handled via RawDocumentStorage; keep pointer zero-length here
CreatedAt: record.CreatedAt,
UpdatedAt: DateTimeOffset.UtcNow,
ExpiresAt: record.ExpiresAt);
var saved = await _repository.UpsertAsync(entity, cancellationToken).ConfigureAwait(false);
return Map(saved);
}
public async Task UpdateStatusAsync(Guid id, string status, CancellationToken cancellationToken, MongoDB.Driver.IClientSessionHandle? session = null)
{
await _repository.UpdateStatusAsync(id, status, cancellationToken).ConfigureAwait(false);
}
private DocumentRecord Map(DocumentRecordEntity row)
{
return new DocumentRecord(
row.Id,
row.SourceName,
row.Uri,
row.CreatedAt,
row.Sha256,
row.Status,
row.ContentType,
row.HeadersJson is null
? null
: JsonSerializer.Deserialize<Dictionary<string, string>>(row.HeadersJson, _json),
row.MetadataJson is null
? null
: JsonSerializer.Deserialize<Dictionary<string, string>>(row.MetadataJson, _json),
row.Etag,
row.LastModified,
PayloadId: null,
ExpiresAt: row.ExpiresAt);
}
}

View File

@@ -0,0 +1,23 @@
-- Concelier Postgres Migration 004: Source documents and payload storage (Mongo replacement)
CREATE TABLE IF NOT EXISTS concelier.source_documents (
id UUID NOT NULL,
source_id UUID NOT NULL,
source_name TEXT NOT NULL,
uri TEXT NOT NULL,
sha256 TEXT NOT NULL,
status TEXT NOT NULL,
content_type TEXT,
headers_json JSONB,
metadata_json JSONB,
etag TEXT,
last_modified TIMESTAMPTZ,
payload BYTEA NOT NULL,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
updated_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
expires_at TIMESTAMPTZ,
CONSTRAINT pk_source_documents PRIMARY KEY (source_name, uri)
);
CREATE INDEX IF NOT EXISTS idx_source_documents_source_id ON concelier.source_documents(source_id);
CREATE INDEX IF NOT EXISTS idx_source_documents_status ON concelier.source_documents(status);

View File

@@ -0,0 +1,18 @@
namespace StellaOps.Concelier.Storage.Postgres.Models;
public sealed record DocumentRecordEntity(
Guid Id,
Guid SourceId,
string SourceName,
string Uri,
string Sha256,
string Status,
string? ContentType,
string? HeadersJson,
string? MetadataJson,
string? Etag,
DateTimeOffset? LastModified,
byte[] Payload,
DateTimeOffset CreatedAt,
DateTimeOffset UpdatedAt,
DateTimeOffset? ExpiresAt);

View File

@@ -0,0 +1,125 @@
using System.Text.Json;
using Dapper;
using StellaOps.Concelier.Storage.Postgres.Models;
using StellaOps.Infrastructure.Postgres;
using StellaOps.Infrastructure.Postgres.Connections;
namespace StellaOps.Concelier.Storage.Postgres.Repositories;
public interface IDocumentRepository
{
Task<DocumentRecordEntity?> FindAsync(Guid id, CancellationToken cancellationToken);
Task<DocumentRecordEntity?> FindBySourceAndUriAsync(string sourceName, string uri, CancellationToken cancellationToken);
Task<DocumentRecordEntity> UpsertAsync(DocumentRecordEntity record, CancellationToken cancellationToken);
Task UpdateStatusAsync(Guid id, string status, CancellationToken cancellationToken);
}
public sealed class DocumentRepository : RepositoryBase<ConcelierDataSource>, IDocumentRepository
{
private readonly JsonSerializerOptions _json = new(JsonSerializerDefaults.Web);
public DocumentRepository(ConcelierDataSource dataSource, ILogger<DocumentRepository> logger)
: base(dataSource, logger)
{
}
public async Task<DocumentRecordEntity?> FindAsync(Guid id, CancellationToken cancellationToken)
{
const string sql = """
SELECT * FROM concelier.source_documents
WHERE id = @Id
LIMIT 1;
""";
await using var conn = await DataSource.OpenSystemConnectionAsync(cancellationToken);
var row = await conn.QuerySingleOrDefaultAsync(sql, new { Id = id });
return row is null ? null : Map(row);
}
public async Task<DocumentRecordEntity?> FindBySourceAndUriAsync(string sourceName, string uri, CancellationToken cancellationToken)
{
const string sql = """
SELECT * FROM concelier.source_documents
WHERE source_name = @SourceName AND uri = @Uri
LIMIT 1;
""";
await using var conn = await DataSource.OpenSystemConnectionAsync(cancellationToken);
var row = await conn.QuerySingleOrDefaultAsync(sql, new { SourceName = sourceName, Uri = uri });
return row is null ? null : Map(row);
}
public async Task<DocumentRecordEntity> UpsertAsync(DocumentRecordEntity record, CancellationToken cancellationToken)
{
const string sql = """
INSERT INTO concelier.source_documents (
id, source_id, source_name, uri, sha256, status, content_type,
headers_json, metadata_json, etag, last_modified, payload, created_at, updated_at, expires_at)
VALUES (
@Id, @SourceId, @SourceName, @Uri, @Sha256, @Status, @ContentType,
@HeadersJson, @MetadataJson, @Etag, @LastModified, @Payload, @CreatedAt, @UpdatedAt, @ExpiresAt)
ON CONFLICT (source_name, uri) DO UPDATE SET
sha256 = EXCLUDED.sha256,
status = EXCLUDED.status,
content_type = EXCLUDED.content_type,
headers_json = EXCLUDED.headers_json,
metadata_json = EXCLUDED.metadata_json,
etag = EXCLUDED.etag,
last_modified = EXCLUDED.last_modified,
payload = EXCLUDED.payload,
updated_at = EXCLUDED.updated_at,
expires_at = EXCLUDED.expires_at
RETURNING *;
""";
await using var conn = await DataSource.OpenSystemConnectionAsync(cancellationToken);
var row = await conn.QuerySingleAsync(sql, new
{
record.Id,
record.SourceId,
record.SourceName,
record.Uri,
record.Sha256,
record.Status,
record.ContentType,
record.HeadersJson,
record.MetadataJson,
record.Etag,
record.LastModified,
record.Payload,
record.CreatedAt,
record.UpdatedAt,
record.ExpiresAt
});
return Map(row);
}
public async Task UpdateStatusAsync(Guid id, string status, CancellationToken cancellationToken)
{
const string sql = """
UPDATE concelier.source_documents
SET status = @Status, updated_at = NOW()
WHERE id = @Id;
""";
await using var conn = await DataSource.OpenSystemConnectionAsync(cancellationToken);
await conn.ExecuteAsync(sql, new { Id = id, Status = status });
}
private DocumentRecordEntity Map(dynamic row)
{
return new DocumentRecordEntity(
row.id,
row.source_id,
row.source_name,
row.uri,
row.sha256,
row.status,
(string?)row.content_type,
(string?)row.headers_json,
(string?)row.metadata_json,
(string?)row.etag,
(DateTimeOffset?)row.last_modified,
(byte[])row.payload,
DateTime.SpecifyKind(row.created_at, DateTimeKind.Utc),
DateTime.SpecifyKind(row.updated_at, DateTimeKind.Utc),
row.expires_at is null ? null : DateTime.SpecifyKind(row.expires_at, DateTimeKind.Utc));
}
}

View File

@@ -4,6 +4,7 @@ using StellaOps.Concelier.Storage.Postgres.Repositories;
using StellaOps.Infrastructure.Postgres;
using StellaOps.Infrastructure.Postgres.Options;
using StellaOps.Concelier.Core.Linksets;
using StellaOps.Concelier.Storage.Mongo;
namespace StellaOps.Concelier.Storage.Postgres;
@@ -38,11 +39,13 @@ public static class ServiceCollectionExtensions
services.AddScoped<IAdvisoryWeaknessRepository, AdvisoryWeaknessRepository>();
services.AddScoped<IKevFlagRepository, KevFlagRepository>();
services.AddScoped<ISourceStateRepository, SourceStateRepository>();
services.AddScoped<IDocumentRepository, DocumentRepository>();
services.AddScoped<IFeedSnapshotRepository, FeedSnapshotRepository>();
services.AddScoped<IAdvisorySnapshotRepository, AdvisorySnapshotRepository>();
services.AddScoped<IMergeEventRepository, MergeEventRepository>();
services.AddScoped<IAdvisoryLinksetStore, AdvisoryLinksetCacheRepository>();
services.AddScoped<IAdvisoryLinksetLookup>(sp => sp.GetRequiredService<IAdvisoryLinksetStore>());
services.AddScoped<IDocumentStore, PostgresDocumentStore>();
return services;
}
@@ -71,11 +74,13 @@ public static class ServiceCollectionExtensions
services.AddScoped<IAdvisoryWeaknessRepository, AdvisoryWeaknessRepository>();
services.AddScoped<IKevFlagRepository, KevFlagRepository>();
services.AddScoped<ISourceStateRepository, SourceStateRepository>();
services.AddScoped<IDocumentRepository, DocumentRepository>();
services.AddScoped<IFeedSnapshotRepository, FeedSnapshotRepository>();
services.AddScoped<IAdvisorySnapshotRepository, AdvisorySnapshotRepository>();
services.AddScoped<IMergeEventRepository, MergeEventRepository>();
services.AddScoped<IAdvisoryLinksetStore, AdvisoryLinksetCacheRepository>();
services.AddScoped<IAdvisoryLinksetLookup>(sp => sp.GetRequiredService<IAdvisoryLinksetStore>());
services.AddScoped<IDocumentStore, PostgresDocumentStore>();
return services;
}

View File

@@ -10,6 +10,11 @@
<RootNamespace>StellaOps.Concelier.Storage.Postgres</RootNamespace>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Dapper" Version="2.1.35" />
<PackageReference Include="Microsoft.Extensions.Logging.Abstractions" Version="10.0.0" />
</ItemGroup>
<ItemGroup>
<EmbeddedResource Include="Migrations\**\*.sql" LogicalName="%(RecursiveDir)%(Filename)%(Extension)" />
</ItemGroup>
@@ -25,6 +30,7 @@
<ItemGroup>
<ProjectReference Include="..\..\..\__Libraries\StellaOps.Infrastructure.Postgres\StellaOps.Infrastructure.Postgres.csproj" />
<ProjectReference Include="..\StellaOps.Concelier.Core\StellaOps.Concelier.Core.csproj" />
<ProjectReference Include="..\..\..\__Libraries\StellaOps.DependencyInjection\StellaOps.DependencyInjection.csproj" />
</ItemGroup>
</Project>

View File

@@ -6,36 +6,36 @@ using StellaOps.Concelier.Connector.Common.Html;
using StellaOps.Concelier.Models;
using StellaOps.Concelier.Storage.Mongo.Documents;
using Xunit;
namespace StellaOps.Concelier.Connector.Cccs.Tests.Internal;
public sealed class CccsMapperTests
{
[Fact]
public void Map_CreatesCanonicalAdvisory()
{
var raw = CccsHtmlParserTests.LoadFixture<CccsRawAdvisoryDocument>("cccs-raw-advisory.json");
var dto = new CccsHtmlParser(new HtmlContentSanitizer()).Parse(raw);
var document = new DocumentRecord(
Guid.NewGuid(),
CccsConnectorPlugin.SourceName,
dto.CanonicalUrl,
DateTimeOffset.UtcNow,
"sha-test",
DocumentStatuses.PendingMap,
"application/json",
Headers: null,
Metadata: null,
Etag: null,
LastModified: dto.Modified,
GridFsId: null);
var recordedAt = DateTimeOffset.Parse("2025-08-12T00:00:00Z");
var advisory = CccsMapper.Map(dto, document, recordedAt);
advisory.AdvisoryKey.Should().Be("TEST-001");
advisory.Title.Should().Be(dto.Title);
advisory.Aliases.Should().Contain(new[] { "TEST-001", "CVE-2020-1234", "CVE-2021-9999" });
namespace StellaOps.Concelier.Connector.Cccs.Tests.Internal;
public sealed class CccsMapperTests
{
[Fact]
public void Map_CreatesCanonicalAdvisory()
{
var raw = CccsHtmlParserTests.LoadFixture<CccsRawAdvisoryDocument>("cccs-raw-advisory.json");
var dto = new CccsHtmlParser(new HtmlContentSanitizer()).Parse(raw);
var document = new DocumentRecord(
Guid.NewGuid(),
CccsConnectorPlugin.SourceName,
dto.CanonicalUrl,
DateTimeOffset.UtcNow,
"sha-test",
DocumentStatuses.PendingMap,
"application/json",
Headers: null,
Metadata: null,
Etag: null,
LastModified: dto.Modified,
PayloadId: null);
var recordedAt = DateTimeOffset.Parse("2025-08-12T00:00:00Z");
var advisory = CccsMapper.Map(dto, document, recordedAt);
advisory.AdvisoryKey.Should().Be("TEST-001");
advisory.Title.Should().Be(dto.Title);
advisory.Aliases.Should().Contain(new[] { "TEST-001", "CVE-2020-1234", "CVE-2021-9999" });
advisory.References.Should().Contain(reference => reference.Url == dto.CanonicalUrl && reference.Kind == "details");
advisory.References.Should().Contain(reference => reference.Url == "https://example.com/details");
advisory.AffectedPackages.Should().HaveCount(2);

View File

@@ -1,118 +1,118 @@
using System;
using System.Globalization;
using MongoDB.Bson;
using StellaOps.Concelier.Models;
using StellaOps.Concelier.Connector.CertCc.Internal;
using StellaOps.Concelier.Storage.Mongo.Documents;
using StellaOps.Concelier.Storage.Mongo.Dtos;
using Xunit;
namespace StellaOps.Concelier.Connector.CertCc.Tests.Internal;
public sealed class CertCcMapperTests
{
private static readonly DateTimeOffset PublishedAt = DateTimeOffset.Parse("2025-10-03T11:35:31Z", CultureInfo.InvariantCulture);
[Fact]
public void Map_ProducesCanonicalAdvisoryWithVendorPrimitives()
{
const string vendorStatement =
"The issue is confirmed, and here is the patch list\n\n" +
"V3912/V3910/V2962/V1000B\t4.4.3.6/4.4.5.1\n" +
"V2927/V2865/V2866\t4.5.1\n" +
"V2765/V2766/V2763/V2135\t4.5.1";
var vendor = new CertCcVendorDto(
"DrayTek Corporation",
ContactDate: PublishedAt.AddDays(-10),
StatementDate: PublishedAt.AddDays(-5),
Updated: PublishedAt,
Statement: vendorStatement,
Addendum: null,
References: new[] { "https://www.draytek.com/support/resources?type=version" });
var vendorStatus = new CertCcVendorStatusDto(
Vendor: "DrayTek Corporation",
CveId: "CVE-2025-10547",
Status: "Affected",
Statement: null,
References: Array.Empty<string>(),
DateAdded: PublishedAt,
DateUpdated: PublishedAt);
var vulnerability = new CertCcVulnerabilityDto(
CveId: "CVE-2025-10547",
Description: null,
DateAdded: PublishedAt,
DateUpdated: PublishedAt);
var metadata = new CertCcNoteMetadata(
VuId: "VU#294418",
IdNumber: "294418",
Title: "Vigor routers running DrayOS RCE via EasyVPN",
Overview: "Overview",
Summary: "Summary",
Published: PublishedAt,
Updated: PublishedAt.AddMinutes(5),
Created: PublishedAt,
Revision: 2,
CveIds: new[] { "CVE-2025-10547" },
PublicUrls: new[]
{
"https://www.draytek.com/about/security-advisory/use-of-uninitialized-variable-vulnerabilities/",
"https://www.draytek.com/support/resources?type=version"
},
PrimaryUrl: "https://www.kb.cert.org/vuls/id/294418/");
var dto = new CertCcNoteDto(
metadata,
Vendors: new[] { vendor },
VendorStatuses: new[] { vendorStatus },
Vulnerabilities: new[] { vulnerability });
var document = new DocumentRecord(
Guid.NewGuid(),
"cert-cc",
"https://www.kb.cert.org/vuls/id/294418/",
PublishedAt,
Sha256: new string('0', 64),
Status: "pending-map",
ContentType: "application/json",
Headers: null,
Metadata: null,
Etag: null,
LastModified: PublishedAt,
GridFsId: null);
var dtoRecord = new DtoRecord(
Id: Guid.NewGuid(),
DocumentId: document.Id,
SourceName: "cert-cc",
SchemaVersion: "certcc.vince.note.v1",
Payload: new BsonDocument(),
ValidatedAt: PublishedAt.AddMinutes(1));
var advisory = CertCcMapper.Map(dto, document, dtoRecord, "cert-cc");
Assert.Equal("certcc/vu-294418", advisory.AdvisoryKey);
Assert.Contains("VU#294418", advisory.Aliases);
Assert.Contains("CVE-2025-10547", advisory.Aliases);
Assert.Equal("en", advisory.Language);
Assert.Equal(PublishedAt, advisory.Published);
Assert.Contains(advisory.References, reference => reference.Url.Contains("/vuls/id/294418", StringComparison.OrdinalIgnoreCase));
var affected = Assert.Single(advisory.AffectedPackages);
Assert.Equal("vendor", affected.Type);
Assert.Equal("DrayTek Corporation", affected.Identifier);
Assert.Contains(affected.Statuses, status => status.Status == AffectedPackageStatusCatalog.Affected);
var range = Assert.Single(affected.VersionRanges);
Assert.NotNull(range.Primitives);
Assert.NotNull(range.Primitives!.VendorExtensions);
Assert.Contains(range.Primitives.VendorExtensions!, kvp => kvp.Key == "certcc.vendor.patches");
Assert.NotEmpty(affected.NormalizedVersions);
Assert.Contains(affected.NormalizedVersions, rule => rule.Scheme == "certcc.vendor" && rule.Value == "4.5.1");
}
}
using System;
using System.Globalization;
using MongoDB.Bson;
using StellaOps.Concelier.Models;
using StellaOps.Concelier.Connector.CertCc.Internal;
using StellaOps.Concelier.Storage.Mongo.Documents;
using StellaOps.Concelier.Storage.Mongo.Dtos;
using Xunit;
namespace StellaOps.Concelier.Connector.CertCc.Tests.Internal;
public sealed class CertCcMapperTests
{
private static readonly DateTimeOffset PublishedAt = DateTimeOffset.Parse("2025-10-03T11:35:31Z", CultureInfo.InvariantCulture);
[Fact]
public void Map_ProducesCanonicalAdvisoryWithVendorPrimitives()
{
const string vendorStatement =
"The issue is confirmed, and here is the patch list\n\n" +
"V3912/V3910/V2962/V1000B\t4.4.3.6/4.4.5.1\n" +
"V2927/V2865/V2866\t4.5.1\n" +
"V2765/V2766/V2763/V2135\t4.5.1";
var vendor = new CertCcVendorDto(
"DrayTek Corporation",
ContactDate: PublishedAt.AddDays(-10),
StatementDate: PublishedAt.AddDays(-5),
Updated: PublishedAt,
Statement: vendorStatement,
Addendum: null,
References: new[] { "https://www.draytek.com/support/resources?type=version" });
var vendorStatus = new CertCcVendorStatusDto(
Vendor: "DrayTek Corporation",
CveId: "CVE-2025-10547",
Status: "Affected",
Statement: null,
References: Array.Empty<string>(),
DateAdded: PublishedAt,
DateUpdated: PublishedAt);
var vulnerability = new CertCcVulnerabilityDto(
CveId: "CVE-2025-10547",
Description: null,
DateAdded: PublishedAt,
DateUpdated: PublishedAt);
var metadata = new CertCcNoteMetadata(
VuId: "VU#294418",
IdNumber: "294418",
Title: "Vigor routers running DrayOS RCE via EasyVPN",
Overview: "Overview",
Summary: "Summary",
Published: PublishedAt,
Updated: PublishedAt.AddMinutes(5),
Created: PublishedAt,
Revision: 2,
CveIds: new[] { "CVE-2025-10547" },
PublicUrls: new[]
{
"https://www.draytek.com/about/security-advisory/use-of-uninitialized-variable-vulnerabilities/",
"https://www.draytek.com/support/resources?type=version"
},
PrimaryUrl: "https://www.kb.cert.org/vuls/id/294418/");
var dto = new CertCcNoteDto(
metadata,
Vendors: new[] { vendor },
VendorStatuses: new[] { vendorStatus },
Vulnerabilities: new[] { vulnerability });
var document = new DocumentRecord(
Guid.NewGuid(),
"cert-cc",
"https://www.kb.cert.org/vuls/id/294418/",
PublishedAt,
Sha256: new string('0', 64),
Status: "pending-map",
ContentType: "application/json",
Headers: null,
Metadata: null,
Etag: null,
LastModified: PublishedAt,
PayloadId: null);
var dtoRecord = new DtoRecord(
Id: Guid.NewGuid(),
DocumentId: document.Id,
SourceName: "cert-cc",
SchemaVersion: "certcc.vince.note.v1",
Payload: new BsonDocument(),
ValidatedAt: PublishedAt.AddMinutes(1));
var advisory = CertCcMapper.Map(dto, document, dtoRecord, "cert-cc");
Assert.Equal("certcc/vu-294418", advisory.AdvisoryKey);
Assert.Contains("VU#294418", advisory.Aliases);
Assert.Contains("CVE-2025-10547", advisory.Aliases);
Assert.Equal("en", advisory.Language);
Assert.Equal(PublishedAt, advisory.Published);
Assert.Contains(advisory.References, reference => reference.Url.Contains("/vuls/id/294418", StringComparison.OrdinalIgnoreCase));
var affected = Assert.Single(advisory.AffectedPackages);
Assert.Equal("vendor", affected.Type);
Assert.Equal("DrayTek Corporation", affected.Identifier);
Assert.Contains(affected.Statuses, status => status.Status == AffectedPackageStatusCatalog.Affected);
var range = Assert.Single(affected.VersionRanges);
Assert.NotNull(range.Primitives);
Assert.NotNull(range.Primitives!.VendorExtensions);
Assert.Contains(range.Primitives.VendorExtensions!, kvp => kvp.Key == "certcc.vendor.patches");
Assert.NotEmpty(affected.NormalizedVersions);
Assert.Contains(affected.NormalizedVersions, rule => rule.Scheme == "certcc.vendor" && rule.Value == "4.5.1");
}
}

View File

@@ -93,7 +93,7 @@ public sealed class SourceStateSeedProcessorTests : IAsyncLifetime
Assert.Equal(documentId, storedDocument!.Id);
Assert.Equal("application/json", storedDocument.ContentType);
Assert.Equal(DocumentStatuses.PendingParse, storedDocument.Status);
Assert.NotNull(storedDocument.GridFsId);
Assert.NotNull(storedDocument.PayloadId);
Assert.NotNull(storedDocument.Headers);
Assert.Equal("true", storedDocument.Headers!["X-Test"]);
Assert.NotNull(storedDocument.Metadata);
@@ -153,7 +153,7 @@ public sealed class SourceStateSeedProcessorTests : IAsyncLifetime
CancellationToken.None);
Assert.NotNull(existingRecord);
var previousGridId = existingRecord!.GridFsId;
var previousGridId = existingRecord!.PayloadId;
Assert.NotNull(previousGridId);
var filesCollection = _database.GetCollection<BsonDocument>("documents.files");
@@ -189,8 +189,8 @@ public sealed class SourceStateSeedProcessorTests : IAsyncLifetime
Assert.NotNull(refreshedRecord);
Assert.Equal(documentId, refreshedRecord!.Id);
Assert.NotNull(refreshedRecord.GridFsId);
Assert.NotEqual(previousGridId, refreshedRecord.GridFsId);
Assert.NotNull(refreshedRecord.PayloadId);
Assert.NotEqual(previousGridId, refreshedRecord.PayloadId);
var files = await filesCollection.Find(FilterDefinition<BsonDocument>.Empty).ToListAsync();
Assert.Single(files);

View File

@@ -1,82 +1,82 @@
using System;
using Xunit;
using StellaOps.Concelier.Models;
using StellaOps.Concelier.Connector.Distro.Debian;
using StellaOps.Concelier.Connector.Distro.Debian.Internal;
using StellaOps.Concelier.Storage.Mongo.Documents;
namespace StellaOps.Concelier.Connector.Distro.Debian.Tests;
public sealed class DebianMapperTests
{
[Fact]
public void Map_BuildsRangePrimitives_ForResolvedPackage()
{
var dto = new DebianAdvisoryDto(
AdvisoryId: "DSA-2024-123",
SourcePackage: "openssl",
Title: "Openssl security update",
Description: "Fixes multiple issues.",
CveIds: new[] { "CVE-2024-1000", "CVE-2024-1001" },
Packages: new[]
{
new DebianPackageStateDto(
Package: "openssl",
Release: "bullseye",
Status: "resolved",
IntroducedVersion: "1:1.1.1n-0+deb11u2",
FixedVersion: "1:1.1.1n-0+deb11u5",
LastAffectedVersion: null,
Published: new DateTimeOffset(2024, 9, 1, 0, 0, 0, TimeSpan.Zero)),
new DebianPackageStateDto(
Package: "openssl",
Release: "bookworm",
Status: "open",
IntroducedVersion: null,
FixedVersion: null,
LastAffectedVersion: null,
Published: null)
},
References: new[]
{
new DebianReferenceDto(
Url: "https://security-tracker.debian.org/tracker/DSA-2024-123",
Kind: "advisory",
Title: "Debian Security Advisory 2024-123"),
});
var document = new DocumentRecord(
Id: Guid.NewGuid(),
SourceName: DebianConnectorPlugin.SourceName,
Uri: "https://security-tracker.debian.org/tracker/DSA-2024-123",
FetchedAt: new DateTimeOffset(2024, 9, 1, 1, 0, 0, TimeSpan.Zero),
Sha256: "sha",
Status: "Fetched",
ContentType: "application/json",
Headers: null,
Metadata: null,
Etag: null,
LastModified: null,
GridFsId: null);
Advisory advisory = DebianMapper.Map(dto, document, new DateTimeOffset(2024, 9, 1, 2, 0, 0, TimeSpan.Zero));
Assert.Equal("DSA-2024-123", advisory.AdvisoryKey);
Assert.Contains("CVE-2024-1000", advisory.Aliases);
Assert.Contains("CVE-2024-1001", advisory.Aliases);
var resolvedPackage = Assert.Single(advisory.AffectedPackages, p => p.Platform == "bullseye");
var range = Assert.Single(resolvedPackage.VersionRanges);
Assert.Equal("evr", range.RangeKind);
Assert.Equal("1:1.1.1n-0+deb11u2", range.IntroducedVersion);
Assert.Equal("1:1.1.1n-0+deb11u5", range.FixedVersion);
Assert.NotNull(range.Primitives);
var evr = range.Primitives!.Evr;
Assert.NotNull(evr);
Assert.NotNull(evr!.Introduced);
Assert.Equal(1, evr.Introduced!.Epoch);
Assert.Equal("1.1.1n", evr.Introduced.UpstreamVersion);
Assert.Equal("0+deb11u2", evr.Introduced.Revision);
using System;
using Xunit;
using StellaOps.Concelier.Models;
using StellaOps.Concelier.Connector.Distro.Debian;
using StellaOps.Concelier.Connector.Distro.Debian.Internal;
using StellaOps.Concelier.Storage.Mongo.Documents;
namespace StellaOps.Concelier.Connector.Distro.Debian.Tests;
public sealed class DebianMapperTests
{
[Fact]
public void Map_BuildsRangePrimitives_ForResolvedPackage()
{
var dto = new DebianAdvisoryDto(
AdvisoryId: "DSA-2024-123",
SourcePackage: "openssl",
Title: "Openssl security update",
Description: "Fixes multiple issues.",
CveIds: new[] { "CVE-2024-1000", "CVE-2024-1001" },
Packages: new[]
{
new DebianPackageStateDto(
Package: "openssl",
Release: "bullseye",
Status: "resolved",
IntroducedVersion: "1:1.1.1n-0+deb11u2",
FixedVersion: "1:1.1.1n-0+deb11u5",
LastAffectedVersion: null,
Published: new DateTimeOffset(2024, 9, 1, 0, 0, 0, TimeSpan.Zero)),
new DebianPackageStateDto(
Package: "openssl",
Release: "bookworm",
Status: "open",
IntroducedVersion: null,
FixedVersion: null,
LastAffectedVersion: null,
Published: null)
},
References: new[]
{
new DebianReferenceDto(
Url: "https://security-tracker.debian.org/tracker/DSA-2024-123",
Kind: "advisory",
Title: "Debian Security Advisory 2024-123"),
});
var document = new DocumentRecord(
Id: Guid.NewGuid(),
SourceName: DebianConnectorPlugin.SourceName,
Uri: "https://security-tracker.debian.org/tracker/DSA-2024-123",
FetchedAt: new DateTimeOffset(2024, 9, 1, 1, 0, 0, TimeSpan.Zero),
Sha256: "sha",
Status: "Fetched",
ContentType: "application/json",
Headers: null,
Metadata: null,
Etag: null,
LastModified: null,
PayloadId: null);
Advisory advisory = DebianMapper.Map(dto, document, new DateTimeOffset(2024, 9, 1, 2, 0, 0, TimeSpan.Zero));
Assert.Equal("DSA-2024-123", advisory.AdvisoryKey);
Assert.Contains("CVE-2024-1000", advisory.Aliases);
Assert.Contains("CVE-2024-1001", advisory.Aliases);
var resolvedPackage = Assert.Single(advisory.AffectedPackages, p => p.Platform == "bullseye");
var range = Assert.Single(resolvedPackage.VersionRanges);
Assert.Equal("evr", range.RangeKind);
Assert.Equal("1:1.1.1n-0+deb11u2", range.IntroducedVersion);
Assert.Equal("1:1.1.1n-0+deb11u5", range.FixedVersion);
Assert.NotNull(range.Primitives);
var evr = range.Primitives!.Evr;
Assert.NotNull(evr);
Assert.NotNull(evr!.Introduced);
Assert.Equal(1, evr.Introduced!.Epoch);
Assert.Equal("1.1.1n", evr.Introduced.UpstreamVersion);
Assert.Equal("0+deb11u2", evr.Introduced.Revision);
Assert.NotNull(evr.Fixed);
Assert.Equal(1, evr.Fixed!.Epoch);
Assert.Equal("1.1.1n", evr.Fixed.UpstreamVersion);
@@ -94,5 +94,5 @@ public sealed class DebianMapperTests
var openPackage = Assert.Single(advisory.AffectedPackages, p => p.Platform == "bookworm");
Assert.Empty(openPackage.VersionRanges);
Assert.Empty(openPackage.NormalizedVersions);
}
}
}
}

View File

@@ -1,47 +1,47 @@
using System;
using System.Collections.Generic;
using System.IO;
using MongoDB.Bson;
using StellaOps.Concelier.Models;
using StellaOps.Concelier.Connector.Common;
using StellaOps.Concelier.Connector.Distro.Suse;
using StellaOps.Concelier.Connector.Distro.Suse.Internal;
using StellaOps.Concelier.Storage.Mongo.Documents;
using Xunit;
namespace StellaOps.Concelier.Connector.Distro.Suse.Tests;
public sealed class SuseMapperTests
{
[Fact]
public void Map_BuildsNevraRangePrimitives()
{
var json = File.ReadAllText(Path.Combine(AppContext.BaseDirectory, "Source", "Distro", "Suse", "Fixtures", "suse-su-2025_0001-1.json"));
var dto = SuseCsafParser.Parse(json);
var document = new DocumentRecord(
Guid.NewGuid(),
SuseConnectorPlugin.SourceName,
"https://ftp.suse.com/pub/projects/security/csaf/suse-su-2025_0001-1.json",
DateTimeOffset.UtcNow,
"sha256",
DocumentStatuses.PendingParse,
"application/json",
Headers: null,
Metadata: new Dictionary<string, string>(StringComparer.Ordinal)
{
["suse.id"] = dto.AdvisoryId
},
Etag: "adv-1",
LastModified: DateTimeOffset.UtcNow,
GridFsId: ObjectId.Empty);
var mapped = SuseMapper.Map(dto, document, DateTimeOffset.UtcNow);
Assert.Equal(dto.AdvisoryId, mapped.AdvisoryKey);
var package = Assert.Single(mapped.AffectedPackages);
Assert.Equal(AffectedPackageTypes.Rpm, package.Type);
var range = Assert.Single(package.VersionRanges);
using System;
using System.Collections.Generic;
using System.IO;
using MongoDB.Bson;
using StellaOps.Concelier.Models;
using StellaOps.Concelier.Connector.Common;
using StellaOps.Concelier.Connector.Distro.Suse;
using StellaOps.Concelier.Connector.Distro.Suse.Internal;
using StellaOps.Concelier.Storage.Mongo.Documents;
using Xunit;
namespace StellaOps.Concelier.Connector.Distro.Suse.Tests;
public sealed class SuseMapperTests
{
[Fact]
public void Map_BuildsNevraRangePrimitives()
{
var json = File.ReadAllText(Path.Combine(AppContext.BaseDirectory, "Source", "Distro", "Suse", "Fixtures", "suse-su-2025_0001-1.json"));
var dto = SuseCsafParser.Parse(json);
var document = new DocumentRecord(
Guid.NewGuid(),
SuseConnectorPlugin.SourceName,
"https://ftp.suse.com/pub/projects/security/csaf/suse-su-2025_0001-1.json",
DateTimeOffset.UtcNow,
"sha256",
DocumentStatuses.PendingParse,
"application/json",
Headers: null,
Metadata: new Dictionary<string, string>(StringComparer.Ordinal)
{
["suse.id"] = dto.AdvisoryId
},
Etag: "adv-1",
LastModified: DateTimeOffset.UtcNow,
PayloadId: ObjectId.Empty);
var mapped = SuseMapper.Map(dto, document, DateTimeOffset.UtcNow);
Assert.Equal(dto.AdvisoryId, mapped.AdvisoryKey);
var package = Assert.Single(mapped.AffectedPackages);
Assert.Equal(AffectedPackageTypes.Rpm, package.Type);
var range = Assert.Single(package.VersionRanges);
Assert.Equal("nevra", range.RangeKind);
Assert.NotNull(range.Primitives);
Assert.NotNull(range.Primitives!.Nevra);

View File

@@ -1,94 +1,94 @@
using StellaOps.Concelier.Models;
using StellaOps.Concelier.Connector.Ghsa.Internal;
using StellaOps.Concelier.Storage.Mongo.Documents;
namespace StellaOps.Concelier.Connector.Ghsa.Tests;
public sealed class GhsaConflictFixtureTests
{
[Fact]
public void ConflictFixture_MatchesSnapshot()
{
var recordedAt = new DateTimeOffset(2025, 3, 4, 8, 30, 0, TimeSpan.Zero);
var document = new DocumentRecord(
Id: Guid.Parse("2f5c4d67-fcac-4ec9-a8d4-8a9c5a6d0fc9"),
SourceName: GhsaConnectorPlugin.SourceName,
Uri: "https://github.com/advisories/GHSA-qqqq-wwww-eeee",
FetchedAt: new DateTimeOffset(2025, 3, 3, 18, 0, 0, TimeSpan.Zero),
Sha256: "sha256-ghsa-conflict-fixture",
Status: "completed",
ContentType: "application/json",
Headers: null,
Metadata: null,
Etag: "\"etag-ghsa-conflict\"",
LastModified: new DateTimeOffset(2025, 3, 3, 18, 0, 0, TimeSpan.Zero),
GridFsId: null);
var dto = new GhsaRecordDto
{
GhsaId = "GHSA-qqqq-wwww-eeee",
Summary = "Container escape in conflict-package",
Description = "Container escape vulnerability allowing privilege escalation in conflict-package.",
Severity = "HIGH",
PublishedAt = new DateTimeOffset(2025, 2, 25, 0, 0, 0, TimeSpan.Zero),
UpdatedAt = new DateTimeOffset(2025, 3, 2, 12, 0, 0, TimeSpan.Zero),
Aliases = new[] { "GHSA-qqqq-wwww-eeee", "CVE-2025-4242" },
References = new[]
{
new GhsaReferenceDto
{
Url = "https://github.com/advisories/GHSA-qqqq-wwww-eeee",
Type = "ADVISORY"
},
new GhsaReferenceDto
{
Url = "https://github.com/conflict/package/releases/tag/v1.4.0",
Type = "FIX"
}
},
Affected = new[]
{
new GhsaAffectedDto
{
PackageName = "conflict/package",
Ecosystem = "npm",
VulnerableRange = "< 1.4.0",
PatchedVersion = "1.4.0"
}
},
Credits = new[]
{
new GhsaCreditDto
{
Type = "reporter",
Name = "security-researcher",
Login = "sec-researcher",
ProfileUrl = "https://github.com/sec-researcher"
},
new GhsaCreditDto
{
Type = "remediation_developer",
Name = "maintainer-team",
Login = "conflict-maintainer",
ProfileUrl = "https://github.com/conflict/package"
}
}
};
var advisory = GhsaMapper.Map(dto, document, recordedAt);
Assert.Equal("ghsa:severity/high", advisory.CanonicalMetricId);
Assert.True(advisory.CvssMetrics.IsEmpty);
var snapshot = SnapshotSerializer.ToSnapshot(advisory).Replace("\r\n", "\n").TrimEnd();
var expectedPath = Path.Combine(AppContext.BaseDirectory, "Fixtures", "conflict-ghsa.canonical.json");
var expected = File.ReadAllText(expectedPath).Replace("\r\n", "\n").TrimEnd();
if (!string.Equals(expected, snapshot, StringComparison.Ordinal))
{
var actualPath = Path.Combine(AppContext.BaseDirectory, "Fixtures", "conflict-ghsa.canonical.actual.json");
File.WriteAllText(actualPath, snapshot);
}
Assert.Equal(expected, snapshot);
}
}
using StellaOps.Concelier.Models;
using StellaOps.Concelier.Connector.Ghsa.Internal;
using StellaOps.Concelier.Storage.Mongo.Documents;
namespace StellaOps.Concelier.Connector.Ghsa.Tests;
public sealed class GhsaConflictFixtureTests
{
[Fact]
public void ConflictFixture_MatchesSnapshot()
{
var recordedAt = new DateTimeOffset(2025, 3, 4, 8, 30, 0, TimeSpan.Zero);
var document = new DocumentRecord(
Id: Guid.Parse("2f5c4d67-fcac-4ec9-a8d4-8a9c5a6d0fc9"),
SourceName: GhsaConnectorPlugin.SourceName,
Uri: "https://github.com/advisories/GHSA-qqqq-wwww-eeee",
FetchedAt: new DateTimeOffset(2025, 3, 3, 18, 0, 0, TimeSpan.Zero),
Sha256: "sha256-ghsa-conflict-fixture",
Status: "completed",
ContentType: "application/json",
Headers: null,
Metadata: null,
Etag: "\"etag-ghsa-conflict\"",
LastModified: new DateTimeOffset(2025, 3, 3, 18, 0, 0, TimeSpan.Zero),
PayloadId: null);
var dto = new GhsaRecordDto
{
GhsaId = "GHSA-qqqq-wwww-eeee",
Summary = "Container escape in conflict-package",
Description = "Container escape vulnerability allowing privilege escalation in conflict-package.",
Severity = "HIGH",
PublishedAt = new DateTimeOffset(2025, 2, 25, 0, 0, 0, TimeSpan.Zero),
UpdatedAt = new DateTimeOffset(2025, 3, 2, 12, 0, 0, TimeSpan.Zero),
Aliases = new[] { "GHSA-qqqq-wwww-eeee", "CVE-2025-4242" },
References = new[]
{
new GhsaReferenceDto
{
Url = "https://github.com/advisories/GHSA-qqqq-wwww-eeee",
Type = "ADVISORY"
},
new GhsaReferenceDto
{
Url = "https://github.com/conflict/package/releases/tag/v1.4.0",
Type = "FIX"
}
},
Affected = new[]
{
new GhsaAffectedDto
{
PackageName = "conflict/package",
Ecosystem = "npm",
VulnerableRange = "< 1.4.0",
PatchedVersion = "1.4.0"
}
},
Credits = new[]
{
new GhsaCreditDto
{
Type = "reporter",
Name = "security-researcher",
Login = "sec-researcher",
ProfileUrl = "https://github.com/sec-researcher"
},
new GhsaCreditDto
{
Type = "remediation_developer",
Name = "maintainer-team",
Login = "conflict-maintainer",
ProfileUrl = "https://github.com/conflict/package"
}
}
};
var advisory = GhsaMapper.Map(dto, document, recordedAt);
Assert.Equal("ghsa:severity/high", advisory.CanonicalMetricId);
Assert.True(advisory.CvssMetrics.IsEmpty);
var snapshot = SnapshotSerializer.ToSnapshot(advisory).Replace("\r\n", "\n").TrimEnd();
var expectedPath = Path.Combine(AppContext.BaseDirectory, "Fixtures", "conflict-ghsa.canonical.json");
var expected = File.ReadAllText(expectedPath).Replace("\r\n", "\n").TrimEnd();
if (!string.Equals(expected, snapshot, StringComparison.Ordinal))
{
var actualPath = Path.Combine(AppContext.BaseDirectory, "Fixtures", "conflict-ghsa.canonical.actual.json");
File.WriteAllText(actualPath, snapshot);
}
Assert.Equal(expected, snapshot);
}
}

View File

@@ -21,7 +21,7 @@ public sealed class GhsaMapperTests
Metadata: null,
Etag: "\"etag-ghsa-fallback\"",
LastModified: recordedAt.AddHours(-3),
GridFsId: null);
PayloadId: null);
var dto = new GhsaRecordDto
{

View File

@@ -1,103 +1,103 @@
using System.Text.Json;
using StellaOps.Concelier.Models;
using StellaOps.Concelier.Connector.Nvd.Internal;
using StellaOps.Concelier.Storage.Mongo.Documents;
namespace StellaOps.Concelier.Connector.Nvd.Tests;
public sealed class NvdConflictFixtureTests
{
[Fact]
public void ConflictFixture_MatchesSnapshot()
{
const string payload = """
{
"vulnerabilities": [
{
"cve": {
"id": "CVE-2025-4242",
"published": "2025-03-01T10:15:00Z",
"lastModified": "2025-03-03T09:45:00Z",
"descriptions": [
{ "lang": "en", "value": "NVD baseline summary for conflict-package allowing container escape." }
],
"references": [
{
"url": "https://nvd.nist.gov/vuln/detail/CVE-2025-4242",
"source": "NVD",
"tags": ["Vendor Advisory"]
}
],
"weaknesses": [
{
"description": [
{ "lang": "en", "value": "CWE-269" }
]
}
],
"metrics": {
"cvssMetricV31": [
{
"cvssData": {
"vectorString": "CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:H/I:H/A:H",
"baseScore": 9.8,
"baseSeverity": "CRITICAL"
},
"exploitabilityScore": 3.9,
"impactScore": 5.9
}
]
},
"configurations": {
"nodes": [
{
"cpeMatch": [
{
"criteria": "cpe:2.3:a:conflict:package:1.0:*:*:*:*:*:*:*",
"vulnerable": true,
"versionStartIncluding": "1.0",
"versionEndExcluding": "1.4"
}
]
}
]
}
}
}
]
}
""";
using var document = JsonDocument.Parse(payload);
var sourceDocument = new DocumentRecord(
Id: Guid.Parse("1a6a0700-2dd0-4f69-bb37-64ca77e51c91"),
SourceName: NvdConnectorPlugin.SourceName,
Uri: "https://services.nvd.nist.gov/rest/json/cve/2.0?cveId=CVE-2025-4242",
FetchedAt: new DateTimeOffset(2025, 3, 3, 10, 0, 0, TimeSpan.Zero),
Sha256: "sha256-nvd-conflict-fixture",
Status: "completed",
ContentType: "application/json",
Headers: null,
Metadata: null,
Etag: "\"etag-nvd-conflict\"",
LastModified: new DateTimeOffset(2025, 3, 3, 9, 45, 0, TimeSpan.Zero),
GridFsId: null);
var advisories = NvdMapper.Map(document, sourceDocument, new DateTimeOffset(2025, 3, 4, 2, 0, 0, TimeSpan.Zero));
var advisory = Assert.Single(advisories);
var snapshot = SnapshotSerializer.ToSnapshot(advisory).Replace("\r\n", "\n").TrimEnd();
var expectedPath = Path.Combine(AppContext.BaseDirectory, "Nvd", "Fixtures", "conflict-nvd.canonical.json");
var expected = File.ReadAllText(expectedPath).Replace("\r\n", "\n").TrimEnd();
if (!string.Equals(expected, snapshot, StringComparison.Ordinal))
{
var actualPath = Path.Combine(AppContext.BaseDirectory, "Nvd", "Fixtures", "conflict-nvd.canonical.actual.json");
Directory.CreateDirectory(Path.GetDirectoryName(actualPath)!);
File.WriteAllText(actualPath, snapshot);
}
Assert.Equal(expected, snapshot);
}
}
using System.Text.Json;
using StellaOps.Concelier.Models;
using StellaOps.Concelier.Connector.Nvd.Internal;
using StellaOps.Concelier.Storage.Mongo.Documents;
namespace StellaOps.Concelier.Connector.Nvd.Tests;
public sealed class NvdConflictFixtureTests
{
[Fact]
public void ConflictFixture_MatchesSnapshot()
{
const string payload = """
{
"vulnerabilities": [
{
"cve": {
"id": "CVE-2025-4242",
"published": "2025-03-01T10:15:00Z",
"lastModified": "2025-03-03T09:45:00Z",
"descriptions": [
{ "lang": "en", "value": "NVD baseline summary for conflict-package allowing container escape." }
],
"references": [
{
"url": "https://nvd.nist.gov/vuln/detail/CVE-2025-4242",
"source": "NVD",
"tags": ["Vendor Advisory"]
}
],
"weaknesses": [
{
"description": [
{ "lang": "en", "value": "CWE-269" }
]
}
],
"metrics": {
"cvssMetricV31": [
{
"cvssData": {
"vectorString": "CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:H/I:H/A:H",
"baseScore": 9.8,
"baseSeverity": "CRITICAL"
},
"exploitabilityScore": 3.9,
"impactScore": 5.9
}
]
},
"configurations": {
"nodes": [
{
"cpeMatch": [
{
"criteria": "cpe:2.3:a:conflict:package:1.0:*:*:*:*:*:*:*",
"vulnerable": true,
"versionStartIncluding": "1.0",
"versionEndExcluding": "1.4"
}
]
}
]
}
}
}
]
}
""";
using var document = JsonDocument.Parse(payload);
var sourceDocument = new DocumentRecord(
Id: Guid.Parse("1a6a0700-2dd0-4f69-bb37-64ca77e51c91"),
SourceName: NvdConnectorPlugin.SourceName,
Uri: "https://services.nvd.nist.gov/rest/json/cve/2.0?cveId=CVE-2025-4242",
FetchedAt: new DateTimeOffset(2025, 3, 3, 10, 0, 0, TimeSpan.Zero),
Sha256: "sha256-nvd-conflict-fixture",
Status: "completed",
ContentType: "application/json",
Headers: null,
Metadata: null,
Etag: "\"etag-nvd-conflict\"",
LastModified: new DateTimeOffset(2025, 3, 3, 9, 45, 0, TimeSpan.Zero),
PayloadId: null);
var advisories = NvdMapper.Map(document, sourceDocument, new DateTimeOffset(2025, 3, 4, 2, 0, 0, TimeSpan.Zero));
var advisory = Assert.Single(advisories);
var snapshot = SnapshotSerializer.ToSnapshot(advisory).Replace("\r\n", "\n").TrimEnd();
var expectedPath = Path.Combine(AppContext.BaseDirectory, "Nvd", "Fixtures", "conflict-nvd.canonical.json");
var expected = File.ReadAllText(expectedPath).Replace("\r\n", "\n").TrimEnd();
if (!string.Equals(expected, snapshot, StringComparison.Ordinal))
{
var actualPath = Path.Combine(AppContext.BaseDirectory, "Nvd", "Fixtures", "conflict-nvd.canonical.actual.json");
Directory.CreateDirectory(Path.GetDirectoryName(actualPath)!);
File.WriteAllText(actualPath, snapshot);
}
Assert.Equal(expected, snapshot);
}
}

View File

@@ -1,118 +1,118 @@
using System.Text.Json;
using MongoDB.Bson;
using StellaOps.Concelier.Models;
using StellaOps.Concelier.Connector.Osv.Internal;
using StellaOps.Concelier.Storage.Mongo.Documents;
using StellaOps.Concelier.Storage.Mongo.Dtos;
namespace StellaOps.Concelier.Connector.Osv.Tests;
public sealed class OsvConflictFixtureTests
{
[Fact]
public void ConflictFixture_MatchesSnapshot()
{
using var databaseSpecificDoc = JsonDocument.Parse("""{"severity":"medium"}""");
var dto = new OsvVulnerabilityDto
{
Id = "OSV-2025-4242",
Summary = "Container escape for conflict-package",
Details = "OSV captures the latest container escape details including patched version metadata.",
Aliases = new[] { "CVE-2025-4242", "GHSA-qqqq-wwww-eeee" },
Published = new DateTimeOffset(2025, 2, 28, 0, 0, 0, TimeSpan.Zero),
Modified = new DateTimeOffset(2025, 3, 6, 12, 0, 0, TimeSpan.Zero),
Severity = new[]
{
new OsvSeverityDto
{
Type = "CVSS_V3",
Score = "CVSS:3.1/AV:N/AC:H/PR:L/UI:R/S:U/C:L/I:L/A:L"
}
},
References = new[]
{
new OsvReferenceDto
{
Type = "ADVISORY",
Url = "https://osv.dev/vulnerability/OSV-2025-4242"
},
new OsvReferenceDto
{
Type = "FIX",
Url = "https://github.com/conflict/package/commit/abcdef1234567890"
}
},
Credits = new[]
{
new OsvCreditDto
{
Name = "osv-reporter",
Type = "reporter",
Contact = new[] { "mailto:osv-reporter@example.com" }
}
},
Affected = new[]
{
new OsvAffectedPackageDto
{
Package = new OsvPackageDto
{
Ecosystem = "npm",
Name = "conflict/package"
},
Ranges = new[]
{
new OsvRangeDto
{
Type = "SEMVER",
Events = new[]
{
new OsvEventDto { Introduced = "1.0.0" },
new OsvEventDto { LastAffected = "1.4.2" },
new OsvEventDto { Fixed = "1.5.0" }
}
}
}
}
},
DatabaseSpecific = databaseSpecificDoc.RootElement.Clone()
};
var document = new DocumentRecord(
Id: Guid.Parse("8dd2b0fe-a5f5-4b3b-9f5c-0f3aad6fb6ce"),
SourceName: OsvConnectorPlugin.SourceName,
Uri: "https://api.osv.dev/v1/vulns/OSV-2025-4242",
FetchedAt: new DateTimeOffset(2025, 3, 6, 11, 30, 0, TimeSpan.Zero),
Sha256: "sha256-osv-conflict-fixture",
Status: "completed",
ContentType: "application/json",
Headers: null,
Metadata: null,
Etag: "\"etag-osv-conflict\"",
LastModified: new DateTimeOffset(2025, 3, 6, 12, 0, 0, TimeSpan.Zero),
GridFsId: null);
var dtoRecord = new DtoRecord(
Id: Guid.Parse("6f7d5ce7-cb47-40a5-8b41-8ad022b5fd5c"),
DocumentId: document.Id,
SourceName: OsvConnectorPlugin.SourceName,
SchemaVersion: "osv.v1",
Payload: new BsonDocument("id", dto.Id),
ValidatedAt: new DateTimeOffset(2025, 3, 6, 12, 5, 0, TimeSpan.Zero));
var advisory = OsvMapper.Map(dto, document, dtoRecord, "npm");
var snapshot = SnapshotSerializer.ToSnapshot(advisory).Replace("\r\n", "\n").TrimEnd();
var expectedPath = Path.Combine(AppContext.BaseDirectory, "Fixtures", "conflict-osv.canonical.json");
var expected = File.ReadAllText(expectedPath).Replace("\r\n", "\n").TrimEnd();
if (!string.Equals(expected, snapshot, StringComparison.Ordinal))
{
var actualPath = Path.Combine(AppContext.BaseDirectory, "Fixtures", "conflict-osv.canonical.actual.json");
File.WriteAllText(actualPath, snapshot);
}
Assert.Equal(expected, snapshot);
}
}
using System.Text.Json;
using MongoDB.Bson;
using StellaOps.Concelier.Models;
using StellaOps.Concelier.Connector.Osv.Internal;
using StellaOps.Concelier.Storage.Mongo.Documents;
using StellaOps.Concelier.Storage.Mongo.Dtos;
namespace StellaOps.Concelier.Connector.Osv.Tests;
public sealed class OsvConflictFixtureTests
{
[Fact]
public void ConflictFixture_MatchesSnapshot()
{
using var databaseSpecificDoc = JsonDocument.Parse("""{"severity":"medium"}""");
var dto = new OsvVulnerabilityDto
{
Id = "OSV-2025-4242",
Summary = "Container escape for conflict-package",
Details = "OSV captures the latest container escape details including patched version metadata.",
Aliases = new[] { "CVE-2025-4242", "GHSA-qqqq-wwww-eeee" },
Published = new DateTimeOffset(2025, 2, 28, 0, 0, 0, TimeSpan.Zero),
Modified = new DateTimeOffset(2025, 3, 6, 12, 0, 0, TimeSpan.Zero),
Severity = new[]
{
new OsvSeverityDto
{
Type = "CVSS_V3",
Score = "CVSS:3.1/AV:N/AC:H/PR:L/UI:R/S:U/C:L/I:L/A:L"
}
},
References = new[]
{
new OsvReferenceDto
{
Type = "ADVISORY",
Url = "https://osv.dev/vulnerability/OSV-2025-4242"
},
new OsvReferenceDto
{
Type = "FIX",
Url = "https://github.com/conflict/package/commit/abcdef1234567890"
}
},
Credits = new[]
{
new OsvCreditDto
{
Name = "osv-reporter",
Type = "reporter",
Contact = new[] { "mailto:osv-reporter@example.com" }
}
},
Affected = new[]
{
new OsvAffectedPackageDto
{
Package = new OsvPackageDto
{
Ecosystem = "npm",
Name = "conflict/package"
},
Ranges = new[]
{
new OsvRangeDto
{
Type = "SEMVER",
Events = new[]
{
new OsvEventDto { Introduced = "1.0.0" },
new OsvEventDto { LastAffected = "1.4.2" },
new OsvEventDto { Fixed = "1.5.0" }
}
}
}
}
},
DatabaseSpecific = databaseSpecificDoc.RootElement.Clone()
};
var document = new DocumentRecord(
Id: Guid.Parse("8dd2b0fe-a5f5-4b3b-9f5c-0f3aad6fb6ce"),
SourceName: OsvConnectorPlugin.SourceName,
Uri: "https://api.osv.dev/v1/vulns/OSV-2025-4242",
FetchedAt: new DateTimeOffset(2025, 3, 6, 11, 30, 0, TimeSpan.Zero),
Sha256: "sha256-osv-conflict-fixture",
Status: "completed",
ContentType: "application/json",
Headers: null,
Metadata: null,
Etag: "\"etag-osv-conflict\"",
LastModified: new DateTimeOffset(2025, 3, 6, 12, 0, 0, TimeSpan.Zero),
PayloadId: null);
var dtoRecord = new DtoRecord(
Id: Guid.Parse("6f7d5ce7-cb47-40a5-8b41-8ad022b5fd5c"),
DocumentId: document.Id,
SourceName: OsvConnectorPlugin.SourceName,
SchemaVersion: "osv.v1",
Payload: new BsonDocument("id", dto.Id),
ValidatedAt: new DateTimeOffset(2025, 3, 6, 12, 5, 0, TimeSpan.Zero));
var advisory = OsvMapper.Map(dto, document, dtoRecord, "npm");
var snapshot = SnapshotSerializer.ToSnapshot(advisory).Replace("\r\n", "\n").TrimEnd();
var expectedPath = Path.Combine(AppContext.BaseDirectory, "Fixtures", "conflict-osv.canonical.json");
var expected = File.ReadAllText(expectedPath).Replace("\r\n", "\n").TrimEnd();
if (!string.Equals(expected, snapshot, StringComparison.Ordinal))
{
var actualPath = Path.Combine(AppContext.BaseDirectory, "Fixtures", "conflict-osv.canonical.actual.json");
File.WriteAllText(actualPath, snapshot);
}
Assert.Equal(expected, snapshot);
}
}

View File

@@ -1,463 +1,463 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Net;
using System.Net.Http;
using System.Security.Cryptography;
using System.Text;
using System.Text.Json;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Http;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Logging.Abstractions;
using Microsoft.Extensions.Options;
using MongoDB.Bson;
using StellaOps.Concelier.Connector.Common;
using StellaOps.Concelier.Connector.Common.Fetch;
using StellaOps.Concelier.Connector.Common.Testing;
using StellaOps.Concelier.Connector.StellaOpsMirror.Internal;
using StellaOps.Concelier.Connector.StellaOpsMirror.Settings;
using StellaOps.Concelier.Storage.Mongo;
using StellaOps.Concelier.Storage.Mongo.Advisories;
using StellaOps.Concelier.Storage.Mongo.Documents;
using StellaOps.Concelier.Storage.Mongo.Dtos;
using StellaOps.Concelier.Testing;
using System;
using System.Collections.Generic;
using System.IO;
using System.Net;
using System.Net.Http;
using System.Security.Cryptography;
using System.Text;
using System.Text.Json;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Http;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Logging.Abstractions;
using Microsoft.Extensions.Options;
using MongoDB.Bson;
using StellaOps.Concelier.Connector.Common;
using StellaOps.Concelier.Connector.Common.Fetch;
using StellaOps.Concelier.Connector.Common.Testing;
using StellaOps.Concelier.Connector.StellaOpsMirror.Internal;
using StellaOps.Concelier.Connector.StellaOpsMirror.Settings;
using StellaOps.Concelier.Storage.Mongo;
using StellaOps.Concelier.Storage.Mongo.Advisories;
using StellaOps.Concelier.Storage.Mongo.Documents;
using StellaOps.Concelier.Storage.Mongo.Dtos;
using StellaOps.Concelier.Testing;
using StellaOps.Cryptography;
using StellaOps.Cryptography.DependencyInjection;
using StellaOps.Concelier.Models;
using Xunit;
namespace StellaOps.Concelier.Connector.StellaOpsMirror.Tests;
[Collection("mongo-fixture")]
public sealed class StellaOpsMirrorConnectorTests : IAsyncLifetime
{
private readonly MongoIntegrationFixture _fixture;
private readonly CannedHttpMessageHandler _handler;
public StellaOpsMirrorConnectorTests(MongoIntegrationFixture fixture)
{
_fixture = fixture;
_handler = new CannedHttpMessageHandler();
}
[Fact]
public async Task FetchAsync_PersistsMirrorArtifacts()
{
var manifestContent = "{\"domain\":\"primary\",\"files\":[]}";
var bundleContent = "{\"advisories\":[{\"id\":\"CVE-2025-0001\"}]}";
var manifestDigest = ComputeDigest(manifestContent);
var bundleDigest = ComputeDigest(bundleContent);
var index = BuildIndex(manifestDigest, Encoding.UTF8.GetByteCount(manifestContent), bundleDigest, Encoding.UTF8.GetByteCount(bundleContent), includeSignature: false);
await using var provider = await BuildServiceProviderAsync();
SeedResponses(index, manifestContent, bundleContent, signature: null);
var connector = provider.GetRequiredService<StellaOpsMirrorConnector>();
await connector.FetchAsync(provider, CancellationToken.None);
var documentStore = provider.GetRequiredService<IDocumentStore>();
var manifestUri = "https://mirror.test/mirror/primary/manifest.json";
var bundleUri = "https://mirror.test/mirror/primary/bundle.json";
var manifestDocument = await documentStore.FindBySourceAndUriAsync(StellaOpsMirrorConnector.Source, manifestUri, CancellationToken.None);
Assert.NotNull(manifestDocument);
Assert.Equal(DocumentStatuses.Mapped, manifestDocument!.Status);
Assert.Equal(NormalizeDigest(manifestDigest), manifestDocument.Sha256);
var bundleDocument = await documentStore.FindBySourceAndUriAsync(StellaOpsMirrorConnector.Source, bundleUri, CancellationToken.None);
Assert.NotNull(bundleDocument);
Assert.Equal(DocumentStatuses.PendingParse, bundleDocument!.Status);
Assert.Equal(NormalizeDigest(bundleDigest), bundleDocument.Sha256);
var rawStorage = provider.GetRequiredService<RawDocumentStorage>();
Assert.NotNull(manifestDocument.GridFsId);
Assert.NotNull(bundleDocument.GridFsId);
var manifestBytes = await rawStorage.DownloadAsync(manifestDocument.GridFsId!.Value, CancellationToken.None);
var bundleBytes = await rawStorage.DownloadAsync(bundleDocument.GridFsId!.Value, CancellationToken.None);
Assert.Equal(manifestContent, Encoding.UTF8.GetString(manifestBytes));
Assert.Equal(bundleContent, Encoding.UTF8.GetString(bundleBytes));
var stateRepository = provider.GetRequiredService<ISourceStateRepository>();
var state = await stateRepository.TryGetAsync(StellaOpsMirrorConnector.Source, CancellationToken.None);
Assert.NotNull(state);
var cursorDocument = state!.Cursor ?? new BsonDocument();
var digestValue = cursorDocument.TryGetValue("bundleDigest", out var digestBson) ? digestBson.AsString : string.Empty;
Assert.Equal(NormalizeDigest(bundleDigest), NormalizeDigest(digestValue));
var pendingDocumentsArray = cursorDocument.TryGetValue("pendingDocuments", out var pendingDocsBson) && pendingDocsBson is BsonArray pendingArray
? pendingArray
: new BsonArray();
Assert.Single(pendingDocumentsArray);
var pendingDocumentId = Guid.Parse(pendingDocumentsArray[0].AsString);
Assert.Equal(bundleDocument.Id, pendingDocumentId);
var pendingMappingsArray = cursorDocument.TryGetValue("pendingMappings", out var pendingMappingsBson) && pendingMappingsBson is BsonArray mappingsArray
? mappingsArray
: new BsonArray();
Assert.Empty(pendingMappingsArray);
}
[Fact]
public async Task FetchAsync_TamperedSignatureThrows()
{
var manifestContent = "{\"domain\":\"primary\"}";
var bundleContent = "{\"advisories\":[{\"id\":\"CVE-2025-0002\"}]}";
var manifestDigest = ComputeDigest(manifestContent);
var bundleDigest = ComputeDigest(bundleContent);
var index = BuildIndex(manifestDigest, Encoding.UTF8.GetByteCount(manifestContent), bundleDigest, Encoding.UTF8.GetByteCount(bundleContent), includeSignature: true);
await using var provider = await BuildServiceProviderAsync(options =>
{
options.Signature.Enabled = true;
options.Signature.KeyId = "mirror-key";
options.Signature.Provider = "default";
});
var defaultProvider = provider.GetRequiredService<DefaultCryptoProvider>();
var signingKey = CreateSigningKey("mirror-key");
defaultProvider.UpsertSigningKey(signingKey);
var (signatureValue, _) = CreateDetachedJws(signingKey, bundleContent);
// Tamper with signature so verification fails.
var tamperedSignature = signatureValue.Replace('a', 'b');
SeedResponses(index, manifestContent, bundleContent, tamperedSignature);
var connector = provider.GetRequiredService<StellaOpsMirrorConnector>();
await Assert.ThrowsAsync<InvalidOperationException>(() => connector.FetchAsync(provider, CancellationToken.None));
var stateRepository = provider.GetRequiredService<ISourceStateRepository>();
var state = await stateRepository.TryGetAsync(StellaOpsMirrorConnector.Source, CancellationToken.None);
Assert.NotNull(state);
Assert.True(state!.FailCount >= 1);
Assert.False(state.Cursor.TryGetValue("bundleDigest", out _));
}
[Fact]
public async Task FetchAsync_SignatureKeyMismatchThrows()
{
var manifestContent = "{\"domain\":\"primary\"}";
var bundleContent = "{\"advisories\":[{\"id\":\"CVE-2025-0003\"}]}";
var manifestDigest = ComputeDigest(manifestContent);
var bundleDigest = ComputeDigest(bundleContent);
var index = BuildIndex(
manifestDigest,
Encoding.UTF8.GetByteCount(manifestContent),
bundleDigest,
Encoding.UTF8.GetByteCount(bundleContent),
includeSignature: true,
signatureKeyId: "unexpected-key",
signatureProvider: "default");
var signingKey = CreateSigningKey("unexpected-key");
var (signatureValue, _) = CreateDetachedJws(signingKey, bundleContent);
await using var provider = await BuildServiceProviderAsync(options =>
{
options.Signature.Enabled = true;
options.Signature.KeyId = "mirror-key";
options.Signature.Provider = "default";
});
SeedResponses(index, manifestContent, bundleContent, signatureValue);
var connector = provider.GetRequiredService<StellaOpsMirrorConnector>();
await Assert.ThrowsAsync<InvalidOperationException>(() => connector.FetchAsync(provider, CancellationToken.None));
}
[Fact]
public async Task FetchAsync_VerifiesSignatureUsingFallbackPublicKey()
{
var manifestContent = "{\"domain\":\"primary\"}";
var bundleContent = "{\"advisories\":[{\"id\":\"CVE-2025-0004\"}]}";
var manifestDigest = ComputeDigest(manifestContent);
var bundleDigest = ComputeDigest(bundleContent);
var index = BuildIndex(manifestDigest, Encoding.UTF8.GetByteCount(manifestContent), bundleDigest, Encoding.UTF8.GetByteCount(bundleContent), includeSignature: true);
var signingKey = CreateSigningKey("mirror-key");
var (signatureValue, _) = CreateDetachedJws(signingKey, bundleContent);
var publicKeyPath = WritePublicKeyPem(signingKey);
await using var provider = await BuildServiceProviderAsync(options =>
{
options.Signature.Enabled = true;
options.Signature.KeyId = "mirror-key";
options.Signature.Provider = "default";
options.Signature.PublicKeyPath = publicKeyPath;
});
try
{
SeedResponses(index, manifestContent, bundleContent, signatureValue);
var connector = provider.GetRequiredService<StellaOpsMirrorConnector>();
await connector.FetchAsync(provider, CancellationToken.None);
var stateRepository = provider.GetRequiredService<ISourceStateRepository>();
var state = await stateRepository.TryGetAsync(StellaOpsMirrorConnector.Source, CancellationToken.None);
Assert.NotNull(state);
Assert.Equal(0, state!.FailCount);
}
finally
{
if (File.Exists(publicKeyPath))
{
File.Delete(publicKeyPath);
}
}
}
[Fact]
public async Task FetchAsync_DigestMismatchMarksFailure()
{
var manifestExpected = "{\"domain\":\"primary\"}";
var manifestTampered = "{\"domain\":\"tampered\"}";
var bundleContent = "{\"advisories\":[{\"id\":\"CVE-2025-0005\"}]}";
var manifestDigest = ComputeDigest(manifestExpected);
var bundleDigest = ComputeDigest(bundleContent);
var index = BuildIndex(manifestDigest, Encoding.UTF8.GetByteCount(manifestExpected), bundleDigest, Encoding.UTF8.GetByteCount(bundleContent), includeSignature: false);
await using var provider = await BuildServiceProviderAsync();
SeedResponses(index, manifestTampered, bundleContent, signature: null);
var connector = provider.GetRequiredService<StellaOpsMirrorConnector>();
await Assert.ThrowsAsync<InvalidOperationException>(() => connector.FetchAsync(provider, CancellationToken.None));
var stateRepository = provider.GetRequiredService<ISourceStateRepository>();
var state = await stateRepository.TryGetAsync(StellaOpsMirrorConnector.Source, CancellationToken.None);
Assert.NotNull(state);
var cursor = state!.Cursor ?? new BsonDocument();
Assert.True(state.FailCount >= 1);
Assert.False(cursor.Contains("bundleDigest"));
}
[Fact]
public void ParseAndMap_PersistAdvisoriesFromBundle()
{
var bundleDocument = SampleData.CreateBundle();
var bundleJson = CanonicalJsonSerializer.SerializeIndented(bundleDocument);
var normalizedFixture = FixtureLoader.Read(SampleData.BundleFixture).TrimEnd();
Assert.Equal(normalizedFixture, FixtureLoader.Normalize(bundleJson).TrimEnd());
var advisories = MirrorAdvisoryMapper.Map(bundleDocument);
Assert.Single(advisories);
var advisory = advisories[0];
var expectedAdvisoryJson = FixtureLoader.Read(SampleData.AdvisoryFixture).TrimEnd();
var mappedJson = CanonicalJsonSerializer.SerializeIndented(advisory);
Assert.Equal(expectedAdvisoryJson, FixtureLoader.Normalize(mappedJson).TrimEnd());
// AdvisoryStore integration validated elsewhere; ensure canonical serialization is stable.
}
public Task InitializeAsync() => Task.CompletedTask;
public Task DisposeAsync()
{
_handler.Clear();
return Task.CompletedTask;
}
private async Task<ServiceProvider> BuildServiceProviderAsync(Action<StellaOpsMirrorConnectorOptions>? configureOptions = null)
{
await _fixture.Client.DropDatabaseAsync(_fixture.Database.DatabaseNamespace.DatabaseName);
_handler.Clear();
var services = new ServiceCollection();
services.AddLogging(builder => builder.AddProvider(NullLoggerProvider.Instance));
services.AddSingleton(_handler);
services.AddSingleton(TimeProvider.System);
services.AddMongoStorage(options =>
{
options.ConnectionString = _fixture.Runner.ConnectionString;
options.DatabaseName = _fixture.Database.DatabaseNamespace.DatabaseName;
options.CommandTimeout = TimeSpan.FromSeconds(5);
});
using StellaOps.Concelier.Models;
using Xunit;
namespace StellaOps.Concelier.Connector.StellaOpsMirror.Tests;
[Collection("mongo-fixture")]
public sealed class StellaOpsMirrorConnectorTests : IAsyncLifetime
{
private readonly MongoIntegrationFixture _fixture;
private readonly CannedHttpMessageHandler _handler;
public StellaOpsMirrorConnectorTests(MongoIntegrationFixture fixture)
{
_fixture = fixture;
_handler = new CannedHttpMessageHandler();
}
[Fact]
public async Task FetchAsync_PersistsMirrorArtifacts()
{
var manifestContent = "{\"domain\":\"primary\",\"files\":[]}";
var bundleContent = "{\"advisories\":[{\"id\":\"CVE-2025-0001\"}]}";
var manifestDigest = ComputeDigest(manifestContent);
var bundleDigest = ComputeDigest(bundleContent);
var index = BuildIndex(manifestDigest, Encoding.UTF8.GetByteCount(manifestContent), bundleDigest, Encoding.UTF8.GetByteCount(bundleContent), includeSignature: false);
await using var provider = await BuildServiceProviderAsync();
SeedResponses(index, manifestContent, bundleContent, signature: null);
var connector = provider.GetRequiredService<StellaOpsMirrorConnector>();
await connector.FetchAsync(provider, CancellationToken.None);
var documentStore = provider.GetRequiredService<IDocumentStore>();
var manifestUri = "https://mirror.test/mirror/primary/manifest.json";
var bundleUri = "https://mirror.test/mirror/primary/bundle.json";
var manifestDocument = await documentStore.FindBySourceAndUriAsync(StellaOpsMirrorConnector.Source, manifestUri, CancellationToken.None);
Assert.NotNull(manifestDocument);
Assert.Equal(DocumentStatuses.Mapped, manifestDocument!.Status);
Assert.Equal(NormalizeDigest(manifestDigest), manifestDocument.Sha256);
var bundleDocument = await documentStore.FindBySourceAndUriAsync(StellaOpsMirrorConnector.Source, bundleUri, CancellationToken.None);
Assert.NotNull(bundleDocument);
Assert.Equal(DocumentStatuses.PendingParse, bundleDocument!.Status);
Assert.Equal(NormalizeDigest(bundleDigest), bundleDocument.Sha256);
var rawStorage = provider.GetRequiredService<RawDocumentStorage>();
Assert.NotNull(manifestDocument.PayloadId);
Assert.NotNull(bundleDocument.PayloadId);
var manifestBytes = await rawStorage.DownloadAsync(manifestDocument.PayloadId!.Value, CancellationToken.None);
var bundleBytes = await rawStorage.DownloadAsync(bundleDocument.PayloadId!.Value, CancellationToken.None);
Assert.Equal(manifestContent, Encoding.UTF8.GetString(manifestBytes));
Assert.Equal(bundleContent, Encoding.UTF8.GetString(bundleBytes));
var stateRepository = provider.GetRequiredService<ISourceStateRepository>();
var state = await stateRepository.TryGetAsync(StellaOpsMirrorConnector.Source, CancellationToken.None);
Assert.NotNull(state);
var cursorDocument = state!.Cursor ?? new BsonDocument();
var digestValue = cursorDocument.TryGetValue("bundleDigest", out var digestBson) ? digestBson.AsString : string.Empty;
Assert.Equal(NormalizeDigest(bundleDigest), NormalizeDigest(digestValue));
var pendingDocumentsArray = cursorDocument.TryGetValue("pendingDocuments", out var pendingDocsBson) && pendingDocsBson is BsonArray pendingArray
? pendingArray
: new BsonArray();
Assert.Single(pendingDocumentsArray);
var pendingDocumentId = Guid.Parse(pendingDocumentsArray[0].AsString);
Assert.Equal(bundleDocument.Id, pendingDocumentId);
var pendingMappingsArray = cursorDocument.TryGetValue("pendingMappings", out var pendingMappingsBson) && pendingMappingsBson is BsonArray mappingsArray
? mappingsArray
: new BsonArray();
Assert.Empty(pendingMappingsArray);
}
[Fact]
public async Task FetchAsync_TamperedSignatureThrows()
{
var manifestContent = "{\"domain\":\"primary\"}";
var bundleContent = "{\"advisories\":[{\"id\":\"CVE-2025-0002\"}]}";
var manifestDigest = ComputeDigest(manifestContent);
var bundleDigest = ComputeDigest(bundleContent);
var index = BuildIndex(manifestDigest, Encoding.UTF8.GetByteCount(manifestContent), bundleDigest, Encoding.UTF8.GetByteCount(bundleContent), includeSignature: true);
await using var provider = await BuildServiceProviderAsync(options =>
{
options.Signature.Enabled = true;
options.Signature.KeyId = "mirror-key";
options.Signature.Provider = "default";
});
var defaultProvider = provider.GetRequiredService<DefaultCryptoProvider>();
var signingKey = CreateSigningKey("mirror-key");
defaultProvider.UpsertSigningKey(signingKey);
var (signatureValue, _) = CreateDetachedJws(signingKey, bundleContent);
// Tamper with signature so verification fails.
var tamperedSignature = signatureValue.Replace('a', 'b');
SeedResponses(index, manifestContent, bundleContent, tamperedSignature);
var connector = provider.GetRequiredService<StellaOpsMirrorConnector>();
await Assert.ThrowsAsync<InvalidOperationException>(() => connector.FetchAsync(provider, CancellationToken.None));
var stateRepository = provider.GetRequiredService<ISourceStateRepository>();
var state = await stateRepository.TryGetAsync(StellaOpsMirrorConnector.Source, CancellationToken.None);
Assert.NotNull(state);
Assert.True(state!.FailCount >= 1);
Assert.False(state.Cursor.TryGetValue("bundleDigest", out _));
}
[Fact]
public async Task FetchAsync_SignatureKeyMismatchThrows()
{
var manifestContent = "{\"domain\":\"primary\"}";
var bundleContent = "{\"advisories\":[{\"id\":\"CVE-2025-0003\"}]}";
var manifestDigest = ComputeDigest(manifestContent);
var bundleDigest = ComputeDigest(bundleContent);
var index = BuildIndex(
manifestDigest,
Encoding.UTF8.GetByteCount(manifestContent),
bundleDigest,
Encoding.UTF8.GetByteCount(bundleContent),
includeSignature: true,
signatureKeyId: "unexpected-key",
signatureProvider: "default");
var signingKey = CreateSigningKey("unexpected-key");
var (signatureValue, _) = CreateDetachedJws(signingKey, bundleContent);
await using var provider = await BuildServiceProviderAsync(options =>
{
options.Signature.Enabled = true;
options.Signature.KeyId = "mirror-key";
options.Signature.Provider = "default";
});
SeedResponses(index, manifestContent, bundleContent, signatureValue);
var connector = provider.GetRequiredService<StellaOpsMirrorConnector>();
await Assert.ThrowsAsync<InvalidOperationException>(() => connector.FetchAsync(provider, CancellationToken.None));
}
[Fact]
public async Task FetchAsync_VerifiesSignatureUsingFallbackPublicKey()
{
var manifestContent = "{\"domain\":\"primary\"}";
var bundleContent = "{\"advisories\":[{\"id\":\"CVE-2025-0004\"}]}";
var manifestDigest = ComputeDigest(manifestContent);
var bundleDigest = ComputeDigest(bundleContent);
var index = BuildIndex(manifestDigest, Encoding.UTF8.GetByteCount(manifestContent), bundleDigest, Encoding.UTF8.GetByteCount(bundleContent), includeSignature: true);
var signingKey = CreateSigningKey("mirror-key");
var (signatureValue, _) = CreateDetachedJws(signingKey, bundleContent);
var publicKeyPath = WritePublicKeyPem(signingKey);
await using var provider = await BuildServiceProviderAsync(options =>
{
options.Signature.Enabled = true;
options.Signature.KeyId = "mirror-key";
options.Signature.Provider = "default";
options.Signature.PublicKeyPath = publicKeyPath;
});
try
{
SeedResponses(index, manifestContent, bundleContent, signatureValue);
var connector = provider.GetRequiredService<StellaOpsMirrorConnector>();
await connector.FetchAsync(provider, CancellationToken.None);
var stateRepository = provider.GetRequiredService<ISourceStateRepository>();
var state = await stateRepository.TryGetAsync(StellaOpsMirrorConnector.Source, CancellationToken.None);
Assert.NotNull(state);
Assert.Equal(0, state!.FailCount);
}
finally
{
if (File.Exists(publicKeyPath))
{
File.Delete(publicKeyPath);
}
}
}
[Fact]
public async Task FetchAsync_DigestMismatchMarksFailure()
{
var manifestExpected = "{\"domain\":\"primary\"}";
var manifestTampered = "{\"domain\":\"tampered\"}";
var bundleContent = "{\"advisories\":[{\"id\":\"CVE-2025-0005\"}]}";
var manifestDigest = ComputeDigest(manifestExpected);
var bundleDigest = ComputeDigest(bundleContent);
var index = BuildIndex(manifestDigest, Encoding.UTF8.GetByteCount(manifestExpected), bundleDigest, Encoding.UTF8.GetByteCount(bundleContent), includeSignature: false);
await using var provider = await BuildServiceProviderAsync();
SeedResponses(index, manifestTampered, bundleContent, signature: null);
var connector = provider.GetRequiredService<StellaOpsMirrorConnector>();
await Assert.ThrowsAsync<InvalidOperationException>(() => connector.FetchAsync(provider, CancellationToken.None));
var stateRepository = provider.GetRequiredService<ISourceStateRepository>();
var state = await stateRepository.TryGetAsync(StellaOpsMirrorConnector.Source, CancellationToken.None);
Assert.NotNull(state);
var cursor = state!.Cursor ?? new BsonDocument();
Assert.True(state.FailCount >= 1);
Assert.False(cursor.Contains("bundleDigest"));
}
[Fact]
public void ParseAndMap_PersistAdvisoriesFromBundle()
{
var bundleDocument = SampleData.CreateBundle();
var bundleJson = CanonicalJsonSerializer.SerializeIndented(bundleDocument);
var normalizedFixture = FixtureLoader.Read(SampleData.BundleFixture).TrimEnd();
Assert.Equal(normalizedFixture, FixtureLoader.Normalize(bundleJson).TrimEnd());
var advisories = MirrorAdvisoryMapper.Map(bundleDocument);
Assert.Single(advisories);
var advisory = advisories[0];
var expectedAdvisoryJson = FixtureLoader.Read(SampleData.AdvisoryFixture).TrimEnd();
var mappedJson = CanonicalJsonSerializer.SerializeIndented(advisory);
Assert.Equal(expectedAdvisoryJson, FixtureLoader.Normalize(mappedJson).TrimEnd());
// AdvisoryStore integration validated elsewhere; ensure canonical serialization is stable.
}
public Task InitializeAsync() => Task.CompletedTask;
public Task DisposeAsync()
{
_handler.Clear();
return Task.CompletedTask;
}
private async Task<ServiceProvider> BuildServiceProviderAsync(Action<StellaOpsMirrorConnectorOptions>? configureOptions = null)
{
await _fixture.Client.DropDatabaseAsync(_fixture.Database.DatabaseNamespace.DatabaseName);
_handler.Clear();
var services = new ServiceCollection();
services.AddLogging(builder => builder.AddProvider(NullLoggerProvider.Instance));
services.AddSingleton(_handler);
services.AddSingleton(TimeProvider.System);
services.AddMongoStorage(options =>
{
options.ConnectionString = _fixture.Runner.ConnectionString;
options.DatabaseName = _fixture.Database.DatabaseNamespace.DatabaseName;
options.CommandTimeout = TimeSpan.FromSeconds(5);
});
services.AddStellaOpsCrypto();
var configuration = new ConfigurationBuilder()
.AddInMemoryCollection(new Dictionary<string, string?>
{
["concelier:sources:stellaopsMirror:baseAddress"] = "https://mirror.test/",
["concelier:sources:stellaopsMirror:domainId"] = "primary",
["concelier:sources:stellaopsMirror:indexPath"] = "/concelier/exports/index.json",
})
.Build();
var routine = new StellaOpsMirrorDependencyInjectionRoutine();
routine.Register(services, configuration);
if (configureOptions is not null)
{
services.PostConfigure(configureOptions);
}
services.Configure<HttpClientFactoryOptions>("stellaops-mirror", builder =>
{
builder.HttpMessageHandlerBuilderActions.Add(options =>
{
options.PrimaryHandler = _handler;
});
});
var provider = services.BuildServiceProvider();
var bootstrapper = provider.GetRequiredService<MongoBootstrapper>();
await bootstrapper.InitializeAsync(CancellationToken.None);
return provider;
}
private void SeedResponses(string indexJson, string manifestContent, string bundleContent, string? signature)
{
var baseUri = new Uri("https://mirror.test");
_handler.AddResponse(HttpMethod.Get, new Uri(baseUri, "/concelier/exports/index.json"), () => CreateJsonResponse(indexJson));
_handler.AddResponse(HttpMethod.Get, new Uri(baseUri, "mirror/primary/manifest.json"), () => CreateJsonResponse(manifestContent));
_handler.AddResponse(HttpMethod.Get, new Uri(baseUri, "mirror/primary/bundle.json"), () => CreateJsonResponse(bundleContent));
if (signature is not null)
{
_handler.AddResponse(HttpMethod.Get, new Uri(baseUri, "mirror/primary/bundle.json.jws"), () => new HttpResponseMessage(HttpStatusCode.OK)
{
Content = new StringContent(signature, Encoding.UTF8, "application/jose+json"),
});
}
}
private static HttpResponseMessage CreateJsonResponse(string content)
=> new(HttpStatusCode.OK)
{
Content = new StringContent(content, Encoding.UTF8, "application/json"),
};
private static string BuildIndex(
string manifestDigest,
int manifestBytes,
string bundleDigest,
int bundleBytes,
bool includeSignature,
string signatureKeyId = "mirror-key",
string signatureProvider = "default")
{
var index = new
{
schemaVersion = 1,
generatedAt = new DateTimeOffset(2025, 10, 19, 12, 0, 0, TimeSpan.Zero),
targetRepository = "repo",
domains = new[]
{
new
{
domainId = "primary",
displayName = "Primary",
advisoryCount = 1,
manifest = new
{
path = "mirror/primary/manifest.json",
sizeBytes = manifestBytes,
digest = manifestDigest,
signature = (object?)null,
},
bundle = new
{
path = "mirror/primary/bundle.json",
sizeBytes = bundleBytes,
digest = bundleDigest,
signature = includeSignature
? new
{
path = "mirror/primary/bundle.json.jws",
algorithm = "ES256",
keyId = signatureKeyId,
provider = signatureProvider,
signedAt = new DateTimeOffset(2025, 10, 19, 12, 0, 0, TimeSpan.Zero),
}
: null,
},
sources = Array.Empty<object>(),
}
}
};
return JsonSerializer.Serialize(index, new JsonSerializerOptions
{
PropertyNamingPolicy = JsonNamingPolicy.CamelCase,
WriteIndented = false,
});
}
private static string ComputeDigest(string content)
{
var bytes = Encoding.UTF8.GetBytes(content);
var hash = SHA256.HashData(bytes);
return "sha256:" + Convert.ToHexString(hash).ToLowerInvariant();
}
private static string NormalizeDigest(string digest)
=> digest.StartsWith("sha256:", StringComparison.OrdinalIgnoreCase) ? digest[7..] : digest;
private static CryptoSigningKey CreateSigningKey(string keyId)
{
using var ecdsa = ECDsa.Create(ECCurve.NamedCurves.nistP256);
var parameters = ecdsa.ExportParameters(includePrivateParameters: true);
return new CryptoSigningKey(new CryptoKeyReference(keyId), SignatureAlgorithms.Es256, in parameters, DateTimeOffset.UtcNow);
}
private static string WritePublicKeyPem(CryptoSigningKey signingKey)
{
ArgumentNullException.ThrowIfNull(signingKey);
var path = Path.Combine(Path.GetTempPath(), $"stellaops-mirror-{Guid.NewGuid():N}.pem");
using var ecdsa = ECDsa.Create(signingKey.PublicParameters);
var publicKeyInfo = ecdsa.ExportSubjectPublicKeyInfo();
var pem = PemEncoding.Write("PUBLIC KEY", publicKeyInfo);
File.WriteAllText(path, pem);
return path;
}
private static (string Signature, DateTimeOffset SignedAt) CreateDetachedJws(CryptoSigningKey signingKey, string payload)
{
var provider = new DefaultCryptoProvider();
provider.UpsertSigningKey(signingKey);
var signer = provider.GetSigner(SignatureAlgorithms.Es256, signingKey.Reference);
var header = new Dictionary<string, object?>
{
["alg"] = SignatureAlgorithms.Es256,
["kid"] = signingKey.Reference.KeyId,
["provider"] = provider.Name,
["typ"] = "application/vnd.stellaops.concelier.mirror-bundle+jws",
["b64"] = false,
["crit"] = new[] { "b64" }
};
var headerJson = JsonSerializer.Serialize(header);
var encodedHeader = Microsoft.IdentityModel.Tokens.Base64UrlEncoder.Encode(headerJson);
var payloadBytes = Encoding.UTF8.GetBytes(payload);
var signingInput = BuildSigningInput(encodedHeader, payloadBytes);
var signatureBytes = signer.SignAsync(signingInput, CancellationToken.None).GetAwaiter().GetResult();
var encodedSignature = Microsoft.IdentityModel.Tokens.Base64UrlEncoder.Encode(signatureBytes);
return (string.Concat(encodedHeader, "..", encodedSignature), DateTimeOffset.UtcNow);
}
private static ReadOnlyMemory<byte> BuildSigningInput(string encodedHeader, ReadOnlySpan<byte> payload)
{
var headerBytes = Encoding.ASCII.GetBytes(encodedHeader);
var buffer = new byte[headerBytes.Length + 1 + payload.Length];
headerBytes.CopyTo(buffer, 0);
buffer[headerBytes.Length] = (byte)'.';
payload.CopyTo(buffer.AsSpan(headerBytes.Length + 1));
return buffer;
}
}
var configuration = new ConfigurationBuilder()
.AddInMemoryCollection(new Dictionary<string, string?>
{
["concelier:sources:stellaopsMirror:baseAddress"] = "https://mirror.test/",
["concelier:sources:stellaopsMirror:domainId"] = "primary",
["concelier:sources:stellaopsMirror:indexPath"] = "/concelier/exports/index.json",
})
.Build();
var routine = new StellaOpsMirrorDependencyInjectionRoutine();
routine.Register(services, configuration);
if (configureOptions is not null)
{
services.PostConfigure(configureOptions);
}
services.Configure<HttpClientFactoryOptions>("stellaops-mirror", builder =>
{
builder.HttpMessageHandlerBuilderActions.Add(options =>
{
options.PrimaryHandler = _handler;
});
});
var provider = services.BuildServiceProvider();
var bootstrapper = provider.GetRequiredService<MongoBootstrapper>();
await bootstrapper.InitializeAsync(CancellationToken.None);
return provider;
}
private void SeedResponses(string indexJson, string manifestContent, string bundleContent, string? signature)
{
var baseUri = new Uri("https://mirror.test");
_handler.AddResponse(HttpMethod.Get, new Uri(baseUri, "/concelier/exports/index.json"), () => CreateJsonResponse(indexJson));
_handler.AddResponse(HttpMethod.Get, new Uri(baseUri, "mirror/primary/manifest.json"), () => CreateJsonResponse(manifestContent));
_handler.AddResponse(HttpMethod.Get, new Uri(baseUri, "mirror/primary/bundle.json"), () => CreateJsonResponse(bundleContent));
if (signature is not null)
{
_handler.AddResponse(HttpMethod.Get, new Uri(baseUri, "mirror/primary/bundle.json.jws"), () => new HttpResponseMessage(HttpStatusCode.OK)
{
Content = new StringContent(signature, Encoding.UTF8, "application/jose+json"),
});
}
}
private static HttpResponseMessage CreateJsonResponse(string content)
=> new(HttpStatusCode.OK)
{
Content = new StringContent(content, Encoding.UTF8, "application/json"),
};
private static string BuildIndex(
string manifestDigest,
int manifestBytes,
string bundleDigest,
int bundleBytes,
bool includeSignature,
string signatureKeyId = "mirror-key",
string signatureProvider = "default")
{
var index = new
{
schemaVersion = 1,
generatedAt = new DateTimeOffset(2025, 10, 19, 12, 0, 0, TimeSpan.Zero),
targetRepository = "repo",
domains = new[]
{
new
{
domainId = "primary",
displayName = "Primary",
advisoryCount = 1,
manifest = new
{
path = "mirror/primary/manifest.json",
sizeBytes = manifestBytes,
digest = manifestDigest,
signature = (object?)null,
},
bundle = new
{
path = "mirror/primary/bundle.json",
sizeBytes = bundleBytes,
digest = bundleDigest,
signature = includeSignature
? new
{
path = "mirror/primary/bundle.json.jws",
algorithm = "ES256",
keyId = signatureKeyId,
provider = signatureProvider,
signedAt = new DateTimeOffset(2025, 10, 19, 12, 0, 0, TimeSpan.Zero),
}
: null,
},
sources = Array.Empty<object>(),
}
}
};
return JsonSerializer.Serialize(index, new JsonSerializerOptions
{
PropertyNamingPolicy = JsonNamingPolicy.CamelCase,
WriteIndented = false,
});
}
private static string ComputeDigest(string content)
{
var bytes = Encoding.UTF8.GetBytes(content);
var hash = SHA256.HashData(bytes);
return "sha256:" + Convert.ToHexString(hash).ToLowerInvariant();
}
private static string NormalizeDigest(string digest)
=> digest.StartsWith("sha256:", StringComparison.OrdinalIgnoreCase) ? digest[7..] : digest;
private static CryptoSigningKey CreateSigningKey(string keyId)
{
using var ecdsa = ECDsa.Create(ECCurve.NamedCurves.nistP256);
var parameters = ecdsa.ExportParameters(includePrivateParameters: true);
return new CryptoSigningKey(new CryptoKeyReference(keyId), SignatureAlgorithms.Es256, in parameters, DateTimeOffset.UtcNow);
}
private static string WritePublicKeyPem(CryptoSigningKey signingKey)
{
ArgumentNullException.ThrowIfNull(signingKey);
var path = Path.Combine(Path.GetTempPath(), $"stellaops-mirror-{Guid.NewGuid():N}.pem");
using var ecdsa = ECDsa.Create(signingKey.PublicParameters);
var publicKeyInfo = ecdsa.ExportSubjectPublicKeyInfo();
var pem = PemEncoding.Write("PUBLIC KEY", publicKeyInfo);
File.WriteAllText(path, pem);
return path;
}
private static (string Signature, DateTimeOffset SignedAt) CreateDetachedJws(CryptoSigningKey signingKey, string payload)
{
var provider = new DefaultCryptoProvider();
provider.UpsertSigningKey(signingKey);
var signer = provider.GetSigner(SignatureAlgorithms.Es256, signingKey.Reference);
var header = new Dictionary<string, object?>
{
["alg"] = SignatureAlgorithms.Es256,
["kid"] = signingKey.Reference.KeyId,
["provider"] = provider.Name,
["typ"] = "application/vnd.stellaops.concelier.mirror-bundle+jws",
["b64"] = false,
["crit"] = new[] { "b64" }
};
var headerJson = JsonSerializer.Serialize(header);
var encodedHeader = Microsoft.IdentityModel.Tokens.Base64UrlEncoder.Encode(headerJson);
var payloadBytes = Encoding.UTF8.GetBytes(payload);
var signingInput = BuildSigningInput(encodedHeader, payloadBytes);
var signatureBytes = signer.SignAsync(signingInput, CancellationToken.None).GetAwaiter().GetResult();
var encodedSignature = Microsoft.IdentityModel.Tokens.Base64UrlEncoder.Encode(signatureBytes);
return (string.Concat(encodedHeader, "..", encodedSignature), DateTimeOffset.UtcNow);
}
private static ReadOnlyMemory<byte> BuildSigningInput(string encodedHeader, ReadOnlySpan<byte> payload)
{
var headerBytes = Encoding.ASCII.GetBytes(encodedHeader);
var buffer = new byte[headerBytes.Length + 1 + payload.Length];
headerBytes.CopyTo(buffer, 0);
buffer[headerBytes.Length] = (byte)'.';
payload.CopyTo(buffer.AsSpan(headerBytes.Length + 1));
return buffer;
}
}

View File

@@ -1,36 +1,36 @@
using System;
using System;
using System.Collections.Generic;
using System.Linq;
using FluentAssertions;
using MongoDB.Bson;
using StellaOps.Concelier.Models;
using StellaOps.Concelier.Connector.Common;
using StellaOps.Concelier.Connector.Vndr.Cisco;
using StellaOps.Concelier.Connector.Vndr.Cisco.Internal;
using StellaOps.Concelier.Storage.Mongo.Documents;
using StellaOps.Concelier.Storage.Mongo.Dtos;
using Xunit;
namespace StellaOps.Concelier.Connector.Vndr.Cisco.Tests;
public sealed class CiscoMapperTests
{
[Fact]
public void Map_ProducesCanonicalAdvisory()
{
var published = new DateTimeOffset(2025, 10, 1, 0, 0, 0, TimeSpan.Zero);
var updated = published.AddDays(1);
using FluentAssertions;
using MongoDB.Bson;
using StellaOps.Concelier.Models;
using StellaOps.Concelier.Connector.Common;
using StellaOps.Concelier.Connector.Vndr.Cisco;
using StellaOps.Concelier.Connector.Vndr.Cisco.Internal;
using StellaOps.Concelier.Storage.Mongo.Documents;
using StellaOps.Concelier.Storage.Mongo.Dtos;
using Xunit;
namespace StellaOps.Concelier.Connector.Vndr.Cisco.Tests;
public sealed class CiscoMapperTests
{
[Fact]
public void Map_ProducesCanonicalAdvisory()
{
var published = new DateTimeOffset(2025, 10, 1, 0, 0, 0, TimeSpan.Zero);
var updated = published.AddDays(1);
var dto = new CiscoAdvisoryDto(
AdvisoryId: "CISCO-SA-TEST",
Title: "Test Advisory",
Summary: "Sample summary",
Severity: "High",
Published: published,
Updated: updated,
PublicationUrl: "https://example.com/advisory",
CsafUrl: "https://sec.cloudapps.cisco.com/csaf/test.json",
CvrfUrl: "https://example.com/cvrf.xml",
Published: published,
Updated: updated,
PublicationUrl: "https://example.com/advisory",
CsafUrl: "https://sec.cloudapps.cisco.com/csaf/test.json",
CvrfUrl: "https://example.com/cvrf.xml",
CvssBaseScore: 9.8,
Cves: new List<string> { "CVE-2024-0001" },
BugIds: new List<string> { "BUG123" },
@@ -39,31 +39,31 @@ public sealed class CiscoMapperTests
new("Cisco Widget", "PID-1", "1.2.3", new [] { AffectedPackageStatusCatalog.KnownAffected }),
new("Cisco Router", "PID-2", ">=1.0.0 <1.4.0", new [] { AffectedPackageStatusCatalog.KnownAffected })
});
var document = new DocumentRecord(
Id: Guid.NewGuid(),
SourceName: VndrCiscoConnectorPlugin.SourceName,
Uri: "https://api.cisco.com/security/advisories/v2/advisories/CISCO-SA-TEST",
FetchedAt: published,
Sha256: "abc123",
Status: DocumentStatuses.PendingMap,
ContentType: "application/json",
Headers: null,
Metadata: null,
Etag: null,
LastModified: updated,
GridFsId: null);
var dtoRecord = new DtoRecord(Guid.NewGuid(), document.Id, VndrCiscoConnectorPlugin.SourceName, "cisco.dto.test", new BsonDocument(), updated);
var advisory = CiscoMapper.Map(dto, document, dtoRecord);
advisory.AdvisoryKey.Should().Be("CISCO-SA-TEST");
advisory.Title.Should().Be("Test Advisory");
advisory.Severity.Should().Be("high");
advisory.Aliases.Should().Contain(new[] { "CISCO-SA-TEST", "CVE-2024-0001", "BUG123" });
advisory.References.Should().Contain(reference => reference.Url == "https://example.com/advisory");
advisory.References.Should().Contain(reference => reference.Url == "https://sec.cloudapps.cisco.com/csaf/test.json");
var document = new DocumentRecord(
Id: Guid.NewGuid(),
SourceName: VndrCiscoConnectorPlugin.SourceName,
Uri: "https://api.cisco.com/security/advisories/v2/advisories/CISCO-SA-TEST",
FetchedAt: published,
Sha256: "abc123",
Status: DocumentStatuses.PendingMap,
ContentType: "application/json",
Headers: null,
Metadata: null,
Etag: null,
LastModified: updated,
PayloadId: null);
var dtoRecord = new DtoRecord(Guid.NewGuid(), document.Id, VndrCiscoConnectorPlugin.SourceName, "cisco.dto.test", new BsonDocument(), updated);
var advisory = CiscoMapper.Map(dto, document, dtoRecord);
advisory.AdvisoryKey.Should().Be("CISCO-SA-TEST");
advisory.Title.Should().Be("Test Advisory");
advisory.Severity.Should().Be("high");
advisory.Aliases.Should().Contain(new[] { "CISCO-SA-TEST", "CVE-2024-0001", "BUG123" });
advisory.References.Should().Contain(reference => reference.Url == "https://example.com/advisory");
advisory.References.Should().Contain(reference => reference.Url == "https://sec.cloudapps.cisco.com/csaf/test.json");
advisory.AffectedPackages.Should().HaveCount(2);
var package = advisory.AffectedPackages.Single(p => p.Identifier == "Cisco Widget");

View File

@@ -0,0 +1,327 @@
using System.Collections.Generic;
using System.Collections.Immutable;
using Microsoft.AspNetCore.Http.HttpResults;
using Microsoft.AspNetCore.Mvc;
using StellaOps.Auth.Abstractions;
using StellaOps.Attestor.Envelope;
using StellaOps.Policy.Engine.Services;
using StellaOps.Policy.Scoring;
using StellaOps.Policy.Scoring.Engine;
using StellaOps.Policy.Scoring.Receipts;
namespace StellaOps.Policy.Engine.Endpoints;
/// <summary>
/// Minimal API surface for CVSS v4.0 score receipts (create, read, amend, history).
/// </summary>
internal static class CvssReceiptEndpoints
{
public static IEndpointRouteBuilder MapCvssReceipts(this IEndpointRouteBuilder endpoints)
{
var group = endpoints.MapGroup("/api/cvss")
.RequireAuthorization()
.WithTags("CVSS Receipts");
group.MapPost("/receipts", CreateReceipt)
.WithName("CreateCvssReceipt")
.WithSummary("Create a CVSS v4.0 receipt with deterministic hashing and optional DSSE attestation.")
.Produces<CvssScoreReceipt>(StatusCodes.Status201Created)
.Produces<ProblemHttpResult>(StatusCodes.Status400BadRequest)
.Produces<ProblemHttpResult>(StatusCodes.Status401Unauthorized);
group.MapGet("/receipts/{receiptId}", GetReceipt)
.WithName("GetCvssReceipt")
.WithSummary("Retrieve a CVSS v4.0 receipt by ID.")
.Produces<CvssScoreReceipt>(StatusCodes.Status200OK)
.Produces<ProblemHttpResult>(StatusCodes.Status404NotFound);
group.MapPut("/receipts/{receiptId}/amend", AmendReceipt)
.WithName("AmendCvssReceipt")
.WithSummary("Append an amendment entry to a CVSS receipt history and optionally re-sign.")
.Produces<CvssScoreReceipt>(StatusCodes.Status200OK)
.Produces<ProblemHttpResult>(StatusCodes.Status400BadRequest)
.Produces<ProblemHttpResult>(StatusCodes.Status404NotFound);
group.MapGet("/receipts/{receiptId}/history", GetReceiptHistory)
.WithName("GetCvssReceiptHistory")
.WithSummary("Return the ordered amendment history for a CVSS receipt.")
.Produces<IReadOnlyList<ReceiptHistoryEntry>>(StatusCodes.Status200OK)
.Produces<ProblemHttpResult>(StatusCodes.Status404NotFound);
group.MapGet("/policies", ListPolicies)
.WithName("ListCvssPolicies")
.WithSummary("List available CVSS policies configured on this host.")
.Produces<IReadOnlyList<CvssPolicy>>(StatusCodes.Status200OK);
return endpoints;
}
private static async Task<IResult> CreateReceipt(
HttpContext context,
[FromBody] CreateCvssReceiptRequest request,
IReceiptBuilder receiptBuilder,
CancellationToken cancellationToken)
{
var scopeResult = ScopeAuthorization.RequireScope(context, StellaOpsScopes.PolicyRun);
if (scopeResult is not null)
{
return scopeResult;
}
if (request is null)
{
return Results.BadRequest(new ProblemDetails
{
Title = "Request body required.",
Status = StatusCodes.Status400BadRequest
});
}
if (request.Policy is null || string.IsNullOrWhiteSpace(request.Policy.Hash))
{
return Results.BadRequest(new ProblemDetails
{
Title = "Policy hash required",
Detail = "CvssPolicy with a deterministic hash must be supplied.",
Status = StatusCodes.Status400BadRequest
});
}
var tenantId = ResolveTenantId(context);
if (string.IsNullOrWhiteSpace(tenantId))
{
return Results.BadRequest(new ProblemDetails
{
Title = "Tenant required",
Detail = "Specify tenant via X-Tenant-Id header or tenant_id claim.",
Status = StatusCodes.Status400BadRequest
});
}
var actor = ResolveActorId(context) ?? request.CreatedBy ?? "system";
var createdAt = request.CreatedAt ?? DateTimeOffset.UtcNow;
var createRequest = new CreateReceiptRequest
{
TenantId = tenantId,
VulnerabilityId = request.VulnerabilityId,
CreatedBy = actor,
CreatedAt = createdAt,
Policy = request.Policy,
BaseMetrics = request.BaseMetrics,
ThreatMetrics = request.ThreatMetrics,
EnvironmentalMetrics = request.EnvironmentalMetrics ?? request.Policy.DefaultEnvironmentalMetrics,
SupplementalMetrics = request.SupplementalMetrics,
Evidence = request.Evidence?.ToImmutableList() ?? ImmutableList<CvssEvidenceItem>.Empty,
SigningKey = request.SigningKey
};
try
{
var receipt = await receiptBuilder.CreateAsync(createRequest, cancellationToken).ConfigureAwait(false);
return Results.Created($"/api/cvss/receipts/{receipt.ReceiptId}", receipt);
}
catch (Exception ex) when (ex is InvalidOperationException or ArgumentException)
{
return Results.BadRequest(new ProblemDetails
{
Title = "Failed to create CVSS receipt",
Detail = ex.Message,
Status = StatusCodes.Status400BadRequest
});
}
}
private static async Task<IResult> GetReceipt(
HttpContext context,
[FromRoute] string receiptId,
IReceiptRepository repository,
CancellationToken cancellationToken)
{
var scopeResult = ScopeAuthorization.RequireScope(context, StellaOpsScopes.FindingsRead);
if (scopeResult is not null)
{
return scopeResult;
}
var tenantId = ResolveTenantId(context);
if (string.IsNullOrWhiteSpace(tenantId))
{
return Results.BadRequest(new ProblemDetails
{
Title = "Tenant required",
Detail = "Specify tenant via X-Tenant-Id header or tenant_id claim.",
Status = StatusCodes.Status400BadRequest
});
}
var receipt = await repository.GetAsync(tenantId, receiptId, cancellationToken).ConfigureAwait(false);
if (receipt is null)
{
return Results.NotFound(new ProblemDetails
{
Title = "Receipt not found",
Detail = $"CVSS receipt '{receiptId}' was not found.",
Status = StatusCodes.Status404NotFound
});
}
return Results.Ok(receipt);
}
private static async Task<IResult> AmendReceipt(
HttpContext context,
[FromRoute] string receiptId,
[FromBody] AmendCvssReceiptRequest request,
IReceiptHistoryService historyService,
CancellationToken cancellationToken)
{
var scopeResult = ScopeAuthorization.RequireScope(context, StellaOpsScopes.PolicyRun);
if (scopeResult is not null)
{
return scopeResult;
}
if (request is null)
{
return Results.BadRequest(new ProblemDetails
{
Title = "Request body required.",
Status = StatusCodes.Status400BadRequest
});
}
var tenantId = ResolveTenantId(context);
if (string.IsNullOrWhiteSpace(tenantId))
{
return Results.BadRequest(new ProblemDetails
{
Title = "Tenant required",
Detail = "Specify tenant via X-Tenant-Id header or tenant_id claim.",
Status = StatusCodes.Status400BadRequest
});
}
var actor = ResolveActorId(context) ?? request.Actor ?? "system";
var amend = new AmendReceiptRequest
{
ReceiptId = receiptId,
TenantId = tenantId,
Actor = actor,
Field = request.Field,
PreviousValue = request.PreviousValue,
NewValue = request.NewValue,
Reason = request.Reason,
ReferenceUri = request.ReferenceUri,
SigningKey = request.SigningKey
};
try
{
var amended = await historyService.AmendAsync(amend, cancellationToken).ConfigureAwait(false);
return Results.Ok(amended);
}
catch (InvalidOperationException ex)
{
return Results.NotFound(new ProblemDetails
{
Title = "Receipt not found",
Detail = ex.Message,
Status = StatusCodes.Status404NotFound
});
}
catch (Exception ex) when (ex is ArgumentException)
{
return Results.BadRequest(new ProblemDetails
{
Title = "Failed to amend receipt",
Detail = ex.Message,
Status = StatusCodes.Status400BadRequest
});
}
}
private static async Task<IResult> GetReceiptHistory(
HttpContext context,
[FromRoute] string receiptId,
IReceiptRepository repository,
CancellationToken cancellationToken)
{
var scopeResult = ScopeAuthorization.RequireScope(context, StellaOpsScopes.FindingsRead);
if (scopeResult is not null)
{
return scopeResult;
}
var tenantId = ResolveTenantId(context);
if (string.IsNullOrWhiteSpace(tenantId))
{
return Results.BadRequest(new ProblemDetails
{
Title = "Tenant required",
Detail = "Specify tenant via X-Tenant-Id header or tenant_id claim.",
Status = StatusCodes.Status400BadRequest
});
}
var receipt = await repository.GetAsync(tenantId, receiptId, cancellationToken).ConfigureAwait(false);
if (receipt is null)
{
return Results.NotFound(new ProblemDetails
{
Title = "Receipt not found",
Detail = $"CVSS receipt '{receiptId}' was not found.",
Status = StatusCodes.Status404NotFound
});
}
var orderedHistory = receipt.History
.OrderBy(h => h.Timestamp)
.ToList();
return Results.Ok(orderedHistory);
}
private static IResult ListPolicies()
=> Results.Ok(Array.Empty<CvssPolicy>());
private static string? ResolveTenantId(HttpContext context)
{
if (context.Request.Headers.TryGetValue("X-Tenant-Id", out var tenantHeader) &&
!string.IsNullOrWhiteSpace(tenantHeader))
{
return tenantHeader.ToString();
}
return context.User?.FindFirst("tenant_id")?.Value;
}
private static string? ResolveActorId(HttpContext context)
{
var user = context.User;
return user?.FindFirst(System.Security.Claims.ClaimTypes.NameIdentifier)?.Value
?? user?.FindFirst("sub")?.Value;
}
}
internal sealed record CreateCvssReceiptRequest(
string VulnerabilityId,
CvssPolicy Policy,
CvssBaseMetrics BaseMetrics,
CvssThreatMetrics? ThreatMetrics,
CvssEnvironmentalMetrics? EnvironmentalMetrics,
CvssSupplementalMetrics? SupplementalMetrics,
IReadOnlyList<CvssEvidenceItem>? Evidence,
EnvelopeKey? SigningKey,
string? CreatedBy,
DateTimeOffset? CreatedAt);
internal sealed record AmendCvssReceiptRequest(
string Field,
string? PreviousValue,
string? NewValue,
string Reason,
string? ReferenceUri,
EnvelopeKey? SigningKey,
string? Actor);

View File

@@ -1,15 +1,27 @@
using StellaOps.Policy.Gateway.Contracts;
using StellaOps.Policy.Gateway.Infrastructure;
namespace StellaOps.Policy.Gateway.Clients;
internal interface IPolicyEngineClient
{
using StellaOps.Policy.Gateway.Contracts;
using StellaOps.Policy.Gateway.Infrastructure;
using StellaOps.Policy.Scoring;
using StellaOps.Policy.Scoring.Receipts;
namespace StellaOps.Policy.Gateway.Clients;
internal interface IPolicyEngineClient
{
Task<PolicyEngineResponse<IReadOnlyList<PolicyPackSummaryDto>>> ListPolicyPacksAsync(GatewayForwardingContext? forwardingContext, CancellationToken cancellationToken);
Task<PolicyEngineResponse<PolicyPackDto>> CreatePolicyPackAsync(GatewayForwardingContext? forwardingContext, CreatePolicyPackRequest request, CancellationToken cancellationToken);
Task<PolicyEngineResponse<PolicyRevisionDto>> CreatePolicyRevisionAsync(GatewayForwardingContext? forwardingContext, string packId, CreatePolicyRevisionRequest request, CancellationToken cancellationToken);
Task<PolicyEngineResponse<PolicyRevisionActivationDto>> ActivatePolicyRevisionAsync(GatewayForwardingContext? forwardingContext, string packId, int version, ActivatePolicyRevisionRequest request, CancellationToken cancellationToken);
}
Task<PolicyEngineResponse<PolicyPackDto>> CreatePolicyPackAsync(GatewayForwardingContext? forwardingContext, CreatePolicyPackRequest request, CancellationToken cancellationToken);
Task<PolicyEngineResponse<PolicyRevisionDto>> CreatePolicyRevisionAsync(GatewayForwardingContext? forwardingContext, string packId, CreatePolicyRevisionRequest request, CancellationToken cancellationToken);
Task<PolicyEngineResponse<PolicyRevisionActivationDto>> ActivatePolicyRevisionAsync(GatewayForwardingContext? forwardingContext, string packId, int version, ActivatePolicyRevisionRequest request, CancellationToken cancellationToken);
Task<PolicyEngineResponse<CvssScoreReceipt>> CreateCvssReceiptAsync(GatewayForwardingContext? forwardingContext, CreateCvssReceiptRequest request, CancellationToken cancellationToken);
Task<PolicyEngineResponse<CvssScoreReceipt>> GetCvssReceiptAsync(GatewayForwardingContext? forwardingContext, string receiptId, CancellationToken cancellationToken);
Task<PolicyEngineResponse<CvssScoreReceipt>> AmendCvssReceiptAsync(GatewayForwardingContext? forwardingContext, string receiptId, AmendCvssReceiptRequest request, CancellationToken cancellationToken);
Task<PolicyEngineResponse<IReadOnlyList<ReceiptHistoryEntry>>> GetCvssReceiptHistoryAsync(GatewayForwardingContext? forwardingContext, string receiptId, CancellationToken cancellationToken);
Task<PolicyEngineResponse<IReadOnlyList<CvssPolicy>>> ListCvssPoliciesAsync(GatewayForwardingContext? forwardingContext, CancellationToken cancellationToken);
}

View File

@@ -5,13 +5,15 @@ using System.Net.Http;
using System.Net.Http.Json;
using System.Text.Json;
using Microsoft.AspNetCore.Http;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using StellaOps.Policy.Gateway.Contracts;
using StellaOps.Policy.Gateway.Infrastructure;
using StellaOps.Policy.Gateway.Options;
using StellaOps.Policy.Gateway.Services;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using StellaOps.Policy.Gateway.Contracts;
using StellaOps.Policy.Gateway.Infrastructure;
using StellaOps.Policy.Gateway.Options;
using StellaOps.Policy.Gateway.Services;
using StellaOps.Policy.Scoring;
using StellaOps.Policy.Scoring.Receipts;
namespace StellaOps.Policy.Gateway.Clients;
@@ -85,18 +87,73 @@ internal sealed class PolicyEngineClient : IPolicyEngineClient
request,
cancellationToken);
public Task<PolicyEngineResponse<PolicyRevisionActivationDto>> ActivatePolicyRevisionAsync(
GatewayForwardingContext? forwardingContext,
string packId,
int version,
ActivatePolicyRevisionRequest request,
CancellationToken cancellationToken)
=> SendAsync<PolicyRevisionActivationDto>(
HttpMethod.Post,
$"api/policy/packs/{Uri.EscapeDataString(packId)}/revisions/{version}:activate",
forwardingContext,
request,
cancellationToken);
public Task<PolicyEngineResponse<PolicyRevisionActivationDto>> ActivatePolicyRevisionAsync(
GatewayForwardingContext? forwardingContext,
string packId,
int version,
ActivatePolicyRevisionRequest request,
CancellationToken cancellationToken)
=> SendAsync<PolicyRevisionActivationDto>(
HttpMethod.Post,
$"api/policy/packs/{Uri.EscapeDataString(packId)}/revisions/{version}:activate",
forwardingContext,
request,
cancellationToken);
public Task<PolicyEngineResponse<CvssScoreReceipt>> CreateCvssReceiptAsync(
GatewayForwardingContext? forwardingContext,
CreateCvssReceiptRequest request,
CancellationToken cancellationToken)
=> SendAsync<CvssScoreReceipt>(
HttpMethod.Post,
"api/cvss/receipts",
forwardingContext,
request,
cancellationToken);
public Task<PolicyEngineResponse<CvssScoreReceipt>> GetCvssReceiptAsync(
GatewayForwardingContext? forwardingContext,
string receiptId,
CancellationToken cancellationToken)
=> SendAsync<CvssScoreReceipt>(
HttpMethod.Get,
$"api/cvss/receipts/{Uri.EscapeDataString(receiptId)}",
forwardingContext,
content: null,
cancellationToken);
public Task<PolicyEngineResponse<CvssScoreReceipt>> AmendCvssReceiptAsync(
GatewayForwardingContext? forwardingContext,
string receiptId,
AmendCvssReceiptRequest request,
CancellationToken cancellationToken)
=> SendAsync<CvssScoreReceipt>(
HttpMethod.Put,
$"api/cvss/receipts/{Uri.EscapeDataString(receiptId)}/amend",
forwardingContext,
request,
cancellationToken);
public Task<PolicyEngineResponse<IReadOnlyList<ReceiptHistoryEntry>>> GetCvssReceiptHistoryAsync(
GatewayForwardingContext? forwardingContext,
string receiptId,
CancellationToken cancellationToken)
=> SendAsync<IReadOnlyList<ReceiptHistoryEntry>>(
HttpMethod.Get,
$"api/cvss/receipts/{Uri.EscapeDataString(receiptId)}/history",
forwardingContext,
content: null,
cancellationToken);
public Task<PolicyEngineResponse<IReadOnlyList<CvssPolicy>>> ListCvssPoliciesAsync(
GatewayForwardingContext? forwardingContext,
CancellationToken cancellationToken)
=> SendAsync<IReadOnlyList<CvssPolicy>>(
HttpMethod.Get,
"api/cvss/policies",
forwardingContext,
content: null,
cancellationToken);
private async Task<PolicyEngineResponse<TSuccess>> SendAsync<TSuccess>(
HttpMethod method,

View File

@@ -0,0 +1,33 @@
using System;
using System.Collections.Generic;
using System.ComponentModel.DataAnnotations;
using StellaOps.Attestor.Envelope;
using StellaOps.Policy.Scoring;
using StellaOps.Policy.Scoring.Receipts;
namespace StellaOps.Policy.Gateway.Contracts;
public sealed record CreateCvssReceiptRequest(
[Required] string VulnerabilityId,
[Required] CvssPolicy Policy,
[Required] CvssBaseMetrics BaseMetrics,
CvssThreatMetrics? ThreatMetrics,
CvssEnvironmentalMetrics? EnvironmentalMetrics,
CvssSupplementalMetrics? SupplementalMetrics,
IReadOnlyList<CvssEvidenceItem>? Evidence,
EnvelopeKey? SigningKey,
string? CreatedBy,
DateTimeOffset? CreatedAt);
public sealed record AmendCvssReceiptRequest(
[Required] string Field,
string? PreviousValue,
string? NewValue,
[Required] string Reason,
string? ReferenceUri,
EnvelopeKey? SigningKey,
string? Actor);
public sealed record CvssReceiptHistoryResponse(
string ReceiptId,
IReadOnlyList<ReceiptHistoryEntry> History);

View File

@@ -279,11 +279,11 @@ policyPacks.MapPost("/{packId}/revisions", async Task<IResult> (
})
.RequireAuthorization(policy => policy.RequireStellaOpsScopes(StellaOpsScopes.PolicyAuthor));
policyPacks.MapPost("/{packId}/revisions/{version:int}:activate", async Task<IResult> (
HttpContext context,
string packId,
int version,
ActivatePolicyRevisionRequest request,
policyPacks.MapPost("/{packId}/revisions/{version:int}:activate", async Task<IResult> (
HttpContext context,
string packId,
int version,
ActivatePolicyRevisionRequest request,
IPolicyEngineClient client,
PolicyEngineTokenProvider tokenProvider,
PolicyGatewayMetrics metrics,
@@ -330,13 +330,144 @@ policyPacks.MapPost("/{packId}/revisions/{version:int}:activate", async Task<IRe
var logger = loggerFactory.CreateLogger("StellaOps.Policy.Gateway.Activation");
LogActivation(logger, packId, version, outcome, source, response.StatusCode);
return response.ToMinimalResult();
})
.RequireAuthorization(policy => policy.RequireStellaOpsScopes(
StellaOpsScopes.PolicyOperate,
StellaOpsScopes.PolicyActivate));
app.Run();
return response.ToMinimalResult();
})
.RequireAuthorization(policy => policy.RequireStellaOpsScopes(
StellaOpsScopes.PolicyOperate,
StellaOpsScopes.PolicyActivate));
var cvss = app.MapGroup("/api/cvss")
.WithTags("CVSS Receipts");
cvss.MapPost("/receipts", async Task<IResult>(
HttpContext context,
CreateCvssReceiptRequest request,
IPolicyEngineClient client,
PolicyEngineTokenProvider tokenProvider,
CancellationToken cancellationToken) =>
{
if (request is null)
{
return Results.BadRequest(new ProblemDetails
{
Title = "Request body required.",
Status = StatusCodes.Status400BadRequest
});
}
GatewayForwardingContext? forwardingContext = null;
if (GatewayForwardingContext.TryCreate(context, out var callerContext))
{
forwardingContext = callerContext;
}
else if (!tokenProvider.IsEnabled)
{
return Results.Unauthorized();
}
var response = await client.CreateCvssReceiptAsync(forwardingContext, request, cancellationToken).ConfigureAwait(false);
return response.ToMinimalResult();
})
.RequireAuthorization(policy => policy.RequireStellaOpsScopes(StellaOpsScopes.PolicyRun));
cvss.MapGet("/receipts/{receiptId}", async Task<IResult>(
HttpContext context,
string receiptId,
IPolicyEngineClient client,
PolicyEngineTokenProvider tokenProvider,
CancellationToken cancellationToken) =>
{
GatewayForwardingContext? forwardingContext = null;
if (GatewayForwardingContext.TryCreate(context, out var callerContext))
{
forwardingContext = callerContext;
}
else if (!tokenProvider.IsEnabled)
{
return Results.Unauthorized();
}
var response = await client.GetCvssReceiptAsync(forwardingContext, receiptId, cancellationToken).ConfigureAwait(false);
return response.ToMinimalResult();
})
.RequireAuthorization(policy => policy.RequireStellaOpsScopes(StellaOpsScopes.FindingsRead));
cvss.MapPut("/receipts/{receiptId}/amend", async Task<IResult>(
HttpContext context,
string receiptId,
AmendCvssReceiptRequest request,
IPolicyEngineClient client,
PolicyEngineTokenProvider tokenProvider,
CancellationToken cancellationToken) =>
{
if (request is null)
{
return Results.BadRequest(new ProblemDetails
{
Title = "Request body required.",
Status = StatusCodes.Status400BadRequest
});
}
GatewayForwardingContext? forwardingContext = null;
if (GatewayForwardingContext.TryCreate(context, out var callerContext))
{
forwardingContext = callerContext;
}
else if (!tokenProvider.IsEnabled)
{
return Results.Unauthorized();
}
var response = await client.AmendCvssReceiptAsync(forwardingContext, receiptId, request, cancellationToken).ConfigureAwait(false);
return response.ToMinimalResult();
})
.RequireAuthorization(policy => policy.RequireStellaOpsScopes(StellaOpsScopes.PolicyRun));
cvss.MapGet("/receipts/{receiptId}/history", async Task<IResult>(
HttpContext context,
string receiptId,
IPolicyEngineClient client,
PolicyEngineTokenProvider tokenProvider,
CancellationToken cancellationToken) =>
{
GatewayForwardingContext? forwardingContext = null;
if (GatewayForwardingContext.TryCreate(context, out var callerContext))
{
forwardingContext = callerContext;
}
else if (!tokenProvider.IsEnabled)
{
return Results.Unauthorized();
}
var response = await client.GetCvssReceiptHistoryAsync(forwardingContext, receiptId, cancellationToken).ConfigureAwait(false);
return response.ToMinimalResult();
})
.RequireAuthorization(policy => policy.RequireStellaOpsScopes(StellaOpsScopes.FindingsRead));
cvss.MapGet("/policies", async Task<IResult>(
HttpContext context,
IPolicyEngineClient client,
PolicyEngineTokenProvider tokenProvider,
CancellationToken cancellationToken) =>
{
GatewayForwardingContext? forwardingContext = null;
if (GatewayForwardingContext.TryCreate(context, out var callerContext))
{
forwardingContext = callerContext;
}
else if (!tokenProvider.IsEnabled)
{
return Results.Unauthorized();
}
var response = await client.ListCvssPoliciesAsync(forwardingContext, cancellationToken).ConfigureAwait(false);
return response.ToMinimalResult();
})
.RequireAuthorization(policy => policy.RequireStellaOpsScopes(StellaOpsScopes.FindingsRead));
app.Run();
static IAsyncPolicy<HttpResponseMessage> CreateAuthorityRetryPolicy(IServiceProvider provider)
{

View File

@@ -16,6 +16,7 @@
<ProjectReference Include="../../Authority/StellaOps.Authority/StellaOps.Auth.Client/StellaOps.Auth.Client.csproj" />
<ProjectReference Include="../../Authority/StellaOps.Authority/StellaOps.Auth.ServerIntegration/StellaOps.Auth.ServerIntegration.csproj" />
<ProjectReference Include="../../AirGap/StellaOps.AirGap.Policy/StellaOps.AirGap.Policy/StellaOps.AirGap.Policy.csproj" />
<ProjectReference Include="../StellaOps.Policy.Scoring/StellaOps.Policy.Scoring.csproj" />
</ItemGroup>
<ItemGroup>
<PackageReference Include="Microsoft.Extensions.Http.Polly" Version="10.0.0" />

View File

@@ -12,7 +12,7 @@ import jsPDF from './jspdf.stub';
imports: [CommonModule],
changeDetection: ChangeDetectionStrategy.OnPush,
template: `
<section class="expl" aria-busy="{{ loading }}">
<section class="expl" [attr.aria-busy]="loading">
<header class="expl__header" *ngIf="result">
<div>
<p class="expl__eyebrow">Policy Studio · Explain</p>

View File

@@ -9,7 +9,7 @@ import { ActivatedRoute } from '@angular/router';
imports: [CommonModule, ReactiveFormsModule],
changeDetection: ChangeDetectionStrategy.OnPush,
template: `
<section class="rb" aria-busy="false">
<section class="rb" [attr.aria-busy]="false">
<header class="rb__header">
<div>
<p class="rb__eyebrow">Policy Studio · Rule Builder</p>

View File

@@ -7,7 +7,7 @@
</PropertyGroup>
<ItemGroup>
<PackageReference Include="MongoDB.Driver" Version="3.5.0" />
<ProjectReference Include="..\..\Concelier\__Libraries\StellaOps.Concelier.Models\StellaOps.Concelier.Models.csproj" />
</ItemGroup>
</Project>