Rename Concelier Source modules to Connector

This commit is contained in:
master
2025-10-18 20:11:18 +03:00
parent 89ede53cc3
commit 052da7a7d0
789 changed files with 1489 additions and 1489 deletions

View File

@@ -0,0 +1,27 @@
# AGENTS
## Role
Red Hat distro connector (Security Data API and OVAL) providing authoritative OS package ranges (RPM NEVRA) and RHSA metadata; overrides generic registry ranges during merge.
## Scope
- Fetch Security Data JSON (for example CVRF) via Hydra; window by last_modified or after cursor; optionally ingest OVAL definitions.
- Validate payloads; parse advisories, CVEs, affected packages; materialize NEVRA and CPE records.
- Map to canonical advisories with affected Type=rpm/cpe, fixedBy NEVRA, RHSA aliasing; persist provenance indicating oval/package.nevra.
## Participants
- Source.Common (HTTP, throttling, validators).
- Storage.Mongo (document, dto, advisory, alias, affected, reference, source_state).
- Models (canonical Affected with NEVRA).
- Core/WebService (jobs: source:redhat:fetch|parse|map) already registered.
- Merge engine to enforce distro precedence (OVAL or PSIRT greater than NVD).
## Interfaces & contracts
- Aliases: RHSA-YYYY:NNNN, CVE ids; references include RHSA pages, errata, OVAL links.
- Affected: rpm (Identifier=NEVRA key) and cpe entries; versions include introduced/fixed/fixedBy; platforms mark RHEL streams.
- Provenance: kind="oval" or "package.nevra" as applicable; value=definition id or package.
## In/Out of scope
In: authoritative rpm ranges, RHSA mapping, OVAL interpretation, watermarking.
Out: building RPM artifacts; cross-distro reconciliation beyond Red Hat.
## Observability & security expectations
- Metrics: SourceDiagnostics publishes `concelier.source.http.*` counters/histograms tagged `concelier.source=redhat`, capturing fetch volumes, parse/OVAL failures, and map affected counts without bespoke metric names.
- Logs: cursor bounds, advisory ids, NEVRA counts; allowlist Red Hat endpoints.
## Tests
- Author and review coverage in `../StellaOps.Concelier.Connector.Distro.RedHat.Tests`.
- Shared fixtures (e.g., `MongoIntegrationFixture`, `ConnectorTestHarness`) live in `../StellaOps.Concelier.Testing`.
- Keep fixtures deterministic; match new cases to real-world advisories or regression scenarios.

View File

@@ -0,0 +1,25 @@
# RHSA Fixture Diffs for Conflict Resolver (Sprint 1)
_Status date: 2025-10-11_
The Red Hat connector fixtures were re-baselined after the model helper rollout so that the conflict resolver receives the canonical payload shape expected for range reconciliation.
## Key schema deltas
- `affectedPackages[]` now emits the `type` field ahead of the identifier and always carries a `normalizedVersions` array (empty for NEVRA/CPE today) alongside existing `versionRanges`.
- All nested `provenance` objects (package ranges, statuses, advisory-level metadata, references) now serialize in canonical order `source`, `kind`, `value`, `decisionReason`, `recordedAt`, `fieldMask` to align with `AdvisoryProvenance` equality used by the conflict resolver.
- `decisionReason` is now present (null) on provenance payloads so future precedence decisions can annotate overrides without another fixture bump.
## Impact on conflict resolver
- Range merge logic must accept an optional `normalizedVersions` array even when it is empty; RPM reconciliation continues to rely on NEVRA primitives (`rangeKind: "nevra"`).
- Provenance comparisons should treat the new property ordering and `decisionReason` field as canonical; older snapshots that lacked these fields are obsolete.
- Advisory/reference provenance now matches the structure that merge emits, so deterministic hashing of resolver inputs will remain stable across connectors.
## Updated goldens
- `src/StellaOps.Concelier.Connector.Distro.RedHat.Tests/RedHat/Fixtures/rhsa-2025-0001.snapshot.json`
- `src/StellaOps.Concelier.Connector.Distro.RedHat.Tests/RedHat/Fixtures/rhsa-2025-0002.snapshot.json`
- `src/StellaOps.Concelier.Connector.Distro.RedHat.Tests/RedHat/Fixtures/rhsa-2025-0003.snapshot.json`
Keep these notes in sync with any future provenance or normalized-rule updates so the conflict resolver team can reason about fixture-driven regressions.

View File

@@ -0,0 +1,97 @@
namespace StellaOps.Concelier.Connector.Distro.RedHat.Configuration;
public sealed class RedHatOptions
{
/// <summary>
/// Name of the HttpClient registered for Red Hat Hydra requests.
/// </summary>
public const string HttpClientName = "redhat-hydra";
/// <summary>
/// Base API endpoint for Hydra security data requests.
/// </summary>
public Uri BaseEndpoint { get; set; } = new("https://access.redhat.com/hydra/rest/securitydata");
/// <summary>
/// Relative path for the advisory listing endpoint (returns summary rows with resource_url values).
/// </summary>
public string SummaryPath { get; set; } = "csaf.json";
/// <summary>
/// Number of summary rows requested per page when scanning for new advisories.
/// </summary>
public int PageSize { get; set; } = 200;
/// <summary>
/// Maximum number of summary pages to inspect within one fetch invocation.
/// </summary>
public int MaxPagesPerFetch { get; set; } = 5;
/// <summary>
/// Upper bound on individual advisories fetched per invocation (guards against unbounded catch-up floods).
/// </summary>
public int MaxAdvisoriesPerFetch { get; set; } = 800;
/// <summary>
/// Initial look-back window applied when no watermark exists (Red Hat publishes extensive history; we default to 30 days).
/// </summary>
public TimeSpan InitialBackfill { get; set; } = TimeSpan.FromDays(30);
/// <summary>
/// Optional overlap period re-scanned on each run to pick up late-published advisories.
/// </summary>
public TimeSpan Overlap { get; set; } = TimeSpan.FromDays(1);
/// <summary>
/// Timeout applied to individual Hydra document fetches.
/// </summary>
public TimeSpan FetchTimeout { get; set; } = TimeSpan.FromSeconds(60);
/// <summary>
/// Custom user-agent presented to Red Hat endpoints (kept short to satisfy Jetty header limits).
/// </summary>
public string UserAgent { get; set; } = "StellaOps.Concelier.RedHat/1.0";
public void Validate()
{
if (BaseEndpoint is null || !BaseEndpoint.IsAbsoluteUri)
{
throw new InvalidOperationException("Red Hat Hydra base endpoint must be an absolute URI.");
}
if (string.IsNullOrWhiteSpace(SummaryPath))
{
throw new InvalidOperationException("Red Hat Hydra summary path must be configured.");
}
if (PageSize <= 0)
{
throw new InvalidOperationException("Red Hat Hydra page size must be positive.");
}
if (MaxPagesPerFetch <= 0)
{
throw new InvalidOperationException("Red Hat Hydra max pages per fetch must be positive.");
}
if (MaxAdvisoriesPerFetch <= 0)
{
throw new InvalidOperationException("Red Hat Hydra max advisories per fetch must be positive.");
}
if (InitialBackfill <= TimeSpan.Zero)
{
throw new InvalidOperationException("Red Hat Hydra initial backfill must be positive.");
}
if (Overlap < TimeSpan.Zero)
{
throw new InvalidOperationException("Red Hat Hydra overlap cannot be negative.");
}
if (FetchTimeout <= TimeSpan.Zero)
{
throw new InvalidOperationException("Red Hat Hydra fetch timeout must be positive.");
}
}
}

View File

@@ -0,0 +1,177 @@
using System;
using System.Collections.Generic;
using System.Text.Json.Serialization;
namespace StellaOps.Concelier.Connector.Distro.RedHat.Internal.Models;
internal sealed class RedHatCsafEnvelope
{
[JsonPropertyName("document")]
public RedHatDocumentSection? Document { get; init; }
[JsonPropertyName("product_tree")]
public RedHatProductTree? ProductTree { get; init; }
[JsonPropertyName("vulnerabilities")]
public IReadOnlyList<RedHatVulnerability>? Vulnerabilities { get; init; }
}
internal sealed class RedHatDocumentSection
{
[JsonPropertyName("aggregate_severity")]
public RedHatAggregateSeverity? AggregateSeverity { get; init; }
[JsonPropertyName("lang")]
public string? Lang { get; init; }
[JsonPropertyName("notes")]
public IReadOnlyList<RedHatDocumentNote>? Notes { get; init; }
[JsonPropertyName("references")]
public IReadOnlyList<RedHatReference>? References { get; init; }
[JsonPropertyName("title")]
public string? Title { get; init; }
[JsonPropertyName("tracking")]
public RedHatTracking? Tracking { get; init; }
}
internal sealed class RedHatAggregateSeverity
{
[JsonPropertyName("text")]
public string? Text { get; init; }
}
internal sealed class RedHatDocumentNote
{
[JsonPropertyName("category")]
public string? Category { get; init; }
[JsonPropertyName("text")]
public string? Text { get; init; }
public bool CategoryEquals(string value)
=> !string.IsNullOrWhiteSpace(Category)
&& string.Equals(Category, value, StringComparison.OrdinalIgnoreCase);
}
internal sealed class RedHatTracking
{
[JsonPropertyName("id")]
public string? Id { get; init; }
[JsonPropertyName("initial_release_date")]
public DateTimeOffset? InitialReleaseDate { get; init; }
[JsonPropertyName("current_release_date")]
public DateTimeOffset? CurrentReleaseDate { get; init; }
}
internal sealed class RedHatReference
{
[JsonPropertyName("category")]
public string? Category { get; init; }
[JsonPropertyName("summary")]
public string? Summary { get; init; }
[JsonPropertyName("url")]
public string? Url { get; init; }
}
internal sealed class RedHatProductTree
{
[JsonPropertyName("branches")]
public IReadOnlyList<RedHatProductBranch>? Branches { get; init; }
}
internal sealed class RedHatProductBranch
{
[JsonPropertyName("category")]
public string? Category { get; init; }
[JsonPropertyName("name")]
public string? Name { get; init; }
[JsonPropertyName("product")]
public RedHatProductNodeInfo? Product { get; init; }
[JsonPropertyName("branches")]
public IReadOnlyList<RedHatProductBranch>? Branches { get; init; }
}
internal sealed class RedHatProductNodeInfo
{
[JsonPropertyName("name")]
public string? Name { get; init; }
[JsonPropertyName("product_id")]
public string? ProductId { get; init; }
[JsonPropertyName("product_identification_helper")]
public RedHatProductIdentificationHelper? ProductIdentificationHelper { get; init; }
}
internal sealed class RedHatProductIdentificationHelper
{
[JsonPropertyName("cpe")]
public string? Cpe { get; init; }
[JsonPropertyName("purl")]
public string? Purl { get; init; }
}
internal sealed class RedHatVulnerability
{
[JsonPropertyName("cve")]
public string? Cve { get; init; }
[JsonPropertyName("references")]
public IReadOnlyList<RedHatReference>? References { get; init; }
[JsonPropertyName("scores")]
public IReadOnlyList<RedHatVulnerabilityScore>? Scores { get; init; }
[JsonPropertyName("product_status")]
public RedHatProductStatus? ProductStatus { get; init; }
}
internal sealed class RedHatVulnerabilityScore
{
[JsonPropertyName("cvss_v3")]
public RedHatCvssV3? CvssV3 { get; init; }
}
internal sealed class RedHatCvssV3
{
[JsonPropertyName("baseScore")]
public double? BaseScore { get; init; }
[JsonPropertyName("baseSeverity")]
public string? BaseSeverity { get; init; }
[JsonPropertyName("vectorString")]
public string? VectorString { get; init; }
[JsonPropertyName("version")]
public string? Version { get; init; }
}
internal sealed class RedHatProductStatus
{
[JsonPropertyName("fixed")]
public IReadOnlyList<string>? Fixed { get; init; }
[JsonPropertyName("first_fixed")]
public IReadOnlyList<string>? FirstFixed { get; init; }
[JsonPropertyName("known_affected")]
public IReadOnlyList<string>? KnownAffected { get; init; }
[JsonPropertyName("known_not_affected")]
public IReadOnlyList<string>? KnownNotAffected { get; init; }
[JsonPropertyName("under_investigation")]
public IReadOnlyList<string>? UnderInvestigation { get; init; }
}

View File

@@ -0,0 +1,254 @@
using System;
using System.Collections.Generic;
using System.Linq;
using MongoDB.Bson;
namespace StellaOps.Concelier.Connector.Distro.RedHat.Internal;
internal sealed record RedHatCursor(
DateTimeOffset? LastReleasedOn,
IReadOnlyCollection<string> ProcessedAdvisoryIds,
IReadOnlyCollection<Guid> PendingDocuments,
IReadOnlyCollection<Guid> PendingMappings,
IReadOnlyDictionary<string, RedHatCachedFetchMetadata> FetchCache)
{
private static readonly IReadOnlyCollection<string> EmptyStringList = Array.Empty<string>();
private static readonly IReadOnlyCollection<Guid> EmptyGuidList = Array.Empty<Guid>();
private static readonly IReadOnlyDictionary<string, RedHatCachedFetchMetadata> EmptyCache =
new Dictionary<string, RedHatCachedFetchMetadata>(StringComparer.OrdinalIgnoreCase);
public static RedHatCursor Empty { get; } = new(null, EmptyStringList, EmptyGuidList, EmptyGuidList, EmptyCache);
public static RedHatCursor FromBsonDocument(BsonDocument? document)
{
if (document is null || document.ElementCount == 0)
{
return Empty;
}
DateTimeOffset? lastReleased = null;
if (document.TryGetValue("lastReleasedOn", out var lastReleasedValue))
{
lastReleased = ReadDateTimeOffset(lastReleasedValue);
}
var processed = ReadStringSet(document, "processedAdvisories");
var pendingDocuments = ReadGuidSet(document, "pendingDocuments");
var pendingMappings = ReadGuidSet(document, "pendingMappings");
var fetchCache = ReadFetchCache(document);
return new RedHatCursor(lastReleased, processed, pendingDocuments, pendingMappings, fetchCache);
}
public BsonDocument ToBsonDocument()
{
var document = new BsonDocument();
if (LastReleasedOn.HasValue)
{
document["lastReleasedOn"] = LastReleasedOn.Value.UtcDateTime;
}
document["processedAdvisories"] = new BsonArray(ProcessedAdvisoryIds);
document["pendingDocuments"] = new BsonArray(PendingDocuments.Select(id => id.ToString()));
document["pendingMappings"] = new BsonArray(PendingMappings.Select(id => id.ToString()));
var cacheArray = new BsonArray();
foreach (var (key, metadata) in FetchCache)
{
var cacheDoc = new BsonDocument
{
["uri"] = key
};
if (!string.IsNullOrWhiteSpace(metadata.ETag))
{
cacheDoc["etag"] = metadata.ETag;
}
if (metadata.LastModified.HasValue)
{
cacheDoc["lastModified"] = metadata.LastModified.Value.UtcDateTime;
}
cacheArray.Add(cacheDoc);
}
document["fetchCache"] = cacheArray;
return document;
}
public RedHatCursor WithLastReleased(DateTimeOffset? releasedOn, IEnumerable<string> advisoryIds)
{
var normalizedIds = advisoryIds?.Where(static id => !string.IsNullOrWhiteSpace(id))
.Select(static id => id.Trim())
.Distinct(StringComparer.OrdinalIgnoreCase)
.ToArray() ?? Array.Empty<string>();
return this with
{
LastReleasedOn = releasedOn,
ProcessedAdvisoryIds = normalizedIds
};
}
public RedHatCursor AddProcessedAdvisories(IEnumerable<string> advisoryIds)
{
if (advisoryIds is null)
{
return this;
}
var set = new HashSet<string>(ProcessedAdvisoryIds, StringComparer.OrdinalIgnoreCase);
foreach (var id in advisoryIds)
{
if (!string.IsNullOrWhiteSpace(id))
{
set.Add(id.Trim());
}
}
return this with { ProcessedAdvisoryIds = set.ToArray() };
}
public RedHatCursor WithPendingDocuments(IEnumerable<Guid> ids)
{
var list = ids?.Distinct().ToArray() ?? Array.Empty<Guid>();
return this with { PendingDocuments = list };
}
public RedHatCursor WithPendingMappings(IEnumerable<Guid> ids)
{
var list = ids?.Distinct().ToArray() ?? Array.Empty<Guid>();
return this with { PendingMappings = list };
}
public RedHatCursor WithFetchCache(string requestUri, string? etag, DateTimeOffset? lastModified)
{
var cache = new Dictionary<string, RedHatCachedFetchMetadata>(FetchCache, StringComparer.OrdinalIgnoreCase)
{
[requestUri] = new RedHatCachedFetchMetadata(etag, lastModified)
};
return this with { FetchCache = cache };
}
public RedHatCursor PruneFetchCache(IEnumerable<string> keepUris)
{
if (FetchCache.Count == 0)
{
return this;
}
var keepSet = new HashSet<string>(keepUris ?? Array.Empty<string>(), StringComparer.OrdinalIgnoreCase);
if (keepSet.Count == 0)
{
return this with { FetchCache = EmptyCache };
}
var cache = new Dictionary<string, RedHatCachedFetchMetadata>(StringComparer.OrdinalIgnoreCase);
foreach (var uri in keepSet)
{
if (FetchCache.TryGetValue(uri, out var metadata))
{
cache[uri] = metadata;
}
}
return this with { FetchCache = cache };
}
public RedHatCachedFetchMetadata? TryGetFetchCache(string requestUri)
{
if (FetchCache.TryGetValue(requestUri, out var metadata))
{
return metadata;
}
return null;
}
private static IReadOnlyCollection<string> ReadStringSet(BsonDocument document, string field)
{
if (!document.TryGetValue(field, out var value) || value is not BsonArray array)
{
return EmptyStringList;
}
var results = new List<string>(array.Count);
foreach (var element in array)
{
if (element.BsonType == BsonType.String)
{
var str = element.AsString.Trim();
if (!string.IsNullOrWhiteSpace(str))
{
results.Add(str);
}
}
}
return results;
}
private static IReadOnlyCollection<Guid> ReadGuidSet(BsonDocument document, string field)
{
if (!document.TryGetValue(field, out var value) || value is not BsonArray array)
{
return EmptyGuidList;
}
var results = new List<Guid>(array.Count);
foreach (var element in array)
{
if (element.BsonType == BsonType.String && Guid.TryParse(element.AsString, out var guid))
{
results.Add(guid);
}
}
return results;
}
private static IReadOnlyDictionary<string, RedHatCachedFetchMetadata> ReadFetchCache(BsonDocument document)
{
if (!document.TryGetValue("fetchCache", out var value) || value is not BsonArray array || array.Count == 0)
{
return EmptyCache;
}
var results = new Dictionary<string, RedHatCachedFetchMetadata>(StringComparer.OrdinalIgnoreCase);
foreach (var element in array.OfType<BsonDocument>())
{
if (!element.TryGetValue("uri", out var uriValue) || uriValue.BsonType != BsonType.String)
{
continue;
}
var uri = uriValue.AsString;
var etag = element.TryGetValue("etag", out var etagValue) && etagValue.BsonType == BsonType.String
? etagValue.AsString
: null;
DateTimeOffset? lastModified = null;
if (element.TryGetValue("lastModified", out var lastModifiedValue))
{
lastModified = ReadDateTimeOffset(lastModifiedValue);
}
results[uri] = new RedHatCachedFetchMetadata(etag, lastModified);
}
return results;
}
private static DateTimeOffset? ReadDateTimeOffset(BsonValue value)
{
return value.BsonType switch
{
BsonType.DateTime => DateTime.SpecifyKind(value.ToUniversalTime(), DateTimeKind.Utc),
BsonType.String when DateTimeOffset.TryParse(value.AsString, out var parsed) => parsed.ToUniversalTime(),
_ => null,
};
}
}
internal sealed record RedHatCachedFetchMetadata(string? ETag, DateTimeOffset? LastModified);

View File

@@ -0,0 +1,758 @@
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text.Json;
using System.Text.Json.Serialization;
using StellaOps.Concelier.Models;
using StellaOps.Concelier.Connector.Distro.RedHat.Internal.Models;
using StellaOps.Concelier.Normalization.Cvss;
using StellaOps.Concelier.Normalization.Distro;
using StellaOps.Concelier.Normalization.Identifiers;
using StellaOps.Concelier.Normalization.Text;
using StellaOps.Concelier.Storage.Mongo.Documents;
using StellaOps.Concelier.Storage.Mongo.Dtos;
namespace StellaOps.Concelier.Connector.Distro.RedHat.Internal;
internal static class RedHatMapper
{
private static readonly JsonSerializerOptions SerializerOptions = new()
{
PropertyNameCaseInsensitive = true,
NumberHandling = JsonNumberHandling.AllowReadingFromString,
};
public static Advisory? Map(string sourceName, DtoRecord dto, DocumentRecord document, JsonDocument payload)
{
ArgumentException.ThrowIfNullOrEmpty(sourceName);
ArgumentNullException.ThrowIfNull(dto);
ArgumentNullException.ThrowIfNull(document);
ArgumentNullException.ThrowIfNull(payload);
var csaf = JsonSerializer.Deserialize<RedHatCsafEnvelope>(payload.RootElement.GetRawText(), SerializerOptions);
var documentSection = csaf?.Document;
if (documentSection is null)
{
return null;
}
var tracking = documentSection.Tracking;
var advisoryKey = NormalizeId(tracking?.Id)
?? NormalizeId(TryGetMetadata(document, "advisoryId"))
?? NormalizeId(document.Uri)
?? dto.DocumentId.ToString();
var title = !string.IsNullOrWhiteSpace(documentSection.Title)
? DescriptionNormalizer.Normalize(new[] { new LocalizedText(documentSection.Title, documentSection.Lang) }).Text
: string.Empty;
if (string.IsNullOrEmpty(title))
{
title = advisoryKey;
}
var description = NormalizeSummary(documentSection);
var summary = string.IsNullOrEmpty(description.Text) ? null : description.Text;
var severity = NormalizeSeverity(documentSection.AggregateSeverity?.Text);
var published = tracking?.InitialReleaseDate;
var modified = tracking?.CurrentReleaseDate ?? published;
var language = description.Language;
var aliases = BuildAliases(advisoryKey, csaf);
var references = BuildReferences(sourceName, dto.ValidatedAt, documentSection, csaf);
var productIndex = RedHatProductIndex.Build(csaf.ProductTree);
var affectedPackages = BuildAffectedPackages(sourceName, dto.ValidatedAt, csaf, productIndex);
var cvssMetrics = BuildCvssMetrics(sourceName, dto.ValidatedAt, advisoryKey, csaf);
var provenance = new[]
{
new AdvisoryProvenance(sourceName, "advisory", advisoryKey, dto.ValidatedAt),
};
return new Advisory(
advisoryKey,
title,
summary,
language,
published,
modified,
severity,
exploitKnown: false,
aliases,
references,
affectedPackages,
cvssMetrics,
provenance);
}
private static IReadOnlyCollection<string> BuildAliases(string advisoryKey, RedHatCsafEnvelope csaf)
{
var aliases = new HashSet<string>(StringComparer.OrdinalIgnoreCase)
{
advisoryKey,
};
if (csaf.Vulnerabilities is not null)
{
foreach (var vulnerability in csaf.Vulnerabilities)
{
if (!string.IsNullOrWhiteSpace(vulnerability?.Cve))
{
aliases.Add(vulnerability!.Cve!.Trim());
}
}
}
return aliases;
}
private static NormalizedDescription NormalizeSummary(RedHatDocumentSection documentSection)
{
var summaryNotes = new List<LocalizedText>();
var otherNotes = new List<LocalizedText>();
if (documentSection.Notes is not null)
{
foreach (var note in documentSection.Notes)
{
if (note is null || string.IsNullOrWhiteSpace(note.Text))
{
continue;
}
var candidate = new LocalizedText(note.Text, documentSection.Lang);
if (note.CategoryEquals("summary"))
{
summaryNotes.Add(candidate);
}
else
{
otherNotes.Add(candidate);
}
}
}
var combined = summaryNotes.Count > 0
? summaryNotes.Concat(otherNotes).ToList()
: otherNotes;
return DescriptionNormalizer.Normalize(combined);
}
private static IReadOnlyCollection<AdvisoryReference> BuildReferences(
string sourceName,
DateTimeOffset recordedAt,
RedHatDocumentSection? documentSection,
RedHatCsafEnvelope csaf)
{
var references = new List<AdvisoryReference>();
if (documentSection is not null)
{
AppendReferences(sourceName, recordedAt, documentSection.References, references);
}
if (csaf.Vulnerabilities is not null)
{
foreach (var vulnerability in csaf.Vulnerabilities)
{
AppendReferences(sourceName, recordedAt, vulnerability?.References, references);
}
}
return NormalizeReferences(references);
}
private static void AppendReferences(string sourceName, DateTimeOffset recordedAt, IReadOnlyList<RedHatReference>? items, ICollection<AdvisoryReference> references)
{
if (items is null)
{
return;
}
foreach (var reference in items)
{
if (reference is null || string.IsNullOrWhiteSpace(reference.Url))
{
continue;
}
var url = reference.Url.Trim();
if (!Validation.LooksLikeHttpUrl(url))
{
continue;
}
var provenance = new AdvisoryProvenance(sourceName, "reference", url, recordedAt);
references.Add(new AdvisoryReference(url, reference.Category, null, reference.Summary, provenance));
}
}
private static IReadOnlyCollection<AdvisoryReference> NormalizeReferences(IReadOnlyCollection<AdvisoryReference> references)
{
if (references.Count == 0)
{
return Array.Empty<AdvisoryReference>();
}
var map = new Dictionary<string, AdvisoryReference>(StringComparer.OrdinalIgnoreCase);
foreach (var reference in references)
{
if (!map.TryGetValue(reference.Url, out var existing))
{
map[reference.Url] = reference;
continue;
}
map[reference.Url] = MergeReferences(existing, reference);
}
return map.Values
.OrderBy(static r => r.Kind is null ? 1 : 0)
.ThenBy(static r => r.Kind ?? string.Empty, StringComparer.OrdinalIgnoreCase)
.ThenBy(static r => r.Url, StringComparer.OrdinalIgnoreCase)
.ThenBy(static r => r.SourceTag ?? string.Empty, StringComparer.OrdinalIgnoreCase)
.ToArray();
}
private static AdvisoryReference MergeReferences(AdvisoryReference existing, AdvisoryReference candidate)
{
var kind = existing.Kind ?? candidate.Kind;
var sourceTag = existing.SourceTag ?? candidate.SourceTag;
var summary = ChoosePreferredSummary(existing.Summary, candidate.Summary);
var provenance = existing.Provenance.RecordedAt <= candidate.Provenance.RecordedAt
? existing.Provenance
: candidate.Provenance;
if (kind == existing.Kind
&& sourceTag == existing.SourceTag
&& summary == existing.Summary
&& provenance == existing.Provenance)
{
return existing;
}
if (kind == candidate.Kind
&& sourceTag == candidate.SourceTag
&& summary == candidate.Summary
&& provenance == candidate.Provenance)
{
return candidate;
}
return new AdvisoryReference(existing.Url, kind, sourceTag, summary, provenance);
}
private static string? ChoosePreferredSummary(string? left, string? right)
{
var leftValue = string.IsNullOrWhiteSpace(left) ? null : left;
var rightValue = string.IsNullOrWhiteSpace(right) ? null : right;
if (leftValue is null)
{
return rightValue;
}
if (rightValue is null)
{
return leftValue;
}
return leftValue.Length >= rightValue.Length ? leftValue : rightValue;
}
private static IReadOnlyCollection<AffectedPackage> BuildAffectedPackages(
string sourceName,
DateTimeOffset recordedAt,
RedHatCsafEnvelope csaf,
RedHatProductIndex productIndex)
{
var rpmPackages = new Dictionary<string, RedHatAffectedRpm>(StringComparer.OrdinalIgnoreCase);
var baseProducts = new Dictionary<string, RedHatProductStatusEntry>(StringComparer.OrdinalIgnoreCase);
var knownAffectedByBase = BuildKnownAffectedIndex(csaf);
if (csaf.Vulnerabilities is not null)
{
foreach (var vulnerability in csaf.Vulnerabilities)
{
if (vulnerability?.ProductStatus is null)
{
continue;
}
RegisterAll(vulnerability.ProductStatus.Fixed, RedHatProductStatuses.Fixed, productIndex, rpmPackages, baseProducts);
RegisterAll(vulnerability.ProductStatus.FirstFixed, RedHatProductStatuses.FirstFixed, productIndex, rpmPackages, baseProducts);
RegisterAll(vulnerability.ProductStatus.KnownAffected, RedHatProductStatuses.KnownAffected, productIndex, rpmPackages, baseProducts);
RegisterAll(vulnerability.ProductStatus.KnownNotAffected, RedHatProductStatuses.KnownNotAffected, productIndex, rpmPackages, baseProducts);
RegisterAll(vulnerability.ProductStatus.UnderInvestigation, RedHatProductStatuses.UnderInvestigation, productIndex, rpmPackages, baseProducts);
}
}
var affected = new List<AffectedPackage>(rpmPackages.Count + baseProducts.Count);
foreach (var rpm in rpmPackages.Values)
{
if (rpm.Statuses.Count == 0)
{
continue;
}
var ranges = new List<AffectedVersionRange>();
var statuses = new List<AffectedPackageStatus>();
var provenance = new AdvisoryProvenance(sourceName, "package.nevra", rpm.ProductId ?? rpm.Nevra, recordedAt);
var lastKnownAffected = knownAffectedByBase.TryGetValue(rpm.BaseProductId, out var candidate)
? candidate
: null;
if (!string.IsNullOrWhiteSpace(lastKnownAffected)
&& string.Equals(lastKnownAffected, rpm.Nevra, StringComparison.OrdinalIgnoreCase))
{
lastKnownAffected = null;
}
if (rpm.Statuses.Contains(RedHatProductStatuses.Fixed) || rpm.Statuses.Contains(RedHatProductStatuses.FirstFixed))
{
ranges.Add(new AffectedVersionRange(
"nevra",
introducedVersion: null,
fixedVersion: rpm.Nevra,
lastAffectedVersion: lastKnownAffected,
rangeExpression: null,
provenance: provenance,
primitives: BuildNevraPrimitives(null, rpm.Nevra, lastKnownAffected)));
}
if (!rpm.Statuses.Contains(RedHatProductStatuses.Fixed)
&& !rpm.Statuses.Contains(RedHatProductStatuses.FirstFixed)
&& rpm.Statuses.Contains(RedHatProductStatuses.KnownAffected))
{
ranges.Add(new AffectedVersionRange(
"nevra",
introducedVersion: null,
fixedVersion: null,
lastAffectedVersion: rpm.Nevra,
rangeExpression: null,
provenance: provenance,
primitives: BuildNevraPrimitives(null, null, rpm.Nevra)));
}
if (rpm.Statuses.Contains(RedHatProductStatuses.KnownNotAffected))
{
statuses.Add(new AffectedPackageStatus(RedHatProductStatuses.KnownNotAffected, provenance));
}
if (rpm.Statuses.Contains(RedHatProductStatuses.UnderInvestigation))
{
statuses.Add(new AffectedPackageStatus(RedHatProductStatuses.UnderInvestigation, provenance));
}
if (ranges.Count == 0 && statuses.Count == 0)
{
continue;
}
affected.Add(new AffectedPackage(
AffectedPackageTypes.Rpm,
rpm.Nevra,
rpm.Platform,
ranges,
statuses,
new[] { provenance }));
}
foreach (var baseEntry in baseProducts.Values)
{
if (baseEntry.Statuses.Count == 0)
{
continue;
}
var node = baseEntry.Node;
if (string.IsNullOrWhiteSpace(node.Cpe))
{
continue;
}
if (!IdentifierNormalizer.TryNormalizeCpe(node.Cpe, out var normalizedCpe))
{
continue;
}
var provenance = new AdvisoryProvenance(sourceName, "oval", node.ProductId, recordedAt);
var statuses = new List<AffectedPackageStatus>();
if (baseEntry.Statuses.Contains(RedHatProductStatuses.KnownAffected))
{
statuses.Add(new AffectedPackageStatus(RedHatProductStatuses.KnownAffected, provenance));
}
if (baseEntry.Statuses.Contains(RedHatProductStatuses.KnownNotAffected))
{
statuses.Add(new AffectedPackageStatus(RedHatProductStatuses.KnownNotAffected, provenance));
}
if (baseEntry.Statuses.Contains(RedHatProductStatuses.UnderInvestigation))
{
statuses.Add(new AffectedPackageStatus(RedHatProductStatuses.UnderInvestigation, provenance));
}
if (statuses.Count == 0)
{
continue;
}
affected.Add(new AffectedPackage(
AffectedPackageTypes.Cpe,
normalizedCpe!,
node.Name,
Array.Empty<AffectedVersionRange>(),
statuses,
new[] { provenance }));
}
return affected;
}
private static Dictionary<string, string> BuildKnownAffectedIndex(RedHatCsafEnvelope csaf)
{
var map = new Dictionary<string, string>(StringComparer.OrdinalIgnoreCase);
if (csaf.Vulnerabilities is null)
{
return map;
}
foreach (var vulnerability in csaf.Vulnerabilities)
{
var entries = vulnerability?.ProductStatus?.KnownAffected;
if (entries is null)
{
continue;
}
foreach (var entry in entries)
{
if (string.IsNullOrWhiteSpace(entry))
{
continue;
}
var colonIndex = entry.IndexOf(':');
if (colonIndex <= 0)
{
continue;
}
var baseId = entry[..colonIndex].Trim();
if (string.IsNullOrEmpty(baseId))
{
continue;
}
var candidate = NormalizeNevra(entry[(colonIndex + 1)..]);
if (!string.IsNullOrEmpty(candidate))
{
map[baseId] = candidate;
}
}
}
return map;
}
private static void RegisterAll(
IReadOnlyList<string>? entries,
string status,
RedHatProductIndex productIndex,
IDictionary<string, RedHatAffectedRpm> rpmPackages,
IDictionary<string, RedHatProductStatusEntry> baseProducts)
{
if (entries is null)
{
return;
}
foreach (var entry in entries)
{
RegisterProductStatus(entry, status, productIndex, rpmPackages, baseProducts);
}
}
private static void RegisterProductStatus(
string? rawEntry,
string status,
RedHatProductIndex productIndex,
IDictionary<string, RedHatAffectedRpm> rpmPackages,
IDictionary<string, RedHatProductStatusEntry> baseProducts)
{
if (string.IsNullOrWhiteSpace(rawEntry) || !IsActionableStatus(status))
{
return;
}
var entry = rawEntry.Trim();
var colonIndex = entry.IndexOf(':');
if (colonIndex <= 0 || colonIndex == entry.Length - 1)
{
if (productIndex.TryGetValue(entry, out var baseOnly))
{
var aggregate = baseProducts.TryGetValue(baseOnly.ProductId, out var existing)
? existing
: new RedHatProductStatusEntry(baseOnly);
aggregate.Statuses.Add(status);
baseProducts[baseOnly.ProductId] = aggregate;
}
return;
}
var baseId = entry[..colonIndex];
var packageId = entry[(colonIndex + 1)..];
if (productIndex.TryGetValue(baseId, out var baseNode))
{
var aggregate = baseProducts.TryGetValue(baseNode.ProductId, out var existing)
? existing
: new RedHatProductStatusEntry(baseNode);
aggregate.Statuses.Add(status);
baseProducts[baseNode.ProductId] = aggregate;
}
if (!productIndex.TryGetValue(packageId, out var packageNode))
{
return;
}
var nevra = NormalizeNevra(packageNode.Name ?? packageNode.ProductId);
if (string.IsNullOrEmpty(nevra))
{
return;
}
var platform = baseProducts.TryGetValue(baseId, out var baseEntry)
? baseEntry.Node.Name ?? baseId
: baseId;
var key = string.Join('|', nevra, platform ?? string.Empty);
if (!rpmPackages.TryGetValue(key, out var rpm))
{
rpm = new RedHatAffectedRpm(nevra, baseId, platform, packageNode.ProductId);
rpmPackages[key] = rpm;
}
rpm.Statuses.Add(status);
}
private static bool IsActionableStatus(string status)
{
return status.Equals(RedHatProductStatuses.Fixed, StringComparison.OrdinalIgnoreCase)
|| status.Equals(RedHatProductStatuses.FirstFixed, StringComparison.OrdinalIgnoreCase)
|| status.Equals(RedHatProductStatuses.KnownAffected, StringComparison.OrdinalIgnoreCase)
|| status.Equals(RedHatProductStatuses.KnownNotAffected, StringComparison.OrdinalIgnoreCase)
|| status.Equals(RedHatProductStatuses.UnderInvestigation, StringComparison.OrdinalIgnoreCase);
}
private static IReadOnlyCollection<CvssMetric> BuildCvssMetrics(
string sourceName,
DateTimeOffset recordedAt,
string advisoryKey,
RedHatCsafEnvelope csaf)
{
var metrics = new List<CvssMetric>();
if (csaf.Vulnerabilities is null)
{
return metrics;
}
foreach (var vulnerability in csaf.Vulnerabilities)
{
if (vulnerability?.Scores is null)
{
continue;
}
foreach (var score in vulnerability.Scores)
{
var cvss = score?.CvssV3;
if (cvss is null)
{
continue;
}
if (!CvssMetricNormalizer.TryNormalize(cvss.Version, cvss.VectorString, cvss.BaseScore, cvss.BaseSeverity, out var normalized))
{
continue;
}
var provenance = new AdvisoryProvenance(sourceName, "cvss", vulnerability.Cve ?? advisoryKey, recordedAt);
metrics.Add(normalized.ToModel(provenance));
}
}
return metrics;
}
private static string? NormalizeSeverity(string? value)
{
if (string.IsNullOrWhiteSpace(value))
{
return null;
}
return value.Trim().ToLowerInvariant() switch
{
"critical" => "critical",
"important" => "high",
"moderate" => "medium",
"low" => "low",
"none" => "none",
_ => value.Trim().ToLowerInvariant(),
};
}
private static string? TryGetMetadata(DocumentRecord document, string key)
{
if (document.Metadata is null)
{
return null;
}
return document.Metadata.TryGetValue(key, out var value) && !string.IsNullOrWhiteSpace(value)
? value.Trim()
: null;
}
private static RangePrimitives BuildNevraPrimitives(string? introduced, string? fixedVersion, string? lastAffected)
{
var primitive = new NevraPrimitive(
ParseNevraComponent(introduced),
ParseNevraComponent(fixedVersion),
ParseNevraComponent(lastAffected));
return new RangePrimitives(null, primitive, null, null);
}
private static NevraComponent? ParseNevraComponent(string? value)
{
if (string.IsNullOrWhiteSpace(value))
{
return null;
}
if (!Nevra.TryParse(value, out var parsed) || parsed is null)
{
return null;
}
return new NevraComponent(parsed.Name, parsed.Epoch, parsed.Version, parsed.Release, parsed.Architecture);
}
private static string? NormalizeId(string? value)
=> string.IsNullOrWhiteSpace(value) ? null : value.Trim();
private static string NormalizeNevra(string? value)
{
return string.IsNullOrWhiteSpace(value)
? string.Empty
: value.Trim();
}
}
internal sealed class RedHatAffectedRpm
{
public RedHatAffectedRpm(string nevra, string baseProductId, string? platform, string? productId)
{
Nevra = nevra;
BaseProductId = baseProductId;
Platform = platform;
ProductId = productId;
Statuses = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
}
public string Nevra { get; }
public string BaseProductId { get; }
public string? Platform { get; }
public string? ProductId { get; }
public HashSet<string> Statuses { get; }
}
internal sealed class RedHatProductStatusEntry
{
public RedHatProductStatusEntry(RedHatProductNode node)
{
Node = node;
Statuses = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
}
public RedHatProductNode Node { get; }
public HashSet<string> Statuses { get; }
}
internal static class RedHatProductStatuses
{
public const string Fixed = "fixed";
public const string FirstFixed = "first_fixed";
public const string KnownAffected = "known_affected";
public const string KnownNotAffected = "known_not_affected";
public const string UnderInvestigation = "under_investigation";
}
internal sealed class RedHatProductIndex
{
private readonly Dictionary<string, RedHatProductNode> _products;
private RedHatProductIndex(Dictionary<string, RedHatProductNode> products)
{
_products = products;
}
public static RedHatProductIndex Build(RedHatProductTree? tree)
{
var products = new Dictionary<string, RedHatProductNode>(StringComparer.OrdinalIgnoreCase);
if (tree?.Branches is not null)
{
foreach (var branch in tree.Branches)
{
Traverse(branch, products);
}
}
return new RedHatProductIndex(products);
}
public bool TryGetValue(string productId, out RedHatProductNode node)
=> _products.TryGetValue(productId, out node);
private static void Traverse(RedHatProductBranch? branch, IDictionary<string, RedHatProductNode> products)
{
if (branch is null)
{
return;
}
if (branch.Product is not null && !string.IsNullOrWhiteSpace(branch.Product.ProductId))
{
var id = branch.Product.ProductId.Trim();
products[id] = new RedHatProductNode(
id,
branch.Product.Name ?? branch.Name ?? id,
branch.Product.ProductIdentificationHelper?.Cpe,
branch.Product.ProductIdentificationHelper?.Purl);
}
if (branch.Branches is null)
{
return;
}
foreach (var child in branch.Branches)
{
Traverse(child, products);
}
}
}
internal sealed record RedHatProductNode(string ProductId, string? Name, string? Cpe, string? Purl);

View File

@@ -0,0 +1,66 @@
using System;
using System.Text.Json;
namespace StellaOps.Concelier.Connector.Distro.RedHat.Internal;
internal readonly record struct RedHatSummaryItem(string AdvisoryId, DateTimeOffset ReleasedOn, Uri ResourceUri)
{
private static readonly string[] AdvisoryFields =
{
"RHSA",
"RHBA",
"RHEA",
"RHUI",
"RHBG",
"RHBO",
"advisory"
};
public static bool TryParse(JsonElement element, out RedHatSummaryItem item)
{
item = default;
string? advisoryId = null;
foreach (var field in AdvisoryFields)
{
if (element.TryGetProperty(field, out var advisoryProperty) && advisoryProperty.ValueKind == JsonValueKind.String)
{
var candidate = advisoryProperty.GetString();
if (!string.IsNullOrWhiteSpace(candidate))
{
advisoryId = candidate.Trim();
break;
}
}
}
if (string.IsNullOrWhiteSpace(advisoryId))
{
return false;
}
if (!element.TryGetProperty("released_on", out var releasedProperty) || releasedProperty.ValueKind != JsonValueKind.String)
{
return false;
}
if (!DateTimeOffset.TryParse(releasedProperty.GetString(), out var releasedOn))
{
return false;
}
if (!element.TryGetProperty("resource_url", out var resourceProperty) || resourceProperty.ValueKind != JsonValueKind.String)
{
return false;
}
var resourceValue = resourceProperty.GetString();
if (!Uri.TryCreate(resourceValue, UriKind.Absolute, out var resourceUri))
{
return false;
}
item = new RedHatSummaryItem(advisoryId!, releasedOn.ToUniversalTime(), resourceUri);
return true;
}
}

View File

@@ -0,0 +1,46 @@
using System;
using System.Threading;
using System.Threading.Tasks;
using StellaOps.Concelier.Core.Jobs;
namespace StellaOps.Concelier.Connector.Distro.RedHat;
internal static class RedHatJobKinds
{
public const string Fetch = "source:redhat:fetch";
public const string Parse = "source:redhat:parse";
public const string Map = "source:redhat:map";
}
internal sealed class RedHatFetchJob : IJob
{
private readonly RedHatConnector _connector;
public RedHatFetchJob(RedHatConnector connector)
=> _connector = connector ?? throw new ArgumentNullException(nameof(connector));
public Task ExecuteAsync(JobExecutionContext context, CancellationToken cancellationToken)
=> _connector.FetchAsync(context.Services, cancellationToken);
}
internal sealed class RedHatParseJob : IJob
{
private readonly RedHatConnector _connector;
public RedHatParseJob(RedHatConnector connector)
=> _connector = connector ?? throw new ArgumentNullException(nameof(connector));
public Task ExecuteAsync(JobExecutionContext context, CancellationToken cancellationToken)
=> _connector.ParseAsync(context.Services, cancellationToken);
}
internal sealed class RedHatMapJob : IJob
{
private readonly RedHatConnector _connector;
public RedHatMapJob(RedHatConnector connector)
=> _connector = connector ?? throw new ArgumentNullException(nameof(connector));
public Task ExecuteAsync(JobExecutionContext context, CancellationToken cancellationToken)
=> _connector.MapAsync(context.Services, cancellationToken);
}

View File

@@ -0,0 +1,3 @@
using System.Runtime.CompilerServices;
[assembly: InternalsVisibleTo("StellaOps.Concelier.Connector.Distro.RedHat.Tests")]

View File

@@ -0,0 +1,434 @@
using System;
using System.Collections.Generic;
using System.Globalization;
using System.Linq;
using System.Text.Json;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using MongoDB.Bson;
using MongoDB.Bson.IO;
using StellaOps.Concelier.Models;
using StellaOps.Concelier.Connector.Common;
using StellaOps.Concelier.Connector.Common.Fetch;
using StellaOps.Concelier.Connector.Distro.RedHat.Configuration;
using StellaOps.Concelier.Connector.Distro.RedHat.Internal;
using StellaOps.Concelier.Storage.Mongo;
using StellaOps.Concelier.Storage.Mongo.Advisories;
using StellaOps.Concelier.Storage.Mongo.Documents;
using StellaOps.Concelier.Storage.Mongo.Dtos;
using StellaOps.Plugin;
namespace StellaOps.Concelier.Connector.Distro.RedHat;
public sealed class RedHatConnector : IFeedConnector
{
private readonly SourceFetchService _fetchService;
private readonly RawDocumentStorage _rawDocumentStorage;
private readonly IDocumentStore _documentStore;
private readonly IDtoStore _dtoStore;
private readonly IAdvisoryStore _advisoryStore;
private readonly ISourceStateRepository _stateRepository;
private readonly ILogger<RedHatConnector> _logger;
private readonly RedHatOptions _options;
private readonly TimeProvider _timeProvider;
public RedHatConnector(
SourceFetchService fetchService,
RawDocumentStorage rawDocumentStorage,
IDocumentStore documentStore,
IDtoStore dtoStore,
IAdvisoryStore advisoryStore,
ISourceStateRepository stateRepository,
IOptions<RedHatOptions> options,
TimeProvider? timeProvider,
ILogger<RedHatConnector> logger)
{
_fetchService = fetchService ?? throw new ArgumentNullException(nameof(fetchService));
_rawDocumentStorage = rawDocumentStorage ?? throw new ArgumentNullException(nameof(rawDocumentStorage));
_documentStore = documentStore ?? throw new ArgumentNullException(nameof(documentStore));
_dtoStore = dtoStore ?? throw new ArgumentNullException(nameof(dtoStore));
_advisoryStore = advisoryStore ?? throw new ArgumentNullException(nameof(advisoryStore));
_stateRepository = stateRepository ?? throw new ArgumentNullException(nameof(stateRepository));
_options = options?.Value ?? throw new ArgumentNullException(nameof(options));
_options.Validate();
_timeProvider = timeProvider ?? TimeProvider.System;
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public string SourceName => RedHatConnectorPlugin.SourceName;
public async Task FetchAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
var now = _timeProvider.GetUtcNow();
var baseline = cursor.LastReleasedOn ?? now - _options.InitialBackfill;
var overlap = _options.Overlap > TimeSpan.Zero ? _options.Overlap : TimeSpan.Zero;
var afterThreshold = baseline - overlap;
if (afterThreshold < DateTimeOffset.UnixEpoch)
{
afterThreshold = DateTimeOffset.UnixEpoch;
}
ProvenanceDiagnostics.ReportResumeWindow(SourceName, afterThreshold, _logger);
var processedSet = new HashSet<string>(cursor.ProcessedAdvisoryIds, StringComparer.OrdinalIgnoreCase);
var newSummaries = new List<RedHatSummaryItem>();
var stopDueToOlderData = false;
var touchedResources = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
for (var page = 1; page <= _options.MaxPagesPerFetch; page++)
{
var summaryUri = BuildSummaryUri(afterThreshold, page);
var summaryKey = summaryUri.ToString();
touchedResources.Add(summaryKey);
var cachedSummary = cursor.TryGetFetchCache(summaryKey);
var summaryMetadata = new Dictionary<string, string>(StringComparer.Ordinal)
{
["page"] = page.ToString(CultureInfo.InvariantCulture),
["type"] = "summary"
};
var summaryRequest = new SourceFetchRequest(RedHatOptions.HttpClientName, SourceName, summaryUri)
{
Metadata = summaryMetadata,
ETag = cachedSummary?.ETag,
LastModified = cachedSummary?.LastModified,
TimeoutOverride = _options.FetchTimeout,
};
SourceFetchContentResult summaryResult;
try
{
summaryResult = await _fetchService.FetchContentAsync(summaryRequest, cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
_logger.LogError(ex, "Red Hat Hydra summary fetch failed for {Uri}", summaryUri);
throw;
}
if (summaryResult.IsNotModified)
{
if (page == 1)
{
break;
}
continue;
}
if (!summaryResult.IsSuccess || summaryResult.Content is null)
{
continue;
}
cursor = cursor.WithFetchCache(summaryKey, summaryResult.ETag, summaryResult.LastModified);
using var document = JsonDocument.Parse(summaryResult.Content);
if (document.RootElement.ValueKind != JsonValueKind.Array)
{
_logger.LogWarning(
"Red Hat Hydra summary response had unexpected payload kind {Kind} for {Uri}",
document.RootElement.ValueKind,
summaryUri);
break;
}
var pageCount = 0;
foreach (var element in document.RootElement.EnumerateArray())
{
if (!RedHatSummaryItem.TryParse(element, out var summary))
{
continue;
}
pageCount++;
if (cursor.LastReleasedOn.HasValue)
{
if (summary.ReleasedOn < cursor.LastReleasedOn.Value - overlap)
{
stopDueToOlderData = true;
break;
}
if (summary.ReleasedOn < cursor.LastReleasedOn.Value)
{
stopDueToOlderData = true;
break;
}
if (summary.ReleasedOn == cursor.LastReleasedOn.Value && processedSet.Contains(summary.AdvisoryId))
{
continue;
}
}
newSummaries.Add(summary);
processedSet.Add(summary.AdvisoryId);
if (newSummaries.Count >= _options.MaxAdvisoriesPerFetch)
{
break;
}
}
if (newSummaries.Count >= _options.MaxAdvisoriesPerFetch || stopDueToOlderData)
{
break;
}
if (pageCount < _options.PageSize)
{
break;
}
}
if (newSummaries.Count == 0)
{
return;
}
newSummaries.Sort(static (left, right) =>
{
var compare = left.ReleasedOn.CompareTo(right.ReleasedOn);
return compare != 0
? compare
: string.CompareOrdinal(left.AdvisoryId, right.AdvisoryId);
});
var pendingDocuments = new HashSet<Guid>(cursor.PendingDocuments);
foreach (var summary in newSummaries)
{
var resourceUri = summary.ResourceUri;
var resourceKey = resourceUri.ToString();
touchedResources.Add(resourceKey);
var cached = cursor.TryGetFetchCache(resourceKey);
var metadata = new Dictionary<string, string>(StringComparer.Ordinal)
{
["advisoryId"] = summary.AdvisoryId,
["releasedOn"] = summary.ReleasedOn.ToString("O", CultureInfo.InvariantCulture)
};
var request = new SourceFetchRequest(RedHatOptions.HttpClientName, SourceName, resourceUri)
{
Metadata = metadata,
ETag = cached?.ETag,
LastModified = cached?.LastModified,
TimeoutOverride = _options.FetchTimeout,
};
try
{
var result = await _fetchService.FetchAsync(request, cancellationToken).ConfigureAwait(false);
if (result.IsNotModified)
{
continue;
}
if (!result.IsSuccess || result.Document is null)
{
continue;
}
pendingDocuments.Add(result.Document.Id);
cursor = cursor.WithFetchCache(resourceKey, result.Document.Etag, result.Document.LastModified);
}
catch (Exception ex)
{
_logger.LogError(ex, "Red Hat Hydra advisory fetch failed for {Uri}", resourceUri);
throw;
}
}
var maxRelease = newSummaries.Max(static item => item.ReleasedOn);
var idsForMaxRelease = newSummaries
.Where(item => item.ReleasedOn == maxRelease)
.Select(item => item.AdvisoryId)
.Distinct(StringComparer.OrdinalIgnoreCase)
.ToArray();
RedHatCursor updated;
if (cursor.LastReleasedOn.HasValue && maxRelease == cursor.LastReleasedOn.Value)
{
updated = cursor
.WithPendingDocuments(pendingDocuments)
.AddProcessedAdvisories(idsForMaxRelease)
.PruneFetchCache(touchedResources);
}
else
{
updated = cursor
.WithPendingDocuments(pendingDocuments)
.WithLastReleased(maxRelease, idsForMaxRelease)
.PruneFetchCache(touchedResources);
}
await UpdateCursorAsync(updated, cancellationToken).ConfigureAwait(false);
}
public async Task ParseAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingDocuments.Count == 0)
{
return;
}
var remainingFetch = cursor.PendingDocuments.ToList();
var pendingMappings = cursor.PendingMappings.ToList();
foreach (var documentId in cursor.PendingDocuments)
{
DocumentRecord? document = null;
try
{
document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (document is null)
{
remainingFetch.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
if (!document.GridFsId.HasValue)
{
_logger.LogWarning("Red Hat document {DocumentId} missing GridFS content; skipping", document.Id);
remainingFetch.Remove(documentId);
pendingMappings.Remove(documentId);
continue;
}
var rawBytes = await _rawDocumentStorage.DownloadAsync(document.GridFsId.Value, cancellationToken).ConfigureAwait(false);
using var jsonDocument = JsonDocument.Parse(rawBytes);
var sanitized = JsonSerializer.Serialize(jsonDocument.RootElement);
var payload = BsonDocument.Parse(sanitized);
var dtoRecord = new DtoRecord(
Guid.NewGuid(),
document.Id,
SourceName,
"redhat.csaf.v2",
payload,
_timeProvider.GetUtcNow());
await _dtoStore.UpsertAsync(dtoRecord, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.PendingMap, cancellationToken).ConfigureAwait(false);
remainingFetch.Remove(documentId);
if (!pendingMappings.Contains(documentId))
{
pendingMappings.Add(documentId);
}
}
catch (Exception ex)
{
var uri = document?.Uri ?? documentId.ToString();
_logger.LogError(ex, "Red Hat CSAF parse failed for {Uri}", uri);
remainingFetch.Remove(documentId);
pendingMappings.Remove(documentId);
}
}
var updatedCursor = cursor
.WithPendingDocuments(remainingFetch)
.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
public async Task MapAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingMappings.Count == 0)
{
return;
}
var pendingMappings = cursor.PendingMappings.ToList();
foreach (var documentId in cursor.PendingMappings)
{
try
{
var dto = await _dtoStore.FindByDocumentIdAsync(documentId, cancellationToken).ConfigureAwait(false);
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (dto is null || document is null)
{
pendingMappings.Remove(documentId);
continue;
}
var json = dto.Payload.ToJson(new JsonWriterSettings
{
OutputMode = JsonOutputMode.RelaxedExtendedJson,
});
using var jsonDocument = JsonDocument.Parse(json);
var advisory = RedHatMapper.Map(SourceName, dto, document, jsonDocument);
if (advisory is null)
{
pendingMappings.Remove(documentId);
continue;
}
await _advisoryStore.UpsertAsync(advisory, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(documentId, DocumentStatuses.Mapped, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
}
catch (Exception ex)
{
_logger.LogError(ex, "Red Hat map failed for document {DocumentId}", documentId);
}
}
var updatedCursor = cursor.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
private async Task<RedHatCursor> GetCursorAsync(CancellationToken cancellationToken)
{
var record = await _stateRepository.TryGetAsync(SourceName, cancellationToken).ConfigureAwait(false);
return RedHatCursor.FromBsonDocument(record?.Cursor);
}
private async Task UpdateCursorAsync(RedHatCursor cursor, CancellationToken cancellationToken)
{
var completedAt = _timeProvider.GetUtcNow();
await _stateRepository.UpdateCursorAsync(SourceName, cursor.ToBsonDocument(), completedAt, cancellationToken).ConfigureAwait(false);
}
private Uri BuildSummaryUri(DateTimeOffset after, int page)
{
var builder = new UriBuilder(_options.BaseEndpoint);
var basePath = builder.Path?.TrimEnd('/') ?? string.Empty;
var summaryPath = _options.SummaryPath.TrimStart('/');
builder.Path = string.IsNullOrEmpty(basePath)
? $"/{summaryPath}"
: $"{basePath}/{summaryPath}";
var parameters = new Dictionary<string, string>(StringComparer.Ordinal)
{
["after"] = after.ToString("yyyy-MM-dd", CultureInfo.InvariantCulture),
["per_page"] = _options.PageSize.ToString(CultureInfo.InvariantCulture),
["page"] = page.ToString(CultureInfo.InvariantCulture)
};
builder.Query = string.Join('&', parameters.Select(static kvp =>
$"{Uri.EscapeDataString(kvp.Key)}={Uri.EscapeDataString(kvp.Value)}"));
return builder.Uri;
}
}

View File

@@ -0,0 +1,19 @@
using Microsoft.Extensions.DependencyInjection;
using StellaOps.Plugin;
namespace StellaOps.Concelier.Connector.Distro.RedHat;
public sealed class RedHatConnectorPlugin : IConnectorPlugin
{
public const string SourceName = "redhat";
public string Name => SourceName;
public bool IsAvailable(IServiceProvider services) => services is not null;
public IFeedConnector Create(IServiceProvider services)
{
ArgumentNullException.ThrowIfNull(services);
return ActivatorUtilities.CreateInstance<RedHatConnector>(services);
}
}

View File

@@ -0,0 +1,54 @@
using System;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
using StellaOps.DependencyInjection;
using StellaOps.Concelier.Core.Jobs;
using StellaOps.Concelier.Connector.Distro.RedHat.Configuration;
namespace StellaOps.Concelier.Connector.Distro.RedHat;
public sealed class RedHatDependencyInjectionRoutine : IDependencyInjectionRoutine
{
private const string ConfigurationSection = "concelier:sources:redhat";
private const string FetchCron = "0,15,30,45 * * * *";
private const string ParseCron = "5,20,35,50 * * * *";
private const string MapCron = "10,25,40,55 * * * *";
private static readonly TimeSpan FetchTimeout = TimeSpan.FromMinutes(12);
private static readonly TimeSpan ParseTimeout = TimeSpan.FromMinutes(15);
private static readonly TimeSpan MapTimeout = TimeSpan.FromMinutes(20);
private static readonly TimeSpan LeaseDuration = TimeSpan.FromMinutes(6);
public IServiceCollection Register(IServiceCollection services, IConfiguration configuration)
{
ArgumentNullException.ThrowIfNull(services);
ArgumentNullException.ThrowIfNull(configuration);
services.AddRedHatConnector(options =>
{
configuration.GetSection(ConfigurationSection).Bind(options);
options.Validate();
});
var schedulerBuilder = new JobSchedulerBuilder(services);
schedulerBuilder
.AddJob<RedHatFetchJob>(
RedHatJobKinds.Fetch,
cronExpression: FetchCron,
timeout: FetchTimeout,
leaseDuration: LeaseDuration)
.AddJob<RedHatParseJob>(
RedHatJobKinds.Parse,
cronExpression: ParseCron,
timeout: ParseTimeout,
leaseDuration: LeaseDuration)
.AddJob<RedHatMapJob>(
RedHatJobKinds.Map,
cronExpression: MapCron,
timeout: MapTimeout,
leaseDuration: LeaseDuration);
return services;
}
}

View File

@@ -0,0 +1,34 @@
using System;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Options;
using StellaOps.Concelier.Connector.Common.Http;
using StellaOps.Concelier.Connector.Distro.RedHat.Configuration;
namespace StellaOps.Concelier.Connector.Distro.RedHat;
public static class RedHatServiceCollectionExtensions
{
public static IServiceCollection AddRedHatConnector(this IServiceCollection services, Action<RedHatOptions> configure)
{
ArgumentNullException.ThrowIfNull(services);
ArgumentNullException.ThrowIfNull(configure);
services.AddOptions<RedHatOptions>()
.Configure(configure)
.PostConfigure(static opts => opts.Validate());
services.AddSourceHttpClient(RedHatOptions.HttpClientName, (sp, httpOptions) =>
{
var options = sp.GetRequiredService<IOptions<RedHatOptions>>().Value;
httpOptions.BaseAddress = options.BaseEndpoint;
httpOptions.Timeout = options.FetchTimeout;
httpOptions.UserAgent = options.UserAgent;
httpOptions.AllowedHosts.Clear();
httpOptions.AllowedHosts.Add(options.BaseEndpoint.Host);
httpOptions.DefaultRequestHeaders["Accept"] = "application/json";
});
services.AddTransient<RedHatConnector>();
return services;
}
}

View File

@@ -0,0 +1,15 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>net10.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
</PropertyGroup>
<ItemGroup>
<ProjectReference Include="../StellaOps.Plugin/StellaOps.Plugin.csproj" />
<ProjectReference Include="../StellaOps.DependencyInjection/StellaOps.DependencyInjection.csproj" />
<ProjectReference Include="..\StellaOps.Concelier.Connector.Common\StellaOps.Concelier.Connector.Common.csproj" />
<ProjectReference Include="..\StellaOps.Concelier.Models\StellaOps.Concelier.Models.csproj" />
<ProjectReference Include="..\StellaOps.Concelier.Storage.Mongo\StellaOps.Concelier.Storage.Mongo.csproj" />
<ProjectReference Include="..\StellaOps.Concelier.Normalization\StellaOps.Concelier.Normalization.csproj" />
</ItemGroup>
</Project>

View File

@@ -0,0 +1,16 @@
# TASKS
| Task | Owner(s) | Depends on | Notes |
|---|---|---|---|
|Hydra fetch with after= cursor|BE-Conn-RH|Source.Common|**DONE** windowed paging with overlap, ETag/Last-Modified persisted.|
|DTOs for Security Data + OVAL|BE-Conn-RH|Tests|**DONE** CSAF payloads serialized into `redhat.csaf.v2` DTOs.|
|NEVRA parser/comparer (complete)|BE-Conn-RH|Models|**DONE** parser/comparer shipped with coverage; add edge cases as needed.|
|Mapper to canonical rpm/cpe affected|BE-Conn-RH|Models|**DONE** maps fixed/known ranges, CPE provenance, status ranges.|
|Job scheduler registration aligns with Options pipeline|BE-Conn-RH|Core|**DONE** registered fetch/parse/map via JobSchedulerBuilder, preserving option overrides and tightening cron/timeouts.|
|Watermark persistence + resume|BE-Conn-RH|Storage.Mongo|**DONE** cursor updates via SourceStateRepository.|
|Precedence tests vs NVD|QA|Merge|**DONE** Added AffectedPackagePrecedenceResolver + tests ensuring Red Hat CPEs override NVD ranges.|
|Golden mapping fixtures|QA|Fixtures|**DONE** fixture validation test now snapshots RHSA-2025:0001/0002/0003 with env-driven regeneration.|
|Job scheduling defaults for source:redhat tasks|BE-Core|JobScheduler|**DONE** Cron windows + per-job timeouts defined for fetch/parse/map.|
|Express unaffected/investigation statuses without overloading range fields|BE-Conn-RH|Models|**DONE** Introduced AffectedPackageStatus collection and updated mapper/tests.|
|Reference dedupe & ordering in mapper|BE-Conn-RH|Models|DONE mapper consolidates by URL, merges metadata, deterministic ordering validated in tests.|
|Hydra summary fetch through SourceFetchService|BE-Conn-RH|Source.Common|DONE summary pages now fetched via SourceFetchService with cache + conditional headers.|
|Fixture validation sweep|QA|Testing|**DOING (2025-10-10)** Regenerate RHSA fixtures once mapper fixes land, review snapshot diffs, and update docs; blocked by outstanding range provenance patches.|