Rename Concelier Source modules to Connector

This commit is contained in:
2025-10-18 20:11:18 +03:00
parent 0137856fdb
commit 6524626230
789 changed files with 1489 additions and 1489 deletions

View File

@@ -0,0 +1,39 @@
# AGENTS
## Role
Implement a connector for GitHub Security Advisories (GHSA) when we need to ingest GHSA content directly (instead of crosswalking via OSV/NVD).
## Scope
- Determine the optimal GHSA data source (GraphQL API, REST, or ecosystem export) and required authentication.
- Implement fetch logic with pagination, updated-since filtering, and cursor persistence.
- Parse GHSA records (identifiers, summaries, affected packages, versions, references, severity).
- Map advisories into canonical `Advisory` objects with aliases, references, affected packages, and range primitives.
- Provide deterministic fixtures and regression tests for the full pipeline.
## Participants
- `Source.Common` (HTTP clients, fetch service, DTO storage).
- `Storage.Mongo` (raw/document/DTO/advisory stores and source state).
- `Concelier.Models` (canonical advisory types).
- `Concelier.Testing` (integration harness, snapshot helpers).
## Interfaces & Contracts
- Job kinds: `ghsa:fetch`, `ghsa:parse`, `ghsa:map`.
- Support GitHub API authentication & rate limiting (token, retry/backoff).
- Alias set must include GHSA IDs and linked CVE IDs.
## In/Out of scope
In scope:
- Full GHSA connector implementation with range primitives and provenance instrumentation.
Out of scope:
- Repo-specific advisory ingest (handled via GitHub repo exports).
- Downstream ecosystem-specific enrichments.
## Observability & Security Expectations
- Log fetch pagination, throttling, and mapping stats.
- Handle GitHub API rate limits with exponential backoff and `Retry-After`.
- Sanitize/validate payloads before persistence.
## Tests
- Add `StellaOps.Concelier.Connector.Ghsa.Tests` with canned GraphQL/REST fixtures.
- Snapshot canonical advisories; enable fixture regeneration with env flag.
- Confirm deterministic ordering/time normalisation.

View File

@@ -0,0 +1,75 @@
using System.Diagnostics.CodeAnalysis;
namespace StellaOps.Concelier.Connector.Ghsa.Configuration;
public sealed class GhsaOptions
{
public static string HttpClientName => "source.ghsa";
public Uri BaseEndpoint { get; set; } = new("https://api.github.com/", UriKind.Absolute);
public string ApiToken { get; set; } = string.Empty;
public int PageSize { get; set; } = 50;
public int MaxPagesPerFetch { get; set; } = 5;
public TimeSpan InitialBackfill { get; set; } = TimeSpan.FromDays(30);
public TimeSpan RequestDelay { get; set; } = TimeSpan.FromMilliseconds(200);
public TimeSpan FailureBackoff { get; set; } = TimeSpan.FromMinutes(5);
public int RateLimitWarningThreshold { get; set; } = 500;
public TimeSpan SecondaryRateLimitBackoff { get; set; } = TimeSpan.FromMinutes(2);
[MemberNotNull(nameof(BaseEndpoint), nameof(ApiToken))]
public void Validate()
{
if (BaseEndpoint is null || !BaseEndpoint.IsAbsoluteUri)
{
throw new InvalidOperationException("BaseEndpoint must be an absolute URI.");
}
if (string.IsNullOrWhiteSpace(ApiToken))
{
throw new InvalidOperationException("ApiToken must be provided.");
}
if (PageSize is < 1 or > 100)
{
throw new InvalidOperationException("PageSize must be between 1 and 100.");
}
if (MaxPagesPerFetch <= 0)
{
throw new InvalidOperationException("MaxPagesPerFetch must be positive.");
}
if (InitialBackfill < TimeSpan.Zero)
{
throw new InvalidOperationException("InitialBackfill cannot be negative.");
}
if (RequestDelay < TimeSpan.Zero)
{
throw new InvalidOperationException("RequestDelay cannot be negative.");
}
if (FailureBackoff <= TimeSpan.Zero)
{
throw new InvalidOperationException("FailureBackoff must be greater than zero.");
}
if (RateLimitWarningThreshold < 0)
{
throw new InvalidOperationException("RateLimitWarningThreshold cannot be negative.");
}
if (SecondaryRateLimitBackoff <= TimeSpan.Zero)
{
throw new InvalidOperationException("SecondaryRateLimitBackoff must be greater than zero.");
}
}
}

View File

@@ -0,0 +1,547 @@
using System.Collections.Generic;
using System.Globalization;
using System.Linq;
using System.Net.Http;
using System.Text.Json;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using MongoDB.Bson;
using StellaOps.Concelier.Models;
using StellaOps.Concelier.Connector.Common;
using StellaOps.Concelier.Connector.Common.Fetch;
using StellaOps.Concelier.Connector.Ghsa.Configuration;
using StellaOps.Concelier.Connector.Ghsa.Internal;
using StellaOps.Concelier.Storage.Mongo;
using StellaOps.Concelier.Storage.Mongo.Advisories;
using StellaOps.Concelier.Storage.Mongo.Documents;
using StellaOps.Concelier.Storage.Mongo.Dtos;
using StellaOps.Plugin;
namespace StellaOps.Concelier.Connector.Ghsa;
public sealed class GhsaConnector : IFeedConnector
{
private static readonly JsonSerializerOptions SerializerOptions = new(JsonSerializerDefaults.Web)
{
PropertyNameCaseInsensitive = true,
WriteIndented = false,
};
private readonly SourceFetchService _fetchService;
private readonly RawDocumentStorage _rawDocumentStorage;
private readonly IDocumentStore _documentStore;
private readonly IDtoStore _dtoStore;
private readonly IAdvisoryStore _advisoryStore;
private readonly ISourceStateRepository _stateRepository;
private readonly GhsaOptions _options;
private readonly GhsaDiagnostics _diagnostics;
private readonly TimeProvider _timeProvider;
private readonly ILogger<GhsaConnector> _logger;
private readonly object _rateLimitWarningLock = new();
private readonly Dictionary<(string Phase, string Resource), bool> _rateLimitWarnings = new();
public GhsaConnector(
SourceFetchService fetchService,
RawDocumentStorage rawDocumentStorage,
IDocumentStore documentStore,
IDtoStore dtoStore,
IAdvisoryStore advisoryStore,
ISourceStateRepository stateRepository,
IOptions<GhsaOptions> options,
GhsaDiagnostics diagnostics,
TimeProvider? timeProvider,
ILogger<GhsaConnector> logger)
{
_fetchService = fetchService ?? throw new ArgumentNullException(nameof(fetchService));
_rawDocumentStorage = rawDocumentStorage ?? throw new ArgumentNullException(nameof(rawDocumentStorage));
_documentStore = documentStore ?? throw new ArgumentNullException(nameof(documentStore));
_dtoStore = dtoStore ?? throw new ArgumentNullException(nameof(dtoStore));
_advisoryStore = advisoryStore ?? throw new ArgumentNullException(nameof(advisoryStore));
_stateRepository = stateRepository ?? throw new ArgumentNullException(nameof(stateRepository));
_options = (options ?? throw new ArgumentNullException(nameof(options))).Value ?? throw new ArgumentNullException(nameof(options));
_options.Validate();
_diagnostics = diagnostics ?? throw new ArgumentNullException(nameof(diagnostics));
_timeProvider = timeProvider ?? TimeProvider.System;
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public string SourceName => GhsaConnectorPlugin.SourceName;
public async Task FetchAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var now = _timeProvider.GetUtcNow();
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
var pendingDocuments = cursor.PendingDocuments.ToHashSet();
var pendingMappings = cursor.PendingMappings.ToHashSet();
var since = cursor.CurrentWindowStart ?? cursor.LastUpdatedExclusive ?? now - _options.InitialBackfill;
if (since > now)
{
since = now;
}
var until = cursor.CurrentWindowEnd ?? now;
if (until <= since)
{
until = since + TimeSpan.FromMinutes(1);
}
var page = cursor.NextPage <= 0 ? 1 : cursor.NextPage;
var pagesFetched = 0;
var hasMore = true;
var rateLimitHit = false;
DateTimeOffset? maxUpdated = cursor.LastUpdatedExclusive;
while (hasMore && pagesFetched < _options.MaxPagesPerFetch)
{
cancellationToken.ThrowIfCancellationRequested();
var listUri = BuildListUri(since, until, page, _options.PageSize);
var metadata = new Dictionary<string, string>(StringComparer.Ordinal)
{
["since"] = since.ToString("O"),
["until"] = until.ToString("O"),
["page"] = page.ToString(CultureInfo.InvariantCulture),
["pageSize"] = _options.PageSize.ToString(CultureInfo.InvariantCulture),
};
SourceFetchContentResult listResult;
try
{
_diagnostics.FetchAttempt();
listResult = await _fetchService.FetchContentAsync(
new SourceFetchRequest(
GhsaOptions.HttpClientName,
SourceName,
listUri)
{
Metadata = metadata,
AcceptHeaders = new[] { "application/vnd.github+json" },
},
cancellationToken).ConfigureAwait(false);
}
catch (HttpRequestException ex)
{
_diagnostics.FetchFailure();
await _stateRepository.MarkFailureAsync(SourceName, now, _options.FailureBackoff, ex.Message, cancellationToken).ConfigureAwait(false);
throw;
}
if (listResult.IsNotModified)
{
_diagnostics.FetchUnchanged();
break;
}
if (!listResult.IsSuccess || listResult.Content is null)
{
_diagnostics.FetchFailure();
break;
}
var deferList = await ApplyRateLimitAsync(listResult.Headers, "list", cancellationToken).ConfigureAwait(false);
if (deferList)
{
rateLimitHit = true;
break;
}
var pageModel = GhsaListParser.Parse(listResult.Content, page, _options.PageSize);
if (pageModel.Items.Count == 0)
{
hasMore = false;
}
foreach (var item in pageModel.Items)
{
cancellationToken.ThrowIfCancellationRequested();
var detailUri = BuildDetailUri(item.GhsaId);
var detailMetadata = new Dictionary<string, string>(StringComparer.Ordinal)
{
["ghsaId"] = item.GhsaId,
["page"] = page.ToString(CultureInfo.InvariantCulture),
["since"] = since.ToString("O"),
["until"] = until.ToString("O"),
};
SourceFetchResult detailResult;
try
{
detailResult = await _fetchService.FetchAsync(
new SourceFetchRequest(
GhsaOptions.HttpClientName,
SourceName,
detailUri)
{
Metadata = detailMetadata,
AcceptHeaders = new[] { "application/vnd.github+json" },
},
cancellationToken).ConfigureAwait(false);
}
catch (HttpRequestException ex)
{
_diagnostics.FetchFailure();
_logger.LogWarning(ex, "Failed fetching GHSA advisory {GhsaId}", item.GhsaId);
continue;
}
if (detailResult.IsNotModified)
{
_diagnostics.FetchUnchanged();
continue;
}
if (!detailResult.IsSuccess || detailResult.Document is null)
{
_diagnostics.FetchFailure();
continue;
}
_diagnostics.FetchDocument();
pendingDocuments.Add(detailResult.Document.Id);
pendingMappings.Add(detailResult.Document.Id);
var deferDetail = await ApplyRateLimitAsync(detailResult.Document.Headers, "detail", cancellationToken).ConfigureAwait(false);
if (deferDetail)
{
rateLimitHit = true;
break;
}
}
if (rateLimitHit)
{
break;
}
if (pageModel.MaxUpdated.HasValue)
{
if (!maxUpdated.HasValue || pageModel.MaxUpdated > maxUpdated)
{
maxUpdated = pageModel.MaxUpdated;
}
}
hasMore = pageModel.HasMorePages;
page = pageModel.NextPageCandidate;
pagesFetched++;
if (!rateLimitHit && hasMore && _options.RequestDelay > TimeSpan.Zero)
{
await Task.Delay(_options.RequestDelay, cancellationToken).ConfigureAwait(false);
}
}
var updatedCursor = cursor
.WithPendingDocuments(pendingDocuments)
.WithPendingMappings(pendingMappings);
if (hasMore || rateLimitHit)
{
updatedCursor = updatedCursor
.WithCurrentWindowStart(since)
.WithCurrentWindowEnd(until)
.WithNextPage(page);
}
else
{
var nextSince = maxUpdated ?? until;
updatedCursor = updatedCursor
.WithLastUpdatedExclusive(nextSince)
.WithCurrentWindowStart(null)
.WithCurrentWindowEnd(null)
.WithNextPage(1);
}
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
public async Task ParseAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingDocuments.Count == 0)
{
return;
}
var remainingDocuments = cursor.PendingDocuments.ToList();
foreach (var documentId in cursor.PendingDocuments)
{
cancellationToken.ThrowIfCancellationRequested();
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (document is null)
{
remainingDocuments.Remove(documentId);
continue;
}
if (!document.GridFsId.HasValue)
{
_diagnostics.ParseFailure();
_logger.LogWarning("GHSA document {DocumentId} missing GridFS content", documentId);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
continue;
}
byte[] rawBytes;
try
{
rawBytes = await _rawDocumentStorage.DownloadAsync(document.GridFsId.Value, cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
_diagnostics.ParseFailure();
_logger.LogError(ex, "Unable to download GHSA raw document {DocumentId}", documentId);
throw;
}
GhsaRecordDto dto;
try
{
dto = GhsaRecordParser.Parse(rawBytes);
}
catch (JsonException ex)
{
_diagnostics.ParseQuarantine();
_logger.LogError(ex, "Malformed GHSA JSON for {DocumentId}", documentId);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
continue;
}
var payload = BsonDocument.Parse(JsonSerializer.Serialize(dto, SerializerOptions));
var dtoRecord = new DtoRecord(
Guid.NewGuid(),
document.Id,
SourceName,
"ghsa/1.0",
payload,
_timeProvider.GetUtcNow());
await _dtoStore.UpsertAsync(dtoRecord, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.PendingMap, cancellationToken).ConfigureAwait(false);
remainingDocuments.Remove(documentId);
_diagnostics.ParseSuccess();
}
var updatedCursor = cursor.WithPendingDocuments(remainingDocuments);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
public async Task MapAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingMappings.Count == 0)
{
return;
}
var pendingMappings = cursor.PendingMappings.ToList();
foreach (var documentId in cursor.PendingMappings)
{
cancellationToken.ThrowIfCancellationRequested();
var dtoRecord = await _dtoStore.FindByDocumentIdAsync(documentId, cancellationToken).ConfigureAwait(false);
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (dtoRecord is null || document is null)
{
_logger.LogWarning("Skipping GHSA mapping for {DocumentId}: DTO or document missing", documentId);
pendingMappings.Remove(documentId);
continue;
}
GhsaRecordDto dto;
try
{
dto = JsonSerializer.Deserialize<GhsaRecordDto>(dtoRecord.Payload.ToJson(), SerializerOptions)
?? throw new InvalidOperationException("Deserialized DTO was null.");
}
catch (Exception ex)
{
_logger.LogError(ex, "Failed to deserialize GHSA DTO for {DocumentId}", documentId);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
continue;
}
var advisory = GhsaMapper.Map(dto, document, dtoRecord.ValidatedAt);
if (advisory.CvssMetrics.IsEmpty && !string.IsNullOrWhiteSpace(advisory.CanonicalMetricId))
{
var fallbackSeverity = string.IsNullOrWhiteSpace(advisory.Severity)
? "unknown"
: advisory.Severity!;
_diagnostics.CanonicalMetricFallback(advisory.CanonicalMetricId!, fallbackSeverity);
if (_logger.IsEnabled(LogLevel.Debug))
{
_logger.LogDebug(
"GHSA {GhsaId} emitted canonical metric fallback {CanonicalMetricId} (severity {Severity})",
advisory.AdvisoryKey,
advisory.CanonicalMetricId,
fallbackSeverity);
}
}
await _advisoryStore.UpsertAsync(advisory, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Mapped, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
_diagnostics.MapSuccess(1);
}
var updatedCursor = cursor.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
private static Uri BuildListUri(DateTimeOffset since, DateTimeOffset until, int page, int pageSize)
{
var query = $"updated_since={Uri.EscapeDataString(since.ToString("O"))}&updated_until={Uri.EscapeDataString(until.ToString("O"))}&page={page}&per_page={pageSize}";
return new Uri($"security/advisories?{query}", UriKind.Relative);
}
private static Uri BuildDetailUri(string ghsaId)
{
var encoded = Uri.EscapeDataString(ghsaId);
return new Uri($"security/advisories/{encoded}", UriKind.Relative);
}
private async Task<GhsaCursor> GetCursorAsync(CancellationToken cancellationToken)
{
var state = await _stateRepository.TryGetAsync(SourceName, cancellationToken).ConfigureAwait(false);
return state is null ? GhsaCursor.Empty : GhsaCursor.FromBson(state.Cursor);
}
private async Task UpdateCursorAsync(GhsaCursor cursor, CancellationToken cancellationToken)
{
await _stateRepository.UpdateCursorAsync(SourceName, cursor.ToBsonDocument(), _timeProvider.GetUtcNow(), cancellationToken).ConfigureAwait(false);
}
private bool ShouldLogRateLimitWarning(in GhsaRateLimitSnapshot snapshot, out bool recovered)
{
recovered = false;
if (!snapshot.Remaining.HasValue)
{
return false;
}
var key = (snapshot.Phase, snapshot.Resource ?? "global");
var warn = snapshot.Remaining.Value <= _options.RateLimitWarningThreshold;
lock (_rateLimitWarningLock)
{
var previouslyWarned = _rateLimitWarnings.TryGetValue(key, out var flagged) && flagged;
if (warn)
{
if (previouslyWarned)
{
return false;
}
_rateLimitWarnings[key] = true;
return true;
}
if (previouslyWarned)
{
_rateLimitWarnings.Remove(key);
recovered = true;
}
return false;
}
}
private static double? CalculateHeadroomPercentage(in GhsaRateLimitSnapshot snapshot)
{
if (!snapshot.Limit.HasValue || !snapshot.Remaining.HasValue)
{
return null;
}
var limit = snapshot.Limit.Value;
if (limit <= 0)
{
return null;
}
return (double)snapshot.Remaining.Value / limit * 100d;
}
private static string FormatHeadroom(double? headroomPct)
=> headroomPct.HasValue ? $" (headroom {headroomPct.Value:F1}%)" : string.Empty;
private async Task<bool> ApplyRateLimitAsync(IReadOnlyDictionary<string, string>? headers, string phase, CancellationToken cancellationToken)
{
var snapshot = GhsaRateLimitParser.TryParse(headers, _timeProvider.GetUtcNow(), phase);
if (snapshot is null || !snapshot.Value.HasData)
{
return false;
}
_diagnostics.RecordRateLimit(snapshot.Value);
var headroomPct = CalculateHeadroomPercentage(snapshot.Value);
if (ShouldLogRateLimitWarning(snapshot.Value, out var recovered))
{
var resetMessage = snapshot.Value.ResetAfter.HasValue
? $" (resets in {snapshot.Value.ResetAfter.Value:c})"
: snapshot.Value.ResetAt.HasValue ? $" (resets at {snapshot.Value.ResetAt.Value:O})" : string.Empty;
_logger.LogWarning(
"GHSA rate limit warning: remaining {Remaining} of {Limit} for {Phase} {Resource}{ResetMessage}{Headroom}",
snapshot.Value.Remaining,
snapshot.Value.Limit,
phase,
snapshot.Value.Resource ?? "global",
resetMessage,
FormatHeadroom(headroomPct));
}
else if (recovered)
{
_logger.LogInformation(
"GHSA rate limit recovered for {Phase} {Resource}: remaining {Remaining} of {Limit}{Headroom}",
phase,
snapshot.Value.Resource ?? "global",
snapshot.Value.Remaining,
snapshot.Value.Limit,
FormatHeadroom(headroomPct));
}
if (snapshot.Value.Remaining.HasValue && snapshot.Value.Remaining.Value <= 0)
{
_diagnostics.RateLimitExhausted(phase);
var delay = snapshot.Value.RetryAfter ?? snapshot.Value.ResetAfter ?? _options.SecondaryRateLimitBackoff;
if (delay > TimeSpan.Zero)
{
_logger.LogWarning(
"GHSA rate limit exhausted for {Phase} {Resource}; delaying {Delay}{Headroom}",
phase,
snapshot.Value.Resource ?? "global",
delay,
FormatHeadroom(headroomPct));
await Task.Delay(delay, cancellationToken).ConfigureAwait(false);
}
return true;
}
return false;
}
}

View File

@@ -0,0 +1,19 @@
using Microsoft.Extensions.DependencyInjection;
using StellaOps.Plugin;
namespace StellaOps.Concelier.Connector.Ghsa;
public sealed class GhsaConnectorPlugin : IConnectorPlugin
{
public const string SourceName = "ghsa";
public string Name => SourceName;
public bool IsAvailable(IServiceProvider services) => services is not null;
public IFeedConnector Create(IServiceProvider services)
{
ArgumentNullException.ThrowIfNull(services);
return ActivatorUtilities.CreateInstance<GhsaConnector>(services);
}
}

View File

@@ -0,0 +1,53 @@
using System;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
using StellaOps.DependencyInjection;
using StellaOps.Concelier.Core.Jobs;
using StellaOps.Concelier.Connector.Ghsa.Configuration;
namespace StellaOps.Concelier.Connector.Ghsa;
public sealed class GhsaDependencyInjectionRoutine : IDependencyInjectionRoutine
{
private const string ConfigurationSection = "concelier:sources:ghsa";
private const string FetchCron = "1,11,21,31,41,51 * * * *";
private const string ParseCron = "3,13,23,33,43,53 * * * *";
private const string MapCron = "5,15,25,35,45,55 * * * *";
private static readonly TimeSpan FetchTimeout = TimeSpan.FromMinutes(6);
private static readonly TimeSpan ParseTimeout = TimeSpan.FromMinutes(5);
private static readonly TimeSpan MapTimeout = TimeSpan.FromMinutes(5);
private static readonly TimeSpan LeaseDuration = TimeSpan.FromMinutes(4);
public IServiceCollection Register(IServiceCollection services, IConfiguration configuration)
{
ArgumentNullException.ThrowIfNull(services);
ArgumentNullException.ThrowIfNull(configuration);
services.AddGhsaConnector(options =>
{
configuration.GetSection(ConfigurationSection).Bind(options);
options.Validate();
});
var scheduler = new JobSchedulerBuilder(services);
scheduler
.AddJob<GhsaFetchJob>(
GhsaJobKinds.Fetch,
cronExpression: FetchCron,
timeout: FetchTimeout,
leaseDuration: LeaseDuration)
.AddJob<GhsaParseJob>(
GhsaJobKinds.Parse,
cronExpression: ParseCron,
timeout: ParseTimeout,
leaseDuration: LeaseDuration)
.AddJob<GhsaMapJob>(
GhsaJobKinds.Map,
cronExpression: MapCron,
timeout: MapTimeout,
leaseDuration: LeaseDuration);
return services;
}
}

View File

@@ -0,0 +1,37 @@
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Options;
using StellaOps.Concelier.Connector.Common.Http;
using StellaOps.Concelier.Connector.Ghsa.Configuration;
using StellaOps.Concelier.Connector.Ghsa.Internal;
namespace StellaOps.Concelier.Connector.Ghsa;
public static class GhsaServiceCollectionExtensions
{
public static IServiceCollection AddGhsaConnector(this IServiceCollection services, Action<GhsaOptions> configure)
{
ArgumentNullException.ThrowIfNull(services);
ArgumentNullException.ThrowIfNull(configure);
services.AddOptions<GhsaOptions>()
.Configure(configure)
.PostConfigure(static opts => opts.Validate());
services.AddSourceHttpClient(GhsaOptions.HttpClientName, (sp, clientOptions) =>
{
var options = sp.GetRequiredService<IOptions<GhsaOptions>>().Value;
clientOptions.BaseAddress = options.BaseEndpoint;
clientOptions.Timeout = TimeSpan.FromSeconds(30);
clientOptions.UserAgent = "StellaOps.Concelier.Ghsa/1.0";
clientOptions.AllowedHosts.Clear();
clientOptions.AllowedHosts.Add(options.BaseEndpoint.Host);
clientOptions.DefaultRequestHeaders["Accept"] = "application/vnd.github+json";
clientOptions.DefaultRequestHeaders["Authorization"] = $"Bearer {options.ApiToken}";
clientOptions.DefaultRequestHeaders["X-GitHub-Api-Version"] = "2022-11-28";
});
services.AddSingleton<GhsaDiagnostics>();
services.AddTransient<GhsaConnector>();
return services;
}
}

View File

@@ -0,0 +1,135 @@
using System.Collections.Generic;
using System.Linq;
using MongoDB.Bson;
namespace StellaOps.Concelier.Connector.Ghsa.Internal;
internal sealed record GhsaCursor(
DateTimeOffset? LastUpdatedExclusive,
DateTimeOffset? CurrentWindowStart,
DateTimeOffset? CurrentWindowEnd,
int NextPage,
IReadOnlyCollection<Guid> PendingDocuments,
IReadOnlyCollection<Guid> PendingMappings)
{
private static readonly IReadOnlyCollection<Guid> EmptyGuidList = Array.Empty<Guid>();
public static GhsaCursor Empty { get; } = new(
null,
null,
null,
1,
EmptyGuidList,
EmptyGuidList);
public BsonDocument ToBsonDocument()
{
var document = new BsonDocument
{
["nextPage"] = NextPage,
["pendingDocuments"] = new BsonArray(PendingDocuments.Select(id => id.ToString())),
["pendingMappings"] = new BsonArray(PendingMappings.Select(id => id.ToString())),
};
if (LastUpdatedExclusive.HasValue)
{
document["lastUpdatedExclusive"] = LastUpdatedExclusive.Value.UtcDateTime;
}
if (CurrentWindowStart.HasValue)
{
document["currentWindowStart"] = CurrentWindowStart.Value.UtcDateTime;
}
if (CurrentWindowEnd.HasValue)
{
document["currentWindowEnd"] = CurrentWindowEnd.Value.UtcDateTime;
}
return document;
}
public static GhsaCursor FromBson(BsonDocument? document)
{
if (document is null || document.ElementCount == 0)
{
return Empty;
}
var lastUpdatedExclusive = document.TryGetValue("lastUpdatedExclusive", out var lastUpdated)
? ParseDate(lastUpdated)
: null;
var windowStart = document.TryGetValue("currentWindowStart", out var windowStartValue)
? ParseDate(windowStartValue)
: null;
var windowEnd = document.TryGetValue("currentWindowEnd", out var windowEndValue)
? ParseDate(windowEndValue)
: null;
var nextPage = document.TryGetValue("nextPage", out var nextPageValue) && nextPageValue.IsInt32
? Math.Max(1, nextPageValue.AsInt32)
: 1;
var pendingDocuments = ReadGuidArray(document, "pendingDocuments");
var pendingMappings = ReadGuidArray(document, "pendingMappings");
return new GhsaCursor(
lastUpdatedExclusive,
windowStart,
windowEnd,
nextPage,
pendingDocuments,
pendingMappings);
}
public GhsaCursor WithPendingDocuments(IEnumerable<Guid> ids)
=> this with { PendingDocuments = ids?.Distinct().ToArray() ?? EmptyGuidList };
public GhsaCursor WithPendingMappings(IEnumerable<Guid> ids)
=> this with { PendingMappings = ids?.Distinct().ToArray() ?? EmptyGuidList };
public GhsaCursor WithLastUpdatedExclusive(DateTimeOffset? timestamp)
=> this with { LastUpdatedExclusive = timestamp };
public GhsaCursor WithCurrentWindowStart(DateTimeOffset? timestamp)
=> this with { CurrentWindowStart = timestamp };
public GhsaCursor WithCurrentWindowEnd(DateTimeOffset? timestamp)
=> this with { CurrentWindowEnd = timestamp };
public GhsaCursor WithNextPage(int page)
=> this with { NextPage = page < 1 ? 1 : page };
private static DateTimeOffset? ParseDate(BsonValue value)
{
return value.BsonType switch
{
BsonType.DateTime => DateTime.SpecifyKind(value.ToUniversalTime(), DateTimeKind.Utc),
BsonType.String when DateTimeOffset.TryParse(value.AsString, out var parsed) => parsed.ToUniversalTime(),
_ => null,
};
}
private static IReadOnlyCollection<Guid> ReadGuidArray(BsonDocument document, string field)
{
if (!document.TryGetValue(field, out var value) || value is not BsonArray array)
{
return EmptyGuidList;
}
var results = new List<Guid>(array.Count);
foreach (var element in array)
{
if (element is null)
{
continue;
}
if (Guid.TryParse(element.ToString(), out var guid))
{
results.Add(guid);
}
}
return results;
}
}

View File

@@ -0,0 +1,164 @@
using System.Collections.Generic;
using System.Diagnostics.Metrics;
namespace StellaOps.Concelier.Connector.Ghsa.Internal;
public sealed class GhsaDiagnostics : IDisposable
{
private const string MeterName = "StellaOps.Concelier.Connector.Ghsa";
private const string MeterVersion = "1.0.0";
private readonly Meter _meter;
private readonly Counter<long> _fetchAttempts;
private readonly Counter<long> _fetchDocuments;
private readonly Counter<long> _fetchFailures;
private readonly Counter<long> _fetchUnchanged;
private readonly Counter<long> _parseSuccess;
private readonly Counter<long> _parseFailures;
private readonly Counter<long> _parseQuarantine;
private readonly Counter<long> _mapSuccess;
private readonly Histogram<long> _rateLimitRemaining;
private readonly Histogram<long> _rateLimitLimit;
private readonly Histogram<double> _rateLimitResetSeconds;
private readonly Histogram<double> _rateLimitHeadroomPct;
private readonly ObservableGauge<double> _rateLimitHeadroomGauge;
private readonly Counter<long> _rateLimitExhausted;
private readonly Counter<long> _canonicalMetricFallbacks;
private readonly object _rateLimitLock = new();
private GhsaRateLimitSnapshot? _lastRateLimitSnapshot;
private readonly Dictionary<(string Phase, string? Resource), GhsaRateLimitSnapshot> _rateLimitSnapshots = new();
public GhsaDiagnostics()
{
_meter = new Meter(MeterName, MeterVersion);
_fetchAttempts = _meter.CreateCounter<long>("ghsa.fetch.attempts", unit: "operations");
_fetchDocuments = _meter.CreateCounter<long>("ghsa.fetch.documents", unit: "documents");
_fetchFailures = _meter.CreateCounter<long>("ghsa.fetch.failures", unit: "operations");
_fetchUnchanged = _meter.CreateCounter<long>("ghsa.fetch.unchanged", unit: "operations");
_parseSuccess = _meter.CreateCounter<long>("ghsa.parse.success", unit: "documents");
_parseFailures = _meter.CreateCounter<long>("ghsa.parse.failures", unit: "documents");
_parseQuarantine = _meter.CreateCounter<long>("ghsa.parse.quarantine", unit: "documents");
_mapSuccess = _meter.CreateCounter<long>("ghsa.map.success", unit: "advisories");
_rateLimitRemaining = _meter.CreateHistogram<long>("ghsa.ratelimit.remaining", unit: "requests");
_rateLimitLimit = _meter.CreateHistogram<long>("ghsa.ratelimit.limit", unit: "requests");
_rateLimitResetSeconds = _meter.CreateHistogram<double>("ghsa.ratelimit.reset_seconds", unit: "s");
_rateLimitHeadroomPct = _meter.CreateHistogram<double>("ghsa.ratelimit.headroom_pct", unit: "percent");
_rateLimitHeadroomGauge = _meter.CreateObservableGauge("ghsa.ratelimit.headroom_pct_current", ObserveHeadroom, unit: "percent");
_rateLimitExhausted = _meter.CreateCounter<long>("ghsa.ratelimit.exhausted", unit: "events");
_canonicalMetricFallbacks = _meter.CreateCounter<long>("ghsa.map.canonical_metric_fallbacks", unit: "advisories");
}
public void FetchAttempt() => _fetchAttempts.Add(1);
public void FetchDocument() => _fetchDocuments.Add(1);
public void FetchFailure() => _fetchFailures.Add(1);
public void FetchUnchanged() => _fetchUnchanged.Add(1);
public void ParseSuccess() => _parseSuccess.Add(1);
public void ParseFailure() => _parseFailures.Add(1);
public void ParseQuarantine() => _parseQuarantine.Add(1);
public void MapSuccess(long count) => _mapSuccess.Add(count);
internal void RecordRateLimit(GhsaRateLimitSnapshot snapshot)
{
var tags = new KeyValuePair<string, object?>[]
{
new("phase", snapshot.Phase),
new("resource", snapshot.Resource ?? "unknown")
};
if (snapshot.Limit.HasValue)
{
_rateLimitLimit.Record(snapshot.Limit.Value, tags);
}
if (snapshot.Remaining.HasValue)
{
_rateLimitRemaining.Record(snapshot.Remaining.Value, tags);
}
if (snapshot.ResetAfter.HasValue)
{
_rateLimitResetSeconds.Record(snapshot.ResetAfter.Value.TotalSeconds, tags);
}
if (TryCalculateHeadroom(snapshot, out var headroom))
{
_rateLimitHeadroomPct.Record(headroom, tags);
}
lock (_rateLimitLock)
{
_lastRateLimitSnapshot = snapshot;
_rateLimitSnapshots[(snapshot.Phase, snapshot.Resource)] = snapshot;
}
}
internal void RateLimitExhausted(string phase)
=> _rateLimitExhausted.Add(1, new KeyValuePair<string, object?>("phase", phase));
public void CanonicalMetricFallback(string canonicalMetricId, string severity)
=> _canonicalMetricFallbacks.Add(
1,
new KeyValuePair<string, object?>("canonical_metric_id", canonicalMetricId),
new KeyValuePair<string, object?>("severity", severity),
new KeyValuePair<string, object?>("reason", "no_cvss"));
internal GhsaRateLimitSnapshot? GetLastRateLimitSnapshot()
{
lock (_rateLimitLock)
{
return _lastRateLimitSnapshot;
}
}
private IEnumerable<Measurement<double>> ObserveHeadroom()
{
lock (_rateLimitLock)
{
if (_rateLimitSnapshots.Count == 0)
{
yield break;
}
foreach (var snapshot in _rateLimitSnapshots.Values)
{
if (TryCalculateHeadroom(snapshot, out var headroom))
{
yield return new Measurement<double>(
headroom,
new KeyValuePair<string, object?>("phase", snapshot.Phase),
new KeyValuePair<string, object?>("resource", snapshot.Resource ?? "unknown"));
}
}
}
}
private static bool TryCalculateHeadroom(in GhsaRateLimitSnapshot snapshot, out double headroomPct)
{
headroomPct = 0;
if (!snapshot.Limit.HasValue || !snapshot.Remaining.HasValue)
{
return false;
}
var limit = snapshot.Limit.Value;
if (limit <= 0)
{
return false;
}
headroomPct = (double)snapshot.Remaining.Value / limit * 100d;
return true;
}
public void Dispose()
{
_meter.Dispose();
}
}

View File

@@ -0,0 +1,115 @@
using System.Collections.Generic;
using System.Globalization;
using System.Text.Json;
namespace StellaOps.Concelier.Connector.Ghsa.Internal;
internal static class GhsaListParser
{
public static GhsaListPage Parse(ReadOnlySpan<byte> content, int currentPage, int pageSize)
{
using var document = JsonDocument.Parse(content.ToArray());
var root = document.RootElement;
var items = new List<GhsaListItem>();
DateTimeOffset? maxUpdated = null;
if (root.TryGetProperty("advisories", out var advisories) && advisories.ValueKind == JsonValueKind.Array)
{
foreach (var advisory in advisories.EnumerateArray())
{
if (advisory.ValueKind != JsonValueKind.Object)
{
continue;
}
var id = GetString(advisory, "ghsa_id");
if (string.IsNullOrWhiteSpace(id))
{
continue;
}
var updated = GetDate(advisory, "updated_at");
if (updated.HasValue && (!maxUpdated.HasValue || updated > maxUpdated))
{
maxUpdated = updated;
}
items.Add(new GhsaListItem(id, updated));
}
}
var hasMorePages = TryDetermineHasMore(root, currentPage, pageSize, items.Count, out var nextPage);
return new GhsaListPage(items, maxUpdated, hasMorePages, nextPage ?? currentPage + 1);
}
private static bool TryDetermineHasMore(JsonElement root, int currentPage, int pageSize, int itemCount, out int? nextPage)
{
nextPage = null;
if (root.TryGetProperty("pagination", out var pagination) && pagination.ValueKind == JsonValueKind.Object)
{
var hasNextPage = pagination.TryGetProperty("has_next_page", out var hasNext) && hasNext.ValueKind == JsonValueKind.True;
if (hasNextPage)
{
nextPage = currentPage + 1;
return true;
}
if (pagination.TryGetProperty("total_pages", out var totalPagesElement) && totalPagesElement.ValueKind == JsonValueKind.Number && totalPagesElement.TryGetInt32(out var totalPages))
{
if (currentPage < totalPages)
{
nextPage = currentPage + 1;
return true;
}
}
return false;
}
if (itemCount >= pageSize)
{
nextPage = currentPage + 1;
return true;
}
return false;
}
private static string? GetString(JsonElement element, string propertyName)
{
if (!element.TryGetProperty(propertyName, out var property))
{
return null;
}
return property.ValueKind switch
{
JsonValueKind.String => property.GetString(),
_ => null,
};
}
private static DateTimeOffset? GetDate(JsonElement element, string propertyName)
{
var value = GetString(element, propertyName);
if (string.IsNullOrWhiteSpace(value))
{
return null;
}
return DateTimeOffset.TryParse(value, CultureInfo.InvariantCulture, DateTimeStyles.AssumeUniversal, out var parsed)
? parsed.ToUniversalTime()
: null;
}
}
internal sealed record GhsaListPage(
IReadOnlyList<GhsaListItem> Items,
DateTimeOffset? MaxUpdated,
bool HasMorePages,
int NextPageCandidate);
internal sealed record GhsaListItem(string GhsaId, DateTimeOffset? UpdatedAt);

View File

@@ -0,0 +1,447 @@
using System.Collections.Generic;
using System.Collections.Immutable;
using System.Linq;
using System.Text;
using StellaOps.Concelier.Models;
using StellaOps.Concelier.Normalization.Cvss;
using StellaOps.Concelier.Normalization.SemVer;
using StellaOps.Concelier.Storage.Mongo.Documents;
namespace StellaOps.Concelier.Connector.Ghsa.Internal;
internal static class GhsaMapper
{
private static readonly HashSet<string> SemVerEcosystems = new(StringComparer.OrdinalIgnoreCase)
{
"npm",
"maven",
"pip",
"rubygems",
"composer",
"nuget",
"go",
"cargo",
};
public static Advisory Map(GhsaRecordDto dto, DocumentRecord document, DateTimeOffset recordedAt)
{
ArgumentNullException.ThrowIfNull(dto);
ArgumentNullException.ThrowIfNull(document);
var fetchProvenance = new AdvisoryProvenance(
GhsaConnectorPlugin.SourceName,
"document",
document.Uri,
document.FetchedAt,
new[] { ProvenanceFieldMasks.Advisory });
var mapProvenance = new AdvisoryProvenance(
GhsaConnectorPlugin.SourceName,
"mapping",
dto.GhsaId,
recordedAt,
new[] { ProvenanceFieldMasks.Advisory });
var aliases = dto.Aliases
.Where(static alias => !string.IsNullOrWhiteSpace(alias))
.Distinct(StringComparer.OrdinalIgnoreCase)
.ToArray();
var references = dto.References
.Select(reference => CreateReference(reference, recordedAt))
.Where(static reference => reference is not null)
.Cast<AdvisoryReference>()
.ToList();
var affected = CreateAffectedPackages(dto, recordedAt);
var credits = CreateCredits(dto.Credits, recordedAt);
var weaknesses = CreateWeaknesses(dto.Cwes, recordedAt);
var cvssMetrics = CreateCvssMetrics(dto.Cvss, recordedAt, out var cvssSeverity, out var canonicalMetricId);
var severityHint = SeverityNormalization.Normalize(dto.Severity);
var cvssSeverityHint = SeverityNormalization.Normalize(dto.Cvss?.Severity);
var severity = severityHint ?? cvssSeverity ?? cvssSeverityHint;
if (canonicalMetricId is null)
{
var fallbackSeverity = severityHint ?? cvssSeverityHint ?? cvssSeverity;
if (!string.IsNullOrWhiteSpace(fallbackSeverity))
{
canonicalMetricId = BuildSeverityCanonicalMetricId(fallbackSeverity);
}
}
var summary = dto.Summary ?? dto.Description;
var description = Validation.TrimToNull(dto.Description);
return new Advisory(
advisoryKey: dto.GhsaId,
title: dto.Summary ?? dto.GhsaId,
summary: summary,
language: "en",
published: dto.PublishedAt,
modified: dto.UpdatedAt ?? dto.PublishedAt,
severity: severity,
exploitKnown: false,
aliases: aliases,
credits: credits,
references: references,
affectedPackages: affected,
cvssMetrics: cvssMetrics,
provenance: new[] { fetchProvenance, mapProvenance },
description: description,
cwes: weaknesses,
canonicalMetricId: canonicalMetricId);
}
private static string BuildSeverityCanonicalMetricId(string severity)
=> $"{GhsaConnectorPlugin.SourceName}:severity/{severity}";
private static AdvisoryReference? CreateReference(GhsaReferenceDto reference, DateTimeOffset recordedAt)
{
if (string.IsNullOrWhiteSpace(reference.Url) || !Validation.LooksLikeHttpUrl(reference.Url))
{
return null;
}
var kind = reference.Type?.ToLowerInvariant();
return new AdvisoryReference(
reference.Url,
kind,
reference.Name,
summary: null,
provenance: new AdvisoryProvenance(
GhsaConnectorPlugin.SourceName,
"reference",
reference.Url,
recordedAt,
new[] { ProvenanceFieldMasks.References }));
}
private static IReadOnlyList<AffectedPackage> CreateAffectedPackages(GhsaRecordDto dto, DateTimeOffset recordedAt)
{
if (dto.Affected.Count == 0)
{
return Array.Empty<AffectedPackage>();
}
var packages = new List<AffectedPackage>(dto.Affected.Count);
foreach (var affected in dto.Affected)
{
var ecosystem = string.IsNullOrWhiteSpace(affected.Ecosystem) ? "unknown" : affected.Ecosystem.Trim();
var packageName = string.IsNullOrWhiteSpace(affected.PackageName) ? "unknown-package" : affected.PackageName.Trim();
var identifier = $"{ecosystem.ToLowerInvariant()}:{packageName}";
var provenance = new[]
{
new AdvisoryProvenance(
GhsaConnectorPlugin.SourceName,
"affected",
identifier,
recordedAt,
new[] { ProvenanceFieldMasks.AffectedPackages }),
};
var rangeKind = SemVerEcosystems.Contains(ecosystem) ? "semver" : "vendor";
var packageType = SemVerEcosystems.Contains(ecosystem) ? AffectedPackageTypes.SemVer : AffectedPackageTypes.Vendor;
var (ranges, normalizedVersions) = SemVerEcosystems.Contains(ecosystem)
? CreateSemVerVersionArtifacts(affected, identifier, ecosystem, packageName, recordedAt)
: CreateVendorVersionArtifacts(affected, rangeKind, identifier, ecosystem, packageName, recordedAt);
var statuses = new[]
{
new AffectedPackageStatus(
"affected",
new AdvisoryProvenance(
GhsaConnectorPlugin.SourceName,
"affected-status",
identifier,
recordedAt,
new[] { ProvenanceFieldMasks.PackageStatuses })),
};
packages.Add(new AffectedPackage(
packageType,
identifier,
platform: null,
versionRanges: ranges,
statuses: statuses,
provenance: provenance,
normalizedVersions: normalizedVersions));
}
return packages;
}
private static IReadOnlyList<AdvisoryCredit> CreateCredits(IReadOnlyList<GhsaCreditDto> credits, DateTimeOffset recordedAt)
{
if (credits.Count == 0)
{
return Array.Empty<AdvisoryCredit>();
}
var results = new List<AdvisoryCredit>(credits.Count);
foreach (var credit in credits)
{
var displayName = Validation.TrimToNull(credit.Name) ?? Validation.TrimToNull(credit.Login);
if (displayName is null)
{
continue;
}
var contacts = new List<string>();
if (!string.IsNullOrWhiteSpace(credit.ProfileUrl) && Validation.LooksLikeHttpUrl(credit.ProfileUrl))
{
contacts.Add(credit.ProfileUrl.Trim());
}
else if (!string.IsNullOrWhiteSpace(credit.Login))
{
contacts.Add($"https://github.com/{credit.Login.Trim()}");
}
var provenance = new AdvisoryProvenance(
GhsaConnectorPlugin.SourceName,
"credit",
displayName,
recordedAt,
new[] { ProvenanceFieldMasks.Credits });
results.Add(new AdvisoryCredit(displayName, credit.Type, contacts, provenance));
}
return results.Count == 0 ? Array.Empty<AdvisoryCredit>() : results;
}
private static IReadOnlyList<AdvisoryWeakness> CreateWeaknesses(IReadOnlyList<GhsaWeaknessDto> cwes, DateTimeOffset recordedAt)
{
if (cwes.Count == 0)
{
return Array.Empty<AdvisoryWeakness>();
}
var list = new List<AdvisoryWeakness>(cwes.Count);
foreach (var cwe in cwes)
{
if (cwe is null || string.IsNullOrWhiteSpace(cwe.CweId))
{
continue;
}
var identifier = cwe.CweId.Trim();
var provenance = new AdvisoryProvenance(
GhsaConnectorPlugin.SourceName,
"weakness",
identifier,
recordedAt,
new[] { ProvenanceFieldMasks.Weaknesses });
var provenanceArray = ImmutableArray.Create(provenance);
list.Add(new AdvisoryWeakness(
taxonomy: "cwe",
identifier: identifier,
name: Validation.TrimToNull(cwe.Name),
uri: BuildCweUrl(identifier),
provenance: provenanceArray));
}
return list.Count == 0 ? Array.Empty<AdvisoryWeakness>() : list;
}
private static IReadOnlyList<CvssMetric> CreateCvssMetrics(GhsaCvssDto? cvss, DateTimeOffset recordedAt, out string? severity, out string? canonicalMetricId)
{
severity = null;
canonicalMetricId = null;
if (cvss is null)
{
return Array.Empty<CvssMetric>();
}
var vector = Validation.TrimToNull(cvss.VectorString);
if (!CvssMetricNormalizer.TryNormalize(null, vector, cvss.Score, cvss.Severity, out var normalized))
{
return Array.Empty<CvssMetric>();
}
severity = normalized.BaseSeverity;
canonicalMetricId = $"{normalized.Version}|{normalized.Vector}";
var provenance = new AdvisoryProvenance(
GhsaConnectorPlugin.SourceName,
"cvss",
normalized.Vector,
recordedAt,
new[] { ProvenanceFieldMasks.CvssMetrics });
return new[]
{
normalized.ToModel(provenance),
};
}
private static string? BuildCweUrl(string? cweId)
{
if (string.IsNullOrWhiteSpace(cweId))
{
return null;
}
var trimmed = cweId.Trim();
var dashIndex = trimmed.IndexOf('-');
if (dashIndex < 0 || dashIndex == trimmed.Length - 1)
{
return null;
}
var digits = new StringBuilder();
for (var i = dashIndex + 1; i < trimmed.Length; i++)
{
var ch = trimmed[i];
if (char.IsDigit(ch))
{
digits.Append(ch);
}
}
return digits.Length == 0 ? null : $"https://cwe.mitre.org/data/definitions/{digits}.html";
}
private static (IReadOnlyList<AffectedVersionRange> Ranges, IReadOnlyList<NormalizedVersionRule> Normalized) CreateSemVerVersionArtifacts(
GhsaAffectedDto affected,
string identifier,
string ecosystem,
string packageName,
DateTimeOffset recordedAt)
{
var note = BuildNormalizedNote(identifier);
var results = SemVerRangeRuleBuilder.Build(affected.VulnerableRange, affected.PatchedVersion, note);
if (results.Count > 0)
{
var ranges = new List<AffectedVersionRange>(results.Count);
var normalized = new List<NormalizedVersionRule>(results.Count);
foreach (var result in results)
{
var primitive = result.Primitive;
var rangeExpression = ResolveRangeExpression(result.Expression, primitive.ConstraintExpression, affected.VulnerableRange);
ranges.Add(new AffectedVersionRange(
rangeKind: "semver",
introducedVersion: Validation.TrimToNull(primitive.Introduced),
fixedVersion: Validation.TrimToNull(primitive.Fixed),
lastAffectedVersion: Validation.TrimToNull(primitive.LastAffected),
rangeExpression: rangeExpression,
provenance: CreateRangeProvenance(identifier, recordedAt),
primitives: new RangePrimitives(
SemVer: primitive,
Nevra: null,
Evr: null,
VendorExtensions: CreateVendorExtensions(ecosystem, packageName))));
normalized.Add(result.NormalizedRule);
}
return (ranges.ToArray(), normalized.ToArray());
}
var fallbackRange = CreateFallbackRange("semver", affected, identifier, ecosystem, packageName, recordedAt);
if (fallbackRange is null)
{
return (Array.Empty<AffectedVersionRange>(), Array.Empty<NormalizedVersionRule>());
}
var fallbackRule = fallbackRange.ToNormalizedVersionRule(note);
var normalizedFallback = fallbackRule is null
? Array.Empty<NormalizedVersionRule>()
: new[] { fallbackRule };
return (new[] { fallbackRange }, normalizedFallback);
}
private static (IReadOnlyList<AffectedVersionRange> Ranges, IReadOnlyList<NormalizedVersionRule> Normalized) CreateVendorVersionArtifacts(
GhsaAffectedDto affected,
string rangeKind,
string identifier,
string ecosystem,
string packageName,
DateTimeOffset recordedAt)
{
var range = CreateFallbackRange(rangeKind, affected, identifier, ecosystem, packageName, recordedAt);
if (range is null)
{
return (Array.Empty<AffectedVersionRange>(), Array.Empty<NormalizedVersionRule>());
}
return (new[] { range }, Array.Empty<NormalizedVersionRule>());
}
private static AffectedVersionRange? CreateFallbackRange(
string rangeKind,
GhsaAffectedDto affected,
string identifier,
string ecosystem,
string packageName,
DateTimeOffset recordedAt)
{
var fixedVersion = Validation.TrimToNull(affected.PatchedVersion);
var rangeExpression = Validation.TrimToNull(affected.VulnerableRange);
if (fixedVersion is null && rangeExpression is null)
{
return null;
}
return new AffectedVersionRange(
rangeKind,
introducedVersion: null,
fixedVersion: fixedVersion,
lastAffectedVersion: null,
rangeExpression: rangeExpression,
provenance: CreateRangeProvenance(identifier, recordedAt),
primitives: new RangePrimitives(
SemVer: null,
Nevra: null,
Evr: null,
VendorExtensions: CreateVendorExtensions(ecosystem, packageName)));
}
private static AdvisoryProvenance CreateRangeProvenance(string identifier, DateTimeOffset recordedAt)
=> new(
GhsaConnectorPlugin.SourceName,
"affected-range",
identifier,
recordedAt,
new[] { ProvenanceFieldMasks.VersionRanges });
private static IReadOnlyDictionary<string, string> CreateVendorExtensions(string ecosystem, string packageName)
=> new Dictionary<string, string>(StringComparer.OrdinalIgnoreCase)
{
["ecosystem"] = ecosystem,
["package"] = packageName,
};
private static string? BuildNormalizedNote(string identifier)
{
var trimmed = Validation.TrimToNull(identifier);
return trimmed is null ? null : $"ghsa:{trimmed}";
}
private static string? ResolveRangeExpression(string? parsedExpression, string? constraintExpression, string? fallbackExpression)
{
var parsed = Validation.TrimToNull(parsedExpression);
if (parsed is not null)
{
return parsed;
}
var constraint = Validation.TrimToNull(constraintExpression);
if (constraint is not null)
{
return constraint;
}
return Validation.TrimToNull(fallbackExpression);
}
}

View File

@@ -0,0 +1,111 @@
using System;
using System.Collections.Generic;
using System.Globalization;
namespace StellaOps.Concelier.Connector.Ghsa.Internal;
internal static class GhsaRateLimitParser
{
public static GhsaRateLimitSnapshot? TryParse(IReadOnlyDictionary<string, string>? headers, DateTimeOffset now, string phase)
{
if (headers is null || headers.Count == 0)
{
return null;
}
string? resource = null;
long? limit = null;
long? remaining = null;
long? used = null;
DateTimeOffset? resetAt = null;
TimeSpan? resetAfter = null;
TimeSpan? retryAfter = null;
var hasData = false;
if (TryGet(headers, "X-RateLimit-Resource", out var resourceValue) && !string.IsNullOrWhiteSpace(resourceValue))
{
resource = resourceValue;
hasData = true;
}
if (TryParseLong(headers, "X-RateLimit-Limit", out var limitValue))
{
limit = limitValue;
hasData = true;
}
if (TryParseLong(headers, "X-RateLimit-Remaining", out var remainingValue))
{
remaining = remainingValue;
hasData = true;
}
if (TryParseLong(headers, "X-RateLimit-Used", out var usedValue))
{
used = usedValue;
hasData = true;
}
if (TryParseLong(headers, "X-RateLimit-Reset", out var resetValue))
{
resetAt = DateTimeOffset.FromUnixTimeSeconds(resetValue);
var delta = resetAt.Value - now;
if (delta > TimeSpan.Zero)
{
resetAfter = delta;
}
hasData = true;
}
if (TryGet(headers, "Retry-After", out var retryAfterValue) && !string.IsNullOrWhiteSpace(retryAfterValue))
{
if (double.TryParse(retryAfterValue, NumberStyles.Float, CultureInfo.InvariantCulture, out var seconds) && seconds > 0)
{
retryAfter = TimeSpan.FromSeconds(seconds);
}
else if (DateTimeOffset.TryParse(retryAfterValue, CultureInfo.InvariantCulture, DateTimeStyles.AssumeUniversal | DateTimeStyles.AdjustToUniversal, out var retryAfterDate))
{
var delta = retryAfterDate - now;
if (delta > TimeSpan.Zero)
{
retryAfter = delta;
}
}
hasData = true;
}
if (!hasData)
{
return null;
}
return new GhsaRateLimitSnapshot(phase, resource, limit, remaining, used, resetAt, resetAfter, retryAfter);
}
private static bool TryGet(IReadOnlyDictionary<string, string> headers, string key, out string value)
{
foreach (var pair in headers)
{
if (pair.Key.Equals(key, StringComparison.OrdinalIgnoreCase))
{
value = pair.Value;
return true;
}
}
value = string.Empty;
return false;
}
private static bool TryParseLong(IReadOnlyDictionary<string, string> headers, string key, out long result)
{
result = 0;
if (TryGet(headers, key, out var value) && long.TryParse(value, NumberStyles.Integer, CultureInfo.InvariantCulture, out var parsed))
{
result = parsed;
return true;
}
return false;
}
}

View File

@@ -0,0 +1,23 @@
using System;
namespace StellaOps.Concelier.Connector.Ghsa.Internal;
internal readonly record struct GhsaRateLimitSnapshot(
string Phase,
string? Resource,
long? Limit,
long? Remaining,
long? Used,
DateTimeOffset? ResetAt,
TimeSpan? ResetAfter,
TimeSpan? RetryAfter)
{
public bool HasData =>
Limit.HasValue ||
Remaining.HasValue ||
Used.HasValue ||
ResetAt.HasValue ||
ResetAfter.HasValue ||
RetryAfter.HasValue ||
!string.IsNullOrEmpty(Resource);
}

View File

@@ -0,0 +1,75 @@
namespace StellaOps.Concelier.Connector.Ghsa.Internal;
internal sealed record GhsaRecordDto
{
public string GhsaId { get; init; } = string.Empty;
public string? Summary { get; init; }
public string? Description { get; init; }
public string? Severity { get; init; }
public DateTimeOffset? PublishedAt { get; init; }
public DateTimeOffset? UpdatedAt { get; init; }
public IReadOnlyList<string> Aliases { get; init; } = Array.Empty<string>();
public IReadOnlyList<GhsaReferenceDto> References { get; init; } = Array.Empty<GhsaReferenceDto>();
public IReadOnlyList<GhsaAffectedDto> Affected { get; init; } = Array.Empty<GhsaAffectedDto>();
public IReadOnlyList<GhsaCreditDto> Credits { get; init; } = Array.Empty<GhsaCreditDto>();
public IReadOnlyList<GhsaWeaknessDto> Cwes { get; init; } = Array.Empty<GhsaWeaknessDto>();
public GhsaCvssDto? Cvss { get; init; }
}
internal sealed record GhsaReferenceDto
{
public string Url { get; init; } = string.Empty;
public string? Type { get; init; }
public string? Name { get; init; }
}
internal sealed record GhsaAffectedDto
{
public string PackageName { get; init; } = string.Empty;
public string Ecosystem { get; init; } = string.Empty;
public string? VulnerableRange { get; init; }
public string? PatchedVersion { get; init; }
}
internal sealed record GhsaCreditDto
{
public string? Type { get; init; }
public string? Name { get; init; }
public string? Login { get; init; }
public string? ProfileUrl { get; init; }
}
internal sealed record GhsaWeaknessDto
{
public string? CweId { get; init; }
public string? Name { get; init; }
}
internal sealed record GhsaCvssDto
{
public double? Score { get; init; }
public string? VectorString { get; init; }
public string? Severity { get; init; }
}

View File

@@ -0,0 +1,269 @@
using System.Collections.Generic;
using System.Globalization;
using System.Text.Json;
namespace StellaOps.Concelier.Connector.Ghsa.Internal;
internal static class GhsaRecordParser
{
public static GhsaRecordDto Parse(ReadOnlySpan<byte> content)
{
using var document = JsonDocument.Parse(content.ToArray());
var root = document.RootElement;
var ghsaId = GetString(root, "ghsa_id") ?? throw new JsonException("ghsa_id missing");
var summary = GetString(root, "summary");
var description = GetString(root, "description");
var severity = GetString(root, "severity");
var publishedAt = GetDate(root, "published_at");
var updatedAt = GetDate(root, "updated_at") ?? publishedAt;
var aliases = new HashSet<string>(StringComparer.OrdinalIgnoreCase)
{
ghsaId,
};
if (root.TryGetProperty("cve_ids", out var cveIds) && cveIds.ValueKind == JsonValueKind.Array)
{
foreach (var cve in cveIds.EnumerateArray())
{
if (cve.ValueKind == JsonValueKind.String && !string.IsNullOrWhiteSpace(cve.GetString()))
{
aliases.Add(cve.GetString()!);
}
}
}
var references = ParseReferences(root);
var affected = ParseAffected(root);
var credits = ParseCredits(root);
var cwes = ParseCwes(root);
var cvss = ParseCvss(root);
return new GhsaRecordDto
{
GhsaId = ghsaId,
Summary = summary,
Description = description,
Severity = severity,
PublishedAt = publishedAt,
UpdatedAt = updatedAt,
Aliases = aliases.ToArray(),
References = references,
Affected = affected,
Credits = credits,
Cwes = cwes,
Cvss = cvss,
};
}
private static IReadOnlyList<GhsaReferenceDto> ParseReferences(JsonElement root)
{
if (!root.TryGetProperty("references", out var references) || references.ValueKind != JsonValueKind.Array)
{
return Array.Empty<GhsaReferenceDto>();
}
var list = new List<GhsaReferenceDto>(references.GetArrayLength());
foreach (var reference in references.EnumerateArray())
{
if (reference.ValueKind != JsonValueKind.Object)
{
continue;
}
var url = GetString(reference, "url");
if (string.IsNullOrWhiteSpace(url))
{
continue;
}
list.Add(new GhsaReferenceDto
{
Url = url,
Type = GetString(reference, "type"),
Name = GetString(reference, "name"),
});
}
return list;
}
private static IReadOnlyList<GhsaAffectedDto> ParseAffected(JsonElement root)
{
if (!root.TryGetProperty("vulnerabilities", out var vulnerabilities) || vulnerabilities.ValueKind != JsonValueKind.Array)
{
return Array.Empty<GhsaAffectedDto>();
}
var list = new List<GhsaAffectedDto>(vulnerabilities.GetArrayLength());
foreach (var entry in vulnerabilities.EnumerateArray())
{
if (entry.ValueKind != JsonValueKind.Object)
{
continue;
}
var package = entry.TryGetProperty("package", out var packageElement) && packageElement.ValueKind == JsonValueKind.Object
? packageElement
: default;
var packageName = GetString(package, "name") ?? "unknown-package";
var ecosystem = GetString(package, "ecosystem") ?? "unknown";
var vulnerableRange = GetString(entry, "vulnerable_version_range");
string? patchedVersion = null;
if (entry.TryGetProperty("first_patched_version", out var patchedElement) && patchedElement.ValueKind == JsonValueKind.Object)
{
patchedVersion = GetString(patchedElement, "identifier");
}
list.Add(new GhsaAffectedDto
{
PackageName = packageName,
Ecosystem = ecosystem,
VulnerableRange = vulnerableRange,
PatchedVersion = patchedVersion,
});
}
return list;
}
private static IReadOnlyList<GhsaCreditDto> ParseCredits(JsonElement root)
{
if (!root.TryGetProperty("credits", out var credits) || credits.ValueKind != JsonValueKind.Array)
{
return Array.Empty<GhsaCreditDto>();
}
var list = new List<GhsaCreditDto>(credits.GetArrayLength());
foreach (var credit in credits.EnumerateArray())
{
if (credit.ValueKind != JsonValueKind.Object)
{
continue;
}
var type = GetString(credit, "type");
var name = GetString(credit, "name");
string? login = null;
string? profile = null;
if (credit.TryGetProperty("user", out var user) && user.ValueKind == JsonValueKind.Object)
{
login = GetString(user, "login");
profile = GetString(user, "html_url") ?? GetString(user, "url");
name ??= GetString(user, "name");
}
name ??= login;
if (string.IsNullOrWhiteSpace(name))
{
continue;
}
list.Add(new GhsaCreditDto
{
Type = type,
Name = name,
Login = login,
ProfileUrl = profile,
});
}
return list;
}
private static IReadOnlyList<GhsaWeaknessDto> ParseCwes(JsonElement root)
{
if (!root.TryGetProperty("cwes", out var cwes) || cwes.ValueKind != JsonValueKind.Array)
{
return Array.Empty<GhsaWeaknessDto>();
}
var list = new List<GhsaWeaknessDto>(cwes.GetArrayLength());
foreach (var entry in cwes.EnumerateArray())
{
if (entry.ValueKind != JsonValueKind.Object)
{
continue;
}
var cweId = GetString(entry, "cwe_id");
if (string.IsNullOrWhiteSpace(cweId))
{
continue;
}
list.Add(new GhsaWeaknessDto
{
CweId = cweId,
Name = GetString(entry, "name"),
});
}
return list.Count == 0 ? Array.Empty<GhsaWeaknessDto>() : list;
}
private static GhsaCvssDto? ParseCvss(JsonElement root)
{
if (!root.TryGetProperty("cvss", out var cvss) || cvss.ValueKind != JsonValueKind.Object)
{
return null;
}
double? score = null;
if (cvss.TryGetProperty("score", out var scoreElement) && scoreElement.ValueKind == JsonValueKind.Number)
{
score = scoreElement.GetDouble();
}
var vector = GetString(cvss, "vector_string") ?? GetString(cvss, "vectorString");
var severity = GetString(cvss, "severity");
if (score is null && string.IsNullOrWhiteSpace(vector) && string.IsNullOrWhiteSpace(severity))
{
return null;
}
return new GhsaCvssDto
{
Score = score,
VectorString = vector,
Severity = severity,
};
}
private static string? GetString(JsonElement element, string propertyName)
{
if (element.ValueKind != JsonValueKind.Object)
{
return null;
}
if (!element.TryGetProperty(propertyName, out var property))
{
return null;
}
return property.ValueKind switch
{
JsonValueKind.String => property.GetString(),
_ => null,
};
}
private static DateTimeOffset? GetDate(JsonElement element, string propertyName)
{
var value = GetString(element, propertyName);
if (string.IsNullOrWhiteSpace(value))
{
return null;
}
return DateTimeOffset.TryParse(value, CultureInfo.InvariantCulture, DateTimeStyles.AssumeUniversal, out var parsed)
? parsed.ToUniversalTime()
: null;
}
}

View File

@@ -0,0 +1,43 @@
using StellaOps.Concelier.Core.Jobs;
namespace StellaOps.Concelier.Connector.Ghsa;
internal static class GhsaJobKinds
{
public const string Fetch = "source:ghsa:fetch";
public const string Parse = "source:ghsa:parse";
public const string Map = "source:ghsa:map";
}
internal sealed class GhsaFetchJob : IJob
{
private readonly GhsaConnector _connector;
public GhsaFetchJob(GhsaConnector connector)
=> _connector = connector ?? throw new ArgumentNullException(nameof(connector));
public Task ExecuteAsync(JobExecutionContext context, CancellationToken cancellationToken)
=> _connector.FetchAsync(context.Services, cancellationToken);
}
internal sealed class GhsaParseJob : IJob
{
private readonly GhsaConnector _connector;
public GhsaParseJob(GhsaConnector connector)
=> _connector = connector ?? throw new ArgumentNullException(nameof(connector));
public Task ExecuteAsync(JobExecutionContext context, CancellationToken cancellationToken)
=> _connector.ParseAsync(context.Services, cancellationToken);
}
internal sealed class GhsaMapJob : IJob
{
private readonly GhsaConnector _connector;
public GhsaMapJob(GhsaConnector connector)
=> _connector = connector ?? throw new ArgumentNullException(nameof(connector));
public Task ExecuteAsync(JobExecutionContext context, CancellationToken cancellationToken)
=> _connector.MapAsync(context.Services, cancellationToken);
}

View File

@@ -0,0 +1,4 @@
using System.Runtime.CompilerServices;
[assembly: InternalsVisibleTo("FixtureUpdater")]
[assembly: InternalsVisibleTo("StellaOps.Concelier.Connector.Ghsa.Tests")]

View File

@@ -0,0 +1,17 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>net10.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
</PropertyGroup>
<ItemGroup>
<ProjectReference Include="../StellaOps.Plugin/StellaOps.Plugin.csproj" />
<ProjectReference Include="../StellaOps.Concelier.Connector.Common/StellaOps.Concelier.Connector.Common.csproj" />
<ProjectReference Include="../StellaOps.Concelier.Models/StellaOps.Concelier.Models.csproj" />
<ProjectReference Include="../StellaOps.Concelier.Normalization/StellaOps.Concelier.Normalization.csproj" />
</ItemGroup>
</Project>

View File

@@ -0,0 +1,19 @@
# TASKS
| Task | Owner(s) | Depends on | Notes |
|---|---|---|---|
|Select GHSA data source & auth model|BE-Conn-GHSA|Research|**DONE (2025-10-10)** Adopted GitHub Security Advisories REST (global) endpoint with bearer token + API version headers documented in `GhsaOptions`.|
|Fetch pipeline & state management|BE-Conn-GHSA|Source.Common, Storage.Mongo|**DONE (2025-10-10)** Implemented list/detail fetch using `GhsaCursor` (time window + page), resumable SourceState and backoff controls.|
|DTO & parser implementation|BE-Conn-GHSA|Source.Common|**DONE (2025-10-10)** Added `GhsaRecordParser`/DTOs extracting aliases, references, severity, vulnerable ranges, patched versions.|
|Canonical mapping & range primitives|BE-Conn-GHSA|Models|**DONE (2025-10-10)** `GhsaMapper` emits GHSA advisories with SemVer packages, vendor extensions (ecosystem/package) and deterministic references.<br>2025-10-11 research trail: upcoming normalized array should follow `[{"scheme":"semver","type":"range","min":"<min>","minInclusive":true,"max":"<max>","maxInclusive":false,"notes":"ghsa:GHSA-xxxx"}]`; include patched-only advisories as `lt`/`lte` when no explicit floor.|
|Deterministic fixtures & tests|QA|Testing|**DONE (2025-10-10)** New `StellaOps.Concelier.Connector.Ghsa.Tests` regression covers fetch/parse/map via canned GHSA fixtures and snapshot assertions.|
|Telemetry & documentation|DevEx|Docs|**DONE (2025-10-10)** Diagnostics meter (`ghsa.fetch.*`) wired; DI extension documents token/headers and job registrations.|
|GitHub quota monitoring & retries|BE-Conn-GHSA, Observability|Source.Common|**DONE (2025-10-12)** Rate-limit metrics/logs added, retry/backoff handles 403 secondary limits, and ops runbook documents dashboards + mitigation steps.|
|Production credential & scheduler rollout|Ops, BE-Conn-GHSA|Docs, WebService|**DONE (2025-10-12)** Scheduler defaults registered via `JobSchedulerBuilder`, credential provisioning documented (Compose/Helm samples), and staged backfill guidance captured in `docs/ops/concelier-ghsa-operations.md`.|
|FEEDCONN-GHSA-04-002 Conflict regression fixtures|BE-Conn-GHSA, QA|Merge `FEEDMERGE-ENGINE-04-001`|**DONE (2025-10-12)** Added `conflict-ghsa.canonical.json` + `GhsaConflictFixtureTests`; SemVer ranges and credits align with merge precedence triple and shareable with QA. Validation: `dotnet test src/StellaOps.Concelier.Connector.Ghsa.Tests/StellaOps.Concelier.Connector.Ghsa.Tests.csproj --filter GhsaConflictFixtureTests`.|
|FEEDCONN-GHSA-02-004 GHSA credits & ecosystem severity mapping|BE-Conn-GHSA|Models `FEEDMODELS-SCHEMA-01-002`|**DONE (2025-10-11)** Mapper emits advisory credits with provenance masks, fixtures assert role/contact ordering, and severity normalization remains unchanged.|
|FEEDCONN-GHSA-02-007 Credit parity regression fixtures|BE-Conn-GHSA, QA|Source.Nvd, Source.Osv|**DONE (2025-10-12)** Parity fixtures regenerated via `tools/FixtureUpdater`, normalized SemVer notes verified against GHSA/NVD/OSV snapshots, and the fixtures guide now documents the headroom checks.|
|FEEDCONN-GHSA-02-001 Normalized versions rollout|BE-Conn-GHSA|Models `FEEDMODELS-SCHEMA-01-003`, Normalization playbook|**DONE (2025-10-11)** GHSA mapper now emits SemVer primitives + normalized ranges, fixtures refreshed, connector tests passing; report logged via FEEDMERGE-COORD-02-900.|
|FEEDCONN-GHSA-02-005 Quota monitoring hardening|BE-Conn-GHSA, Observability|Source.Common metrics|**DONE (2025-10-12)** Diagnostics expose headroom histograms/gauges, warning logs dedupe below the configured threshold, and the ops runbook gained alerting and mitigation guidance.|
|FEEDCONN-GHSA-02-006 Scheduler rollout integration|BE-Conn-GHSA, Ops|Job scheduler|**DONE (2025-10-12)** Dependency routine tests assert cron/timeouts, and the runbook highlights cron overrides plus backoff toggles for staged rollouts.|
|FEEDCONN-GHSA-04-003 Description/CWE/metric parity rollout|BE-Conn-GHSA|Models, Core|**DONE (2025-10-15)** Mapper emits advisory description, CWE weaknesses, and canonical CVSS metric id with updated fixtures (`osv-ghsa.osv.json` parity suite) and connector regression covers the new fields. Reported completion to Merge coordination.|
|FEEDCONN-GHSA-04-004 Canonical metric fallback coverage|BE-Conn-GHSA|Models, Merge|**DONE (2025-10-16)** Ensure canonical metric ids remain populated when GitHub omits CVSS vectors/scores; add fixtures capturing severity-only advisories, document precedence with Merge, and emit analytics to track fallback usage.<br>2025-10-16: Mapper now emits `ghsa:severity/<level>` canonical ids when vectors are missing, diagnostics expose `ghsa.map.canonical_metric_fallbacks`, conflict/mapper fixtures updated, and runbook documents Merge precedence. Tests: `dotnet test src/StellaOps.Concelier.Connector.Ghsa.Tests/StellaOps.Concelier.Connector.Ghsa.Tests.csproj`.|