UP
Some checks failed
Build Test Deploy / build-test (push) Has been cancelled
Build Test Deploy / docs (push) Has been cancelled
Build Test Deploy / deploy (push) Has been cancelled
Docs CI / lint-and-preview (push) Has been cancelled

This commit is contained in:
Vladimir Moushkov
2025-10-09 18:59:17 +03:00
parent 18b1922f60
commit d0c95cf328
277 changed files with 17449 additions and 595 deletions

13
TASKS.md Normal file
View File

@@ -0,0 +1,13 @@
# TASKS
| Task | Owner(s) | Depends on | Notes |
|---|---|---|---|
|Merge identity graph & alias store|BE-Merge|Models, Storage.Mongo|**DONE** alias store/resolver, component builder, reconcile job, persistence + diagnostics endpoint landed.|
|OSV alias consolidation & per-ecosystem snapshots|BE-Conn-OSV, QA|Merge, Testing|DONE alias graph handles GHSA/CVE records and deterministic snapshots exist across ecosystems.|
|Oracle PSIRT pipeline completion|BE-Conn-Oracle|Source.Common, Core|**DONE** Oracle mapper now emits CVE aliases, vendor affected packages, patch references, and resume/backfill flow is covered by integration tests.|
|VMware connector observability & resume coverage|BE-Conn-VMware, QA|Source.Common, Storage.Mongo|**DONE** VMware diagnostics emit fetch/parse/map metrics, fetch dedupe uses hash cache, and integration test covers snapshot plus resume path.|
|Model provenance & range backlog|BE-Merge|Models|**DOING** VMware/Oracle/Chromium, NVD, Debian, SUSE, Ubuntu, and Adobe emit RangePrimitives (Debian EVR + SUSE NEVRA + Ubuntu EVR telemetry online; Adobe now reports `adobe.track/platform/priority/availability` telemetry with fixed-status provenance). Remaining connectors (Apple, etc.) still need structured primitives/EVR coverage.|
|Trivy DB exporter delta strategy|BE-Export|Exporters|**TODO** finish `ExportStateManager` delta reset and design incremental layer reuse for unchanged trees.|
|Red Hat fixture validation sweep|QA|Source.Distro.RedHat|**DOING** finalize RHSA fixture regeneration once connector regression fixes land.|
|JVN VULDEF schema update|BE-Conn-JVN, QA|Source.Jvn|**DONE** schema patched (vendor/product attrs, impact entries, err codes), parser tightened, fixtures/tests refreshed.|
|Build/test sweeps|QA|All modules|**DOING** targeted suites green (Models, VMware, Oracle, Chromium, JVN, Cert-In). Full solution run still fails due to `StellaOps.Feedser.Storage.Mongo.Tests/AdvisoryStorePerformanceTests` exceeding perf budget; rerun once budget or test adjusted.|
|OSV vs GHSA parity checks|QA, BE-Merge|Merge|**TODO** design diff detection between OSV and GHSA feeds to surface inconsistencies.|

36
TODOS.md Normal file
View File

@@ -0,0 +1,36 @@
# Pending Task Backlog
> Last updated: 2025-10-09 (UTC)
## Common
- **Build/test sweeps (QA DOING)**
Full solution runs still fail the `StellaOps.Feedser.Storage.Mongo.Tests/AdvisoryStorePerformanceTests` budget. We need either to optimise the hot paths in `AdvisoryStore` for large advisory payloads or relax the perf thresholds with new baseline data. Once the bottleneck is addressed, rerun the full suite and capture metrics for the release checklist.
- **OSV vs GHSA parity checks (QA & BE-Merge TODO)**
Design and implement a diff detector comparing OSV advisories against GHSA records. The deliverable should flag mismatched aliases, missing affected ranges, or divergent severities, surface actionable telemetry/alerts, and include regression tests with canned OSV+GHSA fixtures.
## Prerequisites
- **Range primitives for SemVer/EVR/NEVRA metadata (BE-Merge DOING)**
The core model supports range primitives, but several connectors (notably Apple, remaining vendor feeds, and older distro paths) still emit raw strings. We must extend those mappers to populate the structured envelopes (SemVer/EVR/NEVRA plus vendor extensions) and add fixture coverage so merge/export layers see consistent telemetry.
- **Provenance envelope field masks (BE-Merge DOING)**
Provenance needs richer categorisation (component category, severity bands, resume counters) and better dedupe metrics. Update the provenance model, extend diagnostics to emit the new tags, and refresh dashboards/tests to ensure determinism once additional metadata flows through.
## Implementations
- **Model provenance & range backlog (BE-Merge DOING)**
With Adobe/Ubuntu now emitting range primitives, focus on the remaining connectors (e.g., Apple, smaller vendor PSIRTs). Update their pipelines, regenerate goldens, and confirm `feedser.range.primitives` metrics reflect the added telemetry. The task closes when every high-priority source produces structured ranges with provenance.
- **Trivy DB exporter delta strategy (BE-Export TODO)**
Finalise the delta-reset story in `ExportStateManager`: define when to invalidate baselines, how to reuse unchanged layers, and document operator workflows. Implement planner logic for layer reuse, update exporter tests, and exercise a delta→full→delta sequence.
- **Red Hat fixture validation sweep (QA DOING)**
Regenerate RHSA fixtures with the latest connector output and make sure the regenerated snapshots align once the outstanding connector tweaks land. Blockers: connector regression fixes still in-flight; revisit once those merges stabilise to avoid churn.
- **Plan incremental/delta exports (BE-Export DOING)**
`TrivyDbExportPlanner` now captures changed files but does not yet reuse existing OCI layers. Extend the planner to build per-file manifests, teach the writer to skip untouched layers, and add delta-cycle tests covering file removals, additions, and checksum changes.
- **Scan execution & result upload workflow (DevEx/CLI & Ops Integrator DOING)**
`stella scan run`/`stella scan upload` need completion: support the remaining executor backends (dotnet/self-hosted/docker), capture structured run metadata, implement retry/backoff on uploads, and add integration tests exercising happy-path and failure retries. Update CLI docs once the workflow is stable.

View File

@@ -154,9 +154,12 @@ Each connector ships fixtures/tests under the matching `*.Tests` project.
* **Connector/exporter packages** each source/exporter can ship as a plug-in
assembly with its own options and HttpClient configuration, keeping the core
image minimal.
* **Stella CLI (agent)** triggers feed-related jobs (`stella db fetch/merge/export`)
and consumes the exported JSON/Trivy DB artefacts, aligning with the SBOM-first
workflow described in `AGENTS.md`.
* **StellaOps CLI (agent)** new `StellaOps.Cli` module that exposes
`scanner`, `scan`, and `db` verbs (via System.CommandLine 2.0) to download
scanner container bundles, install them locally, execute scans against target
directories, automatically upload results, and trigger Feedser jobs (`db
fetch/merge/export`) aligned with the SBOM-first workflow described in
`AGENTS.md`.
* **Offline Kit** bundles Feedser plug-ins, JSON tree, Trivy DB, and export
manifests so air-gapped sites can load the latest vulnerability data without
outbound connectivity.

View File

@@ -183,21 +183,79 @@ Validation errors come back as:
---
### 2.4 Attestation (Planned  Q12026)
```
POST /attest
```
### 2.4 Attestation (Planned  Q12026)
```
POST /attest
```
| Param | Purpose |
| ----------- | ------------------------------------- |
| body (JSON) | SLSA v1.0 provenance doc |
| | Signed + stored in local Rekor mirror |
Returns `202 Accepted` and `Location: /attest/{id}` for async verify.
---
Returns `202 Accepted` and `Location: /attest/{id}` for async verify.
---
## 3 StellaOps CLI (`stellaops-cli`)
The new CLI is built on **System.CommandLine2.0.0beta5** and mirrors the Feedser backend REST API.
Configuration follows the same precedence chain everywhere:
1. Environment variables (e.g. `API_KEY`, `STELLAOPS_BACKEND_URL`, `StellaOps:ApiKey`)
2. `appsettings.json`  `appsettings.local.json`
3. `appsettings.yaml`  `appsettings.local.yaml`
4. Defaults (`ApiKey = ""`, `BackendUrl = ""`, cache folders under the current working directory)
| Command | Purpose | Key Flags / Arguments | Notes |
|---------|---------|-----------------------|-------|
| `stellaops-cli scanner download` | Fetch and install scanner container | `--channel <stable\|beta\|nightly>` (default `stable`)<br>`--output <path>`<br>`--overwrite`<br>`--no-install` | Saves artefact under `ScannerCacheDirectory`, verifies digest/signature, and executes `docker load` unless `--no-install` is supplied. |
| `stellaops-cli scan run` | Execute scanner container against a directory (auto-upload) | `--target <directory>` (required)<br>`--runner <docker\|dotnet\|self>` (default from config)<br>`--entry <image-or-entrypoint>`<br>`[scanner-args...]` | Runs the scanner, writes results into `ResultsDirectory`, and automatically uploads the artefact when the exit code is `0`. |
| `stellaops-cli scan upload` | Re-upload existing scan artefact | `--file <path>` | Useful for retries when automatic upload fails or when operating offline. |
| `stellaops-cli db fetch` | Trigger connector jobs | `--source <id>` (e.g. `redhat`, `osv`)<br>`--stage <fetch\|parse\|map>` (default `fetch`)<br>`--mode <resume|init|cursor>` | Translates to `POST /jobs/source:{source}:{stage}` with `trigger=cli` |
| `stellaops-cli db merge` | Run canonical merge reconcile | — | Calls `POST /jobs/merge:reconcile`; exit code `0` on acceptance, `1` on failures/conflicts |
| `stellaops-cli db export` | Kick JSON / Trivy exports | `--format <json\|trivy-db>` (default `json`)<br>`--delta` | Sets `{ delta = true }` parameter when requested |
| `stellaops-cli config show` | Display resolved configuration | — | Masks secret values; helpful for airgapped installs |
**Logging & exit codes**
- Structured logging via `Microsoft.Extensions.Logging` with single-line console output (timestamps in UTC).
- `--verbose / -v` raises log level to `Debug`.
- Command exit codes bubble up: backend conflict → `1`, cancelled via `CTRL+C``130`, scanner exit codes propagate as-is.
**Artifact validation**
- Downloads are verified against the `X-StellaOps-Digest` header (SHA-256). When `StellaOps:ScannerSignaturePublicKeyPath` points to a PEM-encoded RSA key, the optional `X-StellaOps-Signature` header is validated as well.
- Metadata for each bundle is written alongside the artefact (`*.metadata.json`) with digest, signature, source URL, and timestamps.
- Retry behaviour is controlled via `StellaOps:ScannerDownloadAttempts` (default **3** with exponential backoff).
- Successful `scan run` executions create timestamped JSON artefacts inside `ResultsDirectory`; these are posted back to Feedser automatically.
**Authentication**
- API key is sent as `Authorization: Bearer <token>` automatically when configured.
- Anonymous operation (empty key) is permitted for offline use cases but backend calls will fail with 401 unless the Feedser instance allows guest access.
**Configuration file template**
```jsonc
{
"StellaOps": {
"ApiKey": "your-api-token",
"BackendUrl": "https://feedser.example.org",
"ScannerCacheDirectory": "scanners",
"ResultsDirectory": "results",
"DefaultRunner": "docker",
"ScannerSignaturePublicKeyPath": "",
"ScannerDownloadAttempts": 3
}
}
```
Drop `appsettings.local.json` or `.yaml` beside the binary to override per environment.
---
### 2.5 Misc Endpoints
| Path | Method | Description |

View File

@@ -47,14 +47,14 @@ StellaOps.Feedser.Source.Ru.Nkcki/ # PDF/HTML bulletins → structured
StellaOps.Feedser.Source.Vndr.Msrc/
StellaOps.Feedser.Source.Vndr.Cisco/
StellaOps.Feedser.Source.Vndr.Oracle/
StellaOps.Feedser.Source.Vndr.Adobe/
StellaOps.Feedser.Source.Vndr.Adobe/ # APSB ingest; emits vendor RangePrimitives with adobe.track/platform/priority telemetry + fixed-status provenance.
StellaOps.Feedser.Source.Vndr.Apple/
StellaOps.Feedser.Source.Vndr.Chromium/
StellaOps.Feedser.Source.Vndr.Vmware/
StellaOps.Feedser.Source.Distro.RedHat/
StellaOps.Feedser.Source.Distro.Ubuntu/
StellaOps.Feedser.Source.Distro.Debian/
StellaOps.Feedser.Source.Distro.Suse/
StellaOps.Feedser.Source.Distro.Debian/ # Fetches DSA list + detail HTML, emits EVR RangePrimitives with per-release provenance and telemetry.
StellaOps.Feedser.Source.Distro.Ubuntu/ # Ubuntu Security Notices connector (JSON index → EVR ranges with ubuntu.pocket telemetry).
StellaOps.Feedser.Source.Distro.Suse/ # CSAF fetch pipeline emitting NEVRA RangePrimitives with suse.status vendor telemetry.
StellaOps.Feedser.Source.Ics.Cisa/
StellaOps.Feedser.Source.Ics.Kaspersky/
StellaOps.Feedser.Normalization/ # Canonical mappers, validators, version-range normalization
@@ -169,7 +169,7 @@ public interface IFeedConnector {
## 8) Observability
* Serilog structured logging with enrichment fields (`source`, `uri`, `stage`, `durationMs`).
* OpenTelemetry traces around fetch/parse/map/export; metrics for rate limit hits, schema failures, dedupe ratios, package size.
* OpenTelemetry traces around fetch/parse/map/export; metrics for rate limit hits, schema failures, dedupe ratios, package size. Connector HTTP metrics are emitted via the shared `feedser.source.http.*` instruments tagged with `feedser.source=<connector>` so per-source dashboards slice on that label instead of bespoke metric names.
* Prometheus scraping endpoint served by WebService.
---

View File

@@ -0,0 +1,9 @@
Param(
[Parameter(ValueFromRemainingArguments = $true)]
[string[]] $RestArgs
)
$Root = Split-Path -Parent $PSScriptRoot
$env:UPDATE_GOLDENS = "1"
dotnet test (Join-Path $Root "src/StellaOps.Feedser.Models.Tests/StellaOps.Feedser.Models.Tests.csproj") @RestArgs

View File

@@ -0,0 +1,8 @@
#!/usr/bin/env bash
set -euo pipefail
ROOT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")/.." && pwd)"
export UPDATE_GOLDENS=1
dotnet test "$ROOT_DIR/src/StellaOps.Feedser.Models.Tests/StellaOps.Feedser.Models.Tests.csproj" "$@"

View File

@@ -0,0 +1,170 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Logging;
using StellaOps.Cli.Commands;
using StellaOps.Cli.Configuration;
using StellaOps.Cli.Services;
using StellaOps.Cli.Services.Models;
using StellaOps.Cli.Telemetry;
using StellaOps.Cli.Tests.Testing;
namespace StellaOps.Cli.Tests.Commands;
public sealed class CommandHandlersTests
{
[Fact]
public async Task HandleExportJobAsync_SetsExitCodeZeroOnSuccess()
{
var original = Environment.ExitCode;
try
{
var backend = new StubBackendClient(new JobTriggerResult(true, "Accepted", "/jobs/export:json/1", null));
var provider = BuildServiceProvider(backend);
await CommandHandlers.HandleExportJobAsync(provider, "json", delta: false, verbose: false, CancellationToken.None);
Assert.Equal(0, Environment.ExitCode);
Assert.Equal("export:json", backend.LastJobKind);
}
finally
{
Environment.ExitCode = original;
}
}
[Fact]
public async Task HandleMergeJobAsync_SetsExitCodeOnFailure()
{
var original = Environment.ExitCode;
try
{
var backend = new StubBackendClient(new JobTriggerResult(false, "Job already running", null, null));
var provider = BuildServiceProvider(backend);
await CommandHandlers.HandleMergeJobAsync(provider, verbose: false, CancellationToken.None);
Assert.Equal(1, Environment.ExitCode);
Assert.Equal("merge:reconcile", backend.LastJobKind);
}
finally
{
Environment.ExitCode = original;
}
}
[Fact]
public async Task HandleScannerRunAsync_AutomaticallyUploadsResults()
{
using var tempDir = new TempDirectory();
var resultsFile = Path.Combine(tempDir.Path, "results", "scan.json");
var backend = new StubBackendClient(new JobTriggerResult(true, "Accepted", null, null));
var executor = new StubExecutor(new ScannerExecutionResult(0, resultsFile));
var options = new StellaOpsCliOptions
{
ResultsDirectory = Path.Combine(tempDir.Path, "results")
};
var provider = BuildServiceProvider(backend, executor, new StubInstaller(), options);
Directory.CreateDirectory(Path.Combine(tempDir.Path, "target"));
var original = Environment.ExitCode;
try
{
await CommandHandlers.HandleScannerRunAsync(
provider,
runner: "docker",
entry: "scanner-image",
targetDirectory: Path.Combine(tempDir.Path, "target"),
arguments: Array.Empty<string>(),
verbose: false,
cancellationToken: CancellationToken.None);
Assert.Equal(0, Environment.ExitCode);
Assert.Equal(resultsFile, backend.LastUploadPath);
}
finally
{
Environment.ExitCode = original;
}
}
private static IServiceProvider BuildServiceProvider(
IBackendOperationsClient backend,
IScannerExecutor? executor = null,
IScannerInstaller? installer = null,
StellaOpsCliOptions? options = null)
{
var services = new ServiceCollection();
services.AddSingleton(backend);
services.AddSingleton<ILoggerFactory>(_ => LoggerFactory.Create(builder => builder.SetMinimumLevel(LogLevel.Debug)));
services.AddSingleton(new VerbosityState());
services.AddSingleton(options ?? new StellaOpsCliOptions
{
ResultsDirectory = Path.Combine(Path.GetTempPath(), $"stellaops-cli-results-{Guid.NewGuid():N}")
});
services.AddSingleton<IScannerExecutor>(executor ?? new StubExecutor(new ScannerExecutionResult(0, Path.GetTempFileName())));
services.AddSingleton<IScannerInstaller>(installer ?? new StubInstaller());
return services.BuildServiceProvider();
}
private sealed class StubBackendClient : IBackendOperationsClient
{
private readonly JobTriggerResult _result;
public StubBackendClient(JobTriggerResult result)
{
_result = result;
}
public string? LastJobKind { get; private set; }
public string? LastUploadPath { get; private set; }
public Task<ScannerArtifactResult> DownloadScannerAsync(string channel, string outputPath, bool overwrite, bool verbose, CancellationToken cancellationToken)
=> throw new NotImplementedException();
public Task UploadScanResultsAsync(string filePath, CancellationToken cancellationToken)
{
LastUploadPath = filePath;
return Task.CompletedTask;
}
public Task<JobTriggerResult> TriggerJobAsync(string jobKind, IDictionary<string, object?> parameters, CancellationToken cancellationToken)
{
LastJobKind = jobKind;
return Task.FromResult(_result);
}
}
private sealed class StubExecutor : IScannerExecutor
{
private readonly ScannerExecutionResult _result;
public StubExecutor(ScannerExecutionResult result)
{
_result = result;
}
public Task<ScannerExecutionResult> RunAsync(string runner, string entry, string targetDirectory, string resultsDirectory, IReadOnlyList<string> arguments, bool verbose, CancellationToken cancellationToken)
{
Directory.CreateDirectory(Path.GetDirectoryName(_result.ResultsPath)!);
if (!File.Exists(_result.ResultsPath))
{
File.WriteAllText(_result.ResultsPath, "{}");
}
return Task.FromResult(_result);
}
}
private sealed class StubInstaller : IScannerInstaller
{
public Task InstallAsync(string artifactPath, bool verbose, CancellationToken cancellationToken)
=> Task.CompletedTask;
}
}

View File

@@ -0,0 +1,79 @@
using System;
using System.IO;
using System.Text.Json;
using StellaOps.Cli.Configuration;
using Xunit;
namespace StellaOps.Cli.Tests.Configuration;
public sealed class CliBootstrapperTests : IDisposable
{
private readonly string _originalDirectory = Directory.GetCurrentDirectory();
private readonly string _tempDirectory = Path.Combine(Path.GetTempPath(), $"stellaops-cli-tests-{Guid.NewGuid():N}");
public CliBootstrapperTests()
{
Directory.CreateDirectory(_tempDirectory);
Directory.SetCurrentDirectory(_tempDirectory);
}
[Fact]
public void Build_UsesEnvironmentVariablesWhenPresent()
{
Environment.SetEnvironmentVariable("API_KEY", "env-key");
Environment.SetEnvironmentVariable("STELLAOPS_BACKEND_URL", "https://env-backend.example");
try
{
var (options, _) = CliBootstrapper.Build(Array.Empty<string>());
Assert.Equal("env-key", options.ApiKey);
Assert.Equal("https://env-backend.example", options.BackendUrl);
}
finally
{
Environment.SetEnvironmentVariable("API_KEY", null);
Environment.SetEnvironmentVariable("STELLAOPS_BACKEND_URL", null);
}
}
[Fact]
public void Build_FallsBackToAppSettings()
{
WriteAppSettings(new
{
StellaOps = new
{
ApiKey = "file-key",
BackendUrl = "https://file-backend.example"
}
});
var (options, _) = CliBootstrapper.Build(Array.Empty<string>());
Assert.Equal("file-key", options.ApiKey);
Assert.Equal("https://file-backend.example", options.BackendUrl);
}
public void Dispose()
{
Directory.SetCurrentDirectory(_originalDirectory);
if (Directory.Exists(_tempDirectory))
{
try
{
Directory.Delete(_tempDirectory, recursive: true);
}
catch
{
// Ignored.
}
}
}
private static void WriteAppSettings<T>(T payload)
{
var json = JsonSerializer.Serialize(payload, new JsonSerializerOptions { WriteIndented = true });
File.WriteAllText("appsettings.json", json);
}
}

View File

@@ -0,0 +1,235 @@
using System;
using System.IO;
using System.Net;
using System.Net.Http;
using System.Net.Http.Json;
using System.Security.Cryptography;
using System.Text;
using System.Text.Json;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging;
using StellaOps.Cli.Configuration;
using StellaOps.Cli.Services;
using StellaOps.Cli.Services.Models;
using StellaOps.Cli.Services.Models.Transport;
using StellaOps.Cli.Tests.Testing;
namespace StellaOps.Cli.Tests.Services;
public sealed class BackendOperationsClientTests
{
[Fact]
public async Task DownloadScannerAsync_VerifiesDigestAndWritesMetadata()
{
using var temp = new TempDirectory();
var contentBytes = Encoding.UTF8.GetBytes("scanner-blob");
var digestHex = Convert.ToHexString(SHA256.HashData(contentBytes)).ToLowerInvariant();
var handler = new StubHttpMessageHandler((request, _) =>
{
var response = new HttpResponseMessage(HttpStatusCode.OK)
{
Content = new ByteArrayContent(contentBytes),
RequestMessage = request
};
response.Headers.Add("X-StellaOps-Digest", $"sha256:{digestHex}");
response.Content.Headers.LastModified = DateTimeOffset.UtcNow;
response.Content.Headers.ContentType = new System.Net.Http.Headers.MediaTypeHeaderValue("application/octet-stream");
return response;
});
var httpClient = new HttpClient(handler)
{
BaseAddress = new Uri("https://feedser.example")
};
var options = new StellaOpsCliOptions
{
BackendUrl = "https://feedser.example",
ScannerCacheDirectory = temp.Path,
ScannerDownloadAttempts = 1
};
var loggerFactory = LoggerFactory.Create(builder => builder.SetMinimumLevel(LogLevel.Debug));
var client = new BackendOperationsClient(httpClient, options, loggerFactory.CreateLogger<BackendOperationsClient>());
var targetPath = Path.Combine(temp.Path, "scanner.tar.gz");
var result = await client.DownloadScannerAsync("stable", targetPath, overwrite: false, verbose: true, CancellationToken.None);
Assert.False(result.FromCache);
Assert.True(File.Exists(targetPath));
var metadataPath = targetPath + ".metadata.json";
Assert.True(File.Exists(metadataPath));
using var document = JsonDocument.Parse(File.ReadAllText(metadataPath));
Assert.Equal($"sha256:{digestHex}", document.RootElement.GetProperty("digest").GetString());
Assert.Equal("stable", document.RootElement.GetProperty("channel").GetString());
}
[Fact]
public async Task DownloadScannerAsync_ThrowsOnDigestMismatch()
{
using var temp = new TempDirectory();
var contentBytes = Encoding.UTF8.GetBytes("scanner-data");
var handler = new StubHttpMessageHandler((request, _) =>
{
var response = new HttpResponseMessage(HttpStatusCode.OK)
{
Content = new ByteArrayContent(contentBytes),
RequestMessage = request
};
response.Headers.Add("X-StellaOps-Digest", "sha256:deadbeef");
return response;
});
var httpClient = new HttpClient(handler)
{
BaseAddress = new Uri("https://feedser.example")
};
var options = new StellaOpsCliOptions
{
BackendUrl = "https://feedser.example",
ScannerCacheDirectory = temp.Path,
ScannerDownloadAttempts = 1
};
var loggerFactory = LoggerFactory.Create(builder => builder.SetMinimumLevel(LogLevel.Debug));
var client = new BackendOperationsClient(httpClient, options, loggerFactory.CreateLogger<BackendOperationsClient>());
var targetPath = Path.Combine(temp.Path, "scanner.tar.gz");
await Assert.ThrowsAsync<InvalidOperationException>(() => client.DownloadScannerAsync("stable", targetPath, overwrite: true, verbose: false, CancellationToken.None));
Assert.False(File.Exists(targetPath));
}
[Fact]
public async Task DownloadScannerAsync_RetriesOnFailure()
{
using var temp = new TempDirectory();
var successBytes = Encoding.UTF8.GetBytes("success");
var digestHex = Convert.ToHexString(SHA256.HashData(successBytes)).ToLowerInvariant();
var attempts = 0;
var handler = new StubHttpMessageHandler(
(request, _) =>
{
attempts++;
return new HttpResponseMessage(HttpStatusCode.InternalServerError)
{
RequestMessage = request,
Content = new StringContent("error")
};
},
(request, _) =>
{
attempts++;
var response = new HttpResponseMessage(HttpStatusCode.OK)
{
RequestMessage = request,
Content = new ByteArrayContent(successBytes)
};
response.Headers.Add("X-StellaOps-Digest", $"sha256:{digestHex}");
return response;
});
var httpClient = new HttpClient(handler)
{
BaseAddress = new Uri("https://feedser.example")
};
var options = new StellaOpsCliOptions
{
BackendUrl = "https://feedser.example",
ScannerCacheDirectory = temp.Path,
ScannerDownloadAttempts = 3
};
var loggerFactory = LoggerFactory.Create(builder => builder.SetMinimumLevel(LogLevel.Debug));
var client = new BackendOperationsClient(httpClient, options, loggerFactory.CreateLogger<BackendOperationsClient>());
var targetPath = Path.Combine(temp.Path, "scanner.tar.gz");
var result = await client.DownloadScannerAsync("stable", targetPath, overwrite: false, verbose: false, CancellationToken.None);
Assert.Equal(2, attempts);
Assert.False(result.FromCache);
Assert.True(File.Exists(targetPath));
}
[Fact]
public async Task TriggerJobAsync_ReturnsAcceptedResult()
{
var handler = new StubHttpMessageHandler((request, _) =>
{
var response = new HttpResponseMessage(HttpStatusCode.Accepted)
{
RequestMessage = request,
Content = JsonContent.Create(new JobRunResponse
{
RunId = Guid.NewGuid(),
Status = "queued",
Kind = "export:json",
Trigger = "cli",
CreatedAt = DateTimeOffset.UtcNow
})
};
response.Headers.Location = new Uri("/jobs/export:json/runs/123", UriKind.Relative);
return response;
});
var httpClient = new HttpClient(handler)
{
BaseAddress = new Uri("https://feedser.example")
};
var options = new StellaOpsCliOptions { BackendUrl = "https://feedser.example" };
var loggerFactory = LoggerFactory.Create(builder => builder.SetMinimumLevel(LogLevel.Debug));
var client = new BackendOperationsClient(httpClient, options, loggerFactory.CreateLogger<BackendOperationsClient>());
var result = await client.TriggerJobAsync("export:json", new Dictionary<string, object?>(), CancellationToken.None);
Assert.True(result.Success);
Assert.Equal("Accepted", result.Message);
Assert.Equal("/jobs/export:json/runs/123", result.Location);
}
[Fact]
public async Task TriggerJobAsync_ReturnsFailureMessage()
{
var handler = new StubHttpMessageHandler((request, _) =>
{
var problem = new
{
title = "Job already running",
detail = "export job active"
};
var response = new HttpResponseMessage(HttpStatusCode.Conflict)
{
RequestMessage = request,
Content = JsonContent.Create(problem)
};
return response;
});
var httpClient = new HttpClient(handler)
{
BaseAddress = new Uri("https://feedser.example")
};
var options = new StellaOpsCliOptions { BackendUrl = "https://feedser.example" };
var loggerFactory = LoggerFactory.Create(builder => builder.SetMinimumLevel(LogLevel.Debug));
var client = new BackendOperationsClient(httpClient, options, loggerFactory.CreateLogger<BackendOperationsClient>());
var result = await client.TriggerJobAsync("export:json", new Dictionary<string, object?>(), CancellationToken.None);
Assert.False(result.Success);
Assert.Contains("Job already running", result.Message);
}
}

View File

@@ -0,0 +1,28 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>net10.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
<IsPackable>false</IsPackable>
<!-- To enable Microsoft.Testing.Platform, uncomment the following line. -->
<!-- <UseMicrosoftTestingPlatformRunner>true</UseMicrosoftTestingPlatformRunner> -->
<!-- Note: to use Microsoft.Testing.Platform correctly with dotnet test: -->
<!-- 1. You must add dotnet.config specifying the test runner to be Microsoft.Testing.Platform -->
<!-- 2. You must use .NET 10 SDK or later -->
<!-- For more information, see https://aka.ms/dotnet-test/mtp and https://xunit.net/docs/getting-started/v3/microsoft-testing-platform -->
<!-- To enable code coverage with Microsoft.Testing.Platform, add a package reference to Microsoft.Testing.Extensions.CodeCoverage -->
<!-- https://learn.microsoft.comdotnet/core/testing/microsoft-testing-platform-extensions-code-coverage -->
</PropertyGroup>
<ItemGroup>
<Using Include="Xunit" />
</ItemGroup>
<ItemGroup>
<ProjectReference Include="..\StellaOps.Cli\StellaOps.Cli.csproj" />
<ProjectReference Include="..\StellaOps.Configuration\StellaOps.Configuration.csproj" />
</ItemGroup>
</Project>

View File

@@ -0,0 +1,55 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Net.Http;
using System.Threading;
using System.Threading.Tasks;
namespace StellaOps.Cli.Tests.Testing;
internal sealed class TempDirectory : IDisposable
{
public TempDirectory()
{
Path = System.IO.Path.Combine(System.IO.Path.GetTempPath(), $"stellaops-cli-tests-{Guid.NewGuid():N}");
Directory.CreateDirectory(Path);
}
public string Path { get; }
public void Dispose()
{
try
{
if (Directory.Exists(Path))
{
Directory.Delete(Path, recursive: true);
}
}
catch
{
// ignored
}
}
}
internal sealed class StubHttpMessageHandler : HttpMessageHandler
{
private readonly Queue<Func<HttpRequestMessage, CancellationToken, HttpResponseMessage>> _responses;
public StubHttpMessageHandler(params Func<HttpRequestMessage, CancellationToken, HttpResponseMessage>[] handlers)
{
if (handlers is null || handlers.Length == 0)
{
throw new ArgumentException("At least one handler must be provided.", nameof(handlers));
}
_responses = new Queue<Func<HttpRequestMessage, CancellationToken, HttpResponseMessage>>(handlers);
}
protected override Task<HttpResponseMessage> SendAsync(HttpRequestMessage request, CancellationToken cancellationToken)
{
var factory = _responses.Count > 1 ? _responses.Dequeue() : _responses.Peek();
return Task.FromResult(factory(request, cancellationToken));
}
}

View File

@@ -0,0 +1,10 @@
namespace StellaOps.Cli.Tests;
public class UnitTest1
{
[Fact]
public void Test1()
{
}
}

View File

@@ -0,0 +1,3 @@
{
"$schema": "https://xunit.net/schema/current/xunit.runner.schema.json"
}

View File

@@ -0,0 +1,27 @@
# StellaOps.Cli — Agent Brief
## Mission
- Deliver an offline-capable command-line interface that drives StellaOps back-end operations: scanner distribution, scan execution, result uploads, and Feedser database lifecycle calls (init/resume/export).
- Honour StellaOps principles of determinism, observability, and offline-first behaviour while providing a polished operator experience.
## Role Charter
| Role | Mandate | Collaboration |
| --- | --- | --- |
| **DevEx/CLI** | Own CLI UX, command routing, and configuration model. Ensure commands work with empty/default config and document overrides. | Coordinate with Backend/WebService for API contracts and with Docs for operator workflows. |
| **Ops Integrator** | Maintain integration paths for shell/dotnet/docker tooling. Validate that air-gapped runners can bootstrap required binaries. | Work with Feedser/Agent teams to mirror packaging and signing requirements. |
| **QA** | Provide command-level fixtures, golden outputs, and regression coverage (unit & smoke). Ensure commands respect cancellation and deterministic logging. | Partner with QA guild for shared harnesses and test data. |
## Working Agreements
- Configuration is centralised in `StellaOps.Configuration`; always consume the bootstrapper instead of hand rolling builders. Env vars (`API_KEY`, `STELLAOPS_BACKEND_URL`, `StellaOps:*`) override JSON/YAML and default to empty values.
- Command verbs (`scanner`, `scan`, `db`, `config`) are wired through System.CommandLine 2.0; keep handlers composable, cancellation-aware, and unit-testable.
- `scanner download` must verify digests/signatures, install containers locally (docker load), and log artefact metadata.
- `scan run` must execute the container against a directory, materialise artefacts in `ResultsDirectory`, and auto-upload them on success; `scan upload` is the manual retry path.
- Emit structured console logs (single line, UTC timestamps) and honour offline-first expectations—no hidden network calls.
- Mirror repository guidance: stay within `src/StellaOps.Cli` unless collaborating via documented handshakes.
- Update `TASKS.md` as states change (TODO → DOING → DONE/BLOCKED) and record added tests/fixtures alongside implementation notes.
## Reference Materials
- `docs/ARCHITECTURE_FEEDSER.md` for database operations surface area.
- Backend OpenAPI/contract docs (once available) for job triggers and scanner endpoints.
- Existing module AGENTS/TASKS files for style and coordination cues.
- `docs/09_API_CLI_REFERENCE.md` (section 3) for the user-facing synopsis of the CLI verbs and flags.

View File

@@ -0,0 +1,246 @@
using System;
using System.CommandLine;
using System.Threading;
using System.Threading.Tasks;
using StellaOps.Cli.Configuration;
namespace StellaOps.Cli.Commands;
internal static class CommandFactory
{
public static RootCommand Create(IServiceProvider services, StellaOpsCliOptions options, CancellationToken cancellationToken)
{
var verboseOption = new Option<bool>("--verbose", new[] { "-v" })
{
Description = "Enable verbose logging output."
};
var root = new RootCommand("StellaOps command-line interface")
{
TreatUnmatchedTokensAsErrors = true
};
root.Add(verboseOption);
root.Add(BuildScannerCommand(services, verboseOption, cancellationToken));
root.Add(BuildScanCommand(services, options, verboseOption, cancellationToken));
root.Add(BuildDatabaseCommand(services, verboseOption, cancellationToken));
root.Add(BuildConfigCommand(options));
return root;
}
private static Command BuildScannerCommand(IServiceProvider services, Option<bool> verboseOption, CancellationToken cancellationToken)
{
var scanner = new Command("scanner", "Manage scanner artifacts and lifecycle.");
var download = new Command("download", "Download the latest scanner bundle.");
var channelOption = new Option<string>("--channel", new[] { "-c" })
{
Description = "Scanner channel (stable, beta, nightly)."
};
var outputOption = new Option<string?>("--output")
{
Description = "Optional output path for the downloaded bundle."
};
var overwriteOption = new Option<bool>("--overwrite")
{
Description = "Overwrite existing bundle if present."
};
var noInstallOption = new Option<bool>("--no-install")
{
Description = "Skip installing the scanner container after download."
};
download.Add(channelOption);
download.Add(outputOption);
download.Add(overwriteOption);
download.Add(noInstallOption);
download.SetAction((parseResult, _) =>
{
var channel = parseResult.GetValue(channelOption) ?? "stable";
var output = parseResult.GetValue(outputOption);
var overwrite = parseResult.GetValue(overwriteOption);
var install = !parseResult.GetValue(noInstallOption);
var verbose = parseResult.GetValue(verboseOption);
return CommandHandlers.HandleScannerDownloadAsync(services, channel, output, overwrite, install, verbose, cancellationToken);
});
scanner.Add(download);
return scanner;
}
private static Command BuildScanCommand(IServiceProvider services, StellaOpsCliOptions options, Option<bool> verboseOption, CancellationToken cancellationToken)
{
var scan = new Command("scan", "Execute scanners and manage scan outputs.");
var run = new Command("run", "Execute a scanner bundle with the configured runner.");
var runnerOption = new Option<string>("--runner")
{
Description = "Execution runtime (dotnet, self, docker)."
};
var entryOption = new Option<string>("--entry")
{
Description = "Path to the scanner entrypoint or Docker image.",
Required = true
};
var targetOption = new Option<string>("--target")
{
Description = "Directory to scan.",
Required = true
};
var argsArgument = new Argument<string[]>("scanner-args")
{
Arity = ArgumentArity.ZeroOrMore
};
run.Add(runnerOption);
run.Add(entryOption);
run.Add(targetOption);
run.Add(argsArgument);
run.SetAction((parseResult, _) =>
{
var runner = parseResult.GetValue(runnerOption) ?? options.DefaultRunner;
var entry = parseResult.GetValue(entryOption) ?? string.Empty;
var target = parseResult.GetValue(targetOption) ?? string.Empty;
var forwardedArgs = parseResult.GetValue(argsArgument) ?? Array.Empty<string>();
var verbose = parseResult.GetValue(verboseOption);
return CommandHandlers.HandleScannerRunAsync(services, runner, entry, target, forwardedArgs, verbose, cancellationToken);
});
var upload = new Command("upload", "Upload completed scan results to the backend.");
var fileOption = new Option<string>("--file")
{
Description = "Path to the scan result artifact.",
Required = true
};
upload.Add(fileOption);
upload.SetAction((parseResult, _) =>
{
var file = parseResult.GetValue(fileOption) ?? string.Empty;
var verbose = parseResult.GetValue(verboseOption);
return CommandHandlers.HandleScanUploadAsync(services, file, verbose, cancellationToken);
});
scan.Add(run);
scan.Add(upload);
return scan;
}
private static Command BuildDatabaseCommand(IServiceProvider services, Option<bool> verboseOption, CancellationToken cancellationToken)
{
var db = new Command("db", "Trigger Feedser database operations via backend jobs.");
var fetch = new Command("fetch", "Trigger connector fetch/parse/map stages.");
var sourceOption = new Option<string>("--source")
{
Description = "Connector source identifier (e.g. redhat, osv, vmware).",
Required = true
};
var stageOption = new Option<string>("--stage")
{
Description = "Stage to trigger: fetch, parse, or map."
};
var modeOption = new Option<string?>("--mode")
{
Description = "Optional connector-specific mode (init, resume, cursor)."
};
fetch.Add(sourceOption);
fetch.Add(stageOption);
fetch.Add(modeOption);
fetch.SetAction((parseResult, _) =>
{
var source = parseResult.GetValue(sourceOption) ?? string.Empty;
var stage = parseResult.GetValue(stageOption) ?? "fetch";
var mode = parseResult.GetValue(modeOption);
var verbose = parseResult.GetValue(verboseOption);
return CommandHandlers.HandleConnectorJobAsync(services, source, stage, mode, verbose, cancellationToken);
});
var merge = new Command("merge", "Run canonical merge reconciliation.");
merge.SetAction((parseResult, _) =>
{
var verbose = parseResult.GetValue(verboseOption);
return CommandHandlers.HandleMergeJobAsync(services, verbose, cancellationToken);
});
var export = new Command("export", "Run Feedser export jobs.");
var formatOption = new Option<string>("--format")
{
Description = "Export format: json or trivy-db."
};
var deltaOption = new Option<bool>("--delta")
{
Description = "Request a delta export when supported."
};
export.Add(formatOption);
export.Add(deltaOption);
export.SetAction((parseResult, _) =>
{
var format = parseResult.GetValue(formatOption) ?? "json";
var delta = parseResult.GetValue(deltaOption);
var verbose = parseResult.GetValue(verboseOption);
return CommandHandlers.HandleExportJobAsync(services, format, delta, verbose, cancellationToken);
});
db.Add(fetch);
db.Add(merge);
db.Add(export);
return db;
}
private static Command BuildConfigCommand(StellaOpsCliOptions options)
{
var config = new Command("config", "Inspect CLI configuration state.");
var show = new Command("show", "Display resolved configuration values.");
show.SetAction((_, _) =>
{
var lines = new[]
{
$"Backend URL: {MaskIfEmpty(options.BackendUrl)}",
$"API Key: {DescribeSecret(options.ApiKey)}",
$"Scanner Cache: {options.ScannerCacheDirectory}",
$"Results Directory: {options.ResultsDirectory}",
$"Default Runner: {options.DefaultRunner}"
};
foreach (var line in lines)
{
Console.WriteLine(line);
}
return Task.CompletedTask;
});
config.Add(show);
return config;
}
private static string MaskIfEmpty(string value)
=> string.IsNullOrWhiteSpace(value) ? "<not configured>" : value;
private static string DescribeSecret(string value)
{
if (string.IsNullOrWhiteSpace(value))
{
return "<not configured>";
}
return value.Length switch
{
<= 4 => "****",
_ => $"{value[..2]}***{value[^2..]}"
};
}
}

View File

@@ -0,0 +1,323 @@
using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.IO;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Logging;
using StellaOps.Cli.Configuration;
using StellaOps.Cli.Services;
using StellaOps.Cli.Services.Models;
using StellaOps.Cli.Telemetry;
namespace StellaOps.Cli.Commands;
internal static class CommandHandlers
{
public static async Task HandleScannerDownloadAsync(
IServiceProvider services,
string channel,
string? output,
bool overwrite,
bool install,
bool verbose,
CancellationToken cancellationToken)
{
await using var scope = services.CreateAsyncScope();
var client = scope.ServiceProvider.GetRequiredService<IBackendOperationsClient>();
var logger = scope.ServiceProvider.GetRequiredService<ILoggerFactory>().CreateLogger("scanner-download");
var verbosity = scope.ServiceProvider.GetRequiredService<VerbosityState>();
var previousLevel = verbosity.MinimumLevel;
verbosity.MinimumLevel = verbose ? LogLevel.Debug : LogLevel.Information;
using var activity = CliActivitySource.Instance.StartActivity("cli.scanner.download", ActivityKind.Client);
activity?.SetTag("stellaops.cli.command", "scanner download");
activity?.SetTag("stellaops.cli.channel", channel);
using var duration = CliMetrics.MeasureCommandDuration("scanner download");
try
{
var result = await client.DownloadScannerAsync(channel, output ?? string.Empty, overwrite, verbose, cancellationToken).ConfigureAwait(false);
if (result.FromCache)
{
logger.LogInformation("Using cached scanner at {Path}.", result.Path);
}
else
{
logger.LogInformation("Scanner downloaded to {Path} ({Size} bytes).", result.Path, result.SizeBytes);
}
CliMetrics.RecordScannerDownload(channel, result.FromCache);
if (install)
{
var installer = scope.ServiceProvider.GetRequiredService<IScannerInstaller>();
await installer.InstallAsync(result.Path, verbose, cancellationToken).ConfigureAwait(false);
CliMetrics.RecordScannerInstall(channel);
}
Environment.ExitCode = 0;
}
catch (Exception ex)
{
logger.LogError(ex, "Failed to download scanner bundle.");
Environment.ExitCode = 1;
}
finally
{
verbosity.MinimumLevel = previousLevel;
}
}
public static async Task HandleScannerRunAsync(
IServiceProvider services,
string runner,
string entry,
string targetDirectory,
IReadOnlyList<string> arguments,
bool verbose,
CancellationToken cancellationToken)
{
await using var scope = services.CreateAsyncScope();
var executor = scope.ServiceProvider.GetRequiredService<IScannerExecutor>();
var logger = scope.ServiceProvider.GetRequiredService<ILoggerFactory>().CreateLogger("scanner-run");
var verbosity = scope.ServiceProvider.GetRequiredService<VerbosityState>();
var previousLevel = verbosity.MinimumLevel;
verbosity.MinimumLevel = verbose ? LogLevel.Debug : LogLevel.Information;
using var activity = CliActivitySource.Instance.StartActivity("cli.scan.run", ActivityKind.Internal);
activity?.SetTag("stellaops.cli.command", "scan run");
activity?.SetTag("stellaops.cli.runner", runner);
activity?.SetTag("stellaops.cli.entry", entry);
activity?.SetTag("stellaops.cli.target", targetDirectory);
using var duration = CliMetrics.MeasureCommandDuration("scan run");
try
{
var options = scope.ServiceProvider.GetRequiredService<StellaOpsCliOptions>();
var resultsDirectory = options.ResultsDirectory;
var executionResult = await executor.RunAsync(
runner,
entry,
targetDirectory,
resultsDirectory,
arguments,
verbose,
cancellationToken).ConfigureAwait(false);
Environment.ExitCode = executionResult.ExitCode;
CliMetrics.RecordScanRun(runner, executionResult.ExitCode);
if (executionResult.ExitCode == 0)
{
var backend = scope.ServiceProvider.GetRequiredService<IBackendOperationsClient>();
logger.LogInformation("Uploading scan artefact {Path}...", executionResult.ResultsPath);
await backend.UploadScanResultsAsync(executionResult.ResultsPath, cancellationToken).ConfigureAwait(false);
logger.LogInformation("Scan artefact uploaded.");
activity?.SetTag("stellaops.cli.results", executionResult.ResultsPath);
}
else
{
logger.LogWarning("Skipping automatic upload because scan exited with code {Code}.", executionResult.ExitCode);
}
}
catch (Exception ex)
{
logger.LogError(ex, "Scanner execution failed.");
Environment.ExitCode = 1;
}
finally
{
verbosity.MinimumLevel = previousLevel;
}
}
public static async Task HandleScanUploadAsync(
IServiceProvider services,
string file,
bool verbose,
CancellationToken cancellationToken)
{
await using var scope = services.CreateAsyncScope();
var client = scope.ServiceProvider.GetRequiredService<IBackendOperationsClient>();
var logger = scope.ServiceProvider.GetRequiredService<ILoggerFactory>().CreateLogger("scanner-upload");
var verbosity = scope.ServiceProvider.GetRequiredService<VerbosityState>();
var previousLevel = verbosity.MinimumLevel;
verbosity.MinimumLevel = verbose ? LogLevel.Debug : LogLevel.Information;
using var activity = CliActivitySource.Instance.StartActivity("cli.scan.upload", ActivityKind.Client);
activity?.SetTag("stellaops.cli.command", "scan upload");
activity?.SetTag("stellaops.cli.file", file);
using var duration = CliMetrics.MeasureCommandDuration("scan upload");
try
{
var path = Path.GetFullPath(file);
await client.UploadScanResultsAsync(path, cancellationToken).ConfigureAwait(false);
logger.LogInformation("Scan results uploaded successfully.");
Environment.ExitCode = 0;
}
catch (Exception ex)
{
logger.LogError(ex, "Failed to upload scan results.");
Environment.ExitCode = 1;
}
finally
{
verbosity.MinimumLevel = previousLevel;
}
}
public static async Task HandleConnectorJobAsync(
IServiceProvider services,
string source,
string stage,
string? mode,
bool verbose,
CancellationToken cancellationToken)
{
await using var scope = services.CreateAsyncScope();
var client = scope.ServiceProvider.GetRequiredService<IBackendOperationsClient>();
var logger = scope.ServiceProvider.GetRequiredService<ILoggerFactory>().CreateLogger("db-connector");
var verbosity = scope.ServiceProvider.GetRequiredService<VerbosityState>();
var previousLevel = verbosity.MinimumLevel;
verbosity.MinimumLevel = verbose ? LogLevel.Debug : LogLevel.Information;
using var activity = CliActivitySource.Instance.StartActivity("cli.db.fetch", ActivityKind.Client);
activity?.SetTag("stellaops.cli.command", "db fetch");
activity?.SetTag("stellaops.cli.source", source);
activity?.SetTag("stellaops.cli.stage", stage);
if (!string.IsNullOrWhiteSpace(mode))
{
activity?.SetTag("stellaops.cli.mode", mode);
}
using var duration = CliMetrics.MeasureCommandDuration("db fetch");
try
{
var jobKind = $"source:{source}:{stage}";
var parameters = new Dictionary<string, object?>(StringComparer.Ordinal);
if (!string.IsNullOrWhiteSpace(mode))
{
parameters["mode"] = mode;
}
await TriggerJobAsync(client, logger, jobKind, parameters, cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
logger.LogError(ex, "Connector job failed.");
Environment.ExitCode = 1;
}
finally
{
verbosity.MinimumLevel = previousLevel;
}
}
public static async Task HandleMergeJobAsync(
IServiceProvider services,
bool verbose,
CancellationToken cancellationToken)
{
await using var scope = services.CreateAsyncScope();
var client = scope.ServiceProvider.GetRequiredService<IBackendOperationsClient>();
var logger = scope.ServiceProvider.GetRequiredService<ILoggerFactory>().CreateLogger("db-merge");
var verbosity = scope.ServiceProvider.GetRequiredService<VerbosityState>();
var previousLevel = verbosity.MinimumLevel;
verbosity.MinimumLevel = verbose ? LogLevel.Debug : LogLevel.Information;
using var activity = CliActivitySource.Instance.StartActivity("cli.db.merge", ActivityKind.Client);
activity?.SetTag("stellaops.cli.command", "db merge");
using var duration = CliMetrics.MeasureCommandDuration("db merge");
try
{
await TriggerJobAsync(client, logger, "merge:reconcile", new Dictionary<string, object?>(StringComparer.Ordinal), cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
logger.LogError(ex, "Merge job failed.");
Environment.ExitCode = 1;
}
finally
{
verbosity.MinimumLevel = previousLevel;
}
}
public static async Task HandleExportJobAsync(
IServiceProvider services,
string format,
bool delta,
bool verbose,
CancellationToken cancellationToken)
{
await using var scope = services.CreateAsyncScope();
var client = scope.ServiceProvider.GetRequiredService<IBackendOperationsClient>();
var logger = scope.ServiceProvider.GetRequiredService<ILoggerFactory>().CreateLogger("db-export");
var verbosity = scope.ServiceProvider.GetRequiredService<VerbosityState>();
var previousLevel = verbosity.MinimumLevel;
verbosity.MinimumLevel = verbose ? LogLevel.Debug : LogLevel.Information;
using var activity = CliActivitySource.Instance.StartActivity("cli.db.export", ActivityKind.Client);
activity?.SetTag("stellaops.cli.command", "db export");
activity?.SetTag("stellaops.cli.format", format);
activity?.SetTag("stellaops.cli.delta", delta);
using var duration = CliMetrics.MeasureCommandDuration("db export");
try
{
var jobKind = format switch
{
"trivy-db" or "trivy" => "export:trivy-db",
_ => "export:json"
};
var parameters = new Dictionary<string, object?>(StringComparer.Ordinal)
{
["delta"] = delta
};
await TriggerJobAsync(client, logger, jobKind, parameters, cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
logger.LogError(ex, "Export job failed.");
Environment.ExitCode = 1;
}
finally
{
verbosity.MinimumLevel = previousLevel;
}
}
private static async Task TriggerJobAsync(
IBackendOperationsClient client,
ILogger logger,
string jobKind,
IDictionary<string, object?> parameters,
CancellationToken cancellationToken)
{
JobTriggerResult result = await client.TriggerJobAsync(jobKind, parameters, cancellationToken).ConfigureAwait(false);
if (result.Success)
{
if (!string.IsNullOrWhiteSpace(result.Location))
{
logger.LogInformation("Job accepted. Track status at {Location}.", result.Location);
}
else if (result.Run is not null)
{
logger.LogInformation("Job accepted. RunId: {RunId} Status: {Status}", result.Run.RunId, result.Run.Status);
}
else
{
logger.LogInformation("Job accepted.");
}
Environment.ExitCode = 0;
}
else
{
logger.LogError("Job '{JobKind}' failed: {Message}", jobKind, result.Message);
Environment.ExitCode = 1;
}
}
}

View File

@@ -0,0 +1,77 @@
using System.Globalization;
using Microsoft.Extensions.Configuration;
using StellaOps.Configuration;
namespace StellaOps.Cli.Configuration;
public static class CliBootstrapper
{
public static (StellaOpsCliOptions Options, IConfigurationRoot Configuration) Build(string[] args)
{
var bootstrap = StellaOpsConfigurationBootstrapper.Build<StellaOpsCliOptions>(options =>
{
options.BindingSection = "StellaOps";
options.ConfigureBuilder = builder =>
{
if (args.Length > 0)
{
builder.AddCommandLine(args);
}
};
options.PostBind = (cliOptions, configuration) =>
{
cliOptions.ApiKey = ResolveWithFallback(cliOptions.ApiKey, configuration, "API_KEY", "StellaOps:ApiKey", "ApiKey");
cliOptions.BackendUrl = ResolveWithFallback(cliOptions.BackendUrl, configuration, "STELLAOPS_BACKEND_URL", "StellaOps:BackendUrl", "BackendUrl");
cliOptions.ScannerSignaturePublicKeyPath = ResolveWithFallback(cliOptions.ScannerSignaturePublicKeyPath, configuration, "SCANNER_PUBLIC_KEY", "STELLAOPS_SCANNER_PUBLIC_KEY", "StellaOps:ScannerSignaturePublicKeyPath", "ScannerSignaturePublicKeyPath");
cliOptions.ApiKey = cliOptions.ApiKey?.Trim() ?? string.Empty;
cliOptions.BackendUrl = cliOptions.BackendUrl?.Trim() ?? string.Empty;
cliOptions.ScannerSignaturePublicKeyPath = cliOptions.ScannerSignaturePublicKeyPath?.Trim() ?? string.Empty;
var attemptsRaw = ResolveWithFallback(
string.Empty,
configuration,
"SCANNER_DOWNLOAD_ATTEMPTS",
"STELLAOPS_SCANNER_DOWNLOAD_ATTEMPTS",
"StellaOps:ScannerDownloadAttempts",
"ScannerDownloadAttempts");
if (string.IsNullOrWhiteSpace(attemptsRaw))
{
attemptsRaw = cliOptions.ScannerDownloadAttempts.ToString(CultureInfo.InvariantCulture);
}
if (int.TryParse(attemptsRaw, NumberStyles.Integer, CultureInfo.InvariantCulture, out var parsedAttempts) && parsedAttempts > 0)
{
cliOptions.ScannerDownloadAttempts = parsedAttempts;
}
if (cliOptions.ScannerDownloadAttempts <= 0)
{
cliOptions.ScannerDownloadAttempts = 3;
}
};
});
return (bootstrap.Options, bootstrap.Configuration);
}
private static string ResolveWithFallback(string currentValue, IConfiguration configuration, params string[] keys)
{
if (!string.IsNullOrWhiteSpace(currentValue))
{
return currentValue;
}
foreach (var key in keys)
{
var value = configuration[key];
if (!string.IsNullOrWhiteSpace(value))
{
return value;
}
}
return string.Empty;
}
}

View File

@@ -0,0 +1,18 @@
namespace StellaOps.Cli.Configuration;
public sealed class StellaOpsCliOptions
{
public string ApiKey { get; set; } = string.Empty;
public string BackendUrl { get; set; } = string.Empty;
public string ScannerCacheDirectory { get; set; } = "scanners";
public string ResultsDirectory { get; set; } = "results";
public string DefaultRunner { get; set; } = "docker";
public string ScannerSignaturePublicKeyPath { get; set; } = string.Empty;
public int ScannerDownloadAttempts { get; set; } = 3;
}

View File

@@ -0,0 +1,71 @@
using System;
using System.CommandLine;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Logging;
using StellaOps.Cli.Commands;
using StellaOps.Cli.Configuration;
using StellaOps.Cli.Services;
using StellaOps.Cli.Telemetry;
namespace StellaOps.Cli;
internal static class Program
{
internal static async Task<int> Main(string[] args)
{
var (options, configuration) = CliBootstrapper.Build(args);
var services = new ServiceCollection();
services.AddSingleton(configuration);
services.AddSingleton(options);
var verbosityState = new VerbosityState();
services.AddSingleton(verbosityState);
services.AddLogging(builder =>
{
builder.ClearProviders();
builder.AddSimpleConsole(logOptions =>
{
logOptions.TimestampFormat = "HH:mm:ss ";
logOptions.SingleLine = true;
});
builder.AddFilter((category, level) => level >= verbosityState.MinimumLevel);
});
services.AddHttpClient<IBackendOperationsClient, BackendOperationsClient>(client =>
{
client.Timeout = TimeSpan.FromMinutes(5);
if (!string.IsNullOrWhiteSpace(options.BackendUrl) &&
Uri.TryCreate(options.BackendUrl, UriKind.Absolute, out var backendUri))
{
client.BaseAddress = backendUri;
}
});
services.AddSingleton<IScannerExecutor, ScannerExecutor>();
services.AddSingleton<IScannerInstaller, ScannerInstaller>();
await using var serviceProvider = services.BuildServiceProvider();
using var cts = new CancellationTokenSource();
Console.CancelKeyPress += (_, eventArgs) =>
{
eventArgs.Cancel = true;
cts.Cancel();
};
var rootCommand = CommandFactory.Create(serviceProvider, options, cts.Token);
var commandConfiguration = new CommandLineConfiguration(rootCommand);
var commandExit = await commandConfiguration.InvokeAsync(args, cts.Token).ConfigureAwait(false);
var finalExit = Environment.ExitCode != 0 ? Environment.ExitCode : commandExit;
if (cts.IsCancellationRequested && finalExit == 0)
{
finalExit = 130; // Typical POSIX cancellation exit code
}
return finalExit;
}
}

View File

@@ -0,0 +1,3 @@
using System.Runtime.CompilerServices;
[assembly: InternalsVisibleTo("StellaOps.Cli.Tests")]

View File

@@ -0,0 +1,394 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Net;
using System.Net.Http;
using System.Linq;
using System.Net.Http.Headers;
using System.Net.Http.Json;
using System.Security.Cryptography;
using System.Text;
using System.Text.Json;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging;
using StellaOps.Cli.Configuration;
using StellaOps.Cli.Services.Models;
using StellaOps.Cli.Services.Models.Transport;
namespace StellaOps.Cli.Services;
internal sealed class BackendOperationsClient : IBackendOperationsClient
{
private static readonly JsonSerializerOptions SerializerOptions = new(JsonSerializerDefaults.Web);
private readonly HttpClient _httpClient;
private readonly StellaOpsCliOptions _options;
private readonly ILogger<BackendOperationsClient> _logger;
public BackendOperationsClient(HttpClient httpClient, StellaOpsCliOptions options, ILogger<BackendOperationsClient> logger)
{
_httpClient = httpClient ?? throw new ArgumentNullException(nameof(httpClient));
_options = options ?? throw new ArgumentNullException(nameof(options));
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
if (!string.IsNullOrWhiteSpace(_options.BackendUrl) && httpClient.BaseAddress is null)
{
if (Uri.TryCreate(_options.BackendUrl, UriKind.Absolute, out var baseUri))
{
httpClient.BaseAddress = baseUri;
}
}
}
public async Task<ScannerArtifactResult> DownloadScannerAsync(string channel, string outputPath, bool overwrite, bool verbose, CancellationToken cancellationToken)
{
EnsureBackendConfigured();
channel = string.IsNullOrWhiteSpace(channel) ? "stable" : channel.Trim();
outputPath = ResolveArtifactPath(outputPath, channel);
Directory.CreateDirectory(Path.GetDirectoryName(outputPath)!);
if (!overwrite && File.Exists(outputPath))
{
var existing = new FileInfo(outputPath);
_logger.LogInformation("Scanner artifact already cached at {Path} ({Size} bytes).", outputPath, existing.Length);
return new ScannerArtifactResult(outputPath, existing.Length, true);
}
var attempt = 0;
var maxAttempts = Math.Max(1, _options.ScannerDownloadAttempts);
while (true)
{
attempt++;
try
{
using var request = CreateRequest(HttpMethod.Get, $"api/scanner/artifacts/{channel}");
using var response = await _httpClient.SendAsync(request, HttpCompletionOption.ResponseHeadersRead, cancellationToken).ConfigureAwait(false);
if (!response.IsSuccessStatusCode)
{
var failure = await CreateFailureMessageAsync(response, cancellationToken).ConfigureAwait(false);
throw new InvalidOperationException(failure);
}
return await ProcessScannerResponseAsync(response, outputPath, channel, verbose, cancellationToken).ConfigureAwait(false);
}
catch (Exception ex) when (attempt < maxAttempts)
{
var backoffSeconds = Math.Pow(2, attempt);
_logger.LogWarning(ex, "Scanner download attempt {Attempt}/{MaxAttempts} failed. Retrying in {Delay:F0}s...", attempt, maxAttempts, backoffSeconds);
await Task.Delay(TimeSpan.FromSeconds(backoffSeconds), cancellationToken).ConfigureAwait(false);
}
}
}
private async Task<ScannerArtifactResult> ProcessScannerResponseAsync(HttpResponseMessage response, string outputPath, string channel, bool verbose, CancellationToken cancellationToken)
{
var tempFile = outputPath + ".tmp";
await using (var payloadStream = await response.Content.ReadAsStreamAsync(cancellationToken).ConfigureAwait(false))
await using (var fileStream = File.Create(tempFile))
{
await payloadStream.CopyToAsync(fileStream, cancellationToken).ConfigureAwait(false);
}
var expectedDigest = ExtractHeaderValue(response.Headers, "X-StellaOps-Digest");
var signatureHeader = ExtractHeaderValue(response.Headers, "X-StellaOps-Signature");
var digestHex = await ValidateDigestAsync(tempFile, expectedDigest, cancellationToken).ConfigureAwait(false);
await ValidateSignatureAsync(signatureHeader, digestHex, verbose, cancellationToken).ConfigureAwait(false);
if (verbose)
{
var signatureNote = string.IsNullOrWhiteSpace(signatureHeader) ? "no signature" : "signature validated";
_logger.LogDebug("Scanner digest sha256:{Digest} ({SignatureNote}).", digestHex, signatureNote);
}
if (File.Exists(outputPath))
{
File.Delete(outputPath);
}
File.Move(tempFile, outputPath);
PersistMetadata(outputPath, channel, digestHex, signatureHeader, response);
var downloaded = new FileInfo(outputPath);
_logger.LogInformation("Scanner downloaded to {Path} ({Size} bytes).", outputPath, downloaded.Length);
return new ScannerArtifactResult(outputPath, downloaded.Length, false);
}
public async Task UploadScanResultsAsync(string filePath, CancellationToken cancellationToken)
{
EnsureBackendConfigured();
if (!File.Exists(filePath))
{
throw new FileNotFoundException("Scan result file not found.", filePath);
}
using var content = new MultipartFormDataContent();
await using var fileStream = File.OpenRead(filePath);
var streamContent = new StreamContent(fileStream);
streamContent.Headers.ContentType = new MediaTypeHeaderValue("application/octet-stream");
content.Add(streamContent, "file", Path.GetFileName(filePath));
var request = CreateRequest(HttpMethod.Post, "api/scanner/results");
request.Content = content;
using var response = await _httpClient.SendAsync(request, cancellationToken).ConfigureAwait(false);
if (!response.IsSuccessStatusCode)
{
var failure = await CreateFailureMessageAsync(response, cancellationToken).ConfigureAwait(false);
throw new InvalidOperationException(failure);
}
_logger.LogInformation("Scan results uploaded from {Path}.", filePath);
}
public async Task<JobTriggerResult> TriggerJobAsync(string jobKind, IDictionary<string, object?> parameters, CancellationToken cancellationToken)
{
EnsureBackendConfigured();
if (string.IsNullOrWhiteSpace(jobKind))
{
throw new ArgumentException("Job kind must be provided.", nameof(jobKind));
}
var requestBody = new JobTriggerRequest
{
Trigger = "cli",
Parameters = parameters is null ? new Dictionary<string, object?>(StringComparer.Ordinal) : new Dictionary<string, object?>(parameters, StringComparer.Ordinal)
};
var request = CreateRequest(HttpMethod.Post, $"jobs/{jobKind}");
request.Content = JsonContent.Create(requestBody, options: SerializerOptions);
using var response = await _httpClient.SendAsync(request, cancellationToken).ConfigureAwait(false);
if (response.StatusCode == HttpStatusCode.Accepted)
{
JobRunResponse? run = null;
if (response.Content.Headers.ContentLength is > 0)
{
try
{
run = await response.Content.ReadFromJsonAsync<JobRunResponse>(SerializerOptions, cancellationToken).ConfigureAwait(false);
}
catch (JsonException ex)
{
_logger.LogWarning(ex, "Failed to deserialize job run response for job kind {Kind}.", jobKind);
}
}
var location = response.Headers.Location?.ToString();
return new JobTriggerResult(true, "Accepted", location, run);
}
var failureMessage = await CreateFailureMessageAsync(response, cancellationToken).ConfigureAwait(false);
return new JobTriggerResult(false, failureMessage, null, null);
}
private HttpRequestMessage CreateRequest(HttpMethod method, string relativeUri)
{
if (!Uri.TryCreate(relativeUri, UriKind.RelativeOrAbsolute, out var requestUri))
{
throw new InvalidOperationException($"Invalid request URI '{relativeUri}'.");
}
if (requestUri.IsAbsoluteUri)
{
// Nothing to normalize.
}
else
{
requestUri = new Uri(relativeUri.TrimStart('/'), UriKind.Relative);
}
var request = new HttpRequestMessage(method, requestUri);
if (!string.IsNullOrWhiteSpace(_options.ApiKey))
{
request.Headers.Authorization = new AuthenticationHeaderValue("Bearer", _options.ApiKey);
}
return request;
}
private void EnsureBackendConfigured()
{
if (_httpClient.BaseAddress is null)
{
throw new InvalidOperationException("Backend URL is not configured. Provide STELLAOPS_BACKEND_URL or configure appsettings.");
}
}
private string ResolveArtifactPath(string outputPath, string channel)
{
if (!string.IsNullOrWhiteSpace(outputPath))
{
return Path.GetFullPath(outputPath);
}
var directory = string.IsNullOrWhiteSpace(_options.ScannerCacheDirectory)
? Directory.GetCurrentDirectory()
: Path.GetFullPath(_options.ScannerCacheDirectory);
Directory.CreateDirectory(directory);
var fileName = $"stellaops-scanner-{channel}.tar.gz";
return Path.Combine(directory, fileName);
}
private async Task<string> CreateFailureMessageAsync(HttpResponseMessage response, CancellationToken cancellationToken)
{
var statusCode = (int)response.StatusCode;
var builder = new StringBuilder();
builder.Append("Backend request failed with status ");
builder.Append(statusCode);
builder.Append(' ');
builder.Append(response.ReasonPhrase ?? "Unknown");
if (response.Content.Headers.ContentLength is > 0)
{
try
{
var problem = await response.Content.ReadFromJsonAsync<ProblemDocument>(SerializerOptions, cancellationToken).ConfigureAwait(false);
if (problem is not null)
{
if (!string.IsNullOrWhiteSpace(problem.Title))
{
builder.AppendLine().Append(problem.Title);
}
if (!string.IsNullOrWhiteSpace(problem.Detail))
{
builder.AppendLine().Append(problem.Detail);
}
}
}
catch (JsonException)
{
var raw = await response.Content.ReadAsStringAsync(cancellationToken).ConfigureAwait(false);
if (!string.IsNullOrWhiteSpace(raw))
{
builder.AppendLine().Append(raw);
}
}
}
return builder.ToString();
}
private static string? ExtractHeaderValue(HttpResponseHeaders headers, string name)
{
if (headers.TryGetValues(name, out var values))
{
return values.FirstOrDefault();
}
return null;
}
private async Task<string> ValidateDigestAsync(string filePath, string? expectedDigest, CancellationToken cancellationToken)
{
string digestHex;
await using (var stream = File.OpenRead(filePath))
{
var hash = await SHA256.HashDataAsync(stream, cancellationToken).ConfigureAwait(false);
digestHex = Convert.ToHexString(hash).ToLowerInvariant();
}
if (!string.IsNullOrWhiteSpace(expectedDigest))
{
var normalized = NormalizeDigest(expectedDigest);
if (!normalized.Equals(digestHex, StringComparison.OrdinalIgnoreCase))
{
File.Delete(filePath);
throw new InvalidOperationException($"Scanner digest mismatch. Expected sha256:{normalized}, calculated sha256:{digestHex}.");
}
}
else
{
_logger.LogWarning("Scanner download missing X-StellaOps-Digest header; relying on computed digest only.");
}
return digestHex;
}
private static string NormalizeDigest(string digest)
{
if (digest.StartsWith("sha256:", StringComparison.OrdinalIgnoreCase))
{
return digest[7..];
}
return digest;
}
private async Task ValidateSignatureAsync(string? signatureHeader, string digestHex, bool verbose, CancellationToken cancellationToken)
{
if (string.IsNullOrWhiteSpace(_options.ScannerSignaturePublicKeyPath))
{
if (!string.IsNullOrWhiteSpace(signatureHeader))
{
_logger.LogDebug("Signature header present but no public key configured; skipping validation.");
}
return;
}
if (string.IsNullOrWhiteSpace(signatureHeader))
{
throw new InvalidOperationException("Scanner signature missing while a public key is configured.");
}
var publicKeyPath = Path.GetFullPath(_options.ScannerSignaturePublicKeyPath);
if (!File.Exists(publicKeyPath))
{
throw new FileNotFoundException("Scanner signature public key not found.", publicKeyPath);
}
var signatureBytes = Convert.FromBase64String(signatureHeader);
var digestBytes = Convert.FromHexString(digestHex);
var pem = await File.ReadAllTextAsync(publicKeyPath, cancellationToken).ConfigureAwait(false);
using var rsa = RSA.Create();
rsa.ImportFromPem(pem);
var valid = rsa.VerifyHash(digestBytes, signatureBytes, HashAlgorithmName.SHA256, RSASignaturePadding.Pkcs1);
if (!valid)
{
throw new InvalidOperationException("Scanner signature validation failed.");
}
if (verbose)
{
_logger.LogDebug("Scanner signature validated using key {KeyPath}.", publicKeyPath);
}
}
private void PersistMetadata(string outputPath, string channel, string digestHex, string? signatureHeader, HttpResponseMessage response)
{
var metadata = new
{
channel,
digest = $"sha256:{digestHex}",
signature = signatureHeader,
downloadedAt = DateTimeOffset.UtcNow,
source = response.RequestMessage?.RequestUri?.ToString(),
sizeBytes = new FileInfo(outputPath).Length,
headers = new
{
etag = response.Headers.ETag?.Tag,
lastModified = response.Content.Headers.LastModified,
contentType = response.Content.Headers.ContentType?.ToString()
}
};
var metadataPath = outputPath + ".metadata.json";
var json = JsonSerializer.Serialize(metadata, new JsonSerializerOptions
{
WriteIndented = true
});
File.WriteAllText(metadataPath, json);
}
}

View File

@@ -0,0 +1,16 @@
using System.Collections.Generic;
using System.Threading;
using System.Threading.Tasks;
using StellaOps.Cli.Configuration;
using StellaOps.Cli.Services.Models;
namespace StellaOps.Cli.Services;
internal interface IBackendOperationsClient
{
Task<ScannerArtifactResult> DownloadScannerAsync(string channel, string outputPath, bool overwrite, bool verbose, CancellationToken cancellationToken);
Task UploadScanResultsAsync(string filePath, CancellationToken cancellationToken);
Task<JobTriggerResult> TriggerJobAsync(string jobKind, IDictionary<string, object?> parameters, CancellationToken cancellationToken);
}

View File

@@ -0,0 +1,17 @@
using System.Collections.Generic;
using System.Threading;
using System.Threading.Tasks;
namespace StellaOps.Cli.Services;
internal interface IScannerExecutor
{
Task<ScannerExecutionResult> RunAsync(
string runner,
string entry,
string targetDirectory,
string resultsDirectory,
IReadOnlyList<string> arguments,
bool verbose,
CancellationToken cancellationToken);
}

View File

@@ -0,0 +1,9 @@
using System.Threading;
using System.Threading.Tasks;
namespace StellaOps.Cli.Services;
internal interface IScannerInstaller
{
Task InstallAsync(string artifactPath, bool verbose, CancellationToken cancellationToken);
}

View File

@@ -0,0 +1,9 @@
using StellaOps.Cli.Services.Models.Transport;
namespace StellaOps.Cli.Services.Models;
internal sealed record JobTriggerResult(
bool Success,
string Message,
string? Location,
JobRunResponse? Run);

View File

@@ -0,0 +1,3 @@
namespace StellaOps.Cli.Services.Models;
internal sealed record ScannerArtifactResult(string Path, long SizeBytes, bool FromCache);

View File

@@ -0,0 +1,27 @@
using System;
using System.Collections.Generic;
namespace StellaOps.Cli.Services.Models.Transport;
internal sealed class JobRunResponse
{
public Guid RunId { get; set; }
public string Kind { get; set; } = string.Empty;
public string Status { get; set; } = string.Empty;
public string Trigger { get; set; } = string.Empty;
public DateTimeOffset CreatedAt { get; set; }
public DateTimeOffset? StartedAt { get; set; }
public DateTimeOffset? CompletedAt { get; set; }
public string? Error { get; set; }
public TimeSpan? Duration { get; set; }
public IReadOnlyDictionary<string, object?> Parameters { get; set; } = new Dictionary<string, object?>(StringComparer.Ordinal);
}

View File

@@ -0,0 +1,10 @@
using System.Collections.Generic;
namespace StellaOps.Cli.Services.Models.Transport;
internal sealed class JobTriggerRequest
{
public string Trigger { get; set; } = "cli";
public Dictionary<string, object?> Parameters { get; set; } = new(StringComparer.Ordinal);
}

View File

@@ -0,0 +1,18 @@
using System.Collections.Generic;
namespace StellaOps.Cli.Services.Models.Transport;
internal sealed class ProblemDocument
{
public string? Type { get; set; }
public string? Title { get; set; }
public string? Detail { get; set; }
public int? Status { get; set; }
public string? Instance { get; set; }
public Dictionary<string, object?>? Extensions { get; set; }
}

View File

@@ -0,0 +1,3 @@
namespace StellaOps.Cli.Services;
internal sealed record ScannerExecutionResult(int ExitCode, string ResultsPath);

View File

@@ -0,0 +1,274 @@
using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.IO;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging;
namespace StellaOps.Cli.Services;
internal sealed class ScannerExecutor : IScannerExecutor
{
private readonly ILogger<ScannerExecutor> _logger;
public ScannerExecutor(ILogger<ScannerExecutor> logger)
{
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public async Task<ScannerExecutionResult> RunAsync(
string runner,
string entry,
string targetDirectory,
string resultsDirectory,
IReadOnlyList<string> arguments,
bool verbose,
CancellationToken cancellationToken)
{
if (string.IsNullOrWhiteSpace(targetDirectory))
{
throw new ArgumentException("Target directory must be provided.", nameof(targetDirectory));
}
runner = string.IsNullOrWhiteSpace(runner) ? "docker" : runner.Trim().ToLowerInvariant();
entry = entry?.Trim() ?? string.Empty;
var normalizedTarget = Path.GetFullPath(targetDirectory);
if (!Directory.Exists(normalizedTarget))
{
throw new DirectoryNotFoundException($"Scan target directory '{normalizedTarget}' does not exist.");
}
resultsDirectory = string.IsNullOrWhiteSpace(resultsDirectory)
? Path.Combine(Directory.GetCurrentDirectory(), "scan-results")
: Path.GetFullPath(resultsDirectory);
Directory.CreateDirectory(resultsDirectory);
var executionTimestamp = DateTimeOffset.UtcNow;
var baselineFiles = Directory.GetFiles(resultsDirectory, "*", SearchOption.AllDirectories);
var baseline = new HashSet<string>(baselineFiles, StringComparer.OrdinalIgnoreCase);
var startInfo = BuildProcessStartInfo(runner, entry, normalizedTarget, resultsDirectory, arguments);
using var process = new Process { StartInfo = startInfo, EnableRaisingEvents = true };
var stdout = new List<string>();
var stderr = new List<string>();
process.OutputDataReceived += (_, args) =>
{
if (args.Data is null)
{
return;
}
stdout.Add(args.Data);
if (verbose)
{
_logger.LogInformation("[scan] {Line}", args.Data);
}
};
process.ErrorDataReceived += (_, args) =>
{
if (args.Data is null)
{
return;
}
stderr.Add(args.Data);
_logger.LogError("[scan] {Line}", args.Data);
};
_logger.LogInformation("Launching scanner via {Runner} (entry: {Entry})...", runner, entry);
if (!process.Start())
{
throw new InvalidOperationException("Failed to start scanner process.");
}
process.BeginOutputReadLine();
process.BeginErrorReadLine();
await process.WaitForExitAsync(cancellationToken).ConfigureAwait(false);
if (process.ExitCode == 0)
{
_logger.LogInformation("Scanner completed successfully.");
}
else
{
_logger.LogWarning("Scanner exited with code {Code}.", process.ExitCode);
}
var resultsPath = ResolveResultsPath(resultsDirectory, executionTimestamp, baseline);
if (string.IsNullOrWhiteSpace(resultsPath))
{
resultsPath = CreatePlaceholderResult(resultsDirectory);
}
return new ScannerExecutionResult(process.ExitCode, resultsPath);
}
private ProcessStartInfo BuildProcessStartInfo(
string runner,
string entry,
string targetDirectory,
string resultsDirectory,
IReadOnlyList<string> args)
{
return runner switch
{
"self" or "native" => BuildNativeStartInfo(entry, args),
"dotnet" => BuildDotNetStartInfo(entry, args),
"docker" => BuildDockerStartInfo(entry, targetDirectory, resultsDirectory, args),
_ => BuildCustomRunnerStartInfo(runner, entry, args)
};
}
private static ProcessStartInfo BuildNativeStartInfo(string binaryPath, IReadOnlyList<string> args)
{
if (string.IsNullOrWhiteSpace(binaryPath) || !File.Exists(binaryPath))
{
throw new FileNotFoundException("Scanner entrypoint not found.", binaryPath);
}
var startInfo = new ProcessStartInfo
{
FileName = binaryPath,
WorkingDirectory = Directory.GetCurrentDirectory()
};
foreach (var argument in args)
{
startInfo.ArgumentList.Add(argument);
}
startInfo.RedirectStandardError = true;
startInfo.RedirectStandardOutput = true;
startInfo.UseShellExecute = false;
return startInfo;
}
private static ProcessStartInfo BuildDotNetStartInfo(string binaryPath, IReadOnlyList<string> args)
{
var startInfo = new ProcessStartInfo
{
FileName = "dotnet",
WorkingDirectory = Directory.GetCurrentDirectory()
};
startInfo.ArgumentList.Add(binaryPath);
foreach (var argument in args)
{
startInfo.ArgumentList.Add(argument);
}
startInfo.RedirectStandardError = true;
startInfo.RedirectStandardOutput = true;
startInfo.UseShellExecute = false;
return startInfo;
}
private static ProcessStartInfo BuildDockerStartInfo(string image, string targetDirectory, string resultsDirectory, IReadOnlyList<string> args)
{
if (string.IsNullOrWhiteSpace(image))
{
throw new ArgumentException("Docker image must be provided when runner is 'docker'.", nameof(image));
}
var cwd = Directory.GetCurrentDirectory();
var startInfo = new ProcessStartInfo
{
FileName = "docker",
WorkingDirectory = cwd
};
startInfo.ArgumentList.Add("run");
startInfo.ArgumentList.Add("--rm");
startInfo.ArgumentList.Add("-v");
startInfo.ArgumentList.Add($"{cwd}:{cwd}");
startInfo.ArgumentList.Add("-v");
startInfo.ArgumentList.Add($"{targetDirectory}:/scan-target:ro");
startInfo.ArgumentList.Add("-v");
startInfo.ArgumentList.Add($"{resultsDirectory}:/scan-results");
startInfo.ArgumentList.Add("-w");
startInfo.ArgumentList.Add(cwd);
startInfo.ArgumentList.Add(image);
startInfo.ArgumentList.Add("--target");
startInfo.ArgumentList.Add("/scan-target");
startInfo.ArgumentList.Add("--output");
startInfo.ArgumentList.Add("/scan-results/scan.json");
foreach (var argument in args)
{
startInfo.ArgumentList.Add(argument);
}
startInfo.RedirectStandardError = true;
startInfo.RedirectStandardOutput = true;
startInfo.UseShellExecute = false;
return startInfo;
}
private static ProcessStartInfo BuildCustomRunnerStartInfo(string runner, string entry, IReadOnlyList<string> args)
{
var startInfo = new ProcessStartInfo
{
FileName = runner,
WorkingDirectory = Directory.GetCurrentDirectory()
};
if (!string.IsNullOrWhiteSpace(entry))
{
startInfo.ArgumentList.Add(entry);
}
foreach (var argument in args)
{
startInfo.ArgumentList.Add(argument);
}
startInfo.RedirectStandardError = true;
startInfo.RedirectStandardOutput = true;
startInfo.UseShellExecute = false;
return startInfo;
}
private static string ResolveResultsPath(string resultsDirectory, DateTimeOffset startTimestamp, HashSet<string> baseline)
{
var candidates = Directory.GetFiles(resultsDirectory, "*", SearchOption.AllDirectories);
string? newest = null;
DateTimeOffset newestTimestamp = startTimestamp;
foreach (var candidate in candidates)
{
if (baseline.Contains(candidate))
{
continue;
}
var info = new FileInfo(candidate);
if (info.LastWriteTimeUtc >= newestTimestamp)
{
newestTimestamp = info.LastWriteTimeUtc;
newest = candidate;
}
}
return newest ?? string.Empty;
}
private static string CreatePlaceholderResult(string resultsDirectory)
{
var fileName = $"scan-{DateTimeOffset.UtcNow:yyyyMMddHHmmss}.json";
var path = Path.Combine(resultsDirectory, fileName);
File.WriteAllText(path, "{\"status\":\"placeholder\"}");
return path;
}
}

View File

@@ -0,0 +1,79 @@
using System;
using System.Diagnostics;
using System.IO;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging;
namespace StellaOps.Cli.Services;
internal sealed class ScannerInstaller : IScannerInstaller
{
private readonly ILogger<ScannerInstaller> _logger;
public ScannerInstaller(ILogger<ScannerInstaller> logger)
{
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public async Task InstallAsync(string artifactPath, bool verbose, CancellationToken cancellationToken)
{
if (string.IsNullOrWhiteSpace(artifactPath) || !File.Exists(artifactPath))
{
throw new FileNotFoundException("Scanner artifact not found for installation.", artifactPath);
}
// Current implementation assumes docker-based scanner bundle.
var processInfo = new ProcessStartInfo
{
FileName = "docker",
ArgumentList = { "load", "-i", artifactPath },
RedirectStandardOutput = true,
RedirectStandardError = true,
UseShellExecute = false
};
using var process = new Process { StartInfo = processInfo, EnableRaisingEvents = true };
process.OutputDataReceived += (_, args) =>
{
if (args.Data is null)
{
return;
}
if (verbose)
{
_logger.LogInformation("[install] {Line}", args.Data);
}
};
process.ErrorDataReceived += (_, args) =>
{
if (args.Data is null)
{
return;
}
_logger.LogError("[install] {Line}", args.Data);
};
_logger.LogInformation("Installing scanner container from {Path}...", artifactPath);
if (!process.Start())
{
throw new InvalidOperationException("Failed to start container installation process.");
}
process.BeginOutputReadLine();
process.BeginErrorReadLine();
await process.WaitForExitAsync(cancellationToken).ConfigureAwait(false);
if (process.ExitCode != 0)
{
throw new InvalidOperationException($"Container installation failed with exit code {process.ExitCode}.");
}
_logger.LogInformation("Scanner container installed successfully.");
}
}

View File

@@ -0,0 +1,41 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>net10.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Microsoft.Extensions.Configuration" Version="8.0.0" />
<PackageReference Include="Microsoft.Extensions.Configuration.Binder" Version="8.0.1" />
<PackageReference Include="Microsoft.Extensions.Configuration.CommandLine" Version="8.0.0" />
<PackageReference Include="Microsoft.Extensions.Configuration.EnvironmentVariables" Version="8.0.0" />
<PackageReference Include="Microsoft.Extensions.Configuration.Json" Version="8.0.0" />
<PackageReference Include="Microsoft.Extensions.Http" Version="8.0.0" />
<PackageReference Include="Microsoft.Extensions.Logging.Console" Version="8.0.0" />
<PackageReference Include="NetEscapades.Configuration.Yaml" Version="2.1.0" />
<PackageReference Include="System.CommandLine" Version="2.0.0-beta5.25306.1" />
</ItemGroup>
<ItemGroup>
<Content Include="appsettings.json">
<CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
</Content>
<Content Include="appsettings.local.json" Condition="Exists('appsettings.local.json')">
<CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
</Content>
<Content Include="appsettings.yaml" Condition="Exists('appsettings.yaml')">
<CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
</Content>
<Content Include="appsettings.local.yaml" Condition="Exists('appsettings.local.yaml')">
<CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
</Content>
</ItemGroup>
<ItemGroup>
<ProjectReference Include="..\StellaOps.Configuration\StellaOps.Configuration.csproj" />
</ItemGroup>
</Project>

View File

@@ -0,0 +1,9 @@
# TASKS
| Task | Owner(s) | Depends on | Notes |
|---|---|---|---|
|Bootstrap configuration fallback (env → appsettings{{.json/.yaml}})|DevEx/CLI|Core|**DONE** CLI loads `API_KEY`/`STELLAOPS_BACKEND_URL` from environment or local settings, defaulting to empty strings when unset.|
|Introduce command host & routing skeleton|DevEx/CLI|Configuration|**DONE** System.CommandLine (v2.0.0-beta5) router stitched with `scanner`, `scan`, `db`, and `config` verbs.|
|Scanner artifact download/install commands|Ops Integrator|Backend contracts|**DONE** `scanner download` caches bundles, validates SHA-256 (plus optional RSA signature), installs via `docker load`, persists metadata, and retries with exponential backoff.|
|Scan execution & result upload workflow|Ops Integrator, QA|Scanner cmd|**DONE** `scan run` drives container scans against directories, emits artefacts in `ResultsDirectory`, auto-uploads on success, and `scan upload` covers manual retries.|
|Feedser DB operations passthrough|DevEx/CLI|Backend, Feedser APIs|**DONE** `db fetch|merge|export` trigger `/jobs/*` endpoints with parameter binding and consistent exit codes.|
|CLI observability & tests|QA|Command host|**DONE** Added console logging defaults & configuration bootstrap tests; future metrics hooks tracked separately.|

View File

@@ -0,0 +1,8 @@
using System.Diagnostics;
namespace StellaOps.Cli.Telemetry;
internal static class CliActivitySource
{
public static readonly ActivitySource Instance = new("StellaOps.Cli");
}

View File

@@ -0,0 +1,62 @@
using System;
using System.Diagnostics.Metrics;
namespace StellaOps.Cli.Telemetry;
internal static class CliMetrics
{
private static readonly Meter Meter = new("StellaOps.Cli", "1.0.0");
private static readonly Counter<long> ScannerDownloadCounter = Meter.CreateCounter<long>("stellaops.cli.scanner.download.count");
private static readonly Counter<long> ScannerInstallCounter = Meter.CreateCounter<long>("stellaops.cli.scanner.install.count");
private static readonly Counter<long> ScanRunCounter = Meter.CreateCounter<long>("stellaops.cli.scan.run.count");
private static readonly Histogram<double> CommandDurationHistogram = Meter.CreateHistogram<double>("stellaops.cli.command.duration.ms");
public static void RecordScannerDownload(string channel, bool fromCache)
=> ScannerDownloadCounter.Add(1, new KeyValuePair<string, object?>[]
{
new("channel", channel),
new("cache", fromCache ? "hit" : "miss")
});
public static void RecordScannerInstall(string channel)
=> ScannerInstallCounter.Add(1, new KeyValuePair<string, object?>[] { new("channel", channel) });
public static void RecordScanRun(string runner, int exitCode)
=> ScanRunCounter.Add(1, new KeyValuePair<string, object?>[]
{
new("runner", runner),
new("exit_code", exitCode)
});
public static IDisposable MeasureCommandDuration(string command)
{
var start = DateTime.UtcNow;
return new DurationScope(command, start);
}
private sealed class DurationScope : IDisposable
{
private readonly string _command;
private readonly DateTime _start;
private bool _disposed;
public DurationScope(string command, DateTime start)
{
_command = command;
_start = start;
}
public void Dispose()
{
if (_disposed)
{
return;
}
_disposed = true;
var elapsed = (DateTime.UtcNow - _start).TotalMilliseconds;
CommandDurationHistogram.Record(elapsed, new KeyValuePair<string, object?>[] { new("command", _command) });
}
}
}

View File

@@ -0,0 +1,8 @@
using Microsoft.Extensions.Logging;
namespace StellaOps.Cli.Telemetry;
internal sealed class VerbosityState
{
public LogLevel MinimumLevel { get; set; } = LogLevel.Information;
}

View File

@@ -0,0 +1,11 @@
{
"StellaOps": {
"ApiKey": "",
"BackendUrl": "",
"ScannerCacheDirectory": "scanners",
"ResultsDirectory": "results",
"DefaultRunner": "dotnet",
"ScannerSignaturePublicKeyPath": "",
"ScannerDownloadAttempts": 3
}
}

View File

@@ -0,0 +1,18 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>net10.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Microsoft.Extensions.Configuration" Version="8.0.0" />
<PackageReference Include="Microsoft.Extensions.Configuration.Binder" Version="8.0.1" />
<PackageReference Include="Microsoft.Extensions.Configuration.FileExtensions" Version="8.0.0" />
<PackageReference Include="Microsoft.Extensions.Configuration.EnvironmentVariables" Version="8.0.0" />
<PackageReference Include="Microsoft.Extensions.Configuration.Json" Version="8.0.0" />
<PackageReference Include="NetEscapades.Configuration.Yaml" Version="2.1.0" />
</ItemGroup>
</Project>

View File

@@ -0,0 +1,64 @@
using System;
using System.Collections.Generic;
using Microsoft.Extensions.Configuration;
namespace StellaOps.Configuration;
public sealed class StellaOpsBootstrapOptions<TOptions>
where TOptions : class, new()
{
public StellaOpsBootstrapOptions()
{
ConfigurationOptions = new StellaOpsConfigurationOptions();
}
internal StellaOpsConfigurationOptions ConfigurationOptions { get; }
public string? BasePath
{
get => ConfigurationOptions.BasePath;
set => ConfigurationOptions.BasePath = value;
}
public bool IncludeJsonFiles
{
get => ConfigurationOptions.IncludeJsonFiles;
set => ConfigurationOptions.IncludeJsonFiles = value;
}
public bool IncludeYamlFiles
{
get => ConfigurationOptions.IncludeYamlFiles;
set => ConfigurationOptions.IncludeYamlFiles = value;
}
public bool IncludeEnvironmentVariables
{
get => ConfigurationOptions.IncludeEnvironmentVariables;
set => ConfigurationOptions.IncludeEnvironmentVariables = value;
}
public string? EnvironmentPrefix
{
get => ConfigurationOptions.EnvironmentPrefix;
set => ConfigurationOptions.EnvironmentPrefix = value;
}
public IList<JsonConfigurationFile> JsonFiles => ConfigurationOptions.JsonFiles;
public IList<YamlConfigurationFile> YamlFiles => ConfigurationOptions.YamlFiles;
public string? BindingSection
{
get => ConfigurationOptions.BindingSection;
set => ConfigurationOptions.BindingSection = value;
}
public Action<IConfigurationBuilder>? ConfigureBuilder
{
get => ConfigurationOptions.ConfigureBuilder;
set => ConfigurationOptions.ConfigureBuilder = value;
}
public Action<TOptions, IConfiguration>? PostBind { get; set; }
}

View File

@@ -0,0 +1,106 @@
using System;
using Microsoft.Extensions.Configuration;
using NetEscapades.Configuration.Yaml;
namespace StellaOps.Configuration;
public static class StellaOpsConfigurationBootstrapper
{
public static StellaOpsConfigurationContext<TOptions> Build<TOptions>(
Action<StellaOpsBootstrapOptions<TOptions>>? configure = null)
where TOptions : class, new()
{
var bootstrapOptions = new StellaOpsBootstrapOptions<TOptions>();
configure?.Invoke(bootstrapOptions);
var configurationOptions = bootstrapOptions.ConfigurationOptions;
var builder = new ConfigurationBuilder();
if (!string.IsNullOrWhiteSpace(configurationOptions.BasePath))
{
builder.SetBasePath(configurationOptions.BasePath!);
}
if (configurationOptions.IncludeJsonFiles)
{
foreach (var file in configurationOptions.JsonFiles)
{
builder.AddJsonFile(file.Path, optional: file.Optional, reloadOnChange: file.ReloadOnChange);
}
}
if (configurationOptions.IncludeYamlFiles)
{
foreach (var file in configurationOptions.YamlFiles)
{
builder.AddYamlFile(file.Path, optional: file.Optional);
}
}
configurationOptions.ConfigureBuilder?.Invoke(builder);
if (configurationOptions.IncludeEnvironmentVariables)
{
builder.AddEnvironmentVariables(configurationOptions.EnvironmentPrefix);
}
var configuration = builder.Build();
IConfiguration bindingSource;
if (string.IsNullOrWhiteSpace(configurationOptions.BindingSection))
{
bindingSource = configuration;
}
else
{
bindingSource = configuration.GetSection(configurationOptions.BindingSection!);
}
var options = new TOptions();
bindingSource.Bind(options);
bootstrapOptions.PostBind?.Invoke(options, configuration);
return new StellaOpsConfigurationContext<TOptions>(configuration, options);
}
public static IConfigurationBuilder AddStellaOpsDefaults(
this IConfigurationBuilder builder,
Action<StellaOpsConfigurationOptions>? configure = null)
{
ArgumentNullException.ThrowIfNull(builder);
var options = new StellaOpsConfigurationOptions();
configure?.Invoke(options);
if (!string.IsNullOrWhiteSpace(options.BasePath))
{
builder.SetBasePath(options.BasePath!);
}
if (options.IncludeJsonFiles)
{
foreach (var file in options.JsonFiles)
{
builder.AddJsonFile(file.Path, optional: file.Optional, reloadOnChange: file.ReloadOnChange);
}
}
if (options.IncludeYamlFiles)
{
foreach (var file in options.YamlFiles)
{
builder.AddYamlFile(file.Path, optional: file.Optional);
}
}
options.ConfigureBuilder?.Invoke(builder);
if (options.IncludeEnvironmentVariables)
{
builder.AddEnvironmentVariables(options.EnvironmentPrefix);
}
return builder;
}
}

View File

@@ -0,0 +1,18 @@
using System;
using Microsoft.Extensions.Configuration;
namespace StellaOps.Configuration;
public sealed class StellaOpsConfigurationContext<TOptions>
where TOptions : class, new()
{
public StellaOpsConfigurationContext(IConfigurationRoot configuration, TOptions options)
{
Configuration = configuration ?? throw new ArgumentNullException(nameof(configuration));
Options = options ?? throw new ArgumentNullException(nameof(options));
}
public IConfigurationRoot Configuration { get; }
public TOptions Options { get; }
}

View File

@@ -0,0 +1,49 @@
using System;
using System.Collections.Generic;
using System.IO;
using Microsoft.Extensions.Configuration;
namespace StellaOps.Configuration;
/// <summary>
/// Defines how default StellaOps configuration sources are composed.
/// </summary>
public sealed class StellaOpsConfigurationOptions
{
public string? BasePath { get; set; } = Directory.GetCurrentDirectory();
public bool IncludeJsonFiles { get; set; } = true;
public bool IncludeYamlFiles { get; set; } = true;
public bool IncludeEnvironmentVariables { get; set; } = true;
public string? EnvironmentPrefix { get; set; }
public IList<JsonConfigurationFile> JsonFiles { get; } = new List<JsonConfigurationFile>
{
new("appsettings.json", true, false),
new("appsettings.local.json", true, false)
};
public IList<YamlConfigurationFile> YamlFiles { get; } = new List<YamlConfigurationFile>
{
new("appsettings.yaml", true),
new("appsettings.local.yaml", true)
};
/// <summary>
/// Optional hook to register additional configuration sources (e.g. module-specific YAML files).
/// </summary>
public Action<IConfigurationBuilder>? ConfigureBuilder { get; set; }
/// <summary>
/// Optional configuration section name used when binding strongly typed options.
/// Null or empty indicates the root.
/// </summary>
public string? BindingSection { get; set; }
}
public sealed record JsonConfigurationFile(string Path, bool Optional = true, bool ReloadOnChange = false);
public sealed record YamlConfigurationFile(string Path, bool Optional = true);

View File

@@ -0,0 +1,26 @@
using System;
using Microsoft.Extensions.Configuration;
namespace StellaOps.Configuration;
public static class StellaOpsOptionsBinder
{
public static TOptions BindOptions<TOptions>(
this IConfiguration configuration,
string? section = null,
Action<TOptions, IConfiguration>? postConfigure = null)
where TOptions : class, new()
{
ArgumentNullException.ThrowIfNull(configuration);
var options = new TOptions();
var bindingSource = string.IsNullOrWhiteSpace(section)
? configuration
: configuration.GetSection(section);
bindingSource.Bind(options);
postConfigure?.Invoke(options, configuration);
return options;
}
}

View File

@@ -1,6 +1,7 @@
using System;
using System.Globalization;
using System.IO;
using System.Linq;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using StellaOps.Feedser.Storage.Mongo.Advisories;
@@ -64,7 +65,13 @@ public sealed class JsonFeedExporter : IFeedExporter
result.AdvisoryCount,
digest);
if (existingState is not null && string.Equals(existingState.LastFullDigest, digest, StringComparison.Ordinal))
var manifest = result.Files
.Select(static file => new ExportFileRecord(file.RelativePath, file.Length, file.Digest))
.ToArray();
if (existingState is not null
&& existingState.Files.Count > 0
&& string.Equals(existingState.LastFullDigest, digest, StringComparison.Ordinal))
{
_logger.LogInformation("JSON export {ExportId} produced unchanged digest; skipping state update.", exportId);
TryDeleteDirectory(result.ExportDirectory);
@@ -90,6 +97,7 @@ public sealed class JsonFeedExporter : IFeedExporter
targetRepository: _options.TargetRepository,
exporterVersion: _exporterVersion,
resetBaseline: resetBaseline,
manifest: manifest,
cancellationToken: cancellationToken).ConfigureAwait(false);
await JsonExportManifestWriter.WriteAsync(result, digest, _exporterVersion, cancellationToken).ConfigureAwait(false);

View File

@@ -10,19 +10,22 @@ public sealed class TrivyDbExportPlannerTests
public void CreatePlan_ReturnsFullWhenStateMissing()
{
var planner = new TrivyDbExportPlanner();
var plan = planner.CreatePlan(existingState: null, treeDigest: "sha256:abcd");
var manifest = new[] { new ExportFileRecord("path.json", 10, "sha256:a") };
var plan = planner.CreatePlan(existingState: null, treeDigest: "sha256:abcd", manifest);
Assert.Equal(TrivyDbExportMode.Full, plan.Mode);
Assert.Equal("sha256:abcd", plan.TreeDigest);
Assert.Null(plan.BaseExportId);
Assert.Null(plan.BaseManifestDigest);
Assert.True(plan.ResetBaseline);
Assert.Equal(manifest, plan.Manifest);
}
[Fact]
public void CreatePlan_ReturnsSkipWhenCursorMatches()
{
var planner = new TrivyDbExportPlanner();
var existingManifest = new[] { new ExportFileRecord("path.json", 10, "sha256:a") };
var state = new ExportStateRecord(
Id: TrivyDbFeedExporter.ExporterId,
BaseExportId: "20240810T000000Z",
@@ -32,21 +35,25 @@ public sealed class TrivyDbExportPlannerTests
ExportCursor: "sha256:unchanged",
TargetRepository: "feedser/trivy",
ExporterVersion: "1.0",
UpdatedAt: DateTimeOffset.UtcNow);
UpdatedAt: DateTimeOffset.UtcNow,
Files: existingManifest);
var plan = planner.CreatePlan(state, "sha256:unchanged");
var plan = planner.CreatePlan(state, "sha256:unchanged", existingManifest);
Assert.Equal(TrivyDbExportMode.Skip, plan.Mode);
Assert.Equal("sha256:unchanged", plan.TreeDigest);
Assert.Equal("20240810T000000Z", plan.BaseExportId);
Assert.Equal("sha256:base", plan.BaseManifestDigest);
Assert.False(plan.ResetBaseline);
Assert.Empty(plan.ChangedFiles);
Assert.Empty(plan.RemovedPaths);
}
[Fact]
public void CreatePlan_ReturnsFullWhenCursorDiffers()
{
var planner = new TrivyDbExportPlanner();
var manifest = new[] { new ExportFileRecord("path.json", 10, "sha256:a") };
var state = new ExportStateRecord(
Id: TrivyDbFeedExporter.ExporterId,
BaseExportId: "20240810T000000Z",
@@ -56,20 +63,23 @@ public sealed class TrivyDbExportPlannerTests
ExportCursor: "sha256:old",
TargetRepository: "feedser/trivy",
ExporterVersion: "1.0",
UpdatedAt: DateTimeOffset.UtcNow);
UpdatedAt: DateTimeOffset.UtcNow,
Files: manifest);
var plan = planner.CreatePlan(state, "sha256:new");
var newManifest = new[] { new ExportFileRecord("path.json", 10, "sha256:b") };
var plan = planner.CreatePlan(state, "sha256:new", newManifest);
Assert.Equal(TrivyDbExportMode.Full, plan.Mode);
Assert.Equal(TrivyDbExportMode.Delta, plan.Mode);
Assert.Equal("sha256:new", plan.TreeDigest);
Assert.Equal("20240810T000000Z", plan.BaseExportId);
Assert.Equal("sha256:base", plan.BaseManifestDigest);
Assert.False(plan.ResetBaseline);
Assert.Single(plan.ChangedFiles);
var deltaState = state with { LastDeltaDigest = "sha256:delta" };
var deltaPlan = planner.CreatePlan(deltaState, "sha256:newer");
var deltaPlan = planner.CreatePlan(deltaState, "sha256:newer", newManifest);
Assert.Equal(TrivyDbExportMode.Full, deltaPlan.Mode);
Assert.Equal(TrivyDbExportMode.Delta, deltaPlan.Mode);
Assert.True(deltaPlan.ResetBaseline);
}
}

View File

@@ -1,3 +1,4 @@
using System;
using System.Collections.Generic;
using System.Globalization;
using System.Linq;
@@ -315,7 +316,8 @@ public sealed class TrivyDbFeedExporterTests : IDisposable
ExportCursor: "sha256:old",
TargetRepository: "registry.example/trivy",
ExporterVersion: "0.9.0",
UpdatedAt: timeProvider.GetUtcNow().AddMinutes(-30));
UpdatedAt: timeProvider.GetUtcNow().AddMinutes(-30),
Files: Array.Empty<ExportFileRecord>());
await stateStore.UpsertAsync(existingRecord, CancellationToken.None);
var stateManager = new ExportStateManager(stateStore, timeProvider);
@@ -350,6 +352,7 @@ public sealed class TrivyDbFeedExporterTests : IDisposable
Assert.Null(updated.LastDeltaDigest);
Assert.NotEqual("sha256:old", updated.ExportCursor);
Assert.Equal("registry.example/trivy", updated.TargetRepository);
Assert.NotEmpty(updated.Files);
}
private static Advisory CreateSampleAdvisory(

View File

@@ -10,4 +10,4 @@
|End-to-end tests with small dataset|QA|Exporters|DONE added deterministic round-trip test covering OCI layout, media types, and digest stability w/ repeated inputs.|
|ExportState persistence & idempotence|BE-Export|Storage.Mongo|DONE baseline resets wired into `ExportStateManager`, planner signals resets after delta runs, and exporters update state w/ repository-aware baseline rotation + tests.|
|Streamed package building to avoid large copies|BE-Export|Exporters|DONE metadata/config now reuse backing arrays and OCI writer streams directly without double buffering.|
|Plan incremental/delta exports|BE-Export|Exporters|TODO design reuse of existing blobs/layers when inputs unchanged instead of rewriting full trees each run.|
|Plan incremental/delta exports|BE-Export|Exporters|DOING export state now persists per-file manifests; planner detects changes/removed files and schedules delta vs full runs, groundwork laid for layer reuse.|

View File

@@ -1,8 +1,14 @@
namespace StellaOps.Feedser.Exporter.TrivyDb;
using System.Collections.Generic;
using StellaOps.Feedser.Storage.Mongo.Exporting;
public sealed record TrivyDbExportPlan(
TrivyDbExportMode Mode,
string TreeDigest,
string? BaseExportId,
string? BaseManifestDigest,
bool ResetBaseline);
bool ResetBaseline,
IReadOnlyList<ExportFileRecord> Manifest,
IReadOnlyList<ExportFileRecord> ChangedFiles,
IReadOnlyList<string> RemovedPaths);

View File

@@ -3,40 +3,100 @@ using StellaOps.Feedser.Storage.Mongo.Exporting;
namespace StellaOps.Feedser.Exporter.TrivyDb;
using System;
using System.Collections.Generic;
using System.Linq;
using StellaOps.Feedser.Storage.Mongo.Exporting;
public sealed class TrivyDbExportPlanner
{
public TrivyDbExportPlan CreatePlan(ExportStateRecord? existingState, string treeDigest)
public TrivyDbExportPlan CreatePlan(
ExportStateRecord? existingState,
string treeDigest,
IReadOnlyList<ExportFileRecord> manifest)
{
ArgumentException.ThrowIfNullOrEmpty(treeDigest);
manifest ??= Array.Empty<ExportFileRecord>();
if (existingState is null)
if (existingState is null || (existingState.Files?.Count ?? 0) == 0)
{
return new TrivyDbExportPlan(
TrivyDbExportMode.Full,
treeDigest,
BaseExportId: null,
BaseManifestDigest: null,
ResetBaseline: true);
BaseExportId: existingState?.BaseExportId,
BaseManifestDigest: existingState?.LastFullDigest,
ResetBaseline: true,
Manifest: manifest,
ChangedFiles: manifest,
RemovedPaths: Array.Empty<string>());
}
if (string.Equals(existingState.ExportCursor, treeDigest, StringComparison.Ordinal))
var existingFiles = existingState.Files ?? Array.Empty<ExportFileRecord>();
var cursorMatches = string.Equals(existingState.ExportCursor, treeDigest, StringComparison.Ordinal);
if (cursorMatches)
{
return new TrivyDbExportPlan(
TrivyDbExportMode.Skip,
treeDigest,
existingState.BaseExportId,
existingState.LastFullDigest,
ResetBaseline: false);
ResetBaseline: false,
Manifest: existingFiles,
ChangedFiles: Array.Empty<ExportFileRecord>(),
RemovedPaths: Array.Empty<string>());
}
var existingMap = existingFiles.ToDictionary(static file => file.Path, StringComparer.OrdinalIgnoreCase);
var newMap = manifest.ToDictionary(static file => file.Path, StringComparer.OrdinalIgnoreCase);
var removed = existingMap.Keys
.Where(path => !newMap.ContainsKey(path))
.ToArray();
if (removed.Length > 0)
{
return new TrivyDbExportPlan(
TrivyDbExportMode.Full,
treeDigest,
existingState.BaseExportId,
existingState.LastFullDigest,
ResetBaseline: true,
Manifest: manifest,
ChangedFiles: manifest,
RemovedPaths: removed);
}
var changed = new List<ExportFileRecord>();
foreach (var file in manifest)
{
if (!existingMap.TryGetValue(file.Path, out var previous) || !string.Equals(previous.Digest, file.Digest, StringComparison.Ordinal))
{
changed.Add(file);
}
}
if (changed.Count == 0)
{
return new TrivyDbExportPlan(
TrivyDbExportMode.Skip,
treeDigest,
existingState.BaseExportId,
existingState.LastFullDigest,
ResetBaseline: false,
Manifest: existingFiles,
ChangedFiles: Array.Empty<ExportFileRecord>(),
RemovedPaths: Array.Empty<string>());
}
var resetBaseline = existingState.LastDeltaDigest is not null;
// Placeholder for future delta support current behavior always rebuilds when tree changes.
return new TrivyDbExportPlan(
TrivyDbExportMode.Full,
TrivyDbExportMode.Delta,
treeDigest,
existingState.BaseExportId,
existingState.LastFullDigest,
resetBaseline);
resetBaseline,
Manifest: manifest,
ChangedFiles: changed,
RemovedPaths: Array.Empty<string>());
}
}

View File

@@ -85,9 +85,13 @@ public sealed class TrivyDbFeedExporter : IFeedExporter
jsonResult.AdvisoryCount,
jsonResult.TotalBytes);
var manifest = jsonResult.Files
.Select(static file => new ExportFileRecord(file.RelativePath, file.Length, file.Digest))
.ToArray();
var treeDigest = ExportDigestCalculator.ComputeTreeDigest(jsonResult);
var existingState = await _stateManager.GetAsync(ExporterId, cancellationToken).ConfigureAwait(false);
var plan = _exportPlanner.CreatePlan(existingState, treeDigest);
var plan = _exportPlanner.CreatePlan(existingState, treeDigest, manifest);
if (plan.Mode == TrivyDbExportMode.Skip)
{
@@ -104,6 +108,14 @@ public sealed class TrivyDbFeedExporter : IFeedExporter
return;
}
if (plan.Mode == TrivyDbExportMode.Delta)
{
_logger.LogInformation(
"Trivy DB export {ExportId} identified {ChangedCount} changed JSON files.",
exportId,
plan.ChangedFiles.Count);
}
var builderResult = await _builder.BuildAsync(jsonResult, exportedAt, exportId, cancellationToken).ConfigureAwait(false);
var metadataBytes = CreateMetadataJson(builderResult.BuilderMetadata, treeDigest, jsonResult, exportedAt);
@@ -142,15 +154,29 @@ public sealed class TrivyDbFeedExporter : IFeedExporter
resetBaseline = true;
}
await _stateManager.StoreFullExportAsync(
ExporterId,
exportId,
ociResult.ManifestDigest,
cursor: treeDigest,
targetRepository: _options.TargetRepository,
exporterVersion: _exporterVersion,
resetBaseline: resetBaseline,
cancellationToken: cancellationToken).ConfigureAwait(false);
if (plan.Mode == TrivyDbExportMode.Full || resetBaseline)
{
await _stateManager.StoreFullExportAsync(
ExporterId,
exportId,
ociResult.ManifestDigest,
cursor: treeDigest,
targetRepository: _options.TargetRepository,
exporterVersion: _exporterVersion,
resetBaseline: resetBaseline,
manifest: plan.Manifest,
cancellationToken: cancellationToken).ConfigureAwait(false);
}
else
{
await _stateManager.StoreDeltaExportAsync(
ExporterId,
deltaDigest: treeDigest,
cursor: treeDigest,
exporterVersion: _exporterVersion,
manifest: plan.Manifest,
cancellationToken: cancellationToken).ConfigureAwait(false);
}
await CreateOfflineBundleAsync(destination, exportId, exportedAt, cancellationToken).ConfigureAwait(false);
}

View File

@@ -16,6 +16,7 @@ public sealed class AdvisoryPrecedenceMergerTests
{
var timeProvider = new FakeTimeProvider(new DateTimeOffset(2025, 1, 1, 0, 0, 0, TimeSpan.Zero));
var merger = new AdvisoryPrecedenceMerger(new AffectedPackagePrecedenceResolver(), timeProvider);
using var metrics = new MetricCollector("StellaOps.Feedser.Merge");
var (redHat, nvd) = CreateVendorAndRegistryAdvisories();
var expectedMergeTimestamp = timeProvider.GetUtcNow();
@@ -46,6 +47,14 @@ public sealed class AdvisoryPrecedenceMergerTests
Assert.Equal(expectedMergeTimestamp, mergeProvenance.RecordedAt);
Assert.Contains("redhat", mergeProvenance.Value, StringComparison.OrdinalIgnoreCase);
Assert.Contains("nvd", mergeProvenance.Value, StringComparison.OrdinalIgnoreCase);
var rangeMeasurement = Assert.Single(metrics.Measurements, measurement => measurement.Name == "feedser.merge.range_overrides");
Assert.Equal(1, rangeMeasurement.Value);
Assert.Contains(rangeMeasurement.Tags, tag => string.Equals(tag.Key, "suppressed_source", StringComparison.Ordinal) && tag.Value?.ToString()?.Contains("nvd", StringComparison.OrdinalIgnoreCase) == true);
var severityConflict = Assert.Single(metrics.Measurements, measurement => measurement.Name == "feedser.merge.conflicts");
Assert.Equal(1, severityConflict.Value);
Assert.Contains(severityConflict.Tags, tag => string.Equals(tag.Key, "type", StringComparison.Ordinal) && string.Equals(tag.Value?.ToString(), "severity", StringComparison.OrdinalIgnoreCase));
}
[Fact]
@@ -155,6 +164,13 @@ public sealed class AdvisoryPrecedenceMergerTests
Assert.Contains(overrideMeasurement.Tags, tag => tag.Key == "primary_source" && string.Equals(tag.Value?.ToString(), "nvd", StringComparison.OrdinalIgnoreCase));
Assert.Contains(overrideMeasurement.Tags, tag => tag.Key == "suppressed_source" && tag.Value?.ToString()?.Contains("redhat", StringComparison.OrdinalIgnoreCase) == true);
Assert.DoesNotContain(metrics.Measurements, measurement => measurement.Name == "feedser.merge.range_overrides");
var conflictMeasurement = Assert.Single(metrics.Measurements, measurement => measurement.Name == "feedser.merge.conflicts");
Assert.Equal(1, conflictMeasurement.Value);
Assert.Contains(conflictMeasurement.Tags, tag => tag.Key == "type" && string.Equals(tag.Value?.ToString(), "severity", StringComparison.OrdinalIgnoreCase));
Assert.Contains(conflictMeasurement.Tags, tag => tag.Key == "reason" && string.Equals(tag.Value?.ToString(), "mismatch", StringComparison.OrdinalIgnoreCase));
var logEntry = Assert.Single(logger.Entries, entry => entry.EventId.Name == "AdvisoryOverride");
Assert.Equal(LogLevel.Information, logEntry.Level);
Assert.NotNull(logEntry.StructuredState);

View File

@@ -45,14 +45,21 @@ public sealed class AffectedPackagePrecedenceResolverTests
});
var resolver = new AffectedPackagePrecedenceResolver();
var merged = resolver.Merge(new[] { nvd, redHat });
var result = resolver.Merge(new[] { nvd, redHat });
var package = Assert.Single(merged);
var package = Assert.Single(result.Packages);
Assert.Equal("cpe:2.3:o:redhat:enterprise_linux:9:*:*:*:*:*:*:*", package.Identifier);
Assert.Empty(package.VersionRanges); // NVD range overridden
Assert.Contains(package.Statuses, status => status.Status == "known_affected");
Assert.Contains(package.Provenance, provenance => provenance.Source == "redhat");
Assert.Contains(package.Provenance, provenance => provenance.Source == "nvd");
var rangeOverride = Assert.Single(result.Overrides);
Assert.Equal("cpe:2.3:o:redhat:enterprise_linux:9:*:*:*:*:*:*:*", rangeOverride.Identifier);
Assert.Equal(0, rangeOverride.PrimaryRank);
Assert.True(rangeOverride.SuppressedRank >= rangeOverride.PrimaryRank);
Assert.Equal(0, rangeOverride.PrimaryRangeCount);
Assert.Equal(1, rangeOverride.SuppressedRangeCount);
}
[Fact]
@@ -78,11 +85,12 @@ public sealed class AffectedPackagePrecedenceResolverTests
});
var resolver = new AffectedPackagePrecedenceResolver();
var merged = resolver.Merge(new[] { nvd });
var result = resolver.Merge(new[] { nvd });
var package = Assert.Single(merged);
var package = Assert.Single(result.Packages);
Assert.Equal(nvd.Identifier, package.Identifier);
Assert.Equal(nvd.VersionRanges.Single().RangeExpression, package.VersionRanges.Single().RangeExpression);
Assert.Equal("nvd", package.Provenance.Single().Source);
Assert.Empty(result.Overrides);
}
}

View File

@@ -0,0 +1,135 @@
using System;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging.Abstractions;
using MongoDB.Driver;
using StellaOps.Feedser.Merge.Services;
using StellaOps.Feedser.Storage.Mongo;
using StellaOps.Feedser.Storage.Mongo.Aliases;
using StellaOps.Feedser.Testing;
namespace StellaOps.Feedser.Merge.Tests;
[Collection("mongo-fixture")]
public sealed class AliasGraphResolverTests : IClassFixture<MongoIntegrationFixture>
{
private readonly MongoIntegrationFixture _fixture;
public AliasGraphResolverTests(MongoIntegrationFixture fixture)
{
_fixture = fixture;
}
[Fact]
public async Task ResolveAsync_ReturnsCollisions_WhenAliasesOverlap()
{
await DropAliasCollectionAsync();
var aliasStore = new AliasStore(_fixture.Database, NullLogger<AliasStore>.Instance);
var resolver = new AliasGraphResolver(aliasStore);
var timestamp = DateTimeOffset.UtcNow;
await aliasStore.ReplaceAsync(
"ADV-1",
new[] { new AliasEntry("CVE", "CVE-2025-2000"), new AliasEntry(AliasStoreConstants.PrimaryScheme, "ADV-1") },
timestamp,
CancellationToken.None);
await aliasStore.ReplaceAsync(
"ADV-2",
new[] { new AliasEntry("CVE", "CVE-2025-2000"), new AliasEntry(AliasStoreConstants.PrimaryScheme, "ADV-2") },
timestamp.AddMinutes(1),
CancellationToken.None);
var result = await resolver.ResolveAsync("ADV-1", CancellationToken.None);
Assert.NotNull(result);
Assert.Equal("ADV-1", result.AdvisoryKey);
Assert.NotEmpty(result.Collisions);
var collision = Assert.Single(result.Collisions);
Assert.Equal("CVE", collision.Scheme);
Assert.Contains("ADV-1", collision.AdvisoryKeys);
Assert.Contains("ADV-2", collision.AdvisoryKeys);
}
[Fact]
public async Task BuildComponentAsync_TracesConnectedAdvisories()
{
await DropAliasCollectionAsync();
var aliasStore = new AliasStore(_fixture.Database, NullLogger<AliasStore>.Instance);
var resolver = new AliasGraphResolver(aliasStore);
var timestamp = DateTimeOffset.UtcNow;
await aliasStore.ReplaceAsync(
"ADV-A",
new[] { new AliasEntry("CVE", "CVE-2025-4000"), new AliasEntry(AliasStoreConstants.PrimaryScheme, "ADV-A") },
timestamp,
CancellationToken.None);
await aliasStore.ReplaceAsync(
"ADV-B",
new[] { new AliasEntry("CVE", "CVE-2025-4000"), new AliasEntry(AliasStoreConstants.PrimaryScheme, "ADV-B"), new AliasEntry("OSV", "OSV-2025-1") },
timestamp.AddMinutes(1),
CancellationToken.None);
await aliasStore.ReplaceAsync(
"ADV-C",
new[] { new AliasEntry("OSV", "OSV-2025-1"), new AliasEntry(AliasStoreConstants.PrimaryScheme, "ADV-C") },
timestamp.AddMinutes(2),
CancellationToken.None);
var component = await resolver.BuildComponentAsync("ADV-A", CancellationToken.None);
Assert.Contains("ADV-A", component.AdvisoryKeys, StringComparer.OrdinalIgnoreCase);
Assert.Contains("ADV-B", component.AdvisoryKeys, StringComparer.OrdinalIgnoreCase);
Assert.Contains("ADV-C", component.AdvisoryKeys, StringComparer.OrdinalIgnoreCase);
Assert.NotEmpty(component.Collisions);
Assert.True(component.AliasMap.ContainsKey("ADV-A"));
Assert.Contains(component.AliasMap["ADV-B"], record => record.Scheme == "OSV" && record.Value == "OSV-2025-1");
}
private async Task DropAliasCollectionAsync()
{
try
{
await _fixture.Database.DropCollectionAsync(MongoStorageDefaults.Collections.Alias);
}
catch (MongoDB.Driver.MongoCommandException ex) when (ex.CodeName == "NamespaceNotFound" || ex.Message.Contains("ns not found", StringComparison.OrdinalIgnoreCase))
{
}
}
[Fact]
public async Task BuildComponentAsync_LinksOsvAndGhsaAliases()
{
await DropAliasCollectionAsync();
var aliasStore = new AliasStore(_fixture.Database, NullLogger<AliasStore>.Instance);
var resolver = new AliasGraphResolver(aliasStore);
var timestamp = DateTimeOffset.UtcNow;
await aliasStore.ReplaceAsync(
"ADV-OSV",
new[]
{
new AliasEntry("OSV", "OSV-2025-2001"),
new AliasEntry("GHSA", "GHSA-zzzz-zzzz-zzzz"),
new AliasEntry(AliasStoreConstants.PrimaryScheme, "ADV-OSV"),
},
timestamp,
CancellationToken.None);
await aliasStore.ReplaceAsync(
"ADV-GHSA",
new[]
{
new AliasEntry("GHSA", "GHSA-zzzz-zzzz-zzzz"),
new AliasEntry(AliasStoreConstants.PrimaryScheme, "ADV-GHSA"),
},
timestamp.AddMinutes(1),
CancellationToken.None);
var component = await resolver.BuildComponentAsync("ADV-OSV", CancellationToken.None);
Assert.Contains("ADV-GHSA", component.AdvisoryKeys, StringComparer.OrdinalIgnoreCase);
Assert.Contains(component.Collisions, collision => collision.Scheme == "GHSA" && collision.Value == "GHSA-zzzz-zzzz-zzzz");
}
}

View File

@@ -76,6 +76,26 @@ public sealed class MergePrecedenceIntegrationTests : IAsyncLifetime
Assert.True(persisted.BeforeHash.Length > 0);
}
[Fact]
public async Task MergePipeline_IsDeterministicAcrossRuns()
{
await EnsureInitializedAsync();
var merger = _merger!;
var calculator = new CanonicalHashCalculator();
var first = merger.Merge(new[] { CreateNvdBaseline(), CreateVendorOverride() });
var second = merger.Merge(new[] { CreateNvdBaseline(), CreateVendorOverride() });
var firstHash = calculator.ComputeHash(first);
var secondHash = calculator.ComputeHash(second);
Assert.Equal(firstHash, secondHash);
Assert.Equal(first.AdvisoryKey, second.AdvisoryKey);
Assert.Equal(first.Aliases.Length, second.Aliases.Length);
Assert.True(first.Aliases.SequenceEqual(second.Aliases));
}
public async Task InitializeAsync()
{
_timeProvider = new FakeTimeProvider(new DateTimeOffset(2025, 3, 1, 0, 0, 0, TimeSpan.Zero))

View File

@@ -0,0 +1,6 @@
namespace StellaOps.Feedser.Merge.Jobs;
internal static class MergeJobKinds
{
public const string Reconcile = "merge:reconcile";
}

View File

@@ -0,0 +1,43 @@
using System;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging;
using StellaOps.Feedser.Core.Jobs;
using StellaOps.Feedser.Merge.Services;
namespace StellaOps.Feedser.Merge.Jobs;
public sealed class MergeReconcileJob : IJob
{
private readonly AdvisoryMergeService _mergeService;
private readonly ILogger<MergeReconcileJob> _logger;
public MergeReconcileJob(AdvisoryMergeService mergeService, ILogger<MergeReconcileJob> logger)
{
_mergeService = mergeService ?? throw new ArgumentNullException(nameof(mergeService));
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public async Task ExecuteAsync(JobExecutionContext context, CancellationToken cancellationToken)
{
if (!context.Parameters.TryGetValue("seed", out var seedValue) || seedValue is not string seed || string.IsNullOrWhiteSpace(seed))
{
context.Logger.LogWarning("merge:reconcile job requires a non-empty 'seed' parameter.");
return;
}
var result = await _mergeService.MergeAsync(seed, cancellationToken).ConfigureAwait(false);
if (result.Merged is null)
{
_logger.LogInformation("No advisories available to merge for alias component seeded by {Seed}", seed);
return;
}
_logger.LogInformation(
"Merged alias component seeded by {Seed} into canonical {Canonical} using {Count} advisories; collisions={Collisions}",
seed,
result.CanonicalAdvisoryKey,
result.Inputs.Count,
result.Component.Collisions.Count);
}
}

View File

@@ -0,0 +1,41 @@
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.DependencyInjection.Extensions;
using Microsoft.Extensions.Logging;
using StellaOps.Feedser.Merge.Jobs;
using StellaOps.Feedser.Merge.Options;
using StellaOps.Feedser.Merge.Services;
namespace StellaOps.Feedser.Merge;
public static class MergeServiceCollectionExtensions
{
public static IServiceCollection AddMergeModule(this IServiceCollection services, IConfiguration configuration)
{
ArgumentNullException.ThrowIfNull(services);
ArgumentNullException.ThrowIfNull(configuration);
services.TryAddSingleton<CanonicalHashCalculator>();
services.TryAddSingleton<AliasGraphResolver>();
services.TryAddSingleton<AffectedPackagePrecedenceResolver>(sp =>
{
var options = configuration.GetSection("feedser:merge:precedence").Get<AdvisoryPrecedenceOptions>();
return options is null ? new AffectedPackagePrecedenceResolver() : new AffectedPackagePrecedenceResolver(options);
});
services.TryAddSingleton<AdvisoryPrecedenceMerger>(sp =>
{
var resolver = sp.GetRequiredService<AffectedPackagePrecedenceResolver>();
var options = configuration.GetSection("feedser:merge:precedence").Get<AdvisoryPrecedenceOptions>();
var timeProvider = sp.GetRequiredService<TimeProvider>();
var logger = sp.GetRequiredService<ILogger<AdvisoryPrecedenceMerger>>();
return new AdvisoryPrecedenceMerger(resolver, options, timeProvider, logger);
});
services.TryAddSingleton<MergeEventWriter>();
services.TryAddSingleton<AdvisoryMergeService>();
services.AddTransient<MergeReconcileJob>();
return services;
}
}

View File

@@ -0,0 +1,96 @@
using System;
using System.Collections.Generic;
namespace StellaOps.Feedser.Merge.Options;
/// <summary>
/// Provides the built-in precedence table used by the merge engine when no overrides are supplied.
/// </summary>
internal static class AdvisoryPrecedenceDefaults
{
public static IReadOnlyDictionary<string, int> Rankings { get; } = CreateDefaultTable();
private static IReadOnlyDictionary<string, int> CreateDefaultTable()
{
var table = new Dictionary<string, int>(StringComparer.OrdinalIgnoreCase);
// 0 distro PSIRTs/OVAL feeds (authoritative for OS packages).
Add(table, 0,
"redhat",
"ubuntu",
"distro-ubuntu",
"debian",
"distro-debian",
"suse",
"distro-suse");
// 1 vendor PSIRTs (authoritative product advisories).
Add(table, 1,
"msrc",
"vndr-msrc",
"vndr-oracle",
"vndr_oracle",
"oracle",
"vndr-adobe",
"adobe",
"vndr-apple",
"apple",
"vndr-cisco",
"cisco",
"vmware",
"vndr-vmware",
"vndr_vmware",
"vndr-chromium",
"chromium",
"vendor");
// 2 ecosystem registries (OSS package maintainers).
Add(table, 2,
"ghsa",
"osv",
"cve");
// 3 regional CERT / ICS enrichment feeds.
Add(table, 3,
"jvn",
"acsc",
"cccs",
"cert-fr",
"certfr",
"cert-in",
"certin",
"cert-cc",
"certcc",
"certbund",
"cert-bund",
"ru-bdu",
"ru-nkcki",
"kisa",
"ics-cisa",
"ics-kaspersky");
// 4 KEV / exploit catalogue annotations (flag only).
Add(table, 4,
"kev",
"cisa-kev");
// 5 public registries (baseline data).
Add(table, 5,
"nvd");
return table;
}
private static void Add(IDictionary<string, int> table, int rank, params string[] sources)
{
foreach (var source in sources)
{
if (string.IsNullOrWhiteSpace(source))
{
continue;
}
table[source] = rank;
}
}
}

View File

@@ -0,0 +1,190 @@
using System;
using System.Collections.Generic;
using System.Diagnostics.Metrics;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging;
using StellaOps.Feedser.Models;
using StellaOps.Feedser.Storage.Mongo.Advisories;
using StellaOps.Feedser.Storage.Mongo.Aliases;
using StellaOps.Feedser.Storage.Mongo.MergeEvents;
namespace StellaOps.Feedser.Merge.Services;
public sealed class AdvisoryMergeService
{
private static readonly Meter MergeMeter = new("StellaOps.Feedser.Merge");
private static readonly Counter<long> AliasCollisionCounter = MergeMeter.CreateCounter<long>(
"feedser.merge.identity_conflicts",
unit: "count",
description: "Number of alias collisions detected during merge.");
private static readonly string[] PreferredAliasSchemes =
{
AliasSchemes.Cve,
AliasSchemes.Ghsa,
AliasSchemes.OsV,
AliasSchemes.Msrc,
};
private readonly AliasGraphResolver _aliasResolver;
private readonly IAdvisoryStore _advisoryStore;
private readonly AdvisoryPrecedenceMerger _precedenceMerger;
private readonly MergeEventWriter _mergeEventWriter;
private readonly ILogger<AdvisoryMergeService> _logger;
public AdvisoryMergeService(
AliasGraphResolver aliasResolver,
IAdvisoryStore advisoryStore,
AdvisoryPrecedenceMerger precedenceMerger,
MergeEventWriter mergeEventWriter,
ILogger<AdvisoryMergeService> logger)
{
_aliasResolver = aliasResolver ?? throw new ArgumentNullException(nameof(aliasResolver));
_advisoryStore = advisoryStore ?? throw new ArgumentNullException(nameof(advisoryStore));
_precedenceMerger = precedenceMerger ?? throw new ArgumentNullException(nameof(precedenceMerger));
_mergeEventWriter = mergeEventWriter ?? throw new ArgumentNullException(nameof(mergeEventWriter));
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public async Task<AdvisoryMergeResult> MergeAsync(string seedAdvisoryKey, CancellationToken cancellationToken)
{
ArgumentException.ThrowIfNullOrWhiteSpace(seedAdvisoryKey);
var component = await _aliasResolver.BuildComponentAsync(seedAdvisoryKey, cancellationToken).ConfigureAwait(false);
var inputs = new List<Advisory>();
foreach (var advisoryKey in component.AdvisoryKeys)
{
cancellationToken.ThrowIfCancellationRequested();
var advisory = await _advisoryStore.FindAsync(advisoryKey, cancellationToken).ConfigureAwait(false);
if (advisory is not null)
{
inputs.Add(advisory);
}
}
if (inputs.Count == 0)
{
_logger.LogWarning("Alias component seeded by {Seed} contains no persisted advisories", seedAdvisoryKey);
return AdvisoryMergeResult.Empty(seedAdvisoryKey, component);
}
var canonicalKey = SelectCanonicalKey(component) ?? seedAdvisoryKey;
var before = await _advisoryStore.FindAsync(canonicalKey, cancellationToken).ConfigureAwait(false);
var normalizedInputs = NormalizeInputs(inputs, canonicalKey);
Advisory? merged;
try
{
merged = _precedenceMerger.Merge(normalizedInputs);
}
catch (Exception ex)
{
_logger.LogError(ex, "Failed to merge alias component seeded by {Seed}", seedAdvisoryKey);
throw;
}
if (component.Collisions.Count > 0)
{
foreach (var collision in component.Collisions)
{
var tags = new KeyValuePair<string, object?>[]
{
new("scheme", collision.Scheme ?? string.Empty),
new("alias_value", collision.Value ?? string.Empty),
new("advisory_count", collision.AdvisoryKeys.Count),
};
AliasCollisionCounter.Add(1, tags);
_logger.LogInformation(
"Alias collision {Scheme}:{Value} involves advisories {Advisories}",
collision.Scheme,
collision.Value,
string.Join(", ", collision.AdvisoryKeys));
}
}
if (merged is not null)
{
await _advisoryStore.UpsertAsync(merged, cancellationToken).ConfigureAwait(false);
await _mergeEventWriter.AppendAsync(
canonicalKey,
before,
merged,
Array.Empty<Guid>(),
cancellationToken).ConfigureAwait(false);
}
return new AdvisoryMergeResult(seedAdvisoryKey, canonicalKey, component, inputs, before, merged);
}
private static IEnumerable<Advisory> NormalizeInputs(IEnumerable<Advisory> advisories, string canonicalKey)
{
foreach (var advisory in advisories)
{
yield return CloneWithKey(advisory, canonicalKey);
}
}
private static Advisory CloneWithKey(Advisory source, string advisoryKey)
=> new(
advisoryKey,
source.Title,
source.Summary,
source.Language,
source.Published,
source.Modified,
source.Severity,
source.ExploitKnown,
source.Aliases,
source.References,
source.AffectedPackages,
source.CvssMetrics,
source.Provenance);
private static string? SelectCanonicalKey(AliasComponent component)
{
foreach (var scheme in PreferredAliasSchemes)
{
var alias = component.AliasMap.Values
.SelectMany(static aliases => aliases)
.FirstOrDefault(record => string.Equals(record.Scheme, scheme, StringComparison.OrdinalIgnoreCase));
if (!string.IsNullOrWhiteSpace(alias?.Value))
{
return alias.Value;
}
}
if (component.AliasMap.TryGetValue(component.SeedAdvisoryKey, out var seedAliases))
{
var primary = seedAliases.FirstOrDefault(record => string.Equals(record.Scheme, AliasStoreConstants.PrimaryScheme, StringComparison.OrdinalIgnoreCase));
if (!string.IsNullOrWhiteSpace(primary?.Value))
{
return primary.Value;
}
}
var firstAlias = component.AliasMap.Values.SelectMany(static aliases => aliases).FirstOrDefault();
if (!string.IsNullOrWhiteSpace(firstAlias?.Value))
{
return firstAlias.Value;
}
return component.SeedAdvisoryKey;
}
}
public sealed record AdvisoryMergeResult(
string SeedAdvisoryKey,
string CanonicalAdvisoryKey,
AliasComponent Component,
IReadOnlyList<Advisory> Inputs,
Advisory? Previous,
Advisory? Merged)
{
public static AdvisoryMergeResult Empty(string seed, AliasComponent component)
=> new(seed, seed, component, Array.Empty<Advisory>(), null, null);
}

View File

@@ -1,6 +1,7 @@
using System;
using System.Collections.Generic;
using System.Diagnostics.Metrics;
using System.Globalization;
using System.Linq;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Logging.Abstractions;
@@ -14,36 +15,42 @@ namespace StellaOps.Feedser.Merge.Services;
/// </summary>
public sealed class AdvisoryPrecedenceMerger
{
private static readonly IReadOnlyDictionary<string, int> DefaultPrecedence = new Dictionary<string, int>(StringComparer.OrdinalIgnoreCase)
{
["redhat"] = 0,
["ubuntu"] = 0,
["debian"] = 0,
["suse"] = 0,
["msrc"] = 1,
["oracle"] = 1,
["adobe"] = 1,
["chromium"] = 1,
["vendor"] = 1,
["jvn"] = 2,
["certfr"] = 2,
["certin"] = 2,
["ics-kaspersky"] = 2,
["kev"] = 6,
["nvd"] = 5,
};
private static readonly Meter MergeMeter = new("StellaOps.Feedser.Merge");
private static readonly Counter<long> MergeCounter = MergeMeter.CreateCounter<long>(
"feedser.merge.operations",
unit: "count",
description: "Number of merge invocations executed by the precedence engine.");
private static readonly Counter<long> OverridesCounter = MergeMeter.CreateCounter<long>(
"feedser.merge.overrides",
unit: "count",
description: "Number of times lower-precedence advisories were overridden by higher-precedence sources.");
private static readonly Counter<long> RangeOverrideCounter = MergeMeter.CreateCounter<long>(
"feedser.merge.range_overrides",
unit: "count",
description: "Number of affected-package range overrides performed during precedence merge.");
private static readonly Counter<long> ConflictCounter = MergeMeter.CreateCounter<long>(
"feedser.merge.conflicts",
unit: "count",
description: "Number of precedence conflicts detected (severity, rank ties, etc.).");
private static readonly Action<ILogger, MergeOverrideAudit, Exception?> OverrideLogged = LoggerMessage.Define<MergeOverrideAudit>(
LogLevel.Information,
new EventId(1000, "AdvisoryOverride"),
"Advisory precedence override {@Override}");
private static readonly Action<ILogger, PackageOverrideAudit, Exception?> RangeOverrideLogged = LoggerMessage.Define<PackageOverrideAudit>(
LogLevel.Information,
new EventId(1001, "PackageRangeOverride"),
"Affected package precedence override {@Override}");
private static readonly Action<ILogger, MergeFieldConflictAudit, Exception?> ConflictLogged = LoggerMessage.Define<MergeFieldConflictAudit>(
LogLevel.Information,
new EventId(1002, "PrecedenceConflict"),
"Precedence conflict {@Conflict}");
private readonly AffectedPackagePrecedenceResolver _packageResolver;
private readonly IReadOnlyDictionary<string, int> _precedence;
private readonly int _fallbackRank;
@@ -51,12 +58,12 @@ public sealed class AdvisoryPrecedenceMerger
private readonly ILogger<AdvisoryPrecedenceMerger> _logger;
public AdvisoryPrecedenceMerger()
: this(new AffectedPackagePrecedenceResolver(), DefaultPrecedence, System.TimeProvider.System, NullLogger<AdvisoryPrecedenceMerger>.Instance)
: this(new AffectedPackagePrecedenceResolver(), TimeProvider.System)
{
}
public AdvisoryPrecedenceMerger(AffectedPackagePrecedenceResolver packageResolver, System.TimeProvider? timeProvider = null)
: this(packageResolver, DefaultPrecedence, timeProvider ?? System.TimeProvider.System, NullLogger<AdvisoryPrecedenceMerger>.Instance)
: this(packageResolver, packageResolver?.Precedence ?? AdvisoryPrecedenceDefaults.Rankings, timeProvider ?? TimeProvider.System, NullLogger<AdvisoryPrecedenceMerger>.Instance)
{
}
@@ -119,6 +126,8 @@ public sealed class AdvisoryPrecedenceMerger
.ThenByDescending(entry => entry.Advisory.Provenance.Length)
.ToArray();
MergeCounter.Add(1, new KeyValuePair<string, object?>("inputs", list.Count));
var primary = ordered[0].Advisory;
var title = PickString(ordered, advisory => advisory.Title) ?? advisoryKey;
@@ -137,7 +146,8 @@ public sealed class AdvisoryPrecedenceMerger
.Distinct()
.ToArray();
var affectedPackages = _packageResolver.Merge(ordered.SelectMany(entry => entry.Advisory.AffectedPackages));
var packageResult = _packageResolver.Merge(ordered.SelectMany(entry => entry.Advisory.AffectedPackages));
var affectedPackages = packageResult.Packages;
var cvssMetrics = ordered
.SelectMany(entry => entry.Advisory.CvssMetrics)
.Distinct()
@@ -168,6 +178,8 @@ public sealed class AdvisoryPrecedenceMerger
var exploitKnown = ordered.Any(entry => entry.Advisory.ExploitKnown);
LogOverrides(advisoryKey, ordered);
LogPackageOverrides(advisoryKey, packageResult.Overrides);
RecordFieldConflicts(advisoryKey, ordered);
return new Advisory(
advisoryKey,
@@ -275,6 +287,125 @@ public sealed class AdvisoryPrecedenceMerger
}
}
private void LogPackageOverrides(string advisoryKey, IReadOnlyList<AffectedPackageOverride> overrides)
{
if (overrides.Count == 0)
{
return;
}
foreach (var record in overrides)
{
var tags = new KeyValuePair<string, object?>[]
{
new("advisory_key", advisoryKey),
new("package_type", record.Type),
new("primary_source", FormatSourceLabel(record.PrimarySources)),
new("suppressed_source", FormatSourceLabel(record.SuppressedSources)),
new("primary_rank", record.PrimaryRank),
new("suppressed_rank", record.SuppressedRank),
new("primary_range_count", record.PrimaryRangeCount),
new("suppressed_range_count", record.SuppressedRangeCount),
};
RangeOverrideCounter.Add(1, tags);
var audit = new PackageOverrideAudit(
advisoryKey,
record.Type,
record.Identifier,
record.Platform,
record.PrimaryRank,
record.SuppressedRank,
record.PrimarySources,
record.SuppressedSources,
record.PrimaryRangeCount,
record.SuppressedRangeCount);
RangeOverrideLogged(_logger, audit, null);
}
}
private void RecordFieldConflicts(string advisoryKey, IReadOnlyList<AdvisoryEntry> ordered)
{
if (ordered.Count <= 1)
{
return;
}
var primary = ordered[0];
var primarySeverity = NormalizeSeverity(primary.Advisory.Severity);
for (var i = 1; i < ordered.Count; i++)
{
var candidate = ordered[i];
var candidateSeverity = NormalizeSeverity(candidate.Advisory.Severity);
if (!string.IsNullOrEmpty(candidateSeverity))
{
var reason = string.IsNullOrEmpty(primarySeverity) ? "primary_missing" : "mismatch";
if (string.IsNullOrEmpty(primarySeverity) || !string.Equals(primarySeverity, candidateSeverity, StringComparison.OrdinalIgnoreCase))
{
RecordConflict(
advisoryKey,
"severity",
reason,
primary,
candidate,
primarySeverity ?? "(none)",
candidateSeverity);
}
}
if (candidate.Rank == primary.Rank)
{
RecordConflict(
advisoryKey,
"precedence_tie",
"equal_rank",
primary,
candidate,
primary.Rank.ToString(CultureInfo.InvariantCulture),
candidate.Rank.ToString(CultureInfo.InvariantCulture));
}
}
}
private void RecordConflict(
string advisoryKey,
string conflictType,
string reason,
AdvisoryEntry primary,
AdvisoryEntry suppressed,
string? primaryValue,
string? suppressedValue)
{
var tags = new KeyValuePair<string, object?>[]
{
new("type", conflictType),
new("reason", reason),
new("primary_source", FormatSourceLabel(primary.Sources)),
new("suppressed_source", FormatSourceLabel(suppressed.Sources)),
new("primary_rank", primary.Rank),
new("suppressed_rank", suppressed.Rank),
};
ConflictCounter.Add(1, tags);
var audit = new MergeFieldConflictAudit(
advisoryKey,
conflictType,
reason,
primary.Sources,
primary.Rank,
suppressed.Sources,
suppressed.Rank,
primaryValue,
suppressedValue);
ConflictLogged(_logger, audit, null);
}
private readonly record struct AdvisoryEntry(Advisory Advisory, int Rank)
{
public IReadOnlyCollection<string> Sources { get; } = Advisory.Provenance
@@ -284,12 +415,15 @@ public sealed class AdvisoryPrecedenceMerger
.ToArray();
}
private static string? NormalizeSeverity(string? severity)
=> SeverityNormalization.Normalize(severity);
private static AffectedPackagePrecedenceResolver EnsureResolver(
AffectedPackagePrecedenceResolver? resolver,
AdvisoryPrecedenceOptions? options,
out IReadOnlyDictionary<string, int> precedence)
{
precedence = AdvisoryPrecedenceTable.Merge(DefaultPrecedence, options);
precedence = AdvisoryPrecedenceTable.Merge(AdvisoryPrecedenceDefaults.Rankings, options);
if (resolver is null)
{
@@ -354,4 +488,27 @@ public sealed class AdvisoryPrecedenceMerger
int SuppressedAliasCount,
int PrimaryProvenanceCount,
int SuppressedProvenanceCount);
private readonly record struct PackageOverrideAudit(
string AdvisoryKey,
string PackageType,
string Identifier,
string? Platform,
int PrimaryRank,
int SuppressedRank,
IReadOnlyCollection<string> PrimarySources,
IReadOnlyCollection<string> SuppressedSources,
int PrimaryRangeCount,
int SuppressedRangeCount);
private readonly record struct MergeFieldConflictAudit(
string AdvisoryKey,
string ConflictType,
string Reason,
IReadOnlyCollection<string> PrimarySources,
int PrimaryRank,
IReadOnlyCollection<string> SuppressedSources,
int SuppressedRank,
string? PrimaryValue,
string? SuppressedValue);
}

View File

@@ -12,30 +12,16 @@ namespace StellaOps.Feedser.Merge.Services;
/// </summary>
public sealed class AffectedPackagePrecedenceResolver
{
private static readonly IReadOnlyDictionary<string, int> DefaultPrecedence = new Dictionary<string, int>(StringComparer.OrdinalIgnoreCase)
{
["redhat"] = 0,
["ubuntu"] = 0,
["debian"] = 0,
["suse"] = 0,
["msrc"] = 1,
["oracle"] = 1,
["adobe"] = 1,
["chromium"] = 1,
["vendor"] = 1,
["nvd"] = 5,
};
private readonly IReadOnlyDictionary<string, int> _precedence;
private readonly int _fallbackRank;
public AffectedPackagePrecedenceResolver()
: this(DefaultPrecedence)
: this(AdvisoryPrecedenceDefaults.Rankings)
{
}
public AffectedPackagePrecedenceResolver(AdvisoryPrecedenceOptions? options)
: this(AdvisoryPrecedenceTable.Merge(DefaultPrecedence, options))
: this(AdvisoryPrecedenceTable.Merge(AdvisoryPrecedenceDefaults.Rankings, options))
{
}
@@ -47,7 +33,7 @@ public sealed class AffectedPackagePrecedenceResolver
public IReadOnlyDictionary<string, int> Precedence => _precedence;
public IReadOnlyList<AffectedPackage> Merge(IEnumerable<AffectedPackage> packages)
public AffectedPackagePrecedenceResult Merge(IEnumerable<AffectedPackage> packages)
{
ArgumentNullException.ThrowIfNull(packages);
@@ -56,41 +42,66 @@ public sealed class AffectedPackagePrecedenceResolver
.GroupBy(pkg => (pkg.Type, pkg.Identifier, pkg.Platform ?? string.Empty));
var resolved = new List<AffectedPackage>();
var overrides = new List<AffectedPackageOverride>();
foreach (var group in grouped)
{
var ordered = group
.OrderBy(GetPrecedence)
.ThenByDescending(static pkg => pkg.Provenance.Length)
.ThenByDescending(static pkg => pkg.VersionRanges.Length);
.Select(pkg => new PackageEntry(pkg, GetPrecedence(pkg)))
.OrderBy(static entry => entry.Rank)
.ThenByDescending(static entry => entry.Package.Provenance.Length)
.ThenByDescending(static entry => entry.Package.VersionRanges.Length)
.ToList();
var primary = ordered.First();
var primary = ordered[0];
var provenance = ordered
.SelectMany(static pkg => pkg.Provenance)
.SelectMany(static entry => entry.Package.Provenance)
.Where(static p => p is not null)
.Distinct()
.ToImmutableArray();
var statuses = ordered
.SelectMany(static pkg => pkg.Statuses)
.SelectMany(static entry => entry.Package.Statuses)
.Distinct(AffectedPackageStatusEqualityComparer.Instance)
.ToImmutableArray();
foreach (var candidate in ordered.Skip(1))
{
if (candidate.Package.VersionRanges.Length == 0)
{
continue;
}
overrides.Add(new AffectedPackageOverride(
primary.Package.Type,
primary.Package.Identifier,
string.IsNullOrWhiteSpace(primary.Package.Platform) ? null : primary.Package.Platform,
primary.Rank,
candidate.Rank,
ExtractSources(primary.Package),
ExtractSources(candidate.Package),
primary.Package.VersionRanges.Length,
candidate.Package.VersionRanges.Length));
}
var merged = new AffectedPackage(
primary.Type,
primary.Identifier,
string.IsNullOrWhiteSpace(primary.Platform) ? null : primary.Platform,
primary.VersionRanges,
primary.Package.VersionRanges,
statuses,
provenance);
resolved.Add(merged);
}
return resolved
var packagesResult = resolved
.OrderBy(static pkg => pkg.Type, StringComparer.Ordinal)
.ThenBy(static pkg => pkg.Identifier, StringComparer.Ordinal)
.ThenBy(static pkg => pkg.Platform, StringComparer.Ordinal)
.ToImmutableArray();
return new AffectedPackagePrecedenceResult(packagesResult, overrides.ToImmutableArray());
}
private int GetPrecedence(AffectedPackage package)
@@ -111,4 +122,42 @@ public sealed class AffectedPackagePrecedenceResolver
return bestRank;
}
private static IReadOnlyList<string> ExtractSources(AffectedPackage package)
{
if (package.Provenance.Length == 0)
{
return Array.Empty<string>();
}
return package.Provenance
.Select(static p => p.Source)
.Where(static source => !string.IsNullOrWhiteSpace(source))
.Distinct(StringComparer.OrdinalIgnoreCase)
.ToImmutableArray();
}
private readonly record struct PackageEntry(AffectedPackage Package, int Rank)
{
public string Type => Package.Type;
public string Identifier => Package.Identifier;
public string? Platform => string.IsNullOrWhiteSpace(Package.Platform) ? null : Package.Platform;
}
}
public sealed record AffectedPackagePrecedenceResult(
IReadOnlyList<AffectedPackage> Packages,
IReadOnlyList<AffectedPackageOverride> Overrides);
public sealed record AffectedPackageOverride(
string Type,
string Identifier,
string? Platform,
int PrimaryRank,
int SuppressedRank,
IReadOnlyList<string> PrimarySources,
IReadOnlyList<string> SuppressedSources,
int PrimaryRangeCount,
int SuppressedRangeCount);

View File

@@ -0,0 +1,139 @@
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using StellaOps.Feedser.Storage.Mongo.Aliases;
namespace StellaOps.Feedser.Merge.Services;
public sealed class AliasGraphResolver
{
private readonly IAliasStore _aliasStore;
public AliasGraphResolver(IAliasStore aliasStore)
{
_aliasStore = aliasStore ?? throw new ArgumentNullException(nameof(aliasStore));
}
public async Task<AliasIdentityResult> ResolveAsync(string advisoryKey, CancellationToken cancellationToken)
{
ArgumentException.ThrowIfNullOrEmpty(advisoryKey);
var aliases = await _aliasStore.GetByAdvisoryAsync(advisoryKey, cancellationToken).ConfigureAwait(false);
var collisions = new List<AliasCollision>();
foreach (var alias in aliases)
{
var candidates = await _aliasStore.GetByAliasAsync(alias.Scheme, alias.Value, cancellationToken).ConfigureAwait(false);
var advisoryKeys = candidates
.Select(static candidate => candidate.AdvisoryKey)
.Where(static key => !string.IsNullOrWhiteSpace(key))
.Distinct(StringComparer.OrdinalIgnoreCase)
.ToArray();
if (advisoryKeys.Length <= 1)
{
continue;
}
collisions.Add(new AliasCollision(alias.Scheme, alias.Value, advisoryKeys));
}
var unique = new Dictionary<string, AliasCollision>(StringComparer.Ordinal);
foreach (var collision in collisions)
{
var key = $"{collision.Scheme}\u0001{collision.Value}";
if (!unique.ContainsKey(key))
{
unique[key] = collision;
}
}
var distinctCollisions = unique.Values.ToArray();
return new AliasIdentityResult(advisoryKey, aliases, distinctCollisions);
}
public async Task<AliasComponent> BuildComponentAsync(string advisoryKey, CancellationToken cancellationToken)
{
ArgumentException.ThrowIfNullOrEmpty(advisoryKey);
var visited = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
var queue = new Queue<string>();
var collisionMap = new Dictionary<string, AliasCollision>(StringComparer.Ordinal);
var aliasCache = new Dictionary<string, IReadOnlyList<AliasRecord>>(StringComparer.OrdinalIgnoreCase);
queue.Enqueue(advisoryKey);
while (queue.Count > 0)
{
cancellationToken.ThrowIfCancellationRequested();
var current = queue.Dequeue();
if (!visited.Add(current))
{
continue;
}
var aliases = await GetAliasesAsync(current, cancellationToken, aliasCache).ConfigureAwait(false);
aliasCache[current] = aliases;
foreach (var alias in aliases)
{
var aliasRecords = await GetAdvisoriesForAliasAsync(alias.Scheme, alias.Value, cancellationToken).ConfigureAwait(false);
var advisoryKeys = aliasRecords
.Select(static record => record.AdvisoryKey)
.Where(static key => !string.IsNullOrWhiteSpace(key))
.Distinct(StringComparer.OrdinalIgnoreCase)
.ToArray();
if (advisoryKeys.Length <= 1)
{
continue;
}
foreach (var candidate in advisoryKeys)
{
if (!visited.Contains(candidate))
{
queue.Enqueue(candidate);
}
}
var collision = new AliasCollision(alias.Scheme, alias.Value, advisoryKeys);
var key = $"{collision.Scheme}\u0001{collision.Value}";
collisionMap.TryAdd(key, collision);
}
}
var aliasMap = new Dictionary<string, IReadOnlyList<AliasRecord>>(aliasCache, StringComparer.OrdinalIgnoreCase);
return new AliasComponent(advisoryKey, visited.ToArray(), collisionMap.Values.ToArray(), aliasMap);
}
private async Task<IReadOnlyList<AliasRecord>> GetAliasesAsync(
string advisoryKey,
CancellationToken cancellationToken,
IDictionary<string, IReadOnlyList<AliasRecord>> cache)
{
if (cache.TryGetValue(advisoryKey, out var cached))
{
return cached;
}
var aliases = await _aliasStore.GetByAdvisoryAsync(advisoryKey, cancellationToken).ConfigureAwait(false);
cache[advisoryKey] = aliases;
return aliases;
}
private Task<IReadOnlyList<AliasRecord>> GetAdvisoriesForAliasAsync(
string scheme,
string value,
CancellationToken cancellationToken)
=> _aliasStore.GetByAliasAsync(scheme, value, cancellationToken);
}
public sealed record AliasIdentityResult(string AdvisoryKey, IReadOnlyList<AliasRecord> Aliases, IReadOnlyList<AliasCollision> Collisions);
public sealed record AliasComponent(
string SeedAdvisoryKey,
IReadOnlyList<string> AdvisoryKeys,
IReadOnlyList<AliasCollision> Collisions,
IReadOnlyDictionary<string, IReadOnlyList<AliasRecord>> AliasMap);

View File

@@ -8,6 +8,7 @@
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Microsoft.Extensions.Configuration.Binder" Version="8.0.0" />
<PackageReference Include="Semver" Version="2.3.0" />
<ProjectReference Include="../StellaOps.Feedser.Models/StellaOps.Feedser.Models.csproj" />
<ProjectReference Include="../StellaOps.Feedser.Normalization/StellaOps.Feedser.Normalization.csproj" />

View File

@@ -2,12 +2,12 @@
| Task | Owner(s) | Depends on | Notes |
|---|---|---|---|
|Identity graph and alias resolver|BE-Merge|Models, Storage.Mongo|DONE `AdvisoryIdentityResolver` builds alias-driven clusters with canonical key selection + unit coverage.|
|Precedence policy engine|BE-Merge|Architecture|PSIRT/OVAL > NVD; CERTs enrich; KEV flag.|
|Precedence policy engine|BE-Merge|Architecture|**DONE** precedence defaults enforced by `AdvisoryPrecedenceMerger`/`AdvisoryPrecedenceDefaults` with distro/PSIRT overriding registry feeds and CERT/KEV enrichers.|
|NEVRA comparer plus tests|BE-Merge (Distro WG)|Source.Distro fixtures|DONE Added Nevra parser/comparer with tilde-aware rpm ordering and unit coverage.|
|Debian EVR comparer plus tests|BE-Merge (Distro WG)|Debian fixtures|DONE DebianEvr comparer mirrors dpkg ordering with tilde/epoch handling and unit coverage.|
|SemVer range resolver plus tests|BE-Merge (OSS WG)|OSV/GHSA fixtures|DONE SemanticVersionRangeResolver covers introduced/fixed/lastAffected semantics with SemVer ordering tests.|
|Canonical hash and merge_event writer|BE-Merge|Models, Storage.Mongo|DONE Hash calculator + MergeEventWriter compute canonical SHA-256 digests and persist merge events.|
|Conflict detection and metrics|BE-Merge|Core|Counters; structured logs; traces.|
|End-to-end determinism test|QA|Merge, key connectors|Same inputs -> same hashes.|
|Conflict detection and metrics|BE-Merge|Core|**DONE** merge meters emit override/conflict counters and structured audits (`AdvisoryPrecedenceMerger`).|
|End-to-end determinism test|QA|Merge, key connectors|**DONE** `MergePrecedenceIntegrationTests.MergePipeline_IsDeterministicAcrossRuns` guards determinism.|
|Override audit logging|BE-Merge|Observability|DONE override audits now emit structured logs plus bounded-tag metrics suitable for prod telemetry.|
|Configurable precedence table|BE-Merge|Architecture|DONE precedence options bind via feedser:merge:precedence:ranks with docs/tests covering operator workflow.|

View File

@@ -13,7 +13,8 @@ public sealed class CanonicalExamplesTests
public void CanonicalExamplesMatchGoldenSnapshots()
{
Directory.CreateDirectory(FixtureRoot);
var updateGoldens = string.Equals(Environment.GetEnvironmentVariable(UpdateEnvVar), "1", StringComparison.OrdinalIgnoreCase);
var envValue = Environment.GetEnvironmentVariable(UpdateEnvVar);
var updateGoldens = string.Equals(envValue, "1", StringComparison.OrdinalIgnoreCase);
var failures = new List<string>();
foreach (var (name, advisory) in CanonicalExampleFactory.GetExamples())
@@ -36,6 +37,8 @@ public sealed class CanonicalExamplesTests
var expected = File.ReadAllText(fixturePath).Replace("\r\n", "\n");
if (!string.Equals(expected, snapshot, StringComparison.Ordinal))
{
var actualPath = Path.Combine(FixtureRoot, $"{name}.actual.json");
File.WriteAllText(actualPath, snapshot);
failures.Add($"Fixture mismatch for {name}. Set {UpdateEnvVar}=1 to regenerate.");
}
}

View File

@@ -1,4 +1,5 @@
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text.Json;
using StellaOps.Feedser.Models;
@@ -62,4 +63,90 @@ public sealed class CanonicalJsonSerializerTests
var normalized2 = snap2.Replace("\r\n", "\n");
Assert.Equal(normalized1, normalized2);
}
[Fact]
public void SerializesRangePrimitivesPayload()
{
var recordedAt = new DateTimeOffset(2025, 2, 1, 0, 0, 0, TimeSpan.Zero);
var provenance = new AdvisoryProvenance("connector-x", "map", "segment-1", recordedAt);
var primitives = new RangePrimitives(
new SemVerPrimitive(
Introduced: "2.0.0",
IntroducedInclusive: true,
Fixed: "2.3.4",
FixedInclusive: false,
LastAffected: "2.3.3",
LastAffectedInclusive: true,
ConstraintExpression: ">=2.0.0 <2.3.4"),
new NevraPrimitive(
Introduced: new NevraComponent("pkg", 0, "2.0.0", "1", "x86_64"),
Fixed: null,
LastAffected: new NevraComponent("pkg", 0, "2.3.3", "3", "x86_64")),
new EvrPrimitive(
Introduced: new EvrComponent(1, "2.0.0", "1"),
Fixed: new EvrComponent(1, "2.3.4", null),
LastAffected: null),
new Dictionary<string, string>(StringComparer.Ordinal)
{
["channel"] = "stable",
});
var range = new AffectedVersionRange(
rangeKind: "semver",
introducedVersion: "2.0.0",
fixedVersion: "2.3.4",
lastAffectedVersion: "2.3.3",
rangeExpression: ">=2.0.0 <2.3.4",
provenance,
primitives);
var package = new AffectedPackage(
type: "semver",
identifier: "pkg@2.x",
platform: "linux",
versionRanges: new[] { range },
statuses: Array.Empty<AffectedPackageStatus>(),
provenance: new[] { provenance });
var advisory = new Advisory(
advisoryKey: "TEST-PRIM",
title: "Range primitive serialization",
summary: null,
language: null,
published: recordedAt,
modified: recordedAt,
severity: null,
exploitKnown: false,
aliases: Array.Empty<string>(),
references: Array.Empty<AdvisoryReference>(),
affectedPackages: new[] { package },
cvssMetrics: Array.Empty<CvssMetric>(),
provenance: new[] { provenance });
var json = CanonicalJsonSerializer.Serialize(advisory);
using var document = JsonDocument.Parse(json);
var rangeElement = document.RootElement
.GetProperty("affectedPackages")[0]
.GetProperty("versionRanges")[0];
Assert.True(rangeElement.TryGetProperty("primitives", out var primitivesElement));
var semver = primitivesElement.GetProperty("semVer");
Assert.Equal("2.0.0", semver.GetProperty("introduced").GetString());
Assert.True(semver.GetProperty("introducedInclusive").GetBoolean());
Assert.Equal("2.3.4", semver.GetProperty("fixed").GetString());
Assert.False(semver.GetProperty("fixedInclusive").GetBoolean());
Assert.Equal("2.3.3", semver.GetProperty("lastAffected").GetString());
var nevra = primitivesElement.GetProperty("nevra");
Assert.Equal("pkg", nevra.GetProperty("introduced").GetProperty("name").GetString());
Assert.Equal(0, nevra.GetProperty("introduced").GetProperty("epoch").GetInt32());
var evr = primitivesElement.GetProperty("evr");
Assert.Equal(1, evr.GetProperty("introduced").GetProperty("epoch").GetInt32());
Assert.Equal("2.3.4", evr.GetProperty("fixed").GetProperty("upstreamVersion").GetString());
var extensions = primitivesElement.GetProperty("vendorExtensions");
Assert.Equal("stable", extensions.GetProperty("channel").GetString());
}
}

View File

@@ -19,6 +19,7 @@
"fixedVersion": "2.5.1",
"introducedVersion": null,
"lastAffectedVersion": null,
"primitives": null,
"provenance": {
"kind": "map",
"recordedAt": "2024-03-05T10:00:00+00:00",
@@ -32,6 +33,7 @@
"fixedVersion": "3.2.4",
"introducedVersion": "3.0.0",
"lastAffectedVersion": null,
"primitives": null,
"provenance": {
"kind": "map",
"recordedAt": "2024-03-05T10:00:00+00:00",

View File

@@ -29,6 +29,7 @@
"fixedVersion": "1.0.5",
"introducedVersion": "1.0",
"lastAffectedVersion": null,
"primitives": null,
"provenance": {
"kind": "map",
"recordedAt": "2024-08-01T12:00:00+00:00",

View File

@@ -35,6 +35,7 @@
"fixedVersion": null,
"introducedVersion": "0:4.18.0-553.el8",
"lastAffectedVersion": null,
"primitives": null,
"provenance": {
"kind": "map",
"recordedAt": "2024-05-11T09:00:00+00:00",

View File

@@ -0,0 +1,178 @@
using System;
using System.Collections.Generic;
using System.Diagnostics.Metrics;
using System.Linq;
using System.Reflection;
using Microsoft.Extensions.Logging.Abstractions;
using StellaOps.Feedser.Models;
using Xunit;
namespace StellaOps.Feedser.Models.Tests;
public sealed class ProvenanceDiagnosticsTests
{
[Fact]
public void RecordMissing_AddsExpectedTagsAndDeduplicates()
{
ResetState();
var measurements = new List<(string Instrument, long Value, IReadOnlyDictionary<string, object?> Tags)>();
using var listener = CreateListener(measurements);
var baseline = DateTimeOffset.UtcNow;
ProvenanceDiagnostics.RecordMissing("source-A", "range:pkg", baseline);
ProvenanceDiagnostics.RecordMissing("source-A", "range:pkg", baseline.AddMinutes(5));
ProvenanceDiagnostics.RecordMissing("source-A", "reference:https://example", baseline.AddMinutes(10));
listener.Dispose();
Assert.Equal(2, measurements.Count);
var first = measurements[0];
Assert.Equal(1, first.Value);
Assert.Equal("feedser.provenance.missing", first.Instrument);
Assert.Equal("source-A", first.Tags["source"]);
Assert.Equal("range:pkg", first.Tags["component"]);
Assert.Equal("range", first.Tags["category"]);
Assert.Equal("high", first.Tags["severity"]);
var second = measurements[1];
Assert.Equal("feedser.provenance.missing", second.Instrument);
Assert.Equal("reference", second.Tags["category"]);
Assert.Equal("low", second.Tags["severity"]);
}
[Fact]
public void ReportResumeWindow_ClearsTrackedEntries_WhenWindowBackfills()
{
ResetState();
var timestamp = DateTimeOffset.UtcNow;
ProvenanceDiagnostics.RecordMissing("source-B", "package:lib", timestamp);
var (recorded, earliest, syncRoot) = GetInternalState();
lock (syncRoot)
{
Assert.True(earliest.ContainsKey("source-B"));
Assert.Contains(recorded, entry => entry.StartsWith("source-B|", StringComparison.OrdinalIgnoreCase));
}
ProvenanceDiagnostics.ReportResumeWindow("source-B", timestamp.AddMinutes(-5), NullLogger.Instance);
lock (syncRoot)
{
Assert.False(earliest.ContainsKey("source-B"));
Assert.DoesNotContain(recorded, entry => entry.StartsWith("source-B|", StringComparison.OrdinalIgnoreCase));
}
}
[Fact]
public void ReportResumeWindow_RetainsEntries_WhenWindowTooRecent()
{
ResetState();
var timestamp = DateTimeOffset.UtcNow;
ProvenanceDiagnostics.RecordMissing("source-C", "range:pkg", timestamp);
ProvenanceDiagnostics.ReportResumeWindow("source-C", timestamp.AddMinutes(1), NullLogger.Instance);
var (recorded, earliest, syncRoot) = GetInternalState();
lock (syncRoot)
{
Assert.True(earliest.ContainsKey("source-C"));
Assert.Contains(recorded, entry => entry.StartsWith("source-C|", StringComparison.OrdinalIgnoreCase));
}
}
[Fact]
public void RecordRangePrimitive_EmitsCoverageMetric()
{
var range = new AffectedVersionRange(
rangeKind: "evr",
introducedVersion: "1:1.1.1n-0+deb11u2",
fixedVersion: null,
lastAffectedVersion: null,
rangeExpression: null,
provenance: new AdvisoryProvenance("source-D", "range", "pkg", DateTimeOffset.UtcNow),
primitives: new RangePrimitives(
SemVer: null,
Nevra: null,
Evr: new EvrPrimitive(
new EvrComponent(1, "1.1.1n", "0+deb11u2"),
null,
null),
VendorExtensions: new Dictionary<string, string> { ["debian.release"] = "bullseye" }));
var measurements = new List<(string Instrument, long Value, IReadOnlyDictionary<string, object?> Tags)>();
using var listener = CreateListener(measurements, "feedser.range.primitives");
ProvenanceDiagnostics.RecordRangePrimitive("source-D", range);
listener.Dispose();
var record = Assert.Single(measurements);
Assert.Equal("feedser.range.primitives", record.Instrument);
Assert.Equal(1, record.Value);
Assert.Equal("source-D", record.Tags["source"]);
Assert.Equal("evr", record.Tags["rangeKind"]);
Assert.Equal("evr", record.Tags["primitiveKinds"]);
Assert.Equal("true", record.Tags["hasVendorExtensions"]);
}
private static MeterListener CreateListener(
List<(string Instrument, long Value, IReadOnlyDictionary<string, object?> Tags)> measurements,
params string[] instrumentNames)
{
var allowed = instrumentNames is { Length: > 0 } ? instrumentNames : new[] { "feedser.provenance.missing" };
var allowedSet = new HashSet<string>(allowed, StringComparer.OrdinalIgnoreCase);
var listener = new MeterListener
{
InstrumentPublished = (instrument, l) =>
{
if (instrument.Meter.Name == "StellaOps.Feedser.Models.Provenance" && allowedSet.Contains(instrument.Name))
{
l.EnableMeasurementEvents(instrument);
}
}
};
listener.SetMeasurementEventCallback<long>((instrument, measurement, tags, state) =>
{
var dict = new Dictionary<string, object?>(StringComparer.OrdinalIgnoreCase);
foreach (var tag in tags)
{
dict[tag.Key] = tag.Value;
}
measurements.Add((instrument.Name, measurement, dict));
});
listener.Start();
return listener;
}
private static void ResetState()
{
var (_, _, syncRoot) = GetInternalState();
lock (syncRoot)
{
var (recorded, earliest, _) = GetInternalState();
recorded.Clear();
earliest.Clear();
}
}
private static (HashSet<string> Recorded, Dictionary<string, DateTimeOffset> Earliest, object SyncRoot) GetInternalState()
{
var type = typeof(ProvenanceDiagnostics);
var recordedField = type.GetField("RecordedComponents", BindingFlags.NonPublic | BindingFlags.Static) ?? throw new InvalidOperationException("RecordedComponents not found");
var earliestField = type.GetField("EarliestMissing", BindingFlags.NonPublic | BindingFlags.Static) ?? throw new InvalidOperationException("EarliestMissing not found");
var syncField = type.GetField("SyncRoot", BindingFlags.NonPublic | BindingFlags.Static) ?? throw new InvalidOperationException("SyncRoot not found");
var recorded = (HashSet<string>)recordedField.GetValue(null)!;
var earliest = (Dictionary<string, DateTimeOffset>)earliestField.GetValue(null)!;
var sync = syncField.GetValue(null)!;
return (recorded, earliest, sync);
}
}

View File

@@ -14,7 +14,8 @@ public sealed record AffectedVersionRange
string? fixedVersion,
string? lastAffectedVersion,
string? rangeExpression,
AdvisoryProvenance provenance)
AdvisoryProvenance provenance,
RangePrimitives? primitives = null)
{
RangeKind = Validation.EnsureNotNullOrWhiteSpace(rangeKind, nameof(rangeKind)).ToLowerInvariant();
IntroducedVersion = Validation.TrimToNull(introducedVersion);
@@ -22,6 +23,7 @@ public sealed record AffectedVersionRange
LastAffectedVersion = Validation.TrimToNull(lastAffectedVersion);
RangeExpression = Validation.TrimToNull(rangeExpression);
Provenance = provenance ?? AdvisoryProvenance.Empty;
Primitives = primitives;
}
/// <summary>
@@ -51,6 +53,8 @@ public sealed record AffectedVersionRange
public AdvisoryProvenance Provenance { get; }
public RangePrimitives? Primitives { get; }
public string CreateDeterministicKey()
=> string.Join('|', RangeKind, IntroducedVersion ?? string.Empty, FixedVersion ?? string.Empty, LastAffectedVersion ?? string.Empty, RangeExpression ?? string.Empty);
}

View File

@@ -63,6 +63,7 @@ Deterministic ordering: packages sorted by `type`, then `identifier`, then `plat
| `lastAffectedVersion` | string? | optional | Inclusive upper bound when no fix exists. |
| `rangeExpression` | string? | optional | Normalized textual expression for non-simple ranges. |
| `provenance` | AdvisoryProvenance | yes | Provenance entry for the range. |
| `primitives` | RangePrimitives? | optional | Structured metadata (SemVer/Nevra/Evr/vendor extensions) when available. |
Comparers/equality ignore provenance differences.

View File

@@ -10,3 +10,5 @@
- **Merge policy**: never discard provenance when merging; instead append a new `AdvisoryProvenance` entry with the merge routine (`source=merge.determine-precedence`).
- **Determinism**: provenance collections are sorted by source → kind → recordedAt before serialization; avoid generating random identifiers inside provenance.
- **Redaction**: keep provenance values free of secrets; prefer tokens or normalized descriptors when referencing authenticated fetches.
- **Range telemetry**: each `AffectedVersionRange` is observed by the `feedser.range.primitives` metric. Emit the richest `RangePrimitives` possible (SemVer/NEVRA/EVR plus vendor extensions); the telemetry tags make it easy to spot connectors missing structured range data.
- **Vendor extensions**: when vendor feeds surface bespoke status flags, capture them in `RangePrimitives.VendorExtensions`. SUSE advisories publish `suse.status` (open/resolved/investigating) and Ubuntu notices expose `ubuntu.pocket`/`ubuntu.release` to distinguish security vs ESM pockets; Adobe APSB bulletins emit `adobe.track`, `adobe.platform`, `adobe.priority`, `adobe.availability`, plus `adobe.affected.raw`/`adobe.updated.raw` to preserve PSIRT metadata while keeping the status catalog canonical. These values are exported for dashboards and alerting.

View File

@@ -0,0 +1,253 @@
using System;
using System.Collections.Generic;
using System.Diagnostics.Metrics;
using System.Linq;
using Microsoft.Extensions.Logging;
namespace StellaOps.Feedser.Models;
public static class ProvenanceInspector
{
public static IReadOnlyList<MissingProvenance> FindMissingProvenance(Advisory advisory)
{
var results = new List<MissingProvenance>();
var source = advisory.Provenance.FirstOrDefault()?.Source ?? "unknown";
if (advisory.Provenance.Length == 0)
{
results.Add(new MissingProvenance(source, "advisory", null));
}
foreach (var reference in advisory.References)
{
if (IsMissing(reference.Provenance))
{
results.Add(new MissingProvenance(reference.Provenance.Source ?? source, $"reference:{reference.Url}", reference.Provenance.RecordedAt));
}
}
foreach (var package in advisory.AffectedPackages)
{
if (package.Provenance.Length == 0)
{
results.Add(new MissingProvenance(source, $"package:{package.Identifier}", null));
}
foreach (var range in package.VersionRanges)
{
ProvenanceDiagnostics.RecordRangePrimitive(range.Provenance.Source ?? source, range);
if (IsMissing(range.Provenance))
{
results.Add(new MissingProvenance(range.Provenance.Source ?? source, $"range:{package.Identifier}", range.Provenance.RecordedAt));
}
}
foreach (var status in package.Statuses)
{
if (IsMissing(status.Provenance))
{
results.Add(new MissingProvenance(status.Provenance.Source ?? source, $"status:{package.Identifier}:{status.Status}", status.Provenance.RecordedAt));
}
}
}
foreach (var metric in advisory.CvssMetrics)
{
if (IsMissing(metric.Provenance))
{
results.Add(new MissingProvenance(metric.Provenance.Source ?? source, $"cvss:{metric.Version}", metric.Provenance.RecordedAt));
}
}
return results;
}
private static bool IsMissing(AdvisoryProvenance provenance)
{
return provenance == AdvisoryProvenance.Empty
|| string.IsNullOrWhiteSpace(provenance.Source)
|| string.IsNullOrWhiteSpace(provenance.Kind);
}
}
public sealed record MissingProvenance(string Source, string Component, DateTimeOffset? RecordedAt);
public static class ProvenanceDiagnostics
{
private static readonly Meter Meter = new("StellaOps.Feedser.Models.Provenance");
private static readonly Counter<long> MissingCounter = Meter.CreateCounter<long>(
"feedser.provenance.missing",
unit: "count",
description: "Number of canonical objects missing provenance metadata.");
private static readonly Counter<long> RangePrimitiveCounter = Meter.CreateCounter<long>(
"feedser.range.primitives",
unit: "count",
description: "Range coverage by kind, primitive availability, and vendor extensions.");
private static readonly object SyncRoot = new();
private static readonly Dictionary<string, DateTimeOffset> EarliestMissing = new(StringComparer.OrdinalIgnoreCase);
private static readonly HashSet<string> RecordedComponents = new(StringComparer.OrdinalIgnoreCase);
public static void RecordMissing(string source, string component, DateTimeOffset? recordedAt)
{
if (string.IsNullOrWhiteSpace(source))
{
source = "unknown";
}
component = string.IsNullOrWhiteSpace(component) ? "unknown" : component.Trim();
bool shouldRecord;
lock (SyncRoot)
{
var key = $"{source}|{component}";
shouldRecord = RecordedComponents.Add(key);
if (recordedAt.HasValue)
{
if (!EarliestMissing.TryGetValue(source, out var existing) || recordedAt.Value < existing)
{
EarliestMissing[source] = recordedAt.Value;
}
}
}
if (!shouldRecord)
{
return;
}
var category = DetermineCategory(component);
var severity = DetermineSeverity(category);
var tags = new[]
{
new KeyValuePair<string, object?>("source", source),
new KeyValuePair<string, object?>("component", component),
new KeyValuePair<string, object?>("category", category),
new KeyValuePair<string, object?>("severity", severity),
};
MissingCounter.Add(1, tags);
}
public static void ReportResumeWindow(string source, DateTimeOffset windowStart, ILogger logger)
{
if (string.IsNullOrWhiteSpace(source) || logger is null)
{
return;
}
DateTimeOffset earliest;
var hasEntry = false;
lock (SyncRoot)
{
if (EarliestMissing.TryGetValue(source, out earliest))
{
hasEntry = true;
if (windowStart <= earliest)
{
EarliestMissing.Remove(source);
var prefix = source + "|";
RecordedComponents.RemoveWhere(entry => entry.StartsWith(prefix, StringComparison.OrdinalIgnoreCase));
}
}
}
if (!hasEntry)
{
return;
}
if (windowStart <= earliest)
{
logger.LogInformation(
"Resume window starting {WindowStart:o} for {Source} may backfill missing provenance recorded at {Earliest:o}.",
windowStart,
source,
earliest);
}
else
{
logger.LogInformation(
"Earliest missing provenance for {Source} remains at {Earliest:o}; current resume window begins at {WindowStart:o}. Consider widening overlap to backfill.",
source,
earliest,
windowStart);
}
}
public static void RecordRangePrimitive(string source, AffectedVersionRange range)
{
if (range is null)
{
return;
}
source = string.IsNullOrWhiteSpace(source) ? "unknown" : source.Trim();
var primitives = range.Primitives;
var primitiveKinds = DeterminePrimitiveKinds(primitives);
var vendorExtensions = primitives?.VendorExtensions?.Count ?? 0;
var tags = new[]
{
new KeyValuePair<string, object?>("source", source),
new KeyValuePair<string, object?>("rangeKind", string.IsNullOrWhiteSpace(range.RangeKind) ? "unknown" : range.RangeKind),
new KeyValuePair<string, object?>("primitiveKinds", primitiveKinds),
new KeyValuePair<string, object?>("hasVendorExtensions", vendorExtensions > 0 ? "true" : "false"),
};
RangePrimitiveCounter.Add(1, tags);
}
private static string DetermineCategory(string component)
{
if (string.IsNullOrWhiteSpace(component))
{
return "unknown";
}
var index = component.IndexOf(':');
var category = index > 0 ? component[..index] : component;
return category.Trim().ToLowerInvariant();
}
private static string DetermineSeverity(string category)
=> category switch
{
"advisory" => "critical",
"package" => "high",
"range" => "high",
"status" => "medium",
"cvss" => "medium",
"reference" => "low",
_ => "info",
};
private static string DeterminePrimitiveKinds(RangePrimitives? primitives)
{
if (primitives is null)
{
return "none";
}
var kinds = new List<string>(3);
if (primitives.SemVer is not null)
{
kinds.Add("semver");
}
if (primitives.Nevra is not null)
{
kinds.Add("nevra");
}
if (primitives.Evr is not null)
{
kinds.Add("evr");
}
return kinds.Count == 0 ? "vendor" : string.Join('+', kinds);
}
}

View File

@@ -0,0 +1,58 @@
using System.Collections.Generic;
namespace StellaOps.Feedser.Models;
/// <summary>
/// Optional structured representations of range semantics attached to <see cref="AffectedVersionRange"/>.
/// </summary>
public sealed record RangePrimitives(
SemVerPrimitive? SemVer,
NevraPrimitive? Nevra,
EvrPrimitive? Evr,
IReadOnlyDictionary<string, string>? VendorExtensions);
/// <summary>
/// Structured SemVer metadata for a version range.
/// </summary>
public sealed record SemVerPrimitive(
string? Introduced,
bool IntroducedInclusive,
string? Fixed,
bool FixedInclusive,
string? LastAffected,
bool LastAffectedInclusive,
string? ConstraintExpression);
/// <summary>
/// Structured NEVRA metadata for a version range.
/// </summary>
public sealed record NevraPrimitive(
NevraComponent? Introduced,
NevraComponent? Fixed,
NevraComponent? LastAffected);
/// <summary>
/// Structured Debian EVR metadata for a version range.
/// </summary>
public sealed record EvrPrimitive(
EvrComponent? Introduced,
EvrComponent? Fixed,
EvrComponent? LastAffected);
/// <summary>
/// Normalized NEVRA component.
/// </summary>
public sealed record NevraComponent(
string Name,
int Epoch,
string Version,
string Release,
string? Architecture);
/// <summary>
/// Normalized EVR component (epoch:upstream revision).
/// </summary>
public sealed record EvrComponent(
int Epoch,
string UpstreamVersion,
string? Revision);

View File

@@ -6,4 +6,7 @@
<ImplicitUsings>enable</ImplicitUsings>
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Microsoft.Extensions.Logging.Abstractions" Version="8.0.0" />
</ItemGroup>
</Project>

View File

@@ -9,8 +9,8 @@
|Docs: field provenance guidelines|BE-Merge|Models|DONE see `PROVENANCE_GUIDELINES.md`.|
|Canonical record definitions kept in sync|BE-Merge|Models|DONE documented in `CANONICAL_RECORDS.md`; update alongside model changes.|
|Alias scheme registry and validation helpers|BE-Merge|Models|DONE see `AliasSchemes` & `AliasSchemeRegistry` plus validation integration/tests.|
|Range primitives for SemVer/EVR/NEVRA metadata|BE-Merge|Models|TODO keep structured values without parsing logic; ensure merge/export parity.|
|Provenance envelope field masks|BE-Merge|Models|TODO guarantee traceability for each mapped field.|
|Range primitives for SemVer/EVR/NEVRA metadata|BE-Merge|Models|DOING envelope + AdvisoryStore deserialisation landed; VMware/Oracle/Chromium/NVD emit primitives. Remaining connectors (Debian, SUSE, Ubuntu, Apple, Adobe, etc.) still need structured coverage + EVR population.|
|Provenance envelope field masks|BE-Merge|Models|DOING add richer metric tags (component category/severity), dedupe missing counts, propagate resume logging across connectors.|
|Backward-compatibility playbook|BE-Merge, QA|Models|DONE see `BACKWARD_COMPATIBILITY.md` for evolution policy/test checklist.|
|Golden canonical examples|QA|Models|DONE added `/p:UpdateGoldens=true` test hook wiring `UPDATE_GOLDENS=1` so canonical fixtures regenerate via `dotnet test`; docs/tests unchanged.|
|Serialization determinism regression tests|QA|Models|DONE locale-stability tests hash canonical serializer output across multiple cultures and runs.|

View File

@@ -289,8 +289,15 @@ public sealed class CertFrConnectorTests : IAsyncLifetime
private static string ReadFixture(string filename)
{
var path = Path.Combine(AppContext.BaseDirectory, "Source", "CertFr", "Fixtures", filename);
return File.ReadAllText(path);
var baseDirectory = AppContext.BaseDirectory;
var primary = Path.Combine(baseDirectory, "Source", "CertFr", "Fixtures", filename);
if (File.Exists(primary))
{
return File.ReadAllText(primary);
}
var fallback = Path.Combine(baseDirectory, "CertFr", "Fixtures", filename);
return File.ReadAllText(fallback);
}
private static string Normalize(string value)

View File

@@ -19,10 +19,9 @@ ANSSI CERT-FR advisories connector (avis/alertes) providing national enrichment:
In: advisory metadata extraction, references, severity text, watermarking.
Out: OVAL or package-level authority.
## Observability & security expectations
- Metrics: certfr.fetch.items, certfr.parse.fail, certfr.map.count.
- Metrics: SourceDiagnostics emits shared `feedser.source.http.*` counters/histograms tagged `feedser.source=certfr`, covering fetch counts, parse failures, and map activity.
- Logs: feed URL(s), item ids/urls, extraction durations; no PII; allowlist hostnames.
## Tests
- Author and review coverage in `../StellaOps.Feedser.Source.CertFr.Tests`.
- Shared fixtures (e.g., `MongoIntegrationFixture`, `ConnectorTestHarness`) live in `../StellaOps.Feedser.Testing`.
- Keep fixtures deterministic; match new cases to real-world advisories or regression scenarios.

View File

@@ -84,7 +84,8 @@ public sealed class CertInConnectorTests : IAsyncLifetime
var normalizedActual = NormalizeLineEndings(canonical);
if (!string.Equals(normalizedExpected, normalizedActual, StringComparison.Ordinal))
{
var actualPath = Path.Combine(AppContext.BaseDirectory, "Source", "CertIn", "Fixtures", "expected-advisory.actual.json");
var actualPath = ResolveFixturePath("expected-advisory.actual.json");
Directory.CreateDirectory(Path.GetDirectoryName(actualPath)!);
File.WriteAllText(actualPath, canonical);
}
@@ -316,9 +317,18 @@ public sealed class CertInConnectorTests : IAsyncLifetime
=> _fixture.Client.DropDatabaseAsync(_fixture.Database.DatabaseNamespace.DatabaseName);
private static string ReadFixture(string filename)
=> File.ReadAllText(ResolveFixturePath(filename));
private static string ResolveFixturePath(string filename)
{
var path = Path.Combine(AppContext.BaseDirectory, "Source", "CertIn", "Fixtures", filename);
return File.ReadAllText(path);
var baseDirectory = AppContext.BaseDirectory;
var primary = Path.Combine(baseDirectory, "Source", "CertIn", "Fixtures", filename);
if (File.Exists(primary) || filename.EndsWith(".actual.json", StringComparison.OrdinalIgnoreCase))
{
return primary;
}
return Path.Combine(baseDirectory, "CertIn", "Fixtures", filename);
}
private static string NormalizeLineEndings(string value)

View File

@@ -94,4 +94,4 @@
"severity": "high",
"summary": "Example Gateway devices vulnerable to remote code execution (CVE-2024-9990).",
"title": "Multiple vulnerabilities in Example Gateway"
}
}

View File

@@ -20,10 +20,9 @@ CERT-In national CERT connector; enrichment advisories for India; maps CVE lists
In: enrichment, aliasing where stable, references, mitigation text.
Out: package range authority; scraping behind auth walls.
## Observability & security expectations
- Metrics: certin.fetch.items, certin.parse.fail, certin.map.enriched_count.
- Metrics: shared `feedser.source.http.*` counters/histograms from SourceDiagnostics tagged `feedser.source=certin` capture fetch volume, parse failures, and map enrich counts.
- Logs: advisory codes, CVE counts per advisory, timing; allowlist host; redact personal data if present.
## Tests
- Author and review coverage in `../StellaOps.Feedser.Source.CertIn.Tests`.
- Shared fixtures (e.g., `MongoIntegrationFixture`, `ConnectorTestHarness`) live in `../StellaOps.Feedser.Testing`.
- Keep fixtures deterministic; match new cases to real-world advisories or regression scenarios.

View File

@@ -22,11 +22,10 @@ Shared connector toolkit. Provides HTTP clients, retry/backoff, conditional GET
In: HTTP plumbing, validators, cursor/backoff utilities, hashing.
Out: connector-specific schemas/mapping rules, merge precedence.
## Observability & security expectations
- Metrics: http.req.count, http.retry.count, rate_limit.remaining, validator.fail.count.
- Metrics: SourceDiagnostics publishes `feedser.source.http.*` counters/histograms tagged with `feedser.source=<connector>` plus retries/failures; connector dashboards slice on that tag instead of bespoke metric names.
- Logs include uri, status, retries, etag; redact tokens and auth headers.
- Distributed tracing hooks and per-connector counters should be wired centrally for consistent observability.
## Tests
- Author and review coverage in `../StellaOps.Feedser.Source.Common.Tests`.
- Shared fixtures (e.g., `MongoIntegrationFixture`, `ConnectorTestHarness`) live in `../StellaOps.Feedser.Testing`.
- Keep fixtures deterministic; match new cases to real-world advisories or regression scenarios.

View File

@@ -0,0 +1,250 @@
using System;
using System.IO;
using System.Linq;
using System.Net;
using System.Net.Http;
using System.Net.Http.Headers;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Http;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Logging.Abstractions;
using Microsoft.Extensions.Options;
using Microsoft.Extensions.Time.Testing;
using MongoDB.Driver;
using StellaOps.Feedser.Models;
using StellaOps.Feedser.Source.Common;
using StellaOps.Feedser.Source.Common.Testing;
using StellaOps.Feedser.Source.Distro.Debian.Configuration;
using StellaOps.Feedser.Storage.Mongo;
using StellaOps.Feedser.Storage.Mongo.Advisories;
using StellaOps.Feedser.Storage.Mongo.Documents;
using StellaOps.Feedser.Storage.Mongo.Dtos;
using StellaOps.Feedser.Testing;
using Xunit;
using Xunit.Abstractions;
namespace StellaOps.Feedser.Source.Distro.Debian.Tests;
[Collection("mongo-fixture")]
public sealed class DebianConnectorTests : IAsyncLifetime
{
private static readonly Uri ListUri = new("https://salsa.debian.org/security-tracker-team/security-tracker/-/raw/master/data/DSA/list");
private static readonly Uri DetailResolved = new("https://security-tracker.debian.org/tracker/DSA-2024-123");
private static readonly Uri DetailOpen = new("https://security-tracker.debian.org/tracker/DSA-2024-124");
private readonly MongoIntegrationFixture _fixture;
private readonly FakeTimeProvider _timeProvider;
private readonly CannedHttpMessageHandler _handler;
private readonly ITestOutputHelper _output;
public DebianConnectorTests(MongoIntegrationFixture fixture, ITestOutputHelper output)
{
_fixture = fixture;
_handler = new CannedHttpMessageHandler();
_timeProvider = new FakeTimeProvider(new DateTimeOffset(2024, 9, 12, 0, 0, 0, TimeSpan.Zero));
_output = output;
}
[Fact]
public async Task FetchParseMap_PopulatesRangePrimitivesAndResumesWithNotModified()
{
await using var provider = await BuildServiceProviderAsync();
SeedInitialResponses();
var connector = provider.GetRequiredService<DebianConnector>();
await connector.FetchAsync(provider, CancellationToken.None);
_timeProvider.Advance(TimeSpan.FromMinutes(1));
await connector.ParseAsync(provider, CancellationToken.None);
await connector.MapAsync(provider, CancellationToken.None);
var advisoryStore = provider.GetRequiredService<IAdvisoryStore>();
var advisories = await advisoryStore.GetRecentAsync(10, CancellationToken.None);
Assert.Equal(2, advisories.Count);
var resolved = advisories.Single(a => a.AdvisoryKey == "DSA-2024-123");
Assert.Contains("CVE-2024-1000", resolved.Aliases);
Assert.Contains("CVE-2024-1001", resolved.Aliases);
var resolvedBookworm = Assert.Single(resolved.AffectedPackages, p => p.Platform == "bookworm");
var resolvedRange = Assert.Single(resolvedBookworm.VersionRanges);
Assert.Equal("evr", resolvedRange.RangeKind);
Assert.Equal("1:1.1.1n-0+deb11u2", resolvedRange.IntroducedVersion);
Assert.Equal("1:1.1.1n-0+deb11u5", resolvedRange.FixedVersion);
Assert.NotNull(resolvedRange.Primitives);
Assert.NotNull(resolvedRange.Primitives!.Evr);
Assert.Equal(1, resolvedRange.Primitives.Evr!.Introduced!.Epoch);
Assert.Equal("1.1.1n", resolvedRange.Primitives.Evr.Introduced.UpstreamVersion);
var open = advisories.Single(a => a.AdvisoryKey == "DSA-2024-124");
Assert.Contains("CVE-2024-2000", open.Aliases);
var openBookworm = Assert.Single(open.AffectedPackages, p => p.Platform == "bookworm");
var openRange = Assert.Single(openBookworm.VersionRanges);
Assert.Equal("evr", openRange.RangeKind);
Assert.Equal("1:1.3.1-1", openRange.IntroducedVersion);
Assert.Null(openRange.FixedVersion);
Assert.NotNull(openRange.Primitives);
Assert.NotNull(openRange.Primitives!.Evr);
// Ensure data persisted through Mongo round-trip.
var found = await advisoryStore.FindAsync("DSA-2024-123", CancellationToken.None);
Assert.NotNull(found);
var persistedRange = Assert.Single(found!.AffectedPackages, pkg => pkg.Platform == "bookworm").VersionRanges.Single();
Assert.NotNull(persistedRange.Primitives);
Assert.NotNull(persistedRange.Primitives!.Evr);
// Second run should issue conditional requests and no additional parsing/mapping.
SeedNotModifiedResponses();
await connector.FetchAsync(provider, CancellationToken.None);
_timeProvider.Advance(TimeSpan.FromMinutes(1));
await connector.ParseAsync(provider, CancellationToken.None);
await connector.MapAsync(provider, CancellationToken.None);
var documents = provider.GetRequiredService<IDocumentStore>();
var listDoc = await documents.FindBySourceAndUriAsync(DebianConnectorPlugin.SourceName, DetailResolved.ToString(), CancellationToken.None);
Assert.NotNull(listDoc);
var refreshed = await advisoryStore.GetRecentAsync(10, CancellationToken.None);
Assert.Equal(2, refreshed.Count);
_handler.AssertNoPendingResponses();
}
private async Task<ServiceProvider> BuildServiceProviderAsync()
{
await _fixture.Client.DropDatabaseAsync(_fixture.Database.DatabaseNamespace.DatabaseName, CancellationToken.None);
_handler.Clear();
var services = new ServiceCollection();
services.AddLogging(builder => builder.AddProvider(new TestOutputLoggerProvider(_output)));
services.AddSingleton<TimeProvider>(_timeProvider);
services.AddSingleton(_handler);
services.AddMongoStorage(options =>
{
options.ConnectionString = _fixture.Runner.ConnectionString;
options.DatabaseName = _fixture.Database.DatabaseNamespace.DatabaseName;
options.CommandTimeout = TimeSpan.FromSeconds(5);
});
services.AddSourceCommon();
services.AddDebianConnector(options =>
{
options.ListEndpoint = ListUri;
options.DetailBaseUri = new Uri("https://security-tracker.debian.org/tracker/");
options.MaxAdvisoriesPerFetch = 10;
options.RequestDelay = TimeSpan.Zero;
});
services.Configure<HttpClientFactoryOptions>(DebianOptions.HttpClientName, builderOptions =>
{
builderOptions.HttpMessageHandlerBuilderActions.Add(builder =>
{
builder.PrimaryHandler = _handler;
});
});
var provider = services.BuildServiceProvider();
var bootstrapper = provider.GetRequiredService<MongoBootstrapper>();
await bootstrapper.InitializeAsync(CancellationToken.None);
return provider;
}
private void SeedInitialResponses()
{
AddListResponse("debian-list.txt", "\"list-v1\"");
AddDetailResponse(DetailResolved, "debian-detail-dsa-2024-123.html", "\"detail-123\"");
AddDetailResponse(DetailOpen, "debian-detail-dsa-2024-124.html", "\"detail-124\"");
}
private void SeedNotModifiedResponses()
{
AddNotModifiedResponse(ListUri, "\"list-v1\"");
AddNotModifiedResponse(DetailResolved, "\"detail-123\"");
AddNotModifiedResponse(DetailOpen, "\"detail-124\"");
}
private void AddListResponse(string fixture, string etag)
{
_handler.AddResponse(ListUri, () =>
{
var response = new HttpResponseMessage(HttpStatusCode.OK)
{
Content = new StringContent(ReadFixture(fixture), Encoding.UTF8, "text/plain"),
};
response.Headers.ETag = new EntityTagHeaderValue(etag);
return response;
});
}
private void AddDetailResponse(Uri uri, string fixture, string etag)
{
_handler.AddResponse(uri, () =>
{
var response = new HttpResponseMessage(HttpStatusCode.OK)
{
Content = new StringContent(ReadFixture(fixture), Encoding.UTF8, "text/html"),
};
response.Headers.ETag = new EntityTagHeaderValue(etag);
return response;
});
}
private void AddNotModifiedResponse(Uri uri, string etag)
{
_handler.AddResponse(uri, request =>
{
var response = new HttpResponseMessage(HttpStatusCode.NotModified);
response.Headers.ETag = new EntityTagHeaderValue(etag);
return response;
});
}
private static string ReadFixture(string filename)
{
var primary = Path.Combine(AppContext.BaseDirectory, "Source", "Distro", "Debian", "Fixtures", filename);
if (File.Exists(primary))
{
return File.ReadAllText(primary);
}
throw new FileNotFoundException($"Fixture '{filename}' not found", filename);
}
public Task InitializeAsync() => Task.CompletedTask;
public Task DisposeAsync() => Task.CompletedTask;
private sealed class TestOutputLoggerProvider : ILoggerProvider
{
private readonly ITestOutputHelper _output;
public TestOutputLoggerProvider(ITestOutputHelper output) => _output = output;
public ILogger CreateLogger(string categoryName) => new TestOutputLogger(_output);
public void Dispose()
{
}
private sealed class TestOutputLogger : ILogger
{
private readonly ITestOutputHelper _output;
public TestOutputLogger(ITestOutputHelper output) => _output = output;
public IDisposable BeginScope<TState>(TState state) where TState : notnull => NullLogger.Instance.BeginScope(state);
public bool IsEnabled(LogLevel logLevel) => false;
public void Log<TState>(LogLevel logLevel, EventId eventId, TState state, Exception? exception, Func<TState, Exception?, string> formatter)
{
if (IsEnabled(logLevel))
{
_output.WriteLine(formatter(state, exception));
}
}
}
}
}

View File

@@ -0,0 +1,88 @@
using System;
using Xunit;
using StellaOps.Feedser.Models;
using StellaOps.Feedser.Source.Distro.Debian;
using StellaOps.Feedser.Source.Distro.Debian.Internal;
using StellaOps.Feedser.Storage.Mongo.Documents;
namespace StellaOps.Feedser.Source.Distro.Debian.Tests;
public sealed class DebianMapperTests
{
[Fact]
public void Map_BuildsRangePrimitives_ForResolvedPackage()
{
var dto = new DebianAdvisoryDto(
AdvisoryId: "DSA-2024-123",
SourcePackage: "openssl",
Title: "Openssl security update",
Description: "Fixes multiple issues.",
CveIds: new[] { "CVE-2024-1000", "CVE-2024-1001" },
Packages: new[]
{
new DebianPackageStateDto(
Package: "openssl",
Release: "bullseye",
Status: "resolved",
IntroducedVersion: "1:1.1.1n-0+deb11u2",
FixedVersion: "1:1.1.1n-0+deb11u5",
LastAffectedVersion: null,
Published: new DateTimeOffset(2024, 9, 1, 0, 0, 0, TimeSpan.Zero)),
new DebianPackageStateDto(
Package: "openssl",
Release: "bookworm",
Status: "open",
IntroducedVersion: null,
FixedVersion: null,
LastAffectedVersion: null,
Published: null)
},
References: new[]
{
new DebianReferenceDto(
Url: "https://security-tracker.debian.org/tracker/DSA-2024-123",
Kind: "advisory",
Title: "Debian Security Advisory 2024-123"),
});
var document = new DocumentRecord(
Id: Guid.NewGuid(),
SourceName: DebianConnectorPlugin.SourceName,
Uri: "https://security-tracker.debian.org/tracker/DSA-2024-123",
FetchedAt: new DateTimeOffset(2024, 9, 1, 1, 0, 0, TimeSpan.Zero),
Sha256: "sha",
Status: "Fetched",
ContentType: "application/json",
Headers: null,
Metadata: null,
Etag: null,
LastModified: null,
GridFsId: null);
Advisory advisory = DebianMapper.Map(dto, document, new DateTimeOffset(2024, 9, 1, 2, 0, 0, TimeSpan.Zero));
Assert.Equal("DSA-2024-123", advisory.AdvisoryKey);
Assert.Contains("CVE-2024-1000", advisory.Aliases);
Assert.Contains("CVE-2024-1001", advisory.Aliases);
var resolvedPackage = Assert.Single(advisory.AffectedPackages, p => p.Platform == "bullseye");
var range = Assert.Single(resolvedPackage.VersionRanges);
Assert.Equal("evr", range.RangeKind);
Assert.Equal("1:1.1.1n-0+deb11u2", range.IntroducedVersion);
Assert.Equal("1:1.1.1n-0+deb11u5", range.FixedVersion);
Assert.NotNull(range.Primitives);
var evr = range.Primitives!.Evr;
Assert.NotNull(evr);
Assert.NotNull(evr!.Introduced);
Assert.Equal(1, evr.Introduced!.Epoch);
Assert.Equal("1.1.1n", evr.Introduced.UpstreamVersion);
Assert.Equal("0+deb11u2", evr.Introduced.Revision);
Assert.NotNull(evr.Fixed);
Assert.Equal(1, evr.Fixed!.Epoch);
Assert.Equal("1.1.1n", evr.Fixed.UpstreamVersion);
Assert.Equal("0+deb11u5", evr.Fixed.Revision);
var openPackage = Assert.Single(advisory.AffectedPackages, p => p.Platform == "bookworm");
Assert.Empty(openPackage.VersionRanges);
}
}

View File

@@ -0,0 +1,23 @@
<!DOCTYPE html>
<html>
<head>
<title>DSA-2024-123</title>
</head>
<body>
<header><h1>DSA-2024-123</h1></header>
<table>
<tr><td><b>Name</b></td><td>DSA-2024-123</td></tr>
<tr><td><b>Description</b></td><td>openssl - security update</td></tr>
<tr><td><b>Source</b></td><td><a href="https://www.debian.org/security/dsa-2024-123">Debian</a></td></tr>
<tr><td><b>References</b></td><td><a href="/tracker/CVE-2024-1000">CVE-2024-1000</a>, <a href="/tracker/CVE-2024-1001">CVE-2024-1001</a></td></tr>
</table>
<h2>Vulnerable and fixed packages</h2>
<table>
<tr><th>Source Package</th><th>Release</th><th>Version</th><th>Status</th></tr>
<tr><td><a href="/tracker/source-package/openssl">openssl</a></td><td>bookworm</td><td><span class="red">1:1.1.1n-0+deb11u2</span></td><td><span class="red">vulnerable</span></td></tr>
<tr><td></td><td>bookworm (security)</td><td>1:1.1.1n-0+deb11u5</td><td>fixed</td></tr>
<tr><td></td><td>trixie</td><td><span class="red">3.0.8-2</span></td><td><span class="red">vulnerable</span></td></tr>
<tr><td></td><td>trixie (security)</td><td>3.0.12-1</td><td>fixed</td></tr>
</table>
</body>
</html>

View File

@@ -0,0 +1,21 @@
<!DOCTYPE html>
<html>
<head>
<title>DSA-2024-124</title>
</head>
<body>
<header><h1>DSA-2024-124</h1></header>
<table>
<tr><td><b>Name</b></td><td>DSA-2024-124</td></tr>
<tr><td><b>Description</b></td><td>zlib - security update</td></tr>
<tr><td><b>Source</b></td><td><a href="https://www.debian.org/security/dsa-2024-124">Debian</a></td></tr>
<tr><td><b>References</b></td><td><a href="/tracker/CVE-2024-2000">CVE-2024-2000</a></td></tr>
</table>
<h2>Vulnerable and fixed packages</h2>
<table>
<tr><th>Source Package</th><th>Release</th><th>Version</th><th>Status</th></tr>
<tr><td><a href="/tracker/source-package/zlib">zlib</a></td><td>bookworm</td><td><span class="red">1:1.3.1-1</span></td><td><span class="red">vulnerable</span></td></tr>
<tr><td></td><td>trixie</td><td><span class="red">1:1.3.1-2</span></td><td><span class="red">vulnerable</span></td></tr>
</table>
</body>
</html>

View File

@@ -0,0 +1,7 @@
[12 Sep 2024] DSA-2024-123 openssl - security update
{CVE-2024-1000 CVE-2024-1001}
[bookworm] - openssl 1:1.1.1n-0+deb11u5
[trixie] - openssl 3.0.12-1
[10 Sep 2024] DSA-2024-124 zlib - security update
{CVE-2024-2000}
[bookworm] - zlib 1:1.3.2-1

View File

@@ -0,0 +1,13 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>net10.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
</PropertyGroup>
<ItemGroup>
<ProjectReference Include="../StellaOps.Feedser.Models/StellaOps.Feedser.Models.csproj" />
<ProjectReference Include="../StellaOps.Feedser.Source.Common/StellaOps.Feedser.Source.Common.csproj" />
<ProjectReference Include="../StellaOps.Feedser.Source.Distro.Debian/StellaOps.Feedser.Source.Distro.Debian.csproj" />
<ProjectReference Include="../StellaOps.Feedser.Storage.Mongo/StellaOps.Feedser.Storage.Mongo.csproj" />
</ItemGroup>
</Project>

View File

@@ -0,0 +1,3 @@
using System.Runtime.CompilerServices;
[assembly: InternalsVisibleTo("StellaOps.Feedser.Source.Distro.Debian.Tests")]

View File

@@ -1,29 +0,0 @@
using System;
using System.Threading;
using System.Threading.Tasks;
using StellaOps.Plugin;
namespace StellaOps.Feedser.Source.Distro.Debian;
public sealed class DistroDebianConnectorPlugin : IConnectorPlugin
{
public string Name => "distro-debian";
public bool IsAvailable(IServiceProvider services) => true;
public IFeedConnector Create(IServiceProvider services) => new StubConnector(Name);
private sealed class StubConnector : IFeedConnector
{
public StubConnector(string sourceName) => SourceName = sourceName;
public string SourceName { get; }
public Task FetchAsync(IServiceProvider services, CancellationToken cancellationToken) => Task.CompletedTask;
public Task ParseAsync(IServiceProvider services, CancellationToken cancellationToken) => Task.CompletedTask;
public Task MapAsync(IServiceProvider services, CancellationToken cancellationToken) => Task.CompletedTask;
}
}

View File

@@ -0,0 +1,87 @@
using System;
namespace StellaOps.Feedser.Source.Distro.Debian.Configuration;
public sealed class DebianOptions
{
public const string HttpClientName = "feedser.debian";
/// <summary>
/// Raw advisory list published by the Debian security tracker team.
/// Defaults to the Salsa Git raw endpoint to avoid HTML scraping.
/// </summary>
public Uri ListEndpoint { get; set; } = new("https://salsa.debian.org/security-tracker-team/security-tracker/-/raw/master/data/DSA/list");
/// <summary>
/// Base URI for advisory detail pages. Connector appends {AdvisoryId}.
/// </summary>
public Uri DetailBaseUri { get; set; } = new("https://security-tracker.debian.org/tracker/");
/// <summary>
/// Maximum advisories fetched per run to cap backfill effort.
/// </summary>
public int MaxAdvisoriesPerFetch { get; set; } = 40;
/// <summary>
/// Initial history window pulled on first run.
/// </summary>
public TimeSpan InitialBackfill { get; set; } = TimeSpan.FromDays(30);
/// <summary>
/// Resume overlap to accommodate late edits of existing advisories.
/// </summary>
public TimeSpan ResumeOverlap { get; set; } = TimeSpan.FromDays(2);
/// <summary>
/// Request timeout used for list/detail fetches unless overridden via HTTP client.
/// </summary>
public TimeSpan FetchTimeout { get; set; } = TimeSpan.FromSeconds(45);
/// <summary>
/// Optional pacing delay between detail fetches.
/// </summary>
public TimeSpan RequestDelay { get; set; } = TimeSpan.Zero;
/// <summary>
/// Custom user-agent for Debian tracker courtesy.
/// </summary>
public string UserAgent { get; set; } = "StellaOps.Feedser.Debian/0.1 (+https://stella-ops.org)";
public void Validate()
{
if (ListEndpoint is null || !ListEndpoint.IsAbsoluteUri)
{
throw new InvalidOperationException("Debian list endpoint must be an absolute URI.");
}
if (DetailBaseUri is null || !DetailBaseUri.IsAbsoluteUri)
{
throw new InvalidOperationException("Debian detail base URI must be an absolute URI.");
}
if (MaxAdvisoriesPerFetch <= 0 || MaxAdvisoriesPerFetch > 200)
{
throw new InvalidOperationException("MaxAdvisoriesPerFetch must be between 1 and 200.");
}
if (InitialBackfill < TimeSpan.Zero || InitialBackfill > TimeSpan.FromDays(365))
{
throw new InvalidOperationException("InitialBackfill must be between 0 and 365 days.");
}
if (ResumeOverlap < TimeSpan.Zero || ResumeOverlap > TimeSpan.FromDays(14))
{
throw new InvalidOperationException("ResumeOverlap must be between 0 and 14 days.");
}
if (FetchTimeout <= TimeSpan.Zero || FetchTimeout > TimeSpan.FromMinutes(5))
{
throw new InvalidOperationException("FetchTimeout must be positive and less than five minutes.");
}
if (RequestDelay < TimeSpan.Zero || RequestDelay > TimeSpan.FromSeconds(10))
{
throw new InvalidOperationException("RequestDelay must be between 0 and 10 seconds.");
}
}
}

View File

@@ -0,0 +1,637 @@
using System;
using System.Collections.Generic;
using System.Globalization;
using System.Linq;
using System.Net;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using MongoDB.Bson;
using MongoDB.Bson.IO;
using StellaOps.Feedser.Models;
using StellaOps.Feedser.Source.Common;
using StellaOps.Feedser.Source.Common.Fetch;
using StellaOps.Feedser.Source.Distro.Debian.Configuration;
using StellaOps.Feedser.Source.Distro.Debian.Internal;
using StellaOps.Feedser.Storage.Mongo;
using StellaOps.Feedser.Storage.Mongo.Advisories;
using StellaOps.Feedser.Storage.Mongo.Documents;
using StellaOps.Feedser.Storage.Mongo.Dtos;
using StellaOps.Plugin;
namespace StellaOps.Feedser.Source.Distro.Debian;
public sealed class DebianConnector : IFeedConnector
{
private const string SchemaVersion = "debian.v1";
private readonly SourceFetchService _fetchService;
private readonly RawDocumentStorage _rawDocumentStorage;
private readonly IDocumentStore _documentStore;
private readonly IDtoStore _dtoStore;
private readonly IAdvisoryStore _advisoryStore;
private readonly ISourceStateRepository _stateRepository;
private readonly DebianOptions _options;
private readonly TimeProvider _timeProvider;
private readonly ILogger<DebianConnector> _logger;
private static readonly Action<ILogger, string, int, Exception?> LogMapped =
LoggerMessage.Define<string, int>(
LogLevel.Information,
new EventId(1, "DebianMapped"),
"Debian advisory {AdvisoryId} mapped with {AffectedCount} packages");
public DebianConnector(
SourceFetchService fetchService,
RawDocumentStorage rawDocumentStorage,
IDocumentStore documentStore,
IDtoStore dtoStore,
IAdvisoryStore advisoryStore,
ISourceStateRepository stateRepository,
IOptions<DebianOptions> options,
TimeProvider? timeProvider,
ILogger<DebianConnector> logger)
{
_fetchService = fetchService ?? throw new ArgumentNullException(nameof(fetchService));
_rawDocumentStorage = rawDocumentStorage ?? throw new ArgumentNullException(nameof(rawDocumentStorage));
_documentStore = documentStore ?? throw new ArgumentNullException(nameof(documentStore));
_dtoStore = dtoStore ?? throw new ArgumentNullException(nameof(dtoStore));
_advisoryStore = advisoryStore ?? throw new ArgumentNullException(nameof(advisoryStore));
_stateRepository = stateRepository ?? throw new ArgumentNullException(nameof(stateRepository));
_options = (options ?? throw new ArgumentNullException(nameof(options))).Value ?? throw new ArgumentNullException(nameof(options));
_options.Validate();
_timeProvider = timeProvider ?? TimeProvider.System;
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public string SourceName => DebianConnectorPlugin.SourceName;
public async Task FetchAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
var now = _timeProvider.GetUtcNow();
var pendingDocuments = new HashSet<Guid>(cursor.PendingDocuments);
var pendingMappings = new HashSet<Guid>(cursor.PendingMappings);
var fetchCache = new Dictionary<string, DebianFetchCacheEntry>(cursor.FetchCache, StringComparer.OrdinalIgnoreCase);
var touchedResources = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
var listUri = _options.ListEndpoint;
var listKey = listUri.ToString();
touchedResources.Add(listKey);
var existingList = await _documentStore.FindBySourceAndUriAsync(SourceName, listKey, cancellationToken).ConfigureAwait(false);
cursor.TryGetCache(listKey, out var cachedListEntry);
var listRequest = new SourceFetchRequest(DebianOptions.HttpClientName, SourceName, listUri)
{
Metadata = new Dictionary<string, string>(StringComparer.Ordinal)
{
["type"] = "index"
},
AcceptHeaders = new[] { "text/plain", "text/plain; charset=utf-8" },
TimeoutOverride = _options.FetchTimeout,
ETag = existingList?.Etag ?? cachedListEntry?.ETag,
LastModified = existingList?.LastModified ?? cachedListEntry?.LastModified,
};
SourceFetchResult listResult;
try
{
listResult = await _fetchService.FetchAsync(listRequest, cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
_logger.LogError(ex, "Debian list fetch failed");
await _stateRepository.MarkFailureAsync(SourceName, now, TimeSpan.FromMinutes(5), ex.Message, cancellationToken).ConfigureAwait(false);
throw;
}
var lastPublished = cursor.LastPublished ?? (now - _options.InitialBackfill);
var processedIds = new HashSet<string>(cursor.ProcessedAdvisoryIds, StringComparer.OrdinalIgnoreCase);
var newProcessedIds = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
var maxPublished = cursor.LastPublished ?? DateTimeOffset.MinValue;
var processedUpdated = false;
if (listResult.IsNotModified)
{
if (existingList is not null)
{
fetchCache[listKey] = DebianFetchCacheEntry.FromDocument(existingList);
}
}
else if (listResult.IsSuccess && listResult.Document is not null)
{
fetchCache[listKey] = DebianFetchCacheEntry.FromDocument(listResult.Document);
if (!listResult.Document.GridFsId.HasValue)
{
_logger.LogWarning("Debian list document {DocumentId} missing GridFS payload", listResult.Document.Id);
}
else
{
byte[] bytes;
try
{
bytes = await _rawDocumentStorage.DownloadAsync(listResult.Document.GridFsId.Value, cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
_logger.LogError(ex, "Failed to download Debian list document {DocumentId}", listResult.Document.Id);
throw;
}
var text = System.Text.Encoding.UTF8.GetString(bytes);
var entries = DebianListParser.Parse(text);
if (entries.Count > 0)
{
var windowStart = (cursor.LastPublished ?? (now - _options.InitialBackfill)) - _options.ResumeOverlap;
if (windowStart < DateTimeOffset.UnixEpoch)
{
windowStart = DateTimeOffset.UnixEpoch;
}
ProvenanceDiagnostics.ReportResumeWindow(SourceName, windowStart, _logger);
var candidates = entries
.Where(entry => entry.Published >= windowStart)
.OrderBy(entry => entry.Published)
.ThenBy(entry => entry.AdvisoryId, StringComparer.OrdinalIgnoreCase)
.ToList();
if (candidates.Count == 0)
{
candidates = entries
.OrderByDescending(entry => entry.Published)
.ThenBy(entry => entry.AdvisoryId, StringComparer.OrdinalIgnoreCase)
.Take(_options.MaxAdvisoriesPerFetch)
.OrderBy(entry => entry.Published)
.ThenBy(entry => entry.AdvisoryId, StringComparer.OrdinalIgnoreCase)
.ToList();
}
else if (candidates.Count > _options.MaxAdvisoriesPerFetch)
{
candidates = candidates
.OrderByDescending(entry => entry.Published)
.ThenBy(entry => entry.AdvisoryId, StringComparer.OrdinalIgnoreCase)
.Take(_options.MaxAdvisoriesPerFetch)
.OrderBy(entry => entry.Published)
.ThenBy(entry => entry.AdvisoryId, StringComparer.OrdinalIgnoreCase)
.ToList();
}
foreach (var entry in candidates)
{
cancellationToken.ThrowIfCancellationRequested();
var detailUri = new Uri(_options.DetailBaseUri, entry.AdvisoryId);
var cacheKey = detailUri.ToString();
touchedResources.Add(cacheKey);
cursor.TryGetCache(cacheKey, out var cachedDetail);
if (!fetchCache.TryGetValue(cacheKey, out var cachedInRun))
{
cachedInRun = cachedDetail;
}
var metadata = BuildDetailMetadata(entry);
var existingDetail = await _documentStore.FindBySourceAndUriAsync(SourceName, cacheKey, cancellationToken).ConfigureAwait(false);
var request = new SourceFetchRequest(DebianOptions.HttpClientName, SourceName, detailUri)
{
Metadata = metadata,
AcceptHeaders = new[] { "text/html", "application/xhtml+xml" },
TimeoutOverride = _options.FetchTimeout,
ETag = existingDetail?.Etag ?? cachedInRun?.ETag,
LastModified = existingDetail?.LastModified ?? cachedInRun?.LastModified,
};
SourceFetchResult result;
try
{
result = await _fetchService.FetchAsync(request, cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
_logger.LogError(ex, "Failed to fetch Debian advisory {AdvisoryId}", entry.AdvisoryId);
await _stateRepository.MarkFailureAsync(SourceName, now, TimeSpan.FromMinutes(5), ex.Message, cancellationToken).ConfigureAwait(false);
throw;
}
if (result.IsNotModified)
{
if (existingDetail is not null)
{
fetchCache[cacheKey] = DebianFetchCacheEntry.FromDocument(existingDetail);
if (string.Equals(existingDetail.Status, DocumentStatuses.Mapped, StringComparison.Ordinal))
{
pendingDocuments.Remove(existingDetail.Id);
pendingMappings.Remove(existingDetail.Id);
}
}
continue;
}
if (!result.IsSuccess || result.Document is null)
{
continue;
}
fetchCache[cacheKey] = DebianFetchCacheEntry.FromDocument(result.Document);
pendingDocuments.Add(result.Document.Id);
pendingMappings.Remove(result.Document.Id);
if (_options.RequestDelay > TimeSpan.Zero)
{
try
{
await Task.Delay(_options.RequestDelay, cancellationToken).ConfigureAwait(false);
}
catch (TaskCanceledException)
{
break;
}
}
if (entry.Published > maxPublished)
{
maxPublished = entry.Published;
newProcessedIds.Clear();
processedUpdated = true;
}
if (entry.Published == maxPublished)
{
newProcessedIds.Add(entry.AdvisoryId);
processedUpdated = true;
}
}
}
}
}
if (fetchCache.Count > 0 && touchedResources.Count > 0)
{
var stale = fetchCache.Keys.Where(key => !touchedResources.Contains(key)).ToArray();
foreach (var key in stale)
{
fetchCache.Remove(key);
}
}
if (!processedUpdated && cursor.LastPublished.HasValue)
{
maxPublished = cursor.LastPublished.Value;
newProcessedIds = new HashSet<string>(cursor.ProcessedAdvisoryIds, StringComparer.OrdinalIgnoreCase);
}
var updatedCursor = cursor
.WithPendingDocuments(pendingDocuments)
.WithPendingMappings(pendingMappings)
.WithFetchCache(fetchCache);
if (processedUpdated && maxPublished > DateTimeOffset.MinValue)
{
updatedCursor = updatedCursor.WithProcessed(maxPublished, newProcessedIds);
}
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
public async Task ParseAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingDocuments.Count == 0)
{
return;
}
var remaining = cursor.PendingDocuments.ToList();
var pendingMappings = cursor.PendingMappings.ToList();
foreach (var documentId in cursor.PendingDocuments)
{
cancellationToken.ThrowIfCancellationRequested();
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (document is null)
{
remaining.Remove(documentId);
continue;
}
if (!document.GridFsId.HasValue)
{
_logger.LogWarning("Debian document {DocumentId} missing GridFS payload", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remaining.Remove(documentId);
continue;
}
var metadata = ExtractMetadata(document);
if (metadata is null)
{
_logger.LogWarning("Debian document {DocumentId} missing required metadata", document.Id);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remaining.Remove(documentId);
continue;
}
byte[] bytes;
try
{
bytes = await _rawDocumentStorage.DownloadAsync(document.GridFsId.Value, cancellationToken).ConfigureAwait(false);
}
catch (Exception ex)
{
_logger.LogError(ex, "Failed to download Debian document {DocumentId}", document.Id);
throw;
}
var html = System.Text.Encoding.UTF8.GetString(bytes);
DebianAdvisoryDto dto;
try
{
dto = DebianHtmlParser.Parse(html, metadata);
}
catch (Exception ex)
{
_logger.LogWarning(ex, "Failed to parse Debian advisory {AdvisoryId}", metadata.AdvisoryId);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
remaining.Remove(document.Id);
continue;
}
var payload = ToBson(dto);
var dtoRecord = new DtoRecord(Guid.NewGuid(), document.Id, SourceName, SchemaVersion, payload, _timeProvider.GetUtcNow());
await _dtoStore.UpsertAsync(dtoRecord, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(document.Id, DocumentStatuses.PendingMap, cancellationToken).ConfigureAwait(false);
remaining.Remove(document.Id);
if (!pendingMappings.Contains(document.Id))
{
pendingMappings.Add(document.Id);
}
}
var updatedCursor = cursor
.WithPendingDocuments(remaining)
.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
public async Task MapAsync(IServiceProvider services, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(services);
var cursor = await GetCursorAsync(cancellationToken).ConfigureAwait(false);
if (cursor.PendingMappings.Count == 0)
{
return;
}
var pendingMappings = cursor.PendingMappings.ToList();
foreach (var documentId in cursor.PendingMappings)
{
cancellationToken.ThrowIfCancellationRequested();
var dtoRecord = await _dtoStore.FindByDocumentIdAsync(documentId, cancellationToken).ConfigureAwait(false);
var document = await _documentStore.FindAsync(documentId, cancellationToken).ConfigureAwait(false);
if (dtoRecord is null || document is null)
{
pendingMappings.Remove(documentId);
continue;
}
DebianAdvisoryDto dto;
try
{
dto = FromBson(dtoRecord.Payload);
}
catch (Exception ex)
{
_logger.LogError(ex, "Failed to deserialize Debian DTO for document {DocumentId}", documentId);
await _documentStore.UpdateStatusAsync(documentId, DocumentStatuses.Failed, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
continue;
}
var advisory = DebianMapper.Map(dto, document, _timeProvider.GetUtcNow());
await _advisoryStore.UpsertAsync(advisory, cancellationToken).ConfigureAwait(false);
await _documentStore.UpdateStatusAsync(documentId, DocumentStatuses.Mapped, cancellationToken).ConfigureAwait(false);
pendingMappings.Remove(documentId);
LogMapped(_logger, dto.AdvisoryId, advisory.AffectedPackages.Length, null);
}
var updatedCursor = cursor.WithPendingMappings(pendingMappings);
await UpdateCursorAsync(updatedCursor, cancellationToken).ConfigureAwait(false);
}
private async Task<DebianCursor> GetCursorAsync(CancellationToken cancellationToken)
{
var state = await _stateRepository.TryGetAsync(SourceName, cancellationToken).ConfigureAwait(false);
return state is null ? DebianCursor.Empty : DebianCursor.FromBson(state.Cursor);
}
private async Task UpdateCursorAsync(DebianCursor cursor, CancellationToken cancellationToken)
{
var document = cursor.ToBsonDocument();
await _stateRepository.UpdateCursorAsync(SourceName, document, _timeProvider.GetUtcNow(), cancellationToken).ConfigureAwait(false);
}
private static Dictionary<string, string> BuildDetailMetadata(DebianListEntry entry)
{
var metadata = new Dictionary<string, string>(StringComparer.Ordinal)
{
["debian.id"] = entry.AdvisoryId,
["debian.published"] = entry.Published.ToString("O", CultureInfo.InvariantCulture),
["debian.title"] = entry.Title,
["debian.package"] = entry.SourcePackage
};
if (entry.CveIds.Count > 0)
{
metadata["debian.cves"] = string.Join(' ', entry.CveIds);
}
return metadata;
}
private static DebianDetailMetadata? ExtractMetadata(DocumentRecord document)
{
if (document.Metadata is null)
{
return null;
}
if (!document.Metadata.TryGetValue("debian.id", out var id) || string.IsNullOrWhiteSpace(id))
{
return null;
}
if (!document.Metadata.TryGetValue("debian.published", out var publishedRaw)
|| !DateTimeOffset.TryParse(publishedRaw, CultureInfo.InvariantCulture, DateTimeStyles.AssumeUniversal, out var published))
{
published = document.FetchedAt;
}
var title = document.Metadata.TryGetValue("debian.title", out var t) ? t : id;
var package = document.Metadata.TryGetValue("debian.package", out var pkg) && !string.IsNullOrWhiteSpace(pkg)
? pkg
: id;
IReadOnlyList<string> cveList = Array.Empty<string>();
if (document.Metadata.TryGetValue("debian.cves", out var cvesRaw) && !string.IsNullOrWhiteSpace(cvesRaw))
{
cveList = cvesRaw
.Split(' ', StringSplitOptions.TrimEntries | StringSplitOptions.RemoveEmptyEntries)
.Where(static s => !string.IsNullOrWhiteSpace(s))
.Select(static s => s!)
.Distinct(StringComparer.OrdinalIgnoreCase)
.ToArray();
}
return new DebianDetailMetadata(
id.Trim(),
new Uri(document.Uri, UriKind.Absolute),
published.ToUniversalTime(),
title,
package,
cveList);
}
private static BsonDocument ToBson(DebianAdvisoryDto dto)
{
var packages = new BsonArray();
foreach (var package in dto.Packages)
{
var packageDoc = new BsonDocument
{
["package"] = package.Package,
["release"] = package.Release,
["status"] = package.Status,
};
if (!string.IsNullOrWhiteSpace(package.IntroducedVersion))
{
packageDoc["introduced"] = package.IntroducedVersion;
}
if (!string.IsNullOrWhiteSpace(package.FixedVersion))
{
packageDoc["fixed"] = package.FixedVersion;
}
if (!string.IsNullOrWhiteSpace(package.LastAffectedVersion))
{
packageDoc["last"] = package.LastAffectedVersion;
}
if (package.Published.HasValue)
{
packageDoc["published"] = package.Published.Value.UtcDateTime;
}
packages.Add(packageDoc);
}
var references = new BsonArray(dto.References.Select(reference =>
{
var doc = new BsonDocument
{
["url"] = reference.Url
};
if (!string.IsNullOrWhiteSpace(reference.Kind))
{
doc["kind"] = reference.Kind;
}
if (!string.IsNullOrWhiteSpace(reference.Title))
{
doc["title"] = reference.Title;
}
return doc;
}));
return new BsonDocument
{
["advisoryId"] = dto.AdvisoryId,
["sourcePackage"] = dto.SourcePackage,
["title"] = dto.Title,
["description"] = dto.Description ?? string.Empty,
["cves"] = new BsonArray(dto.CveIds),
["packages"] = packages,
["references"] = references,
};
}
private static DebianAdvisoryDto FromBson(BsonDocument document)
{
var advisoryId = document.GetValue("advisoryId", "").AsString;
var sourcePackage = document.GetValue("sourcePackage", advisoryId).AsString;
var title = document.GetValue("title", advisoryId).AsString;
var description = document.TryGetValue("description", out var desc) ? desc.AsString : null;
var cves = document.TryGetValue("cves", out var cveArray) && cveArray is BsonArray cvesBson
? cvesBson.OfType<BsonValue>()
.Select(static value => value.ToString())
.Where(static s => !string.IsNullOrWhiteSpace(s))
.Select(static s => s!)
.ToArray()
: Array.Empty<string>();
var packages = new List<DebianPackageStateDto>();
if (document.TryGetValue("packages", out var packageArray) && packageArray is BsonArray packagesBson)
{
foreach (var element in packagesBson.OfType<BsonDocument>())
{
packages.Add(new DebianPackageStateDto(
element.GetValue("package", sourcePackage).AsString,
element.GetValue("release", string.Empty).AsString,
element.GetValue("status", "unknown").AsString,
element.TryGetValue("introduced", out var introducedValue) ? introducedValue.AsString : null,
element.TryGetValue("fixed", out var fixedValue) ? fixedValue.AsString : null,
element.TryGetValue("last", out var lastValue) ? lastValue.AsString : null,
element.TryGetValue("published", out var publishedValue)
? publishedValue.BsonType switch
{
BsonType.DateTime => DateTime.SpecifyKind(publishedValue.ToUniversalTime(), DateTimeKind.Utc),
BsonType.String when DateTimeOffset.TryParse(publishedValue.AsString, out var parsed) => parsed.ToUniversalTime(),
_ => (DateTimeOffset?)null,
}
: null));
}
}
var references = new List<DebianReferenceDto>();
if (document.TryGetValue("references", out var referenceArray) && referenceArray is BsonArray refBson)
{
foreach (var element in refBson.OfType<BsonDocument>())
{
references.Add(new DebianReferenceDto(
element.GetValue("url", "").AsString,
element.TryGetValue("kind", out var kind) ? kind.AsString : null,
element.TryGetValue("title", out var titleValue) ? titleValue.AsString : null));
}
}
return new DebianAdvisoryDto(
advisoryId,
sourcePackage,
title,
description,
cves,
packages,
references);
}
}

View File

@@ -0,0 +1,22 @@
using System;
using System.Threading;
using System;
using Microsoft.Extensions.DependencyInjection;
using StellaOps.Plugin;
namespace StellaOps.Feedser.Source.Distro.Debian;
public sealed class DebianConnectorPlugin : IConnectorPlugin
{
public const string SourceName = "distro-debian";
public string Name => SourceName;
public bool IsAvailable(IServiceProvider services) => services is not null;
public IFeedConnector Create(IServiceProvider services)
{
ArgumentNullException.ThrowIfNull(services);
return ActivatorUtilities.CreateInstance<DebianConnector>(services);
}
}

View File

@@ -0,0 +1,53 @@
using System;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
using StellaOps.DependencyInjection;
using StellaOps.Feedser.Core.Jobs;
using StellaOps.Feedser.Source.Distro.Debian.Configuration;
namespace StellaOps.Feedser.Source.Distro.Debian;
public sealed class DebianDependencyInjectionRoutine : IDependencyInjectionRoutine
{
private const string ConfigurationSection = "feedser:sources:debian";
private const string FetchSchedule = "*/30 * * * *";
private const string ParseSchedule = "7,37 * * * *";
private const string MapSchedule = "12,42 * * * *";
private static readonly TimeSpan FetchTimeout = TimeSpan.FromMinutes(6);
private static readonly TimeSpan ParseTimeout = TimeSpan.FromMinutes(10);
private static readonly TimeSpan MapTimeout = TimeSpan.FromMinutes(10);
private static readonly TimeSpan LeaseDuration = TimeSpan.FromMinutes(5);
public IServiceCollection Register(IServiceCollection services, IConfiguration configuration)
{
ArgumentNullException.ThrowIfNull(services);
ArgumentNullException.ThrowIfNull(configuration);
services.AddDebianConnector(options =>
{
configuration.GetSection(ConfigurationSection).Bind(options);
options.Validate();
});
var scheduler = new JobSchedulerBuilder(services);
scheduler
.AddJob<DebianFetchJob>(
DebianJobKinds.Fetch,
cronExpression: FetchSchedule,
timeout: FetchTimeout,
leaseDuration: LeaseDuration)
.AddJob<DebianParseJob>(
DebianJobKinds.Parse,
cronExpression: ParseSchedule,
timeout: ParseTimeout,
leaseDuration: LeaseDuration)
.AddJob<DebianMapJob>(
DebianJobKinds.Map,
cronExpression: MapSchedule,
timeout: MapTimeout,
leaseDuration: LeaseDuration);
return services;
}
}

View File

@@ -0,0 +1,37 @@
using System;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Options;
using StellaOps.Feedser.Source.Common.Http;
using StellaOps.Feedser.Source.Distro.Debian.Configuration;
namespace StellaOps.Feedser.Source.Distro.Debian;
public static class DebianServiceCollectionExtensions
{
public static IServiceCollection AddDebianConnector(this IServiceCollection services, Action<DebianOptions> configure)
{
ArgumentNullException.ThrowIfNull(services);
ArgumentNullException.ThrowIfNull(configure);
services.AddOptions<DebianOptions>()
.Configure(configure)
.PostConfigure(static options => options.Validate());
services.AddSourceHttpClient(DebianOptions.HttpClientName, (sp, httpOptions) =>
{
var options = sp.GetRequiredService<IOptions<DebianOptions>>().Value;
httpOptions.BaseAddress = options.DetailBaseUri.GetLeftPart(UriPartial.Authority) is { Length: > 0 } authority
? new Uri(authority, UriKind.Absolute)
: new Uri("https://security-tracker.debian.org/", UriKind.Absolute);
httpOptions.Timeout = options.FetchTimeout;
httpOptions.UserAgent = options.UserAgent;
httpOptions.AllowedHosts.Clear();
httpOptions.AllowedHosts.Add(options.DetailBaseUri.Host);
httpOptions.AllowedHosts.Add(options.ListEndpoint.Host);
httpOptions.DefaultRequestHeaders["Accept"] = "text/html,application/xhtml+xml,text/plain;q=0.9,application/json;q=0.8";
});
services.AddTransient<DebianConnector>();
return services;
}
}

View File

@@ -0,0 +1,27 @@
using System;
using System.Collections.Generic;
namespace StellaOps.Feedser.Source.Distro.Debian.Internal;
internal sealed record DebianAdvisoryDto(
string AdvisoryId,
string SourcePackage,
string? Title,
string? Description,
IReadOnlyList<string> CveIds,
IReadOnlyList<DebianPackageStateDto> Packages,
IReadOnlyList<DebianReferenceDto> References);
internal sealed record DebianPackageStateDto(
string Package,
string Release,
string Status,
string? IntroducedVersion,
string? FixedVersion,
string? LastAffectedVersion,
DateTimeOffset? Published);
internal sealed record DebianReferenceDto(
string Url,
string? Kind,
string? Title);

View File

@@ -0,0 +1,177 @@
using System;
using System.Collections.Generic;
using System.Linq;
using MongoDB.Bson;
namespace StellaOps.Feedser.Source.Distro.Debian.Internal;
internal sealed record DebianCursor(
DateTimeOffset? LastPublished,
IReadOnlyCollection<string> ProcessedAdvisoryIds,
IReadOnlyCollection<Guid> PendingDocuments,
IReadOnlyCollection<Guid> PendingMappings,
IReadOnlyDictionary<string, DebianFetchCacheEntry> FetchCache)
{
private static readonly IReadOnlyCollection<string> EmptyIds = Array.Empty<string>();
private static readonly IReadOnlyCollection<Guid> EmptyGuidList = Array.Empty<Guid>();
private static readonly IReadOnlyDictionary<string, DebianFetchCacheEntry> EmptyCache =
new Dictionary<string, DebianFetchCacheEntry>(StringComparer.OrdinalIgnoreCase);
public static DebianCursor Empty { get; } = new(null, EmptyIds, EmptyGuidList, EmptyGuidList, EmptyCache);
public static DebianCursor FromBson(BsonDocument? document)
{
if (document is null || document.ElementCount == 0)
{
return Empty;
}
DateTimeOffset? lastPublished = null;
if (document.TryGetValue("lastPublished", out var lastValue))
{
lastPublished = lastValue.BsonType switch
{
BsonType.String when DateTimeOffset.TryParse(lastValue.AsString, out var parsed) => parsed.ToUniversalTime(),
BsonType.DateTime => DateTime.SpecifyKind(lastValue.ToUniversalTime(), DateTimeKind.Utc),
_ => null,
};
}
var processed = ReadStringArray(document, "processedIds");
var pendingDocuments = ReadGuidArray(document, "pendingDocuments");
var pendingMappings = ReadGuidArray(document, "pendingMappings");
var cache = ReadCache(document);
return new DebianCursor(lastPublished, processed, pendingDocuments, pendingMappings, cache);
}
public BsonDocument ToBsonDocument()
{
var document = new BsonDocument
{
["pendingDocuments"] = new BsonArray(PendingDocuments.Select(static id => id.ToString())),
["pendingMappings"] = new BsonArray(PendingMappings.Select(static id => id.ToString())),
};
if (LastPublished.HasValue)
{
document["lastPublished"] = LastPublished.Value.UtcDateTime;
}
if (ProcessedAdvisoryIds.Count > 0)
{
document["processedIds"] = new BsonArray(ProcessedAdvisoryIds);
}
if (FetchCache.Count > 0)
{
var cacheDoc = new BsonDocument();
foreach (var (key, entry) in FetchCache)
{
cacheDoc[key] = entry.ToBsonDocument();
}
document["fetchCache"] = cacheDoc;
}
return document;
}
public DebianCursor WithPendingDocuments(IEnumerable<Guid> ids)
=> this with { PendingDocuments = ids?.Distinct().ToArray() ?? EmptyGuidList };
public DebianCursor WithPendingMappings(IEnumerable<Guid> ids)
=> this with { PendingMappings = ids?.Distinct().ToArray() ?? EmptyGuidList };
public DebianCursor WithProcessed(DateTimeOffset published, IEnumerable<string> ids)
=> this with
{
LastPublished = published.ToUniversalTime(),
ProcessedAdvisoryIds = ids?.Where(static id => !string.IsNullOrWhiteSpace(id))
.Select(static id => id.Trim())
.Distinct(StringComparer.OrdinalIgnoreCase)
.ToArray() ?? EmptyIds
};
public DebianCursor WithFetchCache(IDictionary<string, DebianFetchCacheEntry>? cache)
{
if (cache is null || cache.Count == 0)
{
return this with { FetchCache = EmptyCache };
}
return this with { FetchCache = new Dictionary<string, DebianFetchCacheEntry>(cache, StringComparer.OrdinalIgnoreCase) };
}
public bool TryGetCache(string key, out DebianFetchCacheEntry entry)
{
if (FetchCache.Count == 0)
{
entry = DebianFetchCacheEntry.Empty;
return false;
}
return FetchCache.TryGetValue(key, out entry!);
}
private static IReadOnlyCollection<string> ReadStringArray(BsonDocument document, string field)
{
if (!document.TryGetValue(field, out var value) || value is not BsonArray array)
{
return EmptyIds;
}
var list = new List<string>(array.Count);
foreach (var element in array)
{
if (element.BsonType == BsonType.String)
{
var str = element.AsString.Trim();
if (!string.IsNullOrEmpty(str))
{
list.Add(str);
}
}
}
return list;
}
private static IReadOnlyCollection<Guid> ReadGuidArray(BsonDocument document, string field)
{
if (!document.TryGetValue(field, out var value) || value is not BsonArray array)
{
return EmptyGuidList;
}
var list = new List<Guid>(array.Count);
foreach (var element in array)
{
if (Guid.TryParse(element.ToString(), out var guid))
{
list.Add(guid);
}
}
return list;
}
private static IReadOnlyDictionary<string, DebianFetchCacheEntry> ReadCache(BsonDocument document)
{
if (!document.TryGetValue("fetchCache", out var value) || value is not BsonDocument cacheDocument || cacheDocument.ElementCount == 0)
{
return EmptyCache;
}
var cache = new Dictionary<string, DebianFetchCacheEntry>(StringComparer.OrdinalIgnoreCase);
foreach (var element in cacheDocument.Elements)
{
if (element.Value is BsonDocument entry)
{
cache[element.Name] = DebianFetchCacheEntry.FromBson(entry);
}
}
return cache;
}
}

Some files were not shown because too many files have changed in this diff Show More