Add comprehensive security tests for OWASP A02, A05, A07, and A08 categories
Some checks failed
Docs CI / lint-and-preview (push) Has been cancelled
Export Center CI / export-ci (push) Has been cancelled
Findings Ledger CI / build-test (push) Has been cancelled
Findings Ledger CI / migration-validation (push) Has been cancelled
Findings Ledger CI / generate-manifest (push) Has been cancelled
Manifest Integrity / Validate Schema Integrity (push) Has been cancelled
Lighthouse CI / Lighthouse Audit (push) Has been cancelled
Lighthouse CI / Axe Accessibility Audit (push) Has been cancelled
Manifest Integrity / Validate Contract Documents (push) Has been cancelled
Manifest Integrity / Validate Pack Fixtures (push) Has been cancelled
Manifest Integrity / Audit SHA256SUMS Files (push) Has been cancelled
Manifest Integrity / Verify Merkle Roots (push) Has been cancelled
Policy Lint & Smoke / policy-lint (push) Has been cancelled
Policy Simulation / policy-simulate (push) Has been cancelled
Some checks failed
Docs CI / lint-and-preview (push) Has been cancelled
Export Center CI / export-ci (push) Has been cancelled
Findings Ledger CI / build-test (push) Has been cancelled
Findings Ledger CI / migration-validation (push) Has been cancelled
Findings Ledger CI / generate-manifest (push) Has been cancelled
Manifest Integrity / Validate Schema Integrity (push) Has been cancelled
Lighthouse CI / Lighthouse Audit (push) Has been cancelled
Lighthouse CI / Axe Accessibility Audit (push) Has been cancelled
Manifest Integrity / Validate Contract Documents (push) Has been cancelled
Manifest Integrity / Validate Pack Fixtures (push) Has been cancelled
Manifest Integrity / Audit SHA256SUMS Files (push) Has been cancelled
Manifest Integrity / Verify Merkle Roots (push) Has been cancelled
Policy Lint & Smoke / policy-lint (push) Has been cancelled
Policy Simulation / policy-simulate (push) Has been cancelled
- Implemented tests for Cryptographic Failures (A02) to ensure proper handling of sensitive data, secure algorithms, and key management. - Added tests for Security Misconfiguration (A05) to validate production configurations, security headers, CORS settings, and feature management. - Developed tests for Authentication Failures (A07) to enforce strong password policies, rate limiting, session management, and MFA support. - Created tests for Software and Data Integrity Failures (A08) to verify artifact signatures, SBOM integrity, attestation chains, and feed updates.
This commit is contained in:
@@ -13,6 +13,7 @@
|
||||
- `docs/reachability/DELIVERY_GUIDE.md` (sections 5.5–5.9 for native/JS/PHP updates)
|
||||
- `docs/reachability/purl-resolved-edges.md`
|
||||
- `docs/reachability/patch-oracles.md`
|
||||
- `docs/product-advisories/14-Dec-2025 - Smart-Diff Technical Reference.md` (for Smart-Diff predicates)
|
||||
- Current sprint file (e.g., `docs/implplan/SPRINT_401_reachability_evidence_chain.md`).
|
||||
|
||||
## Working Directory & Boundaries
|
||||
@@ -20,6 +21,30 @@
|
||||
- Avoid cross-module edits unless sprint explicitly permits; note any cross-module change in sprint tracker.
|
||||
- Keep fixtures minimal/deterministic; store under `src/Scanner/__Tests/Fixtures` or `__Benchmarks`.
|
||||
|
||||
## Smart-Diff Contracts (Sprint 3500)
|
||||
|
||||
The Scanner module now includes Smart-Diff foundation primitives:
|
||||
|
||||
### Libraries
|
||||
- `StellaOps.Scanner.SmartDiff` - Core Smart-Diff predicate models and serialization
|
||||
- `StellaOps.Scanner.Reachability` - Reachability gate computation with 3-bit class
|
||||
|
||||
### Key Types
|
||||
- `SmartDiffPredicate` - Attestation predicate for differential scans
|
||||
- `ReachabilityGate` - 3-bit class (0-7) indicating entry/sink reachability
|
||||
- `SinkCategory` - Taxonomy of sensitive sinks (file, network, crypto, etc.)
|
||||
- `SinkRegistry` - Registry of known sinks with category mappings
|
||||
|
||||
### Predicate Schema
|
||||
- URI: `stellaops.dev/predicates/smart-diff@v1`
|
||||
- Schema: `docs/schemas/stellaops-smart-diff.v1.schema.json`
|
||||
- DSSE-signed predicates for evidence chain
|
||||
|
||||
### Integration Points
|
||||
- Integrates with `StellaOps.Policy.Suppression` for pre-filter rules
|
||||
- Emits to Attestor module for DSSE envelope wrapping
|
||||
- Consumed by Findings Ledger for triage decisions
|
||||
|
||||
## Engineering Rules
|
||||
- Target `net10.0`; prefer latest C# preview allowed in repo.
|
||||
- Offline-first: no new external network calls; use cached feeds (`/local-nugets`).
|
||||
@@ -34,6 +59,7 @@
|
||||
- Add/extend tests in `src/Scanner/__Tests/**`; golden outputs should be deterministic (sorted keys, stable ordering).
|
||||
- Benchmarks under `src/Scanner/__Benchmarks/**`; document input and expected ceilings in comments.
|
||||
- Cover multi-RID, trimmed/NativeAOT, self-contained vs framework-dependent cases where applicable.
|
||||
- Smart-Diff: Run schema validation tests (`SmartDiffSchemaValidationTests`) for predicate contract changes.
|
||||
|
||||
## Workflow Expectations
|
||||
- Mirror task state in sprint tracker (`TODO → DOING → DONE/BLOCKED`); note blockers with the specific decision needed.
|
||||
|
||||
@@ -0,0 +1,169 @@
|
||||
using System.Buffers.Binary;
|
||||
using System.Collections.Immutable;
|
||||
|
||||
namespace StellaOps.Scanner.Analyzers.Native.Hardening;
|
||||
|
||||
/// <summary>
|
||||
/// Extracts hardening flags from ELF binaries.
|
||||
/// Per Sprint 3500.4 - Smart-Diff Binary Analysis.
|
||||
/// </summary>
|
||||
public sealed class ElfHardeningExtractor : IHardeningExtractor
|
||||
{
|
||||
// ELF magic bytes
|
||||
private static readonly byte[] ElfMagic = [0x7F, 0x45, 0x4C, 0x46]; // \x7FELF
|
||||
|
||||
// ELF header constants
|
||||
private const int EI_CLASS = 4;
|
||||
private const int ELFCLASS32 = 1;
|
||||
private const int ELFCLASS64 = 2;
|
||||
private const int EI_DATA = 5;
|
||||
private const int ELFDATA2LSB = 1; // Little endian
|
||||
private const int ELFDATA2MSB = 2; // Big endian
|
||||
|
||||
// ELF type constants
|
||||
private const ushort ET_EXEC = 2;
|
||||
private const ushort ET_DYN = 3;
|
||||
|
||||
// Program header types
|
||||
private const uint PT_GNU_STACK = 0x6474e551;
|
||||
private const uint PT_GNU_RELRO = 0x6474e552;
|
||||
|
||||
// Dynamic section tags
|
||||
private const ulong DT_FLAGS_1 = 0x6ffffffb;
|
||||
private const ulong DT_BIND_NOW = 24;
|
||||
private const ulong DT_RPATH = 15;
|
||||
private const ulong DT_RUNPATH = 29;
|
||||
|
||||
// DT_FLAGS_1 values
|
||||
private const ulong DF_1_PIE = 0x08000000;
|
||||
private const ulong DF_1_NOW = 0x00000001;
|
||||
|
||||
// Program header flags
|
||||
private const uint PF_X = 1; // Execute
|
||||
private const uint PF_W = 2; // Write
|
||||
private const uint PF_R = 4; // Read
|
||||
|
||||
/// <inheritdoc />
|
||||
public BinaryFormat SupportedFormat => BinaryFormat.Elf;
|
||||
|
||||
/// <inheritdoc />
|
||||
public bool CanExtract(string path)
|
||||
{
|
||||
try
|
||||
{
|
||||
using var fs = File.OpenRead(path);
|
||||
Span<byte> header = stackalloc byte[16];
|
||||
if (fs.Read(header) < 16) return false;
|
||||
return CanExtract(header);
|
||||
}
|
||||
catch
|
||||
{
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
/// <inheritdoc />
|
||||
public bool CanExtract(ReadOnlySpan<byte> header)
|
||||
{
|
||||
return header.Length >= 4 && header[..4].SequenceEqual(ElfMagic);
|
||||
}
|
||||
|
||||
/// <inheritdoc />
|
||||
public async Task<BinaryHardeningFlags> ExtractAsync(string path, string digest, CancellationToken ct = default)
|
||||
{
|
||||
await using var fs = File.OpenRead(path);
|
||||
return await ExtractAsync(fs, path, digest, ct);
|
||||
}
|
||||
|
||||
/// <inheritdoc />
|
||||
public async Task<BinaryHardeningFlags> ExtractAsync(Stream stream, string path, string digest, CancellationToken ct = default)
|
||||
{
|
||||
var flags = new List<HardeningFlag>();
|
||||
var missing = new List<string>();
|
||||
|
||||
// Read ELF header
|
||||
var headerBuf = new byte[64];
|
||||
var bytesRead = await stream.ReadAsync(headerBuf, ct);
|
||||
if (bytesRead < 52) // Minimum ELF header size
|
||||
{
|
||||
return CreateResult(path, digest, [], ["Invalid ELF header"]);
|
||||
}
|
||||
|
||||
// Parse ELF header basics
|
||||
var is64Bit = headerBuf[EI_CLASS] == ELFCLASS64;
|
||||
var isLittleEndian = headerBuf[EI_DATA] == ELFDATA2LSB;
|
||||
|
||||
// Read e_type to check if PIE
|
||||
var eType = ReadUInt16(headerBuf.AsSpan(16, 2), isLittleEndian);
|
||||
var isPie = eType == ET_DYN; // Shared object = could be PIE
|
||||
|
||||
// For a full implementation, we'd parse:
|
||||
// 1. Program headers for PT_GNU_STACK (NX check) and PT_GNU_RELRO
|
||||
// 2. Dynamic section for DT_FLAGS_1 (PIE confirmation), DT_BIND_NOW (full RELRO)
|
||||
// 3. Symbol table for __stack_chk_fail (stack canary)
|
||||
// 4. Symbol table for __fortify_fail (FORTIFY)
|
||||
|
||||
// PIE detection (simplified - full impl would check DT_FLAGS_1)
|
||||
if (isPie)
|
||||
{
|
||||
flags.Add(new HardeningFlag(HardeningFlagType.Pie, true, "DYN", "e_type"));
|
||||
}
|
||||
else
|
||||
{
|
||||
flags.Add(new HardeningFlag(HardeningFlagType.Pie, false));
|
||||
missing.Add("PIE");
|
||||
}
|
||||
|
||||
// NX - would need to read PT_GNU_STACK and check for PF_X
|
||||
// For now, assume modern binaries have NX by default
|
||||
flags.Add(new HardeningFlag(HardeningFlagType.Nx, true, null, "assumed"));
|
||||
|
||||
// RELRO - would need to check PT_GNU_RELRO presence
|
||||
// Partial RELRO is common, Full RELRO requires BIND_NOW
|
||||
flags.Add(new HardeningFlag(HardeningFlagType.RelroPartial, true, null, "assumed"));
|
||||
flags.Add(new HardeningFlag(HardeningFlagType.RelroFull, false));
|
||||
missing.Add("RELRO_FULL");
|
||||
|
||||
// Stack canary - would check for __stack_chk_fail symbol
|
||||
flags.Add(new HardeningFlag(HardeningFlagType.StackCanary, false));
|
||||
missing.Add("STACK_CANARY");
|
||||
|
||||
// FORTIFY - would check for _chk suffixed functions
|
||||
flags.Add(new HardeningFlag(HardeningFlagType.Fortify, false));
|
||||
missing.Add("FORTIFY");
|
||||
|
||||
// RPATH - would check DT_RPATH/DT_RUNPATH in dynamic section
|
||||
// If present, it's a security concern
|
||||
flags.Add(new HardeningFlag(HardeningFlagType.Rpath, false)); // false = not present = good
|
||||
|
||||
return CreateResult(path, digest, flags, missing);
|
||||
}
|
||||
|
||||
private static BinaryHardeningFlags CreateResult(
|
||||
string path,
|
||||
string digest,
|
||||
List<HardeningFlag> flags,
|
||||
List<string> missing)
|
||||
{
|
||||
// Calculate score: enabled flags / total possible flags
|
||||
var enabledCount = flags.Count(f => f.Enabled && f.Name != HardeningFlagType.Rpath);
|
||||
var totalExpected = 6; // PIE, NX, RELRO_FULL, STACK_CANARY, FORTIFY, (not RPATH)
|
||||
var score = totalExpected > 0 ? (double)enabledCount / totalExpected : 0.0;
|
||||
|
||||
return new BinaryHardeningFlags(
|
||||
Format: BinaryFormat.Elf,
|
||||
Path: path,
|
||||
Digest: digest,
|
||||
Flags: [.. flags],
|
||||
HardeningScore: Math.Round(score, 2),
|
||||
MissingFlags: [.. missing],
|
||||
ExtractedAt: DateTimeOffset.UtcNow);
|
||||
}
|
||||
|
||||
private static ushort ReadUInt16(ReadOnlySpan<byte> span, bool littleEndian)
|
||||
{
|
||||
return littleEndian
|
||||
? BinaryPrimitives.ReadUInt16LittleEndian(span)
|
||||
: BinaryPrimitives.ReadUInt16BigEndian(span);
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,140 @@
|
||||
using System.Collections.Immutable;
|
||||
using System.Text.Json.Serialization;
|
||||
|
||||
namespace StellaOps.Scanner.Analyzers.Native.Hardening;
|
||||
|
||||
/// <summary>
|
||||
/// Security hardening flags extracted from a binary.
|
||||
/// Per Sprint 3500.4 - Smart-Diff Binary Analysis.
|
||||
/// </summary>
|
||||
public sealed record BinaryHardeningFlags(
|
||||
[property: JsonPropertyName("format")] BinaryFormat Format,
|
||||
[property: JsonPropertyName("path")] string Path,
|
||||
[property: JsonPropertyName("digest")] string Digest,
|
||||
[property: JsonPropertyName("flags")] ImmutableArray<HardeningFlag> Flags,
|
||||
[property: JsonPropertyName("score")] double HardeningScore,
|
||||
[property: JsonPropertyName("missing")] ImmutableArray<string> MissingFlags,
|
||||
[property: JsonPropertyName("extractedAt")] DateTimeOffset ExtractedAt);
|
||||
|
||||
/// <summary>
|
||||
/// A single hardening flag with its state.
|
||||
/// </summary>
|
||||
public sealed record HardeningFlag(
|
||||
[property: JsonPropertyName("name")] HardeningFlagType Name,
|
||||
[property: JsonPropertyName("enabled")] bool Enabled,
|
||||
[property: JsonPropertyName("value")] string? Value = null,
|
||||
[property: JsonPropertyName("source")] string? Source = null);
|
||||
|
||||
/// <summary>
|
||||
/// Hardening flag types across binary formats.
|
||||
/// </summary>
|
||||
[JsonConverter(typeof(JsonStringEnumConverter<HardeningFlagType>))]
|
||||
public enum HardeningFlagType
|
||||
{
|
||||
// ELF flags
|
||||
/// <summary>Position Independent Executable</summary>
|
||||
[JsonStringEnumMemberName("PIE")]
|
||||
Pie,
|
||||
|
||||
/// <summary>Partial RELRO</summary>
|
||||
[JsonStringEnumMemberName("RELRO_PARTIAL")]
|
||||
RelroPartial,
|
||||
|
||||
/// <summary>Full RELRO (BIND_NOW)</summary>
|
||||
[JsonStringEnumMemberName("RELRO_FULL")]
|
||||
RelroFull,
|
||||
|
||||
/// <summary>Stack protector canary</summary>
|
||||
[JsonStringEnumMemberName("STACK_CANARY")]
|
||||
StackCanary,
|
||||
|
||||
/// <summary>Non-executable stack/heap</summary>
|
||||
[JsonStringEnumMemberName("NX")]
|
||||
Nx,
|
||||
|
||||
/// <summary>FORTIFY_SOURCE enabled</summary>
|
||||
[JsonStringEnumMemberName("FORTIFY")]
|
||||
Fortify,
|
||||
|
||||
/// <summary>RPATH/RUNPATH set (security concern if present)</summary>
|
||||
[JsonStringEnumMemberName("RPATH")]
|
||||
Rpath,
|
||||
|
||||
// PE flags
|
||||
/// <summary>Address Space Layout Randomization</summary>
|
||||
[JsonStringEnumMemberName("ASLR")]
|
||||
Aslr,
|
||||
|
||||
/// <summary>Data Execution Prevention</summary>
|
||||
[JsonStringEnumMemberName("DEP")]
|
||||
Dep,
|
||||
|
||||
/// <summary>Control Flow Guard</summary>
|
||||
[JsonStringEnumMemberName("CFG")]
|
||||
Cfg,
|
||||
|
||||
/// <summary>Authenticode code signing</summary>
|
||||
[JsonStringEnumMemberName("AUTHENTICODE")]
|
||||
Authenticode,
|
||||
|
||||
/// <summary>Safe Structured Exception Handling</summary>
|
||||
[JsonStringEnumMemberName("SAFE_SEH")]
|
||||
SafeSeh,
|
||||
|
||||
/// <summary>/GS buffer security check</summary>
|
||||
[JsonStringEnumMemberName("GS")]
|
||||
Gs,
|
||||
|
||||
/// <summary>High entropy 64-bit ASLR</summary>
|
||||
[JsonStringEnumMemberName("HIGH_ENTROPY_VA")]
|
||||
HighEntropyVa,
|
||||
|
||||
/// <summary>Force integrity checking</summary>
|
||||
[JsonStringEnumMemberName("FORCE_INTEGRITY")]
|
||||
ForceIntegrity,
|
||||
|
||||
// Mach-O flags
|
||||
/// <summary>DYLD_* environment variable restrictions</summary>
|
||||
[JsonStringEnumMemberName("RESTRICT")]
|
||||
Restrict,
|
||||
|
||||
/// <summary>Hardened runtime enabled</summary>
|
||||
[JsonStringEnumMemberName("HARDENED")]
|
||||
Hardened,
|
||||
|
||||
/// <summary>Code signature present</summary>
|
||||
[JsonStringEnumMemberName("CODE_SIGN")]
|
||||
CodeSign,
|
||||
|
||||
/// <summary>Library validation enabled</summary>
|
||||
[JsonStringEnumMemberName("LIBRARY_VALIDATION")]
|
||||
LibraryValidation,
|
||||
|
||||
// Cross-platform
|
||||
/// <summary>Control-flow Enforcement Technology (Intel CET)</summary>
|
||||
[JsonStringEnumMemberName("CET")]
|
||||
Cet,
|
||||
|
||||
/// <summary>Branch Target Identification (ARM BTI)</summary>
|
||||
[JsonStringEnumMemberName("BTI")]
|
||||
Bti
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Binary format identifier.
|
||||
/// </summary>
|
||||
[JsonConverter(typeof(JsonStringEnumConverter<BinaryFormat>))]
|
||||
public enum BinaryFormat
|
||||
{
|
||||
[JsonStringEnumMemberName("ELF")]
|
||||
Elf,
|
||||
|
||||
[JsonStringEnumMemberName("PE")]
|
||||
Pe,
|
||||
|
||||
[JsonStringEnumMemberName("MachO")]
|
||||
MachO,
|
||||
|
||||
[JsonStringEnumMemberName("Unknown")]
|
||||
Unknown
|
||||
}
|
||||
@@ -0,0 +1,75 @@
|
||||
namespace StellaOps.Scanner.Analyzers.Native.Hardening;
|
||||
|
||||
/// <summary>
|
||||
/// Interface for extracting hardening flags from binaries.
|
||||
/// Per Sprint 3500.4 - Smart-Diff Binary Analysis.
|
||||
/// </summary>
|
||||
public interface IHardeningExtractor
|
||||
{
|
||||
/// <summary>
|
||||
/// Binary format this extractor supports.
|
||||
/// </summary>
|
||||
BinaryFormat SupportedFormat { get; }
|
||||
|
||||
/// <summary>
|
||||
/// Check if a file can be processed by this extractor.
|
||||
/// </summary>
|
||||
/// <param name="path">Path to the binary file.</param>
|
||||
/// <returns>True if the extractor can process this file.</returns>
|
||||
bool CanExtract(string path);
|
||||
|
||||
/// <summary>
|
||||
/// Check if a file can be processed using magic bytes.
|
||||
/// </summary>
|
||||
/// <param name="header">First 16+ bytes of the file.</param>
|
||||
/// <returns>True if the extractor can process this file.</returns>
|
||||
bool CanExtract(ReadOnlySpan<byte> header);
|
||||
|
||||
/// <summary>
|
||||
/// Extract hardening flags from a binary file.
|
||||
/// </summary>
|
||||
/// <param name="path">Path to the binary file.</param>
|
||||
/// <param name="digest">Content digest of the file.</param>
|
||||
/// <param name="ct">Cancellation token.</param>
|
||||
/// <returns>Extracted hardening flags.</returns>
|
||||
Task<BinaryHardeningFlags> ExtractAsync(string path, string digest, CancellationToken ct = default);
|
||||
|
||||
/// <summary>
|
||||
/// Extract hardening flags from a stream.
|
||||
/// </summary>
|
||||
/// <param name="stream">Stream containing binary data.</param>
|
||||
/// <param name="path">Original path (for reporting).</param>
|
||||
/// <param name="digest">Content digest.</param>
|
||||
/// <param name="ct">Cancellation token.</param>
|
||||
/// <returns>Extracted hardening flags.</returns>
|
||||
Task<BinaryHardeningFlags> ExtractAsync(Stream stream, string path, string digest, CancellationToken ct = default);
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Composite extractor that delegates to format-specific extractors.
|
||||
/// </summary>
|
||||
public interface IHardeningExtractorFactory
|
||||
{
|
||||
/// <summary>
|
||||
/// Get the appropriate extractor for a binary file.
|
||||
/// </summary>
|
||||
/// <param name="path">Path to the binary file.</param>
|
||||
/// <returns>The extractor, or null if format not supported.</returns>
|
||||
IHardeningExtractor? GetExtractor(string path);
|
||||
|
||||
/// <summary>
|
||||
/// Get the appropriate extractor based on magic bytes.
|
||||
/// </summary>
|
||||
/// <param name="header">First 16+ bytes of the file.</param>
|
||||
/// <returns>The extractor, or null if format not supported.</returns>
|
||||
IHardeningExtractor? GetExtractor(ReadOnlySpan<byte> header);
|
||||
|
||||
/// <summary>
|
||||
/// Extract hardening flags, auto-detecting format.
|
||||
/// </summary>
|
||||
/// <param name="path">Path to the binary file.</param>
|
||||
/// <param name="digest">Content digest.</param>
|
||||
/// <param name="ct">Cancellation token.</param>
|
||||
/// <returns>Extracted hardening flags, or null if format not supported.</returns>
|
||||
Task<BinaryHardeningFlags?> ExtractAsync(string path, string digest, CancellationToken ct = default);
|
||||
}
|
||||
@@ -17,7 +17,8 @@ public sealed record DeterminismReport(
|
||||
double OverallScore,
|
||||
double ThresholdOverall,
|
||||
double ThresholdImage,
|
||||
IReadOnlyList<DeterminismImageReport> Images)
|
||||
IReadOnlyList<DeterminismImageReport> Images,
|
||||
FidelityMetrics? Fidelity = null)
|
||||
;
|
||||
|
||||
public sealed record DeterminismImageReport(
|
||||
@@ -26,7 +27,8 @@ public sealed record DeterminismImageReport(
|
||||
int Identical,
|
||||
double Score,
|
||||
IReadOnlyDictionary<string, string> ArtifactHashes,
|
||||
IReadOnlyList<DeterminismRunReport> RunsDetail);
|
||||
IReadOnlyList<DeterminismRunReport> RunsDetail,
|
||||
FidelityMetrics? Fidelity = null);
|
||||
|
||||
public sealed record DeterminismRunReport(
|
||||
int RunIndex,
|
||||
|
||||
@@ -0,0 +1,23 @@
|
||||
// -----------------------------------------------------------------------------
|
||||
// IScanMetricsCollectorFactory.cs
|
||||
// Sprint: SPRINT_3406_0001_0001_metrics_tables
|
||||
// Task: METRICS-3406-009
|
||||
// Description: Factory for creating ScanMetricsCollector instances per scan
|
||||
// -----------------------------------------------------------------------------
|
||||
|
||||
namespace StellaOps.Scanner.Worker.Metrics;
|
||||
|
||||
/// <summary>
|
||||
/// Factory for creating ScanMetricsCollector instances per scan.
|
||||
/// </summary>
|
||||
public interface IScanMetricsCollectorFactory
|
||||
{
|
||||
/// <summary>
|
||||
/// Create a new metrics collector for a scan.
|
||||
/// </summary>
|
||||
ScanMetricsCollector Create(
|
||||
Guid scanId,
|
||||
Guid tenantId,
|
||||
string artifactDigest,
|
||||
string artifactType);
|
||||
}
|
||||
@@ -0,0 +1,137 @@
|
||||
// -----------------------------------------------------------------------------
|
||||
// ScanCompletionMetricsIntegration.cs
|
||||
// Sprint: SPRINT_3406_0001_0001_metrics_tables
|
||||
// Task: METRICS-3406-009
|
||||
// Description: Integrates metrics collection into scan completion pipeline
|
||||
// -----------------------------------------------------------------------------
|
||||
|
||||
using Microsoft.Extensions.Logging;
|
||||
using StellaOps.Scanner.Core.Contracts;
|
||||
using StellaOps.Scanner.Storage.Repositories;
|
||||
|
||||
namespace StellaOps.Scanner.Worker.Metrics;
|
||||
|
||||
/// <summary>
|
||||
/// Integrates metrics collection into the scan completion pipeline.
|
||||
/// Call this after successful scan processing to persist metrics.
|
||||
/// </summary>
|
||||
public sealed class ScanCompletionMetricsIntegration
|
||||
{
|
||||
private readonly IScanMetricsCollectorFactory _collectorFactory;
|
||||
private readonly ILogger<ScanCompletionMetricsIntegration> _logger;
|
||||
|
||||
public ScanCompletionMetricsIntegration(
|
||||
IScanMetricsCollectorFactory collectorFactory,
|
||||
ILogger<ScanCompletionMetricsIntegration> logger)
|
||||
{
|
||||
_collectorFactory = collectorFactory ?? throw new ArgumentNullException(nameof(collectorFactory));
|
||||
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Capture metrics for a completed scan.
|
||||
/// </summary>
|
||||
public async Task CaptureAsync(
|
||||
ScanCompletionContext completion,
|
||||
CancellationToken cancellationToken = default)
|
||||
{
|
||||
ArgumentNullException.ThrowIfNull(completion);
|
||||
|
||||
try
|
||||
{
|
||||
using var collector = _collectorFactory.Create(
|
||||
completion.ScanId,
|
||||
completion.TenantId,
|
||||
completion.ArtifactDigest,
|
||||
completion.ArtifactType);
|
||||
|
||||
collector.Start();
|
||||
|
||||
// Record phase timings from context
|
||||
foreach (var phase in completion.Phases)
|
||||
{
|
||||
using (collector.StartPhase(phase.PhaseName))
|
||||
{
|
||||
// Phase timing is set via SetPhaseCompleted
|
||||
}
|
||||
collector.CompletePhase(phase.PhaseName, phase.Metrics);
|
||||
}
|
||||
|
||||
// Set digests
|
||||
collector.SetDigests(
|
||||
completion.FindingsSha256,
|
||||
completion.VexBundleSha256,
|
||||
completion.ProofBundleSha256,
|
||||
completion.SbomSha256);
|
||||
|
||||
// Set policy reference
|
||||
collector.SetPolicy(
|
||||
completion.PolicyDigest,
|
||||
completion.FeedSnapshotId);
|
||||
|
||||
// Set counts
|
||||
collector.SetCounts(
|
||||
completion.PackageCount,
|
||||
completion.FindingCount,
|
||||
completion.VexDecisionCount);
|
||||
|
||||
// Set metadata
|
||||
collector.SetMetadata(
|
||||
completion.SurfaceId,
|
||||
completion.ReplayManifestHash,
|
||||
completion.ScannerImageDigest,
|
||||
completion.IsReplay);
|
||||
|
||||
await collector.CompleteAsync(cancellationToken);
|
||||
|
||||
_logger.LogDebug(
|
||||
"Captured metrics for scan {ScanId}: {FindingCount} findings, {PackageCount} packages",
|
||||
completion.ScanId, completion.FindingCount, completion.PackageCount);
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
_logger.LogWarning(ex, "Failed to capture metrics for scan {ScanId}", completion.ScanId);
|
||||
// Don't fail the scan if metrics capture fails
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Context for scan completion metrics capture.
|
||||
/// </summary>
|
||||
public sealed record ScanCompletionContext
|
||||
{
|
||||
public required Guid ScanId { get; init; }
|
||||
public required Guid TenantId { get; init; }
|
||||
public required string ArtifactDigest { get; init; }
|
||||
public required string ArtifactType { get; init; }
|
||||
public required string FindingsSha256 { get; init; }
|
||||
|
||||
public Guid? SurfaceId { get; init; }
|
||||
public string? VexBundleSha256 { get; init; }
|
||||
public string? ProofBundleSha256 { get; init; }
|
||||
public string? SbomSha256 { get; init; }
|
||||
public string? PolicyDigest { get; init; }
|
||||
public string? FeedSnapshotId { get; init; }
|
||||
public string? ReplayManifestHash { get; init; }
|
||||
public string? ScannerImageDigest { get; init; }
|
||||
public bool IsReplay { get; init; }
|
||||
|
||||
public int? PackageCount { get; init; }
|
||||
public int? FindingCount { get; init; }
|
||||
public int? VexDecisionCount { get; init; }
|
||||
|
||||
public IReadOnlyList<PhaseCompletionInfo> Phases { get; init; } = [];
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Information about a completed phase.
|
||||
/// </summary>
|
||||
public sealed record PhaseCompletionInfo
|
||||
{
|
||||
public required string PhaseName { get; init; }
|
||||
public DateTimeOffset StartedAt { get; init; }
|
||||
public DateTimeOffset FinishedAt { get; init; }
|
||||
public bool Success { get; init; } = true;
|
||||
public Dictionary<string, object>? Metrics { get; init; }
|
||||
}
|
||||
@@ -0,0 +1,49 @@
|
||||
// -----------------------------------------------------------------------------
|
||||
// ScanMetricsCollectorFactory.cs
|
||||
// Sprint: SPRINT_3406_0001_0001_metrics_tables
|
||||
// Task: METRICS-3406-009
|
||||
// Description: Factory implementation for creating ScanMetricsCollector instances
|
||||
// -----------------------------------------------------------------------------
|
||||
|
||||
using Microsoft.Extensions.Logging;
|
||||
using Microsoft.Extensions.Options;
|
||||
using StellaOps.Scanner.Storage.Repositories;
|
||||
using StellaOps.Scanner.Worker.Options;
|
||||
|
||||
namespace StellaOps.Scanner.Worker.Metrics;
|
||||
|
||||
/// <summary>
|
||||
/// Factory for creating ScanMetricsCollector instances per scan.
|
||||
/// </summary>
|
||||
public sealed class ScanMetricsCollectorFactory : IScanMetricsCollectorFactory
|
||||
{
|
||||
private readonly IScanMetricsRepository _repository;
|
||||
private readonly ILoggerFactory _loggerFactory;
|
||||
private readonly string _scannerVersion;
|
||||
|
||||
public ScanMetricsCollectorFactory(
|
||||
IScanMetricsRepository repository,
|
||||
ILoggerFactory loggerFactory,
|
||||
IOptions<ScannerWorkerOptions> options)
|
||||
{
|
||||
_repository = repository ?? throw new ArgumentNullException(nameof(repository));
|
||||
_loggerFactory = loggerFactory ?? throw new ArgumentNullException(nameof(loggerFactory));
|
||||
_scannerVersion = options?.Value.ScannerVersion ?? "unknown";
|
||||
}
|
||||
|
||||
public ScanMetricsCollector Create(
|
||||
Guid scanId,
|
||||
Guid tenantId,
|
||||
string artifactDigest,
|
||||
string artifactType)
|
||||
{
|
||||
return new ScanMetricsCollector(
|
||||
_repository,
|
||||
_loggerFactory.CreateLogger<ScanMetricsCollector>(),
|
||||
scanId,
|
||||
tenantId,
|
||||
artifactDigest,
|
||||
artifactType,
|
||||
_scannerVersion);
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,165 @@
|
||||
using Microsoft.Extensions.Logging;
|
||||
|
||||
namespace StellaOps.Scanner.Reachability.Gates;
|
||||
|
||||
/// <summary>
|
||||
/// Interface for gate detectors.
|
||||
/// </summary>
|
||||
public interface IGateDetector
|
||||
{
|
||||
/// <summary>The type of gate this detector finds.</summary>
|
||||
GateType GateType { get; }
|
||||
|
||||
/// <summary>Detects gates in the given call path.</summary>
|
||||
Task<IReadOnlyList<DetectedGate>> DetectAsync(
|
||||
CallPathContext context,
|
||||
CancellationToken cancellationToken = default);
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Context for gate detection on a call path.
|
||||
/// </summary>
|
||||
public sealed record CallPathContext
|
||||
{
|
||||
/// <summary>Symbols in the call path from entry to vulnerability.</summary>
|
||||
public required IReadOnlyList<string> CallPath { get; init; }
|
||||
|
||||
/// <summary>Source files associated with each symbol (if available).</summary>
|
||||
public IReadOnlyDictionary<string, string>? SourceFiles { get; init; }
|
||||
|
||||
/// <summary>AST or CFG data for deeper analysis (optional).</summary>
|
||||
public object? AstData { get; init; }
|
||||
|
||||
/// <summary>Language of the code being analyzed.</summary>
|
||||
public required string Language { get; init; }
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Composite gate detector that orchestrates all individual detectors.
|
||||
/// SPRINT_3405_0001_0001 - Task #7
|
||||
/// </summary>
|
||||
public sealed class CompositeGateDetector
|
||||
{
|
||||
private readonly IReadOnlyList<IGateDetector> _detectors;
|
||||
private readonly GateMultiplierConfig _config;
|
||||
private readonly ILogger<CompositeGateDetector> _logger;
|
||||
|
||||
public CompositeGateDetector(
|
||||
IEnumerable<IGateDetector> detectors,
|
||||
GateMultiplierConfig? config = null,
|
||||
ILogger<CompositeGateDetector>? logger = null)
|
||||
{
|
||||
_detectors = detectors?.ToList() ?? throw new ArgumentNullException(nameof(detectors));
|
||||
_config = config ?? GateMultiplierConfig.Default;
|
||||
_logger = logger ?? Microsoft.Extensions.Logging.Abstractions.NullLogger<CompositeGateDetector>.Instance;
|
||||
|
||||
if (_detectors.Count == 0)
|
||||
{
|
||||
_logger.LogWarning("CompositeGateDetector initialized with no detectors");
|
||||
}
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Detects all gates in the given call path using all registered detectors.
|
||||
/// </summary>
|
||||
public async Task<GateDetectionResult> DetectAllAsync(
|
||||
CallPathContext context,
|
||||
CancellationToken cancellationToken = default)
|
||||
{
|
||||
ArgumentNullException.ThrowIfNull(context);
|
||||
|
||||
if (context.CallPath.Count == 0)
|
||||
{
|
||||
return GateDetectionResult.Empty;
|
||||
}
|
||||
|
||||
var allGates = new List<DetectedGate>();
|
||||
|
||||
// Run all detectors in parallel
|
||||
var tasks = _detectors.Select(async detector =>
|
||||
{
|
||||
try
|
||||
{
|
||||
var gates = await detector.DetectAsync(context, cancellationToken);
|
||||
return gates;
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
_logger.LogWarning(ex,
|
||||
"Gate detector {DetectorType} failed for path with {PathLength} symbols",
|
||||
detector.GateType, context.CallPath.Count);
|
||||
return Array.Empty<DetectedGate>();
|
||||
}
|
||||
});
|
||||
|
||||
var results = await Task.WhenAll(tasks);
|
||||
|
||||
foreach (var gates in results)
|
||||
{
|
||||
allGates.AddRange(gates);
|
||||
}
|
||||
|
||||
// Deduplicate gates by symbol+type
|
||||
var uniqueGates = allGates
|
||||
.GroupBy(g => (g.GuardSymbol, g.Type))
|
||||
.Select(g => g.OrderByDescending(x => x.Confidence).First())
|
||||
.OrderByDescending(g => g.Confidence)
|
||||
.ToList();
|
||||
|
||||
// Calculate combined multiplier
|
||||
var combinedMultiplier = CalculateCombinedMultiplier(uniqueGates);
|
||||
|
||||
_logger.LogDebug(
|
||||
"Detected {GateCount} gates on path, combined multiplier: {Multiplier}bps",
|
||||
uniqueGates.Count, combinedMultiplier);
|
||||
|
||||
return new GateDetectionResult
|
||||
{
|
||||
Gates = uniqueGates,
|
||||
CombinedMultiplierBps = combinedMultiplier,
|
||||
};
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Calculates the combined multiplier for all detected gates.
|
||||
/// Gates are multiplicative: auth(30%) * feature_flag(20%) = 6%
|
||||
/// </summary>
|
||||
private int CalculateCombinedMultiplier(IReadOnlyList<DetectedGate> gates)
|
||||
{
|
||||
if (gates.Count == 0)
|
||||
{
|
||||
return 10000; // 100% - no reduction
|
||||
}
|
||||
|
||||
// Start with 100% (10000 bps)
|
||||
double multiplier = 10000.0;
|
||||
|
||||
// Group gates by type and take the lowest multiplier per type
|
||||
// (multiple auth gates don't stack, but auth + feature_flag do)
|
||||
var gatesByType = gates
|
||||
.GroupBy(g => g.Type)
|
||||
.Select(g => g.Key);
|
||||
|
||||
foreach (var gateType in gatesByType)
|
||||
{
|
||||
var typeMultiplier = GetMultiplierForType(gateType);
|
||||
multiplier = multiplier * typeMultiplier / 10000.0;
|
||||
}
|
||||
|
||||
// Apply floor
|
||||
var result = (int)Math.Round(multiplier);
|
||||
return Math.Max(result, _config.MinimumMultiplierBps);
|
||||
}
|
||||
|
||||
private int GetMultiplierForType(GateType type)
|
||||
{
|
||||
return type switch
|
||||
{
|
||||
GateType.AuthRequired => _config.AuthRequiredMultiplierBps,
|
||||
GateType.FeatureFlag => _config.FeatureFlagMultiplierBps,
|
||||
GateType.AdminOnly => _config.AdminOnlyMultiplierBps,
|
||||
GateType.NonDefaultConfig => _config.NonDefaultConfigMultiplierBps,
|
||||
_ => 10000, // Unknown gate type - no reduction
|
||||
};
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,258 @@
|
||||
using StellaOps.Scanner.Reachability.Gates;
|
||||
using Xunit;
|
||||
|
||||
namespace StellaOps.Scanner.Reachability.Tests;
|
||||
|
||||
/// <summary>
|
||||
/// Unit tests for gate detection and multiplier calculation.
|
||||
/// SPRINT_3405_0001_0001 - Tasks #13, #14, #15
|
||||
/// </summary>
|
||||
public sealed class GateDetectionTests
|
||||
{
|
||||
[Fact]
|
||||
public void GateDetectionResult_Empty_HasNoGates()
|
||||
{
|
||||
// Assert
|
||||
Assert.False(GateDetectionResult.Empty.HasGates);
|
||||
Assert.Empty(GateDetectionResult.Empty.Gates);
|
||||
Assert.Null(GateDetectionResult.Empty.PrimaryGate);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void GateDetectionResult_WithGates_HasPrimaryGate()
|
||||
{
|
||||
// Arrange
|
||||
var gates = new[]
|
||||
{
|
||||
CreateGate(GateType.AuthRequired, 0.7),
|
||||
CreateGate(GateType.FeatureFlag, 0.9),
|
||||
};
|
||||
|
||||
var result = new GateDetectionResult { Gates = gates };
|
||||
|
||||
// Assert
|
||||
Assert.True(result.HasGates);
|
||||
Assert.Equal(2, result.Gates.Count);
|
||||
Assert.Equal(GateType.FeatureFlag, result.PrimaryGate?.Type); // Highest confidence
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void GateMultiplierConfig_Default_HasExpectedValues()
|
||||
{
|
||||
// Arrange
|
||||
var config = GateMultiplierConfig.Default;
|
||||
|
||||
// Assert
|
||||
Assert.Equal(3000, config.AuthRequiredMultiplierBps); // 30%
|
||||
Assert.Equal(2000, config.FeatureFlagMultiplierBps); // 20%
|
||||
Assert.Equal(1500, config.AdminOnlyMultiplierBps); // 15%
|
||||
Assert.Equal(5000, config.NonDefaultConfigMultiplierBps); // 50%
|
||||
Assert.Equal(500, config.MinimumMultiplierBps); // 5% floor
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task CompositeGateDetector_NoDetectors_ReturnsEmpty()
|
||||
{
|
||||
// Arrange
|
||||
var detector = new CompositeGateDetector([]);
|
||||
var context = CreateContext(["main", "vulnerable_function"]);
|
||||
|
||||
// Act
|
||||
var result = await detector.DetectAllAsync(context);
|
||||
|
||||
// Assert
|
||||
Assert.False(result.HasGates);
|
||||
Assert.Equal(10000, result.CombinedMultiplierBps); // 100%
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task CompositeGateDetector_EmptyCallPath_ReturnsEmpty()
|
||||
{
|
||||
// Arrange
|
||||
var detector = new CompositeGateDetector([new MockAuthDetector()]);
|
||||
var context = CreateContext([]);
|
||||
|
||||
// Act
|
||||
var result = await detector.DetectAllAsync(context);
|
||||
|
||||
// Assert
|
||||
Assert.False(result.HasGates);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task CompositeGateDetector_SingleGate_AppliesMultiplier()
|
||||
{
|
||||
// Arrange
|
||||
var authDetector = new MockAuthDetector(
|
||||
CreateGate(GateType.AuthRequired, 0.95));
|
||||
var detector = new CompositeGateDetector([authDetector]);
|
||||
var context = CreateContext(["main", "auth_check", "vulnerable"]);
|
||||
|
||||
// Act
|
||||
var result = await detector.DetectAllAsync(context);
|
||||
|
||||
// Assert
|
||||
Assert.True(result.HasGates);
|
||||
Assert.Single(result.Gates);
|
||||
Assert.Equal(3000, result.CombinedMultiplierBps); // 30% from auth
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task CompositeGateDetector_MultipleGateTypes_MultipliesMultipliers()
|
||||
{
|
||||
// Arrange
|
||||
var authDetector = new MockAuthDetector(
|
||||
CreateGate(GateType.AuthRequired, 0.9));
|
||||
var featureDetector = new MockFeatureFlagDetector(
|
||||
CreateGate(GateType.FeatureFlag, 0.8));
|
||||
|
||||
var detector = new CompositeGateDetector([authDetector, featureDetector]);
|
||||
var context = CreateContext(["main", "auth_check", "feature_check", "vulnerable"]);
|
||||
|
||||
// Act
|
||||
var result = await detector.DetectAllAsync(context);
|
||||
|
||||
// Assert
|
||||
Assert.True(result.HasGates);
|
||||
Assert.Equal(2, result.Gates.Count);
|
||||
// 30% * 20% = 6% (600 bps), but floor is 500 bps
|
||||
Assert.Equal(600, result.CombinedMultiplierBps);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task CompositeGateDetector_DuplicateGates_Deduplicates()
|
||||
{
|
||||
// Arrange - two detectors finding same gate
|
||||
var authDetector1 = new MockAuthDetector(
|
||||
CreateGate(GateType.AuthRequired, 0.9, "checkAuth"));
|
||||
var authDetector2 = new MockAuthDetector(
|
||||
CreateGate(GateType.AuthRequired, 0.7, "checkAuth"));
|
||||
|
||||
var detector = new CompositeGateDetector([authDetector1, authDetector2]);
|
||||
var context = CreateContext(["main", "checkAuth", "vulnerable"]);
|
||||
|
||||
// Act
|
||||
var result = await detector.DetectAllAsync(context);
|
||||
|
||||
// Assert
|
||||
Assert.Single(result.Gates); // Deduplicated
|
||||
Assert.Equal(0.9, result.Gates[0].Confidence); // Kept higher confidence
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task CompositeGateDetector_AllGateTypes_AppliesMinimumFloor()
|
||||
{
|
||||
// Arrange - all gate types = very low multiplier
|
||||
var detectors = new IGateDetector[]
|
||||
{
|
||||
new MockAuthDetector(CreateGate(GateType.AuthRequired, 0.9)),
|
||||
new MockFeatureFlagDetector(CreateGate(GateType.FeatureFlag, 0.9)),
|
||||
new MockAdminDetector(CreateGate(GateType.AdminOnly, 0.9)),
|
||||
new MockConfigDetector(CreateGate(GateType.NonDefaultConfig, 0.9)),
|
||||
};
|
||||
|
||||
var detector = new CompositeGateDetector(detectors);
|
||||
var context = CreateContext(["main", "auth", "feature", "admin", "config", "vulnerable"]);
|
||||
|
||||
// Act
|
||||
var result = await detector.DetectAllAsync(context);
|
||||
|
||||
// Assert
|
||||
Assert.Equal(4, result.Gates.Count);
|
||||
// 30% * 20% * 15% * 50% = 0.45%, but floor is 5% (500 bps)
|
||||
Assert.Equal(500, result.CombinedMultiplierBps);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task CompositeGateDetector_DetectorException_ContinuesWithOthers()
|
||||
{
|
||||
// Arrange
|
||||
var failingDetector = new FailingGateDetector();
|
||||
var authDetector = new MockAuthDetector(
|
||||
CreateGate(GateType.AuthRequired, 0.9));
|
||||
|
||||
var detector = new CompositeGateDetector([failingDetector, authDetector]);
|
||||
var context = CreateContext(["main", "vulnerable"]);
|
||||
|
||||
// Act
|
||||
var result = await detector.DetectAllAsync(context);
|
||||
|
||||
// Assert - should still get auth gate despite failing detector
|
||||
Assert.Single(result.Gates);
|
||||
Assert.Equal(GateType.AuthRequired, result.Gates[0].Type);
|
||||
}
|
||||
|
||||
private static DetectedGate CreateGate(GateType type, double confidence, string symbol = "guard_symbol")
|
||||
{
|
||||
return new DetectedGate
|
||||
{
|
||||
Type = type,
|
||||
Detail = $"{type} gate detected",
|
||||
GuardSymbol = symbol,
|
||||
Confidence = confidence,
|
||||
DetectionMethod = "mock",
|
||||
};
|
||||
}
|
||||
|
||||
private static CallPathContext CreateContext(string[] callPath)
|
||||
{
|
||||
return new CallPathContext
|
||||
{
|
||||
CallPath = callPath,
|
||||
Language = "csharp",
|
||||
};
|
||||
}
|
||||
|
||||
// Mock detectors for testing
|
||||
private class MockAuthDetector : IGateDetector
|
||||
{
|
||||
private readonly DetectedGate[] _gates;
|
||||
public GateType GateType => GateType.AuthRequired;
|
||||
|
||||
public MockAuthDetector(params DetectedGate[] gates) => _gates = gates;
|
||||
|
||||
public Task<IReadOnlyList<DetectedGate>> DetectAsync(CallPathContext context, CancellationToken ct)
|
||||
=> Task.FromResult<IReadOnlyList<DetectedGate>>(_gates);
|
||||
}
|
||||
|
||||
private class MockFeatureFlagDetector : IGateDetector
|
||||
{
|
||||
private readonly DetectedGate[] _gates;
|
||||
public GateType GateType => GateType.FeatureFlag;
|
||||
|
||||
public MockFeatureFlagDetector(params DetectedGate[] gates) => _gates = gates;
|
||||
|
||||
public Task<IReadOnlyList<DetectedGate>> DetectAsync(CallPathContext context, CancellationToken ct)
|
||||
=> Task.FromResult<IReadOnlyList<DetectedGate>>(_gates);
|
||||
}
|
||||
|
||||
private class MockAdminDetector : IGateDetector
|
||||
{
|
||||
private readonly DetectedGate[] _gates;
|
||||
public GateType GateType => GateType.AdminOnly;
|
||||
|
||||
public MockAdminDetector(params DetectedGate[] gates) => _gates = gates;
|
||||
|
||||
public Task<IReadOnlyList<DetectedGate>> DetectAsync(CallPathContext context, CancellationToken ct)
|
||||
=> Task.FromResult<IReadOnlyList<DetectedGate>>(_gates);
|
||||
}
|
||||
|
||||
private class MockConfigDetector : IGateDetector
|
||||
{
|
||||
private readonly DetectedGate[] _gates;
|
||||
public GateType GateType => GateType.NonDefaultConfig;
|
||||
|
||||
public MockConfigDetector(params DetectedGate[] gates) => _gates = gates;
|
||||
|
||||
public Task<IReadOnlyList<DetectedGate>> DetectAsync(CallPathContext context, CancellationToken ct)
|
||||
=> Task.FromResult<IReadOnlyList<DetectedGate>>(_gates);
|
||||
}
|
||||
|
||||
private class FailingGateDetector : IGateDetector
|
||||
{
|
||||
public GateType GateType => GateType.AuthRequired;
|
||||
|
||||
public Task<IReadOnlyList<DetectedGate>> DetectAsync(CallPathContext context, CancellationToken ct)
|
||||
=> throw new InvalidOperationException("Simulated detector failure");
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,325 @@
|
||||
using System.Collections.Immutable;
|
||||
|
||||
namespace StellaOps.Scanner.SmartDiff.Detection;
|
||||
|
||||
/// <summary>
|
||||
/// Detects material risk changes between two scan snapshots.
|
||||
/// Implements rules R1-R4 from the Smart-Diff advisory.
|
||||
/// Per Sprint 3500.3 - Smart-Diff Detection Rules.
|
||||
/// </summary>
|
||||
public sealed class MaterialRiskChangeDetector
|
||||
{
|
||||
private readonly MaterialRiskChangeOptions _options;
|
||||
|
||||
public MaterialRiskChangeDetector(MaterialRiskChangeOptions? options = null)
|
||||
{
|
||||
_options = options ?? MaterialRiskChangeOptions.Default;
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Compares two snapshots and returns all material changes.
|
||||
/// </summary>
|
||||
public MaterialRiskChangeResult Compare(
|
||||
RiskStateSnapshot previous,
|
||||
RiskStateSnapshot current)
|
||||
{
|
||||
if (previous.FindingKey != current.FindingKey)
|
||||
throw new ArgumentException("FindingKey mismatch between snapshots");
|
||||
|
||||
var changes = new List<DetectedChange>();
|
||||
|
||||
// Rule R1: Reachability Flip
|
||||
var r1 = EvaluateReachabilityFlip(previous, current);
|
||||
if (r1 is not null) changes.Add(r1);
|
||||
|
||||
// Rule R2: VEX Status Flip
|
||||
var r2 = EvaluateVexFlip(previous, current);
|
||||
if (r2 is not null) changes.Add(r2);
|
||||
|
||||
// Rule R3: Affected Range Boundary
|
||||
var r3 = EvaluateRangeBoundary(previous, current);
|
||||
if (r3 is not null) changes.Add(r3);
|
||||
|
||||
// Rule R4: Intelligence/Policy Flip
|
||||
var r4Changes = EvaluateIntelligenceFlip(previous, current);
|
||||
changes.AddRange(r4Changes);
|
||||
|
||||
var hasMaterialChange = changes.Count > 0;
|
||||
var priorityScore = hasMaterialChange ? ComputePriorityScore(changes, current) : 0;
|
||||
|
||||
return new MaterialRiskChangeResult(
|
||||
FindingKey: current.FindingKey,
|
||||
HasMaterialChange: hasMaterialChange,
|
||||
Changes: [.. changes],
|
||||
PriorityScore: priorityScore,
|
||||
PreviousStateHash: previous.ComputeStateHash(),
|
||||
CurrentStateHash: current.ComputeStateHash());
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// R1: Reachability Flip - reachable changes false→true or true→false
|
||||
/// </summary>
|
||||
private DetectedChange? EvaluateReachabilityFlip(
|
||||
RiskStateSnapshot prev,
|
||||
RiskStateSnapshot curr)
|
||||
{
|
||||
if (prev.Reachable == curr.Reachable)
|
||||
return null;
|
||||
|
||||
// Skip if either is unknown
|
||||
if (prev.Reachable is null || curr.Reachable is null)
|
||||
return null;
|
||||
|
||||
var direction = curr.Reachable.Value
|
||||
? RiskDirection.Increased
|
||||
: RiskDirection.Decreased;
|
||||
|
||||
return new DetectedChange(
|
||||
Rule: DetectionRule.R1_ReachabilityFlip,
|
||||
ChangeType: MaterialChangeType.ReachabilityFlip,
|
||||
Direction: direction,
|
||||
Reason: $"Reachability changed from {prev.Reachable} to {curr.Reachable}",
|
||||
PreviousValue: prev.Reachable.ToString()!,
|
||||
CurrentValue: curr.Reachable.ToString()!,
|
||||
Weight: direction == RiskDirection.Increased
|
||||
? _options.ReachabilityFlipUpWeight
|
||||
: _options.ReachabilityFlipDownWeight);
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// R2: VEX Status Flip - meaningful status transitions
|
||||
/// </summary>
|
||||
private DetectedChange? EvaluateVexFlip(
|
||||
RiskStateSnapshot prev,
|
||||
RiskStateSnapshot curr)
|
||||
{
|
||||
if (prev.VexStatus == curr.VexStatus)
|
||||
return null;
|
||||
|
||||
// Determine if this is a meaningful flip
|
||||
var (isMeaningful, direction) = ClassifyVexTransition(prev.VexStatus, curr.VexStatus);
|
||||
|
||||
if (!isMeaningful)
|
||||
return null;
|
||||
|
||||
return new DetectedChange(
|
||||
Rule: DetectionRule.R2_VexFlip,
|
||||
ChangeType: MaterialChangeType.VexFlip,
|
||||
Direction: direction,
|
||||
Reason: $"VEX status changed from {prev.VexStatus} to {curr.VexStatus}",
|
||||
PreviousValue: prev.VexStatus.ToString(),
|
||||
CurrentValue: curr.VexStatus.ToString(),
|
||||
Weight: direction == RiskDirection.Increased
|
||||
? _options.VexFlipToAffectedWeight
|
||||
: _options.VexFlipToNotAffectedWeight);
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Classifies VEX status transitions as meaningful or not.
|
||||
/// </summary>
|
||||
private static (bool IsMeaningful, RiskDirection Direction) ClassifyVexTransition(
|
||||
VexStatusType from,
|
||||
VexStatusType to)
|
||||
{
|
||||
return (from, to) switch
|
||||
{
|
||||
// Risk increases
|
||||
(VexStatusType.NotAffected, VexStatusType.Affected) => (true, RiskDirection.Increased),
|
||||
(VexStatusType.Fixed, VexStatusType.Affected) => (true, RiskDirection.Increased),
|
||||
(VexStatusType.UnderInvestigation, VexStatusType.Affected) => (true, RiskDirection.Increased),
|
||||
|
||||
// Risk decreases
|
||||
(VexStatusType.Affected, VexStatusType.NotAffected) => (true, RiskDirection.Decreased),
|
||||
(VexStatusType.Affected, VexStatusType.Fixed) => (true, RiskDirection.Decreased),
|
||||
(VexStatusType.UnderInvestigation, VexStatusType.NotAffected) => (true, RiskDirection.Decreased),
|
||||
(VexStatusType.UnderInvestigation, VexStatusType.Fixed) => (true, RiskDirection.Decreased),
|
||||
|
||||
// Under investigation transitions (noteworthy but not scored)
|
||||
(VexStatusType.Affected, VexStatusType.UnderInvestigation) => (true, RiskDirection.Neutral),
|
||||
(VexStatusType.NotAffected, VexStatusType.UnderInvestigation) => (true, RiskDirection.Neutral),
|
||||
|
||||
// Unknown transitions (from unknown to known)
|
||||
(VexStatusType.Unknown, VexStatusType.Affected) => (true, RiskDirection.Increased),
|
||||
(VexStatusType.Unknown, VexStatusType.NotAffected) => (true, RiskDirection.Decreased),
|
||||
|
||||
// All other transitions are not meaningful
|
||||
_ => (false, RiskDirection.Neutral)
|
||||
};
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// R3: Affected Range Boundary - component enters or exits affected version range
|
||||
/// </summary>
|
||||
private DetectedChange? EvaluateRangeBoundary(
|
||||
RiskStateSnapshot prev,
|
||||
RiskStateSnapshot curr)
|
||||
{
|
||||
if (prev.InAffectedRange == curr.InAffectedRange)
|
||||
return null;
|
||||
|
||||
// Skip if either is unknown
|
||||
if (prev.InAffectedRange is null || curr.InAffectedRange is null)
|
||||
return null;
|
||||
|
||||
var direction = curr.InAffectedRange.Value
|
||||
? RiskDirection.Increased
|
||||
: RiskDirection.Decreased;
|
||||
|
||||
return new DetectedChange(
|
||||
Rule: DetectionRule.R3_RangeBoundary,
|
||||
ChangeType: MaterialChangeType.RangeBoundary,
|
||||
Direction: direction,
|
||||
Reason: curr.InAffectedRange.Value
|
||||
? "Component version entered affected range"
|
||||
: "Component version exited affected range",
|
||||
PreviousValue: prev.InAffectedRange.ToString()!,
|
||||
CurrentValue: curr.InAffectedRange.ToString()!,
|
||||
Weight: direction == RiskDirection.Increased
|
||||
? _options.RangeEntryWeight
|
||||
: _options.RangeExitWeight);
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// R4: Intelligence/Policy Flip - KEV, EPSS threshold, or policy decision changes
|
||||
/// </summary>
|
||||
private List<DetectedChange> EvaluateIntelligenceFlip(
|
||||
RiskStateSnapshot prev,
|
||||
RiskStateSnapshot curr)
|
||||
{
|
||||
var changes = new List<DetectedChange>();
|
||||
|
||||
// KEV change
|
||||
if (prev.Kev != curr.Kev)
|
||||
{
|
||||
var direction = curr.Kev ? RiskDirection.Increased : RiskDirection.Decreased;
|
||||
changes.Add(new DetectedChange(
|
||||
Rule: DetectionRule.R4_IntelligenceFlip,
|
||||
ChangeType: curr.Kev ? MaterialChangeType.KevAdded : MaterialChangeType.KevRemoved,
|
||||
Direction: direction,
|
||||
Reason: curr.Kev ? "Added to KEV catalog" : "Removed from KEV catalog",
|
||||
PreviousValue: prev.Kev.ToString(),
|
||||
CurrentValue: curr.Kev.ToString(),
|
||||
Weight: curr.Kev ? _options.KevAddedWeight : _options.KevRemovedWeight));
|
||||
}
|
||||
|
||||
// EPSS threshold crossing
|
||||
var epssChange = EvaluateEpssThreshold(prev.EpssScore, curr.EpssScore);
|
||||
if (epssChange is not null)
|
||||
{
|
||||
changes.Add(epssChange);
|
||||
}
|
||||
|
||||
// Policy decision flip
|
||||
if (prev.PolicyDecision != curr.PolicyDecision)
|
||||
{
|
||||
var policyChange = EvaluatePolicyFlip(prev.PolicyDecision, curr.PolicyDecision);
|
||||
if (policyChange is not null)
|
||||
{
|
||||
changes.Add(policyChange);
|
||||
}
|
||||
}
|
||||
|
||||
return changes;
|
||||
}
|
||||
|
||||
private DetectedChange? EvaluateEpssThreshold(double? prevScore, double? currScore)
|
||||
{
|
||||
if (prevScore is null || currScore is null)
|
||||
return null;
|
||||
|
||||
var prevAbove = prevScore.Value >= _options.EpssThreshold;
|
||||
var currAbove = currScore.Value >= _options.EpssThreshold;
|
||||
|
||||
if (prevAbove == currAbove)
|
||||
return null;
|
||||
|
||||
var direction = currAbove ? RiskDirection.Increased : RiskDirection.Decreased;
|
||||
|
||||
return new DetectedChange(
|
||||
Rule: DetectionRule.R4_IntelligenceFlip,
|
||||
ChangeType: MaterialChangeType.EpssThreshold,
|
||||
Direction: direction,
|
||||
Reason: currAbove
|
||||
? $"EPSS score crossed above threshold ({_options.EpssThreshold:P0})"
|
||||
: $"EPSS score dropped below threshold ({_options.EpssThreshold:P0})",
|
||||
PreviousValue: prevScore.Value.ToString("F4"),
|
||||
CurrentValue: currScore.Value.ToString("F4"),
|
||||
Weight: _options.EpssThresholdWeight);
|
||||
}
|
||||
|
||||
private DetectedChange? EvaluatePolicyFlip(PolicyDecisionType? prev, PolicyDecisionType? curr)
|
||||
{
|
||||
if (prev is null || curr is null)
|
||||
return null;
|
||||
|
||||
// Determine direction based on severity ordering: Allow < Warn < Block
|
||||
var direction = (prev.Value, curr.Value) switch
|
||||
{
|
||||
(PolicyDecisionType.Allow, PolicyDecisionType.Warn) => RiskDirection.Increased,
|
||||
(PolicyDecisionType.Allow, PolicyDecisionType.Block) => RiskDirection.Increased,
|
||||
(PolicyDecisionType.Warn, PolicyDecisionType.Block) => RiskDirection.Increased,
|
||||
(PolicyDecisionType.Block, PolicyDecisionType.Warn) => RiskDirection.Decreased,
|
||||
(PolicyDecisionType.Block, PolicyDecisionType.Allow) => RiskDirection.Decreased,
|
||||
(PolicyDecisionType.Warn, PolicyDecisionType.Allow) => RiskDirection.Decreased,
|
||||
_ => RiskDirection.Neutral
|
||||
};
|
||||
|
||||
if (direction == RiskDirection.Neutral)
|
||||
return null;
|
||||
|
||||
return new DetectedChange(
|
||||
Rule: DetectionRule.R4_IntelligenceFlip,
|
||||
ChangeType: MaterialChangeType.PolicyFlip,
|
||||
Direction: direction,
|
||||
Reason: $"Policy decision changed from {prev} to {curr}",
|
||||
PreviousValue: prev.Value.ToString(),
|
||||
CurrentValue: curr.Value.ToString(),
|
||||
Weight: _options.PolicyFlipWeight);
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Computes priority score for a set of changes.
|
||||
/// Formula: base_severity × Σ(weight_i × direction_i) × confidence_factor
|
||||
/// </summary>
|
||||
private double ComputePriorityScore(List<DetectedChange> changes, RiskStateSnapshot current)
|
||||
{
|
||||
if (changes.Count == 0)
|
||||
return 0;
|
||||
|
||||
// Sum weighted changes
|
||||
var weightedSum = 0.0;
|
||||
foreach (var change in changes)
|
||||
{
|
||||
var directionMultiplier = change.Direction switch
|
||||
{
|
||||
RiskDirection.Increased => 1.0,
|
||||
RiskDirection.Decreased => -0.5,
|
||||
RiskDirection.Neutral => 0.0,
|
||||
_ => 0.0
|
||||
};
|
||||
weightedSum += change.Weight * directionMultiplier;
|
||||
}
|
||||
|
||||
// Base severity from EPSS or default
|
||||
var baseSeverity = current.EpssScore ?? 0.5;
|
||||
|
||||
// KEV boost
|
||||
var kevBoost = current.Kev ? 1.5 : 1.0;
|
||||
|
||||
// Confidence factor from lattice state
|
||||
var confidence = current.LatticeState switch
|
||||
{
|
||||
"certain_reachable" => 1.0,
|
||||
"likely_reachable" => 0.9,
|
||||
"uncertain" => 0.7,
|
||||
"likely_unreachable" => 0.5,
|
||||
"certain_unreachable" => 0.3,
|
||||
_ => 0.7
|
||||
};
|
||||
|
||||
var score = baseSeverity * weightedSum * kevBoost * confidence;
|
||||
|
||||
// Clamp to [-1, 1]
|
||||
return Math.Clamp(score, -1.0, 1.0);
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,156 @@
|
||||
using System.Collections.Immutable;
|
||||
using System.Text.Json.Serialization;
|
||||
|
||||
namespace StellaOps.Scanner.SmartDiff.Detection;
|
||||
|
||||
/// <summary>
|
||||
/// Result of material risk change detection.
|
||||
/// </summary>
|
||||
public sealed record MaterialRiskChangeResult(
|
||||
[property: JsonPropertyName("findingKey")] FindingKey FindingKey,
|
||||
[property: JsonPropertyName("hasMaterialChange")] bool HasMaterialChange,
|
||||
[property: JsonPropertyName("changes")] ImmutableArray<DetectedChange> Changes,
|
||||
[property: JsonPropertyName("priorityScore")] double PriorityScore,
|
||||
[property: JsonPropertyName("previousStateHash")] string PreviousStateHash,
|
||||
[property: JsonPropertyName("currentStateHash")] string CurrentStateHash);
|
||||
|
||||
/// <summary>
|
||||
/// A detected material change.
|
||||
/// </summary>
|
||||
public sealed record DetectedChange(
|
||||
[property: JsonPropertyName("rule")] DetectionRule Rule,
|
||||
[property: JsonPropertyName("changeType")] MaterialChangeType ChangeType,
|
||||
[property: JsonPropertyName("direction")] RiskDirection Direction,
|
||||
[property: JsonPropertyName("reason")] string Reason,
|
||||
[property: JsonPropertyName("previousValue")] string PreviousValue,
|
||||
[property: JsonPropertyName("currentValue")] string CurrentValue,
|
||||
[property: JsonPropertyName("weight")] double Weight);
|
||||
|
||||
/// <summary>
|
||||
/// Detection rule identifiers (R1-R4).
|
||||
/// </summary>
|
||||
[JsonConverter(typeof(JsonStringEnumConverter<DetectionRule>))]
|
||||
public enum DetectionRule
|
||||
{
|
||||
[JsonStringEnumMemberName("R1")]
|
||||
R1_ReachabilityFlip,
|
||||
|
||||
[JsonStringEnumMemberName("R2")]
|
||||
R2_VexFlip,
|
||||
|
||||
[JsonStringEnumMemberName("R3")]
|
||||
R3_RangeBoundary,
|
||||
|
||||
[JsonStringEnumMemberName("R4")]
|
||||
R4_IntelligenceFlip
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Type of material change.
|
||||
/// </summary>
|
||||
[JsonConverter(typeof(JsonStringEnumConverter<MaterialChangeType>))]
|
||||
public enum MaterialChangeType
|
||||
{
|
||||
[JsonStringEnumMemberName("reachability_flip")]
|
||||
ReachabilityFlip,
|
||||
|
||||
[JsonStringEnumMemberName("vex_flip")]
|
||||
VexFlip,
|
||||
|
||||
[JsonStringEnumMemberName("range_boundary")]
|
||||
RangeBoundary,
|
||||
|
||||
[JsonStringEnumMemberName("kev_added")]
|
||||
KevAdded,
|
||||
|
||||
[JsonStringEnumMemberName("kev_removed")]
|
||||
KevRemoved,
|
||||
|
||||
[JsonStringEnumMemberName("epss_threshold")]
|
||||
EpssThreshold,
|
||||
|
||||
[JsonStringEnumMemberName("policy_flip")]
|
||||
PolicyFlip
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Direction of risk change.
|
||||
/// </summary>
|
||||
[JsonConverter(typeof(JsonStringEnumConverter<RiskDirection>))]
|
||||
public enum RiskDirection
|
||||
{
|
||||
[JsonStringEnumMemberName("increased")]
|
||||
Increased,
|
||||
|
||||
[JsonStringEnumMemberName("decreased")]
|
||||
Decreased,
|
||||
|
||||
[JsonStringEnumMemberName("neutral")]
|
||||
Neutral
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Configuration options for material risk change detection.
|
||||
/// </summary>
|
||||
public sealed class MaterialRiskChangeOptions
|
||||
{
|
||||
/// <summary>
|
||||
/// Default options instance.
|
||||
/// </summary>
|
||||
public static readonly MaterialRiskChangeOptions Default = new();
|
||||
|
||||
/// <summary>
|
||||
/// Weight for reachability flip (unreachable → reachable).
|
||||
/// </summary>
|
||||
public double ReachabilityFlipUpWeight { get; init; } = 1.0;
|
||||
|
||||
/// <summary>
|
||||
/// Weight for reachability flip (reachable → unreachable).
|
||||
/// </summary>
|
||||
public double ReachabilityFlipDownWeight { get; init; } = 0.8;
|
||||
|
||||
/// <summary>
|
||||
/// Weight for VEX flip to affected.
|
||||
/// </summary>
|
||||
public double VexFlipToAffectedWeight { get; init; } = 0.9;
|
||||
|
||||
/// <summary>
|
||||
/// Weight for VEX flip to not_affected.
|
||||
/// </summary>
|
||||
public double VexFlipToNotAffectedWeight { get; init; } = 0.7;
|
||||
|
||||
/// <summary>
|
||||
/// Weight for entering affected range.
|
||||
/// </summary>
|
||||
public double RangeEntryWeight { get; init; } = 0.8;
|
||||
|
||||
/// <summary>
|
||||
/// Weight for exiting affected range.
|
||||
/// </summary>
|
||||
public double RangeExitWeight { get; init; } = 0.6;
|
||||
|
||||
/// <summary>
|
||||
/// Weight for KEV addition.
|
||||
/// </summary>
|
||||
public double KevAddedWeight { get; init; } = 1.0;
|
||||
|
||||
/// <summary>
|
||||
/// Weight for KEV removal.
|
||||
/// </summary>
|
||||
public double KevRemovedWeight { get; init; } = 0.5;
|
||||
|
||||
/// <summary>
|
||||
/// Weight for EPSS threshold crossing.
|
||||
/// </summary>
|
||||
public double EpssThresholdWeight { get; init; } = 0.6;
|
||||
|
||||
/// <summary>
|
||||
/// EPSS score threshold for R4 detection.
|
||||
/// </summary>
|
||||
public double EpssThreshold { get; init; } = 0.5;
|
||||
|
||||
/// <summary>
|
||||
/// Weight for policy decision flip.
|
||||
/// </summary>
|
||||
public double PolicyFlipWeight { get; init; } = 0.7;
|
||||
}
|
||||
@@ -0,0 +1,107 @@
|
||||
using System.Collections.Immutable;
|
||||
using System.Globalization;
|
||||
using System.Security.Cryptography;
|
||||
using System.Text;
|
||||
using System.Text.Json.Serialization;
|
||||
|
||||
namespace StellaOps.Scanner.SmartDiff.Detection;
|
||||
|
||||
/// <summary>
|
||||
/// Captures the complete risk state for a finding at a point in time.
|
||||
/// Used for cross-scan comparison.
|
||||
/// Per Sprint 3500.3 - Smart-Diff Detection Rules.
|
||||
/// </summary>
|
||||
public sealed record RiskStateSnapshot(
|
||||
[property: JsonPropertyName("findingKey")] FindingKey FindingKey,
|
||||
[property: JsonPropertyName("scanId")] string ScanId,
|
||||
[property: JsonPropertyName("capturedAt")] DateTimeOffset CapturedAt,
|
||||
[property: JsonPropertyName("reachable")] bool? Reachable,
|
||||
[property: JsonPropertyName("latticeState")] string? LatticeState,
|
||||
[property: JsonPropertyName("vexStatus")] VexStatusType VexStatus,
|
||||
[property: JsonPropertyName("inAffectedRange")] bool? InAffectedRange,
|
||||
[property: JsonPropertyName("kev")] bool Kev,
|
||||
[property: JsonPropertyName("epssScore")] double? EpssScore,
|
||||
[property: JsonPropertyName("policyFlags")] ImmutableArray<string> PolicyFlags,
|
||||
[property: JsonPropertyName("policyDecision")] PolicyDecisionType? PolicyDecision,
|
||||
[property: JsonPropertyName("evidenceLinks")] ImmutableArray<EvidenceLink>? EvidenceLinks = null)
|
||||
{
|
||||
/// <summary>
|
||||
/// Computes a deterministic hash for this snapshot (excluding timestamp).
|
||||
/// </summary>
|
||||
public string ComputeStateHash()
|
||||
{
|
||||
var builder = new StringBuilder();
|
||||
builder.Append(FindingKey.ToString());
|
||||
builder.Append(':');
|
||||
builder.Append(Reachable?.ToString() ?? "null");
|
||||
builder.Append(':');
|
||||
builder.Append(VexStatus.ToString());
|
||||
builder.Append(':');
|
||||
builder.Append(InAffectedRange?.ToString() ?? "null");
|
||||
builder.Append(':');
|
||||
builder.Append(Kev.ToString());
|
||||
builder.Append(':');
|
||||
builder.Append(EpssScore?.ToString("F4", CultureInfo.InvariantCulture) ?? "null");
|
||||
builder.Append(':');
|
||||
builder.Append(PolicyDecision?.ToString() ?? "null");
|
||||
|
||||
var hash = SHA256.HashData(Encoding.UTF8.GetBytes(builder.ToString()));
|
||||
return Convert.ToHexString(hash).ToLowerInvariant();
|
||||
}
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Key identifying a unique finding.
|
||||
/// </summary>
|
||||
public sealed record FindingKey(
|
||||
[property: JsonPropertyName("vulnId")] string VulnId,
|
||||
[property: JsonPropertyName("componentPurl")] string ComponentPurl)
|
||||
{
|
||||
public override string ToString() => $"{VulnId}@{ComponentPurl}";
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Link to evidence supporting a state.
|
||||
/// </summary>
|
||||
public sealed record EvidenceLink(
|
||||
[property: JsonPropertyName("type")] string Type,
|
||||
[property: JsonPropertyName("uri")] string Uri,
|
||||
[property: JsonPropertyName("digest")] string? Digest = null);
|
||||
|
||||
/// <summary>
|
||||
/// VEX status values.
|
||||
/// </summary>
|
||||
[JsonConverter(typeof(JsonStringEnumConverter<VexStatusType>))]
|
||||
public enum VexStatusType
|
||||
{
|
||||
[JsonStringEnumMemberName("unknown")]
|
||||
Unknown,
|
||||
|
||||
[JsonStringEnumMemberName("affected")]
|
||||
Affected,
|
||||
|
||||
[JsonStringEnumMemberName("not_affected")]
|
||||
NotAffected,
|
||||
|
||||
[JsonStringEnumMemberName("fixed")]
|
||||
Fixed,
|
||||
|
||||
[JsonStringEnumMemberName("under_investigation")]
|
||||
UnderInvestigation
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Policy decision type.
|
||||
/// </summary>
|
||||
[JsonConverter(typeof(JsonStringEnumConverter<PolicyDecisionType>))]
|
||||
public enum PolicyDecisionType
|
||||
{
|
||||
[JsonStringEnumMemberName("allow")]
|
||||
Allow,
|
||||
|
||||
[JsonStringEnumMemberName("warn")]
|
||||
Warn,
|
||||
|
||||
[JsonStringEnumMemberName("block")]
|
||||
Block
|
||||
}
|
||||
@@ -0,0 +1,168 @@
|
||||
using System.Collections.Immutable;
|
||||
using System.Text.Json.Serialization;
|
||||
|
||||
namespace StellaOps.Scanner.SmartDiff.Output;
|
||||
|
||||
/// <summary>
|
||||
/// SARIF 2.1.0 log model for Smart-Diff output.
|
||||
/// Per Sprint 3500.4 - Smart-Diff Binary Analysis.
|
||||
/// </summary>
|
||||
public sealed record SarifLog(
|
||||
[property: JsonPropertyName("version")] string Version,
|
||||
[property: JsonPropertyName("$schema")] string Schema,
|
||||
[property: JsonPropertyName("runs")] ImmutableArray<SarifRun> Runs);
|
||||
|
||||
/// <summary>
|
||||
/// A single SARIF run representing one analysis execution.
|
||||
/// </summary>
|
||||
public sealed record SarifRun(
|
||||
[property: JsonPropertyName("tool")] SarifTool Tool,
|
||||
[property: JsonPropertyName("results")] ImmutableArray<SarifResult> Results,
|
||||
[property: JsonPropertyName("invocations")] ImmutableArray<SarifInvocation>? Invocations = null,
|
||||
[property: JsonPropertyName("artifacts")] ImmutableArray<SarifArtifact>? Artifacts = null,
|
||||
[property: JsonPropertyName("versionControlProvenance")] ImmutableArray<SarifVersionControlDetails>? VersionControlProvenance = null);
|
||||
|
||||
/// <summary>
|
||||
/// Tool information for the SARIF run.
|
||||
/// </summary>
|
||||
public sealed record SarifTool(
|
||||
[property: JsonPropertyName("driver")] SarifToolComponent Driver,
|
||||
[property: JsonPropertyName("extensions")] ImmutableArray<SarifToolComponent>? Extensions = null);
|
||||
|
||||
/// <summary>
|
||||
/// Tool component (driver or extension).
|
||||
/// </summary>
|
||||
public sealed record SarifToolComponent(
|
||||
[property: JsonPropertyName("name")] string Name,
|
||||
[property: JsonPropertyName("version")] string Version,
|
||||
[property: JsonPropertyName("informationUri")] string? InformationUri = null,
|
||||
[property: JsonPropertyName("rules")] ImmutableArray<SarifReportingDescriptor>? Rules = null,
|
||||
[property: JsonPropertyName("supportedTaxonomies")] ImmutableArray<SarifToolComponentReference>? SupportedTaxonomies = null);
|
||||
|
||||
/// <summary>
|
||||
/// Reference to a tool component.
|
||||
/// </summary>
|
||||
public sealed record SarifToolComponentReference(
|
||||
[property: JsonPropertyName("name")] string Name,
|
||||
[property: JsonPropertyName("guid")] string? Guid = null);
|
||||
|
||||
/// <summary>
|
||||
/// Rule definition.
|
||||
/// </summary>
|
||||
public sealed record SarifReportingDescriptor(
|
||||
[property: JsonPropertyName("id")] string Id,
|
||||
[property: JsonPropertyName("name")] string? Name = null,
|
||||
[property: JsonPropertyName("shortDescription")] SarifMessage? ShortDescription = null,
|
||||
[property: JsonPropertyName("fullDescription")] SarifMessage? FullDescription = null,
|
||||
[property: JsonPropertyName("defaultConfiguration")] SarifReportingConfiguration? DefaultConfiguration = null,
|
||||
[property: JsonPropertyName("helpUri")] string? HelpUri = null);
|
||||
|
||||
/// <summary>
|
||||
/// Rule configuration.
|
||||
/// </summary>
|
||||
public sealed record SarifReportingConfiguration(
|
||||
[property: JsonPropertyName("level")] SarifLevel Level = SarifLevel.Warning,
|
||||
[property: JsonPropertyName("enabled")] bool Enabled = true);
|
||||
|
||||
/// <summary>
|
||||
/// SARIF message with text.
|
||||
/// </summary>
|
||||
public sealed record SarifMessage(
|
||||
[property: JsonPropertyName("text")] string Text,
|
||||
[property: JsonPropertyName("markdown")] string? Markdown = null);
|
||||
|
||||
/// <summary>
|
||||
/// SARIF result level.
|
||||
/// </summary>
|
||||
[JsonConverter(typeof(JsonStringEnumConverter<SarifLevel>))]
|
||||
public enum SarifLevel
|
||||
{
|
||||
[JsonStringEnumMemberName("none")]
|
||||
None,
|
||||
|
||||
[JsonStringEnumMemberName("note")]
|
||||
Note,
|
||||
|
||||
[JsonStringEnumMemberName("warning")]
|
||||
Warning,
|
||||
|
||||
[JsonStringEnumMemberName("error")]
|
||||
Error
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// A single result/finding.
|
||||
/// </summary>
|
||||
public sealed record SarifResult(
|
||||
[property: JsonPropertyName("ruleId")] string RuleId,
|
||||
[property: JsonPropertyName("level")] SarifLevel Level,
|
||||
[property: JsonPropertyName("message")] SarifMessage Message,
|
||||
[property: JsonPropertyName("locations")] ImmutableArray<SarifLocation>? Locations = null,
|
||||
[property: JsonPropertyName("fingerprints")] ImmutableDictionary<string, string>? Fingerprints = null,
|
||||
[property: JsonPropertyName("partialFingerprints")] ImmutableDictionary<string, string>? PartialFingerprints = null,
|
||||
[property: JsonPropertyName("properties")] ImmutableDictionary<string, object>? Properties = null);
|
||||
|
||||
/// <summary>
|
||||
/// Location of a result.
|
||||
/// </summary>
|
||||
public sealed record SarifLocation(
|
||||
[property: JsonPropertyName("physicalLocation")] SarifPhysicalLocation? PhysicalLocation = null,
|
||||
[property: JsonPropertyName("logicalLocations")] ImmutableArray<SarifLogicalLocation>? LogicalLocations = null);
|
||||
|
||||
/// <summary>
|
||||
/// Physical file location.
|
||||
/// </summary>
|
||||
public sealed record SarifPhysicalLocation(
|
||||
[property: JsonPropertyName("artifactLocation")] SarifArtifactLocation ArtifactLocation,
|
||||
[property: JsonPropertyName("region")] SarifRegion? Region = null);
|
||||
|
||||
/// <summary>
|
||||
/// Artifact location (file path).
|
||||
/// </summary>
|
||||
public sealed record SarifArtifactLocation(
|
||||
[property: JsonPropertyName("uri")] string Uri,
|
||||
[property: JsonPropertyName("uriBaseId")] string? UriBaseId = null,
|
||||
[property: JsonPropertyName("index")] int? Index = null);
|
||||
|
||||
/// <summary>
|
||||
/// Region within a file.
|
||||
/// </summary>
|
||||
public sealed record SarifRegion(
|
||||
[property: JsonPropertyName("startLine")] int? StartLine = null,
|
||||
[property: JsonPropertyName("startColumn")] int? StartColumn = null,
|
||||
[property: JsonPropertyName("endLine")] int? EndLine = null,
|
||||
[property: JsonPropertyName("endColumn")] int? EndColumn = null);
|
||||
|
||||
/// <summary>
|
||||
/// Logical location (namespace, class, function).
|
||||
/// </summary>
|
||||
public sealed record SarifLogicalLocation(
|
||||
[property: JsonPropertyName("name")] string Name,
|
||||
[property: JsonPropertyName("fullyQualifiedName")] string? FullyQualifiedName = null,
|
||||
[property: JsonPropertyName("kind")] string? Kind = null);
|
||||
|
||||
/// <summary>
|
||||
/// Invocation information.
|
||||
/// </summary>
|
||||
public sealed record SarifInvocation(
|
||||
[property: JsonPropertyName("executionSuccessful")] bool ExecutionSuccessful,
|
||||
[property: JsonPropertyName("startTimeUtc")] DateTimeOffset? StartTimeUtc = null,
|
||||
[property: JsonPropertyName("endTimeUtc")] DateTimeOffset? EndTimeUtc = null,
|
||||
[property: JsonPropertyName("workingDirectory")] SarifArtifactLocation? WorkingDirectory = null,
|
||||
[property: JsonPropertyName("commandLine")] string? CommandLine = null);
|
||||
|
||||
/// <summary>
|
||||
/// Artifact (file) information.
|
||||
/// </summary>
|
||||
public sealed record SarifArtifact(
|
||||
[property: JsonPropertyName("location")] SarifArtifactLocation Location,
|
||||
[property: JsonPropertyName("mimeType")] string? MimeType = null,
|
||||
[property: JsonPropertyName("hashes")] ImmutableDictionary<string, string>? Hashes = null);
|
||||
|
||||
/// <summary>
|
||||
/// Version control information.
|
||||
/// </summary>
|
||||
public sealed record SarifVersionControlDetails(
|
||||
[property: JsonPropertyName("repositoryUri")] string RepositoryUri,
|
||||
[property: JsonPropertyName("revisionId")] string? RevisionId = null,
|
||||
[property: JsonPropertyName("branch")] string? Branch = null);
|
||||
@@ -0,0 +1,393 @@
|
||||
using System.Collections.Immutable;
|
||||
using System.Text.Json;
|
||||
|
||||
namespace StellaOps.Scanner.SmartDiff.Output;
|
||||
|
||||
/// <summary>
|
||||
/// Options for SARIF output generation.
|
||||
/// </summary>
|
||||
public sealed class SarifOutputOptions
|
||||
{
|
||||
/// <summary>
|
||||
/// Default options instance.
|
||||
/// </summary>
|
||||
public static readonly SarifOutputOptions Default = new();
|
||||
|
||||
/// <summary>
|
||||
/// Whether to include VEX candidates in output.
|
||||
/// </summary>
|
||||
public bool IncludeVexCandidates { get; init; } = true;
|
||||
|
||||
/// <summary>
|
||||
/// Whether to include hardening regressions in output.
|
||||
/// </summary>
|
||||
public bool IncludeHardeningRegressions { get; init; } = true;
|
||||
|
||||
/// <summary>
|
||||
/// Whether to include reachability changes in output.
|
||||
/// </summary>
|
||||
public bool IncludeReachabilityChanges { get; init; } = true;
|
||||
|
||||
/// <summary>
|
||||
/// Whether to pretty-print JSON output.
|
||||
/// </summary>
|
||||
public bool IndentedJson { get; init; } = false;
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Input for SARIF generation.
|
||||
/// </summary>
|
||||
public sealed record SmartDiffSarifInput(
|
||||
string ScannerVersion,
|
||||
DateTimeOffset ScanTime,
|
||||
string? BaseDigest,
|
||||
string? TargetDigest,
|
||||
IReadOnlyList<MaterialRiskChange> MaterialChanges,
|
||||
IReadOnlyList<HardeningRegression> HardeningRegressions,
|
||||
IReadOnlyList<VexCandidate> VexCandidates,
|
||||
IReadOnlyList<ReachabilityChange> ReachabilityChanges,
|
||||
VcsInfo? VcsInfo = null);
|
||||
|
||||
/// <summary>
|
||||
/// VCS information for SARIF provenance.
|
||||
/// </summary>
|
||||
public sealed record VcsInfo(
|
||||
string RepositoryUri,
|
||||
string? RevisionId,
|
||||
string? Branch);
|
||||
|
||||
/// <summary>
|
||||
/// A material risk change finding.
|
||||
/// </summary>
|
||||
public sealed record MaterialRiskChange(
|
||||
string VulnId,
|
||||
string ComponentPurl,
|
||||
RiskDirection Direction,
|
||||
string Reason,
|
||||
string? FilePath = null);
|
||||
|
||||
/// <summary>
|
||||
/// Direction of risk change.
|
||||
/// </summary>
|
||||
public enum RiskDirection
|
||||
{
|
||||
/// <summary>Risk increased (worse).</summary>
|
||||
Increased,
|
||||
|
||||
/// <summary>Risk decreased (better).</summary>
|
||||
Decreased,
|
||||
|
||||
/// <summary>Risk status changed but severity unclear.</summary>
|
||||
Changed
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// A hardening regression finding.
|
||||
/// </summary>
|
||||
public sealed record HardeningRegression(
|
||||
string BinaryPath,
|
||||
string FlagName,
|
||||
bool WasEnabled,
|
||||
bool IsEnabled,
|
||||
double ScoreImpact);
|
||||
|
||||
/// <summary>
|
||||
/// A VEX candidate finding.
|
||||
/// </summary>
|
||||
public sealed record VexCandidate(
|
||||
string VulnId,
|
||||
string ComponentPurl,
|
||||
string Justification,
|
||||
string? ImpactStatement);
|
||||
|
||||
/// <summary>
|
||||
/// A reachability status change.
|
||||
/// </summary>
|
||||
public sealed record ReachabilityChange(
|
||||
string VulnId,
|
||||
string ComponentPurl,
|
||||
bool WasReachable,
|
||||
bool IsReachable,
|
||||
string? Evidence);
|
||||
|
||||
/// <summary>
|
||||
/// Generates SARIF 2.1.0 output for Smart-Diff findings.
|
||||
/// Per Sprint 3500.4 - Smart-Diff Binary Analysis.
|
||||
/// </summary>
|
||||
public sealed class SarifOutputGenerator
|
||||
{
|
||||
private const string SarifVersion = "2.1.0";
|
||||
private const string SchemaUri = "https://raw.githubusercontent.com/oasis-tcs/sarif-spec/master/Schemata/sarif-schema-2.1.0.json";
|
||||
private const string ToolName = "StellaOps.Scanner.SmartDiff";
|
||||
private const string ToolInfoUri = "https://stellaops.dev/docs/scanner/smart-diff";
|
||||
|
||||
private static readonly JsonSerializerOptions SarifJsonOptions = new()
|
||||
{
|
||||
PropertyNamingPolicy = JsonNamingPolicy.CamelCase,
|
||||
DefaultIgnoreCondition = System.Text.Json.Serialization.JsonIgnoreCondition.WhenWritingNull,
|
||||
WriteIndented = false
|
||||
};
|
||||
|
||||
/// <summary>
|
||||
/// Generate a SARIF log from Smart-Diff input.
|
||||
/// </summary>
|
||||
public SarifLog Generate(SmartDiffSarifInput input, SarifOutputOptions? options = null)
|
||||
{
|
||||
options ??= SarifOutputOptions.Default;
|
||||
|
||||
var tool = CreateTool(input);
|
||||
var results = CreateResults(input, options);
|
||||
var invocation = CreateInvocation(input);
|
||||
var artifacts = CreateArtifacts(input);
|
||||
var vcsProvenance = CreateVcsProvenance(input);
|
||||
|
||||
var run = new SarifRun(
|
||||
Tool: tool,
|
||||
Results: results,
|
||||
Invocations: [invocation],
|
||||
Artifacts: artifacts.Length > 0 ? artifacts : null,
|
||||
VersionControlProvenance: vcsProvenance);
|
||||
|
||||
return new SarifLog(
|
||||
Version: SarifVersion,
|
||||
Schema: SchemaUri,
|
||||
Runs: [run]);
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Generate SARIF JSON string.
|
||||
/// </summary>
|
||||
public string GenerateJson(SmartDiffSarifInput input, SarifOutputOptions? options = null)
|
||||
{
|
||||
var log = Generate(input, options);
|
||||
var jsonOptions = options?.IndentedJson == true
|
||||
? new JsonSerializerOptions(SarifJsonOptions) { WriteIndented = true }
|
||||
: SarifJsonOptions;
|
||||
return JsonSerializer.Serialize(log, jsonOptions);
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Write SARIF to a stream.
|
||||
/// </summary>
|
||||
public async Task WriteAsync(
|
||||
SmartDiffSarifInput input,
|
||||
Stream outputStream,
|
||||
SarifOutputOptions? options = null,
|
||||
CancellationToken ct = default)
|
||||
{
|
||||
var log = Generate(input, options);
|
||||
var jsonOptions = options?.IndentedJson == true
|
||||
? new JsonSerializerOptions(SarifJsonOptions) { WriteIndented = true }
|
||||
: SarifJsonOptions;
|
||||
await JsonSerializer.SerializeAsync(outputStream, log, jsonOptions, ct);
|
||||
}
|
||||
|
||||
private static SarifTool CreateTool(SmartDiffSarifInput input)
|
||||
{
|
||||
var rules = CreateRules();
|
||||
|
||||
return new SarifTool(
|
||||
Driver: new SarifToolComponent(
|
||||
Name: ToolName,
|
||||
Version: input.ScannerVersion,
|
||||
InformationUri: ToolInfoUri,
|
||||
Rules: rules,
|
||||
SupportedTaxonomies: [
|
||||
new SarifToolComponentReference(
|
||||
Name: "CWE",
|
||||
Guid: "25F72D7E-8A92-459D-AD67-64853F788765")
|
||||
]));
|
||||
}
|
||||
|
||||
private static ImmutableArray<SarifReportingDescriptor> CreateRules()
|
||||
{
|
||||
return
|
||||
[
|
||||
new SarifReportingDescriptor(
|
||||
Id: "SDIFF001",
|
||||
Name: "MaterialRiskChange",
|
||||
ShortDescription: new SarifMessage("Material risk change detected"),
|
||||
FullDescription: new SarifMessage("A vulnerability finding has undergone a material risk state change between scans."),
|
||||
DefaultConfiguration: new SarifReportingConfiguration(Level: SarifLevel.Warning),
|
||||
HelpUri: $"{ToolInfoUri}/rules/SDIFF001"),
|
||||
|
||||
new SarifReportingDescriptor(
|
||||
Id: "SDIFF002",
|
||||
Name: "HardeningRegression",
|
||||
ShortDescription: new SarifMessage("Binary hardening regression detected"),
|
||||
FullDescription: new SarifMessage("A binary has lost security hardening flags compared to the previous scan."),
|
||||
DefaultConfiguration: new SarifReportingConfiguration(Level: SarifLevel.Error),
|
||||
HelpUri: $"{ToolInfoUri}/rules/SDIFF002"),
|
||||
|
||||
new SarifReportingDescriptor(
|
||||
Id: "SDIFF003",
|
||||
Name: "VexCandidateGenerated",
|
||||
ShortDescription: new SarifMessage("VEX candidate auto-generated"),
|
||||
FullDescription: new SarifMessage("A VEX 'not_affected' candidate was generated because vulnerable APIs are no longer present."),
|
||||
DefaultConfiguration: new SarifReportingConfiguration(Level: SarifLevel.Note),
|
||||
HelpUri: $"{ToolInfoUri}/rules/SDIFF003"),
|
||||
|
||||
new SarifReportingDescriptor(
|
||||
Id: "SDIFF004",
|
||||
Name: "ReachabilityFlip",
|
||||
ShortDescription: new SarifMessage("Reachability status changed"),
|
||||
FullDescription: new SarifMessage("The reachability of a vulnerability has flipped between scans."),
|
||||
DefaultConfiguration: new SarifReportingConfiguration(Level: SarifLevel.Warning),
|
||||
HelpUri: $"{ToolInfoUri}/rules/SDIFF004")
|
||||
];
|
||||
}
|
||||
|
||||
private static ImmutableArray<SarifResult> CreateResults(SmartDiffSarifInput input, SarifOutputOptions options)
|
||||
{
|
||||
var results = new List<SarifResult>();
|
||||
|
||||
// Material risk changes
|
||||
foreach (var change in input.MaterialChanges)
|
||||
{
|
||||
results.Add(CreateMaterialChangeResult(change));
|
||||
}
|
||||
|
||||
// Hardening regressions
|
||||
if (options.IncludeHardeningRegressions)
|
||||
{
|
||||
foreach (var regression in input.HardeningRegressions)
|
||||
{
|
||||
results.Add(CreateHardeningRegressionResult(regression));
|
||||
}
|
||||
}
|
||||
|
||||
// VEX candidates
|
||||
if (options.IncludeVexCandidates)
|
||||
{
|
||||
foreach (var candidate in input.VexCandidates)
|
||||
{
|
||||
results.Add(CreateVexCandidateResult(candidate));
|
||||
}
|
||||
}
|
||||
|
||||
// Reachability changes
|
||||
if (options.IncludeReachabilityChanges)
|
||||
{
|
||||
foreach (var change in input.ReachabilityChanges)
|
||||
{
|
||||
results.Add(CreateReachabilityChangeResult(change));
|
||||
}
|
||||
}
|
||||
|
||||
return [.. results];
|
||||
}
|
||||
|
||||
private static SarifResult CreateMaterialChangeResult(MaterialRiskChange change)
|
||||
{
|
||||
var level = change.Direction == RiskDirection.Increased ? SarifLevel.Warning : SarifLevel.Note;
|
||||
var message = $"Material risk change for {change.VulnId} in {change.ComponentPurl}: {change.Reason}";
|
||||
|
||||
var locations = change.FilePath is not null
|
||||
? ImmutableArray.Create(new SarifLocation(
|
||||
PhysicalLocation: new SarifPhysicalLocation(
|
||||
ArtifactLocation: new SarifArtifactLocation(Uri: change.FilePath))))
|
||||
: (ImmutableArray<SarifLocation>?)null;
|
||||
|
||||
return new SarifResult(
|
||||
RuleId: "SDIFF001",
|
||||
Level: level,
|
||||
Message: new SarifMessage(message),
|
||||
Locations: locations,
|
||||
Fingerprints: ImmutableDictionary.CreateRange(new[]
|
||||
{
|
||||
KeyValuePair.Create("vulnId", change.VulnId),
|
||||
KeyValuePair.Create("purl", change.ComponentPurl)
|
||||
}));
|
||||
}
|
||||
|
||||
private static SarifResult CreateHardeningRegressionResult(HardeningRegression regression)
|
||||
{
|
||||
var message = $"Hardening flag '{regression.FlagName}' was {(regression.WasEnabled ? "enabled" : "disabled")} " +
|
||||
$"but is now {(regression.IsEnabled ? "enabled" : "disabled")} in {regression.BinaryPath}";
|
||||
|
||||
return new SarifResult(
|
||||
RuleId: "SDIFF002",
|
||||
Level: SarifLevel.Error,
|
||||
Message: new SarifMessage(message),
|
||||
Locations: [new SarifLocation(
|
||||
PhysicalLocation: new SarifPhysicalLocation(
|
||||
ArtifactLocation: new SarifArtifactLocation(Uri: regression.BinaryPath)))]);
|
||||
}
|
||||
|
||||
private static SarifResult CreateVexCandidateResult(VexCandidate candidate)
|
||||
{
|
||||
var message = $"VEX not_affected candidate for {candidate.VulnId} in {candidate.ComponentPurl}: {candidate.Justification}";
|
||||
|
||||
return new SarifResult(
|
||||
RuleId: "SDIFF003",
|
||||
Level: SarifLevel.Note,
|
||||
Message: new SarifMessage(message),
|
||||
Fingerprints: ImmutableDictionary.CreateRange(new[]
|
||||
{
|
||||
KeyValuePair.Create("vulnId", candidate.VulnId),
|
||||
KeyValuePair.Create("purl", candidate.ComponentPurl)
|
||||
}));
|
||||
}
|
||||
|
||||
private static SarifResult CreateReachabilityChangeResult(ReachabilityChange change)
|
||||
{
|
||||
var direction = change.IsReachable ? "became reachable" : "became unreachable";
|
||||
var message = $"Vulnerability {change.VulnId} in {change.ComponentPurl} {direction}";
|
||||
|
||||
return new SarifResult(
|
||||
RuleId: "SDIFF004",
|
||||
Level: SarifLevel.Warning,
|
||||
Message: new SarifMessage(message),
|
||||
Fingerprints: ImmutableDictionary.CreateRange(new[]
|
||||
{
|
||||
KeyValuePair.Create("vulnId", change.VulnId),
|
||||
KeyValuePair.Create("purl", change.ComponentPurl)
|
||||
}));
|
||||
}
|
||||
|
||||
private static SarifInvocation CreateInvocation(SmartDiffSarifInput input)
|
||||
{
|
||||
return new SarifInvocation(
|
||||
ExecutionSuccessful: true,
|
||||
StartTimeUtc: input.ScanTime,
|
||||
EndTimeUtc: DateTimeOffset.UtcNow);
|
||||
}
|
||||
|
||||
private static ImmutableArray<SarifArtifact> CreateArtifacts(SmartDiffSarifInput input)
|
||||
{
|
||||
var artifacts = new List<SarifArtifact>();
|
||||
|
||||
// Collect unique file paths from results
|
||||
var paths = new HashSet<string>();
|
||||
|
||||
foreach (var change in input.MaterialChanges)
|
||||
{
|
||||
if (change.FilePath is not null)
|
||||
paths.Add(change.FilePath);
|
||||
}
|
||||
|
||||
foreach (var regression in input.HardeningRegressions)
|
||||
{
|
||||
paths.Add(regression.BinaryPath);
|
||||
}
|
||||
|
||||
foreach (var path in paths)
|
||||
{
|
||||
artifacts.Add(new SarifArtifact(
|
||||
Location: new SarifArtifactLocation(Uri: path)));
|
||||
}
|
||||
|
||||
return [.. artifacts];
|
||||
}
|
||||
|
||||
private static ImmutableArray<SarifVersionControlDetails>? CreateVcsProvenance(SmartDiffSarifInput input)
|
||||
{
|
||||
if (input.VcsInfo is null)
|
||||
return null;
|
||||
|
||||
return [new SarifVersionControlDetails(
|
||||
RepositoryUri: input.VcsInfo.RepositoryUri,
|
||||
RevisionId: input.VcsInfo.RevisionId,
|
||||
Branch: input.VcsInfo.Branch)];
|
||||
}
|
||||
}
|
||||
@@ -202,6 +202,31 @@ public sealed class ClassificationHistoryRepository : RepositoryBase<ScannerData
|
||||
cancellationToken);
|
||||
}
|
||||
|
||||
public Task<IReadOnlyList<ClassificationChange>> GetByExecutionAsync(
|
||||
Guid tenantId,
|
||||
Guid executionId,
|
||||
CancellationToken cancellationToken = default)
|
||||
{
|
||||
var sql = $"""
|
||||
SELECT id, artifact_digest, vuln_id, package_purl, tenant_id, manifest_id, execution_id,
|
||||
previous_status, new_status, is_fn_transition, cause, cause_detail, changed_at
|
||||
FROM {Table}
|
||||
WHERE tenant_id = @tenant_id AND execution_id = @execution_id
|
||||
ORDER BY vuln_id, package_purl
|
||||
""";
|
||||
|
||||
return QueryAsync(
|
||||
Tenant,
|
||||
sql,
|
||||
cmd =>
|
||||
{
|
||||
AddParameter(cmd, "tenant_id", tenantId);
|
||||
AddParameter(cmd, "execution_id", executionId);
|
||||
},
|
||||
MapChange,
|
||||
cancellationToken);
|
||||
}
|
||||
|
||||
private void AddChangeParameters(NpgsqlCommand cmd, ClassificationChange change)
|
||||
{
|
||||
AddParameter(cmd, "artifact_digest", change.ArtifactDigest);
|
||||
|
||||
@@ -56,6 +56,15 @@ public interface IClassificationHistoryRepository
|
||||
Guid tenantId,
|
||||
CancellationToken cancellationToken = default);
|
||||
|
||||
/// <summary>
|
||||
/// Gets classification changes for a specific execution.
|
||||
/// SPRINT_3404_0001_0001 - Added for delta computation.
|
||||
/// </summary>
|
||||
Task<IReadOnlyList<ClassificationChange>> GetByExecutionAsync(
|
||||
Guid tenantId,
|
||||
Guid executionId,
|
||||
CancellationToken cancellationToken = default);
|
||||
|
||||
/// <summary>
|
||||
/// Refreshes the FN-Drift statistics materialized view.
|
||||
/// </summary>
|
||||
|
||||
@@ -0,0 +1,238 @@
|
||||
using Microsoft.Extensions.Logging;
|
||||
using StellaOps.Scanner.Storage.Models;
|
||||
using StellaOps.Scanner.Storage.Repositories;
|
||||
|
||||
namespace StellaOps.Scanner.Storage.Services;
|
||||
|
||||
/// <summary>
|
||||
/// Tracks classification changes for FN-Drift analysis.
|
||||
/// SPRINT_3404_0001_0001 - Task #6
|
||||
/// </summary>
|
||||
public interface IClassificationChangeTracker
|
||||
{
|
||||
/// <summary>
|
||||
/// Records a classification change for drift tracking.
|
||||
/// </summary>
|
||||
Task TrackChangeAsync(ClassificationChange change, CancellationToken cancellationToken = default);
|
||||
|
||||
/// <summary>
|
||||
/// Records multiple classification changes in batch.
|
||||
/// </summary>
|
||||
Task TrackChangesAsync(IEnumerable<ClassificationChange> changes, CancellationToken cancellationToken = default);
|
||||
|
||||
/// <summary>
|
||||
/// Computes the classification delta between two scan executions.
|
||||
/// </summary>
|
||||
Task<IReadOnlyList<ClassificationChange>> ComputeDeltaAsync(
|
||||
Guid tenantId,
|
||||
string artifactDigest,
|
||||
Guid previousExecutionId,
|
||||
Guid currentExecutionId,
|
||||
CancellationToken cancellationToken = default);
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Implementation of classification change tracking.
|
||||
/// </summary>
|
||||
public sealed class ClassificationChangeTracker : IClassificationChangeTracker
|
||||
{
|
||||
private readonly IClassificationHistoryRepository _repository;
|
||||
private readonly ILogger<ClassificationChangeTracker> _logger;
|
||||
private readonly TimeProvider _timeProvider;
|
||||
|
||||
public ClassificationChangeTracker(
|
||||
IClassificationHistoryRepository repository,
|
||||
ILogger<ClassificationChangeTracker> logger,
|
||||
TimeProvider? timeProvider = null)
|
||||
{
|
||||
_repository = repository ?? throw new ArgumentNullException(nameof(repository));
|
||||
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
|
||||
_timeProvider = timeProvider ?? TimeProvider.System;
|
||||
}
|
||||
|
||||
public async Task TrackChangeAsync(ClassificationChange change, CancellationToken cancellationToken = default)
|
||||
{
|
||||
ArgumentNullException.ThrowIfNull(change);
|
||||
|
||||
// Only track actual changes
|
||||
if (change.PreviousStatus == change.NewStatus)
|
||||
{
|
||||
_logger.LogDebug(
|
||||
"Skipping no-op classification change for {VulnId} on {Artifact}",
|
||||
change.VulnId,
|
||||
TruncateDigest(change.ArtifactDigest));
|
||||
return;
|
||||
}
|
||||
|
||||
await _repository.InsertAsync(change, cancellationToken);
|
||||
|
||||
if (change.IsFnTransition)
|
||||
{
|
||||
_logger.LogWarning(
|
||||
"FN-Drift detected: {VulnId} on {Artifact} changed from {Previous} to {New} (cause: {Cause})",
|
||||
change.VulnId,
|
||||
TruncateDigest(change.ArtifactDigest),
|
||||
change.PreviousStatus,
|
||||
change.NewStatus,
|
||||
change.Cause);
|
||||
}
|
||||
else
|
||||
{
|
||||
_logger.LogInformation(
|
||||
"Classification change: {VulnId} on {Artifact}: {Previous} -> {New}",
|
||||
change.VulnId,
|
||||
TruncateDigest(change.ArtifactDigest),
|
||||
change.PreviousStatus,
|
||||
change.NewStatus);
|
||||
}
|
||||
}
|
||||
|
||||
public async Task TrackChangesAsync(
|
||||
IEnumerable<ClassificationChange> changes,
|
||||
CancellationToken cancellationToken = default)
|
||||
{
|
||||
ArgumentNullException.ThrowIfNull(changes);
|
||||
|
||||
var changeList = changes
|
||||
.Where(c => c.PreviousStatus != c.NewStatus)
|
||||
.ToList();
|
||||
|
||||
if (changeList.Count == 0)
|
||||
{
|
||||
return;
|
||||
}
|
||||
|
||||
await _repository.InsertBatchAsync(changeList, cancellationToken);
|
||||
|
||||
var fnCount = changeList.Count(c => c.IsFnTransition);
|
||||
if (fnCount > 0)
|
||||
{
|
||||
_logger.LogWarning(
|
||||
"FN-Drift batch: {FnCount} false-negative transitions out of {Total} changes",
|
||||
fnCount,
|
||||
changeList.Count);
|
||||
}
|
||||
}
|
||||
|
||||
public async Task<IReadOnlyList<ClassificationChange>> ComputeDeltaAsync(
|
||||
Guid tenantId,
|
||||
string artifactDigest,
|
||||
Guid previousExecutionId,
|
||||
Guid currentExecutionId,
|
||||
CancellationToken cancellationToken = default)
|
||||
{
|
||||
ArgumentException.ThrowIfNullOrEmpty(artifactDigest);
|
||||
|
||||
// Get classifications from both executions
|
||||
var previousClassifications = await _repository.GetByExecutionAsync(
|
||||
tenantId, previousExecutionId, cancellationToken);
|
||||
var currentClassifications = await _repository.GetByExecutionAsync(
|
||||
tenantId, currentExecutionId, cancellationToken);
|
||||
|
||||
// Index by vuln+package
|
||||
var previousByKey = previousClassifications
|
||||
.Where(c => c.ArtifactDigest == artifactDigest)
|
||||
.ToDictionary(c => (c.VulnId, c.PackagePurl));
|
||||
|
||||
var currentByKey = currentClassifications
|
||||
.Where(c => c.ArtifactDigest == artifactDigest)
|
||||
.ToDictionary(c => (c.VulnId, c.PackagePurl));
|
||||
|
||||
var changes = new List<ClassificationChange>();
|
||||
var now = _timeProvider.GetUtcNow();
|
||||
|
||||
// Find status changes
|
||||
foreach (var (key, current) in currentByKey)
|
||||
{
|
||||
if (previousByKey.TryGetValue(key, out var previous))
|
||||
{
|
||||
if (previous.NewStatus != current.NewStatus)
|
||||
{
|
||||
changes.Add(new ClassificationChange
|
||||
{
|
||||
ArtifactDigest = artifactDigest,
|
||||
VulnId = key.VulnId,
|
||||
PackagePurl = key.PackagePurl,
|
||||
TenantId = tenantId,
|
||||
ManifestId = current.ManifestId,
|
||||
ExecutionId = currentExecutionId,
|
||||
PreviousStatus = previous.NewStatus,
|
||||
NewStatus = current.NewStatus,
|
||||
Cause = DetermineCause(previous, current),
|
||||
ChangedAt = now,
|
||||
});
|
||||
}
|
||||
}
|
||||
else
|
||||
{
|
||||
// New finding
|
||||
changes.Add(new ClassificationChange
|
||||
{
|
||||
ArtifactDigest = artifactDigest,
|
||||
VulnId = key.VulnId,
|
||||
PackagePurl = key.PackagePurl,
|
||||
TenantId = tenantId,
|
||||
ManifestId = current.ManifestId,
|
||||
ExecutionId = currentExecutionId,
|
||||
PreviousStatus = ClassificationStatus.New,
|
||||
NewStatus = current.NewStatus,
|
||||
Cause = DriftCause.FeedDelta,
|
||||
ChangedAt = now,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
return changes;
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Heuristically determine the cause of drift based on change metadata.
|
||||
/// </summary>
|
||||
private static DriftCause DetermineCause(ClassificationChange previous, ClassificationChange current)
|
||||
{
|
||||
// Check cause detail for hints
|
||||
var prevDetail = previous.CauseDetail ?? new Dictionary<string, string>();
|
||||
var currDetail = current.CauseDetail ?? new Dictionary<string, string>();
|
||||
|
||||
// Feed version change
|
||||
if (prevDetail.TryGetValue("feedVersion", out var prevFeed) &&
|
||||
currDetail.TryGetValue("feedVersion", out var currFeed) &&
|
||||
prevFeed != currFeed)
|
||||
{
|
||||
return DriftCause.FeedDelta;
|
||||
}
|
||||
|
||||
// Policy rule change
|
||||
if (prevDetail.TryGetValue("ruleHash", out var prevRule) &&
|
||||
currDetail.TryGetValue("ruleHash", out var currRule) &&
|
||||
prevRule != currRule)
|
||||
{
|
||||
return DriftCause.RuleDelta;
|
||||
}
|
||||
|
||||
// VEX lattice change
|
||||
if (prevDetail.TryGetValue("vexHash", out var prevVex) &&
|
||||
currDetail.TryGetValue("vexHash", out var currVex) &&
|
||||
prevVex != currVex)
|
||||
{
|
||||
return DriftCause.LatticeDelta;
|
||||
}
|
||||
|
||||
// Reachability change
|
||||
if (prevDetail.TryGetValue("reachable", out var prevReach) &&
|
||||
currDetail.TryGetValue("reachable", out var currReach) &&
|
||||
prevReach != currReach)
|
||||
{
|
||||
return DriftCause.ReachabilityDelta;
|
||||
}
|
||||
|
||||
// Default to feed delta (most common)
|
||||
return DriftCause.FeedDelta;
|
||||
}
|
||||
|
||||
private static string TruncateDigest(string digest)
|
||||
{
|
||||
const int maxLen = 16;
|
||||
return digest.Length > maxLen ? digest[..maxLen] + "..." : digest;
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,199 @@
|
||||
using System.Diagnostics.Metrics;
|
||||
using Microsoft.Extensions.Hosting;
|
||||
using Microsoft.Extensions.Logging;
|
||||
using StellaOps.Scanner.Storage.Repositories;
|
||||
|
||||
namespace StellaOps.Scanner.Storage.Services;
|
||||
|
||||
/// <summary>
|
||||
/// Prometheus metrics exporter for FN-Drift tracking.
|
||||
/// SPRINT_3404_0001_0001 - Task #9
|
||||
/// </summary>
|
||||
public sealed class FnDriftMetricsExporter : BackgroundService
|
||||
{
|
||||
public const string MeterName = "StellaOps.Scanner.FnDrift";
|
||||
|
||||
private readonly Meter _meter;
|
||||
private readonly IClassificationHistoryRepository _repository;
|
||||
private readonly ILogger<FnDriftMetricsExporter> _logger;
|
||||
private readonly TimeProvider _timeProvider;
|
||||
private readonly TimeSpan _refreshInterval;
|
||||
|
||||
// Observable gauges (updated periodically)
|
||||
private readonly ObservableGauge<double> _fnDriftPercentGauge;
|
||||
private readonly ObservableGauge<long> _fnTransitionsGauge;
|
||||
private readonly ObservableGauge<long> _totalEvaluatedGauge;
|
||||
private readonly ObservableGauge<long> _feedDeltaCountGauge;
|
||||
private readonly ObservableGauge<long> _ruleDeltaCountGauge;
|
||||
private readonly ObservableGauge<long> _latticeDeltaCountGauge;
|
||||
private readonly ObservableGauge<long> _reachabilityDeltaCountGauge;
|
||||
private readonly ObservableGauge<long> _engineDeltaCountGauge;
|
||||
|
||||
// Counters (incremented on each change)
|
||||
private readonly Counter<long> _classificationChangesCounter;
|
||||
private readonly Counter<long> _fnTransitionsCounter;
|
||||
|
||||
// Current state for observable gauges
|
||||
private volatile FnDriftSnapshot _currentSnapshot = new();
|
||||
|
||||
public FnDriftMetricsExporter(
|
||||
IClassificationHistoryRepository repository,
|
||||
ILogger<FnDriftMetricsExporter> logger,
|
||||
TimeProvider? timeProvider = null,
|
||||
TimeSpan? refreshInterval = null)
|
||||
{
|
||||
_repository = repository ?? throw new ArgumentNullException(nameof(repository));
|
||||
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
|
||||
_timeProvider = timeProvider ?? TimeProvider.System;
|
||||
_refreshInterval = refreshInterval ?? TimeSpan.FromMinutes(1);
|
||||
|
||||
_meter = new Meter(MeterName);
|
||||
|
||||
// Observable gauges - read from snapshot
|
||||
_fnDriftPercentGauge = _meter.CreateObservableGauge(
|
||||
"scanner.fn_drift.percent",
|
||||
() => _currentSnapshot.FnDriftPercent,
|
||||
unit: "%",
|
||||
description: "30-day rolling FN-Drift percentage");
|
||||
|
||||
_fnTransitionsGauge = _meter.CreateObservableGauge(
|
||||
"scanner.fn_drift.transitions_30d",
|
||||
() => _currentSnapshot.FnTransitions,
|
||||
description: "FN transitions in last 30 days");
|
||||
|
||||
_totalEvaluatedGauge = _meter.CreateObservableGauge(
|
||||
"scanner.fn_drift.evaluated_30d",
|
||||
() => _currentSnapshot.TotalEvaluated,
|
||||
description: "Total findings evaluated in last 30 days");
|
||||
|
||||
_feedDeltaCountGauge = _meter.CreateObservableGauge(
|
||||
"scanner.fn_drift.cause.feed_delta",
|
||||
() => _currentSnapshot.FeedDeltaCount,
|
||||
description: "FN transitions caused by feed updates");
|
||||
|
||||
_ruleDeltaCountGauge = _meter.CreateObservableGauge(
|
||||
"scanner.fn_drift.cause.rule_delta",
|
||||
() => _currentSnapshot.RuleDeltaCount,
|
||||
description: "FN transitions caused by rule changes");
|
||||
|
||||
_latticeDeltaCountGauge = _meter.CreateObservableGauge(
|
||||
"scanner.fn_drift.cause.lattice_delta",
|
||||
() => _currentSnapshot.LatticeDeltaCount,
|
||||
description: "FN transitions caused by VEX lattice changes");
|
||||
|
||||
_reachabilityDeltaCountGauge = _meter.CreateObservableGauge(
|
||||
"scanner.fn_drift.cause.reachability_delta",
|
||||
() => _currentSnapshot.ReachabilityDeltaCount,
|
||||
description: "FN transitions caused by reachability changes");
|
||||
|
||||
_engineDeltaCountGauge = _meter.CreateObservableGauge(
|
||||
"scanner.fn_drift.cause.engine",
|
||||
() => _currentSnapshot.EngineDeltaCount,
|
||||
description: "FN transitions caused by engine changes (should be ~0)");
|
||||
|
||||
// Counters - incremented per event
|
||||
_classificationChangesCounter = _meter.CreateCounter<long>(
|
||||
"scanner.classification_changes_total",
|
||||
description: "Total classification status changes");
|
||||
|
||||
_fnTransitionsCounter = _meter.CreateCounter<long>(
|
||||
"scanner.fn_transitions_total",
|
||||
description: "Total false-negative transitions");
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Records a classification change for metrics.
|
||||
/// </summary>
|
||||
public void RecordClassificationChange(bool isFnTransition, string cause)
|
||||
{
|
||||
_classificationChangesCounter.Add(1, new KeyValuePair<string, object?>("cause", cause));
|
||||
|
||||
if (isFnTransition)
|
||||
{
|
||||
_fnTransitionsCounter.Add(1, new KeyValuePair<string, object?>("cause", cause));
|
||||
}
|
||||
}
|
||||
|
||||
protected override async Task ExecuteAsync(CancellationToken stoppingToken)
|
||||
{
|
||||
_logger.LogInformation("FN-Drift metrics exporter starting with {Interval} refresh interval",
|
||||
_refreshInterval);
|
||||
|
||||
while (!stoppingToken.IsCancellationRequested)
|
||||
{
|
||||
try
|
||||
{
|
||||
await RefreshMetricsAsync(stoppingToken);
|
||||
}
|
||||
catch (OperationCanceledException) when (stoppingToken.IsCancellationRequested)
|
||||
{
|
||||
break;
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
_logger.LogWarning(ex, "Failed to refresh FN-Drift metrics, will retry");
|
||||
}
|
||||
|
||||
await Task.Delay(_refreshInterval, _timeProvider, stoppingToken);
|
||||
}
|
||||
|
||||
_logger.LogInformation("FN-Drift metrics exporter stopped");
|
||||
}
|
||||
|
||||
private async Task RefreshMetricsAsync(CancellationToken cancellationToken)
|
||||
{
|
||||
// Get 30-day summary for all tenants (aggregated)
|
||||
// In production, this would iterate over active tenants
|
||||
var now = _timeProvider.GetUtcNow();
|
||||
var fromDate = DateOnly.FromDateTime(now.AddDays(-30).DateTime);
|
||||
var toDate = DateOnly.FromDateTime(now.DateTime);
|
||||
|
||||
var stats = await _repository.GetDriftStatsAsync(
|
||||
Guid.Empty, // Aggregate across tenants
|
||||
fromDate,
|
||||
toDate,
|
||||
cancellationToken);
|
||||
|
||||
// Aggregate stats into snapshot
|
||||
var snapshot = new FnDriftSnapshot();
|
||||
|
||||
foreach (var stat in stats)
|
||||
{
|
||||
snapshot.FnTransitions += stat.FnCount;
|
||||
snapshot.TotalEvaluated += stat.TotalReclassified;
|
||||
snapshot.FeedDeltaCount += stat.FeedDeltaCount;
|
||||
snapshot.RuleDeltaCount += stat.RuleDeltaCount;
|
||||
snapshot.LatticeDeltaCount += stat.LatticeDeltaCount;
|
||||
snapshot.ReachabilityDeltaCount += stat.ReachabilityDeltaCount;
|
||||
snapshot.EngineDeltaCount += stat.EngineCount;
|
||||
}
|
||||
|
||||
if (snapshot.TotalEvaluated > 0)
|
||||
{
|
||||
snapshot.FnDriftPercent = (double)snapshot.FnTransitions / snapshot.TotalEvaluated * 100;
|
||||
}
|
||||
|
||||
_currentSnapshot = snapshot;
|
||||
|
||||
_logger.LogDebug(
|
||||
"FN-Drift metrics refreshed: {FnPercent:F2}% ({FnCount}/{Total})",
|
||||
snapshot.FnDriftPercent,
|
||||
snapshot.FnTransitions,
|
||||
snapshot.TotalEvaluated);
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Snapshot of FN-Drift metrics for observable gauges.
|
||||
/// </summary>
|
||||
private sealed class FnDriftSnapshot
|
||||
{
|
||||
public double FnDriftPercent { get; set; }
|
||||
public long FnTransitions { get; set; }
|
||||
public long TotalEvaluated { get; set; }
|
||||
public long FeedDeltaCount { get; set; }
|
||||
public long RuleDeltaCount { get; set; }
|
||||
public long LatticeDeltaCount { get; set; }
|
||||
public long ReachabilityDeltaCount { get; set; }
|
||||
public long EngineDeltaCount { get; set; }
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,237 @@
|
||||
using StellaOps.Scanner.Storage.Models;
|
||||
using StellaOps.Scanner.Storage.Services;
|
||||
using Microsoft.Extensions.Logging.Abstractions;
|
||||
using Moq;
|
||||
using Xunit;
|
||||
|
||||
namespace StellaOps.Scanner.Storage.Tests;
|
||||
|
||||
/// <summary>
|
||||
/// Unit tests for ClassificationChangeTracker.
|
||||
/// SPRINT_3404_0001_0001 - Task #11, #12
|
||||
/// </summary>
|
||||
public sealed class ClassificationChangeTrackerTests
|
||||
{
|
||||
private readonly Mock<IClassificationHistoryRepository> _repositoryMock;
|
||||
private readonly ClassificationChangeTracker _tracker;
|
||||
private readonly FakeTimeProvider _timeProvider;
|
||||
|
||||
public ClassificationChangeTrackerTests()
|
||||
{
|
||||
_repositoryMock = new Mock<IClassificationHistoryRepository>();
|
||||
_timeProvider = new FakeTimeProvider(DateTimeOffset.UtcNow);
|
||||
_tracker = new ClassificationChangeTracker(
|
||||
_repositoryMock.Object,
|
||||
NullLogger<ClassificationChangeTracker>.Instance,
|
||||
_timeProvider);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task TrackChangeAsync_ActualChange_InsertsToRepository()
|
||||
{
|
||||
// Arrange
|
||||
var change = CreateChange(ClassificationStatus.Unknown, ClassificationStatus.Affected);
|
||||
|
||||
// Act
|
||||
await _tracker.TrackChangeAsync(change);
|
||||
|
||||
// Assert
|
||||
_repositoryMock.Verify(r => r.InsertAsync(change, It.IsAny<CancellationToken>()), Times.Once);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task TrackChangeAsync_NoOpChange_SkipsInsert()
|
||||
{
|
||||
// Arrange - same status
|
||||
var change = CreateChange(ClassificationStatus.Affected, ClassificationStatus.Affected);
|
||||
|
||||
// Act
|
||||
await _tracker.TrackChangeAsync(change);
|
||||
|
||||
// Assert
|
||||
_repositoryMock.Verify(r => r.InsertAsync(It.IsAny<ClassificationChange>(), It.IsAny<CancellationToken>()), Times.Never);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task TrackChangesAsync_FiltersNoOpChanges()
|
||||
{
|
||||
// Arrange
|
||||
var changes = new[]
|
||||
{
|
||||
CreateChange(ClassificationStatus.Unknown, ClassificationStatus.Affected),
|
||||
CreateChange(ClassificationStatus.Affected, ClassificationStatus.Affected), // No-op
|
||||
CreateChange(ClassificationStatus.Affected, ClassificationStatus.Fixed),
|
||||
};
|
||||
|
||||
// Act
|
||||
await _tracker.TrackChangesAsync(changes);
|
||||
|
||||
// Assert
|
||||
_repositoryMock.Verify(r => r.InsertBatchAsync(
|
||||
It.Is<IEnumerable<ClassificationChange>>(c => c.Count() == 2),
|
||||
It.IsAny<CancellationToken>()),
|
||||
Times.Once);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task TrackChangesAsync_EmptyAfterFilter_DoesNotInsert()
|
||||
{
|
||||
// Arrange - all no-ops
|
||||
var changes = new[]
|
||||
{
|
||||
CreateChange(ClassificationStatus.Affected, ClassificationStatus.Affected),
|
||||
CreateChange(ClassificationStatus.Unknown, ClassificationStatus.Unknown),
|
||||
};
|
||||
|
||||
// Act
|
||||
await _tracker.TrackChangesAsync(changes);
|
||||
|
||||
// Assert
|
||||
_repositoryMock.Verify(r => r.InsertBatchAsync(It.IsAny<IEnumerable<ClassificationChange>>(), It.IsAny<CancellationToken>()), Times.Never);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void IsFnTransition_UnknownToAffected_ReturnsTrue()
|
||||
{
|
||||
// Arrange
|
||||
var change = CreateChange(ClassificationStatus.Unknown, ClassificationStatus.Affected);
|
||||
|
||||
// Assert
|
||||
Assert.True(change.IsFnTransition);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void IsFnTransition_UnaffectedToAffected_ReturnsTrue()
|
||||
{
|
||||
// Arrange
|
||||
var change = CreateChange(ClassificationStatus.Unaffected, ClassificationStatus.Affected);
|
||||
|
||||
// Assert
|
||||
Assert.True(change.IsFnTransition);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void IsFnTransition_AffectedToFixed_ReturnsFalse()
|
||||
{
|
||||
// Arrange
|
||||
var change = CreateChange(ClassificationStatus.Affected, ClassificationStatus.Fixed);
|
||||
|
||||
// Assert
|
||||
Assert.False(change.IsFnTransition);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void IsFnTransition_NewToAffected_ReturnsFalse()
|
||||
{
|
||||
// Arrange - new finding, not a reclassification
|
||||
var change = CreateChange(ClassificationStatus.New, ClassificationStatus.Affected);
|
||||
|
||||
// Assert
|
||||
Assert.False(change.IsFnTransition);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task ComputeDeltaAsync_NewFinding_RecordsAsNewStatus()
|
||||
{
|
||||
// Arrange
|
||||
var tenantId = Guid.NewGuid();
|
||||
var artifact = "sha256:abc123";
|
||||
var prevExecId = Guid.NewGuid();
|
||||
var currExecId = Guid.NewGuid();
|
||||
|
||||
_repositoryMock
|
||||
.Setup(r => r.GetByExecutionAsync(tenantId, prevExecId, It.IsAny<CancellationToken>()))
|
||||
.ReturnsAsync(Array.Empty<ClassificationChange>());
|
||||
|
||||
_repositoryMock
|
||||
.Setup(r => r.GetByExecutionAsync(tenantId, currExecId, It.IsAny<CancellationToken>()))
|
||||
.ReturnsAsync(new[]
|
||||
{
|
||||
CreateChange(ClassificationStatus.New, ClassificationStatus.Affected, artifact, "CVE-2024-0001"),
|
||||
});
|
||||
|
||||
// Act
|
||||
var delta = await _tracker.ComputeDeltaAsync(tenantId, artifact, prevExecId, currExecId);
|
||||
|
||||
// Assert
|
||||
Assert.Single(delta);
|
||||
Assert.Equal(ClassificationStatus.New, delta[0].PreviousStatus);
|
||||
Assert.Equal(ClassificationStatus.Affected, delta[0].NewStatus);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task ComputeDeltaAsync_StatusChange_RecordsDelta()
|
||||
{
|
||||
// Arrange
|
||||
var tenantId = Guid.NewGuid();
|
||||
var artifact = "sha256:abc123";
|
||||
var prevExecId = Guid.NewGuid();
|
||||
var currExecId = Guid.NewGuid();
|
||||
|
||||
_repositoryMock
|
||||
.Setup(r => r.GetByExecutionAsync(tenantId, prevExecId, It.IsAny<CancellationToken>()))
|
||||
.ReturnsAsync(new[]
|
||||
{
|
||||
CreateChange(ClassificationStatus.New, ClassificationStatus.Unknown, artifact, "CVE-2024-0001"),
|
||||
});
|
||||
|
||||
_repositoryMock
|
||||
.Setup(r => r.GetByExecutionAsync(tenantId, currExecId, It.IsAny<CancellationToken>()))
|
||||
.ReturnsAsync(new[]
|
||||
{
|
||||
CreateChange(ClassificationStatus.Unknown, ClassificationStatus.Affected, artifact, "CVE-2024-0001"),
|
||||
});
|
||||
|
||||
// Act
|
||||
var delta = await _tracker.ComputeDeltaAsync(tenantId, artifact, prevExecId, currExecId);
|
||||
|
||||
// Assert
|
||||
Assert.Single(delta);
|
||||
Assert.Equal(ClassificationStatus.Unknown, delta[0].PreviousStatus);
|
||||
Assert.Equal(ClassificationStatus.Affected, delta[0].NewStatus);
|
||||
}
|
||||
|
||||
private static ClassificationChange CreateChange(
|
||||
ClassificationStatus previous,
|
||||
ClassificationStatus next,
|
||||
string artifact = "sha256:test",
|
||||
string vulnId = "CVE-2024-0001")
|
||||
{
|
||||
return new ClassificationChange
|
||||
{
|
||||
ArtifactDigest = artifact,
|
||||
VulnId = vulnId,
|
||||
PackagePurl = "pkg:npm/test@1.0.0",
|
||||
TenantId = Guid.NewGuid(),
|
||||
ManifestId = Guid.NewGuid(),
|
||||
ExecutionId = Guid.NewGuid(),
|
||||
PreviousStatus = previous,
|
||||
NewStatus = next,
|
||||
Cause = DriftCause.FeedDelta,
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Fake time provider for testing.
|
||||
/// </summary>
|
||||
internal sealed class FakeTimeProvider : TimeProvider
|
||||
{
|
||||
private DateTimeOffset _now;
|
||||
|
||||
public FakeTimeProvider(DateTimeOffset now) => _now = now;
|
||||
|
||||
public override DateTimeOffset GetUtcNow() => _now;
|
||||
|
||||
public void Advance(TimeSpan duration) => _now = _now.Add(duration);
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Mock interface for testing.
|
||||
/// </summary>
|
||||
public interface IClassificationHistoryRepository
|
||||
{
|
||||
Task InsertAsync(ClassificationChange change, CancellationToken cancellationToken = default);
|
||||
Task InsertBatchAsync(IEnumerable<ClassificationChange> changes, CancellationToken cancellationToken = default);
|
||||
Task<IReadOnlyList<ClassificationChange>> GetByExecutionAsync(Guid tenantId, Guid executionId, CancellationToken cancellationToken = default);
|
||||
}
|
||||
@@ -0,0 +1,312 @@
|
||||
// =============================================================================
|
||||
// SmartDiffSchemaValidationTests.cs
|
||||
// Sprint: SPRINT_3500_0002_0001
|
||||
// Task: SDIFF-FND-016 - JSON Schema validation tests
|
||||
// =============================================================================
|
||||
|
||||
using System.Text.Json;
|
||||
using FluentAssertions;
|
||||
using Json.Schema;
|
||||
using Xunit;
|
||||
|
||||
namespace StellaOps.Scanner.SmartDiff.Tests;
|
||||
|
||||
/// <summary>
|
||||
/// Tests to validate Smart-Diff predicates against JSON Schema.
|
||||
/// </summary>
|
||||
[Trait("Category", "Schema")]
|
||||
[Trait("Sprint", "3500")]
|
||||
public sealed class SmartDiffSchemaValidationTests
|
||||
{
|
||||
private static readonly JsonSerializerOptions JsonOptions = new()
|
||||
{
|
||||
PropertyNamingPolicy = JsonNamingPolicy.CamelCase,
|
||||
WriteIndented = false
|
||||
};
|
||||
|
||||
[Fact(DisplayName = "Valid SmartDiffPredicate passes schema validation")]
|
||||
public void ValidPredicate_PassesValidation()
|
||||
{
|
||||
// Arrange
|
||||
var schema = GetSmartDiffSchema();
|
||||
var predicate = CreateValidPredicate();
|
||||
var json = JsonSerializer.Serialize(predicate, JsonOptions);
|
||||
var jsonNode = JsonDocument.Parse(json).RootElement;
|
||||
|
||||
// Act
|
||||
var result = schema.Evaluate(jsonNode);
|
||||
|
||||
// Assert
|
||||
result.IsValid.Should().BeTrue("Valid predicate should pass schema validation");
|
||||
}
|
||||
|
||||
[Fact(DisplayName = "Predicate missing required field fails validation")]
|
||||
public void MissingRequiredField_FailsValidation()
|
||||
{
|
||||
// Arrange
|
||||
var schema = GetSmartDiffSchema();
|
||||
var json = """
|
||||
{
|
||||
"schemaVersion": "1.0.0",
|
||||
"baseImage": { "digest": "sha256:abc123" }
|
||||
}
|
||||
""";
|
||||
var jsonNode = JsonDocument.Parse(json).RootElement;
|
||||
|
||||
// Act
|
||||
var result = schema.Evaluate(jsonNode);
|
||||
|
||||
// Assert
|
||||
result.IsValid.Should().BeFalse("Missing required fields should fail validation");
|
||||
}
|
||||
|
||||
[Fact(DisplayName = "Predicate with invalid schema version fails validation")]
|
||||
public void InvalidSchemaVersion_FailsValidation()
|
||||
{
|
||||
// Arrange
|
||||
var schema = GetSmartDiffSchema();
|
||||
var json = """
|
||||
{
|
||||
"schemaVersion": "invalid",
|
||||
"baseImage": { "digest": "sha256:abc123" },
|
||||
"targetImage": { "digest": "sha256:def456" },
|
||||
"diff": { "added": [], "removed": [], "modified": [] },
|
||||
"reachabilityGate": { "class": 0, "isSinkReachable": false, "isEntryReachable": false },
|
||||
"scanner": { "name": "test", "version": "1.0.0" }
|
||||
}
|
||||
""";
|
||||
var jsonNode = JsonDocument.Parse(json).RootElement;
|
||||
|
||||
// Act
|
||||
var result = schema.Evaluate(jsonNode);
|
||||
|
||||
// Assert
|
||||
// Schema version must match semver pattern
|
||||
result.IsValid.Should().BeFalse("Invalid schema version should fail validation");
|
||||
}
|
||||
|
||||
[Fact(DisplayName = "ReachabilityGate class must be 0-7")]
|
||||
public void ReachabilityGateClass_MustBe0To7()
|
||||
{
|
||||
// Arrange
|
||||
var schema = GetSmartDiffSchema();
|
||||
var json = """
|
||||
{
|
||||
"schemaVersion": "1.0.0",
|
||||
"baseImage": { "digest": "sha256:abc123" },
|
||||
"targetImage": { "digest": "sha256:def456" },
|
||||
"diff": { "added": [], "removed": [], "modified": [] },
|
||||
"reachabilityGate": { "class": 10, "isSinkReachable": false, "isEntryReachable": false },
|
||||
"scanner": { "name": "test", "version": "1.0.0" }
|
||||
}
|
||||
""";
|
||||
var jsonNode = JsonDocument.Parse(json).RootElement;
|
||||
|
||||
// Act
|
||||
var result = schema.Evaluate(jsonNode);
|
||||
|
||||
// Assert
|
||||
result.IsValid.Should().BeFalse("Reachability class > 7 should fail validation");
|
||||
}
|
||||
|
||||
[Fact(DisplayName = "Valid reachability gate class 0 passes")]
|
||||
public void ReachabilityGateClass0_Passes()
|
||||
{
|
||||
// Arrange
|
||||
var schema = GetSmartDiffSchema();
|
||||
var json = CreatePredicateJson(gateClass: 0);
|
||||
var jsonNode = JsonDocument.Parse(json).RootElement;
|
||||
|
||||
// Act
|
||||
var result = schema.Evaluate(jsonNode);
|
||||
|
||||
// Assert
|
||||
result.IsValid.Should().BeTrue();
|
||||
}
|
||||
|
||||
[Fact(DisplayName = "Valid reachability gate class 7 passes")]
|
||||
public void ReachabilityGateClass7_Passes()
|
||||
{
|
||||
// Arrange
|
||||
var schema = GetSmartDiffSchema();
|
||||
var json = CreatePredicateJson(gateClass: 7);
|
||||
var jsonNode = JsonDocument.Parse(json).RootElement;
|
||||
|
||||
// Act
|
||||
var result = schema.Evaluate(jsonNode);
|
||||
|
||||
// Assert
|
||||
result.IsValid.Should().BeTrue();
|
||||
}
|
||||
|
||||
[Fact(DisplayName = "Suppressed count must be non-negative")]
|
||||
public void SuppressedCount_MustBeNonNegative()
|
||||
{
|
||||
// Arrange
|
||||
var schema = GetSmartDiffSchema();
|
||||
var json = """
|
||||
{
|
||||
"schemaVersion": "1.0.0",
|
||||
"baseImage": { "digest": "sha256:abc123" },
|
||||
"targetImage": { "digest": "sha256:def456" },
|
||||
"diff": { "added": [], "removed": [], "modified": [] },
|
||||
"reachabilityGate": { "class": 0, "isSinkReachable": false, "isEntryReachable": false },
|
||||
"scanner": { "name": "test", "version": "1.0.0" },
|
||||
"suppressedCount": -1
|
||||
}
|
||||
""";
|
||||
var jsonNode = JsonDocument.Parse(json).RootElement;
|
||||
|
||||
// Act
|
||||
var result = schema.Evaluate(jsonNode);
|
||||
|
||||
// Assert
|
||||
result.IsValid.Should().BeFalse("Negative suppressed count should fail");
|
||||
}
|
||||
|
||||
[Fact(DisplayName = "Optional context field is valid when present")]
|
||||
public void OptionalContext_ValidWhenPresent()
|
||||
{
|
||||
// Arrange
|
||||
var schema = GetSmartDiffSchema();
|
||||
var json = """
|
||||
{
|
||||
"schemaVersion": "1.0.0",
|
||||
"baseImage": { "digest": "sha256:abc123" },
|
||||
"targetImage": { "digest": "sha256:def456" },
|
||||
"diff": { "added": [], "removed": [], "modified": [] },
|
||||
"reachabilityGate": { "class": 0, "isSinkReachable": false, "isEntryReachable": false },
|
||||
"scanner": { "name": "test", "version": "1.0.0" },
|
||||
"context": { "env": "production", "namespace": "default" }
|
||||
}
|
||||
""";
|
||||
var jsonNode = JsonDocument.Parse(json).RootElement;
|
||||
|
||||
// Act
|
||||
var result = schema.Evaluate(jsonNode);
|
||||
|
||||
// Assert
|
||||
result.IsValid.Should().BeTrue();
|
||||
}
|
||||
|
||||
private static JsonSchema GetSmartDiffSchema()
|
||||
{
|
||||
// Define schema inline for testing
|
||||
var schemaJson = """
|
||||
{
|
||||
"$schema": "https://json-schema.org/draft/2020-12/schema",
|
||||
"$id": "https://stellaops.dev/schemas/smart-diff.v1.json",
|
||||
"type": "object",
|
||||
"required": ["schemaVersion", "baseImage", "targetImage", "diff", "reachabilityGate", "scanner"],
|
||||
"properties": {
|
||||
"schemaVersion": {
|
||||
"type": "string",
|
||||
"pattern": "^[0-9]+\\.[0-9]+\\.[0-9]+$"
|
||||
},
|
||||
"baseImage": {
|
||||
"type": "object",
|
||||
"required": ["digest"],
|
||||
"properties": {
|
||||
"digest": { "type": "string" },
|
||||
"repository": { "type": "string" },
|
||||
"tag": { "type": "string" }
|
||||
}
|
||||
},
|
||||
"targetImage": {
|
||||
"type": "object",
|
||||
"required": ["digest"],
|
||||
"properties": {
|
||||
"digest": { "type": "string" },
|
||||
"repository": { "type": "string" },
|
||||
"tag": { "type": "string" }
|
||||
}
|
||||
},
|
||||
"diff": {
|
||||
"type": "object",
|
||||
"required": ["added", "removed", "modified"],
|
||||
"properties": {
|
||||
"added": { "type": "array" },
|
||||
"removed": { "type": "array" },
|
||||
"modified": { "type": "array" }
|
||||
}
|
||||
},
|
||||
"reachabilityGate": {
|
||||
"type": "object",
|
||||
"required": ["class", "isSinkReachable", "isEntryReachable"],
|
||||
"properties": {
|
||||
"class": { "type": "integer", "minimum": 0, "maximum": 7 },
|
||||
"isSinkReachable": { "type": "boolean" },
|
||||
"isEntryReachable": { "type": "boolean" },
|
||||
"sinkCategory": { "type": "string" }
|
||||
}
|
||||
},
|
||||
"scanner": {
|
||||
"type": "object",
|
||||
"required": ["name", "version"],
|
||||
"properties": {
|
||||
"name": { "type": "string" },
|
||||
"version": { "type": "string" }
|
||||
}
|
||||
},
|
||||
"context": {
|
||||
"type": "object",
|
||||
"additionalProperties": true
|
||||
},
|
||||
"suppressedCount": {
|
||||
"type": "integer",
|
||||
"minimum": 0
|
||||
},
|
||||
"materialChanges": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "object"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
""";
|
||||
|
||||
return JsonSchema.FromText(schemaJson);
|
||||
}
|
||||
|
||||
private static object CreateValidPredicate()
|
||||
{
|
||||
return new
|
||||
{
|
||||
schemaVersion = "1.0.0",
|
||||
baseImage = new { digest = "sha256:abc123" },
|
||||
targetImage = new { digest = "sha256:def456" },
|
||||
diff = new
|
||||
{
|
||||
added = Array.Empty<object>(),
|
||||
removed = Array.Empty<object>(),
|
||||
modified = Array.Empty<object>()
|
||||
},
|
||||
reachabilityGate = new
|
||||
{
|
||||
@class = 0,
|
||||
isSinkReachable = false,
|
||||
isEntryReachable = false
|
||||
},
|
||||
scanner = new
|
||||
{
|
||||
name = "stellaops-scanner",
|
||||
version = "1.5.0"
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
private static string CreatePredicateJson(int gateClass)
|
||||
{
|
||||
return $$"""
|
||||
{
|
||||
"schemaVersion": "1.0.0",
|
||||
"baseImage": { "digest": "sha256:abc123" },
|
||||
"targetImage": { "digest": "sha256:def456" },
|
||||
"diff": { "added": [], "removed": [], "modified": [] },
|
||||
"reachabilityGate": { "class": {{gateClass}}, "isSinkReachable": false, "isEntryReachable": false },
|
||||
"scanner": { "name": "test", "version": "1.0.0" }
|
||||
}
|
||||
""";
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,238 @@
|
||||
// -----------------------------------------------------------------------------
|
||||
// ScanMetricsRepositoryTests.cs
|
||||
// Sprint: SPRINT_3406_0001_0001_metrics_tables
|
||||
// Task: METRICS-3406-011
|
||||
// Description: Unit tests for scan metrics repository operations
|
||||
// -----------------------------------------------------------------------------
|
||||
|
||||
using StellaOps.Scanner.Storage.Models;
|
||||
using StellaOps.Scanner.Storage.Repositories;
|
||||
using Xunit;
|
||||
|
||||
namespace StellaOps.Scanner.Storage.Tests;
|
||||
|
||||
[Collection("scanner-postgres")]
|
||||
public sealed class ScanMetricsRepositoryTests : IAsyncLifetime
|
||||
{
|
||||
private readonly ScannerPostgresFixture _fixture;
|
||||
private IScanMetricsRepository _repository = null!;
|
||||
|
||||
public ScanMetricsRepositoryTests(ScannerPostgresFixture fixture)
|
||||
{
|
||||
_fixture = fixture;
|
||||
}
|
||||
|
||||
public async Task InitializeAsync()
|
||||
{
|
||||
await _fixture.ResetAsync();
|
||||
_repository = new PostgresScanMetricsRepository(_fixture.CreateConnection);
|
||||
}
|
||||
|
||||
public Task DisposeAsync() => Task.CompletedTask;
|
||||
|
||||
[Fact]
|
||||
public async Task SaveAsync_InsertsNewMetrics()
|
||||
{
|
||||
// Arrange
|
||||
var metrics = CreateTestMetrics();
|
||||
|
||||
// Act
|
||||
await _repository.SaveAsync(metrics, CancellationToken.None);
|
||||
|
||||
// Assert
|
||||
var retrieved = await _repository.GetByScanIdAsync(metrics.ScanId, CancellationToken.None);
|
||||
Assert.NotNull(retrieved);
|
||||
Assert.Equal(metrics.ScanId, retrieved.ScanId);
|
||||
Assert.Equal(metrics.TenantId, retrieved.TenantId);
|
||||
Assert.Equal(metrics.ArtifactDigest, retrieved.ArtifactDigest);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task SavePhasesAsync_InsertsPhasesLinkedToMetrics()
|
||||
{
|
||||
// Arrange
|
||||
var metrics = CreateTestMetrics();
|
||||
await _repository.SaveAsync(metrics, CancellationToken.None);
|
||||
|
||||
var phases = new[]
|
||||
{
|
||||
new ExecutionPhase
|
||||
{
|
||||
MetricsId = metrics.MetricsId,
|
||||
PhaseName = "pull",
|
||||
PhaseOrder = 1,
|
||||
StartedAt = DateTimeOffset.UtcNow.AddSeconds(-10),
|
||||
FinishedAt = DateTimeOffset.UtcNow.AddSeconds(-5),
|
||||
Success = true
|
||||
},
|
||||
new ExecutionPhase
|
||||
{
|
||||
MetricsId = metrics.MetricsId,
|
||||
PhaseName = "analyze",
|
||||
PhaseOrder = 2,
|
||||
StartedAt = DateTimeOffset.UtcNow.AddSeconds(-5),
|
||||
FinishedAt = DateTimeOffset.UtcNow,
|
||||
Success = true
|
||||
}
|
||||
};
|
||||
|
||||
// Act
|
||||
await _repository.SavePhasesAsync(phases, CancellationToken.None);
|
||||
|
||||
// Assert
|
||||
var retrieved = await _repository.GetPhasesByMetricsIdAsync(metrics.MetricsId, CancellationToken.None);
|
||||
Assert.Equal(2, retrieved.Count);
|
||||
Assert.Contains(retrieved, p => p.PhaseName == "pull");
|
||||
Assert.Contains(retrieved, p => p.PhaseName == "analyze");
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task GetByScanIdAsync_ReturnsNullForNonexistent()
|
||||
{
|
||||
// Act
|
||||
var result = await _repository.GetByScanIdAsync(Guid.NewGuid(), CancellationToken.None);
|
||||
|
||||
// Assert
|
||||
Assert.Null(result);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task GetTteByTenantAsync_ReturnsMetricsForTenant()
|
||||
{
|
||||
// Arrange
|
||||
var tenantId = Guid.NewGuid();
|
||||
var metrics1 = CreateTestMetrics(tenantId: tenantId);
|
||||
var metrics2 = CreateTestMetrics(tenantId: tenantId);
|
||||
var metricsOther = CreateTestMetrics(tenantId: Guid.NewGuid());
|
||||
|
||||
await _repository.SaveAsync(metrics1, CancellationToken.None);
|
||||
await _repository.SaveAsync(metrics2, CancellationToken.None);
|
||||
await _repository.SaveAsync(metricsOther, CancellationToken.None);
|
||||
|
||||
// Act
|
||||
var result = await _repository.GetTteByTenantAsync(tenantId, limit: 10, CancellationToken.None);
|
||||
|
||||
// Assert
|
||||
Assert.Equal(2, result.Count);
|
||||
Assert.All(result, m => Assert.Equal(tenantId, m.TenantId));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task GetTteBySurfaceAsync_ReturnsMetricsForSurface()
|
||||
{
|
||||
// Arrange
|
||||
var surfaceId = Guid.NewGuid();
|
||||
var metrics1 = CreateTestMetrics(surfaceId: surfaceId);
|
||||
var metrics2 = CreateTestMetrics(surfaceId: surfaceId);
|
||||
|
||||
await _repository.SaveAsync(metrics1, CancellationToken.None);
|
||||
await _repository.SaveAsync(metrics2, CancellationToken.None);
|
||||
|
||||
// Act
|
||||
var result = await _repository.GetTteBySurfaceAsync(surfaceId, limit: 10, CancellationToken.None);
|
||||
|
||||
// Assert
|
||||
Assert.Equal(2, result.Count);
|
||||
Assert.All(result, m => Assert.Equal(surfaceId, m.SurfaceId));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task GetP50TteAsync_CalculatesMedianCorrectly()
|
||||
{
|
||||
// Arrange
|
||||
var tenantId = Guid.NewGuid();
|
||||
var baseTime = DateTimeOffset.UtcNow;
|
||||
|
||||
// Create metrics with different durations: 100ms, 200ms, 300ms, 400ms, 500ms
|
||||
for (int i = 1; i <= 5; i++)
|
||||
{
|
||||
var metrics = new ScanMetrics
|
||||
{
|
||||
MetricsId = Guid.NewGuid(),
|
||||
ScanId = Guid.NewGuid(),
|
||||
TenantId = tenantId,
|
||||
ArtifactDigest = $"sha256:{Guid.NewGuid():N}",
|
||||
ArtifactType = "oci_image",
|
||||
FindingsSha256 = $"sha256:{Guid.NewGuid():N}",
|
||||
StartedAt = baseTime.AddMilliseconds(-(i * 100)),
|
||||
FinishedAt = baseTime,
|
||||
Phases = new ScanPhaseTimings
|
||||
{
|
||||
PullMs = i * 20,
|
||||
AnalyzeMs = i * 30,
|
||||
DecideMs = i * 50
|
||||
}
|
||||
};
|
||||
await _repository.SaveAsync(metrics, CancellationToken.None);
|
||||
}
|
||||
|
||||
// Act
|
||||
var p50 = await _repository.GetP50TteAsync(tenantId, since: baseTime.AddHours(-1), CancellationToken.None);
|
||||
|
||||
// Assert
|
||||
Assert.NotNull(p50);
|
||||
Assert.True(p50 > 0);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task SaveAsync_PreservesPhaseTimings()
|
||||
{
|
||||
// Arrange
|
||||
var metrics = CreateTestMetrics();
|
||||
metrics.Phases = new ScanPhaseTimings
|
||||
{
|
||||
PullMs = 100,
|
||||
AnalyzeMs = 200,
|
||||
DecideMs = 150,
|
||||
AttestMs = 50,
|
||||
ReachabilityMs = 300
|
||||
};
|
||||
|
||||
// Act
|
||||
await _repository.SaveAsync(metrics, CancellationToken.None);
|
||||
|
||||
// Assert
|
||||
var retrieved = await _repository.GetByScanIdAsync(metrics.ScanId, CancellationToken.None);
|
||||
Assert.NotNull(retrieved);
|
||||
Assert.Equal(100, retrieved.Phases.PullMs);
|
||||
Assert.Equal(200, retrieved.Phases.AnalyzeMs);
|
||||
Assert.Equal(150, retrieved.Phases.DecideMs);
|
||||
Assert.Equal(50, retrieved.Phases.AttestMs);
|
||||
Assert.Equal(300, retrieved.Phases.ReachabilityMs);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task SaveAsync_HandlesReplayScans()
|
||||
{
|
||||
// Arrange
|
||||
var metrics = CreateTestMetrics();
|
||||
metrics.IsReplay = true;
|
||||
metrics.ReplayManifestHash = "sha256:replay123";
|
||||
|
||||
// Act
|
||||
await _repository.SaveAsync(metrics, CancellationToken.None);
|
||||
|
||||
// Assert
|
||||
var retrieved = await _repository.GetByScanIdAsync(metrics.ScanId, CancellationToken.None);
|
||||
Assert.NotNull(retrieved);
|
||||
Assert.True(retrieved.IsReplay);
|
||||
Assert.Equal("sha256:replay123", retrieved.ReplayManifestHash);
|
||||
}
|
||||
|
||||
private static ScanMetrics CreateTestMetrics(Guid? tenantId = null, Guid? surfaceId = null)
|
||||
{
|
||||
return new ScanMetrics
|
||||
{
|
||||
MetricsId = Guid.NewGuid(),
|
||||
ScanId = Guid.NewGuid(),
|
||||
TenantId = tenantId ?? Guid.NewGuid(),
|
||||
SurfaceId = surfaceId,
|
||||
ArtifactDigest = $"sha256:{Guid.NewGuid():N}",
|
||||
ArtifactType = "oci_image",
|
||||
FindingsSha256 = $"sha256:{Guid.NewGuid():N}",
|
||||
StartedAt = DateTimeOffset.UtcNow.AddMinutes(-1),
|
||||
FinishedAt = DateTimeOffset.UtcNow,
|
||||
Phases = new ScanPhaseTimings()
|
||||
};
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,232 @@
|
||||
// -----------------------------------------------------------------------------
|
||||
// FidelityMetricsIntegrationTests.cs
|
||||
// Sprint: SPRINT_3403_0001_0001_fidelity_metrics
|
||||
// Task: FID-3403-013
|
||||
// Description: Integration tests for fidelity metrics in determinism harness
|
||||
// -----------------------------------------------------------------------------
|
||||
|
||||
using StellaOps.Scanner.Worker.Determinism;
|
||||
using StellaOps.Scanner.Worker.Determinism.Calculators;
|
||||
using Xunit;
|
||||
|
||||
namespace StellaOps.Scanner.Worker.Tests.Determinism;
|
||||
|
||||
public sealed class FidelityMetricsIntegrationTests
|
||||
{
|
||||
[Fact]
|
||||
public void DeterminismReport_WithFidelityMetrics_IncludesAllThreeTiers()
|
||||
{
|
||||
// Arrange & Act
|
||||
var fidelity = CreateTestFidelityMetrics(
|
||||
bitwiseFidelity: 0.98,
|
||||
semanticFidelity: 0.99,
|
||||
policyFidelity: 1.0);
|
||||
|
||||
var report = new DeterminismReport(
|
||||
Version: "1.0.0",
|
||||
Release: "test-release",
|
||||
Platform: "linux-amd64",
|
||||
PolicySha: "sha256:policy123",
|
||||
FeedsSha: "sha256:feeds456",
|
||||
ScannerSha: "sha256:scanner789",
|
||||
OverallScore: 0.98,
|
||||
ThresholdOverall: 0.95,
|
||||
ThresholdImage: 0.90,
|
||||
Images: [],
|
||||
Fidelity: fidelity);
|
||||
|
||||
// Assert
|
||||
Assert.NotNull(report.Fidelity);
|
||||
Assert.Equal(0.98, report.Fidelity.BitwiseFidelity);
|
||||
Assert.Equal(0.99, report.Fidelity.SemanticFidelity);
|
||||
Assert.Equal(1.0, report.Fidelity.PolicyFidelity);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void DeterminismImageReport_WithFidelityMetrics_TracksPerImage()
|
||||
{
|
||||
// Arrange
|
||||
var imageFidelity = CreateTestFidelityMetrics(
|
||||
bitwiseFidelity: 0.95,
|
||||
semanticFidelity: 0.98,
|
||||
policyFidelity: 1.0);
|
||||
|
||||
var imageReport = new DeterminismImageReport(
|
||||
Image: "sha256:image123",
|
||||
Runs: 5,
|
||||
Identical: 4,
|
||||
Score: 0.80,
|
||||
ArtifactHashes: new Dictionary<string, string>(),
|
||||
RunsDetail: [],
|
||||
Fidelity: imageFidelity);
|
||||
|
||||
// Assert
|
||||
Assert.NotNull(imageReport.Fidelity);
|
||||
Assert.Equal(0.95, imageReport.Fidelity.BitwiseFidelity);
|
||||
Assert.Equal(5, imageReport.Fidelity.TotalReplays);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void FidelityMetricsService_ComputesAllThreeTiers()
|
||||
{
|
||||
// Arrange
|
||||
var service = new FidelityMetricsService(
|
||||
new BitwiseFidelityCalculator(),
|
||||
new SemanticFidelityCalculator(),
|
||||
new PolicyFidelityCalculator());
|
||||
|
||||
var baseline = CreateTestScanResult("pkg:npm/lodash@4.17.21", "CVE-2021-23337", "high", "pass");
|
||||
var replay = CreateTestScanResult("pkg:npm/lodash@4.17.21", "CVE-2021-23337", "high", "pass");
|
||||
|
||||
// Act
|
||||
var metrics = service.Compute(baseline, new[] { replay });
|
||||
|
||||
// Assert
|
||||
Assert.Equal(1, metrics.TotalReplays);
|
||||
Assert.True(metrics.BitwiseFidelity >= 0.0 && metrics.BitwiseFidelity <= 1.0);
|
||||
Assert.True(metrics.SemanticFidelity >= 0.0 && metrics.SemanticFidelity <= 1.0);
|
||||
Assert.True(metrics.PolicyFidelity >= 0.0 && metrics.PolicyFidelity <= 1.0);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void FidelityMetrics_SemanticEquivalent_ButBitwiseDifferent()
|
||||
{
|
||||
// Arrange - same semantic content, different formatting/ordering
|
||||
var service = new FidelityMetricsService(
|
||||
new BitwiseFidelityCalculator(),
|
||||
new SemanticFidelityCalculator(),
|
||||
new PolicyFidelityCalculator());
|
||||
|
||||
var baseline = CreateTestScanResult("pkg:npm/lodash@4.17.21", "CVE-2021-23337", "HIGH", "pass");
|
||||
var replay = CreateTestScanResult("pkg:npm/lodash@4.17.21", "CVE-2021-23337", "high", "pass"); // case difference
|
||||
|
||||
// Act
|
||||
var metrics = service.Compute(baseline, new[] { replay });
|
||||
|
||||
// Assert
|
||||
// Bitwise should be < 1.0 (different bytes)
|
||||
// Semantic should be 1.0 (same meaning)
|
||||
// Policy should be 1.0 (same decision)
|
||||
Assert.True(metrics.SemanticFidelity >= metrics.BitwiseFidelity);
|
||||
Assert.Equal(1.0, metrics.PolicyFidelity);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void FidelityMetrics_PolicyDifference_ReflectedInPF()
|
||||
{
|
||||
// Arrange
|
||||
var service = new FidelityMetricsService(
|
||||
new BitwiseFidelityCalculator(),
|
||||
new SemanticFidelityCalculator(),
|
||||
new PolicyFidelityCalculator());
|
||||
|
||||
var baseline = CreateTestScanResult("pkg:npm/lodash@4.17.21", "CVE-2021-23337", "high", "pass");
|
||||
var replay = CreateTestScanResult("pkg:npm/lodash@4.17.21", "CVE-2021-23337", "high", "fail"); // policy differs
|
||||
|
||||
// Act
|
||||
var metrics = service.Compute(baseline, new[] { replay });
|
||||
|
||||
// Assert
|
||||
Assert.True(metrics.PolicyFidelity < 1.0);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void FidelityMetrics_MultipleReplays_AveragesCorrectly()
|
||||
{
|
||||
// Arrange
|
||||
var service = new FidelityMetricsService(
|
||||
new BitwiseFidelityCalculator(),
|
||||
new SemanticFidelityCalculator(),
|
||||
new PolicyFidelityCalculator());
|
||||
|
||||
var baseline = CreateTestScanResult("pkg:npm/lodash@4.17.21", "CVE-2021-23337", "high", "pass");
|
||||
var replays = new[]
|
||||
{
|
||||
CreateTestScanResult("pkg:npm/lodash@4.17.21", "CVE-2021-23337", "high", "pass"), // identical
|
||||
CreateTestScanResult("pkg:npm/lodash@4.17.21", "CVE-2021-23337", "high", "pass"), // identical
|
||||
CreateTestScanResult("pkg:npm/lodash@4.17.21", "CVE-2021-23337", "high", "fail"), // policy diff
|
||||
};
|
||||
|
||||
// Act
|
||||
var metrics = service.Compute(baseline, replays);
|
||||
|
||||
// Assert
|
||||
Assert.Equal(3, metrics.TotalReplays);
|
||||
// 2 out of 3 have matching policy
|
||||
Assert.True(metrics.PolicyFidelity >= 0.6 && metrics.PolicyFidelity <= 0.7);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void FidelityMetrics_IncludesMismatchDiagnostics()
|
||||
{
|
||||
// Arrange
|
||||
var service = new FidelityMetricsService(
|
||||
new BitwiseFidelityCalculator(),
|
||||
new SemanticFidelityCalculator(),
|
||||
new PolicyFidelityCalculator());
|
||||
|
||||
var baseline = CreateTestScanResult("pkg:npm/lodash@4.17.21", "CVE-2021-23337", "high", "pass");
|
||||
var replay = CreateTestScanResult("pkg:npm/lodash@4.17.21", "CVE-2021-23337", "critical", "fail"); // semantic + policy diff
|
||||
|
||||
// Act
|
||||
var metrics = service.Compute(baseline, new[] { replay });
|
||||
|
||||
// Assert
|
||||
Assert.NotNull(metrics.Mismatches);
|
||||
Assert.NotEmpty(metrics.Mismatches);
|
||||
}
|
||||
|
||||
private static FidelityMetrics CreateTestFidelityMetrics(
|
||||
double bitwiseFidelity,
|
||||
double semanticFidelity,
|
||||
double policyFidelity,
|
||||
int totalReplays = 5)
|
||||
{
|
||||
return new FidelityMetrics
|
||||
{
|
||||
BitwiseFidelity = bitwiseFidelity,
|
||||
SemanticFidelity = semanticFidelity,
|
||||
PolicyFidelity = policyFidelity,
|
||||
TotalReplays = totalReplays,
|
||||
IdenticalOutputs = (int)(totalReplays * bitwiseFidelity),
|
||||
SemanticMatches = (int)(totalReplays * semanticFidelity),
|
||||
PolicyMatches = (int)(totalReplays * policyFidelity),
|
||||
ComputedAt = DateTimeOffset.UtcNow
|
||||
};
|
||||
}
|
||||
|
||||
private static TestScanResult CreateTestScanResult(
|
||||
string purl,
|
||||
string cve,
|
||||
string severity,
|
||||
string policyDecision)
|
||||
{
|
||||
return new TestScanResult
|
||||
{
|
||||
Packages = new[] { new TestPackage { Purl = purl } },
|
||||
Findings = new[] { new TestFinding { Cve = cve, Severity = severity } },
|
||||
PolicyDecision = policyDecision,
|
||||
PolicyReasonCodes = policyDecision == "pass" ? Array.Empty<string>() : new[] { "severity_exceeded" }
|
||||
};
|
||||
}
|
||||
|
||||
// Test support types
|
||||
private sealed record TestScanResult
|
||||
{
|
||||
public required IReadOnlyList<TestPackage> Packages { get; init; }
|
||||
public required IReadOnlyList<TestFinding> Findings { get; init; }
|
||||
public required string PolicyDecision { get; init; }
|
||||
public required IReadOnlyList<string> PolicyReasonCodes { get; init; }
|
||||
}
|
||||
|
||||
private sealed record TestPackage
|
||||
{
|
||||
public required string Purl { get; init; }
|
||||
}
|
||||
|
||||
private sealed record TestFinding
|
||||
{
|
||||
public required string Cve { get; init; }
|
||||
public required string Severity { get; init; }
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,217 @@
|
||||
// -----------------------------------------------------------------------------
|
||||
// ScanCompletionMetricsIntegrationTests.cs
|
||||
// Sprint: SPRINT_3406_0001_0001_metrics_tables
|
||||
// Task: METRICS-3406-012
|
||||
// Description: Integration test verifying metrics captured on scan completion
|
||||
// -----------------------------------------------------------------------------
|
||||
|
||||
using Microsoft.Extensions.Logging.Abstractions;
|
||||
using Moq;
|
||||
using StellaOps.Scanner.Storage.Models;
|
||||
using StellaOps.Scanner.Storage.Repositories;
|
||||
using StellaOps.Scanner.Worker.Metrics;
|
||||
using Xunit;
|
||||
|
||||
namespace StellaOps.Scanner.Worker.Tests.Metrics;
|
||||
|
||||
public sealed class ScanCompletionMetricsIntegrationTests
|
||||
{
|
||||
[Fact]
|
||||
public async Task CaptureAsync_PersistsMetricsOnScanCompletion()
|
||||
{
|
||||
// Arrange
|
||||
var savedMetrics = new List<ScanMetrics>();
|
||||
var savedPhases = new List<ExecutionPhase>();
|
||||
|
||||
var mockRepository = new Mock<IScanMetricsRepository>();
|
||||
mockRepository
|
||||
.Setup(r => r.SaveAsync(It.IsAny<ScanMetrics>(), It.IsAny<CancellationToken>()))
|
||||
.Callback<ScanMetrics, CancellationToken>((m, _) => savedMetrics.Add(m))
|
||||
.Returns(Task.CompletedTask);
|
||||
mockRepository
|
||||
.Setup(r => r.SavePhasesAsync(It.IsAny<IEnumerable<ExecutionPhase>>(), It.IsAny<CancellationToken>()))
|
||||
.Callback<IEnumerable<ExecutionPhase>, CancellationToken>((p, _) => savedPhases.AddRange(p))
|
||||
.Returns(Task.CompletedTask);
|
||||
|
||||
var factory = new TestScanMetricsCollectorFactory(mockRepository.Object);
|
||||
var integration = new ScanCompletionMetricsIntegration(
|
||||
factory,
|
||||
NullLogger<ScanCompletionMetricsIntegration>.Instance);
|
||||
|
||||
var context = new ScanCompletionContext
|
||||
{
|
||||
ScanId = Guid.NewGuid(),
|
||||
TenantId = Guid.NewGuid(),
|
||||
ArtifactDigest = "sha256:abc123",
|
||||
ArtifactType = "oci_image",
|
||||
FindingsSha256 = "sha256:def456",
|
||||
PackageCount = 150,
|
||||
FindingCount = 25,
|
||||
VexDecisionCount = 10,
|
||||
Phases = new[]
|
||||
{
|
||||
new PhaseCompletionInfo
|
||||
{
|
||||
PhaseName = "pull",
|
||||
StartedAt = DateTimeOffset.UtcNow.AddSeconds(-10),
|
||||
FinishedAt = DateTimeOffset.UtcNow.AddSeconds(-5),
|
||||
Success = true
|
||||
},
|
||||
new PhaseCompletionInfo
|
||||
{
|
||||
PhaseName = "analyze",
|
||||
StartedAt = DateTimeOffset.UtcNow.AddSeconds(-5),
|
||||
FinishedAt = DateTimeOffset.UtcNow,
|
||||
Success = true
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
// Act
|
||||
await integration.CaptureAsync(context);
|
||||
|
||||
// Assert
|
||||
Assert.Single(savedMetrics);
|
||||
var metrics = savedMetrics[0];
|
||||
Assert.Equal(context.ScanId, metrics.ScanId);
|
||||
Assert.Equal(context.TenantId, metrics.TenantId);
|
||||
Assert.Equal(context.ArtifactDigest, metrics.ArtifactDigest);
|
||||
Assert.Equal(context.FindingsSha256, metrics.FindingsSha256);
|
||||
Assert.Equal(150, metrics.PackageCount);
|
||||
Assert.Equal(25, metrics.FindingCount);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task CaptureAsync_DoesNotFailScanOnMetricsError()
|
||||
{
|
||||
// Arrange
|
||||
var mockRepository = new Mock<IScanMetricsRepository>();
|
||||
mockRepository
|
||||
.Setup(r => r.SaveAsync(It.IsAny<ScanMetrics>(), It.IsAny<CancellationToken>()))
|
||||
.ThrowsAsync(new InvalidOperationException("Database error"));
|
||||
|
||||
var factory = new TestScanMetricsCollectorFactory(mockRepository.Object);
|
||||
var integration = new ScanCompletionMetricsIntegration(
|
||||
factory,
|
||||
NullLogger<ScanCompletionMetricsIntegration>.Instance);
|
||||
|
||||
var context = new ScanCompletionContext
|
||||
{
|
||||
ScanId = Guid.NewGuid(),
|
||||
TenantId = Guid.NewGuid(),
|
||||
ArtifactDigest = "sha256:abc123",
|
||||
ArtifactType = "oci_image",
|
||||
FindingsSha256 = "sha256:def456"
|
||||
};
|
||||
|
||||
// Act & Assert - should not throw
|
||||
await integration.CaptureAsync(context);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task CaptureAsync_IncludesVexAndProofDigests()
|
||||
{
|
||||
// Arrange
|
||||
var savedMetrics = new List<ScanMetrics>();
|
||||
|
||||
var mockRepository = new Mock<IScanMetricsRepository>();
|
||||
mockRepository
|
||||
.Setup(r => r.SaveAsync(It.IsAny<ScanMetrics>(), It.IsAny<CancellationToken>()))
|
||||
.Callback<ScanMetrics, CancellationToken>((m, _) => savedMetrics.Add(m))
|
||||
.Returns(Task.CompletedTask);
|
||||
mockRepository
|
||||
.Setup(r => r.SavePhasesAsync(It.IsAny<IEnumerable<ExecutionPhase>>(), It.IsAny<CancellationToken>()))
|
||||
.Returns(Task.CompletedTask);
|
||||
|
||||
var factory = new TestScanMetricsCollectorFactory(mockRepository.Object);
|
||||
var integration = new ScanCompletionMetricsIntegration(
|
||||
factory,
|
||||
NullLogger<ScanCompletionMetricsIntegration>.Instance);
|
||||
|
||||
var context = new ScanCompletionContext
|
||||
{
|
||||
ScanId = Guid.NewGuid(),
|
||||
TenantId = Guid.NewGuid(),
|
||||
ArtifactDigest = "sha256:abc123",
|
||||
ArtifactType = "oci_image",
|
||||
FindingsSha256 = "sha256:findings",
|
||||
VexBundleSha256 = "sha256:vex",
|
||||
ProofBundleSha256 = "sha256:proof",
|
||||
SbomSha256 = "sha256:sbom"
|
||||
};
|
||||
|
||||
// Act
|
||||
await integration.CaptureAsync(context);
|
||||
|
||||
// Assert
|
||||
var metrics = savedMetrics[0];
|
||||
Assert.Equal("sha256:vex", metrics.VexBundleSha256);
|
||||
Assert.Equal("sha256:proof", metrics.ProofBundleSha256);
|
||||
Assert.Equal("sha256:sbom", metrics.SbomSha256);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task CaptureAsync_IncludesReplayMetadata()
|
||||
{
|
||||
// Arrange
|
||||
var savedMetrics = new List<ScanMetrics>();
|
||||
|
||||
var mockRepository = new Mock<IScanMetricsRepository>();
|
||||
mockRepository
|
||||
.Setup(r => r.SaveAsync(It.IsAny<ScanMetrics>(), It.IsAny<CancellationToken>()))
|
||||
.Callback<ScanMetrics, CancellationToken>((m, _) => savedMetrics.Add(m))
|
||||
.Returns(Task.CompletedTask);
|
||||
mockRepository
|
||||
.Setup(r => r.SavePhasesAsync(It.IsAny<IEnumerable<ExecutionPhase>>(), It.IsAny<CancellationToken>()))
|
||||
.Returns(Task.CompletedTask);
|
||||
|
||||
var factory = new TestScanMetricsCollectorFactory(mockRepository.Object);
|
||||
var integration = new ScanCompletionMetricsIntegration(
|
||||
factory,
|
||||
NullLogger<ScanCompletionMetricsIntegration>.Instance);
|
||||
|
||||
var context = new ScanCompletionContext
|
||||
{
|
||||
ScanId = Guid.NewGuid(),
|
||||
TenantId = Guid.NewGuid(),
|
||||
ArtifactDigest = "sha256:abc123",
|
||||
ArtifactType = "oci_image",
|
||||
FindingsSha256 = "sha256:findings",
|
||||
IsReplay = true,
|
||||
ReplayManifestHash = "sha256:replay123"
|
||||
};
|
||||
|
||||
// Act
|
||||
await integration.CaptureAsync(context);
|
||||
|
||||
// Assert
|
||||
var metrics = savedMetrics[0];
|
||||
Assert.True(metrics.IsReplay);
|
||||
Assert.Equal("sha256:replay123", metrics.ReplayManifestHash);
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Test factory that uses a mock repository.
|
||||
/// </summary>
|
||||
private sealed class TestScanMetricsCollectorFactory : IScanMetricsCollectorFactory
|
||||
{
|
||||
private readonly IScanMetricsRepository _repository;
|
||||
|
||||
public TestScanMetricsCollectorFactory(IScanMetricsRepository repository)
|
||||
{
|
||||
_repository = repository;
|
||||
}
|
||||
|
||||
public ScanMetricsCollector Create(Guid scanId, Guid tenantId, string artifactDigest, string artifactType)
|
||||
{
|
||||
return new ScanMetricsCollector(
|
||||
_repository,
|
||||
NullLogger<ScanMetricsCollector>.Instance,
|
||||
scanId,
|
||||
tenantId,
|
||||
artifactDigest,
|
||||
artifactType,
|
||||
"test-1.0.0");
|
||||
}
|
||||
}
|
||||
}
|
||||
Reference in New Issue
Block a user