feat: Implement PackRunApprovalDecisionService for handling approval decisions
Some checks failed
Docs CI / lint-and-preview (push) Has been cancelled

- Added PackRunApprovalDecisionService to manage approval workflows for pack runs.
- Introduced PackRunApprovalDecisionRequest and PackRunApprovalDecisionResult records.
- Implemented logic to apply approval decisions and schedule run resumes based on approvals.
- Updated related tests to validate approval decision functionality.

test: Enhance tests for PackRunApprovalDecisionService

- Created PackRunApprovalDecisionServiceTests to cover various approval scenarios.
- Added in-memory stores for approvals and states to facilitate testing.
- Validated behavior for applying approvals, including handling missing states.

test: Add FilesystemPackRunArtifactUploaderTests for artifact uploads

- Implemented tests for FilesystemPackRunArtifactUploader to ensure correct file handling.
- Verified that missing files are recorded without exceptions and outputs are written as expected.

fix: Update PackRunState creation to include plan reference

- Modified PackRunState creation logic to include the plan in the state.

chore: Refactor service registration in Program.cs

- Updated service registrations in Program.cs to include new approval store and dispatcher services.
- Ensured proper dependency injection for PackRunApprovalDecisionService.

chore: Enhance TaskRunnerServiceOptions for approval store paths

- Added ApprovalStorePath and other paths to TaskRunnerServiceOptions for better configuration.

chore: Update PackRunWorkerService to handle artifact uploads

- Integrated artifact uploading into PackRunWorkerService upon successful run completion.

docs: Update TASKS.md for sprint progress

- Documented progress on approvals workflow and related tasks in TASKS.md.
This commit is contained in:
master
2025-11-06 11:08:52 +02:00
157 changed files with 6386 additions and 3296 deletions

View File

@@ -730,10 +730,11 @@ internal static class CommandFactory
};
activate.Add(activatePolicyIdArgument);
var activateVersionOption = new Option<int>("--version")
{
Description = "Revision version to activate."
};
var activateVersionOption = new Option<int>("--version")
{
Description = "Revision version to activate.",
IsRequired = true
};
var activationNoteOption = new Option<string?>("--note")
{

View File

@@ -49,6 +49,7 @@
| CLI-POLICY-23-004 | TODO | DevEx/CLI Guild | WEB-POLICY-23-001 | Add `stella policy lint` command validating SPL files with compiler diagnostics; support JSON output. | Command returns lint diagnostics; exit codes documented; tests cover error scenarios. |
| CLI-POLICY-23-005 | DOING (2025-10-28) | DevEx/CLI Guild | POLICY-GATEWAY-18-002..003, WEB-POLICY-23-002 | Implement `stella policy activate` with scheduling window, approval enforcement, and summary output. | Activation command integrates with API, handles 2-person rule failures; tests cover success/error. |
> 2025-10-28: CLI command implemented with gateway integration (`policy activate`), interactive summary output, retry-aware metrics, and exit codes (0 success, 75 pending second approval). Tests cover success/pending/error paths.
> 2025-11-06: Tightened required `--version` parsing, added scheduled activation handling coverage, and expanded tests to validate timestamp normalization.
| CLI-POLICY-23-006 | TODO | DevEx/CLI Guild | WEB-POLICY-23-004 | Provide `stella policy history` and `stella policy explain` commands to pull run history and explanation trees. | Commands output JSON/table; integration tests with fixtures; docs updated. |
## Graph & Vuln Explorer v1

View File

@@ -1811,11 +1811,11 @@ spec:
}
[Fact]
public async Task HandlePolicyActivateAsync_PendingSecondApprovalSetsExitCode()
{
var originalExit = Environment.ExitCode;
var backend = new StubBackendClient(new JobTriggerResult(true, "ok", null, null));
public async Task HandlePolicyActivateAsync_PendingSecondApprovalSetsExitCode()
{
var originalExit = Environment.ExitCode;
var backend = new StubBackendClient(new JobTriggerResult(true, "ok", null, null));
backend.ActivationResult = new PolicyActivationResult(
"pending_second_approval",
new PolicyActivationRevision(
@@ -1852,15 +1852,65 @@ spec:
finally
{
Environment.ExitCode = originalExit;
}
}
[Fact]
public async Task HandlePolicyActivateAsync_MapsErrorCodes()
{
var originalExit = Environment.ExitCode;
var backend = new StubBackendClient(new JobTriggerResult(true, "ok", null, null))
}
}
[Fact]
public async Task HandlePolicyActivateAsync_ParsesScheduledTimestamp()
{
var originalExit = Environment.ExitCode;
var backend = new StubBackendClient(new JobTriggerResult(true, "ok", null, null));
backend.ActivationResult = new PolicyActivationResult(
"scheduled",
new PolicyActivationRevision(
"P-8",
5,
"approved",
false,
DateTimeOffset.Parse("2025-12-01T00:30:00Z", CultureInfo.InvariantCulture),
null,
new ReadOnlyCollection<PolicyActivationApproval>(Array.Empty<PolicyActivationApproval>())));
var provider = BuildServiceProvider(backend);
try
{
const string scheduledValue = "2025-12-01T03:00:00+02:00";
await CommandHandlers.HandlePolicyActivateAsync(
provider,
policyId: "P-8",
version: 5,
note: null,
runNow: false,
scheduledAt: scheduledValue,
priority: null,
rollback: false,
incidentId: null,
verbose: false,
cancellationToken: CancellationToken.None);
Assert.Equal(0, Environment.ExitCode);
Assert.NotNull(backend.LastPolicyActivation);
var activation = backend.LastPolicyActivation!.Value;
Assert.False(activation.Request.RunNow);
var expected = DateTimeOffset.Parse(
scheduledValue,
CultureInfo.InvariantCulture,
DateTimeStyles.AssumeUniversal | DateTimeStyles.AdjustToUniversal);
Assert.Equal(expected, activation.Request.ScheduledAt);
}
finally
{
Environment.ExitCode = originalExit;
}
}
[Fact]
public async Task HandlePolicyActivateAsync_MapsErrorCodes()
{
var originalExit = Environment.ExitCode;
var backend = new StubBackendClient(new JobTriggerResult(true, "ok", null, null))
{
ActivationException = new PolicyApiException("Revision not approved", HttpStatusCode.BadRequest, "ERR_POL_002")
};

View File

@@ -52,9 +52,11 @@ internal static class JobRegistrationExtensions
new("source:vndr-oracle:parse", "StellaOps.Concelier.Connector.Vndr.Oracle.OracleParseJob", "StellaOps.Concelier.Connector.Vndr.Oracle", TimeSpan.FromMinutes(15), TimeSpan.FromMinutes(5)),
new("source:vndr-oracle:map", "StellaOps.Concelier.Connector.Vndr.Oracle.OracleMapJob", "StellaOps.Concelier.Connector.Vndr.Oracle", TimeSpan.FromMinutes(15), TimeSpan.FromMinutes(5)),
new("export:json", "StellaOps.Concelier.Exporter.Json.JsonExportJob", "StellaOps.Concelier.Exporter.Json", TimeSpan.FromMinutes(10), TimeSpan.FromMinutes(5)),
new("export:json", "StellaOps.Concelier.Exporter.Json.JsonExportJob", "StellaOps.Concelier.Exporter.Json", TimeSpan.FromMinutes(10), TimeSpan.FromMinutes(5)),
new("export:trivy-db", "StellaOps.Concelier.Exporter.TrivyDb.TrivyDbExportJob", "StellaOps.Concelier.Exporter.TrivyDb", TimeSpan.FromMinutes(20), TimeSpan.FromMinutes(10)),
#pragma warning disable CS0618, CONCELIER0001 // Legacy merge job remains available until MERGE-LNM-21-002 completes.
new("merge:reconcile", "StellaOps.Concelier.Merge.Jobs.MergeReconcileJob", "StellaOps.Concelier.Merge", TimeSpan.FromMinutes(15), TimeSpan.FromMinutes(5))
#pragma warning restore CS0618, CONCELIER0001
};
public static IServiceCollection AddBuiltInConcelierJobs(this IServiceCollection services)

View File

@@ -15,6 +15,8 @@ public sealed class ConcelierOptions
public AuthorityOptions Authority { get; set; } = new();
public MirrorOptions Mirror { get; set; } = new();
public FeaturesOptions Features { get; set; } = new();
public sealed class StorageOptions
{
@@ -135,4 +137,13 @@ public sealed class ConcelierOptions
public int MaxDownloadRequestsPerHour { get; set; } = 1200;
}
public sealed class FeaturesOptions
{
public bool NoMergeEnabled { get; set; }
public bool LnmShadowWrites { get; set; } = true;
public IList<string> MergeJobAllowlist { get; } = new List<string>();
}
}

View File

@@ -17,7 +17,8 @@ public static class ConcelierOptionsPostConfigure
{
ArgumentNullException.ThrowIfNull(options);
options.Authority ??= new ConcelierOptions.AuthorityOptions();
options.Authority ??= new ConcelierOptions.AuthorityOptions();
options.Features ??= new ConcelierOptions.FeaturesOptions();
var authority = options.Authority;
if (string.IsNullOrWhiteSpace(authority.ClientSecret)

View File

@@ -1,3 +1,4 @@
using System;
using System.Collections.Generic;
using System.Collections.Immutable;
using System.Globalization;
@@ -98,9 +99,36 @@ builder.Services.AddConcelierLinksetMappers();
builder.Services.AddAdvisoryRawServices();
builder.Services.AddSingleton<IAdvisoryObservationQueryService, AdvisoryObservationQueryService>();
builder.Services.AddMergeModule(builder.Configuration);
var features = concelierOptions.Features ?? new ConcelierOptions.FeaturesOptions();
if (!features.NoMergeEnabled)
{
#pragma warning disable CS0618, CONCELIER0001, CONCELIER0002 // Legacy merge service is intentionally supported behind a feature toggle.
builder.Services.AddMergeModule(builder.Configuration);
#pragma warning restore CS0618, CONCELIER0001, CONCELIER0002
}
builder.Services.AddJobScheduler();
builder.Services.AddBuiltInConcelierJobs();
builder.Services.PostConfigure<JobSchedulerOptions>(options =>
{
if (features.NoMergeEnabled)
{
options.Definitions.Remove("merge:reconcile");
return;
}
if (features.MergeJobAllowlist is { Count: > 0 })
{
var allowMergeJob = features.MergeJobAllowlist.Any(value =>
string.Equals(value, "merge:reconcile", StringComparison.OrdinalIgnoreCase));
if (!allowMergeJob)
{
options.Definitions.Remove("merge:reconcile");
}
}
});
builder.Services.AddSingleton<OpenApiDiscoveryDocumentProvider>();
builder.Services.AddSingleton<ServiceStatus>(sp => new ServiceStatus(sp.GetRequiredService<TimeProvider>()));
@@ -183,7 +211,7 @@ if (authorityConfigured)
builder.Services.AddAuthorization(options =>
{
options.AddStellaOpsScopePolicy(JobsPolicyName, concelierOptions.Authority.RequiredScopes.ToArray());
options.AddStellaOpsScopePolicy(ObservationsPolicyName, StellaOpsScopes.VulnRead);
options.AddStellaOpsScopePolicy(ObservationsPolicyName, StellaOpsScopes.VulnView);
options.AddStellaOpsScopePolicy(AdvisoryIngestPolicyName, StellaOpsScopes.AdvisoryIngest);
options.AddStellaOpsScopePolicy(AdvisoryReadPolicyName, StellaOpsScopes.AdvisoryRead);
options.AddStellaOpsScopePolicy(AocVerifyPolicyName, StellaOpsScopes.AdvisoryRead, StellaOpsScopes.AocVerify);
@@ -197,6 +225,11 @@ builder.Services.AddEndpointsApiExplorer();
var app = builder.Build();
if (features.NoMergeEnabled)
{
app.Logger.LogWarning("Legacy merge module disabled via concelier:features:noMergeEnabled; Link-Not-Merge mode active.");
}
var resolvedConcelierOptions = app.Services.GetRequiredService<IOptions<ConcelierOptions>>().Value;
var resolvedAuthority = resolvedConcelierOptions.Authority ?? new ConcelierOptions.AuthorityOptions();
authorityConfigured = resolvedAuthority.Enabled;

View File

@@ -35,5 +35,8 @@
<ProjectReference Include="../../Authority/StellaOps.Authority/StellaOps.Auth.ServerIntegration/StellaOps.Auth.ServerIntegration.csproj" />
<ProjectReference Include="../../Aoc/__Libraries/StellaOps.Aoc/StellaOps.Aoc.csproj" />
<ProjectReference Include="../../Aoc/__Libraries/StellaOps.Aoc.AspNetCore/StellaOps.Aoc.AspNetCore.csproj" />
<ProjectReference Include="../__Analyzers/StellaOps.Concelier.Analyzers/StellaOps.Concelier.Analyzers.csproj"
OutputItemType="Analyzer"
ReferenceOutputAssembly="false" />
</ItemGroup>
</Project>

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,2 @@
; Shipped analyzer releases

View File

@@ -0,0 +1,9 @@
## Release History
### Unreleased
#### New Rules
Rule ID | Title | Notes
--------|-------|------
CONCELIER0002 | Legacy merge pipeline is disabled | Flags usage of `AddMergeModule` and `AdvisoryMergeService`.

View File

@@ -0,0 +1,152 @@
using System;
using System.Collections.Immutable;
using Microsoft.CodeAnalysis;
using Microsoft.CodeAnalysis.Diagnostics;
using Microsoft.CodeAnalysis.Operations;
namespace StellaOps.Concelier.Analyzers;
/// <summary>
/// Analyzer that flags usages of the legacy merge service APIs.
/// </summary>
[DiagnosticAnalyzer(LanguageNames.CSharp)]
public sealed class NoMergeUsageAnalyzer : DiagnosticAnalyzer
{
/// <summary>
/// Diagnostic identifier for legacy merge usage violations.
/// </summary>
public const string DiagnosticId = "CONCELIER0002";
private const string Category = "Usage";
private const string MergeExtensionType = "StellaOps.Concelier.Merge.MergeServiceCollectionExtensions";
private const string MergeServiceType = "StellaOps.Concelier.Merge.Services.AdvisoryMergeService";
private static readonly LocalizableString Title = "Legacy merge pipeline is disabled";
private static readonly LocalizableString MessageFormat = "Do not reference the legacy Concelier merge pipeline (type '{0}')";
private static readonly LocalizableString Description =
"The legacy Concelier merge service is deprecated under MERGE-LNM-21-002. "
+ "Switch to observation/linkset APIs or guard calls behind the concelier:features:noMergeEnabled toggle.";
private static readonly DiagnosticDescriptor Rule = new(
DiagnosticId,
Title,
MessageFormat,
Category,
DiagnosticSeverity.Error,
isEnabledByDefault: true,
description: Description,
helpLinkUri: "https://stella-ops.org/docs/migration/no-merge");
/// <inheritdoc />
public override ImmutableArray<DiagnosticDescriptor> SupportedDiagnostics => ImmutableArray.Create(Rule);
/// <inheritdoc />
public override void Initialize(AnalysisContext context)
{
context.ConfigureGeneratedCodeAnalysis(GeneratedCodeAnalysisFlags.None);
context.EnableConcurrentExecution();
context.RegisterOperationAction(AnalyzeInvocation, OperationKind.Invocation);
context.RegisterOperationAction(AnalyzeObjectCreation, OperationKind.ObjectCreation);
}
private static void AnalyzeInvocation(OperationAnalysisContext context)
{
if (context.Operation is not IInvocationOperation invocation)
{
return;
}
var targetMethod = invocation.TargetMethod;
if (targetMethod is null)
{
return;
}
if (!SymbolEquals(targetMethod.ContainingType, MergeExtensionType))
{
return;
}
if (!string.Equals(targetMethod.Name, "AddMergeModule", StringComparison.Ordinal))
{
return;
}
if (IsAllowedAssembly(context.ContainingSymbol.ContainingAssembly))
{
return;
}
ReportDiagnostic(context, invocation.Syntax, $"{MergeExtensionType}.{targetMethod.Name}");
}
private static void AnalyzeObjectCreation(OperationAnalysisContext context)
{
if (context.Operation is not IObjectCreationOperation creation)
{
return;
}
var createdType = creation.Type;
if (createdType is null || !SymbolEquals(createdType, MergeServiceType))
{
return;
}
if (IsAllowedAssembly(context.ContainingSymbol.ContainingAssembly))
{
return;
}
ReportDiagnostic(context, creation.Syntax, MergeServiceType);
}
private static bool SymbolEquals(ITypeSymbol? symbol, string fullName)
{
if (symbol is null)
{
return false;
}
var display = symbol.ToDisplayString(SymbolDisplayFormat.FullyQualifiedFormat);
if (display.StartsWith("global::", StringComparison.Ordinal))
{
display = display.Substring("global::".Length);
}
return string.Equals(display, fullName, StringComparison.Ordinal);
}
private static bool IsAllowedAssembly(IAssemblySymbol? assemblySymbol)
{
if (assemblySymbol is null)
{
return false;
}
var assemblyName = assemblySymbol.Name;
if (string.IsNullOrWhiteSpace(assemblyName))
{
return false;
}
if (assemblyName.StartsWith("StellaOps.Concelier.Merge", StringComparison.Ordinal))
{
return true;
}
if (assemblyName.EndsWith(".Analyzers", StringComparison.Ordinal))
{
return true;
}
return false;
}
private static void ReportDiagnostic(OperationAnalysisContext context, SyntaxNode syntax, string targetName)
{
var diagnostic = Diagnostic.Create(Rule, syntax.GetLocation(), targetName);
context.ReportDiagnostic(diagnostic);
}
}

View File

@@ -0,0 +1,19 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>netstandard2.0</TargetFramework>
<AssemblyName>StellaOps.Concelier.Analyzers</AssemblyName>
<RootNamespace>StellaOps.Concelier.Analyzers</RootNamespace>
<Nullable>enable</Nullable>
<LangVersion>latest</LangVersion>
<IncludeBuildOutput>false</IncludeBuildOutput>
<GenerateDocumentationFile>true</GenerateDocumentationFile>
<EnforceExtendedAnalyzerRules>true</EnforceExtendedAnalyzerRules>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Microsoft.CodeAnalysis.CSharp" Version="4.9.2" PrivateAssets="all" />
<PackageReference Include="Microsoft.CodeAnalysis.Analyzers" Version="3.11.0" PrivateAssets="all" />
</ItemGroup>
</Project>

View File

@@ -114,10 +114,10 @@ internal sealed class AdvisoryObservationFactory : IAdvisoryObservationFactory
private static AdvisoryObservationLinkset CreateLinkset(RawIdentifiers identifiers, RawLinkset linkset)
{
var aliases = NormalizeAliases(identifiers, linkset);
var purls = NormalizePackageUrls(linkset.PackageUrls);
var cpes = NormalizeCpes(linkset.Cpes);
var references = NormalizeReferences(linkset.References);
var aliases = CollectAliases(identifiers, linkset);
var purls = CollectValues(linkset.PackageUrls);
var cpes = CollectValues(linkset.Cpes);
var references = CollectReferences(linkset.References);
return new AdvisoryObservationLinkset(aliases, purls, cpes, references);
}
@@ -170,124 +170,91 @@ internal sealed class AdvisoryObservationFactory : IAdvisoryObservationFactory
};
}
private static IEnumerable<string> NormalizeAliases(RawIdentifiers identifiers, RawLinkset linkset)
{
var aliases = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
if (LinksetNormalization.TryNormalizeAlias(identifiers.PrimaryId, out var primary))
{
aliases.Add(primary);
}
foreach (var alias in identifiers.Aliases)
{
if (LinksetNormalization.TryNormalizeAlias(alias, out var normalized))
{
aliases.Add(normalized);
}
}
foreach (var alias in linkset.Aliases)
{
if (LinksetNormalization.TryNormalizeAlias(alias, out var normalized))
{
aliases.Add(normalized);
}
}
foreach (var note in linkset.Notes)
{
if (!string.IsNullOrWhiteSpace(note.Value)
&& LinksetNormalization.TryNormalizeAlias(note.Value, out var normalized))
{
aliases.Add(normalized);
}
}
return aliases
.OrderBy(static value => value, StringComparer.OrdinalIgnoreCase)
.ToImmutableArray();
}
private static IEnumerable<string> NormalizePackageUrls(ImmutableArray<string> packageUrls)
{
if (packageUrls.IsDefaultOrEmpty)
{
return ImmutableArray<string>.Empty;
}
var set = new HashSet<string>(StringComparer.Ordinal);
foreach (var candidate in packageUrls)
{
if (!LinksetNormalization.TryNormalizePackageUrl(candidate, out var normalized) || string.IsNullOrEmpty(normalized))
{
continue;
}
set.Add(normalized);
}
return set
.OrderBy(static value => value, StringComparer.Ordinal)
.ToImmutableArray();
}
private static IEnumerable<string> NormalizeCpes(ImmutableArray<string> cpes)
{
if (cpes.IsDefaultOrEmpty)
{
return ImmutableArray<string>.Empty;
}
var set = new HashSet<string>(StringComparer.Ordinal);
foreach (var cpe in cpes)
{
if (!LinksetNormalization.TryNormalizeCpe(cpe, out var normalized) || string.IsNullOrEmpty(normalized))
{
continue;
}
set.Add(normalized);
}
return set
.OrderBy(static value => value, StringComparer.Ordinal)
.ToImmutableArray();
}
private static IEnumerable<AdvisoryObservationReference> NormalizeReferences(ImmutableArray<RawReference> references)
{
if (references.IsDefaultOrEmpty)
{
return ImmutableArray<AdvisoryObservationReference>.Empty;
}
var seen = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
var list = new List<AdvisoryObservationReference>();
foreach (var reference in references)
{
var normalized = LinksetNormalization.TryCreateReference(reference.Type, reference.Url);
if (normalized is null)
{
continue;
}
if (!seen.Add(normalized.Url))
{
continue;
}
list.Add(normalized);
}
return list
.OrderBy(static reference => reference.Type, StringComparer.Ordinal)
.ThenBy(static reference => reference.Url, StringComparer.Ordinal)
.ToImmutableArray();
}
private static IEnumerable<string> CollectAliases(RawIdentifiers identifiers, RawLinkset linkset)
{
var results = new List<string>();
AddAlias(results, identifiers.PrimaryId);
AddRange(results, identifiers.Aliases);
AddRange(results, linkset.Aliases);
foreach (var note in linkset.Notes)
{
if (string.IsNullOrWhiteSpace(note.Value))
{
continue;
}
results.Add(note.Value.Trim());
}
return results;
static void AddAlias(ICollection<string> target, string? value)
{
if (string.IsNullOrWhiteSpace(value))
{
return;
}
target.Add(value.Trim());
}
static void AddRange(ICollection<string> target, ImmutableArray<string> values)
{
if (values.IsDefaultOrEmpty)
{
return;
}
foreach (var value in values)
{
AddAlias(target, value);
}
}
}
private static IEnumerable<string> CollectValues(ImmutableArray<string> values)
{
if (values.IsDefaultOrEmpty)
{
return ImmutableArray<string>.Empty;
}
var list = new List<string>(values.Length);
foreach (var value in values)
{
if (string.IsNullOrWhiteSpace(value))
{
continue;
}
list.Add(value.Trim());
}
return list;
}
private static IEnumerable<AdvisoryObservationReference> CollectReferences(ImmutableArray<RawReference> references)
{
if (references.IsDefaultOrEmpty)
{
return ImmutableArray<AdvisoryObservationReference>.Empty;
}
var list = new List<AdvisoryObservationReference>(references.Length);
foreach (var reference in references)
{
if (string.IsNullOrWhiteSpace(reference.Type) || string.IsNullOrWhiteSpace(reference.Url))
{
continue;
}
list.Add(new AdvisoryObservationReference(reference.Type.Trim(), reference.Url.Trim()));
}
return list;
}
private static ImmutableDictionary<string, string> CreateAttributes(AdvisoryRawDocument rawDocument)
{

View File

@@ -9,7 +9,7 @@
> Docs alignment (2025-10-26): Linkset expectations detailed in AOC reference §4 and policy-engine architecture §2.1.
> 2025-10-28: Advisory raw ingestion now strips client-supplied supersedes hints, logs ignored pointers, and surfaces repository-supplied supersedes identifiers; service tests cover duplicate handling and append-only semantics.
> Docs alignment (2025-10-26): Deployment guide + observability guide describe supersedes metrics; ensure implementation emits `aoc_violation_total` on failure.
| CONCELIER-CORE-AOC-19-004 `Remove ingestion normalization` | DOING (2025-10-28) | Concelier Core Guild | CONCELIER-CORE-AOC-19-002, POLICY-AOC-19-003 | Strip normalization/dedup/severity logic from ingestion pipelines, delegate derived computations to Policy Engine, and update exporters/tests to consume raw documents only.<br>2025-10-29 19:05Z: Audit completed for `AdvisoryRawService`/Mongo repo to confirm alias order/dedup removal persists; identified remaining normalization in observation/linkset factory that will be revised to surface raw duplicates for Policy ingestion. Change sketch + regression matrix drafted under `docs/dev/aoc-normalization-removal-notes.md` (pending commit).<br>2025-10-31 20:45Z: Added raw linkset projection to observations/storage, exposing canonical+raw views, refreshed fixtures/tests, and documented behaviour in models/doc factory.<br>2025-10-31 21:10Z: Coordinated with Policy Engine (POLICY-ENGINE-20-003) on adoption timeline; backfill + consumer readiness tracked in `docs/dev/raw-linkset-backfill-plan.md`. |
| CONCELIER-CORE-AOC-19-004 `Remove ingestion normalization` | DOING (2025-10-28) | Concelier Core Guild | CONCELIER-CORE-AOC-19-002, POLICY-AOC-19-003 | Strip normalization/dedup/severity logic from ingestion pipelines, delegate derived computations to Policy Engine, and update exporters/tests to consume raw documents only.<br>2025-10-29 19:05Z: Audit completed for `AdvisoryRawService`/Mongo repo to confirm alias order/dedup removal persists; identified remaining normalization in observation/linkset factory that will be revised to surface raw duplicates for Policy ingestion. Change sketch + regression matrix drafted under `docs/dev/aoc-normalization-removal-notes.md` (pending commit).<br>2025-10-31 20:45Z: Added raw linkset projection to observations/storage, exposing canonical+raw views, refreshed fixtures/tests, and documented behaviour in models/doc factory.<br>2025-10-31 21:10Z: Coordinated with Policy Engine (POLICY-ENGINE-20-003) on adoption timeline; backfill + consumer readiness tracked in `docs/dev/raw-linkset-backfill-plan.md`.<br>2025-11-05 14:25Z: Resuming to document merge-dependent normalization paths and prepare implementation notes for `noMergeEnabled` gating before code changes land.<br>2025-11-05 19:20Z: Observation factory/linkset now preserve upstream ordering + duplicates; canonicalisation responsibility shifts to downstream consumers with refreshed unit coverage.<br>2025-11-06 16:10Z: Updated AOC reference/backfill docs with raw vs canonical guidance and cross-linked analyzer guardrails. |
> Docs alignment (2025-10-26): Architecture overview emphasises policy-only derivation; coordinate with Policy Engine guild for rollout.
> 2025-10-29: `AdvisoryRawService` now preserves upstream alias/linkset ordering (trim-only) and updated AOC documentation reflects the behaviour; follow-up to ensure policy consumers handle duplicates remains open.
| CONCELIER-CORE-AOC-19-013 `Authority tenant scope smoke coverage` | TODO | Concelier Core Guild | AUTH-AOC-19-002 | Extend Concelier smoke/e2e fixtures to configure `requiredTenants` and assert cross-tenant rejection with updated Authority tokens. | Coordinate deliverable so Authority docs (`AUTH-AOC-19-003`) can close once tests are in place. |

View File

@@ -5,9 +5,10 @@ using Microsoft.Extensions.Logging;
using StellaOps.Concelier.Core.Jobs;
using StellaOps.Concelier.Merge.Services;
namespace StellaOps.Concelier.Merge.Jobs;
public sealed class MergeReconcileJob : IJob
namespace StellaOps.Concelier.Merge.Jobs;
[Obsolete("MergeReconcileJob is deprecated; Link-Not-Merge supersedes merge scheduling. Disable via concelier:features:noMergeEnabled. Tracking MERGE-LNM-21-002.", DiagnosticId = "CONCELIER0001", UrlFormat = "https://stella-ops.org/docs/migration/no-merge")]
public sealed class MergeReconcileJob : IJob
{
private readonly AdvisoryMergeService _mergeService;
private readonly ILogger<MergeReconcileJob> _logger;

View File

@@ -1,15 +1,17 @@
using System;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.DependencyInjection.Extensions;
using Microsoft.Extensions.Logging;
using StellaOps.Concelier.Core;
using StellaOps.Concelier.Merge.Jobs;
using StellaOps.Concelier.Merge.Jobs;
using StellaOps.Concelier.Merge.Options;
using StellaOps.Concelier.Merge.Services;
namespace StellaOps.Concelier.Merge;
public static class MergeServiceCollectionExtensions
namespace StellaOps.Concelier.Merge;
[Obsolete("Legacy merge module is deprecated; prefer Link-Not-Merge linkset pipelines. Track MERGE-LNM-21-002 and set concelier:features:noMergeEnabled=true to disable registration.", DiagnosticId = "CONCELIER0001", UrlFormat = "https://stella-ops.org/docs/migration/no-merge")]
public static class MergeServiceCollectionExtensions
{
public static IServiceCollection AddMergeModule(this IServiceCollection services, IConfiguration configuration)
{
@@ -34,10 +36,12 @@ public static class MergeServiceCollectionExtensions
return new AdvisoryPrecedenceMerger(resolver, options, timeProvider, logger);
});
services.TryAddSingleton<MergeEventWriter>();
services.TryAddSingleton<AdvisoryMergeService>();
services.AddTransient<MergeReconcileJob>();
return services;
}
}
#pragma warning disable CS0618 // Legacy merge services are marked obsolete.
services.TryAddSingleton<MergeEventWriter>();
services.TryAddSingleton<AdvisoryMergeService>();
services.AddTransient<MergeReconcileJob>();
#pragma warning restore CS0618
return services;
}
}

View File

@@ -14,9 +14,10 @@ using StellaOps.Concelier.Storage.Mongo.Aliases;
using StellaOps.Concelier.Storage.Mongo.MergeEvents;
using System.Text.Json;
namespace StellaOps.Concelier.Merge.Services;
public sealed class AdvisoryMergeService
namespace StellaOps.Concelier.Merge.Services;
[Obsolete("AdvisoryMergeService is deprecated. Transition callers to Link-Not-Merge observation/linkset APIs (MERGE-LNM-21-002) and enable concelier:features:noMergeEnabled when ready.", DiagnosticId = "CONCELIER0001", UrlFormat = "https://stella-ops.org/docs/migration/no-merge")]
public sealed class AdvisoryMergeService
{
private static readonly Meter MergeMeter = new("StellaOps.Concelier.Merge");
private static readonly Counter<long> AliasCollisionCounter = MergeMeter.CreateCounter<long>(

View File

@@ -10,6 +10,6 @@
| Task | Owner(s) | Depends on | Notes |
|---|---|---|---|
|MERGE-LNM-21-001 Migration plan authoring|BE-Merge, Architecture Guild|CONCELIER-LNM-21-101|**DONE (2025-11-03)** Authored `docs/migration/no-merge.md` with rollout phases, backfill/validation checklists, rollback guidance, and ownership matrix for the Link-Not-Merge cutover.|
|MERGE-LNM-21-002 Merge service deprecation|BE-Merge|MERGE-LNM-21-001|**DOING (2025-11-03)** Auditing service registrations, DI bindings, and tests consuming `AdvisoryMergeService`; drafting deprecation plan and analyzer scope prior to code removal.|
|MERGE-LNM-21-002 Merge service deprecation|BE-Merge|MERGE-LNM-21-001|**DOING (2025-11-03)** Auditing service registrations, DI bindings, and tests consuming `AdvisoryMergeService`; drafting deprecation plan and analyzer scope prior to code removal.<br>2025-11-05 14:42Z: Implementing `concelier:features:noMergeEnabled` gate, merge job allowlist checks, `[Obsolete]` markings, and analyzer scaffolding to steer consumers toward linkset APIs.<br>2025-11-06 16:10Z: Introduced Roslyn analyzer (`CONCELIER0002`) referenced by Concelier WebService + tests, documented suppression guidance, and updated migration playbook.|
> 2025-11-03: Catalogued call sites (WebService Program `AddMergeModule`, built-in job registration `merge:reconcile`, `MergeReconcileJob`) and confirmed unit tests are the only direct `MergeAsync` callers; next step is to define analyzer + replacement observability coverage.
|MERGE-LNM-21-003 Determinism/test updates|QA Guild, BE-Merge|MERGE-LNM-21-002|Replace merge determinism suites with observation/linkset regression tests verifying no data mutation and conflicts remain visible.|

View File

@@ -280,57 +280,60 @@ public sealed record AdvisoryObservationLinkset
IEnumerable<string>? cpes,
IEnumerable<AdvisoryObservationReference>? references)
{
Aliases = NormalizeStringSet(aliases, toLower: true);
Purls = NormalizeStringSet(purls);
Cpes = NormalizeStringSet(cpes);
References = NormalizeReferences(references);
}
Aliases = ToImmutableArray(aliases);
Purls = ToImmutableArray(purls);
Cpes = ToImmutableArray(cpes);
References = ToImmutableReferences(references);
}
public ImmutableArray<string> Aliases { get; }
public ImmutableArray<string> Purls { get; }
public ImmutableArray<string> Aliases { get; }
public ImmutableArray<string> Purls { get; }
public ImmutableArray<string> Cpes { get; }
public ImmutableArray<AdvisoryObservationReference> References { get; }
private static ImmutableArray<string> NormalizeStringSet(IEnumerable<string>? values, bool toLower = false)
{
if (values is null)
{
return ImmutableArray<string>.Empty;
}
var list = new List<string>();
foreach (var value in values)
{
var trimmed = Validation.TrimToNull(value);
if (trimmed is null)
{
continue;
}
list.Add(toLower ? trimmed.ToLowerInvariant() : trimmed);
}
return list
.Distinct(StringComparer.Ordinal)
.OrderBy(static v => v, StringComparer.Ordinal)
.ToImmutableArray();
}
private static ImmutableArray<AdvisoryObservationReference> NormalizeReferences(IEnumerable<AdvisoryObservationReference>? references)
{
if (references is null)
{
return ImmutableArray<AdvisoryObservationReference>.Empty;
}
return references
.Where(static reference => reference is not null)
.Distinct()
.OrderBy(static reference => reference.Type, StringComparer.Ordinal)
.ThenBy(static reference => reference.Url, StringComparer.Ordinal)
.ToImmutableArray();
}
}
public ImmutableArray<string> Cpes { get; }
public ImmutableArray<AdvisoryObservationReference> References { get; }
private static ImmutableArray<string> ToImmutableArray(IEnumerable<string>? values)
{
if (values is null)
{
return ImmutableArray<string>.Empty;
}
var builder = ImmutableArray.CreateBuilder<string>();
foreach (var value in values)
{
var trimmed = Validation.TrimToNull(value);
if (trimmed is null)
{
continue;
}
builder.Add(trimmed);
}
return builder.Count == 0 ? ImmutableArray<string>.Empty : builder.ToImmutable();
}
private static ImmutableArray<AdvisoryObservationReference> ToImmutableReferences(IEnumerable<AdvisoryObservationReference>? references)
{
if (references is null)
{
return ImmutableArray<AdvisoryObservationReference>.Empty;
}
var builder = ImmutableArray.CreateBuilder<AdvisoryObservationReference>();
foreach (var reference in references)
{
if (reference is null)
{
continue;
}
builder.Add(reference);
}
return builder.Count == 0 ? ImmutableArray<AdvisoryObservationReference>.Empty : builder.ToImmutable();
}
}

View File

@@ -13,11 +13,11 @@ public sealed class AdvisoryObservationFactoryTests
private static readonly DateTimeOffset SampleTimestamp = DateTimeOffset.Parse("2025-10-26T12:34:56Z");
[Fact]
public void Create_NormalizesLinksetIdentifiersAndReferences()
{
var factory = new AdvisoryObservationFactory();
var rawDocument = BuildRawDocument(
identifiers: new RawIdentifiers(
public void Create_PreservesLinksetOrderAndDuplicates()
{
var factory = new AdvisoryObservationFactory();
var rawDocument = BuildRawDocument(
identifiers: new RawIdentifiers(
Aliases: ImmutableArray.Create(" CVE-2025-0001 ", "ghsa-XXXX-YYYY"),
PrimaryId: "GHSA-XXXX-YYYY"),
linkset: new RawLinkset
@@ -29,16 +29,27 @@ public sealed class AdvisoryObservationFactoryTests
new RawReference("Advisory", " https://example.test/advisory "),
new RawReference("ADVISORY", "https://example.test/advisory"))
});
var observation = factory.Create(rawDocument, SampleTimestamp);
Assert.Equal(SampleTimestamp, observation.CreatedAt);
Assert.Equal(new[] { "cve-2025-0001", "ghsa-xxxx-yyyy" }, observation.Linkset.Aliases);
Assert.Equal(new[] { "pkg:npm/left-pad@1.0.0" }, observation.Linkset.Purls);
Assert.Equal(new[] { "cpe:2.3:a:example:product:1.0:*:*:*:*:*:*:*" }, observation.Linkset.Cpes);
var reference = Assert.Single(observation.Linkset.References);
Assert.Equal("advisory", reference.Type);
Assert.Equal("https://example.test/advisory", reference.Url);
var observation = factory.Create(rawDocument, SampleTimestamp);
Assert.Equal(SampleTimestamp, observation.CreatedAt);
Assert.Equal(
new[] { "GHSA-XXXX-YYYY", "CVE-2025-0001", "ghsa-XXXX-YYYY", "CVE-2025-0001" },
observation.Linkset.Aliases);
Assert.Equal(
new[] { "pkg:NPM/left-pad@1.0.0", "pkg:npm/left-pad@1.0.0?foo=bar" },
observation.Linkset.Purls);
Assert.Equal(
new[] { "cpe:/a:Example:Product:1.0", "cpe:/a:example:product:1.0" },
observation.Linkset.Cpes);
Assert.Equal(2, observation.Linkset.References.Length);
Assert.All(
observation.Linkset.References,
reference =>
{
Assert.Equal("advisory", reference.Type);
Assert.Equal("https://example.test/advisory", reference.Url);
});
Assert.Equal(
new[] { "GHSA-XXXX-YYYY", " CVE-2025-0001 ", "ghsa-XXXX-YYYY", " CVE-2025-0001 " },

View File

@@ -1,3 +1,4 @@
using System;
using System.Collections.Immutable;
using System.Linq;
using System.Text.Json.Nodes;
@@ -52,9 +53,9 @@ public sealed class AdvisoryObservationQueryServiceTests
Assert.Equal("tenant-a:osv:beta:1", result.Observations[0].ObservationId);
Assert.Equal("tenant-a:ghsa:alpha:1", result.Observations[1].ObservationId);
Assert.Equal(
new[] { "cve-2025-0001", "cve-2025-0002", "ghsa-xyzz" },
result.Linkset.Aliases);
Assert.Equal(
new[] { "CVE-2025-0001", "CVE-2025-0002", "GHSA-xyzz" },
result.Linkset.Aliases);
Assert.Equal(
new[] { "pkg:npm/package-a@1.0.0", "pkg:pypi/package-b@2.0.0" },
@@ -103,8 +104,11 @@ public sealed class AdvisoryObservationQueryServiceTests
CancellationToken.None);
Assert.Equal(2, result.Observations.Length);
Assert.All(result.Observations, observation =>
Assert.Contains(observation.Linkset.Aliases, alias => alias is "cve-2025-0001" or "cve-2025-9999"));
Assert.All(result.Observations, observation =>
Assert.Contains(
observation.Linkset.Aliases,
alias => alias.Equals("CVE-2025-0001", StringComparison.OrdinalIgnoreCase)
|| alias.Equals("CVE-2025-9999", StringComparison.OrdinalIgnoreCase)));
Assert.False(result.HasMore);
Assert.Null(result.NextCursor);

View File

@@ -10,5 +10,8 @@
<ProjectReference Include="../../__Libraries/StellaOps.Concelier.Storage.Mongo/StellaOps.Concelier.Storage.Mongo.csproj" />
<ProjectReference Include="../../StellaOps.Concelier.WebService/StellaOps.Concelier.WebService.csproj" />
<ProjectReference Include="../../../__Libraries/StellaOps.Plugin/StellaOps.Plugin.csproj" />
<ProjectReference Include="../../__Analyzers/StellaOps.Concelier.Analyzers/StellaOps.Concelier.Analyzers.csproj"
OutputItemType="Analyzer"
ReferenceOutputAssembly="false" />
</ItemGroup>
</Project>
</Project>

View File

@@ -221,7 +221,7 @@ public sealed class WebServiceEndpointsTests : IAsyncLifetime
Assert.NotNull(ingestResponse.Headers.Location);
var locationValue = ingestResponse.Headers.Location!.ToString();
Assert.False(string.IsNullOrWhiteSpace(locationValue));
var lastSlashIndex = locationValue.LastIndexOf('/', StringComparison.Ordinal);
var lastSlashIndex = locationValue.LastIndexOf('/');
var idSegment = lastSlashIndex >= 0
? locationValue[(lastSlashIndex + 1)..]
: locationValue;
@@ -886,15 +886,61 @@ public sealed class WebServiceEndpointsTests : IAsyncLifetime
var limitedResponse = await client.GetAsync("/concelier/exports/index.json");
Assert.Equal((HttpStatusCode)429, limitedResponse.StatusCode);
Assert.NotNull(limitedResponse.Headers.RetryAfter);
Assert.True(limitedResponse.Headers.RetryAfter!.Delta.HasValue);
Assert.True(limitedResponse.Headers.RetryAfter!.Delta!.Value.TotalSeconds > 0);
}
[Fact]
public async Task JobsEndpointsAllowBypassWhenAuthorityEnabled()
{
var environment = new Dictionary<string, string?>
Assert.True(limitedResponse.Headers.RetryAfter!.Delta.HasValue);
Assert.True(limitedResponse.Headers.RetryAfter!.Delta!.Value.TotalSeconds > 0);
}
[Fact]
public void MergeModuleDisabledWhenFeatureFlagEnabled()
{
var environment = new Dictionary<string, string?>
{
["CONCELIER_FEATURES__NOMERGEENABLED"] = "true"
};
using var factory = new ConcelierApplicationFactory(
_runner.ConnectionString,
authorityConfigure: null,
environmentOverrides: environment);
using var scope = factory.Services.CreateScope();
var provider = scope.ServiceProvider;
#pragma warning disable CS0618, CONCELIER0001, CONCELIER0002 // Checking deprecated service registration state.
Assert.Null(provider.GetService<AdvisoryMergeService>());
#pragma warning restore CS0618, CONCELIER0001, CONCELIER0002
var schedulerOptions = provider.GetRequiredService<IOptions<JobSchedulerOptions>>().Value;
Assert.DoesNotContain("merge:reconcile", schedulerOptions.Definitions.Keys);
}
[Fact]
public void MergeJobRemainsWhenAllowlisted()
{
var environment = new Dictionary<string, string?>
{
["CONCELIER_FEATURES__MERGEJOBALLOWLIST__0"] = "merge:reconcile"
};
using var factory = new ConcelierApplicationFactory(
_runner.ConnectionString,
authorityConfigure: null,
environmentOverrides: environment);
using var scope = factory.Services.CreateScope();
var provider = scope.ServiceProvider;
#pragma warning disable CS0618, CONCELIER0001, CONCELIER0002 // Checking deprecated service registration state.
Assert.NotNull(provider.GetService<AdvisoryMergeService>());
#pragma warning restore CS0618, CONCELIER0001, CONCELIER0002
var schedulerOptions = provider.GetRequiredService<IOptions<JobSchedulerOptions>>().Value;
Assert.Contains("merge:reconcile", schedulerOptions.Definitions.Keys);
}
[Fact]
public async Task JobsEndpointsAllowBypassWhenAuthorityEnabled()
{
var environment = new Dictionary<string, string?>
{
["CONCELIER_AUTHORITY__ENABLED"] = "true",
["CONCELIER_AUTHORITY__ALLOWANONYMOUSFALLBACK"] = "false",

View File

@@ -18,7 +18,6 @@ using StellaOps.Excititor.Policy;
using StellaOps.Excititor.Storage.Mongo;
using StellaOps.Excititor.WebService.Endpoints;
using StellaOps.Excititor.WebService.Services;
using StellaOps.Excititor.Core;
using StellaOps.Excititor.Core.Aoc;
var builder = WebApplication.CreateBuilder(args);

View File

@@ -84,10 +84,13 @@ internal sealed class WorkerSignatureVerifier : IVexSignatureVerifier
throw new ExcititorAocGuardException(AocGuardResult.FromViolations(new[] { violation }));
}
VexSignatureMetadata? signatureMetadata = null;
if (document.Format == VexDocumentFormat.OciAttestation && _attestationVerifier is not null)
VexSignatureMetadata? signatureMetadata = null;
VexAttestationDiagnostics? attestationDiagnostics = null;
if (document.Format == VexDocumentFormat.OciAttestation && _attestationVerifier is not null)
{
signatureMetadata = await VerifyAttestationAsync(document, metadata, cancellationToken).ConfigureAwait(false);
var attestationResult = await VerifyAttestationAsync(document, metadata, cancellationToken).ConfigureAwait(false);
signatureMetadata = attestationResult.Metadata;
attestationDiagnostics = attestationResult.Diagnostics;
}
signatureMetadata ??= ExtractSignatureMetadata(metadata);
@@ -96,31 +99,40 @@ internal sealed class WorkerSignatureVerifier : IVexSignatureVerifier
signatureMetadata = await AttachIssuerTrustAsync(signatureMetadata, metadata, cancellationToken).ConfigureAwait(false);
}
var resultLabel = signatureMetadata is null ? "skipped" : "ok";
RecordVerification(document.ProviderId, metadata, resultLabel);
if (resultLabel == "skipped")
{
if (attestationDiagnostics is not null)
{
resultLabel = attestationDiagnostics.Result ?? resultLabel;
}
if (attestationDiagnostics is null)
{
RecordVerification(document.ProviderId, metadata, resultLabel);
}
if (resultLabel == "skipped")
{
_logger.LogDebug(
"Signature verification skipped for provider {ProviderId} (no signature metadata).",
document.ProviderId);
}
else
{
_logger.LogInformation(
"Signature metadata recorded for provider {ProviderId} (type={SignatureType}, subject={Subject}, issuer={Issuer}).",
document.ProviderId,
signatureMetadata!.Type,
signatureMetadata.Subject ?? "<unknown>",
signatureMetadata.Issuer ?? "<unknown>");
}
return signatureMetadata;
}
private async ValueTask<VexSignatureMetadata?> VerifyAttestationAsync(
VexRawDocument document,
ImmutableDictionary<string, string> metadata,
CancellationToken cancellationToken)
_logger.LogInformation(
"Signature metadata recorded for provider {ProviderId} (type={SignatureType}, subject={Subject}, issuer={Issuer}, result={Result}).",
document.ProviderId,
signatureMetadata!.Type,
signatureMetadata.Subject ?? "<unknown>",
signatureMetadata.Issuer ?? "<unknown>",
resultLabel);
}
return signatureMetadata;
}
private async ValueTask<(VexSignatureMetadata Metadata, VexAttestationDiagnostics Diagnostics)> VerifyAttestationAsync(
VexRawDocument document,
ImmutableDictionary<string, string> metadata,
CancellationToken cancellationToken)
{
try
{
@@ -146,37 +158,48 @@ internal sealed class WorkerSignatureVerifier : IVexSignatureVerifier
attestationMetadata,
envelopeJson);
var verification = await _attestationVerifier!
.VerifyAsync(verificationRequest, cancellationToken)
.ConfigureAwait(false);
if (!verification.IsValid)
{
var diagnostics = string.Join(", ", verification.Diagnostics.Select(kvp => $"{kvp.Key}={kvp.Value}"));
_logger.LogError(
"Attestation verification failed for provider {ProviderId} (uri={SourceUri}) diagnostics={Diagnostics}",
document.ProviderId,
document.SourceUri,
diagnostics);
var violation = AocViolation.Create(
AocViolationCode.SignatureInvalid,
"/upstream/signature",
"Attestation verification failed.");
RecordVerification(document.ProviderId, metadata, "fail");
throw new ExcititorAocGuardException(AocGuardResult.FromViolations(new[] { violation }));
}
_logger.LogInformation(
"Attestation verification succeeded for provider {ProviderId} (predicate={PredicateType}, subject={Subject}).",
document.ProviderId,
attestationMetadata.PredicateType,
statement.Subject[0].Name ?? "<unknown>");
return BuildSignatureMetadata(statement, metadata, attestationMetadata, verification.Diagnostics);
}
catch (ExcititorAocGuardException)
{
var verification = await _attestationVerifier!
.VerifyAsync(verificationRequest, cancellationToken)
.ConfigureAwait(false);
var diagnosticsSnapshot = verification.Diagnostics;
if (!verification.IsValid)
{
var failureReason = diagnosticsSnapshot.FailureReason ?? "verification_failed";
var resultTag = diagnosticsSnapshot.Result ?? "invalid";
RecordVerification(document.ProviderId, metadata, resultTag);
_logger.LogError(
"Attestation verification failed for provider {ProviderId} (uri={SourceUri}) result={Result} failure={FailureReason} diagnostics={@Diagnostics}",
document.ProviderId,
document.SourceUri,
resultTag,
failureReason,
diagnosticsSnapshot);
var violation = AocViolation.Create(
AocViolationCode.SignatureInvalid,
"/upstream/signature",
"Attestation verification failed.");
throw new ExcititorAocGuardException(AocGuardResult.FromViolations(new[] { violation }));
}
var successResult = diagnosticsSnapshot.Result ?? "valid";
RecordVerification(document.ProviderId, metadata, successResult);
_logger.LogInformation(
"Attestation verification succeeded for provider {ProviderId} (predicate={PredicateType}, subject={Subject}, result={Result}).",
document.ProviderId,
attestationMetadata.PredicateType,
statement.Subject[0].Name ?? "<unknown>",
successResult);
var signatureMetadata = BuildSignatureMetadata(statement, metadata, attestationMetadata, diagnosticsSnapshot);
return (signatureMetadata, diagnosticsSnapshot);
}
catch (ExcititorAocGuardException)
{
throw;
}
catch (Exception ex)
@@ -192,10 +215,10 @@ internal sealed class WorkerSignatureVerifier : IVexSignatureVerifier
"/upstream/signature",
$"Attestation verification encountered an error: {ex.Message}");
RecordVerification(document.ProviderId, metadata, "fail");
throw new ExcititorAocGuardException(AocGuardResult.FromViolations(new[] { violation }));
}
}
RecordVerification(document.ProviderId, metadata, "error");
throw new ExcititorAocGuardException(AocGuardResult.FromViolations(new[] { violation }));
}
}
private VexAttestationRequest BuildAttestationRequest(VexInTotoStatement statement, VexAttestationPredicate predicate)
{
@@ -252,11 +275,11 @@ internal sealed class WorkerSignatureVerifier : IVexSignatureVerifier
signedAt);
}
private VexSignatureMetadata BuildSignatureMetadata(
VexInTotoStatement statement,
ImmutableDictionary<string, string> metadata,
VexAttestationMetadata attestationMetadata,
ImmutableDictionary<string, string> diagnostics)
private VexSignatureMetadata BuildSignatureMetadata(
VexInTotoStatement statement,
ImmutableDictionary<string, string> metadata,
VexAttestationMetadata attestationMetadata,
VexAttestationDiagnostics diagnostics)
{
metadata.TryGetValue("vex.signature.type", out var type);
metadata.TryGetValue("vex.provenance.cosign.subject", out var subject);

View File

@@ -13,16 +13,18 @@
3. Emit observability signals (logs, metrics, optional tracing) that can run offline and degrade gracefully when transparency services are unreachable.
4. Add regression tests (unit + integration) covering positive path, negative path, and offline fallback scenarios.
## 2. Deliverables
- `IVexAttestationVerifier` abstraction + `VexAttestationVerifier` implementation inside `StellaOps.Excititor.Attestation`, encapsulating DSSE validation, predicate checks, artifact digest confirmation, Rekor inclusion verification, and deterministic diagnostics.
- DI wiring (extension method) for registering verifier + instrumentation dependencies alongside the existing signer/rekor client.
- Shared `VexAttestationDiagnostics` record describing normalized diagnostic keys consumed by Worker/WebService logging.
- Metrics utility (`AttestationMetrics`) exposing counters/histograms via `System.Diagnostics.Metrics`, exported under `StellaOps.Excititor.Attestation` meter.
- Activity source (`AttestationActivitySource`) for optional tracing spans around sign/verify operations.
- Documentation updates (`EXCITITOR-ATTEST-01-003-plan.md`, `TASKS.md` notes) describing instrumentation + test expectations.
- Test coverage in `StellaOps.Excititor.Attestation.Tests` (unit) and scaffolding notes for WebService/Worker integration tests.
## 2. Deliverables
- `IVexAttestationVerifier` abstraction + `VexAttestationVerifier` implementation inside `StellaOps.Excititor.Attestation`, encapsulating DSSE validation, predicate checks, artifact digest confirmation, Rekor inclusion verification, and deterministic diagnostics.
- DI wiring (extension method) for registering verifier + instrumentation dependencies alongside the existing signer/rekor client.
- Shared `VexAttestationDiagnostics` record describing normalized diagnostic keys consumed by Worker/WebService logging.
- Metrics utility (`AttestationMetrics`) exposing counters/histograms via `System.Diagnostics.Metrics`, exported under `StellaOps.Excititor.Attestation` meter.
- Activity source (`AttestationActivitySource`) for optional tracing spans around sign/verify operations.
- 2025-11-05: Implemented `VexAttestationDiagnostics`, activity tagging via `VexAttestationActivitySource`, and updated verifier/tests to emit structured failure reasons.
- 2025-11-05 (pm): Worker attestation verifier now records structured diagnostics/metrics and logs result/failure reasons using `VexAttestationDiagnostics`; attestation success/failure labels propagate to verification counters.
- Documentation updates (`EXCITITOR-ATTEST-01-003-plan.md`, `TASKS.md` notes) describing instrumentation + test expectations.
- Test coverage in `StellaOps.Excititor.Attestation.Tests` (unit) and scaffolding notes for WebService/Worker integration tests.
## 3. Verification Flow
### 3.1 Inputs

View File

@@ -1,6 +1,9 @@
If you are working on this file you need to read docs/modules/excititor/ARCHITECTURE.md and ./AGENTS.md).
# TASKS
| Task | Owner(s) | Depends on | Notes |
|---|---|---|---|
|EXCITITOR-ATTEST-01-003 Verification suite & observability|Team Excititor Attestation|EXCITITOR-ATTEST-01-002|DOING (2025-10-22) Continuing implementation: build `IVexAttestationVerifier`, wire metrics/logging, and add regression tests. Draft plan in `EXCITITOR-ATTEST-01-003-plan.md` (2025-10-19) guides scope; updating with worknotes as progress lands.<br>2025-10-31: Verifier now tolerates duplicate source providers from AOC raw projections, downgrades offline Rekor verification to a degraded result, and enforces trusted signer registry checks with detailed diagnostics/tests.|
If you are working on this file you need to read docs/modules/excititor/ARCHITECTURE.md and ./AGENTS.md).
# TASKS
| Task | Owner(s) | Depends on | Notes |
|---|---|---|---|
|EXCITITOR-ATTEST-01-003 Verification suite & observability|Team Excititor Attestation|EXCITITOR-ATTEST-01-002|TODO (2025-11-06) Continuing implementation: build `IVexAttestationVerifier`, wire metrics/logging, and add regression tests. Draft plan in `EXCITITOR-ATTEST-01-003-plan.md` (2025-10-19) guides scope; updating with worknotes as progress lands.<br>2025-10-31: Verifier now tolerates duplicate source providers from AOC raw projections, downgrades offline Rekor verification to a degraded result, and enforces trusted signer registry checks with detailed diagnostics/tests.<br>2025-11-05 14:35Z: Picking up diagnostics record/ActivitySource work and aligning metrics dimensions before wiring verifier into WebService/Worker paths.|
> 2025-11-05 19:10Z: Worker signature verifier now emits structured diagnostics/metrics via `VexAttestationDiagnostics`; attestation verification results flow into metric labels and logs.
> 2025-11-06 07:12Z: Export verifier builds unblocked; Excititor worker + web service test suites pass with diagnostics wiring (`dotnet test` invocations succeed with staged libssl1.1).
> 2025-11-06 07:55Z: Paused after documenting OpenSSL shim usage; follow-up automation tracked under `DEVOPS-OPENSSL-11-001/002`.
> Remark (2025-10-22): Added verifier implementation + metrics/tests; next steps include wiring into WebService/Worker flows and expanding negative-path coverage.

View File

@@ -0,0 +1,10 @@
using System.Diagnostics;
namespace StellaOps.Excititor.Attestation.Verification;
public static class VexAttestationActivitySource
{
public const string Name = "StellaOps.Excititor.Attestation";
public static readonly ActivitySource Value = new(Name);
}

View File

@@ -1,8 +1,8 @@
using System;
using System.Collections.Generic;
using System.Collections.Immutable;
using System.Diagnostics;
using System.Linq;
using System.Collections.Generic;
using System.Collections.Immutable;
using System.Diagnostics;
using System.Linq;
using System.Text;
using System.Text.Json;
using System.Text.Json.Serialization;
@@ -59,104 +59,120 @@ internal sealed class VexAttestationVerifier : IVexAttestationVerifier
public async ValueTask<VexAttestationVerification> VerifyAsync(
VexAttestationVerificationRequest request,
CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(request);
var stopwatch = Stopwatch.StartNew();
var diagnostics = ImmutableDictionary.CreateBuilder<string, string>(StringComparer.Ordinal);
{
ArgumentNullException.ThrowIfNull(request);
var stopwatch = Stopwatch.StartNew();
var diagnostics = ImmutableDictionary.CreateBuilder<string, string>(StringComparer.Ordinal);
var resultLabel = "valid";
var rekorState = "skipped";
var component = request.IsReverify ? "worker" : "webservice";
void SetFailure(string reason) => diagnostics["failure_reason"] = reason;
using var activity = VexAttestationActivitySource.Value.StartActivity("Verify", ActivityKind.Internal);
activity?.SetTag("attestation.component", component);
activity?.SetTag("attestation.export_id", request.Attestation.ExportId);
try
{
if (string.IsNullOrWhiteSpace(request.Envelope))
{
diagnostics["envelope.state"] = "missing";
_logger.LogWarning("Attestation envelope is missing for export {ExportId}", request.Attestation.ExportId);
resultLabel = "invalid";
return BuildResult(false);
}
if (!TryDeserializeEnvelope(request.Envelope, out var envelope, diagnostics))
{
_logger.LogWarning("Failed to deserialize attestation envelope for export {ExportId}", request.Attestation.ExportId);
resultLabel = "invalid";
return BuildResult(false);
}
if (!string.Equals(envelope.PayloadType, VexDsseBuilder.PayloadType, StringComparison.OrdinalIgnoreCase))
{
diagnostics["payload.type"] = envelope.PayloadType ?? string.Empty;
_logger.LogWarning(
"Unexpected DSSE payload type {PayloadType} for export {ExportId}",
envelope.PayloadType,
request.Attestation.ExportId);
resultLabel = "invalid";
return BuildResult(false);
}
if (envelope.Signatures is null || envelope.Signatures.Count == 0)
{
diagnostics["signature.state"] = "missing";
_logger.LogWarning("Attestation envelope for export {ExportId} does not contain signatures.", request.Attestation.ExportId);
resultLabel = "invalid";
return BuildResult(false);
}
var payloadBase64 = envelope.Payload ?? string.Empty;
if (!TryDecodePayload(payloadBase64, out var payloadBytes, diagnostics))
{
_logger.LogWarning("Failed to decode attestation payload for export {ExportId}", request.Attestation.ExportId);
resultLabel = "invalid";
return BuildResult(false);
}
if (!TryDeserializeStatement(payloadBytes, out var statement, diagnostics))
{
_logger.LogWarning("Failed to deserialize DSSE statement for export {ExportId}", request.Attestation.ExportId);
resultLabel = "invalid";
return BuildResult(false);
}
if (!ValidatePredicateType(statement, request, diagnostics))
{
_logger.LogWarning("Predicate type mismatch for export {ExportId}", request.Attestation.ExportId);
resultLabel = "invalid";
return BuildResult(false);
}
if (!ValidateSubject(statement, request, diagnostics))
{
_logger.LogWarning("Subject mismatch for export {ExportId}", request.Attestation.ExportId);
resultLabel = "invalid";
return BuildResult(false);
}
if (!ValidatePredicate(statement, request, diagnostics))
{
_logger.LogWarning("Predicate payload mismatch for export {ExportId}", request.Attestation.ExportId);
resultLabel = "invalid";
return BuildResult(false);
}
if (!ValidateMetadataDigest(envelope, request.Metadata, diagnostics))
{
_logger.LogWarning("Attestation digest mismatch for export {ExportId}", request.Attestation.ExportId);
resultLabel = "invalid";
return BuildResult(false);
}
if (!ValidateSignedAt(request.Metadata, request.Attestation.CreatedAt, diagnostics))
{
_logger.LogWarning("SignedAt validation failed for export {ExportId}", request.Attestation.ExportId);
resultLabel = "invalid";
return BuildResult(false);
}
if (string.IsNullOrWhiteSpace(request.Envelope))
{
diagnostics["envelope.state"] = "missing";
SetFailure("missing_envelope");
_logger.LogWarning("Attestation envelope is missing for export {ExportId}", request.Attestation.ExportId);
resultLabel = "invalid";
return BuildResult(false);
}
if (!TryDeserializeEnvelope(request.Envelope, out var envelope, diagnostics))
{
SetFailure("invalid_envelope");
_logger.LogWarning("Failed to deserialize attestation envelope for export {ExportId}", request.Attestation.ExportId);
resultLabel = "invalid";
return BuildResult(false);
}
if (!string.Equals(envelope.PayloadType, VexDsseBuilder.PayloadType, StringComparison.OrdinalIgnoreCase))
{
diagnostics["payload.type"] = envelope.PayloadType ?? string.Empty;
SetFailure("unexpected_payload_type");
_logger.LogWarning(
"Unexpected DSSE payload type {PayloadType} for export {ExportId}",
envelope.PayloadType,
request.Attestation.ExportId);
resultLabel = "invalid";
return BuildResult(false);
}
if (envelope.Signatures is null || envelope.Signatures.Count == 0)
{
diagnostics["signature.state"] = "missing";
SetFailure("missing_signature");
_logger.LogWarning("Attestation envelope for export {ExportId} does not contain signatures.", request.Attestation.ExportId);
resultLabel = "invalid";
return BuildResult(false);
}
var payloadBase64 = envelope.Payload ?? string.Empty;
if (!TryDecodePayload(payloadBase64, out var payloadBytes, diagnostics))
{
SetFailure("payload_decode_failed");
_logger.LogWarning("Failed to decode attestation payload for export {ExportId}", request.Attestation.ExportId);
resultLabel = "invalid";
return BuildResult(false);
}
if (!TryDeserializeStatement(payloadBytes, out var statement, diagnostics))
{
SetFailure("invalid_statement");
_logger.LogWarning("Failed to deserialize DSSE statement for export {ExportId}", request.Attestation.ExportId);
resultLabel = "invalid";
return BuildResult(false);
}
if (!ValidatePredicateType(statement, request, diagnostics))
{
SetFailure("predicate_type_mismatch");
_logger.LogWarning("Predicate type mismatch for export {ExportId}", request.Attestation.ExportId);
resultLabel = "invalid";
return BuildResult(false);
}
if (!ValidateSubject(statement, request, diagnostics))
{
SetFailure("subject_mismatch");
_logger.LogWarning("Subject mismatch for export {ExportId}", request.Attestation.ExportId);
resultLabel = "invalid";
return BuildResult(false);
}
if (!ValidatePredicate(statement, request, diagnostics))
{
SetFailure("predicate_mismatch");
_logger.LogWarning("Predicate payload mismatch for export {ExportId}", request.Attestation.ExportId);
resultLabel = "invalid";
return BuildResult(false);
}
if (!ValidateMetadataDigest(envelope, request.Metadata, diagnostics))
{
SetFailure("envelope_digest_mismatch");
_logger.LogWarning("Attestation digest mismatch for export {ExportId}", request.Attestation.ExportId);
resultLabel = "invalid";
return BuildResult(false);
}
if (!ValidateSignedAt(request.Metadata, request.Attestation.CreatedAt, diagnostics))
{
SetFailure("signedat_out_of_range");
_logger.LogWarning("SignedAt validation failed for export {ExportId}", request.Attestation.ExportId);
resultLabel = "invalid";
return BuildResult(false);
}
rekorState = await VerifyTransparencyAsync(request.Metadata, diagnostics, cancellationToken).ConfigureAwait(false);
if (rekorState is "missing" or "unverified" or "client_unavailable")
{
SetFailure(rekorState);
resultLabel = "invalid";
return BuildResult(false);
}
@@ -164,6 +180,9 @@ internal sealed class VexAttestationVerifier : IVexAttestationVerifier
var signaturesVerified = await VerifySignaturesAsync(payloadBytes, envelope.Signatures, diagnostics, cancellationToken).ConfigureAwait(false);
if (!signaturesVerified)
{
diagnostics["failure_reason"] = diagnostics.TryGetValue("signature.reason", out var reason)
? reason
: "signature_verification_failed";
if (_options.RequireSignatureVerification)
{
resultLabel = "invalid";
@@ -183,13 +202,16 @@ internal sealed class VexAttestationVerifier : IVexAttestationVerifier
catch (Exception ex)
{
diagnostics["error"] = ex.GetType().Name;
diagnostics["error.message"] = ex.Message; resultLabel = "error";
_logger.LogError(ex, "Unexpected exception verifying attestation for export {ExportId}", request.Attestation.ExportId);
return BuildResult(false);
}
finally
{
stopwatch.Stop();
diagnostics["error.message"] = ex.Message; resultLabel = "error";
_logger.LogError(ex, "Unexpected exception verifying attestation for export {ExportId}", request.Attestation.ExportId);
diagnostics["failure_reason"] = diagnostics.TryGetValue("error", out var errorCode)
? errorCode
: ex.GetType().Name;
return BuildResult(false);
}
finally
{
stopwatch.Stop();
var tags = new KeyValuePair<string, object?>[]
{
new("result", resultLabel),
@@ -200,12 +222,32 @@ internal sealed class VexAttestationVerifier : IVexAttestationVerifier
_metrics.VerifyDuration.Record(stopwatch.Elapsed.TotalSeconds, tags);
}
VexAttestationVerification BuildResult(bool isValid)
{
diagnostics["result"] = resultLabel;
diagnostics["component"] = component;
VexAttestationVerification BuildResult(bool isValid)
{
diagnostics["result"] = resultLabel;
diagnostics["component"] = component;
diagnostics["rekor.state"] = rekorState;
return new VexAttestationVerification(isValid, diagnostics.ToImmutable());
var snapshot = VexAttestationDiagnostics.FromBuilder(diagnostics);
if (activity is { } currentActivity)
{
currentActivity.SetTag("attestation.result", resultLabel);
currentActivity.SetTag("attestation.rekor", rekorState);
if (!isValid)
{
var failure = snapshot.FailureReason ?? "verification_failed";
currentActivity.SetStatus(ActivityStatusCode.Error, failure);
currentActivity.SetTag("attestation.failure_reason", failure);
}
else
{
currentActivity.SetStatus(resultLabel is "degraded"
? ActivityStatusCode.Ok
: ActivityStatusCode.Ok);
}
}
return new VexAttestationVerification(isValid, snapshot);
}
}

View File

@@ -14,4 +14,4 @@
<ProjectReference Include="../../../Aoc/__Libraries/StellaOps.Aoc/StellaOps.Aoc.csproj" />
<ProjectReference Include="../../../Concelier/__Libraries/StellaOps.Concelier.RawModels/StellaOps.Concelier.RawModels.csproj" />
</ItemGroup>
</Project>
</Project>

View File

@@ -1,7 +1,8 @@
using System;
using System.Collections.Immutable;
using System.Threading;
using System.Threading.Tasks;
using System.Threading.Tasks;
using StellaOps.Excititor.Attestation.Verification;
namespace StellaOps.Excititor.Core;
@@ -33,4 +34,4 @@ public sealed record VexAttestationVerificationRequest(
public sealed record VexAttestationVerification(
bool IsValid,
ImmutableDictionary<string, string> Diagnostics);
VexAttestationDiagnostics Diagnostics);

View File

@@ -0,0 +1,57 @@
using System.Collections;
using System.Collections.Generic;
using System.Collections.Immutable;
namespace StellaOps.Excititor.Attestation.Verification;
public sealed class VexAttestationDiagnostics : IReadOnlyDictionary<string, string>
{
private readonly ImmutableDictionary<string, string> _values;
private VexAttestationDiagnostics(ImmutableDictionary<string, string> values)
{
_values = values ?? ImmutableDictionary<string, string>.Empty;
}
public static VexAttestationDiagnostics FromBuilder(ImmutableDictionary<string, string>.Builder builder)
{
ArgumentNullException.ThrowIfNull(builder);
return new(builder.ToImmutable());
}
public static VexAttestationDiagnostics Empty { get; } = new(ImmutableDictionary<string, string>.Empty);
public string? Result => TryGetValue("result", out var value) ? value : null;
public string? Component => TryGetValue("component", out var value) ? value : null;
public string? RekorState => TryGetValue("rekor.state", out var value) ? value : null;
public string? FailureReason => TryGetValue("failure_reason", out var value) ? value : null;
public string this[string key] => _values[key];
public IEnumerable<string> Keys => _values.Keys;
public IEnumerable<string> Values => _values.Values;
public int Count => _values.Count;
public bool ContainsKey(string key) => _values.ContainsKey(key);
public bool TryGetValue(string key, out string value)
{
if (_values.TryGetValue(key, out var stored))
{
value = stored;
return true;
}
value = string.Empty;
return false;
}
public IEnumerator<KeyValuePair<string, string>> GetEnumerator() => _values.GetEnumerator();
IEnumerator IEnumerable.GetEnumerator() => GetEnumerator();
}

View File

@@ -98,6 +98,7 @@ public sealed class VexExportEngine : IExportEngine
cached.PolicyDigest,
cached.ConsensusDigest,
cached.ScoreDigest,
cached.QuietProvenance,
cached.Attestation,
cached.SizeBytes);
}

View File

@@ -130,7 +130,7 @@ internal static class VexExportEnvelopeBuilder
}
}
internal sealed record VexExportEnvelopeContext(
public sealed record VexExportEnvelopeContext(
ImmutableArray<VexConsensus> Consensus,
string ConsensusCanonicalJson,
VexContentAddress ConsensusDigest,

View File

@@ -280,7 +280,7 @@ public sealed class VexMirrorBundlePublisher : IVexMirrorBundlePublisher
ToRelativePath(mirrorRoot, manifestPath),
manifestBytes.LongLength,
ComputeDigest(manifestBytes),
signature: null);
Signature: null);
var bundleDescriptor = manifestDocument.Bundle with
{
@@ -298,7 +298,7 @@ public sealed class VexMirrorBundlePublisher : IVexMirrorBundlePublisher
manifestDocument.DomainId,
manifestDocument.DisplayName,
manifestDocument.GeneratedAt,
manifestDocument.Exports.Length,
manifestDocument.Exports.Count,
manifestDescriptor,
bundleDescriptor,
exportKeys));
@@ -474,6 +474,11 @@ public sealed class VexMirrorBundlePublisher : IVexMirrorBundlePublisher
private JsonMirrorSigningContext PrepareSigningContext(MirrorSigningOptions signingOptions)
{
if (_cryptoRegistry is null)
{
throw new InvalidOperationException("Mirror signing requires a crypto provider registry to be configured.");
}
var algorithm = string.IsNullOrWhiteSpace(signingOptions.Algorithm)
? SignatureAlgorithms.Es256
: signingOptions.Algorithm.Trim();
@@ -496,7 +501,7 @@ public sealed class VexMirrorBundlePublisher : IVexMirrorBundlePublisher
var provider = ResolveProvider(algorithm, providerHint);
var signingKey = LoadSigningKey(signingOptions, provider, algorithm);
provider.UpsertSigningKey(signingKey);
resolved = _cryptoRegistry.ResolveSigner(CryptoCapability.Signing, algorithm, new CryptoKeyReference(keyId, provider.Name), provider.Name);
resolved = _cryptoRegistry!.ResolveSigner(CryptoCapability.Signing, algorithm, new CryptoKeyReference(keyId, provider.Name), provider.Name);
}
return new JsonMirrorSigningContext(resolved.Signer, algorithm, resolved.ProviderName, _timeProvider);

View File

@@ -85,6 +85,6 @@ public sealed class VexAttestationClientTests
private sealed class FakeVerifier : IVexAttestationVerifier
{
public ValueTask<VexAttestationVerification> VerifyAsync(VexAttestationVerificationRequest request, CancellationToken cancellationToken)
=> ValueTask.FromResult(new VexAttestationVerification(true, ImmutableDictionary<string, string>.Empty));
=> ValueTask.FromResult(new VexAttestationVerification(true, VexAttestationDiagnostics.Empty));
}
}

View File

@@ -16,42 +16,44 @@ public sealed class VexAttestationVerifierTests : IDisposable
{
private readonly VexAttestationMetrics _metrics = new();
[Fact]
public async Task VerifyAsync_ReturnsValid_WhenEnvelopeMatches()
{
var (request, metadata, envelope) = await CreateSignedAttestationAsync();
var verifier = CreateVerifier(options => options.RequireTransparencyLog = false);
[Fact]
public async Task VerifyAsync_ReturnsValid_WhenEnvelopeMatches()
{
var (request, metadata, envelope) = await CreateSignedAttestationAsync();
var verifier = CreateVerifier(options => options.RequireTransparencyLog = false);
var verification = await verifier.VerifyAsync(
new VexAttestationVerificationRequest(request, metadata, envelope),
CancellationToken.None);
Assert.True(verification.IsValid);
Assert.Equal("valid", verification.Diagnostics.Result);
Assert.Null(verification.Diagnostics.FailureReason);
}
var verification = await verifier.VerifyAsync(
new VexAttestationVerificationRequest(request, metadata, envelope),
CancellationToken.None);
[Fact]
public async Task VerifyAsync_ReturnsInvalid_WhenDigestMismatch()
{
var (request, metadata, envelope) = await CreateSignedAttestationAsync();
var verifier = CreateVerifier(options => options.RequireTransparencyLog = false);
var tamperedMetadata = new VexAttestationMetadata(
metadata.PredicateType,
metadata.Rekor,
"sha256:deadbeef",
metadata.SignedAt);
var verification = await verifier.VerifyAsync(
new VexAttestationVerificationRequest(request, tamperedMetadata, envelope),
CancellationToken.None);
Assert.False(verification.IsValid);
Assert.Equal("invalid", verification.Diagnostics.Result);
Assert.Equal("sha256:deadbeef", verification.Diagnostics["metadata.envelopeDigest"]);
Assert.Equal("envelope_digest_mismatch", verification.Diagnostics.FailureReason);
}
Assert.True(verification.IsValid);
Assert.Equal("valid", verification.Diagnostics["result"]);
}
[Fact]
public async Task VerifyAsync_ReturnsInvalid_WhenDigestMismatch()
{
var (request, metadata, envelope) = await CreateSignedAttestationAsync();
var verifier = CreateVerifier(options => options.RequireTransparencyLog = false);
var tamperedMetadata = new VexAttestationMetadata(
metadata.PredicateType,
metadata.Rekor,
"sha256:deadbeef",
metadata.SignedAt);
var verification = await verifier.VerifyAsync(
new VexAttestationVerificationRequest(request, tamperedMetadata, envelope),
CancellationToken.None);
Assert.False(verification.IsValid);
Assert.Equal("invalid", verification.Diagnostics["result"]);
Assert.Equal("sha256:deadbeef", verification.Diagnostics["metadata.envelopeDigest"]);
}
[Fact]
[Fact]
public async Task VerifyAsync_AllowsOfflineTransparency_WhenConfigured()
{
var (request, metadata, envelope) = await CreateSignedAttestationAsync(includeRekor: true);
@@ -67,47 +69,50 @@ public sealed class VexAttestationVerifierTests : IDisposable
CancellationToken.None);
Assert.True(verification.IsValid);
Assert.Equal("offline", verification.Diagnostics["rekor.state"]);
Assert.Equal("degraded", verification.Diagnostics["result"]);
Assert.Equal("offline", verification.Diagnostics.RekorState);
Assert.Equal("degraded", verification.Diagnostics.Result);
Assert.Null(verification.Diagnostics.FailureReason);
}
[Fact]
public async Task VerifyAsync_ReturnsInvalid_WhenTransparencyRequiredAndMissing()
{
var (request, metadata, envelope) = await CreateSignedAttestationAsync(includeRekor: false);
var verifier = CreateVerifier(options =>
{
options.RequireTransparencyLog = true;
options.AllowOfflineTransparency = false;
});
var verification = await verifier.VerifyAsync(
new VexAttestationVerificationRequest(request, metadata, envelope),
CancellationToken.None);
Assert.False(verification.IsValid);
Assert.Equal("missing", verification.Diagnostics["rekor.state"]);
Assert.Equal("invalid", verification.Diagnostics["result"]);
}
[Fact]
public async Task VerifyAsync_ReturnsInvalid_WhenTransparencyUnavailableAndOfflineDisallowed()
{
var (request, metadata, envelope) = await CreateSignedAttestationAsync(includeRekor: true);
var transparency = new ThrowingTransparencyLogClient();
var verifier = CreateVerifier(options =>
{
options.RequireTransparencyLog = true;
options.AllowOfflineTransparency = false;
}, transparency);
var verification = await verifier.VerifyAsync(
new VexAttestationVerificationRequest(request, metadata, envelope),
CancellationToken.None);
Assert.False(verification.IsValid);
Assert.Equal("unreachable", verification.Diagnostics["rekor.state"]);
Assert.Equal("invalid", verification.Diagnostics["result"]);
{
var (request, metadata, envelope) = await CreateSignedAttestationAsync(includeRekor: false);
var verifier = CreateVerifier(options =>
{
options.RequireTransparencyLog = true;
options.AllowOfflineTransparency = false;
});
var verification = await verifier.VerifyAsync(
new VexAttestationVerificationRequest(request, metadata, envelope),
CancellationToken.None);
Assert.False(verification.IsValid);
Assert.Equal("missing", verification.Diagnostics.RekorState);
Assert.Equal("invalid", verification.Diagnostics.Result);
Assert.Equal("missing", verification.Diagnostics.FailureReason);
}
[Fact]
public async Task VerifyAsync_ReturnsInvalid_WhenTransparencyUnavailableAndOfflineDisallowed()
{
var (request, metadata, envelope) = await CreateSignedAttestationAsync(includeRekor: true);
var transparency = new ThrowingTransparencyLogClient();
var verifier = CreateVerifier(options =>
{
options.RequireTransparencyLog = true;
options.AllowOfflineTransparency = false;
}, transparency);
var verification = await verifier.VerifyAsync(
new VexAttestationVerificationRequest(request, metadata, envelope),
CancellationToken.None);
Assert.False(verification.IsValid);
Assert.Equal("unreachable", verification.Diagnostics.RekorState);
Assert.Equal("invalid", verification.Diagnostics.Result);
Assert.Equal("unreachable", verification.Diagnostics.FailureReason);
}
[Fact]
@@ -125,7 +130,7 @@ public sealed class VexAttestationVerifierTests : IDisposable
CancellationToken.None);
Assert.True(verification.IsValid);
Assert.Equal("valid", verification.Diagnostics["result"]);
Assert.Equal("valid", verification.Diagnostics.Result);
}
[Fact]
@@ -152,6 +157,8 @@ public sealed class VexAttestationVerifierTests : IDisposable
Assert.True(verification.IsValid);
Assert.Equal("verified", verification.Diagnostics["signature.state"]);
Assert.Equal("valid", verification.Diagnostics.Result);
Assert.Null(verification.Diagnostics.FailureReason);
}
[Fact]
@@ -179,6 +186,8 @@ public sealed class VexAttestationVerifierTests : IDisposable
Assert.False(verification.IsValid);
Assert.Equal("error", verification.Diagnostics["signature.state"]);
Assert.Equal("verification_failed", verification.Diagnostics["signature.reason"]);
Assert.Equal("verification_failed", verification.Diagnostics.FailureReason);
Assert.Equal("invalid", verification.Diagnostics.Result);
}
private async Task<(VexAttestationRequest Request, VexAttestationMetadata Metadata, string Envelope)> CreateSignedAttestationAsync(

View File

@@ -6,6 +6,7 @@ using System.Globalization;
using Microsoft.Extensions.Logging.Abstractions;
using MongoDB.Driver;
using StellaOps.Excititor.Core;
using StellaOps.Excititor.Attestation.Verification;
using StellaOps.Excititor.Export;
using StellaOps.Excititor.Policy;
using StellaOps.Excititor.Storage.Mongo;
@@ -291,7 +292,7 @@ public sealed class ExportEngineTests
}
public ValueTask<VexAttestationVerification> VerifyAsync(VexAttestationVerificationRequest request, CancellationToken cancellationToken)
=> ValueTask.FromResult(new VexAttestationVerification(true, ImmutableDictionary<string, string>.Empty));
=> ValueTask.FromResult(new VexAttestationVerification(true, VexAttestationDiagnostics.Empty));
}
private sealed class RecordingCacheIndex : IVexCacheIndex

View File

@@ -4,13 +4,14 @@ using System.Collections.Immutable;
using System.Text;
using System.Text.Json;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.DependencyInjection.Extensions;
using Microsoft.Extensions.Hosting;
using Microsoft.Extensions.Logging;
using StellaOps.Excititor.Core;
using StellaOps.Excititor.Export;
using StellaOps.Excititor.Storage.Mongo;
using StellaOps.Excititor.WebService.Services;
using Microsoft.Extensions.DependencyInjection.Extensions;
using Microsoft.Extensions.Hosting;
using Microsoft.Extensions.Logging;
using StellaOps.Excititor.Core;
using StellaOps.Excititor.Attestation.Verification;
using StellaOps.Excititor.Export;
using StellaOps.Excititor.Storage.Mongo;
using StellaOps.Excititor.WebService.Services;
using MongoDB.Driver;
using StellaOps.Excititor.Attestation.Dsse;
@@ -162,7 +163,7 @@ internal static class TestServiceOverrides
public ValueTask<VexAttestationVerification> VerifyAsync(VexAttestationVerificationRequest request, CancellationToken cancellationToken)
{
var verification = new VexAttestationVerification(true, ImmutableDictionary<string, string>.Empty);
var verification = new VexAttestationVerification(true, VexAttestationDiagnostics.Empty);
return ValueTask.FromResult(verification);
}
}

View File

@@ -504,6 +504,21 @@ public sealed class DefaultVexProviderRunnerTests
bool includeGlobal,
CancellationToken cancellationToken)
=> ValueTask.FromResult(DefaultTrust);
public ValueTask<IssuerTrustResponseModel> SetIssuerTrustAsync(
string tenantId,
string issuerId,
decimal weight,
string? reason,
CancellationToken cancellationToken)
=> ValueTask.FromResult(DefaultTrust);
public ValueTask DeleteIssuerTrustAsync(
string tenantId,
string issuerId,
string? reason,
CancellationToken cancellationToken)
=> ValueTask.CompletedTask;
}
private sealed class NoopSignatureVerifier : IVexSignatureVerifier
@@ -655,25 +670,25 @@ public sealed class DefaultVexProviderRunnerTests
}
}
private sealed class StubAttestationVerifier : IVexAttestationVerifier
{
private readonly bool _isValid;
private readonly ImmutableDictionary<string, string> _diagnostics;
public StubAttestationVerifier(bool isValid, ImmutableDictionary<string, string> diagnostics)
{
_isValid = isValid;
_diagnostics = diagnostics;
}
public int Invocations { get; private set; }
public ValueTask<VexAttestationVerification> VerifyAsync(VexAttestationVerificationRequest request, CancellationToken cancellationToken)
{
Invocations++;
return ValueTask.FromResult(new VexAttestationVerification(_isValid, _diagnostics));
}
}
private sealed class StubAttestationVerifier : IVexAttestationVerifier
{
private readonly bool _isValid;
private readonly VexAttestationDiagnostics _diagnostics;
public StubAttestationVerifier(bool isValid, ImmutableDictionary<string, string> diagnostics)
{
_isValid = isValid;
_diagnostics = VexAttestationDiagnostics.FromBuilder(diagnostics.ToBuilder());
}
public int Invocations { get; private set; }
public ValueTask<VexAttestationVerification> VerifyAsync(VexAttestationVerificationRequest request, CancellationToken cancellationToken)
{
Invocations++;
return ValueTask.FromResult(new VexAttestationVerification(_isValid, _diagnostics));
}
}
private static VexRawDocument CreateAttestationRawDocument(DateTimeOffset observedAt)
{

View File

@@ -249,19 +249,21 @@ public sealed class WorkerSignatureVerifierTests
private sealed class StubAttestationVerifier : IVexAttestationVerifier
{
private readonly bool _isValid;
private readonly ImmutableDictionary<string, string> _diagnostics;
private readonly VexAttestationDiagnostics _diagnostics;
public StubAttestationVerifier(bool isValid, ImmutableDictionary<string, string>? diagnostics = null)
{
_isValid = isValid;
_diagnostics = diagnostics ?? ImmutableDictionary<string, string>.Empty;
}
public int Invocations { get; private set; }
public ValueTask<VexAttestationVerification> VerifyAsync(VexAttestationVerificationRequest request, CancellationToken cancellationToken)
{
Invocations++;
{
_isValid = isValid;
_diagnostics = diagnostics is null
? VexAttestationDiagnostics.Empty
: VexAttestationDiagnostics.FromBuilder(diagnostics.ToBuilder());
}
public int Invocations { get; private set; }
public ValueTask<VexAttestationVerification> VerifyAsync(VexAttestationVerificationRequest request, CancellationToken cancellationToken)
{
Invocations++;
return ValueTask.FromResult(new VexAttestationVerification(_isValid, _diagnostics));
}
}
@@ -269,7 +271,7 @@ public sealed class WorkerSignatureVerifierTests
private sealed class StubIssuerDirectoryClient : IIssuerDirectoryClient
{
private readonly IReadOnlyList<IssuerKeyModel> _keys;
private readonly IssuerTrustResponseModel _trust;
private IssuerTrustResponseModel _trust;
private StubIssuerDirectoryClient(
IReadOnlyList<IssuerKeyModel> keys,
@@ -302,7 +304,7 @@ public sealed class WorkerSignatureVerifierTests
null,
null);
var now = DateTimeOffset.UtcNow;
var now = DateTimeOffset.UnixEpoch;
var overrideModel = new IssuerTrustOverrideModel(weight, "stub", now, "test", now, "test");
return new StubIssuerDirectoryClient(
new[] { key },
@@ -322,6 +324,29 @@ public sealed class WorkerSignatureVerifierTests
bool includeGlobal,
CancellationToken cancellationToken)
=> ValueTask.FromResult(_trust);
public ValueTask<IssuerTrustResponseModel> SetIssuerTrustAsync(
string tenantId,
string issuerId,
decimal weight,
string? reason,
CancellationToken cancellationToken)
{
var now = DateTimeOffset.UnixEpoch;
var overrideModel = new IssuerTrustOverrideModel(weight, "stub-set", now, "test", now, "test");
_trust = new IssuerTrustResponseModel(overrideModel, null, weight);
return ValueTask.FromResult(_trust);
}
public ValueTask DeleteIssuerTrustAsync(
string tenantId,
string issuerId,
string? reason,
CancellationToken cancellationToken)
{
_trust = new IssuerTrustResponseModel(null, null, 0m);
return ValueTask.CompletedTask;
}
}
private sealed class FixedTimeProvider : TimeProvider

View File

@@ -3,5 +3,5 @@
## Sprint 64 Bundle Implementation
| ID | Status | Owner(s) | Depends on | Description | Exit Criteria |
|----|--------|----------|------------|-------------|---------------|
| DVOFF-64-001 | DOING (2025-11-04) | DevPortal Offline Guild, Exporter Guild | DEVPORT-64-001, SDKREL-64-002 | Implement Export Center job `devportal --offline` bundling portal HTML, specs, SDK artifacts, changelogs, and verification manifest. | Job executes in staging; manifest contains checksums + DSSE signatures; docs updated. |
| DVOFF-64-001 | DONE (2025-11-05) | DevPortal Offline Guild, Exporter Guild | DEVPORT-64-001, SDKREL-64-002 | Implement Export Center job `devportal --offline` bundling portal HTML, specs, SDK artifacts, changelogs, and verification manifest. | Job executes in staging; manifest contains checksums + DSSE signatures; docs updated. |
| DVOFF-64-002 | TODO | DevPortal Offline Guild, AirGap Controller Guild | DVOFF-64-001 | Provide verification CLI (`stella devportal verify bundle.tgz`) ensuring integrity before import. | CLI command validates signatures; integration test covers corrupted bundle; runbook updated. |

View File

@@ -0,0 +1,176 @@
using System;
using System.IO;
using System.Text;
using System.Text.Json;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging;
namespace StellaOps.ExportCenter.Core.DevPortalOffline;
/// <summary>
/// Coordinates bundle construction, manifest signing, and artefact persistence for the devportal offline export.
/// </summary>
public sealed class DevPortalOfflineJob
{
private static readonly JsonSerializerOptions SignatureSerializerOptions = new(JsonSerializerDefaults.Web)
{
WriteIndented = true
};
private readonly DevPortalOfflineBundleBuilder _builder;
private readonly IDevPortalOfflineObjectStore _objectStore;
private readonly IDevPortalOfflineManifestSigner _manifestSigner;
private readonly ILogger<DevPortalOfflineJob> _logger;
public DevPortalOfflineJob(
DevPortalOfflineBundleBuilder builder,
IDevPortalOfflineObjectStore objectStore,
IDevPortalOfflineManifestSigner manifestSigner,
ILogger<DevPortalOfflineJob> logger)
{
_builder = builder ?? throw new ArgumentNullException(nameof(builder));
_objectStore = objectStore ?? throw new ArgumentNullException(nameof(objectStore));
_manifestSigner = manifestSigner ?? throw new ArgumentNullException(nameof(manifestSigner));
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public async Task<DevPortalOfflineJobOutcome> ExecuteAsync(
DevPortalOfflineJobRequest request,
CancellationToken cancellationToken = default)
{
ArgumentNullException.ThrowIfNull(request);
cancellationToken.ThrowIfCancellationRequested();
var bundleResult = _builder.Build(request.BundleRequest, cancellationToken);
_logger.LogInformation("DevPortal offline bundle constructed with {EntryCount} entries.", bundleResult.Manifest.Entries.Count);
var signature = await _manifestSigner.SignAsync(
request.BundleRequest.BundleId,
bundleResult.ManifestJson,
bundleResult.RootHash,
cancellationToken).ConfigureAwait(false);
var storagePrefix = BuildStoragePrefix(request.StoragePrefix, request.BundleRequest.BundleId);
var bundleFileName = SanitizeFileName(request.BundleFileName);
var manifestKey = $"{storagePrefix}/manifest.json";
var signatureKey = $"{storagePrefix}/manifest.dsse.json";
var checksumsKey = $"{storagePrefix}/checksums.txt";
var bundleKey = $"{storagePrefix}/{bundleFileName}";
var manifestMetadata = await StoreTextAsync(
bundleResult.ManifestJson,
manifestKey,
MediaTypes.Json,
cancellationToken).ConfigureAwait(false);
var signatureJson = JsonSerializer.Serialize(signature, SignatureSerializerOptions);
var signatureMetadata = await StoreTextAsync(
signatureJson,
signatureKey,
MediaTypes.Json,
cancellationToken).ConfigureAwait(false);
var checksumsMetadata = await StoreTextAsync(
bundleResult.Checksums,
checksumsKey,
MediaTypes.Text,
cancellationToken).ConfigureAwait(false);
DevPortalOfflineStorageMetadata bundleMetadata;
using (bundleResult.BundleStream)
{
bundleResult.BundleStream.Position = 0;
bundleMetadata = await _objectStore.StoreAsync(
bundleResult.BundleStream,
new DevPortalOfflineObjectStoreOptions(bundleKey, MediaTypes.GZip),
cancellationToken).ConfigureAwait(false);
}
_logger.LogInformation(
"DevPortal offline bundle stored at {BundleKey} ({SizeBytes} bytes).",
bundleMetadata.StorageKey,
bundleMetadata.SizeBytes);
return new DevPortalOfflineJobOutcome(
bundleResult.Manifest,
bundleResult.RootHash,
signature,
manifestMetadata,
signatureMetadata,
checksumsMetadata,
bundleMetadata);
}
private Task<DevPortalOfflineStorageMetadata> StoreTextAsync(
string content,
string storageKey,
string contentType,
CancellationToken cancellationToken)
{
var bytes = Encoding.UTF8.GetBytes(content);
var stream = new MemoryStream(bytes, writable: false);
return StoreAsync(stream, storageKey, contentType, cancellationToken);
}
private async Task<DevPortalOfflineStorageMetadata> StoreAsync(
Stream stream,
string storageKey,
string contentType,
CancellationToken cancellationToken)
{
try
{
return await _objectStore.StoreAsync(
stream,
new DevPortalOfflineObjectStoreOptions(storageKey, contentType),
cancellationToken)
.ConfigureAwait(false);
}
finally
{
await stream.DisposeAsync().ConfigureAwait(false);
}
}
private static string BuildStoragePrefix(string? prefix, Guid bundleId)
{
var trimmed = string.IsNullOrWhiteSpace(prefix)
? string.Empty
: prefix.Trim().Trim('/').Replace('\\', '/');
return string.IsNullOrEmpty(trimmed)
? bundleId.ToString("D")
: $"{trimmed}/{bundleId:D}";
}
private static string SanitizeFileName(string? value)
{
if (string.IsNullOrWhiteSpace(value))
{
return "devportal-offline-bundle.tgz";
}
var fileName = Path.GetFileName(value);
if (string.IsNullOrEmpty(fileName))
{
return "devportal-offline-bundle.tgz";
}
if (fileName.Contains("..", StringComparison.Ordinal))
{
throw new InvalidOperationException("Bundle file name cannot contain path traversal sequences.");
}
return fileName;
}
private static class MediaTypes
{
public const string Json = "application/json";
public const string Text = "text/plain";
public const string GZip = "application/gzip";
}
}

View File

@@ -0,0 +1,34 @@
using System;
using System.Collections.Generic;
namespace StellaOps.ExportCenter.Core.DevPortalOffline;
/// <summary>
/// Represents the inputs required to execute the devportal offline export job.
/// </summary>
/// <param name="BundleRequest">Bundle builder request describing source directories and bundle metadata.</param>
/// <param name="StoragePrefix">Relative storage prefix (without bundle identifier) where artefacts are persisted.</param>
/// <param name="BundleFileName">File name used for the packaged archive in storage.</param>
public sealed record DevPortalOfflineJobRequest(
DevPortalOfflineBundleRequest BundleRequest,
string StoragePrefix,
string BundleFileName);
/// <summary>
/// Captures metadata produced after successfully executing the devportal offline export job.
/// </summary>
/// <param name="Manifest">The manifest describing bundled artefacts.</param>
/// <param name="RootHash">SHA-256 hash of the manifest JSON payload.</param>
/// <param name="Signature">DSSE envelope and metadata for the manifest.</param>
/// <param name="ManifestStorage">Storage metadata for the manifest JSON artefact.</param>
/// <param name="SignatureStorage">Storage metadata for the manifest signature document.</param>
/// <param name="ChecksumsStorage">Storage metadata for the checksum file.</param>
/// <param name="BundleStorage">Storage metadata for the bundled archive.</param>
public sealed record DevPortalOfflineJobOutcome(
DevPortalOfflineBundleManifest Manifest,
string RootHash,
DevPortalOfflineManifestSignatureDocument Signature,
DevPortalOfflineStorageMetadata ManifestStorage,
DevPortalOfflineStorageMetadata SignatureStorage,
DevPortalOfflineStorageMetadata ChecksumsStorage,
DevPortalOfflineStorageMetadata BundleStorage);

View File

@@ -0,0 +1,56 @@
using System;
using System.Collections.Generic;
using System.Text.Json.Serialization;
using System.Threading;
using System.Threading.Tasks;
namespace StellaOps.ExportCenter.Core.DevPortalOffline;
/// <summary>
/// Provides DSSE signing for devportal offline bundle manifests.
/// </summary>
public interface IDevPortalOfflineManifestSigner
{
Task<DevPortalOfflineManifestSignatureDocument> SignAsync(
Guid bundleId,
string manifestJson,
string rootHash,
CancellationToken cancellationToken = default);
}
/// <summary>
/// DSSE signature document emitted for devportal manifest verification.
/// </summary>
/// <param name="BundleId">Identifier of the bundle being signed.</param>
/// <param name="RootHash">SHA-256 hash of the manifest JSON.</param>
/// <param name="SignedAtUtc">UTC timestamp when the signature was created.</param>
/// <param name="Algorithm">Signing algorithm identifier (for example HMACSHA256).</param>
/// <param name="KeyId">Identifier of the signing key.</param>
/// <param name="Envelope">DSSE envelope containing payload and signatures.</param>
public sealed record DevPortalOfflineManifestSignatureDocument(
[property: JsonPropertyName("bundleId")] Guid BundleId,
[property: JsonPropertyName("rootHash")] string RootHash,
[property: JsonPropertyName("signedAt")] DateTimeOffset SignedAtUtc,
[property: JsonPropertyName("algorithm")] string Algorithm,
[property: JsonPropertyName("keyId")] string KeyId,
[property: JsonPropertyName("envelope")] DevPortalOfflineManifestDsseEnvelope Envelope);
/// <summary>
/// Standard DSSE envelope carrying payload and signatures.
/// </summary>
/// <param name="PayloadType">Type of the payload (for example application/json).</param>
/// <param name="Payload">Base64-encoded manifest payload.</param>
/// <param name="Signatures">Collection of DSSE signatures.</param>
public sealed record DevPortalOfflineManifestDsseEnvelope(
[property: JsonPropertyName("payloadType")] string PayloadType,
[property: JsonPropertyName("payload")] string Payload,
[property: JsonPropertyName("signatures")] IReadOnlyList<DevPortalOfflineManifestDsseSignature> Signatures);
/// <summary>
/// Represents an individual DSSE signature entry.
/// </summary>
/// <param name="Signature">Base64-encoded signature.</param>
/// <param name="KeyId">Identifier for the key used to sign.</param>
public sealed record DevPortalOfflineManifestDsseSignature(
[property: JsonPropertyName("sig")] string Signature,
[property: JsonPropertyName("keyid")] string? KeyId);

View File

@@ -3,16 +3,15 @@
<PropertyGroup>
<TargetFramework>net10.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
<LangVersion>preview</LangVersion>
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
</PropertyGroup>
<PropertyGroup>
<TargetFramework>net10.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
<LangVersion>preview</LangVersion>
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Microsoft.Extensions.Logging.Abstractions" Version="10.0.0-rc.2.25502.107" />
</ItemGroup>
</Project>

View File

@@ -0,0 +1,18 @@
using System.ComponentModel.DataAnnotations;
namespace StellaOps.ExportCenter.Infrastructure.DevPortalOffline;
public sealed class DevPortalOfflineManifestSigningOptions
{
[Required]
public string KeyId { get; set; } = "devportal-offline-local";
[Required]
public string Secret { get; set; } = null!;
[Required]
public string Algorithm { get; set; } = "HMACSHA256";
[Required]
public string PayloadType { get; set; } = "application/vnd.stella.devportal.manifest+json";
}

View File

@@ -0,0 +1,9 @@
using System.ComponentModel.DataAnnotations;
namespace StellaOps.ExportCenter.Infrastructure.DevPortalOffline;
public sealed class DevPortalOfflineStorageOptions
{
[Required]
public string RootPath { get; set; } = null!;
}

View File

@@ -0,0 +1,149 @@
using System;
using System.Buffers;
using System.IO;
using System.Security.Cryptography;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using StellaOps.ExportCenter.Core.DevPortalOffline;
namespace StellaOps.ExportCenter.Infrastructure.DevPortalOffline;
public sealed class FileSystemDevPortalOfflineObjectStore : IDevPortalOfflineObjectStore
{
private readonly IOptionsMonitor<DevPortalOfflineStorageOptions> _options;
private readonly TimeProvider _timeProvider;
private readonly ILogger<FileSystemDevPortalOfflineObjectStore> _logger;
public FileSystemDevPortalOfflineObjectStore(
IOptionsMonitor<DevPortalOfflineStorageOptions> options,
TimeProvider timeProvider,
ILogger<FileSystemDevPortalOfflineObjectStore> logger)
{
_options = options ?? throw new ArgumentNullException(nameof(options));
_timeProvider = timeProvider ?? TimeProvider.System;
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public async Task<DevPortalOfflineStorageMetadata> StoreAsync(
Stream content,
DevPortalOfflineObjectStoreOptions storeOptions,
CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(content);
ArgumentNullException.ThrowIfNull(storeOptions);
var root = EnsureRootPath();
var storageKey = SanitizeKey(storeOptions.StorageKey);
var fullPath = GetFullPath(root, storageKey);
Directory.CreateDirectory(Path.GetDirectoryName(fullPath)!);
content.Seek(0, SeekOrigin.Begin);
using var fileStream = new FileStream(fullPath, FileMode.Create, FileAccess.Write, FileShare.None);
using var hash = IncrementalHash.CreateHash(HashAlgorithmName.SHA256);
var buffer = ArrayPool<byte>.Shared.Rent(128 * 1024);
long totalBytes = 0;
try
{
int read;
while ((read = await content.ReadAsync(buffer.AsMemory(0, buffer.Length), cancellationToken).ConfigureAwait(false)) > 0)
{
await fileStream.WriteAsync(buffer.AsMemory(0, read), cancellationToken).ConfigureAwait(false);
hash.AppendData(buffer, 0, read);
totalBytes += read;
}
}
finally
{
ArrayPool<byte>.Shared.Return(buffer);
}
await fileStream.FlushAsync(cancellationToken).ConfigureAwait(false);
content.Seek(0, SeekOrigin.Begin);
var sha = Convert.ToHexString(hash.GetHashAndReset()).ToLowerInvariant();
var createdAt = _timeProvider.GetUtcNow();
_logger.LogDebug("Stored devportal artefact at {Path} ({Bytes} bytes).", fullPath, totalBytes);
return new DevPortalOfflineStorageMetadata(
storageKey,
storeOptions.ContentType,
totalBytes,
sha,
createdAt);
}
public Task<bool> ExistsAsync(string storageKey, CancellationToken cancellationToken)
{
var root = EnsureRootPath();
var fullPath = GetFullPath(root, SanitizeKey(storageKey));
var exists = File.Exists(fullPath);
return Task.FromResult(exists);
}
public Task<Stream> OpenReadAsync(string storageKey, CancellationToken cancellationToken)
{
var root = EnsureRootPath();
var fullPath = GetFullPath(root, SanitizeKey(storageKey));
if (!File.Exists(fullPath))
{
throw new FileNotFoundException($"DevPortal offline artefact '{storageKey}' was not found.", fullPath);
}
Stream stream = new FileStream(fullPath, FileMode.Open, FileAccess.Read, FileShare.Read);
return Task.FromResult(stream);
}
private string EnsureRootPath()
{
var root = _options.CurrentValue.RootPath;
if (string.IsNullOrWhiteSpace(root))
{
throw new InvalidOperationException("DevPortal offline storage root path is not configured.");
}
var full = Path.GetFullPath(root);
if (!Directory.Exists(full))
{
Directory.CreateDirectory(full);
}
if (!full.EndsWith(Path.DirectorySeparatorChar))
{
full += Path.DirectorySeparatorChar;
}
return full;
}
private static string SanitizeKey(string storageKey)
{
if (string.IsNullOrWhiteSpace(storageKey))
{
throw new ArgumentException("Storage key cannot be empty.", nameof(storageKey));
}
if (storageKey.Contains("..", StringComparison.Ordinal))
{
throw new ArgumentException("Storage key cannot contain path traversal sequences.", nameof(storageKey));
}
var trimmed = storageKey.Trim().Trim('/').Replace('\\', '/');
return trimmed;
}
private static string GetFullPath(string root, string storageKey)
{
var combined = Path.Combine(root, storageKey.Replace('/', Path.DirectorySeparatorChar));
var fullPath = Path.GetFullPath(combined);
if (!fullPath.StartsWith(root, StringComparison.Ordinal))
{
throw new InvalidOperationException("Storage key resolves outside of configured root path.");
}
return fullPath;
}
}

View File

@@ -0,0 +1,121 @@
using System;
using System.Buffers.Binary;
using System.ComponentModel.DataAnnotations;
using System.Security.Cryptography;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using StellaOps.ExportCenter.Core.DevPortalOffline;
namespace StellaOps.ExportCenter.Infrastructure.DevPortalOffline;
public sealed class HmacDevPortalOfflineManifestSigner : IDevPortalOfflineManifestSigner
{
private readonly IOptionsMonitor<DevPortalOfflineManifestSigningOptions> _options;
private readonly TimeProvider _timeProvider;
private readonly ILogger<HmacDevPortalOfflineManifestSigner> _logger;
public HmacDevPortalOfflineManifestSigner(
IOptionsMonitor<DevPortalOfflineManifestSigningOptions> options,
TimeProvider timeProvider,
ILogger<HmacDevPortalOfflineManifestSigner> logger)
{
_options = options ?? throw new ArgumentNullException(nameof(options));
_timeProvider = timeProvider ?? TimeProvider.System;
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public Task<DevPortalOfflineManifestSignatureDocument> SignAsync(
Guid bundleId,
string manifestJson,
string rootHash,
CancellationToken cancellationToken = default)
{
if (string.IsNullOrWhiteSpace(manifestJson))
{
throw new ArgumentException("Manifest JSON is required.", nameof(manifestJson));
}
if (string.IsNullOrWhiteSpace(rootHash))
{
throw new ArgumentException("Root hash is required.", nameof(rootHash));
}
var options = _options.CurrentValue;
ValidateOptions(options);
var signedAt = _timeProvider.GetUtcNow();
var payloadBytes = Encoding.UTF8.GetBytes(manifestJson);
var pae = BuildPreAuthEncoding(options.PayloadType, payloadBytes);
var signature = ComputeSignature(options, pae);
var payloadBase64 = Convert.ToBase64String(payloadBytes);
_logger.LogDebug("Signed devportal manifest for bundle {BundleId}.", bundleId);
var envelope = new DevPortalOfflineManifestDsseEnvelope(
options.PayloadType,
payloadBase64,
new[]
{
new DevPortalOfflineManifestDsseSignature(signature, options.KeyId)
});
var document = new DevPortalOfflineManifestSignatureDocument(
bundleId,
rootHash,
signedAt,
options.Algorithm,
options.KeyId,
envelope);
return Task.FromResult(document);
}
private static void ValidateOptions(DevPortalOfflineManifestSigningOptions options)
{
Validator.ValidateObject(options, new ValidationContext(options), validateAllProperties: true);
if (!string.Equals(options.Algorithm, "HMACSHA256", StringComparison.OrdinalIgnoreCase))
{
throw new NotSupportedException($"Algorithm '{options.Algorithm}' is not supported for devportal manifest signing.");
}
}
private static string ComputeSignature(DevPortalOfflineManifestSigningOptions options, byte[] pae)
{
var secretBytes = Convert.FromBase64String(options.Secret);
using var hmac = new HMACSHA256(secretBytes);
var signatureBytes = hmac.ComputeHash(pae);
return Convert.ToBase64String(signatureBytes);
}
private static byte[] BuildPreAuthEncoding(string payloadType, byte[] payloadBytes)
{
var typeBytes = Encoding.UTF8.GetBytes(payloadType ?? string.Empty);
const string prefix = "DSSEv1";
var totalLength = prefix.Length +
sizeof(ulong) +
sizeof(ulong) + typeBytes.Length +
sizeof(ulong) + payloadBytes.Length;
var buffer = new byte[totalLength];
var span = buffer.AsSpan();
var offset = Encoding.UTF8.GetBytes(prefix, span);
BinaryPrimitives.WriteUInt64BigEndian(span[offset..], 2);
offset += sizeof(ulong);
BinaryPrimitives.WriteUInt64BigEndian(span[offset..], (ulong)typeBytes.Length);
offset += sizeof(ulong);
typeBytes.CopyTo(span[offset..]);
offset += typeBytes.Length;
BinaryPrimitives.WriteUInt64BigEndian(span[offset..], (ulong)payloadBytes.Length);
offset += sizeof(ulong);
payloadBytes.CopyTo(span[offset..]);
return buffer;
}
}

View File

@@ -1,28 +1,19 @@
<?xml version="1.0" ?>
<Project Sdk="Microsoft.NET.Sdk">
<ItemGroup>
<ProjectReference Include="..\StellaOps.ExportCenter.Core\StellaOps.ExportCenter.Core.csproj"/>
</ItemGroup>
<PropertyGroup>
<TargetFramework>net10.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
<LangVersion>preview</LangVersion>
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
</PropertyGroup>
</Project>
<?xml version="1.0" encoding="utf-8"?>
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>net10.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
<LangVersion>preview</LangVersion>
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
</PropertyGroup>
<ItemGroup>
<ProjectReference Include="..\StellaOps.ExportCenter.Core\StellaOps.ExportCenter.Core.csproj" />
</ItemGroup>
<ItemGroup>
<PackageReference Include="Microsoft.Extensions.Logging.Abstractions" Version="10.0.0-rc.2.25502.107" />
<PackageReference Include="Microsoft.Extensions.Options" Version="10.0.0-rc.2.25502.107" />
</ItemGroup>
</Project>

View File

@@ -52,7 +52,7 @@ public sealed class DevPortalOfflineBundleBuilderTests
var fixedNow = new DateTimeOffset(2025, 11, 4, 12, 30, 0, TimeSpan.Zero);
var builder = new DevPortalOfflineBundleBuilder(new FixedTimeProvider(fixedNow));
var result = builder.Build(request);
var result = builder.Build(request, TestContext.Current.CancellationToken);
Assert.Equal(request.BundleId, result.Manifest.BundleId);
Assert.Equal("devportal-offline/v1", result.Manifest.Version);
@@ -65,12 +65,12 @@ public sealed class DevPortalOfflineBundleBuilderTests
var expectedPaths = new[]
{
"changelog/CHANGELOG.md",
"portal/assets/app.js",
"portal/index.html",
"specs/openapi.yaml",
"sdks/dotnet/stellaops.sdk.nupkg",
"sdks/python/stellaops_sdk.whl",
"changelog/CHANGELOG.md"
"specs/openapi.yaml"
};
Assert.Equal(expectedPaths, result.Manifest.Entries.Select(entry => entry.Path).ToArray());
@@ -119,7 +119,10 @@ public sealed class DevPortalOfflineBundleBuilderTests
}
finally
{
tempRoot.Dispose();
if (Directory.Exists(tempRoot.FullName))
{
Directory.Delete(tempRoot.FullName, recursive: true);
}
}
}
@@ -129,7 +132,7 @@ public sealed class DevPortalOfflineBundleBuilderTests
var builder = new DevPortalOfflineBundleBuilder(new FixedTimeProvider(DateTimeOffset.UtcNow));
var request = new DevPortalOfflineBundleRequest(Guid.NewGuid());
var exception = Assert.Throws<InvalidOperationException>(() => builder.Build(request));
var exception = Assert.Throws<InvalidOperationException>(() => builder.Build(request, TestContext.Current.CancellationToken));
Assert.Contains("does not contain any files", exception.Message, StringComparison.Ordinal);
}
@@ -145,7 +148,7 @@ public sealed class DevPortalOfflineBundleBuilderTests
File.WriteAllText(Path.Combine(portalRoot, "index.html"), "<html/>");
var builder = new DevPortalOfflineBundleBuilder(new FixedTimeProvider(DateTimeOffset.UtcNow));
var result = builder.Build(new DevPortalOfflineBundleRequest(Guid.NewGuid(), portalRoot));
var result = builder.Build(new DevPortalOfflineBundleRequest(Guid.NewGuid(), portalRoot), TestContext.Current.CancellationToken);
Assert.Single(result.Manifest.Entries);
Assert.True(result.Manifest.Sources.PortalIncluded);
@@ -155,7 +158,10 @@ public sealed class DevPortalOfflineBundleBuilderTests
}
finally
{
tempRoot.Dispose();
if (Directory.Exists(tempRoot.FullName))
{
Directory.Delete(tempRoot.FullName, recursive: true);
}
}
}
@@ -166,7 +172,7 @@ public sealed class DevPortalOfflineBundleBuilderTests
var missing = Path.Combine(Path.GetTempPath(), Guid.NewGuid().ToString("N"));
var request = new DevPortalOfflineBundleRequest(Guid.NewGuid(), missing);
Assert.Throws<DirectoryNotFoundException>(() => builder.Build(request));
Assert.Throws<DirectoryNotFoundException>(() => builder.Build(request, TestContext.Current.CancellationToken));
}
private static string CalculateFileHash(string path)

View File

@@ -0,0 +1,217 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Security.Cryptography;
using System.Text;
using System.Text.Json;
using System.Threading;
using System.Threading.Tasks;
using StellaOps.ExportCenter.Core.DevPortalOffline;
using Microsoft.Extensions.Logging.Abstractions;
namespace StellaOps.ExportCenter.Tests;
public class DevPortalOfflineJobTests
{
[Fact]
public async Task ExecuteAsync_StoresArtefacts()
{
var tempRoot = Directory.CreateTempSubdirectory();
try
{
var portalRoot = Path.Combine(tempRoot.FullName, "portal");
Directory.CreateDirectory(portalRoot);
File.WriteAllText(Path.Combine(portalRoot, "index.html"), "<html>offline</html>");
var specsRoot = Path.Combine(tempRoot.FullName, "specs");
Directory.CreateDirectory(specsRoot);
File.WriteAllText(Path.Combine(specsRoot, "api.json"), "{\"openapi\":\"3.1.0\"}");
var sdkRoot = Path.Combine(tempRoot.FullName, "sdk-dotnet");
Directory.CreateDirectory(sdkRoot);
File.WriteAllText(Path.Combine(sdkRoot, "stellaops.sdk.nupkg"), "binary");
var changelogRoot = Path.Combine(tempRoot.FullName, "changelog");
Directory.CreateDirectory(changelogRoot);
File.WriteAllText(Path.Combine(changelogRoot, "CHANGELOG.md"), "# 2025.11.0");
var bundleId = Guid.Parse("5e76bb6f-2925-41ea-8e72-1fd8a384dd1a");
var request = new DevPortalOfflineBundleRequest(
bundleId,
portalRoot,
specsRoot,
new[] { new DevPortalSdkSource("dotnet", sdkRoot) },
changelogRoot,
new Dictionary<string, string> { ["releaseVersion"] = "2025.11.0" });
var fixedNow = new DateTimeOffset(2025, 11, 4, 18, 15, 0, TimeSpan.Zero);
var timeProvider = new FixedTimeProvider(fixedNow);
var builder = new DevPortalOfflineBundleBuilder(timeProvider);
var objectStore = new InMemoryObjectStore(timeProvider);
var signer = new TestManifestSigner(timeProvider);
var job = new DevPortalOfflineJob(builder, objectStore, signer, NullLogger<DevPortalOfflineJob>.Instance);
var outcome = await job.ExecuteAsync(
new DevPortalOfflineJobRequest(request, "exports/devportal", "bundle.tgz"),
TestContext.Current.CancellationToken);
var expectedPrefix = $"exports/devportal/{bundleId:D}";
Assert.Equal($"{expectedPrefix}/manifest.json", outcome.ManifestStorage.StorageKey);
Assert.Equal($"{expectedPrefix}/manifest.dsse.json", outcome.SignatureStorage.StorageKey);
Assert.Equal($"{expectedPrefix}/checksums.txt", outcome.ChecksumsStorage.StorageKey);
Assert.Equal($"{expectedPrefix}/bundle.tgz", outcome.BundleStorage.StorageKey);
var manifestText = objectStore.GetText(outcome.ManifestStorage.StorageKey);
using (var manifestDoc = JsonDocument.Parse(manifestText))
{
Assert.Equal("devportal-offline/v1", manifestDoc.RootElement.GetProperty("version").GetString());
Assert.Equal("2025.11.0", manifestDoc.RootElement.GetProperty("metadata").GetProperty("releaseVersion").GetString());
}
var signatureText = objectStore.GetText(outcome.SignatureStorage.StorageKey);
Assert.Contains(bundleId.ToString("D"), signatureText, StringComparison.OrdinalIgnoreCase);
Assert.Contains(outcome.RootHash, signatureText, StringComparison.Ordinal);
}
finally
{
if (Directory.Exists(tempRoot.FullName))
{
Directory.Delete(tempRoot.FullName, recursive: true);
}
}
}
[Fact]
public async Task ExecuteAsync_SanitizesBundleFileName()
{
var builder = new DevPortalOfflineBundleBuilder(new FixedTimeProvider(DateTimeOffset.UtcNow));
var objectStore = new InMemoryObjectStore(new FixedTimeProvider(DateTimeOffset.UtcNow));
var signer = new TestManifestSigner(new FixedTimeProvider(DateTimeOffset.UtcNow));
var job = new DevPortalOfflineJob(builder, objectStore, signer, NullLogger<DevPortalOfflineJob>.Instance);
var tempRoot = Directory.CreateTempSubdirectory();
try
{
var portalRoot = Path.Combine(tempRoot.FullName, "portal");
Directory.CreateDirectory(portalRoot);
File.WriteAllText(Path.Combine(portalRoot, "index.html"), "<html/>");
var request = new DevPortalOfflineBundleRequest(Guid.NewGuid(), portalRoot);
var outcome = await job.ExecuteAsync(
new DevPortalOfflineJobRequest(request, "exports", "../bundle.tgz"),
TestContext.Current.CancellationToken);
var expectedPrefix = $"exports/{request.BundleId:D}";
Assert.Equal($"{expectedPrefix}/bundle.tgz", outcome.BundleStorage.StorageKey);
}
finally
{
if (Directory.Exists(tempRoot.FullName))
{
Directory.Delete(tempRoot.FullName, recursive: true);
}
}
}
private sealed class InMemoryObjectStore : IDevPortalOfflineObjectStore
{
private readonly Dictionary<string, InMemoryEntry> _entries = new(StringComparer.Ordinal);
private readonly TimeProvider _timeProvider;
public InMemoryObjectStore(TimeProvider timeProvider)
{
_timeProvider = timeProvider;
}
public Task<DevPortalOfflineStorageMetadata> StoreAsync(
Stream content,
DevPortalOfflineObjectStoreOptions options,
CancellationToken cancellationToken)
{
using var memory = new MemoryStream();
content.CopyTo(memory);
var bytes = memory.ToArray();
content.Seek(0, SeekOrigin.Begin);
var sha = Convert.ToHexString(SHA256.HashData(bytes)).ToLowerInvariant();
var metadata = new DevPortalOfflineStorageMetadata(
options.StorageKey,
options.ContentType,
bytes.Length,
sha,
_timeProvider.GetUtcNow());
_entries[options.StorageKey] = new InMemoryEntry(options.ContentType, bytes);
return Task.FromResult(metadata);
}
public Task<bool> ExistsAsync(string storageKey, CancellationToken cancellationToken)
=> Task.FromResult(_entries.ContainsKey(storageKey));
public Task<Stream> OpenReadAsync(string storageKey, CancellationToken cancellationToken)
{
if (!_entries.TryGetValue(storageKey, out var entry))
{
throw new FileNotFoundException(storageKey);
}
Stream stream = new MemoryStream(entry.Content, writable: false);
return Task.FromResult(stream);
}
public string GetText(string storageKey)
{
if (!_entries.TryGetValue(storageKey, out var entry))
{
throw new FileNotFoundException(storageKey);
}
return Encoding.UTF8.GetString(entry.Content);
}
private sealed record InMemoryEntry(string ContentType, byte[] Content);
}
private sealed class TestManifestSigner : IDevPortalOfflineManifestSigner
{
private readonly TimeProvider _timeProvider;
public TestManifestSigner(TimeProvider timeProvider)
{
_timeProvider = timeProvider;
}
public Task<DevPortalOfflineManifestSignatureDocument> SignAsync(
Guid bundleId,
string manifestJson,
string rootHash,
CancellationToken cancellationToken = default)
{
var payload = Convert.ToBase64String(Encoding.UTF8.GetBytes(manifestJson));
return Task.FromResult(new DevPortalOfflineManifestSignatureDocument(
bundleId,
rootHash,
_timeProvider.GetUtcNow(),
"TEST-SHA256",
"test-key",
new DevPortalOfflineManifestDsseEnvelope(
"application/json",
payload,
new[] { new DevPortalOfflineManifestDsseSignature("c2lnbmF0dXJl", "test-key") })));
}
}
private sealed class FixedTimeProvider : TimeProvider
{
private readonly DateTimeOffset _utcNow;
public FixedTimeProvider(DateTimeOffset utcNow)
{
_utcNow = utcNow;
}
public override DateTimeOffset GetUtcNow() => _utcNow;
public override long GetTimestamp() => TimeProvider.System.GetTimestamp();
}
}

View File

@@ -0,0 +1,147 @@
using System;
using System.Buffers.Binary;
using System.Security.Cryptography;
using System.Threading;
using System.Text;
using Microsoft.Extensions.Logging.Abstractions;
using Microsoft.Extensions.Options;
using StellaOps.ExportCenter.Core.DevPortalOffline;
using StellaOps.ExportCenter.Infrastructure.DevPortalOffline;
namespace StellaOps.ExportCenter.Tests;
public class HmacDevPortalOfflineManifestSignerTests
{
[Fact]
public async Task SignAsync_ComputesDeterministicSignature()
{
var options = new DevPortalOfflineManifestSigningOptions
{
KeyId = "devportal-test",
Secret = Convert.ToBase64String(Encoding.UTF8.GetBytes("shared-secret")),
Algorithm = "HMACSHA256",
PayloadType = "application/vnd.stella.devportal.manifest+json"
};
var now = new DateTimeOffset(2025, 11, 4, 19, 0, 0, TimeSpan.Zero);
var signer = new HmacDevPortalOfflineManifestSigner(
new StaticOptionsMonitor<DevPortalOfflineManifestSigningOptions>(options),
new FixedTimeProvider(now),
NullLogger<HmacDevPortalOfflineManifestSigner>.Instance);
const string manifest = "{\"version\":\"v1\",\"entries\":[]}";
var rootHash = Convert.ToHexString(SHA256.HashData(Encoding.UTF8.GetBytes(manifest))).ToLowerInvariant();
var bundleId = Guid.Parse("9ca2aafb-42b7-4df9-85f7-5a1d46c4e0ef");
var document = await signer.SignAsync(bundleId, manifest, rootHash, TestContext.Current.CancellationToken);
Assert.Equal(bundleId, document.BundleId);
Assert.Equal(rootHash, document.RootHash);
Assert.Equal(now, document.SignedAtUtc);
Assert.Equal(options.Algorithm, document.Algorithm);
Assert.Equal(options.KeyId, document.KeyId);
Assert.Equal(Convert.ToBase64String(Encoding.UTF8.GetBytes(manifest)), document.Envelope.Payload);
Assert.Equal(options.PayloadType, document.Envelope.PayloadType);
var signature = Assert.Single(document.Envelope.Signatures);
Assert.Equal(options.KeyId, signature.KeyId);
var expectedSignature = ComputeExpectedSignature(options, manifest);
Assert.Equal(expectedSignature, signature.Signature);
}
[Fact]
public async Task SignAsync_ThrowsForUnsupportedAlgorithm()
{
var options = new DevPortalOfflineManifestSigningOptions
{
KeyId = "devportal-test",
Secret = Convert.ToBase64String(Encoding.UTF8.GetBytes("shared-secret")),
Algorithm = "RSA",
PayloadType = "application/json"
};
var signer = new HmacDevPortalOfflineManifestSigner(
new StaticOptionsMonitor<DevPortalOfflineManifestSigningOptions>(options),
new FixedTimeProvider(DateTimeOffset.UtcNow),
NullLogger<HmacDevPortalOfflineManifestSigner>.Instance);
await Assert.ThrowsAsync<NotSupportedException>(() =>
signer.SignAsync(Guid.NewGuid(), "{}", "root", TestContext.Current.CancellationToken));
}
private static string ComputeExpectedSignature(DevPortalOfflineManifestSigningOptions options, string manifest)
{
var payloadBytes = Encoding.UTF8.GetBytes(manifest);
var pae = BuildPreAuthEncoding(options.PayloadType, payloadBytes);
var secret = Convert.FromBase64String(options.Secret);
using var hmac = new HMACSHA256(secret);
var signature = hmac.ComputeHash(pae);
return Convert.ToBase64String(signature);
}
private static byte[] BuildPreAuthEncoding(string payloadType, byte[] payloadBytes)
{
var typeBytes = Encoding.UTF8.GetBytes(payloadType ?? string.Empty);
const string prefix = "DSSEv1";
var buffer = new byte[prefix.Length + sizeof(ulong) + sizeof(ulong) + typeBytes.Length + sizeof(ulong) + payloadBytes.Length];
var span = buffer.AsSpan();
var written = Encoding.UTF8.GetBytes(prefix, span);
span = span[written..];
BinaryPrimitives.WriteUInt64BigEndian(span, 2);
span = span[sizeof(ulong)..];
BinaryPrimitives.WriteUInt64BigEndian(span, (ulong)typeBytes.Length);
span = span[sizeof(ulong)..];
typeBytes.CopyTo(span);
span = span[typeBytes.Length..];
BinaryPrimitives.WriteUInt64BigEndian(span, (ulong)payloadBytes.Length);
span = span[sizeof(ulong)..];
payloadBytes.CopyTo(span);
return buffer;
}
private sealed class FixedTimeProvider : TimeProvider
{
private readonly DateTimeOffset _utcNow;
public FixedTimeProvider(DateTimeOffset utcNow)
{
_utcNow = utcNow;
}
public override DateTimeOffset GetUtcNow() => _utcNow;
public override long GetTimestamp() => TimeProvider.System.GetTimestamp();
}
private sealed class StaticOptionsMonitor<T> : IOptionsMonitor<T>
{
private readonly T _value;
public StaticOptionsMonitor(T value)
{
_value = value;
}
public T CurrentValue => _value;
public T Get(string? name) => _value;
public IDisposable OnChange(Action<T, string?> listener) => NullDisposable.Instance;
private sealed class NullDisposable : IDisposable
{
public static readonly NullDisposable Instance = new();
public void Dispose()
{
}
}
}
}

View File

@@ -0,0 +1,19 @@
using System;
using System.Collections.Generic;
namespace StellaOps.ExportCenter.Worker;
public sealed class DevPortalOfflineWorkerOptions
{
public bool Enabled { get; set; }
public Guid? BundleId { get; set; }
public string? StoragePrefix { get; set; }
public string? BundleFileName { get; set; }
public string? PortalDirectory { get; set; }
public string? SpecsDirectory { get; set; }
public string? ChangelogDirectory { get; set; }
public List<DevPortalOfflineWorkerSdkSource>? SdkSources { get; set; }
public Dictionary<string, string>? Metadata { get; set; }
}
public sealed record DevPortalOfflineWorkerSdkSource(string Name, string Directory);

View File

@@ -1,7 +1,22 @@
using StellaOps.ExportCenter.Worker;
var builder = Host.CreateApplicationBuilder(args);
builder.Services.AddHostedService<Worker>();
var host = builder.Build();
host.Run();
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;
using StellaOps.ExportCenter.Core.DevPortalOffline;
using StellaOps.ExportCenter.Infrastructure.DevPortalOffline;
using StellaOps.ExportCenter.Worker;
var builder = Host.CreateApplicationBuilder(args);
builder.Services.AddSingleton(TimeProvider.System);
builder.Services.Configure<DevPortalOfflineWorkerOptions>(builder.Configuration.GetSection("DevPortalOffline"));
builder.Services.Configure<DevPortalOfflineManifestSigningOptions>(builder.Configuration.GetSection("DevPortalOffline:Signing"));
builder.Services.Configure<DevPortalOfflineStorageOptions>(builder.Configuration.GetSection("DevPortalOffline:Storage"));
builder.Services.AddSingleton<DevPortalOfflineBundleBuilder>();
builder.Services.AddSingleton<IDevPortalOfflineManifestSigner, HmacDevPortalOfflineManifestSigner>();
builder.Services.AddSingleton<IDevPortalOfflineObjectStore, FileSystemDevPortalOfflineObjectStore>();
builder.Services.AddSingleton<DevPortalOfflineJob>();
builder.Services.AddHostedService<Worker>();
var host = builder.Build();
host.Run();

View File

@@ -1,16 +1,87 @@
namespace StellaOps.ExportCenter.Worker;
public class Worker(ILogger<Worker> logger) : BackgroundService
{
protected override async Task ExecuteAsync(CancellationToken stoppingToken)
{
while (!stoppingToken.IsCancellationRequested)
{
if (logger.IsEnabled(LogLevel.Information))
{
logger.LogInformation("Worker running at: {time}", DateTimeOffset.Now);
}
await Task.Delay(1000, stoppingToken);
}
}
}
using System;
using System.Collections.Generic;
using System.Linq;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using StellaOps.ExportCenter.Core.DevPortalOffline;
namespace StellaOps.ExportCenter.Worker;
public sealed class Worker : BackgroundService
{
private readonly ILogger<Worker> _logger;
private readonly DevPortalOfflineJob _job;
private readonly IOptions<DevPortalOfflineWorkerOptions> _options;
public Worker(
ILogger<Worker> logger,
DevPortalOfflineJob job,
IOptions<DevPortalOfflineWorkerOptions> options)
{
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
_job = job ?? throw new ArgumentNullException(nameof(job));
_options = options ?? throw new ArgumentNullException(nameof(options));
}
protected override async Task ExecuteAsync(CancellationToken stoppingToken)
{
var options = _options.Value ?? new DevPortalOfflineWorkerOptions();
if (!options.Enabled)
{
_logger.LogInformation("DevPortal offline job disabled. Worker idling.");
await Task.Delay(Timeout.Infinite, stoppingToken).ConfigureAwait(false);
return;
}
try
{
var request = BuildRequest(options);
var outcome = await _job.ExecuteAsync(request, stoppingToken).ConfigureAwait(false);
_logger.LogInformation(
"DevPortal offline export completed. Bundle stored at {Key}. Manifest signature key {SignatureKey}.",
outcome.BundleStorage.StorageKey,
outcome.SignatureStorage.StorageKey);
}
catch (Exception ex) when (!stoppingToken.IsCancellationRequested)
{
_logger.LogError(ex, "DevPortal offline export job failed.");
throw;
}
await Task.Delay(Timeout.Infinite, stoppingToken).ConfigureAwait(false);
}
private static DevPortalOfflineJobRequest BuildRequest(DevPortalOfflineWorkerOptions options)
{
var bundleId = options.BundleId ?? Guid.NewGuid();
var metadata = options.Metadata is null || options.Metadata.Count == 0
? null
: new Dictionary<string, string>(options.Metadata, StringComparer.Ordinal);
var sdkSources = options.SdkSources is { Count: > 0 }
? options.SdkSources
.Select(source => new DevPortalSdkSource(source.Name, source.Directory))
.ToArray()
: Array.Empty<DevPortalSdkSource>();
var bundleRequest = new DevPortalOfflineBundleRequest(
bundleId,
options.PortalDirectory,
options.SpecsDirectory,
sdkSources,
options.ChangelogDirectory,
metadata);
var storagePrefix = options.StoragePrefix ?? "devportal/offline";
var bundleFileName = string.IsNullOrWhiteSpace(options.BundleFileName)
? "devportal-offline-bundle.tgz"
: options.BundleFileName;
return new DevPortalOfflineJobRequest(
bundleRequest,
storagePrefix,
bundleFileName);
}
}

View File

@@ -1,8 +1,25 @@
{
"Logging": {
"LogLevel": {
"Default": "Information",
"Microsoft.Hosting.Lifetime": "Information"
}
}
}
{
"Logging": {
"LogLevel": {
"Default": "Information",
"Microsoft.Hosting.Lifetime": "Information"
}
},
"DevPortalOffline": {
"Enabled": false,
"StoragePrefix": "devportal/offline",
"BundleFileName": "devportal-offline-bundle.tgz",
"Metadata": {
"releaseChannel": "stable"
},
"Storage": {
"RootPath": "./out/devportal-offline"
},
"Signing": {
"KeyId": "devportal-offline-local",
"Secret": "ZGV2cG9ydGFsLW9mZmxpbmUtc2lnbmVyLXNlY3JldA==",
"Algorithm": "HMACSHA256",
"PayloadType": "application/vnd.stella.devportal.manifest+json"
}
}
}

View File

@@ -1,3 +1,3 @@
using System.Runtime.CompilerServices;
[assembly: InternalsVisibleTo("StellaOps.Scanner.WebService.Tests")]
using System.Runtime.CompilerServices;
[assembly: InternalsVisibleTo("StellaOps.Scanner.WebService.Tests")]

View File

@@ -97,6 +97,8 @@ builder.Services.AddSurfaceEnvironment(options =>
builder.Services.AddSurfaceValidation();
builder.Services.AddSurfaceFileCache();
builder.Services.AddSurfaceSecrets();
builder.Services.AddSingleton<IConfigureOptions<SurfaceCacheOptions>>(sp =>
new SurfaceCacheOptionsConfigurator(sp.GetRequiredService<ISurfaceEnvironment>()));
builder.Services.AddSingleton<ISurfacePointerService, SurfacePointerService>();
builder.Services.AddSingleton<IRedisConnectionFactory, RedisConnectionFactory>();
if (bootstrapOptions.Events is { Enabled: true } eventsOptions
@@ -370,5 +372,24 @@ if (resolvedOptions.Features.EnablePolicyPreview)
apiGroup.MapReportEndpoints(resolvedOptions.Api.ReportsSegment);
apiGroup.MapRuntimeEndpoints(resolvedOptions.Api.RuntimeSegment);
app.MapOpenApiIfAvailable();
await app.RunAsync().ConfigureAwait(false);
app.MapOpenApiIfAvailable();
await app.RunAsync().ConfigureAwait(false);
public partial class Program;
internal sealed class SurfaceCacheOptionsConfigurator : IConfigureOptions<SurfaceCacheOptions>
{
private readonly ISurfaceEnvironment _surfaceEnvironment;
public SurfaceCacheOptionsConfigurator(ISurfaceEnvironment surfaceEnvironment)
{
_surfaceEnvironment = surfaceEnvironment ?? throw new ArgumentNullException(nameof(surfaceEnvironment));
}
public void Configure(SurfaceCacheOptions options)
{
ArgumentNullException.ThrowIfNull(options);
var settings = _surfaceEnvironment.Settings;
options.RootDirectory = settings.CacheRoot.FullName;
}
}

View File

@@ -4,7 +4,8 @@
|----|--------|----------|------------|-------------|---------------|
| SCAN-REPLAY-186-001 | TODO | Scanner WebService Guild | REPLAY-CORE-185-001 | Implement scan `record` mode producing replay manifests/bundles, capture policy/feed/tool hashes, and update `docs/modules/scanner/architecture.md` referencing `docs/replay/DETERMINISTIC_REPLAY.md` Section 6. | API/worker integration tests cover record mode; docs merged; replay artifacts stored per spec. |
| SCANNER-SURFACE-02 | DONE (2025-11-05) | Scanner WebService Guild | SURFACE-FS-02 | Publish Surface.FS pointers (CAS URIs, manifests) via scan/report APIs and update attestation metadata.<br>2025-11-05: Surface pointers projected through scan/report endpoints, orchestrator samples + DSSE fixtures refreshed with manifest block, readiness tests updated to use validator stub. | OpenAPI updated; clients regenerated; integration tests validate pointer presence and tenancy. |
| SCANNER-ENV-02 | DOING (2025-11-02) | Scanner WebService Guild, Ops Guild | SURFACE-ENV-02 | Wire Surface.Env helpers into WebService hosting (cache roots, feature flags) and document configuration.<br>2025-11-02: Cache root resolution switched to helper; feature flag bindings updated; Helm/Compose updates pending review. | Service uses helper; env table documented; helm/compose templates updated. |
| SCANNER-ENV-02 | TODO (2025-11-06) | Scanner WebService Guild, Ops Guild | SURFACE-ENV-02 | Wire Surface.Env helpers into WebService hosting (cache roots, feature flags) and document configuration.<br>2025-11-02: Cache root resolution switched to helper; feature flag bindings updated; Helm/Compose updates pending review.<br>2025-11-05 14:55Z: Aligning readiness checks, docs, and Helm/Compose templates with Surface.Env outputs and planning test coverage for configuration fallbacks.<br>2025-11-06 17:05Z: Surface.Env documentation/README refreshed; warning catalogue captured for ops handoff.<br>2025-11-06 07:45Z: Helm values (dev/stage/prod/airgap/mirror) and Compose examples updated with `SCANNER_SURFACE_*` defaults plus rollout warning note in `deploy/README.md`.<br>2025-11-06 07:55Z: Paused; follow-up automation captured under `DEVOPS-OPENSSL-11-001/002` and pending Surface.Env readiness tests. | Service uses helper; env table documented; helm/compose templates updated. |
> 2025-11-05 19:18Z: Added configurator to project wiring and unit test ensuring Surface.Env cache root is honoured.
| SCANNER-SECRETS-02 | DOING (2025-11-02) | Scanner WebService Guild, Security Guild | SURFACE-SECRETS-02 | Replace ad-hoc secret wiring with Surface.Secrets for report/export operations (registry and CAS tokens).<br>2025-11-02: Export/report flows now depend on Surface.Secrets stub; integration tests in progress. | Secrets fetched through shared provider; unit/integration tests cover rotation + failure cases. |
| SCANNER-EVENTS-16-301 | BLOCKED (2025-10-26) | Scanner WebService Guild | ORCH-SVC-38-101, NOTIFY-SVC-38-001 | Emit orchestrator-compatible envelopes (`scanner.event.*`) and update integration tests to verify Notifier ingestion (no Redis queue coupling). | Tests assert envelope schema + orchestrator publish; Notifier consumer harness passes; docs updated with new event contract. Blocked by .NET 10 preview OpenAPI/Auth dependency drift preventing `dotnet test` completion. |
| SCANNER-EVENTS-16-302 | DOING (2025-10-26) | Scanner WebService Guild | SCANNER-EVENTS-16-301 | Extend orchestrator event links (report/policy/attestation) once endpoints are finalised across gateway + console. | Links section covers UI/API targets; downstream consumers validated; docs/samples updated. |

View File

@@ -0,0 +1,3 @@
using System.Runtime.CompilerServices;
[assembly: InternalsVisibleTo("StellaOps.Scanner.Worker.Tests")]

View File

@@ -36,6 +36,8 @@ builder.Services.AddSurfaceEnvironment(options =>
builder.Services.AddSurfaceValidation();
builder.Services.AddSurfaceFileCache();
builder.Services.AddSurfaceSecrets();
builder.Services.AddSingleton<IConfigureOptions<SurfaceCacheOptions>>(sp =>
new SurfaceCacheOptionsConfigurator(sp.GetRequiredService<ISurfaceEnvironment>()));
builder.Services.AddSingleton<ScannerWorkerMetrics>();
builder.Services.AddSingleton<ScanProgressReporter>();
builder.Services.AddSingleton<ScanJobProcessor>();
@@ -127,3 +129,20 @@ var host = builder.Build();
await host.RunAsync();
public partial class Program;
internal sealed class SurfaceCacheOptionsConfigurator : IConfigureOptions<SurfaceCacheOptions>
{
private readonly ISurfaceEnvironment _surfaceEnvironment;
public SurfaceCacheOptionsConfigurator(ISurfaceEnvironment surfaceEnvironment)
{
_surfaceEnvironment = surfaceEnvironment ?? throw new ArgumentNullException(nameof(surfaceEnvironment));
}
public void Configure(SurfaceCacheOptions options)
{
ArgumentNullException.ThrowIfNull(options);
var settings = _surfaceEnvironment.Settings;
options.RootDirectory = settings.CacheRoot.FullName;
}
}

View File

@@ -4,5 +4,6 @@
|----|--------|----------|------------|-------------|---------------|
| SCAN-REPLAY-186-002 | TODO | Scanner Worker Guild | REPLAY-CORE-185-001 | Enforce deterministic analyzer execution when consuming replay input bundles, emit layer Merkle metadata, and author `docs/modules/scanner/deterministic-execution.md` summarising invariants from `docs/replay/DETERMINISTIC_REPLAY.md` Section 4. | Replay mode analyzers pass determinism tests; new doc merged; integration fixtures updated. |
| SCANNER-SURFACE-01 | DOING (2025-11-02) | Scanner Worker Guild | SURFACE-FS-02 | Persist Surface.FS manifests after analyzer stages, including layer CAS metadata and EntryTrace fragments.<br>2025-11-02: Draft Surface.FS manifests emitted for sample scans; telemetry counters under review. | Integration tests prove cache entries exist; telemetry counters exported. |
| SCANNER-ENV-01 | DOING (2025-11-02) | Scanner Worker Guild | SURFACE-ENV-02 | Replace ad-hoc environment reads with `StellaOps.Scanner.Surface.Env` helpers for cache roots and CAS endpoints.<br>2025-11-02: Worker bootstrap now resolves cache roots via helper; warning path documented; smoke tests running. | Worker boots with helper; misconfiguration warnings documented; smoke tests updated. |
| SCANNER-ENV-01 | TODO (2025-11-06) | Scanner Worker Guild | SURFACE-ENV-02 | Replace ad-hoc environment reads with `StellaOps.Scanner.Surface.Env` helpers for cache roots and CAS endpoints.<br>2025-11-02: Worker bootstrap now resolves cache roots via helper; warning path documented; smoke tests running.<br>2025-11-05 14:55Z: Extending helper usage into cache/secrets configuration, updating worker validator wiring, and drafting docs/tests for new Surface.Env outputs.<br>2025-11-06 17:05Z: README/design docs updated with warning catalogue; startup logging guidance captured for ops runbooks.<br>2025-11-06 07:45Z: Helm/Compose env profiles (dev/stage/prod/airgap/mirror) now seed `SCANNER_SURFACE_*` defaults to keep worker cache roots aligned with Surface.Env helpers.<br>2025-11-06 07:55Z: Paused; pending automation tracked via `DEVOPS-OPENSSL-11-001/002` and Surface.Env test fixtures. | Worker boots with helper; misconfiguration warnings documented; smoke tests updated. |
> 2025-11-05 19:18Z: Bound `SurfaceCacheOptions` root directory to resolved Surface.Env settings and added unit coverage around the configurator.
| SCANNER-SECRETS-01 | DOING (2025-11-02) | Scanner Worker Guild, Security Guild | SURFACE-SECRETS-02 | Adopt `StellaOps.Scanner.Surface.Secrets` for registry/CAS credentials during scan execution.<br>2025-11-02: Surface.Secrets provider wired for CAS token retrieval; integration tests added. | Secrets fetched via shared provider; legacy secret code removed; integration tests cover rotation. |

View File

@@ -0,0 +1,49 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Security.Cryptography.X509Certificates;
using StellaOps.Scanner.Surface.Env;
using StellaOps.Scanner.Surface.FS;
using Xunit;
namespace StellaOps.Scanner.WebService.Tests;
public sealed class SurfaceCacheOptionsConfiguratorTests
{
[Fact]
public void Configure_UsesSurfaceEnvironmentCacheRoot()
{
var cacheRoot = new DirectoryInfo(Path.Combine(Path.GetTempPath(), Guid.NewGuid().ToString("N")));
var settings = new SurfaceEnvironmentSettings(
new Uri("https://surface.example"),
"surface-cache",
null,
cacheRoot,
cacheQuotaMegabytes: 512,
prefetchEnabled: true,
featureFlags: Array.Empty<string>(),
secrets: new SurfaceSecretsConfiguration("file", "tenant-b", "/etc/secrets", null, null, allowInline: false),
tenant: "tenant-b",
tls: new SurfaceTlsConfiguration(null, null, new X509Certificate2Collection()));
var environment = new StubSurfaceEnvironment(settings);
var configurator = new SurfaceCacheOptionsConfigurator(environment);
var options = new SurfaceCacheOptions();
configurator.Configure(options);
Assert.Equal(cacheRoot.FullName, options.RootDirectory);
}
private sealed class StubSurfaceEnvironment : ISurfaceEnvironment
{
public StubSurfaceEnvironment(SurfaceEnvironmentSettings settings)
{
Settings = settings;
}
public SurfaceEnvironmentSettings Settings { get; }
public IReadOnlyDictionary<string, string> RawVariables { get; } = new Dictionary<string, string>();
}
}

View File

@@ -0,0 +1,49 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Security.Cryptography.X509Certificates;
using StellaOps.Scanner.Surface.Env;
using StellaOps.Scanner.Surface.FS;
using Xunit;
namespace StellaOps.Scanner.Worker.Tests;
public sealed class SurfaceCacheOptionsConfiguratorTests
{
[Fact]
public void Configure_UsesSurfaceEnvironmentCacheRoot()
{
var cacheRoot = new DirectoryInfo(Path.Combine(Path.GetTempPath(), Guid.NewGuid().ToString("N")));
var settings = new SurfaceEnvironmentSettings(
new Uri("https://surface.example"),
"surface-cache",
null,
cacheRoot,
cacheQuotaMegabytes: 1024,
prefetchEnabled: false,
featureFlags: Array.Empty<string>(),
secrets: new SurfaceSecretsConfiguration("file", "tenant-a", "/etc/secrets", null, null, false),
tenant: "tenant-a",
tls: new SurfaceTlsConfiguration(null, null, new X509Certificate2Collection()));
var environment = new StubSurfaceEnvironment(settings);
var configurator = new SurfaceCacheOptionsConfigurator(environment);
var options = new SurfaceCacheOptions();
configurator.Configure(options);
Assert.Equal(cacheRoot.FullName, options.RootDirectory);
}
private sealed class StubSurfaceEnvironment : ISurfaceEnvironment
{
public StubSurfaceEnvironment(SurfaceEnvironmentSettings settings)
{
Settings = settings;
}
public SurfaceEnvironmentSettings Settings { get; }
public IReadOnlyDictionary<string, string> RawVariables { get; } = new Dictionary<string, string>();
}
}

View File

@@ -1,6 +1,6 @@
namespace StellaOps.Scheduler.WebService.GraphJobs;
internal readonly record struct GraphJobUpdateResult<TJob>(bool Updated, TJob Job) where TJob : class
public readonly record struct GraphJobUpdateResult<TJob>(bool Updated, TJob Job) where TJob : class
{
public static GraphJobUpdateResult<TJob> UpdatedResult(TJob job) => new(true, job);

View File

@@ -43,6 +43,7 @@ internal sealed class PolicySimulationMetricsProvider : IPolicySimulationMetrics
private readonly Histogram<double> _latencyHistogram;
private readonly object _snapshotLock = new();
private IReadOnlyDictionary<string, long> _latestQueueSnapshot = new Dictionary<string, long>(StringComparer.Ordinal);
private string _latestTenantId = string.Empty;
private bool _disposed;
public PolicySimulationMetricsProvider(IPolicyRunJobRepository repository, TimeProvider? timeProvider = null)
@@ -83,9 +84,12 @@ internal sealed class PolicySimulationMetricsProvider : IPolicySimulationMetrics
totalQueueDepth += count;
}
var snapshot = new Dictionary<string, long>(queueCounts, StringComparer.Ordinal);
lock (_snapshotLock)
{
_latestQueueSnapshot = queueCounts;
_latestQueueSnapshot = snapshot;
_latestTenantId = tenantId;
}
var sampleSize = 200;
@@ -113,7 +117,7 @@ internal sealed class PolicySimulationMetricsProvider : IPolicySimulationMetrics
Average(durations));
return new PolicySimulationMetricsResponse(
new PolicySimulationQueueDepth(totalQueueDepth, queueCounts),
new PolicySimulationQueueDepth(totalQueueDepth, snapshot),
latencyMetrics);
}
@@ -134,16 +138,21 @@ internal sealed class PolicySimulationMetricsProvider : IPolicySimulationMetrics
private IEnumerable<Measurement<long>> ObserveQueueDepth()
{
IReadOnlyDictionary<string, long> snapshot;
string tenantId;
lock (_snapshotLock)
{
snapshot = _latestQueueSnapshot;
tenantId = _latestTenantId;
}
tenantId = string.IsNullOrWhiteSpace(tenantId) ? "unknown" : tenantId;
foreach (var pair in snapshot)
{
yield return new Measurement<long>(
pair.Value,
new KeyValuePair<string, object?>("status", pair.Key));
new KeyValuePair<string, object?>("status", pair.Key),
new KeyValuePair<string, object?>("tenantId", tenantId));
}
}

View File

@@ -29,12 +29,13 @@
## Policy Studio (Sprint 27)
| ID | Status | Owner(s) | Depends on | Description | Exit Criteria |
|----|--------|----------|------------|-------------|---------------|
| SCHED-CONSOLE-27-001 | DONE (2025-11-03) | Scheduler WebService Guild, Policy Registry Guild | SCHED-WEB-16-103, REGISTRY-API-27-005 | Provide policy batch simulation orchestration endpoints (`/policies/simulations` POST/GET) exposing run creation, shard status, SSE progress, cancellation, and retries with RBAC enforcement. | API handles shard lifecycle with SSE heartbeats + retry headers; unauthorized requests rejected; integration tests cover submit/cancel/resume flows. |
| SCHED-CONSOLE-27-002 | DONE (2025-11-05) | Scheduler WebService Guild, Observability Guild | SCHED-CONSOLE-27-001 | Emit telemetry endpoints/metrics (`policy_simulation_queue_depth`, `policy_simulation_latency_seconds`) and webhook callbacks for completion/failure consumed by Registry. | Metrics exposed via gateway, dashboards seeded, webhook contract documented, integration tests validate metrics emission. |
> 2025-11-05: Resuming to align instrumentation naming with architecture spec, exercise latency recording in SSE flows, and ensure registry webhook contract (samples/docs) reflects terminal result behaviour.
> 2025-11-05: Histogram renamed to `policy_simulation_latency_seconds`, queue gauge kept stable, new unit tests cover metrics capture/latency recording, and docs updated. Local `dotnet test` build currently blocked by existing GraphJobs visibility errors (see `StellaOps.Scheduler.WebService/GraphJobs/IGraphJobStore.cs`).
## Vulnerability Explorer (Sprint 29)
| SCHED-CONSOLE-27-001 | DONE (2025-11-03) | Scheduler WebService Guild, Policy Registry Guild | SCHED-WEB-16-103, REGISTRY-API-27-005 | Provide policy batch simulation orchestration endpoints (`/policies/simulations` POST/GET) exposing run creation, shard status, SSE progress, cancellation, and retries with RBAC enforcement. | API handles shard lifecycle with SSE heartbeats + retry headers; unauthorized requests rejected; integration tests cover submit/cancel/resume flows. |
| SCHED-CONSOLE-27-002 | DONE (2025-11-05) | Scheduler WebService Guild, Observability Guild | SCHED-CONSOLE-27-001 | Emit telemetry endpoints/metrics (`policy_simulation_queue_depth`, `policy_simulation_latency_seconds`) and webhook callbacks for completion/failure consumed by Registry. | Metrics exposed via gateway, dashboards seeded, webhook contract documented, integration tests validate metrics emission. |
> 2025-11-05: Resuming to align instrumentation naming with architecture spec, exercise latency recording in SSE flows, and ensure registry webhook contract (samples/docs) reflects terminal result behaviour.
> 2025-11-05: Histogram renamed to `policy_simulation_latency_seconds`, queue gauge kept stable, new unit tests cover metrics capture/latency recording, and docs updated. Local `dotnet test` build currently blocked by existing GraphJobs visibility errors (see `StellaOps.Scheduler.WebService/GraphJobs/IGraphJobStore.cs`).
> 2025-11-06: Added tenant-aware tagging to `policy_simulation_queue_depth` gauge samples and refreshed metrics provider snapshot coverage.
## Vulnerability Explorer (Sprint 29)
| ID | Status | Owner(s) | Depends on | Description | Exit Criteria |
|----|--------|----------|------------|-------------|---------------|
| SCHED-VULN-29-001 | TODO | Scheduler WebService Guild, Findings Ledger Guild | SCHED-WEB-16-103, SBOM-VULN-29-001 | Expose resolver job APIs (`POST /vuln/resolver/jobs`, `GET /vuln/resolver/jobs/{id}`) to trigger candidate recomputation per artifact/policy change with RBAC and rate limits. | Resolver APIs documented; integration tests cover submit/status/cancel; unauthorized requests rejected. |

View File

@@ -0,0 +1,23 @@
{
"schema": "scheduler-impact-index@1",
"generatedAt": "2025-10-01T00:00:00Z",
"image": {
"repository": "registry.stellaops.test/team/sample-service",
"digest": "sha256:0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef",
"tag": "1.0.0"
},
"components": [
{
"purl": "pkg:docker/sample-service@1.0.0",
"usage": [
"runtime"
]
},
{
"purl": "pkg:pypi/requests@2.31.0",
"usage": [
"usedByEntrypoint"
]
}
]
}

View File

@@ -8,6 +8,7 @@
<ProjectReference Include="../StellaOps.Scheduler.Models/StellaOps.Scheduler.Models.csproj" />
</ItemGroup>
<ItemGroup>
<EmbeddedResource Include="Fixtures\**\*.json" />
<EmbeddedResource Include="..\..\samples\scanner\images\**\bom-index.json"
Link="Fixtures\%(RecursiveDir)%(Filename)%(Extension)" />
</ItemGroup>

View File

@@ -0,0 +1,52 @@
using System.IO;
using System.Linq;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging.Abstractions;
using StellaOps.Scheduler.ImpactIndex;
using StellaOps.Scheduler.Models;
using Xunit;
namespace StellaOps.Scheduler.WebService.Tests;
public sealed class ImpactIndexFixtureTests
{
[Fact]
public void FixtureDirectoryExists()
{
var fixtureDirectory = GetFixtureDirectory();
Assert.True(Directory.Exists(fixtureDirectory), $"Fixture directory not found: {fixtureDirectory}");
var files = Directory.EnumerateFiles(fixtureDirectory, "bom-index.json", SearchOption.AllDirectories).ToArray();
Assert.NotEmpty(files);
var sampleFile = Path.Combine(fixtureDirectory, "sample", "bom-index.json");
Assert.Contains(sampleFile, files);
}
[Fact]
public async Task FixtureImpactIndexLoadsSampleImage()
{
var fixtureDirectory = GetFixtureDirectory();
var options = new ImpactIndexStubOptions
{
FixtureDirectory = fixtureDirectory,
SnapshotId = "tests/impact-index-stub"
};
var index = new FixtureImpactIndex(options, TimeProvider.System, NullLogger<FixtureImpactIndex>.Instance);
var selector = new Selector(SelectorScope.AllImages);
var impactSet = await index.ResolveAllAsync(selector, usageOnly: false);
Assert.True(impactSet.Total > 0, "Expected the fixture impact index to load at least one image.");
}
private static string GetFixtureDirectory()
{
var assemblyLocation = typeof(SchedulerWebApplicationFactory).Assembly.Location;
var assemblyDirectory = Path.GetDirectoryName(assemblyLocation)
?? AppContext.BaseDirectory;
return Path.GetFullPath(Path.Combine(assemblyDirectory, "seed-data", "impact-index"));
}
}

View File

@@ -1,8 +1,16 @@
using System;
using System.Collections.Generic;
using System.Collections.Immutable;
using System.Diagnostics.Metrics;
using System.Globalization;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using MongoDB.Driver;
using StellaOps.Scheduler.Models;
using StellaOps.Scheduler.Storage.Mongo.Repositories;
using StellaOps.Scheduler.WebService.PolicySimulations;
using Xunit;
namespace StellaOps.Scheduler.WebService.Tests;
@@ -12,14 +20,14 @@ public sealed class PolicySimulationMetricsProviderTests
public async Task CaptureAsync_ComputesQueueDepthAndLatency()
{
var now = DateTimeOffset.UtcNow;
var queueCounts = new Dictionary<PolicyRunJobStatus, long>
var counts = new Dictionary<PolicyRunJobStatus, long>
{
[PolicyRunJobStatus.Pending] = 2,
[PolicyRunJobStatus.Dispatching] = 1,
[PolicyRunJobStatus.Submitted] = 1
};
var jobs = new List<PolicyRunJob>
var jobs = new[]
{
CreateJob(
status: PolicyRunJobStatus.Completed,
@@ -34,8 +42,9 @@ public sealed class PolicySimulationMetricsProviderTests
cancelledAt: now.AddSeconds(-20))
};
await using var provider = new PolicySimulationMetricsProvider(
new StubPolicyRunJobRepository(queueCounts, jobs));
var repository = new StubPolicyRunJobRepository(counts, jobs);
using var provider = new PolicySimulationMetricsProvider(repository);
var response = await provider.CaptureAsync("tenant-alpha", CancellationToken.None);
@@ -49,17 +58,88 @@ public sealed class PolicySimulationMetricsProviderTests
Assert.Equal(20.0, response.Latency.P50);
Assert.Equal(28.0, response.Latency.P90);
Assert.Equal(29.0, response.Latency.P95);
Assert.Equal(30.0, response.Latency.P99);
Assert.True(response.Latency.P99.HasValue);
Assert.Equal(29.8, response.Latency.P99.Value, 1);
}
[Fact]
public async Task CaptureAsync_UpdatesSnapshotAndEmitsTenantTaggedGauge()
{
var repository = new StubPolicyRunJobRepository();
repository.QueueCounts[PolicyRunJobStatus.Pending] = 3;
repository.QueueCounts[PolicyRunJobStatus.Dispatching] = 1;
repository.QueueCounts[PolicyRunJobStatus.Submitted] = 2;
var now = DateTimeOffset.Parse("2025-11-06T10:00:00Z", CultureInfo.InvariantCulture, DateTimeStyles.AdjustToUniversal);
repository.Jobs.Add(CreateJob(
status: PolicyRunJobStatus.Completed,
queuedAt: now.AddMinutes(-30),
submittedAt: now.AddMinutes(-28),
completedAt: now.AddMinutes(-5),
id: "job-1",
runId: "run-job-1"));
repository.Jobs.Add(CreateJob(
status: PolicyRunJobStatus.Failed,
queuedAt: now.AddMinutes(-20),
submittedAt: now.AddMinutes(-18),
completedAt: now.AddMinutes(-2),
id: "job-2",
runId: "run-job-2",
lastError: "policy engine timeout"));
using var provider = new PolicySimulationMetricsProvider(repository);
var measurements = new List<(string Status, string Tenant, long Value)>();
using var listener = new MeterListener
{
InstrumentPublished = (instrument, meterListener) =>
{
if (instrument.Meter.Name == "StellaOps.Scheduler.WebService.PolicySimulations" &&
instrument.Name == "policy_simulation_queue_depth")
{
meterListener.EnableMeasurementEvents(instrument);
}
}
};
listener.SetMeasurementEventCallback<long>((instrument, measurement, tags, state) =>
{
var status = string.Empty;
var tenant = string.Empty;
foreach (var tag in tags)
{
if (string.Equals(tag.Key, "status", StringComparison.Ordinal))
{
status = tag.Value?.ToString() ?? string.Empty;
}
else if (string.Equals(tag.Key, "tenantId", StringComparison.Ordinal))
{
tenant = tag.Value?.ToString() ?? string.Empty;
}
}
measurements.Add((status, tenant, measurement));
});
listener.Start();
var response = await provider.CaptureAsync("tenant-alpha", CancellationToken.None);
Assert.Equal(6, response.QueueDepth.Total);
listener.RecordObservableInstruments();
Assert.Contains(measurements, item =>
item.Status == "pending" &&
item.Tenant == "tenant-alpha" &&
item.Value == 3);
listener.Dispose();
}
[Fact]
public void RecordLatency_EmitsHistogramMeasurement()
{
var repo = new StubPolicyRunJobRepository(
counts: new Dictionary<PolicyRunJobStatus, long>(),
jobs: Array.Empty<PolicyRunJob>());
var repository = new StubPolicyRunJobRepository();
using var provider = new PolicySimulationMetricsProvider(repo);
using var provider = new PolicySimulationMetricsProvider(repository);
var measurements = new List<double>();
using var listener = new MeterListener
@@ -81,64 +161,50 @@ public sealed class PolicySimulationMetricsProviderTests
measurements.Add(measurement);
}
});
listener.Start();
var now = DateTimeOffset.UtcNow;
var status = new PolicyRunStatus(
runId: "run-1",
tenantId: "tenant-alpha",
policyId: "policy-alpha",
policyVersion: 1,
mode: PolicyRunMode.Simulate,
status: PolicyRunExecutionStatus.Succeeded,
priority: PolicyRunPriority.Normal,
var latencyJob = CreateJob(
status: PolicyRunJobStatus.Completed,
queuedAt: now.AddSeconds(-12),
startedAt: now.AddSeconds(-10),
finishedAt: now,
stats: PolicyRunStats.Empty,
inputs: PolicyRunInputs.Empty,
determinismHash: null,
errorCode: null,
error: null,
attempts: 1,
traceId: null,
explainUri: null,
metadata: ImmutableSortedDictionary<string, string>.Empty,
cancellationRequested: false,
cancellationRequestedAt: null,
cancellationReason: null,
schemaVersion: null);
submittedAt: now.AddSeconds(-10),
completedAt: now,
id: "job-latency",
runId: "run-1");
var status = PolicyRunStatusFactory.Create(latencyJob, now);
provider.RecordLatency(status, now);
listener.Dispose();
Assert.Single(measurements);
Assert.Equal(12, measurements[0], precision: 6);
listener.Dispose();
}
private static PolicyRunJob CreateJob(
PolicyRunJobStatus status,
DateTimeOffset queuedAt,
DateTimeOffset submittedAt,
DateTimeOffset? submittedAt,
DateTimeOffset? completedAt,
DateTimeOffset? cancelledAt = null)
DateTimeOffset? cancelledAt = null,
string? id = null,
string? runId = null,
string? lastError = null)
{
var id = Guid.NewGuid().ToString("N");
var runId = $"run:{id}";
var updatedAt = completedAt ?? cancelledAt ?? submittedAt;
var jobId = id ?? Guid.NewGuid().ToString("N");
var resolvedRunId = runId ?? $"run:{jobId}";
var updatedAt = completedAt ?? cancelledAt ?? submittedAt ?? queuedAt;
return new PolicyRunJob(
SchemaVersion: SchedulerSchemaVersions.PolicyRunJob,
Id: id,
Id: jobId,
TenantId: "tenant-alpha",
PolicyId: "policy-alpha",
PolicyVersion: 1,
Mode: PolicyRunMode.Simulate,
Priority: PolicyRunPriority.Normal,
PriorityRank: 0,
RunId: runId,
RunId: resolvedRunId,
RequestedBy: "tester",
CorrelationId: null,
Metadata: ImmutableSortedDictionary<string, string>.Empty,
@@ -146,8 +212,8 @@ public sealed class PolicySimulationMetricsProviderTests
QueuedAt: queuedAt,
Status: status,
AttemptCount: 1,
LastAttemptAt: submittedAt,
LastError: null,
LastAttemptAt: submittedAt ?? completedAt ?? queuedAt,
LastError: lastError,
CreatedAt: queuedAt,
UpdatedAt: updatedAt,
AvailableAt: queuedAt,
@@ -163,15 +229,32 @@ public sealed class PolicySimulationMetricsProviderTests
private sealed class StubPolicyRunJobRepository : IPolicyRunJobRepository
{
private readonly IReadOnlyDictionary<PolicyRunJobStatus, long> _counts;
private readonly IReadOnlyList<PolicyRunJob> _jobs;
public StubPolicyRunJobRepository()
{
}
public StubPolicyRunJobRepository(
IReadOnlyDictionary<PolicyRunJobStatus, long> counts,
IReadOnlyList<PolicyRunJob> jobs)
IDictionary<PolicyRunJobStatus, long> counts,
IEnumerable<PolicyRunJob> jobs)
{
_counts = counts;
_jobs = jobs;
foreach (var pair in counts)
{
QueueCounts[pair.Key] = pair.Value;
}
Jobs.AddRange(jobs);
}
public Dictionary<PolicyRunJobStatus, long> QueueCounts { get; } = new();
public List<PolicyRunJob> Jobs { get; } = new();
public Task InsertAsync(
PolicyRunJob job,
IClientSessionHandle? session = null,
CancellationToken cancellationToken = default)
{
Jobs.Add(job);
return Task.CompletedTask;
}
public Task<long> CountAsync(
@@ -180,12 +263,17 @@ public sealed class PolicySimulationMetricsProviderTests
IReadOnlyCollection<PolicyRunJobStatus> statuses,
CancellationToken cancellationToken = default)
{
if (statuses is null || statuses.Count == 0)
{
return Task.FromResult(QueueCounts.Values.Sum());
}
long total = 0;
foreach (var status in statuses)
{
if (_counts.TryGetValue(status, out var value))
if (QueueCounts.TryGetValue(status, out var count))
{
total += value;
total += count;
}
}
@@ -199,44 +287,53 @@ public sealed class PolicySimulationMetricsProviderTests
IReadOnlyCollection<PolicyRunJobStatus>? statuses = null,
DateTimeOffset? queuedAfter = null,
int limit = 50,
MongoDB.Driver.IClientSessionHandle? session = null,
IClientSessionHandle? session = null,
CancellationToken cancellationToken = default)
=> Task.FromResult(_jobs);
{
IEnumerable<PolicyRunJob> query = Jobs;
Task IPolicyRunJobRepository.InsertAsync(
PolicyRunJob job,
MongoDB.Driver.IClientSessionHandle? session,
CancellationToken cancellationToken)
=> throw new NotSupportedException();
if (statuses is { Count: > 0 })
{
query = query.Where(job => statuses.Contains(job.Status));
}
Task<PolicyRunJob?> IPolicyRunJobRepository.GetAsync(
if (queuedAfter is not null)
{
query = query.Where(job => (job.QueuedAt ?? job.CreatedAt) >= queuedAfter.Value);
}
var result = query.Take(limit).ToList().AsReadOnly();
return Task.FromResult<IReadOnlyList<PolicyRunJob>>(result);
}
public Task<PolicyRunJob?> GetAsync(
string tenantId,
string jobId,
MongoDB.Driver.IClientSessionHandle? session,
CancellationToken cancellationToken)
=> throw new NotSupportedException();
IClientSessionHandle? session = null,
CancellationToken cancellationToken = default)
=> Task.FromResult<PolicyRunJob?>(Jobs.FirstOrDefault(job => job.Id == jobId));
Task<PolicyRunJob?> IPolicyRunJobRepository.GetByRunIdAsync(
public Task<PolicyRunJob?> GetByRunIdAsync(
string tenantId,
string runId,
MongoDB.Driver.IClientSessionHandle? session,
CancellationToken cancellationToken)
=> throw new NotSupportedException();
IClientSessionHandle? session = null,
CancellationToken cancellationToken = default)
=> Task.FromResult<PolicyRunJob?>(Jobs.FirstOrDefault(job => string.Equals(job.RunId, runId, StringComparison.Ordinal)));
Task<PolicyRunJob?> IPolicyRunJobRepository.LeaseAsync(
public Task<PolicyRunJob?> LeaseAsync(
string leaseOwner,
DateTimeOffset now,
TimeSpan leaseDuration,
int maxAttempts,
MongoDB.Driver.IClientSessionHandle? session,
CancellationToken cancellationToken)
=> throw new NotSupportedException();
IClientSessionHandle? session = null,
CancellationToken cancellationToken = default)
=> Task.FromResult<PolicyRunJob?>(null);
Task<bool> IPolicyRunJobRepository.ReplaceAsync(
public Task<bool> ReplaceAsync(
PolicyRunJob job,
string? expectedLeaseOwner,
MongoDB.Driver.IClientSessionHandle? session,
CancellationToken cancellationToken)
=> throw new NotSupportedException();
string? expectedLeaseOwner = null,
IClientSessionHandle? session = null,
CancellationToken cancellationToken = default)
=> Task.FromResult(true);
}
}

View File

@@ -11,16 +11,16 @@ using Microsoft.Extensions.DependencyInjection;
using StellaOps.Scheduler.Models;
using StellaOps.Scheduler.Queue;
using StellaOps.Scheduler.Storage.Mongo.Repositories;
namespace StellaOps.Scheduler.WebService.Tests;
public sealed class RunEndpointTests : IClassFixture<WebApplicationFactory<Program>>
{
private readonly WebApplicationFactory<Program> _factory;
public RunEndpointTests(WebApplicationFactory<Program> factory)
{
_factory = factory;
namespace StellaOps.Scheduler.WebService.Tests;
public sealed class RunEndpointTests : IClassFixture<WebApplicationFactory<Program>>
{
private readonly WebApplicationFactory<Program> _factory;
public RunEndpointTests(WebApplicationFactory<Program> factory)
{
_factory = factory;
}
[Fact]
@@ -100,13 +100,13 @@ public sealed class RunEndpointTests : IClassFixture<WebApplicationFactory<Progr
var scheduleId = scheduleJson.GetProperty("schedule").GetProperty("id").GetString();
Assert.False(string.IsNullOrEmpty(scheduleId));
var previewResponse = await client.PostAsJsonAsync("/api/v1/scheduler/runs/preview", new
{
scheduleId,
usageOnly = true,
sampleSize = 3
});
var previewResponse = await client.PostAsJsonAsync("/api/v1/scheduler/runs/preview", new
{
scheduleId,
usageOnly = true,
sampleSize = 3
});
previewResponse.EnsureSuccessStatusCode();
var preview = await previewResponse.Content.ReadFromJsonAsync<JsonElement>();
Assert.True(preview.GetProperty("total").GetInt32() >= 0);

View File

@@ -1,11 +1,14 @@
using System;
using System.Collections.Generic;
using System.IO;
using Microsoft.AspNetCore.Hosting;
using Microsoft.AspNetCore.Mvc.Testing;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.DependencyInjection.Extensions;
using StellaOps.Scheduler.WebService.Options;
using StellaOps.Scheduler.WebService.Runs;
using StellaOps.Scheduler.ImpactIndex;
namespace StellaOps.Scheduler.WebService.Tests;
@@ -15,6 +18,8 @@ public sealed class SchedulerWebApplicationFactory : WebApplicationFactory<Progr
{
builder.ConfigureAppConfiguration((_, configuration) =>
{
var fixtureDirectory = GetFixtureDirectory();
configuration.AddInMemoryCollection(new[]
{
new KeyValuePair<string, string?>("Scheduler:Authority:Enabled", "false"),
@@ -27,12 +32,22 @@ public sealed class SchedulerWebApplicationFactory : WebApplicationFactory<Progr
new KeyValuePair<string, string?>("Scheduler:Events:Webhooks:Excitor:Enabled", "true"),
new KeyValuePair<string, string?>("Scheduler:Events:Webhooks:Excitor:HmacSecret", "excitor-secret"),
new KeyValuePair<string, string?>("Scheduler:Events:Webhooks:Excitor:RateLimitRequests", "20"),
new KeyValuePair<string, string?>("Scheduler:Events:Webhooks:Excitor:RateLimitWindowSeconds", "60")
new KeyValuePair<string, string?>("Scheduler:Events:Webhooks:Excitor:RateLimitWindowSeconds", "60"),
new KeyValuePair<string, string?>("Scheduler:ImpactIndex:FixtureDirectory", fixtureDirectory)
});
});
builder.ConfigureServices(services =>
{
var fixtureDirectory = GetFixtureDirectory();
services.RemoveAll<ImpactIndexStubOptions>();
services.AddSingleton(new ImpactIndexStubOptions
{
FixtureDirectory = fixtureDirectory,
SnapshotId = "tests/impact-index-stub"
});
services.Configure<SchedulerEventsOptions>(options =>
{
options.Webhooks ??= new SchedulerInboundWebhooksOptions();
@@ -52,4 +67,14 @@ public sealed class SchedulerWebApplicationFactory : WebApplicationFactory<Progr
});
});
}
private static string GetFixtureDirectory()
{
var assemblyLocation = typeof(SchedulerWebApplicationFactory).Assembly.Location;
var assemblyDirectory = Path.GetDirectoryName(assemblyLocation)
?? AppContext.BaseDirectory;
var fixtureDirectory = Path.Combine(assemblyDirectory, "seed-data", "impact-index");
return Path.GetFullPath(fixtureDirectory);
}
}

View File

@@ -18,4 +18,9 @@
<ItemGroup>
<ProjectReference Include="../../StellaOps.Scheduler.WebService/StellaOps.Scheduler.WebService.csproj" />
</ItemGroup>
<ItemGroup>
<Content Include="seed-data/impact-index/**">
<CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
</Content>
</ItemGroup>
</Project>

View File

@@ -0,0 +1,23 @@
{
"schema": "scheduler-impact-index@1",
"generatedAt": "2025-10-01T00:00:00Z",
"image": {
"repository": "registry.stellaops.test/team/sample-service",
"digest": "sha256:0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef",
"tag": "1.0.0"
},
"components": [
{
"purl": "pkg:docker/sample-service@1.0.0",
"usage": [
"runtime"
]
},
{
"purl": "pkg:pypi/requests@2.31.0",
"usage": [
"usedByEntrypoint"
]
}
]
}

View File

@@ -0,0 +1,12 @@
using StellaOps.TaskRunner.Core.Planning;
namespace StellaOps.TaskRunner.Core.Execution;
public interface IPackRunArtifactUploader
{
Task UploadAsync(
PackRunExecutionContext context,
PackRunState state,
IReadOnlyList<TaskPackPlanOutput> outputs,
CancellationToken cancellationToken);
}

View File

@@ -0,0 +1,6 @@
namespace StellaOps.TaskRunner.Core.Execution;
public interface IPackRunJobScheduler
{
Task ScheduleAsync(PackRunExecutionContext context, CancellationToken cancellationToken);
}

View File

@@ -3,27 +3,33 @@ using StellaOps.TaskRunner.Core.Planning;
namespace StellaOps.TaskRunner.Core.Execution;
public sealed record PackRunState(
string RunId,
string PlanHash,
TaskPackPlanFailurePolicy FailurePolicy,
DateTimeOffset CreatedAt,
DateTimeOffset UpdatedAt,
IReadOnlyDictionary<string, PackRunStepStateRecord> Steps)
public sealed record PackRunState(
string RunId,
string PlanHash,
TaskPackPlan Plan,
TaskPackPlanFailurePolicy FailurePolicy,
DateTimeOffset RequestedAt,
DateTimeOffset CreatedAt,
DateTimeOffset UpdatedAt,
IReadOnlyDictionary<string, PackRunStepStateRecord> Steps)
{
public static PackRunState Create(
string runId,
string planHash,
TaskPackPlanFailurePolicy failurePolicy,
IReadOnlyDictionary<string, PackRunStepStateRecord> steps,
DateTimeOffset timestamp)
=> new(
runId,
planHash,
failurePolicy,
timestamp,
timestamp,
new ReadOnlyDictionary<string, PackRunStepStateRecord>(new Dictionary<string, PackRunStepStateRecord>(steps, StringComparer.Ordinal)));
string planHash,
TaskPackPlan plan,
TaskPackPlanFailurePolicy failurePolicy,
DateTimeOffset requestedAt,
IReadOnlyDictionary<string, PackRunStepStateRecord> steps,
DateTimeOffset timestamp)
=> new(
runId,
planHash,
plan,
failurePolicy,
requestedAt,
timestamp,
timestamp,
new ReadOnlyDictionary<string, PackRunStepStateRecord>(new Dictionary<string, PackRunStepStateRecord>(steps, StringComparer.Ordinal)));
}
public sealed record PackRunStepStateRecord(

View File

@@ -110,18 +110,20 @@ public sealed class FilePackRunStateStore : IPackRunStateStore
return result;
}
private sealed record StateDocument(
string RunId,
string PlanHash,
TaskPackPlanFailurePolicy FailurePolicy,
DateTimeOffset CreatedAt,
DateTimeOffset UpdatedAt,
IReadOnlyList<StepDocument> Steps)
{
public static StateDocument FromDomain(PackRunState state)
{
var steps = state.Steps.Values
.OrderBy(step => step.StepId, StringComparer.Ordinal)
private sealed record StateDocument(
string RunId,
string PlanHash,
TaskPackPlan Plan,
TaskPackPlanFailurePolicy FailurePolicy,
DateTimeOffset RequestedAt,
DateTimeOffset CreatedAt,
DateTimeOffset UpdatedAt,
IReadOnlyList<StepDocument> Steps)
{
public static StateDocument FromDomain(PackRunState state)
{
var steps = state.Steps.Values
.OrderBy(step => step.StepId, StringComparer.Ordinal)
.Select(step => new StepDocument(
step.StepId,
step.Kind,
@@ -137,15 +139,17 @@ public sealed class FilePackRunStateStore : IPackRunStateStore
step.StatusReason))
.ToList();
return new StateDocument(
state.RunId,
state.PlanHash,
state.FailurePolicy,
state.CreatedAt,
state.UpdatedAt,
steps);
}
return new StateDocument(
state.RunId,
state.PlanHash,
state.Plan,
state.FailurePolicy,
state.RequestedAt,
state.CreatedAt,
state.UpdatedAt,
steps);
}
public PackRunState ToDomain()
{
var steps = Steps.ToDictionary(
@@ -165,14 +169,16 @@ public sealed class FilePackRunStateStore : IPackRunStateStore
step.StatusReason),
StringComparer.Ordinal);
return new PackRunState(
RunId,
PlanHash,
FailurePolicy,
CreatedAt,
UpdatedAt,
steps);
}
return new PackRunState(
RunId,
PlanHash,
Plan,
FailurePolicy,
RequestedAt,
CreatedAt,
UpdatedAt,
steps);
}
}
private sealed record StepDocument(

View File

@@ -0,0 +1,239 @@
using System.Text.Json;
using System.Text.Json.Nodes;
using Microsoft.Extensions.Logging;
using StellaOps.TaskRunner.Core.Execution;
using StellaOps.TaskRunner.Core.Planning;
namespace StellaOps.TaskRunner.Infrastructure.Execution;
/// <summary>
/// Stores pack run artifacts on the local file system so they can be mirrored to the eventual remote store.
/// </summary>
public sealed class FilesystemPackRunArtifactUploader : IPackRunArtifactUploader
{
private static readonly JsonSerializerOptions SerializerOptions = new(JsonSerializerDefaults.Web)
{
WriteIndented = true
};
private readonly string rootPath;
private readonly ILogger<FilesystemPackRunArtifactUploader> logger;
private readonly TimeProvider timeProvider;
public FilesystemPackRunArtifactUploader(
string rootPath,
TimeProvider? timeProvider,
ILogger<FilesystemPackRunArtifactUploader> logger)
{
ArgumentException.ThrowIfNullOrWhiteSpace(rootPath);
this.rootPath = Path.GetFullPath(rootPath);
this.logger = logger ?? throw new ArgumentNullException(nameof(logger));
this.timeProvider = timeProvider ?? TimeProvider.System;
Directory.CreateDirectory(this.rootPath);
}
public async Task UploadAsync(
PackRunExecutionContext context,
PackRunState state,
IReadOnlyList<TaskPackPlanOutput> outputs,
CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(context);
ArgumentNullException.ThrowIfNull(state);
ArgumentNullException.ThrowIfNull(outputs);
if (outputs.Count == 0)
{
return;
}
var destinationRoot = Path.Combine(rootPath, SanitizeFileName(context.RunId));
var filesRoot = Path.Combine(destinationRoot, "files");
var expressionsRoot = Path.Combine(destinationRoot, "expressions");
Directory.CreateDirectory(destinationRoot);
var manifest = new ArtifactManifest(
context.RunId,
timeProvider.GetUtcNow(),
new List<ArtifactRecord>(outputs.Count));
foreach (var output in outputs)
{
cancellationToken.ThrowIfCancellationRequested();
var record = await ProcessOutputAsync(
context,
output,
destinationRoot,
filesRoot,
expressionsRoot,
cancellationToken).ConfigureAwait(false);
manifest.Outputs.Add(record);
}
var manifestPath = Path.Combine(destinationRoot, "artifact-manifest.json");
await using (var stream = File.Open(manifestPath, FileMode.Create, FileAccess.Write, FileShare.None))
{
await JsonSerializer.SerializeAsync(stream, manifest, SerializerOptions, cancellationToken)
.ConfigureAwait(false);
}
logger.LogInformation(
"Pack run {RunId} artifact manifest written to {Path} with {Count} output entries.",
context.RunId,
manifestPath,
manifest.Outputs.Count);
}
private async Task<ArtifactRecord> ProcessOutputAsync(
PackRunExecutionContext context,
TaskPackPlanOutput output,
string destinationRoot,
string filesRoot,
string expressionsRoot,
CancellationToken cancellationToken)
{
var sourcePath = ResolveString(output.Path);
var expressionNode = ResolveExpression(output.Expression);
var status = "skipped";
string? storedPath = null;
string? notes = null;
if (IsFileOutput(output))
{
if (string.IsNullOrWhiteSpace(sourcePath))
{
status = "unresolved";
notes = "Output path requires runtime value.";
}
else if (!File.Exists(sourcePath))
{
status = "missing";
notes = $"Source file '{sourcePath}' not found.";
logger.LogWarning(
"Pack run {RunId} output {Output} referenced missing file {Path}.",
context.RunId,
output.Name,
sourcePath);
}
else
{
Directory.CreateDirectory(filesRoot);
var destinationPath = Path.Combine(filesRoot, DetermineDestinationFileName(output, sourcePath));
Directory.CreateDirectory(Path.GetDirectoryName(destinationPath)!);
await CopyFileAsync(sourcePath, destinationPath, cancellationToken).ConfigureAwait(false);
storedPath = GetRelativePath(destinationPath, destinationRoot);
status = "copied";
logger.LogInformation(
"Pack run {RunId} output {Output} copied to {Destination}.",
context.RunId,
output.Name,
destinationPath);
}
}
if (expressionNode is not null)
{
Directory.CreateDirectory(expressionsRoot);
var expressionPath = Path.Combine(
expressionsRoot,
$"{SanitizeFileName(output.Name)}.json");
var json = expressionNode.ToJsonString(SerializerOptions);
await File.WriteAllTextAsync(expressionPath, json, cancellationToken).ConfigureAwait(false);
storedPath ??= GetRelativePath(expressionPath, destinationRoot);
status = status == "copied" ? "copied" : "materialized";
}
return new ArtifactRecord(
output.Name,
output.Type,
sourcePath,
storedPath,
status,
notes);
}
private static async Task CopyFileAsync(string sourcePath, string destinationPath, CancellationToken cancellationToken)
{
await using var source = File.Open(sourcePath, FileMode.Open, FileAccess.Read, FileShare.Read);
await using var destination = File.Open(destinationPath, FileMode.Create, FileAccess.Write, FileShare.None);
await source.CopyToAsync(destination, cancellationToken).ConfigureAwait(false);
}
private static bool IsFileOutput(TaskPackPlanOutput output)
=> string.Equals(output.Type, "file", StringComparison.OrdinalIgnoreCase);
private static string DetermineDestinationFileName(TaskPackPlanOutput output, string sourcePath)
{
var extension = Path.GetExtension(sourcePath);
var baseName = SanitizeFileName(output.Name);
if (!string.IsNullOrWhiteSpace(extension) &&
!baseName.EndsWith(extension, StringComparison.OrdinalIgnoreCase))
{
return baseName + extension;
}
return baseName;
}
private static string? ResolveString(TaskPackPlanParameterValue? parameter)
{
if (parameter is null || parameter.RequiresRuntimeValue || parameter.Value is null)
{
return null;
}
if (parameter.Value is JsonValue jsonValue && jsonValue.TryGetValue<string>(out var value))
{
return value;
}
return null;
}
private static JsonNode? ResolveExpression(TaskPackPlanParameterValue? parameter)
{
if (parameter is null || parameter.RequiresRuntimeValue)
{
return null;
}
return parameter.Value;
}
private static string SanitizeFileName(string value)
{
var result = value;
foreach (var invalid in Path.GetInvalidFileNameChars())
{
result = result.Replace(invalid, '_');
}
return string.IsNullOrWhiteSpace(result) ? "output" : result;
}
private static string GetRelativePath(string path, string root)
=> Path.GetRelativePath(root, path)
.Replace('\\', '/');
private sealed record ArtifactManifest(string RunId, DateTimeOffset UploadedAt, List<ArtifactRecord> Outputs);
private sealed record ArtifactRecord(
string Name,
string Type,
string? SourcePath,
string? StoredPath,
string Status,
string? Notes);
}

View File

@@ -4,13 +4,13 @@ using StellaOps.AirGap.Policy;
using StellaOps.TaskRunner.Core.Execution;
using StellaOps.TaskRunner.Core.Planning;
using StellaOps.TaskRunner.Core.TaskPacks;
namespace StellaOps.TaskRunner.Infrastructure.Execution;
public sealed class FilesystemPackRunDispatcher : IPackRunJobDispatcher
{
private readonly string queuePath;
private readonly string archivePath;
namespace StellaOps.TaskRunner.Infrastructure.Execution;
public sealed class FilesystemPackRunDispatcher : IPackRunJobDispatcher, IPackRunJobScheduler
{
private readonly string queuePath;
private readonly string archivePath;
private readonly TaskPackManifestLoader manifestLoader = new();
private readonly TaskPackPlanner planner;
private readonly JsonSerializerOptions serializerOptions = new(JsonSerializerDefaults.Web);
@@ -30,11 +30,11 @@ public sealed class FilesystemPackRunDispatcher : IPackRunJobDispatcher
.OrderBy(path => path, StringComparer.Ordinal)
.ToArray();
foreach (var file in files)
{
cancellationToken.ThrowIfCancellationRequested();
try
foreach (var file in files)
{
cancellationToken.ThrowIfCancellationRequested();
try
{
var jobJson = await File.ReadAllTextAsync(file, cancellationToken).ConfigureAwait(false);
var job = JsonSerializer.Deserialize<JobEnvelope>(jobJson, serializerOptions);
@@ -43,38 +43,69 @@ public sealed class FilesystemPackRunDispatcher : IPackRunJobDispatcher
continue;
}
var manifestPath = ResolvePath(queuePath, job.ManifestPath);
var inputsPath = job.InputsPath is null ? null : ResolvePath(queuePath, job.InputsPath);
var manifest = manifestLoader.Load(manifestPath);
var inputs = await LoadInputsAsync(inputsPath, cancellationToken).ConfigureAwait(false);
var planResult = planner.Plan(manifest, inputs);
if (!planResult.Success || planResult.Plan is null)
{
throw new InvalidOperationException($"Failed to plan pack for run {job.RunId}: {string.Join(';', planResult.Errors.Select(e => e.Message))}");
}
var archiveFile = Path.Combine(archivePath, Path.GetFileName(file));
File.Move(file, archiveFile, overwrite: true);
var requestedAt = job.RequestedAt ?? DateTimeOffset.UtcNow;
return new PackRunExecutionContext(job.RunId ?? Guid.NewGuid().ToString("n"), planResult.Plan, requestedAt);
}
catch (Exception ex)
{
TaskPackPlan? plan = job.Plan;
if (plan is null)
{
if (string.IsNullOrWhiteSpace(job.ManifestPath))
{
continue;
}
var manifestPath = ResolvePath(queuePath, job.ManifestPath);
var inputsPath = job.InputsPath is null ? null : ResolvePath(queuePath, job.InputsPath);
var manifest = manifestLoader.Load(manifestPath);
var inputs = await LoadInputsAsync(inputsPath, cancellationToken).ConfigureAwait(false);
var planResult = planner.Plan(manifest, inputs);
if (!planResult.Success || planResult.Plan is null)
{
throw new InvalidOperationException($"Failed to plan pack for run {job.RunId}: {string.Join(';', planResult.Errors.Select(e => e.Message))}");
}
plan = planResult.Plan;
}
var archiveFile = Path.Combine(archivePath, Path.GetFileName(file));
File.Move(file, archiveFile, overwrite: true);
var requestedAt = job.RequestedAt ?? DateTimeOffset.UtcNow;
var runId = string.IsNullOrWhiteSpace(job.RunId) ? Guid.NewGuid().ToString("n") : job.RunId;
return new PackRunExecutionContext(runId, plan, requestedAt);
}
catch (Exception ex)
{
var failedPath = file + ".failed";
File.Move(file, failedPath, overwrite: true);
Console.Error.WriteLine($"Failed to dequeue job '{file}': {ex.Message}");
}
}
return null;
}
private static string ResolvePath(string root, string relative)
=> Path.IsPathRooted(relative) ? relative : Path.Combine(root, relative);
private static async Task<IDictionary<string, JsonNode?>> LoadInputsAsync(string? path, CancellationToken cancellationToken)
return null;
}
public async Task ScheduleAsync(PackRunExecutionContext context, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(context);
var envelope = new JobEnvelope(
context.RunId,
ManifestPath: null,
InputsPath: null,
context.RequestedAt,
context.Plan);
Directory.CreateDirectory(queuePath);
var safeRunId = string.IsNullOrWhiteSpace(context.RunId) ? Guid.NewGuid().ToString("n") : SanitizeFileName(context.RunId);
var fileName = $"{safeRunId}-{DateTimeOffset.UtcNow:yyyyMMddHHmmssfff}.json";
var path = Path.Combine(queuePath, fileName);
var json = JsonSerializer.Serialize(envelope, serializerOptions);
await File.WriteAllTextAsync(path, json, cancellationToken).ConfigureAwait(false);
}
private static string ResolvePath(string root, string relative)
=> Path.IsPathRooted(relative) ? relative : Path.Combine(root, relative);
private static async Task<IDictionary<string, JsonNode?>> LoadInputsAsync(string? path, CancellationToken cancellationToken)
{
if (string.IsNullOrWhiteSpace(path) || !File.Exists(path))
{
@@ -92,7 +123,23 @@ public sealed class FilesystemPackRunDispatcher : IPackRunJobDispatcher
pair => pair.Key,
pair => pair.Value,
StringComparer.Ordinal);
}
private sealed record JobEnvelope(string? RunId, string ManifestPath, string? InputsPath, DateTimeOffset? RequestedAt);
}
}
private sealed record JobEnvelope(
string? RunId,
string? ManifestPath,
string? InputsPath,
DateTimeOffset? RequestedAt,
TaskPackPlan? Plan);
private static string SanitizeFileName(string value)
{
var safe = value.Trim();
foreach (var invalid in Path.GetInvalidFileNameChars())
{
safe = safe.Replace(invalid, '_');
}
return safe;
}
}

View File

@@ -0,0 +1,40 @@
using Microsoft.Extensions.Logging;
using StellaOps.TaskRunner.Core.Execution;
using StellaOps.TaskRunner.Core.Planning;
namespace StellaOps.TaskRunner.Infrastructure.Execution;
public sealed class LoggingPackRunArtifactUploader : IPackRunArtifactUploader
{
private readonly ILogger<LoggingPackRunArtifactUploader> _logger;
public LoggingPackRunArtifactUploader(ILogger<LoggingPackRunArtifactUploader> logger)
{
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public Task UploadAsync(
PackRunExecutionContext context,
PackRunState state,
IReadOnlyList<TaskPackPlanOutput> outputs,
CancellationToken cancellationToken)
{
if (outputs.Count == 0)
{
return Task.CompletedTask;
}
foreach (var output in outputs)
{
var path = output.Path?.Value?.ToString() ?? "(dynamic)";
_logger.LogInformation(
"Pack run {RunId} scheduled artifact upload for output {Output} (type={Type}, path={Path}).",
context.RunId,
output.Name,
output.Type,
path);
}
return Task.CompletedTask;
}
}

View File

@@ -0,0 +1,117 @@
using Microsoft.Extensions.Logging;
using StellaOps.TaskRunner.Core.Execution;
using StellaOps.TaskRunner.Core.Planning;
namespace StellaOps.TaskRunner.Infrastructure.Execution;
public sealed class PackRunApprovalDecisionService
{
private readonly IPackRunApprovalStore _approvalStore;
private readonly IPackRunStateStore _stateStore;
private readonly IPackRunJobScheduler _scheduler;
private readonly ILogger<PackRunApprovalDecisionService> _logger;
public PackRunApprovalDecisionService(
IPackRunApprovalStore approvalStore,
IPackRunStateStore stateStore,
IPackRunJobScheduler scheduler,
ILogger<PackRunApprovalDecisionService> logger)
{
_approvalStore = approvalStore ?? throw new ArgumentNullException(nameof(approvalStore));
_stateStore = stateStore ?? throw new ArgumentNullException(nameof(stateStore));
_scheduler = scheduler ?? throw new ArgumentNullException(nameof(scheduler));
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
public async Task<PackRunApprovalDecisionResult> ApplyAsync(
PackRunApprovalDecisionRequest request,
CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(request);
ArgumentException.ThrowIfNullOrWhiteSpace(request.RunId);
ArgumentException.ThrowIfNullOrWhiteSpace(request.ApprovalId);
var runId = request.RunId.Trim();
var approvalId = request.ApprovalId.Trim();
var state = await _stateStore.GetAsync(runId, cancellationToken).ConfigureAwait(false);
if (state is null)
{
_logger.LogWarning("Approval decision for run {RunId} rejected run state not found.", runId);
return PackRunApprovalDecisionResult.NotFound;
}
var approvals = await _approvalStore.GetAsync(runId, cancellationToken).ConfigureAwait(false);
if (approvals.Count == 0)
{
_logger.LogWarning("Approval decision for run {RunId} rejected approval state not found.", runId);
return PackRunApprovalDecisionResult.NotFound;
}
var requestedAt = state.RequestedAt != default ? state.RequestedAt : state.CreatedAt;
var coordinator = PackRunApprovalCoordinator.Restore(state.Plan, approvals, requestedAt);
ApprovalActionResult actionResult;
var now = DateTimeOffset.UtcNow;
switch (request.Decision)
{
case PackRunApprovalDecisionType.Approved:
actionResult = coordinator.Approve(approvalId, request.ActorId ?? "system", now, request.Summary);
break;
case PackRunApprovalDecisionType.Rejected:
actionResult = coordinator.Reject(approvalId, request.ActorId ?? "system", now, request.Summary);
break;
case PackRunApprovalDecisionType.Expired:
actionResult = coordinator.Expire(approvalId, now, request.Summary);
break;
default:
throw new ArgumentOutOfRangeException(nameof(request.Decision), request.Decision, "Unsupported approval decision.");
}
await _approvalStore.UpdateAsync(runId, actionResult.State, cancellationToken).ConfigureAwait(false);
_logger.LogInformation(
"Applied approval decision {Decision} for run {RunId} (approval {ApprovalId}, actor={ActorId}).",
request.Decision,
runId,
approvalId,
request.ActorId ?? "(system)");
if (actionResult.ShouldResumeRun && request.Decision == PackRunApprovalDecisionType.Approved)
{
var context = new PackRunExecutionContext(runId, state.Plan, requestedAt);
await _scheduler.ScheduleAsync(context, cancellationToken).ConfigureAwait(false);
_logger.LogInformation("Scheduled run {RunId} for resume after approvals completed.", runId);
return PackRunApprovalDecisionResult.Resumed;
}
return PackRunApprovalDecisionResult.Applied;
}
}
public sealed record PackRunApprovalDecisionRequest(
string RunId,
string ApprovalId,
PackRunApprovalDecisionType Decision,
string? ActorId,
string? Summary);
public enum PackRunApprovalDecisionType
{
Approved,
Rejected,
Expired
}
public sealed record PackRunApprovalDecisionResult(string Status)
{
public static PackRunApprovalDecisionResult NotFound { get; } = new("not_found");
public static PackRunApprovalDecisionResult Applied { get; } = new("applied");
public static PackRunApprovalDecisionResult Resumed { get; } = new("resumed");
public bool ShouldResume => ReferenceEquals(this, Resumed);
}

View File

@@ -1,4 +1,5 @@
using StellaOps.TaskRunner.Core.Execution;
using System.Text.Json.Nodes;
using StellaOps.TaskRunner.Core.Execution;
using StellaOps.TaskRunner.Core.Planning;
using StellaOps.TaskRunner.Infrastructure.Execution;
@@ -60,12 +61,34 @@ public sealed class FilePackRunStateStoreTests
private static PackRunState CreateState(string runId)
{
var failurePolicy = new TaskPackPlanFailurePolicy(MaxAttempts: 3, BackoffSeconds: 30, ContinueOnError: false);
var steps = new Dictionary<string, PackRunStepStateRecord>(StringComparer.Ordinal)
{
["step-a"] = new PackRunStepStateRecord(
StepId: "step-a",
Kind: PackRunStepKind.Run,
var failurePolicy = new TaskPackPlanFailurePolicy(MaxAttempts: 3, BackoffSeconds: 30, ContinueOnError: false);
var metadata = new TaskPackPlanMetadata("sample", "1.0.0", null, Array.Empty<string>());
var parameters = new Dictionary<string, TaskPackPlanParameterValue>(StringComparer.Ordinal);
var stepPlan = new TaskPackPlanStep(
Id: "step-a",
TemplateId: "run/image",
Name: "Run step",
Type: "run",
Enabled: true,
Uses: "builtin/run",
Parameters: parameters,
ApprovalId: null,
GateMessage: null,
Children: Array.Empty<TaskPackPlanStep>());
var plan = new TaskPackPlan(
metadata,
new Dictionary<string, JsonNode?>(StringComparer.Ordinal),
new[] { stepPlan },
"hash-123",
Array.Empty<TaskPackPlanApproval>(),
Array.Empty<TaskPackPlanSecret>(),
Array.Empty<TaskPackPlanOutput>(),
failurePolicy);
var steps = new Dictionary<string, PackRunStepStateRecord>(StringComparer.Ordinal)
{
["step-a"] = new PackRunStepStateRecord(
StepId: "step-a",
Kind: PackRunStepKind.Run,
Enabled: true,
ContinueOnError: false,
MaxParallel: null,
@@ -75,10 +98,11 @@ public sealed class FilePackRunStateStoreTests
Attempts: 1,
LastTransitionAt: DateTimeOffset.UtcNow,
NextAttemptAt: null,
StatusReason: null)
};
return PackRunState.Create(runId, "hash-123", failurePolicy, steps, DateTimeOffset.UtcNow);
StatusReason: null)
};
var timestamp = DateTimeOffset.UtcNow;
return PackRunState.Create(runId, "hash-123", plan, failurePolicy, timestamp, steps, timestamp);
}
private static string CreateTempDirectory()

View File

@@ -0,0 +1,138 @@
using System.Text.Json;
using System.Text.Json.Nodes;
using Microsoft.Extensions.Logging.Abstractions;
using StellaOps.TaskRunner.Core.Execution;
using StellaOps.TaskRunner.Core.Planning;
using StellaOps.TaskRunner.Infrastructure.Execution;
using Xunit;
namespace StellaOps.TaskRunner.Tests;
public sealed class FilesystemPackRunArtifactUploaderTests : IDisposable
{
private readonly string artifactsRoot;
public FilesystemPackRunArtifactUploaderTests()
{
artifactsRoot = Path.Combine(Path.GetTempPath(), Guid.NewGuid().ToString("n"));
}
[Fact]
public async Task CopiesFileOutputs()
{
var sourceFile = Path.Combine(Path.GetTempPath(), $"{Guid.NewGuid():n}.txt");
await File.WriteAllTextAsync(sourceFile, "artifact-content", TestContext.Current.CancellationToken);
var uploader = CreateUploader();
var output = CreateFileOutput("bundle", sourceFile);
var context = CreateContext();
var state = CreateState(context);
await uploader.UploadAsync(context, state, new[] { output }, TestContext.Current.CancellationToken);
var runPath = Path.Combine(artifactsRoot, context.RunId);
var filesDirectory = Path.Combine(runPath, "files");
var copiedFiles = Directory.GetFiles(filesDirectory);
Assert.Single(copiedFiles);
Assert.Equal("bundle.txt", Path.GetFileName(copiedFiles[0]));
Assert.Equal("artifact-content", await File.ReadAllTextAsync(copiedFiles[0], TestContext.Current.CancellationToken));
var manifest = await ReadManifestAsync(runPath);
Assert.Single(manifest.Outputs);
Assert.Equal("copied", manifest.Outputs[0].Status);
Assert.Equal("files/bundle.txt", manifest.Outputs[0].StoredPath);
}
[Fact]
public async Task RecordsMissingFilesWithoutThrowing()
{
var uploader = CreateUploader();
var output = CreateFileOutput("missing", Path.Combine(Path.GetTempPath(), "does-not-exist.txt"));
var context = CreateContext();
var state = CreateState(context);
await uploader.UploadAsync(context, state, new[] { output }, TestContext.Current.CancellationToken);
var manifest = await ReadManifestAsync(Path.Combine(artifactsRoot, context.RunId));
Assert.Equal("missing", manifest.Outputs[0].Status);
}
[Fact]
public async Task WritesExpressionOutputsAsJson()
{
var uploader = CreateUploader();
var output = CreateExpressionOutput("metadata", JsonNode.Parse("""{"foo":"bar"}""")!);
var context = CreateContext();
var state = CreateState(context);
await uploader.UploadAsync(context, state, new[] { output }, TestContext.Current.CancellationToken);
var expressionPath = Path.Combine(artifactsRoot, context.RunId, "expressions", "metadata.json");
Assert.True(File.Exists(expressionPath));
var manifest = await ReadManifestAsync(Path.Combine(artifactsRoot, context.RunId));
Assert.Equal("materialized", manifest.Outputs[0].Status);
Assert.Equal("expressions/metadata.json", manifest.Outputs[0].StoredPath);
}
private FilesystemPackRunArtifactUploader CreateUploader()
=> new(artifactsRoot, TimeProvider.System, NullLogger<FilesystemPackRunArtifactUploader>.Instance);
private static TaskPackPlanOutput CreateFileOutput(string name, string path)
=> new(
name,
Type: "file",
Path: new TaskPackPlanParameterValue(JsonValue.Create(path), null, null, false),
Expression: null);
private static TaskPackPlanOutput CreateExpressionOutput(string name, JsonNode expression)
=> new(
name,
Type: "object",
Path: null,
Expression: new TaskPackPlanParameterValue(expression, null, null, false));
private static PackRunExecutionContext CreateContext()
=> new("run-" + Guid.NewGuid().ToString("n"), CreatePlan(), DateTimeOffset.UtcNow);
private static PackRunState CreateState(PackRunExecutionContext context)
=> PackRunState.Create(
runId: context.RunId,
planHash: context.Plan.Hash,
context.Plan,
failurePolicy: new TaskPackPlanFailurePolicy(1, 1, false),
requestedAt: DateTimeOffset.UtcNow,
steps: new Dictionary<string, PackRunStepStateRecord>(StringComparer.Ordinal),
timestamp: DateTimeOffset.UtcNow);
private static TaskPackPlan CreatePlan()
{
return new TaskPackPlan(
new TaskPackPlanMetadata("sample-pack", "1.0.0", null, Array.Empty<string>()),
new Dictionary<string, JsonNode?>(StringComparer.Ordinal),
Array.Empty<TaskPackPlanStep>(),
hash: "hash",
approvals: Array.Empty<TaskPackPlanApproval>(),
secrets: Array.Empty<TaskPackPlanSecret>(),
outputs: Array.Empty<TaskPackPlanOutput>(),
failurePolicy: new TaskPackPlanFailurePolicy(1, 1, false));
}
private static async Task<ArtifactManifestModel> ReadManifestAsync(string runPath)
{
var json = await File.ReadAllTextAsync(Path.Combine(runPath, "artifact-manifest.json"), TestContext.Current.CancellationToken);
return JsonSerializer.Deserialize<ArtifactManifestModel>(json, new JsonSerializerOptions(JsonSerializerDefaults.Web))!;
}
public void Dispose()
{
if (Directory.Exists(artifactsRoot))
{
Directory.Delete(artifactsRoot, recursive: true);
}
}
private sealed record ArtifactManifestModel(string RunId, DateTimeOffset UploadedAt, List<ArtifactRecordModel> Outputs);
private sealed record ArtifactRecordModel(string Name, string Type, string? SourcePath, string? StoredPath, string Status, string? Notes);
}

View File

@@ -0,0 +1,211 @@
using System.Text.Json.Nodes;
using Microsoft.Extensions.Logging.Abstractions;
using StellaOps.TaskRunner.Core.Execution;
using StellaOps.TaskRunner.Core.Planning;
using StellaOps.TaskRunner.Infrastructure.Execution;
namespace StellaOps.TaskRunner.Tests;
public sealed class PackRunApprovalDecisionServiceTests
{
[Fact]
public async Task ApplyAsync_ApprovingLastGateSchedulesResume()
{
var plan = TestPlanFactory.CreatePlan();
var state = TestPlanFactory.CreateState("run-1", plan);
var approval = new PackRunApprovalState(
"security-review",
new[] { "Packs.Approve" },
new[] { "step-a" },
Array.Empty<string>(),
null,
DateTimeOffset.UtcNow.AddMinutes(-5),
PackRunApprovalStatus.Pending);
var approvalStore = new InMemoryApprovalStore(new Dictionary<string, IReadOnlyList<PackRunApprovalState>>
{
["run-1"] = new List<PackRunApprovalState> { approval }
});
var stateStore = new InMemoryStateStore(new Dictionary<string, PackRunState>
{
["run-1"] = state
});
var scheduler = new RecordingScheduler();
var service = new PackRunApprovalDecisionService(
approvalStore,
stateStore,
scheduler,
NullLogger<PackRunApprovalDecisionService>.Instance);
var result = await service.ApplyAsync(
new PackRunApprovalDecisionRequest("run-1", "security-review", PackRunApprovalDecisionType.Approved, "approver@example.com", "LGTM"),
CancellationToken.None);
Assert.Equal("resumed", result.Status);
Assert.True(scheduler.ScheduledContexts.TryGetValue("run-1", out var context));
Assert.Equal(plan.Hash, context!.Plan.Hash);
Assert.Equal(PackRunApprovalStatus.Approved, approvalStore.LastUpdated?.Status);
}
[Fact]
public async Task ApplyAsync_ReturnsNotFoundWhenStateMissing()
{
var approvalStore = new InMemoryApprovalStore(new Dictionary<string, IReadOnlyList<PackRunApprovalState>>());
var stateStore = new InMemoryStateStore(new Dictionary<string, PackRunState>());
var scheduler = new RecordingScheduler();
var service = new PackRunApprovalDecisionService(
approvalStore,
stateStore,
scheduler,
NullLogger<PackRunApprovalDecisionService>.Instance);
var result = await service.ApplyAsync(
new PackRunApprovalDecisionRequest("missing", "approval", PackRunApprovalDecisionType.Approved, "actor", null),
CancellationToken.None);
Assert.Equal("not_found", result.Status);
Assert.False(scheduler.ScheduledContexts.Any());
}
private sealed class InMemoryApprovalStore : IPackRunApprovalStore
{
private readonly Dictionary<string, List<PackRunApprovalState>> _approvals;
public PackRunApprovalState? LastUpdated { get; private set; }
public InMemoryApprovalStore(IDictionary<string, IReadOnlyList<PackRunApprovalState>> seed)
{
_approvals = seed.ToDictionary(
pair => pair.Key,
pair => pair.Value.ToList(),
StringComparer.Ordinal);
}
public Task SaveAsync(string runId, IReadOnlyList<PackRunApprovalState> approvals, CancellationToken cancellationToken)
{
_approvals[runId] = approvals.ToList();
return Task.CompletedTask;
}
public Task<IReadOnlyList<PackRunApprovalState>> GetAsync(string runId, CancellationToken cancellationToken)
{
if (_approvals.TryGetValue(runId, out var existing))
{
return Task.FromResult<IReadOnlyList<PackRunApprovalState>>(existing);
}
return Task.FromResult<IReadOnlyList<PackRunApprovalState>>(Array.Empty<PackRunApprovalState>());
}
public Task UpdateAsync(string runId, PackRunApprovalState approval, CancellationToken cancellationToken)
{
if (_approvals.TryGetValue(runId, out var list))
{
for (var i = 0; i < list.Count; i++)
{
if (string.Equals(list[i].ApprovalId, approval.ApprovalId, StringComparison.Ordinal))
{
list[i] = approval;
LastUpdated = approval;
break;
}
}
}
return Task.CompletedTask;
}
}
private sealed class InMemoryStateStore : IPackRunStateStore
{
private readonly Dictionary<string, PackRunState> _states;
public InMemoryStateStore(IDictionary<string, PackRunState> states)
{
_states = new Dictionary<string, PackRunState>(states, StringComparer.Ordinal);
}
public Task<PackRunState?> GetAsync(string runId, CancellationToken cancellationToken)
{
_states.TryGetValue(runId, out var state);
return Task.FromResult(state);
}
public Task SaveAsync(PackRunState state, CancellationToken cancellationToken)
{
_states[state.RunId] = state;
return Task.CompletedTask;
}
public Task<IReadOnlyList<PackRunState>> ListAsync(CancellationToken cancellationToken)
=> Task.FromResult<IReadOnlyList<PackRunState>>(_states.Values.ToList());
}
private sealed class RecordingScheduler : IPackRunJobScheduler
{
public Dictionary<string, PackRunExecutionContext> ScheduledContexts { get; } = new(StringComparer.Ordinal);
public Task ScheduleAsync(PackRunExecutionContext context, CancellationToken cancellationToken)
{
ScheduledContexts[context.RunId] = context;
return Task.CompletedTask;
}
}
}
internal static class TestPlanFactory
{
public static TaskPackPlan CreatePlan()
{
var metadata = new TaskPackPlanMetadata("sample", "1.0.0", null, Array.Empty<string>());
var parameters = new Dictionary<string, TaskPackPlanParameterValue>(StringComparer.Ordinal);
var step = new TaskPackPlanStep(
Id: "step-a",
TemplateId: "run/image",
Name: "Run step",
Type: "run",
Enabled: true,
Uses: "builtin/run",
Parameters: parameters,
ApprovalId: "security-review",
GateMessage: null,
Children: Array.Empty<TaskPackPlanStep>());
return new TaskPackPlan(
metadata,
new Dictionary<string, JsonNode?>(StringComparer.Ordinal),
new[] { step },
"hash-123",
new[]
{
new TaskPackPlanApproval("security-review", new[] { "Packs.Approve" }, null, null)
},
Array.Empty<TaskPackPlanSecret>(),
Array.Empty<TaskPackPlanOutput>(),
new TaskPackPlanFailurePolicy(3, 30, false));
}
public static PackRunState CreateState(string runId, TaskPackPlan plan)
{
var timestamp = DateTimeOffset.UtcNow;
var steps = new Dictionary<string, PackRunStepStateRecord>(StringComparer.Ordinal)
{
["step-a"] = new PackRunStepStateRecord(
"step-a",
PackRunStepKind.GateApproval,
true,
false,
null,
"security-review",
null,
PackRunStepExecutionStatus.Pending,
0,
null,
null,
null)
};
return PackRunState.Create(runId, plan.Hash, plan, plan.FailurePolicy ?? PackRunExecutionGraph.DefaultFailurePolicy, timestamp, steps, timestamp);
}
}

View File

@@ -129,7 +129,7 @@ public sealed class PackRunGateStateUpdaterTests
StatusReason: reason);
}
return PackRunState.Create("run-1", plan.Hash, graph.FailurePolicy, steps, RequestedAt);
return PackRunState.Create("run-1", plan.Hash, plan, graph.FailurePolicy, RequestedAt, steps, RequestedAt);
}
private static IEnumerable<PackRunExecutionStep> EnumerateSteps(IReadOnlyList<PackRunExecutionStep> steps)

View File

@@ -12,15 +12,27 @@ var builder = WebApplication.CreateBuilder(args);
builder.Services.Configure<TaskRunnerServiceOptions>(builder.Configuration.GetSection("TaskRunner"));
builder.Services.AddSingleton<TaskPackManifestLoader>();
builder.Services.AddSingleton<TaskPackPlanner>();
builder.Services.AddSingleton<PackRunSimulationEngine>();
builder.Services.AddSingleton<PackRunExecutionGraphBuilder>();
builder.Services.AddSingleton<IPackRunStateStore>(sp =>
{
var options = sp.GetRequiredService<IOptions<TaskRunnerServiceOptions>>().Value;
return new FilePackRunStateStore(options.RunStatePath);
});
builder.Services.AddOpenApi();
builder.Services.AddSingleton<TaskPackPlanner>();
builder.Services.AddSingleton<PackRunSimulationEngine>();
builder.Services.AddSingleton<PackRunExecutionGraphBuilder>();
builder.Services.AddSingleton<IPackRunApprovalStore>(sp =>
{
var options = sp.GetRequiredService<IOptions<TaskRunnerServiceOptions>>().Value;
return new FilePackRunApprovalStore(options.ApprovalStorePath);
});
builder.Services.AddSingleton<IPackRunStateStore>(sp =>
{
var options = sp.GetRequiredService<IOptions<TaskRunnerServiceOptions>>().Value;
return new FilePackRunStateStore(options.RunStatePath);
});
builder.Services.AddSingleton(sp =>
{
var options = sp.GetRequiredService<IOptions<TaskRunnerServiceOptions>>().Value;
return new FilesystemPackRunDispatcher(options.QueuePath, options.ArchivePath);
});
builder.Services.AddSingleton<IPackRunJobScheduler>(sp => sp.GetRequiredService<FilesystemPackRunDispatcher>());
builder.Services.AddSingleton<PackRunApprovalDecisionService>();
builder.Services.AddOpenApi();
var app = builder.Build();
@@ -67,11 +79,11 @@ app.MapPost("/v1/task-runner/simulations", async (
return Results.Ok(response);
}).WithName("SimulateTaskPack");
app.MapGet("/v1/task-runner/runs/{runId}", async (
string runId,
IPackRunStateStore stateStore,
CancellationToken cancellationToken) =>
{
app.MapGet("/v1/task-runner/runs/{runId}", async (
string runId,
IPackRunStateStore stateStore,
CancellationToken cancellationToken) =>
{
if (string.IsNullOrWhiteSpace(runId))
{
return Results.BadRequest(new { error = "runId is required." });
@@ -83,10 +95,43 @@ app.MapGet("/v1/task-runner/runs/{runId}", async (
return Results.NotFound();
}
return Results.Ok(RunStateMapper.ToResponse(state));
}).WithName("GetRunState");
app.MapGet("/", () => Results.Redirect("/openapi"));
return Results.Ok(RunStateMapper.ToResponse(state));
}).WithName("GetRunState");
app.MapPost("/v1/task-runner/runs/{runId}/approvals/{approvalId}", async (
string runId,
string approvalId,
[FromBody] ApprovalDecisionDto request,
PackRunApprovalDecisionService decisionService,
CancellationToken cancellationToken) =>
{
if (request is null)
{
return Results.BadRequest(new { error = "Request body is required." });
}
if (!Enum.TryParse<PackRunApprovalDecisionType>(request.Decision, ignoreCase: true, out var decisionType))
{
return Results.BadRequest(new { error = "Invalid decision. Expected approved, rejected, or expired." });
}
var result = await decisionService.ApplyAsync(
new PackRunApprovalDecisionRequest(runId, approvalId, decisionType, request.ActorId, request.Summary),
cancellationToken).ConfigureAwait(false);
if (ReferenceEquals(result, PackRunApprovalDecisionResult.NotFound))
{
return Results.NotFound();
}
return Results.Ok(new
{
status = result.Status,
resumed = result.ShouldResume
});
}).WithName("ApplyApprovalDecision");
app.MapGet("/", () => Results.Redirect("/openapi"));
app.Run();
@@ -146,19 +191,21 @@ internal sealed record RunStateResponse(
DateTimeOffset UpdatedAt,
IReadOnlyList<RunStateStepResponse> Steps);
internal sealed record RunStateStepResponse(
string StepId,
string Kind,
bool Enabled,
bool ContinueOnError,
int? MaxParallel,
string? ApprovalId,
string? GateMessage,
string Status,
int Attempts,
DateTimeOffset? LastTransitionAt,
DateTimeOffset? NextAttemptAt,
string? StatusReason);
internal sealed record RunStateStepResponse(
string StepId,
string Kind,
bool Enabled,
bool ContinueOnError,
int? MaxParallel,
string? ApprovalId,
string? GateMessage,
string Status,
int Attempts,
DateTimeOffset? LastTransitionAt,
DateTimeOffset? NextAttemptAt,
string? StatusReason);
internal sealed record ApprovalDecisionDto(string Decision, string? ActorId, string? Summary);
internal static class SimulationMapper
{

View File

@@ -1,6 +1,9 @@
namespace StellaOps.TaskRunner.WebService;
public sealed class TaskRunnerServiceOptions
{
public string RunStatePath { get; set; } = Path.Combine(AppContext.BaseDirectory, "state", "runs");
}
public sealed class TaskRunnerServiceOptions
{
public string RunStatePath { get; set; } = Path.Combine(AppContext.BaseDirectory, "state", "runs");
public string ApprovalStorePath { get; set; } = Path.Combine(AppContext.BaseDirectory, "approvals");
public string QueuePath { get; set; } = Path.Combine(AppContext.BaseDirectory, "queue");
public string ArchivePath { get; set; } = Path.Combine(AppContext.BaseDirectory, "queue", "archive");
}

View File

@@ -18,12 +18,14 @@ builder.Services.AddSingleton<IPackRunApprovalStore>(sp =>
return new FilePackRunApprovalStore(options.Value.ApprovalStorePath);
});
builder.Services.AddSingleton<IPackRunJobDispatcher>(sp =>
builder.Services.AddSingleton(sp =>
{
var options = sp.GetRequiredService<IOptions<PackRunWorkerOptions>>();
var egressPolicy = sp.GetRequiredService<IEgressPolicy>();
return new FilesystemPackRunDispatcher(options.Value.QueuePath, options.Value.ArchivePath, egressPolicy);
});
builder.Services.AddSingleton<IPackRunJobDispatcher>(sp => sp.GetRequiredService<FilesystemPackRunDispatcher>());
builder.Services.AddSingleton<IPackRunJobScheduler>(sp => sp.GetRequiredService<FilesystemPackRunDispatcher>());
builder.Services.AddSingleton<IPackRunNotificationPublisher>(sp =>
{
@@ -49,6 +51,13 @@ builder.Services.AddSingleton<IPackRunStepExecutor, NoopPackRunStepExecutor>();
builder.Services.AddSingleton<PackRunExecutionGraphBuilder>();
builder.Services.AddSingleton<PackRunSimulationEngine>();
builder.Services.AddSingleton<PackRunProcessor>();
builder.Services.AddSingleton<IPackRunArtifactUploader>(sp =>
{
var options = sp.GetRequiredService<IOptions<PackRunWorkerOptions>>().Value;
var timeProvider = sp.GetService<TimeProvider>();
var logger = sp.GetRequiredService<ILogger<FilesystemPackRunArtifactUploader>>();
return new FilesystemPackRunArtifactUploader(options.ArtifactsPath, timeProvider, logger);
});
builder.Services.AddHostedService<PackRunWorkerService>();
var host = builder.Build();

View File

@@ -4,11 +4,13 @@ public sealed class PackRunWorkerOptions
{
public TimeSpan IdleDelay { get; set; } = TimeSpan.FromSeconds(1);
public string QueuePath { get; set; } = Path.Combine(AppContext.BaseDirectory, "queue");
public string ArchivePath { get; set; } = Path.Combine(AppContext.BaseDirectory, "queue", "archive");
public string QueuePath { get; set; } = Path.Combine(AppContext.BaseDirectory, "queue");
public string ArchivePath { get; set; } = Path.Combine(AppContext.BaseDirectory, "queue", "archive");
public string ApprovalStorePath { get; set; } = Path.Combine(AppContext.BaseDirectory, "approvals");
public string RunStatePath { get; set; } = Path.Combine(AppContext.BaseDirectory, "state", "runs");
public string ArtifactsPath { get; set; } = Path.Combine(AppContext.BaseDirectory, "artifacts");
}

View File

@@ -15,31 +15,34 @@ public sealed class PackRunWorkerService : BackgroundService
private readonly IPackRunJobDispatcher dispatcher;
private readonly PackRunProcessor processor;
private readonly PackRunWorkerOptions options;
private readonly IPackRunStateStore stateStore;
private readonly PackRunExecutionGraphBuilder graphBuilder;
private readonly PackRunSimulationEngine simulationEngine;
private readonly IPackRunStepExecutor executor;
private readonly ILogger<PackRunWorkerService> logger;
public PackRunWorkerService(
IPackRunJobDispatcher dispatcher,
PackRunProcessor processor,
IPackRunStateStore stateStore,
PackRunExecutionGraphBuilder graphBuilder,
PackRunSimulationEngine simulationEngine,
IPackRunStepExecutor executor,
IOptions<PackRunWorkerOptions> options,
ILogger<PackRunWorkerService> logger)
{
this.dispatcher = dispatcher ?? throw new ArgumentNullException(nameof(dispatcher));
this.processor = processor ?? throw new ArgumentNullException(nameof(processor));
private readonly PackRunWorkerOptions options;
private readonly IPackRunStateStore stateStore;
private readonly PackRunExecutionGraphBuilder graphBuilder;
private readonly PackRunSimulationEngine simulationEngine;
private readonly IPackRunStepExecutor executor;
private readonly IPackRunArtifactUploader artifactUploader;
private readonly ILogger<PackRunWorkerService> logger;
public PackRunWorkerService(
IPackRunJobDispatcher dispatcher,
PackRunProcessor processor,
IPackRunStateStore stateStore,
PackRunExecutionGraphBuilder graphBuilder,
PackRunSimulationEngine simulationEngine,
IPackRunStepExecutor executor,
IPackRunArtifactUploader artifactUploader,
IOptions<PackRunWorkerOptions> options,
ILogger<PackRunWorkerService> logger)
{
this.dispatcher = dispatcher ?? throw new ArgumentNullException(nameof(dispatcher));
this.processor = processor ?? throw new ArgumentNullException(nameof(processor));
this.stateStore = stateStore ?? throw new ArgumentNullException(nameof(stateStore));
this.graphBuilder = graphBuilder ?? throw new ArgumentNullException(nameof(graphBuilder));
this.simulationEngine = simulationEngine ?? throw new ArgumentNullException(nameof(simulationEngine));
this.executor = executor ?? throw new ArgumentNullException(nameof(executor));
this.options = options?.Value ?? throw new ArgumentNullException(nameof(options));
this.logger = logger ?? throw new ArgumentNullException(nameof(logger));
this.graphBuilder = graphBuilder ?? throw new ArgumentNullException(nameof(graphBuilder));
this.simulationEngine = simulationEngine ?? throw new ArgumentNullException(nameof(simulationEngine));
this.executor = executor ?? throw new ArgumentNullException(nameof(executor));
this.artifactUploader = artifactUploader ?? throw new ArgumentNullException(nameof(artifactUploader));
this.options = options?.Value ?? throw new ArgumentNullException(nameof(options));
this.logger = logger ?? throw new ArgumentNullException(nameof(logger));
}
protected override async Task ExecuteAsync(CancellationToken stoppingToken)
@@ -100,14 +103,15 @@ public sealed class PackRunWorkerService : BackgroundService
var updatedState = await ExecuteGraphAsync(context, graph, state, cancellationToken).ConfigureAwait(false);
await stateStore.SaveAsync(updatedState, cancellationToken).ConfigureAwait(false);
if (updatedState.Steps.Values.All(step => step.Status is PackRunStepExecutionStatus.Succeeded or PackRunStepExecutionStatus.Skipped))
{
logger.LogInformation("Run {RunId} finished successfully.", context.RunId);
}
else
{
logger.LogInformation("Run {RunId} paused with pending work.", context.RunId);
}
if (updatedState.Steps.Values.All(step => step.Status is PackRunStepExecutionStatus.Succeeded or PackRunStepExecutionStatus.Skipped))
{
logger.LogInformation("Run {RunId} finished successfully.", context.RunId);
await artifactUploader.UploadAsync(context, updatedState, context.Plan.Outputs, cancellationToken).ConfigureAwait(false);
}
else
{
logger.LogInformation("Run {RunId} paused with pending work.", context.RunId);
}
}
private async Task<PackRunState> CreateInitialStateAsync(
@@ -164,7 +168,14 @@ public sealed class PackRunWorkerService : BackgroundService
}
var failurePolicy = graph.FailurePolicy ?? PackRunExecutionGraph.DefaultFailurePolicy;
var state = PackRunState.Create(context.RunId, context.Plan.Hash, failurePolicy, stepRecords, timestamp);
var state = PackRunState.Create(
context.RunId,
context.Plan.Hash,
context.Plan,
failurePolicy,
context.RequestedAt,
stepRecords,
timestamp);
await stateStore.SaveAsync(state, cancellationToken).ConfigureAwait(false);
return state;
}

View File

@@ -18,10 +18,11 @@
## Sprint 43 Approvals, Notifications, Hardening
| ID | Status | Owner(s) | Depends on | Description | Exit Criteria |
|----|--------|----------|------------|-------------|---------------|
| TASKRUN-43-001 | DOING (2025-10-29) | Task Runner Guild | TASKRUN-42-001, NOTIFY-SVC-40-001 | Implement approvals workflow (resume after approval), notifications integration, remote artifact uploads, chaos resilience, secret injection, and audit logs. | Approvals/resume flow validated; notifications emitted; chaos tests documented; secrets redacted in logs; audit logs complete. |
> 2025-10-29: Starting approvals orchestration — defining persistence/workflow scaffolding, integrating plan insights for notifications, and staging resume hooks.
> 2025-10-29: Added approval coordinator + policy notification bridge with unit tests; ready to wire into worker execution/resume path.
|----|--------|----------|------------|-------------|---------------|
| TASKRUN-43-001 | DOING (2025-10-29) | Task Runner Guild | TASKRUN-42-001, NOTIFY-SVC-40-001 | Implement approvals workflow (resume after approval), notifications integration, remote artifact uploads, chaos resilience, secret injection, and audit logs. | Approvals/resume flow validated; notifications emitted; chaos tests documented; secrets redacted in logs; audit logs complete. |
> 2025-10-29: Starting approvals orchestration — defining persistence/workflow scaffolding, integrating plan insights for notifications, and staging resume hooks.
> 2025-10-29: Added approval coordinator + policy notification bridge with unit tests; ready to wire into worker execution/resume path.
> 2025-11-06: Added approval decision API with resume requeue, persisted plan snapshots, and artifact uploader hook (logging backend pending).
## Authority-Backed Scopes & Tenancy (Epic 14)
| ID | Status | Owner(s) | Depends on | Description | Exit Criteria |

View File

@@ -1,5 +1,6 @@
using Google.Cloud.Kms.V1;
using Google.Protobuf;
using Google.Protobuf.WellKnownTypes;
namespace StellaOps.Cryptography.Kms;

View File

@@ -271,7 +271,7 @@ internal sealed class Pkcs11InteropFacade : IPkcs11Facade
}
catch
{
# ignore logout failures
// ignore logout failures
}
}

View File

@@ -64,9 +64,8 @@ public sealed class Pkcs11Options
/// <summary>
/// Gets or sets an optional factory for advanced facade injection (testing, custom providers).
/// </summary>
public Func<IServiceProvider, IPkcs11Facade>? FacadeFactory { get; set; }
internal Func<IServiceProvider, IPkcs11Facade>? FacadeFactory { get; set; }
private static TimeSpan EnsurePositive(TimeSpan value, TimeSpan fallback)
=> value <= TimeSpan.Zero ? fallback : value;
}