feat: Enhance Authority Identity Provider Registry with Bootstrap Capability

- Added support for bootstrap providers in AuthorityIdentityProviderRegistry.
- Introduced a new property for bootstrap providers and updated AggregateCapabilities.
- Updated relevant methods to handle bootstrap capabilities during provider registration.

feat: Introduce Sealed Mode Status in OpenIddict Handlers

- Added SealedModeStatusProperty to AuthorityOpenIddictConstants.
- Enhanced ValidateClientCredentialsHandler, ValidatePasswordGrantHandler, and ValidateRefreshTokenGrantHandler to validate sealed mode evidence.
- Implemented logic to handle airgap seal confirmation requirements.

feat: Update Program Configuration for Sealed Mode

- Registered IAuthoritySealedModeEvidenceValidator in Program.cs.
- Added logging for bootstrap capabilities in identity provider plugins.
- Implemented checks for bootstrap support in API endpoints.

chore: Update Tasks and Documentation

- Marked AUTH-MTLS-11-002 as DONE in TASKS.md.
- Updated documentation to reflect changes in sealed mode and bootstrap capabilities.

fix: Improve CLI Command Handlers Output

- Enhanced output formatting for command responses and prompts in CommandHandlers.cs.

feat: Extend Advisory AI Models

- Added Response property to AdvisoryPipelineOutputModel for better output handling.

fix: Adjust Concelier Web Service Authentication

- Improved JWT token handling in Concelier Web Service to ensure proper token extraction and logging.

test: Enhance Web Service Endpoints Tests

- Added detailed logging for authentication failures in WebServiceEndpointsTests.
- Enabled PII logging for better debugging of authentication issues.

feat: Introduce Air-Gap Configuration Options

- Added AuthorityAirGapOptions and AuthoritySealedModeOptions to StellaOpsAuthorityOptions.
- Implemented validation logic for air-gap configurations to ensure proper setup.
This commit is contained in:
master
2025-11-09 12:18:14 +02:00
parent d71c81e45d
commit ba4c935182
68 changed files with 2142 additions and 291 deletions

View File

@@ -8,6 +8,7 @@ using StellaOps.AdvisoryAI.Orchestration;
using StellaOps.AdvisoryAI.Prompting;
using StellaOps.AdvisoryAI.Metrics;
using StellaOps.AdvisoryAI.Queue;
using StellaOps.AdvisoryAI.Inference;
namespace StellaOps.AdvisoryAI.Execution;
@@ -27,6 +28,7 @@ internal sealed class AdvisoryPipelineExecutor : IAdvisoryPipelineExecutor
private readonly IAdvisoryOutputStore _outputStore;
private readonly AdvisoryPipelineMetrics _metrics;
private readonly TimeProvider _timeProvider;
private readonly IAdvisoryInferenceClient _inferenceClient;
private readonly ILogger<AdvisoryPipelineExecutor>? _logger;
public AdvisoryPipelineExecutor(
@@ -35,6 +37,7 @@ internal sealed class AdvisoryPipelineExecutor : IAdvisoryPipelineExecutor
IAdvisoryOutputStore outputStore,
AdvisoryPipelineMetrics metrics,
TimeProvider timeProvider,
IAdvisoryInferenceClient inferenceClient,
ILogger<AdvisoryPipelineExecutor>? logger = null)
{
_promptAssembler = promptAssembler ?? throw new ArgumentNullException(nameof(promptAssembler));
@@ -42,6 +45,7 @@ internal sealed class AdvisoryPipelineExecutor : IAdvisoryPipelineExecutor
_outputStore = outputStore ?? throw new ArgumentNullException(nameof(outputStore));
_metrics = metrics ?? throw new ArgumentNullException(nameof(metrics));
_timeProvider = timeProvider ?? throw new ArgumentNullException(nameof(timeProvider));
_inferenceClient = inferenceClient ?? throw new ArgumentNullException(nameof(inferenceClient));
_logger = logger;
}
@@ -87,8 +91,9 @@ internal sealed class AdvisoryPipelineExecutor : IAdvisoryPipelineExecutor
prompt.Citations.Length,
plan.StructuredChunks.Length);
var inferenceResult = await _inferenceClient.GenerateAsync(plan, prompt, guardrailResult, cancellationToken).ConfigureAwait(false);
var generatedAt = _timeProvider.GetUtcNow();
var output = AdvisoryPipelineOutput.Create(plan, prompt, guardrailResult, generatedAt, planFromCache);
var output = AdvisoryPipelineOutput.Create(plan, prompt, guardrailResult, inferenceResult, generatedAt, planFromCache);
await _outputStore.SaveAsync(output, cancellationToken).ConfigureAwait(false);
_metrics.RecordOutputStored(plan.Request.TaskType, planFromCache, guardrailResult.Blocked);

View File

@@ -0,0 +1,215 @@
using System;
using System.Collections.Immutable;
using System.Net.Http;
using System.Net.Http.Headers;
using System.Net.Http.Json;
using System.Text.Json.Serialization;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using StellaOps.AdvisoryAI.Guardrails;
using StellaOps.AdvisoryAI.Orchestration;
using StellaOps.AdvisoryAI.Prompting;
namespace StellaOps.AdvisoryAI.Inference;
public interface IAdvisoryInferenceClient
{
Task<AdvisoryInferenceResult> GenerateAsync(
AdvisoryTaskPlan plan,
AdvisoryPrompt prompt,
AdvisoryGuardrailResult guardrailResult,
CancellationToken cancellationToken);
}
public sealed record AdvisoryInferenceResult(
string Content,
string? ModelId,
int? PromptTokens,
int? CompletionTokens,
ImmutableDictionary<string, string> Metadata)
{
public static AdvisoryInferenceResult FromLocal(string content)
=> new(
content,
"local.prompt-preview",
null,
null,
ImmutableDictionary.Create<string, string>(StringComparer.Ordinal));
public static AdvisoryInferenceResult FromFallback(string content, string reason, string? details = null)
{
var builder = ImmutableDictionary.CreateBuilder<string, string>(StringComparer.Ordinal);
builder["inference.fallback_reason"] = reason;
if (!string.IsNullOrWhiteSpace(details))
{
builder["inference.fallback_details"] = details!;
}
return new AdvisoryInferenceResult(
content,
"remote.fallback",
null,
null,
builder.ToImmutable());
}
}
public sealed class LocalAdvisoryInferenceClient : IAdvisoryInferenceClient
{
public Task<AdvisoryInferenceResult> GenerateAsync(
AdvisoryTaskPlan plan,
AdvisoryPrompt prompt,
AdvisoryGuardrailResult guardrailResult,
CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(prompt);
ArgumentNullException.ThrowIfNull(guardrailResult);
var sanitized = guardrailResult.SanitizedPrompt ?? prompt.Prompt ?? string.Empty;
return Task.FromResult(AdvisoryInferenceResult.FromLocal(sanitized));
}
}
public sealed class RemoteAdvisoryInferenceClient : IAdvisoryInferenceClient
{
private readonly HttpClient _httpClient;
private readonly IOptions<AdvisoryAiInferenceOptions> _options;
private readonly ILogger<RemoteAdvisoryInferenceClient>? _logger;
public RemoteAdvisoryInferenceClient(
HttpClient httpClient,
IOptions<AdvisoryAiInferenceOptions> options,
ILogger<RemoteAdvisoryInferenceClient>? logger = null)
{
_httpClient = httpClient ?? throw new ArgumentNullException(nameof(httpClient));
_options = options ?? throw new ArgumentNullException(nameof(options));
_logger = logger;
}
public async Task<AdvisoryInferenceResult> GenerateAsync(
AdvisoryTaskPlan plan,
AdvisoryPrompt prompt,
AdvisoryGuardrailResult guardrailResult,
CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(plan);
ArgumentNullException.ThrowIfNull(prompt);
ArgumentNullException.ThrowIfNull(guardrailResult);
var sanitized = guardrailResult.SanitizedPrompt ?? prompt.Prompt ?? string.Empty;
var inferenceOptions = _options.Value ?? new AdvisoryAiInferenceOptions();
var remote = inferenceOptions.Remote ?? new AdvisoryAiRemoteInferenceOptions();
if (remote.BaseAddress is null)
{
_logger?.LogWarning("Remote inference is enabled but no base address was configured. Falling back to local prompt output.");
return AdvisoryInferenceResult.FromLocal(sanitized);
}
var endpoint = string.IsNullOrWhiteSpace(remote.Endpoint)
? "/v1/inference"
: remote.Endpoint;
var request = new RemoteInferenceRequest(
TaskType: plan.Request.TaskType.ToString(),
Profile: plan.Request.Profile,
Prompt: sanitized,
Metadata: prompt.Metadata.ToDictionary(static pair => pair.Key, static pair => pair.Value, StringComparer.Ordinal),
Citations: prompt.Citations
.Select(citation => new RemoteInferenceCitation(citation.Index, citation.DocumentId, citation.ChunkId))
.ToArray());
try
{
using var response = await _httpClient.PostAsJsonAsync(endpoint, request, cancellationToken).ConfigureAwait(false);
if (!response.IsSuccessStatusCode)
{
var body = await response.Content.ReadAsStringAsync(cancellationToken).ConfigureAwait(false);
_logger?.LogWarning(
"Remote inference request failed with status {StatusCode}. Response body: {Body}",
response.StatusCode,
body);
return AdvisoryInferenceResult.FromFallback(sanitized, $"remote_http_{(int)response.StatusCode}", body);
}
var payload = await response.Content.ReadFromJsonAsync<RemoteInferenceResponse>(cancellationToken: cancellationToken).ConfigureAwait(false);
if (payload is null || string.IsNullOrWhiteSpace(payload.Content))
{
_logger?.LogWarning("Remote inference response was empty. Falling back to sanitized prompt.");
return AdvisoryInferenceResult.FromFallback(sanitized, "remote_empty_response");
}
var metadataBuilder = ImmutableDictionary.CreateBuilder<string, string>(StringComparer.Ordinal);
if (payload.Metadata is not null)
{
foreach (var pair in payload.Metadata)
{
if (!string.IsNullOrWhiteSpace(pair.Key) && pair.Value is not null)
{
metadataBuilder[pair.Key] = pair.Value;
}
}
}
return new AdvisoryInferenceResult(
payload.Content,
payload.ModelId,
payload.Usage?.PromptTokens,
payload.Usage?.CompletionTokens,
metadataBuilder.ToImmutable());
}
catch (OperationCanceledException) when (!cancellationToken.IsCancellationRequested)
{
_logger?.LogWarning("Remote inference timed out before completion. Returning sanitized prompt.");
return AdvisoryInferenceResult.FromFallback(sanitized, "remote_timeout");
}
catch (HttpRequestException ex)
{
_logger?.LogWarning(ex, "Remote inference HTTP request failed. Returning sanitized prompt.");
return AdvisoryInferenceResult.FromFallback(sanitized, "remote_http_exception", ex.Message);
}
}
private sealed record RemoteInferenceRequest(
string TaskType,
string Profile,
string Prompt,
IReadOnlyDictionary<string, string> Metadata,
IReadOnlyList<RemoteInferenceCitation> Citations);
private sealed record RemoteInferenceCitation(int Index, string DocumentId, string ChunkId);
private sealed record RemoteInferenceResponse(
[property: JsonPropertyName("content")] string Content,
[property: JsonPropertyName("modelId")] string? ModelId,
[property: JsonPropertyName("usage")] RemoteInferenceUsage? Usage,
[property: JsonPropertyName("metadata")] Dictionary<string, string>? Metadata);
private sealed record RemoteInferenceUsage(
[property: JsonPropertyName("promptTokens")] int? PromptTokens,
[property: JsonPropertyName("completionTokens")] int? CompletionTokens);
}
public sealed class AdvisoryAiInferenceOptions
{
public AdvisoryAiInferenceMode Mode { get; set; } = AdvisoryAiInferenceMode.Local;
public AdvisoryAiRemoteInferenceOptions Remote { get; set; } = new();
}
public sealed class AdvisoryAiRemoteInferenceOptions
{
public Uri? BaseAddress { get; set; }
public string Endpoint { get; set; } = "/v1/inference";
public string? ApiKey { get; set; }
public TimeSpan Timeout { get; set; } = TimeSpan.FromSeconds(30);
}
public enum AdvisoryAiInferenceMode
{
Local,
Remote
}

View File

@@ -1,10 +1,12 @@
using System.Collections.Concurrent;
using System.Collections.Immutable;
using System.Globalization;
using System.Security.Cryptography;
using System.Text;
using StellaOps.AdvisoryAI.Guardrails;
using StellaOps.AdvisoryAI.Prompting;
using StellaOps.AdvisoryAI.Orchestration;
using StellaOps.AdvisoryAI.Inference;
namespace StellaOps.AdvisoryAI.Outputs;
@@ -22,6 +24,7 @@ public sealed class AdvisoryPipelineOutput
AdvisoryTaskType taskType,
string profile,
string prompt,
string response,
ImmutableArray<AdvisoryPromptCitation> citations,
ImmutableDictionary<string, string> metadata,
AdvisoryGuardrailResult guardrail,
@@ -33,6 +36,7 @@ public sealed class AdvisoryPipelineOutput
TaskType = taskType;
Profile = string.IsNullOrWhiteSpace(profile) ? throw new ArgumentException(nameof(profile)) : profile;
Prompt = prompt ?? throw new ArgumentNullException(nameof(prompt));
Response = response ?? throw new ArgumentNullException(nameof(response));
Citations = citations;
Metadata = metadata ?? throw new ArgumentNullException(nameof(metadata));
Guardrail = guardrail ?? throw new ArgumentNullException(nameof(guardrail));
@@ -49,6 +53,8 @@ public sealed class AdvisoryPipelineOutput
public string Prompt { get; }
public string Response { get; }
public ImmutableArray<AdvisoryPromptCitation> Citations { get; }
public ImmutableDictionary<string, string> Metadata { get; }
@@ -65,15 +71,21 @@ public sealed class AdvisoryPipelineOutput
AdvisoryTaskPlan plan,
AdvisoryPrompt prompt,
AdvisoryGuardrailResult guardrail,
AdvisoryInferenceResult inference,
DateTimeOffset generatedAtUtc,
bool planFromCache)
{
ArgumentNullException.ThrowIfNull(plan);
ArgumentNullException.ThrowIfNull(prompt);
ArgumentNullException.ThrowIfNull(guardrail);
ArgumentNullException.ThrowIfNull(inference);
var promptContent = guardrail.SanitizedPrompt ?? prompt.Prompt ?? string.Empty;
var outputHash = ComputeHash(promptContent);
var responseContent = string.IsNullOrWhiteSpace(inference.Content)
? promptContent
: inference.Content;
var metadata = MergeMetadata(prompt.Metadata, inference);
var outputHash = ComputeHash(responseContent);
var provenance = new AdvisoryDsseProvenance(plan.CacheKey, outputHash, ImmutableArray<string>.Empty);
return new AdvisoryPipelineOutput(
@@ -81,14 +93,52 @@ public sealed class AdvisoryPipelineOutput
plan.Request.TaskType,
plan.Request.Profile,
promptContent,
responseContent,
prompt.Citations,
prompt.Metadata,
metadata,
guardrail,
provenance,
generatedAtUtc,
planFromCache);
}
private static ImmutableDictionary<string, string> MergeMetadata(
ImmutableDictionary<string, string> metadata,
AdvisoryInferenceResult inference)
{
var builder = metadata is { Count: > 0 }
? metadata.ToBuilder()
: ImmutableDictionary.CreateBuilder<string, string>(StringComparer.Ordinal);
if (!string.IsNullOrWhiteSpace(inference.ModelId))
{
builder["inference.model_id"] = inference.ModelId!;
}
if (inference.PromptTokens.HasValue)
{
builder["inference.prompt_tokens"] = inference.PromptTokens.Value.ToString(CultureInfo.InvariantCulture);
}
if (inference.CompletionTokens.HasValue)
{
builder["inference.completion_tokens"] = inference.CompletionTokens.Value.ToString(CultureInfo.InvariantCulture);
}
if (inference.Metadata is not null && inference.Metadata.Count > 0)
{
foreach (var pair in inference.Metadata)
{
if (!string.IsNullOrWhiteSpace(pair.Key) && pair.Value is not null)
{
builder[pair.Key] = pair.Value;
}
}
}
return builder.ToImmutable();
}
private static string ComputeHash(string content)
{
var bytes = Encoding.UTF8.GetBytes(content);

View File

@@ -11,7 +11,7 @@
| AIAI-31-005 | DONE (2025-11-04) | Advisory AI Guild, Security Guild | AIAI-31-004 | Implement guardrails (redaction, injection defense, output validation, citation enforcement) and fail-safe handling. | Guardrails block adversarial inputs; output validator enforces schemas; security tests pass. |
| AIAI-31-006 | DONE (2025-11-04) | Advisory AI Guild | AIAI-31-004..005 | Expose REST API endpoints (`/advisory/ai/*`) with RBAC, rate limits, OpenAPI schemas, and batching support. | Endpoints deployed with schema validation; rate limits enforced; integration tests cover error codes. |
| AIAI-31-007 | DONE (2025-11-06) | Advisory AI Guild, Observability Guild | AIAI-31-004..006 | Instrument metrics (`advisory_ai_latency`, `guardrail_blocks`, `validation_failures`, `citation_coverage`), logs, and traces; publish dashboards/alerts. | Telemetry live; dashboards approved; alerts configured. |
| AIAI-31-008 | TODO | Advisory AI Guild, DevOps Guild | AIAI-31-006..007 | Package inference on-prem container, remote inference toggle, Helm/Compose manifests, scaling guidance, offline kit instructions. | Deployment docs merged; smoke deploy executed; offline kit updated; feature flags documented. |
| AIAI-31-008 | DOING (2025-11-08) | Advisory AI Guild, DevOps Guild | AIAI-31-006..007 | Package inference on-prem container, remote inference toggle, Helm/Compose manifests, scaling guidance, offline kit instructions. | Deployment docs merged; smoke deploy executed; offline kit updated; feature flags documented. |
| AIAI-31-010 | DONE (2025-11-02) | Advisory AI Guild | CONCELIER-VULN-29-001, EXCITITOR-VULN-29-001 | Implement Concelier advisory raw document provider mapping CSAF/OSV payloads into structured chunks for retrieval. | Provider resolves content format, preserves metadata, and passes unit tests covering CSAF/OSV cases. |
| AIAI-31-011 | DONE (2025-11-02) | Advisory AI Guild | EXCITITOR-LNM-21-201, EXCITITOR-CORE-AOC-19-002 | Implement Excititor VEX document provider to surface structured VEX statements for vector retrieval. | Provider returns conflict-aware VEX chunks with deterministic metadata and tests for representative statements. |
| AIAI-31-009 | DONE (2025-11-08) | Advisory AI Guild, QA Guild | AIAI-31-001..006 | Develop unit/golden/property/perf tests, injection harness, and regression suite; ensure determinism with seeded caches. | Test suite green; golden outputs stored; injection tests pass; perf targets documented. |