- Introduced README.md for Zastava Evidence Locker Plan detailing artifacts to sign and post-signing steps. - Added example JSON schemas for observer events and webhook admissions. - Updated implementor guidelines with checklist for CI linting, determinism, secrets management, and schema control. - Created alert rules for Vuln Explorer to monitor API latency and projection errors. - Developed analytics ingestion plan for Vuln Explorer, focusing on telemetry and PII guardrails. - Implemented Grafana dashboard configuration for Vuln Explorer metrics visualization. - Added expected projection SHA256 for vulnerability events. - Created k6 load testing script for Vuln Explorer API. - Added sample projection and replay event data for testing. - Implemented ReplayInputsLock for deterministic replay inputs management. - Developed tests for ReplayInputsLock to ensure stable hash computation. - Created SurfaceManifestDeterminismVerifier to validate manifest determinism and integrity. - Added unit tests for SurfaceManifestDeterminismVerifier to ensure correct functionality. - Implemented Angular tests for VulnerabilityHttpClient and VulnerabilityDetailComponent to verify API interactions and UI rendering.
11 KiB
Surface.FS Design (Epic: SURFACE-SHARING)
Status: Draft v1.1 — aligns with tasks
SURFACE-FS-01..06,SCANNER-SURFACE-01..05,ZASTAVA-SURFACE-01..02,SCHED-SURFACE-01,OPS-SECRETS-01..02.Audience: Scanner Worker/WebService, Zastava, Scheduler, DevOps. Component map: See Scanner architecture — §1 System landscape for end-to-end placement.
1. Purpose
Surface.FS provides a unified content-addressable cache for Scanner-derived artefacts (layer manifests, entry traces, SBOM fragments, runtime deltas). It enables:
- Sharing scan results between Worker, WebService, Zastava Observer/Webhook, Scheduler planners, Export Center, and future CLI operations.
- Deterministic reproduction of scan evidence (manifests and payloads) in both connected and air-gapped environments.
- Efficient data movement by storing manifests once and referencing them via stable pointers.
2. Core Concepts
2.1 Artefact Key
Each artefact is addressed by a tuple (tenant, surfaceKind, contentDigest) where contentDigest is a SHA256 of the canonical payload. surfaceKind identifies artefact type (see Manifest schema below).
2.2 Manifest
Manifests describe the artefact metadata and storage pointers. They are stored in the surface-manifests bucket and fetched by consumers before retrieving bulk data.
{
"schema": "stellaops.surface.manifest@1",
"tenant": "acme",
"imageDigest": "sha256:cafe...",
"scanId": "scan-1234",
"generatedAt": "2025-10-29T12:00:00Z",
"source": {
"component": "scanner.worker",
"version": "2025.10.0",
"workerInstance": "scanner-worker-1",
"attempt": 1
},
"artifacts": [
{
"kind": "entrytrace.graph",
"uri": "cas://surface-cache/manifests/acme/ab/cd/abcdef0123456789abcdef0123456789abcdef0123456789abcdef0123456789.json",
"digest": "sha256:abcdef0123456789abcdef0123456789abcdef0123456789abcdef0123456789",
"mediaType": "application/vnd.stellaops.entrytrace+json",
"format": "json",
"sizeBytes": 524288,
"view": "runtime",
"attestations": [
{
"kind": "dsse",
"mediaType": "application/vnd.dsse+json",
"digest": "sha256:e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855",
"uri": "cas://surface-cache/attestations/entrytrace.graph.dsse/e3b0c442....json"
}
],
"storage": {
"bucket": "surface-cache",
"objectKey": "payloads/acme/entrytrace/sha256/ab/cd/abcdef0123456789abcdef0123456789abcdef0123456789abcdef0123456789.ndjson.zst",
"sizeBytes": 524288,
"contentType": "application/x-ndjson+zstd"
},
"metadata": {
"entrypoint": "/usr/bin/java",
"surfaceVersion": "1"
}
}
]
}
Manifest URIs follow the deterministic pattern:
cas://{bucket}/{prefix}/{tenant}/{digest[0..1]}/{digest[2..3]}/{digest}.json
The hex portion of the manifest digest is split into two directory levels to avoid hot directories. The same layout is mirrored on disk by the default FileSurfaceManifestStore, which keeps offline bundle sync trivial (copy the manifests/ tree verbatim).
Deterministic composition adds:
- Artifact kind
composition.recipe(media typeapplication/vnd.stellaops.composition.recipe+json) describing the merge recipe and Merkle root. attestations[]per artefact (currently DSSE envelopes) so offline kits can verify payloads without re-signing.determinismRootanddeterminismmetadata on the manifest that capture the Merkle root plus the composition recipe digest/URI.
2.3 Payload Storage
Large payloads (SBOM fragments, entry traces, runtime events) live in the same object store as manifests (RustFS/S3). Manifests record relative paths so offline bundles can copy both manifest and payload without modification.
3. APIs
Surface.FS exposes .NET-first abstractions that hosts consume via DI:
ISurfaceManifestWriter.PublishAsync(document)– normalises artefact lists, computes the canonical SHA-256 digest, persists the manifest via the configured store, and returns aSurfaceManifestPublishResultcontaining the digest, canonical URI, and the normalised document.ISurfaceManifestReader.TryGetByUriAsync(uri)– resolves a manifest pointer (e.g.cas://surface-cache/manifests/...) back into aSurfaceManifestDocument.ISurfaceManifestReader.TryGetByDigestAsync(digest)– looks up a manifest by digest, scanning tenant prefixes when necessary (used by Offline Kit importers).ISurfaceCache(GetOrCreateAsync,TryGetAsync,SetAsync) – lightweight content-addressable cache for hot artefacts (layer fragments, entry trace outputs) hosted on local disk.
All components honour configuration bound from Surface:Cache and Surface:Manifest (or environment mirrors like SCANNER_SURFACE_CACHE_ROOT). SurfaceManifestStoreOptions controls the URI scheme/bucket/prefix and allows overriding the manifest directory while still defaulting to <cacheRoot>/manifests.
WebService integration (2025-11-05)
/api/v1/scans/{id}and/api/v1/reportsresponses now include asurfaceblock containing:manifestUri–cas://pointer to the Surface manifest JSON.manifestDigest– canonical SHA-256 over the manifest payload.manifest.artifacts[]– deterministic list withkind,uri,digest,mediaType,format, and optionalview. URIs reuse theArtifactObjectKeyBuildersemantics (cas://{bucket}/{rootPrefix}/images/...).
- This allows UI/CLI consumers to fetch manifests or artefacts without additional Surface.FS round-trips.
4. Library Responsibilities
Surface.FS library for .NET hosts provides:
ISurfaceManifestWriter/ISurfaceManifestReaderwith the defaultFileSurfaceManifestStoreimplementation (single-writer semaphore, digest reuse, optional overwrite warning).- Deterministic pointer builder (
SurfaceManifestPathBuilder) and options (SurfaceManifestStoreOptions,SurfaceCacheOptions) that align withSurface.Envconfiguration. - Local cache abstraction
ISurfaceCachewith defaultFileSurfaceCacheimplementation (usesSurface:Cache:Root/SCANNER_SURFACE_CACHE_ROOT, enforces per-key semaphores, stores bytes verbatim). SurfaceCacheKeyhelper that normalises cache entries as{namespace}/{tenant}/{sha256}. EntryTrace graphs use theentrytrace.graphnamespace so Worker/WebService/CLI can share cached results deterministically.- JSON serialiser (
SurfaceCacheJsonSerializer) that applies camelCase naming, ignores nulls, and uses a stable encoder for reproducible hashing. - Metrics:
surface_manifest_published_total,surface_manifest_cache_hit_total, plus host-specific counters wired via Scanner Worker instrumentation.
5. Retention & Eviction
- Manifests capture
generatedAt; retention windows (30 days for SBOM fragments, 7 days for entry traces) are enforced by job configuration and object-store lifecycle policies. AnexpiresAtfield is reserved for future use when automated eviction is introduced. - Background job
SurfaceCacheMaintenanceServiceevicts local cache entries exceeding quota, oldest-first. - Object storage retention policies are managed by DevOps; library exposes metrics but does not auto-delete unless instructed.
6. Offline Kit Handling
Offline kits include:
offline/surface/
manifests/
<tenant>/<digest[0..1]>/<digest[2..3]>/<digest>.json
payloads/
<tenant>/<kind>/<digest[0..1]>/<digest[2..3]>/<digest>.json.zst
manifest-index.json
Import script uses ISurfaceManifestWriter.PublishAsync for each manifest after verifying the embedded digest, keeping Offline Kit replays identical to online flows. This enables Zastava and Scheduler running offline to consume cached data without re-scanning.
6.1 EntryTrace Cache Usage
Scanner.Worker serialises EntryTrace graphs into Surface.FS using SurfaceCacheKey(namespace: "entrytrace.graph", tenant, sha256(options|env|entrypoint)). At runtime the worker checks the cache before invoking analyzers; cache hits bypass parsing and feed the result store/attestor pipeline directly. The same namespace is consumed by WebService and CLI to retrieve cached graphs for reporting.
6.2 BuildX generator path
StellaOps.Scanner.Sbomer.BuildXPlugin reuses the same CAS layout via the --surface-* descriptor flags (or STELLAOPS_SURFACE_* env vars). When layer fragment JSON, EntryTrace graph JSON, or NDJSON files are supplied, the plug-in writes them under scanner/surface/** within the configured CAS root and emits a manifest pointer so Scanner.WebService can pick up the artefacts without re-scanning. The Surface manifest JSON can also be copied to an arbitrary path via --surface-manifest-output for CI artefacts/offline kits.
7. Security & Tenancy
- Tenant ID is mandatory; Surface.Validation enforces match with Authority token.
- Manifests/payloads stored in tenant-specific prefixes to prevent leakage.
- Optional manifest signing (future) will use
Surface.Secretsto load signing keys. - TLS enforced between hosts and Surface.FS endpoint; certificate pins configured via Surface.Env.
8. Observability
- Logs include manifest SHA, tenant, kind, and cache namespace; payload paths are truncated for brevity.
- Prometheus metrics (emitted by Scanner.Worker) now include:
scanner_worker_surface_manifests_published_total,scanner_worker_surface_manifests_failed_total,scanner_worker_surface_manifests_skipped_totalwith labels{queue, job_kind, surface_result, reason?, surface_payload_count}.scanner_worker_surface_payload_persisted_totalwith{surface_kind}to track cache churn (entrytrace.graph,entrytrace.ndjson,layer.fragments, …).scanner_worker_surface_manifest_publish_duration_mshistogram for end-to-end persistence latency.
- Grafana dashboard JSON:
docs/modules/scanner/operations/surface-worker-grafana-dashboard.json(panels for publish outcomes, latency, per-kind cache rate, and failure reasons). Import alongside the analyzer dashboard and point it to the Scanner Prometheus datasource. - Tracing spans:
surface.fs.put,surface.fs.get,surface.fs.cache.
9. Testing Strategy
- Unit tests for path builder, manifest serializer, and local cache eviction.
- Determinism verifier tests assert that
composition.recipe+ DSSE payloads match the Merkle root and surface artefact digests. - Integration tests using embedded RustFS or MinIO container to validate API interactions.
- Offline kit tests verifying export/import cycle round-trips manifests and payloads.
10. Future Enhancements
- Manifest signing (DSSE) to support tamper detection in hostile environments.
- Differential manifests to optimise large SBOM updates.
- Cross-region replication for multi-site deployments.
11. References
- Surface.Env Design (
docs/modules/scanner/design/surface-env.md) - Surface.Secrets Design (
docs/modules/scanner/design/surface-secrets.md) - Surface.Validation Design (
docs/modules/scanner/design/surface-validation.md) - Zastava Deployment Runbook (
docs/modules/devops/runbooks/zastava-deployment.md)