docs consolidation and others

This commit is contained in:
master
2026-01-06 19:02:21 +02:00
parent d7bdca6d97
commit 4789027317
849 changed files with 16551 additions and 66770 deletions

View File

@@ -4,6 +4,8 @@
**Source:** `src/AirGap/`
**Owner:** Platform Team
> **Note:** This is the module dossier with architecture and implementation details. For operational guides and workflows, see [docs/modules/airgap/guides/](./guides/).
## Purpose
AirGap manages sealed knowledge snapshot export and import for offline/air-gapped deployments. Provides time-anchored snapshots with staleness policies, deterministic bundle creation, and secure import validation for complete offline operation.

View File

@@ -0,0 +1,35 @@
# Remediation plan for AG1AG12 (Airgap deployment playbook gaps)
Source: `31-Nov-2025 FINDINGS.md` (AG1AG12). Scope: sprint `SPRINT_0510_0001_0001_airgap`.
## Summary of actions
- **AG1 Trust roots & key custody:** Define per-profile root hierarchy (FIPS/eIDAS/GOST/SM + optional PQ). Require M-of-N custody for offline signer keys; dual-sign (ECDSA+PQ) where regionally allowed. Add rotation cadence (quarterly PQ, annual classical) and HSM/offline signer paths. Manifest fields: `trustRoots[] {id, profile, algo, fingerprint, rotationDue}`.
- **AG2 Rekor mirror integrity:** Standardize mirror format as DSSE-signed CAR with `mirror.manifest` (root hash, start/end index, freshness ts, signature). Include staleness window hours and reconciliation steps (prefer upstream Rekor if available, else fail closed when stale > window).
- **AG3 Feed freezing & provenance:** Extend offline kit manifest with `feeds[] {name, source, snapshotId, sha256, validFrom, validTo, dsse}`. Replay must refuse newer/older feeds unless override DSSE is supplied.
- **AG4 Deterministic tooling versions:** Add `tools[] {name, version, sha256, imageDigest}` to manifest; CLI verifies before replay. Require `--offline`/`--disable-telemetry` flags in runner scripts.
- **AG5 Size/resource limits:** Add kit chunking spec (`zstd` chunks, 256MiB max, per-chunk SHA256) and max kit size (10GiB). Provide streaming verifier script path (`scripts/verify-kit.sh`) and fail on missing/invalid chunks.
- **AG6 Malware/content scanning:** Require pre-publish AV/YARA scan with signed report hash in manifest (`scans[] {tool, version, result, reportSha256}`) and post-ingest scan before registry load. Scanner defaults to offline sigs.
- **AG7 Policy/graph alignment:** Manifest must carry policy bundle hash and graph revision hash (DSSE references). Replay fails closed on mismatch. Controller status surfaces hashes and drift seconds.
- **AG8 Tenant/env scoping:** Manifest includes `tenant`, `environment`; importer enforces equality and tenant-scoped storage paths. DSSE annotations must carry tenant/env; reject mismatches.
- **AG9 Ingress/egress audit trail:** Add signed ingress/egress receipts (`ingress_receipt.dsse`, `egress_receipt.dsse`) capturing kit hash, operator ID, decision, timestamp. Store in Proof Graph (or local CAS mirror when offline).
- **AG10 Replay validation depth:** Define levels: `hash-only`, `recompute`, `recompute+policy-freeze`. Manifest states required level; replay script enforces and emits evidence bundle (`replay_evidence.dsse`) with success criteria.
- **AG11 Observability in air-gap:** Provide OTLP-to-file/SQLite exporter in kit; default retention 7d/5GiB cap; redaction allowlist documented. No external sinks. Controller/Importer log to local file + optional JSON lines.
- **AG12 Operational runbooks:** Add `docs/airgap/runbooks/` covering: signature failure, missing gateway headers, stale mirror, policy mismatch, chunk verification failure. Include required approvals and fail-closed guidance.
## Files to update (next steps)
- Offline kit manifest schema (`docs/airgap/offline-kit-manifest.schema.json`, new) with fields above.
- Runner scripts: `scripts/verify-kit.sh`, `scripts/replay-kit.sh` (enforce hash/tool checks, replay levels).
- Add AV/YARA guidance to `docs/airgap/offline-kit/README.md` and integrate into CI.
- Update controller/importer status APIs to surface policy/graph hash and scan results.
- Add ingress/egress receipt DSSE templates (`docs/airgap/templates/receipt.ingress.json`).
## Owners & timelines
- Schema & manifest updates: AirGap Importer Guild (due 2025-12-05).
- Key custody/rotation doc + dual-sign flows: Authority Guild (due 2025-12-06).
- Mirror/feeds/tool hashing + scripts: DevOps Guild (due 2025-12-06).
- Runbooks + observability defaults: Ops Guild (due 2025-12-07).
## Acceptance
- All new schema fields documented with examples; DSSE signatures validated in CI.
- Replay and verify scripts fail-closed on mismatch/staleness; tests cover chunking and hash drift.
- Ingress/egress receipts produced during CI dry-run and verified against Proof Graph mirror.

View File

@@ -0,0 +1,384 @@
# VEX Signature Verification: Offline Mode
**Sprint:** SPRINT_1227_0004_0001_BE_signature_verification
**Task:** T11 - Document offline mode with bundled trust anchors
**Date:** 2025-12-28
---
## Overview
This document describes how to configure VEX signature verification for air-gapped (offline) deployments where network access to public trust infrastructure (Sigstore, Fulcio, Rekor) is unavailable.
---
## Offline Mode Architecture
```
┌─────────────────────────────────────────────────────────────┐
│ Air-Gapped Environment │
│ │
│ ┌───────────────┐ ┌────────────────────────────────┐ │
│ │ VEX Documents │────▶│ ProductionVexSignatureVerifier │ │
│ └───────────────┘ └────────────────────────────────┘ │
│ │ │
│ ┌──────────────┴────────────────┐ │
│ ▼ ▼ │
│ ┌─────────────────────────┐ ┌─────────────────────┐ │
│ │ Bundled Trust Anchors │ │ Bundled Issuer Dir │ │
│ │ /var/stellaops/trust/ │ │ /var/stellaops/ │ │
│ │ ├── fulcio-root.pem │ │ bundles/issuers.json│ │
│ │ ├── sigstore-root.pem │ └─────────────────────┘ │
│ │ └── internal-ca.pem │ │
│ └─────────────────────────┘ │
│ │
└─────────────────────────────────────────────────────────────┘
```
---
## Configuration
### 1. Enable Offline Mode
**File:** `etc/excititor.yaml`
```yaml
VexSignatureVerification:
Enabled: true
DefaultProfile: "world"
OfflineMode: true # Critical: Enable offline verification
# Offline-specific settings
OfflineBundle:
Enabled: true
BundlePath: "/var/stellaops/bundles"
RefreshOnStartup: false
# Trust anchors for signature verification
TrustAnchors:
Fulcio:
- "/var/stellaops/trust/fulcio-root.pem"
- "/var/stellaops/trust/fulcio-intermediate.pem"
Sigstore:
- "/var/stellaops/trust/sigstore-root.pem"
Internal:
- "/var/stellaops/trust/internal-ca.pem"
- "/var/stellaops/trust/internal-intermediate.pem"
# IssuerDirectory in offline mode
IssuerDirectory:
OfflineBundle: "/var/stellaops/bundles/issuers.json"
FallbackToBundle: true
# ServiceUrl not needed in offline mode
```
### 2. Directory Structure
```
/var/stellaops/
├── bundles/
│ ├── issuers.json # Issuer directory bundle
│ ├── revocations.json # Key revocation data
│ └── tuf-metadata/ # TUF metadata for updates
│ ├── root.json
│ ├── targets.json
│ └── snapshot.json
├── trust/
│ ├── fulcio-root.pem # Sigstore Fulcio root CA
│ ├── fulcio-intermediate.pem
│ ├── sigstore-root.pem # Sigstore root
│ ├── rekor-pubkey.pem # Rekor public key
│ ├── internal-ca.pem # Internal enterprise CA
│ └── internal-intermediate.pem
└── cache/
└── verification-cache.db # Local verification cache
```
---
## Bundle Preparation
### 1. Download Trust Anchors
Run this on a connected machine to prepare the bundle:
```bash
#!/bin/bash
# prepare-offline-bundle.sh
BUNDLE_DIR="./offline-bundle"
mkdir -p "$BUNDLE_DIR/trust" "$BUNDLE_DIR/bundles"
# Download Sigstore trust anchors
echo "Downloading Sigstore trust anchors..."
curl -sSL https://fulcio.sigstore.dev/api/v2/trustBundle \
-o "$BUNDLE_DIR/trust/fulcio-root.pem"
curl -sSL https://rekor.sigstore.dev/api/v1/log/publicKey \
-o "$BUNDLE_DIR/trust/rekor-pubkey.pem"
# Download TUF metadata
echo "Downloading TUF metadata..."
cosign initialize --mirror=https://tuf-repo.sigstore.dev \
--root="$BUNDLE_DIR/bundles/tuf-metadata"
# Export issuer directory
echo "Exporting issuer directory..."
stellaops-cli issuer-directory export \
--format json \
--output "$BUNDLE_DIR/bundles/issuers.json"
# Export revocation data
echo "Exporting revocation data..."
stellaops-cli revocations export \
--format json \
--output "$BUNDLE_DIR/bundles/revocations.json"
# Create manifest
echo "Creating bundle manifest..."
cat > "$BUNDLE_DIR/manifest.json" <<EOF
{
"version": "1.0.0",
"createdAt": "$(date -u +%Y-%m-%dT%H:%M:%SZ)",
"expiresAt": "$(date -u -d '+90 days' +%Y-%m-%dT%H:%M:%SZ)",
"contents": {
"trustAnchors": ["fulcio-root.pem", "rekor-pubkey.pem"],
"bundles": ["issuers.json", "revocations.json"],
"tufMetadata": true
},
"checksum": "$(find $BUNDLE_DIR -type f -exec sha256sum {} \; | sha256sum | cut -d' ' -f1)"
}
EOF
# Package bundle
echo "Creating tarball..."
tar -czvf "stellaops-trust-bundle-$(date +%Y%m%d).tar.gz" -C "$BUNDLE_DIR" .
echo "Bundle ready: stellaops-trust-bundle-$(date +%Y%m%d).tar.gz"
```
### 2. Transfer to Air-Gapped Environment
```bash
# On air-gapped machine
sudo mkdir -p /var/stellaops/{trust,bundles,cache}
sudo tar -xzvf stellaops-trust-bundle-20250128.tar.gz -C /var/stellaops/
# Verify bundle integrity
stellaops-cli bundle verify /var/stellaops/manifest.json
```
---
## Issuer Directory Bundle Format
**File:** `/var/stellaops/bundles/issuers.json`
```json
{
"version": "1.0.0",
"exportedAt": "2025-01-28T10:30:00Z",
"issuers": [
{
"id": "redhat-security",
"name": "Red Hat Product Security",
"description": "Official Red Hat security advisories",
"jurisdiction": "us",
"trustLevel": "high",
"keys": [
{
"keyId": "rh-vex-signing-key-2024",
"algorithm": "ECDSA-P256",
"publicKey": "-----BEGIN PUBLIC KEY-----\nMFkwEwYHKoZIzj0...\n-----END PUBLIC KEY-----",
"notBefore": "2024-01-01T00:00:00Z",
"notAfter": "2026-01-01T00:00:00Z",
"revoked": false
}
],
"csafPublisher": {
"providerMetadataUrl": "https://access.redhat.com/.well-known/csaf/provider-metadata.json",
"tlpWhite": true
}
},
{
"id": "internal-security",
"name": "Internal Security Team",
"description": "Internal VEX attestations",
"jurisdiction": "internal",
"trustLevel": "high",
"keys": [
{
"keyId": "internal-vex-key-001",
"algorithm": "Ed25519",
"publicKey": "-----BEGIN PUBLIC KEY-----\nMCowBQYDK2VwAyEA...\n-----END PUBLIC KEY-----",
"notBefore": "2024-06-01T00:00:00Z",
"notAfter": "2025-06-01T00:00:00Z",
"revoked": false
}
]
}
],
"revokedKeys": [
{
"keyId": "old-compromised-key",
"revokedAt": "2024-03-15T00:00:00Z",
"reason": "key_compromise"
}
]
}
```
---
## Verification Behavior in Offline Mode
### Supported Verification Methods
| Method | Offline Support | Notes |
|--------|-----------------|-------|
| DSSE | Full | Uses bundled keys |
| PGP | Full | Uses bundled keyrings |
| X.509 | Partial | Requires bundled CA chain |
| Cosign (keyed) | Full | Uses bundled public keys |
| Cosign (keyless) | Limited | Requires bundled Fulcio root |
| Rekor Verification | No | Transparency log unavailable |
### Fallback Behavior
```yaml
VexSignatureVerification:
OfflineFallback:
# When Rekor is unavailable
SkipRekorVerification: true
WarnOnMissingTransparency: true
# When issuer key not in bundle
UnknownIssuerAction: "warn" # warn | block | allow
# When certificate chain incomplete
IncompleteChainAction: "warn"
```
### Verification Result Fields
```json
{
"verified": true,
"method": "dsse",
"mode": "offline",
"warnings": [
"transparency_log_skipped"
],
"issuerName": "Red Hat Product Security",
"keyId": "rh-vex-signing-key-2024",
"bundleVersion": "2025.01.28",
"bundleAge": "P3D"
}
```
---
## Bundle Updates
### Manual Update Process
1. **Export new bundle** on connected machine
2. **Transfer** via secure media (USB, CD)
3. **Verify** bundle signature on air-gapped machine
4. **Deploy** with rollback capability
```bash
# On air-gapped machine
cd /var/stellaops
# Backup current bundle
sudo cp -r bundles bundles.backup-$(date +%Y%m%d)
# Deploy new bundle
sudo tar -xzvf new-bundle.tar.gz -C /tmp/new-bundle
sudo stellaops-cli bundle verify /tmp/new-bundle/manifest.json
# Apply with verification
sudo stellaops-cli bundle apply /tmp/new-bundle --verify
sudo systemctl restart stellaops-excititor
# Rollback if needed
# sudo stellaops-cli bundle rollback --to bundles.backup-20250115
```
### Recommended Update Frequency
| Component | Recommended Frequency | Criticality |
|-----------|----------------------|-------------|
| Trust anchors | Quarterly | High |
| Issuer directory | Monthly | Medium |
| Revocation data | Weekly | Critical |
| TUF metadata | Monthly | Medium |
---
## Monitoring and Alerts
### Bundle Expiration Warning
```yaml
# prometheus-alerts.yaml
groups:
- name: stellaops-verification
rules:
- alert: TrustBundleExpiringSoon
expr: stellaops_trust_bundle_expiry_days < 30
for: 1h
labels:
severity: warning
annotations:
summary: "Trust bundle expires in {{ $value }} days"
- alert: TrustBundleExpired
expr: stellaops_trust_bundle_expiry_days <= 0
for: 5m
labels:
severity: critical
annotations:
summary: "Trust bundle has expired - verification may fail"
```
### Metrics
| Metric | Description |
|--------|-------------|
| `stellaops_trust_bundle_expiry_days` | Days until bundle expiration |
| `stellaops_verification_offline_mode` | 1 if running in offline mode |
| `stellaops_verification_bundle_key_count` | Number of issuer keys in bundle |
| `stellaops_verification_revoked_key_count` | Number of revoked keys |
---
## Troubleshooting
### Common Issues
1. **"Unknown issuer" for known vendor**
- Update issuer directory bundle
- Add vendor's keys to bundle
2. **"Expired certificate" for recent VEX**
- Certificate may have rotated after bundle export
- Update trust anchors bundle
3. **"Chain validation failed"**
- Missing intermediate certificate
- Add intermediate to bundle
4. **Stale revocation data**
- Key may be compromised but bundle doesn't know
- Update revocation bundle urgently
---
## See Also
- [VEX Signature Verification Configuration](../operations/vex-verification-config.md)
- [Air-Gap Deployment Guide](../airgap/deployment-guide.md)
- [TUF Repository Management](../operations/tuf-repository.md)

View File

@@ -0,0 +1,339 @@
# Offline and Air-Gap Advisory Implementation Roadmap
**Source Advisory:** 14-Dec-2025 - Offline and Air-Gap Technical Reference
**Document Version:** 1.0
**Last Updated:** 2025-12-15
---
## Executive Summary
This document outlines the implementation roadmap for gaps identified between the 14-Dec-2025 Offline and Air-Gap Technical Reference advisory and the current StellaOps codebase. The implementation is organized into 5 sprints addressing security-critical, high-priority, and enhancement-level improvements.
---
## Implementation Overview
### Sprint Summary
| Sprint | Topic | Priority | Gaps | Effort | Dependencies |
|--------|-------|----------|------|--------|--------------|
| [0338](../implplan/SPRINT_0338_0001_0001_airgap_importer_core.md) | AirGap Importer Core | P0 | G6, G7 | Medium | None |
| [0339](../implplan/SPRINT_0339_0001_0001_cli_offline_commands.md) | CLI Offline Commands | P1 | G4 | Medium | 0338 |
| [0340](../implplan/SPRINT_0340_0001_0001_scanner_offline_config.md) | Scanner Offline Config | P2 | G5 | Medium | 0338 |
| [0341](../implplan/SPRINT_0341_0001_0001_observability_audit.md) | Observability & Audit | P1-P2 | G11-G14 | Medium | 0338 |
| [0342](../implplan/SPRINT_0342_0001_0001_evidence_reconciliation.md) | Evidence Reconciliation | P3 | G10 | High | 0338, 0340 |
### Dependency Graph
```
┌─────────────────────────────────────────────┐
│ │
│ Sprint 0338: AirGap Importer Core (P0) │
│ - Monotonicity enforcement (G6) │
│ - Quarantine handling (G7) │
│ │
└──────────────────┬──────────────────────────┘
┌─────────────────────┼─────────────────────┐
│ │ │
▼ ▼ ▼
┌────────────────┐ ┌────────────────┐ ┌────────────────┐
│ Sprint 0339 │ │ Sprint 0340 │ │ Sprint 0341 │
│ CLI Commands │ │ Scanner Config │ │ Observability │
│ (P1) │ │ (P2) │ │ (P1-P2) │
│ - G4 │ │ - G5 │ │ - G11-G14 │
└────────────────┘ └───────┬────────┘ └────────────────┘
┌────────────────┐
│ Sprint 0342 │
│ Evidence Recon │
│ (P3) │
│ - G10 │
└────────────────┘
```
---
## Gap-to-Sprint Mapping
### P0 - Critical (Must Implement First)
| Gap ID | Description | Sprint | Rationale |
|--------|-------------|--------|-----------|
| **G6** | Monotonicity enforcement | 0338 | Rollback prevention is security-critical; prevents replay attacks |
| **G7** | Quarantine directory handling | 0338 | Essential for forensic analysis of failed imports |
### P1 - High Priority
| Gap ID | Description | Sprint | Rationale |
|--------|-------------|--------|-----------|
| **G4** | CLI `offline` command group | 0339 | Primary operator interface; competitive parity |
| **G11** | Prometheus metrics | 0341 | Operational visibility in air-gap environments |
| **G13** | Error reason codes | 0341 | Automation and troubleshooting |
### P2 - Important
| Gap ID | Description | Sprint | Rationale |
|--------|-------------|--------|-----------|
| **G5** | Scanner offline config surface | 0340 | Enterprise trust anchor management |
| **G12** | Structured logging fields | 0341 | Log aggregation and correlation |
| **G14** | Audit schema enhancement | 0341 | Compliance and chain-of-custody |
### P3 - Lower Priority
| Gap ID | Description | Sprint | Rationale |
|--------|-------------|--------|-----------|
| **G10** | Evidence reconciliation algorithm | 0342 | Complex but valuable; VEX-first decisioning |
### Deferred (Not Implementing)
| Gap ID | Description | Rationale |
|--------|-------------|-----------|
| **G9** | YAML verification policy schema | Over-engineering; existing JSON/code config sufficient |
---
## Technical Architecture
### New Components
```
src/AirGap/
├── StellaOps.AirGap.Importer/
│ ├── Versioning/
│ │ ├── BundleVersion.cs # Sprint 0338
│ │ ├── IVersionMonotonicityChecker.cs # Sprint 0338
│ │ └── IBundleVersionStore.cs # Sprint 0338
│ ├── Quarantine/
│ │ ├── IQuarantineService.cs # Sprint 0338
│ │ ├── FileSystemQuarantineService.cs # Sprint 0338
│ │ └── QuarantineOptions.cs # Sprint 0338
│ ├── Telemetry/
│ │ ├── OfflineKitMetrics.cs # Sprint 0341
│ │ ├── OfflineKitLogFields.cs # Sprint 0341
│ │ └── OfflineKitLogScopes.cs # Sprint 0341
│ ├── Reconciliation/
│ │ ├── ArtifactIndex.cs # Sprint 0342
│ │ ├── EvidenceCollector.cs # Sprint 0342
│ │ ├── DocumentNormalizer.cs # Sprint 0342
│ │ ├── PrecedenceLattice.cs # Sprint 0342
│ │ └── EvidenceGraphEmitter.cs # Sprint 0342
src/Scanner/
├── __Libraries/StellaOps.Scanner.Core/
│ ├── Configuration/
│ │ ├── OfflineKitOptions.cs # Sprint 0340
│ │ ├── TrustAnchorConfig.cs # Sprint 0340
│ │ └── OfflineKitOptionsValidator.cs # Sprint 0340
│ └── TrustAnchors/
│ ├── PurlPatternMatcher.cs # Sprint 0340
│ ├── ITrustAnchorRegistry.cs # Sprint 0340
│ └── TrustAnchorRegistry.cs # Sprint 0340
src/Cli/
├── StellaOps.Cli/
│ ├── Commands/
│ ├── Offline/
│ │ ├── OfflineCommandGroup.cs # Sprint 0339
│ │ ├── OfflineImportHandler.cs # Sprint 0339
│ │ ├── OfflineStatusHandler.cs # Sprint 0339
│ │ └── OfflineExitCodes.cs # Sprint 0339
│ └── Verify/
│ └── VerifyOfflineHandler.cs # Sprint 0339
│ └── Output/
│ └── OfflineKitReasonCodes.cs # Sprint 0341
src/Authority/
├── __Libraries/StellaOps.Authority.Storage.Postgres/
│ └── Migrations/
│ └── 004_offline_kit_audit.sql # Sprint 0341
```
### Database Changes
| Table | Schema | Sprint | Purpose |
|-------|--------|--------|---------|
| `airgap.bundle_versions` | New | 0338 | Track active bundle versions per tenant/type |
| `airgap.bundle_version_history` | New | 0338 | Version history for audit trail |
| `authority.offline_kit_audit` | New | 0341 | Enhanced audit with Rekor/DSSE fields |
### Configuration Changes
| Section | Sprint | Fields |
|---------|--------|--------|
| `AirGap:Quarantine` | 0338 | `QuarantineRoot`, `RetentionPeriod`, `MaxQuarantineSizeBytes` |
| `Scanner:OfflineKit` | 0340 | `RequireDsse`, `RekorOfflineMode`, `TrustAnchors[]` |
### CLI Commands
| Command | Sprint | Description |
|---------|--------|-------------|
| `stellaops offline import` | 0339 | Import offline kit with verification |
| `stellaops offline status` | 0339 | Display current kit status |
| `stellaops verify offline` | 0339 | Offline evidence verification |
### Metrics
| Metric | Type | Sprint | Labels |
|--------|------|--------|--------|
| `offlinekit_import_total` | Counter | 0341 | `status`, `tenant_id` |
| `offlinekit_attestation_verify_latency_seconds` | Histogram | 0341 | `attestation_type`, `success` |
| `attestor_rekor_success_total` | Counter | 0341 | `mode` |
| `attestor_rekor_retry_total` | Counter | 0341 | `reason` |
| `rekor_inclusion_latency` | Histogram | 0341 | `success` |
---
## Implementation Sequence
### Phase 1: Foundation (Sprint 0338)
**Duration:** 1 sprint
**Focus:** Security-critical infrastructure
1. Implement `BundleVersion` model with semver parsing
2. Create `IVersionMonotonicityChecker` and Postgres store
3. Integrate monotonicity check into `ImportValidator`
4. Implement `--force-activate` with audit trail
5. Create `IQuarantineService` and file-system implementation
6. Integrate quarantine into all import failure paths
7. Write comprehensive tests
**Exit Criteria:**
- [ ] Rollback attacks are prevented
- [ ] Failed bundles are preserved for investigation
- [ ] Force activation requires justification
### Phase 2: Operator Experience (Sprints 0339, 0341)
**Duration:** 1-2 sprints (can parallelize)
**Focus:** CLI and observability
**Sprint 0339 (CLI):**
1. Create `offline` command group
2. Implement `offline import` with all flags
3. Implement `offline status` with output formats
4. Implement `verify offline` with policy loading
5. Add exit code standardization
6. Write CLI integration tests
**Sprint 0341 (Observability):**
1. Add Prometheus metrics infrastructure
2. Implement offline kit metrics
3. Standardize structured logging fields
4. Complete error reason codes
5. Create audit schema migration
6. Implement audit repository and emitter
7. Create Grafana dashboard
> Blockers: Prometheus `/metrics` endpoint hosting and audit emitter call-sites await an owning Offline Kit import/activation flow (`POST /api/offline-kit/import`).
**Exit Criteria:**
- [ ] Operators can import/verify kits via CLI
- [ ] Metrics are visible in Prometheus/Grafana
- [ ] All operations are auditable
### Phase 3: Configuration (Sprint 0340)
**Duration:** 1 sprint
**Focus:** Trust anchor management
1. Create `OfflineKitOptions` configuration class
2. Implement PURL pattern matcher
3. Create `TrustAnchorRegistry` with precedence resolution
4. Add options validation
5. Integrate trust anchors with DSSE verification
6. Update Helm chart values
7. Write configuration tests
**Exit Criteria:**
- [ ] Trust anchors configurable per ecosystem
- [ ] DSSE verification uses configured anchors
- [ ] Invalid configuration fails startup
### Phase 4: Advanced Features (Sprint 0342)
**Duration:** 1-2 sprints
**Focus:** Evidence reconciliation
1. Design artifact indexing
2. Implement evidence collection
3. Create document normalization
4. Implement VEX precedence lattice
5. Create evidence graph emitter
6. Integrate with CLI `verify offline`
7. Write golden-file determinism tests
**Exit Criteria:**
- [ ] Evidence reconciliation is deterministic
- [ ] VEX conflicts resolved by precedence
- [ ] Graph output is signed and verifiable
---
## Testing Strategy
### Unit Tests
- All new classes have corresponding test classes
- Mock dependencies for isolation
- Property-based tests for lattice operations
### Integration Tests
- Testcontainers for PostgreSQL
- Full import → verification → audit flow
- CLI command execution tests
### Determinism Tests
- Golden-file tests for evidence reconciliation
- Cross-platform validation (Windows, Linux, macOS)
- Reproducibility across runs
### Security Tests
- Monotonicity bypass attempts
- Signature verification edge cases
- Trust anchor configuration validation
---
## Documentation Updates
| Document | Sprint | Updates |
|----------|--------|---------|
| `docs/airgap/importer.md` | 0338 | Monotonicity + quarantine reference |
| `docs/airgap/runbooks/quarantine-investigation.md` | 0338 | New runbook |
| `docs/modules/cli/commands/offline.md` | 0339 | New command reference |
| `docs/modules/cli/guides/airgap.md` | 0339 | Update with CLI examples |
| `docs/modules/scanner/configuration.md` | 0340 | Add offline kit config section |
| `docs/airgap/observability.md` | 0341 | Metrics and logging reference |
| `docs/airgap/evidence-reconciliation.md` | 0342 | Algorithm documentation |
---
## Risk Register
| Risk | Impact | Mitigation |
|------|--------|------------|
| Monotonicity breaks existing workflows | High | Provide `--force-activate` escape hatch |
| Quarantine disk exhaustion | Medium | Implement quota and TTL cleanup |
| Trust anchor config complexity | Medium | Provide sensible defaults, validate at startup |
| Evidence reconciliation performance | Medium | Streaming processing, caching |
| Cross-platform determinism failures | High | CI matrix, golden-file tests |
---
## Success Metrics
| Metric | Target | Sprint |
|--------|--------|--------|
| Rollback attack prevention | 100% | 0338 |
| Failed bundle quarantine rate | 100% | 0338 |
| CLI command adoption | 50% operators | 0339 |
| Metric collection uptime | 99.9% | 0341 |
| Audit completeness | 100% events | 0341 |
| Reconciliation determinism | 100% | 0342 |
---
## References
- [14-Dec-2025 Offline and Air-Gap Technical Reference](../product-advisories/14-Dec-2025%20-%20Offline%20and%20Air-Gap%20Technical%20Reference.md)
- [Air-Gap Mode Playbook](./airgap-mode.md)
- [Offline Kit Documentation](../OFFLINE_KIT.md)
- [Importer](./importer.md)

View File

@@ -0,0 +1,77 @@
# Air-Gapped Mode Playbook
> Work of this type or tasks of this type on this component must also be applied everywhere else it should be applied.
## Overview
Air-Gapped Mode is the supported operating profile for deployments with **zero external egress**. All inputs arrive via signed mirror bundles, and every surface (CLI, Console, APIs, schedulers, scanners) operates under sealed-network constraints while preserving Aggregation-Only Contract invariants.
- **Primary components:** Web Services API, Console, CLI, Orchestrator, Task Runner, Conselier (formerly Feedser), Excitor (formerly Vexer), Policy Engine, Findings Ledger, Export Center, Authority & Tenancy, Notifications, Observability & Forensics.
- **Surfaces:** offline bootstrap, mirror ingestion, deterministic jobs, offline advisories/VEX/policy packs/notifications, evidence exports.
- **Dependencies:** Export Center, Containerized Distribution, Authority-backed scopes & tenancy, Observability & Forensics, Policy Studio.
## Guiding principles
1. **Zero egress:** all outbound network calls are disabled unless explicitly allowed. Any feature requiring online data must degrade gracefully with clear UX messaging.
2. **Deterministic inputs:** the platform accepts only signed Mirror Bundles (advisories, VEX, policy packs, vendor feeds, images, dashboards). Bundles carry provenance attestations and chain-of-custody manifests.
3. **Auditable exchange:** every import/export records provenance, signatures, and operator identity. Evidence bundles and reports remain verifiable offline.
4. **Aggregation-Only Contract compliance:** Conseiller and Excitor continue to aggregate without mutating source records, even when ingesting mirrored feeds.
5. **Operator ergonomics:** offline bootstrap, upgrade, and verification steps are reproducible and scripted.
## Lifecycle & modes
| Mode | Description | Tooling |
| --- | --- | --- |
| Connected | Standard deployment with online feeds. Operators use Export Center to build mirror bundles for offline environments. | `stella export bundle create --profile mirror:full` |
| Staging mirror | Sealed host that fetches upstream feeds, runs validation, and signs mirror bundles. | Export Center, cosign, bundle validation scripts |
| Air-gapped | Production cluster with egress sealed, consuming validated bundles, issuing provenance for inward/outward transfers. | Mirror import CLI, sealed-mode runtime flags |
### Installation & bootstrap
1. Prepare mirror bundles (images, charts, advisories/VEX, policy packs, dashboards, telemetry configs).
2. Transfer bundles via approved media and validate signatures (`cosign verify`, bundle manifest hash).
3. Deploy platform using offline artefacts (`helm install --set airgap.enabled=true`), referencing local registry/object storage.
### Updates
1. Staging host generates incremental bundles (mirror delta) with provenance.
2. Offline site imports bundles via the CLI (`stella airgap import --bundle`) and records chain-of-custody.
3. Scheduler triggers replay jobs with deterministic timelines; results remain reproducible across imports.
## Component responsibilities
| Component | Offline duties |
| --- | --- |
| Export Center | Produce full/delta mirror bundles, signed manifests, provenance attestations. |
| Authority & Tenancy | Provide offline scope enforcement, short-lived tokens, revocation via local CRLs. |
| Conseiller / Excitor | Ingest mirrored advisories/VEX, enforce AOC, versioned observations. |
| Policy Engine & Findings Ledger | Replay evaluations using offline feeds, emit explain traces, support sealed-mode hints. |
| Notifications | Deliver locally via approved channels (email relay, webhook proxies) or queue for manual export. |
| Observability | Collect metrics/logs/traces locally, generate forensic bundles for external analysis. |
## Operational guardrails
- **Network policy:** enforce allowlists (`airgap.egressAllowlist=[]`). Any unexpected outbound request raises an alert.
- **Bundle validation:** double-sign manifests (bundle signer + site-specific cosign key); reject on mismatch.
- **Time synchronization:** rely on local NTP or manual clock audits; many signatures require monotonic time.
- **Key rotation:** plan for offline key ceremonies; Export Center and Authority document rotation playbooks.
- **Authority scopes:** enforce `airgap:status:read`, `airgap:import`, and `airgap:seal` via tenant-scoped roles; require operator reason/ticket metadata for sealing.
- **AirGap controller API:** requires tenant identity (`x-tenant-id` header or tenant claim) plus the matching scope; requests without tenant context are rejected.
- **Incident response:** maintain scripts for replaying imports, regenerating manifests, and exporting forensic data without egress.
- **EgressPolicy facade:** all services route outbound calls through `StellaOps.AirGap.Policy`. In sealed mode `EgressPolicy` enforces the `airgap.egressAllowlist`, auto-permits loopback targets, and raises `AIRGAP_EGRESS_BLOCKED` exceptions with remediation text (add host to allowlist or coordinate break-glass). Unsealed mode logs intents but does not block, giving operators a single toggle for rehearsals. Task Runner now feeds every `run.egress` declaration and runtime network hint into the shared policy during planning, preventing sealed-mode packs from executing unless destinations are declared and allow-listed.
- **CLI guard:** the CLI now routes outbound HTTP through the shared egress policy. When sealed, commands that would dial external endpoints (for example, `scanner download` or remote `sources ingest` URIs) are refused with `AIRGAP_EGRESS_BLOCKED` messaging and remediation guidance instead of attempting the network call.
- **Observability exporters:** `StellaOps.Telemetry.Core` now binds OTLP exporters to the configured egress policy. When sealed, any collector endpoint that is not loopback or allow-listed is skipped at startup and a structured warning is written so operators see the remediation guidance without leaving sealed mode.
- **Linting/CI:** enable the `StellaOps.AirGap.Policy.Analyzers` package in solution-level analyzers so CI fails on raw `HttpClient` usage. The analyzer emits `AIRGAP001` and the bundled code fix rewrites to `EgressHttpClientFactory.Create(...)`; treat analyzer warnings as errors in sealed-mode pipelines.
## Testing & verification
- Integration tests mimic offline installs by running with `AIRGAP_ENABLED=true` in CI.
- Mirror bundles include validation scripts to compare hash manifests across staging and production.
- Sealed-mode smoke tests ensure services fail closed when attempting egress.
## References
- Export workflows: `docs/modules/export-center/overview.md`
- Policy sealed-mode hints: `docs/modules/policy/guides/overview.md`
- Observability forensic bundles: `docs/modules/telemetry/architecture.md`
- Runtime posture enforcement: `docs/modules/zastava/operations/runtime.md`

View File

@@ -0,0 +1,33 @@
# Bootstrap Pack (Airgap 56-004)
Guidance to build and install the bootstrap pack that primes sealed environments.
## Contents
- Core images/charts for platform services (Authority, Excititor, Concelier, Export Center, Scheduler) with digests.
- Offline NuGet/npm caches (if permitted) with checksum manifest.
- Configuration defaults: sealed-mode toggles, trust roots, time-anchor bundle, network policy presets.
- Verification scripts: hash check, DSSE verification (if available), and connectivity probes to local mirrors.
## Build steps
1. Gather image digests and charts from trusted registry/mirror.
2. Create `bootstrap-manifest.json` with:
- `bundleId`, `createdAt` (UTC), `producer`, `mirrorGeneration`
- `files[]` (path, sha256, size, mediaType)
- optional `dsseEnvelopeHash`
3. Package into tarball with deterministic ordering (POSIX tar, sorted paths, numeric owner 0:0).
4. Compute sha256 for tarball; record in manifest.
## Install steps
1. Transfer pack to sealed site (removable media).
2. Verify tarball hash and DSSE (if present) using offline trust roots.
3. Load images/charts into local registry; preload caches to `local-nugets/` etc.
4. Apply network policies (deny-all) and sealed-mode config.
5. Register bootstrap manifest and mirrorGeneration with Excititor/Export Center.
## Determinism & rollback
- Keep manifests in ISO-8601 UTC; no host-specific metadata in tar headers.
- For rollback, retain previous bootstrap tarball + manifest; restore registry contents and config snapshots.
## Related
- `docs/modules/airgap/guides/mirror-bundles.md` — mirror pack format and validation.
- `docs/modules/airgap/guides/sealing-and-egress.md` — egress enforcement used during install.

View File

@@ -0,0 +1,38 @@
# Bundle Catalog & Items Repositories (prep for AIRGAP-IMP-57-001)
## Scope
- Deterministic storage for offline bundle metadata with tenant isolation (RLS) and stable ordering.
- Ready for PostgreSQL-backed implementation while providing in-memory deterministic reference behavior.
## Schema (logical)
- `bundle_catalog`:
- `tenant_id` (string, PK part, RLS partition)
- `bundle_id` (string, PK part)
- `digest` (hex string)
- `imported_at_utc` (datetime)
- `content_paths` (array of strings, sorted ordinal)
- `bundle_items`:
- `tenant_id` (string, PK part, RLS partition)
- `bundle_id` (string, PK part)
- `path` (string, PK part)
- `digest` (hex string)
- `size_bytes` (long)
## Implementation delivered (2025-11-20)
- In-memory repositories enforcing tenant isolation and deterministic ordering:
- `InMemoryBundleCatalogRepository` (upsert + list ordered by `bundle_id`).
- `InMemoryBundleItemRepository` (bulk upsert + list ordered by `path`).
- Models: `BundleCatalogEntry`, `BundleItem`.
- Tests cover upsert overwrite semantics, tenant isolation, and deterministic ordering (`tests/AirGap/StellaOps.AirGap.Importer.Tests/InMemoryBundleRepositoriesTests.cs`).
## Migration notes (for PostgreSQL backends)
- Create compound unique indexes on (`tenant_id`, `bundle_id`) for catalog; (`tenant_id`, `bundle_id`, `path`) for items.
- Enforce RLS by always scoping queries to `tenant_id` and validating it at repository boundary (as done in in-memory reference impl).
- Keep paths lowercased or use ordinal comparisons to avoid locale drift; sort before persistence to preserve determinism.
## Next steps
- Implement PostgreSQL-backed repositories mirroring the deterministic behavior and indexes above.
- Wire repositories into importer service/CLI once storage provider is selected.
## Owners
- AirGap Importer Guild.

View File

@@ -0,0 +1,6 @@
# Console Airgap Implementation Tasks (link to DOCS-AIRGAP-57-002)
- Implement sealed badge + staleness indicators using `staleness-and-time.md` rules.
- Hook import wizard to backend once mirror bundle schema and timeline event API are available.
- Ensure admin-only import; read-only view otherwise.
- Emit telemetry for imports (success/failure) and denied attempts.

View File

@@ -0,0 +1,100 @@
# AirGap Controller
The AirGap Controller is the tenant-scoped state keeper for sealed-mode operation. It records whether an installation is sealed, what policy hash is active, which time anchor is in force, and what staleness budgets apply.
For workflow context, start at `docs/modules/airgap/guides/overview.md` and `docs/modules/airgap/guides/airgap-mode.md`.
## Responsibilities
- Maintain the current AirGap state per tenant (sealed/unsealed, policy hash, time anchor, staleness budgets).
- Provide a deterministic, auditable status snapshot for operators and automation.
- Enforce sealed/unsealed transitions via Authority scopes.
- Emit telemetry signals suitable for dashboards and forensics timelines.
Non-goals:
- Bundle signature validation and import staging (owned by the importer; see `docs/modules/airgap/guides/importer.md`).
- Cryptographic signing (Signer/Attestor).
## API
Base route group: `/system/airgap` (requires authorization).
### `GET /system/airgap/status`
Required scope: `airgap:status:read`
Response: `AirGapStatusResponse` (current state + staleness evaluation).
Notes:
- Tenant routing uses `x-tenant-id` (defaults to `default` if absent).
- `driftSeconds` and `secondsRemaining` are derived from the active time anchor and staleness budget evaluation.
- `contentStaleness` contains per-category staleness evaluations (clients should treat keys as case-insensitive).
### `POST /system/airgap/seal`
Required scope: `airgap:seal`
Body: `SealRequest`
- `policyHash` (required): binds the sealed state to a specific policy revision.
- `timeAnchor` (optional): time anchor record (from the AirGap Time service).
- `stalenessBudget` (optional): default staleness budget.
- `contentBudgets` (optional): per-category staleness budgets (e.g., `advisories`, `vex`, `scanner`).
Behavior:
- Rejects requests missing `policyHash` (`400 { \"error\": \"policy_hash_required\" }`).
- Records the sealed state and returns an updated status snapshot.
### `POST /system/airgap/unseal`
Required scope: `airgap:seal`
Behavior:
- Clears the sealed state and returns an updated status snapshot.
- Staleness is returned as `Unknown` after unseal (clients should treat this as "not applicable").
### `POST /system/airgap/verify`
Required scope: `airgap:verify`
Purpose: verify replay / bundle verification requests against the currently active AirGap state.
## State model (per tenant)
Canonical fields captured by the controller (see `src/AirGap/StellaOps.AirGap.Controller`):
- `tenantId`
- `sealed`
- `policyHash` (nullable)
- `timeAnchor` (`TimeAnchor`, may be `Unknown`)
- `stalenessBudget` (`StalenessBudget`)
- `contentBudgets` (`Dictionary<string, StalenessBudget>`)
- `driftBaselineSeconds` (baseline used to keep drift evaluation stable across transitions)
- `lastTransitionAt` (UTC)
Determinism requirements:
- Use UTC timestamps only.
- Use ordinal comparisons for keys and stable serialization settings for JSON responses.
- Never infer state from wall-clock behavior other than the injected `TimeProvider`.
## Telemetry
The controller emits:
- Structured logs: `airgap.status.read`, `airgap.sealed`, `airgap.unsealed`, `airgap.verify` (include `tenant_id`, `policy_hash`, and drift/staleness).
- Metrics: `airgap_seal_total`, `airgap_unseal_total`, `airgap_status_read_total`, and gauges for drift/budget/remaining seconds.
- Timeline events (optional): `airgap.sealed`, `airgap.unsealed`, `airgap.staleness.warning`, `airgap.staleness.breach`.
## References
- `docs/modules/airgap/guides/overview.md`
- `docs/modules/airgap/guides/sealed-startup-diagnostics.md`
- `docs/modules/airgap/guides/staleness-and-time.md`
- `docs/modules/airgap/guides/time-api.md`
- `docs/modules/airgap/guides/importer.md`

View File

@@ -0,0 +1,19 @@
# Airgap Degradation Matrix (DOCS-AIRGAP-58-001)
What works and what degrades across modes (sealed → constrained → connected).
| Capability | Connected | Constrained | Sealed | Notes |
| --- | --- | --- | --- | --- |
| Mirror imports | ✓ | ✓ | ✓ | Sealed requires preloaded media + offline validation. |
| Time anchors (external NTP) | ✓ | ✓ (allowlisted) | ✗ | Sealed relies on signed time anchors. |
| Transparency log lookups | ✓ | ✓ (if allowlisted) | ✗ | Sealed skips; rely on bundled checkpoints. |
| Rekor witness | ✓ | optional | ✗ | Disabled in sealed; log locally. |
| SBOM feed refresh | ✓ | limited mirrors | offline only | Use mirror bundles. |
| CLI plugin downloads | ✓ | allowlisted | ✗ | Must ship in bootstrap pack. |
| Telemetry export | ✓ | optional | optional/log-only | Sealed may use console exporter only. |
| Webhook callbacks | ✓ | allowlisted internal only | ✗ | Use internal queue instead. |
| OTA updates | ✓ | partial | ✗ | Use mirrorGeneration refresh. |
## Remediation guidance
- If a capability is degraded in sealed mode, provide offline substitute (mirror bundles, time anchors, console exporter).
- When moving to constrained/connected, re-enable trust roots and transparency checks gradually; verify hashes first.

View File

@@ -0,0 +1,23 @@
# DevPortal Offline (DOCS-AIRGAP-DEVPORT-64-001)
How to use the developer portal in fully offline/sealed environments.
## Serving the portal
- Host static build from local object store or file server; no CDN.
- Set `DEVPORTAL_OFFLINE=true` to disable external analytics/fonts.
## Auth
- Use Authority in offline mode with pre-provisioned tenants; cache JWKS locally.
## Bundles
- Provide mirror/bootstrap bundles via offline download page with hashes and DSSE (if available).
- Offer time anchors download; display staleness and mirrorGeneration in UI header.
## Search/docs
- Bundle docs and search index; disable remote doc fetch.
## Telemetry
- Disable remote telemetry; keep console logs only or send to local OTLP endpoint.
## Verification
- On load, run self-check to confirm no external requests; fail with clear banner if any detected.

View File

@@ -0,0 +1,732 @@
# EPSS Air-Gapped Bundles Guide
## Overview
This guide describes how to create, distribute, and import EPSS (Exploit Prediction Scoring System) data bundles for air-gapped StellaOps deployments. EPSS bundles enable offline vulnerability risk scoring with the same probabilistic threat intelligence available to online deployments.
**Key Concepts**:
- **Risk Bundle**: Aggregated security data (EPSS + KEV + advisories) for offline import
- **EPSS Snapshot**: Single-day EPSS scores for all CVEs (~300k rows)
- **Staleness Threshold**: How old EPSS data can be before fallback to CVSS-only
- **Deterministic Import**: Same bundle imported twice yields identical database state
---
## Bundle Structure
### Standard Risk Bundle Layout
```
risk-bundle-2025-12-17/
├── manifest.json # Bundle metadata and checksums
├── epss/
│ ├── epss_scores-2025-12-17.csv.zst # EPSS data (ZSTD compressed)
│ └── epss_metadata.json # EPSS provenance
├── kev/
│ └── kev-catalog.json # CISA KEV catalog
├── advisories/
│ ├── nvd-updates.ndjson.zst
│ └── ghsa-updates.ndjson.zst
└── signatures/
├── bundle.dsse.json # DSSE signature (optional)
└── bundle.sha256sums # File integrity checksums
```
### manifest.json
```json
{
"bundle_id": "risk-bundle-2025-12-17",
"created_at": "2025-12-17T00:00:00Z",
"created_by": "stellaops-bundler-v1.2.3",
"bundle_type": "risk",
"schema_version": "v1",
"contents": {
"epss": {
"model_date": "2025-12-17",
"file": "epss/epss_scores-2025-12-17.csv.zst",
"sha256": "abc123...",
"size_bytes": 15728640,
"row_count": 231417
},
"kev": {
"catalog_version": "2025-12-17",
"file": "kev/kev-catalog.json",
"sha256": "def456...",
"known_exploited_count": 1247
},
"advisories": {
"nvd": {
"file": "advisories/nvd-updates.ndjson.zst",
"sha256": "ghi789...",
"record_count": 1523
},
"ghsa": {
"file": "advisories/ghsa-updates.ndjson.zst",
"sha256": "jkl012...",
"record_count": 8734
}
}
},
"signature": {
"type": "dsse",
"file": "signatures/bundle.dsse.json",
"key_id": "stellaops-bundler-2025",
"algorithm": "ed25519"
}
}
```
### epss/epss_metadata.json
```json
{
"model_date": "2025-12-17",
"model_version": "v2025.12.17",
"published_date": "2025-12-17",
"row_count": 231417,
"source_uri": "https://epss.empiricalsecurity.com/epss_scores-2025-12-17.csv.gz",
"retrieved_at": "2025-12-17T00:05:32Z",
"file_sha256": "abc123...",
"decompressed_sha256": "xyz789...",
"compression": "zstd",
"compression_level": 19
}
```
---
## Creating EPSS Bundles
### Prerequisites
**Build System Requirements**:
- Internet access (for fetching FIRST.org data)
- StellaOps Bundler CLI: `stellaops-bundler`
- ZSTD compression: `zstd` (v1.5+)
- Python 3.10+ (for verification scripts)
**Permissions**:
- Read access to FIRST.org EPSS API/CSV endpoints
- Write access to bundle staging directory
- (Optional) Signing key for DSSE signatures
### Daily Bundle Creation (Automated)
**Recommended Schedule**: Daily at 01:00 UTC (after FIRST publishes at ~00:00 UTC)
**Script**: `scripts/create-risk-bundle.sh`
```bash
#!/bin/bash
set -euo pipefail
BUNDLE_DATE=$(date -u +%Y-%m-%d)
BUNDLE_DIR="risk-bundle-${BUNDLE_DATE}"
STAGING_DIR="/tmp/stellaops-bundles/${BUNDLE_DIR}"
echo "Creating risk bundle for ${BUNDLE_DATE}..."
# 1. Create staging directory
mkdir -p "${STAGING_DIR}"/{epss,kev,advisories,signatures}
# 2. Fetch EPSS data from FIRST.org
echo "Fetching EPSS data..."
curl -sL "https://epss.empiricalsecurity.com/epss_scores-${BUNDLE_DATE}.csv.gz" \
-o "${STAGING_DIR}/epss/epss_scores-${BUNDLE_DATE}.csv.gz"
# 3. Decompress and re-compress with ZSTD (better compression for offline)
gunzip "${STAGING_DIR}/epss/epss_scores-${BUNDLE_DATE}.csv.gz"
zstd -19 -q "${STAGING_DIR}/epss/epss_scores-${BUNDLE_DATE}.csv" \
-o "${STAGING_DIR}/epss/epss_scores-${BUNDLE_DATE}.csv.zst"
rm "${STAGING_DIR}/epss/epss_scores-${BUNDLE_DATE}.csv"
# 4. Generate EPSS metadata
stellaops-bundler epss metadata \
--file "${STAGING_DIR}/epss/epss_scores-${BUNDLE_DATE}.csv.zst" \
--model-date "${BUNDLE_DATE}" \
--output "${STAGING_DIR}/epss/epss_metadata.json"
# 5. Fetch KEV catalog
echo "Fetching KEV catalog..."
curl -sL "https://www.cisa.gov/sites/default/files/feeds/known_exploited_vulnerabilities.json" \
-o "${STAGING_DIR}/kev/kev-catalog.json"
# 6. Fetch advisory updates (optional, for comprehensive bundles)
# stellaops-bundler advisories fetch ...
# 7. Generate checksums
echo "Generating checksums..."
(cd "${STAGING_DIR}" && find . -type f ! -name "*.sha256sums" -exec sha256sum {} \;) \
> "${STAGING_DIR}/signatures/bundle.sha256sums"
# 8. Generate manifest
stellaops-bundler manifest create \
--bundle-dir "${STAGING_DIR}" \
--bundle-id "${BUNDLE_DIR}" \
--output "${STAGING_DIR}/manifest.json"
# 9. Sign bundle (if signing key available)
if [ -n "${SIGNING_KEY:-}" ]; then
echo "Signing bundle..."
stellaops-bundler sign \
--manifest "${STAGING_DIR}/manifest.json" \
--key "${SIGNING_KEY}" \
--output "${STAGING_DIR}/signatures/bundle.dsse.json"
fi
# 10. Create tarball
echo "Creating tarball..."
tar -C "$(dirname "${STAGING_DIR}")" -czf "/var/stellaops/bundles/${BUNDLE_DIR}.tar.gz" \
"$(basename "${STAGING_DIR}")"
echo "Bundle created: /var/stellaops/bundles/${BUNDLE_DIR}.tar.gz"
echo "Size: $(du -h /var/stellaops/bundles/${BUNDLE_DIR}.tar.gz | cut -f1)"
# 11. Verify bundle
stellaops-bundler verify "/var/stellaops/bundles/${BUNDLE_DIR}.tar.gz"
```
**Cron Schedule**:
```cron
# Daily at 01:00 UTC (after FIRST publishes EPSS at ~00:00 UTC)
0 1 * * * /opt/stellaops/scripts/create-risk-bundle.sh >> /var/log/stellaops/bundler.log 2>&1
```
---
## Distributing Bundles
### Transfer Methods
#### 1. Physical Media (Highest Security)
```bash
# Copy to USB drive
cp /var/stellaops/bundles/risk-bundle-2025-12-17.tar.gz /media/usb/stellaops/
# Verify checksum
sha256sum /media/usb/stellaops/risk-bundle-2025-12-17.tar.gz
```
#### 2. Secure File Transfer (Network Isolation)
```bash
# SCP over dedicated management network
scp /var/stellaops/bundles/risk-bundle-2025-12-17.tar.gz \
admin@airgap-gateway.internal:/incoming/
# Verify after transfer
ssh admin@airgap-gateway.internal \
"sha256sum /incoming/risk-bundle-2025-12-17.tar.gz"
```
#### 3. Offline Bundle Repository (CD/DVD)
```bash
# Burn to CD/DVD (for regulated industries)
growisofs -Z /dev/sr0 \
-R -J -joliet-long \
-V "StellaOps Risk Bundle 2025-12-17" \
/var/stellaops/bundles/risk-bundle-2025-12-17.tar.gz
# Verify disc
md5sum /dev/sr0 > risk-bundle-2025-12-17.md5
```
### Storage Recommendations
**Bundle Retention**:
- **Online bundler**: Keep last 90 days (rolling cleanup)
- **Air-gapped system**: Keep last 30 days minimum (for rollback)
**Naming Convention**:
- Pattern: `risk-bundle-YYYY-MM-DD.tar.gz`
- Example: `risk-bundle-2025-12-17.tar.gz`
**Directory Structure** (air-gapped system):
```
/opt/stellaops/bundles/
├── incoming/ # Transfer staging area
├── verified/ # Verified, ready to import
├── imported/ # Successfully imported (archive)
└── failed/ # Failed verification/import (quarantine)
```
---
## Importing Bundles (Air-Gapped System)
### Pre-Import Verification
**Step 1: Transfer to Verified Directory**
```bash
# Transfer from incoming to verified (manual approval gate)
sudo mv /opt/stellaops/bundles/incoming/risk-bundle-2025-12-17.tar.gz \
/opt/stellaops/bundles/verified/
```
**Step 2: Verify Bundle Integrity**
```bash
# Extract bundle
cd /opt/stellaops/bundles/verified
tar -xzf risk-bundle-2025-12-17.tar.gz
# Verify checksums
cd risk-bundle-2025-12-17
sha256sum -c signatures/bundle.sha256sums
# Expected output:
# epss/epss_scores-2025-12-17.csv.zst: OK
# epss/epss_metadata.json: OK
# kev/kev-catalog.json: OK
# manifest.json: OK
```
**Step 3: Verify DSSE Signature (if signed)**
```bash
stellaops-bundler verify-signature \
--manifest manifest.json \
--signature signatures/bundle.dsse.json \
--trusted-keys /etc/stellaops/trusted-keys.json
# Expected output:
# ✓ Signature valid
# ✓ Key ID: stellaops-bundler-2025
# ✓ Signed at: 2025-12-17T01:05:00Z
```
### Import Procedure
**Step 4: Import Bundle**
```bash
# Import using stellaops CLI
stellaops offline import \
--bundle /opt/stellaops/bundles/verified/risk-bundle-2025-12-17.tar.gz \
--verify \
--dry-run
# Review dry-run output, then execute
stellaops offline import \
--bundle /opt/stellaops/bundles/verified/risk-bundle-2025-12-17.tar.gz \
--verify
```
**Import Output**:
```
Importing risk bundle: risk-bundle-2025-12-17
✓ Manifest validated
✓ Checksums verified
✓ Signature verified
Importing EPSS data...
Model Date: 2025-12-17
Row Count: 231,417
✓ epss_import_runs created (import_run_id: 550e8400-...)
✓ epss_scores inserted (231,417 rows, 23.4s)
✓ epss_changes computed (12,345 changes, 8.1s)
✓ epss_current upserted (231,417 rows, 5.2s)
✓ Event emitted: epss.updated
Importing KEV catalog...
Known Exploited Count: 1,247
✓ kev_catalog updated
Import completed successfully in 41.2s
```
**Step 5: Verify Import**
```bash
# Check EPSS status
stellaops epss status
# Expected output:
# EPSS Status:
# Latest Model Date: 2025-12-17
# Source: bundle://risk-bundle-2025-12-17
# CVE Count: 231,417
# Staleness: FRESH (0 days)
# Import Time: 2025-12-17T10:30:00Z
# Query specific CVE to verify
stellaops epss get CVE-2024-12345
# Expected output:
# CVE-2024-12345
# Score: 0.42357
# Percentile: 88.2th
# Model Date: 2025-12-17
# Source: bundle://risk-bundle-2025-12-17
```
**Step 6: Archive Imported Bundle**
```bash
# Move to imported archive
sudo mv /opt/stellaops/bundles/verified/risk-bundle-2025-12-17.tar.gz \
/opt/stellaops/bundles/imported/
```
---
## Automation (Air-Gapped System)
### Automated Import on Arrival
**Script**: `/opt/stellaops/scripts/auto-import-bundle.sh`
```bash
#!/bin/bash
set -euo pipefail
INCOMING_DIR="/opt/stellaops/bundles/incoming"
VERIFIED_DIR="/opt/stellaops/bundles/verified"
IMPORTED_DIR="/opt/stellaops/bundles/imported"
FAILED_DIR="/opt/stellaops/bundles/failed"
LOG_FILE="/var/log/stellaops/auto-import.log"
log() {
echo "[$(date -Iseconds)] $*" | tee -a "${LOG_FILE}"
}
# Watch for new bundles in incoming/
for bundle in "${INCOMING_DIR}"/risk-bundle-*.tar.gz; do
[ -f "${bundle}" ] || continue
BUNDLE_NAME=$(basename "${bundle}")
log "Detected new bundle: ${BUNDLE_NAME}"
# Extract
EXTRACT_DIR="${VERIFIED_DIR}/${BUNDLE_NAME%.tar.gz}"
mkdir -p "${EXTRACT_DIR}"
tar -xzf "${bundle}" -C "${VERIFIED_DIR}"
# Verify checksums
if ! (cd "${EXTRACT_DIR}" && sha256sum -c signatures/bundle.sha256sums > /dev/null 2>&1); then
log "ERROR: Checksum verification failed for ${BUNDLE_NAME}"
mv "${bundle}" "${FAILED_DIR}/"
rm -rf "${EXTRACT_DIR}"
continue
fi
log "Checksum verification passed"
# Verify signature (if present)
if [ -f "${EXTRACT_DIR}/signatures/bundle.dsse.json" ]; then
if ! stellaops-bundler verify-signature \
--manifest "${EXTRACT_DIR}/manifest.json" \
--signature "${EXTRACT_DIR}/signatures/bundle.dsse.json" \
--trusted-keys /etc/stellaops/trusted-keys.json > /dev/null 2>&1; then
log "ERROR: Signature verification failed for ${BUNDLE_NAME}"
mv "${bundle}" "${FAILED_DIR}/"
rm -rf "${EXTRACT_DIR}"
continue
fi
log "Signature verification passed"
fi
# Import
if stellaops offline import --bundle "${bundle}" --verify >> "${LOG_FILE}" 2>&1; then
log "Import successful for ${BUNDLE_NAME}"
mv "${bundle}" "${IMPORTED_DIR}/"
rm -rf "${EXTRACT_DIR}"
else
log "ERROR: Import failed for ${BUNDLE_NAME}"
mv "${bundle}" "${FAILED_DIR}/"
fi
done
```
**Systemd Service**: `/etc/systemd/system/stellaops-bundle-watcher.service`
```ini
[Unit]
Description=StellaOps Bundle Auto-Import Watcher
After=network.target
[Service]
Type=simple
ExecStart=/usr/bin/inotifywait -m -e close_write --format '%w%f' /opt/stellaops/bundles/incoming | \
while read file; do /opt/stellaops/scripts/auto-import-bundle.sh; done
Restart=always
RestartSec=10
User=stellaops
Group=stellaops
[Install]
WantedBy=multi-user.target
```
**Enable Service**:
```bash
sudo systemctl enable stellaops-bundle-watcher
sudo systemctl start stellaops-bundle-watcher
```
---
## Staleness Handling
### Staleness Thresholds
| Days Since Model Date | Status | Action |
|-----------------------|--------|--------|
| 0-1 | FRESH | Normal operation |
| 2-7 | ACCEPTABLE | Continue, low-priority alert |
| 8-14 | STALE | Alert, plan bundle import |
| 15+ | VERY_STALE | Fallback to CVSS-only, urgent alert |
### Monitoring Staleness
**SQL Query**:
```sql
SELECT * FROM concelier.epss_model_staleness;
-- Output:
-- latest_model_date | latest_import_at | days_stale | staleness_status
-- 2025-12-10 | 2025-12-10 10:30:00+00 | 7 | ACCEPTABLE
```
**Prometheus Metric**:
```promql
epss_model_staleness_days{instance="airgap-prod"}
# Alert rule:
- alert: EpssDataStale
expr: epss_model_staleness_days > 7
for: 1h
labels:
severity: warning
annotations:
summary: "EPSS data is stale ({{ $value }} days old)"
```
### Fallback Behavior
When EPSS data is VERY_STALE (>14 days):
**Automatic Fallback**:
- Scanner: Skip EPSS evidence, log warning
- Policy: Use CVSS-only scoring (no EPSS bonus)
- Notifications: Disabled EPSS-based alerts
- UI: Show staleness banner, disable EPSS filters
**Manual Override** (force continue using stale data):
```yaml
# etc/scanner.yaml
scanner:
epss:
staleness_policy: continue # Options: fallback, continue, error
max_staleness_days: 30 # Override 14-day default
```
---
## Troubleshooting
### Bundle Import Failed: Checksum Mismatch
**Symptom**:
```
ERROR: Checksum verification failed
epss/epss_scores-2025-12-17.csv.zst: FAILED
```
**Diagnosis**:
1. Verify bundle was not corrupted during transfer:
```bash
# Compare with original
sha256sum risk-bundle-2025-12-17.tar.gz
```
2. Re-transfer bundle from source
**Resolution**:
- Delete corrupted bundle: `rm risk-bundle-2025-12-17.tar.gz`
- Re-download/re-transfer from bundler system
### Bundle Import Failed: Signature Invalid
**Symptom**:
```
ERROR: Signature verification failed
Invalid signature or untrusted key
```
**Diagnosis**:
1. Check trusted keys configured:
```bash
cat /etc/stellaops/trusted-keys.json
```
2. Verify key ID in bundle signature matches:
```bash
jq '.signature.key_id' manifest.json
```
**Resolution**:
- Update trusted keys file with current bundler public key
- Or: Skip signature verification (if signatures optional):
```bash
stellaops offline import --bundle risk-bundle-2025-12-17.tar.gz --skip-signature-verify
```
### No EPSS Data After Import
**Symptom**:
- Import succeeded, but `stellaops epss status` shows "No EPSS data"
**Diagnosis**:
```sql
-- Check import runs
SELECT * FROM concelier.epss_import_runs ORDER BY created_at DESC LIMIT 1;
-- Check epss_current count
SELECT COUNT(*) FROM concelier.epss_current;
```
**Resolution**:
1. If import_runs shows FAILED status:
- Check error column: `SELECT error FROM concelier.epss_import_runs WHERE status = 'FAILED'`
- Re-run import with verbose logging
2. If epss_current is empty:
- Manually trigger upsert:
```sql
-- Re-run upsert for latest model_date
-- (This SQL is safe to re-run)
INSERT INTO concelier.epss_current (cve_id, epss_score, percentile, model_date, import_run_id, updated_at)
SELECT s.cve_id, s.epss_score, s.percentile, s.model_date, s.import_run_id, NOW()
FROM concelier.epss_scores s
WHERE s.model_date = (SELECT MAX(model_date) FROM concelier.epss_import_runs WHERE status = 'SUCCEEDED')
ON CONFLICT (cve_id) DO UPDATE SET
epss_score = EXCLUDED.epss_score,
percentile = EXCLUDED.percentile,
model_date = EXCLUDED.model_date,
import_run_id = EXCLUDED.import_run_id,
updated_at = NOW();
```
---
## Best Practices
### 1. Weekly Bundle Import Cadence
**Recommended Schedule**:
- **Minimum**: Weekly (every Monday)
- **Preferred**: Bi-weekly (Monday & Thursday)
- **Ideal**: Daily (if transfer logistics allow)
### 2. Bundle Verification Checklist
Before importing:
- [ ] Checksum verification passed
- [ ] Signature verification passed (if signed)
- [ ] Model date within acceptable staleness window
- [ ] Disk space available (estimate: 500MB per bundle)
- [ ] Backup current EPSS data (for rollback)
### 3. Rollback Plan
If new bundle causes issues:
```bash
# 1. Identify problematic import_run_id
SELECT import_run_id, model_date, status
FROM concelier.epss_import_runs
ORDER BY created_at DESC LIMIT 5;
# 2. Delete problematic import (cascades to epss_scores, epss_changes)
DELETE FROM concelier.epss_import_runs
WHERE import_run_id = '550e8400-...';
# 3. Restore epss_current from previous day
-- (Upsert from previous model_date as shown in troubleshooting)
# 4. Verify rollback
stellaops epss status
```
### 4. Audit Trail
Log all bundle imports for compliance:
**Audit Log Format** (`/var/log/stellaops/bundle-audit.log`):
```json
{
"timestamp": "2025-12-17T10:30:00Z",
"action": "import",
"bundle_id": "risk-bundle-2025-12-17",
"bundle_sha256": "abc123...",
"imported_by": "admin@example.com",
"import_run_id": "550e8400-e29b-41d4-a716-446655440000",
"result": "SUCCESS",
"row_count": 231417,
"duration_seconds": 41.2
}
```
---
## Appendix: Bundle Creation Tools
### stellaops-bundler CLI Reference
```bash
# Create EPSS metadata
stellaops-bundler epss metadata \
--file epss_scores-2025-12-17.csv.zst \
--model-date 2025-12-17 \
--output epss_metadata.json
# Create manifest
stellaops-bundler manifest create \
--bundle-dir risk-bundle-2025-12-17 \
--bundle-id risk-bundle-2025-12-17 \
--output manifest.json
# Sign bundle
stellaops-bundler sign \
--manifest manifest.json \
--key /path/to/signing-key.pem \
--output bundle.dsse.json
# Verify bundle
stellaops-bundler verify risk-bundle-2025-12-17.tar.gz
```
### Custom Bundle Scripts
Example for creating weekly bundles (7-day snapshots):
```bash
#!/bin/bash
# create-weekly-bundle.sh
WEEK_START=$(date -u -d "last monday" +%Y-%m-%d)
WEEK_END=$(date -u +%Y-%m-%d)
BUNDLE_ID="risk-bundle-weekly-${WEEK_START}"
echo "Creating weekly bundle: ${BUNDLE_ID}"
for day in $(seq 0 6); do
CURRENT_DATE=$(date -u -d "${WEEK_START} + ${day} days" +%Y-%m-%d)
# Fetch EPSS for each day...
curl -sL "https://epss.empiricalsecurity.com/epss_scores-${CURRENT_DATE}.csv.gz" \
-o "epss/epss_scores-${CURRENT_DATE}.csv.gz"
done
# Compress and bundle...
tar -czf "${BUNDLE_ID}.tar.gz" epss/ kev/ manifest.json
```
---
**Last Updated**: 2025-12-17
**Version**: 1.0
**Maintainer**: StellaOps Operations Team

View File

@@ -0,0 +1,77 @@
# AirGap Importer
The AirGap Importer verifies and ingests offline bundles (mirror, bootstrap, evidence kits) into a sealed or constrained deployment. It fails closed by default: imports are rejected when verification fails, and failures are diagnosable offline.
This document describes importer behavior and its key building blocks. For bundle formats and operational workflow, see `docs/modules/airgap/guides/offline-bundle-format.md`, `docs/modules/airgap/guides/mirror-bundles.md`, and `docs/modules/airgap/guides/operations.md`.
## Responsibilities
- Verify bundle integrity and authenticity (DSSE signatures; optional TUF metadata where applicable).
- Enforce monotonicity (prevent version rollback unless explicitly force-activated with a recorded reason).
- Stage verified content into deterministic layouts (catalog + item repository + object-store paths).
- Quarantine failed bundles for forensic analysis with deterministic logs and metadata.
- Emit an audit trail for every dry-run and import attempt (success or failure).
## Verification pipeline (conceptual)
1. **Plan**: build an ordered list of validation/ingest steps for the bundle (`BundleImportPlanner`).
2. **Validate signatures**: verify DSSE envelopes and trusted key fingerprints.
3. **Validate metadata** (when present): verify TUF root/snapshot/timestamp consistency against trust roots.
4. **Compute deterministic roots**: compute a Merkle root over staged bundle items (stable ordering).
5. **Check monotonicity**: ensure the incoming bundle version is newer than the currently active version.
6. **Quarantine on failure**: preserve the bundle + verification log and emit a stable failure reason code.
7. **Commit**: write catalog/item entries and activation record; emit audit/timeline events.
The step order must remain stable; if steps change, treat it as a contract change and update CLI/UI guidance.
## Quarantine
When verification fails, the importer quarantines the bundle with enough information to debug offline.
Typical structure:
- `/updates/quarantine/<tenantId>/<timestamp>-<reason>-<id>/`
- `bundle.tar.zst` (original)
- `manifest.json` (if extracted)
- `verification.log` (deterministic, no machine-specific paths)
- `failure-reason.txt` (human-readable)
- `quarantine.json` (structured metadata: tenant, reason, timestamps, sizes, hashes)
Operational expectations:
- Quarantine is bounded: enforce per-tenant quota + TTL cleanup.
- Listing is deterministic: sort by `quarantined_at` then `quarantine_id` (ordinal).
## Version monotonicity
Rollback resistance is enforced via:
- A per-tenant version store (`IBundleVersionStore`) backed by Postgres in production.
- A monotonicity checker (`IVersionMonotonicityChecker`) that compares incoming bundle versions to the active version.
- Optional force-activate path requiring a human reason, stored alongside the activation record.
## Storage model
The importer writes deterministic metadata that other components can query:
- **Bundle catalog**: (tenant, bundle_id, digest, imported_at_utc, content paths).
- **Bundle items**: (tenant, bundle_id, path, digest, size).
For the logical schema and deterministic ordering rules, see `docs/modules/airgap/guides/bundle-repositories.md`.
## Telemetry and auditing
Minimum signals:
- Counters: imports attempted/succeeded/failed, dry-runs, quarantines created, monotonicity failures, force-activations.
- Structured logs with stable reason codes (e.g., `dsse_signature_invalid`, `tuf_root_invalid`, `merkle_mismatch`, `version_rollback_blocked`).
- Audit emission: include tenant, bundle_id, digest, operator identity, and whether sealed mode was active.
## References
- `docs/modules/airgap/guides/offline-bundle-format.md`
- `docs/modules/airgap/guides/mirror-bundles.md`
- `docs/modules/airgap/guides/bundle-repositories.md`
- `docs/modules/airgap/guides/operations.md`
- `docs/modules/airgap/guides/controller.md`

View File

@@ -0,0 +1,210 @@
# macOS Offline Kit Integration
> Owner: Scanner Guild, Offline Kit Guild
> Related tasks: SCANNER-ENG-0020..0023
## Overview
This document describes the offline operation requirements for macOS package scanning, including Homebrew formula metadata, pkgutil receipts, and application bundle analysis.
## Homebrew Offline Mirroring
### Required Tap Mirrors
For comprehensive macOS scanning in offline environments, mirror the following Homebrew taps:
| Tap | Path | Est. Size | Update Frequency |
|-----|------|-----------|------------------|
| `homebrew/core` | `/opt/stellaops/mirror/homebrew-core` | ~400MB | Weekly |
| `homebrew/cask` | `/opt/stellaops/mirror/homebrew-cask` | ~150MB | Weekly |
| Custom taps | As configured | Varies | As needed |
### Mirroring Procedure
```bash
# Clone or update homebrew-core
git clone --depth 1 https://github.com/Homebrew/homebrew-core.git \
/opt/stellaops/mirror/homebrew-core
# Clone or update homebrew-cask
git clone --depth 1 https://github.com/Homebrew/homebrew-cask.git \
/opt/stellaops/mirror/homebrew-cask
# Create manifest for offline verification
stellaops-cli offline create-manifest \
--source /opt/stellaops/mirror/homebrew-core \
--output /opt/stellaops/mirror/homebrew-core.manifest.json
```
### Formula Metadata Extraction
The scanner extracts metadata from `INSTALL_RECEIPT.json` files in the Cellar. For policy evaluation, ensure the following fields are preserved:
- `tap` - Source tap identifier
- `version` and `revision` - Package version info
- `poured_from_bottle` - Build source indicator
- `source.url`, `source.checksum` - Provenance data
- `runtime_dependencies`, `build_dependencies` - Dependency graph
## pkgutil Receipt Data
### Receipt Location
macOS pkgutil receipts are stored in `/var/db/receipts/`. The scanner reads:
- `*.plist` - Receipt metadata (installer, version, date)
- `*.bom` - Bill of Materials (installed files)
### Offline Considerations
pkgutil receipts are system-local and don't require external mirroring. However, for policy enforcement against known package identifiers, maintain a reference database of:
- Apple system package identifiers (`com.apple.pkg.*`)
- Xcode component identifiers
- Third-party installer identifiers
## Application Bundle Inspection
### Code Signing & Notarization
For offline notarization verification, prefetch:
1. **Apple Root Certificates**
- Apple Root CA
- Apple Root CA - G2
- Apple Root CA - G3
2. **WWDR Certificates**
- Apple Worldwide Developer Relations Certification Authority
- Developer ID Certification Authority
3. **CRL/OCSP Caches**
```bash
# Prefetch Apple CRLs
curl -o /opt/stellaops/cache/apple-crl/root.crl \
https://www.apple.com/appleca/root.crl
```
### Entitlement Taxonomy
The scanner classifies entitlements into capability categories for policy evaluation:
| Category | Entitlements | Risk Level |
|----------|--------------|------------|
| `network` | `com.apple.security.network.client`, `.server` | Low |
| `camera` | `com.apple.security.device.camera` | High |
| `microphone` | `com.apple.security.device.microphone` | High |
| `filesystem` | `com.apple.security.files.*` | Medium |
| `automation` | `com.apple.security.automation.apple-events` | High |
| `code-execution` | `com.apple.security.cs.allow-*` | Critical |
| `debugging` | `com.apple.security.get-task-allow` | High |
### High-Risk Entitlement Alerting
The following entitlements trigger elevated policy warnings by default:
```
com.apple.security.device.camera
com.apple.security.device.microphone
com.apple.security.cs.allow-unsigned-executable-memory
com.apple.security.cs.disable-library-validation
com.apple.security.get-task-allow
com.apple.security.files.all
com.apple.security.automation.apple-events
```
## Policy Predicates
### Available Predicates
The following SPL predicates are available for macOS components:
```spl
# Bundle signing predicates
macos.signed # Bundle has code signature
macos.signed("TEAMID123") # Signed by specific team
macos.signed("TEAMID123", true) # Signed with hardened runtime
macos.sandboxed # App sandbox enabled
macos.hardened_runtime # Hardened runtime enabled
# Entitlement predicates
macos.entitlement("com.apple.security.network.client")
macos.entitlement_any(["com.apple.security.device.camera", "..."])
macos.category("network") # Has any network entitlement
macos.category_any(["camera", "microphone"])
macos.high_risk_entitlements # Has any high-risk entitlement
# Package receipt predicates
macos.pkg_receipt("com.apple.pkg.Safari")
macos.pkg_receipt("com.apple.pkg.Safari", "17.1")
# Metadata accessors
macos.bundle_id # CFBundleIdentifier
macos.team_id # Code signing team ID
macos.min_os_version # LSMinimumSystemVersion
```
### Example Policy Rules
```spl
# Block unsigned third-party apps
rule block_unsigned_apps priority 3 {
when sbom.any_component(
macos.bundle_id != "" and
not macos.signed and
not macos.bundle_id.startswith("com.apple.")
)
then status := "blocked"
because "Unsigned third-party macOS applications are not permitted."
}
# Warn on high-risk entitlements
rule warn_high_risk_entitlements priority 4 {
when sbom.any_component(macos.high_risk_entitlements)
then status := "warn"
because "Application requests high-risk entitlements (camera, microphone, etc.)."
}
# Require hardened runtime for non-Apple apps
rule require_hardened_runtime priority 5 {
when sbom.any_component(
macos.signed and
not macos.hardened_runtime and
not macos.bundle_id.startswith("com.apple.")
)
then status := "warn"
because "Third-party apps should enable hardened runtime."
}
```
## Disk Space Requirements
| Component | Estimated Size | Notes |
|-----------|---------------|-------|
| Homebrew core tap snapshot | ~400MB | Compressed git clone |
| Homebrew cask tap snapshot | ~150MB | Compressed git clone |
| Apple certificate cache | ~5MB | Root + WWDR chains |
| CRL/OCSP cache | ~10MB | Periodic refresh needed |
| **Total** | ~565MB | Per release cycle |
## Validation Scripts
### Verify Offline Readiness
```bash
# Check Homebrew mirror integrity
stellaops-cli offline verify-homebrew \
--mirror /opt/stellaops/mirror/homebrew-core \
--manifest /opt/stellaops/mirror/homebrew-core.manifest.json
# Verify Apple certificate chain
stellaops-cli offline verify-apple-certs \
--cache /opt/stellaops/cache/apple-certs \
--require wwdr
```
## References
- `docs/modules/scanner/design/macos-analyzer.md` - Analyzer design specification
- `docs/modules/airgap/guides/mirror-bundles.md` - General mirroring patterns
- Apple Developer Documentation: Code Signing Guide

View File

@@ -0,0 +1,28 @@
# Mirror Bundles (Airgap 56-003)
Defines the mirror bundle format and validation workflow for sealed deployments.
## Contents
- Images/charts: OCI artifacts exported with digests + SBOMs.
- Manifests: `manifest.json` with entries:
- `bundleId`, `mirrorGeneration`, `createdAt`, `producer` (export center), `hashes` (sha256 list)
- `dsseEnvelopeHash` for signed manifest (if available)
- `files[]`: path, sha256, size, mediaType
- Transparency: optional TUF metadata (`timestamp.json`, `snapshot.json`) for replay protection.
## Validation steps
1. Verify `manifest.json` sha256 matches provided hash.
2. If DSSE present, verify signature against offline trust roots.
3. Validate Merkle root (if included) over `files[]` hashes.
4. For each OCI artifact, confirm digest matches and SBOM present.
5. Record `mirrorGeneration` and manifest hash; store in audit log and timeline event.
## Workflow
- Export Center produces bundle + manifest; Attestor/Excititor importers validate before ingest.
- Bundle consumers must refuse imports if any hash/signature fails.
- Keep format stable; any schema change bumps `manifestVersion` in `manifest.json`.
## Determinism
- Sort `files[]` by path; compute hashes with UTF-8 canonical paths.
- Use ISO-8601 UTC timestamps in manifests.
- Do not include host-specific paths or timestamps in tar layers.

View File

@@ -0,0 +1,213 @@
# Offline Bundle Format (.stella.bundle.tgz)
> Sprint: SPRINT_3603_0001_0001
> Module: ExportCenter
This document describes the `.stella.bundle.tgz` format for portable, signed, verifiable evidence packages.
## Overview
The offline bundle is a self-contained archive containing all evidence and artifacts needed for offline triage of security findings. Bundles are:
- **Portable**: Single file that can be transferred to air-gapped environments
- **Signed**: DSSE-signed manifest for authenticity verification
- **Verifiable**: Content-addressable with SHA-256 hashes for integrity
- **Complete**: Contains all data needed for offline decision-making
## File Format
```
{alert-id}.stella.bundle.tgz
├── manifest.json # Bundle manifest (DSSE-signed)
├── metadata/
│ ├── alert.json # Alert metadata snapshot
│ └── generation-info.json # Bundle generation metadata
├── evidence/
│ ├── reachability-proof.json # Call-graph reachability evidence
│ ├── callstack.json # Exploitability call stacks
│ └── provenance.json # Build provenance attestations
├── vex/
│ ├── decisions.ndjson # VEX decision history (NDJSON)
│ └── current-status.json # Current VEX status
├── sbom/
│ ├── current.cdx.json # Current SBOM slice (CycloneDX)
│ └── baseline.cdx.json # Baseline SBOM for diff
├── diff/
│ └── sbom-delta.json # SBOM delta changes
└── attestations/
├── bundle.dsse.json # DSSE envelope for bundle
└── evidence.dsse.json # Evidence attestation chain
```
## Manifest Schema
The `manifest.json` file follows this schema:
```json
{
"bundle_format_version": "1.0.0",
"bundle_id": "abc123def456...",
"alert_id": "alert-789",
"created_at": "2024-12-15T10:00:00Z",
"created_by": "user@example.com",
"stellaops_version": "1.5.0",
"entries": [
{
"path": "metadata/alert.json",
"hash": "sha256:...",
"size": 1234,
"content_type": "application/json"
}
],
"root_hash": "sha256:...",
"signature": {
"algorithm": "ES256",
"key_id": "signing-key-001",
"value": "..."
}
}
```
### Manifest Fields
| Field | Type | Required | Description |
|-------|------|----------|-------------|
| `bundle_format_version` | string | Yes | Format version (semver) |
| `bundle_id` | string | Yes | Unique bundle identifier |
| `alert_id` | string | Yes | Source alert identifier |
| `created_at` | ISO 8601 | Yes | Bundle creation timestamp (UTC) |
| `created_by` | string | Yes | Actor who created the bundle |
| `stellaops_version` | string | Yes | StellaOps version that created bundle |
| `entries` | array | Yes | List of content entries with hashes |
| `root_hash` | string | Yes | Merkle root of all entry hashes |
| `signature` | object | No | DSSE signature (if signed) |
## Entry Schema
Each entry in the manifest:
```json
{
"path": "evidence/reachability-proof.json",
"hash": "sha256:abc123...",
"size": 2048,
"content_type": "application/json",
"compression": null
}
```
## DSSE Signing
Bundles support DSSE (Dead Simple Signing Envelope) signing:
```json
{
"payloadType": "application/vnd.stellaops.bundle.manifest+json",
"payload": "<base64-encoded manifest>",
"signatures": [
{
"keyid": "signing-key-001",
"sig": "<base64-encoded signature>"
}
]
}
```
## Creation
### API Endpoint
```http
GET /v1/alerts/{alertId}/bundle
Authorization: Bearer <token>
Response: application/gzip
Content-Disposition: attachment; filename="alert-123.stella.bundle.tgz"
```
### Programmatic
```csharp
var packager = services.GetRequiredService<IOfflineBundlePackager>();
var result = await packager.CreateBundleAsync(new BundleRequest
{
AlertId = "alert-123",
ActorId = "user@example.com",
IncludeVexHistory = true,
IncludeSbomSlice = true
});
// result.Content contains the tarball stream
// result.ManifestHash contains the verification hash
```
## Verification
### API Endpoint
```http
POST /v1/alerts/{alertId}/bundle/verify
Content-Type: application/json
{
"bundle_hash": "sha256:abc123...",
"signature": "<optional DSSE signature>"
}
Response:
{
"is_valid": true,
"hash_valid": true,
"chain_valid": true,
"signature_valid": true,
"verified_at": "2024-12-15T10:00:00Z"
}
```
### Programmatic
```csharp
var verification = await packager.VerifyBundleAsync(
bundlePath: "/path/to/bundle.stella.bundle.tgz",
expectedHash: "sha256:abc123...");
if (!verification.IsValid)
{
Console.WriteLine($"Verification failed: {string.Join(", ", verification.Errors)}");
}
```
## CLI Usage
```bash
# Export bundle
stellaops alert bundle export --alert-id alert-123 --output ./bundles/
# Verify bundle
stellaops alert bundle verify --file ./bundles/alert-123.stella.bundle.tgz
# Import bundle (air-gapped instance)
stellaops alert bundle import --file ./bundles/alert-123.stella.bundle.tgz
```
## Security Considerations
1. **Hash Verification**: Always verify bundle hash before processing
2. **Signature Validation**: Verify DSSE signature if present
3. **Content Validation**: Validate JSON schemas after extraction
4. **Size Limits**: Enforce maximum bundle size limits (default: 100MB)
5. **Path Traversal**: Tarball extraction must prevent path traversal attacks
## Versioning
| Format Version | Changes | Min StellaOps Version |
|----------------|---------|----------------------|
| 1.0.0 | Initial format | 1.0.0 |
## Related Documentation
- [Evidence Bundle Envelope](./evidence-bundle-envelope.md)
- [DSSE Signing Guide](./dsse-signing.md)
- [Offline Kit Guide](../OFFLINE_KIT.md)
- [API Reference](../api/evidence-decision-api.openapi.yaml)

View File

@@ -0,0 +1,518 @@
# Offline Parity Verification
**Last Updated:** 2025-12-14
**Next Review:** 2026-03-14
---
## Overview
This document defines the methodology for verifying that StellaOps scanner produces **identical results** in offline/air-gapped environments compared to connected deployments. Parity verification ensures that security decisions made in disconnected environments are equivalent to those made with full network access.
---
## 1. PARITY VERIFICATION OBJECTIVES
### 1.1 Core Guarantees
| Guarantee | Description | Target |
|-----------|-------------|--------|
| **Bitwise Fidelity** | Scan outputs are byte-identical offline vs online | 100% |
| **Semantic Fidelity** | Same vulnerabilities, severities, and verdicts | 100% |
| **Temporal Parity** | Same results given identical feed snapshots | 100% |
| **Policy Parity** | Same pass/fail decisions with identical policies | 100% |
### 1.2 What Parity Does NOT Cover
- **Feed freshness**: Offline feeds may be hours/days behind live feeds (by design)
- **Network-only enrichment**: EPSS lookups, live KEV checks (graceful degradation applies)
- **Transparency log submission**: Rekor entries created only when connected
---
## 2. TEST METHODOLOGY
### 2.1 Environment Configuration
#### Connected Environment
```yaml
environment:
mode: connected
network: enabled
feeds:
sources: [osv, ghsa, nvd]
refresh: live
rekor: enabled
epss: enabled
timestamp_source: ntp
```
#### Offline Environment
```yaml
environment:
mode: offline
network: disabled
feeds:
sources: [local-bundle]
refresh: none
rekor: offline-snapshot
epss: bundled-cache
timestamp_source: frozen
timestamp_value: "2025-12-14T00:00:00Z"
```
### 2.2 Test Procedure
```
PARITY VERIFICATION PROCEDURE v1.0
══════════════════════════════════
PHASE 1: BUNDLE CAPTURE (Connected Environment)
─────────────────────────────────────────────────
1. Capture current feed state:
- Record feed version/digest
- Snapshot EPSS scores (top 1000 CVEs)
- Record KEV list state
2. Run connected scan:
stellaops scan --image <test-image> \
--format json \
--output connected-scan.json \
--receipt connected-receipt.json
3. Export offline bundle:
stellaops offline bundle export \
--feeds-snapshot \
--epss-cache \
--output parity-bundle-$(date +%Y%m%d).tar.zst
PHASE 2: OFFLINE SCAN (Air-Gapped Environment)
───────────────────────────────────────────────
1. Import bundle:
stellaops offline bundle import parity-bundle-*.tar.zst
2. Freeze clock to bundle timestamp:
export STELLAOPS_DETERMINISM_TIMESTAMP="2025-12-14T00:00:00Z"
3. Run offline scan:
stellaops scan --image <test-image> \
--format json \
--output offline-scan.json \
--receipt offline-receipt.json \
--offline-mode
PHASE 3: PARITY COMPARISON
──────────────────────────
1. Compare findings digests:
diff <(jq -S '.findings | sort_by(.id)' connected-scan.json) \
<(jq -S '.findings | sort_by(.id)' offline-scan.json)
2. Compare policy decisions:
diff <(jq -S '.policyDecision' connected-scan.json) \
<(jq -S '.policyDecision' offline-scan.json)
3. Compare receipt input hashes:
jq '.inputHash' connected-receipt.json
jq '.inputHash' offline-receipt.json
# MUST be identical if same bundle used
PHASE 4: RECORD RESULTS
───────────────────────
1. Generate parity report:
stellaops parity report \
--connected connected-scan.json \
--offline offline-scan.json \
--output parity-report-$(date +%Y%m%d).json
```
### 2.3 Test Image Matrix
Run parity tests against this representative image set:
| Image | Category | Expected Vulns | Notes |
|-------|----------|----------------|-------|
| `alpine:3.19` | Minimal | ~5 | Fast baseline |
| `debian:12-slim` | Standard | ~40 | OS package focus |
| `node:20-alpine` | Application | ~100 | npm + OS packages |
| `python:3.12` | Application | ~150 | pip + OS packages |
| `dotnet/aspnet:8.0` | Application | ~75 | NuGet + OS packages |
| `postgres:16-alpine` | Database | ~70 | Database + OS |
---
## 3. COMPARISON CRITERIA
### 3.1 Bitwise Comparison
Compare canonical JSON outputs after normalization:
```bash
# Canonical comparison script
canonical_compare() {
local connected="$1"
local offline="$2"
# Normalize both outputs
jq -S . "$connected" > /tmp/connected-canonical.json
jq -S . "$offline" > /tmp/offline-canonical.json
# Compute hashes
CONNECTED_HASH=$(sha256sum /tmp/connected-canonical.json | cut -d' ' -f1)
OFFLINE_HASH=$(sha256sum /tmp/offline-canonical.json | cut -d' ' -f1)
if [[ "$CONNECTED_HASH" == "$OFFLINE_HASH" ]]; then
echo "PASS: Bitwise identical"
return 0
else
echo "FAIL: Hash mismatch"
echo " Connected: $CONNECTED_HASH"
echo " Offline: $OFFLINE_HASH"
diff --color /tmp/connected-canonical.json /tmp/offline-canonical.json
return 1
fi
}
```
### 3.2 Semantic Comparison
When bitwise comparison fails, perform semantic comparison:
| Field | Comparison Rule | Allowed Variance |
|-------|-----------------|------------------|
| `findings[].id` | Exact match | None |
| `findings[].severity` | Exact match | None |
| `findings[].cvss.score` | Exact match | None |
| `findings[].cvss.vector` | Exact match | None |
| `findings[].affected` | Exact match | None |
| `findings[].reachability` | Exact match | None |
| `sbom.components[].purl` | Exact match | None |
| `sbom.components[].version` | Exact match | None |
| `metadata.timestamp` | Ignored | Expected to differ |
| `metadata.scanId` | Ignored | Expected to differ |
| `metadata.environment` | Ignored | Expected to differ |
### 3.3 Fields Excluded from Comparison
These fields are expected to differ and are excluded from parity checks:
```json
{
"excludedFields": [
"$.metadata.scanId",
"$.metadata.timestamp",
"$.metadata.hostname",
"$.metadata.environment.network",
"$.attestations[*].rekorEntry",
"$.metadata.epssEnrichedAt"
]
}
```
### 3.4 Graceful Degradation Fields
Fields that may be absent in offline mode (acceptable):
| Field | Online | Offline | Parity Rule |
|-------|--------|---------|-------------|
| `epssScore` | Present | May be stale/absent | Check if bundled |
| `kevStatus` | Live | Bundled snapshot | Compare against bundle date |
| `rekorEntry` | Present | Absent | Exclude from comparison |
| `fulcioChain` | Present | Absent | Exclude from comparison |
---
## 4. AUTOMATED PARITY CI
### 4.1 CI Workflow
```yaml
# .gitea/workflows/offline-parity.yml
name: Offline Parity Verification
on:
schedule:
- cron: '0 3 * * 1' # Weekly Monday 3am
workflow_dispatch:
jobs:
parity-test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Setup .NET
uses: actions/setup-dotnet@v4
with:
dotnet-version: '10.0.x'
- name: Set determinism environment
run: |
echo "TZ=UTC" >> $GITHUB_ENV
echo "LC_ALL=C" >> $GITHUB_ENV
echo "STELLAOPS_DETERMINISM_SEED=42" >> $GITHUB_ENV
- name: Capture connected baseline
run: scripts/parity/capture-connected.sh
- name: Export offline bundle
run: scripts/parity/export-bundle.sh
- name: Run offline scan (sandboxed)
run: |
docker run --network none \
-v $(pwd)/bundle:/bundle:ro \
-v $(pwd)/results:/results \
stellaops/scanner:latest \
scan --offline-mode --bundle /bundle
- name: Compare parity
run: scripts/parity/compare-parity.sh
- name: Upload parity report
uses: actions/upload-artifact@v4
with:
name: parity-report
path: results/parity-report-*.json
```
### 4.2 Parity Test Script
```bash
#!/bin/bash
# scripts/parity/compare-parity.sh
set -euo pipefail
CONNECTED_DIR="results/connected"
OFFLINE_DIR="results/offline"
REPORT_FILE="results/parity-report-$(date +%Y%m%d).json"
declare -a IMAGES=(
"alpine:3.19"
"debian:12-slim"
"node:20-alpine"
"python:3.12"
"mcr.microsoft.com/dotnet/aspnet:8.0"
"postgres:16-alpine"
)
TOTAL=0
PASSED=0
FAILED=0
RESULTS=()
for image in "${IMAGES[@]}"; do
TOTAL=$((TOTAL + 1))
image_hash=$(echo "$image" | sha256sum | cut -c1-12)
connected_file="${CONNECTED_DIR}/${image_hash}-scan.json"
offline_file="${OFFLINE_DIR}/${image_hash}-scan.json"
# Compare findings
connected_findings=$(jq -S '.findings | sort_by(.id) | map(del(.metadata.timestamp))' "$connected_file")
offline_findings=$(jq -S '.findings | sort_by(.id) | map(del(.metadata.timestamp))' "$offline_file")
connected_hash=$(echo "$connected_findings" | sha256sum | cut -d' ' -f1)
offline_hash=$(echo "$offline_findings" | sha256sum | cut -d' ' -f1)
if [[ "$connected_hash" == "$offline_hash" ]]; then
PASSED=$((PASSED + 1))
status="PASS"
else
FAILED=$((FAILED + 1))
status="FAIL"
fi
RESULTS+=("{\"image\":\"$image\",\"status\":\"$status\",\"connectedHash\":\"$connected_hash\",\"offlineHash\":\"$offline_hash\"}")
done
# Generate report
cat > "$REPORT_FILE" <<EOF
{
"reportDate": "$(date -u +%Y-%m-%dT%H:%M:%SZ)",
"bundleVersion": "$(cat bundle/version.txt)",
"summary": {
"total": $TOTAL,
"passed": $PASSED,
"failed": $FAILED,
"parityRate": $(echo "scale=4; $PASSED / $TOTAL" | bc)
},
"results": [$(IFS=,; echo "${RESULTS[*]}")]
}
EOF
echo "Parity Report: $PASSED/$TOTAL passed ($(echo "scale=2; $PASSED * 100 / $TOTAL" | bc)%)"
if [[ $FAILED -gt 0 ]]; then
echo "PARITY VERIFICATION FAILED"
exit 1
fi
```
---
## 5. PARITY RESULTS
### 5.1 Latest Verification Results
| Date | Bundle Version | Images Tested | Parity Rate | Notes |
|------|---------------|---------------|-------------|-------|
| 2025-12-14 | 2025.12.0 | 6 | 100% | Baseline established |
| — | — | — | — | — |
### 5.2 Historical Parity Tracking
```sql
-- Query for parity trend analysis
SELECT
date_trunc('week', report_date) AS week,
AVG(parity_rate) AS avg_parity,
MIN(parity_rate) AS min_parity,
COUNT(*) AS test_runs
FROM parity_reports
WHERE report_date >= NOW() - INTERVAL '90 days'
GROUP BY 1
ORDER BY 1 DESC;
```
### 5.3 Parity Database Schema
```sql
CREATE TABLE scanner.parity_reports (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
report_date TIMESTAMPTZ NOT NULL,
bundle_version TEXT NOT NULL,
bundle_digest TEXT NOT NULL,
total_images INT NOT NULL,
passed_images INT NOT NULL,
failed_images INT NOT NULL,
parity_rate NUMERIC(5,4) NOT NULL,
results JSONB NOT NULL,
ci_run_id TEXT,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
);
CREATE INDEX idx_parity_reports_date ON scanner.parity_reports(report_date DESC);
CREATE INDEX idx_parity_reports_bundle ON scanner.parity_reports(bundle_version);
```
---
## 6. KNOWN LIMITATIONS
### 6.1 Acceptable Differences
| Scenario | Expected Behavior | Parity Impact |
|----------|-------------------|---------------|
| **EPSS scores** | Use bundled cache (may be stale) | None if cache bundled |
| **KEV status** | Use bundled snapshot | None if snapshot bundled |
| **Rekor entries** | Not created offline | Excluded from comparison |
| **Timestamp fields** | Differ by design | Excluded from comparison |
| **Network-only advisories** | Not available offline | Feed drift (documented) |
### 6.2 Known Edge Cases
1. **Race conditions during bundle capture**: If feeds update during bundle export, connected scan may include newer data than bundle. Mitigation: Capture bundle first, then run connected scan.
2. **Clock drift**: Offline environments with drifted clocks may compute different freshness scores. Mitigation: Always use frozen timestamps from bundle.
3. **Locale differences**: String sorting may differ across locales. Mitigation: Force `LC_ALL=C` in both environments.
4. **Floating point rounding**: CVSS v4 MacroVector interpolation may have micro-differences. Mitigation: Use integer basis points throughout.
### 6.3 Out of Scope
The following are intentionally NOT covered by parity verification:
- Real-time threat intelligence (requires network)
- Live vulnerability disclosure (requires network)
- Transparency log inclusion proofs (requires Rekor)
- OIDC/Fulcio certificate chains (requires network)
---
## 7. TROUBLESHOOTING
### 7.1 Common Parity Failures
| Symptom | Likely Cause | Resolution |
|---------|--------------|------------|
| Different vulnerability counts | Feed version mismatch | Verify bundle digest matches |
| Different CVSS scores | CVSS v4 calculation issue | Check MacroVector lookup parity |
| Different severity labels | Threshold configuration | Compare policy bundles |
| Missing EPSS data | EPSS cache not bundled | Re-export with `--epss-cache` |
| Different component counts | SBOM generation variance | Check analyzer versions |
### 7.2 Debug Commands
```bash
# Compare feed versions
stellaops feeds version --connected
stellaops feeds version --offline --bundle ./bundle
# Compare policy digests
stellaops policy digest --connected
stellaops policy digest --offline --bundle ./bundle
# Detailed diff of findings
stellaops parity diff \
--connected connected-scan.json \
--offline offline-scan.json \
--verbose
```
---
## 8. METRICS AND MONITORING
### 8.1 Prometheus Metrics
```
# Parity verification metrics
parity_test_total{status="pass|fail"}
parity_test_duration_seconds (histogram)
parity_bundle_age_seconds (gauge)
parity_findings_diff_count (gauge)
```
### 8.2 Alerting Rules
```yaml
groups:
- name: offline-parity
rules:
- alert: ParityTestFailed
expr: parity_test_total{status="fail"} > 0
for: 0m
labels:
severity: critical
annotations:
summary: "Offline parity test failed"
- alert: ParityRateDegraded
expr: |
(sum(parity_test_total{status="pass"}) /
sum(parity_test_total)) < 0.95
for: 1h
labels:
severity: warning
annotations:
summary: "Parity rate below 95%"
```
---
## 9. REFERENCES
- [Offline Update Kit (OUK)](../OFFLINE_KIT.md)
- [Offline and Air-Gap Technical Reference](../product-advisories/14-Dec-2025%20-%20Offline%20and%20Air-Gap%20Technical%20Reference.md)
- [Determinism and Reproducibility Technical Reference](../product-advisories/14-Dec-2025%20-%20Determinism%20and%20Reproducibility%20Technical%20Reference.md)
- [Determinism CI Harness](../modules/scanner/design/determinism-ci-harness.md)
- [Performance Baselines](../benchmarks/performance-baselines.md)
---
**Document Version**: 1.0
**Target Platform**: .NET 10, PostgreSQL >=16

View File

@@ -0,0 +1,34 @@
# Airgap Operations (DOCS-AIRGAP-57-004)
Runbooks for imports, failure recovery, and auditing in sealed/constrained modes.
## Imports
1) Verify bundle hash/DSSE (see `mirror-bundles.md`).
2) `stella airgap import --bundle ... --generation N --dry-run` (optional).
3) Apply network policy: ensure sealed/constrained mode set correctly.
4) Import with `stella airgap import ...` and watch logs.
5) Confirm timeline event emitted (bundleId, mirrorGeneration, actor).
## Failure recovery
- Hash/signature mismatch: reject bundle; re-request export; log incident.
- Partial import: rerun with `--force` after cleaning registry/cache; keep previous generation for rollback.
- Staleness breach: if imports unavailable, raise amber alert; if >72h, go red and halt new ingest until refreshed.
- Time anchor expired: apply new anchor from trusted media before continuing operations.
## Auditing
- Record every import in audit log: `{tenant, mirrorGeneration, manifestHash, actor, sealed}`.
- Preserve manifests and hashes for at least two generations.
- Periodically (daily) run `stella airgap list --format json` and archive output.
- Ensure logs are immutable (append-only) in sealed environments.
## Observability
- Monitor counters for denied egress, import success/failure, and staleness alerts.
- Expose `/obs/airgap/status` (if available) to scrape bundle freshness.
## Checklist (per import)
- [ ] Hash/DSSE verified
- [ ] Sealed/constrained mode configured
- [ ] Registry/cache reachable
- [ ] Import succeeded
- [ ] Timeline/audit recorded
- [ ] Staleness dashboard updated

View File

@@ -0,0 +1,32 @@
# Airgap Overview
This page orients teams before diving into per-component runbooks. It summarises modes, lifecycle, and governance responsibilities for sealed deployments.
## Modes
- **Sealed**: deny-all egress; only preloaded bundles (mirror + bootstrap) allowed. Requires exported time anchors and offline trust roots.
- **Constrained**: limited egress to allowlisted registries and NTP; mirror bundles still preferred.
- **Connected**: full egress for staging; must remain policy-compatible with sealed mode.
## Lifecycle
1. **Prepare bundles**: export mirror + bootstrap packs (images/charts, SBOMs, DSSE metadata) signed and hashed.
2. **Stage & verify**: load bundles into the offline store, verify hashes/DSSE, record mirrorGeneration.
3. **Activate**: flip sealed toggle; enforce deny-all egress and policy banners; register bundles with Excititor/Export Center.
4. **Operate**: run periodic staleness checks, apply time anchors, and audit imports via timeline events.
5. **Refresh/rollback**: import next mirrorGeneration or roll back using previous manifest + hashes.
## Responsibilities
- **AirGap Controller Guild**: owns network posture (deny-all, allowlists), sealed-mode policy banners, and change control.
- **Export Center / Evidence Locker Guilds**: produce and verify bundle manifests, DSSE envelopes, and Merkle roots.
- **Module owners** (Excititor, Concelier, etc.): honor sealed-mode toggles, emit staleness headers, and refuse unsigned/unknown bundles.
- **Ops/Signals Guild**: maintain time anchors and observability sinks compatible with sealed deployments.
## Rule banner (sealed mode)
Display a top-of-console banner when `sealed=true`:
- "Sealed mode: no external egress. Only registered bundles permitted. Imports logged; violations trigger audit."
- Include current `mirrorGeneration`, bundle manifest hash, and time-anchor status.
## Related docs
- `docs/modules/airgap/guides/airgap-mode.md` — deeper policy shapes per mode.
- `docs/modules/airgap/guides/bundle-repositories.md` — mirror/bootstrap bundle structure.
- `docs/modules/airgap/guides/staleness-and-time.md` — time anchors and staleness checks.
- `docs/modules/airgap/guides/controller.md` / `docs/modules/airgap/guides/importer.md` — controller + importer references.

View File

@@ -0,0 +1,254 @@
# Portable Evidence Bundle Verification Guide
This document describes how Advisory AI teams can verify the integrity and authenticity of portable evidence bundles produced by StellaOps Excititor for sealed deployments.
## Overview
Portable evidence bundles are self-contained ZIP archives that include:
- Evidence locker manifest with cryptographic Merkle root
- DSSE attestation envelope (when signing is enabled)
- Raw evidence items organized by provider
- Audit timeline events
- Bundle manifest with content index
## Bundle Structure
```
evidence-bundle-{tenant}-{timestamp}.zip
├── manifest.json # VexLockerManifest with Merkle root
├── attestation.json # DSSE envelope (optional)
├── evidence/
│ └── {provider}/
│ └── sha256_{digest}.json
├── timeline.json # Audit timeline events
├── bundle-manifest.json # Index of all contents
└── VERIFY.md # Verification instructions
```
## Verification Steps
### Step 1: Extract and Validate Structure
```bash
# Extract the bundle
unzip evidence-bundle-*.zip -d evidence-bundle/
# Verify expected files exist
ls -la evidence-bundle/
# Should see: manifest.json, bundle-manifest.json, evidence/, timeline.json, VERIFY.md
```
### Step 2: Verify Evidence Item Integrity
Each evidence item's content hash must match its filename:
```bash
cd evidence-bundle/evidence
# For each provider directory
for provider in */; do
for file in "$provider"*.json; do
# Extract expected hash from filename (sha256_xxxx.json -> xxxx)
expected=$(basename "$file" .json | sed 's/sha256_//')
# Compute actual hash
actual=$(sha256sum "$file" | cut -d' ' -f1)
if [ "$expected" != "$actual" ]; then
echo "MISMATCH: $file"
fi
done
done
```
### Step 3: Verify Merkle Root
The Merkle root provides cryptographic proof that all evidence items are included without modification.
#### Python Verification Script
```python
#!/usr/bin/env python3
import json
import hashlib
from pathlib import Path
def compute_merkle_root(hashes):
"""Compute Merkle root from list of hex hashes."""
if len(hashes) == 0:
return hashlib.sha256(b'').hexdigest()
if len(hashes) == 1:
return hashes[0]
# Pad to even number
if len(hashes) % 2 != 0:
hashes = hashes + [hashes[-1]]
# Compute next level
next_level = []
for i in range(0, len(hashes), 2):
combined = bytes.fromhex(hashes[i] + hashes[i+1])
next_level.append(hashlib.sha256(combined).hexdigest())
return compute_merkle_root(next_level)
def verify_bundle(bundle_path):
"""Verify a portable evidence bundle."""
bundle_path = Path(bundle_path)
# Load manifest
with open(bundle_path / 'manifest.json') as f:
manifest = json.load(f)
# Extract hashes, sorted by observationId then providerId
items = sorted(manifest['items'],
key=lambda x: (x['observationId'], x['providerId'].lower()))
hashes = []
for item in items:
content_hash = item['contentHash']
# Strip sha256: prefix if present
if content_hash.startswith('sha256:'):
content_hash = content_hash[7:]
hashes.append(content_hash.lower())
# Compute Merkle root
computed_root = 'sha256:' + compute_merkle_root(hashes)
expected_root = manifest['merkleRoot']
if computed_root == expected_root:
print(f"✓ Merkle root verified: {computed_root}")
return True
else:
print(f"✗ Merkle root mismatch!")
print(f" Expected: {expected_root}")
print(f" Computed: {computed_root}")
return False
if __name__ == '__main__':
import sys
if len(sys.argv) != 2:
print(f"Usage: {sys.argv[0]} <bundle-directory>")
sys.exit(1)
success = verify_bundle(sys.argv[1])
sys.exit(0 if success else 1)
```
### Step 4: Verify Attestation (if present)
When `attestation.json` exists, verify the DSSE envelope:
```bash
# Check if attestation exists
if [ -f "evidence-bundle/attestation.json" ]; then
# Extract attestation metadata
jq '.' evidence-bundle/attestation.json
# Verify signature using appropriate tool
# For Sigstore/cosign attestations:
# cosign verify-attestation --type custom ...
fi
```
#### Attestation Fields
| Field | Description |
|-------|-------------|
| `dsseEnvelope` | Base64-encoded DSSE envelope |
| `envelopeDigest` | SHA-256 hash of the envelope |
| `predicateType` | in-toto predicate type URI |
| `signatureType` | Signature algorithm (e.g., "ES256") |
| `keyId` | Signing key identifier |
| `issuer` | Certificate issuer |
| `subject` | Certificate subject |
| `signedAt` | Signing timestamp (ISO-8601) |
| `transparencyLogRef` | Rekor transparency log entry URL |
### Step 5: Validate Timeline
The timeline provides audit trail of bundle creation:
```bash
# View timeline events
jq '.' evidence-bundle/timeline.json
# Check for any failed events
jq '.[] | select(.errorCode != null)' evidence-bundle/timeline.json
```
#### Timeline Event Types
| Event Type | Description |
|------------|-------------|
| `airgap.import.started` | Bundle import initiated |
| `airgap.import.completed` | Import succeeded |
| `airgap.import.failed` | Import failed (check errorCode) |
## Error Codes Reference
| Code | Description | Resolution |
|------|-------------|------------|
| `AIRGAP_EGRESS_BLOCKED` | External URL blocked in sealed mode | Use mirror/portable media |
| `AIRGAP_SOURCE_UNTRUSTED` | Publisher not allowlisted | Contact administrator |
| `AIRGAP_SIGNATURE_MISSING` | Required signature absent | Re-export with signing |
| `AIRGAP_SIGNATURE_INVALID` | Signature verification failed | Check key/certificate |
| `AIRGAP_PAYLOAD_STALE` | Timestamp exceeds tolerance | Re-create bundle |
| `AIRGAP_PAYLOAD_MISMATCH` | Hash doesn't match metadata | Verify transfer integrity |
## Advisory AI Integration
### Quick Integrity Check
For automated pipelines, use the bundle manifest:
```python
import json
with open('bundle-manifest.json') as f:
manifest = json.load(f)
# Key fields for Advisory AI
print(f"Bundle ID: {manifest['bundleId']}")
print(f"Merkle Root: {manifest['merkleRoot']}")
print(f"Item Count: {manifest['itemCount']}")
print(f"Has Attestation: {manifest['hasAttestation']}")
```
### Evidence Lookup
Find evidence for specific observations:
```python
# Index evidence by observation ID
evidence_index = {e['observationId']: e for e in manifest['evidence']}
# Lookup specific observation
obs_id = 'obs-123-abc'
if obs_id in evidence_index:
entry = evidence_index[obs_id]
file_path = f"evidence/{entry['providerId']}/sha256_{entry['contentHash'][7:]}.json"
```
### Provenance Chain
Build complete provenance from bundle:
1. `bundle-manifest.json` → Bundle creation metadata
2. `manifest.json` → Evidence locker snapshot
3. `attestation.json` → Cryptographic attestation
4. `timeline.json` → Audit trail
## Offline Verification
For fully air-gapped environments:
1. Transfer bundle via approved media
2. Extract to isolated verification system
3. Run verification scripts without network
4. Document verification results for audit
## Support
For questions or issues:
- Review bundle contents with `jq` and standard Unix tools
- Check timeline for error codes and messages
- Contact StellaOps support with bundle ID and merkle root

View File

@@ -0,0 +1,27 @@
# Portable Evidence Bundles (DOCS-AIRGAP-58-004)
Guidance for exporting/importing portable evidence bundles across enclaves.
## Bundle contents
- Evidence payloads (VEX observations/linksets) as NDJSON.
- Timeline events and attestation DSSE envelopes.
- Manifest with `bundleId`, `source`, `tenant`, `createdAt`, `files[]`, `dsseEnvelopeHash` (optional).
## Export
- Produce from Evidence Locker/Excititor with deterministic ordering and SHA-256 hashes.
- Include Merkle root over evidence files; store in manifest.
- Sign manifest (DSSE) when trust roots available.
## Import
- Verify manifest hash, Merkle root, and DSSE signature offline.
- Enforce tenant scoping; refuse cross-tenant bundles.
- Emit timeline event upon successful import.
## Constraints
- No external lookups; verification uses bundled roots.
- Max size per bundle configurable; default 500MB.
- Keep file paths UTF-8 and slash-separated; avoid host-specific metadata.
## Determinism
- Sort files lexicographically; use ISO-8601 UTC timestamps.
- Avoid re-compressing files; if tar is used, set deterministic headers (uid/gid=0, mtime=0).

View File

@@ -0,0 +1,584 @@
# Proof Chain Verification in Air-Gap Mode
> **Version**: 1.0.0
> **Last Updated**: 2025-12-17
> **Related**: [Proof Chain API](../api/proofs.md), [Key Rotation Runbook](../operations/key-rotation-runbook.md)
This document describes how to verify proof chains in air-gapped (offline) environments where Rekor transparency log access is unavailable.
---
## Overview
Proof chains in StellaOps consist of cryptographically-linked attestations:
1. **Evidence statements** - Raw vulnerability findings
2. **Reasoning statements** - Policy evaluation traces
3. **VEX verdict statements** - Final vulnerability status determinations
4. **Graph root statements** - Merkle root commitments to graph analysis results
5. **Proof spine** - Merkle tree aggregating all components
In online mode, proof chains include Rekor inclusion proofs for transparency. In air-gap mode, verification proceeds without Rekor but maintains cryptographic integrity.
---
## Verification Levels
### Level 1: Content-Addressed ID Verification
Verifies that content-addressed IDs match payload hashes.
```bash
# Verify a proof bundle ID
stellaops proof verify --offline \
--proof-bundle sha256:1a2b3c4d... \
--level content-id
# Expected output:
# ✓ Content-addressed ID verified
# ✓ Payload hash: sha256:1a2b3c4d...
```
### Level 2: DSSE Signature Verification
Verifies DSSE envelope signatures against trust anchors.
```bash
# Verify signatures with local trust anchors
stellaops proof verify --offline \
--proof-bundle sha256:1a2b3c4d... \
--anchor-file /path/to/trust-anchors.json \
--level signature
# Expected output:
# ✓ DSSE signature valid
# ✓ Signer: key-2025-prod
# ✓ Trust anchor: 550e8400-e29b-41d4-a716-446655440000
```
### Level 3: Merkle Path Verification
Verifies the proof spine merkle tree structure.
```bash
# Verify merkle paths
stellaops proof verify --offline \
--proof-bundle sha256:1a2b3c4d... \
--level merkle
# Expected output:
# ✓ Merkle root verified
# ✓ Evidence paths: 3/3 valid
# ✓ Reasoning path: valid
# ✓ VEX verdict path: valid
```
### Level 4: Full Verification (Offline)
Performs all verification steps except Rekor.
```bash
# Full offline verification
stellaops proof verify --offline \
--proof-bundle sha256:1a2b3c4d... \
--anchor-file /path/to/trust-anchors.json
# Expected output:
# Proof Chain Verification
# ═══════════════════════
# ✓ Content-addressed IDs verified
# ✓ DSSE signatures verified (3 envelopes)
# ✓ Merkle paths verified
# ⊘ Rekor verification skipped (offline mode)
#
# Overall: VERIFIED (offline)
```
---
## Trust Anchor Distribution
In air-gap environments, trust anchors must be distributed out-of-band.
### Export Trust Anchors
```bash
# On the online system, export trust anchors
stellaops anchor export --format json > trust-anchors.json
# Verify export integrity
sha256sum trust-anchors.json > trust-anchors.sha256
```
### Trust Anchor File Format
```json
{
"version": "1.0",
"exportedAt": "2025-12-17T00:00:00Z",
"anchors": [
{
"trustAnchorId": "550e8400-e29b-41d4-a716-446655440000",
"purlPattern": "pkg:*",
"allowedKeyids": ["key-2024-prod", "key-2025-prod"],
"allowedPredicateTypes": [
"evidence.stella/v1",
"reasoning.stella/v1",
"cdx-vex.stella/v1",
"proofspine.stella/v1"
],
"revokedKeys": ["key-2023-prod"],
"keyMaterial": {
"key-2024-prod": {
"algorithm": "ECDSA-P256",
"publicKey": "-----BEGIN PUBLIC KEY-----\n..."
},
"key-2025-prod": {
"algorithm": "ECDSA-P256",
"publicKey": "-----BEGIN PUBLIC KEY-----\n..."
}
}
}
]
}
```
### Import Trust Anchors
```bash
# On the air-gapped system
stellaops anchor import --file trust-anchors.json
# Verify import
stellaops anchor list
```
---
## Proof Bundle Distribution
### Export Proof Bundles
```bash
# Export a proof bundle for offline transfer
stellaops proof export \
--entry sha256:abc123:pkg:npm/lodash@4.17.21 \
--output proof-bundle.zip
# Bundle contents:
# proof-bundle.zip
# ├── proof-spine.json # The proof spine
# ├── evidence/ # Evidence statements
# │ ├── sha256_e1.json
# │ └── sha256_e2.json
# ├── reasoning.json # Reasoning statement
# ├── vex-verdict.json # VEX verdict statement
# ├── envelopes/ # DSSE envelopes
# │ ├── evidence-e1.dsse
# │ ├── evidence-e2.dsse
# │ ├── reasoning.dsse
# │ ├── vex-verdict.dsse
# │ └── proof-spine.dsse
# └── VERIFY.md # Verification instructions
```
### Verify Exported Bundle
```bash
# On the air-gapped system
stellaops proof verify --offline \
--bundle-file proof-bundle.zip \
--anchor-file trust-anchors.json
```
---
## Batch Verification
For audits, verify multiple proof bundles efficiently:
```bash
# Create a verification manifest
cat > verify-manifest.json << 'EOF'
{
"bundles": [
"sha256:1a2b3c4d...",
"sha256:5e6f7g8h...",
"sha256:9i0j1k2l..."
],
"options": {
"checkRekor": false,
"failFast": false
}
}
EOF
# Run batch verification
stellaops proof verify-batch \
--manifest verify-manifest.json \
--anchor-file trust-anchors.json \
--output verification-report.json
```
### Verification Report Format
```json
{
"verifiedAt": "2025-12-17T10:00:00Z",
"mode": "offline",
"anchorsUsed": ["550e8400..."],
"results": [
{
"proofBundleId": "sha256:1a2b3c4d...",
"verified": true,
"checks": {
"contentId": true,
"signature": true,
"merklePath": true,
"rekorInclusion": null
}
}
],
"summary": {
"total": 3,
"verified": 3,
"failed": 0,
"skipped": 0
}
}
```
---
## Graph Root Attestation Verification (Offline)
Graph root attestations provide tamper-evident commitment to graph analysis results. In air-gap mode, these attestations can be verified without network access.
### Verify Graph Root Attestation
```bash
# Verify a single graph root attestation
stellaops graph-root verify --offline \
--envelope graph-root.dsse \
--anchor-file trust-anchors.json
# Expected output:
# Graph Root Verification
# ═══════════════════════
# ✓ DSSE signature verified
# ✓ Predicate type: graph-root.stella/v1
# ✓ Graph type: ReachabilityGraph
# ✓ Canon version: stella:canon:v1
# ⊘ Rekor verification skipped (offline mode)
#
# Overall: VERIFIED (offline)
```
### Verify with Node/Edge Reconstruction
When you have the original graph data, you can recompute and verify the Merkle root:
```bash
# Verify with reconstruction
stellaops graph-root verify --offline \
--envelope graph-root.dsse \
--nodes nodes.json \
--edges edges.json \
--anchor-file trust-anchors.json
# Expected output:
# Graph Root Verification (with reconstruction)
# ═════════════════════════════════════════════
# ✓ DSSE signature verified
# ✓ Nodes canonicalized: 1234 entries
# ✓ Edges canonicalized: 5678 entries
# ✓ Merkle root recomputed: sha256:abc123...
# ✓ Merkle root matches claimed: sha256:abc123...
#
# Overall: VERIFIED (reconstructed)
```
### Graph Data File Formats
**nodes.json** - Array of node identifiers:
```json
{
"canonVersion": "stella:canon:v1",
"nodes": [
"pkg:npm/lodash@4.17.21",
"pkg:npm/express@4.18.2",
"pkg:npm/body-parser@1.20.0"
]
}
```
**edges.json** - Array of edge identifiers:
```json
{
"canonVersion": "stella:canon:v1",
"edges": [
"pkg:npm/express@4.18.2->pkg:npm/body-parser@1.20.0",
"pkg:npm/express@4.18.2->pkg:npm/lodash@4.17.21"
]
}
```
### Verification Steps (Detailed)
The offline graph root verification algorithm:
1. **Parse DSSE envelope** - Extract payload and signatures
2. **Decode in-toto statement** - Parse subject and predicate
3. **Verify signature** - Check DSSE signature against trust anchor allowed keys
4. **Validate predicate type** - Confirm `graph-root.stella/v1`
5. **Extract Merkle root** - Get claimed root from predicate
6. **If reconstruction requested**:
- Load nodes.json and edges.json
- Verify canon version matches predicate
- Sort nodes lexicographically
- Sort edges lexicographically
- Concatenate sorted lists
- Build SHA-256 Merkle tree
- Compare computed root to claimed root
7. **Emit verification result**
### Programmatic Verification (.NET)
```csharp
using StellaOps.Attestor.GraphRoot;
// Load trust anchors
var anchors = await TrustAnchors.LoadFromFileAsync("trust-anchors.json");
// Create verifier
var verifier = new GraphRootAttestor(signer, canonicalJsonSerializer);
// Load envelope
var envelope = await DsseEnvelope.LoadAsync("graph-root.dsse");
// Verify without reconstruction
var result = await verifier.VerifyAsync(
envelope,
trustAnchors: anchors,
verifyRekor: false);
// Verify with reconstruction
var nodeIds = new[] { "pkg:npm/lodash@4.17.21", "pkg:npm/express@4.18.2" };
var edgeIds = new[] { "pkg:npm/express@4.18.2->pkg:npm/lodash@4.17.21" };
var fullResult = await verifier.VerifyAsync(
envelope,
nodeIds: nodeIds,
edgeIds: edgeIds,
trustAnchors: anchors,
verifyRekor: false);
Console.WriteLine($"Verified: {fullResult.IsValid}");
Console.WriteLine($"Merkle root: {fullResult.MerkleRoot}");
```
### Integration with Proof Spine
Graph roots can be included in proof spines for comprehensive verification:
```bash
# Export proof bundle with graph roots
stellaops proof export \
--entry sha256:abc123:pkg:npm/lodash@4.17.21 \
--include-graph-roots \
--output proof-bundle.zip
# Bundle now includes:
# proof-bundle.zip
# ├── proof-spine.json
# ├── evidence/
# ├── reasoning.json
# ├── vex-verdict.json
# ├── graph-roots/ # Graph root attestations
# │ ├── reachability.dsse
# │ └── dependency.dsse
# ├── envelopes/
# └── VERIFY.md
# Verify with graph roots
stellaops proof verify --offline \
--bundle-file proof-bundle.zip \
--verify-graph-roots \
--anchor-file trust-anchors.json
```
### Determinism Requirements
For offline verification to succeed:
1. **Same canonicalization** - Use `stella:canon:v1` consistently
2. **Same ordering** - Lexicographic sort for nodes and edges
3. **Same encoding** - UTF-8 for all string operations
4. **Same hash algorithm** - SHA-256 for Merkle tree
---
## Key Rotation in Air-Gap Mode
When keys are rotated, trust anchor updates must be distributed:
### 1. Export Updated Anchors
```bash
# On online system after key rotation
stellaops anchor export --since 2025-01-01 > anchor-update.json
sha256sum anchor-update.json > anchor-update.sha256
```
### 2. Verify and Import Update
```bash
# On air-gapped system
sha256sum -c anchor-update.sha256
stellaops anchor import --file anchor-update.json --merge
# Verify key history
stellaops anchor show --anchor-id 550e8400... --show-history
```
### 3. Temporal Verification
When verifying old proofs after key rotation:
```bash
# Verify proof signed with now-revoked key
stellaops proof verify --offline \
--proof-bundle sha256:old-proof... \
--anchor-file trust-anchors.json \
--at-time "2024-06-15T12:00:00Z"
# The verification uses key validity at the specified time
```
---
## Manual Verification (No CLI)
For environments without the StellaOps CLI, manual verification is possible:
### 1. Verify Content-Addressed ID
```bash
# Extract payload from DSSE envelope
jq -r '.payload' proof-spine.dsse | base64 -d > payload.json
# Compute hash
sha256sum payload.json
# Compare with proof bundle ID
```
### 2. Verify DSSE Signature
```python
#!/usr/bin/env python3
import json
import base64
from cryptography.hazmat.primitives import hashes
from cryptography.hazmat.primitives.asymmetric import ec
from cryptography.hazmat.primitives.serialization import load_pem_public_key
def verify_dsse(envelope_path, public_key_pem):
"""Verify a DSSE envelope signature."""
with open(envelope_path) as f:
envelope = json.load(f)
payload_type = envelope['payloadType']
payload = base64.b64decode(envelope['payload'])
# Build PAE (Pre-Authentication Encoding)
pae = f"DSSEv1 {len(payload_type)} {payload_type} {len(payload)} ".encode() + payload
public_key = load_pem_public_key(public_key_pem.encode())
for sig in envelope['signatures']:
signature = base64.b64decode(sig['sig'])
try:
public_key.verify(signature, pae, ec.ECDSA(hashes.SHA256()))
print(f"✓ Signature valid for keyid: {sig['keyid']}")
return True
except Exception as e:
print(f"✗ Signature invalid: {e}")
return False
```
### 3. Verify Merkle Path
```python
#!/usr/bin/env python3
import json
import hashlib
def verify_merkle_path(leaf_hash, path, root_hash, leaf_index):
"""Verify a Merkle inclusion path."""
current = bytes.fromhex(leaf_hash)
index = leaf_index
for sibling in path:
sibling_bytes = bytes.fromhex(sibling)
if index % 2 == 0:
# Current is left child
combined = current + sibling_bytes
else:
# Current is right child
combined = sibling_bytes + current
current = hashlib.sha256(combined).digest()
index //= 2
computed_root = current.hex()
if computed_root == root_hash:
print("✓ Merkle path verified")
return True
else:
print(f"✗ Merkle root mismatch: {computed_root} != {root_hash}")
return False
```
---
## Exit Codes
Offline verification uses the same exit codes as online:
| Code | Meaning | CI/CD Action |
|------|---------|--------------|
| 0 | Verification passed | Proceed |
| 1 | Verification failed | Block |
| 2 | System error | Retry/investigate |
---
## Troubleshooting
### Missing Trust Anchor
```
Error: No trust anchor found for keyid "key-2025-prod"
```
**Solution**: Import updated trust anchors from online system.
### Key Not Valid at Time
```
Error: Key "key-2024-prod" was revoked at 2024-12-01, before proof signature at 2025-01-15
```
**Solution**: This indicates the proof was signed after key revocation. Investigate the signature timestamp.
### Merkle Path Invalid
```
Error: Merkle path verification failed for evidence sha256:e1...
```
**Solution**: The proof bundle may be corrupted. Re-export from online system.
---
## Related Documentation
- [Proof Chain API Reference](../api/proofs.md)
- [Key Rotation Runbook](../operations/key-rotation-runbook.md)
- [Portable Evidence Bundle Verification](portable-evidence-bundle-verification.md)
- [Offline Bundle Format](offline-bundle-format.md)

View File

@@ -0,0 +1,368 @@
# Reachability Drift Air-Gap Workflows
**Sprint:** SPRINT_3600_0001_0001
**Task:** RDRIFT-MASTER-0006 - Document air-gap workflows for reachability drift
## Overview
Reachability Drift Detection can operate in fully air-gapped environments using offline bundles. This document describes the workflows for running reachability drift analysis without network connectivity, building on the Smart-Diff air-gap patterns.
## Prerequisites
1. **Offline Kit** - Downloaded and verified (`stellaops offline kit download`)
2. **Feed Snapshots** - Pre-staged vulnerability feeds and surfaces
3. **Call Graph Cache** - Pre-extracted call graphs for target artifacts
4. **Vulnerability Surface Bundles** - Pre-computed trigger method mappings
## Key Differences from Online Mode
| Aspect | Online Mode | Air-Gap Mode |
|--------|-------------|--------------|
| Surface Queries | Real-time API | Local bundle lookup |
| Call Graph Extraction | On-demand | Pre-computed + cached |
| Graph Diff | Direct comparison | Bundle-to-bundle |
| Attestation | Online transparency log | Offline DSSE bundle |
| Metrics | Telemetry enabled | Local-only metrics |
---
## Workflow 1: Offline Reachability Drift Analysis
### Step 1: Prepare Offline Bundle with Call Graphs
On a connected machine:
```bash
# Download offline kit with reachability bundles
stellaops offline kit download \
--output /path/to/offline-bundle \
--include-feeds nvd,osv,epss \
--include-surfaces \
--feed-date 2025-01-15
# Pre-extract call graphs for known artifacts
stellaops callgraph extract \
--artifact registry.example.com/app:v1 \
--artifact registry.example.com/app:v2 \
--output /path/to/offline-bundle/callgraphs \
--languages dotnet,nodejs,java,go,python
# Include vulnerability surface bundles
stellaops surfaces export \
--cve-list /path/to/known-cves.txt \
--output /path/to/offline-bundle/surfaces \
--format ndjson
# Package for transfer
stellaops offline kit package \
--input /path/to/offline-bundle \
--output stellaops-reach-offline-2025-01-15.tar.gz \
--sign
```
### Step 2: Transfer to Air-Gapped Environment
Transfer the bundle using approved media:
- USB drive (scanned and approved)
- Optical media (DVD/Blu-ray)
- Data diode
### Step 3: Import Bundle
On the air-gapped machine:
```bash
# Verify bundle signature
stellaops offline kit verify \
--input stellaops-reach-offline-2025-01-15.tar.gz \
--public-key /path/to/signing-key.pub
# Extract and configure
stellaops offline kit import \
--input stellaops-reach-offline-2025-01-15.tar.gz \
--data-dir /opt/stellaops/data
```
### Step 4: Run Reachability Drift Analysis
```bash
# Set offline mode
export STELLAOPS_OFFLINE=true
export STELLAOPS_DATA_DIR=/opt/stellaops/data
export STELLAOPS_SURFACES_DIR=/opt/stellaops/data/surfaces
export STELLAOPS_CALLGRAPH_CACHE=/opt/stellaops/data/callgraphs
# Run reachability drift
stellaops reach-drift \
--base-scan scan-v1.json \
--current-scan scan-v2.json \
--base-callgraph callgraph-v1.json \
--current-callgraph callgraph-v2.json \
--output drift-report.json \
--format json
```
---
## Workflow 2: Pre-Computed Drift Export
For environments that cannot run the full analysis, pre-compute drift results on a connected machine and export them for review.
### Step 1: Pre-Compute Drift Results
```bash
# On connected machine: compute drift
stellaops reach-drift \
--base-scan scan-v1.json \
--current-scan scan-v2.json \
--output drift-results.json \
--include-witnesses \
--include-paths
# Generate offline viewer bundle
stellaops offline viewer export \
--drift-report drift-results.json \
--output drift-viewer-bundle.html \
--self-contained
```
### Step 2: Transfer and Review
The self-contained HTML viewer can be opened in any browser on the air-gapped machine without additional dependencies.
---
## Workflow 3: Incremental Call Graph Updates
For environments that need to update call graphs without full re-extraction.
### Step 1: Export Graph Delta
On connected machine after code changes:
```bash
# Extract delta since last snapshot
stellaops callgraph delta \
--base-snapshot callgraph-v1.json \
--current-source /path/to/code \
--output graph-delta.json
```
### Step 2: Apply Delta in Air-Gap
```bash
# Merge delta into existing graph
stellaops callgraph merge \
--base /opt/stellaops/data/callgraphs/app-v1.json \
--delta graph-delta.json \
--output /opt/stellaops/data/callgraphs/app-v2.json
```
---
## Bundle Contents
### Call Graph Bundle Structure
```
callgraphs/
├── manifest.json # Bundle metadata
├── checksums.sha256 # Content hashes
├── app-v1/
│ ├── snapshot.json # CallGraphSnapshot
│ ├── entrypoints.json # Entrypoint index
│ └── sinks.json # Sink index
└── app-v2/
├── snapshot.json
├── entrypoints.json
└── sinks.json
```
### Surface Bundle Structure
```
surfaces/
├── manifest.json # Bundle metadata
├── checksums.sha256 # Content hashes
├── by-cve/
│ ├── CVE-2024-1234.json # Surface + triggers
│ └── CVE-2024-5678.json
└── by-package/
├── nuget/
│ └── Newtonsoft.Json/
│ └── surfaces.ndjson
└── npm/
└── lodash/
└── surfaces.ndjson
```
---
## Offline Surface Query
When running in air-gap mode, the surface query service automatically uses local bundles:
```csharp
// Configuration for air-gap mode
services.AddSingleton<ISurfaceQueryService>(sp =>
{
var options = sp.GetRequiredService<IOptions<AirGapOptions>>().Value;
if (options.Enabled)
{
return new OfflineSurfaceQueryService(
options.SurfacesBundlePath,
sp.GetRequiredService<ILogger<OfflineSurfaceQueryService>>());
}
return sp.GetRequiredService<OnlineSurfaceQueryService>();
});
```
---
## Attestation in Air-Gap Mode
Reachability drift results can be attested even in offline mode using pre-provisioned signing keys:
```bash
# Sign drift results with offline key
stellaops attest sign \
--input drift-results.json \
--predicate-type https://stellaops.io/attestation/reachability-drift/v1 \
--key /opt/stellaops/keys/signing-key.pem \
--output drift-attestation.dsse.json
# Verify attestation (offline)
stellaops attest verify \
--input drift-attestation.dsse.json \
--trust-root /opt/stellaops/keys/trust-root.json
```
---
## Staleness Considerations
### Call Graph Freshness
Call graphs should be re-extracted when:
- Source code changes significantly
- Dependencies are updated
- Framework versions change
Maximum recommended staleness: **7 days** for active development, **30 days** for stable releases.
### Surface Bundle Freshness
Surface bundles should be updated when:
- New CVEs are published
- Vulnerability details are refined
- Trigger methods are updated
Maximum recommended staleness: **24 hours** for high-security environments, **7 days** for standard environments.
### Staleness Indicators
```bash
# Check bundle freshness
stellaops offline status \
--data-dir /opt/stellaops/data
# Output:
# Bundle Type | Last Updated | Age | Status
# -----------------|---------------------|--------|--------
# NVD Feed | 2025-01-15T00:00:00 | 3 days | OK
# OSV Feed | 2025-01-15T00:00:00 | 3 days | OK
# Surfaces | 2025-01-14T12:00:00 | 4 days | WARNING
# Call Graphs (v1) | 2025-01-10T08:00:00 | 8 days | STALE
```
---
## Determinism Requirements
All offline workflows must produce deterministic results:
1. **Call Graph Extraction** - Same source produces identical graph hash
2. **Drift Detection** - Same inputs produce identical drift report
3. **Path Witnesses** - Same reachability query produces identical paths
4. **Attestation** - Signature over canonical JSON (sorted keys, no whitespace)
Verification:
```bash
# Verify determinism
stellaops reach-drift \
--base-scan scan-v1.json \
--current-scan scan-v2.json \
--output drift-1.json
stellaops reach-drift \
--base-scan scan-v1.json \
--current-scan scan-v2.json \
--output drift-2.json
# Must be identical
diff drift-1.json drift-2.json
# (no output = identical)
```
---
## Troubleshooting
### Missing Surface Data
```
Error: No surface found for CVE-2024-1234 in package pkg:nuget/Newtonsoft.Json@12.0.1
```
**Resolution:** Update surface bundle or fall back to package-API-level reachability:
```bash
stellaops reach-drift \
--fallback-mode package-api \
...
```
### Call Graph Extraction Failure
```
Error: Failed to extract call graph - missing language support for 'rust'
```
**Resolution:** Pre-extract call graphs on a machine with required tooling, or skip unsupported languages:
```bash
stellaops callgraph extract \
--skip-unsupported \
...
```
### Bundle Signature Verification Failure
```
Error: Bundle signature invalid - public key mismatch
```
**Resolution:** Ensure correct public key is used, or re-download bundle:
```bash
# List available trust roots
stellaops offline trust-roots list
# Import new trust root (requires approval)
stellaops offline trust-roots import \
--key new-signing-key.pub \
--fingerprint <expected-fingerprint>
```
---
## Related Documentation
- [Smart-Diff Air-Gap Workflows](smart-diff-airgap-workflows.md)
- [Offline Bundle Format](offline-bundle-format.md)
- [Air-Gap Operations](operations.md)
- [Staleness and Time](staleness-and-time.md)
- [Sealing and Egress](sealing-and-egress.md)

View File

@@ -0,0 +1,389 @@
# Risk Bundles (Airgap)
Risk bundles package vulnerability intelligence data for offline/air-gapped environments. They provide deterministic, signed archives containing provider datasets (CISA KEV, FIRST EPSS, OSV) that can be verified and imported without network connectivity.
## Bundle Structure
A risk bundle is a gzip-compressed tar archive (`risk-bundle.tar.gz`) with the following structure:
```
risk-bundle.tar.gz
├── manifests/
│ └── provider-manifest.json # Bundle metadata and provider entries
├── providers/
│ ├── cisa-kev/
│ │ └── snapshot # CISA Known Exploited Vulnerabilities JSON
│ ├── first-epss/
│ │ └── snapshot # FIRST EPSS scores CSV/JSON
│ └── osv/ # (optional) OpenSSF OSV bulk JSON
│ └── snapshot
└── signatures/
└── provider-manifest.dsse # DSSE envelope for manifest
```
## Provider Manifest
The `provider-manifest.json` contains bundle metadata and per-provider entries:
```json
{
"version": "1.0.0",
"bundleId": "risk-bundle-20241211-120000",
"createdAt": "2024-12-11T12:00:00Z",
"inputsHash": "sha256:abc123...",
"providers": [
{
"providerId": "cisa-kev",
"digest": "sha256:def456...",
"snapshotDate": "2024-12-11T00:00:00Z",
"optional": false
},
{
"providerId": "first-epss",
"digest": "sha256:789abc...",
"snapshotDate": "2024-12-11T00:00:00Z",
"optional": true
}
]
}
```
| Field | Description |
|-------|-------------|
| `version` | Manifest schema version (currently `1.0.0`) |
| `bundleId` | Unique identifier for this bundle |
| `createdAt` | ISO-8601 UTC timestamp of bundle creation |
| `inputsHash` | SHA-256 hash of concatenated provider digests (deterministic ordering) |
| `providers[]` | Array of provider entries sorted by `providerId` |
### Provider Entry Fields
| Field | Description |
|-------|-------------|
| `providerId` | Provider identifier (`cisa-kev`, `first-epss`, `osv`) |
| `digest` | SHA-256 hash of snapshot file (`sha256:<hex>`) |
| `snapshotDate` | ISO-8601 timestamp of provider data snapshot |
| `optional` | Whether provider is required for bundle validity |
## Provider Catalog
| Provider | Source | Coverage | Refresh | Required |
|----------|--------|----------|---------|----------|
| `cisa-kev` | CISA Known Exploited Vulnerabilities | Exploited CVEs with KEV flag | Daily | Yes |
| `first-epss` | FIRST EPSS scores | Exploitation probability per CVE | Daily | No |
| `osv` | OpenSSF OSV | OSS advisories with affected ranges | Weekly | No (opt-in) |
## Building Risk Bundles
### Using the Export Worker
The ExportCenter worker can build risk bundles via the `stella export risk-bundle` job:
```bash
# Build bundle with default providers (CISA KEV + EPSS)
stella export risk-bundle --output /path/to/output
# Include OSV providers (larger bundle)
stella export risk-bundle --output /path/to/output --include-osv
# Build with specific bundle ID
stella export risk-bundle --output /path/to/output --bundle-id "custom-bundle-id"
```
### Using the CI Build Script
For CI pipelines and deterministic testing, use the shell scripts:
```bash
# Build fixture bundle for CI testing (deterministic)
ops/devops/risk-bundle/build-bundle.sh --output /tmp/bundle --fixtures-only
# Build with OSV
ops/devops/risk-bundle/build-bundle.sh --output /tmp/bundle --fixtures-only --include-osv
# Build with custom bundle ID
ops/devops/risk-bundle/build-bundle.sh --output /tmp/bundle --fixtures-only --bundle-id "ci-test-bundle"
```
### Build Script Options
| Option | Description |
|--------|-------------|
| `--output <dir>` | Output directory for bundle artifacts (required) |
| `--fixtures-only` | Use fixture data instead of live provider downloads |
| `--include-osv` | Include OSV providers (increases bundle size) |
| `--bundle-id <id>` | Custom bundle ID (default: auto-generated with timestamp) |
### Build Outputs
After building, the output directory contains:
```
output/
├── risk-bundle.tar.gz # The bundle archive
├── risk-bundle.tar.gz.sha256 # SHA-256 checksum
└── manifest.json # Copy of provider-manifest.json
```
## Verifying Risk Bundles
### Using the CLI
```bash
# Basic verification
stella risk bundle verify --bundle-path ./risk-bundle.tar.gz
# With detached signature
stella risk bundle verify --bundle-path ./risk-bundle.tar.gz --signature-path ./bundle.sig
# Check Sigstore Rekor transparency log
stella risk bundle verify --bundle-path ./risk-bundle.tar.gz --check-rekor
# JSON output for automation
stella risk bundle verify --bundle-path ./risk-bundle.tar.gz --json
# Verbose output with warnings
stella risk bundle verify --bundle-path ./risk-bundle.tar.gz --verbose
```
### CLI Options
| Option | Description |
|--------|-------------|
| `--bundle-path, -b` | Path to risk bundle file (required) |
| `--signature-path, -s` | Path to detached signature file |
| `--check-rekor` | Verify transparency log entry in Sigstore Rekor |
| `--json` | Output results as JSON |
| `--tenant` | Tenant context for verification |
| `--verbose` | Show detailed output including warnings |
### Using the Verification Script
For offline/air-gap verification without the CLI:
```bash
# Basic verification
ops/devops/risk-bundle/verify-bundle.sh /path/to/risk-bundle.tar.gz
# With detached signature
ops/devops/risk-bundle/verify-bundle.sh /path/to/risk-bundle.tar.gz --signature /path/to/bundle.sig
# Strict mode (warnings are errors)
ops/devops/risk-bundle/verify-bundle.sh /path/to/risk-bundle.tar.gz --strict
# JSON output
ops/devops/risk-bundle/verify-bundle.sh /path/to/risk-bundle.tar.gz --json
```
### Verification Steps
The verification process performs these checks:
1. **Archive integrity** - Bundle is a valid tar.gz archive
2. **Structure validation** - Required files present (`manifests/provider-manifest.json`)
3. **Manifest parsing** - Valid JSON with required fields (`bundleId`, `version`, `providers`)
4. **Provider hash verification** - Each provider snapshot matches its declared digest
5. **Mandatory provider check** - `cisa-kev` must be present and valid
6. **DSSE signature validation** - Manifest signature verified (if present)
7. **Detached signature** - Bundle archive signature verified (if provided)
### Exit Codes
| Code | Meaning |
|------|---------|
| 0 | Bundle is valid |
| 1 | Bundle is invalid or verification failed |
| 2 | Input error (missing file, bad arguments) |
### JSON Output Format
```json
{
"valid": true,
"bundleId": "risk-bundle-20241211-120000",
"version": "1.0.0",
"providerCount": 2,
"mandatoryProviderFound": true,
"errorCount": 0,
"warningCount": 1,
"errors": [],
"warnings": ["Optional provider not found: osv"]
}
```
## Importing Risk Bundles
### Prerequisites
1. Verify the bundle before import (see above)
2. Ensure the target system has sufficient storage
3. Back up existing provider data if replacing
### Import Steps
1. **Transfer the bundle** to the air-gapped environment via approved media
2. **Verify the bundle** using the CLI or verification script
3. **Extract to staging**:
```bash
mkdir -p /staging/risk-bundle
tar -xzf risk-bundle.tar.gz -C /staging/risk-bundle
```
4. **Validate provider data**:
```bash
# Verify individual provider hashes
sha256sum /staging/risk-bundle/providers/cisa-kev/snapshot
sha256sum /staging/risk-bundle/providers/first-epss/snapshot
```
5. **Import into Concelier**:
```bash
stella concelier import-risk-bundle --path /staging/risk-bundle
```
### Error Handling
| Error | Cause | Resolution |
|-------|-------|------------|
| "Bundle is not a valid tar.gz archive" | Corrupted download/transfer | Re-download and verify checksum |
| "Missing required file: manifests/provider-manifest.json" | Incomplete bundle | Rebuild bundle |
| "Missing mandatory provider: cisa-kev" | KEV snapshot missing | Rebuild with valid provider data |
| "Hash mismatch: cisa-kev" | Corrupted provider data | Re-download provider snapshot |
| "DSSE signature validation failed" | Tampered manifest | Investigate chain of custody |
## CI/CD Integration
### GitHub Actions / Gitea Workflow
The `.gitea/workflows/risk-bundle-ci.yml` workflow:
1. **Build job**: Compiles RiskBundles library, runs tests, builds fixture bundle
2. **Offline kit job**: Packages bundle for offline kit distribution
3. **Publish checksums job**: Publishes checksums to artifact store (main branch only)
```yaml
# Trigger manually or on push to relevant paths
on:
push:
paths:
- 'src/ExportCenter/StellaOps.ExportCenter.RiskBundles/**'
- 'ops/devops/risk-bundle/**'
workflow_dispatch:
inputs:
include_osv:
type: boolean
default: false
```
### Offline Kit Integration
Risk bundles are included in the Offline Update Kit:
```
offline-kit/
└── risk-bundles/
├── risk-bundle.tar.gz
├── risk-bundle.tar.gz.sha256
├── manifest.json
├── checksums.txt
└── kit-manifest.json
```
The `kit-manifest.json` provides metadata for offline kit consumers:
```json
{
"component": "risk-bundle",
"version": "20241211-120000",
"files": [
{"path": "risk-bundle.tar.gz", "checksum_file": "risk-bundle.tar.gz.sha256"},
{"path": "manifest.json", "checksum_file": "manifest.json.sha256"}
],
"verification": {
"checksums": "checksums.txt",
"signature": "risk-bundle.tar.gz.sig"
}
}
```
## Signing and Trust
### DSSE Manifest Signature
The `signatures/provider-manifest.dsse` file contains a Dead Simple Signing Envelope:
```json
{
"payloadType": "application/vnd.stellaops.risk-bundle.manifest+json",
"payload": "<base64-encoded-manifest>",
"signatures": [
{
"keyid": "risk-bundle-signing-key",
"sig": "<signature>"
}
]
}
```
### Offline Trust Roots
For air-gapped verification, include public keys in the bundle:
```
signatures/
├── provider-manifest.dsse
└── pubkeys/
└── <tenant>.pem
```
### Sigstore/Rekor Integration
When `--check-rekor` is specified, verification queries the Sigstore Rekor transparency log to confirm the bundle was published to the public ledger.
## Determinism Checklist
Risk bundles are designed for reproducible builds:
- [x] Fixed timestamps for tar entries (`--mtime="@<epoch>"`)
- [x] Sorted file ordering (`--sort=name`)
- [x] Numeric owner/group (`--owner=0 --group=0 --numeric-owner`)
- [x] Deterministic gzip compression (`gzip -n`)
- [x] Providers sorted by `providerId` in manifest
- [x] Files sorted lexicographically in bundle
- [x] UTF-8 canonical paths
- [x] ISO-8601 UTC timestamps
## Troubleshooting
### Common Issues
**Q: Bundle verification fails with "jq not available"**
A: The verification script uses `jq` for JSON parsing. Install it or use the CLI (`stella risk bundle verify`) which has built-in JSON support.
**Q: Hash mismatch after transfer**
A: Binary transfers can corrupt files. Use checksums:
```bash
# On source system
sha256sum risk-bundle.tar.gz > checksum.txt
# On target system
sha256sum -c checksum.txt
```
**Q: "Optional provider not found" warning**
A: This is informational. Optional providers (EPSS, OSV) enhance risk analysis but aren't required. Use `--strict` if you want to enforce their presence.
**Q: DSSE signature validation fails in air-gap**
A: Ensure the offline trust root is configured:
```bash
stella config set risk-bundle.trust-root /path/to/pubkey.pem
```
## Related Documentation
- [Offline Update Kit](../OFFLINE_KIT.md) - Complete offline kit documentation
- [Mirror Bundles](./mirror-bundles.md) - OCI artifact bundles for air-gap
- [Provider Matrix](../modules/export-center/operations/risk-bundle-provider-matrix.md) - Detailed provider specifications
- [ExportCenter Architecture](../modules/export-center/architecture.md) - Export service design

View File

@@ -0,0 +1,616 @@
# Air-Gap Operations Runbook for Score Proofs & Reachability
> **Version**: 1.0.0
> **Sprint**: 3500.0004.0004
> **Last Updated**: 2025-12-20
This runbook covers air-gapped operations for Score Proofs and Reachability features, including offline kit deployment, proof verification, and bundle management.
---
## Table of Contents
1. [Overview](#1-overview)
2. [Offline Kit Deployment](#2-offline-kit-deployment)
3. [Score Proofs in Air-Gap Mode](#3-score-proofs-in-air-gap-mode)
4. [Reachability in Air-Gap Mode](#4-reachability-in-air-gap-mode)
5. [Bundle Import Operations](#5-bundle-import-operations)
6. [Proof Verification Offline](#6-proof-verification-offline)
7. [Troubleshooting](#7-troubleshooting)
8. [Monitoring & Alerting](#8-monitoring--alerting)
---
## 1. Overview
### Air-Gap Modes
| Mode | Network | Use Case |
|------|---------|----------|
| **Sealed** | No external connectivity | Classified environments |
| **Constrained** | Limited egress (allowlist) | Regulated networks |
| **Hybrid** | Selective connectivity | Standard enterprise |
### Score Proofs Air-Gap Capabilities
| Feature | Sealed | Constrained | Hybrid |
|---------|--------|-------------|--------|
| Score computation | ✅ | ✅ | ✅ |
| Score replay | ✅ | ✅ | ✅ |
| Proof generation | ✅ | ✅ | ✅ |
| Proof verification | ✅ | ✅ | ✅ |
| Rekor logging | ❌ | 🔶 (optional) | ✅ |
| Feed updates | Bundle | Bundle | Online |
### Reachability Air-Gap Capabilities
| Feature | Sealed | Constrained | Hybrid |
|---------|--------|-------------|--------|
| Call graph upload | ✅ | ✅ | ✅ |
| Reachability compute | ✅ | ✅ | ✅ |
| Explain queries | ✅ | ✅ | ✅ |
| Symbol resolution | Bundle | Bundle | Online |
---
## 2. Offline Kit Deployment
### 2.1 Offline Kit Contents
The offline kit contains everything needed for air-gapped Score Proofs and Reachability:
```
offline-kit/
├── manifests/
│ ├── kit-manifest.json # Kit metadata and versions
│ ├── feed-manifest.json # Advisory feed snapshot
│ └── vex-manifest.json # VEX data snapshot
├── feeds/
│ ├── concelier/ # Advisory feed data
│ │ ├── advisories.ndjson
│ │ └── snapshot.dsse.json
│ └── excititor/ # VEX data
│ ├── vex-statements.ndjson
│ └── snapshot.dsse.json
├── policies/
│ ├── scoring-policy.yaml
│ └── policy.dsse.json
├── trust/
│ ├── trust-anchors.json # Public keys for verification
│ └── time-anchor.json # Time attestation
├── symbols/
│ └── symbol-index.db # Symbol resolution database
└── tools/
├── stella-cli # CLI binary
└── verify-kit.sh # Kit verification script
```
### 2.2 Verify Kit Integrity
Before deployment, always verify the offline kit:
```bash
# Verify kit signature
stella airgap verify-kit --kit /path/to/offline-kit
# Output:
# Kit manifest: VALID
# Feed snapshot: VALID (sha256:feed123...)
# VEX snapshot: VALID (sha256:vex456...)
# Policy: VALID (sha256:policy789...)
# Trust anchors: VALID (3 anchors)
# Time anchor: VALID (expires: 2025-12-31T00:00:00Z)
#
# Kit verification: PASSED
# Verify individual components
stella airgap verify --file feeds/concelier/snapshot.dsse.json
stella airgap verify --file feeds/excititor/snapshot.dsse.json
stella airgap verify --file policies/policy.dsse.json
```
### 2.3 Deploy Offline Kit
```bash
# Deploy kit (sealed mode)
stella airgap deploy --kit /path/to/offline-kit --mode sealed
# Deploy kit (constrained mode with limited egress)
stella airgap deploy --kit /path/to/offline-kit \
--mode constrained \
--egress-allowlist https://rekor.sigstore.dev
# Verify deployment
stella airgap status
# Output:
# Mode: sealed
# Kit version: 2025.12.20
# Feed snapshot: sha256:feed123... (2025-12-20)
# VEX snapshot: sha256:vex456... (2025-12-20)
# Policy: sha256:policy789... (v1.2.3)
# Trust anchors: 3 active
# Time anchor: Valid until 2025-12-31
# Staleness: OK (0 days)
```
### 2.4 Kit Updates
```bash
# Check for kit updates (requires external access or new media)
stella airgap check-update --current-kit /path/to/current-kit
# Import new kit
stella airgap import-kit --kit /path/to/new-kit --validate
# Rollback to previous kit
stella airgap rollback --generation previous
```
---
## 3. Score Proofs in Air-Gap Mode
### 3.1 Create Scan (Air-Gap)
```bash
# Create scan referencing offline kit snapshots
stella scan create --artifact sha256:abc123... \
--airgap \
--feed-snapshot sha256:feed123... \
--vex-snapshot sha256:vex456... \
--policy-snapshot sha256:policy789...
# Or auto-detect from deployed kit
stella scan create --artifact sha256:abc123... --use-offline-kit
```
### 3.2 Score Replay (Air-Gap)
```bash
# Replay with offline kit snapshots
stella score replay --scan-id $SCAN_ID --offline
# Replay with specific bundle
stella score replay --scan-id $SCAN_ID \
--offline \
--bundle /path/to/proof-bundle.zip
# Compare with different kit versions
stella score replay --scan-id $SCAN_ID \
--offline \
--feed-snapshot sha256:newfeed... \
--diff
```
### 3.3 Generate Proof Bundle (Air-Gap)
```bash
# Generate proof bundle for export
stella proof export --scan-id $SCAN_ID \
--include-manifest \
--include-chain \
--output proof-bundle.zip
# Proof bundle contents (air-gap safe):
# - manifest.json (canonical)
# - manifest.dsse.json
# - score_proof.json
# - proof_root.dsse.json
# - meta.json
# - NO external references
# Generate portable bundle
stella proof export --scan-id $SCAN_ID \
--portable \
--include-trust-anchors \
--output portable-proof.zip
```
---
## 4. Reachability in Air-Gap Mode
### 4.1 Call Graph Operations (Air-Gap)
```bash
# Upload call graph (works identically)
stella scan graph upload --scan-id $SCAN_ID --file callgraph.json
# Call graph processing is fully local
# No external network required
```
### 4.2 Compute Reachability (Air-Gap)
```bash
# Compute reachability (fully offline)
stella reachability compute --scan-id $SCAN_ID --offline
# Symbol resolution uses offline database
stella reachability compute --scan-id $SCAN_ID \
--offline \
--symbol-db /path/to/offline-kit/symbols/symbol-index.db
```
### 4.3 Explain Queries (Air-Gap)
```bash
# Explain queries work offline
stella reachability explain --scan-id $SCAN_ID \
--cve CVE-2024-1234 \
--purl "pkg:npm/lodash@4.17.20" \
--offline
# Export explanations for external review
stella reachability explain-all --scan-id $SCAN_ID \
--output explanations.json \
--offline
```
---
## 5. Bundle Import Operations
### 5.1 Import Feed Updates
```bash
# Verify feed bundle before import
stella airgap verify --bundle feed-update.zip
# Dry-run import
stella airgap import --bundle feed-update.zip \
--type feed \
--dry-run
# Import feed bundle
stella airgap import --bundle feed-update.zip \
--type feed \
--generation 2025.12.21
# Verify import
stella airgap verify-import --generation 2025.12.21
```
### 5.2 Import VEX Updates
```bash
# Import VEX bundle
stella airgap import --bundle vex-update.zip \
--type vex \
--generation 2025.12.21
# Verify VEX statements
stella airgap vex-status
# Output:
# VEX statements: 15,432
# Last update: 2025-12-21
# Generation: 2025.12.21
# Signature: VALID
```
### 5.3 Import Trust Anchors
```bash
# Import new trust anchor (requires approval)
stella airgap import-anchor --file new-anchor.json \
--reason "Key rotation Q4 2025" \
--approver admin@example.com
# Verify anchor chain
stella airgap verify-anchors
# List active anchors
stella airgap anchors list
```
### 5.4 Import Checklist
**Pre-Import**:
- [ ] Verify bundle signature (DSSE)
- [ ] Verify bundle hash matches manifest
- [ ] Confirm sealed/constrained mode is set
- [ ] Backup current generation
**Import**:
- [ ] Run dry-run import
- [ ] Apply import
- [ ] Verify import succeeded
**Post-Import**:
- [ ] Verify timeline event emitted
- [ ] Update staleness dashboard
- [ ] Archive import manifest
- [ ] Update audit log
---
## 6. Proof Verification Offline
### 6.1 Verify Proof Bundle (Full Offline)
```bash
# Verify proof bundle without any network access
stella proof verify --bundle proof-bundle.zip \
--offline \
--trust-anchor /path/to/trust-anchors.json
# Verification checks (offline):
# ✅ Signature valid (DSSE)
# ✅ Content-addressed ID matches
# ✅ Merkle path valid
# ⏭️ Rekor inclusion (SKIPPED - offline mode)
# ✅ Time anchor valid
```
### 6.2 Verify with Portable Bundle
```bash
# Portable bundles include trust anchors
stella proof verify --bundle portable-proof.zip \
--offline \
--self-contained
# Output:
# Using embedded trust anchors
# Signature verification: PASS
# ID recomputation: PASS
# Merkle path: PASS
# Time anchor: VALID
# Overall: VERIFIED
```
### 6.3 Batch Verification
```bash
# Verify multiple bundles
stella proof verify-batch --dir /path/to/bundles/ \
--offline \
--trust-anchor /path/to/trust-anchors.json \
--output verification-report.json
# Generate verification report
cat verification-report.json | jq '.summary'
# Output:
# {
# "total": 100,
# "verified": 98,
# "failed": 2,
# "skipped": 0
# }
```
### 6.4 Verification Without CLI
For environments without the CLI, manual verification is possible:
```bash
# 1. Extract bundle
unzip proof-bundle.zip -d ./verify/
# 2. Verify DSSE signature (using openssl)
# Extract payload from DSSE envelope
cat ./verify/manifest.dsse.json | jq -r '.payload' | base64 -d > payload.json
# Verify signature
cat ./verify/manifest.dsse.json | jq -r '.signatures[0].sig' | base64 -d > signature.bin
openssl dgst -sha256 -verify trust-anchor-pubkey.pem -signature signature.bin payload.json
# 3. Verify content-addressed ID
# Compute canonical hash
cat ./verify/manifest.json | jq -cS . | sha256sum
# Compare with manifestHash in bundle
# 4. Verify merkle path
# (See docs/airgap/proof-chain-verification.md for algorithm)
```
---
## 7. Troubleshooting
### 7.1 Kit Verification Failed
**Symptom**: `stella airgap verify-kit` fails.
**Diagnosis**:
```bash
# Check specific component
stella airgap verify-kit --verbose --component feeds
# Common errors:
# - "Signature verification failed": Key mismatch
# - "Hash mismatch": Bundle corrupted during transfer
# - "Time anchor expired": Anchor needs refresh
```
**Resolution**:
| Error | Cause | Resolution |
|-------|-------|------------|
| Signature failed | Wrong trust anchor | Import correct anchor |
| Hash mismatch | Corruption | Re-transfer bundle |
| Time anchor expired | Clock drift or expired | Import new time anchor |
### 7.2 Staleness Alert
**Symptom**: Staleness warning/alert.
**Diagnosis**:
```bash
# Check staleness status
stella airgap staleness-status
# Output:
# Feed age: 5 days (threshold: 7 days)
# VEX age: 3 days (threshold: 7 days)
# Policy age: 30 days (threshold: 90 days)
# Status: AMBER (approaching threshold)
```
**Resolution**:
```bash
# Import updated bundles
stella airgap import --bundle /path/to/latest-feed.zip --type feed
# If bundles unavailable and breach imminent:
# - Raise amber alert (5-7 days)
# - If >7 days, raise red alert and halt new ingests
# - Request emergency bundle via secure channel
```
### 7.3 Proof Verification Fails Offline
**Symptom**: Proof verification fails in sealed mode.
**Diagnosis**:
```bash
# Check verification error
stella proof verify --bundle proof.zip --offline --verbose
# Common errors:
# - "Trust anchor not found": Missing anchor in offline kit
# - "Time anchor expired": Time validation failed
# - "Unsupported algorithm": Key algorithm not supported
```
**Resolution**:
```bash
# For missing trust anchor:
# Import the required anchor
stella airgap import-anchor --file required-anchor.json
# For expired time anchor:
# Import new time anchor
stella airgap import-time-anchor --file new-time-anchor.json
# For algorithm issues:
# Regenerate proof with supported algorithm
stella proof regenerate --scan-id $SCAN_ID --algorithm ECDSA-P256
```
### 7.4 Symbol Resolution Fails
**Symptom**: Reachability shows "symbol not found" errors.
**Diagnosis**:
```bash
# Check symbol database status
stella airgap symbols-status
# Output:
# Symbol DB: /path/to/symbols/symbol-index.db
# Version: 2025.12.15
# Entries: 5,234,567
# Coverage: Java, .NET, Python
```
**Resolution**:
```bash
# Import updated symbol database
stella airgap import --bundle symbol-update.zip --type symbols
# Recompute reachability with new symbols
stella reachability compute --scan-id $SCAN_ID --offline --force
```
---
## 8. Monitoring & Alerting
### 8.1 Key Metrics (Air-Gap)
| Metric | Description | Alert Threshold |
|--------|-------------|-----------------|
| `airgap_staleness_days` | Days since last bundle import | > 5 (amber), > 7 (red) |
| `airgap_time_anchor_validity_days` | Days until time anchor expires | < 7 |
| `airgap_verification_failures` | Offline verification failures | > 0 |
| `airgap_import_failures` | Bundle import failures | > 0 |
### 8.2 Alerting Rules
```yaml
groups:
- name: airgap-operations
rules:
- alert: AirgapStalenessAmber
expr: airgap_staleness_days > 5
for: 1h
labels:
severity: warning
annotations:
summary: "Air-gap feed staleness approaching threshold"
- alert: AirgapStalenessRed
expr: airgap_staleness_days > 7
for: 1h
labels:
severity: critical
annotations:
summary: "Air-gap feed staleness breach - halt new ingests"
- alert: AirgapTimeAnchorExpiring
expr: airgap_time_anchor_validity_days < 7
for: 1h
labels:
severity: warning
annotations:
summary: "Time anchor expiring in {{ $value }} days"
- alert: AirgapVerificationFailure
expr: increase(airgap_verification_failures_total[1h]) > 0
for: 5m
labels:
severity: critical
annotations:
summary: "Air-gap verification failures detected"
```
### 8.3 Audit Requirements
For air-gapped environments, maintain strict audit trails:
```bash
# Record every import
{
"timestamp": "2025-12-20T10:00:00Z",
"action": "import",
"bundleType": "feed",
"bundleHash": "sha256:feed123...",
"generation": "2025.12.20",
"actor": "operator@example.com",
"mode": "sealed",
"verification": "PASS"
}
# Daily audit log export
stella airgap audit-export --date today --output audit-$(date +%Y%m%d).json
# Verify audit log integrity
stella airgap audit-verify --log audit-20251220.json
```
---
## Related Documentation
- [Air-Gap Overview](./overview.md)
- [Offline Bundle Format](./offline-bundle-format.md)
- [Proof Chain Verification](./proof-chain-verification.md)
- [Time Anchor Schema](./time-anchor-schema.md)
- [Score Proofs Runbook](../operations/score-proofs-runbook.md)
- [Reachability Runbook](../operations/reachability-runbook.md)
---
**Last Updated**: 2025-12-20
**Version**: 1.0.0
**Sprint**: 3500.0004.0004

View File

@@ -0,0 +1,32 @@
# AirGap Sealed-Mode Startup Diagnostics (prep for AIRGAP-CTL-57-001/57-002)
## Goal
Prevent services from running when sealed-mode requirements are unmet and emit auditable diagnostics + telemetry.
## Pre-flight checks
1) `airgap_state` indicates `sealed=true`.
2) Egress allowlist configured (non-empty or explicitly `[]`).
3) Trust root bundle + TUF metadata present and unexpired.
4) Time anchor available (see `TimeAnchor` schema) and staleness budget not exceeded.
5) Pending root rotations either applied or flagged with approver IDs.
## On failure
- Abort host startup with structured error code: `AIRGAP_STARTUP_MISSING_<ITEM>` (implemented as `sealed-startup-blocked:<reason>` in controller host).
- Emit structured log fields: `airgap.startup.check`, `status=failure`, `reason`, `bundlePath`, `trustRootVersion`, `timeAnchorDigest`.
- Increment counter `airgap_startup_blocked_total{reason}` and gauge `airgap_time_anchor_age_seconds` if anchor missing/stale.
## Telemetry hooks
- Trace event `airgap.startup.validation` with attributes: `sealed`, `allowlist.count`, `trust_roots.count`, `time_anchor.age_seconds`, `rotation.pending`.
- Timeline events (for 57-002): `airgap.sealed` and `airgap.unsealed` include startup validation results and pending rotations.
## Integration points
- Controller: run checks during `IHostApplicationLifetime.ApplicationStarted` before exposing endpoints.
- Importer: reuse `ImportValidator` to ensure bundles + trust rotation are valid before proceeding.
- Time component: provide anchor + staleness calculations to the controller checks.
## Artefacts
- This document (deterministic guardrails for startup diagnostics).
- Code references: `src/AirGap/StellaOps.AirGap.Importer/Validation/*` for trust + bundle validation primitives; `src/AirGap/StellaOps.AirGap.Time/*` for anchors.
## Owners
- AirGap Controller Guild · Observability Guild.

View File

@@ -0,0 +1,30 @@
# Sealing and Egress (Airgap 56-002)
Guidance for enforcing deny-all egress and validating sealed-mode posture.
## Network policies
- Kubernetes: apply namespace-scoped `NetworkPolicy` with default deny; allow only:
- DNS to internal resolver
- Object storage/mirror endpoints on allowlist
- OTLP/observability endpoints if permitted for sealed monitoring
- Docker Compose: use firewall rules or `extra_hosts` to block outbound except mirrors; ship `iptables` template in ops bundle.
## EgressPolicy facade
- Services MUST read `Excititor:Network:EgressPolicy` (or module equivalent) to decide runtime behavior:
- `sealed` → deny outbound HTTP/S except allowlist; fail fast on unexpected hosts.
- `constrained` → allow allowlist + time/NTP if required.
- Log policy decisions and surface `X-Sealed-Mode: true|false` on HTTP responses for diagnostics.
## Verification checklist
1. Confirm policy manifests applied (kubectl/compose diff) and pods restarted.
2. Run connectivity probe from each pod:
- Allowed endpoints respond (200/OK or 403 expected).
- Disallowed domains return immediate failure.
3. Attempt bundle import; verify timeline event emitted with `sealed=true`.
4. Check observability: counters for denied egress should increment (export or console log).
5. Record mirrorGeneration + manifest hash in audit log.
## Determinism & offline posture
- No external CRLs/OCSP in sealed mode; rely on bundled trust roots.
- Keep allowlist minimal and declared in config; no implicit fallbacks.
- All timestamps UTC; avoid calling external time APIs.

View File

@@ -0,0 +1,288 @@
# Smart-Diff Air-Gap Workflows
**Sprint:** SPRINT_3500_0001_0001
**Task:** SDIFF-MASTER-0006 - Document air-gap workflows for smart-diff
## Overview
Smart-Diff can operate in fully air-gapped environments using offline bundles. This document describes the workflows for running smart-diff analysis without network connectivity.
## Prerequisites
1. **Offline Kit** - Downloaded and verified (`stellaops offline kit download`)
2. **Feed Snapshots** - Pre-staged vulnerability feeds
3. **SBOM Cache** - Pre-generated SBOMs for target artifacts
## Workflow 1: Offline Smart-Diff Analysis
### Step 1: Prepare Offline Bundle
On a connected machine:
```bash
# Download offline kit with feeds
stellaops offline kit download \
--output /path/to/offline-bundle \
--include-feeds nvd,osv,epss \
--feed-date 2025-01-15
# Include SBOMs for known artifacts
stellaops offline sbom generate \
--artifact registry.example.com/app:v1 \
--artifact registry.example.com/app:v2 \
--output /path/to/offline-bundle/sboms
# Package for transfer
stellaops offline kit package \
--input /path/to/offline-bundle \
--output stellaops-offline-2025-01-15.tar.gz \
--sign
```
### Step 2: Transfer to Air-Gapped Environment
Transfer the bundle using approved media:
- USB drive (scanned and approved)
- Optical media (DVD/Blu-ray)
- Data diode
### Step 3: Import Bundle
On the air-gapped machine:
```bash
# Verify bundle signature
stellaops offline kit verify \
--input stellaops-offline-2025-01-15.tar.gz \
--public-key /path/to/signing-key.pub
# Extract and configure
stellaops offline kit import \
--input stellaops-offline-2025-01-15.tar.gz \
--data-dir /opt/stellaops/data
```
### Step 4: Run Smart-Diff
```bash
# Set offline mode
export STELLAOPS_OFFLINE=true
export STELLAOPS_DATA_DIR=/opt/stellaops/data
# Run smart-diff
stellaops smart-diff \
--base sbom:app-v1.json \
--target sbom:app-v2.json \
--output smart-diff-report.json
```
## Workflow 2: Pre-Computed Smart-Diff Export
For environments where even running analysis tools is restricted.
### Step 1: Prepare Artifacts (Connected Machine)
```bash
# Generate SBOMs
stellaops sbom generate --artifact app:v1 --output app-v1-sbom.json
stellaops sbom generate --artifact app:v2 --output app-v2-sbom.json
# Run smart-diff with full proof bundle
stellaops smart-diff \
--base app-v1-sbom.json \
--target app-v2-sbom.json \
--output-dir ./smart-diff-export \
--include-proofs \
--include-evidence \
--format bundle
```
### Step 2: Verify Export Contents
The export bundle contains:
```
smart-diff-export/
├── manifest.json # Signed manifest
├── base-sbom.json # Base SBOM (hash verified)
├── target-sbom.json # Target SBOM (hash verified)
├── diff-results.json # Smart-diff findings
├── sarif-report.json # SARIF formatted output
├── proofs/
│ ├── ledger.json # Proof ledger
│ └── nodes/ # Individual proof nodes
├── evidence/
│ ├── reachability.json # Reachability evidence
│ ├── vex-statements.json # VEX statements
│ └── hardening.json # Binary hardening data
└── signature.dsse # DSSE envelope
```
### Step 3: Import and Verify (Air-Gapped Machine)
```bash
# Verify bundle integrity
stellaops verify-bundle \
--input smart-diff-export \
--public-key /path/to/trusted-key.pub
# View results
stellaops smart-diff show \
--bundle smart-diff-export \
--format table
```
## Workflow 3: Incremental Feed Updates
### Step 1: Generate Delta Feed
On connected machine:
```bash
# Generate delta since last sync
stellaops offline feed delta \
--since 2025-01-10 \
--output feed-delta-2025-01-15.tar.gz \
--sign
```
### Step 2: Apply Delta (Air-Gapped)
```bash
# Import delta
stellaops offline feed apply \
--input feed-delta-2025-01-15.tar.gz \
--verify
# Trigger score replay for affected scans
stellaops score replay-all \
--trigger feed-update \
--dry-run
```
## Configuration
### Environment Variables
| Variable | Description | Default |
|----------|-------------|---------|
| `STELLAOPS_OFFLINE` | Enable offline mode | `false` |
| `STELLAOPS_DATA_DIR` | Local data directory | `~/.stellaops` |
| `STELLAOPS_FEED_DIR` | Feed snapshot directory | `$DATA_DIR/feeds` |
| `STELLAOPS_SBOM_CACHE` | SBOM cache directory | `$DATA_DIR/sboms` |
| `STELLAOPS_SKIP_NETWORK` | Block network requests | `false` |
| `STELLAOPS_REQUIRE_SIGNATURES` | Require signed data | `true` |
### Config File
```yaml
# ~/.stellaops/config.yaml
offline:
enabled: true
data_dir: /opt/stellaops/data
require_signatures: true
feeds:
source: local
path: /opt/stellaops/data/feeds
sbom:
cache_dir: /opt/stellaops/data/sboms
network:
allow_list: [] # Empty = no network
```
## Verification
### Verify Feed Freshness
```bash
# Check feed dates
stellaops offline status
# Output:
# Feed Status (Offline Mode)
# ─────────────────────────────
# NVD: 2025-01-15 (2 days old)
# OSV: 2025-01-15 (2 days old)
# EPSS: 2025-01-14 (3 days old)
# KEV: 2025-01-15 (2 days old)
```
### Verify Proof Integrity
```bash
# Verify smart-diff proofs
stellaops smart-diff verify \
--input smart-diff-report.json \
--proof-bundle ./proofs
# Output:
# ✓ Manifest hash verified
# ✓ All proof nodes valid
# ✓ Root hash matches: sha256:abc123...
```
## Determinism Guarantees
Offline smart-diff maintains determinism by:
1. **Content-addressed feeds** - Same feed hash = same results
2. **Frozen timestamps** - All timestamps use manifest creation time
3. **No network randomness** - No external API calls
4. **Stable sorting** - Deterministic output ordering
### Reproducibility Test
```bash
# Run twice and compare
stellaops smart-diff --base a.json --target b.json --output run1.json
stellaops smart-diff --base a.json --target b.json --output run2.json
# Compare hashes
sha256sum run1.json run2.json
# abc123... run1.json
# abc123... run2.json (identical)
```
## Troubleshooting
### Error: Feed not found
```
Error: Feed 'nvd' not found in offline data directory
```
**Solution:** Ensure feed was included in offline kit:
```bash
stellaops offline kit status
ls $STELLAOPS_FEED_DIR/nvd/
```
### Error: Network request blocked
```
Error: Network request blocked in offline mode: api.osv.dev
```
**Solution:** This is expected behavior. Ensure all required data is in offline bundle.
### Error: Signature verification failed
```
Error: Bundle signature verification failed
```
**Solution:** Ensure correct public key is configured:
```bash
stellaops offline kit verify \
--input bundle.tar.gz \
--public-key /path/to/correct-key.pub
```
## Related Documentation
- [Offline Kit Guide](../OFFLINE_KIT.md)
- [Smart-Diff CLI](../cli/smart-diff-cli.md)
- [Smart-Diff types](../api/smart-diff-types.md)
- [Determinism gates](../testing/determinism-gates.md)

View File

@@ -0,0 +1,69 @@
# Air-Gapped Time Anchors & Staleness Budgets
> **Audience:** AirGap Time/Controller/Policy guilds, DevOps
>
> **Purpose:** Document how air-gapped installations maintain trusted time anchors, compute staleness windows, and expose drift telemetry.
## 1. Overview
Air-gapped clusters cannot contact external NTP servers. StellaOps distributes signed time anchor tokens alongside mirror bundles so services can reason about freshness and seal state without external clocks.
Key goals:
- Provide deterministic time anchors signed by the mirror authority.
- Track drift and staleness budgets for scanner reports, advisories, and runtime evidence.
- Surface warnings to operators (UI/CLI/Notifier) before anchors expire.
## 2. Components
| Component | Responsibility |
|-----------|----------------|
| AirGap Controller | Stores the active `time_anchor` token and enforces sealed/unsealed transitions. |
| AirGap Time service | Parses anchor bundles, validates signatures, records monotonic offsets, and exposes drift metrics. |
| Scheduler & Policy Engine | Query the time service to gate scheduled runs and evidence evaluation. |
| UI / Notifier | Display remaining budget and raise alerts when thresholds are crossed. |
## 3. Time Anchor Tokens
- Distributed as part of mirror/offline bundles (`airgap/time-anchor.json`).
- Signed with mirror key; includes issuance time, validity window, and monotonic counter.
- Validation steps:
1. Verify detached signature.
2. Compare bundle counter to previously applied anchors.
3. Persist anchor with checksum for audit.
## 4. Staleness Budgets
Each tenant/configuration defines budgets:
- **Advisory freshness** maximum age of advisory/VEX data before rescans are required.
- **Scanner evidence** acceptable drift between last scan and current anchor.
- **Runtime posture** tolerated drift before Notifier raises incidents.
AirGap Time calculates drift = `now(monotonic) - anchor.issued_at` and exposes:
- `/api/v1/time/status` current anchor metadata, drift, remaining budget.
- `/api/v1/time/metrics` Prometheus counters (`airgap_anchor_drift_seconds`, `airgap_anchor_expiry_seconds`).
## 5. Operator Workflow
1. Import new mirror bundle (includes time anchor).
2. AirGap Time validates and stores the anchor; Controller records audit entry.
3. Services subscribe to change events and recompute drift.
4. UI displays badge (green/amber/red) based on thresholds.
5. Notifier sends alerts when drift exceeds warning or expiry limits.
## 6. Implementation Notes
- Use `IAirGapTimeStore` for persistence; default implementation relies on PostgreSQL with tenant partitioning.
- Ensure deterministic JSON serialization (UTC ISO-8601 timestamps, sorted keys).
- Test vectors located under `src/AirGap/StellaOps.AirGap.Time/fixtures/`.
- For offline testing, simulate monotonic clock via `ITestClock` to avoid system clock drift in CI.
- Staleness calculations use `StalenessCalculator` + `StalenessBudget`/`StalenessEvaluation` (see `src/AirGap/StellaOps.AirGap.Time/Services` and `.Models`); warning/breach thresholds must be non-negative and warning ≤ breach.
## 7. References
- `docs/modules/airgap/guides/airgap-mode.md`
- `src/AirGap/StellaOps.AirGap.Time`
- `src/AirGap/StellaOps.AirGap.Controller`
- `src/AirGap/StellaOps.AirGap.Policy`

View File

@@ -0,0 +1,316 @@
# Symbol Bundles for Air-Gapped Installations
**Reference:** SYMS-BUNDLE-401-014
This document describes how to create, verify, and deploy deterministic symbol bundles for air-gapped StellaOps installations.
## Overview
Symbol bundles package debug symbols (PDBs, DWARF, etc.) into a single archive with:
- **Deterministic ordering** for reproducible builds
- **BLAKE3 hashes** for content verification
- **DSSE signatures** for authenticity
- **Rekor checkpoints** for transparency log integration
- **Merkle inclusion proofs** for offline verification
## Bundle Structure
```
bundle-name-1.0.0.symbols.zip
├── manifest.json # Bundle manifest with all metadata
├── symbols/
│ ├── {debug-id-1}/
│ │ ├── myapp.exe.symbols # Symbol blob
│ │ └── myapp.exe.symbols.json # Symbol manifest
│ ├── {debug-id-2}/
│ │ ├── libcrypto.so.symbols
│ │ └── libcrypto.so.symbols.json
│ └── ...
```
## Creating a Bundle
### Prerequisites
1. Collect symbol manifests from CI builds or ingest tools
2. Ensure all manifests follow the `*.symbols.json` naming convention
3. Have signing keys available (if signing is required)
### Build Command
```bash
# Basic bundle creation
stella symbols bundle \
--name "product-symbols" \
--version "1.0.0" \
--source ./symbols-dir \
--output ./bundles
# With signing and Rekor submission
stella symbols bundle \
--name "product-symbols" \
--version "1.0.0" \
--source ./symbols-dir \
--output ./bundles \
--sign \
--key ./signing-key.pem \
--key-id "release-key-2025" \
--rekor \
--rekor-url https://rekor.sigstore.dev
# Filter by platform
stella symbols bundle \
--name "linux-symbols" \
--version "1.0.0" \
--source ./symbols-dir \
--output ./bundles \
--platform linux-x64
```
### Bundle Options
| Option | Description |
|--------|-------------|
| `--name` | Bundle name (required) |
| `--version` | Bundle version in SemVer format (required) |
| `--source` | Source directory containing symbol manifests (required) |
| `--output` | Output directory for bundle archive (required) |
| `--platform` | Filter symbols by platform (e.g., linux-x64, win-x64) |
| `--tenant` | Filter symbols by tenant ID |
| `--sign` | Sign bundle with DSSE |
| `--key` | Path to signing key (PEM-encoded private key) |
| `--key-id` | Key ID for DSSE signature |
| `--algorithm` | Signing algorithm (ecdsa-p256, ed25519, rsa-pss-sha256) |
| `--rekor` | Submit to Rekor transparency log |
| `--rekor-url` | Rekor server URL |
| `--format` | Archive format: zip (default) or tar.gz |
| `--compression` | Compression level (0-9, default: 6) |
## Verifying a Bundle
### Online Verification
```bash
stella symbols verify --bundle ./product-symbols-1.0.0.symbols.zip
```
### Offline Verification
For air-gapped environments, include the Rekor public key:
```bash
stella symbols verify \
--bundle ./product-symbols-1.0.0.symbols.zip \
--public-key ./signing-public-key.pem \
--rekor-offline \
--rekor-key ./rekor-public-key.pem
```
### Verification Output
```
Bundle verification successful!
Bundle ID: a1b2c3d4e5f6g7h8
Name: product-symbols-1.0.0.symbols
Version: 1.0.0
Signature: valid (ecdsa-p256)
Hash verification: 42/42 valid
```
## Extracting Symbols
### Full Extraction
```bash
stella symbols extract \
--bundle ./product-symbols-1.0.0.symbols.zip \
--output ./extracted-symbols
```
### Platform-Filtered Extraction
```bash
stella symbols extract \
--bundle ./product-symbols-1.0.0.symbols.zip \
--output ./linux-symbols \
--platform linux-x64
```
### Manifests Only
```bash
stella symbols extract \
--bundle ./product-symbols-1.0.0.symbols.zip \
--output ./manifests-only \
--manifests-only
```
## Inspecting Bundles
```bash
# Basic info
stella symbols inspect --bundle ./product-symbols-1.0.0.symbols.zip
# With entry listing
stella symbols inspect --bundle ./product-symbols-1.0.0.symbols.zip --entries
```
## Bundle Manifest Schema
The bundle manifest (`manifest.json`) follows this schema:
```json
{
"schemaVersion": "stellaops.symbols.bundle/v1",
"bundleId": "blake3-hash-of-content",
"name": "product-symbols",
"version": "1.0.0",
"createdAt": "2025-12-14T10:30:00Z",
"platform": null,
"tenantId": null,
"entries": [
{
"debugId": "abc123def456",
"codeId": "...",
"binaryName": "myapp.exe",
"platform": "win-x64",
"format": "pe",
"manifestHash": "blake3...",
"blobHash": "blake3...",
"blobSizeBytes": 102400,
"archivePath": "symbols/abc123def456/myapp.exe.symbols",
"symbolCount": 5000
}
],
"totalSizeBytes": 10485760,
"signature": {
"signed": true,
"algorithm": "ecdsa-p256",
"keyId": "release-key-2025",
"dsseDigest": "sha256:...",
"signedAt": "2025-12-14T10:30:00Z",
"publicKey": "-----BEGIN PUBLIC KEY-----..."
},
"rekorCheckpoint": {
"rekorUrl": "https://rekor.sigstore.dev",
"logEntryId": "...",
"logIndex": 12345678,
"integratedTime": "2025-12-14T10:30:01Z",
"rootHash": "sha256:...",
"treeSize": 987654321,
"inclusionProof": {
"logIndex": 12345678,
"rootHash": "sha256:...",
"treeSize": 987654321,
"hashes": ["sha256:...", "sha256:..."]
},
"logPublicKey": "-----BEGIN PUBLIC KEY-----..."
},
"hashAlgorithm": "blake3"
}
```
## Air-Gap Deployment Workflow
### 1. Create Bundle (Online Environment)
```bash
# On the online build server
stella symbols bundle \
--name "release-v2.0.0-symbols" \
--version "2.0.0" \
--source /build/symbols \
--output /export \
--sign --key /keys/release.pem \
--rekor
```
### 2. Transfer to Air-Gapped Environment
Copy the following files to the air-gapped environment:
- `release-v2.0.0-symbols-2.0.0.symbols.zip`
- `release-v2.0.0-symbols-2.0.0.manifest.json`
- `signing-public-key.pem` (if not already present)
- `rekor-public-key.pem` (for Rekor offline verification)
### 3. Verify (Air-Gapped Environment)
```bash
# On the air-gapped server
stella symbols verify \
--bundle ./release-v2.0.0-symbols-2.0.0.symbols.zip \
--public-key ./signing-public-key.pem \
--rekor-offline \
--rekor-key ./rekor-public-key.pem
```
### 4. Extract and Deploy
```bash
# Extract to symbols server directory
stella symbols extract \
--bundle ./release-v2.0.0-symbols-2.0.0.symbols.zip \
--output /var/stellaops/symbols \
--verify
```
## Determinism Guarantees
Symbol bundles are deterministic:
1. **Entry ordering**: Entries sorted by debug ID, then binary name (lexicographic)
2. **Hash algorithm**: BLAKE3 for all content hashes
3. **Timestamps**: UTC ISO-8601 format
4. **JSON serialization**: Canonical form (no whitespace, sorted keys)
5. **Archive entries**: Sorted by path within archive
This ensures that given the same input manifests, the same bundle (excluding signatures) is produced.
## CI Integration
### GitHub Actions Example
```yaml
- name: Build symbol bundle
run: |
stella symbols bundle \
--name "${{ github.repository }}-symbols" \
--version "${{ github.ref_name }}" \
--source ./build/symbols \
--output ./dist \
--sign --key ${{ secrets.SIGNING_KEY }} \
--rekor
- name: Upload bundle artifact
uses: actions/upload-artifact@v4
with:
name: symbol-bundle
path: ./dist/*.symbols.zip
```
## Troubleshooting
### "No symbol manifests found"
Ensure manifests follow the `*.symbols.json` naming convention and are not DSSE envelopes (`*.dsse.json`).
### "Signature verification failed"
Check that:
1. The public key matches the signing key
2. The bundle has not been modified after signing
3. The key ID matches what was used during signing
### "Rekor inclusion proof invalid"
For offline verification:
1. Ensure the Rekor public key is current
2. The checkpoint was created when the log was online
3. The tree size hasn't changed since the checkpoint
## Related Documentation
- [Offline Kit Guide](../OFFLINE_KIT.md)
- [Symbol Server Architecture](../modules/scanner/architecture.md)
- [DSSE Signing Guide](../modules/signer/architecture.md)
- [Rekor Integration](../modules/attestor/architecture.md)

View File

@@ -0,0 +1,15 @@
# Time Anchor JSON schema (prep for AIRGAP-TIME-57-001)
Artifact: `docs/modules/airgap/schemas/time-anchor-schema.json`
Highlights:
- Required: `anchorTime` (RFC3339), `source` (`roughtime`|`rfc3161`), `format` string, `tokenDigest` (sha256 hex of token bytes).
- Optional: `signatureFingerprint` (hex), `verification.status` (`unknown|passed|failed`) + `reason`.
- No additional properties to keep payload deterministic.
Intended use:
- AirGap Time Guild can embed this in sealed-mode configs and validation endpoints.
- Mirror/OCI timelines can cite the digest + source without needing full token parsing.
Notes:
- Trust roots and final signature fingerprint rules stay TBD; placeholders remain optional to avoid blocking until roots are issued.

View File

@@ -0,0 +1,48 @@
# Time Anchor Trust Roots (draft) — for AIRGAP-TIME-57-001
Provides a minimal, deterministic format for distributing trust roots used to validate time tokens (Roughtime and RFC3161) in sealed/offline environments.
## Artefacts
- JSON schema: `docs/modules/airgap/schemas/time-anchor-schema.json`
- Trust roots bundle (draft): `docs/modules/airgap/samples/time-anchor-trust-roots.json`
## Bundle format (`time-anchor-trust-roots.json`)
```json
{
"version": 1,
"roughtime": [
{
"name": "stellaops-test-roughtime",
"publicKeyBase64": "BASE64_ED25519_PUBLIC_KEY",
"validFrom": "2025-01-01T00:00:00Z",
"validTo": "2026-01-01T00:00:00Z"
}
],
"rfc3161": [
{
"name": "stellaops-test-tsa",
"certificatePem": "-----BEGIN CERTIFICATE-----...-----END CERTIFICATE-----",
"validFrom": "2025-01-01T00:00:00Z",
"validTo": "2026-01-01T00:00:00Z",
"fingerprintSha256": "HEX_SHA256"
}
]
}
```
- All times are UTC ISO-8601.
- Fields are deterministic; no optional properties other than multiple entries per list.
- Consumers must reject expired roots and enforce matching token format (Roughtime vs RFC3161).
## Usage guidance
- Ship the bundle with the air-gapped deployment alongside the time-anchor schema.
- Configure AirGap Time service to load roots from a sealed path; do not fetch over network.
- Rotate by bumping `version`, adding new entries, and setting `validFrom/validTo`; keep prior roots until all deployments roll.
## Next steps
- Replace placeholder values with production Roughtime public keys and TSA certificates once issued by Security.
- Add regression tests in `StellaOps.AirGap.Time.Tests` that load this bundle and validate sample tokens once real roots are present.
- CI/Dev unblock: you can test end-to-end with a throwaway root by:
1. Generate Ed25519 key for Roughtime: `openssl genpkey -algorithm Ed25519 -out rtime-dev.pem && openssl pkey -in rtime-dev.pem -pubout -out rtime-dev.pub`.
2. Base64-encode the public key (`base64 -w0 rtime-dev.pub`) and place into `publicKeyBase64`; set validity to a short window.
3. Point `AirGap:TrustRootFile` at your edited bundle and set `AirGap:AllowUntrustedAnchors=true` only in dev.
4. Run `scripts/mirror/verify_thin_bundle.py --time-root docs/modules/airgap/samples/time-anchor-trust-roots.json` to ensure bundle is parsable.

View File

@@ -0,0 +1,21 @@
# Time Anchor Verification Gap (AIRGAP-TIME-57-001 follow-up)
## Status (2025-11-20)
- Parser: Roughtime verifier now checks Ed25519 signature; RFC3161 verifier uses SignedCms signature validation and signing time attribute. Still needs final trust root bundle + fixture alignment.
- Staleness: calculator + budgets landed; loader accepts hex fixtures.
- Verification: pipeline (`TimeVerificationService`) active; awaiting guild-provided trust roots (format + key IDs) for production readiness and to update tests/fixtures.
## Whats missing
- Roughtime parser: parse signed responses, extract `timestamp`, `radius`, `verifier` public key; verify signature.
- RFC3161 parser: decode ASN.1 TimeStampToken, verify signer chain against provided trust roots, extract nonce/ts.
- Trust roots: final format (JWK vs PEM) and key IDs to align with `TrustRootConfig`/Time service.
## Proposed plan
1) Receive finalized token format + trust-root bundle from Time Guild.
2) Implement format-specific verifiers with validating tests using provided fixtures.
3) Expose `/api/v1/time/status` returning anchor metadata + staleness; wire telemetry counters/alerts per sealed diagnostics doc.
## Owners
- AirGap Time Guild (format decision + trust roots)
- AirGap Importer Guild (bundle delivery of anchors)
- Observability Guild (telemetry wiring)

View File

@@ -0,0 +1,60 @@
# AirGap Time API (status + anchor ingest)
## Endpoints
- `POST /api/v1/time/anchor`
- Body (JSON):
- `tenantId` (string, required)
- `hexToken` (string, required) — hex-encoded Roughtime or RFC3161 token.
- `format` (string, required) — `Roughtime` or `Rfc3161`.
- `trustRootKeyId` (string, required)
- `trustRootAlgorithm` (string, required)
- `trustRootPublicKeyBase64` (string, required) — pubkey (Ed25519 for Roughtime, RSA for RFC3161).
- `warningSeconds` (number, optional)
- `breachSeconds` (number, optional)
- Response: `TimeStatusDto` (anchor + staleness snapshot) or 400 with reason (`token-hex-invalid`, `roughtime-signature-invalid`, `rfc3161-verify-failed:*`, etc.).
- Example:
```bash
curl -s -X POST http://localhost:5000/api/v1/time/anchor \
-H 'content-type: application/json' \
-d '{
"tenantId":"tenant-default",
"hexToken":"01020304deadbeef",
"format":"Roughtime",
"trustRootKeyId":"root-1",
"trustRootAlgorithm":"ed25519",
"trustRootPublicKeyBase64":"<base64-ed25519-public-key>",
"warningSeconds":3600,
"breachSeconds":7200
}'
```
- `GET /api/v1/time/status?tenantId=<id>`
- Returns `TimeStatusDto` with anchor metadata and staleness flags. 400 if `tenantId` missing.
- `GET /healthz/ready`
- Health check: `Healthy` when anchor present and not stale; `Degraded` when warning threshold crossed; `Unhealthy` when missing/stale. Uses configured tenant/budgets.
## Config
`appsettings.json` (see `docs/modules/airgap/samples/time-config-sample.json`):
```json
{
"AirGap": {
"TenantId": "tenant-default",
"Staleness": {
"WarningSeconds": 3600,
"BreachSeconds": 7200
}
}
}
```
## Startup validation
- The host runs sealed-mode validation at startup using the configured tenant and budgets.
- Fails closed with `sealed-startup-blocked:<reason>` if anchor is missing/stale or budgets mismatch.
## Notes
- Roughtime verifier checks Ed25519 signatures (message||signature framing).
- RFC3161 verifier uses SignedCms signature verification and signing-time attribute for anchor time.
- DTO serialization is stable (ISO-8601 UTC timestamps, fields fixed).

View File

@@ -0,0 +1,367 @@
# Triage Air-Gap Workflows
**Sprint:** SPRINT_3600_0001_0001
**Task:** TRI-MASTER-0006 - Document air-gap triage workflows
## Overview
This document describes how to perform vulnerability triage in fully air-gapped environments. The triage workflow supports offline evidence bundles, decision capture, and replay token generation.
## Workflow 1: Offline Triage with Evidence Bundles
### Step 1: Export Evidence Bundle (Connected Machine)
```bash
# Export triage bundle for specific findings
stellaops triage export \
--scan-id scan-12345678 \
--findings CVE-2024-1234,CVE-2024-5678 \
--include-evidence \
--include-graph \
--output triage-bundle.stella.bundle.tgz
# Export entire scan for offline review
stellaops triage export \
--scan-id scan-12345678 \
--all-findings \
--output full-triage-bundle.stella.bundle.tgz
```
### Step 2: Bundle Contents
The `.stella.bundle.tgz` archive contains:
```
triage-bundle.stella.bundle.tgz/
├── manifest.json # Signed bundle manifest
├── findings/
│ ├── index.json # Finding list with IDs
│ ├── CVE-2024-1234.json # Finding details
│ └── CVE-2024-5678.json
├── evidence/
│ ├── reachability/ # Reachability proofs
│ ├── callstack/ # Call stack snippets
│ ├── vex/ # VEX/CSAF statements
│ └── provenance/ # Provenance data
├── graph/
│ ├── nodes.ndjson # Dependency graph nodes
│ └── edges.ndjson # Graph edges
├── feeds/
│ └── snapshot.json # Feed snapshot metadata
└── signature.dsse # DSSE envelope
```
### Step 3: Transfer to Air-Gapped Environment
Transfer using approved methods:
- USB media (security scanned)
- Optical media
- Data diode
### Step 4: Import and Verify
On the air-gapped machine:
```bash
# Verify bundle integrity
stellaops triage verify-bundle \
--input triage-bundle.stella.bundle.tgz \
--public-key /path/to/signing-key.pub
# Import for offline triage
stellaops triage import \
--input triage-bundle.stella.bundle.tgz \
--workspace /opt/stellaops/triage
```
### Step 5: Perform Offline Triage
```bash
# List findings in bundle
stellaops triage list \
--workspace /opt/stellaops/triage
# View finding with evidence
stellaops triage show CVE-2024-1234 \
--workspace /opt/stellaops/triage \
--show-evidence
# Make triage decision
stellaops triage decide CVE-2024-1234 \
--workspace /opt/stellaops/triage \
--status not_affected \
--justification "Code path is unreachable due to config gating" \
--reviewer "security-team"
```
### Step 6: Export Decisions
```bash
# Export decisions for sync back
stellaops triage export-decisions \
--workspace /opt/stellaops/triage \
--output decisions-2025-01-15.json \
--sign
```
### Step 7: Sync Decisions (Connected Machine)
```bash
# Import and apply decisions
stellaops triage import-decisions \
--input decisions-2025-01-15.json \
--verify \
--apply
```
## Workflow 2: Batch Offline Triage
For high-volume environments.
### Step 1: Export Batch Bundle
```bash
# Export all untriaged findings
stellaops triage export-batch \
--query "status=untriaged AND priority>=0.7" \
--limit 100 \
--output batch-triage-2025-01-15.stella.bundle.tgz
```
### Step 2: Offline Batch Processing
```bash
# Interactive batch triage
stellaops triage batch \
--workspace /opt/stellaops/triage \
--input batch-triage-2025-01-15.stella.bundle.tgz
# Keyboard shortcuts enabled:
# j/k - Next/Previous finding
# a - Accept (affected)
# n - Not affected
# w - Will not fix
# f - False positive
# u - Undo last decision
# q - Quit (saves progress)
```
### Step 3: Export and Sync
```bash
# Export batch decisions
stellaops triage export-decisions \
--workspace /opt/stellaops/triage \
--format json \
--sign \
--output batch-decisions.json
```
## Workflow 3: Evidence-First Offline Review
### Step 1: Pre-compute Evidence
On connected machine:
```bash
# Generate evidence for all high-priority findings
stellaops evidence generate \
--scan-id scan-12345678 \
--priority-min 0.7 \
--output-dir ./evidence-pack
# Include:
# - Reachability analysis
# - Call stack traces
# - VEX lookups
# - Dependency graph snippets
```
### Step 2: Package with Findings
```bash
stellaops triage package \
--scan-id scan-12345678 \
--evidence-dir ./evidence-pack \
--output evidence-triage.stella.bundle.tgz
```
### Step 3: Offline Review with Evidence
```bash
# Evidence-first view
stellaops triage show CVE-2024-1234 \
--workspace /opt/stellaops/triage \
--evidence-first
# Output:
# ═══════════════════════════════════════════
# CVE-2024-1234 · lodash@4.17.20
# ═══════════════════════════════════════════
#
# EVIDENCE SUMMARY
# ────────────────
# Reachability: EXECUTED (tier 2/3)
# └─ main.js:42 → utils.js:15 → lodash/merge
#
# Call Stack:
# 1. main.js:42 handleRequest()
# 2. utils.js:15 mergeConfig()
# 3. lodash:merge <vulnerable>
#
# VEX Status: No statement found
# EPSS: 0.45 (Medium)
# KEV: No
#
# ─────────────────────────────────────────────
# Press [a]ffected, [n]ot affected, [s]kip...
```
## Configuration
### Environment Variables
| Variable | Description | Default |
|----------|-------------|---------|
| `STELLAOPS_OFFLINE` | Enable offline mode | `false` |
| `STELLAOPS_TRIAGE_WORKSPACE` | Triage workspace path | `~/.stellaops/triage` |
| `STELLAOPS_BUNDLE_VERIFY` | Verify bundle signatures | `true` |
| `STELLAOPS_DECISION_SIGN` | Sign exported decisions | `true` |
### Config File
```yaml
# ~/.stellaops/triage.yaml
offline:
enabled: true
workspace: /opt/stellaops/triage
bundle_verify: true
decisions:
require_justification: true
sign_exports: true
keyboard:
enabled: true
vim_mode: true
```
## Bundle Format Specification
### manifest.json
```json
{
"version": "1.0",
"type": "triage-bundle",
"created_at": "2025-01-15T10:00:00Z",
"scan_id": "scan-12345678",
"finding_count": 25,
"feed_snapshot": "sha256:abc123...",
"graph_revision": "sha256:def456...",
"signatures": {
"manifest": "sha256:ghi789...",
"dsse_envelope": "signature.dsse"
}
}
```
### Decision Format
```json
{
"finding_id": "finding-12345678",
"vuln_key": "CVE-2024-1234:pkg:npm/lodash@4.17.20",
"status": "not_affected",
"justification": "Code path gated by feature flag",
"reviewer": "security-team",
"decided_at": "2025-01-15T14:30:00Z",
"replay_token": "rt_abc123...",
"evidence_refs": [
"evidence/reachability/CVE-2024-1234.json"
]
}
```
## Replay Tokens
Each decision generates a replay token for audit trail:
```bash
# View replay token
stellaops triage show-token rt_abc123...
# Output:
# Replay Token: rt_abc123...
# ─────────────────────────────
# Finding: CVE-2024-1234
# Decision: not_affected
# Evidence Hash: sha256:xyz789...
# Feed Snapshot: sha256:abc123...
# Decided: 2025-01-15T14:30:00Z
# Reviewer: security-team
```
### Verify Token
```bash
stellaops triage verify-token rt_abc123... \
--public-key /path/to/key.pub
# ✓ Token signature valid
# ✓ Evidence hash matches
# ✓ Feed snapshot verified
```
## Troubleshooting
### Error: Bundle signature invalid
```
Error: Bundle signature verification failed
```
**Solution:** Ensure the correct public key is used:
```bash
stellaops triage verify-bundle \
--input bundle.tgz \
--public-key /path/to/correct-key.pub \
--verbose
```
### Error: Evidence not found
```
Error: Evidence for CVE-2024-1234 not included in bundle
```
**Solution:** Re-export with evidence:
```bash
stellaops triage export \
--scan-id scan-12345678 \
--findings CVE-2024-1234 \
--include-evidence \
--output bundle.tgz
```
### Error: Decision sync conflict
```
Error: Finding CVE-2024-1234 has newer decision on server
```
**Solution:** Review and resolve:
```bash
stellaops triage import-decisions \
--input decisions.json \
--conflict-mode review
# Options: keep-local, keep-server, newest, review
```
## Related Documentation
- [Offline Kit Guide](../OFFLINE_KIT.md)
- [Vulnerability Explorer guide](../VULNERABILITY_EXPLORER_GUIDE.md)
- [Triage contract](../api/triage.contract.v1.md)
- [Console accessibility](../accessibility.md)

View File

@@ -0,0 +1,60 @@
# AV/YARA Scan Runbook (AIRGAP-AV-510-011)
Purpose: ensure every offline-kit bundle is scanned pre-publish and post-ingest, with deterministic reports and optional signatures.
## Inputs
- Bundle directory containing `manifest.json` and payload files.
- AV scanner (e.g., ClamAV) and optional YARA rule set available locally (no network).
## Steps (offline)
1. Scan all bundle files:
```bash
clamscan -r --max-filesize=2G --max-scansize=4G --no-summary bundle/ > reports/av-scan.txt
```
2. Convert to structured report:
```bash
python - <<'PY'
import hashlib, json, pathlib, sys
root = pathlib.Path("bundle")
report = {
"scanner": "clamav",
"scannerVersion": "1.4.1",
"startedAt": "2025-12-02T00:02:00Z",
"completedAt": "2025-12-02T00:04:30Z",
"status": "clean",
"artifacts": [],
"errors": []
}
for path in root.glob("**/*"):
if path.is_file():
h = hashlib.sha256(path.read_bytes()).hexdigest()
report["artifacts"].append({
"path": str(path.relative_to(root)),
"sha256": h,
"result": "clean",
"yaraRules": []
})
json.dump(report, sys.stdout, indent=2)
PY
```
3. Validate report against schema:
```bash
jq empty --argfile schema docs/modules/airgap/schemas/av-report.schema.json 'input' < docs/modules/airgap/samples/av-report.sample.json >/dev/null
```
4. Optionally sign report (detached):
```bash
openssl dgst -sha256 -sign airgap-av-key.pem reports/av-report.json > reports/av-report.sig
```
5. Update `manifest.json`:
- Set `avScan.status` to `clean` or `findings`.
- `avScan.reportPath` and `avScan.reportSha256` must match the generated report.
## Acceptance checks
- Report validates against `docs/modules/airgap/schemas/av-report.schema.json`.
- `manifest.json` hashes updated and verified via `src/AirGap/scripts/verify-manifest.sh`.
- If any artifact result is `malicious`/`suspicious`, bundle must be rejected and re-scanned after remediation.
## References
- Manifest schema: `docs/modules/airgap/schemas/manifest.schema.json`
- Sample report: `docs/modules/airgap/samples/av-report.sample.json`
- Manifest verifier: `src/AirGap/scripts/verify-manifest.sh`

View File

@@ -0,0 +1,57 @@
# Offline Kit Import Verification Runbook
This runbook supports AIRGAP-MANIFEST-510-010, AIRGAP-REPLAY-510-013, and AIRGAP-VERIFY-510-014. It validates bundles fully offline and enforces replay depth.
## Replay depth levels (manifest `replayPolicy`)
- `hash-only`: verify manifest/bundle digests, staleness window, optional signature.
- `full-recompute`: hash-only + every chunk hash + AV report hash.
- `policy-freeze`: full-recompute + manifest policies must include the sealed policy hash (prevents imports with drifting policy/graph material).
## Quick steps
```bash
src/AirGap/scripts/verify-kit.sh \
--manifest offline-kit/manifest.json \
--bundle offline-kit/bundle.tar.gz \
--signature offline-kit/manifest.sig --pubkey offline-kit/manifest.pub.pem \
--av-report offline-kit/reports/av-report.json \
--receipt offline-kit/receipts/ingress.json \
--sealed-policy-hash "aa55..." \
--depth policy-freeze
```
## What the script enforces
1) Manifest & bundle digests match (`hashes.*`).
2) Optional manifest signature is valid (OpenSSL).
3) Staleness: `createdAt` must be within `stalenessWindowHours` of `--now` (defaults to UTC now).
4) AV: `avScan.status` must not be `findings`; if `reportSha256` is present, the provided report hash must match.
5) Chunks (full-recompute/policy-freeze): every `chunks[].path` exists relative to the manifest and matches its recorded SHA-256.
6) Policy-freeze: `--sealed-policy-hash` must appear in `policies[].sha256`.
7) Optional: `--expected-graph-sha` checks the graph chunk hash; `--receipt` reuses `verify-receipt.sh` to bind the receipt to the manifest/bundle hashes.
Exit codes: hash mismatch (3/4), staleness (5), AV issues (68), chunk drift (910), graph mismatch (11), policy drift (1213), bad depth (14).
## Controller verify endpoint (server-side guard)
`POST /system/airgap/verify` (scope `airgap:verify`) expects `VerifyRequest`:
```jsonc
{
"depth": "PolicyFreeze",
"manifestSha256": "...",
"bundleSha256": "...",
"computedManifestSha256": "...", // from offline verifier
"computedBundleSha256": "...",
"manifestCreatedAt": "2025-12-02T00:00:00Z",
"stalenessWindowHours": 168,
"bundlePolicyHash": "aa55...",
"sealedPolicyHash": "aa55..." // optional, controller fills from state if omitted
}
```
The controller applies the same replay rules and returns `{ "valid": true|false, "reason": "..." }`.
## References
- Schema: `docs/modules/airgap/schemas/manifest.schema.json`
- Samples: `docs/modules/airgap/samples/offline-kit-manifest.sample.json`, `docs/modules/airgap/samples/av-report.sample.json`, `docs/modules/airgap/samples/receipt.sample.json`
- Scripts: `src/AirGap/scripts/verify-kit.sh`, `src/AirGap/scripts/verify-manifest.sh`, `src/AirGap/scripts/verify-receipt.sh`

View File

@@ -0,0 +1,39 @@
# AirGap Quarantine Investigation Runbook
## Purpose
Quarantine preserves failed bundle imports for offline forensic analysis. It keeps the original bundle and the verification context (reason + logs) so operators can diagnose tampering, trust-root drift, or packaging issues without re-running in an online environment.
## Location & Structure
Default root: `/updates/quarantine`
Per-tenant layout:
`/updates/quarantine/<tenantId>/<timestamp>-<reason>-<id>/`
Removal staging:
`/updates/quarantine/<tenantId>/.removed/<quarantineId>/`
## Files in a quarantine entry
- `bundle.tar.zst` - the original bundle as provided
- `manifest.json` - bundle manifest (when available)
- `verification.log` - validation step output (TUF/DSSE/Merkle/rotation/monotonicity, etc.)
- `failure-reason.txt` - human-readable failure summary (reason + timestamp + metadata)
- `quarantine.json` - structured metadata for listing/automation
## Investigation steps (offline)
1. Identify the tenant and locate the quarantine root on the importer host.
2. Pick the newest quarantine entry for the tenant (timestamp prefix).
3. Read `failure-reason.txt` first to capture the top-level reason and metadata.
4. Review `verification.log` for the precise failing step.
5. If needed, extract and inspect `bundle.tar.zst` in an isolated workspace (no network).
6. Decide whether the entry should be retained (for audit) or removed after investigation.
## Removal & Retention
- Removal requires a human-provided reason (audit trail). Implementations should use the quarantine services remove operation which moves entries under `.removed/`.
- Retention and quota controls are configured via `AirGap:Quarantine` settings (root, TTL, max size); TTL cleanup can remove entries older than the retention period.
## Common failure categories
- `tuf:*` - invalid/expired metadata or snapshot hash mismatch
- `dsse:*` - signature invalid or trust root mismatch
- `merkle-*` - payload entry set invalid or empty
- `rotation:*` - root rotation policy failure (dual approval, no-op rotation, etc.)
- `version-non-monotonic:*` - rollback prevention triggered (force activation requires a justification)

View File

@@ -0,0 +1,23 @@
{
"$schema": "../av-report.schema.json",
"scanner": "clamav",
"scannerVersion": "1.4.1",
"startedAt": "2025-12-02T00:02:00Z",
"completedAt": "2025-12-02T00:04:30Z",
"status": "clean",
"artifacts": [
{
"path": "chunks/advisories-0001.tzst",
"sha256": "1234123412341234123412341234123412341234123412341234123412341234",
"result": "clean",
"yaraRules": []
},
{
"path": "chunks/vex-0001.tzst",
"sha256": "4321432143214321432143214321432143214321432143214321432143214321",
"result": "clean",
"yaraRules": []
}
],
"errors": []
}

View File

@@ -0,0 +1,3 @@
a:1
b:2
c:3

View File

@@ -0,0 +1,44 @@
{
"$schema": "../manifest.schema.json",
"schemaVersion": "1.0.0",
"bundleId": "offline-kit:concelier:2025-12-02",
"tenant": "default",
"environment": "prod",
"createdAt": "2025-12-02T00:00:00Z",
"stalenessWindowHours": 168,
"replayPolicy": "policy-freeze",
"tools": [
{ "name": "concelier-exporter", "version": "2.5.0", "sha256": "0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcd" },
{ "name": "trivy-db", "version": "0.48.0", "sha256": "89abcdef0123456789abcdef0123456789abcdef0123456789abcdef01234567" }
],
"feeds": [
{ "name": "redhat-csaf", "snapshot": "2025-12-01", "sha256": "fedcba9876543210fedcba9876543210fedcba9876543210fedcba9876543210", "stalenessHours": 72 },
{ "name": "osv", "snapshot": "2025-12-01T23:00:00Z", "sha256": "0f0e0d0c0b0a09080706050403020100ffeeddccbbaa99887766554433221100", "stalenessHours": 24 }
],
"policies": [
{ "name": "policy-bundle", "version": "1.4.2", "sha256": "aa55aa55aa55aa55aa55aa55aa55aa55aa55aa55aa55aa55aa55aa55aa55aa55" }
],
"chunks": [
{ "path": "chunks/advisories-0001.tzst", "sha256": "1234123412341234123412341234123412341234123412341234123412341234", "size": 1048576, "kind": "advisory" },
{ "path": "chunks/vex-0001.tzst", "sha256": "4321432143214321432143214321432143214321432143214321432143214321", "size": 524288, "kind": "vex" }
],
"avScan": {
"status": "clean",
"scanner": "clamav 1.4.1",
"scanAt": "2025-12-02T00:05:00Z",
"reportPath": "reports/av-scan.txt",
"reportSha256": "bb66bb66bb66bb66bb66bb66bb66bb66bb66bb66bb66bb66bb66bb66bb66bb66"
},
"hashes": {
"manifestSha256": "29d58b9fdc5c4e65b26c03f3bd9f442ff0c7f8514b8a9225f8b6417ffabc0101",
"bundleSha256": "d3c3f6c75c6a3f0906bcee457cc77a2d6d7c0f9d1a1d7da78c0d2ab8e0dba111"
},
"signatures": [
{
"type": "dsse",
"keyId": "airgap-manifest-dev",
"signature": "MEQCIGVyb3JrZXktc2lnbmF0dXJlLXNob3J0",
"envelopeDigest": "sha256:cc77cc77cc77cc77cc77cc77cc77cc77cc77cc77cc77cc77cc77cc77cc77cc77"
}
]
}

View File

@@ -0,0 +1,21 @@
{
"$schema": "../receipt.schema.json",
"schemaVersion": "1.0.0",
"receiptId": "receipt:ingress:2025-12-02T00-00Z",
"direction": "ingress",
"bundleId": "offline-kit:concelier:2025-12-02",
"tenant": "default",
"operator": { "id": "op-123", "role": "airgap-controller" },
"occurredAt": "2025-12-02T00:06:00Z",
"decision": "allow",
"hashes": {
"bundleSha256": "d3c3f6c75c6a3f0906bcee457cc77a2d6d7c0f9d1a1d7da78c0d2ab8e0dba111",
"manifestSha256": "29d58b9fdc5c4e65b26c03f3bd9f442ff0c7f8514b8a9225f8b6417ffabc0101"
},
"dsse": {
"envelopeDigest": "sha256:cc77cc77cc77cc77cc77cc77cc77cc77cc77cc77cc77cc77cc77cc77cc77cc77",
"signer": "airgap-receipts-dev",
"rekorUuid": "11111111-2222-3333-4444-555555555555"
},
"notes": "Ingress verified, AV clean, staleness within window."
}

View File

@@ -0,0 +1,36 @@
{
"$schema": "https://json-schema.org/draft/2020-12/schema",
"$id": "https://stellaops.local/airgap/av-report.schema.json",
"title": "Offline AV/YARA Scan Report",
"type": "object",
"additionalProperties": false,
"required": ["scanner", "scannerVersion", "startedAt", "completedAt", "status", "artifacts"],
"properties": {
"scanner": { "type": "string" },
"scannerVersion": { "type": "string" },
"startedAt": { "type": "string", "format": "date-time" },
"completedAt": { "type": "string", "format": "date-time" },
"status": { "type": "string", "enum": ["clean", "findings", "error"] },
"signature": { "type": "string", "description": "Optional detached signature over this report (base64)" },
"artifacts": {
"type": "array",
"items": {
"type": "object",
"additionalProperties": false,
"required": ["path", "sha256", "result"],
"properties": {
"path": { "type": "string" },
"sha256": { "type": "string", "pattern": "^[A-Fa-f0-9]{64}$" },
"result": { "type": "string", "enum": ["clean", "suspicious", "malicious", "error"] },
"yaraRules": { "type": "array", "items": { "type": "string" }, "uniqueItems": true },
"notes": { "type": "string" }
}
},
"uniqueItems": true
},
"errors": {
"type": "array",
"items": { "type": "string" }
}
}
}

View File

@@ -0,0 +1,123 @@
{
"$schema": "https://json-schema.org/draft/2020-12/schema",
"$id": "https://stellaops.local/airgap/manifest.schema.json",
"title": "Offline Kit Manifest",
"type": "object",
"additionalProperties": false,
"required": [
"schemaVersion",
"bundleId",
"tenant",
"environment",
"createdAt",
"stalenessWindowHours",
"replayPolicy",
"tools",
"feeds",
"policies",
"chunks",
"hashes"
],
"properties": {
"schemaVersion": { "type": "string", "pattern": "^1\\.\\d+\\.\\d+$" },
"bundleId": { "type": "string", "pattern": "^offline-kit:[A-Za-z0-9._:-]+$" },
"tenant": { "type": "string", "minLength": 1 },
"environment": { "type": "string", "enum": ["prod", "stage", "dev", "test"] },
"createdAt": { "type": "string", "format": "date-time" },
"stalenessWindowHours": { "type": "integer", "minimum": 0 },
"replayPolicy": { "type": "string", "enum": ["hash-only", "full-recompute", "policy-freeze"] },
"tools": {
"type": "array",
"items": {
"type": "object",
"additionalProperties": false,
"required": ["name", "version", "sha256"],
"properties": {
"name": { "type": "string" },
"version": { "type": "string" },
"sha256": { "type": "string", "pattern": "^[A-Fa-f0-9]{64}$" }
}
},
"uniqueItems": true
},
"feeds": {
"type": "array",
"items": {
"type": "object",
"additionalProperties": false,
"required": ["name", "snapshot", "sha256"],
"properties": {
"name": { "type": "string" },
"snapshot": { "type": "string" },
"sha256": { "type": "string", "pattern": "^[A-Fa-f0-9]{64}$" },
"stalenessHours": { "type": "integer", "minimum": 0 }
}
},
"uniqueItems": true
},
"policies": {
"type": "array",
"items": {
"type": "object",
"additionalProperties": false,
"required": ["name", "version", "sha256"],
"properties": {
"name": { "type": "string" },
"version": { "type": "string" },
"sha256": { "type": "string", "pattern": "^[A-Fa-f0-9]{64}$" }
}
},
"uniqueItems": true
},
"chunks": {
"type": "array",
"items": {
"type": "object",
"additionalProperties": false,
"required": ["path", "sha256", "size"],
"properties": {
"path": { "type": "string" },
"sha256": { "type": "string", "pattern": "^[A-Fa-f0-9]{64}$" },
"size": { "type": "integer", "minimum": 0 },
"kind": { "type": "string", "enum": ["advisory", "sbom", "vex", "policy", "graph", "tooling", "other"] }
}
},
"uniqueItems": true
},
"avScan": {
"type": "object",
"additionalProperties": false,
"required": ["status"],
"properties": {
"status": { "type": "string", "enum": ["not_run", "clean", "findings"] },
"scanner": { "type": "string" },
"scanAt": { "type": "string", "format": "date-time" },
"reportPath": { "type": "string" },
"reportSha256": { "type": "string", "pattern": "^[A-Fa-f0-9]{64}$" }
}
},
"hashes": {
"type": "object",
"additionalProperties": false,
"required": ["manifestSha256", "bundleSha256"],
"properties": {
"manifestSha256": { "type": "string", "pattern": "^[A-Fa-f0-9]{64}$" },
"bundleSha256": { "type": "string", "pattern": "^[A-Fa-f0-9]{64}$" }
}
},
"signatures": {
"type": "array",
"items": {
"type": "object",
"additionalProperties": false,
"required": ["type", "keyId", "signature"],
"properties": {
"type": { "type": "string", "enum": ["dsse", "jws-detached"] },
"keyId": { "type": "string" },
"signature": { "type": "string" },
"envelopeDigest": { "type": "string", "pattern": "^sha256:[A-Fa-f0-9]{64}$" }
}
}
}
}
}

View File

@@ -0,0 +1,55 @@
{
"$schema": "https://json-schema.org/draft/2020-12/schema",
"$id": "https://stellaops.local/airgap/receipt.schema.json",
"title": "AirGap Ingress/Egress Receipt",
"type": "object",
"additionalProperties": false,
"required": [
"schemaVersion",
"receiptId",
"direction",
"bundleId",
"tenant",
"operator",
"occurredAt",
"decision",
"hashes"
],
"properties": {
"schemaVersion": { "type": "string", "pattern": "^1\\.\\d+\\.\\d+$" },
"receiptId": { "type": "string", "pattern": "^receipt:[A-Za-z0-9._:-]+$" },
"direction": { "type": "string", "enum": ["ingress", "egress"] },
"bundleId": { "type": "string", "pattern": "^offline-kit:[A-Za-z0-9._:-]+$" },
"tenant": { "type": "string", "minLength": 1 },
"operator": {
"type": "object",
"additionalProperties": false,
"required": ["id", "role"],
"properties": {
"id": { "type": "string" },
"role": { "type": "string" }
}
},
"occurredAt": { "type": "string", "format": "date-time" },
"decision": { "type": "string", "enum": ["allow", "deny", "quarantine"] },
"hashes": {
"type": "object",
"additionalProperties": false,
"required": ["bundleSha256", "manifestSha256"],
"properties": {
"bundleSha256": { "type": "string", "pattern": "^[A-Fa-f0-9]{64}$" },
"manifestSha256": { "type": "string", "pattern": "^[A-Fa-f0-9]{64}$" }
}
},
"dsse": {
"type": "object",
"additionalProperties": false,
"properties": {
"envelopeDigest": { "type": "string", "pattern": "^sha256:[A-Fa-f0-9]{64}$" },
"signer": { "type": "string" },
"rekorUuid": { "type": "string" }
}
},
"notes": { "type": "string" }
}
}

View File

@@ -0,0 +1,43 @@
{
"$schema": "https://json-schema.org/draft/2020-12/schema",
"title": "StellaOps Time Anchor",
"type": "object",
"required": ["anchorTime", "source", "format", "tokenDigest"],
"properties": {
"anchorTime": {
"description": "UTC timestamp asserted by the time token (RFC3339/ISO-8601)",
"type": "string",
"format": "date-time"
},
"source": {
"description": "Logical source of the time token (e.g., roughtime",
"type": "string",
"enum": ["roughtime", "rfc3161"]
},
"format": {
"description": "Payload format identifier (e.g., draft-roughtime-v1, rfc3161)",
"type": "string"
},
"tokenDigest": {
"description": "SHA-256 of the raw time token bytes, hex-encoded",
"type": "string",
"pattern": "^[0-9a-fA-F]{64}$"
},
"signatureFingerprint": {
"description": "Fingerprint of the signer key (hex); optional until trust roots finalized",
"type": "string",
"pattern": "^[0-9a-fA-F]{16,128}$"
},
"verification": {
"description": "Result of local verification (if performed)",
"type": "object",
"properties": {
"status": {"type": "string", "enum": ["unknown", "passed", "failed"]},
"reason": {"type": "string"}
},
"required": ["status"],
"additionalProperties": false
}
},
"additionalProperties": false
}

View File

@@ -0,0 +1,20 @@
{
"version": 1,
"roughtime": [
{
"name": "stellaops-test-roughtime",
"publicKeyBase64": "dGVzdC1yb3VnaHRpbWUtcHViLWtleQ==",
"validFrom": "2025-01-01T00:00:00Z",
"validTo": "2026-01-01T00:00:00Z"
}
],
"rfc3161": [
{
"name": "stellaops-test-tsa",
"certificatePem": "-----BEGIN CERTIFICATE-----\nMIIBszCCAVmgAwIBAgIUYPXPLACEHOLDERKEYm7ri5bzsYqvSwwDQYJKoZIhvcNAQELBQAwETEPMA0GA1UEAwwGU3RlbGxhMB4XDTI1MDEwMTAwMDAwMFoXDTI2MDEwMTAwMDAwMFowETEPMA0GA1UEAwwGU3RlbGxhMHYwEAYHKoZIzj0CAQYFK4EEACIDYgAEPLACEHOLDERuQjVekA7gQtaQ6UiI4bYbw2bG8xwDthQqLehCDXXWix9TAAEbnII1xF4Zk12Y0wUjiJB82H4x6HTDY0Hes74AUFyi0A39p0Y0ffSZlnzCwzmxrSYzYHbpbb8WZKGa+jUzBRMB0GA1UdDgQWBBSPLACEHOLDERRoKdqaLKv8Bf+FfoUzAfBgNVHSMEGDAWgBSPLACEHOLDERRoKdqaLKv8Bf+FfoUzAPBgNVHRMBAf8EBTADAQH/MA0GCSqGSIb3DQEBCwUAA4IBAQCPLACEHOLDER\n-----END CERTIFICATE-----",
"validFrom": "2025-01-01T00:00:00Z",
"validTo": "2026-01-01T00:00:00Z",
"fingerprintSha256": "0000000000000000000000000000000000000000000000000000000000000000"
}
]
}

View File

@@ -0,0 +1,9 @@
{
"AirGap": {
"TenantId": "tenant-default",
"Staleness": {
"WarningSeconds": 3600,
"BreachSeconds": 7200
}
}
}