feat(rate-limiting): Implement core rate limiting functionality with configuration, decision-making, metrics, middleware, and service registration

- Add RateLimitConfig for configuration management with YAML binding support.
- Introduce RateLimitDecision to encapsulate the result of rate limit checks.
- Implement RateLimitMetrics for OpenTelemetry metrics tracking.
- Create RateLimitMiddleware for enforcing rate limits on incoming requests.
- Develop RateLimitService to orchestrate instance and environment rate limit checks.
- Add RateLimitServiceCollectionExtensions for dependency injection registration.
This commit is contained in:
master
2025-12-17 18:02:37 +02:00
parent 394b57f6bf
commit 8bbfe4d2d2
211 changed files with 47179 additions and 1590 deletions

732
docs/airgap/epss-bundles.md Normal file
View File

@@ -0,0 +1,732 @@
# EPSS Air-Gapped Bundles Guide
## Overview
This guide describes how to create, distribute, and import EPSS (Exploit Prediction Scoring System) data bundles for air-gapped StellaOps deployments. EPSS bundles enable offline vulnerability risk scoring with the same probabilistic threat intelligence available to online deployments.
**Key Concepts**:
- **Risk Bundle**: Aggregated security data (EPSS + KEV + advisories) for offline import
- **EPSS Snapshot**: Single-day EPSS scores for all CVEs (~300k rows)
- **Staleness Threshold**: How old EPSS data can be before fallback to CVSS-only
- **Deterministic Import**: Same bundle imported twice yields identical database state
---
## Bundle Structure
### Standard Risk Bundle Layout
```
risk-bundle-2025-12-17/
├── manifest.json # Bundle metadata and checksums
├── epss/
│ ├── epss_scores-2025-12-17.csv.zst # EPSS data (ZSTD compressed)
│ └── epss_metadata.json # EPSS provenance
├── kev/
│ └── kev-catalog.json # CISA KEV catalog
├── advisories/
│ ├── nvd-updates.ndjson.zst
│ └── ghsa-updates.ndjson.zst
└── signatures/
├── bundle.dsse.json # DSSE signature (optional)
└── bundle.sha256sums # File integrity checksums
```
### manifest.json
```json
{
"bundle_id": "risk-bundle-2025-12-17",
"created_at": "2025-12-17T00:00:00Z",
"created_by": "stellaops-bundler-v1.2.3",
"bundle_type": "risk",
"schema_version": "v1",
"contents": {
"epss": {
"model_date": "2025-12-17",
"file": "epss/epss_scores-2025-12-17.csv.zst",
"sha256": "abc123...",
"size_bytes": 15728640,
"row_count": 231417
},
"kev": {
"catalog_version": "2025-12-17",
"file": "kev/kev-catalog.json",
"sha256": "def456...",
"known_exploited_count": 1247
},
"advisories": {
"nvd": {
"file": "advisories/nvd-updates.ndjson.zst",
"sha256": "ghi789...",
"record_count": 1523
},
"ghsa": {
"file": "advisories/ghsa-updates.ndjson.zst",
"sha256": "jkl012...",
"record_count": 8734
}
}
},
"signature": {
"type": "dsse",
"file": "signatures/bundle.dsse.json",
"key_id": "stellaops-bundler-2025",
"algorithm": "ed25519"
}
}
```
### epss/epss_metadata.json
```json
{
"model_date": "2025-12-17",
"model_version": "v2025.12.17",
"published_date": "2025-12-17",
"row_count": 231417,
"source_uri": "https://epss.empiricalsecurity.com/epss_scores-2025-12-17.csv.gz",
"retrieved_at": "2025-12-17T00:05:32Z",
"file_sha256": "abc123...",
"decompressed_sha256": "xyz789...",
"compression": "zstd",
"compression_level": 19
}
```
---
## Creating EPSS Bundles
### Prerequisites
**Build System Requirements**:
- Internet access (for fetching FIRST.org data)
- StellaOps Bundler CLI: `stellaops-bundler`
- ZSTD compression: `zstd` (v1.5+)
- Python 3.10+ (for verification scripts)
**Permissions**:
- Read access to FIRST.org EPSS API/CSV endpoints
- Write access to bundle staging directory
- (Optional) Signing key for DSSE signatures
### Daily Bundle Creation (Automated)
**Recommended Schedule**: Daily at 01:00 UTC (after FIRST publishes at ~00:00 UTC)
**Script**: `scripts/create-risk-bundle.sh`
```bash
#!/bin/bash
set -euo pipefail
BUNDLE_DATE=$(date -u +%Y-%m-%d)
BUNDLE_DIR="risk-bundle-${BUNDLE_DATE}"
STAGING_DIR="/tmp/stellaops-bundles/${BUNDLE_DIR}"
echo "Creating risk bundle for ${BUNDLE_DATE}..."
# 1. Create staging directory
mkdir -p "${STAGING_DIR}"/{epss,kev,advisories,signatures}
# 2. Fetch EPSS data from FIRST.org
echo "Fetching EPSS data..."
curl -sL "https://epss.empiricalsecurity.com/epss_scores-${BUNDLE_DATE}.csv.gz" \
-o "${STAGING_DIR}/epss/epss_scores-${BUNDLE_DATE}.csv.gz"
# 3. Decompress and re-compress with ZSTD (better compression for offline)
gunzip "${STAGING_DIR}/epss/epss_scores-${BUNDLE_DATE}.csv.gz"
zstd -19 -q "${STAGING_DIR}/epss/epss_scores-${BUNDLE_DATE}.csv" \
-o "${STAGING_DIR}/epss/epss_scores-${BUNDLE_DATE}.csv.zst"
rm "${STAGING_DIR}/epss/epss_scores-${BUNDLE_DATE}.csv"
# 4. Generate EPSS metadata
stellaops-bundler epss metadata \
--file "${STAGING_DIR}/epss/epss_scores-${BUNDLE_DATE}.csv.zst" \
--model-date "${BUNDLE_DATE}" \
--output "${STAGING_DIR}/epss/epss_metadata.json"
# 5. Fetch KEV catalog
echo "Fetching KEV catalog..."
curl -sL "https://www.cisa.gov/sites/default/files/feeds/known_exploited_vulnerabilities.json" \
-o "${STAGING_DIR}/kev/kev-catalog.json"
# 6. Fetch advisory updates (optional, for comprehensive bundles)
# stellaops-bundler advisories fetch ...
# 7. Generate checksums
echo "Generating checksums..."
(cd "${STAGING_DIR}" && find . -type f ! -name "*.sha256sums" -exec sha256sum {} \;) \
> "${STAGING_DIR}/signatures/bundle.sha256sums"
# 8. Generate manifest
stellaops-bundler manifest create \
--bundle-dir "${STAGING_DIR}" \
--bundle-id "${BUNDLE_DIR}" \
--output "${STAGING_DIR}/manifest.json"
# 9. Sign bundle (if signing key available)
if [ -n "${SIGNING_KEY:-}" ]; then
echo "Signing bundle..."
stellaops-bundler sign \
--manifest "${STAGING_DIR}/manifest.json" \
--key "${SIGNING_KEY}" \
--output "${STAGING_DIR}/signatures/bundle.dsse.json"
fi
# 10. Create tarball
echo "Creating tarball..."
tar -C "$(dirname "${STAGING_DIR}")" -czf "/var/stellaops/bundles/${BUNDLE_DIR}.tar.gz" \
"$(basename "${STAGING_DIR}")"
echo "Bundle created: /var/stellaops/bundles/${BUNDLE_DIR}.tar.gz"
echo "Size: $(du -h /var/stellaops/bundles/${BUNDLE_DIR}.tar.gz | cut -f1)"
# 11. Verify bundle
stellaops-bundler verify "/var/stellaops/bundles/${BUNDLE_DIR}.tar.gz"
```
**Cron Schedule**:
```cron
# Daily at 01:00 UTC (after FIRST publishes EPSS at ~00:00 UTC)
0 1 * * * /opt/stellaops/scripts/create-risk-bundle.sh >> /var/log/stellaops/bundler.log 2>&1
```
---
## Distributing Bundles
### Transfer Methods
#### 1. Physical Media (Highest Security)
```bash
# Copy to USB drive
cp /var/stellaops/bundles/risk-bundle-2025-12-17.tar.gz /media/usb/stellaops/
# Verify checksum
sha256sum /media/usb/stellaops/risk-bundle-2025-12-17.tar.gz
```
#### 2. Secure File Transfer (Network Isolation)
```bash
# SCP over dedicated management network
scp /var/stellaops/bundles/risk-bundle-2025-12-17.tar.gz \
admin@airgap-gateway.internal:/incoming/
# Verify after transfer
ssh admin@airgap-gateway.internal \
"sha256sum /incoming/risk-bundle-2025-12-17.tar.gz"
```
#### 3. Offline Bundle Repository (CD/DVD)
```bash
# Burn to CD/DVD (for regulated industries)
growisofs -Z /dev/sr0 \
-R -J -joliet-long \
-V "StellaOps Risk Bundle 2025-12-17" \
/var/stellaops/bundles/risk-bundle-2025-12-17.tar.gz
# Verify disc
md5sum /dev/sr0 > risk-bundle-2025-12-17.md5
```
### Storage Recommendations
**Bundle Retention**:
- **Online bundler**: Keep last 90 days (rolling cleanup)
- **Air-gapped system**: Keep last 30 days minimum (for rollback)
**Naming Convention**:
- Pattern: `risk-bundle-YYYY-MM-DD.tar.gz`
- Example: `risk-bundle-2025-12-17.tar.gz`
**Directory Structure** (air-gapped system):
```
/opt/stellaops/bundles/
├── incoming/ # Transfer staging area
├── verified/ # Verified, ready to import
├── imported/ # Successfully imported (archive)
└── failed/ # Failed verification/import (quarantine)
```
---
## Importing Bundles (Air-Gapped System)
### Pre-Import Verification
**Step 1: Transfer to Verified Directory**
```bash
# Transfer from incoming to verified (manual approval gate)
sudo mv /opt/stellaops/bundles/incoming/risk-bundle-2025-12-17.tar.gz \
/opt/stellaops/bundles/verified/
```
**Step 2: Verify Bundle Integrity**
```bash
# Extract bundle
cd /opt/stellaops/bundles/verified
tar -xzf risk-bundle-2025-12-17.tar.gz
# Verify checksums
cd risk-bundle-2025-12-17
sha256sum -c signatures/bundle.sha256sums
# Expected output:
# epss/epss_scores-2025-12-17.csv.zst: OK
# epss/epss_metadata.json: OK
# kev/kev-catalog.json: OK
# manifest.json: OK
```
**Step 3: Verify DSSE Signature (if signed)**
```bash
stellaops-bundler verify-signature \
--manifest manifest.json \
--signature signatures/bundle.dsse.json \
--trusted-keys /etc/stellaops/trusted-keys.json
# Expected output:
# ✓ Signature valid
# ✓ Key ID: stellaops-bundler-2025
# ✓ Signed at: 2025-12-17T01:05:00Z
```
### Import Procedure
**Step 4: Import Bundle**
```bash
# Import using stellaops CLI
stellaops offline import \
--bundle /opt/stellaops/bundles/verified/risk-bundle-2025-12-17.tar.gz \
--verify \
--dry-run
# Review dry-run output, then execute
stellaops offline import \
--bundle /opt/stellaops/bundles/verified/risk-bundle-2025-12-17.tar.gz \
--verify
```
**Import Output**:
```
Importing risk bundle: risk-bundle-2025-12-17
✓ Manifest validated
✓ Checksums verified
✓ Signature verified
Importing EPSS data...
Model Date: 2025-12-17
Row Count: 231,417
✓ epss_import_runs created (import_run_id: 550e8400-...)
✓ epss_scores inserted (231,417 rows, 23.4s)
✓ epss_changes computed (12,345 changes, 8.1s)
✓ epss_current upserted (231,417 rows, 5.2s)
✓ Event emitted: epss.updated
Importing KEV catalog...
Known Exploited Count: 1,247
✓ kev_catalog updated
Import completed successfully in 41.2s
```
**Step 5: Verify Import**
```bash
# Check EPSS status
stellaops epss status
# Expected output:
# EPSS Status:
# Latest Model Date: 2025-12-17
# Source: bundle://risk-bundle-2025-12-17
# CVE Count: 231,417
# Staleness: FRESH (0 days)
# Import Time: 2025-12-17T10:30:00Z
# Query specific CVE to verify
stellaops epss get CVE-2024-12345
# Expected output:
# CVE-2024-12345
# Score: 0.42357
# Percentile: 88.2th
# Model Date: 2025-12-17
# Source: bundle://risk-bundle-2025-12-17
```
**Step 6: Archive Imported Bundle**
```bash
# Move to imported archive
sudo mv /opt/stellaops/bundles/verified/risk-bundle-2025-12-17.tar.gz \
/opt/stellaops/bundles/imported/
```
---
## Automation (Air-Gapped System)
### Automated Import on Arrival
**Script**: `/opt/stellaops/scripts/auto-import-bundle.sh`
```bash
#!/bin/bash
set -euo pipefail
INCOMING_DIR="/opt/stellaops/bundles/incoming"
VERIFIED_DIR="/opt/stellaops/bundles/verified"
IMPORTED_DIR="/opt/stellaops/bundles/imported"
FAILED_DIR="/opt/stellaops/bundles/failed"
LOG_FILE="/var/log/stellaops/auto-import.log"
log() {
echo "[$(date -Iseconds)] $*" | tee -a "${LOG_FILE}"
}
# Watch for new bundles in incoming/
for bundle in "${INCOMING_DIR}"/risk-bundle-*.tar.gz; do
[ -f "${bundle}" ] || continue
BUNDLE_NAME=$(basename "${bundle}")
log "Detected new bundle: ${BUNDLE_NAME}"
# Extract
EXTRACT_DIR="${VERIFIED_DIR}/${BUNDLE_NAME%.tar.gz}"
mkdir -p "${EXTRACT_DIR}"
tar -xzf "${bundle}" -C "${VERIFIED_DIR}"
# Verify checksums
if ! (cd "${EXTRACT_DIR}" && sha256sum -c signatures/bundle.sha256sums > /dev/null 2>&1); then
log "ERROR: Checksum verification failed for ${BUNDLE_NAME}"
mv "${bundle}" "${FAILED_DIR}/"
rm -rf "${EXTRACT_DIR}"
continue
fi
log "Checksum verification passed"
# Verify signature (if present)
if [ -f "${EXTRACT_DIR}/signatures/bundle.dsse.json" ]; then
if ! stellaops-bundler verify-signature \
--manifest "${EXTRACT_DIR}/manifest.json" \
--signature "${EXTRACT_DIR}/signatures/bundle.dsse.json" \
--trusted-keys /etc/stellaops/trusted-keys.json > /dev/null 2>&1; then
log "ERROR: Signature verification failed for ${BUNDLE_NAME}"
mv "${bundle}" "${FAILED_DIR}/"
rm -rf "${EXTRACT_DIR}"
continue
fi
log "Signature verification passed"
fi
# Import
if stellaops offline import --bundle "${bundle}" --verify >> "${LOG_FILE}" 2>&1; then
log "Import successful for ${BUNDLE_NAME}"
mv "${bundle}" "${IMPORTED_DIR}/"
rm -rf "${EXTRACT_DIR}"
else
log "ERROR: Import failed for ${BUNDLE_NAME}"
mv "${bundle}" "${FAILED_DIR}/"
fi
done
```
**Systemd Service**: `/etc/systemd/system/stellaops-bundle-watcher.service`
```ini
[Unit]
Description=StellaOps Bundle Auto-Import Watcher
After=network.target
[Service]
Type=simple
ExecStart=/usr/bin/inotifywait -m -e close_write --format '%w%f' /opt/stellaops/bundles/incoming | \
while read file; do /opt/stellaops/scripts/auto-import-bundle.sh; done
Restart=always
RestartSec=10
User=stellaops
Group=stellaops
[Install]
WantedBy=multi-user.target
```
**Enable Service**:
```bash
sudo systemctl enable stellaops-bundle-watcher
sudo systemctl start stellaops-bundle-watcher
```
---
## Staleness Handling
### Staleness Thresholds
| Days Since Model Date | Status | Action |
|-----------------------|--------|--------|
| 0-1 | FRESH | Normal operation |
| 2-7 | ACCEPTABLE | Continue, low-priority alert |
| 8-14 | STALE | Alert, plan bundle import |
| 15+ | VERY_STALE | Fallback to CVSS-only, urgent alert |
### Monitoring Staleness
**SQL Query**:
```sql
SELECT * FROM concelier.epss_model_staleness;
-- Output:
-- latest_model_date | latest_import_at | days_stale | staleness_status
-- 2025-12-10 | 2025-12-10 10:30:00+00 | 7 | ACCEPTABLE
```
**Prometheus Metric**:
```promql
epss_model_staleness_days{instance="airgap-prod"}
# Alert rule:
- alert: EpssDataStale
expr: epss_model_staleness_days > 7
for: 1h
labels:
severity: warning
annotations:
summary: "EPSS data is stale ({{ $value }} days old)"
```
### Fallback Behavior
When EPSS data is VERY_STALE (>14 days):
**Automatic Fallback**:
- Scanner: Skip EPSS evidence, log warning
- Policy: Use CVSS-only scoring (no EPSS bonus)
- Notifications: Disabled EPSS-based alerts
- UI: Show staleness banner, disable EPSS filters
**Manual Override** (force continue using stale data):
```yaml
# etc/scanner.yaml
scanner:
epss:
staleness_policy: continue # Options: fallback, continue, error
max_staleness_days: 30 # Override 14-day default
```
---
## Troubleshooting
### Bundle Import Failed: Checksum Mismatch
**Symptom**:
```
ERROR: Checksum verification failed
epss/epss_scores-2025-12-17.csv.zst: FAILED
```
**Diagnosis**:
1. Verify bundle was not corrupted during transfer:
```bash
# Compare with original
sha256sum risk-bundle-2025-12-17.tar.gz
```
2. Re-transfer bundle from source
**Resolution**:
- Delete corrupted bundle: `rm risk-bundle-2025-12-17.tar.gz`
- Re-download/re-transfer from bundler system
### Bundle Import Failed: Signature Invalid
**Symptom**:
```
ERROR: Signature verification failed
Invalid signature or untrusted key
```
**Diagnosis**:
1. Check trusted keys configured:
```bash
cat /etc/stellaops/trusted-keys.json
```
2. Verify key ID in bundle signature matches:
```bash
jq '.signature.key_id' manifest.json
```
**Resolution**:
- Update trusted keys file with current bundler public key
- Or: Skip signature verification (if signatures optional):
```bash
stellaops offline import --bundle risk-bundle-2025-12-17.tar.gz --skip-signature-verify
```
### No EPSS Data After Import
**Symptom**:
- Import succeeded, but `stellaops epss status` shows "No EPSS data"
**Diagnosis**:
```sql
-- Check import runs
SELECT * FROM concelier.epss_import_runs ORDER BY created_at DESC LIMIT 1;
-- Check epss_current count
SELECT COUNT(*) FROM concelier.epss_current;
```
**Resolution**:
1. If import_runs shows FAILED status:
- Check error column: `SELECT error FROM concelier.epss_import_runs WHERE status = 'FAILED'`
- Re-run import with verbose logging
2. If epss_current is empty:
- Manually trigger upsert:
```sql
-- Re-run upsert for latest model_date
-- (This SQL is safe to re-run)
INSERT INTO concelier.epss_current (cve_id, epss_score, percentile, model_date, import_run_id, updated_at)
SELECT s.cve_id, s.epss_score, s.percentile, s.model_date, s.import_run_id, NOW()
FROM concelier.epss_scores s
WHERE s.model_date = (SELECT MAX(model_date) FROM concelier.epss_import_runs WHERE status = 'SUCCEEDED')
ON CONFLICT (cve_id) DO UPDATE SET
epss_score = EXCLUDED.epss_score,
percentile = EXCLUDED.percentile,
model_date = EXCLUDED.model_date,
import_run_id = EXCLUDED.import_run_id,
updated_at = NOW();
```
---
## Best Practices
### 1. Weekly Bundle Import Cadence
**Recommended Schedule**:
- **Minimum**: Weekly (every Monday)
- **Preferred**: Bi-weekly (Monday & Thursday)
- **Ideal**: Daily (if transfer logistics allow)
### 2. Bundle Verification Checklist
Before importing:
- [ ] Checksum verification passed
- [ ] Signature verification passed (if signed)
- [ ] Model date within acceptable staleness window
- [ ] Disk space available (estimate: 500MB per bundle)
- [ ] Backup current EPSS data (for rollback)
### 3. Rollback Plan
If new bundle causes issues:
```bash
# 1. Identify problematic import_run_id
SELECT import_run_id, model_date, status
FROM concelier.epss_import_runs
ORDER BY created_at DESC LIMIT 5;
# 2. Delete problematic import (cascades to epss_scores, epss_changes)
DELETE FROM concelier.epss_import_runs
WHERE import_run_id = '550e8400-...';
# 3. Restore epss_current from previous day
-- (Upsert from previous model_date as shown in troubleshooting)
# 4. Verify rollback
stellaops epss status
```
### 4. Audit Trail
Log all bundle imports for compliance:
**Audit Log Format** (`/var/log/stellaops/bundle-audit.log`):
```json
{
"timestamp": "2025-12-17T10:30:00Z",
"action": "import",
"bundle_id": "risk-bundle-2025-12-17",
"bundle_sha256": "abc123...",
"imported_by": "admin@example.com",
"import_run_id": "550e8400-e29b-41d4-a716-446655440000",
"result": "SUCCESS",
"row_count": 231417,
"duration_seconds": 41.2
}
```
---
## Appendix: Bundle Creation Tools
### stellaops-bundler CLI Reference
```bash
# Create EPSS metadata
stellaops-bundler epss metadata \
--file epss_scores-2025-12-17.csv.zst \
--model-date 2025-12-17 \
--output epss_metadata.json
# Create manifest
stellaops-bundler manifest create \
--bundle-dir risk-bundle-2025-12-17 \
--bundle-id risk-bundle-2025-12-17 \
--output manifest.json
# Sign bundle
stellaops-bundler sign \
--manifest manifest.json \
--key /path/to/signing-key.pem \
--output bundle.dsse.json
# Verify bundle
stellaops-bundler verify risk-bundle-2025-12-17.tar.gz
```
### Custom Bundle Scripts
Example for creating weekly bundles (7-day snapshots):
```bash
#!/bin/bash
# create-weekly-bundle.sh
WEEK_START=$(date -u -d "last monday" +%Y-%m-%d)
WEEK_END=$(date -u +%Y-%m-%d)
BUNDLE_ID="risk-bundle-weekly-${WEEK_START}"
echo "Creating weekly bundle: ${BUNDLE_ID}"
for day in $(seq 0 6); do
CURRENT_DATE=$(date -u -d "${WEEK_START} + ${day} days" +%Y-%m-%d)
# Fetch EPSS for each day...
curl -sL "https://epss.empiricalsecurity.com/epss_scores-${CURRENT_DATE}.csv.gz" \
-o "epss/epss_scores-${CURRENT_DATE}.csv.gz"
done
# Compress and bundle...
tar -czf "${BUNDLE_ID}.tar.gz" epss/ kev/ manifest.json
```
---
**Last Updated**: 2025-12-17
**Version**: 1.0
**Maintainer**: StellaOps Operations Team

View File

@@ -0,0 +1,415 @@
# Proof Chain Verification in Air-Gap Mode
> **Version**: 1.0.0
> **Last Updated**: 2025-12-17
> **Related**: [Proof Chain API](../api/proofs.md), [Key Rotation Runbook](../operations/key-rotation-runbook.md)
This document describes how to verify proof chains in air-gapped (offline) environments where Rekor transparency log access is unavailable.
---
## Overview
Proof chains in StellaOps consist of cryptographically-linked attestations:
1. **Evidence statements** - Raw vulnerability findings
2. **Reasoning statements** - Policy evaluation traces
3. **VEX verdict statements** - Final vulnerability status determinations
4. **Proof spine** - Merkle tree aggregating all components
In online mode, proof chains include Rekor inclusion proofs for transparency. In air-gap mode, verification proceeds without Rekor but maintains cryptographic integrity.
---
## Verification Levels
### Level 1: Content-Addressed ID Verification
Verifies that content-addressed IDs match payload hashes.
```bash
# Verify a proof bundle ID
stellaops proof verify --offline \
--proof-bundle sha256:1a2b3c4d... \
--level content-id
# Expected output:
# ✓ Content-addressed ID verified
# ✓ Payload hash: sha256:1a2b3c4d...
```
### Level 2: DSSE Signature Verification
Verifies DSSE envelope signatures against trust anchors.
```bash
# Verify signatures with local trust anchors
stellaops proof verify --offline \
--proof-bundle sha256:1a2b3c4d... \
--anchor-file /path/to/trust-anchors.json \
--level signature
# Expected output:
# ✓ DSSE signature valid
# ✓ Signer: key-2025-prod
# ✓ Trust anchor: 550e8400-e29b-41d4-a716-446655440000
```
### Level 3: Merkle Path Verification
Verifies the proof spine merkle tree structure.
```bash
# Verify merkle paths
stellaops proof verify --offline \
--proof-bundle sha256:1a2b3c4d... \
--level merkle
# Expected output:
# ✓ Merkle root verified
# ✓ Evidence paths: 3/3 valid
# ✓ Reasoning path: valid
# ✓ VEX verdict path: valid
```
### Level 4: Full Verification (Offline)
Performs all verification steps except Rekor.
```bash
# Full offline verification
stellaops proof verify --offline \
--proof-bundle sha256:1a2b3c4d... \
--anchor-file /path/to/trust-anchors.json
# Expected output:
# Proof Chain Verification
# ═══════════════════════
# ✓ Content-addressed IDs verified
# ✓ DSSE signatures verified (3 envelopes)
# ✓ Merkle paths verified
# ⊘ Rekor verification skipped (offline mode)
#
# Overall: VERIFIED (offline)
```
---
## Trust Anchor Distribution
In air-gap environments, trust anchors must be distributed out-of-band.
### Export Trust Anchors
```bash
# On the online system, export trust anchors
stellaops anchor export --format json > trust-anchors.json
# Verify export integrity
sha256sum trust-anchors.json > trust-anchors.sha256
```
### Trust Anchor File Format
```json
{
"version": "1.0",
"exportedAt": "2025-12-17T00:00:00Z",
"anchors": [
{
"trustAnchorId": "550e8400-e29b-41d4-a716-446655440000",
"purlPattern": "pkg:*",
"allowedKeyids": ["key-2024-prod", "key-2025-prod"],
"allowedPredicateTypes": [
"evidence.stella/v1",
"reasoning.stella/v1",
"cdx-vex.stella/v1",
"proofspine.stella/v1"
],
"revokedKeys": ["key-2023-prod"],
"keyMaterial": {
"key-2024-prod": {
"algorithm": "ECDSA-P256",
"publicKey": "-----BEGIN PUBLIC KEY-----\n..."
},
"key-2025-prod": {
"algorithm": "ECDSA-P256",
"publicKey": "-----BEGIN PUBLIC KEY-----\n..."
}
}
}
]
}
```
### Import Trust Anchors
```bash
# On the air-gapped system
stellaops anchor import --file trust-anchors.json
# Verify import
stellaops anchor list
```
---
## Proof Bundle Distribution
### Export Proof Bundles
```bash
# Export a proof bundle for offline transfer
stellaops proof export \
--entry sha256:abc123:pkg:npm/lodash@4.17.21 \
--output proof-bundle.zip
# Bundle contents:
# proof-bundle.zip
# ├── proof-spine.json # The proof spine
# ├── evidence/ # Evidence statements
# │ ├── sha256_e1.json
# │ └── sha256_e2.json
# ├── reasoning.json # Reasoning statement
# ├── vex-verdict.json # VEX verdict statement
# ├── envelopes/ # DSSE envelopes
# │ ├── evidence-e1.dsse
# │ ├── evidence-e2.dsse
# │ ├── reasoning.dsse
# │ ├── vex-verdict.dsse
# │ └── proof-spine.dsse
# └── VERIFY.md # Verification instructions
```
### Verify Exported Bundle
```bash
# On the air-gapped system
stellaops proof verify --offline \
--bundle-file proof-bundle.zip \
--anchor-file trust-anchors.json
```
---
## Batch Verification
For audits, verify multiple proof bundles efficiently:
```bash
# Create a verification manifest
cat > verify-manifest.json << 'EOF'
{
"bundles": [
"sha256:1a2b3c4d...",
"sha256:5e6f7g8h...",
"sha256:9i0j1k2l..."
],
"options": {
"checkRekor": false,
"failFast": false
}
}
EOF
# Run batch verification
stellaops proof verify-batch \
--manifest verify-manifest.json \
--anchor-file trust-anchors.json \
--output verification-report.json
```
### Verification Report Format
```json
{
"verifiedAt": "2025-12-17T10:00:00Z",
"mode": "offline",
"anchorsUsed": ["550e8400..."],
"results": [
{
"proofBundleId": "sha256:1a2b3c4d...",
"verified": true,
"checks": {
"contentId": true,
"signature": true,
"merklePath": true,
"rekorInclusion": null
}
}
],
"summary": {
"total": 3,
"verified": 3,
"failed": 0,
"skipped": 0
}
}
```
---
## Key Rotation in Air-Gap Mode
When keys are rotated, trust anchor updates must be distributed:
### 1. Export Updated Anchors
```bash
# On online system after key rotation
stellaops anchor export --since 2025-01-01 > anchor-update.json
sha256sum anchor-update.json > anchor-update.sha256
```
### 2. Verify and Import Update
```bash
# On air-gapped system
sha256sum -c anchor-update.sha256
stellaops anchor import --file anchor-update.json --merge
# Verify key history
stellaops anchor show --anchor-id 550e8400... --show-history
```
### 3. Temporal Verification
When verifying old proofs after key rotation:
```bash
# Verify proof signed with now-revoked key
stellaops proof verify --offline \
--proof-bundle sha256:old-proof... \
--anchor-file trust-anchors.json \
--at-time "2024-06-15T12:00:00Z"
# The verification uses key validity at the specified time
```
---
## Manual Verification (No CLI)
For environments without the StellaOps CLI, manual verification is possible:
### 1. Verify Content-Addressed ID
```bash
# Extract payload from DSSE envelope
jq -r '.payload' proof-spine.dsse | base64 -d > payload.json
# Compute hash
sha256sum payload.json
# Compare with proof bundle ID
```
### 2. Verify DSSE Signature
```python
#!/usr/bin/env python3
import json
import base64
from cryptography.hazmat.primitives import hashes
from cryptography.hazmat.primitives.asymmetric import ec
from cryptography.hazmat.primitives.serialization import load_pem_public_key
def verify_dsse(envelope_path, public_key_pem):
"""Verify a DSSE envelope signature."""
with open(envelope_path) as f:
envelope = json.load(f)
payload_type = envelope['payloadType']
payload = base64.b64decode(envelope['payload'])
# Build PAE (Pre-Authentication Encoding)
pae = f"DSSEv1 {len(payload_type)} {payload_type} {len(payload)} ".encode() + payload
public_key = load_pem_public_key(public_key_pem.encode())
for sig in envelope['signatures']:
signature = base64.b64decode(sig['sig'])
try:
public_key.verify(signature, pae, ec.ECDSA(hashes.SHA256()))
print(f"✓ Signature valid for keyid: {sig['keyid']}")
return True
except Exception as e:
print(f"✗ Signature invalid: {e}")
return False
```
### 3. Verify Merkle Path
```python
#!/usr/bin/env python3
import json
import hashlib
def verify_merkle_path(leaf_hash, path, root_hash, leaf_index):
"""Verify a Merkle inclusion path."""
current = bytes.fromhex(leaf_hash)
index = leaf_index
for sibling in path:
sibling_bytes = bytes.fromhex(sibling)
if index % 2 == 0:
# Current is left child
combined = current + sibling_bytes
else:
# Current is right child
combined = sibling_bytes + current
current = hashlib.sha256(combined).digest()
index //= 2
computed_root = current.hex()
if computed_root == root_hash:
print("✓ Merkle path verified")
return True
else:
print(f"✗ Merkle root mismatch: {computed_root} != {root_hash}")
return False
```
---
## Exit Codes
Offline verification uses the same exit codes as online:
| Code | Meaning | CI/CD Action |
|------|---------|--------------|
| 0 | Verification passed | Proceed |
| 1 | Verification failed | Block |
| 2 | System error | Retry/investigate |
---
## Troubleshooting
### Missing Trust Anchor
```
Error: No trust anchor found for keyid "key-2025-prod"
```
**Solution**: Import updated trust anchors from online system.
### Key Not Valid at Time
```
Error: Key "key-2024-prod" was revoked at 2024-12-01, before proof signature at 2025-01-15
```
**Solution**: This indicates the proof was signed after key revocation. Investigate the signature timestamp.
### Merkle Path Invalid
```
Error: Merkle path verification failed for evidence sha256:e1...
```
**Solution**: The proof bundle may be corrupted. Re-export from online system.
---
## Related Documentation
- [Proof Chain API Reference](../api/proofs.md)
- [Key Rotation Runbook](../operations/key-rotation-runbook.md)
- [Portable Evidence Bundle Verification](portable-evidence-bundle-verification.md)
- [Offline Bundle Format](offline-bundle-format.md)

View File

@@ -0,0 +1,287 @@
# Smart-Diff Air-Gap Workflows
**Sprint:** SPRINT_3500_0001_0001
**Task:** SDIFF-MASTER-0006 - Document air-gap workflows for smart-diff
## Overview
Smart-Diff can operate in fully air-gapped environments using offline bundles. This document describes the workflows for running smart-diff analysis without network connectivity.
## Prerequisites
1. **Offline Kit** - Downloaded and verified (`stellaops offline kit download`)
2. **Feed Snapshots** - Pre-staged vulnerability feeds
3. **SBOM Cache** - Pre-generated SBOMs for target artifacts
## Workflow 1: Offline Smart-Diff Analysis
### Step 1: Prepare Offline Bundle
On a connected machine:
```bash
# Download offline kit with feeds
stellaops offline kit download \
--output /path/to/offline-bundle \
--include-feeds nvd,osv,epss \
--feed-date 2025-01-15
# Include SBOMs for known artifacts
stellaops offline sbom generate \
--artifact registry.example.com/app:v1 \
--artifact registry.example.com/app:v2 \
--output /path/to/offline-bundle/sboms
# Package for transfer
stellaops offline kit package \
--input /path/to/offline-bundle \
--output stellaops-offline-2025-01-15.tar.gz \
--sign
```
### Step 2: Transfer to Air-Gapped Environment
Transfer the bundle using approved media:
- USB drive (scanned and approved)
- Optical media (DVD/Blu-ray)
- Data diode
### Step 3: Import Bundle
On the air-gapped machine:
```bash
# Verify bundle signature
stellaops offline kit verify \
--input stellaops-offline-2025-01-15.tar.gz \
--public-key /path/to/signing-key.pub
# Extract and configure
stellaops offline kit import \
--input stellaops-offline-2025-01-15.tar.gz \
--data-dir /opt/stellaops/data
```
### Step 4: Run Smart-Diff
```bash
# Set offline mode
export STELLAOPS_OFFLINE=true
export STELLAOPS_DATA_DIR=/opt/stellaops/data
# Run smart-diff
stellaops smart-diff \
--base sbom:app-v1.json \
--target sbom:app-v2.json \
--output smart-diff-report.json
```
## Workflow 2: Pre-Computed Smart-Diff Export
For environments where even running analysis tools is restricted.
### Step 1: Prepare Artifacts (Connected Machine)
```bash
# Generate SBOMs
stellaops sbom generate --artifact app:v1 --output app-v1-sbom.json
stellaops sbom generate --artifact app:v2 --output app-v2-sbom.json
# Run smart-diff with full proof bundle
stellaops smart-diff \
--base app-v1-sbom.json \
--target app-v2-sbom.json \
--output-dir ./smart-diff-export \
--include-proofs \
--include-evidence \
--format bundle
```
### Step 2: Verify Export Contents
The export bundle contains:
```
smart-diff-export/
├── manifest.json # Signed manifest
├── base-sbom.json # Base SBOM (hash verified)
├── target-sbom.json # Target SBOM (hash verified)
├── diff-results.json # Smart-diff findings
├── sarif-report.json # SARIF formatted output
├── proofs/
│ ├── ledger.json # Proof ledger
│ └── nodes/ # Individual proof nodes
├── evidence/
│ ├── reachability.json # Reachability evidence
│ ├── vex-statements.json # VEX statements
│ └── hardening.json # Binary hardening data
└── signature.dsse # DSSE envelope
```
### Step 3: Import and Verify (Air-Gapped Machine)
```bash
# Verify bundle integrity
stellaops verify-bundle \
--input smart-diff-export \
--public-key /path/to/trusted-key.pub
# View results
stellaops smart-diff show \
--bundle smart-diff-export \
--format table
```
## Workflow 3: Incremental Feed Updates
### Step 1: Generate Delta Feed
On connected machine:
```bash
# Generate delta since last sync
stellaops offline feed delta \
--since 2025-01-10 \
--output feed-delta-2025-01-15.tar.gz \
--sign
```
### Step 2: Apply Delta (Air-Gapped)
```bash
# Import delta
stellaops offline feed apply \
--input feed-delta-2025-01-15.tar.gz \
--verify
# Trigger score replay for affected scans
stellaops score replay-all \
--trigger feed-update \
--dry-run
```
## Configuration
### Environment Variables
| Variable | Description | Default |
|----------|-------------|---------|
| `STELLAOPS_OFFLINE` | Enable offline mode | `false` |
| `STELLAOPS_DATA_DIR` | Local data directory | `~/.stellaops` |
| `STELLAOPS_FEED_DIR` | Feed snapshot directory | `$DATA_DIR/feeds` |
| `STELLAOPS_SBOM_CACHE` | SBOM cache directory | `$DATA_DIR/sboms` |
| `STELLAOPS_SKIP_NETWORK` | Block network requests | `false` |
| `STELLAOPS_REQUIRE_SIGNATURES` | Require signed data | `true` |
### Config File
```yaml
# ~/.stellaops/config.yaml
offline:
enabled: true
data_dir: /opt/stellaops/data
require_signatures: true
feeds:
source: local
path: /opt/stellaops/data/feeds
sbom:
cache_dir: /opt/stellaops/data/sboms
network:
allow_list: [] # Empty = no network
```
## Verification
### Verify Feed Freshness
```bash
# Check feed dates
stellaops offline status
# Output:
# Feed Status (Offline Mode)
# ─────────────────────────────
# NVD: 2025-01-15 (2 days old)
# OSV: 2025-01-15 (2 days old)
# EPSS: 2025-01-14 (3 days old)
# KEV: 2025-01-15 (2 days old)
```
### Verify Proof Integrity
```bash
# Verify smart-diff proofs
stellaops smart-diff verify \
--input smart-diff-report.json \
--proof-bundle ./proofs
# Output:
# ✓ Manifest hash verified
# ✓ All proof nodes valid
# ✓ Root hash matches: sha256:abc123...
```
## Determinism Guarantees
Offline smart-diff maintains determinism by:
1. **Content-addressed feeds** - Same feed hash = same results
2. **Frozen timestamps** - All timestamps use manifest creation time
3. **No network randomness** - No external API calls
4. **Stable sorting** - Deterministic output ordering
### Reproducibility Test
```bash
# Run twice and compare
stellaops smart-diff --base a.json --target b.json --output run1.json
stellaops smart-diff --base a.json --target b.json --output run2.json
# Compare hashes
sha256sum run1.json run2.json
# abc123... run1.json
# abc123... run2.json (identical)
```
## Troubleshooting
### Error: Feed not found
```
Error: Feed 'nvd' not found in offline data directory
```
**Solution:** Ensure feed was included in offline kit:
```bash
stellaops offline kit status
ls $STELLAOPS_FEED_DIR/nvd/
```
### Error: Network request blocked
```
Error: Network request blocked in offline mode: api.osv.dev
```
**Solution:** This is expected behavior. Ensure all required data is in offline bundle.
### Error: Signature verification failed
```
Error: Bundle signature verification failed
```
**Solution:** Ensure correct public key is configured:
```bash
stellaops offline kit verify \
--input bundle.tar.gz \
--public-key /path/to/correct-key.pub
```
## Related Documentation
- [Offline Kit Guide](../10_OFFLINE_KIT.md)
- [Determinism Requirements](../product-advisories/14-Dec-2025%20-%20Determinism%20and%20Reproducibility%20Technical%20Reference.md)
- [Smart-Diff API](../api/scanner-api.md)

View File

@@ -0,0 +1,366 @@
# Triage Air-Gap Workflows
**Sprint:** SPRINT_3600_0001_0001
**Task:** TRI-MASTER-0006 - Document air-gap triage workflows
## Overview
This document describes how to perform vulnerability triage in fully air-gapped environments. The triage workflow supports offline evidence bundles, decision capture, and replay token generation.
## Workflow 1: Offline Triage with Evidence Bundles
### Step 1: Export Evidence Bundle (Connected Machine)
```bash
# Export triage bundle for specific findings
stellaops triage export \
--scan-id scan-12345678 \
--findings CVE-2024-1234,CVE-2024-5678 \
--include-evidence \
--include-graph \
--output triage-bundle.stella.bundle.tgz
# Export entire scan for offline review
stellaops triage export \
--scan-id scan-12345678 \
--all-findings \
--output full-triage-bundle.stella.bundle.tgz
```
### Step 2: Bundle Contents
The `.stella.bundle.tgz` archive contains:
```
triage-bundle.stella.bundle.tgz/
├── manifest.json # Signed bundle manifest
├── findings/
│ ├── index.json # Finding list with IDs
│ ├── CVE-2024-1234.json # Finding details
│ └── CVE-2024-5678.json
├── evidence/
│ ├── reachability/ # Reachability proofs
│ ├── callstack/ # Call stack snippets
│ ├── vex/ # VEX/CSAF statements
│ └── provenance/ # Provenance data
├── graph/
│ ├── nodes.ndjson # Dependency graph nodes
│ └── edges.ndjson # Graph edges
├── feeds/
│ └── snapshot.json # Feed snapshot metadata
└── signature.dsse # DSSE envelope
```
### Step 3: Transfer to Air-Gapped Environment
Transfer using approved methods:
- USB media (security scanned)
- Optical media
- Data diode
### Step 4: Import and Verify
On the air-gapped machine:
```bash
# Verify bundle integrity
stellaops triage verify-bundle \
--input triage-bundle.stella.bundle.tgz \
--public-key /path/to/signing-key.pub
# Import for offline triage
stellaops triage import \
--input triage-bundle.stella.bundle.tgz \
--workspace /opt/stellaops/triage
```
### Step 5: Perform Offline Triage
```bash
# List findings in bundle
stellaops triage list \
--workspace /opt/stellaops/triage
# View finding with evidence
stellaops triage show CVE-2024-1234 \
--workspace /opt/stellaops/triage \
--show-evidence
# Make triage decision
stellaops triage decide CVE-2024-1234 \
--workspace /opt/stellaops/triage \
--status not_affected \
--justification "Code path is unreachable due to config gating" \
--reviewer "security-team"
```
### Step 6: Export Decisions
```bash
# Export decisions for sync back
stellaops triage export-decisions \
--workspace /opt/stellaops/triage \
--output decisions-2025-01-15.json \
--sign
```
### Step 7: Sync Decisions (Connected Machine)
```bash
# Import and apply decisions
stellaops triage import-decisions \
--input decisions-2025-01-15.json \
--verify \
--apply
```
## Workflow 2: Batch Offline Triage
For high-volume environments.
### Step 1: Export Batch Bundle
```bash
# Export all untriaged findings
stellaops triage export-batch \
--query "status=untriaged AND priority>=0.7" \
--limit 100 \
--output batch-triage-2025-01-15.stella.bundle.tgz
```
### Step 2: Offline Batch Processing
```bash
# Interactive batch triage
stellaops triage batch \
--workspace /opt/stellaops/triage \
--input batch-triage-2025-01-15.stella.bundle.tgz
# Keyboard shortcuts enabled:
# j/k - Next/Previous finding
# a - Accept (affected)
# n - Not affected
# w - Will not fix
# f - False positive
# u - Undo last decision
# q - Quit (saves progress)
```
### Step 3: Export and Sync
```bash
# Export batch decisions
stellaops triage export-decisions \
--workspace /opt/stellaops/triage \
--format json \
--sign \
--output batch-decisions.json
```
## Workflow 3: Evidence-First Offline Review
### Step 1: Pre-compute Evidence
On connected machine:
```bash
# Generate evidence for all high-priority findings
stellaops evidence generate \
--scan-id scan-12345678 \
--priority-min 0.7 \
--output-dir ./evidence-pack
# Include:
# - Reachability analysis
# - Call stack traces
# - VEX lookups
# - Dependency graph snippets
```
### Step 2: Package with Findings
```bash
stellaops triage package \
--scan-id scan-12345678 \
--evidence-dir ./evidence-pack \
--output evidence-triage.stella.bundle.tgz
```
### Step 3: Offline Review with Evidence
```bash
# Evidence-first view
stellaops triage show CVE-2024-1234 \
--workspace /opt/stellaops/triage \
--evidence-first
# Output:
# ═══════════════════════════════════════════
# CVE-2024-1234 · lodash@4.17.20
# ═══════════════════════════════════════════
#
# EVIDENCE SUMMARY
# ────────────────
# Reachability: EXECUTED (tier 2/3)
# └─ main.js:42 → utils.js:15 → lodash/merge
#
# Call Stack:
# 1. main.js:42 handleRequest()
# 2. utils.js:15 mergeConfig()
# 3. lodash:merge <vulnerable>
#
# VEX Status: No statement found
# EPSS: 0.45 (Medium)
# KEV: No
#
# ─────────────────────────────────────────────
# Press [a]ffected, [n]ot affected, [s]kip...
```
## Configuration
### Environment Variables
| Variable | Description | Default |
|----------|-------------|---------|
| `STELLAOPS_OFFLINE` | Enable offline mode | `false` |
| `STELLAOPS_TRIAGE_WORKSPACE` | Triage workspace path | `~/.stellaops/triage` |
| `STELLAOPS_BUNDLE_VERIFY` | Verify bundle signatures | `true` |
| `STELLAOPS_DECISION_SIGN` | Sign exported decisions | `true` |
### Config File
```yaml
# ~/.stellaops/triage.yaml
offline:
enabled: true
workspace: /opt/stellaops/triage
bundle_verify: true
decisions:
require_justification: true
sign_exports: true
keyboard:
enabled: true
vim_mode: true
```
## Bundle Format Specification
### manifest.json
```json
{
"version": "1.0",
"type": "triage-bundle",
"created_at": "2025-01-15T10:00:00Z",
"scan_id": "scan-12345678",
"finding_count": 25,
"feed_snapshot": "sha256:abc123...",
"graph_revision": "sha256:def456...",
"signatures": {
"manifest": "sha256:ghi789...",
"dsse_envelope": "signature.dsse"
}
}
```
### Decision Format
```json
{
"finding_id": "finding-12345678",
"vuln_key": "CVE-2024-1234:pkg:npm/lodash@4.17.20",
"status": "not_affected",
"justification": "Code path gated by feature flag",
"reviewer": "security-team",
"decided_at": "2025-01-15T14:30:00Z",
"replay_token": "rt_abc123...",
"evidence_refs": [
"evidence/reachability/CVE-2024-1234.json"
]
}
```
## Replay Tokens
Each decision generates a replay token for audit trail:
```bash
# View replay token
stellaops triage show-token rt_abc123...
# Output:
# Replay Token: rt_abc123...
# ─────────────────────────────
# Finding: CVE-2024-1234
# Decision: not_affected
# Evidence Hash: sha256:xyz789...
# Feed Snapshot: sha256:abc123...
# Decided: 2025-01-15T14:30:00Z
# Reviewer: security-team
```
### Verify Token
```bash
stellaops triage verify-token rt_abc123... \
--public-key /path/to/key.pub
# ✓ Token signature valid
# ✓ Evidence hash matches
# ✓ Feed snapshot verified
```
## Troubleshooting
### Error: Bundle signature invalid
```
Error: Bundle signature verification failed
```
**Solution:** Ensure the correct public key is used:
```bash
stellaops triage verify-bundle \
--input bundle.tgz \
--public-key /path/to/correct-key.pub \
--verbose
```
### Error: Evidence not found
```
Error: Evidence for CVE-2024-1234 not included in bundle
```
**Solution:** Re-export with evidence:
```bash
stellaops triage export \
--scan-id scan-12345678 \
--findings CVE-2024-1234 \
--include-evidence \
--output bundle.tgz
```
### Error: Decision sync conflict
```
Error: Finding CVE-2024-1234 has newer decision on server
```
**Solution:** Review and resolve:
```bash
stellaops triage import-decisions \
--input decisions.json \
--conflict-mode review
# Options: keep-local, keep-server, newest, review
```
## Related Documentation
- [Offline Kit Guide](../10_OFFLINE_KIT.md)
- [Triage API Reference](../api/triage-api.md)
- [Keyboard Shortcuts](../ui/keyboard-shortcuts.md)