Add signal contracts for reachability, exploitability, trust, and unknown symbols
- Introduced `ReachabilityState`, `RuntimeHit`, `ExploitabilitySignal`, `ReachabilitySignal`, `SignalEnvelope`, `SignalType`, `TrustSignal`, and `UnknownSymbolSignal` records to define various signal types and their properties. - Implemented JSON serialization attributes for proper data interchange. - Created project files for the new signal contracts library and corresponding test projects. - Added deterministic test fixtures for micro-interaction testing. - Included cryptographic keys for secure operations with cosign.
This commit is contained in:
297
docs/modules/scanner/design/api-ui-surfacing.md
Normal file
297
docs/modules/scanner/design/api-ui-surfacing.md
Normal file
@@ -0,0 +1,297 @@
|
||||
# API/UI Surfacing for New Metadata (SC7)
|
||||
|
||||
Status: Draft · Date: 2025-12-04
|
||||
Scope: Define API endpoints and UI components for surfacing CVSS v4, CycloneDX 1.7/CBOM, SLSA 1.2, and evidence metadata with deterministic pagination and sorting.
|
||||
|
||||
## Objectives
|
||||
|
||||
- Expose new metadata fields via REST API endpoints.
|
||||
- Define UI components for filters, columns, and downloads.
|
||||
- Ensure deterministic pagination and sorting across all endpoints.
|
||||
- Support both online and offline UI rendering.
|
||||
|
||||
## API Endpoints
|
||||
|
||||
### Vulnerability Ratings (CVSS v4 + v3.1)
|
||||
|
||||
```http
|
||||
GET /api/v1/scans/{scanId}/vulnerabilities
|
||||
```
|
||||
|
||||
Response includes dual CVSS ratings:
|
||||
|
||||
```json
|
||||
{
|
||||
"vulnerabilities": [
|
||||
{
|
||||
"id": "CVE-2025-0001",
|
||||
"ratings": [
|
||||
{
|
||||
"method": "CVSSv4",
|
||||
"score": 8.5,
|
||||
"severity": "high",
|
||||
"vector": "CVSS:4.0/AV:N/AC:L/AT:N/PR:N/UI:N/VC:H/VI:L/VA:N/SC:N/SI:N/SA:N"
|
||||
},
|
||||
{
|
||||
"method": "CVSSv31",
|
||||
"score": 7.5,
|
||||
"severity": "high",
|
||||
"vector": "CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:H/I:N/A:N"
|
||||
}
|
||||
],
|
||||
"evidence": {
|
||||
"source": "concelier:nvd:2025-12-03",
|
||||
"hash": "b3:eeee1111...",
|
||||
"proofId": "proof-12345"
|
||||
}
|
||||
}
|
||||
],
|
||||
"meta": {
|
||||
"page": {
|
||||
"cursor": "eyJpZCI6IkNWRS0yMDI1LTAwMDEiLCJzY29yZSI6OC41fQ==",
|
||||
"size": 100,
|
||||
"hasMore": true
|
||||
},
|
||||
"sort": {
|
||||
"field": "ratings.CVSSv4.score",
|
||||
"direction": "desc"
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### CBOM Services
|
||||
|
||||
```http
|
||||
GET /api/v1/scans/{scanId}/services
|
||||
```
|
||||
|
||||
Response includes CBOM properties:
|
||||
|
||||
```json
|
||||
{
|
||||
"services": [
|
||||
{
|
||||
"name": "api-gateway",
|
||||
"version": "1.0.0",
|
||||
"cbom": {
|
||||
"ingress": "0.0.0.0:8080",
|
||||
"egress": "https://external-api.example.invalid:443",
|
||||
"dataClassification": "pii",
|
||||
"provider": null,
|
||||
"region": null
|
||||
}
|
||||
}
|
||||
],
|
||||
"meta": {
|
||||
"page": {
|
||||
"cursor": "eyJuYW1lIjoiYXBpLWdhdGV3YXkifQ==",
|
||||
"size": 100
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Source Provenance (SLSA)
|
||||
|
||||
```http
|
||||
GET /api/v1/scans/{scanId}/provenance
|
||||
```
|
||||
|
||||
Response includes SLSA Source Track:
|
||||
|
||||
```json
|
||||
{
|
||||
"provenance": {
|
||||
"source": {
|
||||
"repo": "https://example.invalid/demo",
|
||||
"ref": "refs/tags/v1.0.0",
|
||||
"commit": "aaaa...",
|
||||
"treeHash": "b3:1111..."
|
||||
},
|
||||
"build": {
|
||||
"id": "build-12345",
|
||||
"invocationHash": "b3:2222...",
|
||||
"builderId": "https://builder.stellaops.local/scanner"
|
||||
},
|
||||
"dsse": {
|
||||
"hash": "sha256:4444...",
|
||||
"cas": "cas://provenance/demo/v1.0.0.dsse"
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Evidence Lookup
|
||||
|
||||
```http
|
||||
GET /api/v1/evidence/{evidenceHash}
|
||||
```
|
||||
|
||||
Returns evidence details:
|
||||
|
||||
```json
|
||||
{
|
||||
"evidence": {
|
||||
"hash": "b3:eeee1111...",
|
||||
"source": "scanner:binary-analyzer:v1.0.0",
|
||||
"type": "binary",
|
||||
"metadata": {
|
||||
"buildId": "abc123...",
|
||||
"symbolsHash": "b3:...",
|
||||
"confidence": 0.95
|
||||
},
|
||||
"cas": "cas://evidence/openssl/3.0.0/binary-analysis.json"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Pagination & Sorting
|
||||
|
||||
### Deterministic Cursors
|
||||
|
||||
Cursors are base64-encoded tuples of sort keys:
|
||||
|
||||
```json
|
||||
{
|
||||
"cursor": "base64({\"id\":\"CVE-2025-0001\",\"score\":8.5})",
|
||||
"decode": {
|
||||
"primaryKey": "id",
|
||||
"secondaryKey": "score",
|
||||
"lastValue": {"id": "CVE-2025-0001", "score": 8.5}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Sort Fields
|
||||
|
||||
| Endpoint | Default Sort | Allowed Fields |
|
||||
|----------|--------------|----------------|
|
||||
| `/vulnerabilities` | `ratings.CVSSv4.score desc, id asc` | `id`, `ratings.CVSSv4.score`, `ratings.CVSSv31.score`, `severity` |
|
||||
| `/components` | `purl asc, name asc` | `purl`, `name`, `version`, `type` |
|
||||
| `/services` | `name asc` | `name`, `version`, `cbom.dataClassification` |
|
||||
|
||||
### Page Size
|
||||
|
||||
| Parameter | Default | Min | Max |
|
||||
|-----------|---------|-----|-----|
|
||||
| `pageSize` | 100 | 1 | 500 |
|
||||
|
||||
## UI Components
|
||||
|
||||
### Vulnerability Table Columns
|
||||
|
||||
| Column | Source | Sortable | Filterable |
|
||||
|--------|--------|----------|------------|
|
||||
| CVE ID | `id` | Yes | Yes (search) |
|
||||
| CVSS v4 Score | `ratings[method=CVSSv4].score` | Yes | Yes (range) |
|
||||
| CVSS v4 Severity | `ratings[method=CVSSv4].severity` | Yes | Yes (multi-select) |
|
||||
| CVSS v3.1 Score | `ratings[method=CVSSv31].score` | Yes | Yes (range) |
|
||||
| Affected Component | `affects[].ref` | No | Yes (search) |
|
||||
| Evidence Source | `evidence.source` | No | Yes (multi-select) |
|
||||
|
||||
### CBOM Service View
|
||||
|
||||
| Column | Source | Sortable | Filterable |
|
||||
|--------|--------|----------|------------|
|
||||
| Service Name | `name` | Yes | Yes |
|
||||
| Ingress | `cbom.ingress` | No | Yes |
|
||||
| Egress | `cbom.egress` | No | Yes |
|
||||
| Data Classification | `cbom.dataClassification` | Yes | Yes (multi-select) |
|
||||
| Provider | `cbom.provider` | Yes | Yes |
|
||||
| Region | `cbom.region` | Yes | Yes |
|
||||
|
||||
### Filters
|
||||
|
||||
```typescript
|
||||
interface VulnerabilityFilters {
|
||||
severity?: ('critical' | 'high' | 'medium' | 'low')[];
|
||||
cvssV4ScoreMin?: number;
|
||||
cvssV4ScoreMax?: number;
|
||||
cvssV31ScoreMin?: number;
|
||||
cvssV31ScoreMax?: number;
|
||||
hasEvidence?: boolean;
|
||||
evidenceSource?: string[];
|
||||
affectedComponent?: string;
|
||||
}
|
||||
|
||||
interface ServiceFilters {
|
||||
dataClassification?: ('pii' | 'internal' | 'public' | 'confidential')[];
|
||||
hasEgress?: boolean;
|
||||
provider?: string[];
|
||||
region?: string[];
|
||||
}
|
||||
```
|
||||
|
||||
## Download Formats
|
||||
|
||||
### Export Endpoints
|
||||
|
||||
```http
|
||||
GET /api/v1/scans/{scanId}/export?format={format}
|
||||
```
|
||||
|
||||
| Format | Content-Type | Deterministic |
|
||||
|--------|--------------|---------------|
|
||||
| `cdx-1.7` | `application/vnd.cyclonedx+json` | Yes |
|
||||
| `cdx-1.6` | `application/vnd.cyclonedx+json` | Yes |
|
||||
| `spdx-3.0` | `application/spdx+json` | Yes |
|
||||
| `csv` | `text/csv` | Yes |
|
||||
| `pdf` | `application/pdf` | Partial* |
|
||||
|
||||
*PDF includes timestamp in footer
|
||||
|
||||
### CSV Export Columns
|
||||
|
||||
```csv
|
||||
cve_id,cvss_v4_score,cvss_v4_vector,cvss_v31_score,cvss_v31_vector,severity,affected_purl,evidence_hash,evidence_source
|
||||
CVE-2025-0001,8.5,"CVSS:4.0/...",7.5,"CVSS:3.1/...",high,pkg:npm/example-lib@2.0.0,b3:eeee...,concelier:nvd:2025-12-03
|
||||
```
|
||||
|
||||
## Determinism Requirements
|
||||
|
||||
1. **Sort stability**: All sorts use secondary key (usually `id`) for tie-breaking
|
||||
2. **Cursor encoding**: Deterministic JSON serialization before base64
|
||||
3. **Timestamps**: UTC ISO-8601, no sub-millisecond precision unless non-zero
|
||||
4. **Export ordering**: Same ordering rules as API responses
|
||||
5. **Filter normalization**: Sort filter arrays before query execution
|
||||
|
||||
## Offline Support
|
||||
|
||||
### Prefetch Manifest
|
||||
|
||||
```json
|
||||
{
|
||||
"prefetch": {
|
||||
"vulnerabilities": "/api/v1/scans/{scanId}/vulnerabilities?pageSize=500",
|
||||
"components": "/api/v1/scans/{scanId}/components?pageSize=500",
|
||||
"services": "/api/v1/scans/{scanId}/services",
|
||||
"provenance": "/api/v1/scans/{scanId}/provenance"
|
||||
},
|
||||
"cache": {
|
||||
"ttl": 86400,
|
||||
"storage": "indexeddb"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Static Export
|
||||
|
||||
For air-gapped environments, export complete dataset:
|
||||
|
||||
```http
|
||||
GET /api/v1/scans/{scanId}/offline-bundle
|
||||
```
|
||||
|
||||
Returns zip containing:
|
||||
- `vulnerabilities.json` (all pages concatenated)
|
||||
- `components.json`
|
||||
- `services.json`
|
||||
- `provenance.json`
|
||||
- `manifest.json` (with hashes)
|
||||
|
||||
## Links
|
||||
|
||||
- Sprint: `docs/implplan/SPRINT_0186_0001_0001_record_deterministic_execution.md` (SC7)
|
||||
- Roadmap: `docs/modules/scanner/design/standards-convergence-roadmap.md` (SC1)
|
||||
- Contract: `docs/modules/scanner/design/cdx17-cbom-contract.md` (SC2)
|
||||
207
docs/modules/scanner/design/binary-evidence-alignment.md
Normal file
207
docs/modules/scanner/design/binary-evidence-alignment.md
Normal file
@@ -0,0 +1,207 @@
|
||||
# Binary Evidence Alignment with SBOM/VEX Outputs (SC6)
|
||||
|
||||
Status: Draft · Date: 2025-12-04
|
||||
Scope: Define how binary-level evidence (build-id, symbols, patch oracle) aligns with SBOM/VEX outputs to feed policy engines and VEX decisioning.
|
||||
|
||||
## Objectives
|
||||
|
||||
- Link binary-level evidence to SBOM component identities.
|
||||
- Ensure evidence fields are available for policy/VEX correlation.
|
||||
- Define required joins between binary analysis and vulnerability data.
|
||||
- Enable deterministic evidence chain from binary → SBOM → VEX → policy.
|
||||
|
||||
## Evidence Types
|
||||
|
||||
### Build Identity Evidence
|
||||
|
||||
| Evidence Type | Source | SBOM Field | VEX Field | Policy Input |
|
||||
|---------------|--------|------------|-----------|--------------|
|
||||
| Build ID | ELF `.note.gnu.build-id` | `component.properties[evidence:build-id]` | `statement.products[].identifiers.buildId` | `binary.buildId` |
|
||||
| Go Build Info | `runtime.BuildInfo` | `component.properties[evidence:go-build]` | n/a | `binary.goBuildInfo` |
|
||||
| PE Version | PE resource section | `component.properties[evidence:pe-version]` | n/a | `binary.peVersion` |
|
||||
| Mach-O UUID | LC_UUID command | `component.properties[evidence:macho-uuid]` | n/a | `binary.machoUuid` |
|
||||
|
||||
### Symbol Evidence
|
||||
|
||||
| Evidence Type | Source | SBOM Field | VEX Field | Policy Input |
|
||||
|---------------|--------|------------|-----------|--------------|
|
||||
| Exported Symbols | ELF `.dynsym` / PE exports | `component.properties[evidence:symbols-hash]` | n/a | `binary.symbolsHash` |
|
||||
| Debug Symbols | DWARF / PDB | `component.properties[evidence:debug-hash]` | n/a | `binary.debugHash` |
|
||||
| Function Names | Symbol table | `component.properties[evidence:functions]` | `statement.justification.functions` | `binary.functions[]` |
|
||||
|
||||
### Patch Oracle Evidence
|
||||
|
||||
| Evidence Type | Source | SBOM Field | VEX Field | Policy Input |
|
||||
|---------------|--------|------------|-----------|--------------|
|
||||
| Patch Signature | Function hash diff | `component.properties[evidence:patch-sig]` | `statement.justification.patchSignature` | `patch.signature` |
|
||||
| CVE Fix Commit | Commit mapping | `component.properties[evidence:fix-commit]` | `statement.justification.fixCommit` | `patch.fixCommit` |
|
||||
| Binary Diff | objdiff hash | `component.properties[evidence:binary-diff]` | n/a | `patch.binaryDiffHash` |
|
||||
|
||||
## Evidence Chain
|
||||
|
||||
```
|
||||
Binary → SBOM Component → VEX Statement → Policy Evaluation
|
||||
│ │ │ │
|
||||
│ │ │ └── policy://input/component/{purl}
|
||||
│ │ └── vex://statement/{cve}/{product}
|
||||
│ └── sbom://component/{purl}
|
||||
└── binary://evidence/{hash}
|
||||
```
|
||||
|
||||
### Join Keys
|
||||
|
||||
| Source | Target | Join Field | Required |
|
||||
|--------|--------|------------|----------|
|
||||
| Binary | SBOM Component | `evidence:hash` → `component.hashes[]` | Yes |
|
||||
| SBOM Component | VEX Product | `component.purl` → `statement.products[].purl` | Yes |
|
||||
| VEX Statement | Policy Input | `statement.cve` → `policy.advisoryId` | Yes |
|
||||
| Binary Evidence | VEX Justification | `evidence:patch-sig` → `justification.patchSignature` | No |
|
||||
|
||||
## Required SBOM Evidence Properties
|
||||
|
||||
For binary evidence to flow through the pipeline, components MUST include:
|
||||
|
||||
```json
|
||||
{
|
||||
"type": "library",
|
||||
"name": "openssl",
|
||||
"version": "3.0.0",
|
||||
"purl": "pkg:generic/openssl@3.0.0",
|
||||
"hashes": [
|
||||
{"alg": "SHA-256", "content": "..."}
|
||||
],
|
||||
"properties": [
|
||||
{"name": "evidence:hash", "value": "b3:..."},
|
||||
{"name": "evidence:source", "value": "scanner:binary-analyzer:v1.0.0"},
|
||||
{"name": "evidence:build-id", "value": "abc123..."},
|
||||
{"name": "evidence:symbols-hash", "value": "b3:..."},
|
||||
{"name": "evidence:patch-sig", "value": "b3:..."},
|
||||
{"name": "evidence:confidence", "value": "0.95"}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
## VEX Integration
|
||||
|
||||
VEX statements can reference binary evidence for justification:
|
||||
|
||||
```json
|
||||
{
|
||||
"vulnerability": "CVE-2025-0001",
|
||||
"products": [
|
||||
{
|
||||
"purl": "pkg:generic/openssl@3.0.0",
|
||||
"identifiers": {
|
||||
"buildId": "abc123...",
|
||||
"evidenceHash": "b3:..."
|
||||
}
|
||||
}
|
||||
],
|
||||
"status": "not_affected",
|
||||
"justification": {
|
||||
"category": "vulnerable_code_not_present",
|
||||
"patchSignature": "b3:...",
|
||||
"fixCommit": "deadbeef...",
|
||||
"functions": ["EVP_EncryptUpdate", "EVP_DecryptUpdate"],
|
||||
"evidenceRef": "cas://evidence/openssl/3.0.0/binary-analysis.json"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Policy Engine Integration
|
||||
|
||||
Policy rules can reference binary evidence fields:
|
||||
|
||||
```rego
|
||||
# policy/scanner/binary-evidence.rego
|
||||
package scanner.binary
|
||||
|
||||
import rego.v1
|
||||
|
||||
# Require build-id for high-severity vulns
|
||||
deny contains msg if {
|
||||
input.vulnerability.severity == "critical"
|
||||
not input.component.properties["evidence:build-id"]
|
||||
msg := sprintf("Critical vuln %s requires build-id evidence", [input.vulnerability.id])
|
||||
}
|
||||
|
||||
# Accept patch oracle evidence as mitigation
|
||||
allow contains decision if {
|
||||
input.component.properties["evidence:patch-sig"]
|
||||
input.vex.status == "not_affected"
|
||||
input.vex.justification.patchSignature == input.component.properties["evidence:patch-sig"]
|
||||
decision := {
|
||||
"action": "accept",
|
||||
"reason": "Patch signature verified",
|
||||
"confidence": input.component.properties["evidence:confidence"]
|
||||
}
|
||||
}
|
||||
|
||||
# Confidence threshold for binary analysis
|
||||
warn contains msg if {
|
||||
conf := to_number(input.component.properties["evidence:confidence"])
|
||||
conf < 0.8
|
||||
msg := sprintf("Low confidence (%v) binary evidence for %s", [conf, input.component.purl])
|
||||
}
|
||||
```
|
||||
|
||||
## Evidence Fields by Binary Format
|
||||
|
||||
### ELF (Linux)
|
||||
|
||||
| Section | Evidence Extracted | Deterministic |
|
||||
|---------|-------------------|---------------|
|
||||
| `.note.gnu.build-id` | Build ID (SHA1/UUID) | Yes |
|
||||
| `.gnu.hash` | Symbol hash table | Yes |
|
||||
| `.dynsym` | Dynamic symbols | Yes |
|
||||
| `.debug_info` | DWARF debug symbols | Yes |
|
||||
| `.rodata` | String literals | Yes |
|
||||
|
||||
### PE (Windows)
|
||||
|
||||
| Section | Evidence Extracted | Deterministic |
|
||||
|---------|-------------------|---------------|
|
||||
| PE Header | Timestamp, Machine type | Partial* |
|
||||
| Resource | Version info, Product name | Yes |
|
||||
| Export Table | Exported functions | Yes |
|
||||
| Import Table | Dependencies | Yes |
|
||||
| Debug Directory | PDB path, GUID | Yes |
|
||||
|
||||
*PE timestamp may be zeroed for reproducible builds
|
||||
|
||||
### Mach-O (macOS)
|
||||
|
||||
| Command | Evidence Extracted | Deterministic |
|
||||
|---------|-------------------|---------------|
|
||||
| LC_UUID | Binary UUID | Yes |
|
||||
| LC_VERSION_MIN | Min OS version | Yes |
|
||||
| LC_BUILD_VERSION | Build version | Yes |
|
||||
| LC_CODE_SIGNATURE | Code signature | Yes |
|
||||
| SYMTAB | Symbol table | Yes |
|
||||
|
||||
## Determinism Requirements
|
||||
|
||||
1. **Stable ordering**: Evidence properties sorted by name
|
||||
2. **Hash computation**: BLAKE3-256 over canonical JSON
|
||||
3. **Confidence scores**: 4 decimal places, `MidpointRounding.ToZero`
|
||||
4. **Function lists**: Sorted lexicographically, deduplicated
|
||||
5. **Symbol hashes**: Computed over sorted symbol names
|
||||
|
||||
## CAS Storage
|
||||
|
||||
Binary evidence artifacts stored in CAS:
|
||||
|
||||
```
|
||||
cas://evidence/{component}/{version}/
|
||||
├── binary-analysis.json # Full analysis result
|
||||
├── symbols.txt # Extracted symbols (sorted)
|
||||
├── functions.txt # Extracted functions (sorted)
|
||||
└── patch-signatures.json # Patch oracle signatures
|
||||
```
|
||||
|
||||
## Links
|
||||
|
||||
- Sprint: `docs/implplan/SPRINT_0186_0001_0001_record_deterministic_execution.md` (SC6)
|
||||
- Roadmap: `docs/modules/scanner/design/standards-convergence-roadmap.md` (SC1)
|
||||
- Contract: `docs/modules/scanner/design/cdx17-cbom-contract.md` (SC2)
|
||||
- Entropy: `docs/modules/scanner/entropy.md`
|
||||
362
docs/modules/scanner/design/competitor-anomaly-tests.md
Normal file
362
docs/modules/scanner/design/competitor-anomaly-tests.md
Normal file
@@ -0,0 +1,362 @@
|
||||
# Competitor Ingest Anomaly Regression Tests (CM4)
|
||||
|
||||
Status: Draft · Date: 2025-12-04
|
||||
Scope: Define anomaly regression test suite for ingest pipeline covering schema drift, nullables, encoding, and ordering anomalies.
|
||||
|
||||
## Objectives
|
||||
|
||||
- Detect schema drift in upstream tool outputs.
|
||||
- Validate handling of nullable/missing fields.
|
||||
- Ensure proper encoding handling (UTF-8, escaping).
|
||||
- Verify deterministic ordering is maintained.
|
||||
- Provide golden fixtures with expected hashes.
|
||||
|
||||
## Test Categories
|
||||
|
||||
### 1. Schema Drift Tests
|
||||
|
||||
Detect when upstream tools change their output schema.
|
||||
|
||||
```
|
||||
tests/anomaly/schema-drift/
|
||||
├── syft/
|
||||
│ ├── v1.0.0-baseline.json # Known good output
|
||||
│ ├── v1.5.0-new-fields.json # Added fields
|
||||
│ ├── v1.5.0-removed-fields.json # Removed fields
|
||||
│ ├── v1.5.0-type-change.json # Field type changed
|
||||
│ └── expected-results.json
|
||||
├── trivy/
|
||||
│ └── ... (same structure)
|
||||
└── clair/
|
||||
└── ... (same structure)
|
||||
```
|
||||
|
||||
#### Test Cases
|
||||
|
||||
| Test | Input | Expected Behavior |
|
||||
|------|-------|-------------------|
|
||||
| `new_optional_field` | Output with new field | Accept, ignore new field |
|
||||
| `new_required_field` | Output with new required field | Warn, map if possible |
|
||||
| `removed_optional_field` | Output missing optional field | Accept |
|
||||
| `removed_required_field` | Output missing required field | Reject |
|
||||
| `field_type_change` | Field type differs from schema | Reject or coerce |
|
||||
| `field_rename` | Field renamed without mapping | Warn, check mapping |
|
||||
|
||||
#### Schema Drift Fixture
|
||||
|
||||
```json
|
||||
{
|
||||
"test": "new_optional_field",
|
||||
"tool": "syft",
|
||||
"inputVersion": "1.5.0",
|
||||
"baselineVersion": "1.0.0",
|
||||
"input": {
|
||||
"artifacts": [
|
||||
{
|
||||
"name": "lib-a",
|
||||
"version": "1.0.0",
|
||||
"purl": "pkg:npm/lib-a@1.0.0",
|
||||
"newField": "unexpected value"
|
||||
}
|
||||
]
|
||||
},
|
||||
"expected": {
|
||||
"status": "accepted",
|
||||
"warnings": ["unknown_field:newField"],
|
||||
"normalizedHash": "b3:..."
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 2. Nullable/Missing Field Tests
|
||||
|
||||
Validate handling of null, empty, and missing values.
|
||||
|
||||
```
|
||||
tests/anomaly/nullables/
|
||||
├── null-values.json
|
||||
├── empty-strings.json
|
||||
├── empty-arrays.json
|
||||
├── missing-optional.json
|
||||
├── missing-required.json
|
||||
└── expected-results.json
|
||||
```
|
||||
|
||||
#### Test Cases
|
||||
|
||||
| Test | Input | Expected Behavior |
|
||||
|------|-------|-------------------|
|
||||
| `null_optional` | Optional field is null | Accept, omit from output |
|
||||
| `null_required` | Required field is null | Reject |
|
||||
| `empty_string` | String field is "" | Accept, preserve or omit |
|
||||
| `empty_array` | Array field is [] | Accept, preserve |
|
||||
| `missing_optional` | Optional field absent | Accept |
|
||||
| `missing_required` | Required field absent | Reject |
|
||||
|
||||
#### Nullable Fixture
|
||||
|
||||
```json
|
||||
{
|
||||
"test": "null_optional",
|
||||
"tool": "syft",
|
||||
"input": {
|
||||
"artifacts": [
|
||||
{
|
||||
"name": "lib-a",
|
||||
"version": "1.0.0",
|
||||
"purl": "pkg:npm/lib-a@1.0.0",
|
||||
"licenses": null
|
||||
}
|
||||
]
|
||||
},
|
||||
"expected": {
|
||||
"status": "accepted",
|
||||
"output": {
|
||||
"components": [
|
||||
{
|
||||
"name": "lib-a",
|
||||
"version": "1.0.0",
|
||||
"purl": "pkg:npm/lib-a@1.0.0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"normalizedHash": "b3:..."
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 3. Encoding Tests
|
||||
|
||||
Validate proper handling of character encoding and escaping.
|
||||
|
||||
```
|
||||
tests/anomaly/encoding/
|
||||
├── utf8-valid.json
|
||||
├── utf8-bom.json
|
||||
├── latin1-fallback.json
|
||||
├── unicode-escapes.json
|
||||
├── special-chars.json
|
||||
├── json-escaping.json
|
||||
└── expected-results.json
|
||||
```
|
||||
|
||||
#### Test Cases
|
||||
|
||||
| Test | Input | Expected Behavior |
|
||||
|------|-------|-------------------|
|
||||
| `utf8_valid` | Standard UTF-8 | Accept |
|
||||
| `utf8_bom` | UTF-8 with BOM | Accept, strip BOM |
|
||||
| `unicode_escapes` | `\u0041` style escapes | Accept, decode |
|
||||
| `special_chars` | Tabs, newlines in strings | Accept, preserve or escape |
|
||||
| `control_chars` | Control characters (0x00-0x1F) | Reject or sanitize |
|
||||
| `surrogate_pairs` | Emoji and supplementary chars | Accept |
|
||||
|
||||
#### Encoding Fixture
|
||||
|
||||
```json
|
||||
{
|
||||
"test": "special_chars",
|
||||
"tool": "syft",
|
||||
"input": {
|
||||
"artifacts": [
|
||||
{
|
||||
"name": "lib-with-tab\ttab",
|
||||
"version": "1.0.0",
|
||||
"description": "Line1\nLine2"
|
||||
}
|
||||
]
|
||||
},
|
||||
"expected": {
|
||||
"status": "accepted",
|
||||
"output": {
|
||||
"components": [
|
||||
{
|
||||
"name": "lib-with-tab\ttab",
|
||||
"version": "1.0.0"
|
||||
}
|
||||
]
|
||||
},
|
||||
"normalizedHash": "b3:..."
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 4. Ordering Tests
|
||||
|
||||
Verify deterministic ordering is maintained across inputs.
|
||||
|
||||
```
|
||||
tests/anomaly/ordering/
|
||||
├── unsorted-components.json
|
||||
├── reversed-components.json
|
||||
├── random-order.json
|
||||
├── unicode-sort.json
|
||||
├── case-sensitivity.json
|
||||
└── expected-results.json
|
||||
```
|
||||
|
||||
#### Test Cases
|
||||
|
||||
| Test | Input | Expected Behavior |
|
||||
|------|-------|-------------------|
|
||||
| `unsorted_input` | Components in random order | Sort deterministically |
|
||||
| `reversed_input` | Components in reverse order | Sort deterministically |
|
||||
| `same_after_sort` | Pre-sorted input | Same output as unsorted |
|
||||
| `unicode_sort` | Unicode component names | Locale-invariant sort |
|
||||
| `case_sensitivity` | Mixed case names | Case-insensitive sort |
|
||||
|
||||
#### Ordering Fixture
|
||||
|
||||
```json
|
||||
{
|
||||
"test": "unsorted_input",
|
||||
"tool": "syft",
|
||||
"input": {
|
||||
"artifacts": [
|
||||
{"name": "zebra", "version": "1.0.0", "purl": "pkg:npm/zebra@1.0.0"},
|
||||
{"name": "apple", "version": "1.0.0", "purl": "pkg:npm/apple@1.0.0"},
|
||||
{"name": "mango", "version": "1.0.0", "purl": "pkg:npm/mango@1.0.0"}
|
||||
]
|
||||
},
|
||||
"expected": {
|
||||
"status": "accepted",
|
||||
"output": {
|
||||
"components": [
|
||||
{"name": "apple", "version": "1.0.0", "purl": "pkg:npm/apple@1.0.0"},
|
||||
{"name": "mango", "version": "1.0.0", "purl": "pkg:npm/mango@1.0.0"},
|
||||
{"name": "zebra", "version": "1.0.0", "purl": "pkg:npm/zebra@1.0.0"}
|
||||
]
|
||||
},
|
||||
"normalizedHash": "b3:..."
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Golden Fixtures
|
||||
|
||||
### Hash File Format
|
||||
|
||||
```
|
||||
# tests/anomaly/hashes.txt
|
||||
schema-drift/syft/v1.0.0-baseline.json: BLAKE3=... SHA256=...
|
||||
schema-drift/syft/expected-results.json: BLAKE3=... SHA256=...
|
||||
nullables/null-values.json: BLAKE3=... SHA256=...
|
||||
nullables/expected-results.json: BLAKE3=... SHA256=...
|
||||
encoding/utf8-valid.json: BLAKE3=... SHA256=...
|
||||
encoding/expected-results.json: BLAKE3=... SHA256=...
|
||||
ordering/unsorted-components.json: BLAKE3=... SHA256=...
|
||||
ordering/expected-results.json: BLAKE3=... SHA256=...
|
||||
```
|
||||
|
||||
## CI Integration
|
||||
|
||||
### Test Workflow
|
||||
|
||||
```yaml
|
||||
# .gitea/workflows/anomaly-tests.yml
|
||||
name: Anomaly Regression Tests
|
||||
|
||||
on:
|
||||
push:
|
||||
paths:
|
||||
- 'src/Scanner/Adapters/**'
|
||||
- 'tests/anomaly/**'
|
||||
pull_request:
|
||||
|
||||
jobs:
|
||||
anomaly-tests:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- name: Setup .NET
|
||||
uses: actions/setup-dotnet@v4
|
||||
with:
|
||||
dotnet-version: '10.0.x'
|
||||
|
||||
- name: Verify fixture hashes
|
||||
run: scripts/scanner/verify-anomaly-fixtures.sh
|
||||
|
||||
- name: Run schema drift tests
|
||||
run: |
|
||||
dotnet test src/Scanner/__Tests/StellaOps.Scanner.Anomaly.Tests \
|
||||
--filter "Category=SchemaDrift"
|
||||
|
||||
- name: Run nullable tests
|
||||
run: |
|
||||
dotnet test src/Scanner/__Tests/StellaOps.Scanner.Anomaly.Tests \
|
||||
--filter "Category=Nullable"
|
||||
|
||||
- name: Run encoding tests
|
||||
run: |
|
||||
dotnet test src/Scanner/__Tests/StellaOps.Scanner.Anomaly.Tests \
|
||||
--filter "Category=Encoding"
|
||||
|
||||
- name: Run ordering tests
|
||||
run: |
|
||||
dotnet test src/Scanner/__Tests/StellaOps.Scanner.Anomaly.Tests \
|
||||
--filter "Category=Ordering"
|
||||
```
|
||||
|
||||
### Test Runner
|
||||
|
||||
```csharp
|
||||
// src/Scanner/__Tests/StellaOps.Scanner.Anomaly.Tests/AnomalyTestRunner.cs
|
||||
[Category("SchemaDrift")]
|
||||
[Theory]
|
||||
[MemberData(nameof(GetSchemaDriftTestCases))]
|
||||
public async Task SchemaDrift_HandledCorrectly(AnomalyTestCase testCase)
|
||||
{
|
||||
// Arrange
|
||||
var adapter = _adapterFactory.Create(testCase.Tool);
|
||||
|
||||
// Act
|
||||
var result = await adapter.NormalizeAsync(testCase.Input);
|
||||
|
||||
// Assert
|
||||
Assert.Equal(testCase.Expected.Status, result.Status);
|
||||
Assert.Equal(testCase.Expected.Warnings, result.Warnings);
|
||||
|
||||
if (testCase.Expected.NormalizedHash != null)
|
||||
{
|
||||
var hash = Blake3.HashData(Encoding.UTF8.GetBytes(
|
||||
JsonSerializer.Serialize(result.Output)));
|
||||
Assert.Equal(testCase.Expected.NormalizedHash,
|
||||
$"b3:{Convert.ToHexString(hash).ToLowerInvariant()}");
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Failure Handling
|
||||
|
||||
### On Test Failure
|
||||
|
||||
1. **Schema Drift**: Create issue, update adapter mapping
|
||||
2. **Nullable Handling**: Fix normalization logic
|
||||
3. **Encoding Error**: Fix encoding detection/conversion
|
||||
4. **Ordering Violation**: Fix sort comparator
|
||||
|
||||
### Failure Report
|
||||
|
||||
```json
|
||||
{
|
||||
"failure": {
|
||||
"category": "schema_drift",
|
||||
"test": "new_required_field",
|
||||
"tool": "syft",
|
||||
"input": {...},
|
||||
"expected": {...},
|
||||
"actual": {...},
|
||||
"diff": [
|
||||
{"path": "/status", "expected": "accepted", "actual": "rejected"}
|
||||
],
|
||||
"timestamp": "2025-12-04T12:00:00Z"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Links
|
||||
|
||||
- Sprint: `docs/implplan/SPRINT_0186_0001_0001_record_deterministic_execution.md` (CM4)
|
||||
- Normalization: `docs/modules/scanner/design/competitor-ingest-normalization.md` (CM1)
|
||||
- Fixtures: `docs/modules/scanner/fixtures/competitor-adapters/`
|
||||
324
docs/modules/scanner/design/competitor-benchmark-parity.md
Normal file
324
docs/modules/scanner/design/competitor-benchmark-parity.md
Normal file
@@ -0,0 +1,324 @@
|
||||
# Competitor Benchmark Parity Plan (CM7, CM8, CM9)
|
||||
|
||||
Status: Draft · Date: 2025-12-04
|
||||
Scope: Define source transparency fields (CM7), benchmark parity requirements (CM8), and ecosystem coverage tracking (CM9).
|
||||
|
||||
## CM7: Source Transparency Fields
|
||||
|
||||
### Required Metadata Fields
|
||||
|
||||
| Field | Source | Storage Location | API Exposure |
|
||||
|-------|--------|------------------|--------------|
|
||||
| `source.tool` | Ingest input | `ingest_metadata.tool` | Yes |
|
||||
| `source.version` | Ingest input | `ingest_metadata.tool_version` | Yes |
|
||||
| `source.hash` | Computed | `ingest_metadata.tool_hash` | Yes |
|
||||
| `adapter.version` | Adapter manifest | `ingest_metadata.adapter_version` | Yes |
|
||||
| `normalized_hash` | Computed | `ingest_metadata.normalized_hash` | Yes |
|
||||
| `import_timestamp` | System | `ingest_metadata.imported_at` | Yes |
|
||||
|
||||
### Metadata Schema
|
||||
|
||||
```json
|
||||
{
|
||||
"ingest_metadata": {
|
||||
"source": {
|
||||
"tool": "syft",
|
||||
"version": "1.0.0",
|
||||
"hash": "sha256:...",
|
||||
"uri": "https://github.com/anchore/syft/releases/v1.0.0"
|
||||
},
|
||||
"adapter": {
|
||||
"version": "1.0.0",
|
||||
"mappingHash": "b3:..."
|
||||
},
|
||||
"normalized": {
|
||||
"hash": "b3:aa42c167...",
|
||||
"recordCount": 42,
|
||||
"format": "stellaops-v1"
|
||||
},
|
||||
"import": {
|
||||
"timestamp": "2025-12-04T12:00:00Z",
|
||||
"user": "system",
|
||||
"snapshotId": "syft-20251204T120000Z-001"
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### API Exposure
|
||||
|
||||
```http
|
||||
GET /api/v1/ingest/metadata/{snapshotId}
|
||||
```
|
||||
|
||||
Response:
|
||||
```json
|
||||
{
|
||||
"metadata": {
|
||||
"snapshotId": "syft-20251204T120000Z-001",
|
||||
"source": {
|
||||
"tool": "syft",
|
||||
"version": "1.0.0",
|
||||
"hash": "sha256:..."
|
||||
},
|
||||
"adapter": {
|
||||
"version": "1.0.0"
|
||||
},
|
||||
"normalized": {
|
||||
"hash": "b3:...",
|
||||
"recordCount": 42
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## CM8: Benchmark Parity
|
||||
|
||||
### Pinned Tool Versions
|
||||
|
||||
| Tool | Pinned Version | Test Frequency | Baseline Date |
|
||||
|------|----------------|----------------|---------------|
|
||||
| Syft | 1.0.0 | Weekly | 2025-12-01 |
|
||||
| Trivy | 0.50.0 | Weekly | 2025-12-01 |
|
||||
| Clair | 6.0.0 | Weekly | 2025-12-01 |
|
||||
|
||||
### Benchmark Test Suite
|
||||
|
||||
```
|
||||
tests/benchmark/
|
||||
├── syft/
|
||||
│ ├── inputs/
|
||||
│ │ ├── alpine-3.19.json # Container image
|
||||
│ │ ├── node-app.json # Node.js project
|
||||
│ │ └── java-app.json # Java project
|
||||
│ ├── expected/
|
||||
│ │ ├── alpine-3.19-expected.json
|
||||
│ │ ├── node-app-expected.json
|
||||
│ │ └── java-app-expected.json
|
||||
│ └── hashes.txt
|
||||
├── trivy/
|
||||
│ └── ... (same structure)
|
||||
└── clair/
|
||||
└── ... (same structure)
|
||||
```
|
||||
|
||||
### Benchmark Workflow
|
||||
|
||||
```yaml
|
||||
# .gitea/workflows/benchmark-parity.yml
|
||||
name: Benchmark Parity Check
|
||||
|
||||
on:
|
||||
schedule:
|
||||
- cron: '0 0 * * 0' # Weekly
|
||||
workflow_dispatch:
|
||||
|
||||
jobs:
|
||||
benchmark:
|
||||
runs-on: ubuntu-latest
|
||||
strategy:
|
||||
matrix:
|
||||
tool: [syft, trivy, clair]
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- name: Run ${{ matrix.tool }} benchmark
|
||||
run: |
|
||||
scripts/benchmark/run-benchmark.sh ${{ matrix.tool }}
|
||||
|
||||
- name: Compare results
|
||||
run: |
|
||||
scripts/benchmark/compare-results.sh ${{ matrix.tool }}
|
||||
|
||||
- name: Upload logs
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: benchmark-${{ matrix.tool }}
|
||||
path: benchmark-results/
|
||||
```
|
||||
|
||||
### Benchmark Comparison
|
||||
|
||||
```bash
|
||||
#!/bin/bash
|
||||
# scripts/benchmark/compare-results.sh
|
||||
|
||||
TOOL=$1
|
||||
BENCHMARK_DIR="tests/benchmark/${TOOL}"
|
||||
|
||||
for input in "${BENCHMARK_DIR}/inputs/"*.json; do
|
||||
name=$(basename "${input}" .json)
|
||||
expected="${BENCHMARK_DIR}/expected/${name}-expected.json"
|
||||
actual="benchmark-results/${name}-actual.json"
|
||||
|
||||
# Run tool
|
||||
stellaops ingest normalize \
|
||||
--tool "${TOOL}" \
|
||||
--input "${input}" \
|
||||
--output "${actual}"
|
||||
|
||||
# Compare
|
||||
diff_result=$(diff <(jq -S . "${expected}") <(jq -S . "${actual}"))
|
||||
|
||||
if [[ -n "${diff_result}" ]]; then
|
||||
echo "DRIFT: ${name}"
|
||||
echo "${diff_result}"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "PASS: ${name}"
|
||||
done
|
||||
```
|
||||
|
||||
### Drift Detection
|
||||
|
||||
When benchmark drift detected:
|
||||
|
||||
1. Log drift details with hash comparison
|
||||
2. Create issue in tracking system
|
||||
3. Notify Scanner Guild
|
||||
4. Block release if critical drift
|
||||
|
||||
```json
|
||||
{
|
||||
"drift": {
|
||||
"tool": "syft",
|
||||
"version": "1.0.0",
|
||||
"testCase": "alpine-3.19",
|
||||
"detected": "2025-12-04T00:00:00Z",
|
||||
"details": {
|
||||
"expectedHash": "b3:expected...",
|
||||
"actualHash": "b3:actual...",
|
||||
"diffCount": 3,
|
||||
"fields": [
|
||||
"/components/5/version",
|
||||
"/components/12/licenses",
|
||||
"/vulnerabilities/2/ratings/0/score"
|
||||
]
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## CM9: Coverage Tracking
|
||||
|
||||
### Coverage Matrix
|
||||
|
||||
Location: `docs/modules/scanner/fixtures/competitor-adapters/coverage.csv`
|
||||
|
||||
```csv
|
||||
ecosystem,syft,trivy,clair,notes
|
||||
container,yes,yes,yes,All tools support OCI images
|
||||
java,yes,yes,no,Clair Java support pending
|
||||
python,yes,yes,no,Trivy has best pip/poetry coverage
|
||||
dotnet,no,yes,no,Trivy only; Syft support pending
|
||||
go,yes,yes,no,Both tools have good go.mod support
|
||||
rust,yes,yes,no,Cargo.lock parsing
|
||||
ruby,yes,yes,no,Gemfile.lock parsing
|
||||
php,yes,yes,no,composer.lock parsing
|
||||
os-pkgs,yes,yes,yes,APK/DEB/RPM supported
|
||||
node,yes,yes,no,package-lock.json/yarn.lock
|
||||
```
|
||||
|
||||
### Coverage API
|
||||
|
||||
```http
|
||||
GET /api/v1/ingest/coverage
|
||||
```
|
||||
|
||||
Response:
|
||||
```json
|
||||
{
|
||||
"coverage": {
|
||||
"lastUpdated": "2025-12-04T00:00:00Z",
|
||||
"ecosystems": {
|
||||
"container": {
|
||||
"syft": {"supported": true, "tested": true},
|
||||
"trivy": {"supported": true, "tested": true},
|
||||
"clair": {"supported": true, "tested": true}
|
||||
},
|
||||
"java": {
|
||||
"syft": {"supported": true, "tested": true},
|
||||
"trivy": {"supported": true, "tested": true},
|
||||
"clair": {"supported": false, "tested": false, "planned": "2026-Q1"}
|
||||
}
|
||||
},
|
||||
"gaps": [
|
||||
{"ecosystem": "dotnet", "tool": "syft", "priority": "high"},
|
||||
{"ecosystem": "java", "tool": "clair", "priority": "medium"}
|
||||
]
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Gap Tracking
|
||||
|
||||
```json
|
||||
{
|
||||
"gaps": [
|
||||
{
|
||||
"id": "gap-001",
|
||||
"ecosystem": "dotnet",
|
||||
"tool": "syft",
|
||||
"priority": "high",
|
||||
"reason": "Customer demand for .NET scanning",
|
||||
"status": "planned",
|
||||
"targetDate": "2025-Q2",
|
||||
"blockers": ["Upstream syft issue #1234"]
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
### Coverage CI Check
|
||||
|
||||
```yaml
|
||||
# Check coverage doesn't regress
|
||||
- name: Verify coverage matrix
|
||||
run: |
|
||||
# Ensure no "yes" changed to "no" without documentation
|
||||
git diff HEAD~1 docs/modules/scanner/fixtures/competitor-adapters/coverage.csv \
|
||||
| grep -E '^\-.*yes.*$' && {
|
||||
echo "Coverage regression detected"
|
||||
exit 1
|
||||
}
|
||||
```
|
||||
|
||||
## Reporting
|
||||
|
||||
### Weekly Coverage Report
|
||||
|
||||
```json
|
||||
{
|
||||
"report": {
|
||||
"period": "2025-W49",
|
||||
"coverage": {
|
||||
"total_ecosystems": 10,
|
||||
"full_coverage": 3,
|
||||
"partial_coverage": 5,
|
||||
"no_coverage": 2
|
||||
},
|
||||
"benchmark": {
|
||||
"tests_run": 45,
|
||||
"tests_passed": 44,
|
||||
"tests_failed": 1,
|
||||
"drift_detected": ["trivy/alpine-3.19"]
|
||||
},
|
||||
"metadata": {
|
||||
"snapshots_imported": 156,
|
||||
"tools_seen": ["syft", "trivy", "clair"],
|
||||
"versions_seen": {
|
||||
"syft": ["1.0.0", "1.0.1"],
|
||||
"trivy": ["0.50.0"],
|
||||
"clair": ["6.0.0"]
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Links
|
||||
|
||||
- Sprint: `docs/implplan/SPRINT_0186_0001_0001_record_deterministic_execution.md` (CM7, CM8, CM9)
|
||||
- Normalization: `docs/modules/scanner/design/competitor-ingest-normalization.md` (CM1)
|
||||
- Coverage CSV: `docs/modules/scanner/fixtures/competitor-adapters/coverage.csv`
|
||||
296
docs/modules/scanner/design/competitor-db-governance.md
Normal file
296
docs/modules/scanner/design/competitor-db-governance.md
Normal file
@@ -0,0 +1,296 @@
|
||||
# Competitor Ingest DB Snapshot Governance (CM3)
|
||||
|
||||
Status: Draft · Date: 2025-12-04
|
||||
Scope: Enforce database snapshot governance including versioning, freshness SLA, and rollback procedures for imported external feeds.
|
||||
|
||||
## Objectives
|
||||
|
||||
- Define versioning scheme for imported snapshots.
|
||||
- Establish freshness SLA for external data.
|
||||
- Enable deterministic rollback to previous snapshots.
|
||||
- Support audit trail for all snapshot operations.
|
||||
|
||||
## Snapshot Versioning
|
||||
|
||||
### Version Scheme
|
||||
|
||||
```
|
||||
{tool}-{timestamp}-{sequence}
|
||||
|
||||
Examples:
|
||||
- syft-20251204T000000Z-001
|
||||
- trivy-20251204T120000Z-001
|
||||
- clair-20251204T060000Z-002
|
||||
```
|
||||
|
||||
### Snapshot Record
|
||||
|
||||
```json
|
||||
{
|
||||
"id": "syft-20251204T000000Z-001",
|
||||
"tool": "syft",
|
||||
"toolVersion": "1.0.0",
|
||||
"importedAt": "2025-12-04T00:00:00Z",
|
||||
"sourceHash": "b3:...",
|
||||
"normalizedHash": "b3:...",
|
||||
"recordCount": 1234,
|
||||
"state": "active",
|
||||
"previousSnapshot": "syft-20251203T000000Z-001",
|
||||
"metadata": {
|
||||
"sourceUri": "https://example.com/sbom.json",
|
||||
"importUser": "system",
|
||||
"importReason": "scheduled_sync"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Freshness SLA
|
||||
|
||||
### Thresholds by Tool
|
||||
|
||||
| Tool | Max Age | Stale Threshold | Critical Threshold |
|
||||
|------|---------|-----------------|-------------------|
|
||||
| Syft | 7 days | 14 days | 30 days |
|
||||
| Trivy | 7 days | 14 days | 30 days |
|
||||
| Clair | 7 days | 14 days | 30 days |
|
||||
| Custom | Configurable | Configurable | Configurable |
|
||||
|
||||
### Freshness States
|
||||
|
||||
| State | Condition | Action |
|
||||
|-------|-----------|--------|
|
||||
| `fresh` | age < max_age | Normal operation |
|
||||
| `stale` | max_age <= age < critical | Emit warning |
|
||||
| `critical` | age >= critical | Block queries without override |
|
||||
| `expired` | Manual expiry | Data unavailable |
|
||||
|
||||
### SLA Monitoring
|
||||
|
||||
```json
|
||||
{
|
||||
"sla": {
|
||||
"tool": "syft",
|
||||
"snapshotId": "syft-20251204T000000Z-001",
|
||||
"importedAt": "2025-12-04T00:00:00Z",
|
||||
"age": "P2D",
|
||||
"state": "fresh",
|
||||
"nextCheck": "2025-12-05T00:00:00Z",
|
||||
"thresholds": {
|
||||
"maxAge": "P7D",
|
||||
"stale": "P14D",
|
||||
"critical": "P30D"
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Rollback Procedures
|
||||
|
||||
### Rollback Triggers
|
||||
|
||||
| Trigger | Auto/Manual | Action |
|
||||
|---------|-------------|--------|
|
||||
| Import failure | Auto | Rollback to previous |
|
||||
| Validation failure | Auto | Rollback to previous |
|
||||
| Data corruption | Manual | Rollback to specified |
|
||||
| Compliance requirement | Manual | Rollback to specified |
|
||||
| User request | Manual | Rollback to specified |
|
||||
|
||||
### Rollback Workflow
|
||||
|
||||
```
|
||||
┌─────────────┐
|
||||
│ Initiate │
|
||||
│ Rollback │
|
||||
└─────────────┘
|
||||
│
|
||||
▼
|
||||
┌─────────────┐
|
||||
│ Verify │──Fail──► Abort
|
||||
│ Target │
|
||||
└─────────────┘
|
||||
│
|
||||
Pass
|
||||
│
|
||||
▼
|
||||
┌─────────────┐
|
||||
│ Create │
|
||||
│ Savepoint │
|
||||
└─────────────┘
|
||||
│
|
||||
▼
|
||||
┌─────────────┐
|
||||
│ Restore │──Fail──► Restore Savepoint
|
||||
│ Snapshot │
|
||||
└─────────────┘
|
||||
│
|
||||
Pass
|
||||
│
|
||||
▼
|
||||
┌─────────────┐
|
||||
│ Verify │──Fail──► Restore Savepoint
|
||||
│ Restore │
|
||||
└─────────────┘
|
||||
│
|
||||
Pass
|
||||
│
|
||||
▼
|
||||
┌─────────────┐
|
||||
│ Commit │
|
||||
│ Change │
|
||||
└─────────────┘
|
||||
│
|
||||
▼
|
||||
┌─────────────┐
|
||||
│ Update │
|
||||
│ Active │
|
||||
└─────────────┘
|
||||
```
|
||||
|
||||
### Rollback Command
|
||||
|
||||
```bash
|
||||
# Rollback to previous snapshot
|
||||
stellaops ingest rollback --tool syft
|
||||
|
||||
# Rollback to specific snapshot
|
||||
stellaops ingest rollback --tool syft --snapshot-id syft-20251201T000000Z-001
|
||||
|
||||
# Dry run
|
||||
stellaops ingest rollback --tool syft --dry-run
|
||||
|
||||
# Force rollback (skip confirmations)
|
||||
stellaops ingest rollback --tool syft --force
|
||||
```
|
||||
|
||||
### Rollback Response
|
||||
|
||||
```json
|
||||
{
|
||||
"rollback": {
|
||||
"status": "completed",
|
||||
"tool": "syft",
|
||||
"from": {
|
||||
"snapshotId": "syft-20251204T000000Z-001",
|
||||
"recordCount": 1234
|
||||
},
|
||||
"to": {
|
||||
"snapshotId": "syft-20251203T000000Z-001",
|
||||
"recordCount": 1200
|
||||
},
|
||||
"executedAt": "2025-12-04T12:00:00Z",
|
||||
"executedBy": "admin@example.com",
|
||||
"reason": "Data corruption detected"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Retention Policy
|
||||
|
||||
### Snapshot Retention
|
||||
|
||||
| Category | Retention | Cleanup |
|
||||
|----------|-----------|---------|
|
||||
| Active | Indefinite | Never |
|
||||
| Previous (N-1) | 30 days | Auto |
|
||||
| Archived | 90 days | Auto |
|
||||
| Audit | 1 year | Manual |
|
||||
|
||||
### Cleanup Schedule
|
||||
|
||||
```json
|
||||
{
|
||||
"retention": {
|
||||
"schedule": "0 0 * * *",
|
||||
"rules": [
|
||||
{
|
||||
"category": "previous",
|
||||
"maxAge": "P30D",
|
||||
"action": "archive"
|
||||
},
|
||||
{
|
||||
"category": "archived",
|
||||
"maxAge": "P90D",
|
||||
"action": "delete"
|
||||
}
|
||||
],
|
||||
"exceptions": [
|
||||
{
|
||||
"snapshotId": "syft-20251101T000000Z-001",
|
||||
"reason": "Audit hold",
|
||||
"expiresAt": "2026-12-01T00:00:00Z"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Audit Trail
|
||||
|
||||
### Audit Events
|
||||
|
||||
| Event | Fields | Retention |
|
||||
|-------|--------|-----------|
|
||||
| `snapshot_imported` | id, tool, hash, user, timestamp | 1 year |
|
||||
| `snapshot_activated` | id, previous_id, user, timestamp | 1 year |
|
||||
| `snapshot_rolled_back` | from_id, to_id, reason, user | 1 year |
|
||||
| `snapshot_expired` | id, reason, user, timestamp | 1 year |
|
||||
| `snapshot_deleted` | id, reason, user, timestamp | 1 year |
|
||||
|
||||
### Audit Record Format
|
||||
|
||||
```json
|
||||
{
|
||||
"audit": {
|
||||
"id": "audit-12345",
|
||||
"event": "snapshot_rolled_back",
|
||||
"timestamp": "2025-12-04T12:00:00Z",
|
||||
"user": "admin@example.com",
|
||||
"details": {
|
||||
"fromSnapshot": "syft-20251204T000000Z-001",
|
||||
"toSnapshot": "syft-20251203T000000Z-001",
|
||||
"reason": "Data corruption detected",
|
||||
"recordsAffected": 34
|
||||
},
|
||||
"hash": "b3:..."
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## API Endpoints
|
||||
|
||||
### List Snapshots
|
||||
|
||||
```http
|
||||
GET /api/v1/ingest/snapshots?tool=syft&state=active
|
||||
```
|
||||
|
||||
### Get Snapshot Details
|
||||
|
||||
```http
|
||||
GET /api/v1/ingest/snapshots/{snapshotId}
|
||||
```
|
||||
|
||||
### Initiate Rollback
|
||||
|
||||
```http
|
||||
POST /api/v1/ingest/snapshots/{snapshotId}/rollback
|
||||
Content-Type: application/json
|
||||
|
||||
{
|
||||
"reason": "Data corruption detected",
|
||||
"dryRun": false
|
||||
}
|
||||
```
|
||||
|
||||
### Check SLA Status
|
||||
|
||||
```http
|
||||
GET /api/v1/ingest/sla?tool=syft
|
||||
```
|
||||
|
||||
## Links
|
||||
|
||||
- Sprint: `docs/implplan/SPRINT_0186_0001_0001_record_deterministic_execution.md` (CM3)
|
||||
- Normalization: `docs/modules/scanner/design/competitor-ingest-normalization.md` (CM1)
|
||||
- Feed Thresholds: `docs/modules/policy/contracts/feed-snapshot-thresholds.md` (SP6)
|
||||
322
docs/modules/scanner/design/competitor-error-taxonomy.md
Normal file
322
docs/modules/scanner/design/competitor-error-taxonomy.md
Normal file
@@ -0,0 +1,322 @@
|
||||
# Competitor Ingest Error Taxonomy (CM10)
|
||||
|
||||
Status: Draft · Date: 2025-12-04
|
||||
Scope: Standardize retry/backoff/error taxonomy for the competitor ingest pipeline with deterministic diagnostics.
|
||||
|
||||
## Objectives
|
||||
|
||||
- Define comprehensive error taxonomy for ingest failures.
|
||||
- Specify retry behavior for transient errors.
|
||||
- Enable deterministic error diagnostics.
|
||||
- Support offline error handling.
|
||||
|
||||
## Error Categories
|
||||
|
||||
### Retryable Errors
|
||||
|
||||
| Code | Category | Description | Max Retries | Backoff |
|
||||
|------|----------|-------------|-------------|---------|
|
||||
| `E1001` | `network_error` | Network connectivity failure | 3 | Exponential |
|
||||
| `E1002` | `network_timeout` | Connection/read timeout | 3 | Exponential |
|
||||
| `E1003` | `rate_limit` | Rate limit exceeded (429) | 5 | Linear |
|
||||
| `E1004` | `service_unavailable` | Service temporarily unavailable (503) | 3 | Exponential |
|
||||
| `E1005` | `transient_io` | Temporary I/O error | 2 | Fixed |
|
||||
| `E1006` | `lock_contention` | Resource lock conflict | 3 | Exponential |
|
||||
|
||||
### Non-Retryable Errors
|
||||
|
||||
| Code | Category | Description | Action |
|
||||
|------|----------|-------------|--------|
|
||||
| `E2001` | `signature_invalid` | Signature verification failed | Reject |
|
||||
| `E2002` | `signature_expired` | Signature validity exceeded | Reject |
|
||||
| `E2003` | `key_unknown` | Signing key not found | Reject |
|
||||
| `E2004` | `key_expired` | Signing key validity exceeded | Reject |
|
||||
| `E2005` | `key_revoked` | Signing key revoked | Reject |
|
||||
| `E2006` | `alg_unsupported` | Unsupported algorithm | Reject |
|
||||
| `E2007` | `hash_mismatch` | Content hash mismatch | Reject |
|
||||
| `E2008` | `schema_invalid` | Input doesn't match schema | Reject |
|
||||
| `E2009` | `version_unsupported` | Tool version not supported | Reject |
|
||||
| `E2010` | `no_evidence` | No acceptable evidence found | Reject |
|
||||
|
||||
### Warning Conditions
|
||||
|
||||
| Code | Category | Description | Action |
|
||||
|------|----------|-------------|--------|
|
||||
| `W3001` | `provenance_unknown` | Provenance not verifiable | Accept with warning |
|
||||
| `W3002` | `degraded_confidence` | Low confidence result | Accept with warning |
|
||||
| `W3003` | `stale_data` | Data exceeds freshness threshold | Accept with warning |
|
||||
| `W3004` | `partial_mapping` | Some fields couldn't be mapped | Accept with warning |
|
||||
| `W3005` | `deprecated_format` | Using deprecated format | Accept with warning |
|
||||
|
||||
## Error Format
|
||||
|
||||
### Standard Error Response
|
||||
|
||||
```json
|
||||
{
|
||||
"error": {
|
||||
"code": "E2001",
|
||||
"category": "signature_invalid",
|
||||
"message": "DSSE signature verification failed",
|
||||
"details": {
|
||||
"reason": "Key ID not found in trusted keyring",
|
||||
"keyId": "unknown-key-12345",
|
||||
"algorithm": "ecdsa-p256"
|
||||
},
|
||||
"retryable": false,
|
||||
"diagnostics": {
|
||||
"timestamp": "2025-12-04T12:00:00Z",
|
||||
"traceId": "abc123...",
|
||||
"inputHash": "b3:...",
|
||||
"stage": "signature_verification"
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Warning Response
|
||||
|
||||
```json
|
||||
{
|
||||
"result": {
|
||||
"status": "accepted_with_warnings",
|
||||
"warnings": [
|
||||
{
|
||||
"code": "W3001",
|
||||
"category": "provenance_unknown",
|
||||
"message": "SBOM accepted without verified provenance",
|
||||
"details": {
|
||||
"reason": "No signature present",
|
||||
"fallbackLevel": 2
|
||||
}
|
||||
}
|
||||
],
|
||||
"output": {...}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Retry Configuration
|
||||
|
||||
### Backoff Strategies
|
||||
|
||||
```json
|
||||
{
|
||||
"backoff": {
|
||||
"exponential": {
|
||||
"description": "2^attempt * base, capped at max",
|
||||
"config": {
|
||||
"base": 1000,
|
||||
"factor": 2,
|
||||
"max": 60000,
|
||||
"jitter": 0.1
|
||||
},
|
||||
"example": [1000, 2000, 4000, 8000, 16000, 32000, 60000]
|
||||
},
|
||||
"linear": {
|
||||
"description": "initial + (attempt * increment), capped at max",
|
||||
"config": {
|
||||
"initial": 1000,
|
||||
"increment": 1000,
|
||||
"max": 30000
|
||||
},
|
||||
"example": [1000, 2000, 3000, 4000, 5000]
|
||||
},
|
||||
"fixed": {
|
||||
"description": "Constant delay between retries",
|
||||
"config": {
|
||||
"delay": 5000
|
||||
},
|
||||
"example": [5000, 5000, 5000]
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Retry Decision Logic
|
||||
|
||||
```python
|
||||
def should_retry(error: IngestError, attempt: int) -> RetryDecision:
|
||||
if error.code.startswith('E2'):
|
||||
return RetryDecision(retry=False, reason="Non-retryable error")
|
||||
|
||||
config = RETRY_CONFIG.get(error.category)
|
||||
if not config:
|
||||
return RetryDecision(retry=False, reason="Unknown error category")
|
||||
|
||||
if attempt >= config.max_retries:
|
||||
return RetryDecision(retry=False, reason="Max retries exceeded")
|
||||
|
||||
delay = calculate_backoff(config, attempt)
|
||||
return RetryDecision(retry=True, delay=delay)
|
||||
```
|
||||
|
||||
## Diagnostic Output
|
||||
|
||||
### Deterministic Diagnostics
|
||||
|
||||
All error diagnostics must be deterministic and reproducible:
|
||||
|
||||
```json
|
||||
{
|
||||
"diagnostics": {
|
||||
"timestamp": "2025-12-04T12:00:00Z",
|
||||
"traceId": "deterministic-trace-id",
|
||||
"inputHash": "b3:...",
|
||||
"stage": "normalization",
|
||||
"context": {
|
||||
"tool": "syft",
|
||||
"toolVersion": "1.0.0",
|
||||
"adapterVersion": "1.0.0",
|
||||
"inputSize": 12345,
|
||||
"componentCount": 42
|
||||
},
|
||||
"errorChain": [
|
||||
{
|
||||
"stage": "schema_validation",
|
||||
"error": "Missing required field: artifacts[0].purl",
|
||||
"path": "/artifacts/0"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Offline Diagnostics
|
||||
|
||||
For offline mode, diagnostics include:
|
||||
- No timestamps that depend on wall clock
|
||||
- Deterministic trace IDs (based on input hash)
|
||||
- All context from bundled metadata
|
||||
|
||||
```json
|
||||
{
|
||||
"diagnostics": {
|
||||
"mode": "offline",
|
||||
"kitVersion": "1.0.0",
|
||||
"traceId": "b3:input-hash-derived-trace-id",
|
||||
"kitHash": "b3:...",
|
||||
"trustRootHash": "b3:..."
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Error Handling Workflow
|
||||
|
||||
```
|
||||
┌─────────────┐
|
||||
│ Ingest │
|
||||
│ Request │
|
||||
└─────────────┘
|
||||
│
|
||||
▼
|
||||
┌─────────────┐
|
||||
│ Validate │──Error──► E2008 schema_invalid
|
||||
└─────────────┘
|
||||
│
|
||||
▼
|
||||
┌─────────────┐
|
||||
│ Verify │──Error──► E2001-E2007 signature errors
|
||||
│ Signature │
|
||||
└─────────────┘
|
||||
│
|
||||
▼
|
||||
┌─────────────┐
|
||||
│ Normalize │──Error──► E2008-E2009 format errors
|
||||
└─────────────┘
|
||||
│
|
||||
▼
|
||||
┌─────────────┐
|
||||
│ Store │──Error──► E1001-E1006 retryable errors
|
||||
└─────────────┘
|
||||
│
|
||||
▼
|
||||
┌─────────────┐
|
||||
│ Success │
|
||||
│ or Warn │
|
||||
└─────────────┘
|
||||
```
|
||||
|
||||
## API Error Responses
|
||||
|
||||
### HTTP Status Mapping
|
||||
|
||||
| Error Category | HTTP Status | Response Body |
|
||||
|----------------|-------------|---------------|
|
||||
| Retryable (E1xxx) | 503 | Error with Retry-After header |
|
||||
| Rate limit (E1003) | 429 | Error with Retry-After header |
|
||||
| Signature (E2001-E2007) | 400 | Error with details |
|
||||
| Schema (E2008) | 400 | Error with validation details |
|
||||
| Version (E2009) | 400 | Error with supported versions |
|
||||
| No evidence (E2010) | 400 | Error with fallback options |
|
||||
|
||||
### Example Error Response
|
||||
|
||||
```http
|
||||
HTTP/1.1 400 Bad Request
|
||||
Content-Type: application/json
|
||||
X-Stellaops-Error-Code: E2001
|
||||
X-Stellaops-Trace-Id: abc123...
|
||||
|
||||
```
|
||||
|
||||
### Retry Response
|
||||
|
||||
```http
|
||||
HTTP/1.1 503 Service Unavailable
|
||||
Content-Type: application/json
|
||||
Retry-After: 5
|
||||
X-Stellaops-Error-Code: E1004
|
||||
X-Stellaops-Retry-Attempt: 1
|
||||
|
||||
```
|
||||
|
||||
## Logging
|
||||
|
||||
### Error Log Format
|
||||
|
||||
```json
|
||||
{
|
||||
"level": "error",
|
||||
"timestamp": "2025-12-04T12:00:00Z",
|
||||
"logger": "stellaops.scanner.ingest",
|
||||
"message": "Ingest failed: signature_invalid",
|
||||
"error": {
|
||||
"code": "E2001",
|
||||
"category": "signature_invalid"
|
||||
},
|
||||
"context": {
|
||||
"traceId": "abc123...",
|
||||
"tool": "syft",
|
||||
"inputHash": "b3:..."
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Links
|
||||
|
||||
- Sprint: `docs/implplan/SPRINT_0186_0001_0001_record_deterministic_execution.md` (CM10)
|
||||
- Normalization: `docs/modules/scanner/design/competitor-ingest-normalization.md` (CM1)
|
||||
- Verification: `docs/modules/scanner/design/competitor-signature-verification.md` (CM2)
|
||||
- Offline Kit: `docs/modules/scanner/design/competitor-offline-ingest-kit.md` (CM5)
|
||||
"timestamp": "2025-12-04T12:00:00Z",
|
||||
"logger": "stellaops.scanner.ingest",
|
||||
"message": "Ingest failed: signature_invalid",
|
||||
"error": {
|
||||
"code": "E2001",
|
||||
"category": "signature_invalid"
|
||||
},
|
||||
"context": {
|
||||
"traceId": "abc123...",
|
||||
"tool": "syft",
|
||||
"inputHash": "b3:..."
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Links
|
||||
|
||||
- Sprint: `docs/implplan/SPRINT_0186_0001_0001_record_deterministic_execution.md` (CM10)
|
||||
- Normalization: `docs/modules/scanner/design/competitor-ingest-normalization.md` (CM1)
|
||||
- Verification: `docs/modules/scanner/design/competitor-signature-verification.md` (CM2)
|
||||
- Offline Kit: `docs/modules/scanner/design/competitor-offline-ingest-kit.md` (CM5)
|
||||
339
docs/modules/scanner/design/competitor-fallback-hierarchy.md
Normal file
339
docs/modules/scanner/design/competitor-fallback-hierarchy.md
Normal file
@@ -0,0 +1,339 @@
|
||||
# Competitor Ingest Fallback Hierarchy (CM6)
|
||||
|
||||
Status: Draft · Date: 2025-12-04
|
||||
Scope: Establish fallback hierarchy when external SBOM/scan data is incomplete, with explicit decision traces.
|
||||
|
||||
## Objectives
|
||||
|
||||
- Define clear fallback levels for incomplete data.
|
||||
- Ensure transparent decision tracking.
|
||||
- Enable policy-based confidence scoring.
|
||||
- Support offline fallback evaluation.
|
||||
|
||||
## Fallback Levels
|
||||
|
||||
### Hierarchy Definition
|
||||
|
||||
```
|
||||
Level 1: Signed SBOM with valid provenance
|
||||
│
|
||||
└──► Level 2: Unsigned SBOM with tool metadata
|
||||
│
|
||||
└──► Level 3: Scan-only results
|
||||
│
|
||||
└──► Level 4: Reject (no evidence)
|
||||
```
|
||||
|
||||
### Level Details
|
||||
|
||||
| Level | Source | Confidence | Requirements | Warnings |
|
||||
|-------|--------|------------|--------------|----------|
|
||||
| 1 | Signed SBOM | 1.0 | Valid signature, valid provenance | None |
|
||||
| 2 | Unsigned SBOM | 0.7 | Tool metadata, component purl, scan timestamp | `provenance_unknown` |
|
||||
| 3 | Scan-only | 0.5 | Scan timestamp | `degraded_confidence`, `no_sbom` |
|
||||
| 4 | Reject | 0.0 | None met | - |
|
||||
|
||||
### Level 1: Signed SBOM
|
||||
|
||||
Requirements:
|
||||
- DSSE/COSE/JWS signature present
|
||||
- Signature verification passes
|
||||
- Signer key in trusted keyring
|
||||
- Provenance metadata valid
|
||||
|
||||
```json
|
||||
{
|
||||
"fallback": {
|
||||
"level": 1,
|
||||
"source": "signed_sbom",
|
||||
"confidence": 1.0,
|
||||
"decision": {
|
||||
"reason": "Valid signature and provenance",
|
||||
"checks": {
|
||||
"signaturePresent": true,
|
||||
"signatureValid": true,
|
||||
"keyTrusted": true,
|
||||
"provenanceValid": true
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Level 2: Unsigned SBOM
|
||||
|
||||
Requirements (all must be present):
|
||||
- Tool name and version
|
||||
- Component list with PURLs
|
||||
- At least one SHA-256 hash per component
|
||||
- Scan timestamp
|
||||
|
||||
```json
|
||||
{
|
||||
"fallback": {
|
||||
"level": 2,
|
||||
"source": "unsigned_sbom",
|
||||
"confidence": 0.7,
|
||||
"decision": {
|
||||
"reason": "Valid SBOM without signature",
|
||||
"checks": {
|
||||
"signaturePresent": false,
|
||||
"toolMetadata": true,
|
||||
"componentPurls": true,
|
||||
"componentHashes": true,
|
||||
"scanTimestamp": true
|
||||
},
|
||||
"warnings": ["provenance_unknown"]
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Level 3: Scan-only
|
||||
|
||||
Requirements:
|
||||
- Scan timestamp present
|
||||
- At least one finding or component
|
||||
|
||||
```json
|
||||
{
|
||||
"fallback": {
|
||||
"level": 3,
|
||||
"source": "scan_only",
|
||||
"confidence": 0.5,
|
||||
"decision": {
|
||||
"reason": "Scan results without SBOM",
|
||||
"checks": {
|
||||
"signaturePresent": false,
|
||||
"toolMetadata": false,
|
||||
"scanTimestamp": true,
|
||||
"hasFindings": true
|
||||
},
|
||||
"warnings": ["degraded_confidence", "no_sbom"]
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Level 4: Reject
|
||||
|
||||
When no requirements met:
|
||||
|
||||
```json
|
||||
{
|
||||
"fallback": {
|
||||
"level": 4,
|
||||
"source": "reject",
|
||||
"confidence": 0.0,
|
||||
"decision": {
|
||||
"reason": "No acceptable evidence found",
|
||||
"checks": {
|
||||
"signaturePresent": false,
|
||||
"toolMetadata": false,
|
||||
"scanTimestamp": false,
|
||||
"hasFindings": false
|
||||
},
|
||||
"action": "reject",
|
||||
"errorCode": "E2010"
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Decision Evaluation
|
||||
|
||||
### Evaluation Algorithm
|
||||
|
||||
```python
|
||||
def evaluate_fallback(input_data: dict) -> FallbackDecision:
|
||||
checks = {
|
||||
"signaturePresent": has_signature(input_data),
|
||||
"signatureValid": False,
|
||||
"keyTrusted": False,
|
||||
"provenanceValid": False,
|
||||
"toolMetadata": has_tool_metadata(input_data),
|
||||
"componentPurls": has_component_purls(input_data),
|
||||
"componentHashes": has_component_hashes(input_data),
|
||||
"scanTimestamp": has_scan_timestamp(input_data),
|
||||
"hasFindings": has_findings(input_data)
|
||||
}
|
||||
|
||||
# Level 1 check
|
||||
if checks["signaturePresent"]:
|
||||
sig_result = verify_signature(input_data)
|
||||
checks["signatureValid"] = sig_result.valid
|
||||
checks["keyTrusted"] = sig_result.key_trusted
|
||||
checks["provenanceValid"] = verify_provenance(input_data)
|
||||
|
||||
if all([checks["signatureValid"], checks["keyTrusted"], checks["provenanceValid"]]):
|
||||
return FallbackDecision(level=1, confidence=1.0, checks=checks)
|
||||
|
||||
# Level 2 check
|
||||
if all([checks["toolMetadata"], checks["componentPurls"],
|
||||
checks["componentHashes"], checks["scanTimestamp"]]):
|
||||
return FallbackDecision(
|
||||
level=2, confidence=0.7, checks=checks,
|
||||
warnings=["provenance_unknown"]
|
||||
)
|
||||
|
||||
# Level 3 check
|
||||
if checks["scanTimestamp"] and checks["hasFindings"]:
|
||||
return FallbackDecision(
|
||||
level=3, confidence=0.5, checks=checks,
|
||||
warnings=["degraded_confidence", "no_sbom"]
|
||||
)
|
||||
|
||||
# Level 4: Reject
|
||||
return FallbackDecision(
|
||||
level=4, confidence=0.0, checks=checks,
|
||||
action="reject", error_code="E2010"
|
||||
)
|
||||
```
|
||||
|
||||
## Decision Trace
|
||||
|
||||
### Trace Format
|
||||
|
||||
```json
|
||||
{
|
||||
"trace": {
|
||||
"id": "trace-12345",
|
||||
"timestamp": "2025-12-04T12:00:00Z",
|
||||
"input": {
|
||||
"hash": "b3:...",
|
||||
"size": 12345,
|
||||
"format": "cyclonedx-1.6"
|
||||
},
|
||||
"evaluation": {
|
||||
"steps": [
|
||||
{
|
||||
"check": "signaturePresent",
|
||||
"result": false,
|
||||
"details": "No DSSE/COSE/JWS envelope found"
|
||||
},
|
||||
{
|
||||
"check": "toolMetadata",
|
||||
"result": true,
|
||||
"details": "Found tool: syft v1.0.0"
|
||||
},
|
||||
{
|
||||
"check": "componentPurls",
|
||||
"result": true,
|
||||
"details": "42 components with valid PURLs"
|
||||
},
|
||||
{
|
||||
"check": "componentHashes",
|
||||
"result": true,
|
||||
"details": "42 components with SHA-256 hashes"
|
||||
},
|
||||
{
|
||||
"check": "scanTimestamp",
|
||||
"result": true,
|
||||
"details": "Timestamp: 2025-12-04T00:00:00Z"
|
||||
}
|
||||
],
|
||||
"decision": {
|
||||
"level": 2,
|
||||
"confidence": 0.7,
|
||||
"warnings": ["provenance_unknown"]
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Trace Persistence
|
||||
|
||||
Decision traces are:
|
||||
- Stored with normalized output
|
||||
- Included in API responses
|
||||
- Available for audit queries
|
||||
- Deterministic (same input = same trace)
|
||||
|
||||
## Policy Integration
|
||||
|
||||
### Confidence Thresholds
|
||||
|
||||
```json
|
||||
{
|
||||
"policy": {
|
||||
"minConfidence": {
|
||||
"production": 0.8,
|
||||
"staging": 0.5,
|
||||
"development": 0.0
|
||||
},
|
||||
"allowedLevels": {
|
||||
"production": [1],
|
||||
"staging": [1, 2],
|
||||
"development": [1, 2, 3]
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Policy Evaluation
|
||||
|
||||
```rego
|
||||
# policy/ingest/fallback.rego
|
||||
package ingest.fallback
|
||||
|
||||
import rego.v1
|
||||
|
||||
default allow = false
|
||||
|
||||
allow if {
|
||||
input.fallback.level <= max_allowed_level
|
||||
input.fallback.confidence >= min_confidence
|
||||
}
|
||||
|
||||
max_allowed_level := data.policy.allowedLevels[input.environment][_]
|
||||
min_confidence := data.policy.minConfidence[input.environment]
|
||||
|
||||
deny contains msg if {
|
||||
input.fallback.level > max_allowed_level
|
||||
msg := sprintf("Fallback level %d not allowed in %s", [input.fallback.level, input.environment])
|
||||
}
|
||||
|
||||
warn contains msg if {
|
||||
warning := input.fallback.decision.warnings[_]
|
||||
msg := sprintf("Fallback warning: %s", [warning])
|
||||
}
|
||||
```
|
||||
|
||||
## Override Mechanism
|
||||
|
||||
### Manual Override
|
||||
|
||||
```bash
|
||||
# Accept unsigned SBOM in production (requires approval)
|
||||
stellaops ingest import \
|
||||
--input external-sbom.json \
|
||||
--allow-unsigned \
|
||||
--override-reason "Emergency import per ticket INC-12345" \
|
||||
--override-approver security-admin@example.com
|
||||
```
|
||||
|
||||
### Override Record
|
||||
|
||||
```json
|
||||
{
|
||||
"override": {
|
||||
"enabled": true,
|
||||
"level": 2,
|
||||
"originalDecision": {
|
||||
"level": 4,
|
||||
"reason": "Would normally reject"
|
||||
},
|
||||
"overrideReason": "Emergency import per ticket INC-12345",
|
||||
"approver": "security-admin@example.com",
|
||||
"approvedAt": "2025-12-04T12:00:00Z",
|
||||
"expiresAt": "2025-12-05T12:00:00Z"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Links
|
||||
|
||||
- Sprint: `docs/implplan/SPRINT_0186_0001_0001_record_deterministic_execution.md` (CM6)
|
||||
- Verification: `docs/modules/scanner/design/competitor-signature-verification.md` (CM2)
|
||||
- Normalization: `docs/modules/scanner/design/competitor-ingest-normalization.md` (CM1)
|
||||
353
docs/modules/scanner/design/competitor-offline-ingest-kit.md
Normal file
353
docs/modules/scanner/design/competitor-offline-ingest-kit.md
Normal file
@@ -0,0 +1,353 @@
|
||||
# Competitor Offline Ingest Kit (CM5)
|
||||
|
||||
Status: Draft · Date: 2025-12-04
|
||||
Scope: Define offline ingest kit contents including DSSE-signed adapters, mappings, fixtures, and trust roots for air-gapped environments.
|
||||
|
||||
## Objectives
|
||||
|
||||
- Bundle all competitor ingest artifacts for offline use.
|
||||
- Sign kit with DSSE for integrity verification.
|
||||
- Include trust roots for signature verification.
|
||||
- Enable complete ingest workflow without network access.
|
||||
|
||||
## Bundle Structure
|
||||
|
||||
```
|
||||
out/offline/competitor-ingest-kit-v1/
|
||||
├── manifest.json # Bundle manifest with all hashes
|
||||
├── manifest.dsse # DSSE signature over manifest
|
||||
├── adapters/
|
||||
│ ├── syft/
|
||||
│ │ ├── mapping.csv # Field mappings
|
||||
│ │ ├── version-map.json # Version compatibility
|
||||
│ │ └── schema.json # Expected input schema
|
||||
│ ├── trivy/
|
||||
│ │ ├── mapping.csv
|
||||
│ │ ├── version-map.json
|
||||
│ │ └── schema.json
|
||||
│ └── clair/
|
||||
│ ├── mapping.csv
|
||||
│ ├── version-map.json
|
||||
│ └── schema.json
|
||||
├── fixtures/
|
||||
│ ├── normalized-syft.json
|
||||
│ ├── normalized-trivy.json
|
||||
│ ├── normalized-clair.json
|
||||
│ └── hashes.txt
|
||||
├── trust/
|
||||
│ ├── root-ca.pem
|
||||
│ ├── keyring.json
|
||||
│ ├── revocation.json
|
||||
│ └── cosign-keys/
|
||||
│ ├── syft-release.pub
|
||||
│ ├── trivy-release.pub
|
||||
│ └── clair-release.pub
|
||||
├── coverage/
|
||||
│ └── coverage.csv
|
||||
├── policies/
|
||||
│ ├── signature-policy.json
|
||||
│ ├── fallback-policy.json
|
||||
│ └── retry-policy.json
|
||||
└── tools/
|
||||
├── versions.json
|
||||
└── checksums.txt
|
||||
```
|
||||
|
||||
## Manifest Format
|
||||
|
||||
```json
|
||||
{
|
||||
"version": "1.0.0",
|
||||
"created": "2025-12-04T00:00:00Z",
|
||||
"creator": "stellaops-scanner",
|
||||
"type": "competitor-ingest-kit",
|
||||
"artifacts": [
|
||||
{
|
||||
"path": "adapters/syft/mapping.csv",
|
||||
"type": "adapter",
|
||||
"tool": "syft",
|
||||
"blake3": "...",
|
||||
"sha256": "..."
|
||||
},
|
||||
{
|
||||
"path": "fixtures/normalized-syft.json",
|
||||
"type": "fixture",
|
||||
"tool": "syft",
|
||||
"blake3": "aa42c167d19535709a10df73dc39e6a50b8efbbb0ae596d17183ce62676fa85a",
|
||||
"sha256": "3f8684ff341808dcb92e97dd2c10acca727baaff05182e81a4364bb3dad0eaa7"
|
||||
},
|
||||
{
|
||||
"path": "trust/keyring.json",
|
||||
"type": "trust",
|
||||
"blake3": "...",
|
||||
"sha256": "..."
|
||||
}
|
||||
],
|
||||
"supportedTools": {
|
||||
"syft": {
|
||||
"minVersion": "1.0.0",
|
||||
"maxVersion": "1.99.99",
|
||||
"tested": ["1.0.0", "1.5.0", "1.10.0"]
|
||||
},
|
||||
"trivy": {
|
||||
"minVersion": "0.50.0",
|
||||
"maxVersion": "0.59.99",
|
||||
"tested": ["0.50.0", "0.55.0"]
|
||||
},
|
||||
"clair": {
|
||||
"minVersion": "6.0.0",
|
||||
"maxVersion": "6.99.99",
|
||||
"tested": ["6.0.0", "6.1.0"]
|
||||
}
|
||||
},
|
||||
"manifestHash": {
|
||||
"blake3": "...",
|
||||
"sha256": "..."
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Adapter Contents
|
||||
|
||||
### Mapping CSV Format
|
||||
|
||||
```csv
|
||||
source_field,target_field,rule,required,notes
|
||||
artifacts[].name,components[].name,copy,yes,Component name
|
||||
artifacts[].version,components[].version,copy,yes,Component version
|
||||
artifacts[].purl,components[].purl,copy,yes,Package URL
|
||||
artifacts[].type,components[].type,map:package->library,yes,Type mapping
|
||||
artifacts[].licenses[],components[].licenses[],flatten,no,License list
|
||||
artifacts[].metadata.digest,components[].hashes[],transform:sha256,no,Hash extraction
|
||||
```
|
||||
|
||||
### Version Map Format
|
||||
|
||||
```json
|
||||
{
|
||||
"tool": "syft",
|
||||
"versionRanges": [
|
||||
{
|
||||
"range": ">=1.0.0 <1.5.0",
|
||||
"schemaVersion": "v1",
|
||||
"mappingFile": "mapping-v1.csv"
|
||||
},
|
||||
{
|
||||
"range": ">=1.5.0 <2.0.0",
|
||||
"schemaVersion": "v2",
|
||||
"mappingFile": "mapping-v2.csv",
|
||||
"breaking": ["artifacts.metadata renamed to artifacts.meta"]
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
## Policy Files
|
||||
|
||||
### Signature Policy (CM2)
|
||||
|
||||
```json
|
||||
{
|
||||
"policy": {
|
||||
"requireSignature": true,
|
||||
"allowUnsigned": false,
|
||||
"acceptedFormats": ["dsse", "cose", "jws"],
|
||||
"acceptedAlgorithms": ["ed25519", "ecdsa-p256", "rsa-2048"],
|
||||
"trustedIssuers": [
|
||||
"https://github.com/anchore/syft",
|
||||
"https://github.com/aquasecurity/trivy",
|
||||
"https://github.com/quay/clair"
|
||||
],
|
||||
"rejectReasons": [
|
||||
"sig_invalid",
|
||||
"sig_expired",
|
||||
"key_unknown",
|
||||
"key_expired",
|
||||
"key_revoked",
|
||||
"alg_unsupported"
|
||||
]
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Fallback Policy (CM6)
|
||||
|
||||
```json
|
||||
{
|
||||
"fallback": {
|
||||
"hierarchy": [
|
||||
{
|
||||
"level": 1,
|
||||
"source": "signed_sbom",
|
||||
"confidence": 1.0,
|
||||
"requirements": ["valid_signature", "valid_provenance"]
|
||||
},
|
||||
{
|
||||
"level": 2,
|
||||
"source": "unsigned_sbom",
|
||||
"confidence": 0.7,
|
||||
"requirements": ["tool_metadata", "component_purl", "scan_timestamp"],
|
||||
"warnings": ["provenance_unknown"]
|
||||
},
|
||||
{
|
||||
"level": 3,
|
||||
"source": "scan_only",
|
||||
"confidence": 0.5,
|
||||
"requirements": ["scan_timestamp"],
|
||||
"warnings": ["degraded_confidence", "no_sbom"]
|
||||
},
|
||||
{
|
||||
"level": 4,
|
||||
"source": "reject",
|
||||
"confidence": 0.0,
|
||||
"requirements": [],
|
||||
"action": "reject",
|
||||
"reason": "no_evidence"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Retry Policy (CM10)
|
||||
|
||||
```json
|
||||
{
|
||||
"retry": {
|
||||
"retryable": [
|
||||
{"code": "network_error", "maxRetries": 3, "backoff": "exponential"},
|
||||
{"code": "rate_limit", "maxRetries": 5, "backoff": "linear"},
|
||||
{"code": "transient_io", "maxRetries": 2, "backoff": "fixed"}
|
||||
],
|
||||
"nonRetryable": [
|
||||
"signature_invalid",
|
||||
"schema_invalid",
|
||||
"unsupported_version",
|
||||
"no_evidence"
|
||||
],
|
||||
"backoffConfig": {
|
||||
"exponential": {"base": 1000, "factor": 2, "max": 60000},
|
||||
"linear": {"initial": 1000, "increment": 1000, "max": 30000},
|
||||
"fixed": {"delay": 5000}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Kit Generation
|
||||
|
||||
### Build Script
|
||||
|
||||
```bash
|
||||
#!/bin/bash
|
||||
# scripts/scanner/build-competitor-ingest-kit.sh
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
KIT_DIR="out/offline/competitor-ingest-kit-v1"
|
||||
rm -rf "${KIT_DIR}"
|
||||
mkdir -p "${KIT_DIR}"
|
||||
|
||||
# Copy adapters
|
||||
for tool in syft trivy clair; do
|
||||
mkdir -p "${KIT_DIR}/adapters/${tool}"
|
||||
cp "src/Scanner/Adapters/${tool}/"*.csv "${KIT_DIR}/adapters/${tool}/"
|
||||
cp "src/Scanner/Adapters/${tool}/"*.json "${KIT_DIR}/adapters/${tool}/"
|
||||
done
|
||||
|
||||
# Copy fixtures
|
||||
mkdir -p "${KIT_DIR}/fixtures"
|
||||
cp docs/modules/scanner/fixtures/competitor-adapters/fixtures/*.json "${KIT_DIR}/fixtures/"
|
||||
cp docs/modules/scanner/fixtures/competitor-adapters/fixtures/hashes.txt "${KIT_DIR}/fixtures/"
|
||||
|
||||
# Copy trust roots
|
||||
mkdir -p "${KIT_DIR}/trust/cosign-keys"
|
||||
cp trust/root-ca.pem "${KIT_DIR}/trust/"
|
||||
cp trust/keyring.json "${KIT_DIR}/trust/"
|
||||
cp trust/revocation.json "${KIT_DIR}/trust/"
|
||||
cp trust/cosign-keys/*.pub "${KIT_DIR}/trust/cosign-keys/"
|
||||
|
||||
# Copy coverage
|
||||
mkdir -p "${KIT_DIR}/coverage"
|
||||
cp docs/modules/scanner/fixtures/competitor-adapters/coverage.csv "${KIT_DIR}/coverage/"
|
||||
|
||||
# Copy policies
|
||||
mkdir -p "${KIT_DIR}/policies"
|
||||
cp policies/competitor/*.json "${KIT_DIR}/policies/"
|
||||
|
||||
# Copy tool versions
|
||||
mkdir -p "${KIT_DIR}/tools"
|
||||
cp tools/versions.json "${KIT_DIR}/tools/"
|
||||
cp tools/checksums.txt "${KIT_DIR}/tools/"
|
||||
|
||||
# Generate manifest
|
||||
scripts/scanner/generate-competitor-manifest.sh "${KIT_DIR}"
|
||||
|
||||
# Sign manifest
|
||||
scripts/scanner/sign-manifest.sh "${KIT_DIR}/manifest.json" "${KIT_DIR}/manifest.dsse"
|
||||
|
||||
echo "Kit built: ${KIT_DIR}"
|
||||
```
|
||||
|
||||
## Verification
|
||||
|
||||
### Kit Verification Script
|
||||
|
||||
```bash
|
||||
#!/bin/bash
|
||||
# scripts/scanner/verify-competitor-ingest-kit.sh
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
KIT_DIR="${1:-out/offline/competitor-ingest-kit-v1}"
|
||||
|
||||
# Verify DSSE signature
|
||||
stellaops-verify dsse \
|
||||
--envelope "${KIT_DIR}/manifest.dsse" \
|
||||
--trust-root "${KIT_DIR}/trust/root-ca.pem" \
|
||||
--expected-payload-type "application/vnd.stellaops.competitor-ingest.manifest+json"
|
||||
|
||||
# Extract and verify artifacts
|
||||
MANIFEST=$(stellaops-verify dsse --envelope "${KIT_DIR}/manifest.dsse" --extract-payload)
|
||||
|
||||
for artifact in $(echo "${MANIFEST}" | jq -r '.artifacts[] | @base64'); do
|
||||
path=$(echo "${artifact}" | base64 -d | jq -r '.path')
|
||||
expected_blake3=$(echo "${artifact}" | base64 -d | jq -r '.blake3')
|
||||
|
||||
actual_blake3=$(b3sum "${KIT_DIR}/${path}" | cut -d' ' -f1)
|
||||
|
||||
if [[ "${actual_blake3}" != "${expected_blake3}" ]]; then
|
||||
echo "FAIL: ${path}"
|
||||
exit 1
|
||||
fi
|
||||
echo "PASS: ${path}"
|
||||
done
|
||||
|
||||
echo "All artifacts verified"
|
||||
```
|
||||
|
||||
## Usage
|
||||
|
||||
### Offline Ingest Workflow
|
||||
|
||||
```bash
|
||||
# Initialize from kit
|
||||
stellaops ingest init --kit out/offline/competitor-ingest-kit-v1
|
||||
|
||||
# Import SBOM with offline verification
|
||||
stellaops ingest import \
|
||||
--input external-sbom.json \
|
||||
--tool syft \
|
||||
--offline \
|
||||
--trust-root out/offline/competitor-ingest-kit-v1/trust/keyring.json
|
||||
|
||||
# Validate against fixtures
|
||||
stellaops ingest validate \
|
||||
--fixtures out/offline/competitor-ingest-kit-v1/fixtures
|
||||
```
|
||||
|
||||
## Links
|
||||
|
||||
- Sprint: `docs/implplan/SPRINT_0186_0001_0001_record_deterministic_execution.md` (CM5)
|
||||
- Normalization: `docs/modules/scanner/design/competitor-ingest-normalization.md` (CM1)
|
||||
- Verification: `docs/modules/scanner/design/competitor-signature-verification.md` (CM2)
|
||||
255
docs/modules/scanner/design/competitor-signature-verification.md
Normal file
255
docs/modules/scanner/design/competitor-signature-verification.md
Normal file
@@ -0,0 +1,255 @@
|
||||
# Competitor SBOM/Scan Signature Verification (CM2)
|
||||
|
||||
Status: Draft · Date: 2025-12-04
|
||||
Scope: Specify signature and provenance verification requirements for accepting external SBOM and scan outputs, including rejection/flag policies.
|
||||
|
||||
## Objectives
|
||||
|
||||
- Define acceptable signature algorithms and formats.
|
||||
- Establish trust root management for external signers.
|
||||
- Specify verification workflow and failure modes.
|
||||
- Enable offline verification with bundled trust roots.
|
||||
|
||||
## Acceptable Signatures
|
||||
|
||||
### Signature Formats
|
||||
|
||||
| Format | Algorithm | Key Type | Status |
|
||||
|--------|-----------|----------|--------|
|
||||
| DSSE | Ed25519 | Asymmetric | Preferred |
|
||||
| DSSE | ECDSA P-256 | Asymmetric | Accepted |
|
||||
| DSSE | RSA-2048+ | Asymmetric | Accepted |
|
||||
| COSE | EdDSA | Asymmetric | Accepted |
|
||||
| JWS | ES256 | Asymmetric | Accepted |
|
||||
| JWS | RS256 | Asymmetric | Deprecated |
|
||||
|
||||
### Hash Algorithms
|
||||
|
||||
| Algorithm | Usage | Status |
|
||||
|-----------|-------|--------|
|
||||
| SHA-256 | Primary | Required |
|
||||
| BLAKE3-256 | Secondary | Preferred |
|
||||
| SHA-384 | Alternative | Accepted |
|
||||
| SHA-512 | Alternative | Accepted |
|
||||
| SHA-1 | Legacy | Rejected |
|
||||
| MD5 | Legacy | Rejected |
|
||||
|
||||
## Trust Root Management
|
||||
|
||||
### Bundled Trust Roots
|
||||
|
||||
```
|
||||
out/offline/competitor-ingest-kit-v1/trust/
|
||||
├── root-ca.pem # CA for signed SBOMs
|
||||
├── keyring.json # Known signing keys
|
||||
├── cosign-keys/ # Cosign public keys
|
||||
│ ├── syft-release.pub
|
||||
│ ├── trivy-release.pub
|
||||
│ └── clair-release.pub
|
||||
└── fulcio-root.pem # Sigstore Fulcio CA
|
||||
```
|
||||
|
||||
### Keyring Format
|
||||
|
||||
```json
|
||||
{
|
||||
"keys": [
|
||||
{
|
||||
"id": "syft-release-2025",
|
||||
"type": "ecdsa-p256",
|
||||
"publicKey": "-----BEGIN PUBLIC KEY-----\n...\n-----END PUBLIC KEY-----",
|
||||
"issuer": "https://github.com/anchore/syft",
|
||||
"validFrom": "2025-01-01T00:00:00Z",
|
||||
"validTo": "2026-01-01T00:00:00Z",
|
||||
"purposes": ["sbom-signing", "attestation-signing"]
|
||||
}
|
||||
],
|
||||
"trustedIssuers": [
|
||||
"https://github.com/anchore/syft",
|
||||
"https://github.com/aquasecurity/trivy",
|
||||
"https://github.com/quay/clair"
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
## Verification Workflow
|
||||
|
||||
```
|
||||
┌─────────────┐
|
||||
│ Receive │
|
||||
│ SBOM │
|
||||
└─────────────┘
|
||||
│
|
||||
▼
|
||||
┌─────────────┐ ┌─────────────┐
|
||||
│ Has DSSE? │──No─► Has JWS? │──No─► Unsigned
|
||||
└─────────────┘ └─────────────┘ │
|
||||
│ │ │
|
||||
Yes Yes │
|
||||
│ │ │
|
||||
▼ ▼ ▼
|
||||
┌─────────────┐ ┌─────────────┐ ┌─────────────┐
|
||||
│ Verify DSSE │ │ Verify JWS │ │ Apply CM6 │
|
||||
│ Signature │ │ Signature │ │ Fallback │
|
||||
└─────────────┘ └─────────────┘ └─────────────┘
|
||||
│ │ │
|
||||
▼ ▼ ▼
|
||||
┌─────────────┐ ┌─────────────┐ ┌─────────────┐
|
||||
│ Valid? │ │ Valid? │ │ Provenance: │
|
||||
└─────────────┘ └─────────────┘ │ unknown │
|
||||
│ │ │ │ └─────────────┘
|
||||
Yes No Yes No
|
||||
│ │ │ │
|
||||
▼ ▼ ▼ ▼
|
||||
Accept Reject Accept Reject
|
||||
```
|
||||
|
||||
### Verification Steps
|
||||
|
||||
1. **Format Detection**
|
||||
- Check for DSSE envelope wrapper
|
||||
- Check for detached signature file (`.sig`)
|
||||
- Check for inline JWS header
|
||||
|
||||
2. **Signature Extraction**
|
||||
- Parse envelope/signature structure
|
||||
- Extract signer key ID and algorithm
|
||||
|
||||
3. **Key Lookup**
|
||||
- Search bundled keyring for key ID
|
||||
- Verify key is within validity period
|
||||
- Check key purpose matches usage
|
||||
|
||||
4. **Cryptographic Verification**
|
||||
- Verify signature over payload
|
||||
- Verify hash matches content
|
||||
- Check for signature expiry
|
||||
|
||||
5. **Provenance Validation**
|
||||
- Extract signer identity
|
||||
- Verify issuer is trusted
|
||||
- Check build metadata if present
|
||||
|
||||
## Failure Modes
|
||||
|
||||
### Rejection Reasons
|
||||
|
||||
| Code | Reason | Action |
|
||||
|------|--------|--------|
|
||||
| `sig_missing` | No signature present | Apply fallback (CM6) |
|
||||
| `sig_invalid` | Signature verification failed | Reject |
|
||||
| `sig_expired` | Signature validity period exceeded | Reject |
|
||||
| `key_unknown` | Signing key not in keyring | Reject |
|
||||
| `key_expired` | Signing key validity exceeded | Reject |
|
||||
| `key_revoked` | Signing key has been revoked | Reject |
|
||||
| `issuer_untrusted` | Issuer not in trusted list | Reject |
|
||||
| `alg_unsupported` | Algorithm not acceptable | Reject |
|
||||
| `hash_mismatch` | Content hash doesn't match | Reject |
|
||||
|
||||
### Flag Policy
|
||||
|
||||
When `--allow-unsigned` is set:
|
||||
|
||||
| Condition | Behavior |
|
||||
|-----------|----------|
|
||||
| Signature missing | Accept with `provenance=unknown`, emit warning |
|
||||
| Signature invalid | Reject (flag doesn't override invalid) |
|
||||
| Key unknown | Accept with `provenance=unverified`, emit warning |
|
||||
|
||||
## Verification API
|
||||
|
||||
### Endpoint
|
||||
|
||||
```http
|
||||
POST /api/v1/ingest/verify
|
||||
Content-Type: application/json
|
||||
|
||||
{
|
||||
"sbom": "<base64-encoded-sbom>",
|
||||
"signature": "<base64-encoded-signature>",
|
||||
"options": {
|
||||
"allowUnsigned": false,
|
||||
"requireProvenance": true
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Response
|
||||
|
||||
```json
|
||||
{
|
||||
"verification": {
|
||||
"status": "valid",
|
||||
"signature": {
|
||||
"format": "dsse",
|
||||
"algorithm": "ecdsa-p256",
|
||||
"keyId": "syft-release-2025",
|
||||
"signedAt": "2025-12-04T00:00:00Z"
|
||||
},
|
||||
"provenance": {
|
||||
"issuer": "https://github.com/anchore/syft",
|
||||
"buildId": "build-12345",
|
||||
"sourceRepo": "https://github.com/example/app"
|
||||
},
|
||||
"hash": {
|
||||
"algorithm": "sha256",
|
||||
"value": "..."
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Offline Verification
|
||||
|
||||
### Requirements
|
||||
|
||||
- All trust roots bundled in offline kit
|
||||
- No network calls during verification
|
||||
- Keyring includes all expected signers
|
||||
- CRL/OCSP checks disabled (use bundled revocation lists)
|
||||
|
||||
### Revocation List Format
|
||||
|
||||
```json
|
||||
{
|
||||
"revoked": [
|
||||
{
|
||||
"keyId": "compromised-key-2024",
|
||||
"revokedAt": "2024-12-01T00:00:00Z",
|
||||
"reason": "Key compromise"
|
||||
}
|
||||
],
|
||||
"lastUpdated": "2025-12-04T00:00:00Z"
|
||||
}
|
||||
```
|
||||
|
||||
## Integration with Normalization (CM1)
|
||||
|
||||
After successful verification:
|
||||
|
||||
1. Extract tool metadata from signature/provenance
|
||||
2. Pass to normalization adapter
|
||||
3. Include verification result in normalized output
|
||||
|
||||
```json
|
||||
{
|
||||
"source": {
|
||||
"tool": "syft",
|
||||
"version": "1.0.0",
|
||||
"hash": "sha256:..."
|
||||
},
|
||||
"verification": {
|
||||
"status": "verified",
|
||||
"keyId": "syft-release-2025",
|
||||
"signedAt": "2025-12-04T00:00:00Z"
|
||||
},
|
||||
"components": [...],
|
||||
"normalized_hash": "blake3:..."
|
||||
}
|
||||
```
|
||||
|
||||
## Links
|
||||
|
||||
- Sprint: `docs/implplan/SPRINT_0186_0001_0001_record_deterministic_execution.md` (CM2)
|
||||
- Normalization: `docs/modules/scanner/design/competitor-ingest-normalization.md` (CM1)
|
||||
- Fallback: See CM6 in this document series
|
||||
263
docs/modules/scanner/design/determinism-ci-harness.md
Normal file
263
docs/modules/scanner/design/determinism-ci-harness.md
Normal file
@@ -0,0 +1,263 @@
|
||||
# Determinism CI Harness for New Formats (SC5)
|
||||
|
||||
Status: Draft · Date: 2025-12-04
|
||||
Scope: Define the determinism CI harness for validating stable ordering, hash checks, golden fixtures, and RNG seeds for CVSS v4, CycloneDX 1.7/CBOM, and SLSA 1.2 outputs.
|
||||
|
||||
## Objectives
|
||||
|
||||
- Ensure Scanner outputs are reproducible across builds, platforms, and time.
|
||||
- Validate that serialized SBOM/VEX/attestation outputs have deterministic ordering.
|
||||
- Anchor CI validation to golden fixtures with pre-computed hashes.
|
||||
- Enable offline verification without network dependencies.
|
||||
|
||||
## CI Pipeline Integration
|
||||
|
||||
### Environment Setup
|
||||
|
||||
```yaml
|
||||
# .gitea/workflows/scanner-determinism.yml additions
|
||||
env:
|
||||
DOTNET_DISABLE_BUILTIN_GRAPH: "1"
|
||||
TZ: "UTC"
|
||||
LC_ALL: "C"
|
||||
STELLAOPS_DETERMINISM_SEED: "42"
|
||||
STELLAOPS_DETERMINISM_TIMESTAMP: "2025-01-01T00:00:00Z"
|
||||
```
|
||||
|
||||
### Required Environment Variables
|
||||
|
||||
| Variable | Purpose | Default |
|
||||
|----------|---------|---------|
|
||||
| `TZ` | Force UTC timezone | `UTC` |
|
||||
| `LC_ALL` | Force locale-invariant sorting | `C` |
|
||||
| `STELLAOPS_DETERMINISM_SEED` | Fixed RNG seed for reproducibility | `42` |
|
||||
| `STELLAOPS_DETERMINISM_TIMESTAMP` | Fixed timestamp for output | `2025-01-01T00:00:00Z` |
|
||||
| `DOTNET_DISABLE_BUILTIN_GRAPH` | Disable non-deterministic graph features | `1` |
|
||||
|
||||
## Hash Validation Steps
|
||||
|
||||
### 1. Golden Fixture Verification
|
||||
|
||||
```bash
|
||||
#!/bin/bash
|
||||
# scripts/scanner/verify-determinism.sh
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
FIXTURE_DIR="docs/modules/scanner/fixtures/cdx17-cbom"
|
||||
HASH_FILE="${FIXTURE_DIR}/hashes.txt"
|
||||
|
||||
verify_fixture() {
|
||||
local file="$1"
|
||||
local expected_blake3="$2"
|
||||
local expected_sha256="$3"
|
||||
|
||||
actual_blake3=$(b3sum "${file}" | cut -d' ' -f1)
|
||||
actual_sha256=$(sha256sum "${file}" | cut -d' ' -f1)
|
||||
|
||||
if [[ "${actual_blake3}" != "${expected_blake3}" ]]; then
|
||||
echo "FAIL: ${file} BLAKE3 mismatch"
|
||||
echo " expected: ${expected_blake3}"
|
||||
echo " actual: ${actual_blake3}"
|
||||
return 1
|
||||
fi
|
||||
|
||||
if [[ "${actual_sha256}" != "${expected_sha256}" ]]; then
|
||||
echo "FAIL: ${file} SHA256 mismatch"
|
||||
echo " expected: ${expected_sha256}"
|
||||
echo " actual: ${actual_sha256}"
|
||||
return 1
|
||||
fi
|
||||
|
||||
echo "PASS: ${file}"
|
||||
return 0
|
||||
}
|
||||
|
||||
# Parse hashes.txt and verify each fixture
|
||||
while IFS=': ' read -r filename hashes; do
|
||||
blake3=$(echo "${hashes}" | grep -oP 'BLAKE3=\K[a-f0-9]+')
|
||||
sha256=$(echo "${hashes}" | grep -oP 'SHA256=\K[a-f0-9]+')
|
||||
verify_fixture "${FIXTURE_DIR}/${filename}" "${blake3}" "${sha256}"
|
||||
done < <(grep -v '^#' "${HASH_FILE}")
|
||||
```
|
||||
|
||||
### 2. Deterministic Serialization Test
|
||||
|
||||
```csharp
|
||||
// src/Scanner/__Tests/StellaOps.Scanner.Determinism.Tests/CdxDeterminismTests.cs
|
||||
[Fact]
|
||||
public async Task Cdx17_Serialization_Is_Deterministic()
|
||||
{
|
||||
// Arrange
|
||||
var options = new DeterminismOptions
|
||||
{
|
||||
Seed = 42,
|
||||
Timestamp = new DateTimeOffset(2025, 1, 1, 0, 0, 0, TimeSpan.Zero),
|
||||
CultureInvariant = true
|
||||
};
|
||||
|
||||
var sbom = CreateTestSbom();
|
||||
|
||||
// Act - serialize twice
|
||||
var json1 = await _serializer.SerializeAsync(sbom, options);
|
||||
var json2 = await _serializer.SerializeAsync(sbom, options);
|
||||
|
||||
// Assert - must be identical
|
||||
Assert.Equal(json1, json2);
|
||||
|
||||
// Compute and verify hash
|
||||
var hash = Blake3.HashData(Encoding.UTF8.GetBytes(json1));
|
||||
Assert.Equal(ExpectedHash, Convert.ToHexString(hash).ToLowerInvariant());
|
||||
}
|
||||
```
|
||||
|
||||
### 3. Downgrade Adapter Verification
|
||||
|
||||
```csharp
|
||||
[Fact]
|
||||
public async Task Cdx17_To_Cdx16_Downgrade_Is_Deterministic()
|
||||
{
|
||||
// Arrange
|
||||
var cdx17 = await LoadFixture("sample-cdx17-cbom.json");
|
||||
|
||||
// Act
|
||||
var cdx16_1 = await _adapter.Downgrade(cdx17);
|
||||
var cdx16_2 = await _adapter.Downgrade(cdx17);
|
||||
|
||||
// Assert
|
||||
var json1 = await _serializer.SerializeAsync(cdx16_1);
|
||||
var json2 = await _serializer.SerializeAsync(cdx16_2);
|
||||
Assert.Equal(json1, json2);
|
||||
|
||||
// Verify matches golden fixture hash
|
||||
var hash = Blake3.HashData(Encoding.UTF8.GetBytes(json1));
|
||||
var expectedHash = LoadExpectedHash("sample-cdx16.json");
|
||||
Assert.Equal(expectedHash, Convert.ToHexString(hash).ToLowerInvariant());
|
||||
}
|
||||
```
|
||||
|
||||
## Ordering Rules
|
||||
|
||||
### Components (CycloneDX)
|
||||
1. Sort by `purl` (case-insensitive, locale-invariant)
|
||||
2. Ties: sort by `name` (case-insensitive)
|
||||
3. Ties: sort by `version` (semantic version comparison)
|
||||
|
||||
### Vulnerabilities
|
||||
1. Sort by `id` (lexicographic)
|
||||
2. Ties: sort by `source.name` (lexicographic)
|
||||
3. Ties: sort by highest severity rating score (descending)
|
||||
|
||||
### Properties
|
||||
1. Sort by `name` (lexicographic, locale-invariant)
|
||||
|
||||
### Hashes
|
||||
1. Sort by `alg` (BLAKE3-256, SHA-256, SHA-512 order)
|
||||
|
||||
### Ratings (CVSS)
|
||||
1. CVSSv4 first
|
||||
2. CVSSv31 second
|
||||
3. CVSSv30 third
|
||||
4. Others alphabetically by method
|
||||
|
||||
## Fixture Requirements (SC8 Cross-Reference)
|
||||
|
||||
Each golden fixture must include:
|
||||
|
||||
| Format | Fixture File | Contents |
|
||||
|--------|--------------|----------|
|
||||
| CDX 1.7 + CBOM | `sample-cdx17-cbom.json` | Full SBOM with CVSS v4/v3.1, CBOM properties, SLSA Source Track, evidence |
|
||||
| CDX 1.6 (downgraded) | `sample-cdx16.json` | Downgraded version with CVSS v4 removed, CBOM dropped, audit markers |
|
||||
| SLSA Source Track | `source-track.sample.json` | Standalone source provenance block |
|
||||
|
||||
## CI Workflow Steps
|
||||
|
||||
```yaml
|
||||
# Add to .gitea/workflows/scanner-determinism.yml
|
||||
jobs:
|
||||
determinism-check:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- name: Setup .NET
|
||||
uses: actions/setup-dotnet@v4
|
||||
with:
|
||||
dotnet-version: '10.0.x'
|
||||
|
||||
- name: Set determinism environment
|
||||
run: |
|
||||
echo "TZ=UTC" >> $GITHUB_ENV
|
||||
echo "LC_ALL=C" >> $GITHUB_ENV
|
||||
echo "DOTNET_DISABLE_BUILTIN_GRAPH=1" >> $GITHUB_ENV
|
||||
echo "STELLAOPS_DETERMINISM_SEED=42" >> $GITHUB_ENV
|
||||
|
||||
- name: Verify golden fixtures
|
||||
run: scripts/scanner/verify-determinism.sh
|
||||
|
||||
- name: Run determinism tests
|
||||
run: |
|
||||
dotnet test src/Scanner/__Tests/StellaOps.Scanner.Determinism.Tests \
|
||||
--configuration Release \
|
||||
--verbosity normal
|
||||
|
||||
- name: Run adapter determinism tests
|
||||
run: |
|
||||
dotnet test src/Scanner/__Tests/StellaOps.Scanner.Adapters.Tests \
|
||||
--filter "Category=Determinism" \
|
||||
--configuration Release
|
||||
```
|
||||
|
||||
## Failure Handling
|
||||
|
||||
### Hash Mismatch Protocol
|
||||
|
||||
1. **Do not auto-update hashes** - manual review required
|
||||
2. Log diff between expected and actual output
|
||||
3. Capture both BLAKE3 and SHA256 for audit trail
|
||||
4. Block merge until resolved
|
||||
|
||||
### Acceptable Reasons for Hash Update
|
||||
|
||||
- Schema version bump (documented in change log)
|
||||
- Intentional ordering rule change (documented in adapter CSV)
|
||||
- Bug fix that corrects previously non-deterministic output
|
||||
- Never: cosmetic changes, timestamp updates, random salts
|
||||
|
||||
## Offline Verification
|
||||
|
||||
The harness must work completely offline:
|
||||
|
||||
- No network calls during serialization
|
||||
- No external schema validation endpoints
|
||||
- Trust roots and schemas bundled in repository
|
||||
- All RNG seeded from environment variable
|
||||
|
||||
## Integration with SC8 Fixtures
|
||||
|
||||
The fixtures defined in SC8 serve as golden sources for this harness:
|
||||
|
||||
```
|
||||
docs/modules/scanner/fixtures/
|
||||
├── cdx17-cbom/
|
||||
│ ├── sample-cdx17-cbom.json # CVSS v4 + v3.1, CBOM, evidence
|
||||
│ ├── sample-cdx16.json # Downgraded, CVSS v3.1 only
|
||||
│ ├── source-track.sample.json # SLSA Source Track
|
||||
│ └── hashes.txt # BLAKE3 + SHA256 for all fixtures
|
||||
├── adapters/
|
||||
│ ├── mapping-cvss4-to-cvss3.csv
|
||||
│ ├── mapping-cdx17-to-cdx16.csv
|
||||
│ ├── mapping-slsa12-to-slsa10.csv
|
||||
│ └── hashes.txt
|
||||
└── competitor-adapters/
|
||||
└── fixtures/
|
||||
├── normalized-syft.json
|
||||
├── normalized-trivy.json
|
||||
└── normalized-clair.json
|
||||
```
|
||||
|
||||
## Links
|
||||
|
||||
- Sprint: `docs/implplan/SPRINT_0186_0001_0001_record_deterministic_execution.md` (SC5)
|
||||
- Roadmap: `docs/modules/scanner/design/standards-convergence-roadmap.md` (SC1)
|
||||
- Contract: `docs/modules/scanner/design/cdx17-cbom-contract.md` (SC2)
|
||||
323
docs/modules/scanner/design/offline-kit-parity.md
Normal file
323
docs/modules/scanner/design/offline-kit-parity.md
Normal file
@@ -0,0 +1,323 @@
|
||||
# Offline Kit Parity for Scanner Standards (SC10)
|
||||
|
||||
Status: Draft · Date: 2025-12-04
|
||||
Scope: Define offline-kit contents, DSSE signing, and parity requirements for schemas, adapters, mappings, and fixtures to enable air-gapped operation.
|
||||
|
||||
## Objectives
|
||||
|
||||
- Bundle all schema/adapter/fixture artifacts for offline use.
|
||||
- Sign bundles with DSSE for integrity verification.
|
||||
- Ensure offline kit matches online capabilities.
|
||||
- Document verification procedures for air-gapped environments.
|
||||
|
||||
## Bundle Structure
|
||||
|
||||
```
|
||||
out/offline/scanner-standards-kit-v1/
|
||||
├── manifest.json # Bundle manifest with all hashes
|
||||
├── manifest.dsse # DSSE signature over manifest
|
||||
├── schemas/
|
||||
│ ├── cyclonedx-1.7.schema.json # CDX 1.7 JSON schema
|
||||
│ ├── cyclonedx-1.6.schema.json # CDX 1.6 JSON schema
|
||||
│ ├── spdx-3.0.1.schema.json # SPDX 3.0.1 JSON-LD schema
|
||||
│ └── slsa-provenance-v1.schema.json
|
||||
├── adapters/
|
||||
│ ├── mapping-cvss4-to-cvss3.csv
|
||||
│ ├── mapping-cdx17-to-cdx16.csv
|
||||
│ ├── mapping-slsa12-to-slsa10.csv
|
||||
│ └── hashes.txt
|
||||
├── fixtures/
|
||||
│ ├── cdx17-cbom/
|
||||
│ │ ├── sample-cdx17-cbom.json
|
||||
│ │ ├── sample-cdx16.json
|
||||
│ │ ├── source-track.sample.json
|
||||
│ │ └── hashes.txt
|
||||
│ └── competitor-adapters/
|
||||
│ ├── fixtures/
|
||||
│ │ ├── normalized-syft.json
|
||||
│ │ ├── normalized-trivy.json
|
||||
│ │ └── normalized-clair.json
|
||||
│ └── coverage.csv
|
||||
├── tools/
|
||||
│ ├── versions.json # Pinned tool versions
|
||||
│ └── checksums.txt # Tool binary hashes
|
||||
└── trust/
|
||||
├── root-ca.pem # Trust root for signature verification
|
||||
└── keyring.json # Signing key metadata
|
||||
```
|
||||
|
||||
## Manifest Format
|
||||
|
||||
```json
|
||||
{
|
||||
"version": "1.0.0",
|
||||
"created": "2025-12-04T00:00:00Z",
|
||||
"creator": "stellaops-scanner",
|
||||
"artifacts": [
|
||||
{
|
||||
"path": "schemas/cyclonedx-1.7.schema.json",
|
||||
"type": "schema",
|
||||
"format": "cyclonedx",
|
||||
"version": "1.7",
|
||||
"blake3": "a1b2c3d4...",
|
||||
"sha256": "e5f6a7b8..."
|
||||
},
|
||||
{
|
||||
"path": "adapters/mapping-cvss4-to-cvss3.csv",
|
||||
"type": "adapter",
|
||||
"source": "cvss4",
|
||||
"target": "cvss3.1",
|
||||
"blake3": "fa600b26...",
|
||||
"sha256": "072b66be..."
|
||||
},
|
||||
{
|
||||
"path": "fixtures/cdx17-cbom/sample-cdx17-cbom.json",
|
||||
"type": "fixture",
|
||||
"format": "cyclonedx",
|
||||
"version": "1.7",
|
||||
"blake3": "27c6de0c...",
|
||||
"sha256": "22d8f6f8..."
|
||||
}
|
||||
],
|
||||
"tools": {
|
||||
"syft": {
|
||||
"version": "1.0.0",
|
||||
"blake3": "...",
|
||||
"sha256": "..."
|
||||
},
|
||||
"trivy": {
|
||||
"version": "0.50.0",
|
||||
"blake3": "...",
|
||||
"sha256": "..."
|
||||
}
|
||||
},
|
||||
"manifestHash": {
|
||||
"blake3": "...",
|
||||
"sha256": "..."
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## DSSE Signing
|
||||
|
||||
### Signature Format
|
||||
|
||||
```json
|
||||
{
|
||||
"payloadType": "application/vnd.stellaops.scanner.manifest+json",
|
||||
"payload": "<base64-encoded-manifest>",
|
||||
"signatures": [
|
||||
{
|
||||
"keyid": "stellaops-scanner-release-2025",
|
||||
"sig": "<base64-signature>"
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
### Signing Process
|
||||
|
||||
```bash
|
||||
#!/bin/bash
|
||||
# scripts/scanner/sign-offline-kit.sh
|
||||
|
||||
MANIFEST="out/offline/scanner-standards-kit-v1/manifest.json"
|
||||
DSSE="out/offline/scanner-standards-kit-v1/manifest.dsse"
|
||||
KEY_ID="stellaops-scanner-release-2025"
|
||||
|
||||
# Compute manifest hash
|
||||
MANIFEST_HASH=$(b3sum "${MANIFEST}" | cut -d' ' -f1)
|
||||
|
||||
# Sign with DSSE
|
||||
stellaops-sign dsse \
|
||||
--payload-type "application/vnd.stellaops.scanner.manifest+json" \
|
||||
--payload "${MANIFEST}" \
|
||||
--key-id "${KEY_ID}" \
|
||||
--output "${DSSE}"
|
||||
|
||||
echo "Signed manifest: ${DSSE}"
|
||||
echo "Manifest BLAKE3: ${MANIFEST_HASH}"
|
||||
```
|
||||
|
||||
## Verification
|
||||
|
||||
### Offline Verification Steps
|
||||
|
||||
```bash
|
||||
#!/bin/bash
|
||||
# scripts/scanner/verify-offline-kit.sh
|
||||
|
||||
KIT_DIR="out/offline/scanner-standards-kit-v1"
|
||||
|
||||
# 1. Verify DSSE signature
|
||||
stellaops-verify dsse \
|
||||
--envelope "${KIT_DIR}/manifest.dsse" \
|
||||
--trust-root "${KIT_DIR}/trust/root-ca.pem" \
|
||||
--expected-payload-type "application/vnd.stellaops.scanner.manifest+json"
|
||||
|
||||
# 2. Extract manifest and verify artifacts
|
||||
MANIFEST=$(stellaops-verify dsse --envelope "${KIT_DIR}/manifest.dsse" --extract-payload)
|
||||
|
||||
# 3. Verify each artifact hash
|
||||
for artifact in $(echo "${MANIFEST}" | jq -r '.artifacts[] | @base64'); do
|
||||
path=$(echo "${artifact}" | base64 -d | jq -r '.path')
|
||||
expected_blake3=$(echo "${artifact}" | base64 -d | jq -r '.blake3')
|
||||
|
||||
actual_blake3=$(b3sum "${KIT_DIR}/${path}" | cut -d' ' -f1)
|
||||
|
||||
if [[ "${actual_blake3}" != "${expected_blake3}" ]]; then
|
||||
echo "FAIL: ${path} hash mismatch"
|
||||
exit 1
|
||||
fi
|
||||
echo "PASS: ${path}"
|
||||
done
|
||||
|
||||
echo "All artifacts verified"
|
||||
```
|
||||
|
||||
### Programmatic Verification
|
||||
|
||||
```csharp
|
||||
// src/Scanner/StellaOps.Scanner.Offline/OfflineKitVerifier.cs
|
||||
public class OfflineKitVerifier
|
||||
{
|
||||
public async Task<VerificationResult> VerifyAsync(
|
||||
string kitPath,
|
||||
ITrustRootProvider trustRoots)
|
||||
{
|
||||
var manifestPath = Path.Combine(kitPath, "manifest.json");
|
||||
var dssePath = Path.Combine(kitPath, "manifest.dsse");
|
||||
|
||||
// Verify DSSE signature
|
||||
var envelope = await DsseEnvelope.LoadAsync(dssePath);
|
||||
var signatureValid = await _verifier.VerifyAsync(envelope, trustRoots);
|
||||
|
||||
if (!signatureValid)
|
||||
return VerificationResult.SignatureInvalid;
|
||||
|
||||
// Parse manifest
|
||||
var manifest = JsonSerializer.Deserialize<OfflineManifest>(
|
||||
envelope.Payload);
|
||||
|
||||
// Verify each artifact
|
||||
foreach (var artifact in manifest.Artifacts)
|
||||
{
|
||||
var artifactPath = Path.Combine(kitPath, artifact.Path);
|
||||
var actualHash = await HashUtil.Blake3Async(artifactPath);
|
||||
|
||||
if (actualHash != artifact.Blake3)
|
||||
return VerificationResult.ArtifactHashMismatch(artifact.Path);
|
||||
}
|
||||
|
||||
return VerificationResult.Success(manifest);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Parity Requirements
|
||||
|
||||
### Required Contents
|
||||
|
||||
| Category | Online | Offline Kit | Notes |
|
||||
|----------|--------|-------------|-------|
|
||||
| CDX 1.7 Schema | CDN fetch | Bundled | Schema validation |
|
||||
| CDX 1.6 Schema | CDN fetch | Bundled | Downgrade validation |
|
||||
| CVSS v4→v3 Adapter | API lookup | Bundled CSV | Pure function |
|
||||
| CDX 1.7→1.6 Adapter | API lookup | Bundled CSV | Pure function |
|
||||
| SLSA 1.2→1.0 Adapter | API lookup | Bundled CSV | Pure function |
|
||||
| Golden Fixtures | Test repo | Bundled | Determinism tests |
|
||||
| Tool Binaries | Package registry | Bundled/Checksum | Syft, Trivy |
|
||||
| Trust Roots | Online PKI | Bundled PEM | Signature verification |
|
||||
|
||||
### Functional Parity
|
||||
|
||||
The offline kit must support:
|
||||
|
||||
| Operation | Online | Offline |
|
||||
|-----------|--------|---------|
|
||||
| Schema validation | Yes | Yes |
|
||||
| SBOM serialization | Yes | Yes |
|
||||
| Downgrade conversion | Yes | Yes |
|
||||
| Hash verification | Yes | Yes |
|
||||
| DSSE verification | Yes | Yes |
|
||||
| Determinism testing | Yes | Yes |
|
||||
| Rekor transparency | Yes | No* |
|
||||
|
||||
*Offline mode uses mirrored checkpoints instead of live Rekor
|
||||
|
||||
## Bundle Generation
|
||||
|
||||
### CI Workflow
|
||||
|
||||
```yaml
|
||||
# .gitea/workflows/offline-kit.yml
|
||||
jobs:
|
||||
build-offline-kit:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- name: Collect schemas
|
||||
run: |
|
||||
mkdir -p out/offline/scanner-standards-kit-v1/schemas
|
||||
cp schemas/cyclonedx-*.json out/offline/scanner-standards-kit-v1/schemas/
|
||||
cp schemas/spdx-*.json out/offline/scanner-standards-kit-v1/schemas/
|
||||
|
||||
- name: Collect adapters
|
||||
run: |
|
||||
mkdir -p out/offline/scanner-standards-kit-v1/adapters
|
||||
cp docs/modules/scanner/fixtures/adapters/*.csv out/offline/scanner-standards-kit-v1/adapters/
|
||||
cp docs/modules/scanner/fixtures/adapters/hashes.txt out/offline/scanner-standards-kit-v1/adapters/
|
||||
|
||||
- name: Collect fixtures
|
||||
run: |
|
||||
mkdir -p out/offline/scanner-standards-kit-v1/fixtures
|
||||
cp -r docs/modules/scanner/fixtures/cdx17-cbom out/offline/scanner-standards-kit-v1/fixtures/
|
||||
cp -r docs/modules/scanner/fixtures/competitor-adapters out/offline/scanner-standards-kit-v1/fixtures/
|
||||
|
||||
- name: Generate manifest
|
||||
run: scripts/scanner/generate-manifest.sh
|
||||
|
||||
- name: Sign bundle
|
||||
run: scripts/scanner/sign-offline-kit.sh
|
||||
env:
|
||||
SIGNING_KEY: ${{ secrets.SCANNER_SIGNING_KEY }}
|
||||
|
||||
- name: Verify bundle
|
||||
run: scripts/scanner/verify-offline-kit.sh
|
||||
|
||||
- name: Upload artifact
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: scanner-standards-kit-v1
|
||||
path: out/offline/scanner-standards-kit-v1/
|
||||
```
|
||||
|
||||
## Update Procedure
|
||||
|
||||
### Offline Kit Refresh
|
||||
|
||||
When schemas/adapters change:
|
||||
|
||||
1. Update source artifacts in repository
|
||||
2. Run hash verification CI
|
||||
3. Generate new manifest
|
||||
4. Sign with release key
|
||||
5. Publish new kit version
|
||||
6. Document changes in release notes
|
||||
|
||||
### Version Compatibility
|
||||
|
||||
| Kit Version | Scanner Version | CDX Version | CVSS Support |
|
||||
|-------------|-----------------|-------------|--------------|
|
||||
| v1.0.0 | 1.x | 1.6, 1.7 | v3.1, v4.0 |
|
||||
| v1.1.0 | 1.x | 1.6, 1.7, 1.8* | v3.1, v4.0 |
|
||||
|
||||
*Future versions
|
||||
|
||||
## Links
|
||||
|
||||
- Sprint: `docs/implplan/SPRINT_0186_0001_0001_record_deterministic_execution.md` (SC10)
|
||||
- Roadmap: `docs/modules/scanner/design/standards-convergence-roadmap.md` (SC1)
|
||||
- Governance: `docs/modules/scanner/design/schema-governance.md` (SC9)
|
||||
- Offline Operation: `docs/24_OFFLINE_KIT.md`
|
||||
197
docs/modules/scanner/design/schema-governance.md
Normal file
197
docs/modules/scanner/design/schema-governance.md
Normal file
@@ -0,0 +1,197 @@
|
||||
# Schema Governance for Scanner Outputs (SC9)
|
||||
|
||||
Status: Draft · Date: 2025-12-04
|
||||
Scope: Define governance, approvals, RACI, and review cadence for schema bumps, downgrade adapters, and mapping table changes.
|
||||
|
||||
## Objectives
|
||||
|
||||
- Establish clear ownership and approval workflows for schema changes.
|
||||
- Define RACI matrix for schema-related decisions.
|
||||
- Set review cadence and change control procedures.
|
||||
- Ensure adapter tables are locked with documented changes.
|
||||
|
||||
## RACI Matrix
|
||||
|
||||
### Schema Changes
|
||||
|
||||
| Activity | Product | Scanner TL | Sbomer TL | Policy TL | Ops | QA |
|
||||
|----------|---------|------------|-----------|-----------|-----|-----|
|
||||
| CycloneDX version bump | A | R | C | C | I | C |
|
||||
| CVSS version support | A | R | I | C | I | C |
|
||||
| SLSA version bump | A | R | C | C | I | C |
|
||||
| New evidence fields | A | R | C | C | I | C |
|
||||
| CBOM property additions | A | R | C | C | I | C |
|
||||
|
||||
### Adapter Changes
|
||||
|
||||
| Activity | Product | Scanner TL | Sbomer TL | Policy TL | Ops | QA |
|
||||
|----------|---------|------------|-----------|-----------|-----|-----|
|
||||
| Downgrade adapter update | A | R | C | I | I | R |
|
||||
| Mapping table changes | A | R | C | I | I | R |
|
||||
| Hash update approval | A | R | I | I | I | R |
|
||||
| Fixture updates | I | R | C | I | I | R |
|
||||
|
||||
### Release Artifacts
|
||||
|
||||
| Activity | Product | Scanner TL | Sbomer TL | Policy TL | Ops | QA |
|
||||
|----------|---------|------------|-----------|-----------|-----|-----|
|
||||
| Schema freeze | A | R | C | C | I | I |
|
||||
| DSSE signing | I | C | I | I | R | I |
|
||||
| Offline kit bundling | I | I | I | I | R | C |
|
||||
| Release notes | R | C | C | C | C | I |
|
||||
|
||||
Legend: R=Responsible, A=Accountable, C=Consulted, I=Informed
|
||||
|
||||
## Schema Bump Workflow
|
||||
|
||||
### 1. Proposal Phase
|
||||
|
||||
```mermaid
|
||||
graph LR
|
||||
A[RFC Draft] --> B[Technical Review]
|
||||
B --> C{Approved?}
|
||||
C -->|Yes| D[Implementation]
|
||||
C -->|No| A
|
||||
D --> E[Adapter Update]
|
||||
E --> F[Fixture Update]
|
||||
F --> G[Hash Freeze]
|
||||
G --> H[DSSE Sign]
|
||||
H --> I[Release]
|
||||
```
|
||||
|
||||
### 2. Required Artifacts
|
||||
|
||||
| Artifact | Owner | Location |
|
||||
|----------|-------|----------|
|
||||
| RFC Document | Scanner TL | `docs/rfcs/scanner/` |
|
||||
| Mapping CSV | Scanner TL | `docs/modules/scanner/fixtures/adapters/` |
|
||||
| Golden Fixtures | QA | `docs/modules/scanner/fixtures/cdx17-cbom/` |
|
||||
| Hash List | QA | `docs/modules/scanner/fixtures/*/hashes.txt` |
|
||||
| DSSE Envelope | Ops | `out/offline/scanner-standards-kit-v1/` |
|
||||
|
||||
### 3. Approval Gates
|
||||
|
||||
| Gate | Approvers | Criteria |
|
||||
|------|-----------|----------|
|
||||
| RFC Approval | Product + Scanner TL | Technical feasibility, backwards compat |
|
||||
| Adapter Approval | Scanner TL + QA | Mapping completeness, determinism tests pass |
|
||||
| Hash Freeze | Scanner TL + QA | All fixtures pass hash validation |
|
||||
| DSSE Sign | Ops | All hashes recorded, offline kit complete |
|
||||
| Release | Product | All gates passed, release notes approved |
|
||||
|
||||
## Review Cadence
|
||||
|
||||
### Regular Reviews
|
||||
|
||||
| Review | Frequency | Attendees | Scope |
|
||||
|--------|-----------|-----------|-------|
|
||||
| Schema Sync | Monthly | Scanner, Sbomer, Policy TLs | Upcoming changes, deprecations |
|
||||
| Adapter Review | Per release | Scanner TL, QA | Mapping accuracy, test coverage |
|
||||
| Hash Audit | Per release | QA, Ops | All fixture hashes valid |
|
||||
|
||||
### Ad-hoc Reviews
|
||||
|
||||
Triggered by:
|
||||
- Upstream schema release (CycloneDX, SPDX, SLSA)
|
||||
- Security advisory requiring field changes
|
||||
- Customer request for new evidence types
|
||||
- Determinism test failure
|
||||
|
||||
## Change Control
|
||||
|
||||
### Acceptable Changes
|
||||
|
||||
| Change Type | Requires | Example |
|
||||
|-------------|----------|---------|
|
||||
| Add optional field | Scanner TL approval | New evidence property |
|
||||
| Add required field | RFC + Product approval | New mandatory hash |
|
||||
| Remove field | RFC + deprecation notice | Legacy property removal |
|
||||
| Change ordering | Scanner TL + QA approval | Sort key update |
|
||||
| Update hash | QA approval + documented reason | Fixture content change |
|
||||
|
||||
### Prohibited Changes
|
||||
|
||||
| Change | Reason | Alternative |
|
||||
|--------|--------|-------------|
|
||||
| Silent hash update | Breaks determinism validation | Document change, get approval |
|
||||
| Remove required field | Breaks consumers | Deprecate with N-1 support |
|
||||
| Change field type | Breaks serialization | New field with migration |
|
||||
| Reorder without docs | Breaks hash validation | Update ordering rules + hashes |
|
||||
|
||||
## Deprecation Policy
|
||||
|
||||
### Deprecation Timeline
|
||||
|
||||
| Phase | Duration | Actions |
|
||||
|-------|----------|---------|
|
||||
| Announced | +0 days | Add deprecation notice to docs |
|
||||
| Warning | +30 days | Emit warning in API responses |
|
||||
| N-1 Support | +90 days | Old format still accepted |
|
||||
| Removal | +180 days | Old format rejected |
|
||||
|
||||
### Deprecation Notice Format
|
||||
|
||||
```json
|
||||
{
|
||||
"deprecated": {
|
||||
"field": "ratings[method=CVSSv30]",
|
||||
"since": "v2.5.0",
|
||||
"removal": "v3.0.0",
|
||||
"replacement": "ratings[method=CVSSv31]",
|
||||
"migrationGuide": "docs/migrations/cvss-v30-removal.md"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Adapter Locking
|
||||
|
||||
### Lock Conditions
|
||||
|
||||
Adapters are locked when:
|
||||
1. Hash recorded in `hashes.txt`
|
||||
2. DSSE envelope signed
|
||||
3. Offline kit bundled
|
||||
|
||||
### Unlock Process
|
||||
|
||||
To modify a locked adapter:
|
||||
1. Create new version (e.g., `mapping-cvss4-to-cvss3-v2.csv`)
|
||||
2. Update hash file with new entry
|
||||
3. Keep old version for N-1 compatibility
|
||||
4. Get Scanner TL + QA approval
|
||||
5. Sign new DSSE envelope
|
||||
|
||||
## Audit Trail
|
||||
|
||||
### Required Records
|
||||
|
||||
| Record | Location | Retention |
|
||||
|--------|----------|-----------|
|
||||
| RFC decisions | `docs/rfcs/scanner/` | Permanent |
|
||||
| Hash changes | Git history + `CHANGELOG.md` | Permanent |
|
||||
| Approval records | PR comments | Permanent |
|
||||
| DSSE envelopes | CAS + offline kit | Permanent |
|
||||
|
||||
### Git Commit Requirements
|
||||
|
||||
Schema-related commits must include:
|
||||
```
|
||||
feat(scanner): Add CVSS v4 support
|
||||
|
||||
- Add CVSSv4 rating method
|
||||
- Update adapter mapping CSV
|
||||
- Update golden fixtures
|
||||
- New hashes recorded
|
||||
|
||||
Approved-By: @scanner-tl
|
||||
Reviewed-By: @qa-lead
|
||||
Hash-Update: mapping-cvss4-to-cvss3.csv BLAKE3=fa600b26...
|
||||
|
||||
Refs: RFC-2025-012, SCAN-GAP-186-SC9
|
||||
```
|
||||
|
||||
## Links
|
||||
|
||||
- Sprint: `docs/implplan/SPRINT_0186_0001_0001_record_deterministic_execution.md` (SC9)
|
||||
- Roadmap: `docs/modules/scanner/design/standards-convergence-roadmap.md` (SC1)
|
||||
- Adapters: `docs/modules/scanner/fixtures/adapters/`
|
||||
Reference in New Issue
Block a user