Add tests and implement timeline ingestion options with NATS and Redis subscribers
- Introduced `BinaryReachabilityLifterTests` to validate binary lifting functionality. - Created `PackRunWorkerOptions` for configuring worker paths and execution persistence. - Added `TimelineIngestionOptions` for configuring NATS and Redis ingestion transports. - Implemented `NatsTimelineEventSubscriber` for subscribing to NATS events. - Developed `RedisTimelineEventSubscriber` for reading from Redis Streams. - Added `TimelineEnvelopeParser` to normalize incoming event envelopes. - Created unit tests for `TimelineEnvelopeParser` to ensure correct field mapping. - Implemented `TimelineAuthorizationAuditSink` for logging authorization outcomes.
This commit is contained in:
@@ -0,0 +1,43 @@
|
||||
# Deterministic SBOM composition fixtures
|
||||
|
||||
Reference bundle for DOCS-SCANNER-DET-01. Use it to prove fragment-level DSSE, `_composition.json`, and CycloneDX composition metadata stay deterministic and offline-verifiable.
|
||||
|
||||
## Contents
|
||||
- `_composition.json` — composition recipe with Merkle root, fragment hashes, BOM hash, and determinism pins.
|
||||
- `fragment-layer{1,2}.json` — canonical fragments (sorted keys, newline-terminated).
|
||||
- `fragment-layer{1,2}.dsse.json` — DSSE envelopes over the canonical fragments (demo key `demo-ed25519`).
|
||||
- `bom.cdx.json` — composed CycloneDX BOM with `stellaops:merkle.root` and `stellaops:composition.manifest` properties.
|
||||
- `hashes.txt` — sha256 for every file in this directory.
|
||||
- `generate.py` — reproducible generator (standard library only).
|
||||
|
||||
## Verify offline
|
||||
```bash
|
||||
cd docs/modules/scanner/fixtures/deterministic-compose
|
||||
sha256sum -c hashes.txt
|
||||
|
||||
# Check DSSE payload matches fragment
|
||||
jq -r '.payload' fragment-layer1.dsse.json | base64 -d > /tmp/payload.json
|
||||
diff -u fragment-layer1.json /tmp/payload.json
|
||||
|
||||
# Recompute Merkle root from fragment hashes
|
||||
python - <<'PY'
|
||||
import hashlib, json
|
||||
from pathlib import Path
|
||||
frag_hashes = [line.split()[0] for line in Path('hashes.txt').read_text().splitlines() if 'fragment-layer' in line and '.json' in line and '.dsse' not in line]
|
||||
frag_hashes = [bytes.fromhex(h) for h in frag_hashes]
|
||||
while len(frag_hashes) > 1:
|
||||
nxt = []
|
||||
it = iter(frag_hashes)
|
||||
for a in it:
|
||||
b = next(it, a)
|
||||
nxt.append(hashlib.sha256(a+b).digest())
|
||||
frag_hashes = nxt
|
||||
print(f"merkle={frag_hashes[0].hex()}")
|
||||
PY
|
||||
```
|
||||
|
||||
## Regenerate
|
||||
```bash
|
||||
python generate.py
|
||||
sha256sum -c hashes.txt
|
||||
```
|
||||
Reference in New Issue
Block a user