feat: Add DigestUpsertRequest and LockEntity models
Some checks failed
AOC Guard CI / aoc-guard (push) Has been cancelled
AOC Guard CI / aoc-verify (push) Has been cancelled
Concelier Attestation Tests / attestation-tests (push) Has been cancelled
Docs CI / lint-and-preview (push) Has been cancelled
Export Center CI / export-ci (push) Has been cancelled
Mirror Thin Bundle Sign & Verify / mirror-sign (push) Has been cancelled

- Introduced DigestUpsertRequest for handling digest upsert requests with properties like ChannelId, Recipient, DigestKey, Events, and CollectUntil.
- Created LockEntity to represent a lightweight distributed lock entry with properties such as Id, TenantId, Resource, Owner, ExpiresAt, and CreatedAt.

feat: Implement ILockRepository interface and LockRepository class

- Defined ILockRepository interface with methods for acquiring and releasing locks.
- Implemented LockRepository class with methods to try acquiring a lock and releasing it, using SQL for upsert operations.

feat: Add SurfaceManifestPointer record for manifest pointers

- Introduced SurfaceManifestPointer to represent a minimal pointer to a Surface.FS manifest associated with an image digest.

feat: Create PolicySimulationInputLock and related validation logic

- Added PolicySimulationInputLock record to describe policy simulation inputs and expected digests.
- Implemented validation logic for policy simulation inputs, including checks for digest drift and shadow mode requirements.

test: Add unit tests for ReplayVerificationService and ReplayVerifier

- Created ReplayVerificationServiceTests to validate the behavior of the ReplayVerificationService under various scenarios.
- Developed ReplayVerifierTests to ensure the correctness of the ReplayVerifier logic.

test: Implement PolicySimulationInputLockValidatorTests

- Added tests for PolicySimulationInputLockValidator to verify the validation logic against expected inputs and conditions.

chore: Add cosign key example and signing scripts

- Included a placeholder cosign key example for development purposes.
- Added a script for signing Signals artifacts using cosign with support for both v2 and v3.

chore: Create script for uploading evidence to the evidence locker

- Developed a script to upload evidence to the evidence locker, ensuring required environment variables are set.
This commit is contained in:
StellaOps Bot
2025-12-03 07:51:50 +02:00
parent 37cba83708
commit e923880694
171 changed files with 6567 additions and 2952 deletions

View File

@@ -30,12 +30,16 @@ CHECKPOINT_FRESHNESS=${CHECKPOINT_FRESHNESS:-86400}
OCI=${OCI:-1}
SIGN_KEY="$KEYFILE" STAGE="$STAGE" CREATED="$CREATED" TENANT_SCOPE="$TENANT_SCOPE" ENV_SCOPE="$ENV_SCOPE" CHUNK_SIZE="$CHUNK_SIZE" CHECKPOINT_FRESHNESS="$CHECKPOINT_FRESHNESS" OCI="$OCI" "$ROOT/src/Mirror/StellaOps.Mirror.Creator/make-thin-v1.sh"
# Default to staged time-anchor unless caller overrides
TIME_ANCHOR_FILE=${TIME_ANCHOR_FILE:-$ROOT/out/mirror/thin/stage-v1/layers/time-anchor.json}
# Emit milestone summary with hashes for downstream consumers
MANIFEST_PATH="$ROOT/out/mirror/thin/mirror-thin-v1.manifest.json"
TAR_PATH="$ROOT/out/mirror/thin/mirror-thin-v1.tar.gz"
DSSE_PATH="$ROOT/out/mirror/thin/mirror-thin-v1.manifest.dsse.json"
BUNDLE_PATH="$ROOT/out/mirror/thin/mirror-thin-v1.bundle.json"
BUNDLE_DSSE_PATH="$ROOT/out/mirror/thin/mirror-thin-v1.bundle.dsse.json"
TIME_ANCHOR_DSSE_PATH="$TIME_ANCHOR_FILE.dsse.json"
TRANSPORT_PATH="$ROOT/out/mirror/thin/stage-v1/layers/transport-plan.json"
REKOR_POLICY_PATH="$ROOT/out/mirror/thin/stage-v1/layers/rekor-policy.json"
MIRROR_POLICY_PATH="$ROOT/out/mirror/thin/stage-v1/layers/mirror-policy.json"
@@ -46,6 +50,50 @@ sha256() {
sha256sum "$1" | awk '{print $1}'
}
# Sign manifest, bundle meta, and time-anchor (if present)
python "$ROOT/scripts/mirror/sign_thin_bundle.py" \
--key "$KEYFILE" \
--manifest "$MANIFEST_PATH" \
--tar "$TAR_PATH" \
--tuf-dir "$ROOT/out/mirror/thin/tuf" \
--bundle "$BUNDLE_PATH" \
--time-anchor "$TIME_ANCHOR_FILE"
# Normalize time-anchor DSSE location for bundle meta/summary
if [[ -f "$TIME_ANCHOR_FILE.dsse.json" ]]; then
cp "$TIME_ANCHOR_FILE.dsse.json" "$TIME_ANCHOR_DSSE_PATH"
fi
# Refresh bundle meta hashes now that DSSE files exist
python - <<'PY'
import json, pathlib, hashlib
root = pathlib.Path("$ROOT")
bundle_path = pathlib.Path("$BUNDLE_PATH")
manifest_dsse = pathlib.Path("$DSSE_PATH")
bundle_dsse = pathlib.Path("$BUNDLE_DSSE_PATH")
time_anchor_dsse = pathlib.Path("$TIME_ANCHOR_DSSE_PATH")
def sha(path: pathlib.Path) -> str:
h = hashlib.sha256()
with path.open('rb') as f:
for chunk in iter(lambda: f.read(8192), b''):
h.update(chunk)
return h.hexdigest()
data = json.loads(bundle_path.read_text())
art = data.setdefault('artifacts', {})
if manifest_dsse.exists():
art.setdefault('manifest_dsse', {})['sha256'] = sha(manifest_dsse)
if bundle_dsse.exists():
art.setdefault('bundle_dsse', {})['sha256'] = sha(bundle_dsse)
if time_anchor_dsse.exists():
art.setdefault('time_anchor_dsse', {})['sha256'] = sha(time_anchor_dsse)
bundle_path.write_text(json.dumps(data, indent=2, sort_keys=True) + "\n")
sha_path = bundle_path.with_suffix(bundle_path.suffix + '.sha256')
sha_path.write_text(f"{sha(bundle_path)} {bundle_path.name}\n")
PY
cat > "$SUMMARY_PATH" <<JSON
{
"created": "$CREATED",
@@ -54,7 +102,8 @@ cat > "$SUMMARY_PATH" <<JSON
"dsse": $( [[ -f "$DSSE_PATH" ]] && echo "{\"path\": \"$(basename "$DSSE_PATH")\", \"sha256\": \"$(sha256 "$DSSE_PATH")\"}" || echo "null" ),
"bundle": $( [[ -f "$BUNDLE_PATH" ]] && echo "{\"path\": \"$(basename "$BUNDLE_PATH")\", \"sha256\": \"$(sha256 "$BUNDLE_PATH")\"}" || echo "null" ),
"bundle_dsse": $( [[ -f "$BUNDLE_DSSE_PATH" ]] && echo "{\"path\": \"$(basename "$BUNDLE_DSSE_PATH")\", \"sha256\": \"$(sha256 "$BUNDLE_DSSE_PATH")\"}" || echo "null" ),
"time_anchor": $( [[ -n "${TIME_ANCHOR_FILE:-}" && -f "$TIME_ANCHOR_FILE" ]] && echo "{\"path\": \"$(basename "$TIME_ANCHOR_FILE")\", \"sha256\": \"$(sha256 "$TIME_ANCHOR_FILE")\"}" || echo "null" )
"time_anchor": $( [[ -n "${TIME_ANCHOR_FILE:-}" && -f "$TIME_ANCHOR_FILE" ]] && echo "{\"path\": \"$(basename "$TIME_ANCHOR_FILE")\", \"sha256\": \"$(sha256 "$TIME_ANCHOR_FILE")\"}" || echo "null" ),
"time_anchor_dsse": $( [[ -f "$TIME_ANCHOR_DSSE_PATH" ]] && echo "{\"path\": \"$(basename "$TIME_ANCHOR_DSSE_PATH")\", \"sha256\": \"$(sha256 "$TIME_ANCHOR_DSSE_PATH")\"}" || echo "null" )
,"policies": {
"transport": {"path": "$(basename "$TRANSPORT_PATH")", "sha256": "$(sha256 "$TRANSPORT_PATH")"},
"rekor": {"path": "$(basename "$REKOR_POLICY_PATH")", "sha256": "$(sha256 "$REKOR_POLICY_PATH")"},

View File

@@ -7,7 +7,8 @@ Usage:
--key out/mirror/thin/tuf/keys/mirror-ed25519-test-1.pem \
--manifest out/mirror/thin/mirror-thin-v1.manifest.json \
--tar out/mirror/thin/mirror-thin-v1.tar.gz \
--tuf-dir out/mirror/thin/tuf
--tuf-dir out/mirror/thin/tuf \
--time-anchor out/mirror/thin/stage-v1/layers/time-anchor.json
Writes:
- mirror-thin-v1.manifest.dsse.json
@@ -48,6 +49,7 @@ def main():
ap.add_argument("--tar", required=True, type=pathlib.Path)
ap.add_argument("--tuf-dir", required=True, type=pathlib.Path)
ap.add_argument("--bundle", required=False, type=pathlib.Path)
ap.add_argument("--time-anchor", required=False, type=pathlib.Path)
args = ap.parse_args()
key = load_key(args.key)
@@ -75,12 +77,29 @@ def main():
bundle_dsse_path = args.bundle.with_suffix(".dsse.json")
write_json(bundle_dsse_path, bundle_dsse)
anchor_dsse_path = None
if args.time_anchor:
anchor_bytes = args.time_anchor.read_bytes()
anchor_sig = sign_bytes(key, anchor_bytes)
anchor_dsse = {
"payloadType": "application/vnd.stellaops.time-anchor+json",
"payload": b64url(anchor_bytes),
"signatures": [{"keyid": keyid, "sig": b64url(anchor_sig)}],
}
anchor_dsse_path = args.time_anchor.with_suffix(".dsse.json")
write_json(anchor_dsse_path, anchor_dsse)
# update TUF metadata
for name in ["root.json", "targets.json", "snapshot.json", "timestamp.json"]:
sign_tuf(args.tuf_dir / name, keyid, key)
extra = f", bundle DSSE -> {bundle_dsse_path}" if args.bundle else ""
print(f"Signed DSSE + TUF using keyid {keyid}; DSSE -> {dsse_path}{extra}")
parts = [f"manifest DSSE -> {dsse_path}"]
if args.bundle:
parts.append(f"bundle DSSE -> {bundle_dsse_path}")
if anchor_dsse_path:
parts.append(f"time anchor DSSE -> {anchor_dsse_path}")
parts.append("TUF metadata updated")
print(f"Signed DSSE + TUF using keyid {keyid}; " + ", ".join(parts))
if __name__ == "__main__":
main()

View File

@@ -125,6 +125,16 @@ def check_content_hashes(manifest: dict, tar_path: pathlib.Path):
raise SystemExit(f"index digest mismatch {name}: {digest}")
def read_tar_entry(tar_path: pathlib.Path, name: str) -> bytes:
with tarfile.open(tar_path, "r:gz") as tf:
try:
info = tf.getmember(name)
except KeyError:
info = tf.getmember(f"./{name}")
data = tf.extractfile(info).read()
return data
def load_pubkey(path: pathlib.Path) -> Ed25519PublicKey:
if not CRYPTO_AVAILABLE:
raise SystemExit("cryptography is required for DSSE verification; install before using --pubkey")
@@ -170,6 +180,16 @@ def check_bundle_meta(meta_path: pathlib.Path, manifest_path: pathlib.Path, tar_
expect("manifest", manifest_path)
expect("tarball", tar_path)
# DSSE sidecars are optional but if present, validate hashes
dsse_manifest = artifacts.get("manifest_dsse")
if dsse_manifest and dsse_manifest.get("path"):
expect("manifest_dsse", meta_path.parent / dsse_manifest["path"])
dsse_bundle = artifacts.get("bundle_dsse")
if dsse_bundle and dsse_bundle.get("path"):
expect("bundle_dsse", meta_path.parent / dsse_bundle["path"])
dsse_anchor = artifacts.get("time_anchor_dsse")
if dsse_anchor and dsse_anchor.get("path"):
expect("time_anchor_dsse", meta_path.parent / dsse_anchor["path"])
for extra in ["time_anchor", "transport_plan", "rekor_policy", "mirror_policy", "offline_policy", "artifact_hashes"]:
rec = artifacts.get(extra)
if not rec:
@@ -177,6 +197,13 @@ def check_bundle_meta(meta_path: pathlib.Path, manifest_path: pathlib.Path, tar_
if not rec.get("path"):
raise SystemExit(f"bundle meta missing path for {extra}")
time_anchor_dsse = artifacts.get("time_anchor_dsse")
if time_anchor_dsse:
if not time_anchor_dsse.get("path"):
raise SystemExit("bundle meta missing path for time_anchor_dsse")
if not (meta_path.parent / time_anchor_dsse["path"]).exists():
raise SystemExit("time_anchor_dsse referenced but file missing")
for group, expected_count in [("ok", 10), ("rk", 10), ("ms", 10)]:
if len(meta.get("gaps", {}).get(group, [])) != expected_count:
raise SystemExit(f"bundle meta gaps.{group} expected {expected_count} entries")
@@ -215,6 +242,8 @@ def main():
bundle_meta = args.bundle_meta
bundle_dsse = bundle_meta.with_suffix(".dsse.json") if bundle_meta else None
manifest_dsse = manifest_path.with_suffix(".dsse.json")
time_anchor_dsse = None
time_anchor_path = tar_path.parent / "stage-v1" / "layers" / "time-anchor.json"
man_expected = load_sha256_sidecar(manifest_path)
tar_expected = load_sha256_sidecar(tar_path)
@@ -236,6 +265,13 @@ def main():
if sha256_file(bundle_meta) != meta_expected:
raise SystemExit("bundle meta sha256 mismatch")
check_bundle_meta(bundle_meta, manifest_path, tar_path, args.tenant, args.environment)
meta = json.loads(bundle_meta.read_text())
ta_entry = meta.get("artifacts", {}).get("time_anchor_dsse")
if ta_entry and ta_entry.get("path"):
ta_path = bundle_meta.parent / ta_entry["path"]
if sha256_file(ta_path) != ta_entry.get("sha256"):
raise SystemExit("time_anchor_dsse sha256 mismatch")
time_anchor_dsse = ta_path
if args.pubkey:
pubkey = args.pubkey
@@ -243,6 +279,12 @@ def main():
verify_dsse(manifest_dsse, pubkey, manifest_path, "application/vnd.stellaops.mirror.manifest+json")
if bundle_dsse and bundle_dsse.exists():
verify_dsse(bundle_dsse, pubkey, bundle_meta, "application/vnd.stellaops.mirror.bundle+json")
if time_anchor_dsse and time_anchor_dsse.exists() and time_anchor_path.exists():
anchor_bytes = read_tar_entry(tar_path, "layers/time-anchor.json")
tmp_anchor = tar_path.parent / "time-anchor.verify.json"
tmp_anchor.write_bytes(anchor_bytes)
verify_dsse(time_anchor_dsse, pubkey, tmp_anchor, "application/vnd.stellaops.time-anchor+json")
tmp_anchor.unlink(missing_ok=True)
print("OK: mirror-thin bundle verified")

View File

@@ -0,0 +1,88 @@
#!/usr/bin/env bash
set -euo pipefail
# Offline verifier for policy-sim inputs lock (PS1PS10 remediation).
# Usage: verify-policy-sim-lock.sh lock.json --policy path --graph path --sbom path --time-anchor path --dataset path [--max-age-hours 24]
usage() {
echo "Usage: $0 lock.json --policy <file> --graph <file> --sbom <file> --time-anchor <file> --dataset <file> [--max-age-hours <n>]" >&2
exit 2
}
[[ $# -lt 11 ]] && usage
lock=""
policy=""
graph=""
sbom=""
time_anchor=""
dataset=""
max_age_hours=0
while [[ $# -gt 0 ]]; do
case "$1" in
--policy) policy=${2:-}; shift ;;
--graph) graph=${2:-}; shift ;;
--sbom) sbom=${2:-}; shift ;;
--time-anchor) time_anchor=${2:-}; shift ;;
--dataset) dataset=${2:-}; shift ;;
--max-age-hours) max_age_hours=${2:-0}; shift ;;
*) if [[ -z "$lock" ]]; then lock=$1; else usage; fi ;;
esac
shift
done
[[ -z "$lock" || -z "$policy" || -z "$graph" || -z "$sbom" || -z "$time_anchor" || -z "$dataset" ]] && usage
require() { command -v "$1" >/dev/null || { echo "$1 is required" >&2; exit 2; }; }
require jq
require sha256sum
calc_sha() { sha256sum "$1" | awk '{print $1}'; }
lock_policy=$(jq -r '.policyBundleSha256' "$lock")
lock_graph=$(jq -r '.graphSha256' "$lock")
lock_sbom=$(jq -r '.sbomSha256' "$lock")
lock_anchor=$(jq -r '.timeAnchorSha256' "$lock")
lock_dataset=$(jq -r '.datasetSha256' "$lock")
lock_shadow=$(jq -r '.shadowIsolation' "$lock")
lock_scopes=$(jq -r '.requiredScopes[]?' "$lock" | tr '\n' ' ')
lock_generated=$(jq -r '.generatedAt' "$lock")
sha_ok() {
[[ $1 =~ ^[A-Fa-f0-9]{64}$ ]]
}
for h in "$lock_policy" "$lock_graph" "$lock_sbom" "$lock_anchor" "$lock_dataset"; do
sha_ok "$h" || { echo "invalid digest format: $h" >&2; exit 3; }
done
[[ "$lock_shadow" == "true" ]] || { echo "shadowIsolation must be true" >&2; exit 5; }
if ! grep -qi "policy:simulate:shadow" <<< "$lock_scopes"; then
echo "requiredScopes missing policy:simulate:shadow" >&2; exit 5;
fi
[[ "$lock_policy" == "$(calc_sha "$policy")" ]] || { echo "policy digest mismatch" >&2; exit 3; }
[[ "$lock_graph" == "$(calc_sha "$graph")" ]] || { echo "graph digest mismatch" >&2; exit 3; }
[[ "$lock_sbom" == "$(calc_sha "$sbom")" ]] || { echo "sbom digest mismatch" >&2; exit 3; }
[[ "$lock_anchor" == "$(calc_sha "$time_anchor")" ]] || { echo "time anchor digest mismatch" >&2; exit 3; }
[[ "$lock_dataset" == "$(calc_sha "$dataset")" ]] || { echo "dataset digest mismatch" >&2; exit 3; }
if [[ $max_age_hours -gt 0 ]]; then
now=$(date -u +"%Y-%m-%dT%H:%M:%SZ")
age_hours=$(python3 - <<'PY'
import sys,datetime
lock=sys.argv[1].replace('Z','+00:00')
now=sys.argv[2].replace('Z','+00:00')
l=datetime.datetime.fromisoformat(lock)
n=datetime.datetime.fromisoformat(now)
print((n-l).total_seconds()/3600)
PY
"$lock_generated" "$now")
if (( $(printf '%.0f' "$age_hours") > max_age_hours )); then
echo "lock stale: ${age_hours}h > ${max_age_hours}h" >&2
exit 4
fi
fi
echo "policy-sim lock verified (shadow mode enforced)."