feat: Add DigestUpsertRequest and LockEntity models
Some checks failed
AOC Guard CI / aoc-guard (push) Has been cancelled
AOC Guard CI / aoc-verify (push) Has been cancelled
Concelier Attestation Tests / attestation-tests (push) Has been cancelled
Docs CI / lint-and-preview (push) Has been cancelled
Export Center CI / export-ci (push) Has been cancelled
Mirror Thin Bundle Sign & Verify / mirror-sign (push) Has been cancelled

- Introduced DigestUpsertRequest for handling digest upsert requests with properties like ChannelId, Recipient, DigestKey, Events, and CollectUntil.
- Created LockEntity to represent a lightweight distributed lock entry with properties such as Id, TenantId, Resource, Owner, ExpiresAt, and CreatedAt.

feat: Implement ILockRepository interface and LockRepository class

- Defined ILockRepository interface with methods for acquiring and releasing locks.
- Implemented LockRepository class with methods to try acquiring a lock and releasing it, using SQL for upsert operations.

feat: Add SurfaceManifestPointer record for manifest pointers

- Introduced SurfaceManifestPointer to represent a minimal pointer to a Surface.FS manifest associated with an image digest.

feat: Create PolicySimulationInputLock and related validation logic

- Added PolicySimulationInputLock record to describe policy simulation inputs and expected digests.
- Implemented validation logic for policy simulation inputs, including checks for digest drift and shadow mode requirements.

test: Add unit tests for ReplayVerificationService and ReplayVerifier

- Created ReplayVerificationServiceTests to validate the behavior of the ReplayVerificationService under various scenarios.
- Developed ReplayVerifierTests to ensure the correctness of the ReplayVerifier logic.

test: Implement PolicySimulationInputLockValidatorTests

- Added tests for PolicySimulationInputLockValidator to verify the validation logic against expected inputs and conditions.

chore: Add cosign key example and signing scripts

- Included a placeholder cosign key example for development purposes.
- Added a script for signing Signals artifacts using cosign with support for both v2 and v3.

chore: Create script for uploading evidence to the evidence locker

- Developed a script to upload evidence to the evidence locker, ensuring required environment variables are set.
This commit is contained in:
StellaOps Bot
2025-12-03 07:51:50 +02:00
parent 37cba83708
commit e923880694
171 changed files with 6567 additions and 2952 deletions

View File

@@ -5,3 +5,4 @@
| OFFKIT-GAPS-125-011 | DONE | Offline kit gap remediation (OK1OK10) via bundle meta + policy layers. |
| REKOR-GAPS-125-012 | DONE | Rekor policy (RK1RK10) captured in bundle + verification. |
| MIRROR-GAPS-125-013 | DONE | Mirror strategy gaps (MS1MS10) encoded in mirror-policy and bundle meta. |
| MIRROR-CRT-57-002 | DONE | Time-anchor DSSE emitted when SIGN_KEY is set; bundle meta + verifier check anchor integrity. |

View File

@@ -44,6 +44,41 @@ else
}
]
}
# Optional: sign time anchor early so bundle meta can record DSSE hash
if [[ -n "${SIGN_KEY:-}" ]]; then
python - <<'PY'
import base64, json, pathlib, os
from cryptography.hazmat.primitives import serialization
from cryptography.hazmat.primitives.asymmetric.ed25519 import Ed25519PrivateKey
stage = pathlib.Path(os.environ['STAGE'])
anchor = stage / 'layers' / 'time-anchor.json'
dsse_path = stage / 'layers' / 'time-anchor.dsse.json'
out_path = stage.parent / 'time-anchor.dsse.json'
key_path = pathlib.Path(os.environ['SIGN_KEY'])
key: Ed25519PrivateKey = serialization.load_pem_private_key(key_path.read_bytes(), password=None)
payload = anchor.read_bytes()
sig = key.sign(payload)
pub_path = key_path.with_suffix('.pub')
pub_key = serialization.load_pem_public_key(pub_path.read_bytes())
pub_raw = pub_path.read_bytes()
def b64url(data: bytes) -> str:
return base64.urlsafe_b64encode(data).rstrip(b"=").decode()
dsse = {
"payloadType": "application/vnd.stellaops.time-anchor+json",
"payload": b64url(payload),
"signatures": [{"keyid": base64.urlsafe_b64encode(pub_raw).decode(), "sig": b64url(sig)}]
}
dsse_json = json.dumps(dsse, indent=2, sort_keys=True) + "\n"
dsse_path.write_text(dsse_json, encoding='utf-8')
out_path.write_text(dsse_json, encoding='utf-8')
print(f"Signed time-anchor DSSE -> {out_path}")
PY
fi
DATA
fi
@@ -287,6 +322,7 @@ def sha(path: pathlib.Path) -> str:
manifest_path = out / 'mirror-thin-v1.manifest.json'
tar_path = out / 'mirror-thin-v1.tar.gz'
time_anchor = stage / 'layers' / 'time-anchor.json'
time_anchor_dsse = out / 'time-anchor.dsse.json'
transport_plan = stage / 'layers' / 'transport-plan.json'
rekor_policy = stage / 'layers' / 'rekor-policy.json'
mirror_policy = stage / 'layers' / 'mirror-policy.json'
@@ -312,14 +348,15 @@ bundle = {
'checkpoint_freshness_seconds': fresh,
'artifacts': {
'manifest': {'path': manifest_path.name, 'sha256': sha(manifest_path)},
'tarball': {'path': tar_path.name, 'sha256': sha(tar_path)},
'manifest_dsse': {'path': 'mirror-thin-v1.manifest.dsse.json', 'sha256': None},
'bundle_meta': {'path': 'mirror-thin-v1.bundle.json', 'sha256': None},
'bundle_dsse': {'path': 'mirror-thin-v1.bundle.dsse.json', 'sha256': None},
'time_anchor': {'path': time_anchor.name, 'sha256': sha(time_anchor)},
'transport_plan': {'path': transport_plan.name, 'sha256': sha(transport_plan)},
'rekor_policy': {'path': rekor_policy.name, 'sha256': sha(rekor_policy)},
'mirror_policy': {'path': mirror_policy.name, 'sha256': sha(mirror_policy)},
'tarball': {'path': tar_path.name, 'sha256': sha(tar_path)},
'manifest_dsse': {'path': 'mirror-thin-v1.manifest.dsse.json', 'sha256': None},
'bundle_meta': {'path': 'mirror-thin-v1.bundle.json', 'sha256': None},
'bundle_dsse': {'path': 'mirror-thin-v1.bundle.dsse.json', 'sha256': None},
'time_anchor': {'path': time_anchor.name, 'sha256': sha(time_anchor)},
'time_anchor_dsse': {'path': time_anchor_dsse.name, 'sha256': sha(time_anchor_dsse)} if time_anchor_dsse.exists() else None,
'transport_plan': {'path': transport_plan.name, 'sha256': sha(transport_plan)},
'rekor_policy': {'path': rekor_policy.name, 'sha256': sha(rekor_policy)},
'mirror_policy': {'path': mirror_policy.name, 'sha256': sha(mirror_policy)},
'offline_policy': {'path': offline_policy.name, 'sha256': sha(offline_policy)},
'artifact_hashes': {'path': artifact_hashes.name, 'sha256': sha(artifact_hashes)},
'oci_index': {'path': 'oci/index.json', 'sha256': sha(oci_index)} if oci_index.exists() else None