# AOC linkset backfill Purpose - Safely backfill advisory linksets and observations under Aggregation-Only rules. - Preserve offline kit integrity and determinism during data migrations. Inputs - Deterministic NDJSON dataset (gzip) for linksets and observations. - Target database and collections for advisory linksets and observations. - Offline kit bundle mirroring the backfill dataset. Preparation - Run a dry-run import to validate schema and guardrails. - Backup target collections before any import. - Stage rollback scripts and confirm indexes are reproducible. - Set ingestion flags for backfill windows (link-not-merge enabled, aggregation-only disabled if required for rehearsal). Execution - Import the NDJSON dataset with deterministic ordering. - Record import metrics and structured logs. - Run a determinism probe test that compares golden hashes. Rollback - Restore from backup and reapply deterministic indexes. - Re-run determinism probes and confirm guard flags are reset. Evidence to capture - Backup hash or archive checksum. - Import logs with counts and zero merge counters. - Determinism test results and hashes. - Offline kit bundle hash. Dataset generation - Export from staging with a pinned tenant and stable ordering. - Verify determinism by hashing the NDJSON output twice; hashes must match. - Publish a .sha256 alongside the dataset. Offline posture - Backfill datasets are mirrored into offline kits for air-gap verification. - Exports and evidence are stored as content-addressed artifacts.