# Concelier Backfill & Rollback Plan (STORE-AOC-19-005-DEV, Postgres) ## Objective Prepare and rehearse the raw Link-Not-Merge backfill/rollback so Concelier Postgres reflects the dataset deterministically across dev/stage. This replaces the prior Mongo workflow. ## Inputs - Dataset tarball: `out/linksets/linksets-stage-backfill.tar.zst` - Files expected inside: `linksets.ndjson`, `advisory_chunks.ndjson`, `manifest.json` - Record SHA-256 of the tarball here when staged: ``` $ sha256sum out/linksets/linksets-stage-backfill.tar.zst 2b43ef9b5694f59be8c1d513893c506b8d1b8de152d820937178070bfc00d0c0 out/linksets/linksets-stage-backfill.tar.zst ``` - To regenerate the tarball deterministically from repo seeds: `./scripts/concelier/build-store-aoc-19-005-dataset.sh` - To validate a tarball locally (counts + hashes): `./scripts/concelier/test-store-aoc-19-005-dataset.sh out/linksets/linksets-stage-backfill.tar.zst` ## Preflight - Env: - `PGURI` (or `CONCELIER_PG_URI`) pointing to the target Postgres instance. - `PGSCHEMA` (default `lnm_raw`) for staging tables. - Ensure maintenance window for bulk import; no concurrent writers to staging tables. ## Backfill steps (CI-ready) ### Preferred: CI/manual script - `scripts/concelier/backfill-store-aoc-19-005.sh /path/to/linksets-stage-backfill.tar.zst` - Env: `PGURI` (or `CONCELIER_PG_URI`), optional `PGSCHEMA` (default `lnm_raw`), optional `DRY_RUN=1` for extraction-only. - The script: - Extracts and validates required files. - Creates/clears staging tables (`.linksets_raw`, `.advisory_chunks_raw`). - Imports via `\copy` from TSV derived with `jq -rc '[._id, .] | @tsv'`. - Prints counts and echoes the manifest. ### Manual steps (fallback) 1) Extract dataset: ``` mkdir -p out/linksets/extracted tar -xf out/linksets/linksets-stage-backfill.tar.zst -C out/linksets/extracted ``` 2) Create/truncate staging tables and import: ``` psql "$PGURI" <