feat(docs): Add comprehensive documentation for Vexer, Vulnerability Explorer, and Zastava modules
- Introduced AGENTS.md, README.md, TASKS.md, and implementation_plan.md for Vexer, detailing mission, responsibilities, key components, and operational notes. - Established similar documentation structure for Vulnerability Explorer and Zastava modules, including their respective workflows, integrations, and observability notes. - Created risk scoring profiles documentation outlining the core workflow, factor model, governance, and deliverables. - Ensured all modules adhere to the Aggregation-Only Contract and maintain determinism and provenance in outputs.
This commit is contained in:
@@ -10,4 +10,4 @@
|
||||
| ID | Status | Owner(s) | Depends on | Description | Exit Criteria |
|
||||
|----|--------|----------|------------|-------------|---------------|
|
||||
| ATTEST-TYPES-73-001 | TODO | Attestation Payloads Guild | ATTEST-TYPES-72-002 | Create golden payload samples for each type; integrate into tests and documentation. | Golden fixtures stored; tests compare outputs; docs embed examples. |
|
||||
| ATTEST-TYPES-73-002 | TODO | Attestation Payloads Guild, Docs Guild | ATTEST-TYPES-73-001 | Publish schema reference docs (`/docs/attestor/payloads.md`) with annotated JSON examples. | Doc merged with banner; examples validated by tests. |
|
||||
| ATTEST-TYPES-73-002 | TODO | Attestation Payloads Guild, Docs Guild | ATTEST-TYPES-73-001 | Publish schema reference docs (`/docs/modules/attestor/payloads.md`) with annotated JSON examples. | Doc merged with banner; examples validated by tests. |
|
||||
|
||||
@@ -10,4 +10,4 @@
|
||||
| ID | Status | Owner(s) | Depends on | Description | Exit Criteria |
|
||||
|----|--------|----------|------------|-------------|---------------|
|
||||
| ATTEST-VERIFY-74-001 | TODO | Verification Guild, Observability Guild | ATTEST-VERIFY-73-001 | Emit telemetry (spans/metrics) tagged by subject, issuer, policy, result; integrate with dashboards. | Metrics visible; spans present; SLO thresholds defined. |
|
||||
| ATTEST-VERIFY-74-002 | TODO | Verification Guild, Docs Guild | ATTEST-VERIFY-73-001 | Document verification report schema and explainability in `/docs/attestor/workflows.md`. | Documentation merged; examples verified via tests. |
|
||||
| ATTEST-VERIFY-74-002 | TODO | Verification Guild, Docs Guild | ATTEST-VERIFY-73-001 | Document verification report schema and explainability in `/docs/modules/attestor/workflows.md`. | Documentation merged; examples verified via tests. |
|
||||
|
||||
@@ -21,7 +21,7 @@
|
||||
- Update `TASKS.md` as states change (TODO → DOING → DONE/BLOCKED) and record added tests/fixtures alongside implementation notes.
|
||||
|
||||
## Reference Materials
|
||||
- `docs/ARCHITECTURE_CONCELIER.md` for database operations surface area.
|
||||
- `docs/modules/concelier/ARCHITECTURE.md` for database operations surface area.
|
||||
- Backend OpenAPI/contract docs (once available) for job triggers and scanner endpoints.
|
||||
- Existing module AGENTS/TASKS files for style and coordination cues.
|
||||
- `docs/09_API_CLI_REFERENCE.md` (section 3) for the user-facing synopsis of the CLI verbs and flags.
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
| ID | Status | Owner(s) | Depends on | Description | Exit Criteria |
|
||||
|----|--------|----------|------------|-------------|---------------|
|
||||
| CLI-AOC-19-001 | DONE (2025-10-27) | DevEx/CLI Guild | CONCELIER-WEB-AOC-19-001, EXCITITOR-WEB-AOC-19-001 | Implement `stella sources ingest --dry-run` printing would-write payloads with forbidden field scan results and guard status. | Command displays diff-safe JSON, highlights forbidden fields, exits non-zero on guard violation, and has unit tests. |
|
||||
> Docs ready (2025-10-26): Reference behaviour/spec in `docs/cli/cli-reference.md` §2 and AOC reference §5.
|
||||
> Docs ready (2025-10-26): Reference behaviour/spec in `docs/modules/cli/guides/cli-reference.md` §2 and AOC reference §5.
|
||||
> 2025-10-27: CLI command scaffolded with backend client call, JSON/table output, gzip/base64 normalisation, and exit-code mapping. Awaiting Concelier dry-run endpoint + integration tests once backend lands.
|
||||
> 2025-10-27: Progress paused before adding CLI unit tests; blocked on extending `StubBackendClient` + fixtures for `ExecuteAocIngestDryRunAsync` coverage.
|
||||
> 2025-10-27: Added stubbed ingest responses + unit tests covering success/violation paths, output writing, and exit-code mapping.
|
||||
@@ -11,7 +11,7 @@
|
||||
> 2025-10-27: CLI wiring in progress; backend client/command surface being added with table/JSON output.
|
||||
> 2025-10-27: Added JSON/table Spectre output, integration tests for exit-code handling, CLI metrics, and updated quickstart/architecture docs to cover guard workflows.
|
||||
| CLI-AOC-19-003 | DONE (2025-10-27) | Docs/CLI Guild | CLI-AOC-19-001, CLI-AOC-19-002 | Update CLI reference and quickstart docs to cover new commands, exit codes, and offline verification workflows. | Docs updated; examples recorded; release notes mention new commands. |
|
||||
> Docs note (2025-10-26): `docs/cli/cli-reference.md` now describes both commands, exit codes, and offline usage—sync help text once implementation lands.
|
||||
> Docs note (2025-10-26): `docs/modules/cli/guides/cli-reference.md` now describes both commands, exit codes, and offline usage—sync help text once implementation lands.
|
||||
> 2025-10-27: CLI reference now reflects final summary fields/JSON schema, quickstart includes verification/dry-run workflows, and API reference tables list both `sources ingest --dry-run` and `aoc verify`.
|
||||
|
||||
## Policy Engine v2
|
||||
|
||||
@@ -1,12 +1,12 @@
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|FEEDCONN-CCCS-02-001 Catalogue official CCCS advisory feeds|BE-Conn-CCCS|Research|**DONE (2025-10-11)** – Resolved RSS→Atom redirects (`/api/cccs/rss/v1/get?...` → `/api/cccs/atom/v1/get?...`), confirmed feed caps at 50 entries with inline HTML bodies, no `Last-Modified`/`ETag`, and `updated` timestamps in UTC. Findings and packet captures parked in `docs/concelier-connector-research-20251011.md`; retention sweep follow-up tracked in 02-007.|
|
||||
|FEEDCONN-CCCS-02-002 Implement fetch & source state handling|BE-Conn-CCCS|Source.Common, Storage.Mongo|**DONE (2025-10-14)** – `CccsConnector.FetchAsync` now hydrates feeds via `CccsFeedClient`, persists per-entry JSON payloads with SHA256 dedupe and cursor state, throttles requests, and records taxonomy + language metadata in document state.|
|
||||
|FEEDCONN-CCCS-02-003 DTO/parser implementation|BE-Conn-CCCS|Source.Common|**DONE (2025-10-14)** – Added `CccsHtmlParser` to sanitize Atom body HTML, extract serial/date/product bullets, collapse whitespace, and emit normalized reference URLs; `ParseAsync` now persists DTO records under schema `cccs.dto.v1`.|
|
||||
|FEEDCONN-CCCS-02-004 Canonical mapping & range primitives|BE-Conn-CCCS|Models|**DONE (2025-10-14)** – `CccsMapper` now materializes canonical advisories (aliases from serial/source/CVEs, references incl. canonical URL, vendor package records) with provenance masks; `MapAsync` stores results in `AdvisoryStore`.|
|
||||
|FEEDCONN-CCCS-02-005 Deterministic fixtures & tests|QA|Testing|**DONE (2025-10-14)** – Added English/French fixtures plus parser + connector end-to-end tests (`StellaOps.Concelier.Connector.Cccs.Tests`). Canned HTTP handler + Mongo fixture enables fetch→parse→map regression; fixtures refresh via `UPDATE_CCCS_FIXTURES=1`.|
|
||||
|FEEDCONN-CCCS-02-006 Observability & documentation|DevEx|Docs|**DONE (2025-10-15)** – Added `CccsDiagnostics` meter (fetch/parse/map counters), enriched connector logs with document counts, and published `docs/ops/concelier-cccs-operations.md` covering config, telemetry, and sanitiser guidance.|
|
||||
|FEEDCONN-CCCS-02-007 Historical advisory harvesting plan|BE-Conn-CCCS|Research|**DONE (2025-10-15)** – Measured `/api/cccs/threats/v1/get` inventory (~5.1k rows/lang; earliest 2018-06-08), documented backfill workflow + language split strategy, and linked the runbook for Offline Kit execution.|
|
||||
|FEEDCONN-CCCS-02-008 Raw DOM parsing refinement|BE-Conn-CCCS|Source.Common|**DONE (2025-10-15)** – Parser now walks unsanitised DOM (heading + nested list coverage), sanitizer keeps `<h#>`/`section` nodes, and regression fixtures/tests assert EN/FR list handling + preserved HTML structure.|
|
||||
|FEEDCONN-CCCS-02-009 Normalized versions rollout (Oct 2025)|BE-Conn-CCCS|Merge coordination (`FEEDMERGE-COORD-02-900`)|**TODO (due 2025-10-21)** – Implement trailing-version split helper per Merge guidance (see `../Merge/RANGE_PRIMITIVES_COORDINATION.md` “Helper snippets”) to emit `NormalizedVersions` via `SemVerRangeRuleBuilder`; refresh mapper tests/fixtures to assert provenance notes (`cccs:{serial}:{index}`) and confirm merge counters drop.|
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|FEEDCONN-CCCS-02-001 Catalogue official CCCS advisory feeds|BE-Conn-CCCS|Research|**DONE (2025-10-11)** – Resolved RSS→Atom redirects (`/api/cccs/rss/v1/get?...` → `/api/cccs/atom/v1/get?...`), confirmed feed caps at 50 entries with inline HTML bodies, no `Last-Modified`/`ETag`, and `updated` timestamps in UTC. Findings and packet captures parked in `docs/concelier-connector-research-20251011.md`; retention sweep follow-up tracked in 02-007.|
|
||||
|FEEDCONN-CCCS-02-002 Implement fetch & source state handling|BE-Conn-CCCS|Source.Common, Storage.Mongo|**DONE (2025-10-14)** – `CccsConnector.FetchAsync` now hydrates feeds via `CccsFeedClient`, persists per-entry JSON payloads with SHA256 dedupe and cursor state, throttles requests, and records taxonomy + language metadata in document state.|
|
||||
|FEEDCONN-CCCS-02-003 DTO/parser implementation|BE-Conn-CCCS|Source.Common|**DONE (2025-10-14)** – Added `CccsHtmlParser` to sanitize Atom body HTML, extract serial/date/product bullets, collapse whitespace, and emit normalized reference URLs; `ParseAsync` now persists DTO records under schema `cccs.dto.v1`.|
|
||||
|FEEDCONN-CCCS-02-004 Canonical mapping & range primitives|BE-Conn-CCCS|Models|**DONE (2025-10-14)** – `CccsMapper` now materializes canonical advisories (aliases from serial/source/CVEs, references incl. canonical URL, vendor package records) with provenance masks; `MapAsync` stores results in `AdvisoryStore`.|
|
||||
|FEEDCONN-CCCS-02-005 Deterministic fixtures & tests|QA|Testing|**DONE (2025-10-14)** – Added English/French fixtures plus parser + connector end-to-end tests (`StellaOps.Concelier.Connector.Cccs.Tests`). Canned HTTP handler + Mongo fixture enables fetch→parse→map regression; fixtures refresh via `UPDATE_CCCS_FIXTURES=1`.|
|
||||
|FEEDCONN-CCCS-02-006 Observability & documentation|DevEx|Docs|**DONE (2025-10-15)** – Added `CccsDiagnostics` meter (fetch/parse/map counters), enriched connector logs with document counts, and published `docs/modules/concelier/operations/connectors/cccs.md` covering config, telemetry, and sanitiser guidance.|
|
||||
|FEEDCONN-CCCS-02-007 Historical advisory harvesting plan|BE-Conn-CCCS|Research|**DONE (2025-10-15)** – Measured `/api/cccs/threats/v1/get` inventory (~5.1k rows/lang; earliest 2018-06-08), documented backfill workflow + language split strategy, and linked the runbook for Offline Kit execution.|
|
||||
|FEEDCONN-CCCS-02-008 Raw DOM parsing refinement|BE-Conn-CCCS|Source.Common|**DONE (2025-10-15)** – Parser now walks unsanitised DOM (heading + nested list coverage), sanitizer keeps `<h#>`/`section` nodes, and regression fixtures/tests assert EN/FR list handling + preserved HTML structure.|
|
||||
|FEEDCONN-CCCS-02-009 Normalized versions rollout (Oct 2025)|BE-Conn-CCCS|Merge coordination (`FEEDMERGE-COORD-02-900`)|**TODO (due 2025-10-21)** – Implement trailing-version split helper per Merge guidance (see `../Merge/RANGE_PRIMITIVES_COORDINATION.md` “Helper snippets”) to emit `NormalizedVersions` via `SemVerRangeRuleBuilder`; refresh mapper tests/fixtures to assert provenance notes (`cccs:{serial}:{index}`) and confirm merge counters drop.|
|
||||
|
||||
@@ -1,13 +1,13 @@
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|FEEDCONN-CERTBUND-02-001 Research CERT-Bund advisory endpoints|BE-Conn-CERTBUND|Research|**DONE (2025-10-11)** – Confirmed public RSS at `https://wid.cert-bund.de/content/public/securityAdvisory/rss` (HTTP 200 w/out cookies), 250-item window, German titles/categories, and detail links pointing to Angular SPA. Captured header profile (no cache hints) and logged open item to discover the JSON API used by `portal` frontend.|
|
||||
|FEEDCONN-CERTBUND-02-002 Fetch job & state persistence|BE-Conn-CERTBUND|Source.Common, Storage.Mongo|**DONE (2025-10-14)** – `CertBundConnector.FetchAsync` consumes RSS via session-bootstrapped client, stores per-advisory JSON documents with metadata + SHA, throttles detail requests, and maintains cursor state (pending docs/mappings, known advisory IDs, last published).|
|
||||
|FEEDCONN-CERTBUND-02-003 Parser/DTO implementation|BE-Conn-CERTBUND|Source.Common|**DONE (2025-10-14)** – Detail JSON piped through `CertBundDetailParser` (raw DOM sanitised to HTML), capturing severity, CVEs, product list, and references into DTO records (`cert-bund.detail.v1`).|
|
||||
|FEEDCONN-CERTBUND-02-004 Canonical mapping & range primitives|BE-Conn-CERTBUND|Models|**DONE (2025-10-14)** – `CertBundMapper` emits canonical advisories (aliases, references, vendor package ranges, provenance) with severity normalisation and deterministic ordering.|
|
||||
|FEEDCONN-CERTBUND-02-005 Regression fixtures & tests|QA|Testing|**DONE (2025-10-14)** – Added `StellaOps.Concelier.Connector.CertBund.Tests` covering fetch→parse→map against canned RSS/JSON fixtures; integration harness uses Mongo2Go + canned HTTP handler; fixtures regenerate via `UPDATE_CERTBUND_FIXTURES=1`.|
|
||||
|FEEDCONN-CERTBUND-02-006 Telemetry & documentation|DevEx|Docs|**DONE (2025-10-15)** – Added `CertBundDiagnostics` (meter `StellaOps.Concelier.Connector.CertBund`) with fetch/parse/map counters + histograms, recorded coverage days, wired stage summary logs, and published the ops runbook (`docs/ops/concelier-certbund-operations.md`).|
|
||||
|FEEDCONN-CERTBUND-02-007 Feed history & locale assessment|BE-Conn-CERTBUND|Research|**DONE (2025-10-15)** – Measured RSS retention (~6 days/≈250 items), captured connector-driven backfill guidance in the runbook, and aligned locale guidance (preserve `language=de`, Docs glossary follow-up). **Next:** coordinate with Tools to land the state-seeding helper so scripted backfills replace manual Mongo tweaks.|
|
||||
|FEEDCONN-CERTBUND-02-008 Session bootstrap & cookie strategy|BE-Conn-CERTBUND|Source.Common|**DONE (2025-10-14)** – Feed client primes the portal session (cookie container via `SocketsHttpHandler`), shares cookies across detail requests, and documents bootstrap behaviour in options (`PortalBootstrapUri`).|
|
||||
|FEEDCONN-CERTBUND-02-009 Offline Kit export packaging|BE-Conn-CERTBUND, Docs|Offline Kit|**DONE (2025-10-17)** – Added `tools/certbund_offline_snapshot.py` to capture search/export JSON, emit deterministic manifests + SHA files, and refreshed docs (`docs/ops/concelier-certbund-operations.md`, `docs/24_OFFLINE_KIT.md`) with offline-kit instructions and manifest layout guidance. Seed data README/ignore rules cover local snapshot hygiene.|
|
||||
|FEEDCONN-CERTBUND-02-010 Normalized range translator|BE-Conn-CERTBUND|Merge coordination (`FEEDMERGE-COORD-02-900`)|**TODO (due 2025-10-22)** – Translate `product.Versions` phrases (e.g., `2023.1 bis 2024.2`, `alle`) into comparator strings for `SemVerRangeRuleBuilder`, emit `NormalizedVersions` with `certbund:{advisoryId}:{vendor}` provenance, and extend tests/README with localisation notes.|
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|FEEDCONN-CERTBUND-02-001 Research CERT-Bund advisory endpoints|BE-Conn-CERTBUND|Research|**DONE (2025-10-11)** – Confirmed public RSS at `https://wid.cert-bund.de/content/public/securityAdvisory/rss` (HTTP 200 w/out cookies), 250-item window, German titles/categories, and detail links pointing to Angular SPA. Captured header profile (no cache hints) and logged open item to discover the JSON API used by `portal` frontend.|
|
||||
|FEEDCONN-CERTBUND-02-002 Fetch job & state persistence|BE-Conn-CERTBUND|Source.Common, Storage.Mongo|**DONE (2025-10-14)** – `CertBundConnector.FetchAsync` consumes RSS via session-bootstrapped client, stores per-advisory JSON documents with metadata + SHA, throttles detail requests, and maintains cursor state (pending docs/mappings, known advisory IDs, last published).|
|
||||
|FEEDCONN-CERTBUND-02-003 Parser/DTO implementation|BE-Conn-CERTBUND|Source.Common|**DONE (2025-10-14)** – Detail JSON piped through `CertBundDetailParser` (raw DOM sanitised to HTML), capturing severity, CVEs, product list, and references into DTO records (`cert-bund.detail.v1`).|
|
||||
|FEEDCONN-CERTBUND-02-004 Canonical mapping & range primitives|BE-Conn-CERTBUND|Models|**DONE (2025-10-14)** – `CertBundMapper` emits canonical advisories (aliases, references, vendor package ranges, provenance) with severity normalisation and deterministic ordering.|
|
||||
|FEEDCONN-CERTBUND-02-005 Regression fixtures & tests|QA|Testing|**DONE (2025-10-14)** – Added `StellaOps.Concelier.Connector.CertBund.Tests` covering fetch→parse→map against canned RSS/JSON fixtures; integration harness uses Mongo2Go + canned HTTP handler; fixtures regenerate via `UPDATE_CERTBUND_FIXTURES=1`.|
|
||||
|FEEDCONN-CERTBUND-02-006 Telemetry & documentation|DevEx|Docs|**DONE (2025-10-15)** – Added `CertBundDiagnostics` (meter `StellaOps.Concelier.Connector.CertBund`) with fetch/parse/map counters + histograms, recorded coverage days, wired stage summary logs, and published the ops runbook (`docs/modules/concelier/operations/connectors/certbund.md`).|
|
||||
|FEEDCONN-CERTBUND-02-007 Feed history & locale assessment|BE-Conn-CERTBUND|Research|**DONE (2025-10-15)** – Measured RSS retention (~6 days/≈250 items), captured connector-driven backfill guidance in the runbook, and aligned locale guidance (preserve `language=de`, Docs glossary follow-up). **Next:** coordinate with Tools to land the state-seeding helper so scripted backfills replace manual Mongo tweaks.|
|
||||
|FEEDCONN-CERTBUND-02-008 Session bootstrap & cookie strategy|BE-Conn-CERTBUND|Source.Common|**DONE (2025-10-14)** – Feed client primes the portal session (cookie container via `SocketsHttpHandler`), shares cookies across detail requests, and documents bootstrap behaviour in options (`PortalBootstrapUri`).|
|
||||
|FEEDCONN-CERTBUND-02-009 Offline Kit export packaging|BE-Conn-CERTBUND, Docs|Offline Kit|**DONE (2025-10-17)** – Added `src/Tools/certbund_offline_snapshot.py` to capture search/export JSON, emit deterministic manifests + SHA files, and refreshed docs (`docs/modules/concelier/operations/connectors/certbund.md`, `docs/24_OFFLINE_KIT.md`) with offline-kit instructions and manifest layout guidance. Seed data README/ignore rules cover local snapshot hygiene.|
|
||||
|FEEDCONN-CERTBUND-02-010 Normalized range translator|BE-Conn-CERTBUND|Merge coordination (`FEEDMERGE-COORD-02-900`)|**TODO (due 2025-10-22)** – Translate `product.Versions` phrases (e.g., `2023.1 bis 2024.2`, `alle`) into comparator strings for `SemVerRangeRuleBuilder`, emit `NormalizedVersions` with `certbund:{advisoryId}:{vendor}` provenance, and extend tests/README with localisation notes.|
|
||||
|
||||
@@ -34,7 +34,7 @@
|
||||
|
||||
- Dashboards: `certcc.*` meters (plan, summary fetch, detail fetch) plus `concelier.range.primitives` with tag `scheme=certcc.vendor`.
|
||||
- Logs: ensure Parse/Map jobs emit `correlationId` aligned with fetch events for traceability.
|
||||
- Data QA: run `tools/dump_advisory` against two VINCE notes (one multi-vendor, one single-vendor) every phase to spot-check normalized versions ordering and provenance.
|
||||
- Data QA: run `src/Tools/dump_advisory` against two VINCE notes (one multi-vendor, one single-vendor) every phase to spot-check normalized versions ordering and provenance.
|
||||
- Storage: verify Mongo TTL/size for `raw_documents` and `dtos`—detail payload volume increases by ~3× when mapping resumes.
|
||||
|
||||
## 5. Rollback / Contingency Playbook
|
||||
|
||||
@@ -7,6 +7,6 @@
|
||||
|Canonical mapping & range primitives|BE-Conn-CVE|Models|**DONE (2025-10-10)** – `CveMapper` emits canonical advisories, vendor range primitives, SemVer/range statuses, references, CVSS normalization.<br>2025-10-11 research trail: confirm subsequent MR adds `NormalizedVersions` shaped like `[{"scheme":"semver","type":"range","min":"<min>","minInclusive":true,"max":"<max>","maxInclusive":false,"notes":"nvd:CVE-2025-XXXX"}]` so storage provenance joins continue to work.|
|
||||
|Deterministic tests & fixtures|QA|Testing|**DONE (2025-10-10)** – Added `StellaOps.Concelier.Connector.Cve.Tests` harness with canned fixtures + snapshot regression covering fetch/parse/map.|
|
||||
|Observability & docs|DevEx|Docs|**DONE (2025-10-10)** – Diagnostics meter (`cve.fetch.*`, etc.) wired; options/usage documented via `CveServiceCollectionExtensions`.|
|
||||
|Operator rollout playbook|BE-Conn-CVE, Ops|Docs|**DONE (2025-10-12)** – Refreshed `docs/ops/concelier-cve-kev-operations.md` with credential checklist, smoke book, PromQL guardrails, and linked Grafana pack (`docs/ops/concelier-cve-kev-grafana-dashboard.json`).|
|
||||
|Live smoke & monitoring|QA, BE-Conn-CVE|WebService, Observability|**DONE (2025-10-15)** – Executed connector harness smoke using CVE Services sample window (CVE-2024-0001), confirmed fetch/parse/map telemetry (`cve.fetch.*`, `cve.map.success`) all incremented once, and archived the summary log + Grafana import guidance in `docs/ops/concelier-cve-kev-operations.md` (“Staging smoke 2025-10-15”).|
|
||||
|Operator rollout playbook|BE-Conn-CVE, Ops|Docs|**DONE (2025-10-12)** – Refreshed `docs/modules/concelier/operations/connectors/cve-kev.md` with credential checklist, smoke book, PromQL guardrails, and linked Grafana pack (`docs/modules/concelier/operations/connectors/cve-kev-grafana-dashboard.json`).|
|
||||
|Live smoke & monitoring|QA, BE-Conn-CVE|WebService, Observability|**DONE (2025-10-15)** – Executed connector harness smoke using CVE Services sample window (CVE-2024-0001), confirmed fetch/parse/map telemetry (`cve.fetch.*`, `cve.map.success`) all incremented once, and archived the summary log + Grafana import guidance in `docs/modules/concelier/operations/connectors/cve-kev.md` (“Staging smoke 2025-10-15”).|
|
||||
|FEEDCONN-CVE-02-003 Normalized versions rollout|BE-Conn-CVE|Models `FEEDMODELS-SCHEMA-01-003`, Normalization playbook|**DONE (2025-10-12)** – Confirmed SemVer primitives map to normalized rules with `cve:{cveId}:{identifier}` notes and refreshed snapshots; `dotnet test src/Concelier/StellaOps.Concelier.PluginBinaries/StellaOps.Concelier.Connector.Cve.Tests` passes on net10 preview.|
|
||||
|
||||
@@ -8,10 +8,10 @@
|
||||
|Deterministic fixtures & tests|QA|Testing|**DONE (2025-10-10)** – New `StellaOps.Concelier.Connector.Ghsa.Tests` regression covers fetch/parse/map via canned GHSA fixtures and snapshot assertions.|
|
||||
|Telemetry & documentation|DevEx|Docs|**DONE (2025-10-10)** – Diagnostics meter (`ghsa.fetch.*`) wired; DI extension documents token/headers and job registrations.|
|
||||
|GitHub quota monitoring & retries|BE-Conn-GHSA, Observability|Source.Common|**DONE (2025-10-12)** – Rate-limit metrics/logs added, retry/backoff handles 403 secondary limits, and ops runbook documents dashboards + mitigation steps.|
|
||||
|Production credential & scheduler rollout|Ops, BE-Conn-GHSA|Docs, WebService|**DONE (2025-10-12)** – Scheduler defaults registered via `JobSchedulerBuilder`, credential provisioning documented (Compose/Helm samples), and staged backfill guidance captured in `docs/ops/concelier-ghsa-operations.md`.|
|
||||
|Production credential & scheduler rollout|Ops, BE-Conn-GHSA|Docs, WebService|**DONE (2025-10-12)** – Scheduler defaults registered via `JobSchedulerBuilder`, credential provisioning documented (Compose/Helm samples), and staged backfill guidance captured in `docs/modules/concelier/operations/connectors/ghsa.md`.|
|
||||
|FEEDCONN-GHSA-04-002 Conflict regression fixtures|BE-Conn-GHSA, QA|Merge `FEEDMERGE-ENGINE-04-001`|**DONE (2025-10-12)** – Added `conflict-ghsa.canonical.json` + `GhsaConflictFixtureTests`; SemVer ranges and credits align with merge precedence triple and shareable with QA. Validation: `dotnet test src/Concelier/StellaOps.Concelier.PluginBinaries/StellaOps.Concelier.Connector.Ghsa.Tests/StellaOps.Concelier.Connector.Ghsa.Tests.csproj --filter GhsaConflictFixtureTests`.|
|
||||
|FEEDCONN-GHSA-02-004 GHSA credits & ecosystem severity mapping|BE-Conn-GHSA|Models `FEEDMODELS-SCHEMA-01-002`|**DONE (2025-10-11)** – Mapper emits advisory credits with provenance masks, fixtures assert role/contact ordering, and severity normalization remains unchanged.|
|
||||
|FEEDCONN-GHSA-02-007 Credit parity regression fixtures|BE-Conn-GHSA, QA|Source.Nvd, Source.Osv|**DONE (2025-10-12)** – Parity fixtures regenerated via `tools/FixtureUpdater`, normalized SemVer notes verified against GHSA/NVD/OSV snapshots, and the fixtures guide now documents the headroom checks.|
|
||||
|FEEDCONN-GHSA-02-007 Credit parity regression fixtures|BE-Conn-GHSA, QA|Source.Nvd, Source.Osv|**DONE (2025-10-12)** – Parity fixtures regenerated via `src/Tools/FixtureUpdater`, normalized SemVer notes verified against GHSA/NVD/OSV snapshots, and the fixtures guide now documents the headroom checks.|
|
||||
|FEEDCONN-GHSA-02-001 Normalized versions rollout|BE-Conn-GHSA|Models `FEEDMODELS-SCHEMA-01-003`, Normalization playbook|**DONE (2025-10-11)** – GHSA mapper now emits SemVer primitives + normalized ranges, fixtures refreshed, connector tests passing; report logged via FEEDMERGE-COORD-02-900.|
|
||||
|FEEDCONN-GHSA-02-005 Quota monitoring hardening|BE-Conn-GHSA, Observability|Source.Common metrics|**DONE (2025-10-12)** – Diagnostics expose headroom histograms/gauges, warning logs dedupe below the configured threshold, and the ops runbook gained alerting and mitigation guidance.|
|
||||
|FEEDCONN-GHSA-02-006 Scheduler rollout integration|BE-Conn-GHSA, Ops|Job scheduler|**DONE (2025-10-12)** – Dependency routine tests assert cron/timeouts, and the runbook highlights cron overrides plus backoff toggles for staged rollouts.|
|
||||
|
||||
@@ -9,7 +9,7 @@
|
||||
|FEEDCONN-ICSCISA-02-006 Telemetry & documentation|DevEx|Docs|**DONE (2025-10-16)** – Ops guide documents attachment checks, SemVer exact values, and proxy guidance; diagnostics remain unchanged.|
|
||||
|FEEDCONN-ICSCISA-02-007 Detail document inventory|BE-Conn-ICS-CISA|Research|**DONE (2025-10-16)** – Validated canned detail pages vs feed output so attachment inventories stay aligned; archived expectations noted in `HANDOVER.md`.|
|
||||
|FEEDCONN-ICSCISA-02-008 Distribution fallback strategy|BE-Conn-ICS-CISA|Research|**DONE (2025-10-11)** – Outlined GovDelivery token request, HTML scrape + email digest fallback, and dependency on Ops for credential workflow; awaiting decision before fetch implementation.|
|
||||
|FEEDCONN-ICSCISA-02-009 GovDelivery credential onboarding|Ops, BE-Conn-ICS-CISA|Ops|**DONE (2025-10-14)** – GovDelivery onboarding runbook captured in `docs/ops/concelier-icscisa-operations.md`; secret vault path and Offline Kit handling documented.|
|
||||
|FEEDCONN-ICSCISA-02-009 GovDelivery credential onboarding|Ops, BE-Conn-ICS-CISA|Ops|**DONE (2025-10-14)** – GovDelivery onboarding runbook captured in `docs/modules/concelier/operations/connectors/ics-cisa.md`; secret vault path and Offline Kit handling documented.|
|
||||
|FEEDCONN-ICSCISA-02-010 Mitigation & SemVer polish|BE-Conn-ICS-CISA|02-003, 02-004|**DONE (2025-10-16)** – Attachment + mitigation references now land as expected and SemVer primitives carry exact values; end-to-end suite green (see `HANDOVER.md`).|
|
||||
|FEEDCONN-ICSCISA-02-011 Docs & telemetry refresh|DevEx|02-006|**DONE (2025-10-16)** – Ops documentation refreshed (attachments, SemVer validation, proxy knobs) and telemetry notes verified.|
|
||||
|FEEDCONN-ICSCISA-02-012 Normalized version decision|BE-Conn-ICS-CISA|Merge coordination (`FEEDMERGE-COORD-02-900`)|**TODO (due 2025-10-23)** – Promote existing `SemVerPrimitive` exact values into `NormalizedVersions` via `.ToNormalizedVersionRule("ics-cisa:{advisoryId}:{product}")`, add regression coverage, and open Models ticket if non-SemVer firmware requires a new scheme.|
|
||||
|
||||
@@ -8,5 +8,5 @@
|
||||
|Deterministic fixtures/tests|QA|Testing|**DONE** – End-to-end fetch→parse→map test with canned catalog + snapshot (`UPDATE_KEV_FIXTURES=1`) guards determinism.|
|
||||
|Telemetry & docs|DevEx|Docs|**DONE** – Connector emits structured logs + meters for catalog entries/advisories and AGENTS docs cover cadence/allowlist guidance.|
|
||||
|Schema validation & anomaly surfacing|BE-Conn-KEV, QA|Source.Common|**DONE (2025-10-12)** – Wired `IJsonSchemaValidator` + embedded schema, added failure reasons (`schema`, `download`, `invalidJson`, etc.), anomaly counters (`missingCveId`, `countMismatch`, `nullEntry`), and kept `dotnet test src/Concelier/StellaOps.Concelier.PluginBinaries/StellaOps.Concelier.Connector.Kev.Tests` passing.|
|
||||
|Metrics export wiring|DevOps, DevEx|Observability|**DONE (2025-10-12)** – Added `kev.fetch.*` counters, parse failure/anomaly tags, refreshed ops runbook + Grafana dashboard (`docs/ops/concelier-cve-kev-grafana-dashboard.json`) with PromQL guidance.|
|
||||
|Metrics export wiring|DevOps, DevEx|Observability|**DONE (2025-10-12)** – Added `kev.fetch.*` counters, parse failure/anomaly tags, refreshed ops runbook + Grafana dashboard (`docs/modules/concelier/operations/connectors/cve-kev-grafana-dashboard.json`) with PromQL guidance.|
|
||||
|FEEDCONN-KEV-02-003 Normalized versions propagation|BE-Conn-KEV|Models `FEEDMODELS-SCHEMA-01-003`, Normalization playbook|**DONE (2025-10-12)** – Validated catalog/date/due normalized rules emission + ordering; fixtures assert rule set and `dotnet test src/Concelier/StellaOps.Concelier.PluginBinaries/StellaOps.Concelier.Connector.Kev.Tests` remains green.|
|
||||
|
||||
@@ -17,4 +17,4 @@
|
||||
|FEEDCONN-OSV-04-003 Parity fixture refresh|QA, BE-Conn-OSV|Normalized versions rollout, GHSA parity tests|**DONE (2025-10-12)** – Parity fixtures include normalizedVersions notes (`osv:<ecosystem>:<id>:<purl>`); regression math rerun via `dotnet test src/Concelier/StellaOps.Concelier.PluginBinaries/StellaOps.Concelier.Connector.Osv.Tests` and docs flagged for workflow sync.|
|
||||
|FEEDCONN-OSV-04-002 Conflict regression fixtures|BE-Conn-OSV, QA|Merge `FEEDMERGE-ENGINE-04-001`|**DONE (2025-10-12)** – Added `conflict-osv.canonical.json` + regression asserting SemVer range + CVSS medium severity; dataset matches GHSA/NVD fixtures for merge tests. Validation: `dotnet test src/Concelier/StellaOps.Concelier.PluginBinaries/StellaOps.Concelier.Connector.Osv.Tests/StellaOps.Concelier.Connector.Osv.Tests.csproj --filter OsvConflictFixtureTests`.|
|
||||
|FEEDCONN-OSV-04-004 Description/CWE/metric parity rollout|BE-Conn-OSV|Models, Core|**DONE (2025-10-15)** – OSV mapper writes advisory descriptions, `database_specific.cwe_ids` weaknesses, and canonical CVSS metric id. Parity fixtures (`osv-ghsa.*`, `osv-npm.snapshot.json`, `osv-pypi.snapshot.json`) refreshed and status communicated to Merge coordination.|
|
||||
|FEEDCONN-OSV-04-005 Canonical metric fallbacks & CWE notes|BE-Conn-OSV|Models, Merge|**DONE (2025-10-16)** – Add fallback logic and metrics for advisories lacking CVSS vectors, enrich CWE provenance notes, and document merge/export expectations; refresh parity fixtures accordingly.<br>2025-10-16: Mapper now emits `osv:severity/<level>` canonical ids for severity-only advisories, weakness provenance carries `database_specific.cwe_ids`, diagnostics expose `osv.map.canonical_metric_fallbacks`, parity fixtures regenerated, and ops notes added in `docs/ops/concelier-osv-operations.md`. Tests: `dotnet test src/Concelier/StellaOps.Concelier.PluginBinaries/StellaOps.Concelier.Connector.Osv.Tests/StellaOps.Concelier.Connector.Osv.Tests.csproj`.|
|
||||
|FEEDCONN-OSV-04-005 Canonical metric fallbacks & CWE notes|BE-Conn-OSV|Models, Merge|**DONE (2025-10-16)** – Add fallback logic and metrics for advisories lacking CVSS vectors, enrich CWE provenance notes, and document merge/export expectations; refresh parity fixtures accordingly.<br>2025-10-16: Mapper now emits `osv:severity/<level>` canonical ids for severity-only advisories, weakness provenance carries `database_specific.cwe_ids`, diagnostics expose `osv.map.canonical_metric_fallbacks`, parity fixtures regenerated, and ops notes added in `docs/modules/concelier/operations/connectors/osv.md`. Tests: `dotnet test src/Concelier/StellaOps.Concelier.PluginBinaries/StellaOps.Concelier.Connector.Osv.Tests/StellaOps.Concelier.Connector.Osv.Tests.csproj`.|
|
||||
|
||||
@@ -1,11 +1,11 @@
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|FEEDCONN-NKCKI-02-001 Research NKTsKI advisory feeds|BE-Conn-Nkcki|Research|**DONE (2025-10-11)** – Candidate RSS locations (`https://cert.gov.ru/rss/advisories.xml`, `https://www.cert.gov.ru/...`) return 403/404 even with `Accept-Language: ru-RU` and `--insecure`; site is Bitrix-backed and expects Russian Trusted Sub CA plus session cookies. Logged packet captures + needed cert list in `docs/concelier-connector-research-20251011.md`; waiting on Ops for sanctioned trust bundle.|
|
||||
|FEEDCONN-NKCKI-02-002 Fetch pipeline & state persistence|BE-Conn-Nkcki|Source.Common, Storage.Mongo|**DONE (2025-10-13)** – Listing fetch now honours `maxListingPagesPerFetch`, persists cache hits when listing access fails, and records telemetry via `RuNkckiDiagnostics`. Cursor tracking covers pending documents/mappings and the known bulletin ring buffer.|
|
||||
|FEEDCONN-NKCKI-02-003 DTO & parser implementation|BE-Conn-Nkcki|Source.Common|**DONE (2025-10-13)** – Parser normalises nested arrays (ICS categories, vulnerable software lists, optional tags), flattens multiline `software_text`, and guarantees deterministic ordering for URLs and tags.|
|
||||
|FEEDCONN-NKCKI-02-004 Canonical mapping & range primitives|BE-Conn-Nkcki|Models|**DONE (2025-10-13)** – Mapper splits structured software entries, emits SemVer range primitives + normalized rules, deduplicates references, and surfaces CVSS v4 metadata alongside existing metrics.|
|
||||
|FEEDCONN-NKCKI-02-005 Deterministic fixtures & tests|QA|Testing|**DONE (2025-10-13)** – Fixtures refreshed with multi-page pagination + multi-entry bulletins. Tests exercise cache replay and rely on bundled OpenSSL 1.1 libs in `tools/openssl/linux-x64` to keep Mongo2Go green on modern distros.|
|
||||
|FEEDCONN-NKCKI-02-006 Telemetry & documentation|DevEx|Docs|**DONE (2025-10-13)** – Added connector-specific metrics (`nkcki.*`) and documented configuration/operational guidance in `docs/ops/concelier-nkcki-operations.md`.|
|
||||
|FEEDCONN-NKCKI-02-007 Archive ingestion strategy|BE-Conn-Nkcki|Research|**DONE (2025-10-13)** – Documented Bitrix pagination/backfill plan (cache-first, offline replay, HTML/PDF capture) in `docs/ops/concelier-nkcki-operations.md`.|
|
||||
|FEEDCONN-NKCKI-02-008 Access enablement plan|BE-Conn-Nkcki|Source.Common|**DONE (2025-10-11)** – Documented trust-store requirement, optional SOCKS proxy fallback, and monitoring plan; shared TLS support now available via `SourceHttpClientOptions.TrustedRootCertificates` (`concelier:httpClients:source.nkcki:*`), awaiting Ops-sourced cert bundle before fetch implementation.|
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|FEEDCONN-NKCKI-02-001 Research NKTsKI advisory feeds|BE-Conn-Nkcki|Research|**DONE (2025-10-11)** – Candidate RSS locations (`https://cert.gov.ru/rss/advisories.xml`, `https://www.cert.gov.ru/...`) return 403/404 even with `Accept-Language: ru-RU` and `--insecure`; site is Bitrix-backed and expects Russian Trusted Sub CA plus session cookies. Logged packet captures + needed cert list in `docs/concelier-connector-research-20251011.md`; waiting on Ops for sanctioned trust bundle.|
|
||||
|FEEDCONN-NKCKI-02-002 Fetch pipeline & state persistence|BE-Conn-Nkcki|Source.Common, Storage.Mongo|**DONE (2025-10-13)** – Listing fetch now honours `maxListingPagesPerFetch`, persists cache hits when listing access fails, and records telemetry via `RuNkckiDiagnostics`. Cursor tracking covers pending documents/mappings and the known bulletin ring buffer.|
|
||||
|FEEDCONN-NKCKI-02-003 DTO & parser implementation|BE-Conn-Nkcki|Source.Common|**DONE (2025-10-13)** – Parser normalises nested arrays (ICS categories, vulnerable software lists, optional tags), flattens multiline `software_text`, and guarantees deterministic ordering for URLs and tags.|
|
||||
|FEEDCONN-NKCKI-02-004 Canonical mapping & range primitives|BE-Conn-Nkcki|Models|**DONE (2025-10-13)** – Mapper splits structured software entries, emits SemVer range primitives + normalized rules, deduplicates references, and surfaces CVSS v4 metadata alongside existing metrics.|
|
||||
|FEEDCONN-NKCKI-02-005 Deterministic fixtures & tests|QA|Testing|**DONE (2025-10-13)** – Fixtures refreshed with multi-page pagination + multi-entry bulletins. Tests exercise cache replay and rely on bundled OpenSSL 1.1 libs in `src/Tools/openssl/linux-x64` to keep Mongo2Go green on modern distros.|
|
||||
|FEEDCONN-NKCKI-02-006 Telemetry & documentation|DevEx|Docs|**DONE (2025-10-13)** – Added connector-specific metrics (`nkcki.*`) and documented configuration/operational guidance in `docs/modules/concelier/operations/connectors/nkcki.md`.|
|
||||
|FEEDCONN-NKCKI-02-007 Archive ingestion strategy|BE-Conn-Nkcki|Research|**DONE (2025-10-13)** – Documented Bitrix pagination/backfill plan (cache-first, offline replay, HTML/PDF capture) in `docs/modules/concelier/operations/connectors/nkcki.md`.|
|
||||
|FEEDCONN-NKCKI-02-008 Access enablement plan|BE-Conn-Nkcki|Source.Common|**DONE (2025-10-11)** – Documented trust-store requirement, optional SOCKS proxy fallback, and monitoring plan; shared TLS support now available via `SourceHttpClientOptions.TrustedRootCertificates` (`concelier:httpClients:source.nkcki:*`), awaiting Ops-sourced cert bundle before fetch implementation.|
|
||||
|
||||
@@ -1,11 +1,11 @@
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|Catalogue Apple security bulletin sources|BE-Conn-Apple|Research|**DONE** – Feed contract documented in README (Software Lookup Service JSON + HT article hub) with rate-limit notes.|
|
||||
|Fetch pipeline & state persistence|BE-Conn-Apple|Source.Common, Storage.Mongo|**DONE** – Index fetch + detail ingestion with SourceState cursoring/allowlists committed; awaiting live smoke run before enabling in scheduler defaults.|
|
||||
|Parser & DTO implementation|BE-Conn-Apple|Source.Common|**DONE** – AngleSharp detail parser produces canonical DTO payloads (CVE list, timestamps, affected tables) persisted via DTO store.|
|
||||
|Canonical mapping & range primitives|BE-Conn-Apple|Models|**DONE** – Mapper now emits SemVer-derived normalizedVersions with `apple:<platform>:<product>` notes; fixtures updated to assert canonical rules while we continue tracking multi-device coverage in follow-up tasks.<br>2025-10-11 research trail: confirmed payload aligns with `[{"scheme":"semver","type":"range","min":"<build-start>","minInclusive":true,"max":"<build-end>","maxInclusive":false,"notes":"apple:ios:17.1"}]`; continue using `notes` to surface build identifiers for storage provenance.|
|
||||
|Deterministic fixtures/tests|QA|Testing|**DONE (2025-10-12)** – Parser now scopes references to article content, sorts affected rows deterministically, and regenerated fixtures (125326/125328/106355/HT214108/HT215500) produce stable JSON + sanitizer HTML in English.|
|
||||
|Telemetry & documentation|DevEx|Docs|**DONE (2025-10-12)** – OpenTelemetry pipeline exports `StellaOps.Concelier.Connector.Vndr.Apple`; runbook `docs/ops/concelier-apple-operations.md` added with metrics + monitoring guidance.|
|
||||
|Live HTML regression sweep|QA|Source.Common|**DONE (2025-10-12)** – Captured latest support.apple.com articles for 125326/125328/106355/HT214108/HT215500, trimmed nav noise, and committed sanitized HTML + expected DTOs with invariant timestamps.|
|
||||
|Fixture regeneration tooling|DevEx|Testing|**DONE (2025-10-12)** – `scripts/update-apple-fixtures.(sh|ps1)` set the env flag + sentinel, forward through WSLENV, and clean up after regeneration; README references updated usage.|
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|Catalogue Apple security bulletin sources|BE-Conn-Apple|Research|**DONE** – Feed contract documented in README (Software Lookup Service JSON + HT article hub) with rate-limit notes.|
|
||||
|Fetch pipeline & state persistence|BE-Conn-Apple|Source.Common, Storage.Mongo|**DONE** – Index fetch + detail ingestion with SourceState cursoring/allowlists committed; awaiting live smoke run before enabling in scheduler defaults.|
|
||||
|Parser & DTO implementation|BE-Conn-Apple|Source.Common|**DONE** – AngleSharp detail parser produces canonical DTO payloads (CVE list, timestamps, affected tables) persisted via DTO store.|
|
||||
|Canonical mapping & range primitives|BE-Conn-Apple|Models|**DONE** – Mapper now emits SemVer-derived normalizedVersions with `apple:<platform>:<product>` notes; fixtures updated to assert canonical rules while we continue tracking multi-device coverage in follow-up tasks.<br>2025-10-11 research trail: confirmed payload aligns with `[{"scheme":"semver","type":"range","min":"<build-start>","minInclusive":true,"max":"<build-end>","maxInclusive":false,"notes":"apple:ios:17.1"}]`; continue using `notes` to surface build identifiers for storage provenance.|
|
||||
|Deterministic fixtures/tests|QA|Testing|**DONE (2025-10-12)** – Parser now scopes references to article content, sorts affected rows deterministically, and regenerated fixtures (125326/125328/106355/HT214108/HT215500) produce stable JSON + sanitizer HTML in English.|
|
||||
|Telemetry & documentation|DevEx|Docs|**DONE (2025-10-12)** – OpenTelemetry pipeline exports `StellaOps.Concelier.Connector.Vndr.Apple`; runbook `docs/modules/concelier/operations/connectors/apple.md` added with metrics + monitoring guidance.|
|
||||
|Live HTML regression sweep|QA|Source.Common|**DONE (2025-10-12)** – Captured latest support.apple.com articles for 125326/125328/106355/HT214108/HT215500, trimmed nav noise, and committed sanitized HTML + expected DTOs with invariant timestamps.|
|
||||
|Fixture regeneration tooling|DevEx|Testing|**DONE (2025-10-12)** – `scripts/update-apple-fixtures.(sh|ps1)` set the env flag + sentinel, forward through WSLENV, and clean up after regeneration; README references updated usage.|
|
||||
|
||||
@@ -1,12 +1,12 @@
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|FEEDCONN-CISCO-02-001 Confirm Cisco PSIRT data source|BE-Conn-Cisco|Research|**DONE (2025-10-11)** – Selected openVuln REST API (`https://apix.cisco.com/security/advisories/v2/…`) as primary (structured JSON, CSAF/CVRF links) with RSS as fallback. Documented OAuth2 client-credentials flow (`cloudsso.cisco.com/as/token.oauth2`), baseline quotas (5 req/s, 30 req/min, 5 000 req/day), and pagination contract (`pageIndex`, `pageSize≤100`) in `docs/concelier-connector-research-20251011.md`.|
|
||||
|FEEDCONN-CISCO-02-002 Fetch pipeline & state persistence|BE-Conn-Cisco|Source.Common, Storage.Mongo|**DONE (2025-10-14)** – Fetch job now streams openVuln pages with OAuth bearer handler, honours 429 `Retry-After`, persists per-advisory JSON + metadata into GridFS, and updates cursor (`lastModified`, advisory ID, pending docs).|
|
||||
|FEEDCONN-CISCO-02-003 Parser & DTO implementation|BE-Conn-Cisco|Source.Common|**DONE (2025-10-14)** – DTO factory normalizes SIR, folds CSAF product statuses, and persists `cisco.dto.v1` payloads (see `CiscoDtoFactory`).|
|
||||
|FEEDCONN-CISCO-02-004 Canonical mapping & range primitives|BE-Conn-Cisco|Models|**DONE (2025-10-14)** – `CiscoMapper` emits canonical advisories with vendor + SemVer primitives, provenance, and status tags.|
|
||||
|FEEDCONN-CISCO-02-005 Deterministic fixtures & tests|QA|Testing|**DONE (2025-10-14)** – Added unit tests (`StellaOps.Concelier.Connector.Vndr.Cisco.Tests`) exercising DTO/mapper pipelines; `dotnet test` validated.|
|
||||
|FEEDCONN-CISCO-02-006 Telemetry & documentation|DevEx|Docs|**DONE (2025-10-14)** – Cisco diagnostics counters exposed and ops runbook updated with telemetry guidance (`docs/ops/concelier-cisco-operations.md`).|
|
||||
|FEEDCONN-CISCO-02-007 API selection decision memo|BE-Conn-Cisco|Research|**DONE (2025-10-11)** – Drafted decision matrix: openVuln (structured/delta filters, OAuth throttle) vs RSS (delayed/minimal metadata). Pending OAuth onboarding (`FEEDCONN-CISCO-02-008`) before final recommendation circulated.|
|
||||
|FEEDCONN-CISCO-02-008 OAuth client provisioning|Ops, BE-Conn-Cisco|Ops|**DONE (2025-10-14)** – `docs/ops/concelier-cisco-operations.md` documents OAuth provisioning/rotation, quotas, and Offline Kit distribution guidance.|
|
||||
|FEEDCONN-CISCO-02-009 Normalized SemVer promotion|BE-Conn-Cisco|Merge coordination (`FEEDMERGE-COORD-02-900`)|**TODO (due 2025-10-21)** – Use helper from `../Merge/RANGE_PRIMITIVES_COORDINATION.md` to convert `SemVerPrimitive` outputs into `NormalizedVersionRule` with provenance (`cisco:{productId}`), update mapper/tests, and confirm merge normalized-rule counters drop.|
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|FEEDCONN-CISCO-02-001 Confirm Cisco PSIRT data source|BE-Conn-Cisco|Research|**DONE (2025-10-11)** – Selected openVuln REST API (`https://apix.cisco.com/security/advisories/v2/…`) as primary (structured JSON, CSAF/CVRF links) with RSS as fallback. Documented OAuth2 client-credentials flow (`cloudsso.cisco.com/as/token.oauth2`), baseline quotas (5 req/s, 30 req/min, 5 000 req/day), and pagination contract (`pageIndex`, `pageSize≤100`) in `docs/concelier-connector-research-20251011.md`.|
|
||||
|FEEDCONN-CISCO-02-002 Fetch pipeline & state persistence|BE-Conn-Cisco|Source.Common, Storage.Mongo|**DONE (2025-10-14)** – Fetch job now streams openVuln pages with OAuth bearer handler, honours 429 `Retry-After`, persists per-advisory JSON + metadata into GridFS, and updates cursor (`lastModified`, advisory ID, pending docs).|
|
||||
|FEEDCONN-CISCO-02-003 Parser & DTO implementation|BE-Conn-Cisco|Source.Common|**DONE (2025-10-14)** – DTO factory normalizes SIR, folds CSAF product statuses, and persists `cisco.dto.v1` payloads (see `CiscoDtoFactory`).|
|
||||
|FEEDCONN-CISCO-02-004 Canonical mapping & range primitives|BE-Conn-Cisco|Models|**DONE (2025-10-14)** – `CiscoMapper` emits canonical advisories with vendor + SemVer primitives, provenance, and status tags.|
|
||||
|FEEDCONN-CISCO-02-005 Deterministic fixtures & tests|QA|Testing|**DONE (2025-10-14)** – Added unit tests (`StellaOps.Concelier.Connector.Vndr.Cisco.Tests`) exercising DTO/mapper pipelines; `dotnet test` validated.|
|
||||
|FEEDCONN-CISCO-02-006 Telemetry & documentation|DevEx|Docs|**DONE (2025-10-14)** – Cisco diagnostics counters exposed and ops runbook updated with telemetry guidance (`docs/modules/concelier/operations/connectors/cisco.md`).|
|
||||
|FEEDCONN-CISCO-02-007 API selection decision memo|BE-Conn-Cisco|Research|**DONE (2025-10-11)** – Drafted decision matrix: openVuln (structured/delta filters, OAuth throttle) vs RSS (delayed/minimal metadata). Pending OAuth onboarding (`FEEDCONN-CISCO-02-008`) before final recommendation circulated.|
|
||||
|FEEDCONN-CISCO-02-008 OAuth client provisioning|Ops, BE-Conn-Cisco|Ops|**DONE (2025-10-14)** – `docs/modules/concelier/operations/connectors/cisco.md` documents OAuth provisioning/rotation, quotas, and Offline Kit distribution guidance.|
|
||||
|FEEDCONN-CISCO-02-009 Normalized SemVer promotion|BE-Conn-Cisco|Merge coordination (`FEEDMERGE-COORD-02-900`)|**TODO (due 2025-10-21)** – Use helper from `../Merge/RANGE_PRIMITIVES_COORDINATION.md` to convert `SemVerPrimitive` outputs into `NormalizedVersionRule` with provenance (`cisco:{productId}`), update mapper/tests, and confirm merge normalized-rule counters drop.|
|
||||
|
||||
@@ -1,11 +1,11 @@
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|FEEDCONN-MSRC-02-001 Document MSRC Security Update Guide API|BE-Conn-MSRC|Research|**DONE (2025-10-11)** – Confirmed REST endpoint (`https://api.msrc.microsoft.com/sug/v2.0/en-US/vulnerabilities`) + CVRF ZIP download flow, required Azure AD client-credentials scope (`api://api.msrc.microsoft.com/.default`), mandatory `api-version=2024-08-01` header, and delta params (`lastModifiedStartDateTime`, `lastModifiedEndDateTime`). Findings recorded in `docs/concelier-connector-research-20251011.md`.|
|
||||
|FEEDCONN-MSRC-02-002 Fetch pipeline & source state|BE-Conn-MSRC|Source.Common, Storage.Mongo|**DONE (2025-10-15)** – Added `MsrcApiClient` + token provider, cursor overlap handling, and detail persistence via GridFS (metadata carries CVRF URL + timestamps). State tracks `lastModifiedCursor` with configurable overlap/backoff. **Next:** coordinate with Tools on shared state-seeding helper once CVRF download flag stabilises.|
|
||||
|FEEDCONN-MSRC-02-003 Parser & DTO implementation|BE-Conn-MSRC|Source.Common|**DONE (2025-10-15)** – Implemented `MsrcDetailParser`/DTOs capturing threats, remediations, KB IDs, CVEs, CVSS, and affected products (build/platform metadata preserved).|
|
||||
|FEEDCONN-MSRC-02-004 Canonical mapping & range primitives|BE-Conn-MSRC|Models|**DONE (2025-10-15)** – `MsrcMapper` emits aliases (MSRC ID/CVE/KB), references (release notes + CVRF), vendor packages with `msrc.build` normalized rules, and CVSS provenance.|
|
||||
|FEEDCONN-MSRC-02-005 Deterministic fixtures/tests|QA|Testing|**DONE (2025-10-15)** – Added `StellaOps.Concelier.Connector.Vndr.Msrc.Tests` with canned token/summary/detail responses and snapshot assertions via Mongo2Go. Fixtures regenerate via `UPDATE_MSRC_FIXTURES`.|
|
||||
|FEEDCONN-MSRC-02-006 Telemetry & documentation|DevEx|Docs|**DONE (2025-10-15)** – Introduced `MsrcDiagnostics` meter (summary/detail/parse/map metrics), structured fetch logs, README updates, and Ops brief `docs/ops/concelier-msrc-operations.md` covering AAD onboarding + CVRF handling.|
|
||||
|FEEDCONN-MSRC-02-007 API contract comparison memo|BE-Conn-MSRC|Research|**DONE (2025-10-11)** – Completed memo outline recommending dual-path (REST for incremental, CVRF for offline); implementation hinges on `FEEDCONN-MSRC-02-008` AAD onboarding for token acquisition.|
|
||||
|FEEDCONN-MSRC-02-008 Azure AD application onboarding|Ops, BE-Conn-MSRC|Ops|**DONE (2025-10-15)** – Coordinated Ops handoff; drafted AAD onboarding brief (`docs/ops/concelier-msrc-operations.md`) with app registration requirements, secret rotation policy, sample configuration, and CVRF mirroring guidance for Offline Kit.|
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|FEEDCONN-MSRC-02-001 Document MSRC Security Update Guide API|BE-Conn-MSRC|Research|**DONE (2025-10-11)** – Confirmed REST endpoint (`https://api.msrc.microsoft.com/sug/v2.0/en-US/vulnerabilities`) + CVRF ZIP download flow, required Azure AD client-credentials scope (`api://api.msrc.microsoft.com/.default`), mandatory `api-version=2024-08-01` header, and delta params (`lastModifiedStartDateTime`, `lastModifiedEndDateTime`). Findings recorded in `docs/concelier-connector-research-20251011.md`.|
|
||||
|FEEDCONN-MSRC-02-002 Fetch pipeline & source state|BE-Conn-MSRC|Source.Common, Storage.Mongo|**DONE (2025-10-15)** – Added `MsrcApiClient` + token provider, cursor overlap handling, and detail persistence via GridFS (metadata carries CVRF URL + timestamps). State tracks `lastModifiedCursor` with configurable overlap/backoff. **Next:** coordinate with Tools on shared state-seeding helper once CVRF download flag stabilises.|
|
||||
|FEEDCONN-MSRC-02-003 Parser & DTO implementation|BE-Conn-MSRC|Source.Common|**DONE (2025-10-15)** – Implemented `MsrcDetailParser`/DTOs capturing threats, remediations, KB IDs, CVEs, CVSS, and affected products (build/platform metadata preserved).|
|
||||
|FEEDCONN-MSRC-02-004 Canonical mapping & range primitives|BE-Conn-MSRC|Models|**DONE (2025-10-15)** – `MsrcMapper` emits aliases (MSRC ID/CVE/KB), references (release notes + CVRF), vendor packages with `msrc.build` normalized rules, and CVSS provenance.|
|
||||
|FEEDCONN-MSRC-02-005 Deterministic fixtures/tests|QA|Testing|**DONE (2025-10-15)** – Added `StellaOps.Concelier.Connector.Vndr.Msrc.Tests` with canned token/summary/detail responses and snapshot assertions via Mongo2Go. Fixtures regenerate via `UPDATE_MSRC_FIXTURES`.|
|
||||
|FEEDCONN-MSRC-02-006 Telemetry & documentation|DevEx|Docs|**DONE (2025-10-15)** – Introduced `MsrcDiagnostics` meter (summary/detail/parse/map metrics), structured fetch logs, README updates, and Ops brief `docs/modules/concelier/operations/connectors/msrc.md` covering AAD onboarding + CVRF handling.|
|
||||
|FEEDCONN-MSRC-02-007 API contract comparison memo|BE-Conn-MSRC|Research|**DONE (2025-10-11)** – Completed memo outline recommending dual-path (REST for incremental, CVRF for offline); implementation hinges on `FEEDCONN-MSRC-02-008` AAD onboarding for token acquisition.|
|
||||
|FEEDCONN-MSRC-02-008 Azure AD application onboarding|Ops, BE-Conn-MSRC|Ops|**DONE (2025-10-15)** – Coordinated Ops handoff; drafted AAD onboarding brief (`docs/modules/concelier/operations/connectors/msrc.md`) with app registration requirements, secret rotation policy, sample configuration, and CVRF mirroring guidance for Offline Kit.|
|
||||
|
||||
@@ -1,143 +0,0 @@
|
||||
````markdown
|
||||
# Concelier Vulnerability Conflict Resolution Rules
|
||||
|
||||
This document defines the canonical, deterministic conflict resolution strategy for merging vulnerability data from **NVD**, **GHSA**, and **OSV** in Concelier.
|
||||
|
||||
---
|
||||
|
||||
## 🧭 Source Precedence
|
||||
|
||||
1. **Primary order:**
|
||||
`GHSA > NVD > OSV`
|
||||
|
||||
**Rationale:**
|
||||
GHSA advisories are human-curated and fast to correct; NVD has the broadest CVE coverage; OSV excels in ecosystem-specific precision.
|
||||
|
||||
2. **Freshness override (≥48 h):**
|
||||
If a **lower-priority** source is **newer by at least 48 hours** for a freshness-sensitive field, its value overrides the higher-priority one.
|
||||
Always store the decision in a provenance record.
|
||||
|
||||
3. **Merge scope:**
|
||||
Only merge data referring to the **same CVE ID** or the same GHSA/OSV advisory explicitly mapped to that CVE.
|
||||
|
||||
---
|
||||
|
||||
## 🧩 Field-Level Precedence
|
||||
|
||||
| Field | Priority | Freshness-Sensitive | Notes |
|
||||
|-------|-----------|--------------------|-------|
|
||||
| Title / Summary | GHSA → NVD → OSV | ✅ | Prefer concise structured titles |
|
||||
| Description | GHSA → NVD → OSV | ✅ | |
|
||||
| Severity (CVSS) | NVD → GHSA → OSV | ❌ | Keep all under `metrics[]`, mark `canonicalMetric` by order |
|
||||
| Ecosystem Severity Label | GHSA → OSV | ❌ | Supplemental tag only |
|
||||
| Affected Packages / Ranges | OSV → GHSA → NVD | ✅ | OSV strongest for SemVer normalization |
|
||||
| CWE(s) | NVD → GHSA → OSV | ❌ | NVD taxonomy most stable |
|
||||
| References / Links | Union of all | ✅ | Deduplicate by normalized URL |
|
||||
| Credits / Acknowledgements | Union of all | ✅ | Sort by role, displayName |
|
||||
| Published / Modified timestamps | Earliest published / Latest modified | ✅ | |
|
||||
| EPSS / KEV / Exploit status | Specialized feed only | ❌ | Do not override manually |
|
||||
|
||||
---
|
||||
|
||||
## ⚖️ Deterministic Tie-Breakers
|
||||
|
||||
If precedence and freshness both tie:
|
||||
|
||||
1. **Source order:** GHSA > NVD > OSV
|
||||
2. **Lexicographic stability:** Prefer shorter normalized text; if equal, ASCIIbetical
|
||||
3. **Stable hash of payload:** Lowest hash wins
|
||||
|
||||
Each chosen value must store the merge rationale:
|
||||
|
||||
```json
|
||||
{
|
||||
"provenance": {
|
||||
"source": "GHSA",
|
||||
"kind": "merge",
|
||||
"value": "description",
|
||||
"decisionReason": "precedence"
|
||||
}
|
||||
}
|
||||
````
|
||||
|
||||
---
|
||||
|
||||
## 🧮 Merge Algorithm (Pseudocode)
|
||||
|
||||
```csharp
|
||||
inputs: records = {ghsa?, nvd?, osv?}
|
||||
out = new CanonicalVuln(CVE)
|
||||
|
||||
foreach field in CANONICAL_SCHEMA:
|
||||
candidates = collect(values, source, lastModified)
|
||||
if freshnessSensitive(field) and newerBy48h(lowerPriority):
|
||||
pick newest
|
||||
else:
|
||||
pick by precedence(field)
|
||||
if tie:
|
||||
applyTieBreakers()
|
||||
out.field = normalize(field, value)
|
||||
out.provenance[field] = decisionTrail
|
||||
|
||||
out.references = dedupe(union(all.references))
|
||||
out.affected = normalizeAndUnion(OSV, GHSA, NVD)
|
||||
out.metrics = rankAndSetCanonical(NVDv3 → GHSA → OSV → v2)
|
||||
return out
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 🔧 Normalization Rules
|
||||
|
||||
* **SemVer:**
|
||||
Parse with tolerant builder; normalize `v` prefixes; map comparators (`<=`, `<`, `>=`, `>`); expand OSV events into continuous ranges.
|
||||
|
||||
* **Packages:**
|
||||
Canonical key = `(ecosystem, packageName, language?)`; maintain aliases (purl, npm, Maven GAV, etc.).
|
||||
|
||||
* **CWE:**
|
||||
Store both ID and name; validate against current CWE catalog.
|
||||
|
||||
* **CVSS:**
|
||||
Preserve provided vector and base score; recompute only for validation.
|
||||
|
||||
---
|
||||
|
||||
## ✅ Output Guarantees
|
||||
|
||||
| Property | Description |
|
||||
| ---------------- | ------------------------------------------------------------------------------- |
|
||||
| **Reproducible** | Same input → same canonical output |
|
||||
| **Auditable** | Provenance stored per field |
|
||||
| **Complete** | Unions with de-duplication |
|
||||
| **Composable** | Future layers (KEV, EPSS, vendor advisories) can safely extend precedence rules |
|
||||
|
||||
---
|
||||
|
||||
## 🧠 Example
|
||||
|
||||
* GHSA summary updated on *2025-10-09*
|
||||
* NVD last modified *2025-10-05*
|
||||
* OSV updated *2025-10-10*
|
||||
|
||||
→ **Summary:** OSV wins (freshness override)
|
||||
→ **CVSS:** NVD v3.1 remains canonical
|
||||
→ **Affected:** OSV ranges canonical; GHSA aliases merged
|
||||
|
||||
---
|
||||
|
||||
## 🧰 Optional C# Helper Class
|
||||
|
||||
`StellaOps.Concelier.Core/CanonicalMerger.cs`
|
||||
|
||||
Implements:
|
||||
|
||||
* `FieldPrecedenceMap`
|
||||
* `FreshnessSensitiveFields`
|
||||
* `ApplyTieBreakers()`
|
||||
* `NormalizeAndUnion()`
|
||||
|
||||
Deterministically builds `CanonicalVuln` with full provenance tracking.
|
||||
|
||||
```
|
||||
```
|
||||
@@ -1,94 +1,94 @@
|
||||
# TASKS — Epic 1: Aggregation-Only Contract
|
||||
> **AOC Reminder:** Excititor WebService publishes raw statements/linksets only; derived precedence/severity belongs to Policy overlays.
|
||||
| ID | Status | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|---|
|
||||
| EXCITITOR-WEB-AOC-19-001 `Raw VEX ingestion APIs` | TODO | Excititor WebService Guild | EXCITITOR-CORE-AOC-19-001, EXCITITOR-STORE-AOC-19-001 | Implement `POST /ingest/vex`, `GET /vex/raw*`, and `POST /aoc/verify` endpoints. Enforce Authority scopes, tenant injection, and guard pipeline to ensure only immutable VEX facts are persisted. |
|
||||
> Docs alignment (2025-10-26): See AOC reference §4–5 and authority scopes doc for required tokens/behaviour.
|
||||
| EXCITITOR-WEB-AOC-19-002 `AOC observability + metrics` | TODO | Excititor WebService Guild, Observability Guild | EXCITITOR-WEB-AOC-19-001 | Export metrics (`ingestion_write_total`, `aoc_violation_total`, signature verification counters) and tracing spans matching Conseiller naming. Ensure structured logging includes tenant, source vendor, upstream id, and content hash. |
|
||||
> Docs alignment (2025-10-26): Metrics/traces/log schema in `docs/observability/observability.md`.
|
||||
| EXCITITOR-WEB-AOC-19-003 `Guard + schema test harness` | TODO | QA Guild | EXCITITOR-WEB-AOC-19-001 | Add unit/integration tests for schema validation, forbidden field rejection (`ERR_AOC_001/006/007`), and supersedes behavior using CycloneDX-VEX & CSAF fixtures with deterministic expectations. |
|
||||
> Docs alignment (2025-10-26): Error codes + CLI verification in `docs/cli/cli-reference.md`.
|
||||
| EXCITITOR-WEB-AOC-19-004 `Batch ingest validation` | TODO | Excititor WebService Guild, QA Guild | EXCITITOR-WEB-AOC-19-003, EXCITITOR-CORE-AOC-19-002 | Build large fixture ingest covering mixed VEX statuses, verifying raw storage parity, metrics, and CLI `aoc verify` compatibility. Document load test/runbook updates. |
|
||||
> Docs alignment (2025-10-26): Offline/air-gap workflows captured in `docs/deploy/containers.md` §5.
|
||||
|
||||
## Policy Engine v2
|
||||
|
||||
| ID | Status | Owner(s) | Depends on | Notes |
|
||||
|----|--------|----------|------------|-------|
|
||||
| EXCITITOR-POLICY-20-001 `Policy selection endpoints` | TODO | Excititor WebService Guild | WEB-POLICY-20-001, EXCITITOR-CORE-AOC-19-004 | Provide VEX lookup APIs supporting PURL/advisory batching, scope filtering, and tenant enforcement with deterministic ordering + pagination. |
|
||||
|
||||
## StellaOps Console (Sprint 23)
|
||||
|
||||
| ID | Status | Owner(s) | Depends on | Notes |
|
||||
|----|--------|----------|------------|-------|
|
||||
| EXCITITOR-CONSOLE-23-001 `VEX aggregation views` | TODO | Excititor WebService Guild, BE-Base Platform Guild | EXCITITOR-LNM-21-201, EXCITITOR-LNM-21-202 | Expose `/console/vex` endpoints returning grouped VEX statements per advisory/component with status chips, justification metadata, precedence trace pointers, and tenant-scoped filters for Console explorer. |
|
||||
| EXCITITOR-CONSOLE-23-002 `Dashboard VEX deltas` | TODO | Excititor WebService Guild | EXCITITOR-CONSOLE-23-001, EXCITITOR-LNM-21-203 | Provide aggregated counts for VEX overrides (new, not_affected, revoked) powering Console dashboard + live status ticker; emit metrics for policy explain integration. |
|
||||
| EXCITITOR-CONSOLE-23-003 `VEX search helpers` | TODO | Excititor WebService Guild | EXCITITOR-CONSOLE-23-001 | Deliver rapid lookup endpoints of VEX by advisory/component for Console global search; ensure response includes provenance and precedence context; include caching and RBAC. |
|
||||
|
||||
## Graph Explorer v1
|
||||
|
||||
| ID | Status | Owner(s) | Depends on | Notes |
|
||||
|----|--------|----------|------------|-------|
|
||||
|
||||
## Link-Not-Merge v1
|
||||
|
||||
| ID | Status | Owner(s) | Depends on | Notes |
|
||||
|----|--------|----------|------------|-------|
|
||||
| EXCITITOR-LNM-21-201 `Observation APIs` | TODO | Excititor WebService Guild, BE-Base Platform Guild | EXCITITOR-LNM-21-001 | Add VEX observation read endpoints with filters, pagination, RBAC, and tenant scoping. |
|
||||
| EXCITITOR-LNM-21-202 `Linkset APIs` | TODO | Excititor WebService Guild | EXCITITOR-LNM-21-002, EXCITITOR-LNM-21-003 | Implement linkset read/export/evidence endpoints returning correlation/conflict payloads and map errors to `ERR_AGG_*`. |
|
||||
| EXCITITOR-LNM-21-203 `Event publishing` | TODO | Excititor WebService Guild, Platform Events Guild | EXCITITOR-LNM-21-005 | Publish `vex.linkset.updated` events, document schema, and ensure idempotent delivery. |
|
||||
|
||||
## Graph & Vuln Explorer v1
|
||||
|
||||
| ID | Status | Owner(s) | Depends on | Notes |
|
||||
|----|--------|----------|------------|-------|
|
||||
| EXCITITOR-GRAPH-24-101 `VEX summary API` | TODO | Excititor WebService Guild | EXCITITOR-GRAPH-24-001 | Provide endpoints delivering VEX status summaries per component/asset for Vuln Explorer integration. |
|
||||
| EXCITITOR-GRAPH-24-102 `Evidence batch API` | TODO | Excititor WebService Guild | EXCITITOR-LNM-21-201 | Add batch VEX observation retrieval optimized for Graph overlays/tooltips. |
|
||||
|
||||
## VEX Lens (Sprint 30)
|
||||
|
||||
| ID | Status | Owner(s) | Depends on | Notes |
|
||||
|----|--------|----------|------------|-------|
|
||||
| EXCITITOR-VEXLENS-30-001 `VEX evidence enrichers` | TODO | Excititor WebService Guild, VEX Lens Guild | EXCITITOR-VULN-29-001, VEXLENS-30-005 | Include issuer hints, signatures, and product trees in evidence payloads for VEX Lens; Label: VEX-Lens. |
|
||||
|
||||
## Vulnerability Explorer (Sprint 29)
|
||||
|
||||
| ID | Status | Owner(s) | Depends on | Notes |
|
||||
|----|--------|----------|------------|-------|
|
||||
| EXCITITOR-VULN-29-001 `VEX key canonicalization` | TODO | Excititor WebService Guild | EXCITITOR-LNM-21-001 | Canonicalize (lossless) VEX advisory/product keys (map to `advisory_key`, capture product scopes); expose original sources in `links[]`; AOC-compliant: no merge, no derived fields, no suppression; backfill existing records. |
|
||||
| EXCITITOR-VULN-29-002 `Evidence retrieval` | TODO | Excititor WebService Guild | EXCITITOR-VULN-29-001, VULN-API-29-003 | Provide `/vuln/evidence/vex/{advisory_key}` returning raw VEX statements filtered by tenant/product scope for Explorer evidence tabs. |
|
||||
| EXCITITOR-VULN-29-004 `Observability` | TODO | Excititor WebService Guild, Observability Guild | EXCITITOR-VULN-29-001 | Add metrics/logs for VEX normalization, suppression scopes, withdrawn statements; emit events consumed by Vuln Explorer resolver. |
|
||||
|
||||
## Advisory AI (Sprint 31)
|
||||
|
||||
| ID | Status | Owner(s) | Depends on | Notes |
|
||||
|----|--------|----------|------------|-------|
|
||||
| EXCITITOR-AIAI-31-001 `Justification enrichment` | TODO | Excititor WebService Guild | EXCITITOR-VULN-29-001 | Expose normalized VEX justifications, product trees, and paragraph anchors for Advisory AI conflict explanations. |
|
||||
| EXCITITOR-AIAI-31-002 `VEX chunk API` | TODO | Excititor WebService Guild | EXCITITOR-AIAI-31-001, VEXLENS-30-006 | Provide `/vex/evidence/chunks` endpoint returning tenant-scoped VEX statements with signature metadata and scope scores for RAG. |
|
||||
| EXCITITOR-AIAI-31-003 `Telemetry` | TODO | Excititor WebService Guild, Observability Guild | EXCITITOR-AIAI-31-001 | Emit metrics/logs for VEX chunk usage, signature verification failures, and guardrail triggers. |
|
||||
|
||||
## Observability & Forensics (Epic 15)
|
||||
| ID | Status | Owner(s) | Depends on | Notes |
|
||||
|----|--------|----------|------------|-------|
|
||||
| EXCITITOR-WEB-OBS-50-001 `Telemetry adoption` | TODO | Excititor WebService Guild | TELEMETRY-OBS-50-001, EXCITITOR-OBS-50-001 | Adopt telemetry core for VEX APIs, ensure responses include trace IDs & correlation headers, and update structured logging for read endpoints. |
|
||||
| EXCITITOR-WEB-OBS-51-001 `Observability health endpoints` | TODO | Excititor WebService Guild | EXCITITOR-WEB-OBS-50-001, WEB-OBS-51-001 | Implement `/obs/excititor/health` summarizing ingest/link SLOs, signature failure counts, and conflict trends for Console dashboards. |
|
||||
| EXCITITOR-WEB-OBS-52-001 `Timeline streaming` | TODO | Excititor WebService Guild | EXCITITOR-WEB-OBS-50-001, TIMELINE-OBS-52-003 | Provide SSE bridge for VEX timeline events with tenant filters, pagination, and guardrails. |
|
||||
| EXCITITOR-WEB-OBS-53-001 `Evidence APIs` | TODO | Excititor WebService Guild, Evidence Locker Guild | EXCITITOR-OBS-53-001, EVID-OBS-53-003 | Expose `/evidence/vex/*` endpoints that fetch locker bundles, enforce scopes, and surface verification metadata. |
|
||||
| EXCITITOR-WEB-OBS-54-001 `Attestation APIs` | TODO | Excititor WebService Guild | EXCITITOR-OBS-54-001, PROV-OBS-54-001 | Add `/attestations/vex/*` endpoints returning DSSE verification state, builder identity, and chain-of-custody links. |
|
||||
| EXCITITOR-WEB-OBS-55-001 `Incident mode toggles` | TODO | Excititor WebService Guild, DevOps Guild | EXCITITOR-OBS-55-001, WEB-OBS-55-001 | Provide incident mode API for VEX pipelines with activation audit logs and retention override previews. |
|
||||
|
||||
## Air-Gapped Mode (Epic 16)
|
||||
| ID | Status | Owner(s) | Depends on | Notes |
|
||||
|----|--------|----------|------------|-------|
|
||||
| EXCITITOR-WEB-AIRGAP-56-001 | TODO | Excititor WebService Guild | AIRGAP-IMP-58-001, EXCITITOR-AIRGAP-56-001 | Support mirror bundle registration via APIs, expose bundle provenance in VEX responses, and block external connectors in sealed mode. |
|
||||
| EXCITITOR-WEB-AIRGAP-56-002 | TODO | Excititor WebService Guild, AirGap Time Guild | EXCITITOR-WEB-AIRGAP-56-001, AIRGAP-TIME-58-001 | Return VEX staleness metrics and time anchor info in API responses for Console/CLI use. |
|
||||
| EXCITITOR-WEB-AIRGAP-57-001 | TODO | Excititor WebService Guild, AirGap Policy Guild | AIRGAP-POL-56-001 | Map sealed-mode violations to standardized error payload with remediation guidance. |
|
||||
| EXCITITOR-WEB-AIRGAP-58-001 | TODO | Excititor WebService Guild, AirGap Importer Guild | EXCITITOR-WEB-AIRGAP-56-001, TIMELINE-OBS-53-001 | Emit timeline events for VEX bundle imports with bundle ID, scope, and actor metadata. |
|
||||
|
||||
## SDKs & OpenAPI (Epic 17)
|
||||
| ID | Status | Owner(s) | Depends on | Notes |
|
||||
|----|--------|----------|------------|-------|
|
||||
| EXCITITOR-WEB-OAS-61-001 | TODO | Excititor WebService Guild | OAS-61-001 | Implement `/.well-known/openapi` discovery endpoint with spec version metadata. |
|
||||
| EXCITITOR-WEB-OAS-61-002 | TODO | Excititor WebService Guild | APIGOV-61-001 | Standardize error envelope responses and update controller/unit tests. |
|
||||
| EXCITITOR-WEB-OAS-62-001 | TODO | Excititor WebService Guild | EXCITITOR-OAS-61-002 | Add curated examples for VEX observation/linkset endpoints and ensure portal displays them. |
|
||||
| EXCITITOR-WEB-OAS-63-001 | TODO | Excititor WebService Guild, API Governance Guild | APIGOV-63-001 | Emit deprecation headers and update docs for retiring VEX APIs. |
|
||||
# TASKS — Epic 1: Aggregation-Only Contract
|
||||
> **AOC Reminder:** Excititor WebService publishes raw statements/linksets only; derived precedence/severity belongs to Policy overlays.
|
||||
| ID | Status | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|---|
|
||||
| EXCITITOR-WEB-AOC-19-001 `Raw VEX ingestion APIs` | TODO | Excititor WebService Guild | EXCITITOR-CORE-AOC-19-001, EXCITITOR-STORE-AOC-19-001 | Implement `POST /ingest/vex`, `GET /vex/raw*`, and `POST /aoc/verify` endpoints. Enforce Authority scopes, tenant injection, and guard pipeline to ensure only immutable VEX facts are persisted. |
|
||||
> Docs alignment (2025-10-26): See AOC reference §4–5 and authority scopes doc for required tokens/behaviour.
|
||||
| EXCITITOR-WEB-AOC-19-002 `AOC observability + metrics` | TODO | Excititor WebService Guild, Observability Guild | EXCITITOR-WEB-AOC-19-001 | Export metrics (`ingestion_write_total`, `aoc_violation_total`, signature verification counters) and tracing spans matching Conseiller naming. Ensure structured logging includes tenant, source vendor, upstream id, and content hash. |
|
||||
> Docs alignment (2025-10-26): Metrics/traces/log schema in `docs/observability/observability.md`.
|
||||
| EXCITITOR-WEB-AOC-19-003 `Guard + schema test harness` | TODO | QA Guild | EXCITITOR-WEB-AOC-19-001 | Add unit/integration tests for schema validation, forbidden field rejection (`ERR_AOC_001/006/007`), and supersedes behavior using CycloneDX-VEX & CSAF fixtures with deterministic expectations. |
|
||||
> Docs alignment (2025-10-26): Error codes + CLI verification in `docs/modules/cli/guides/cli-reference.md`.
|
||||
| EXCITITOR-WEB-AOC-19-004 `Batch ingest validation` | TODO | Excititor WebService Guild, QA Guild | EXCITITOR-WEB-AOC-19-003, EXCITITOR-CORE-AOC-19-002 | Build large fixture ingest covering mixed VEX statuses, verifying raw storage parity, metrics, and CLI `aoc verify` compatibility. Document load test/runbook updates. |
|
||||
> Docs alignment (2025-10-26): Offline/air-gap workflows captured in `docs/deploy/containers.md` §5.
|
||||
|
||||
## Policy Engine v2
|
||||
|
||||
| ID | Status | Owner(s) | Depends on | Notes |
|
||||
|----|--------|----------|------------|-------|
|
||||
| EXCITITOR-POLICY-20-001 `Policy selection endpoints` | TODO | Excititor WebService Guild | WEB-POLICY-20-001, EXCITITOR-CORE-AOC-19-004 | Provide VEX lookup APIs supporting PURL/advisory batching, scope filtering, and tenant enforcement with deterministic ordering + pagination. |
|
||||
|
||||
## StellaOps Console (Sprint 23)
|
||||
|
||||
| ID | Status | Owner(s) | Depends on | Notes |
|
||||
|----|--------|----------|------------|-------|
|
||||
| EXCITITOR-CONSOLE-23-001 `VEX aggregation views` | TODO | Excititor WebService Guild, BE-Base Platform Guild | EXCITITOR-LNM-21-201, EXCITITOR-LNM-21-202 | Expose `/console/vex` endpoints returning grouped VEX statements per advisory/component with status chips, justification metadata, precedence trace pointers, and tenant-scoped filters for Console explorer. |
|
||||
| EXCITITOR-CONSOLE-23-002 `Dashboard VEX deltas` | TODO | Excititor WebService Guild | EXCITITOR-CONSOLE-23-001, EXCITITOR-LNM-21-203 | Provide aggregated counts for VEX overrides (new, not_affected, revoked) powering Console dashboard + live status ticker; emit metrics for policy explain integration. |
|
||||
| EXCITITOR-CONSOLE-23-003 `VEX search helpers` | TODO | Excititor WebService Guild | EXCITITOR-CONSOLE-23-001 | Deliver rapid lookup endpoints of VEX by advisory/component for Console global search; ensure response includes provenance and precedence context; include caching and RBAC. |
|
||||
|
||||
## Graph Explorer v1
|
||||
|
||||
| ID | Status | Owner(s) | Depends on | Notes |
|
||||
|----|--------|----------|------------|-------|
|
||||
|
||||
## Link-Not-Merge v1
|
||||
|
||||
| ID | Status | Owner(s) | Depends on | Notes |
|
||||
|----|--------|----------|------------|-------|
|
||||
| EXCITITOR-LNM-21-201 `Observation APIs` | TODO | Excititor WebService Guild, BE-Base Platform Guild | EXCITITOR-LNM-21-001 | Add VEX observation read endpoints with filters, pagination, RBAC, and tenant scoping. |
|
||||
| EXCITITOR-LNM-21-202 `Linkset APIs` | TODO | Excititor WebService Guild | EXCITITOR-LNM-21-002, EXCITITOR-LNM-21-003 | Implement linkset read/export/evidence endpoints returning correlation/conflict payloads and map errors to `ERR_AGG_*`. |
|
||||
| EXCITITOR-LNM-21-203 `Event publishing` | TODO | Excititor WebService Guild, Platform Events Guild | EXCITITOR-LNM-21-005 | Publish `vex.linkset.updated` events, document schema, and ensure idempotent delivery. |
|
||||
|
||||
## Graph & Vuln Explorer v1
|
||||
|
||||
| ID | Status | Owner(s) | Depends on | Notes |
|
||||
|----|--------|----------|------------|-------|
|
||||
| EXCITITOR-GRAPH-24-101 `VEX summary API` | TODO | Excititor WebService Guild | EXCITITOR-GRAPH-24-001 | Provide endpoints delivering VEX status summaries per component/asset for Vuln Explorer integration. |
|
||||
| EXCITITOR-GRAPH-24-102 `Evidence batch API` | TODO | Excititor WebService Guild | EXCITITOR-LNM-21-201 | Add batch VEX observation retrieval optimized for Graph overlays/tooltips. |
|
||||
|
||||
## VEX Lens (Sprint 30)
|
||||
|
||||
| ID | Status | Owner(s) | Depends on | Notes |
|
||||
|----|--------|----------|------------|-------|
|
||||
| EXCITITOR-VEXLENS-30-001 `VEX evidence enrichers` | TODO | Excititor WebService Guild, VEX Lens Guild | EXCITITOR-VULN-29-001, VEXLENS-30-005 | Include issuer hints, signatures, and product trees in evidence payloads for VEX Lens; Label: VEX-Lens. |
|
||||
|
||||
## Vulnerability Explorer (Sprint 29)
|
||||
|
||||
| ID | Status | Owner(s) | Depends on | Notes |
|
||||
|----|--------|----------|------------|-------|
|
||||
| EXCITITOR-VULN-29-001 `VEX key canonicalization` | TODO | Excititor WebService Guild | EXCITITOR-LNM-21-001 | Canonicalize (lossless) VEX advisory/product keys (map to `advisory_key`, capture product scopes); expose original sources in `links[]`; AOC-compliant: no merge, no derived fields, no suppression; backfill existing records. |
|
||||
| EXCITITOR-VULN-29-002 `Evidence retrieval` | TODO | Excititor WebService Guild | EXCITITOR-VULN-29-001, VULN-API-29-003 | Provide `/vuln/evidence/vex/{advisory_key}` returning raw VEX statements filtered by tenant/product scope for Explorer evidence tabs. |
|
||||
| EXCITITOR-VULN-29-004 `Observability` | TODO | Excititor WebService Guild, Observability Guild | EXCITITOR-VULN-29-001 | Add metrics/logs for VEX normalization, suppression scopes, withdrawn statements; emit events consumed by Vuln Explorer resolver. |
|
||||
|
||||
## Advisory AI (Sprint 31)
|
||||
|
||||
| ID | Status | Owner(s) | Depends on | Notes |
|
||||
|----|--------|----------|------------|-------|
|
||||
| EXCITITOR-AIAI-31-001 `Justification enrichment` | TODO | Excititor WebService Guild | EXCITITOR-VULN-29-001 | Expose normalized VEX justifications, product trees, and paragraph anchors for Advisory AI conflict explanations. |
|
||||
| EXCITITOR-AIAI-31-002 `VEX chunk API` | TODO | Excititor WebService Guild | EXCITITOR-AIAI-31-001, VEXLENS-30-006 | Provide `/vex/evidence/chunks` endpoint returning tenant-scoped VEX statements with signature metadata and scope scores for RAG. |
|
||||
| EXCITITOR-AIAI-31-003 `Telemetry` | TODO | Excititor WebService Guild, Observability Guild | EXCITITOR-AIAI-31-001 | Emit metrics/logs for VEX chunk usage, signature verification failures, and guardrail triggers. |
|
||||
|
||||
## Observability & Forensics (Epic 15)
|
||||
| ID | Status | Owner(s) | Depends on | Notes |
|
||||
|----|--------|----------|------------|-------|
|
||||
| EXCITITOR-WEB-OBS-50-001 `Telemetry adoption` | TODO | Excititor WebService Guild | TELEMETRY-OBS-50-001, EXCITITOR-OBS-50-001 | Adopt telemetry core for VEX APIs, ensure responses include trace IDs & correlation headers, and update structured logging for read endpoints. |
|
||||
| EXCITITOR-WEB-OBS-51-001 `Observability health endpoints` | TODO | Excititor WebService Guild | EXCITITOR-WEB-OBS-50-001, WEB-OBS-51-001 | Implement `/obs/excititor/health` summarizing ingest/link SLOs, signature failure counts, and conflict trends for Console dashboards. |
|
||||
| EXCITITOR-WEB-OBS-52-001 `Timeline streaming` | TODO | Excititor WebService Guild | EXCITITOR-WEB-OBS-50-001, TIMELINE-OBS-52-003 | Provide SSE bridge for VEX timeline events with tenant filters, pagination, and guardrails. |
|
||||
| EXCITITOR-WEB-OBS-53-001 `Evidence APIs` | TODO | Excititor WebService Guild, Evidence Locker Guild | EXCITITOR-OBS-53-001, EVID-OBS-53-003 | Expose `/evidence/vex/*` endpoints that fetch locker bundles, enforce scopes, and surface verification metadata. |
|
||||
| EXCITITOR-WEB-OBS-54-001 `Attestation APIs` | TODO | Excititor WebService Guild | EXCITITOR-OBS-54-001, PROV-OBS-54-001 | Add `/attestations/vex/*` endpoints returning DSSE verification state, builder identity, and chain-of-custody links. |
|
||||
| EXCITITOR-WEB-OBS-55-001 `Incident mode toggles` | TODO | Excititor WebService Guild, DevOps Guild | EXCITITOR-OBS-55-001, WEB-OBS-55-001 | Provide incident mode API for VEX pipelines with activation audit logs and retention override previews. |
|
||||
|
||||
## Air-Gapped Mode (Epic 16)
|
||||
| ID | Status | Owner(s) | Depends on | Notes |
|
||||
|----|--------|----------|------------|-------|
|
||||
| EXCITITOR-WEB-AIRGAP-56-001 | TODO | Excititor WebService Guild | AIRGAP-IMP-58-001, EXCITITOR-AIRGAP-56-001 | Support mirror bundle registration via APIs, expose bundle provenance in VEX responses, and block external connectors in sealed mode. |
|
||||
| EXCITITOR-WEB-AIRGAP-56-002 | TODO | Excititor WebService Guild, AirGap Time Guild | EXCITITOR-WEB-AIRGAP-56-001, AIRGAP-TIME-58-001 | Return VEX staleness metrics and time anchor info in API responses for Console/CLI use. |
|
||||
| EXCITITOR-WEB-AIRGAP-57-001 | TODO | Excititor WebService Guild, AirGap Policy Guild | AIRGAP-POL-56-001 | Map sealed-mode violations to standardized error payload with remediation guidance. |
|
||||
| EXCITITOR-WEB-AIRGAP-58-001 | TODO | Excititor WebService Guild, AirGap Importer Guild | EXCITITOR-WEB-AIRGAP-56-001, TIMELINE-OBS-53-001 | Emit timeline events for VEX bundle imports with bundle ID, scope, and actor metadata. |
|
||||
|
||||
## SDKs & OpenAPI (Epic 17)
|
||||
| ID | Status | Owner(s) | Depends on | Notes |
|
||||
|----|--------|----------|------------|-------|
|
||||
| EXCITITOR-WEB-OAS-61-001 | TODO | Excititor WebService Guild | OAS-61-001 | Implement `/.well-known/openapi` discovery endpoint with spec version metadata. |
|
||||
| EXCITITOR-WEB-OAS-61-002 | TODO | Excititor WebService Guild | APIGOV-61-001 | Standardize error envelope responses and update controller/unit tests. |
|
||||
| EXCITITOR-WEB-OAS-62-001 | TODO | Excititor WebService Guild | EXCITITOR-OAS-61-002 | Add curated examples for VEX observation/linkset endpoints and ensure portal displays them. |
|
||||
| EXCITITOR-WEB-OAS-63-001 | TODO | Excititor WebService Guild, API Governance Guild | APIGOV-63-001 | Emit deprecation headers and update docs for retiring VEX APIs. |
|
||||
|
||||
@@ -1,9 +1,9 @@
|
||||
If you are working on this file you need to read docs/ARCHITECTURE_EXCITITOR.md and ./AGENTS.md).
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|EXCITITOR-ATTEST-01-001 – In-toto predicate & DSSE builder|Team Excititor Attestation|EXCITITOR-CORE-01-001|**DONE (2025-10-16)** – Added deterministic in-toto predicate/statement models, DSSE envelope builder wired to signer abstraction, and attestation client producing metadata + diagnostics.|
|
||||
|EXCITITOR-ATTEST-01-002 – Rekor v2 client integration|Team Excititor Attestation|EXCITITOR-ATTEST-01-001|**DONE (2025-10-16)** – Implemented Rekor HTTP client with retry/backoff, transparency log abstraction, DI helpers, and attestation client integration capturing Rekor metadata + diagnostics.|
|
||||
If you are working on this file you need to read docs/modules/excititor/ARCHITECTURE.md and ./AGENTS.md).
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|EXCITITOR-ATTEST-01-001 – In-toto predicate & DSSE builder|Team Excititor Attestation|EXCITITOR-CORE-01-001|**DONE (2025-10-16)** – Added deterministic in-toto predicate/statement models, DSSE envelope builder wired to signer abstraction, and attestation client producing metadata + diagnostics.|
|
||||
|EXCITITOR-ATTEST-01-002 – Rekor v2 client integration|Team Excititor Attestation|EXCITITOR-ATTEST-01-001|**DONE (2025-10-16)** – Implemented Rekor HTTP client with retry/backoff, transparency log abstraction, DI helpers, and attestation client integration capturing Rekor metadata + diagnostics.|
|
||||
|EXCITITOR-ATTEST-01-003 – Verification suite & observability|Team Excititor Attestation|EXCITITOR-ATTEST-01-002|DOING (2025-10-22) – Continuing implementation: build `IVexAttestationVerifier`, wire metrics/logging, and add regression tests. Draft plan in `EXCITITOR-ATTEST-01-003-plan.md` (2025-10-19) guides scope; updating with worknotes as progress lands.|
|
||||
|
||||
> Remark (2025-10-22): Added verifier implementation + metrics/tests; next steps include wiring into WebService/Worker flows and expanding negative-path coverage.
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
If you are working on this file you need to read docs/ARCHITECTURE_EXCITITOR.md and ./AGENTS.md).
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|EXCITITOR-CONN-ABS-01-001 – Connector context & base classes|Team Excititor Connectors|EXCITITOR-CORE-01-003|**DONE (2025-10-17)** – Added `StellaOps.Excititor.Connectors.Abstractions` project with `VexConnectorBase`, deterministic logging scopes, metadata builder helpers, and connector descriptors; docs updated to highlight the shared abstractions.|
|
||||
|EXCITITOR-CONN-ABS-01-002 – YAML options & validation|Team Excititor Connectors|EXCITITOR-CONN-ABS-01-001|**DONE (2025-10-17)** – Delivered `VexConnectorOptionsBinder` + binder options/validators, environment-variable expansion, data-annotation checks, and custom validation hooks with documentation updates covering the workflow.|
|
||||
|EXCITITOR-CONN-ABS-01-003 – Plugin packaging & docs|Team Excititor Connectors|EXCITITOR-CONN-ABS-01-001|**DONE (2025-10-17)** – Authored `docs/dev/30_EXCITITOR_CONNECTOR_GUIDE.md`, added quick-start template under `docs/dev/templates/excititor-connector/`, and updated module docs to reference the packaging workflow.|
|
||||
If you are working on this file you need to read docs/modules/excititor/ARCHITECTURE.md and ./AGENTS.md).
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|EXCITITOR-CONN-ABS-01-001 – Connector context & base classes|Team Excititor Connectors|EXCITITOR-CORE-01-003|**DONE (2025-10-17)** – Added `StellaOps.Excititor.Connectors.Abstractions` project with `VexConnectorBase`, deterministic logging scopes, metadata builder helpers, and connector descriptors; docs updated to highlight the shared abstractions.|
|
||||
|EXCITITOR-CONN-ABS-01-002 – YAML options & validation|Team Excititor Connectors|EXCITITOR-CONN-ABS-01-001|**DONE (2025-10-17)** – Delivered `VexConnectorOptionsBinder` + binder options/validators, environment-variable expansion, data-annotation checks, and custom validation hooks with documentation updates covering the workflow.|
|
||||
|EXCITITOR-CONN-ABS-01-003 – Plugin packaging & docs|Team Excititor Connectors|EXCITITOR-CONN-ABS-01-001|**DONE (2025-10-17)** – Authored `docs/dev/30_EXCITITOR_CONNECTOR_GUIDE.md`, added quick-start template under `docs/dev/templates/excititor-connector/`, and updated module docs to reference the packaging workflow.|
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
If you are working on this file you need to read docs/ARCHITECTURE_EXCITITOR.md and ./AGENTS.md).
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|EXCITITOR-CONN-CISCO-01-001 – Endpoint discovery & auth plumbing|Team Excititor Connectors – Cisco|EXCITITOR-CONN-ABS-01-001|**DONE (2025-10-17)** – Added `CiscoProviderMetadataLoader` with bearer token support, offline snapshot fallback, DI helpers, and tests covering network/offline discovery to unblock subsequent fetch work.|
|
||||
|EXCITITOR-CONN-CISCO-01-002 – CSAF pull loop & pagination|Team Excititor Connectors – Cisco|EXCITITOR-CONN-CISCO-01-001, EXCITITOR-STORAGE-01-003|**DONE (2025-10-17)** – Implemented paginated advisory fetch using provider directories, raw document persistence with dedupe/state tracking, offline resiliency, and unit coverage.|
|
||||
|EXCITITOR-CONN-CISCO-01-003 – Provider trust metadata|Team Excititor Connectors – Cisco|EXCITITOR-CONN-CISCO-01-002, EXCITITOR-POLICY-01-001|**DOING (2025-10-19)** – Prereqs confirmed (both DONE); implementing cosign/PGP trust metadata emission and advisory provenance hints for policy weighting.|
|
||||
If you are working on this file you need to read docs/modules/excititor/ARCHITECTURE.md and ./AGENTS.md).
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|EXCITITOR-CONN-CISCO-01-001 – Endpoint discovery & auth plumbing|Team Excititor Connectors – Cisco|EXCITITOR-CONN-ABS-01-001|**DONE (2025-10-17)** – Added `CiscoProviderMetadataLoader` with bearer token support, offline snapshot fallback, DI helpers, and tests covering network/offline discovery to unblock subsequent fetch work.|
|
||||
|EXCITITOR-CONN-CISCO-01-002 – CSAF pull loop & pagination|Team Excititor Connectors – Cisco|EXCITITOR-CONN-CISCO-01-001, EXCITITOR-STORAGE-01-003|**DONE (2025-10-17)** – Implemented paginated advisory fetch using provider directories, raw document persistence with dedupe/state tracking, offline resiliency, and unit coverage.|
|
||||
|EXCITITOR-CONN-CISCO-01-003 – Provider trust metadata|Team Excititor Connectors – Cisco|EXCITITOR-CONN-CISCO-01-002, EXCITITOR-POLICY-01-001|**DOING (2025-10-19)** – Prereqs confirmed (both DONE); implementing cosign/PGP trust metadata emission and advisory provenance hints for policy weighting.|
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
If you are working on this file you need to read docs/ARCHITECTURE_EXCITITOR.md and ./AGENTS.md).
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|EXCITITOR-CONN-MS-01-001 – AAD onboarding & token cache|Team Excititor Connectors – MSRC|EXCITITOR-CONN-ABS-01-001|**DONE (2025-10-17)** – Added MSRC connector project with configurable AAD options, token provider (offline/online modes), DI wiring, and unit tests covering caching and fallback scenarios.|
|
||||
|EXCITITOR-CONN-MS-01-002 – CSAF download pipeline|Team Excititor Connectors – MSRC|EXCITITOR-CONN-MS-01-001, EXCITITOR-STORAGE-01-003|**DOING (2025-10-19)** – Prereqs verified (EXCITITOR-CONN-MS-01-001, EXCITITOR-STORAGE-01-003); drafting fetch/retry plan and storage wiring before implementation of CSAF package download, checksum validation, and quarantine flows.|
|
||||
|EXCITITOR-CONN-MS-01-003 – Trust metadata & provenance hints|Team Excititor Connectors – MSRC|EXCITITOR-CONN-MS-01-002, EXCITITOR-POLICY-01-001|TODO – Emit cosign/AAD issuer metadata, attach provenance details, and document policy integration.|
|
||||
If you are working on this file you need to read docs/modules/excititor/ARCHITECTURE.md and ./AGENTS.md).
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|EXCITITOR-CONN-MS-01-001 – AAD onboarding & token cache|Team Excititor Connectors – MSRC|EXCITITOR-CONN-ABS-01-001|**DONE (2025-10-17)** – Added MSRC connector project with configurable AAD options, token provider (offline/online modes), DI wiring, and unit tests covering caching and fallback scenarios.|
|
||||
|EXCITITOR-CONN-MS-01-002 – CSAF download pipeline|Team Excititor Connectors – MSRC|EXCITITOR-CONN-MS-01-001, EXCITITOR-STORAGE-01-003|**DOING (2025-10-19)** – Prereqs verified (EXCITITOR-CONN-MS-01-001, EXCITITOR-STORAGE-01-003); drafting fetch/retry plan and storage wiring before implementation of CSAF package download, checksum validation, and quarantine flows.|
|
||||
|EXCITITOR-CONN-MS-01-003 – Trust metadata & provenance hints|Team Excititor Connectors – MSRC|EXCITITOR-CONN-MS-01-002, EXCITITOR-POLICY-01-001|TODO – Emit cosign/AAD issuer metadata, attach provenance details, and document policy integration.|
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
If you are working on this file you need to read docs/ARCHITECTURE_EXCITITOR.md and ./AGENTS.md).
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|EXCITITOR-CONN-OCI-01-001 – OCI discovery & auth plumbing|Team Excititor Connectors – OCI|EXCITITOR-CONN-ABS-01-001|DONE (2025-10-18) – Added connector skeleton, options/validators, discovery caching, cosign/auth descriptors, offline bundle resolution, DI wiring, and regression tests.|
|
||||
|EXCITITOR-CONN-OCI-01-002 – Attestation fetch & verify loop|Team Excititor Connectors – OCI|EXCITITOR-CONN-OCI-01-001, EXCITITOR-ATTEST-01-002|DONE (2025-10-18) – Added offline/registry fetch services, DSSE retrieval with retries, signature verification callout, and raw persistence coverage.|
|
||||
|EXCITITOR-CONN-OCI-01-003 – Provenance metadata & policy hooks|Team Excititor Connectors – OCI|EXCITITOR-CONN-OCI-01-002, EXCITITOR-POLICY-01-001|DONE (2025-10-18) – Enriched attestation metadata with provenance hints, cosign expectations, registry auth context, and signature diagnostics for policy consumption.|
|
||||
If you are working on this file you need to read docs/modules/excititor/ARCHITECTURE.md and ./AGENTS.md).
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|EXCITITOR-CONN-OCI-01-001 – OCI discovery & auth plumbing|Team Excititor Connectors – OCI|EXCITITOR-CONN-ABS-01-001|DONE (2025-10-18) – Added connector skeleton, options/validators, discovery caching, cosign/auth descriptors, offline bundle resolution, DI wiring, and regression tests.|
|
||||
|EXCITITOR-CONN-OCI-01-002 – Attestation fetch & verify loop|Team Excititor Connectors – OCI|EXCITITOR-CONN-OCI-01-001, EXCITITOR-ATTEST-01-002|DONE (2025-10-18) – Added offline/registry fetch services, DSSE retrieval with retries, signature verification callout, and raw persistence coverage.|
|
||||
|EXCITITOR-CONN-OCI-01-003 – Provenance metadata & policy hooks|Team Excititor Connectors – OCI|EXCITITOR-CONN-OCI-01-002, EXCITITOR-POLICY-01-001|DONE (2025-10-18) – Enriched attestation metadata with provenance hints, cosign expectations, registry auth context, and signature diagnostics for policy consumption.|
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
If you are working on this file you need to read docs/ARCHITECTURE_EXCITITOR.md and ./AGENTS.md).
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|EXCITITOR-CONN-ORACLE-01-001 – Oracle CSAF catalogue discovery|Team Excititor Connectors – Oracle|EXCITITOR-CONN-ABS-01-001|**DONE (2025-10-19)** – Implemented cached Oracle CSAF catalog loader with CPU calendar merge, offline snapshot ingest/persist, options validation + DI wiring, and regression tests; prerequisite EXCITITOR-CONN-ABS-01-001 verified DONE per Sprint 5 log (2025-10-19).|
|
||||
|EXCITITOR-CONN-ORACLE-01-002 – CSAF download & dedupe pipeline|Team Excititor Connectors – Oracle|EXCITITOR-CONN-ORACLE-01-001, EXCITITOR-STORAGE-01-003|**DONE (2025-10-19)** – Added Oracle CSAF fetch loop with retry/backoff, checksum validation, resume-aware state persistence, digest dedupe, configurable throttling, and raw storage wiring; regression tests cover new ingestion and mismatch handling.|
|
||||
|EXCITITOR-CONN-ORACLE-01-003 – Trust metadata + provenance|Team Excititor Connectors – Oracle|EXCITITOR-CONN-ORACLE-01-002, EXCITITOR-POLICY-01-001|TODO – Emit Oracle signing metadata (PGP/cosign) and provenance hints for consensus weighting.|
|
||||
If you are working on this file you need to read docs/modules/excititor/ARCHITECTURE.md and ./AGENTS.md).
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|EXCITITOR-CONN-ORACLE-01-001 – Oracle CSAF catalogue discovery|Team Excititor Connectors – Oracle|EXCITITOR-CONN-ABS-01-001|**DONE (2025-10-19)** – Implemented cached Oracle CSAF catalog loader with CPU calendar merge, offline snapshot ingest/persist, options validation + DI wiring, and regression tests; prerequisite EXCITITOR-CONN-ABS-01-001 verified DONE per Sprint 5 log (2025-10-19).|
|
||||
|EXCITITOR-CONN-ORACLE-01-002 – CSAF download & dedupe pipeline|Team Excititor Connectors – Oracle|EXCITITOR-CONN-ORACLE-01-001, EXCITITOR-STORAGE-01-003|**DONE (2025-10-19)** – Added Oracle CSAF fetch loop with retry/backoff, checksum validation, resume-aware state persistence, digest dedupe, configurable throttling, and raw storage wiring; regression tests cover new ingestion and mismatch handling.|
|
||||
|EXCITITOR-CONN-ORACLE-01-003 – Trust metadata + provenance|Team Excititor Connectors – Oracle|EXCITITOR-CONN-ORACLE-01-002, EXCITITOR-POLICY-01-001|TODO – Emit Oracle signing metadata (PGP/cosign) and provenance hints for consensus weighting.|
|
||||
|
||||
@@ -1,10 +1,10 @@
|
||||
If you are working on this file you need to read docs/ARCHITECTURE_EXCITITOR.md and ./AGENTS.md).
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|EXCITITOR-CONN-RH-01-001 – Provider metadata discovery|Team Excititor Connectors – Red Hat|EXCITITOR-CONN-ABS-01-001|**DONE (2025-10-17)** – Added `RedHatProviderMetadataLoader` with HTTP/ETag caching, offline snapshot handling, and validation; exposed DI helper + tests covering live, cached, and offline scenarios.|
|
||||
|EXCITITOR-CONN-RH-01-002 – Incremental CSAF pulls|Team Excititor Connectors – Red Hat|EXCITITOR-CONN-RH-01-001, EXCITITOR-STORAGE-01-003|**DONE (2025-10-17)** – Implemented `RedHatCsafConnector` with ROLIE feed parsing, incremental filtering via `context.Since`, CSAF document download + metadata capture, and persistence through `IVexRawDocumentSink`; tests cover live fetch/cache/offline scenarios with ETag handling.|
|
||||
|EXCITITOR-CONN-RH-01-003 – Trust metadata emission|Team Excititor Connectors – Red Hat|EXCITITOR-CONN-RH-01-002, EXCITITOR-POLICY-01-001|**DONE (2025-10-17)** – Provider metadata loader now emits trust overrides (weight, cosign issuer/pattern, PGP fingerprints) and the connector surfaces provenance hints for policy/consensus layers.|
|
||||
|EXCITITOR-CONN-RH-01-004 – Resume state persistence|Team Excititor Connectors – Red Hat|EXCITITOR-CONN-RH-01-002, EXCITITOR-STORAGE-01-003|**DONE (2025-10-17)** – Connector now loads/saves resume state via `IVexConnectorStateRepository`, tracking last update timestamp and recent document digests to avoid duplicate CSAF ingestion; regression covers state persistence and duplicate skips.|
|
||||
|EXCITITOR-CONN-RH-01-005 – Worker/WebService integration|Team Excititor Connectors – Red Hat|EXCITITOR-CONN-RH-01-002|**DONE (2025-10-17)** – Worker/WebService now call `AddRedHatCsafConnector`, register the connector + state repo, and default worker scheduling adds the `excititor:redhat` provider so background jobs and orchestration can activate the connector without extra wiring.|
|
||||
|EXCITITOR-CONN-RH-01-006 – CSAF normalization parity tests|Team Excititor Connectors – Red Hat|EXCITITOR-CONN-RH-01-002, EXCITITOR-FMT-CSAF-01-001|**DONE (2025-10-17)** – Added RHSA fixture-driven regression verifying CSAF normalizer retains Red Hat product metadata, tracking fields, and timestamps (`rhsa-sample.json` + `CsafNormalizerTests.NormalizeAsync_PreservesRedHatSpecificMetadata`).|
|
||||
If you are working on this file you need to read docs/modules/excititor/ARCHITECTURE.md and ./AGENTS.md).
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|EXCITITOR-CONN-RH-01-001 – Provider metadata discovery|Team Excititor Connectors – Red Hat|EXCITITOR-CONN-ABS-01-001|**DONE (2025-10-17)** – Added `RedHatProviderMetadataLoader` with HTTP/ETag caching, offline snapshot handling, and validation; exposed DI helper + tests covering live, cached, and offline scenarios.|
|
||||
|EXCITITOR-CONN-RH-01-002 – Incremental CSAF pulls|Team Excititor Connectors – Red Hat|EXCITITOR-CONN-RH-01-001, EXCITITOR-STORAGE-01-003|**DONE (2025-10-17)** – Implemented `RedHatCsafConnector` with ROLIE feed parsing, incremental filtering via `context.Since`, CSAF document download + metadata capture, and persistence through `IVexRawDocumentSink`; tests cover live fetch/cache/offline scenarios with ETag handling.|
|
||||
|EXCITITOR-CONN-RH-01-003 – Trust metadata emission|Team Excititor Connectors – Red Hat|EXCITITOR-CONN-RH-01-002, EXCITITOR-POLICY-01-001|**DONE (2025-10-17)** – Provider metadata loader now emits trust overrides (weight, cosign issuer/pattern, PGP fingerprints) and the connector surfaces provenance hints for policy/consensus layers.|
|
||||
|EXCITITOR-CONN-RH-01-004 – Resume state persistence|Team Excititor Connectors – Red Hat|EXCITITOR-CONN-RH-01-002, EXCITITOR-STORAGE-01-003|**DONE (2025-10-17)** – Connector now loads/saves resume state via `IVexConnectorStateRepository`, tracking last update timestamp and recent document digests to avoid duplicate CSAF ingestion; regression covers state persistence and duplicate skips.|
|
||||
|EXCITITOR-CONN-RH-01-005 – Worker/WebService integration|Team Excititor Connectors – Red Hat|EXCITITOR-CONN-RH-01-002|**DONE (2025-10-17)** – Worker/WebService now call `AddRedHatCsafConnector`, register the connector + state repo, and default worker scheduling adds the `excititor:redhat` provider so background jobs and orchestration can activate the connector without extra wiring.|
|
||||
|EXCITITOR-CONN-RH-01-006 – CSAF normalization parity tests|Team Excititor Connectors – Red Hat|EXCITITOR-CONN-RH-01-002, EXCITITOR-FMT-CSAF-01-001|**DONE (2025-10-17)** – Added RHSA fixture-driven regression verifying CSAF normalizer retains Red Hat product metadata, tracking fields, and timestamps (`rhsa-sample.json` + `CsafNormalizerTests.NormalizeAsync_PreservesRedHatSpecificMetadata`).|
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
If you are working on this file you need to read docs/ARCHITECTURE_EXCITITOR.md and ./AGENTS.md).
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|EXCITITOR-CONN-SUSE-01-001 – Rancher hub discovery & auth|Team Excititor Connectors – SUSE|EXCITITOR-CONN-ABS-01-001|**DONE (2025-10-17)** – Added Rancher hub options/token provider, discovery metadata loader with offline snapshots + caching, connector shell, DI wiring, and unit tests covering network/offline paths.|
|
||||
|EXCITITOR-CONN-SUSE-01-002 – Checkpointed event ingestion|Team Excititor Connectors – SUSE|EXCITITOR-CONN-SUSE-01-001, EXCITITOR-STORAGE-01-003|**DOING (2025-10-19)** – Process hub events with resume checkpoints, deduplication, and quarantine path for malformed payloads.<br>2025-10-19: Prereqs EXCITITOR-CONN-SUSE-01-001 & EXCITITOR-STORAGE-01-003 confirmed complete; initiating checkpoint/resume implementation plan.|
|
||||
|EXCITITOR-CONN-SUSE-01-003 – Trust metadata & policy hints|Team Excititor Connectors – SUSE|EXCITITOR-CONN-SUSE-01-002, EXCITITOR-POLICY-01-001|TODO – Emit provider trust configuration (signers, weight overrides) and attach provenance hints for consensus engine.|
|
||||
If you are working on this file you need to read docs/modules/excititor/ARCHITECTURE.md and ./AGENTS.md).
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|EXCITITOR-CONN-SUSE-01-001 – Rancher hub discovery & auth|Team Excititor Connectors – SUSE|EXCITITOR-CONN-ABS-01-001|**DONE (2025-10-17)** – Added Rancher hub options/token provider, discovery metadata loader with offline snapshots + caching, connector shell, DI wiring, and unit tests covering network/offline paths.|
|
||||
|EXCITITOR-CONN-SUSE-01-002 – Checkpointed event ingestion|Team Excititor Connectors – SUSE|EXCITITOR-CONN-SUSE-01-001, EXCITITOR-STORAGE-01-003|**DOING (2025-10-19)** – Process hub events with resume checkpoints, deduplication, and quarantine path for malformed payloads.<br>2025-10-19: Prereqs EXCITITOR-CONN-SUSE-01-001 & EXCITITOR-STORAGE-01-003 confirmed complete; initiating checkpoint/resume implementation plan.|
|
||||
|EXCITITOR-CONN-SUSE-01-003 – Trust metadata & policy hints|Team Excititor Connectors – SUSE|EXCITITOR-CONN-SUSE-01-002, EXCITITOR-POLICY-01-001|TODO – Emit provider trust configuration (signers, weight overrides) and attach provenance hints for consensus engine.|
|
||||
|
||||
@@ -1,8 +1,8 @@
|
||||
If you are working on this file you need to read docs/ARCHITECTURE_EXCITITOR.md and ./AGENTS.md).
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|EXCITITOR-CONN-UBUNTU-01-001 – Ubuntu CSAF discovery & channels|Team Excititor Connectors – Ubuntu|EXCITITOR-CONN-ABS-01-001|**DONE (2025-10-17)** – Added Ubuntu connector project with configurable channel options, catalog loader (network/offline), DI wiring, and discovery unit tests.|
|
||||
|EXCITITOR-CONN-UBUNTU-01-002 – Incremental fetch & deduplication|Team Excititor Connectors – Ubuntu|EXCITITOR-CONN-UBUNTU-01-001, EXCITITOR-STORAGE-01-003|**DOING (2025-10-19)** – Fetch CSAF bundles with ETag handling, checksum validation, deduplication, and raw persistence.|
|
||||
|EXCITITOR-CONN-UBUNTU-01-003 – Trust metadata & provenance|Team Excititor Connectors – Ubuntu|EXCITITOR-CONN-UBUNTU-01-002, EXCITITOR-POLICY-01-001|TODO – Emit Ubuntu signing metadata (GPG fingerprints) plus provenance hints for policy weighting and diagnostics.|
|
||||
> Remark (2025-10-19, EXCITITOR-CONN-UBUNTU-01-002): Prerequisites EXCITITOR-CONN-UBUNTU-01-001 and EXCITITOR-STORAGE-01-003 verified as **DONE**; advancing to DOING per Wave 0 kickoff.
|
||||
If you are working on this file you need to read docs/modules/excititor/ARCHITECTURE.md and ./AGENTS.md).
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|EXCITITOR-CONN-UBUNTU-01-001 – Ubuntu CSAF discovery & channels|Team Excititor Connectors – Ubuntu|EXCITITOR-CONN-ABS-01-001|**DONE (2025-10-17)** – Added Ubuntu connector project with configurable channel options, catalog loader (network/offline), DI wiring, and discovery unit tests.|
|
||||
|EXCITITOR-CONN-UBUNTU-01-002 – Incremental fetch & deduplication|Team Excititor Connectors – Ubuntu|EXCITITOR-CONN-UBUNTU-01-001, EXCITITOR-STORAGE-01-003|**DOING (2025-10-19)** – Fetch CSAF bundles with ETag handling, checksum validation, deduplication, and raw persistence.|
|
||||
|EXCITITOR-CONN-UBUNTU-01-003 – Trust metadata & provenance|Team Excititor Connectors – Ubuntu|EXCITITOR-CONN-UBUNTU-01-002, EXCITITOR-POLICY-01-001|TODO – Emit Ubuntu signing metadata (GPG fingerprints) plus provenance hints for policy weighting and diagnostics.|
|
||||
> Remark (2025-10-19, EXCITITOR-CONN-UBUNTU-01-002): Prerequisites EXCITITOR-CONN-UBUNTU-01-001 and EXCITITOR-STORAGE-01-003 verified as **DONE**; advancing to DOING per Wave 0 kickoff.
|
||||
|
||||
@@ -1,11 +1,11 @@
|
||||
If you are working on this file you need to read docs/ARCHITECTURE_EXCITITOR.md and ./AGENTS.md).
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|EXCITITOR-EXPORT-01-001 – Export engine orchestration|Team Excititor Export|EXCITITOR-CORE-01-003|DONE (2025-10-15) – Export engine scaffolding with cache lookup, data source hooks, and deterministic manifest emission.|
|
||||
|EXCITITOR-EXPORT-01-002 – Cache index & eviction hooks|Team Excititor Export|EXCITITOR-EXPORT-01-001, EXCITITOR-STORAGE-01-003|**DONE (2025-10-16)** – Export engine now invalidates cache entries on force refresh, cache services expose prune/invalidate APIs, and storage maintenance trims expired/dangling records with Mongo2Go coverage.|
|
||||
|EXCITITOR-EXPORT-01-003 – Artifact store adapters|Team Excititor Export|EXCITITOR-EXPORT-01-001|**DONE (2025-10-16)** – Implemented multi-store pipeline with filesystem, S3-compatible, and offline bundle adapters (hash verification + manifest/zip output) plus unit coverage and DI hooks.|
|
||||
|EXCITITOR-EXPORT-01-004 – Attestation handoff integration|Team Excititor Export|EXCITITOR-EXPORT-01-001, EXCITITOR-ATTEST-01-001|**DONE (2025-10-17)** – Export engine now invokes attestation client, logs diagnostics, and persists Rekor/envelope metadata on manifests; regression coverage added in `ExportEngineTests.ExportAsync_AttachesAttestationMetadata`.|
|
||||
If you are working on this file you need to read docs/modules/excititor/ARCHITECTURE.md and ./AGENTS.md).
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|EXCITITOR-EXPORT-01-001 – Export engine orchestration|Team Excititor Export|EXCITITOR-CORE-01-003|DONE (2025-10-15) – Export engine scaffolding with cache lookup, data source hooks, and deterministic manifest emission.|
|
||||
|EXCITITOR-EXPORT-01-002 – Cache index & eviction hooks|Team Excititor Export|EXCITITOR-EXPORT-01-001, EXCITITOR-STORAGE-01-003|**DONE (2025-10-16)** – Export engine now invalidates cache entries on force refresh, cache services expose prune/invalidate APIs, and storage maintenance trims expired/dangling records with Mongo2Go coverage.|
|
||||
|EXCITITOR-EXPORT-01-003 – Artifact store adapters|Team Excititor Export|EXCITITOR-EXPORT-01-001|**DONE (2025-10-16)** – Implemented multi-store pipeline with filesystem, S3-compatible, and offline bundle adapters (hash verification + manifest/zip output) plus unit coverage and DI hooks.|
|
||||
|EXCITITOR-EXPORT-01-004 – Attestation handoff integration|Team Excititor Export|EXCITITOR-EXPORT-01-001, EXCITITOR-ATTEST-01-001|**DONE (2025-10-17)** – Export engine now invokes attestation client, logs diagnostics, and persists Rekor/envelope metadata on manifests; regression coverage added in `ExportEngineTests.ExportAsync_AttachesAttestationMetadata`.|
|
||||
|EXCITITOR-EXPORT-01-005 – Score & resolve envelope surfaces|Team Excititor Export|EXCITITOR-EXPORT-01-004, EXCITITOR-CORE-02-001|**DONE (2025-10-21)** – Export engine now canonicalizes consensus/score envelopes, persists their SHA-256 digests into manifests/attestation metadata, and regression tests validate metadata wiring via `ExportEngineTests`.|
|
||||
|EXCITITOR-EXPORT-01-006 – Quiet provenance packaging|Team Excititor Export|EXCITITOR-EXPORT-01-005, POLICY-CORE-09-005|**DONE (2025-10-21)** – Export manifests now carry quiet-provenance entries (statement digests, signers, justification codes); metadata flows into offline bundles & attestations with regression coverage in `ExportEngineTests`.|
|
||||
|EXCITITOR-EXPORT-01-007 – Mirror bundle + domain manifest|Team Excititor Export|EXCITITOR-EXPORT-01-006|**DONE (2025-10-21)** – Created per-domain mirror bundles with consensus/score artefacts, published signed-ready manifests/index for downstream Excititor sync, and added regression coverage.|
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
If you are working on this file you need to read docs/ARCHITECTURE_EXCITITOR.md and ./AGENTS.md).
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|EXCITITOR-FMT-CSAF-01-001 – CSAF normalizer foundation|Team Excititor Formats|EXCITITOR-CORE-01-001|**DONE (2025-10-17)** – Implemented CSAF normalizer + DI hook, parsing tracking metadata, product tree branches/full names, and mapping product statuses into canonical `VexClaim`s with baseline precedence. Regression added in `CsafNormalizerTests`.|
|
||||
|EXCITITOR-FMT-CSAF-01-002 – Status/justification mapping|Team Excititor Formats|EXCITITOR-FMT-CSAF-01-001, EXCITITOR-POLICY-01-001|**DOING (2025-10-19)** – Prereqs EXCITITOR-FMT-CSAF-01-001 & EXCITITOR-POLICY-01-001 verified DONE; starting normalization of `product_status`/`justification` values with policy-aligned diagnostics.|
|
||||
|EXCITITOR-FMT-CSAF-01-003 – CSAF export adapter|Team Excititor Formats|EXCITITOR-EXPORT-01-001, EXCITITOR-FMT-CSAF-01-001|**DOING (2025-10-19)** – Prereqs EXCITITOR-EXPORT-01-001 & EXCITITOR-FMT-CSAF-01-001 confirmed DONE; drafting deterministic CSAF exporter and manifest metadata flow.|
|
||||
If you are working on this file you need to read docs/modules/excititor/ARCHITECTURE.md and ./AGENTS.md).
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|EXCITITOR-FMT-CSAF-01-001 – CSAF normalizer foundation|Team Excititor Formats|EXCITITOR-CORE-01-001|**DONE (2025-10-17)** – Implemented CSAF normalizer + DI hook, parsing tracking metadata, product tree branches/full names, and mapping product statuses into canonical `VexClaim`s with baseline precedence. Regression added in `CsafNormalizerTests`.|
|
||||
|EXCITITOR-FMT-CSAF-01-002 – Status/justification mapping|Team Excititor Formats|EXCITITOR-FMT-CSAF-01-001, EXCITITOR-POLICY-01-001|**DOING (2025-10-19)** – Prereqs EXCITITOR-FMT-CSAF-01-001 & EXCITITOR-POLICY-01-001 verified DONE; starting normalization of `product_status`/`justification` values with policy-aligned diagnostics.|
|
||||
|EXCITITOR-FMT-CSAF-01-003 – CSAF export adapter|Team Excititor Formats|EXCITITOR-EXPORT-01-001, EXCITITOR-FMT-CSAF-01-001|**DOING (2025-10-19)** – Prereqs EXCITITOR-EXPORT-01-001 & EXCITITOR-FMT-CSAF-01-001 confirmed DONE; drafting deterministic CSAF exporter and manifest metadata flow.|
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
If you are working on this file you need to read docs/ARCHITECTURE_EXCITITOR.md and ./AGENTS.md).
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|EXCITITOR-FMT-CYCLONE-01-001 – CycloneDX VEX normalizer|Team Excititor Formats|EXCITITOR-CORE-01-001|**DONE (2025-10-17)** – CycloneDX normalizer parses `analysis` data, resolves component references, and emits canonical `VexClaim`s; regression lives in `CycloneDxNormalizerTests`.|
|
||||
|EXCITITOR-FMT-CYCLONE-01-002 – Component reference reconciliation|Team Excititor Formats|EXCITITOR-FMT-CYCLONE-01-001|**DOING (2025-10-19)** – Prereq EXCITITOR-FMT-CYCLONE-01-001 confirmed DONE; proceeding with reference reconciliation helpers and diagnostics for missing SBOM links.|
|
||||
|EXCITITOR-FMT-CYCLONE-01-003 – CycloneDX export serializer|Team Excititor Formats|EXCITITOR-EXPORT-01-001, EXCITITOR-FMT-CYCLONE-01-001|**DOING (2025-10-19)** – Prereqs EXCITITOR-EXPORT-01-001 & EXCITITOR-FMT-CYCLONE-01-001 verified DONE; initiating deterministic CycloneDX VEX exporter work.|
|
||||
If you are working on this file you need to read docs/modules/excititor/ARCHITECTURE.md and ./AGENTS.md).
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|EXCITITOR-FMT-CYCLONE-01-001 – CycloneDX VEX normalizer|Team Excititor Formats|EXCITITOR-CORE-01-001|**DONE (2025-10-17)** – CycloneDX normalizer parses `analysis` data, resolves component references, and emits canonical `VexClaim`s; regression lives in `CycloneDxNormalizerTests`.|
|
||||
|EXCITITOR-FMT-CYCLONE-01-002 – Component reference reconciliation|Team Excititor Formats|EXCITITOR-FMT-CYCLONE-01-001|**DOING (2025-10-19)** – Prereq EXCITITOR-FMT-CYCLONE-01-001 confirmed DONE; proceeding with reference reconciliation helpers and diagnostics for missing SBOM links.|
|
||||
|EXCITITOR-FMT-CYCLONE-01-003 – CycloneDX export serializer|Team Excititor Formats|EXCITITOR-EXPORT-01-001, EXCITITOR-FMT-CYCLONE-01-001|**DOING (2025-10-19)** – Prereqs EXCITITOR-EXPORT-01-001 & EXCITITOR-FMT-CYCLONE-01-001 verified DONE; initiating deterministic CycloneDX VEX exporter work.|
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
If you are working on this file you need to read docs/ARCHITECTURE_EXCITITOR.md and ./AGENTS.md).
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|EXCITITOR-FMT-OPENVEX-01-001 – OpenVEX normalizer|Team Excititor Formats|EXCITITOR-CORE-01-001|**DONE (2025-10-17)** – OpenVEX normalizer parses statements/products, maps status/justification, and surfaces provenance metadata; coverage in `OpenVexNormalizerTests`.|
|
||||
|EXCITITOR-FMT-OPENVEX-01-002 – Statement merge utilities|Team Excititor Formats|EXCITITOR-FMT-OPENVEX-01-001|**DOING (2025-10-19)** – Prereq EXCITITOR-FMT-OPENVEX-01-001 confirmed DONE; building deterministic merge reducers with policy diagnostics.|
|
||||
|EXCITITOR-FMT-OPENVEX-01-003 – OpenVEX export writer|Team Excititor Formats|EXCITITOR-EXPORT-01-001, EXCITITOR-FMT-OPENVEX-01-001|**DOING (2025-10-19)** – Prereqs EXCITITOR-EXPORT-01-001 & EXCITITOR-FMT-OPENVEX-01-001 verified DONE; starting canonical OpenVEX exporter with stable ordering/SBOM references.|
|
||||
If you are working on this file you need to read docs/modules/excititor/ARCHITECTURE.md and ./AGENTS.md).
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|EXCITITOR-FMT-OPENVEX-01-001 – OpenVEX normalizer|Team Excititor Formats|EXCITITOR-CORE-01-001|**DONE (2025-10-17)** – OpenVEX normalizer parses statements/products, maps status/justification, and surfaces provenance metadata; coverage in `OpenVexNormalizerTests`.|
|
||||
|EXCITITOR-FMT-OPENVEX-01-002 – Statement merge utilities|Team Excititor Formats|EXCITITOR-FMT-OPENVEX-01-001|**DOING (2025-10-19)** – Prereq EXCITITOR-FMT-OPENVEX-01-001 confirmed DONE; building deterministic merge reducers with policy diagnostics.|
|
||||
|EXCITITOR-FMT-OPENVEX-01-003 – OpenVEX export writer|Team Excititor Formats|EXCITITOR-EXPORT-01-001, EXCITITOR-FMT-OPENVEX-01-001|**DOING (2025-10-19)** – Prereqs EXCITITOR-EXPORT-01-001 & EXCITITOR-FMT-OPENVEX-01-001 verified DONE; starting canonical OpenVEX exporter with stable ordering/SBOM references.|
|
||||
|
||||
@@ -1,11 +1,11 @@
|
||||
If you are working on this file you need to read docs/ARCHITECTURE_EXCITITOR.md and ./AGENTS.md).
|
||||
If you are working on this file you need to read docs/modules/excititor/ARCHITECTURE.md and ./AGENTS.md).
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|EXCITITOR-POLICY-01-001 – Policy schema & binding|Team Excititor Policy|EXCITITOR-CORE-01-001|DONE (2025-10-15) – Established `VexPolicyOptions`, options binding, and snapshot provider covering baseline weights/overrides.|
|
||||
|EXCITITOR-POLICY-01-002 – Policy evaluator service|Team Excititor Policy|EXCITITOR-POLICY-01-001|DONE (2025-10-15) – `VexPolicyEvaluator` exposes immutable snapshots to consensus and normalizes rejection reasons.|
|
||||
|EXCITITOR-POLICY-01-003 – Operator diagnostics & docs|Team Excititor Policy|EXCITITOR-POLICY-01-001|**DONE (2025-10-16)** – Surface structured diagnostics (CLI/WebService) and author policy upgrade guidance in docs/ARCHITECTURE_EXCITITOR.md appendix.<br>2025-10-16: Added `IVexPolicyDiagnostics`/`VexPolicyDiagnosticsReport`, sorted issue ordering, recommendations, and appendix guidance. Tests: `dotnet test src/Excititor/__Tests/StellaOps.Excititor.Core.Tests/StellaOps.Excititor.Core.Tests.csproj`.|
|
||||
|EXCITITOR-POLICY-01-003 – Operator diagnostics & docs|Team Excititor Policy|EXCITITOR-POLICY-01-001|**DONE (2025-10-16)** – Surface structured diagnostics (CLI/WebService) and author policy upgrade guidance in docs/modules/excititor/ARCHITECTURE.md appendix.<br>2025-10-16: Added `IVexPolicyDiagnostics`/`VexPolicyDiagnosticsReport`, sorted issue ordering, recommendations, and appendix guidance. Tests: `dotnet test src/Excititor/__Tests/StellaOps.Excititor.Core.Tests/StellaOps.Excititor.Core.Tests.csproj`.|
|
||||
|EXCITITOR-POLICY-01-004 – Policy schema validation & YAML binding|Team Excititor Policy|EXCITITOR-POLICY-01-001|**DONE (2025-10-16)** – Added strongly-typed YAML/JSON binding, schema validation, and deterministic diagnostics for operator-supplied policy bundles.|
|
||||
|EXCITITOR-POLICY-01-005 – Policy change tracking & telemetry|Team Excititor Policy|EXCITITOR-POLICY-01-002|**DONE (2025-10-16)** – Emit revision history, expose snapshot digests via CLI/WebService, and add structured logging/metrics for policy reloads.<br>2025-10-16: `VexPolicySnapshot` now carries revision/digest, provider logs reloads, `vex.policy.reloads` metric emitted, binder/diagnostics expose digest metadata. Tests: `dotnet test src/Excititor/__Tests/StellaOps.Excititor.Core.Tests/StellaOps.Excititor.Core.Tests.csproj`.|
|
||||
|EXCITITOR-POLICY-02-001 – Scoring coefficients & weight ceilings|Team Excititor Policy|EXCITITOR-POLICY-01-004|DONE (2025-10-19) – Added `weights.ceiling` + `scoring.{alpha,beta}` options with normalization warnings, extended consensus policy/digest, refreshed docs (`docs/ARCHITECTURE_EXCITITOR.md`, `docs/EXCITITOR_SCORRING.md`), and validated via `dotnet test` for core/policy suites.|
|
||||
|EXCITITOR-POLICY-02-001 – Scoring coefficients & weight ceilings|Team Excititor Policy|EXCITITOR-POLICY-01-004|DONE (2025-10-19) – Added `weights.ceiling` + `scoring.{alpha,beta}` options with normalization warnings, extended consensus policy/digest, refreshed docs (`docs/modules/excititor/ARCHITECTURE.md`, `docs/modules/excititor/scoring.md`), and validated via `dotnet test` for core/policy suites.|
|
||||
|EXCITITOR-POLICY-02-002 – Diagnostics for scoring signals|Team Excititor Policy|EXCITITOR-POLICY-02-001|BACKLOG – Update diagnostics reports to surface missing severity/KEV/EPSS mappings, coefficient overrides, and provide actionable recommendations for policy tuning.|
|
||||
|
||||
@@ -1,87 +1,87 @@
|
||||
using System;
|
||||
using System.Collections.Immutable;
|
||||
using System.Linq;
|
||||
|
||||
namespace StellaOps.Excititor.Policy;
|
||||
|
||||
public interface IVexPolicyDiagnostics
|
||||
{
|
||||
VexPolicyDiagnosticsReport GetDiagnostics();
|
||||
}
|
||||
|
||||
public sealed record VexPolicyDiagnosticsReport(
|
||||
string Version,
|
||||
string RevisionId,
|
||||
string Digest,
|
||||
int ErrorCount,
|
||||
int WarningCount,
|
||||
DateTimeOffset GeneratedAt,
|
||||
ImmutableArray<VexPolicyIssue> Issues,
|
||||
ImmutableArray<string> Recommendations,
|
||||
ImmutableDictionary<string, double> ActiveOverrides);
|
||||
|
||||
public sealed class VexPolicyDiagnostics : IVexPolicyDiagnostics
|
||||
{
|
||||
private readonly IVexPolicyProvider _policyProvider;
|
||||
private readonly TimeProvider _timeProvider;
|
||||
|
||||
public VexPolicyDiagnostics(
|
||||
IVexPolicyProvider policyProvider,
|
||||
TimeProvider? timeProvider = null)
|
||||
{
|
||||
_policyProvider = policyProvider ?? throw new ArgumentNullException(nameof(policyProvider));
|
||||
_timeProvider = timeProvider ?? TimeProvider.System;
|
||||
}
|
||||
|
||||
public VexPolicyDiagnosticsReport GetDiagnostics()
|
||||
{
|
||||
var snapshot = _policyProvider.GetSnapshot();
|
||||
var issues = snapshot.Issues;
|
||||
|
||||
var errorCount = issues.Count(static issue => issue.Severity == VexPolicyIssueSeverity.Error);
|
||||
var warningCount = issues.Count(static issue => issue.Severity == VexPolicyIssueSeverity.Warning);
|
||||
var overrides = snapshot.ConsensusOptions.ProviderOverrides
|
||||
.OrderBy(static pair => pair.Key, StringComparer.Ordinal)
|
||||
.ToImmutableDictionary();
|
||||
|
||||
var recommendations = BuildRecommendations(errorCount, warningCount, overrides);
|
||||
|
||||
return new VexPolicyDiagnosticsReport(
|
||||
snapshot.Version,
|
||||
snapshot.RevisionId,
|
||||
snapshot.Digest,
|
||||
errorCount,
|
||||
warningCount,
|
||||
_timeProvider.GetUtcNow(),
|
||||
issues,
|
||||
recommendations,
|
||||
overrides);
|
||||
}
|
||||
|
||||
private static ImmutableArray<string> BuildRecommendations(
|
||||
int errorCount,
|
||||
int warningCount,
|
||||
ImmutableDictionary<string, double> overrides)
|
||||
{
|
||||
var messages = ImmutableArray.CreateBuilder<string>();
|
||||
|
||||
if (errorCount > 0)
|
||||
{
|
||||
messages.Add("Resolve policy errors before running consensus; defaults are used while errors persist.");
|
||||
}
|
||||
|
||||
if (warningCount > 0)
|
||||
{
|
||||
messages.Add("Review policy warnings via CLI/Web diagnostics and adjust configuration as needed.");
|
||||
}
|
||||
|
||||
if (overrides.Count > 0)
|
||||
{
|
||||
messages.Add($"Provider overrides active for: {string.Join(", ", overrides.Keys)}.");
|
||||
}
|
||||
|
||||
messages.Add("Refer to docs/ARCHITECTURE_EXCITITOR.md for policy upgrade and diagnostics guidance.");
|
||||
|
||||
return messages.ToImmutable();
|
||||
}
|
||||
}
|
||||
using System;
|
||||
using System.Collections.Immutable;
|
||||
using System.Linq;
|
||||
|
||||
namespace StellaOps.Excititor.Policy;
|
||||
|
||||
public interface IVexPolicyDiagnostics
|
||||
{
|
||||
VexPolicyDiagnosticsReport GetDiagnostics();
|
||||
}
|
||||
|
||||
public sealed record VexPolicyDiagnosticsReport(
|
||||
string Version,
|
||||
string RevisionId,
|
||||
string Digest,
|
||||
int ErrorCount,
|
||||
int WarningCount,
|
||||
DateTimeOffset GeneratedAt,
|
||||
ImmutableArray<VexPolicyIssue> Issues,
|
||||
ImmutableArray<string> Recommendations,
|
||||
ImmutableDictionary<string, double> ActiveOverrides);
|
||||
|
||||
public sealed class VexPolicyDiagnostics : IVexPolicyDiagnostics
|
||||
{
|
||||
private readonly IVexPolicyProvider _policyProvider;
|
||||
private readonly TimeProvider _timeProvider;
|
||||
|
||||
public VexPolicyDiagnostics(
|
||||
IVexPolicyProvider policyProvider,
|
||||
TimeProvider? timeProvider = null)
|
||||
{
|
||||
_policyProvider = policyProvider ?? throw new ArgumentNullException(nameof(policyProvider));
|
||||
_timeProvider = timeProvider ?? TimeProvider.System;
|
||||
}
|
||||
|
||||
public VexPolicyDiagnosticsReport GetDiagnostics()
|
||||
{
|
||||
var snapshot = _policyProvider.GetSnapshot();
|
||||
var issues = snapshot.Issues;
|
||||
|
||||
var errorCount = issues.Count(static issue => issue.Severity == VexPolicyIssueSeverity.Error);
|
||||
var warningCount = issues.Count(static issue => issue.Severity == VexPolicyIssueSeverity.Warning);
|
||||
var overrides = snapshot.ConsensusOptions.ProviderOverrides
|
||||
.OrderBy(static pair => pair.Key, StringComparer.Ordinal)
|
||||
.ToImmutableDictionary();
|
||||
|
||||
var recommendations = BuildRecommendations(errorCount, warningCount, overrides);
|
||||
|
||||
return new VexPolicyDiagnosticsReport(
|
||||
snapshot.Version,
|
||||
snapshot.RevisionId,
|
||||
snapshot.Digest,
|
||||
errorCount,
|
||||
warningCount,
|
||||
_timeProvider.GetUtcNow(),
|
||||
issues,
|
||||
recommendations,
|
||||
overrides);
|
||||
}
|
||||
|
||||
private static ImmutableArray<string> BuildRecommendations(
|
||||
int errorCount,
|
||||
int warningCount,
|
||||
ImmutableDictionary<string, double> overrides)
|
||||
{
|
||||
var messages = ImmutableArray.CreateBuilder<string>();
|
||||
|
||||
if (errorCount > 0)
|
||||
{
|
||||
messages.Add("Resolve policy errors before running consensus; defaults are used while errors persist.");
|
||||
}
|
||||
|
||||
if (warningCount > 0)
|
||||
{
|
||||
messages.Add("Review policy warnings via CLI/Web diagnostics and adjust configuration as needed.");
|
||||
}
|
||||
|
||||
if (overrides.Count > 0)
|
||||
{
|
||||
messages.Add($"Provider overrides active for: {string.Join(", ", overrides.Keys)}.");
|
||||
}
|
||||
|
||||
messages.Add("Refer to docs/modules/excititor/architecture.md for policy upgrade and diagnostics guidance.");
|
||||
|
||||
return messages.ToImmutable();
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,169 +1,169 @@
|
||||
using System;
|
||||
using System.Collections.Generic;
|
||||
using System.Collections.Immutable;
|
||||
using System.Linq;
|
||||
using Microsoft.Extensions.Logging.Abstractions;
|
||||
using Microsoft.Extensions.Options;
|
||||
using Microsoft.Extensions.Time.Testing;
|
||||
using StellaOps.Excititor.Core;
|
||||
using StellaOps.Excititor.Policy;
|
||||
using System.Diagnostics.Metrics;
|
||||
|
||||
namespace StellaOps.Excititor.Core.Tests;
|
||||
|
||||
public class VexPolicyDiagnosticsTests
|
||||
{
|
||||
[Fact]
|
||||
public void GetDiagnostics_ReportsCountsRecommendationsAndOverrides()
|
||||
{
|
||||
var overrides = new[]
|
||||
{
|
||||
new KeyValuePair<string, double>("provider-a", 0.8),
|
||||
new KeyValuePair<string, double>("provider-b", 0.6),
|
||||
};
|
||||
|
||||
var snapshot = new VexPolicySnapshot(
|
||||
"custom/v1",
|
||||
new VexConsensusPolicyOptions(
|
||||
version: "custom/v1",
|
||||
providerOverrides: overrides),
|
||||
new BaselineVexConsensusPolicy(),
|
||||
ImmutableArray.Create(
|
||||
new VexPolicyIssue("sample.error", "Blocking issue.", VexPolicyIssueSeverity.Error),
|
||||
new VexPolicyIssue("sample.warning", "Non-blocking issue.", VexPolicyIssueSeverity.Warning)),
|
||||
"rev-test",
|
||||
"ABCDEF");
|
||||
|
||||
var fakeProvider = new FakePolicyProvider(snapshot);
|
||||
var fakeTime = new FakeTimeProvider(new DateTimeOffset(2025, 10, 16, 17, 0, 0, TimeSpan.Zero));
|
||||
var diagnostics = new VexPolicyDiagnostics(fakeProvider, fakeTime);
|
||||
|
||||
var report = diagnostics.GetDiagnostics();
|
||||
|
||||
Assert.Equal("custom/v1", report.Version);
|
||||
Assert.Equal("rev-test", report.RevisionId);
|
||||
Assert.Equal("ABCDEF", report.Digest);
|
||||
Assert.Equal(1, report.ErrorCount);
|
||||
Assert.Equal(1, report.WarningCount);
|
||||
Assert.Equal(fakeTime.GetUtcNow(), report.GeneratedAt);
|
||||
Assert.Collection(report.Issues,
|
||||
issue => Assert.Equal("sample.error", issue.Code),
|
||||
issue => Assert.Equal("sample.warning", issue.Code));
|
||||
Assert.Equal(new[] { "provider-a", "provider-b" }, report.ActiveOverrides.Keys.OrderBy(static key => key, StringComparer.Ordinal));
|
||||
Assert.Contains(report.Recommendations, message => message.Contains("Resolve policy errors", StringComparison.OrdinalIgnoreCase));
|
||||
Assert.Contains(report.Recommendations, message => message.Contains("provider-a", StringComparison.OrdinalIgnoreCase));
|
||||
Assert.Contains(report.Recommendations, message => message.Contains("docs/ARCHITECTURE_EXCITITOR.md", StringComparison.OrdinalIgnoreCase));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void GetDiagnostics_WhenNoIssues_StillReturnsDefaultRecommendation()
|
||||
{
|
||||
var fakeProvider = new FakePolicyProvider(VexPolicySnapshot.Default);
|
||||
var fakeTime = new FakeTimeProvider(new DateTimeOffset(2025, 10, 16, 17, 0, 0, TimeSpan.Zero));
|
||||
var diagnostics = new VexPolicyDiagnostics(fakeProvider, fakeTime);
|
||||
|
||||
var report = diagnostics.GetDiagnostics();
|
||||
|
||||
Assert.Equal(0, report.ErrorCount);
|
||||
Assert.Equal(0, report.WarningCount);
|
||||
Assert.Empty(report.ActiveOverrides);
|
||||
Assert.Single(report.Recommendations);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void PolicyProvider_ComputesRevisionAndDigest_AndEmitsTelemetry()
|
||||
{
|
||||
using var listener = new MeterListener();
|
||||
var reloadMeasurements = 0;
|
||||
string? lastRevision = null;
|
||||
listener.InstrumentPublished += (instrument, _) =>
|
||||
{
|
||||
if (instrument.Meter.Name == "StellaOps.Excititor.Policy" &&
|
||||
instrument.Name == "vex.policy.reloads")
|
||||
{
|
||||
listener.EnableMeasurementEvents(instrument);
|
||||
}
|
||||
};
|
||||
|
||||
listener.SetMeasurementEventCallback<long>((instrument, measurement, tags, state) =>
|
||||
{
|
||||
reloadMeasurements++;
|
||||
foreach (var tag in tags)
|
||||
{
|
||||
if (tag.Key is "revision" && tag.Value is string revision)
|
||||
{
|
||||
lastRevision = revision;
|
||||
break;
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
listener.Start();
|
||||
|
||||
var optionsMonitor = new MutableOptionsMonitor<VexPolicyOptions>(new VexPolicyOptions());
|
||||
var provider = new VexPolicyProvider(optionsMonitor, NullLogger<VexPolicyProvider>.Instance);
|
||||
|
||||
var snapshot1 = provider.GetSnapshot();
|
||||
Assert.Equal("rev-1", snapshot1.RevisionId);
|
||||
Assert.False(string.IsNullOrWhiteSpace(snapshot1.Digest));
|
||||
|
||||
var snapshot2 = provider.GetSnapshot();
|
||||
Assert.Equal("rev-1", snapshot2.RevisionId);
|
||||
Assert.Equal(snapshot1.Digest, snapshot2.Digest);
|
||||
|
||||
optionsMonitor.Update(new VexPolicyOptions
|
||||
{
|
||||
ProviderOverrides = new Dictionary<string, double>
|
||||
{
|
||||
["provider-a"] = 0.4
|
||||
}
|
||||
});
|
||||
|
||||
var snapshot3 = provider.GetSnapshot();
|
||||
Assert.Equal("rev-2", snapshot3.RevisionId);
|
||||
Assert.NotEqual(snapshot1.Digest, snapshot3.Digest);
|
||||
|
||||
listener.Dispose();
|
||||
|
||||
Assert.True(reloadMeasurements >= 2);
|
||||
Assert.Equal("rev-2", lastRevision);
|
||||
}
|
||||
|
||||
private sealed class FakePolicyProvider : IVexPolicyProvider
|
||||
{
|
||||
private readonly VexPolicySnapshot _snapshot;
|
||||
|
||||
public FakePolicyProvider(VexPolicySnapshot snapshot)
|
||||
{
|
||||
_snapshot = snapshot;
|
||||
}
|
||||
|
||||
public VexPolicySnapshot GetSnapshot() => _snapshot;
|
||||
}
|
||||
|
||||
private sealed class MutableOptionsMonitor<T> : IOptionsMonitor<T>
|
||||
{
|
||||
private T _value;
|
||||
|
||||
public MutableOptionsMonitor(T value)
|
||||
{
|
||||
_value = value;
|
||||
}
|
||||
|
||||
public T CurrentValue => _value;
|
||||
|
||||
public T Get(string? name) => _value;
|
||||
|
||||
public void Update(T newValue) => _value = newValue;
|
||||
|
||||
public IDisposable OnChange(Action<T, string?> listener) => NullDisposable.Instance;
|
||||
|
||||
private sealed class NullDisposable : IDisposable
|
||||
{
|
||||
public static readonly NullDisposable Instance = new();
|
||||
public void Dispose()
|
||||
{
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
using System;
|
||||
using System.Collections.Generic;
|
||||
using System.Collections.Immutable;
|
||||
using System.Linq;
|
||||
using Microsoft.Extensions.Logging.Abstractions;
|
||||
using Microsoft.Extensions.Options;
|
||||
using Microsoft.Extensions.Time.Testing;
|
||||
using StellaOps.Excititor.Core;
|
||||
using StellaOps.Excititor.Policy;
|
||||
using System.Diagnostics.Metrics;
|
||||
|
||||
namespace StellaOps.Excititor.Core.Tests;
|
||||
|
||||
public class VexPolicyDiagnosticsTests
|
||||
{
|
||||
[Fact]
|
||||
public void GetDiagnostics_ReportsCountsRecommendationsAndOverrides()
|
||||
{
|
||||
var overrides = new[]
|
||||
{
|
||||
new KeyValuePair<string, double>("provider-a", 0.8),
|
||||
new KeyValuePair<string, double>("provider-b", 0.6),
|
||||
};
|
||||
|
||||
var snapshot = new VexPolicySnapshot(
|
||||
"custom/v1",
|
||||
new VexConsensusPolicyOptions(
|
||||
version: "custom/v1",
|
||||
providerOverrides: overrides),
|
||||
new BaselineVexConsensusPolicy(),
|
||||
ImmutableArray.Create(
|
||||
new VexPolicyIssue("sample.error", "Blocking issue.", VexPolicyIssueSeverity.Error),
|
||||
new VexPolicyIssue("sample.warning", "Non-blocking issue.", VexPolicyIssueSeverity.Warning)),
|
||||
"rev-test",
|
||||
"ABCDEF");
|
||||
|
||||
var fakeProvider = new FakePolicyProvider(snapshot);
|
||||
var fakeTime = new FakeTimeProvider(new DateTimeOffset(2025, 10, 16, 17, 0, 0, TimeSpan.Zero));
|
||||
var diagnostics = new VexPolicyDiagnostics(fakeProvider, fakeTime);
|
||||
|
||||
var report = diagnostics.GetDiagnostics();
|
||||
|
||||
Assert.Equal("custom/v1", report.Version);
|
||||
Assert.Equal("rev-test", report.RevisionId);
|
||||
Assert.Equal("ABCDEF", report.Digest);
|
||||
Assert.Equal(1, report.ErrorCount);
|
||||
Assert.Equal(1, report.WarningCount);
|
||||
Assert.Equal(fakeTime.GetUtcNow(), report.GeneratedAt);
|
||||
Assert.Collection(report.Issues,
|
||||
issue => Assert.Equal("sample.error", issue.Code),
|
||||
issue => Assert.Equal("sample.warning", issue.Code));
|
||||
Assert.Equal(new[] { "provider-a", "provider-b" }, report.ActiveOverrides.Keys.OrderBy(static key => key, StringComparer.Ordinal));
|
||||
Assert.Contains(report.Recommendations, message => message.Contains("Resolve policy errors", StringComparison.OrdinalIgnoreCase));
|
||||
Assert.Contains(report.Recommendations, message => message.Contains("provider-a", StringComparison.OrdinalIgnoreCase));
|
||||
Assert.Contains(report.Recommendations, message => message.Contains("docs/modules/excititor/architecture.md", StringComparison.OrdinalIgnoreCase));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void GetDiagnostics_WhenNoIssues_StillReturnsDefaultRecommendation()
|
||||
{
|
||||
var fakeProvider = new FakePolicyProvider(VexPolicySnapshot.Default);
|
||||
var fakeTime = new FakeTimeProvider(new DateTimeOffset(2025, 10, 16, 17, 0, 0, TimeSpan.Zero));
|
||||
var diagnostics = new VexPolicyDiagnostics(fakeProvider, fakeTime);
|
||||
|
||||
var report = diagnostics.GetDiagnostics();
|
||||
|
||||
Assert.Equal(0, report.ErrorCount);
|
||||
Assert.Equal(0, report.WarningCount);
|
||||
Assert.Empty(report.ActiveOverrides);
|
||||
Assert.Single(report.Recommendations);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void PolicyProvider_ComputesRevisionAndDigest_AndEmitsTelemetry()
|
||||
{
|
||||
using var listener = new MeterListener();
|
||||
var reloadMeasurements = 0;
|
||||
string? lastRevision = null;
|
||||
listener.InstrumentPublished += (instrument, _) =>
|
||||
{
|
||||
if (instrument.Meter.Name == "StellaOps.Excititor.Policy" &&
|
||||
instrument.Name == "vex.policy.reloads")
|
||||
{
|
||||
listener.EnableMeasurementEvents(instrument);
|
||||
}
|
||||
};
|
||||
|
||||
listener.SetMeasurementEventCallback<long>((instrument, measurement, tags, state) =>
|
||||
{
|
||||
reloadMeasurements++;
|
||||
foreach (var tag in tags)
|
||||
{
|
||||
if (tag.Key is "revision" && tag.Value is string revision)
|
||||
{
|
||||
lastRevision = revision;
|
||||
break;
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
listener.Start();
|
||||
|
||||
var optionsMonitor = new MutableOptionsMonitor<VexPolicyOptions>(new VexPolicyOptions());
|
||||
var provider = new VexPolicyProvider(optionsMonitor, NullLogger<VexPolicyProvider>.Instance);
|
||||
|
||||
var snapshot1 = provider.GetSnapshot();
|
||||
Assert.Equal("rev-1", snapshot1.RevisionId);
|
||||
Assert.False(string.IsNullOrWhiteSpace(snapshot1.Digest));
|
||||
|
||||
var snapshot2 = provider.GetSnapshot();
|
||||
Assert.Equal("rev-1", snapshot2.RevisionId);
|
||||
Assert.Equal(snapshot1.Digest, snapshot2.Digest);
|
||||
|
||||
optionsMonitor.Update(new VexPolicyOptions
|
||||
{
|
||||
ProviderOverrides = new Dictionary<string, double>
|
||||
{
|
||||
["provider-a"] = 0.4
|
||||
}
|
||||
});
|
||||
|
||||
var snapshot3 = provider.GetSnapshot();
|
||||
Assert.Equal("rev-2", snapshot3.RevisionId);
|
||||
Assert.NotEqual(snapshot1.Digest, snapshot3.Digest);
|
||||
|
||||
listener.Dispose();
|
||||
|
||||
Assert.True(reloadMeasurements >= 2);
|
||||
Assert.Equal("rev-2", lastRevision);
|
||||
}
|
||||
|
||||
private sealed class FakePolicyProvider : IVexPolicyProvider
|
||||
{
|
||||
private readonly VexPolicySnapshot _snapshot;
|
||||
|
||||
public FakePolicyProvider(VexPolicySnapshot snapshot)
|
||||
{
|
||||
_snapshot = snapshot;
|
||||
}
|
||||
|
||||
public VexPolicySnapshot GetSnapshot() => _snapshot;
|
||||
}
|
||||
|
||||
private sealed class MutableOptionsMonitor<T> : IOptionsMonitor<T>
|
||||
{
|
||||
private T _value;
|
||||
|
||||
public MutableOptionsMonitor(T value)
|
||||
{
|
||||
_value = value;
|
||||
}
|
||||
|
||||
public T CurrentValue => _value;
|
||||
|
||||
public T Get(string? name) => _value;
|
||||
|
||||
public void Update(T newValue) => _value = newValue;
|
||||
|
||||
public IDisposable OnChange(Action<T, string?> listener) => NullDisposable.Instance;
|
||||
|
||||
private sealed class NullDisposable : IDisposable
|
||||
{
|
||||
public static readonly NullDisposable Instance = new();
|
||||
public void Dispose()
|
||||
{
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -10,4 +10,4 @@
|
||||
| ID | Status | Owner(s) | Depends on | Description | Exit Criteria |
|
||||
|----|--------|----------|------------|-------------|---------------|
|
||||
| EXPORT-ATTEST-75-001 | TODO | Attestation Bundle Guild, CLI Attestor Guild | EXPORT-ATTEST-74-001 | Provide CLI command `stella attest bundle verify/import` for air-gap usage. | CLI verifies/signatures; import seeds attestor store; tests cover corrupted bundle. |
|
||||
| EXPORT-ATTEST-75-002 | TODO | Attestation Bundle Guild, Docs Guild | EXPORT-ATTEST-75-001 | Document `/docs/attestor/airgap.md` with bundle workflows and verification steps. | Doc merged with banner; examples verified. |
|
||||
| EXPORT-ATTEST-75-002 | TODO | Attestation Bundle Guild, Docs Guild | EXPORT-ATTEST-75-001 | Document `/docs/modules/attestor/airgap.md` with bundle workflows and verification steps. | Doc merged with banner; examples verified. |
|
||||
|
||||
@@ -1,140 +0,0 @@
|
||||
Here’s a quick, practical idea to make your version-range modeling cleaner and faster to query.
|
||||
|
||||

|
||||
|
||||
# Rethinking `SemVerRangeBuilder` + MongoDB
|
||||
|
||||
**Problem (today):** Version normalization rules live as a nested object (and often as a bespoke structure per source). This can force awkward `$objectToArray`, `$map`, and conditional logic in pipelines when you need to:
|
||||
|
||||
* match “is version X affected?”
|
||||
* flatten ranges for analytics
|
||||
* de-duplicate across sources
|
||||
|
||||
**Proposal:** Store *normalized version rules as an embedded collection (array of small docs)* instead of a single nested object.
|
||||
|
||||
## Minimal background
|
||||
|
||||
* **SemVer normalization**: converting all source-specific version notations into a single, strict representation (e.g., `>=1.2.3 <2.0.0`, exact pins, wildcards).
|
||||
* **Embedded collection**: an array of consistently shaped items inside the parent doc—great for `$unwind`-centric analytics and direct matches.
|
||||
|
||||
## Suggested shape
|
||||
|
||||
```json
|
||||
{
|
||||
"_id": "VULN-123",
|
||||
"packageId": "pkg:npm/lodash",
|
||||
"source": "NVD",
|
||||
"normalizedVersions": [
|
||||
{
|
||||
"scheme": "semver",
|
||||
"type": "range", // "range" | "exact" | "lt" | "lte" | "gt" | "gte"
|
||||
"min": "1.2.3", // optional
|
||||
"minInclusive": true, // optional
|
||||
"max": "2.0.0", // optional
|
||||
"maxInclusive": false, // optional
|
||||
"notes": "from GHSA GHSA-xxxx" // traceability
|
||||
},
|
||||
{
|
||||
"scheme": "semver",
|
||||
"type": "exact",
|
||||
"value": "1.5.0"
|
||||
}
|
||||
],
|
||||
"metadata": { "ingestedAt": "2025-10-10T12:00:00Z" }
|
||||
}
|
||||
```
|
||||
|
||||
### Why this helps
|
||||
|
||||
* **Simpler queries**
|
||||
|
||||
* *Is v affected?*
|
||||
|
||||
```js
|
||||
db.vulns.aggregate([
|
||||
{ $match: { packageId: "pkg:npm/lodash" } },
|
||||
{ $unwind: "$normalizedVersions" },
|
||||
{ $match: {
|
||||
$or: [
|
||||
{ "normalizedVersions.type": "exact", "normalizedVersions.value": "1.5.0" },
|
||||
{ "normalizedVersions.type": "range",
|
||||
"normalizedVersions.min": { $lte: "1.5.0" },
|
||||
"normalizedVersions.max": { $gt: "1.5.0" } }
|
||||
]
|
||||
}},
|
||||
{ $project: { _id: 1 } }
|
||||
])
|
||||
```
|
||||
* No `$objectToArray`, fewer `$cond`s.
|
||||
|
||||
* **Cheaper storage**
|
||||
|
||||
* Arrays of tiny docs compress well and avoid wide nested structures with many nulls/keys.
|
||||
|
||||
* **Easier dedup/merge**
|
||||
|
||||
* `$unwind` → normalize → `$group` by `{scheme,type,min,max,value}` to collapse equivalent rules across sources.
|
||||
|
||||
## Builder changes (`SemVerRangeBuilder`)
|
||||
|
||||
* **Emit items, not a monolith**: have the builder return `IEnumerable<NormalizedVersionRule>`.
|
||||
* **Normalize early**: resolve “aliases” (`1.2.x`, `^1.2.3`, distro styles) into canonical `(type,min,max,…)` before persistence.
|
||||
* **Traceability**: include `notes`/`sourceRef` on each rule so you can re-materialize provenance during audits.
|
||||
* **Lean projection helper**: when you only need normalized rules (and not the intermediate primitives), prefer `SemVerRangeRuleBuilder.BuildNormalizedRules(rawRange, patchedVersion, provenanceNote)` to skip manual projections.
|
||||
|
||||
### C# sketch
|
||||
|
||||
```csharp
|
||||
public record NormalizedVersionRule(
|
||||
string Scheme, // "semver"
|
||||
string Type, // "range" | "exact" | ...
|
||||
string? Min = null,
|
||||
bool? MinInclusive = null,
|
||||
string? Max = null,
|
||||
bool? MaxInclusive = null,
|
||||
string? Value = null,
|
||||
string? Notes = null
|
||||
);
|
||||
|
||||
public static class SemVerRangeBuilder
|
||||
{
|
||||
public static IEnumerable<NormalizedVersionRule> Build(string raw)
|
||||
{
|
||||
// parse raw (^1.2.3, 1.2.x, <=2.0.0, etc.)
|
||||
// yield canonical rules:
|
||||
yield return new NormalizedVersionRule(
|
||||
Scheme: "semver",
|
||||
Type: "range",
|
||||
Min: "1.2.3",
|
||||
MinInclusive: true,
|
||||
Max: "2.0.0",
|
||||
MaxInclusive: false,
|
||||
Notes: "nvd:ABC-123"
|
||||
);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Aggregation patterns you unlock
|
||||
|
||||
* **Fast “affected version” lookups** via `$unwind + $match` (can complement with a computed sort key).
|
||||
* **Rollups**: count of vulns per `(major,minor)` by mapping each rule into bucketed segments.
|
||||
* **Cross-source reconciliation**: group identical rules to de-duplicate.
|
||||
|
||||
## Indexing tips
|
||||
|
||||
* Compound index on `{ packageId: 1, "normalizedVersions.scheme": 1, "normalizedVersions.type": 1 }`.
|
||||
* If lookups by exact value are common: add a sparse index on `"normalizedVersions.value"`.
|
||||
|
||||
## Migration path (safe + incremental)
|
||||
|
||||
1. **Dual-write**: keep old nested object while writing the new `normalizedVersions` array.
|
||||
2. **Backfill** existing docs with a one-time script using your current builder.
|
||||
3. **Cutover** queries/aggregations to the new path (behind a feature flag).
|
||||
4. **Clean up** old field after soak.
|
||||
|
||||
If you want, I can draft:
|
||||
|
||||
* a one-time Mongo backfill script,
|
||||
* the new EF/Mongo C# POCOs, and
|
||||
* a test matrix (edge cases: prerelease tags, build metadata, `0.*` semantics, distro-style ranges).
|
||||
@@ -1,4 +1,4 @@
|
||||
# StellaOps.Notify.WebService — Agent Charter
|
||||
|
||||
## Mission
|
||||
Implement Notify control plane per `docs/ARCHITECTURE_NOTIFY.md`.
|
||||
# StellaOps.Notify.WebService — Agent Charter
|
||||
|
||||
## Mission
|
||||
Implement Notify control plane per `docs/modules/notify/ARCHITECTURE.md`.
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
# StellaOps.Notify.Worker — Agent Charter
|
||||
|
||||
## Mission
|
||||
Consume events, evaluate rules, and dispatch deliveries per `docs/ARCHITECTURE_NOTIFY.md`.
|
||||
# StellaOps.Notify.Worker — Agent Charter
|
||||
|
||||
## Mission
|
||||
Consume events, evaluate rules, and dispatch deliveries per `docs/modules/notify/ARCHITECTURE.md`.
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
# StellaOps.Notify.Connectors.Email — Agent Charter
|
||||
|
||||
## Mission
|
||||
Implement SMTP connector plug-in per `docs/ARCHITECTURE_NOTIFY.md`.
|
||||
# StellaOps.Notify.Connectors.Email — Agent Charter
|
||||
|
||||
## Mission
|
||||
Implement SMTP connector plug-in per `docs/modules/notify/ARCHITECTURE.md`.
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
# StellaOps.Notify.Connectors.Slack — Agent Charter
|
||||
|
||||
## Mission
|
||||
Deliver Slack connector plug-in per `docs/ARCHITECTURE_NOTIFY.md`.
|
||||
# StellaOps.Notify.Connectors.Slack — Agent Charter
|
||||
|
||||
## Mission
|
||||
Deliver Slack connector plug-in per `docs/modules/notify/ARCHITECTURE.md`.
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
# StellaOps.Notify.Connectors.Teams — Agent Charter
|
||||
|
||||
## Mission
|
||||
Implement Microsoft Teams connector plug-in per `docs/ARCHITECTURE_NOTIFY.md`.
|
||||
# StellaOps.Notify.Connectors.Teams — Agent Charter
|
||||
|
||||
## Mission
|
||||
Implement Microsoft Teams connector plug-in per `docs/modules/notify/ARCHITECTURE.md`.
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
# StellaOps.Notify.Connectors.Webhook — Agent Charter
|
||||
|
||||
## Mission
|
||||
Implement generic webhook connector plug-in per `docs/ARCHITECTURE_NOTIFY.md`.
|
||||
# StellaOps.Notify.Connectors.Webhook — Agent Charter
|
||||
|
||||
## Mission
|
||||
Implement generic webhook connector plug-in per `docs/modules/notify/ARCHITECTURE.md`.
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
# StellaOps.Notify.Engine — Agent Charter
|
||||
|
||||
## Mission
|
||||
Deliver rule evaluation, digest, and rendering logic per `docs/ARCHITECTURE_NOTIFY.md`.
|
||||
# StellaOps.Notify.Engine — Agent Charter
|
||||
|
||||
## Mission
|
||||
Deliver rule evaluation, digest, and rendering logic per `docs/modules/notify/ARCHITECTURE.md`.
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
# StellaOps.Notify.Models — Agent Charter
|
||||
|
||||
## Mission
|
||||
Define Notify DTOs and contracts per `docs/ARCHITECTURE_NOTIFY.md`.
|
||||
# StellaOps.Notify.Models — Agent Charter
|
||||
|
||||
## Mission
|
||||
Define Notify DTOs and contracts per `docs/modules/notify/ARCHITECTURE.md`.
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
# StellaOps.Notify.Queue — Agent Charter
|
||||
|
||||
## Mission
|
||||
Provide event & delivery queues for Notify per `docs/ARCHITECTURE_NOTIFY.md`.
|
||||
# StellaOps.Notify.Queue — Agent Charter
|
||||
|
||||
## Mission
|
||||
Provide event & delivery queues for Notify per `docs/modules/notify/ARCHITECTURE.md`.
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
# StellaOps.Notify.Storage.Mongo — Agent Charter
|
||||
|
||||
## Mission
|
||||
Implement Mongo persistence (rules, channels, deliveries, digests, locks, audit) per `docs/ARCHITECTURE_NOTIFY.md`.
|
||||
# StellaOps.Notify.Storage.Mongo — Agent Charter
|
||||
|
||||
## Mission
|
||||
Implement Mongo persistence (rules, channels, deliveries, digests, locks, audit) per `docs/modules/notify/ARCHITECTURE.md`.
|
||||
|
||||
@@ -18,7 +18,7 @@
|
||||
<None Include="../../docs/events/*.json">
|
||||
<CopyToOutputDirectory>Always</CopyToOutputDirectory>
|
||||
</None>
|
||||
<None Include="../../docs/notify/samples/*.json">
|
||||
<None Include="../../../../docs/modules/notify/resources/samples/*.json">
|
||||
<CopyToOutputDirectory>Always</CopyToOutputDirectory>
|
||||
</None>
|
||||
</ItemGroup>
|
||||
|
||||
@@ -22,8 +22,8 @@
|
||||
</ItemGroup>
|
||||
|
||||
<ItemGroup>
|
||||
<None Include="../../docs/notify/samples/*.json">
|
||||
<None Include="../../../../docs/modules/notify/resources/samples/*.json">
|
||||
<CopyToOutputDirectory>Always</CopyToOutputDirectory>
|
||||
</None>
|
||||
</ItemGroup>
|
||||
</Project>
|
||||
</Project>
|
||||
|
||||
@@ -12,8 +12,8 @@
|
||||
</ItemGroup>
|
||||
|
||||
<ItemGroup>
|
||||
<None Include="../../docs/notify/samples/*.json">
|
||||
<None Include="../../../../docs/modules/notify/resources/samples/*.json">
|
||||
<CopyToOutputDirectory>Always</CopyToOutputDirectory>
|
||||
</None>
|
||||
</ItemGroup>
|
||||
</Project>
|
||||
</Project>
|
||||
|
||||
@@ -1,12 +1,12 @@
|
||||
# StellaOps.Policy — Agent Charter
|
||||
|
||||
## Mission
|
||||
Deliver the policy engine outlined in `docs/ARCHITECTURE_SCANNER.md` and related prose:
|
||||
- Define YAML schema (ignore rules, VEX inclusion/exclusion, vendor precedence, license gates).
|
||||
- Provide policy snapshot storage with revision digests and diagnostics.
|
||||
- Offer preview APIs to compare policy impacts on existing reports.
|
||||
|
||||
## Expectations
|
||||
- Coordinate with Scanner.WebService, Feedser, Vexer, UI, Notify.
|
||||
- Maintain deterministic serialization and unit tests for precedence rules.
|
||||
- Update `TASKS.md` and broadcast contract changes.
|
||||
# StellaOps.Policy — Agent Charter
|
||||
|
||||
## Mission
|
||||
Deliver the policy engine outlined in `docs/modules/scanner/ARCHITECTURE.md` and related prose:
|
||||
- Define YAML schema (ignore rules, VEX inclusion/exclusion, vendor precedence, license gates).
|
||||
- Provide policy snapshot storage with revision digests and diagnostics.
|
||||
- Offer preview APIs to compare policy impacts on existing reports.
|
||||
|
||||
## Expectations
|
||||
- Coordinate with Scanner.WebService, Feedser, Vexer, UI, Notify.
|
||||
- Maintain deterministic serialization and unit tests for precedence rules.
|
||||
- Update `TASKS.md` and broadcast contract changes.
|
||||
|
||||
@@ -1,12 +1,12 @@
|
||||
# StellaOps.Scanner.Sbomer.BuildXPlugin — Agent Charter
|
||||
|
||||
## Mission
|
||||
Implement the build-time SBOM generator described in `docs/ARCHITECTURE_SCANNER.md` and new buildx dossier requirements:
|
||||
- Provide a deterministic BuildKit/Buildx generator that produces layer SBOM fragments and uploads them to local CAS.
|
||||
- Emit OCI annotations (+provenance) compatible with Scanner.Emit and Attestor hand-offs.
|
||||
- Respect restart-time plug-in policy (`plugins/scanner/buildx/` manifests) and keep CI overhead ≤300 ms per layer.
|
||||
|
||||
## Expectations
|
||||
- Read architecture + upcoming Buildx addendum before coding.
|
||||
- Ensure graceful fallback to post-build scan when generator unavailable.
|
||||
- Provide integration tests with mock BuildKit, and update `TASKS.md` as states change.
|
||||
# StellaOps.Scanner.Sbomer.BuildXPlugin — Agent Charter
|
||||
|
||||
## Mission
|
||||
Implement the build-time SBOM generator described in `docs/modules/scanner/ARCHITECTURE.md` and new buildx dossier requirements:
|
||||
- Provide a deterministic BuildKit/Buildx generator that produces layer SBOM fragments and uploads them to local CAS.
|
||||
- Emit OCI annotations (+provenance) compatible with Scanner.Emit and Attestor hand-offs.
|
||||
- Respect restart-time plug-in policy (`plugins/scanner/buildx/` manifests) and keep CI overhead ≤300 ms per layer.
|
||||
|
||||
## Expectations
|
||||
- Read architecture + upcoming Buildx addendum before coding.
|
||||
- Ensure graceful fallback to post-build scan when generator unavailable.
|
||||
- Provide integration tests with mock BuildKit, and update `TASKS.md` as states change.
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
# Scanner Core Task Board
|
||||
|
||||
| ID | Status | Owner(s) | Depends on | Description | Exit Criteria |
|
||||
|----|--------|----------|------------|-------------|---------------|
|
||||
| SCANNER-CORE-09-501 | DONE (2025-10-19) | Scanner Core Guild | — | Define shared DTOs (ScanJob, ProgressEvent), error taxonomy, and deterministic ID/timestamp helpers aligning with `ARCHITECTURE_SCANNER.md` §3–§4.<br>2025-10-19: Added golden fixtures + `ScannerCoreContractsTests` to lock canonical JSON.<br>2025-10-19: Published canonical JSON snippet + acceptance notes in `docs/scanner-core-contracts.md`. | DTOs serialize deterministically, helpers produce reproducible IDs/timestamps, tests cover round-trips and hash derivation. |
|
||||
| SCANNER-CORE-09-502 | DONE (2025-10-19) | Scanner Core Guild | SCANNER-CORE-09-501 | Observability helpers (correlation IDs, logging scopes, metric namespacing, deterministic hashes) consumed by WebService/Worker.<br>2025-10-19: Verified progress scope serialisation via new fixtures/tests.<br>2025-10-19: Added `ScannerLogExtensionsPerformanceTests` to enforce ≤ 5 µs scope overhead + documented micro-bench results. | Logging/metrics helpers allocate minimally, correlation IDs stable, ActivitySource emitted; tests assert determinism. |
|
||||
| SCANNER-CORE-09-503 | DONE (2025-10-18) | Scanner Core Guild | SCANNER-CORE-09-501, SCANNER-CORE-09-502 | Security utilities: Authority client factory, OpTok caching, DPoP verifier, restart-time plug-in guardrails for scanner components. | Authority helpers cache tokens, DPoP validator rejects invalid proofs, plug-in guard prevents runtime additions; tests cover happy/error paths. |
|
||||
# Scanner Core Task Board
|
||||
|
||||
| ID | Status | Owner(s) | Depends on | Description | Exit Criteria |
|
||||
|----|--------|----------|------------|-------------|---------------|
|
||||
| SCANNER-CORE-09-501 | DONE (2025-10-19) | Scanner Core Guild | — | Define shared DTOs (ScanJob, ProgressEvent), error taxonomy, and deterministic ID/timestamp helpers aligning with `modules/scanner/ARCHITECTURE.md` §3–§4.<br>2025-10-19: Added golden fixtures + `ScannerCoreContractsTests` to lock canonical JSON.<br>2025-10-19: Published canonical JSON snippet + acceptance notes in `docs/scanner-core-contracts.md`. | DTOs serialize deterministically, helpers produce reproducible IDs/timestamps, tests cover round-trips and hash derivation. |
|
||||
| SCANNER-CORE-09-502 | DONE (2025-10-19) | Scanner Core Guild | SCANNER-CORE-09-501 | Observability helpers (correlation IDs, logging scopes, metric namespacing, deterministic hashes) consumed by WebService/Worker.<br>2025-10-19: Verified progress scope serialisation via new fixtures/tests.<br>2025-10-19: Added `ScannerLogExtensionsPerformanceTests` to enforce ≤ 5 µs scope overhead + documented micro-bench results. | Logging/metrics helpers allocate minimally, correlation IDs stable, ActivitySource emitted; tests assert determinism. |
|
||||
| SCANNER-CORE-09-503 | DONE (2025-10-18) | Scanner Core Guild | SCANNER-CORE-09-501, SCANNER-CORE-09-502 | Security utilities: Authority client factory, OpTok caching, DPoP verifier, restart-time plug-in guardrails for scanner components. | Authority helpers cache tokens, DPoP validator rejects invalid proofs, plug-in guard prevents runtime additions; tests cover happy/error paths. |
|
||||
|
||||
@@ -1,15 +1,15 @@
|
||||
# StellaOps.Scanner.Queue — Agent Charter
|
||||
|
||||
## Mission
|
||||
Deliver the scanner job queue backbone defined in `docs/ARCHITECTURE_SCANNER.md`, providing deterministic, offline-friendly leasing semantics for WebService producers and Worker consumers.
|
||||
|
||||
## Responsibilities
|
||||
- Define queue abstractions with idempotent enqueue tokens, acknowledgement, lease renewal, and claim support.
|
||||
- Ship first-party adapters for Redis Streams and NATS JetStream, respecting offline deployments and allow-listed hosts.
|
||||
- Surface health probes, structured diagnostics, and metrics needed by Scanner WebService/Worker.
|
||||
- Document operational expectations and configuration binding hooks.
|
||||
|
||||
## Interfaces & Dependencies
|
||||
- Consumes shared configuration primitives from `StellaOps.Configuration`.
|
||||
- Exposes dependency injection extensions for `StellaOps.DependencyInjection`.
|
||||
- Targets `net10.0` (preview) and aligns with scanner DTOs once `StellaOps.Scanner.Core` lands.
|
||||
# StellaOps.Scanner.Queue — Agent Charter
|
||||
|
||||
## Mission
|
||||
Deliver the scanner job queue backbone defined in `docs/modules/scanner/ARCHITECTURE.md`, providing deterministic, offline-friendly leasing semantics for WebService producers and Worker consumers.
|
||||
|
||||
## Responsibilities
|
||||
- Define queue abstractions with idempotent enqueue tokens, acknowledgement, lease renewal, and claim support.
|
||||
- Ship first-party adapters for Redis Streams and NATS JetStream, respecting offline deployments and allow-listed hosts.
|
||||
- Surface health probes, structured diagnostics, and metrics needed by Scanner WebService/Worker.
|
||||
- Document operational expectations and configuration binding hooks.
|
||||
|
||||
## Interfaces & Dependencies
|
||||
- Consumes shared configuration primitives from `StellaOps.Configuration`.
|
||||
- Exposes dependency injection extensions for `StellaOps.DependencyInjection`.
|
||||
- Targets `net10.0` (preview) and aligns with scanner DTOs once `StellaOps.Scanner.Core` lands.
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
# StellaOps.Scheduler.WebService — Agent Charter
|
||||
|
||||
## Mission
|
||||
Implement Scheduler control plane per `docs/ARCHITECTURE_SCHEDULER.md`.
|
||||
# StellaOps.Scheduler.WebService — Agent Charter
|
||||
|
||||
## Mission
|
||||
Implement Scheduler control plane per `docs/modules/scheduler/ARCHITECTURE.md`.
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
# StellaOps.Scheduler.ImpactIndex — Agent Charter
|
||||
|
||||
## Mission
|
||||
Build the global impact index per `docs/ARCHITECTURE_SCHEDULER.md` (roaring bitmaps, selectors, snapshotting).
|
||||
# StellaOps.Scheduler.ImpactIndex — Agent Charter
|
||||
|
||||
## Mission
|
||||
Build the global impact index per `docs/modules/scheduler/ARCHITECTURE.md` (roaring bitmaps, selectors, snapshotting).
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
# StellaOps.Scheduler.Models — Agent Charter
|
||||
|
||||
## Mission
|
||||
Define Scheduler DTOs (Schedule, Run, ImpactSet, Selector, DeltaSummary) per `docs/ARCHITECTURE_SCHEDULER.md`.
|
||||
# StellaOps.Scheduler.Models — Agent Charter
|
||||
|
||||
## Mission
|
||||
Define Scheduler DTOs (Schedule, Run, ImpactSet, Selector, DeltaSummary) per `docs/modules/scheduler/ARCHITECTURE.md`.
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
# StellaOps.Scheduler.Queue — Agent Charter
|
||||
|
||||
## Mission
|
||||
Provide queue abstraction (Redis Streams / NATS JetStream) for planner inputs and runner segments per `docs/ARCHITECTURE_SCHEDULER.md`.
|
||||
# StellaOps.Scheduler.Queue — Agent Charter
|
||||
|
||||
## Mission
|
||||
Provide queue abstraction (Redis Streams / NATS JetStream) for planner inputs and runner segments per `docs/modules/scheduler/ARCHITECTURE.md`.
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
# StellaOps.Scheduler.Storage.Mongo — Agent Charter
|
||||
|
||||
## Mission
|
||||
Implement Mongo persistence (schedules, runs, impact cursors, locks, audit) per `docs/ARCHITECTURE_SCHEDULER.md`.
|
||||
# StellaOps.Scheduler.Storage.Mongo — Agent Charter
|
||||
|
||||
## Mission
|
||||
Implement Mongo persistence (schedules, runs, impact cursors, locks, audit) per `docs/modules/scheduler/ARCHITECTURE.md`.
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
# StellaOps.Scheduler.Worker — Agent Charter
|
||||
|
||||
## Mission
|
||||
Implement Scheduler planners/runners per `docs/ARCHITECTURE_SCHEDULER.md`.
|
||||
# StellaOps.Scheduler.Worker — Agent Charter
|
||||
|
||||
## Mission
|
||||
Implement Scheduler planners/runners per `docs/modules/scheduler/ARCHITECTURE.md`.
|
||||
|
||||
@@ -34,10 +34,10 @@ runner throughput, and backlog health.
|
||||
|
||||
## Dashboards & alerts
|
||||
|
||||
- **Grafana dashboard:** `docs/ops/scheduler-worker-grafana-dashboard.json`
|
||||
- **Grafana dashboard:** `docs/modules/scheduler/operations/worker-grafana-dashboard.json`
|
||||
(import into Prometheus-backed Grafana). Panels mirror the metrics above with
|
||||
mode filters.
|
||||
- **Prometheus rules:** `docs/ops/scheduler-worker-prometheus-rules.yaml`
|
||||
- **Prometheus rules:** `docs/modules/scheduler/operations/worker-prometheus-rules.yaml`
|
||||
provides planner failure/latency, backlog, and stuck-run alerts.
|
||||
- **Operations guide:** see `docs/ops/scheduler-worker-operations.md` for
|
||||
- **Operations guide:** see `docs/modules/scheduler/operations/worker.md` for
|
||||
runbook steps, alert context, and dashboard wiring instructions.
|
||||
|
||||
20
src/Tools/FixtureUpdater/FixtureUpdater.csproj
Normal file
20
src/Tools/FixtureUpdater/FixtureUpdater.csproj
Normal file
@@ -0,0 +1,20 @@
|
||||
<Project Sdk="Microsoft.NET.Sdk">
|
||||
|
||||
<PropertyGroup>
|
||||
<OutputType>Exe</OutputType>
|
||||
<TargetFramework>net10.0</TargetFramework>
|
||||
<ImplicitUsings>enable</ImplicitUsings>
|
||||
<Nullable>enable</Nullable>
|
||||
</PropertyGroup>
|
||||
|
||||
<ItemGroup>
|
||||
<ProjectReference Include="../../Concelier/StellaOps.Concelier.PluginBinaries/StellaOps.Concelier.Connector.Osv/StellaOps.Concelier.Connector.Osv.csproj" />
|
||||
<ProjectReference Include="../../Concelier/StellaOps.Concelier.PluginBinaries/StellaOps.Concelier.Connector.Ghsa/StellaOps.Concelier.Connector.Ghsa.csproj" />
|
||||
<ProjectReference Include="../../Concelier/StellaOps.Concelier.PluginBinaries/StellaOps.Concelier.Connector.Nvd/StellaOps.Concelier.Connector.Nvd.csproj" />
|
||||
<ProjectReference Include="../../Concelier/StellaOps.Concelier.PluginBinaries/StellaOps.Concelier.Connector.Common/StellaOps.Concelier.Connector.Common.csproj" />
|
||||
<ProjectReference Include="../../Concelier/__Libraries/StellaOps.Concelier.Storage.Mongo/StellaOps.Concelier.Storage.Mongo.csproj" />
|
||||
<ProjectReference Include="../../Concelier/__Libraries/StellaOps.Concelier.Models/StellaOps.Concelier.Models.csproj" />
|
||||
<ProjectReference Include="../../Concelier/__Libraries/StellaOps.Concelier.Testing/StellaOps.Concelier.Testing.csproj" />
|
||||
</ItemGroup>
|
||||
|
||||
</Project>
|
||||
378
src/Tools/FixtureUpdater/Program.cs
Normal file
378
src/Tools/FixtureUpdater/Program.cs
Normal file
@@ -0,0 +1,378 @@
|
||||
using System.Linq;
|
||||
using System.Text;
|
||||
using System.Text.Json;
|
||||
using System.Text.Json.Serialization;
|
||||
using MongoDB.Bson;
|
||||
using StellaOps.Concelier.Models;
|
||||
using StellaOps.Concelier.Connector.Ghsa;
|
||||
using StellaOps.Concelier.Connector.Common;
|
||||
using StellaOps.Concelier.Connector.Ghsa.Internal;
|
||||
using StellaOps.Concelier.Connector.Osv.Internal;
|
||||
using StellaOps.Concelier.Connector.Osv;
|
||||
using StellaOps.Concelier.Connector.Nvd;
|
||||
using StellaOps.Concelier.Storage.Mongo.Documents;
|
||||
using StellaOps.Concelier.Storage.Mongo.Dtos;
|
||||
|
||||
var serializerOptions = new JsonSerializerOptions(JsonSerializerDefaults.Web)
|
||||
{
|
||||
DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingNull,
|
||||
};
|
||||
|
||||
var projectRoot = Path.GetFullPath(Path.Combine(AppContext.BaseDirectory, "..", "..", "..", "..", ".."));
|
||||
|
||||
var osvFixturesPath = Path.Combine(projectRoot, "src", "StellaOps.Concelier.Connector.Osv.Tests", "Fixtures");
|
||||
var ghsaFixturesPath = Path.Combine(projectRoot, "src", "StellaOps.Concelier.Connector.Ghsa.Tests", "Fixtures");
|
||||
var nvdFixturesPath = Path.Combine(projectRoot, "src", "StellaOps.Concelier.Connector.Nvd.Tests", "Nvd", "Fixtures");
|
||||
|
||||
RewriteOsvFixtures(osvFixturesPath);
|
||||
RewriteSnapshotFixtures(osvFixturesPath);
|
||||
RewriteGhsaFixtures(osvFixturesPath);
|
||||
RewriteCreditParityFixtures(ghsaFixturesPath, nvdFixturesPath);
|
||||
return;
|
||||
|
||||
void RewriteOsvFixtures(string fixturesPath)
|
||||
{
|
||||
var rawPath = Path.Combine(fixturesPath, "osv-ghsa.raw-osv.json");
|
||||
if (!File.Exists(rawPath))
|
||||
{
|
||||
Console.WriteLine($"[FixtureUpdater] OSV raw fixture missing: {rawPath}");
|
||||
return;
|
||||
}
|
||||
|
||||
using var document = JsonDocument.Parse(File.ReadAllText(rawPath));
|
||||
var advisories = new List<Advisory>();
|
||||
foreach (var element in document.RootElement.EnumerateArray())
|
||||
{
|
||||
var dto = JsonSerializer.Deserialize<OsvVulnerabilityDto>(element.GetRawText(), serializerOptions);
|
||||
if (dto is null)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var ecosystem = dto.Affected?.FirstOrDefault()?.Package?.Ecosystem ?? "unknown";
|
||||
var uri = new Uri($"https://osv.dev/vulnerability/{dto.Id}");
|
||||
var documentRecord = new DocumentRecord(
|
||||
Guid.NewGuid(),
|
||||
OsvConnectorPlugin.SourceName,
|
||||
uri.ToString(),
|
||||
DateTimeOffset.UtcNow,
|
||||
"fixture-sha",
|
||||
DocumentStatuses.PendingMap,
|
||||
"application/json",
|
||||
null,
|
||||
new Dictionary<string, string>(StringComparer.Ordinal)
|
||||
{
|
||||
["osv.ecosystem"] = ecosystem,
|
||||
},
|
||||
null,
|
||||
DateTimeOffset.UtcNow,
|
||||
null,
|
||||
null);
|
||||
|
||||
var payload = BsonDocument.Parse(element.GetRawText());
|
||||
var dtoRecord = new DtoRecord(
|
||||
Guid.NewGuid(),
|
||||
documentRecord.Id,
|
||||
OsvConnectorPlugin.SourceName,
|
||||
"osv.v1",
|
||||
payload,
|
||||
DateTimeOffset.UtcNow);
|
||||
|
||||
var advisory = OsvMapper.Map(dto, documentRecord, dtoRecord, ecosystem);
|
||||
advisories.Add(advisory);
|
||||
}
|
||||
|
||||
advisories.Sort((left, right) => string.Compare(left.AdvisoryKey, right.AdvisoryKey, StringComparison.Ordinal));
|
||||
var snapshot = SnapshotSerializer.ToSnapshot(advisories);
|
||||
File.WriteAllText(Path.Combine(fixturesPath, "osv-ghsa.osv.json"), snapshot);
|
||||
Console.WriteLine($"[FixtureUpdater] Updated {Path.Combine(fixturesPath, "osv-ghsa.osv.json")}");
|
||||
}
|
||||
|
||||
void RewriteSnapshotFixtures(string fixturesPath)
|
||||
{
|
||||
var baselinePublished = new DateTimeOffset(2025, 1, 5, 12, 0, 0, TimeSpan.Zero);
|
||||
var baselineModified = new DateTimeOffset(2025, 1, 8, 6, 30, 0, TimeSpan.Zero);
|
||||
var baselineFetched = new DateTimeOffset(2025, 1, 8, 7, 0, 0, TimeSpan.Zero);
|
||||
|
||||
var cases = new (string Ecosystem, string Purl, string PackageName, string SnapshotFile)[]
|
||||
{
|
||||
("npm", "pkg:npm/%40scope%2Fleft-pad", "@scope/left-pad", "osv-npm.snapshot.json"),
|
||||
("PyPI", "pkg:pypi/requests", "requests", "osv-pypi.snapshot.json"),
|
||||
};
|
||||
|
||||
foreach (var (ecosystem, purl, packageName, snapshotFile) in cases)
|
||||
{
|
||||
var dto = new OsvVulnerabilityDto
|
||||
{
|
||||
Id = $"OSV-2025-{ecosystem}-0001",
|
||||
Summary = $"{ecosystem} package vulnerability",
|
||||
Details = $"Detailed description for {ecosystem} package {packageName}.",
|
||||
Published = baselinePublished,
|
||||
Modified = baselineModified,
|
||||
Aliases = new[] { $"CVE-2025-11{ecosystem.Length}", $"GHSA-{ecosystem.Length}abc-{ecosystem.Length}def-{ecosystem.Length}ghi" },
|
||||
Related = new[] { $"OSV-RELATED-{ecosystem}-42" },
|
||||
References = new[]
|
||||
{
|
||||
new OsvReferenceDto { Url = $"https://example.com/{ecosystem}/advisory", Type = "ADVISORY" },
|
||||
new OsvReferenceDto { Url = $"https://example.com/{ecosystem}/fix", Type = "FIX" },
|
||||
},
|
||||
Severity = new[]
|
||||
{
|
||||
new OsvSeverityDto { Type = "CVSS_V3", Score = "CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:H/I:H/A:H" },
|
||||
},
|
||||
Affected = new[]
|
||||
{
|
||||
new OsvAffectedPackageDto
|
||||
{
|
||||
Package = new OsvPackageDto
|
||||
{
|
||||
Ecosystem = ecosystem,
|
||||
Name = packageName,
|
||||
Purl = purl,
|
||||
},
|
||||
Ranges = new[]
|
||||
{
|
||||
new OsvRangeDto
|
||||
{
|
||||
Type = "SEMVER",
|
||||
Events = new[]
|
||||
{
|
||||
new OsvEventDto { Introduced = "0" },
|
||||
new OsvEventDto { Fixed = "2.0.0" },
|
||||
},
|
||||
},
|
||||
},
|
||||
Versions = new[] { "1.0.0", "1.5.0" },
|
||||
EcosystemSpecific = JsonDocument.Parse("{\"severity\":\"high\"}").RootElement.Clone(),
|
||||
},
|
||||
},
|
||||
DatabaseSpecific = JsonDocument.Parse("{\"source\":\"osv.dev\"}").RootElement.Clone(),
|
||||
};
|
||||
|
||||
var document = new DocumentRecord(
|
||||
Guid.NewGuid(),
|
||||
OsvConnectorPlugin.SourceName,
|
||||
$"https://osv.dev/vulnerability/{dto.Id}",
|
||||
baselineFetched,
|
||||
"fixture-sha",
|
||||
DocumentStatuses.PendingParse,
|
||||
"application/json",
|
||||
null,
|
||||
new Dictionary<string, string>(StringComparer.Ordinal) { ["osv.ecosystem"] = ecosystem },
|
||||
null,
|
||||
baselineModified,
|
||||
null);
|
||||
|
||||
var payload = BsonDocument.Parse(JsonSerializer.Serialize(dto, serializerOptions));
|
||||
var dtoRecord = new DtoRecord(Guid.NewGuid(), document.Id, OsvConnectorPlugin.SourceName, "osv.v1", payload, baselineModified);
|
||||
|
||||
var advisory = OsvMapper.Map(dto, document, dtoRecord, ecosystem);
|
||||
var snapshot = SnapshotSerializer.ToSnapshot(advisory);
|
||||
File.WriteAllText(Path.Combine(fixturesPath, snapshotFile), snapshot);
|
||||
Console.WriteLine($"[FixtureUpdater] Updated {Path.Combine(fixturesPath, snapshotFile)}");
|
||||
}
|
||||
}
|
||||
|
||||
void RewriteGhsaFixtures(string fixturesPath)
|
||||
{
|
||||
var rawPath = Path.Combine(fixturesPath, "osv-ghsa.raw-ghsa.json");
|
||||
if (!File.Exists(rawPath))
|
||||
{
|
||||
Console.WriteLine($"[FixtureUpdater] GHSA raw fixture missing: {rawPath}");
|
||||
return;
|
||||
}
|
||||
|
||||
JsonDocument document;
|
||||
try
|
||||
{
|
||||
document = JsonDocument.Parse(File.ReadAllText(rawPath));
|
||||
}
|
||||
catch (JsonException ex)
|
||||
{
|
||||
Console.WriteLine($"[FixtureUpdater] Failed to parse GHSA raw fixture '{rawPath}': {ex.Message}");
|
||||
return;
|
||||
}
|
||||
using (document)
|
||||
{
|
||||
var advisories = new List<Advisory>();
|
||||
foreach (var element in document.RootElement.EnumerateArray())
|
||||
{
|
||||
GhsaRecordDto dto;
|
||||
try
|
||||
{
|
||||
dto = GhsaRecordParser.Parse(Encoding.UTF8.GetBytes(element.GetRawText()));
|
||||
}
|
||||
catch (JsonException)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var uri = new Uri($"https://github.com/advisories/{dto.GhsaId}");
|
||||
var documentRecord = new DocumentRecord(
|
||||
Guid.NewGuid(),
|
||||
GhsaConnectorPlugin.SourceName,
|
||||
uri.ToString(),
|
||||
DateTimeOffset.UtcNow,
|
||||
"fixture-sha",
|
||||
DocumentStatuses.PendingMap,
|
||||
"application/json",
|
||||
null,
|
||||
new Dictionary<string, string>(StringComparer.Ordinal),
|
||||
null,
|
||||
DateTimeOffset.UtcNow,
|
||||
null,
|
||||
null);
|
||||
|
||||
var advisory = GhsaMapper.Map(dto, documentRecord, DateTimeOffset.UtcNow);
|
||||
advisories.Add(advisory);
|
||||
}
|
||||
|
||||
advisories.Sort((left, right) => string.Compare(left.AdvisoryKey, right.AdvisoryKey, StringComparison.Ordinal));
|
||||
var snapshot = SnapshotSerializer.ToSnapshot(advisories);
|
||||
File.WriteAllText(Path.Combine(fixturesPath, "osv-ghsa.ghsa.json"), snapshot);
|
||||
Console.WriteLine($"[FixtureUpdater] Updated {Path.Combine(fixturesPath, "osv-ghsa.ghsa.json")}");
|
||||
}
|
||||
}
|
||||
|
||||
void RewriteCreditParityFixtures(string ghsaFixturesPath, string nvdFixturesPath)
|
||||
{
|
||||
Directory.CreateDirectory(ghsaFixturesPath);
|
||||
Directory.CreateDirectory(nvdFixturesPath);
|
||||
|
||||
var advisoryKeyGhsa = "GHSA-credit-parity";
|
||||
var advisoryKeyNvd = "CVE-2025-5555";
|
||||
var recordedAt = new DateTimeOffset(2025, 10, 10, 15, 0, 0, TimeSpan.Zero);
|
||||
var published = new DateTimeOffset(2025, 10, 9, 18, 30, 0, TimeSpan.Zero);
|
||||
var modified = new DateTimeOffset(2025, 10, 10, 12, 0, 0, TimeSpan.Zero);
|
||||
|
||||
AdvisoryCredit[] CreateCredits(string source) =>
|
||||
[
|
||||
CreateCredit("Alice Researcher", "reporter", new[] { "mailto:alice.researcher@example.com" }, source),
|
||||
CreateCredit("Bob Maintainer", "remediation_developer", new[] { "https://github.com/acme/bob-maintainer" }, source)
|
||||
];
|
||||
|
||||
AdvisoryCredit CreateCredit(string displayName, string role, IReadOnlyList<string> contacts, string source)
|
||||
{
|
||||
var provenance = new AdvisoryProvenance(
|
||||
source,
|
||||
"credit",
|
||||
$"{source}:{displayName.ToLowerInvariant().Replace(' ', '-')}",
|
||||
recordedAt,
|
||||
new[] { ProvenanceFieldMasks.Credits });
|
||||
|
||||
return new AdvisoryCredit(displayName, role, contacts, provenance);
|
||||
}
|
||||
|
||||
AdvisoryReference[] CreateReferences(string sourceName, params (string Url, string Kind)[] entries)
|
||||
{
|
||||
if (entries is null || entries.Length == 0)
|
||||
{
|
||||
return Array.Empty<AdvisoryReference>();
|
||||
}
|
||||
|
||||
var references = new List<AdvisoryReference>(entries.Length);
|
||||
foreach (var entry in entries)
|
||||
{
|
||||
var provenance = new AdvisoryProvenance(
|
||||
sourceName,
|
||||
"reference",
|
||||
entry.Url,
|
||||
recordedAt,
|
||||
new[] { ProvenanceFieldMasks.References });
|
||||
|
||||
references.Add(new AdvisoryReference(
|
||||
entry.Url,
|
||||
entry.Kind,
|
||||
sourceTag: null,
|
||||
summary: null,
|
||||
provenance));
|
||||
}
|
||||
|
||||
return references.ToArray();
|
||||
}
|
||||
|
||||
Advisory CreateAdvisory(
|
||||
string sourceName,
|
||||
string advisoryKey,
|
||||
IEnumerable<string> aliases,
|
||||
AdvisoryCredit[] credits,
|
||||
AdvisoryReference[] references,
|
||||
string documentValue)
|
||||
{
|
||||
var documentProvenance = new AdvisoryProvenance(
|
||||
sourceName,
|
||||
"document",
|
||||
documentValue,
|
||||
recordedAt,
|
||||
new[] { ProvenanceFieldMasks.Advisory });
|
||||
var mappingProvenance = new AdvisoryProvenance(
|
||||
sourceName,
|
||||
"mapping",
|
||||
advisoryKey,
|
||||
recordedAt,
|
||||
new[] { ProvenanceFieldMasks.Advisory });
|
||||
|
||||
return new Advisory(
|
||||
advisoryKey,
|
||||
"Credit parity regression fixture",
|
||||
"Credit parity regression fixture",
|
||||
"en",
|
||||
published,
|
||||
modified,
|
||||
"moderate",
|
||||
exploitKnown: false,
|
||||
aliases,
|
||||
credits,
|
||||
references,
|
||||
Array.Empty<AffectedPackage>(),
|
||||
Array.Empty<CvssMetric>(),
|
||||
new[] { documentProvenance, mappingProvenance });
|
||||
}
|
||||
|
||||
var ghsa = CreateAdvisory(
|
||||
"ghsa",
|
||||
advisoryKeyGhsa,
|
||||
new[] { advisoryKeyGhsa, advisoryKeyNvd },
|
||||
CreateCredits("ghsa"),
|
||||
CreateReferences(
|
||||
"ghsa",
|
||||
( $"https://github.com/advisories/{advisoryKeyGhsa}", "advisory"),
|
||||
( "https://example.com/ghsa/patch", "patch")),
|
||||
$"security/advisories/{advisoryKeyGhsa}");
|
||||
|
||||
var osv = CreateAdvisory(
|
||||
OsvConnectorPlugin.SourceName,
|
||||
advisoryKeyGhsa,
|
||||
new[] { advisoryKeyGhsa, advisoryKeyNvd },
|
||||
CreateCredits(OsvConnectorPlugin.SourceName),
|
||||
CreateReferences(
|
||||
OsvConnectorPlugin.SourceName,
|
||||
( $"https://github.com/advisories/{advisoryKeyGhsa}", "advisory"),
|
||||
( $"https://osv.dev/vulnerability/{advisoryKeyGhsa}", "advisory")),
|
||||
$"https://osv.dev/vulnerability/{advisoryKeyGhsa}");
|
||||
|
||||
var nvd = CreateAdvisory(
|
||||
NvdConnectorPlugin.SourceName,
|
||||
advisoryKeyNvd,
|
||||
new[] { advisoryKeyNvd, advisoryKeyGhsa },
|
||||
CreateCredits(NvdConnectorPlugin.SourceName),
|
||||
CreateReferences(
|
||||
NvdConnectorPlugin.SourceName,
|
||||
( $"https://services.nvd.nist.gov/vuln/detail/{advisoryKeyNvd}", "advisory"),
|
||||
( "https://example.com/nvd/reference", "report")),
|
||||
$"https://services.nvd.nist.gov/vuln/detail/{advisoryKeyNvd}");
|
||||
|
||||
var ghsaSnapshot = SnapshotSerializer.ToSnapshot(ghsa);
|
||||
var osvSnapshot = SnapshotSerializer.ToSnapshot(osv);
|
||||
var nvdSnapshot = SnapshotSerializer.ToSnapshot(nvd);
|
||||
|
||||
File.WriteAllText(Path.Combine(ghsaFixturesPath, "credit-parity.ghsa.json"), ghsaSnapshot);
|
||||
File.WriteAllText(Path.Combine(ghsaFixturesPath, "credit-parity.osv.json"), osvSnapshot);
|
||||
File.WriteAllText(Path.Combine(ghsaFixturesPath, "credit-parity.nvd.json"), nvdSnapshot);
|
||||
|
||||
File.WriteAllText(Path.Combine(nvdFixturesPath, "credit-parity.ghsa.json"), ghsaSnapshot);
|
||||
File.WriteAllText(Path.Combine(nvdFixturesPath, "credit-parity.osv.json"), osvSnapshot);
|
||||
File.WriteAllText(Path.Combine(nvdFixturesPath, "credit-parity.nvd.json"), nvdSnapshot);
|
||||
|
||||
Console.WriteLine($"[FixtureUpdater] Updated credit parity fixtures under {ghsaFixturesPath} and {nvdFixturesPath}");
|
||||
}
|
||||
18
src/Tools/LanguageAnalyzerSmoke/LanguageAnalyzerSmoke.csproj
Normal file
18
src/Tools/LanguageAnalyzerSmoke/LanguageAnalyzerSmoke.csproj
Normal file
@@ -0,0 +1,18 @@
|
||||
<?xml version="1.0" encoding="utf-8"?>
|
||||
<Project Sdk="Microsoft.NET.Sdk">
|
||||
<PropertyGroup>
|
||||
<OutputType>Exe</OutputType>
|
||||
<TargetFramework>net10.0</TargetFramework>
|
||||
<ImplicitUsings>enable</ImplicitUsings>
|
||||
<Nullable>enable</Nullable>
|
||||
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
|
||||
</PropertyGroup>
|
||||
<ItemGroup>
|
||||
<PackageReference Include="Microsoft.Extensions.DependencyInjection" Version="10.0.0-rc.2.25502.107" />
|
||||
<PackageReference Include="Microsoft.Extensions.Logging.Abstractions" Version="10.0.0-rc.2.25502.107" />
|
||||
</ItemGroup>
|
||||
<ItemGroup>
|
||||
<ProjectReference Include="..\..\src\StellaOps.Scanner.Analyzers.Lang\StellaOps.Scanner.Analyzers.Lang.csproj" />
|
||||
<ProjectReference Include="..\..\src\StellaOps.Scanner.Core\StellaOps.Scanner.Core.csproj" />
|
||||
</ItemGroup>
|
||||
</Project>
|
||||
348
src/Tools/LanguageAnalyzerSmoke/Program.cs
Normal file
348
src/Tools/LanguageAnalyzerSmoke/Program.cs
Normal file
@@ -0,0 +1,348 @@
|
||||
using System.Collections.Immutable;
|
||||
using System.Diagnostics;
|
||||
using System.Reflection;
|
||||
using System.Security.Cryptography;
|
||||
using System.Text;
|
||||
using System.Text.Json;
|
||||
using System.Text.Json.Serialization;
|
||||
using Microsoft.Extensions.DependencyInjection;
|
||||
using Microsoft.Extensions.Logging.Abstractions;
|
||||
using StellaOps.Scanner.Analyzers.Lang;
|
||||
using StellaOps.Scanner.Analyzers.Lang.Plugin;
|
||||
using StellaOps.Scanner.Core.Security;
|
||||
|
||||
internal sealed record SmokeScenario(string Name, string[] UsageHintRelatives)
|
||||
{
|
||||
public IReadOnlyList<string> ResolveUsageHints(string scenarioRoot)
|
||||
=> UsageHintRelatives.Select(relative => Path.GetFullPath(Path.Combine(scenarioRoot, relative))).ToArray();
|
||||
}
|
||||
|
||||
internal sealed class SmokeOptions
|
||||
{
|
||||
public string RepoRoot { get; set; } = Directory.GetCurrentDirectory();
|
||||
public string PluginDirectoryName { get; set; } = "StellaOps.Scanner.Analyzers.Lang.Python";
|
||||
public string FixtureRelativePath { get; set; } = Path.Combine("src", "StellaOps.Scanner.Analyzers.Lang.Python.Tests", "Fixtures", "lang", "python");
|
||||
|
||||
public static SmokeOptions Parse(string[] args)
|
||||
{
|
||||
var options = new SmokeOptions();
|
||||
|
||||
for (var index = 0; index < args.Length; index++)
|
||||
{
|
||||
var current = args[index];
|
||||
switch (current)
|
||||
{
|
||||
case "--repo-root":
|
||||
case "-r":
|
||||
options.RepoRoot = RequireValue(args, ref index, current);
|
||||
break;
|
||||
case "--plugin-directory":
|
||||
case "-p":
|
||||
options.PluginDirectoryName = RequireValue(args, ref index, current);
|
||||
break;
|
||||
case "--fixture-path":
|
||||
case "-f":
|
||||
options.FixtureRelativePath = RequireValue(args, ref index, current);
|
||||
break;
|
||||
case "--help":
|
||||
case "-h":
|
||||
PrintUsage();
|
||||
Environment.Exit(0);
|
||||
break;
|
||||
default:
|
||||
throw new ArgumentException($"Unknown argument '{current}'. Use --help for usage.");
|
||||
}
|
||||
}
|
||||
|
||||
options.RepoRoot = Path.GetFullPath(options.RepoRoot);
|
||||
return options;
|
||||
}
|
||||
|
||||
private static string RequireValue(string[] args, ref int index, string switchName)
|
||||
{
|
||||
if (index + 1 >= args.Length)
|
||||
{
|
||||
throw new ArgumentException($"Missing value for '{switchName}'.");
|
||||
}
|
||||
|
||||
index++;
|
||||
var value = args[index];
|
||||
if (string.IsNullOrWhiteSpace(value))
|
||||
{
|
||||
throw new ArgumentException($"Value for '{switchName}' cannot be empty.");
|
||||
}
|
||||
|
||||
return value;
|
||||
}
|
||||
|
||||
private static void PrintUsage()
|
||||
{
|
||||
Console.WriteLine("Language Analyzer Smoke Harness");
|
||||
Console.WriteLine("Usage: dotnet run --project src/Tools/LanguageAnalyzerSmoke -- [options]");
|
||||
Console.WriteLine();
|
||||
Console.WriteLine("Options:");
|
||||
Console.WriteLine(" -r, --repo-root <path> Repository root (defaults to current working directory)");
|
||||
Console.WriteLine(" -p, --plugin-directory <name> Analyzer plug-in directory under plugins/scanner/analyzers/lang (defaults to StellaOps.Scanner.Analyzers.Lang.Python)");
|
||||
Console.WriteLine(" -f, --fixture-path <path> Relative path to fixtures root (defaults to src/Scanner/__Tests/StellaOps.Scanner.Analyzers.Lang.Python.Tests/Fixtures/lang/python)");
|
||||
Console.WriteLine(" -h, --help Show usage information");
|
||||
}
|
||||
}
|
||||
|
||||
internal sealed record PluginManifest
|
||||
{
|
||||
[JsonPropertyName("schemaVersion")]
|
||||
public string SchemaVersion { get; init; } = string.Empty;
|
||||
|
||||
[JsonPropertyName("id")]
|
||||
public string Id { get; init; } = string.Empty;
|
||||
|
||||
[JsonPropertyName("displayName")]
|
||||
public string DisplayName { get; init; } = string.Empty;
|
||||
|
||||
[JsonPropertyName("version")]
|
||||
public string Version { get; init; } = string.Empty;
|
||||
|
||||
[JsonPropertyName("requiresRestart")]
|
||||
public bool RequiresRestart { get; init; }
|
||||
|
||||
[JsonPropertyName("entryPoint")]
|
||||
public PluginEntryPoint EntryPoint { get; init; } = new();
|
||||
|
||||
[JsonPropertyName("capabilities")]
|
||||
public IReadOnlyList<string> Capabilities { get; init; } = Array.Empty<string>();
|
||||
|
||||
[JsonPropertyName("metadata")]
|
||||
public IReadOnlyDictionary<string, string> Metadata { get; init; } = ImmutableDictionary<string, string>.Empty;
|
||||
}
|
||||
|
||||
internal sealed record PluginEntryPoint
|
||||
{
|
||||
[JsonPropertyName("type")]
|
||||
public string Type { get; init; } = string.Empty;
|
||||
|
||||
[JsonPropertyName("assembly")]
|
||||
public string Assembly { get; init; } = string.Empty;
|
||||
|
||||
[JsonPropertyName("typeName")]
|
||||
public string TypeName { get; init; } = string.Empty;
|
||||
}
|
||||
|
||||
file static class Program
|
||||
{
|
||||
private static readonly SmokeScenario[] PythonScenarios =
|
||||
{
|
||||
new("simple-venv", new[] { Path.Combine("bin", "simple-tool") }),
|
||||
new("pip-cache", new[] { Path.Combine("lib", "python3.11", "site-packages", "cache_pkg-1.2.3.data", "scripts", "cache-tool") }),
|
||||
new("layered-editable", new[] { Path.Combine("layer1", "usr", "bin", "layered-cli") })
|
||||
};
|
||||
|
||||
public static async Task<int> Main(string[] args)
|
||||
{
|
||||
try
|
||||
{
|
||||
var options = SmokeOptions.Parse(args);
|
||||
await RunAsync(options).ConfigureAwait(false);
|
||||
Console.WriteLine("✅ Python analyzer smoke checks passed");
|
||||
return 0;
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
Console.Error.WriteLine($"❌ {ex.Message}");
|
||||
return 1;
|
||||
}
|
||||
}
|
||||
|
||||
private static async Task RunAsync(SmokeOptions options)
|
||||
{
|
||||
ValidateOptions(options);
|
||||
|
||||
var pluginRoot = Path.Combine(options.RepoRoot, "plugins", "scanner", "analyzers", "lang", options.PluginDirectoryName);
|
||||
var manifestPath = Path.Combine(pluginRoot, "manifest.json");
|
||||
if (!File.Exists(manifestPath))
|
||||
{
|
||||
throw new FileNotFoundException($"Plug-in manifest not found at '{manifestPath}'.", manifestPath);
|
||||
}
|
||||
|
||||
using var manifestStream = File.OpenRead(manifestPath);
|
||||
var manifest = JsonSerializer.Deserialize<PluginManifest>(manifestStream, new JsonSerializerOptions
|
||||
{
|
||||
PropertyNameCaseInsensitive = true,
|
||||
ReadCommentHandling = JsonCommentHandling.Skip
|
||||
}) ?? throw new InvalidOperationException($"Unable to parse manifest '{manifestPath}'.");
|
||||
|
||||
ValidateManifest(manifest, options.PluginDirectoryName);
|
||||
|
||||
var pluginAssemblyPath = Path.Combine(pluginRoot, manifest.EntryPoint.Assembly);
|
||||
if (!File.Exists(pluginAssemblyPath))
|
||||
{
|
||||
throw new FileNotFoundException($"Plug-in assembly '{manifest.EntryPoint.Assembly}' not found under '{pluginRoot}'.", pluginAssemblyPath);
|
||||
}
|
||||
|
||||
var sha256 = ComputeSha256(pluginAssemblyPath);
|
||||
Console.WriteLine($"→ Plug-in assembly SHA-256: {sha256}");
|
||||
|
||||
using var serviceProvider = BuildServiceProvider();
|
||||
var catalog = new LanguageAnalyzerPluginCatalog(new RestartOnlyPluginGuard(), NullLogger<LanguageAnalyzerPluginCatalog>.Instance);
|
||||
catalog.LoadFromDirectory(pluginRoot, seal: true);
|
||||
|
||||
if (catalog.Plugins.Count == 0)
|
||||
{
|
||||
throw new InvalidOperationException($"No analyzer plug-ins were loaded from '{pluginRoot}'.");
|
||||
}
|
||||
|
||||
var analyzerSet = catalog.CreateAnalyzers(serviceProvider);
|
||||
if (analyzerSet.Count == 0)
|
||||
{
|
||||
throw new InvalidOperationException("Language analyzer plug-ins reported no analyzers.");
|
||||
}
|
||||
|
||||
var analyzerIds = analyzerSet.Select(analyzer => analyzer.Id).ToArray();
|
||||
Console.WriteLine($"→ Loaded analyzers: {string.Join(", ", analyzerIds)}");
|
||||
|
||||
if (!analyzerIds.Contains("python", StringComparer.OrdinalIgnoreCase))
|
||||
{
|
||||
throw new InvalidOperationException("Python analyzer was not created by the plug-in.");
|
||||
}
|
||||
|
||||
var fixtureRoot = Path.GetFullPath(Path.Combine(options.RepoRoot, options.FixtureRelativePath));
|
||||
if (!Directory.Exists(fixtureRoot))
|
||||
{
|
||||
throw new DirectoryNotFoundException($"Fixture directory '{fixtureRoot}' does not exist.");
|
||||
}
|
||||
|
||||
foreach (var scenario in PythonScenarios)
|
||||
{
|
||||
await RunScenarioAsync(scenario, fixtureRoot, catalog, serviceProvider).ConfigureAwait(false);
|
||||
}
|
||||
}
|
||||
|
||||
private static ServiceProvider BuildServiceProvider()
|
||||
{
|
||||
var services = new ServiceCollection();
|
||||
services.AddLogging();
|
||||
return services.BuildServiceProvider();
|
||||
}
|
||||
|
||||
private static async Task RunScenarioAsync(SmokeScenario scenario, string fixtureRoot, ILanguageAnalyzerPluginCatalog catalog, IServiceProvider services)
|
||||
{
|
||||
var scenarioRoot = Path.Combine(fixtureRoot, scenario.Name);
|
||||
if (!Directory.Exists(scenarioRoot))
|
||||
{
|
||||
throw new DirectoryNotFoundException($"Scenario '{scenario.Name}' directory missing at '{scenarioRoot}'.");
|
||||
}
|
||||
|
||||
var goldenPath = Path.Combine(scenarioRoot, "expected.json");
|
||||
string? goldenNormalized = null;
|
||||
if (File.Exists(goldenPath))
|
||||
{
|
||||
goldenNormalized = NormalizeJson(await File.ReadAllTextAsync(goldenPath).ConfigureAwait(false));
|
||||
}
|
||||
|
||||
var usageHints = new LanguageUsageHints(scenario.ResolveUsageHints(scenarioRoot));
|
||||
var context = new LanguageAnalyzerContext(scenarioRoot, TimeProvider.System, usageHints, services);
|
||||
|
||||
var coldEngine = new LanguageAnalyzerEngine(catalog.CreateAnalyzers(services));
|
||||
var coldStopwatch = Stopwatch.StartNew();
|
||||
var coldResult = await coldEngine.AnalyzeAsync(context, CancellationToken.None).ConfigureAwait(false);
|
||||
coldStopwatch.Stop();
|
||||
|
||||
if (coldResult.Components.Count == 0)
|
||||
{
|
||||
throw new InvalidOperationException($"Scenario '{scenario.Name}' produced no components during cold run.");
|
||||
}
|
||||
|
||||
var coldJson = NormalizeJson(coldResult.ToJson(indent: true));
|
||||
if (goldenNormalized is string expected && !string.Equals(coldJson, expected, StringComparison.Ordinal))
|
||||
{
|
||||
Console.WriteLine($"⚠️ Scenario '{scenario.Name}' output deviates from repository golden snapshot.");
|
||||
}
|
||||
|
||||
var warmEngine = new LanguageAnalyzerEngine(catalog.CreateAnalyzers(services));
|
||||
var warmStopwatch = Stopwatch.StartNew();
|
||||
var warmResult = await warmEngine.AnalyzeAsync(context, CancellationToken.None).ConfigureAwait(false);
|
||||
warmStopwatch.Stop();
|
||||
|
||||
var warmJson = NormalizeJson(warmResult.ToJson(indent: true));
|
||||
if (!string.Equals(coldJson, warmJson, StringComparison.Ordinal))
|
||||
{
|
||||
throw new InvalidOperationException($"Scenario '{scenario.Name}' produced different outputs between cold and warm runs.");
|
||||
}
|
||||
|
||||
EnsureDurationWithinBudget(scenario.Name, coldStopwatch.Elapsed, warmStopwatch.Elapsed);
|
||||
|
||||
Console.WriteLine($"✓ Scenario '{scenario.Name}' — components {coldResult.Components.Count}, cold {coldStopwatch.Elapsed.TotalMilliseconds:F1} ms, warm {warmStopwatch.Elapsed.TotalMilliseconds:F1} ms");
|
||||
}
|
||||
|
||||
private static void EnsureDurationWithinBudget(string scenarioName, TimeSpan coldDuration, TimeSpan warmDuration)
|
||||
{
|
||||
var coldBudget = TimeSpan.FromSeconds(30);
|
||||
var warmBudget = TimeSpan.FromSeconds(5);
|
||||
|
||||
if (coldDuration > coldBudget)
|
||||
{
|
||||
throw new InvalidOperationException($"Scenario '{scenarioName}' cold run exceeded budget ({coldDuration.TotalSeconds:F2}s > {coldBudget.TotalSeconds:F2}s).");
|
||||
}
|
||||
|
||||
if (warmDuration > warmBudget)
|
||||
{
|
||||
throw new InvalidOperationException($"Scenario '{scenarioName}' warm run exceeded budget ({warmDuration.TotalSeconds:F2}s > {warmBudget.TotalSeconds:F2}s).");
|
||||
}
|
||||
}
|
||||
|
||||
private static string NormalizeJson(string json)
|
||||
=> json.Replace("\r\n", "\n", StringComparison.Ordinal).TrimEnd();
|
||||
|
||||
private static void ValidateOptions(SmokeOptions options)
|
||||
{
|
||||
if (!Directory.Exists(options.RepoRoot))
|
||||
{
|
||||
throw new DirectoryNotFoundException($"Repository root '{options.RepoRoot}' does not exist.");
|
||||
}
|
||||
}
|
||||
|
||||
private static void ValidateManifest(PluginManifest manifest, string expectedDirectory)
|
||||
{
|
||||
if (!string.Equals(manifest.SchemaVersion, "1.0", StringComparison.Ordinal))
|
||||
{
|
||||
throw new InvalidOperationException($"Unexpected manifest schema version '{manifest.SchemaVersion}'.");
|
||||
}
|
||||
|
||||
if (!manifest.RequiresRestart)
|
||||
{
|
||||
throw new InvalidOperationException("Language analyzer plug-in must be marked as restart-only.");
|
||||
}
|
||||
|
||||
if (!string.Equals(manifest.EntryPoint.Type, "dotnet", StringComparison.OrdinalIgnoreCase))
|
||||
{
|
||||
throw new InvalidOperationException($"Unsupported entry point type '{manifest.EntryPoint.Type}'.");
|
||||
}
|
||||
|
||||
if (!manifest.Capabilities.Contains("python", StringComparer.OrdinalIgnoreCase))
|
||||
{
|
||||
throw new InvalidOperationException("Manifest capabilities do not include 'python'.");
|
||||
}
|
||||
|
||||
if (!string.Equals(manifest.EntryPoint.TypeName, "StellaOps.Scanner.Analyzers.Lang.Python.PythonAnalyzerPlugin", StringComparison.Ordinal))
|
||||
{
|
||||
throw new InvalidOperationException($"Unexpected entry point type name '{manifest.EntryPoint.TypeName}'.");
|
||||
}
|
||||
|
||||
if (!string.Equals(manifest.Id, "stellaops.analyzer.lang.python", StringComparison.OrdinalIgnoreCase))
|
||||
{
|
||||
throw new InvalidOperationException($"Manifest id '{manifest.Id}' does not match expected plug-in id for directory '{expectedDirectory}'.");
|
||||
}
|
||||
}
|
||||
|
||||
private static string ComputeSha256(string path)
|
||||
{
|
||||
using var hash = SHA256.Create();
|
||||
using var stream = File.OpenRead(path);
|
||||
var digest = hash.ComputeHash(stream);
|
||||
var builder = new StringBuilder(digest.Length * 2);
|
||||
foreach (var b in digest)
|
||||
{
|
||||
builder.Append(b.ToString("x2"));
|
||||
}
|
||||
return builder.ToString();
|
||||
}
|
||||
}
|
||||
12
src/Tools/NotifySmokeCheck/NotifySmokeCheck.csproj
Normal file
12
src/Tools/NotifySmokeCheck/NotifySmokeCheck.csproj
Normal file
@@ -0,0 +1,12 @@
|
||||
<Project Sdk="Microsoft.NET.Sdk">
|
||||
<PropertyGroup>
|
||||
<OutputType>Exe</OutputType>
|
||||
<TargetFramework>net10.0</TargetFramework>
|
||||
<Nullable>enable</Nullable>
|
||||
<ImplicitUsings>enable</ImplicitUsings>
|
||||
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
|
||||
</PropertyGroup>
|
||||
<ItemGroup>
|
||||
<PackageReference Include="StackExchange.Redis" Version="2.8.24" />
|
||||
</ItemGroup>
|
||||
</Project>
|
||||
198
src/Tools/NotifySmokeCheck/Program.cs
Normal file
198
src/Tools/NotifySmokeCheck/Program.cs
Normal file
@@ -0,0 +1,198 @@
|
||||
using System.Globalization;
|
||||
using System.Net.Http.Headers;
|
||||
using System.Linq;
|
||||
using System.Text.Json;
|
||||
using StackExchange.Redis;
|
||||
|
||||
static string RequireEnv(string name)
|
||||
{
|
||||
var value = Environment.GetEnvironmentVariable(name);
|
||||
if (string.IsNullOrWhiteSpace(value))
|
||||
{
|
||||
throw new InvalidOperationException($"Environment variable '{name}' is required for Notify smoke validation.");
|
||||
}
|
||||
|
||||
return value;
|
||||
}
|
||||
|
||||
static string? GetField(StreamEntry entry, string fieldName)
|
||||
{
|
||||
foreach (var pair in entry.Values)
|
||||
{
|
||||
if (string.Equals(pair.Name, fieldName, StringComparison.OrdinalIgnoreCase))
|
||||
{
|
||||
return pair.Value.ToString();
|
||||
}
|
||||
}
|
||||
|
||||
return null;
|
||||
}
|
||||
|
||||
static void Ensure(bool condition, string message)
|
||||
{
|
||||
if (!condition)
|
||||
{
|
||||
throw new InvalidOperationException(message);
|
||||
}
|
||||
}
|
||||
|
||||
var redisDsn = RequireEnv("NOTIFY_SMOKE_REDIS_DSN");
|
||||
var redisStream = Environment.GetEnvironmentVariable("NOTIFY_SMOKE_STREAM");
|
||||
if (string.IsNullOrWhiteSpace(redisStream))
|
||||
{
|
||||
redisStream = "stella.events";
|
||||
}
|
||||
|
||||
var expectedKindsEnv = RequireEnv("NOTIFY_SMOKE_EXPECT_KINDS");
|
||||
|
||||
var expectedKinds = expectedKindsEnv
|
||||
.Split(',', StringSplitOptions.RemoveEmptyEntries | StringSplitOptions.TrimEntries)
|
||||
.Select(kind => kind.ToLowerInvariant())
|
||||
.Distinct()
|
||||
.ToArray();
|
||||
Ensure(expectedKinds.Length > 0, "Expected at least one event kind in NOTIFY_SMOKE_EXPECT_KINDS.");
|
||||
|
||||
var lookbackMinutesEnv = RequireEnv("NOTIFY_SMOKE_LOOKBACK_MINUTES");
|
||||
if (!double.TryParse(lookbackMinutesEnv, NumberStyles.Any, CultureInfo.InvariantCulture, out var lookbackMinutes))
|
||||
{
|
||||
throw new InvalidOperationException("NOTIFY_SMOKE_LOOKBACK_MINUTES must be numeric.");
|
||||
}
|
||||
Ensure(lookbackMinutes > 0, "NOTIFY_SMOKE_LOOKBACK_MINUTES must be greater than zero.");
|
||||
|
||||
var now = DateTimeOffset.UtcNow;
|
||||
var sinceThreshold = now - TimeSpan.FromMinutes(Math.Max(1, lookbackMinutes));
|
||||
|
||||
Console.WriteLine($"ℹ️ Checking Redis stream '{redisStream}' for kinds [{string.Join(", ", expectedKinds)}] within the last {lookbackMinutes:F1} minutes.");
|
||||
|
||||
var redisConfig = ConfigurationOptions.Parse(redisDsn);
|
||||
redisConfig.AbortOnConnectFail = false;
|
||||
|
||||
await using var redisConnection = await ConnectionMultiplexer.ConnectAsync(redisConfig);
|
||||
var database = redisConnection.GetDatabase();
|
||||
|
||||
var streamEntries = await database.StreamRangeAsync(redisStream, "-", "+", count: 200);
|
||||
if (streamEntries.Length > 1)
|
||||
{
|
||||
Array.Reverse(streamEntries);
|
||||
}
|
||||
Ensure(streamEntries.Length > 0, $"Redis stream '{redisStream}' is empty.");
|
||||
|
||||
var recentEntries = new List<StreamEntry>();
|
||||
foreach (var entry in streamEntries)
|
||||
{
|
||||
var timestampText = GetField(entry, "ts");
|
||||
if (timestampText is null)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
if (!DateTimeOffset.TryParse(timestampText, CultureInfo.InvariantCulture, DateTimeStyles.AssumeUniversal | DateTimeStyles.AdjustToUniversal, out var entryTimestamp))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
if (entryTimestamp >= sinceThreshold)
|
||||
{
|
||||
recentEntries.Add(entry);
|
||||
}
|
||||
}
|
||||
|
||||
Ensure(recentEntries.Count > 0, $"No Redis events newer than {sinceThreshold:u} located in stream '{redisStream}'.");
|
||||
|
||||
var missingKinds = new List<string>();
|
||||
foreach (var kind in expectedKinds)
|
||||
{
|
||||
var match = recentEntries.FirstOrDefault(entry =>
|
||||
{
|
||||
var entryKind = GetField(entry, "kind")?.ToLowerInvariant();
|
||||
return entryKind == kind;
|
||||
});
|
||||
|
||||
if (match.Equals(default(StreamEntry)))
|
||||
{
|
||||
missingKinds.Add(kind);
|
||||
}
|
||||
}
|
||||
|
||||
Ensure(missingKinds.Count == 0, $"Missing expected Redis events for kinds: {string.Join(", ", missingKinds)}");
|
||||
|
||||
Console.WriteLine("✅ Redis event stream contains the expected scanner events.");
|
||||
|
||||
var notifyBaseUrl = RequireEnv("NOTIFY_SMOKE_NOTIFY_BASEURL").TrimEnd('/');
|
||||
var notifyToken = RequireEnv("NOTIFY_SMOKE_NOTIFY_TOKEN");
|
||||
var notifyTenant = RequireEnv("NOTIFY_SMOKE_NOTIFY_TENANT");
|
||||
var notifyTenantHeader = Environment.GetEnvironmentVariable("NOTIFY_SMOKE_NOTIFY_TENANT_HEADER");
|
||||
if (string.IsNullOrWhiteSpace(notifyTenantHeader))
|
||||
{
|
||||
notifyTenantHeader = "X-StellaOps-Tenant";
|
||||
}
|
||||
|
||||
var notifyTimeoutSeconds = 30;
|
||||
var notifyTimeoutEnv = Environment.GetEnvironmentVariable("NOTIFY_SMOKE_NOTIFY_TIMEOUT_SECONDS");
|
||||
if (!string.IsNullOrWhiteSpace(notifyTimeoutEnv) && int.TryParse(notifyTimeoutEnv, NumberStyles.Integer, CultureInfo.InvariantCulture, out var parsedTimeout))
|
||||
{
|
||||
notifyTimeoutSeconds = Math.Max(5, parsedTimeout);
|
||||
}
|
||||
|
||||
using var httpClient = new HttpClient
|
||||
{
|
||||
Timeout = TimeSpan.FromSeconds(notifyTimeoutSeconds),
|
||||
};
|
||||
|
||||
httpClient.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Bearer", notifyToken);
|
||||
httpClient.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
|
||||
httpClient.DefaultRequestHeaders.Add(notifyTenantHeader, notifyTenant);
|
||||
|
||||
var sinceQuery = Uri.EscapeDataString(sinceThreshold.ToString("O", CultureInfo.InvariantCulture));
|
||||
var deliveriesUrl = $"{notifyBaseUrl}/api/v1/deliveries?since={sinceQuery}&limit=200";
|
||||
|
||||
Console.WriteLine($"ℹ️ Querying Notify deliveries via {deliveriesUrl}.");
|
||||
|
||||
using var response = await httpClient.GetAsync(deliveriesUrl);
|
||||
if (!response.IsSuccessStatusCode)
|
||||
{
|
||||
var body = await response.Content.ReadAsStringAsync();
|
||||
throw new InvalidOperationException($"Notify deliveries request failed with {(int)response.StatusCode} {response.ReasonPhrase}: {body}");
|
||||
}
|
||||
|
||||
var json = await response.Content.ReadAsStringAsync();
|
||||
if (string.IsNullOrWhiteSpace(json))
|
||||
{
|
||||
throw new InvalidOperationException("Notify deliveries response body was empty.");
|
||||
}
|
||||
|
||||
using var document = JsonDocument.Parse(json);
|
||||
var root = document.RootElement;
|
||||
|
||||
IEnumerable<JsonElement> EnumerateDeliveries(JsonElement element)
|
||||
{
|
||||
return element.ValueKind switch
|
||||
{
|
||||
JsonValueKind.Array => element.EnumerateArray(),
|
||||
JsonValueKind.Object when element.TryGetProperty("items", out var items) && items.ValueKind == JsonValueKind.Array => items.EnumerateArray(),
|
||||
_ => throw new InvalidOperationException("Notify deliveries response was not an array or did not contain an 'items' collection.")
|
||||
};
|
||||
}
|
||||
|
||||
var deliveries = EnumerateDeliveries(root).ToArray();
|
||||
Ensure(deliveries.Length > 0, "Notify deliveries response did not return any records.");
|
||||
|
||||
var missingDeliveryKinds = new List<string>();
|
||||
foreach (var kind in expectedKinds)
|
||||
{
|
||||
var found = deliveries.Any(delivery =>
|
||||
delivery.TryGetProperty("kind", out var kindProperty) &&
|
||||
kindProperty.GetString()?.Equals(kind, StringComparison.OrdinalIgnoreCase) == true &&
|
||||
delivery.TryGetProperty("status", out var statusProperty) &&
|
||||
!string.Equals(statusProperty.GetString(), "failed", StringComparison.OrdinalIgnoreCase));
|
||||
|
||||
if (!found)
|
||||
{
|
||||
missingDeliveryKinds.Add(kind);
|
||||
}
|
||||
}
|
||||
|
||||
Ensure(missingDeliveryKinds.Count == 0, $"Notify deliveries missing successful records for kinds: {string.Join(", ", missingDeliveryKinds)}");
|
||||
|
||||
Console.WriteLine("✅ Notify deliveries include the expected scanner events.");
|
||||
Console.WriteLine("🎉 Notify smoke validation completed successfully.");
|
||||
14
src/Tools/PolicyDslValidator/PolicyDslValidator.csproj
Normal file
14
src/Tools/PolicyDslValidator/PolicyDslValidator.csproj
Normal file
@@ -0,0 +1,14 @@
|
||||
<Project Sdk="Microsoft.NET.Sdk">
|
||||
|
||||
<PropertyGroup>
|
||||
<OutputType>Exe</OutputType>
|
||||
<TargetFramework>net10.0</TargetFramework>
|
||||
<Nullable>enable</Nullable>
|
||||
<ImplicitUsings>enable</ImplicitUsings>
|
||||
</PropertyGroup>
|
||||
|
||||
<ItemGroup>
|
||||
<ProjectReference Include="..\..\src\StellaOps.Policy\StellaOps.Policy.csproj" />
|
||||
</ItemGroup>
|
||||
|
||||
</Project>
|
||||
56
src/Tools/PolicyDslValidator/Program.cs
Normal file
56
src/Tools/PolicyDslValidator/Program.cs
Normal file
@@ -0,0 +1,56 @@
|
||||
using StellaOps.Policy;
|
||||
|
||||
if (args.Length == 0)
|
||||
{
|
||||
Console.Error.WriteLine("Usage: policy-dsl-validator [--strict] [--json] <path-or-glob> [<path-or-glob> ...]");
|
||||
Console.Error.WriteLine("Example: policy-dsl-validator --strict docs/examples/policies");
|
||||
return 64; // EX_USAGE
|
||||
}
|
||||
|
||||
var inputs = new List<string>();
|
||||
var strict = false;
|
||||
var outputJson = false;
|
||||
|
||||
foreach (var arg in args)
|
||||
{
|
||||
switch (arg)
|
||||
{
|
||||
case "--strict":
|
||||
case "-s":
|
||||
strict = true;
|
||||
break;
|
||||
|
||||
case "--json":
|
||||
case "-j":
|
||||
outputJson = true;
|
||||
break;
|
||||
|
||||
case "--help":
|
||||
case "-h":
|
||||
case "-?":
|
||||
Console.WriteLine("Usage: policy-dsl-validator [--strict] [--json] <path-or-glob> [<path-or-glob> ...]");
|
||||
Console.WriteLine("Example: policy-dsl-validator --strict docs/examples/policies");
|
||||
return 0;
|
||||
|
||||
default:
|
||||
inputs.Add(arg);
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
if (inputs.Count == 0)
|
||||
{
|
||||
Console.Error.WriteLine("No input files or directories provided.");
|
||||
return 64; // EX_USAGE
|
||||
}
|
||||
|
||||
var options = new PolicyValidationCliOptions
|
||||
{
|
||||
Inputs = inputs,
|
||||
Strict = strict,
|
||||
OutputJson = outputJson,
|
||||
};
|
||||
|
||||
var cli = new PolicyValidationCli();
|
||||
var exitCode = await cli.RunAsync(options, CancellationToken.None);
|
||||
return exitCode;
|
||||
21
src/Tools/PolicySchemaExporter/PolicySchemaExporter.csproj
Normal file
21
src/Tools/PolicySchemaExporter/PolicySchemaExporter.csproj
Normal file
@@ -0,0 +1,21 @@
|
||||
<Project Sdk="Microsoft.NET.Sdk">
|
||||
|
||||
<PropertyGroup>
|
||||
<OutputType>Exe</OutputType>
|
||||
<TargetFramework>net10.0</TargetFramework>
|
||||
<ImplicitUsings>enable</ImplicitUsings>
|
||||
<Nullable>enable</Nullable>
|
||||
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
|
||||
</PropertyGroup>
|
||||
|
||||
<ItemGroup>
|
||||
<PackageReference Include="NJsonSchema" Version="11.5.1" />
|
||||
<PackageReference Include="NJsonSchema.SystemTextJson" Version="11.5.1" />
|
||||
<PackageReference Include="Newtonsoft.Json" Version="13.0.3" />
|
||||
</ItemGroup>
|
||||
|
||||
<ItemGroup>
|
||||
<ProjectReference Include="..\..\StellaOps.Scheduler.Models\StellaOps.Scheduler.Models.csproj" />
|
||||
</ItemGroup>
|
||||
|
||||
</Project>
|
||||
48
src/Tools/PolicySchemaExporter/Program.cs
Normal file
48
src/Tools/PolicySchemaExporter/Program.cs
Normal file
@@ -0,0 +1,48 @@
|
||||
using System.Collections.Immutable;
|
||||
using System.Text.Json;
|
||||
using System.Text.Json.Serialization;
|
||||
using NJsonSchema;
|
||||
using NJsonSchema.Generation;
|
||||
using NJsonSchema.Generation.SystemTextJson;
|
||||
using Newtonsoft.Json;
|
||||
using StellaOps.Scheduler.Models;
|
||||
|
||||
var output = args.Length switch
|
||||
{
|
||||
0 => Path.GetFullPath(Path.Combine(AppContext.BaseDirectory, "..", "..", "..", "docs", "schemas")),
|
||||
1 => Path.GetFullPath(args[0]),
|
||||
_ => throw new ArgumentException("Usage: dotnet run --project src/Tools/PolicySchemaExporter -- [outputDirectory]")
|
||||
};
|
||||
|
||||
Directory.CreateDirectory(output);
|
||||
|
||||
var generatorSettings = new SystemTextJsonSchemaGeneratorSettings
|
||||
{
|
||||
SchemaType = SchemaType.JsonSchema,
|
||||
DefaultReferenceTypeNullHandling = ReferenceTypeNullHandling.NotNull,
|
||||
SerializerOptions = new JsonSerializerOptions
|
||||
{
|
||||
PropertyNamingPolicy = JsonNamingPolicy.CamelCase,
|
||||
DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingNull,
|
||||
},
|
||||
};
|
||||
|
||||
var generator = new JsonSchemaGenerator(generatorSettings);
|
||||
|
||||
var exports = ImmutableArray.Create(
|
||||
(FileName: "policy-run-request.schema.json", Type: typeof(PolicyRunRequest)),
|
||||
(FileName: "policy-run-status.schema.json", Type: typeof(PolicyRunStatus)),
|
||||
(FileName: "policy-diff-summary.schema.json", Type: typeof(PolicyDiffSummary)),
|
||||
(FileName: "policy-explain-trace.schema.json", Type: typeof(PolicyExplainTrace))
|
||||
);
|
||||
|
||||
foreach (var export in exports)
|
||||
{
|
||||
var schema = generator.Generate(export.Type);
|
||||
schema.Title = export.Type.Name;
|
||||
schema.AllowAdditionalProperties = false;
|
||||
|
||||
var outputPath = Path.Combine(output, export.FileName);
|
||||
await File.WriteAllTextAsync(outputPath, schema.ToJson(Formatting.Indented) + Environment.NewLine);
|
||||
Console.WriteLine($"Wrote {outputPath}");
|
||||
}
|
||||
14
src/Tools/PolicySimulationSmoke/PolicySimulationSmoke.csproj
Normal file
14
src/Tools/PolicySimulationSmoke/PolicySimulationSmoke.csproj
Normal file
@@ -0,0 +1,14 @@
|
||||
<Project Sdk="Microsoft.NET.Sdk">
|
||||
|
||||
<PropertyGroup>
|
||||
<OutputType>Exe</OutputType>
|
||||
<TargetFramework>net10.0</TargetFramework>
|
||||
<ImplicitUsings>enable</ImplicitUsings>
|
||||
<Nullable>enable</Nullable>
|
||||
</PropertyGroup>
|
||||
|
||||
<ItemGroup>
|
||||
<ProjectReference Include="..\..\src\StellaOps.Policy\StellaOps.Policy.csproj" />
|
||||
</ItemGroup>
|
||||
|
||||
</Project>
|
||||
291
src/Tools/PolicySimulationSmoke/Program.cs
Normal file
291
src/Tools/PolicySimulationSmoke/Program.cs
Normal file
@@ -0,0 +1,291 @@
|
||||
using System.Collections.Immutable;
|
||||
using System.Text.Json;
|
||||
using Microsoft.Extensions.Logging;
|
||||
using Microsoft.Extensions.Logging.Abstractions;
|
||||
using StellaOps.Policy;
|
||||
|
||||
var scenarioRoot = "samples/policy/simulations";
|
||||
string? outputDir = null;
|
||||
|
||||
for (var i = 0; i < args.Length; i++)
|
||||
{
|
||||
var arg = args[i];
|
||||
switch (arg)
|
||||
{
|
||||
case "--scenario-root":
|
||||
case "-r":
|
||||
if (i + 1 >= args.Length)
|
||||
{
|
||||
Console.Error.WriteLine("Missing value for --scenario-root.");
|
||||
return 64;
|
||||
}
|
||||
scenarioRoot = args[++i];
|
||||
break;
|
||||
case "--output":
|
||||
case "-o":
|
||||
if (i + 1 >= args.Length)
|
||||
{
|
||||
Console.Error.WriteLine("Missing value for --output.");
|
||||
return 64;
|
||||
}
|
||||
outputDir = args[++i];
|
||||
break;
|
||||
case "--help":
|
||||
case "-h":
|
||||
case "-?":
|
||||
PrintUsage();
|
||||
return 0;
|
||||
default:
|
||||
Console.Error.WriteLine($"Unknown argument '{arg}'.");
|
||||
PrintUsage();
|
||||
return 64;
|
||||
}
|
||||
}
|
||||
|
||||
if (!Directory.Exists(scenarioRoot))
|
||||
{
|
||||
Console.Error.WriteLine($"Scenario root '{scenarioRoot}' does not exist.");
|
||||
return 66;
|
||||
}
|
||||
|
||||
var scenarioFiles = Directory.GetFiles(scenarioRoot, "scenario.json", SearchOption.AllDirectories);
|
||||
if (scenarioFiles.Length == 0)
|
||||
{
|
||||
Console.Error.WriteLine($"No scenario.json files found under '{scenarioRoot}'.");
|
||||
return 0;
|
||||
}
|
||||
|
||||
var loggerFactory = NullLoggerFactory.Instance;
|
||||
var snapshotStore = new PolicySnapshotStore(
|
||||
new NullPolicySnapshotRepository(),
|
||||
new NullPolicyAuditRepository(),
|
||||
TimeProvider.System,
|
||||
loggerFactory.CreateLogger<PolicySnapshotStore>());
|
||||
var previewService = new PolicyPreviewService(snapshotStore, loggerFactory.CreateLogger<PolicyPreviewService>());
|
||||
|
||||
var serializerOptions = new JsonSerializerOptions(JsonSerializerDefaults.Web)
|
||||
{
|
||||
PropertyNameCaseInsensitive = true,
|
||||
ReadCommentHandling = JsonCommentHandling.Skip,
|
||||
};
|
||||
|
||||
var summary = new List<ScenarioResult>();
|
||||
var success = true;
|
||||
|
||||
foreach (var scenarioFile in scenarioFiles.OrderBy(static f => f, StringComparer.OrdinalIgnoreCase))
|
||||
{
|
||||
var scenarioText = await File.ReadAllTextAsync(scenarioFile);
|
||||
var scenario = JsonSerializer.Deserialize<PolicySimulationScenario>(scenarioText, serializerOptions);
|
||||
if (scenario is null)
|
||||
{
|
||||
Console.Error.WriteLine($"Failed to deserialize scenario '{scenarioFile}'.");
|
||||
success = false;
|
||||
continue;
|
||||
}
|
||||
|
||||
var repoRoot = Directory.GetCurrentDirectory();
|
||||
var policyPath = Path.Combine(repoRoot, scenario.PolicyPath);
|
||||
if (!File.Exists(policyPath))
|
||||
{
|
||||
Console.Error.WriteLine($"Policy file '{scenario.PolicyPath}' referenced by scenario '{scenario.Name}' does not exist.");
|
||||
success = false;
|
||||
continue;
|
||||
}
|
||||
|
||||
var policyContent = await File.ReadAllTextAsync(policyPath);
|
||||
var policyFormat = PolicySchema.DetectFormat(policyPath);
|
||||
var findings = scenario.Findings.Select(ToPolicyFinding).ToImmutableArray();
|
||||
var baseline = scenario.Baseline?.Select(ToPolicyVerdict).ToImmutableArray() ?? ImmutableArray<PolicyVerdict>.Empty;
|
||||
|
||||
var request = new PolicyPreviewRequest(
|
||||
ImageDigest: $"sha256:simulation-{scenario.Name}",
|
||||
Findings: findings,
|
||||
BaselineVerdicts: baseline,
|
||||
SnapshotOverride: null,
|
||||
ProposedPolicy: new PolicySnapshotContent(
|
||||
Content: policyContent,
|
||||
Format: policyFormat,
|
||||
Actor: "ci",
|
||||
Source: "ci/simulation-smoke",
|
||||
Description: $"CI simulation for scenario '{scenario.Name}'"));
|
||||
|
||||
var response = await previewService.PreviewAsync(request, CancellationToken.None);
|
||||
var scenarioResult = EvaluateScenario(scenario, response);
|
||||
summary.Add(scenarioResult);
|
||||
|
||||
if (!scenarioResult.Success)
|
||||
{
|
||||
success = false;
|
||||
}
|
||||
}
|
||||
|
||||
if (outputDir is not null)
|
||||
{
|
||||
Directory.CreateDirectory(outputDir);
|
||||
var summaryPath = Path.Combine(outputDir, "policy-simulation-summary.json");
|
||||
await File.WriteAllTextAsync(summaryPath, JsonSerializer.Serialize(summary, new JsonSerializerOptions { WriteIndented = true }));
|
||||
}
|
||||
|
||||
return success ? 0 : 1;
|
||||
|
||||
static void PrintUsage()
|
||||
{
|
||||
Console.WriteLine("Usage: policy-simulation-smoke [--scenario-root <path>] [--output <dir>]");
|
||||
Console.WriteLine("Example: policy-simulation-smoke --scenario-root samples/policy/simulations --output artifacts/policy-simulations");
|
||||
}
|
||||
|
||||
static PolicyFinding ToPolicyFinding(ScenarioFinding finding)
|
||||
{
|
||||
var tags = finding.Tags is null ? ImmutableArray<string>.Empty : ImmutableArray.CreateRange(finding.Tags);
|
||||
var severity = Enum.Parse<PolicySeverity>(finding.Severity, ignoreCase: true);
|
||||
return new PolicyFinding(
|
||||
finding.FindingId,
|
||||
severity,
|
||||
finding.Environment,
|
||||
finding.Source,
|
||||
finding.Vendor,
|
||||
finding.License,
|
||||
finding.Image,
|
||||
finding.Repository,
|
||||
finding.Package,
|
||||
finding.Purl,
|
||||
finding.Cve,
|
||||
finding.Path,
|
||||
finding.LayerDigest,
|
||||
tags);
|
||||
}
|
||||
|
||||
static PolicyVerdict ToPolicyVerdict(ScenarioBaseline baseline)
|
||||
{
|
||||
var status = Enum.Parse<PolicyVerdictStatus>(baseline.Status, ignoreCase: true);
|
||||
var inputs = baseline.Inputs?.ToImmutableDictionary(StringComparer.OrdinalIgnoreCase) ?? ImmutableDictionary<string, double>.Empty;
|
||||
return new PolicyVerdict(
|
||||
baseline.FindingId,
|
||||
status,
|
||||
RuleName: baseline.RuleName,
|
||||
RuleAction: baseline.RuleAction,
|
||||
Notes: baseline.Notes,
|
||||
Score: baseline.Score,
|
||||
ConfigVersion: baseline.ConfigVersion ?? PolicyScoringConfig.Default.Version,
|
||||
Inputs: inputs,
|
||||
QuietedBy: null,
|
||||
Quiet: false,
|
||||
UnknownConfidence: null,
|
||||
ConfidenceBand: null,
|
||||
UnknownAgeDays: null,
|
||||
SourceTrust: null,
|
||||
Reachability: null);
|
||||
}
|
||||
|
||||
static ScenarioResult EvaluateScenario(PolicySimulationScenario scenario, PolicyPreviewResponse response)
|
||||
{
|
||||
var result = new ScenarioResult(scenario.Name);
|
||||
if (!response.Success)
|
||||
{
|
||||
result.Failures.Add("Preview failed.");
|
||||
return result with { Success = false, ChangedCount = response.ChangedCount };
|
||||
}
|
||||
|
||||
var diffs = response.Diffs.ToDictionary(diff => diff.Projected.FindingId, StringComparer.OrdinalIgnoreCase);
|
||||
foreach (var expected in scenario.ExpectedDiffs)
|
||||
{
|
||||
if (!diffs.TryGetValue(expected.FindingId, out var diff))
|
||||
{
|
||||
result.Failures.Add($"Expected finding '{expected.FindingId}' missing from diff.");
|
||||
continue;
|
||||
}
|
||||
|
||||
var projectedStatus = diff.Projected.Status.ToString();
|
||||
result.ActualStatuses[expected.FindingId] = projectedStatus;
|
||||
if (!string.Equals(projectedStatus, expected.Status, StringComparison.OrdinalIgnoreCase))
|
||||
{
|
||||
result.Failures.Add($"Finding '{expected.FindingId}' expected status '{expected.Status}' but was '{projectedStatus}'.");
|
||||
}
|
||||
}
|
||||
|
||||
foreach (var diff in diffs.Values)
|
||||
{
|
||||
if (!result.ActualStatuses.ContainsKey(diff.Projected.FindingId))
|
||||
{
|
||||
result.ActualStatuses[diff.Projected.FindingId] = diff.Projected.Status.ToString();
|
||||
}
|
||||
}
|
||||
|
||||
var success = result.Failures.Count == 0;
|
||||
return result with
|
||||
{
|
||||
Success = success,
|
||||
ChangedCount = response.ChangedCount
|
||||
};
|
||||
}
|
||||
|
||||
internal sealed record PolicySimulationScenario
|
||||
{
|
||||
public string Name { get; init; } = "scenario";
|
||||
public string PolicyPath { get; init; } = string.Empty;
|
||||
public List<ScenarioFinding> Findings { get; init; } = new();
|
||||
public List<ScenarioExpectedDiff> ExpectedDiffs { get; init; } = new();
|
||||
public List<ScenarioBaseline>? Baseline { get; init; }
|
||||
}
|
||||
|
||||
internal sealed record ScenarioFinding
|
||||
{
|
||||
public string FindingId { get; init; } = string.Empty;
|
||||
public string Severity { get; init; } = "Low";
|
||||
public string? Environment { get; init; }
|
||||
public string? Source { get; init; }
|
||||
public string? Vendor { get; init; }
|
||||
public string? License { get; init; }
|
||||
public string? Image { get; init; }
|
||||
public string? Repository { get; init; }
|
||||
public string? Package { get; init; }
|
||||
public string? Purl { get; init; }
|
||||
public string? Cve { get; init; }
|
||||
public string? Path { get; init; }
|
||||
public string? LayerDigest { get; init; }
|
||||
public string[]? Tags { get; init; }
|
||||
}
|
||||
|
||||
internal sealed record ScenarioExpectedDiff
|
||||
{
|
||||
public string FindingId { get; init; } = string.Empty;
|
||||
public string Status { get; init; } = "Pass";
|
||||
}
|
||||
|
||||
internal sealed record ScenarioBaseline
|
||||
{
|
||||
public string FindingId { get; init; } = string.Empty;
|
||||
public string Status { get; init; } = "Pass";
|
||||
public string? RuleName { get; init; }
|
||||
public string? RuleAction { get; init; }
|
||||
public string? Notes { get; init; }
|
||||
public double Score { get; init; }
|
||||
public string? ConfigVersion { get; init; }
|
||||
public Dictionary<string, double>? Inputs { get; init; }
|
||||
}
|
||||
|
||||
internal sealed record ScenarioResult(string ScenarioName)
|
||||
{
|
||||
public bool Success { get; init; } = true;
|
||||
public int ChangedCount { get; init; }
|
||||
public List<string> Failures { get; } = new();
|
||||
public Dictionary<string, string> ActualStatuses { get; } = new(StringComparer.OrdinalIgnoreCase);
|
||||
}
|
||||
|
||||
internal sealed class NullPolicySnapshotRepository : IPolicySnapshotRepository
|
||||
{
|
||||
public Task AddAsync(PolicySnapshot snapshot, CancellationToken cancellationToken = default) => Task.CompletedTask;
|
||||
|
||||
public Task<PolicySnapshot?> GetLatestAsync(CancellationToken cancellationToken = default) => Task.FromResult<PolicySnapshot?>(null);
|
||||
|
||||
public Task<IReadOnlyList<PolicySnapshot>> ListAsync(int limit, CancellationToken cancellationToken = default)
|
||||
=> Task.FromResult<IReadOnlyList<PolicySnapshot>>(Array.Empty<PolicySnapshot>());
|
||||
}
|
||||
|
||||
internal sealed class NullPolicyAuditRepository : IPolicyAuditRepository
|
||||
{
|
||||
public Task AddAsync(PolicyAuditEntry entry, CancellationToken cancellationToken = default) => Task.CompletedTask;
|
||||
|
||||
public Task<IReadOnlyList<PolicyAuditEntry>> ListAsync(int limit, CancellationToken cancellationToken = default)
|
||||
=> Task.FromResult<IReadOnlyList<PolicyAuditEntry>>(Array.Empty<PolicyAuditEntry>());
|
||||
}
|
||||
286
src/Tools/RustFsMigrator/Program.cs
Normal file
286
src/Tools/RustFsMigrator/Program.cs
Normal file
@@ -0,0 +1,286 @@
|
||||
using Amazon;
|
||||
using Amazon.Runtime;
|
||||
using Amazon.S3;
|
||||
using Amazon.S3.Model;
|
||||
using System.Net.Http.Headers;
|
||||
|
||||
var options = MigrationOptions.Parse(args);
|
||||
if (options is null)
|
||||
{
|
||||
MigrationOptions.PrintUsage();
|
||||
return 1;
|
||||
}
|
||||
|
||||
Console.WriteLine($"RustFS migrator starting (prefix: '{options.Prefix ?? "<all>"}')");
|
||||
if (options.DryRun)
|
||||
{
|
||||
Console.WriteLine("Dry-run enabled. No objects will be written to RustFS.");
|
||||
}
|
||||
|
||||
var s3Config = new AmazonS3Config
|
||||
{
|
||||
ForcePathStyle = true,
|
||||
};
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(options.S3ServiceUrl))
|
||||
{
|
||||
s3Config.ServiceURL = options.S3ServiceUrl;
|
||||
s3Config.UseHttp = options.S3ServiceUrl.StartsWith("http://", StringComparison.OrdinalIgnoreCase);
|
||||
}
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(options.S3Region))
|
||||
{
|
||||
s3Config.RegionEndpoint = RegionEndpoint.GetBySystemName(options.S3Region);
|
||||
}
|
||||
|
||||
using var s3Client = CreateS3Client(options, s3Config);
|
||||
using var httpClient = CreateRustFsClient(options);
|
||||
|
||||
var listRequest = new ListObjectsV2Request
|
||||
{
|
||||
BucketName = options.S3Bucket,
|
||||
Prefix = options.Prefix,
|
||||
MaxKeys = 1000,
|
||||
};
|
||||
|
||||
var migrated = 0;
|
||||
var skipped = 0;
|
||||
|
||||
do
|
||||
{
|
||||
var response = await s3Client.ListObjectsV2Async(listRequest).ConfigureAwait(false);
|
||||
foreach (var entry in response.S3Objects)
|
||||
{
|
||||
if (entry.Size == 0 && entry.Key.EndsWith('/'))
|
||||
{
|
||||
skipped++;
|
||||
continue;
|
||||
}
|
||||
|
||||
Console.WriteLine($"Migrating {entry.Key} ({entry.Size} bytes)...");
|
||||
|
||||
if (options.DryRun)
|
||||
{
|
||||
migrated++;
|
||||
continue;
|
||||
}
|
||||
|
||||
using var getResponse = await s3Client.GetObjectAsync(new GetObjectRequest
|
||||
{
|
||||
BucketName = options.S3Bucket,
|
||||
Key = entry.Key,
|
||||
}).ConfigureAwait(false);
|
||||
|
||||
await using var memory = new MemoryStream();
|
||||
await getResponse.ResponseStream.CopyToAsync(memory).ConfigureAwait(false);
|
||||
memory.Position = 0;
|
||||
|
||||
using var request = new HttpRequestMessage(HttpMethod.Put, BuildRustFsUri(options, entry.Key))
|
||||
{
|
||||
Content = new ByteArrayContent(memory.ToArray()),
|
||||
};
|
||||
request.Content.Headers.ContentType = MediaTypeHeaderValue.Parse("application/octet-stream");
|
||||
|
||||
if (options.Immutable)
|
||||
{
|
||||
request.Headers.TryAddWithoutValidation("X-RustFS-Immutable", "true");
|
||||
}
|
||||
|
||||
if (options.RetentionSeconds is { } retainSeconds)
|
||||
{
|
||||
request.Headers.TryAddWithoutValidation("X-RustFS-Retain-Seconds", retainSeconds.ToString());
|
||||
}
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(options.RustFsApiKeyHeader) && !string.IsNullOrWhiteSpace(options.RustFsApiKey))
|
||||
{
|
||||
request.Headers.TryAddWithoutValidation(options.RustFsApiKeyHeader!, options.RustFsApiKey!);
|
||||
}
|
||||
|
||||
using var responseMessage = await httpClient.SendAsync(request).ConfigureAwait(false);
|
||||
if (!responseMessage.IsSuccessStatusCode)
|
||||
{
|
||||
var error = await responseMessage.Content.ReadAsStringAsync().ConfigureAwait(false);
|
||||
Console.Error.WriteLine($"Failed to upload {entry.Key}: {(int)responseMessage.StatusCode} {responseMessage.ReasonPhrase}\n{error}");
|
||||
return 2;
|
||||
}
|
||||
|
||||
migrated++;
|
||||
}
|
||||
|
||||
listRequest.ContinuationToken = response.NextContinuationToken;
|
||||
} while (!string.IsNullOrEmpty(listRequest.ContinuationToken));
|
||||
|
||||
Console.WriteLine($"Migration complete. Migrated {migrated} objects. Skipped {skipped} directory markers.");
|
||||
return 0;
|
||||
|
||||
static AmazonS3Client CreateS3Client(MigrationOptions options, AmazonS3Config config)
|
||||
{
|
||||
if (!string.IsNullOrWhiteSpace(options.S3AccessKey) && !string.IsNullOrWhiteSpace(options.S3SecretKey))
|
||||
{
|
||||
var credentials = new BasicAWSCredentials(options.S3AccessKey, options.S3SecretKey);
|
||||
return new AmazonS3Client(credentials, config);
|
||||
}
|
||||
|
||||
return new AmazonS3Client(config);
|
||||
}
|
||||
|
||||
static HttpClient CreateRustFsClient(MigrationOptions options)
|
||||
{
|
||||
var client = new HttpClient
|
||||
{
|
||||
BaseAddress = new Uri(options.RustFsEndpoint, UriKind.Absolute),
|
||||
Timeout = TimeSpan.FromMinutes(5),
|
||||
};
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(options.RustFsApiKeyHeader) && !string.IsNullOrWhiteSpace(options.RustFsApiKey))
|
||||
{
|
||||
client.DefaultRequestHeaders.TryAddWithoutValidation(options.RustFsApiKeyHeader, options.RustFsApiKey);
|
||||
}
|
||||
|
||||
return client;
|
||||
}
|
||||
|
||||
static Uri BuildRustFsUri(MigrationOptions options, string key)
|
||||
{
|
||||
var normalized = string.Join('/', key
|
||||
.Split('/', StringSplitOptions.RemoveEmptyEntries)
|
||||
.Select(Uri.EscapeDataString));
|
||||
|
||||
var builder = new UriBuilder(options.RustFsEndpoint)
|
||||
{
|
||||
Path = $"/api/v1/buckets/{Uri.EscapeDataString(options.RustFsBucket)}/objects/{normalized}",
|
||||
};
|
||||
|
||||
return builder.Uri;
|
||||
}
|
||||
|
||||
internal sealed record MigrationOptions
|
||||
{
|
||||
public string S3Bucket { get; init; } = string.Empty;
|
||||
|
||||
public string? S3ServiceUrl { get; init; }
|
||||
= null;
|
||||
|
||||
public string? S3Region { get; init; }
|
||||
= null;
|
||||
|
||||
public string? S3AccessKey { get; init; }
|
||||
= null;
|
||||
|
||||
public string? S3SecretKey { get; init; }
|
||||
= null;
|
||||
|
||||
public string RustFsEndpoint { get; init; } = string.Empty;
|
||||
|
||||
public string RustFsBucket { get; init; } = string.Empty;
|
||||
|
||||
public string? RustFsApiKeyHeader { get; init; }
|
||||
= null;
|
||||
|
||||
public string? RustFsApiKey { get; init; }
|
||||
= null;
|
||||
|
||||
public string? Prefix { get; init; }
|
||||
= null;
|
||||
|
||||
public bool Immutable { get; init; }
|
||||
= false;
|
||||
|
||||
public int? RetentionSeconds { get; init; }
|
||||
= null;
|
||||
|
||||
public bool DryRun { get; init; }
|
||||
= false;
|
||||
|
||||
public static MigrationOptions? Parse(string[] args)
|
||||
{
|
||||
var builder = new Dictionary<string, string?>(StringComparer.OrdinalIgnoreCase);
|
||||
|
||||
for (var i = 0; i < args.Length; i++)
|
||||
{
|
||||
var key = args[i];
|
||||
if (key.StartsWith("--", StringComparison.OrdinalIgnoreCase))
|
||||
{
|
||||
var normalized = key[2..];
|
||||
if (string.Equals(normalized, "immutable", StringComparison.OrdinalIgnoreCase) || string.Equals(normalized, "dry-run", StringComparison.OrdinalIgnoreCase))
|
||||
{
|
||||
builder[normalized] = "true";
|
||||
continue;
|
||||
}
|
||||
|
||||
if (i + 1 >= args.Length)
|
||||
{
|
||||
Console.Error.WriteLine($"Missing value for argument '{key}'.");
|
||||
return null;
|
||||
}
|
||||
|
||||
builder[normalized] = args[++i];
|
||||
}
|
||||
}
|
||||
|
||||
if (!builder.TryGetValue("s3-bucket", out var bucket) || string.IsNullOrWhiteSpace(bucket))
|
||||
{
|
||||
Console.Error.WriteLine("--s3-bucket is required.");
|
||||
return null;
|
||||
}
|
||||
|
||||
if (!builder.TryGetValue("rustfs-endpoint", out var rustFsEndpoint) || string.IsNullOrWhiteSpace(rustFsEndpoint))
|
||||
{
|
||||
Console.Error.WriteLine("--rustfs-endpoint is required.");
|
||||
return null;
|
||||
}
|
||||
|
||||
if (!builder.TryGetValue("rustfs-bucket", out var rustFsBucket) || string.IsNullOrWhiteSpace(rustFsBucket))
|
||||
{
|
||||
Console.Error.WriteLine("--rustfs-bucket is required.");
|
||||
return null;
|
||||
}
|
||||
|
||||
int? retentionSeconds = null;
|
||||
if (builder.TryGetValue("retain-days", out var retainStr) && !string.IsNullOrWhiteSpace(retainStr))
|
||||
{
|
||||
if (double.TryParse(retainStr, out var days) && days > 0)
|
||||
{
|
||||
retentionSeconds = (int)Math.Ceiling(days * 24 * 60 * 60);
|
||||
}
|
||||
else
|
||||
{
|
||||
Console.Error.WriteLine("--retain-days must be a positive number.");
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
return new MigrationOptions
|
||||
{
|
||||
S3Bucket = bucket,
|
||||
S3ServiceUrl = builder.TryGetValue("s3-endpoint", out var s3Endpoint) ? s3Endpoint : null,
|
||||
S3Region = builder.TryGetValue("s3-region", out var s3Region) ? s3Region : null,
|
||||
S3AccessKey = builder.TryGetValue("s3-access-key", out var s3AccessKey) ? s3AccessKey : null,
|
||||
S3SecretKey = builder.TryGetValue("s3-secret-key", out var s3SecretKey) ? s3SecretKey : null,
|
||||
RustFsEndpoint = rustFsEndpoint!,
|
||||
RustFsBucket = rustFsBucket!,
|
||||
RustFsApiKeyHeader = builder.TryGetValue("rustfs-api-key-header", out var apiKeyHeader) ? apiKeyHeader : null,
|
||||
RustFsApiKey = builder.TryGetValue("rustfs-api-key", out var apiKey) ? apiKey : null,
|
||||
Prefix = builder.TryGetValue("prefix", out var prefix) ? prefix : null,
|
||||
Immutable = builder.ContainsKey("immutable"),
|
||||
RetentionSeconds = retentionSeconds,
|
||||
DryRun = builder.ContainsKey("dry-run"),
|
||||
};
|
||||
}
|
||||
|
||||
public static void PrintUsage()
|
||||
{
|
||||
Console.WriteLine(@"Usage: dotnet run --project src/Tools/RustFsMigrator -- \
|
||||
--s3-bucket <name> \
|
||||
[--s3-endpoint http://minio:9000] \
|
||||
[--s3-region us-east-1] \
|
||||
[--s3-access-key key --s3-secret-key secret] \
|
||||
--rustfs-endpoint http://rustfs:8080 \
|
||||
--rustfs-bucket scanner-artifacts \
|
||||
[--rustfs-api-key-header X-API-Key --rustfs-api-key token] \
|
||||
[--prefix scanner/] \
|
||||
[--immutable] \
|
||||
[--retain-days 365] \
|
||||
[--dry-run]");
|
||||
}
|
||||
}
|
||||
11
src/Tools/RustFsMigrator/RustFsMigrator.csproj
Normal file
11
src/Tools/RustFsMigrator/RustFsMigrator.csproj
Normal file
@@ -0,0 +1,11 @@
|
||||
<Project Sdk="Microsoft.NET.Sdk">
|
||||
<PropertyGroup>
|
||||
<OutputType>Exe</OutputType>
|
||||
<TargetFramework>net10.0</TargetFramework>
|
||||
<ImplicitUsings>enable</ImplicitUsings>
|
||||
<Nullable>enable</Nullable>
|
||||
</PropertyGroup>
|
||||
<ItemGroup>
|
||||
<PackageReference Include="AWSSDK.S3" Version="3.7.305.6" />
|
||||
</ItemGroup>
|
||||
</Project>
|
||||
346
src/Tools/SourceStateSeeder/Program.cs
Normal file
346
src/Tools/SourceStateSeeder/Program.cs
Normal file
@@ -0,0 +1,346 @@
|
||||
using System.Globalization;
|
||||
using System.Text.Json;
|
||||
using Microsoft.Extensions.Logging;
|
||||
using Microsoft.Extensions.Logging.Abstractions;
|
||||
using MongoDB.Driver;
|
||||
using StellaOps.Concelier.Connector.Common;
|
||||
using StellaOps.Concelier.Connector.Common.Fetch;
|
||||
using StellaOps.Concelier.Connector.Common.State;
|
||||
using StellaOps.Concelier.Storage.Mongo;
|
||||
using StellaOps.Concelier.Storage.Mongo.Documents;
|
||||
|
||||
namespace SourceStateSeeder;
|
||||
|
||||
internal static class Program
|
||||
{
|
||||
private static readonly JsonSerializerOptions JsonOptions = new(JsonSerializerDefaults.Web)
|
||||
{
|
||||
PropertyNameCaseInsensitive = true,
|
||||
ReadCommentHandling = JsonCommentHandling.Skip,
|
||||
AllowTrailingCommas = true,
|
||||
};
|
||||
|
||||
public static async Task<int> Main(string[] args)
|
||||
{
|
||||
try
|
||||
{
|
||||
var options = SeedOptions.Parse(args);
|
||||
if (options is null)
|
||||
{
|
||||
SeedOptions.PrintUsage();
|
||||
return 1;
|
||||
}
|
||||
|
||||
var seed = await LoadSpecificationAsync(options.InputPath).ConfigureAwait(false);
|
||||
var sourceName = seed.Source ?? options.SourceName;
|
||||
if (string.IsNullOrWhiteSpace(sourceName))
|
||||
{
|
||||
Console.Error.WriteLine("Source name must be supplied via --source or the seed file.");
|
||||
return 1;
|
||||
}
|
||||
|
||||
var specification = await BuildSpecificationAsync(seed, sourceName, options.InputPath, CancellationToken.None).ConfigureAwait(false);
|
||||
|
||||
var client = new MongoClient(options.ConnectionString);
|
||||
var database = client.GetDatabase(options.DatabaseName);
|
||||
var loggerFactory = NullLoggerFactory.Instance;
|
||||
|
||||
var documentStore = new DocumentStore(database, loggerFactory.CreateLogger<DocumentStore>());
|
||||
var rawStorage = new RawDocumentStorage(database);
|
||||
var stateRepository = new MongoSourceStateRepository(database, loggerFactory.CreateLogger<MongoSourceStateRepository>());
|
||||
|
||||
var processor = new SourceStateSeedProcessor(
|
||||
documentStore,
|
||||
rawStorage,
|
||||
stateRepository,
|
||||
TimeProvider.System,
|
||||
loggerFactory.CreateLogger<SourceStateSeedProcessor>());
|
||||
|
||||
var result = await processor.ProcessAsync(specification, CancellationToken.None).ConfigureAwait(false);
|
||||
|
||||
Console.WriteLine(
|
||||
$"Seeded {result.DocumentsProcessed} document(s) for {sourceName} " +
|
||||
$"(pendingDocuments+= {result.PendingDocumentsAdded}, pendingMappings+= {result.PendingMappingsAdded}, knownAdvisories+= {result.KnownAdvisoriesAdded.Count}).");
|
||||
return 0;
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
Console.Error.WriteLine($"Error: {ex.Message}");
|
||||
return 1;
|
||||
}
|
||||
}
|
||||
|
||||
private static async Task<StateSeed> LoadSpecificationAsync(string inputPath)
|
||||
{
|
||||
await using var stream = File.OpenRead(inputPath);
|
||||
var seed = await JsonSerializer.DeserializeAsync<StateSeed>(stream, JsonOptions).ConfigureAwait(false)
|
||||
?? throw new InvalidOperationException("Input file deserialized to null.");
|
||||
return seed;
|
||||
}
|
||||
|
||||
private static async Task<SourceStateSeedSpecification> BuildSpecificationAsync(
|
||||
StateSeed seed,
|
||||
string sourceName,
|
||||
string inputPath,
|
||||
CancellationToken cancellationToken)
|
||||
{
|
||||
var baseDirectory = Path.GetDirectoryName(Path.GetFullPath(inputPath)) ?? Directory.GetCurrentDirectory();
|
||||
var documents = new List<SourceStateSeedDocument>(seed.Documents.Count);
|
||||
|
||||
foreach (var documentSeed in seed.Documents)
|
||||
{
|
||||
documents.Add(await BuildDocumentAsync(documentSeed, baseDirectory, cancellationToken).ConfigureAwait(false));
|
||||
}
|
||||
|
||||
return new SourceStateSeedSpecification
|
||||
{
|
||||
Source = sourceName,
|
||||
Documents = documents.AsReadOnly(),
|
||||
Cursor = BuildCursor(seed.Cursor),
|
||||
KnownAdvisories = NormalizeStrings(seed.KnownAdvisories),
|
||||
CompletedAt = seed.CompletedAt,
|
||||
};
|
||||
}
|
||||
|
||||
private static async Task<SourceStateSeedDocument> BuildDocumentAsync(
|
||||
DocumentSeed seed,
|
||||
string baseDirectory,
|
||||
CancellationToken cancellationToken)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(seed.Uri))
|
||||
{
|
||||
throw new InvalidOperationException("Seed entry missing 'uri'.");
|
||||
}
|
||||
|
||||
if (string.IsNullOrWhiteSpace(seed.ContentFile))
|
||||
{
|
||||
throw new InvalidOperationException($"Seed entry for '{seed.Uri}' missing 'contentFile'.");
|
||||
}
|
||||
|
||||
var contentPath = ResolvePath(seed.ContentFile, baseDirectory);
|
||||
if (!File.Exists(contentPath))
|
||||
{
|
||||
throw new FileNotFoundException($"Content file not found for '{seed.Uri}'.", contentPath);
|
||||
}
|
||||
|
||||
var contentBytes = await File.ReadAllBytesAsync(contentPath, cancellationToken).ConfigureAwait(false);
|
||||
|
||||
var metadata = seed.Metadata is null
|
||||
? null
|
||||
: new Dictionary<string, string>(seed.Metadata, StringComparer.OrdinalIgnoreCase);
|
||||
|
||||
var headers = seed.Headers is null
|
||||
? null
|
||||
: new Dictionary<string, string>(seed.Headers, StringComparer.OrdinalIgnoreCase);
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(seed.ContentType))
|
||||
{
|
||||
headers ??= new Dictionary<string, string>(StringComparer.OrdinalIgnoreCase);
|
||||
if (!headers.ContainsKey("content-type"))
|
||||
{
|
||||
headers["content-type"] = seed.ContentType!;
|
||||
}
|
||||
}
|
||||
|
||||
return new SourceStateSeedDocument
|
||||
{
|
||||
Uri = seed.Uri,
|
||||
DocumentId = seed.DocumentId,
|
||||
Content = contentBytes,
|
||||
ContentType = seed.ContentType,
|
||||
Status = string.IsNullOrWhiteSpace(seed.Status) ? DocumentStatuses.PendingParse : seed.Status,
|
||||
Headers = headers,
|
||||
Metadata = metadata,
|
||||
Etag = seed.Etag,
|
||||
LastModified = ParseOptionalDate(seed.LastModified),
|
||||
ExpiresAt = seed.ExpiresAt,
|
||||
FetchedAt = ParseOptionalDate(seed.FetchedAt),
|
||||
AddToPendingDocuments = seed.AddToPendingDocuments,
|
||||
AddToPendingMappings = seed.AddToPendingMappings,
|
||||
KnownIdentifiers = NormalizeStrings(seed.KnownIdentifiers),
|
||||
};
|
||||
}
|
||||
|
||||
private static SourceStateSeedCursor? BuildCursor(CursorSeed? cursorSeed)
|
||||
{
|
||||
if (cursorSeed is null)
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
return new SourceStateSeedCursor
|
||||
{
|
||||
PendingDocuments = NormalizeGuids(cursorSeed.PendingDocuments),
|
||||
PendingMappings = NormalizeGuids(cursorSeed.PendingMappings),
|
||||
KnownAdvisories = NormalizeStrings(cursorSeed.KnownAdvisories),
|
||||
LastModifiedCursor = cursorSeed.LastModifiedCursor,
|
||||
LastFetchAt = cursorSeed.LastFetchAt,
|
||||
Additional = cursorSeed.Additional is null
|
||||
? null
|
||||
: new Dictionary<string, string>(cursorSeed.Additional, StringComparer.OrdinalIgnoreCase),
|
||||
};
|
||||
}
|
||||
|
||||
private static IReadOnlyCollection<Guid>? NormalizeGuids(IEnumerable<Guid>? values)
|
||||
{
|
||||
if (values is null)
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
var set = new HashSet<Guid>();
|
||||
foreach (var guid in values)
|
||||
{
|
||||
if (guid != Guid.Empty)
|
||||
{
|
||||
set.Add(guid);
|
||||
}
|
||||
}
|
||||
|
||||
return set.Count == 0 ? null : set.ToList();
|
||||
}
|
||||
|
||||
private static IReadOnlyCollection<string>? NormalizeStrings(IEnumerable<string>? values)
|
||||
{
|
||||
if (values is null)
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
var set = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
|
||||
foreach (var value in values)
|
||||
{
|
||||
if (!string.IsNullOrWhiteSpace(value))
|
||||
{
|
||||
set.Add(value.Trim());
|
||||
}
|
||||
}
|
||||
|
||||
return set.Count == 0 ? null : set.ToList();
|
||||
}
|
||||
|
||||
private static DateTimeOffset? ParseOptionalDate(string? value)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(value))
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
return DateTimeOffset.Parse(value, CultureInfo.InvariantCulture, DateTimeStyles.AssumeUniversal | DateTimeStyles.AdjustToUniversal);
|
||||
}
|
||||
|
||||
private static string ResolvePath(string path, string baseDirectory)
|
||||
=> Path.IsPathRooted(path) ? path : Path.GetFullPath(Path.Combine(baseDirectory, path));
|
||||
}
|
||||
|
||||
internal sealed record SeedOptions
|
||||
{
|
||||
public required string ConnectionString { get; init; }
|
||||
public required string DatabaseName { get; init; }
|
||||
public required string InputPath { get; init; }
|
||||
public string? SourceName { get; init; }
|
||||
|
||||
public static SeedOptions? Parse(string[] args)
|
||||
{
|
||||
string? connectionString = null;
|
||||
string? database = null;
|
||||
string? input = null;
|
||||
string? source = null;
|
||||
|
||||
for (var i = 0; i < args.Length; i++)
|
||||
{
|
||||
var arg = args[i];
|
||||
switch (arg)
|
||||
{
|
||||
case "--connection-string":
|
||||
case "-c":
|
||||
connectionString = TakeValue(args, ref i, arg);
|
||||
break;
|
||||
case "--database":
|
||||
case "-d":
|
||||
database = TakeValue(args, ref i, arg);
|
||||
break;
|
||||
case "--input":
|
||||
case "-i":
|
||||
input = TakeValue(args, ref i, arg);
|
||||
break;
|
||||
case "--source":
|
||||
case "-s":
|
||||
source = TakeValue(args, ref i, arg);
|
||||
break;
|
||||
case "--help":
|
||||
case "-h":
|
||||
return null;
|
||||
default:
|
||||
Console.Error.WriteLine($"Unrecognized argument '{arg}'.");
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
if (string.IsNullOrWhiteSpace(connectionString) || string.IsNullOrWhiteSpace(database) || string.IsNullOrWhiteSpace(input))
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
return new SeedOptions
|
||||
{
|
||||
ConnectionString = connectionString,
|
||||
DatabaseName = database,
|
||||
InputPath = input,
|
||||
SourceName = source,
|
||||
};
|
||||
}
|
||||
|
||||
public static void PrintUsage()
|
||||
{
|
||||
Console.WriteLine("Usage: dotnet run --project src/Tools/SourceStateSeeder -- --connection-string <connection> --database <name> --input <seed.json> [--source <source>]");
|
||||
}
|
||||
|
||||
private static string TakeValue(string[] args, ref int index, string arg)
|
||||
{
|
||||
if (index + 1 >= args.Length)
|
||||
{
|
||||
throw new ArgumentException($"Missing value for {arg}.");
|
||||
}
|
||||
|
||||
index++;
|
||||
return args[index];
|
||||
}
|
||||
}
|
||||
|
||||
internal sealed record StateSeed
|
||||
{
|
||||
public string? Source { get; init; }
|
||||
public List<DocumentSeed> Documents { get; init; } = new();
|
||||
public CursorSeed? Cursor { get; init; }
|
||||
public List<string>? KnownAdvisories { get; init; }
|
||||
public DateTimeOffset? CompletedAt { get; init; }
|
||||
}
|
||||
|
||||
internal sealed record DocumentSeed
|
||||
{
|
||||
public string Uri { get; init; } = string.Empty;
|
||||
public string ContentFile { get; init; } = string.Empty;
|
||||
public Guid? DocumentId { get; init; }
|
||||
public string? ContentType { get; init; }
|
||||
public Dictionary<string, string>? Metadata { get; init; }
|
||||
public Dictionary<string, string>? Headers { get; init; }
|
||||
public string Status { get; init; } = DocumentStatuses.PendingParse;
|
||||
public bool AddToPendingDocuments { get; init; } = true;
|
||||
public bool AddToPendingMappings { get; init; }
|
||||
public string? LastModified { get; init; }
|
||||
public string? FetchedAt { get; init; }
|
||||
public string? Etag { get; init; }
|
||||
public DateTimeOffset? ExpiresAt { get; init; }
|
||||
public List<string>? KnownIdentifiers { get; init; }
|
||||
}
|
||||
|
||||
internal sealed record CursorSeed
|
||||
{
|
||||
public List<Guid>? PendingDocuments { get; init; }
|
||||
public List<Guid>? PendingMappings { get; init; }
|
||||
public List<string>? KnownAdvisories { get; init; }
|
||||
public DateTimeOffset? LastModifiedCursor { get; init; }
|
||||
public DateTimeOffset? LastFetchAt { get; init; }
|
||||
public Dictionary<string, string>? Additional { get; init; }
|
||||
}
|
||||
12
src/Tools/SourceStateSeeder/SourceStateSeeder.csproj
Normal file
12
src/Tools/SourceStateSeeder/SourceStateSeeder.csproj
Normal file
@@ -0,0 +1,12 @@
|
||||
<Project Sdk="Microsoft.NET.Sdk">
|
||||
<PropertyGroup>
|
||||
<OutputType>Exe</OutputType>
|
||||
<TargetFramework>net10.0</TargetFramework>
|
||||
<ImplicitUsings>enable</ImplicitUsings>
|
||||
<Nullable>enable</Nullable>
|
||||
</PropertyGroup>
|
||||
<ItemGroup>
|
||||
<ProjectReference Include="..\..\src\StellaOps.Concelier.Connector.Common\StellaOps.Concelier.Connector.Common.csproj" />
|
||||
<ProjectReference Include="..\..\src\StellaOps.Concelier.Storage.Mongo\StellaOps.Concelier.Storage.Mongo.csproj" />
|
||||
</ItemGroup>
|
||||
</Project>
|
||||
444
src/Tools/certbund_offline_snapshot.py
Normal file
444
src/Tools/certbund_offline_snapshot.py
Normal file
@@ -0,0 +1,444 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Capture CERT-Bund search/export JSON snapshots and generate Offline Kit manifests.
|
||||
|
||||
The script can bootstrap a session against https://wid.cert-bund.de, fetch
|
||||
paginated search results plus per-year export payloads, and emit a manifest
|
||||
that records source, date range, SHA-256, and capture timestamps for each artefact.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import argparse
|
||||
import datetime as dt
|
||||
import hashlib
|
||||
import json
|
||||
import os
|
||||
from pathlib import Path, PurePosixPath
|
||||
import sys
|
||||
import time
|
||||
import urllib.error
|
||||
import urllib.parse
|
||||
import urllib.request
|
||||
from http.cookiejar import MozillaCookieJar
|
||||
from typing import Any, Dict, Iterable, List, Optional
|
||||
|
||||
|
||||
PORTAL_ROOT = "https://wid.cert-bund.de/portal/"
|
||||
SEARCH_ENDPOINT = "https://wid.cert-bund.de/portal/api/securityadvisory/search"
|
||||
EXPORT_ENDPOINT = "https://wid.cert-bund.de/portal/api/securityadvisory/export"
|
||||
CSRF_ENDPOINT = "https://wid.cert-bund.de/portal/api/security/csrf"
|
||||
USER_AGENT = "StellaOps.CertBundOffline/0.1"
|
||||
|
||||
UTC = dt.timezone.utc
|
||||
|
||||
|
||||
class CertBundClient:
|
||||
def __init__(
|
||||
self,
|
||||
cookie_file: Optional[Path] = None,
|
||||
xsrf_token: Optional[str] = None,
|
||||
auto_bootstrap: bool = True,
|
||||
) -> None:
|
||||
self.cookie_path = cookie_file
|
||||
self.cookie_jar = MozillaCookieJar()
|
||||
|
||||
if self.cookie_path and self.cookie_path.exists():
|
||||
self.cookie_jar.load(self.cookie_path, ignore_discard=True, ignore_expires=True)
|
||||
|
||||
self.opener = urllib.request.build_opener(urllib.request.HTTPCookieProcessor(self.cookie_jar))
|
||||
self.opener.addheaders = [("User-Agent", USER_AGENT)]
|
||||
|
||||
self._xsrf_token = xsrf_token
|
||||
self.auto_bootstrap = auto_bootstrap
|
||||
|
||||
if self.auto_bootstrap and not self._xsrf_token:
|
||||
self._bootstrap()
|
||||
|
||||
@property
|
||||
def xsrf_token(self) -> str:
|
||||
if self._xsrf_token:
|
||||
return self._xsrf_token
|
||||
|
||||
token = _extract_cookie_value(self.cookie_jar, "XSRF-TOKEN")
|
||||
if token:
|
||||
self._xsrf_token = token
|
||||
return token
|
||||
|
||||
raise RuntimeError(
|
||||
"CERT-Bund XSRF token not available. Provide --xsrf-token or a cookie file "
|
||||
"containing XSRF-TOKEN (see docs/modules/concelier/operations/connectors/certbund.md)."
|
||||
)
|
||||
|
||||
def fetch_search_pages(
|
||||
self,
|
||||
destination: Path,
|
||||
page_size: int,
|
||||
max_pages: int,
|
||||
) -> None:
|
||||
destination.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
for page in range(max_pages):
|
||||
payload = {
|
||||
"page": page,
|
||||
"size": page_size,
|
||||
"sort": ["published,desc"],
|
||||
}
|
||||
try:
|
||||
document = self._post_json(SEARCH_ENDPOINT, payload)
|
||||
except urllib.error.HTTPError as exc:
|
||||
raise RuntimeError(
|
||||
f"Failed to fetch CERT-Bund search page {page}: HTTP {exc.code}. "
|
||||
"Double-check the XSRF token or portal cookies."
|
||||
) from exc
|
||||
|
||||
content = document.get("content") or []
|
||||
if not content and page > 0:
|
||||
break
|
||||
|
||||
file_path = destination / f"certbund-search-page-{page:02d}.json"
|
||||
_write_pretty_json(file_path, document)
|
||||
print(f"[certbund] wrote search page {page:02d} → {file_path}")
|
||||
|
||||
if not content:
|
||||
break
|
||||
|
||||
self._persist_cookies()
|
||||
|
||||
def fetch_exports(self, destination: Path, start_year: int, end_year: int) -> None:
|
||||
destination.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
for year in range(start_year, end_year + 1):
|
||||
from_value = f"{year}-01-01"
|
||||
to_value = f"{year}-12-31"
|
||||
query = urllib.parse.urlencode({"format": "json", "from": from_value, "to": to_value})
|
||||
url = f"{EXPORT_ENDPOINT}?{query}"
|
||||
try:
|
||||
document = self._get_json(url)
|
||||
except urllib.error.HTTPError as exc:
|
||||
raise RuntimeError(
|
||||
f"Failed to fetch CERT-Bund export for {year}: HTTP {exc.code}. "
|
||||
"Ensure the XSRF token and cookies are valid."
|
||||
) from exc
|
||||
|
||||
file_path = destination / f"certbund-export-{year}.json"
|
||||
_write_pretty_json(file_path, document)
|
||||
print(f"[certbund] wrote export {year} → {file_path}")
|
||||
|
||||
self._persist_cookies()
|
||||
|
||||
def _bootstrap(self) -> None:
|
||||
try:
|
||||
self._request("GET", PORTAL_ROOT, headers={"Accept": "text/html,application/xhtml+xml"})
|
||||
except urllib.error.HTTPError as exc:
|
||||
raise RuntimeError(f"Failed to bootstrap CERT-Bund session: HTTP {exc.code}") from exc
|
||||
|
||||
# First attempt to obtain CSRF token directly.
|
||||
self._attempt_csrf_fetch()
|
||||
|
||||
if _extract_cookie_value(self.cookie_jar, "XSRF-TOKEN"):
|
||||
return
|
||||
|
||||
# If the token is still missing, trigger the search endpoint once (likely 403)
|
||||
# to make the portal materialise JSESSIONID, then retry token acquisition.
|
||||
try:
|
||||
payload = {"page": 0, "size": 1, "sort": ["published,desc"]}
|
||||
self._post_json(SEARCH_ENDPOINT, payload, include_token=False)
|
||||
except urllib.error.HTTPError:
|
||||
pass
|
||||
|
||||
self._attempt_csrf_fetch()
|
||||
|
||||
token = _extract_cookie_value(self.cookie_jar, "XSRF-TOKEN")
|
||||
if token:
|
||||
self._xsrf_token = token
|
||||
else:
|
||||
print(
|
||||
"[certbund] warning: automatic XSRF token retrieval failed. "
|
||||
"Supply --xsrf-token or reuse a browser-exported cookies file.",
|
||||
file=sys.stderr,
|
||||
)
|
||||
|
||||
def _attempt_csrf_fetch(self) -> None:
|
||||
headers = {
|
||||
"Accept": "application/json, text/plain, */*",
|
||||
"X-Requested-With": "XMLHttpRequest",
|
||||
"Origin": "https://wid.cert-bund.de",
|
||||
"Referer": PORTAL_ROOT,
|
||||
}
|
||||
try:
|
||||
self._request("GET", CSRF_ENDPOINT, headers=headers)
|
||||
except urllib.error.HTTPError:
|
||||
pass
|
||||
|
||||
def _request(self, method: str, url: str, data: Optional[bytes] = None, headers: Optional[Dict[str, str]] = None) -> bytes:
|
||||
request = urllib.request.Request(url, data=data, method=method)
|
||||
default_headers = {
|
||||
"User-Agent": USER_AGENT,
|
||||
"Accept": "application/json",
|
||||
}
|
||||
for key, value in default_headers.items():
|
||||
request.add_header(key, value)
|
||||
|
||||
if headers:
|
||||
for key, value in headers.items():
|
||||
request.add_header(key, value)
|
||||
|
||||
return self.opener.open(request, timeout=60).read()
|
||||
|
||||
def _post_json(self, url: str, payload: Dict[str, Any], include_token: bool = True) -> Dict[str, Any]:
|
||||
data = json.dumps(payload).encode("utf-8")
|
||||
headers = {
|
||||
"Content-Type": "application/json",
|
||||
"Accept": "application/json",
|
||||
"X-Requested-With": "XMLHttpRequest",
|
||||
"Origin": "https://wid.cert-bund.de",
|
||||
"Referer": PORTAL_ROOT,
|
||||
}
|
||||
if include_token:
|
||||
headers["X-XSRF-TOKEN"] = self.xsrf_token
|
||||
|
||||
raw = self._request("POST", url, data=data, headers=headers)
|
||||
return json.loads(raw.decode("utf-8"))
|
||||
|
||||
def _get_json(self, url: str) -> Any:
|
||||
headers = {
|
||||
"Accept": "application/json",
|
||||
"X-Requested-With": "XMLHttpRequest",
|
||||
"Referer": PORTAL_ROOT,
|
||||
}
|
||||
headers["X-XSRF-TOKEN"] = self.xsrf_token
|
||||
|
||||
raw = self._request("GET", url, headers=headers)
|
||||
return json.loads(raw.decode("utf-8"))
|
||||
|
||||
def _persist_cookies(self) -> None:
|
||||
if not self.cookie_path:
|
||||
return
|
||||
|
||||
self.cookie_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
self.cookie_jar.save(self.cookie_path, ignore_discard=True, ignore_expires=True)
|
||||
|
||||
|
||||
def _extract_cookie_value(jar: MozillaCookieJar, name: str) -> Optional[str]:
|
||||
for cookie in jar:
|
||||
if cookie.name == name:
|
||||
return cookie.value
|
||||
return None
|
||||
|
||||
|
||||
def _write_pretty_json(path: Path, document: Any) -> None:
|
||||
path.parent.mkdir(parents=True, exist_ok=True)
|
||||
with path.open("w", encoding="utf-8") as handle:
|
||||
json.dump(document, handle, ensure_ascii=False, indent=2, sort_keys=True)
|
||||
handle.write("\n")
|
||||
|
||||
|
||||
def scan_artifacts(root: Path) -> List[Dict[str, Any]]:
|
||||
records: List[Dict[str, Any]] = []
|
||||
search_dir = root / "search"
|
||||
export_dir = root / "export"
|
||||
|
||||
if search_dir.exists():
|
||||
for file_path in sorted(search_dir.glob("certbund-search-page-*.json")):
|
||||
record = _build_search_record(file_path)
|
||||
records.append(record)
|
||||
|
||||
if export_dir.exists():
|
||||
for file_path in sorted(export_dir.glob("certbund-export-*.json")):
|
||||
record = _build_export_record(file_path)
|
||||
records.append(record)
|
||||
|
||||
return records
|
||||
|
||||
|
||||
def _build_search_record(path: Path) -> Dict[str, Any]:
|
||||
with path.open("r", encoding="utf-8") as handle:
|
||||
data = json.load(handle)
|
||||
|
||||
content = data.get("content") or []
|
||||
published_values: List[str] = []
|
||||
for item in content:
|
||||
published = (
|
||||
item.get("published")
|
||||
or item.get("publishedAt")
|
||||
or item.get("datePublished")
|
||||
or item.get("published_date")
|
||||
)
|
||||
if isinstance(published, str):
|
||||
published_values.append(published)
|
||||
|
||||
if published_values:
|
||||
try:
|
||||
ordered = sorted(_parse_iso_timestamp(value) for value in published_values if value)
|
||||
range_from = ordered[0].isoformat()
|
||||
range_to = ordered[-1].isoformat()
|
||||
except ValueError:
|
||||
range_from = range_to = None
|
||||
else:
|
||||
range_from = range_to = None
|
||||
|
||||
return {
|
||||
"type": "search",
|
||||
"path": path,
|
||||
"source": "concelier.cert-bund.search",
|
||||
"itemCount": len(content),
|
||||
"from": range_from,
|
||||
"to": range_to,
|
||||
"capturedAt": _timestamp_from_stat(path),
|
||||
}
|
||||
|
||||
|
||||
def _build_export_record(path: Path) -> Dict[str, Any]:
|
||||
year = _extract_year_from_filename(path.name)
|
||||
if year is not None:
|
||||
from_value = f"{year}-01-01"
|
||||
to_value = f"{year}-12-31"
|
||||
else:
|
||||
from_value = None
|
||||
to_value = None
|
||||
|
||||
return {
|
||||
"type": "export",
|
||||
"path": path,
|
||||
"source": "concelier.cert-bund.export",
|
||||
"itemCount": None,
|
||||
"from": from_value,
|
||||
"to": to_value,
|
||||
"capturedAt": _timestamp_from_stat(path),
|
||||
}
|
||||
|
||||
|
||||
def _timestamp_from_stat(path: Path) -> str:
|
||||
stat = path.stat()
|
||||
return dt.datetime.fromtimestamp(stat.st_mtime, tz=UTC).isoformat()
|
||||
|
||||
|
||||
def _extract_year_from_filename(name: str) -> Optional[int]:
|
||||
stem = Path(name).stem
|
||||
parts = stem.split("-")
|
||||
if parts and parts[-1].isdigit() and len(parts[-1]) == 4:
|
||||
return int(parts[-1])
|
||||
return None
|
||||
|
||||
|
||||
def _parse_iso_timestamp(value: str) -> dt.datetime:
|
||||
try:
|
||||
return dt.datetime.fromisoformat(value.replace("Z", "+00:00"))
|
||||
except ValueError:
|
||||
# Fallback for formats like 2025-10-14T06:24:49
|
||||
return dt.datetime.strptime(value, "%Y-%m-%dT%H:%M:%S").replace(tzinfo=UTC)
|
||||
|
||||
|
||||
def build_manifest(root: Path, records: Iterable[Dict[str, Any]], manifest_path: Path) -> None:
|
||||
manifest_entries = []
|
||||
for record in records:
|
||||
path = record["path"]
|
||||
rel_path = PurePosixPath(path.relative_to(root).as_posix())
|
||||
sha256 = hashlib.sha256(path.read_bytes()).hexdigest()
|
||||
size = path.stat().st_size
|
||||
|
||||
entry = {
|
||||
"source": record["source"],
|
||||
"type": record["type"],
|
||||
"path": str(rel_path),
|
||||
"sha256": sha256,
|
||||
"sizeBytes": size,
|
||||
"capturedAt": record["capturedAt"],
|
||||
"from": record.get("from"),
|
||||
"to": record.get("to"),
|
||||
"itemCount": record.get("itemCount"),
|
||||
}
|
||||
manifest_entries.append(entry)
|
||||
|
||||
sha_file = path.with_suffix(path.suffix + ".sha256")
|
||||
_write_sha_file(sha_file, sha256, path.name)
|
||||
|
||||
manifest_entries.sort(key=lambda item: item["path"])
|
||||
|
||||
manifest_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
manifest_document = {
|
||||
"source": "concelier.cert-bund",
|
||||
"generatedAt": dt.datetime.now(tz=UTC).isoformat(),
|
||||
"artifacts": manifest_entries,
|
||||
}
|
||||
|
||||
with manifest_path.open("w", encoding="utf-8") as handle:
|
||||
json.dump(manifest_document, handle, ensure_ascii=False, indent=2, sort_keys=True)
|
||||
handle.write("\n")
|
||||
|
||||
manifest_sha = hashlib.sha256(manifest_path.read_bytes()).hexdigest()
|
||||
_write_sha_file(manifest_path.with_suffix(".sha256"), manifest_sha, manifest_path.name)
|
||||
|
||||
print(f"[certbund] manifest generated → {manifest_path}")
|
||||
|
||||
|
||||
def _write_sha_file(path: Path, digest: str, filename: str) -> None:
|
||||
path.parent.mkdir(parents=True, exist_ok=True)
|
||||
with path.open("w", encoding="utf-8") as handle:
|
||||
handle.write(f"{digest} {filename}\n")
|
||||
|
||||
|
||||
def parse_args(argv: Optional[List[str]] = None) -> argparse.Namespace:
|
||||
parser = argparse.ArgumentParser(
|
||||
description="Capture CERT-Bund search/export snapshots for Offline Kit packaging.",
|
||||
)
|
||||
parser.add_argument("--output", default="seed-data/cert-bund", help="Destination directory for artefacts.")
|
||||
parser.add_argument("--start-year", type=int, default=2014, help="First year (inclusive) for export snapshots.")
|
||||
parser.add_argument(
|
||||
"--end-year",
|
||||
type=int,
|
||||
default=dt.datetime.now(tz=UTC).year,
|
||||
help="Last year (inclusive) for export snapshots.",
|
||||
)
|
||||
parser.add_argument("--page-size", type=int, default=100, help="Search page size.")
|
||||
parser.add_argument("--max-pages", type=int, default=12, help="Maximum number of search result pages to capture.")
|
||||
parser.add_argument("--cookie-file", type=Path, help="Path to a Netscape cookie file to reuse/persist session cookies.")
|
||||
parser.add_argument("--xsrf-token", help="Optional explicit XSRF token value (overrides cookie discovery).")
|
||||
parser.add_argument(
|
||||
"--skip-fetch",
|
||||
action="store_true",
|
||||
help="Skip HTTP fetches and only regenerate manifest from existing files.",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--no-bootstrap",
|
||||
action="store_true",
|
||||
help="Do not attempt automatic session bootstrap (use with --skip-fetch or pre-populated cookies).",
|
||||
)
|
||||
return parser.parse_args(argv)
|
||||
|
||||
|
||||
def main(argv: Optional[List[str]] = None) -> int:
|
||||
args = parse_args(argv)
|
||||
output_dir = Path(args.output).expanduser().resolve()
|
||||
|
||||
if not args.skip_fetch:
|
||||
client = CertBundClient(
|
||||
cookie_file=args.cookie_file,
|
||||
xsrf_token=args.xsrf_token,
|
||||
auto_bootstrap=not args.no_bootstrap,
|
||||
)
|
||||
|
||||
start_year = args.start_year
|
||||
end_year = args.end_year
|
||||
if start_year > end_year:
|
||||
raise SystemExit("start-year cannot be greater than end-year.")
|
||||
|
||||
client.fetch_search_pages(output_dir / "search", args.page_size, args.max_pages)
|
||||
client.fetch_exports(output_dir / "export", start_year, end_year)
|
||||
|
||||
records = scan_artifacts(output_dir)
|
||||
if not records:
|
||||
print(
|
||||
"[certbund] no artefacts discovered. Fetch data first or point --output to the dataset directory.",
|
||||
file=sys.stderr,
|
||||
)
|
||||
return 1
|
||||
|
||||
manifest_path = output_dir / "manifest" / "certbund-offline-manifest.json"
|
||||
build_manifest(output_dir, records, manifest_path)
|
||||
return 0
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
sys.exit(main())
|
||||
BIN
src/Tools/openssl/linux-x64/libcrypto.so.1.1
Normal file
BIN
src/Tools/openssl/linux-x64/libcrypto.so.1.1
Normal file
Binary file not shown.
BIN
src/Tools/openssl/linux-x64/libssl.so.1.1
Normal file
BIN
src/Tools/openssl/linux-x64/libssl.so.1.1
Normal file
Binary file not shown.
@@ -1,105 +1,105 @@
|
||||
# Zastava Webhook · Wave 0 Implementation Notes
|
||||
|
||||
> Authored 2025-10-19 by Zastava Webhook Guild.
|
||||
|
||||
## ZASTAVA-WEBHOOK-12-101 — Admission Controller Host (TLS bootstrap + Authority auth)
|
||||
|
||||
**Objectives**
|
||||
- Provide a deterministic, restart-safe .NET 10 host that exposes a Kubernetes ValidatingAdmissionWebhook endpoint.
|
||||
- Load serving certificates at start-up only (per restart-time plug-in rule) and surface reload guidance via documentation rather than hot-reload.
|
||||
- Authenticate outbound calls to Authority/Scanner using OpTok + DPoP as defined in `docs/ARCHITECTURE_ZASTAVA.md`.
|
||||
|
||||
**Plan**
|
||||
1. **Project scaffolding**
|
||||
- Create `StellaOps.Zastava.Webhook` project with minimal API pipeline (`Program.cs`, `Startup` equivalent via extension methods).
|
||||
- Reference shared helpers once `ZASTAVA-CORE-12-201/202` land; temporarily stub interfaces behind `IZastavaAdmissionRequest`/`IZastavaAdmissionResult`.
|
||||
2. **TLS bootstrap**
|
||||
- Support two certificate sources:
|
||||
1. Mounted secret path (`/var/run/secrets/zastava-webhook/tls.{crt,key}`) with optional CA bundle.
|
||||
2. CSR workflow: generate CSR + private key, submit to Kubernetes Certificates API when `admission.tls.autoApprove` enabled; persist signed cert/key to mounted emptyDir for reuse across replicas.
|
||||
- Validate cert/key pair on boot; abort start-up if invalid to preserve deterministic behavior.
|
||||
- Configure Kestrel for mutual TLS off (API Server already provides client auth) but enforce minimum TLS 1.3, strong cipher suite list, HTTP/2 disabled (K8s uses HTTP/1.1).
|
||||
# Zastava Webhook · Wave 0 Implementation Notes
|
||||
|
||||
> Authored 2025-10-19 by Zastava Webhook Guild.
|
||||
|
||||
## ZASTAVA-WEBHOOK-12-101 — Admission Controller Host (TLS bootstrap + Authority auth)
|
||||
|
||||
**Objectives**
|
||||
- Provide a deterministic, restart-safe .NET 10 host that exposes a Kubernetes ValidatingAdmissionWebhook endpoint.
|
||||
- Load serving certificates at start-up only (per restart-time plug-in rule) and surface reload guidance via documentation rather than hot-reload.
|
||||
- Authenticate outbound calls to Authority/Scanner using OpTok + DPoP as defined in `docs/modules/zastava/ARCHITECTURE.md`.
|
||||
|
||||
**Plan**
|
||||
1. **Project scaffolding**
|
||||
- Create `StellaOps.Zastava.Webhook` project with minimal API pipeline (`Program.cs`, `Startup` equivalent via extension methods).
|
||||
- Reference shared helpers once `ZASTAVA-CORE-12-201/202` land; temporarily stub interfaces behind `IZastavaAdmissionRequest`/`IZastavaAdmissionResult`.
|
||||
2. **TLS bootstrap**
|
||||
- Support two certificate sources:
|
||||
1. Mounted secret path (`/var/run/secrets/zastava-webhook/tls.{crt,key}`) with optional CA bundle.
|
||||
2. CSR workflow: generate CSR + private key, submit to Kubernetes Certificates API when `admission.tls.autoApprove` enabled; persist signed cert/key to mounted emptyDir for reuse across replicas.
|
||||
- Validate cert/key pair on boot; abort start-up if invalid to preserve deterministic behavior.
|
||||
- Configure Kestrel for mutual TLS off (API Server already provides client auth) but enforce minimum TLS 1.3, strong cipher suite list, HTTP/2 disabled (K8s uses HTTP/1.1).
|
||||
3. **Authority auth**
|
||||
- Bootstrap Authority client via shared runtime core (`AddZastavaRuntimeCore` + `IZastavaAuthorityTokenProvider`) so webhook reuses multitenant OpTok caching and guardrails.
|
||||
- Implement DPoP proof generator bound to webhook host keypair (prefer Ed25519) with configurable rotation period (default 24h, triggered at restart).
|
||||
- Add background health check verifying token freshness and surfacing metrics (`zastava.authority_token_renew_failures_total`).
|
||||
4. **Hosting concerns**
|
||||
- Configure structured logging with correlation id from AdmissionReview UID.
|
||||
- Expose `/healthz` (reads cert expiry, Authority token status) and `/metrics` (Prometheus).
|
||||
- Add readiness gate that requires initial TLS and Authority bootstrap to succeed.
|
||||
|
||||
**Deliverables**
|
||||
- Compilable host project with integration tests covering TLS load (mounted files + CSR mock) and Authority token acquisition.
|
||||
- Documentation snippet for deploy charts describing secret/CSR wiring.
|
||||
|
||||
**Open Questions**
|
||||
- Need confirmation from Core guild on DTO naming (`AdmissionReviewEnvelope`, `AdmissionDecision`) to avoid rework.
|
||||
- Determine whether CSR auto-approval is acceptable for air-gapped clusters without Kubernetes cert-manager; may require fallback manual cert import path.
|
||||
|
||||
## ZASTAVA-WEBHOOK-12-102 — Backend policy query & digest resolution
|
||||
|
||||
**Objectives**
|
||||
- Resolve all images within AdmissionReview to immutable digests before policy evaluation.
|
||||
- Call Scanner WebService `/api/v1/scanner/policy/runtime` with namespace/labels/images payload, enforce verdicts with deterministic error messaging.
|
||||
|
||||
**Plan**
|
||||
1. **Image resolution**
|
||||
- Implement resolver service with pluggable strategies:
|
||||
- Use existing digest if present.
|
||||
- Resolve tags via registry HEAD (respecting `admission.resolveTags` flag); fallback to Observer-provided digest once core DTOs available.
|
||||
- Cache per-registry auth to minimise latency; adhere to allow/deny lists from configuration.
|
||||
2. **Scanner client**
|
||||
- Define typed request/response models mirroring `docs/ARCHITECTURE_ZASTAVA.md` structure (`ttlSeconds`, `results[digest] -> { signed, hasSbom, policyVerdict, reasons, rekor }`).
|
||||
- Implement retry policy (3 attempts, exponential backoff) and map HTTP errors to webhook fail-open/closed depending on namespace configuration.
|
||||
- Instrument latency (`zastava.backend_latency_seconds`) and failure counts.
|
||||
3. **Verdict enforcement**
|
||||
- Evaluate per-image results: if any `policyVerdict != pass` (or `warn` when `enforceWarnings=false`), deny with aggregated reasons.
|
||||
- Attach `ttlSeconds` to admission response annotations for auditing.
|
||||
- Record structured logs with namespace, pod, image digest, decision, reasons, backend latency.
|
||||
4. **Contract coordination**
|
||||
- Schedule joint review with Scanner WebService guild once SCANNER-RUNTIME-12-302 schema stabilises; track in TASKS sub-items.
|
||||
- Provide sample payload fixtures for CLI team (`CLI-RUNTIME-13-005`) to validate table output; ensure field names stay aligned.
|
||||
|
||||
**Deliverables**
|
||||
- Registry resolver unit tests (tag->digest) with deterministic fixtures.
|
||||
- HTTP client integration tests using Scanner stub returning varied verdict combinations.
|
||||
- Documentation update summarising contract and failure handling.
|
||||
|
||||
**Open Questions**
|
||||
- Confirm expected policy verdict enumeration (`pass|warn|fail|error`?) and textual reason codes.
|
||||
- Need TTL behaviour: should webhook reduce TTL when backend returns > configured max?
|
||||
|
||||
## ZASTAVA-WEBHOOK-12-103 — Caching, fail-open/closed toggles, metrics/logging
|
||||
|
||||
**Objectives**
|
||||
- Provide deterministic caching layer respecting backend TTL while ensuring eviction on policy mutation.
|
||||
- Allow namespace-scoped fail-open behaviour with explicit metrics and alerts.
|
||||
- Surface actionable metrics/logging aligned with Architecture doc.
|
||||
|
||||
**Plan**
|
||||
1. **Cache design**
|
||||
- In-memory LRU keyed by image digest; value carries verdict payload + expiry timestamp.
|
||||
- Support optional persistent seed (read-only) to prime hot digests for offline clusters (config: `admission.cache.seedPath`).
|
||||
- On startup, load seed file and emit metric `zastava.cache_seed_entries_total`.
|
||||
- Evict entries on TTL or when `policyRevision` annotation in AdmissionReview changes (requires hook from Core DTO).
|
||||
2. **Fail-open/closed toggles**
|
||||
- Configuration: global default + namespace overrides through `admission.failOpenNamespaces`, `admission.failClosedNamespaces`.
|
||||
- Decision matrix:
|
||||
- Backend success + verdict PASS → allow.
|
||||
- Backend success + non-pass → deny unless namespace override says warn allowed.
|
||||
- Backend failure → allow if namespace fail-open, deny otherwise; annotate response with `zastava.ops/fail-open=true`.
|
||||
- Implement policy change event hook (future) to clear cache if observer signals revocation.
|
||||
3. **Metrics & logging**
|
||||
- Counters: `zastava.admission_requests_total{decision}`, `zastava.cache_hits_total{result=hit|miss}`, `zastava.fail_open_total`, `zastava.backend_failures_total{stage}`.
|
||||
- Histograms: `zastava.admission_latency_seconds` (overall), `zastava.resolve_latency_seconds`.
|
||||
- Logs: structured JSON with `decision`, `namespace`, `pod`, `imageDigest`, `reasons`, `cacheStatus`, `failMode`.
|
||||
- Optionally emit OpenTelemetry span for admission path with attributes capturing backend latency + cache path.
|
||||
4. **Testing & ops hooks**
|
||||
- Unit tests for cache TTL, namespace override logic, fail-open metric increments.
|
||||
- Integration test simulating backend outage ensuring fail-open/closed behaviour matches config.
|
||||
- Document runbook snippet describing interpreting metrics and toggling namespaces.
|
||||
|
||||
**Open Questions**
|
||||
- Confirm whether cache entries should include `policyRevision` to detect backend policy updates; requires coordination with Policy guild.
|
||||
- Need guidance on maximum cache size (default suggestions: 5k entries per replica?) to avoid memory blow-up.
|
||||
|
||||
- Implement DPoP proof generator bound to webhook host keypair (prefer Ed25519) with configurable rotation period (default 24h, triggered at restart).
|
||||
- Add background health check verifying token freshness and surfacing metrics (`zastava.authority_token_renew_failures_total`).
|
||||
4. **Hosting concerns**
|
||||
- Configure structured logging with correlation id from AdmissionReview UID.
|
||||
- Expose `/healthz` (reads cert expiry, Authority token status) and `/metrics` (Prometheus).
|
||||
- Add readiness gate that requires initial TLS and Authority bootstrap to succeed.
|
||||
|
||||
**Deliverables**
|
||||
- Compilable host project with integration tests covering TLS load (mounted files + CSR mock) and Authority token acquisition.
|
||||
- Documentation snippet for deploy charts describing secret/CSR wiring.
|
||||
|
||||
**Open Questions**
|
||||
- Need confirmation from Core guild on DTO naming (`AdmissionReviewEnvelope`, `AdmissionDecision`) to avoid rework.
|
||||
- Determine whether CSR auto-approval is acceptable for air-gapped clusters without Kubernetes cert-manager; may require fallback manual cert import path.
|
||||
|
||||
## ZASTAVA-WEBHOOK-12-102 — Backend policy query & digest resolution
|
||||
|
||||
**Objectives**
|
||||
- Resolve all images within AdmissionReview to immutable digests before policy evaluation.
|
||||
- Call Scanner WebService `/api/v1/scanner/policy/runtime` with namespace/labels/images payload, enforce verdicts with deterministic error messaging.
|
||||
|
||||
**Plan**
|
||||
1. **Image resolution**
|
||||
- Implement resolver service with pluggable strategies:
|
||||
- Use existing digest if present.
|
||||
- Resolve tags via registry HEAD (respecting `admission.resolveTags` flag); fallback to Observer-provided digest once core DTOs available.
|
||||
- Cache per-registry auth to minimise latency; adhere to allow/deny lists from configuration.
|
||||
2. **Scanner client**
|
||||
- Define typed request/response models mirroring `docs/modules/zastava/ARCHITECTURE.md` structure (`ttlSeconds`, `results[digest] -> { signed, hasSbom, policyVerdict, reasons, rekor }`).
|
||||
- Implement retry policy (3 attempts, exponential backoff) and map HTTP errors to webhook fail-open/closed depending on namespace configuration.
|
||||
- Instrument latency (`zastava.backend_latency_seconds`) and failure counts.
|
||||
3. **Verdict enforcement**
|
||||
- Evaluate per-image results: if any `policyVerdict != pass` (or `warn` when `enforceWarnings=false`), deny with aggregated reasons.
|
||||
- Attach `ttlSeconds` to admission response annotations for auditing.
|
||||
- Record structured logs with namespace, pod, image digest, decision, reasons, backend latency.
|
||||
4. **Contract coordination**
|
||||
- Schedule joint review with Scanner WebService guild once SCANNER-RUNTIME-12-302 schema stabilises; track in TASKS sub-items.
|
||||
- Provide sample payload fixtures for CLI team (`CLI-RUNTIME-13-005`) to validate table output; ensure field names stay aligned.
|
||||
|
||||
**Deliverables**
|
||||
- Registry resolver unit tests (tag->digest) with deterministic fixtures.
|
||||
- HTTP client integration tests using Scanner stub returning varied verdict combinations.
|
||||
- Documentation update summarising contract and failure handling.
|
||||
|
||||
**Open Questions**
|
||||
- Confirm expected policy verdict enumeration (`pass|warn|fail|error`?) and textual reason codes.
|
||||
- Need TTL behaviour: should webhook reduce TTL when backend returns > configured max?
|
||||
|
||||
## ZASTAVA-WEBHOOK-12-103 — Caching, fail-open/closed toggles, metrics/logging
|
||||
|
||||
**Objectives**
|
||||
- Provide deterministic caching layer respecting backend TTL while ensuring eviction on policy mutation.
|
||||
- Allow namespace-scoped fail-open behaviour with explicit metrics and alerts.
|
||||
- Surface actionable metrics/logging aligned with Architecture doc.
|
||||
|
||||
**Plan**
|
||||
1. **Cache design**
|
||||
- In-memory LRU keyed by image digest; value carries verdict payload + expiry timestamp.
|
||||
- Support optional persistent seed (read-only) to prime hot digests for offline clusters (config: `admission.cache.seedPath`).
|
||||
- On startup, load seed file and emit metric `zastava.cache_seed_entries_total`.
|
||||
- Evict entries on TTL or when `policyRevision` annotation in AdmissionReview changes (requires hook from Core DTO).
|
||||
2. **Fail-open/closed toggles**
|
||||
- Configuration: global default + namespace overrides through `admission.failOpenNamespaces`, `admission.failClosedNamespaces`.
|
||||
- Decision matrix:
|
||||
- Backend success + verdict PASS → allow.
|
||||
- Backend success + non-pass → deny unless namespace override says warn allowed.
|
||||
- Backend failure → allow if namespace fail-open, deny otherwise; annotate response with `zastava.ops/fail-open=true`.
|
||||
- Implement policy change event hook (future) to clear cache if observer signals revocation.
|
||||
3. **Metrics & logging**
|
||||
- Counters: `zastava.admission_requests_total{decision}`, `zastava.cache_hits_total{result=hit|miss}`, `zastava.fail_open_total`, `zastava.backend_failures_total{stage}`.
|
||||
- Histograms: `zastava.admission_latency_seconds` (overall), `zastava.resolve_latency_seconds`.
|
||||
- Logs: structured JSON with `decision`, `namespace`, `pod`, `imageDigest`, `reasons`, `cacheStatus`, `failMode`.
|
||||
- Optionally emit OpenTelemetry span for admission path with attributes capturing backend latency + cache path.
|
||||
4. **Testing & ops hooks**
|
||||
- Unit tests for cache TTL, namespace override logic, fail-open metric increments.
|
||||
- Integration test simulating backend outage ensuring fail-open/closed behaviour matches config.
|
||||
- Document runbook snippet describing interpreting metrics and toggling namespaces.
|
||||
|
||||
**Open Questions**
|
||||
- Confirm whether cache entries should include `policyRevision` to detect backend policy updates; requires coordination with Policy guild.
|
||||
- Need guidance on maximum cache size (default suggestions: 5k entries per replica?) to avoid memory blow-up.
|
||||
|
||||
|
||||
Reference in New Issue
Block a user