Introduce Vexer platform scaffolding and enrich Concelier merge
This commit is contained in:
42
SPRINTS.md
42
SPRINTS.md
@@ -52,7 +52,7 @@
|
||||
| Sprint 1 | Stabilize In-Progress Foundations | src/StellaOps.Cryptography/TASKS.md | DONE (2025-10-13) | Team Authority Platform & Security Guild | AUTHSEC-DOCS-01-002 | SEC3.B — Published `docs/security/rate-limits.md` with tuning matrix, alert thresholds, and lockout interplay guidance; Docs guild can lift copy into plugin guide. |
|
||||
| Sprint 1 | Stabilize In-Progress Foundations | src/StellaOps.Cryptography/TASKS.md | DONE (2025-10-14) | Team Authority Platform & Security Guild | AUTHSEC-CRYPTO-02-001 | SEC5.B1 — Introduce libsodium signing provider and parity tests to unblock CLI verification enhancements. |
|
||||
| Sprint 1 | Bootstrap & Replay Hardening | src/StellaOps.Cryptography/TASKS.md | DONE (2025-10-14) | Security Guild | AUTHSEC-CRYPTO-02-004 | SEC5.D/E — Finish bootstrap invite lifecycle (API/store/cleanup) and token device heuristics; build currently red due to pending handler integration. |
|
||||
| Sprint 1 | Developer Tooling | src/StellaOps.Cli/TASKS.md | TODO | DevEx/CLI | AUTHCLI-DIAG-01-001 | Surface password policy diagnostics in CLI startup/output so operators see weakened overrides immediately. |
|
||||
| Sprint 1 | Developer Tooling | src/StellaOps.Cli/TASKS.md | DONE (2025-10-15) | DevEx/CLI | AUTHCLI-DIAG-01-001 | Surface password policy diagnostics in CLI startup/output so operators see weakened overrides immediately.<br>CLI now loads Authority plug-ins at startup, logs weakened password policies (length/complexity), and regression coverage lives in `StellaOps.Cli.Tests/Services/AuthorityDiagnosticsReporterTests`. |
|
||||
| Sprint 1 | Stabilize In-Progress Foundations | src/StellaOps.Authority/StellaOps.Authority.Plugin.Standard/TASKS.md | DONE (2025-10-11) | Team Authority Platform & Security Guild | AUTHPLUG-DOCS-01-001 | PLG6.DOC — Developer guide copy + diagrams merged 2025-10-11; limiter guidance incorporated and handed to Docs guild for asset export. |
|
||||
| Sprint 2 | Connector & Data Implementation Wave | src/StellaOps.Feedser.Normalization/TASKS.md | DONE (2025-10-12) | Team Normalization & Storage Backbone | FEEDNORM-NORM-02-001 | SemVer normalized rule emitter<br>`SemVerRangeRuleBuilder` shipped 2025-10-12 with comparator/`||` support and fixtures aligning to `FASTER_MODELING_AND_NORMALIZATION.md`. |
|
||||
| Sprint 2 | Connector & Data Implementation Wave | src/StellaOps.Feedser.Storage.Mongo/TASKS.md | DONE (2025-10-11) | Team Normalization & Storage Backbone | FEEDSTORAGE-DATA-02-001 | Normalized range dual-write + backfill |
|
||||
@@ -70,14 +70,14 @@
|
||||
| Sprint 2 | Connector & Data Implementation Wave | src/StellaOps.Feedser.Source.Osv/TASKS.md | DONE (2025-10-11) | Team Connector Expansion – GHSA/NVD/OSV | FEEDCONN-OSV-02-003 | OSV normalized versions & freshness |
|
||||
| Sprint 2 | Connector & Data Implementation Wave | src/StellaOps.Feedser.Source.Osv/TASKS.md | DONE (2025-10-11) | Team Connector Expansion – GHSA/NVD/OSV | FEEDCONN-OSV-02-004 | OSV references & credits alignment |
|
||||
| Sprint 2 | Connector & Data Implementation Wave | src/StellaOps.Feedser.Source.Osv/TASKS.md | DONE (2025-10-12) | Team Connector Expansion – GHSA/NVD/OSV | FEEDCONN-OSV-02-005 | Fixture updater workflow<br>Resolved 2025-10-12: OSV mapper now derives canonical PURLs for Go + scoped npm packages when raw payloads omit `purl`; conflict fixtures unchanged for invalid npm names. Verified via `dotnet test src/StellaOps.Feedser.Source.Osv.Tests`, `src/StellaOps.Feedser.Source.Ghsa.Tests`, `src/StellaOps.Feedser.Source.Nvd.Tests`, and backbone normalization/storage suites. |
|
||||
| Sprint 2 | Connector & Data Implementation Wave | src/StellaOps.Feedser.Source.Acsc/TASKS.md | Implementation DONE (2025-10-12) | Team Connector Expansion – Regional & Vendor Feeds | FEEDCONN-ACSC-02-001 … 02-008 | Fetch→parse→map pipeline, fixtures, diagnostics, and README finished 2025-10-12; awaiting downstream export follow-ups tracked separately. |
|
||||
| Sprint 2 | Connector & Data Implementation Wave | src/StellaOps.Feedser.Source.Acsc/TASKS.md | DONE (2025-10-12) | Team Connector Expansion – Regional & Vendor Feeds | FEEDCONN-ACSC-02-001 … 02-008 | Fetch→parse→map pipeline, fixtures, diagnostics, and README finished 2025-10-12; downstream export parity captured via FEEDEXPORT-JSON-04-001 / FEEDEXPORT-TRIVY-04-001 (completed). |
|
||||
| Sprint 2 | Connector & Data Implementation Wave | src/StellaOps.Feedser.Source.Cccs/TASKS.md | DONE (2025-10-16) | Team Connector Expansion – Regional & Vendor Feeds | FEEDCONN-CCCS-02-001 … 02-008 | Observability meter, historical harvest plan, and DOM sanitizer refinements wrapped; ops notes live under `docs/ops/feedser-cccs-operations.md` with fixtures validating EN/FR list handling. |
|
||||
| Sprint 2 | Connector & Data Implementation Wave | src/StellaOps.Feedser.Source.CertBund/TASKS.md | DONE (2025-10-15) | Team Connector Expansion – Regional & Vendor Feeds | FEEDCONN-CERTBUND-02-001 … 02-008 | Telemetry/docs (02-006) and history/locale sweep (02-007) completed alongside pipeline; runbook `docs/ops/feedser-certbund-operations.md` captures locale guidance and offline packaging. |
|
||||
| Sprint 2 | Connector & Data Implementation Wave | src/StellaOps.Feedser.Source.Kisa/TASKS.md | DONE (2025-10-14) | Team Connector Expansion – Regional & Vendor Feeds | FEEDCONN-KISA-02-001 … 02-007 | Connector, tests, and telemetry/docs (02-006) finalized; localisation notes in `docs/dev/kisa_connector_notes.md` complete rollout. |
|
||||
| Sprint 2 | Connector & Data Implementation Wave | src/StellaOps.Feedser.Source.Ru.Bdu/TASKS.md | DONE (2025-10-14) | Team Connector Expansion – Regional & Vendor Feeds | FEEDCONN-RUBDU-02-001 … 02-008 | Fetch/parser/mapper refinements, regression fixtures, telemetry/docs, access options, and trusted root packaging all landed; README documents offline access strategy. |
|
||||
| Sprint 2 | Connector & Data Implementation Wave | src/StellaOps.Feedser.Source.Ru.Nkcki/TASKS.md | DONE (2025-10-13) | Team Connector Expansion – Regional & Vendor Feeds | FEEDCONN-NKCKI-02-001 … 02-008 | Listing fetch, parser, mapper, fixtures, telemetry/docs, and archive plan finished; Mongo2Go/libcrypto dependency resolved via bundled OpenSSL noted in ops guide. |
|
||||
| Sprint 2 | Connector & Data Implementation Wave | src/StellaOps.Feedser.Source.Ics.Cisa/TASKS.md | DONE (2025-10-16) | Team Connector Expansion – Regional & Vendor Feeds | FEEDCONN-ICSCISA-02-001 … 02-011 | Feed parser attachment fixes, SemVer exact values, regression suites, telemetry/docs updates, and handover complete; ops runbook now details attachment verification + proxy usage. |
|
||||
| Sprint 2 | Connector & Data Implementation Wave | src/StellaOps.Feedser.Source.Vndr.Cisco/TASKS.md | Implementation DONE (2025-10-14) | Team Connector Expansion – Regional & Vendor Feeds | FEEDCONN-CISCO-02-001 … 02-007 | OAuth fetch pipeline, DTO/mapping, tests, and telemetry/docs shipped; monitoring enablement now tracked via follow-up ops tasks (02-006+). |
|
||||
| Sprint 2 | Connector & Data Implementation Wave | src/StellaOps.Feedser.Source.Vndr.Cisco/TASKS.md | DONE (2025-10-14) | Team Connector Expansion – Regional & Vendor Feeds | FEEDCONN-CISCO-02-001 … 02-007 | OAuth fetch pipeline, DTO/mapping, tests, and telemetry/docs shipped; monitoring/export integration follow-ups recorded in Ops docs and exporter backlog (completed). |
|
||||
| Sprint 2 | Connector & Data Implementation Wave | src/StellaOps.Feedser.Source.Vndr.Msrc/TASKS.md | DONE (2025-10-15) | Team Connector Expansion – Regional & Vendor Feeds | FEEDCONN-MSRC-02-001 … 02-008 | Azure AD onboarding (02-008) unblocked fetch/parse/map pipeline; fixtures, telemetry/docs, and Offline Kit guidance published in `docs/ops/feedser-msrc-operations.md`. |
|
||||
| Sprint 2 | Connector & Data Implementation Wave | src/StellaOps.Feedser.Source.Cve/TASKS.md | DONE (2025-10-15) | Team Connector Support & Monitoring | FEEDCONN-CVE-02-001 … 02-002 | CVE data-source selection, fetch pipeline, and docs landed 2025-10-10. 2025-10-15: smoke verified using the seeded mirror fallback; connector now logs a warning and pulls from `seed-data/cve/` until live CVE Services credentials arrive. |
|
||||
| Sprint 2 | Connector & Data Implementation Wave | src/StellaOps.Feedser.Source.Kev/TASKS.md | DONE (2025-10-12) | Team Connector Support & Monitoring | FEEDCONN-KEV-02-001 … 02-002 | KEV catalog ingestion, fixtures, telemetry, and schema validation completed 2025-10-12; ops dashboard published. |
|
||||
@@ -98,3 +98,39 @@
|
||||
| Sprint 3 | Conflict Resolution Integration & Communications | src/StellaOps.Feedser.Source.Osv/TASKS.md | DONE (2025-10-12) | Team Connector Regression Fixtures | FEEDCONN-OSV-04-002 | OSV conflict regression fixtures<br>Instructions to work:<br>Read ./AGENTS.md and module AGENTS. Produce fixture triples supporting the precedence/tie-breaker paths defined in ./src/DEDUP_CONFLICTS_RESOLUTION_ALGO.md and hand them to Merge QA. |
|
||||
| Sprint 3 | Conflict Resolution Integration & Communications | docs/TASKS.md | DONE (2025-10-11) | Team Documentation Guild – Conflict Guidance | FEEDDOCS-DOCS-05-001 | Feedser Conflict Rules<br>Runbook published at `docs/ops/feedser-conflict-resolution.md`; metrics/log guidance aligned with Sprint 3 merge counters. |
|
||||
| Sprint 3 | Conflict Resolution Integration & Communications | docs/TASKS.md | DONE (2025-10-16) | Team Documentation Guild – Conflict Guidance | FEEDDOCS-DOCS-05-002 | Conflict runbook ops rollout<br>Ops review completed, alert thresholds applied, and change log appended in `docs/ops/feedser-conflict-resolution.md`; task closed after connector signals verified. |
|
||||
| Sprint 4 | Schema Parity & Freshness Alignment | src/StellaOps.Feedser.Models/TASKS.md | DONE (2025-10-15) | Team Models & Merge Leads | FEEDMODELS-SCHEMA-04-001 | Advisory schema parity (description/CWE/canonical metric)<br>Extend `Advisory` and related records with description text, CWE collection, and canonical metric pointer; refresh validation + serializer determinism tests. |
|
||||
| Sprint 4 | Schema Parity & Freshness Alignment | src/StellaOps.Feedser.Core/TASKS.md | DONE (2025-10-15) | Team Core Engine & Storage Analytics | FEEDCORE-ENGINE-04-003 | Canonical merger parity for new fields<br>Teach `CanonicalMerger` to populate description, CWEResults, and canonical metric pointer with provenance + regression coverage. |
|
||||
| Sprint 4 | Schema Parity & Freshness Alignment | src/StellaOps.Feedser.Core/TASKS.md | DONE (2025-10-15) | Team Core Engine & Storage Analytics | FEEDCORE-ENGINE-04-004 | Reference normalization & freshness instrumentation cleanup<br>Implement URL normalization for reference dedupe, align freshness-sensitive instrumentation, and add analytics tests. |
|
||||
| Sprint 4 | Schema Parity & Freshness Alignment | src/StellaOps.Feedser.Merge/TASKS.md | DONE (2025-10-15) | Team Merge & QA Enforcement | FEEDMERGE-ENGINE-04-004 | Merge pipeline parity for new advisory fields<br>Ensure merge service + merge events surface description/CWE/canonical metric decisions with updated metrics/tests. |
|
||||
| Sprint 4 | Schema Parity & Freshness Alignment | src/StellaOps.Feedser.Merge/TASKS.md | DONE (2025-10-15) | Team Merge & QA Enforcement | FEEDMERGE-ENGINE-04-005 | Connector coordination for new advisory fields<br>GHSA/NVD/OSV connectors now ship description, CWE, and canonical metric data with refreshed fixtures; merge coordination log updated and exporters notified. |
|
||||
| Sprint 4 | Schema Parity & Freshness Alignment | src/StellaOps.Feedser.Exporter.Json/TASKS.md | DONE (2025-10-15) | Team Exporters – JSON | FEEDEXPORT-JSON-04-001 | Surface new advisory fields in JSON exporter<br>Update schemas/offline bundle + fixtures once model/core parity lands.<br>2025-10-15: `dotnet test src/StellaOps.Feedser.Exporter.Json.Tests` validated canonical metric/CWE emission. |
|
||||
| Sprint 4 | Schema Parity & Freshness Alignment | src/StellaOps.Feedser.Exporter.TrivyDb/TASKS.md | DONE (2025-10-15) | Team Exporters – Trivy DB | FEEDEXPORT-TRIVY-04-001 | Propagate new advisory fields into Trivy DB package<br>Extend Bolt builder, metadata, and regression tests for the expanded schema.<br>2025-10-15: `dotnet test src/StellaOps.Feedser.Exporter.TrivyDb.Tests` confirmed canonical metric/CWE propagation. |
|
||||
| Sprint 4 | Schema Parity & Freshness Alignment | src/StellaOps.Feedser.Source.Ghsa/TASKS.md | TODO | Team Connector Regression Fixtures | FEEDCONN-GHSA-04-004 | Harden CVSS fallback so canonical metric ids persist when GitHub omits vectors; extend fixtures and document severity precedence hand-off to Merge. |
|
||||
| Sprint 4 | Schema Parity & Freshness Alignment | src/StellaOps.Feedser.Source.Osv/TASKS.md | TODO | Team Connector Expansion – GHSA/NVD/OSV | FEEDCONN-OSV-04-005 | Map OSV advisories lacking CVSS vectors to canonical metric ids/notes and document CWE provenance quirks; schedule parity fixture updates. |
|
||||
| Sprint 5 | Vexer Core Foundations | src/StellaOps.Vexer.Core/TASKS.md | DONE (2025-10-15) | Team Vexer Core & Policy | VEXER-CORE-01-001 | Stand up canonical VEX claim/consensus records with deterministic serializers so Storage/Exports share a stable contract. |
|
||||
| Sprint 5 | Vexer Core Foundations | src/StellaOps.Vexer.Core/TASKS.md | DONE (2025-10-15) | Team Vexer Core & Policy | VEXER-CORE-01-002 | Implement trust-weighted consensus resolver with baseline policy weights, justification gates, telemetry output, and majority/tie handling. |
|
||||
| Sprint 5 | Vexer Core Foundations | src/StellaOps.Vexer.Core/TASKS.md | DONE (2025-10-15) | Team Vexer Core & Policy | VEXER-CORE-01-003 | Publish shared connector/exporter/attestation abstractions and deterministic query signature utilities for cache/attestation workflows. |
|
||||
| Sprint 5 | Vexer Core Foundations | src/StellaOps.Vexer.Policy/TASKS.md | DONE (2025-10-15) | Team Vexer Policy | VEXER-POLICY-01-001 | Established policy options & snapshot provider covering baseline weights/overrides. |
|
||||
| Sprint 5 | Vexer Core Foundations | src/StellaOps.Vexer.Policy/TASKS.md | DONE (2025-10-15) | Team Vexer Policy | VEXER-POLICY-01-002 | Policy evaluator now feeds consensus resolver with immutable snapshots. |
|
||||
| Sprint 5 | Vexer Core Foundations | src/StellaOps.Vexer.Policy/TASKS.md | TODO | Team Vexer Policy | VEXER-POLICY-01-003 | Author policy diagnostics, CLI/WebService surfacing, and documentation updates. |
|
||||
| Sprint 5 | Vexer Core Foundations | src/StellaOps.Vexer.Policy/TASKS.md | TODO | Team Vexer Policy | VEXER-POLICY-01-004 | Implement YAML/JSON schema validation and deterministic diagnostics for operator bundles. |
|
||||
| Sprint 5 | Vexer Core Foundations | src/StellaOps.Vexer.Policy/TASKS.md | TODO | Team Vexer Policy | VEXER-POLICY-01-005 | Add policy change tracking, snapshot digests, and telemetry/logging hooks. |
|
||||
| Sprint 5 | Vexer Core Foundations | src/StellaOps.Vexer.Storage.Mongo/TASKS.md | DONE (2025-10-15) | Team Vexer Storage | VEXER-STORAGE-01-001 | Mongo mapping registry plus raw/export entities and DI extensions in place. |
|
||||
| Sprint 5 | Vexer Core Foundations | src/StellaOps.Vexer.Storage.Mongo/TASKS.md | TODO | Team Vexer Storage | VEXER-STORAGE-01-004 | Build provider/consensus/cache class maps and related collections. |
|
||||
| Sprint 5 | Vexer Core Foundations | src/StellaOps.Vexer.Export/TASKS.md | DONE (2025-10-15) | Team Vexer Export | VEXER-EXPORT-01-001 | Export engine delivers cache lookup, manifest creation, and policy integration. |
|
||||
| Sprint 5 | Vexer Core Foundations | src/StellaOps.Vexer.Export/TASKS.md | TODO | Team Vexer Export | VEXER-EXPORT-01-004 | Connect export engine to attestation client and persist Rekor metadata. |
|
||||
| Sprint 5 | Vexer Core Foundations | src/StellaOps.Vexer.Attestation/TASKS.md | TODO | Team Vexer Attestation | VEXER-ATTEST-01-001 | Implement in-toto predicate + DSSE builder providing envelopes for export attestation. |
|
||||
| Sprint 5 | Vexer Core Foundations | src/StellaOps.Vexer.Connectors.Abstractions/TASKS.md | TODO | Team Vexer Connectors | VEXER-CONN-ABS-01-001 | Deliver shared connector context/base classes so provider plug-ins can be activated via WebService/Worker. |
|
||||
| Sprint 5 | Vexer Core Foundations | src/StellaOps.Vexer.WebService/TASKS.md | TODO | Team Vexer WebService | VEXER-WEB-01-001 | Scaffold minimal API host, DI, and `/vexer/status` endpoint integrating policy, storage, export, and attestation services. |
|
||||
| Sprint 6 | Vexer Ingest & Formats | src/StellaOps.Vexer.Worker/TASKS.md | TODO | Team Vexer Worker | VEXER-WORKER-01-001 | Create Worker host with provider scheduling and logging to drive recurring pulls/reconciliation. |
|
||||
| Sprint 6 | Vexer Ingest & Formats | src/StellaOps.Vexer.Formats.CSAF/TASKS.md | TODO | Team Vexer Formats | VEXER-FMT-CSAF-01-001 | Implement CSAF normalizer foundation translating provider documents into `VexClaim` entries. |
|
||||
| Sprint 6 | Vexer Ingest & Formats | src/StellaOps.Vexer.Formats.CycloneDX/TASKS.md | TODO | Team Vexer Formats | VEXER-FMT-CYCLONE-01-001 | Implement CycloneDX VEX normalizer capturing `analysis` state and component references. |
|
||||
| Sprint 6 | Vexer Ingest & Formats | src/StellaOps.Vexer.Formats.OpenVEX/TASKS.md | TODO | Team Vexer Formats | VEXER-FMT-OPENVEX-01-001 | Implement OpenVEX normalizer to ingest attestations into canonical claims with provenance. |
|
||||
| Sprint 6 | Vexer Ingest & Formats | src/StellaOps.Vexer.Connectors.RedHat.CSAF/TASKS.md | TODO | Team Vexer Connectors – Red Hat | VEXER-CONN-RH-01-001 | Ship Red Hat CSAF provider metadata discovery enabling incremental pulls. |
|
||||
| Sprint 6 | Vexer Ingest & Formats | src/StellaOps.Vexer.Connectors.Cisco.CSAF/TASKS.md | TODO | Team Vexer Connectors – Cisco | VEXER-CONN-CISCO-01-001 | Implement Cisco CSAF endpoint discovery/auth to unlock paginated pulls. |
|
||||
| Sprint 6 | Vexer Ingest & Formats | src/StellaOps.Vexer.Connectors.SUSE.RancherVEXHub/TASKS.md | TODO | Team Vexer Connectors – SUSE | VEXER-CONN-SUSE-01-001 | Build Rancher VEX Hub discovery/subscription path with offline snapshot support. |
|
||||
| Sprint 6 | Vexer Ingest & Formats | src/StellaOps.Vexer.Connectors.MSRC.CSAF/TASKS.md | TODO | Team Vexer Connectors – MSRC | VEXER-CONN-MS-01-001 | Deliver AAD onboarding/token cache for MSRC CSAF ingestion. |
|
||||
| Sprint 6 | Vexer Ingest & Formats | src/StellaOps.Vexer.Connectors.Oracle.CSAF/TASKS.md | TODO | Team Vexer Connectors – Oracle | VEXER-CONN-ORACLE-01-001 | Implement Oracle CSAF catalogue discovery with CPU calendar awareness. |
|
||||
| Sprint 6 | Vexer Ingest & Formats | src/StellaOps.Vexer.Connectors.Ubuntu.CSAF/TASKS.md | TODO | Team Vexer Connectors – Ubuntu | VEXER-CONN-UBUNTU-01-001 | Implement Ubuntu CSAF discovery and channel selection for USN ingestion. |
|
||||
| Sprint 6 | Vexer Ingest & Formats | src/StellaOps.Vexer.Connectors.OCI.OpenVEX.Attest/TASKS.md | TODO | Team Vexer Connectors – OCI | VEXER-CONN-OCI-01-001 | Wire OCI discovery/auth to fetch OpenVEX attestations for configured images. |
|
||||
| Sprint 6 | Vexer Ingest & Formats | src/StellaOps.Cli/TASKS.md | TODO | DevEx/CLI | VEXER-CLI-01-001 | Add `vexer` CLI verbs bridging to WebService with consistent auth and offline UX. |
|
||||
|
||||
2
SPRINTS_VEXER.md
Normal file
2
SPRINTS_VEXER.md
Normal file
@@ -0,0 +1,2 @@
|
||||
| Sprint | Theme | Tasks File Path | Status | Type of Specialist | Task ID | Task Description |
|
||||
| --- | --- | --- | --- | --- | --- | --- |
|
||||
@@ -234,6 +234,11 @@ See `docs/dev/32_AUTH_CLIENT_GUIDE.md` for recommended profiles (online vs. air-
|
||||
|
||||
When running on an interactive terminal without explicit override flags, the CLI uses Spectre.Console prompts to let you choose per-run ORAS/offline bundle behaviour.
|
||||
|
||||
**Startup diagnostics**
|
||||
|
||||
- `stellaops-cli` now loads Authority plug-in manifests during startup (respecting `Authority:Plugins:*`) and surfaces analyzer warnings when a plug-in weakens the baseline password policy (minimum length **12** and all character classes required).
|
||||
- Follow the log entry’s config path and raise `passwordPolicy.minimumLength` to at least 12 while keeping `requireUppercase`, `requireLowercase`, `requireDigit`, and `requireSymbol` set to `true` to clear the warning; weakened overrides are treated as actionable security deviations.
|
||||
|
||||
**Logging & exit codes**
|
||||
|
||||
- Structured logging via `Microsoft.Extensions.Logging` with single-line console output (timestamps in UTC).
|
||||
|
||||
72
docs/ARCHITECTURE_VEXER.md
Normal file
72
docs/ARCHITECTURE_VEXER.md
Normal file
@@ -0,0 +1,72 @@
|
||||
# StellaOps Vexer Architecture
|
||||
|
||||
Vexer is StellaOps' vulnerability-exploitability (VEX) platform. It ingests VEX statements from multiple providers, normalizes them into canonical claims, projects trust-weighted consensus, and delivers deterministic export artifacts with signed attestations. This document summarizes the target architecture and how the current implementation maps to those goals.
|
||||
|
||||
## 1. Solution topology
|
||||
|
||||
| Module | Purpose | Key contracts |
|
||||
| --- | --- | --- |
|
||||
| `StellaOps.Vexer.Core` | Domain models (`VexClaim`, `VexConsensus`, `VexExportManifest`), deterministic JSON helpers, shared abstractions (connectors, exporters, attestations). | `IVexConnector`, `IVexExporter`, `IVexAttestationClient`, `VexCanonicalJsonSerializer` |
|
||||
| `StellaOps.Vexer.Policy` | Loads operator policy (weights, overrides, justification gates) and exposes snapshots for consensus. | `IVexPolicyProvider`, `IVexPolicyEvaluator`, `VexPolicyOptions` |
|
||||
| `StellaOps.Vexer.Storage.Mongo` | Persistence layer for providers, raw docs, claims, consensus, exports, cache. | `IVexRawStore`, `IVexExportStore`, Mongo class maps |
|
||||
| `StellaOps.Vexer.Export` | Orchestrates export pipeline (query signature → cache lookup → snapshot build → attestation handoff). | `IExportEngine`, `IVexExportDataSource` |
|
||||
| `StellaOps.Vexer.Attestation` *(planned)* | Builds in-toto/DSSE envelopes and communicates with Sigstore/Rekor. | `IVexAttestationClient` |
|
||||
| `StellaOps.Vexer.WebService` *(planned)* | Minimal API host for ingest/export endpoints. | `AddVexerWebService()` |
|
||||
| `StellaOps.Vexer.Worker` *(planned)* | Background executor for scheduled pulls, verification, reconciliation, cache GC. | Hosted services |
|
||||
|
||||
All modules target .NET 10 preview and follow the same deterministic logging and serialization conventions as Feedser.
|
||||
|
||||
## 2. Data model
|
||||
|
||||
MongoDB acts as the canonical store; collections (with logical responsibilities) are:
|
||||
|
||||
- `vex.providers` – provider metadata, trust tiers, discovery endpoints, and cosign/PGP details.
|
||||
- `vex.raw` – immutable raw documents (CSAF, CycloneDX VEX, OpenVEX, OCI attestations) with digests, retrieval metadata, and signature state.
|
||||
- `vex.claims` – normalized `VexClaim` rows; deduped on `(providerId, vulnId, productKey, docDigest)`.
|
||||
- `vex.consensus` – consensus projections per `(vulnId, productKey)` capturing rollup status, source weights, conflicts, and policy revision.
|
||||
- `vex.exports` – export manifests containing artifact digests, cache metadata, and attestation pointers.
|
||||
- `vex.cache` – index from `querySignature`/`format` to export digest for fast reuse.
|
||||
|
||||
GridFS is used for large raw payloads when necessary, and artifact stores (S3/MinIO/file) hold serialized exports referenced by `vex.exports`.
|
||||
|
||||
## 3. Ingestion and reconciliation flow
|
||||
|
||||
1. **Discovery & configuration** – connectors load YAML/JSON settings via `StellaOps.Vexer.Policy` (provider enablement, trust overrides).
|
||||
2. **Fetch** – each `IVexConnector` pulls source windows, writing raw documents through `IVexRawDocumentSink` (Mongo-backed) with dedupe on digest.
|
||||
3. **Verification** – signatures/attestations validated through `IVexSignatureVerifier`; metadata stored alongside raw records.
|
||||
4. **Normalization** – format-specific `IVexNormalizer` instances translate raw payloads to canonical `VexClaim` batches.
|
||||
5. **Consensus** – `VexConsensusResolver` (Core) consumes claims with policy weights supplied by `IVexPolicyEvaluator`, producing deterministic consensus entries and conflict annotations.
|
||||
6. **Export** – query requests pass through `VexExportEngine`, generating `VexExportManifest` instances, caching by `VexQuerySignature`, and emitting artifacts for attestation/signature.
|
||||
7. **Attestation & transparency** *(planned)* – `IVexAttestationClient` signs exports (in-toto/DSSE) and records bundles in Rekor v2.
|
||||
|
||||
The Worker coordinates the long-running steps (fetch/verify/normalize/export), while the WebService exposes synchronous APIs for on-demand operations and status lookups.
|
||||
|
||||
## 4. Policy semantics
|
||||
|
||||
- **Weights** – default tiers (`vendor=1.0`, `distro=0.9`, `platform=0.7`, `hub=0.5`, `attestation=0.6`) loaded via `VexPolicyOptions.Weights`, with per-provider overrides.
|
||||
- **Justification gates** – policy enforces that `not_affected` claims must provide a recognized justification; rejected claims are preserved as conflicts with reason metadata.
|
||||
- **Diagnostics** – policy snapshots carry structured issues for misconfigurations (out-of-range weights, empty overrides) surfaced to operators via logs and future CLI/Web endpoints.
|
||||
|
||||
Policy snapshots are immutable and versioned so consensus records capture the policy revision used during evaluation.
|
||||
|
||||
## 5. Determinism & caching
|
||||
|
||||
- JSON serialization uses `VexCanonicalJsonSerializer`, enforcing property ordering and camelCase naming for reproducible snapshots and test fixtures.
|
||||
- `VexQuerySignature` produces canonical filter/order strings and SHA-256 digests, enabling cache keys shared across services.
|
||||
- Export manifests reuse cached artifacts when the same signature/format is requested unless `ForceRefresh` is explicitly set.
|
||||
|
||||
## 6. Observability & offline posture
|
||||
|
||||
- Structured logs (`ILogger`) capture correlation IDs, query signatures, provider IDs, and policy revisions. Metrics/OTel instrumentation will mirror Feedser once tracing hooks are added.
|
||||
- Offline-first: connectors, policy bundles, and export caches can be bundled inside the Offline Kit; no mandatory outbound calls beyond configured provider allowlists.
|
||||
- Operator tooling (CLI/WebService) will expose diagnostics (policy issues, verification failures, cache status) so air-gapped deployments maintain visibility without external telemetry.
|
||||
|
||||
## 7. Roadmap highlights
|
||||
|
||||
- Complete storage mappings for providers/consensus/cache and add migrations/indices per collection.
|
||||
- Implement Rekor/in-toto attestation clients and wire export engine to produce signed bundles.
|
||||
- Build WebService endpoints (`/vexer/status`, `/vexer/claims`, `/vexer/exports`) plus CLI verbs mirroring Feedser patterns.
|
||||
- Provide CSAF, CycloneDX VEX, and OpenVEX normalizers along with vendor-specific connectors (Red Hat, Cisco, SUSE, MSRC, Oracle, Ubuntu, OCI attestation).
|
||||
- Extend policy diagnostics with schema validation, change tracking, and operator-facing diff reports.
|
||||
|
||||
This architecture keeps Vexer aligned with StellaOps' deterministic, offline-operable design while layering VEX-specific consensus and attestation capabilities on top of the Feedser foundations.
|
||||
@@ -42,6 +42,10 @@ Legend: ✅ complete, ⚠️ in progress/partial, ❌ not started.
|
||||
- Merge now emits `feedser.merge.normalized_rules` (tags: `package_type`, `scheme`) and `feedser.merge.normalized_rules_missing` (tags: `package_type`). Track these counters to confirm normalized arrays land as connectors roll out.
|
||||
- Expect `normalized_rules_missing` to trend toward zero as each connector flips on normalized output. Investigate any sustained counts by checking the corresponding module `TASKS.md`.
|
||||
|
||||
## Implementation tips
|
||||
|
||||
- When a connector only needs to populate `AffectedPackage.NormalizedVersions` (without reusing range primitives), call `SemVerRangeRuleBuilder.BuildNormalizedRules(rawRange, patchedVersion, note)` to project the normalized rule list directly. This avoids re-wrapping `SemVerRangeBuildResult` instances and keeps provenance notes consistent with the shared builder.
|
||||
|
||||
## How to use this dashboard
|
||||
|
||||
1. Before opening a connector PR, update the module `TASKS.md` entry and drop a short bullet here (status + timestamp).
|
||||
|
||||
@@ -80,6 +80,7 @@ Here’s a quick, practical idea to make your version-range modeling cleaner and
|
||||
* **Emit items, not a monolith**: have the builder return `IEnumerable<NormalizedVersionRule>`.
|
||||
* **Normalize early**: resolve “aliases” (`1.2.x`, `^1.2.3`, distro styles) into canonical `(type,min,max,…)` before persistence.
|
||||
* **Traceability**: include `notes`/`sourceRef` on each rule so you can re-materialize provenance during audits.
|
||||
* **Lean projection helper**: when you only need normalized rules (and not the intermediate primitives), prefer `SemVerRangeRuleBuilder.BuildNormalizedRules(rawRange, patchedVersion, provenanceNote)` to skip manual projections.
|
||||
|
||||
### C# sketch
|
||||
|
||||
|
||||
@@ -0,0 +1,137 @@
|
||||
using System;
|
||||
using System.Collections.Generic;
|
||||
using System.IO;
|
||||
using System.Linq;
|
||||
using System.Text.Json;
|
||||
using Microsoft.Extensions.Logging;
|
||||
using StellaOps.Cli.Configuration;
|
||||
using StellaOps.Cli.Services;
|
||||
using Xunit;
|
||||
|
||||
namespace StellaOps.Cli.Tests.Services;
|
||||
|
||||
public sealed class AuthorityDiagnosticsReporterTests : IDisposable
|
||||
{
|
||||
private readonly string _originalDirectory = Directory.GetCurrentDirectory();
|
||||
private readonly string _tempDirectory = Path.Combine(Path.GetTempPath(), $"stellaops-cli-tests-{Guid.NewGuid():N}");
|
||||
|
||||
public AuthorityDiagnosticsReporterTests()
|
||||
{
|
||||
Directory.CreateDirectory(_tempDirectory);
|
||||
Directory.SetCurrentDirectory(_tempDirectory);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void Emit_LogsWarning_WhenPasswordPolicyWeakened()
|
||||
{
|
||||
WriteAuthorityConfiguration(minimumLength: 8);
|
||||
|
||||
var (_, configuration) = CliBootstrapper.Build(Array.Empty<string>());
|
||||
var logger = new ListLogger();
|
||||
|
||||
AuthorityDiagnosticsReporter.Emit(configuration, logger);
|
||||
|
||||
var warning = Assert.Single(logger.Entries, entry => entry.Level == LogLevel.Warning);
|
||||
Assert.Contains("minimum length 8 < 12", warning.Message, StringComparison.OrdinalIgnoreCase);
|
||||
Assert.Contains("standard.yaml", warning.Message, StringComparison.OrdinalIgnoreCase);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void Emit_EmitsNoWarnings_WhenPasswordPolicyMeetsBaseline()
|
||||
{
|
||||
WriteAuthorityConfiguration(minimumLength: 12);
|
||||
|
||||
var (_, configuration) = CliBootstrapper.Build(Array.Empty<string>());
|
||||
var logger = new ListLogger();
|
||||
|
||||
AuthorityDiagnosticsReporter.Emit(configuration, logger);
|
||||
|
||||
Assert.DoesNotContain(logger.Entries, entry => entry.Level >= LogLevel.Warning);
|
||||
}
|
||||
|
||||
public void Dispose()
|
||||
{
|
||||
Directory.SetCurrentDirectory(_originalDirectory);
|
||||
try
|
||||
{
|
||||
if (Directory.Exists(_tempDirectory))
|
||||
{
|
||||
Directory.Delete(_tempDirectory, recursive: true);
|
||||
}
|
||||
}
|
||||
catch
|
||||
{
|
||||
// Ignored.
|
||||
}
|
||||
}
|
||||
|
||||
private static void WriteAuthorityConfiguration(int minimumLength)
|
||||
{
|
||||
var payload = new
|
||||
{
|
||||
Authority = new
|
||||
{
|
||||
Plugins = new
|
||||
{
|
||||
ConfigurationDirectory = "plugins",
|
||||
Descriptors = new
|
||||
{
|
||||
standard = new
|
||||
{
|
||||
AssemblyName = "StellaOps.Authority.Plugin.Standard",
|
||||
Enabled = true,
|
||||
ConfigFile = "standard.yaml"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
var json = JsonSerializer.Serialize(payload, new JsonSerializerOptions { WriteIndented = true });
|
||||
File.WriteAllText("appsettings.json", json);
|
||||
|
||||
var pluginDirectory = Path.Combine(Directory.GetCurrentDirectory(), "plugins");
|
||||
Directory.CreateDirectory(pluginDirectory);
|
||||
|
||||
var pluginConfig = $"""
|
||||
bootstrapUser:
|
||||
username: "admin"
|
||||
password: "changeme"
|
||||
|
||||
passwordPolicy:
|
||||
minimumLength: {minimumLength}
|
||||
requireUppercase: true
|
||||
requireLowercase: true
|
||||
requireDigit: true
|
||||
requireSymbol: true
|
||||
""";
|
||||
|
||||
File.WriteAllText(Path.Combine(pluginDirectory, "standard.yaml"), pluginConfig);
|
||||
}
|
||||
|
||||
private sealed class ListLogger : ILogger
|
||||
{
|
||||
public readonly record struct LogEntry(LogLevel Level, string Message);
|
||||
|
||||
public List<LogEntry> Entries { get; } = new();
|
||||
|
||||
public IDisposable? BeginScope<TState>(TState state) where TState : notnull => NullScope.Instance;
|
||||
|
||||
public bool IsEnabled(LogLevel logLevel) => true;
|
||||
|
||||
public void Log<TState>(LogLevel logLevel, EventId eventId, TState state, Exception? exception, Func<TState, Exception?, string> formatter)
|
||||
{
|
||||
var message = formatter(state, exception);
|
||||
Entries.Add(new LogEntry(logLevel, message));
|
||||
}
|
||||
}
|
||||
|
||||
private sealed class NullScope : IDisposable
|
||||
{
|
||||
public static NullScope Instance { get; } = new();
|
||||
|
||||
public void Dispose()
|
||||
{
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -105,12 +105,15 @@ internal static class Program
|
||||
services.AddSingleton<IScannerExecutor, ScannerExecutor>();
|
||||
services.AddSingleton<IScannerInstaller, ScannerInstaller>();
|
||||
|
||||
await using var serviceProvider = services.BuildServiceProvider();
|
||||
using var cts = new CancellationTokenSource();
|
||||
Console.CancelKeyPress += (_, eventArgs) =>
|
||||
{
|
||||
eventArgs.Cancel = true;
|
||||
cts.Cancel();
|
||||
await using var serviceProvider = services.BuildServiceProvider();
|
||||
var loggerFactory = serviceProvider.GetRequiredService<ILoggerFactory>();
|
||||
var startupLogger = loggerFactory.CreateLogger("StellaOps.Cli.Startup");
|
||||
AuthorityDiagnosticsReporter.Emit(configuration, startupLogger);
|
||||
using var cts = new CancellationTokenSource();
|
||||
Console.CancelKeyPress += (_, eventArgs) =>
|
||||
{
|
||||
eventArgs.Cancel = true;
|
||||
cts.Cancel();
|
||||
};
|
||||
|
||||
var rootCommand = CommandFactory.Create(serviceProvider, options, cts.Token);
|
||||
|
||||
123
src/StellaOps.Cli/Services/AuthorityDiagnosticsReporter.cs
Normal file
123
src/StellaOps.Cli/Services/AuthorityDiagnosticsReporter.cs
Normal file
@@ -0,0 +1,123 @@
|
||||
using System;
|
||||
using System.Collections.Generic;
|
||||
using System.IO;
|
||||
using Microsoft.Extensions.Configuration;
|
||||
using Microsoft.Extensions.Logging;
|
||||
using StellaOps.Authority.Plugins.Abstractions;
|
||||
using StellaOps.Configuration;
|
||||
|
||||
namespace StellaOps.Cli.Services;
|
||||
|
||||
/// <summary>
|
||||
/// Emits Authority configuration diagnostics discovered during CLI startup.
|
||||
/// </summary>
|
||||
internal static class AuthorityDiagnosticsReporter
|
||||
{
|
||||
public static void Emit(IConfiguration configuration, ILogger logger)
|
||||
{
|
||||
ArgumentNullException.ThrowIfNull(configuration);
|
||||
ArgumentNullException.ThrowIfNull(logger);
|
||||
|
||||
var basePath = Directory.GetCurrentDirectory();
|
||||
EmitInternal(configuration, logger, basePath);
|
||||
}
|
||||
|
||||
internal static void Emit(IConfiguration configuration, ILogger logger, string basePath)
|
||||
{
|
||||
ArgumentNullException.ThrowIfNull(configuration);
|
||||
ArgumentNullException.ThrowIfNull(logger);
|
||||
ArgumentNullException.ThrowIfNull(basePath);
|
||||
|
||||
EmitInternal(configuration, logger, basePath);
|
||||
}
|
||||
|
||||
private static void EmitInternal(IConfiguration configuration, ILogger logger, string basePath)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(basePath))
|
||||
{
|
||||
basePath = Directory.GetCurrentDirectory();
|
||||
}
|
||||
|
||||
var authoritySection = configuration.GetSection("Authority");
|
||||
if (!authoritySection.Exists())
|
||||
{
|
||||
return;
|
||||
}
|
||||
|
||||
var authorityOptions = new StellaOpsAuthorityOptions();
|
||||
authoritySection.Bind(authorityOptions);
|
||||
|
||||
if (authorityOptions.Plugins.Descriptors.Count == 0)
|
||||
{
|
||||
return;
|
||||
}
|
||||
|
||||
var resolvedBasePath = Path.GetFullPath(basePath);
|
||||
IReadOnlyList<AuthorityPluginContext> contexts;
|
||||
|
||||
try
|
||||
{
|
||||
contexts = AuthorityPluginConfigurationLoader.Load(authorityOptions, resolvedBasePath);
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
logger.LogDebug(ex, "Failed to load Authority plug-in configuration for diagnostics.");
|
||||
return;
|
||||
}
|
||||
|
||||
if (contexts.Count == 0)
|
||||
{
|
||||
return;
|
||||
}
|
||||
|
||||
IReadOnlyList<AuthorityConfigurationDiagnostic> diagnostics;
|
||||
try
|
||||
{
|
||||
diagnostics = AuthorityPluginConfigurationAnalyzer.Analyze(contexts);
|
||||
}
|
||||
catch (Exception ex)
|
||||
{
|
||||
logger.LogDebug(ex, "Failed to analyze Authority plug-in configuration for diagnostics.");
|
||||
return;
|
||||
}
|
||||
|
||||
if (diagnostics.Count == 0)
|
||||
{
|
||||
return;
|
||||
}
|
||||
|
||||
var contextLookup = new Dictionary<string, AuthorityPluginContext>(StringComparer.OrdinalIgnoreCase);
|
||||
foreach (var context in contexts)
|
||||
{
|
||||
if (context?.Manifest?.Name is { Length: > 0 } name && !contextLookup.ContainsKey(name))
|
||||
{
|
||||
contextLookup[name] = context;
|
||||
}
|
||||
}
|
||||
|
||||
foreach (var diagnostic in diagnostics)
|
||||
{
|
||||
var level = diagnostic.Severity switch
|
||||
{
|
||||
AuthorityConfigurationDiagnosticSeverity.Error => LogLevel.Error,
|
||||
AuthorityConfigurationDiagnosticSeverity.Warning => LogLevel.Warning,
|
||||
_ => LogLevel.Information
|
||||
};
|
||||
|
||||
if (!logger.IsEnabled(level))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
if (contextLookup.TryGetValue(diagnostic.PluginName, out var context) &&
|
||||
context?.Manifest?.ConfigPath is { Length: > 0 } configPath)
|
||||
{
|
||||
logger.Log(level, "{DiagnosticMessage} (config: {ConfigPath})", diagnostic.Message, configPath);
|
||||
}
|
||||
else
|
||||
{
|
||||
logger.Log(level, "{DiagnosticMessage}", diagnostic.Message);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,10 +1,11 @@
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|Bootstrap configuration fallback (env → appsettings{{.json/.yaml}})|DevEx/CLI|Core|**DONE** – CLI loads `API_KEY`/`STELLAOPS_BACKEND_URL` from environment or local settings, defaulting to empty strings when unset.|
|
||||
|Introduce command host & routing skeleton|DevEx/CLI|Configuration|**DONE** – System.CommandLine (v2.0.0-beta5) router stitched with `scanner`, `scan`, `db`, and `config` verbs.|
|
||||
|Scanner artifact download/install commands|Ops Integrator|Backend contracts|**DONE** – `scanner download` caches bundles, validates SHA-256 (plus optional RSA signature), installs via `docker load`, persists metadata, and retries with exponential backoff.|
|
||||
|Scan execution & result upload workflow|Ops Integrator, QA|Scanner cmd|**DONE** – `scan run` drives container scans against directories, emits artefacts in `ResultsDirectory`, auto-uploads on success, and `scan upload` covers manual retries.|
|
||||
If you are working on this file you need to read docs/ARCHITECTURE_VEXER.md and ./AGENTS.md).
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|Bootstrap configuration fallback (env → appsettings{{.json/.yaml}})|DevEx/CLI|Core|**DONE** – CLI loads `API_KEY`/`STELLAOPS_BACKEND_URL` from environment or local settings, defaulting to empty strings when unset.|
|
||||
|Introduce command host & routing skeleton|DevEx/CLI|Configuration|**DONE** – System.CommandLine (v2.0.0-beta5) router stitched with `scanner`, `scan`, `db`, and `config` verbs.|
|
||||
|Scanner artifact download/install commands|Ops Integrator|Backend contracts|**DONE** – `scanner download` caches bundles, validates SHA-256 (plus optional RSA signature), installs via `docker load`, persists metadata, and retries with exponential backoff.|
|
||||
|Scan execution & result upload workflow|Ops Integrator, QA|Scanner cmd|**DONE** – `scan run` drives container scans against directories, emits artefacts in `ResultsDirectory`, auto-uploads on success, and `scan upload` covers manual retries.|
|
||||
|Feedser DB operations passthrough|DevEx/CLI|Backend, Feedser APIs|**DONE** – `db fetch|merge|export` trigger `/jobs/*` endpoints with parameter binding and consistent exit codes.|
|
||||
|CLI observability & tests|QA|Command host|**DONE** – Added console logging defaults & configuration bootstrap tests; future metrics hooks tracked separately.|
|
||||
|Authority auth commands|DevEx/CLI|Auth libraries|**DONE** – `auth login/logout/status` wrap the shared auth client, manage token cache, and surface status messages.|
|
||||
@@ -12,4 +13,7 @@
|
||||
|Authority whoami command|DevEx/CLI|Authority auth commands|**DONE (2025-10-10)** – Added `auth whoami` verb that displays subject/audience/expiry from cached tokens and handles opaque tokens gracefully.|
|
||||
|Expose auth client resilience settings|DevEx/CLI|Auth libraries LIB5|**DONE (2025-10-10)** – CLI options now bind resilience knobs, `AddStellaOpsAuthClient` honours them, and tests cover env overrides.|
|
||||
|Document advanced Authority tuning|Docs/CLI|Expose auth client resilience settings|**DONE (2025-10-10)** – docs/09 and docs/10 describe retry/offline settings with env examples and point to the integration guide.|
|
||||
|Surface password policy diagnostics in CLI output|DevEx/CLI, Security Guild|AUTHSEC-CRYPTO-02-004|**TODO** – Bubble analyzer warnings during CLI startup (plugin load) and add tests/docs guiding operators to remediate weakened policies.|
|
||||
|Surface password policy diagnostics in CLI output|DevEx/CLI, Security Guild|AUTHSEC-CRYPTO-02-004|**DONE (2025-10-15)** – CLI startup runs the Authority plug-in analyzer, logs weakened password policy warnings with manifest paths, added unit tests (`dotnet test src/StellaOps.Cli.Tests`) and updated docs/09 with remediation guidance.|
|
||||
|VEXER-CLI-01-001 – Add `vexer` command group|DevEx/CLI|VEXER-WEB-01-001|TODO – Introduce `vexer` verb hierarchy (init/pull/resume/list-providers/export/verify/reconcile) forwarding to WebService with token auth and consistent exit codes.|
|
||||
|VEXER-CLI-01-002 – Export download & attestation UX|DevEx/CLI|VEXER-CLI-01-001, VEXER-EXPORT-01-001|TODO – Display export metadata (sha256, size, Rekor link), support optional artifact download path, and handle cache hits gracefully.|
|
||||
|VEXER-CLI-01-003 – CLI docs & examples for Vexer|Docs/CLI|VEXER-CLI-01-001|TODO – Update docs/09_API_CLI_REFERENCE.md and quickstart snippets to cover Vexer verbs, offline guidance, and attestation verification workflow.|
|
||||
|
||||
@@ -144,9 +144,9 @@ public sealed class CanonicalMergerTests
|
||||
provenance: new[] { CreateProvenance("osv", ProvenanceFieldMasks.AffectedPackages) },
|
||||
normalizedVersions: Array.Empty<NormalizedVersionRule>());
|
||||
|
||||
var ghsa = CreateAdvisory("ghsa", "GHSA-1234", "GHSA Title", null, BaseTimestamp.AddHours(1), packages: new[] { ghsaPackage });
|
||||
var nvd = CreateAdvisory("nvd", "CVE-2025-1111", "NVD Title", null, BaseTimestamp.AddHours(2), packages: new[] { nvdPackage });
|
||||
var osv = CreateAdvisory("osv", "OSV-2025-xyz", "OSV Title", null, BaseTimestamp.AddHours(3), packages: new[] { osvPackage });
|
||||
var ghsa = CreateAdvisory("ghsa", "GHSA-1234", "GHSA Title", modified: BaseTimestamp.AddHours(1), packages: new[] { ghsaPackage });
|
||||
var nvd = CreateAdvisory("nvd", "CVE-2025-1111", "NVD Title", modified: BaseTimestamp.AddHours(2), packages: new[] { nvdPackage });
|
||||
var osv = CreateAdvisory("osv", "OSV-2025-xyz", "OSV Title", modified: BaseTimestamp.AddHours(3), packages: new[] { osvPackage });
|
||||
|
||||
var result = merger.Merge("CVE-2025-1111", ghsa, nvd, osv);
|
||||
|
||||
@@ -181,6 +181,119 @@ public sealed class CanonicalMergerTests
|
||||
Assert.Equal("critical", result.Advisory.Severity);
|
||||
Assert.Contains(result.Advisory.CvssMetrics, metric => metric.Vector == "CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:C/C:H/I:H/A:H");
|
||||
Assert.Contains(result.Advisory.CvssMetrics, metric => metric.Vector == "CVSS:3.0/AV:L/AC:L/PR:L/UI:R/S:U/C:H/I:H/A:H");
|
||||
Assert.Equal("3.1|CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:C/C:H/I:H/A:H", result.Advisory.CanonicalMetricId);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void Merge_ReferencesNormalizedAndFreshnessOverrides()
|
||||
{
|
||||
var merger = new CanonicalMerger(new FixedTimeProvider(BaseTimestamp.AddHours(80)));
|
||||
|
||||
var ghsa = CreateAdvisory(
|
||||
source: "ghsa",
|
||||
advisoryKey: "GHSA-ref",
|
||||
title: "GHSA Title",
|
||||
references: new[]
|
||||
{
|
||||
new AdvisoryReference(
|
||||
"http://Example.COM/path/resource?b=2&a=1#section",
|
||||
kind: "advisory",
|
||||
sourceTag: null,
|
||||
summary: null,
|
||||
CreateProvenance("ghsa", ProvenanceFieldMasks.References))
|
||||
},
|
||||
modified: BaseTimestamp);
|
||||
|
||||
var osv = CreateAdvisory(
|
||||
source: "osv",
|
||||
advisoryKey: "OSV-ref",
|
||||
title: "OSV Title",
|
||||
references: new[]
|
||||
{
|
||||
new AdvisoryReference(
|
||||
"https://example.com/path/resource?a=1&b=2",
|
||||
kind: "advisory",
|
||||
sourceTag: null,
|
||||
summary: null,
|
||||
CreateProvenance("osv", ProvenanceFieldMasks.References))
|
||||
},
|
||||
modified: BaseTimestamp.AddHours(80));
|
||||
|
||||
var result = merger.Merge("CVE-REF-2025-01", ghsa, null, osv);
|
||||
|
||||
var reference = Assert.Single(result.Advisory.References);
|
||||
Assert.Equal("https://example.com/path/resource?a=1&b=2", reference.Url);
|
||||
|
||||
var unionDecision = Assert.Single(result.Decisions.Where(decision => decision.Field == "references"));
|
||||
Assert.Null(unionDecision.SelectedSource);
|
||||
Assert.Equal("union", unionDecision.DecisionReason);
|
||||
|
||||
var itemDecision = Assert.Single(result.Decisions.Where(decision => decision.Field.StartsWith("references[", StringComparison.OrdinalIgnoreCase)));
|
||||
Assert.Equal("osv", itemDecision.SelectedSource);
|
||||
Assert.Equal("freshness_override", itemDecision.DecisionReason);
|
||||
Assert.Contains("https://example.com/path/resource?a=1&b=2", itemDecision.Field, StringComparison.OrdinalIgnoreCase);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void Merge_DescriptionFreshnessOverride()
|
||||
{
|
||||
var merger = new CanonicalMerger(new FixedTimeProvider(BaseTimestamp.AddHours(12)));
|
||||
|
||||
var ghsa = CreateAdvisory(
|
||||
source: "ghsa",
|
||||
advisoryKey: "GHSA-desc",
|
||||
title: "GHSA Title",
|
||||
summary: "Summary",
|
||||
description: "Initial GHSA description",
|
||||
modified: BaseTimestamp.AddHours(1));
|
||||
|
||||
var nvd = CreateAdvisory(
|
||||
source: "nvd",
|
||||
advisoryKey: "CVE-2025-5555",
|
||||
title: "NVD Title",
|
||||
summary: "Summary",
|
||||
description: "NVD baseline description",
|
||||
modified: BaseTimestamp.AddHours(2));
|
||||
|
||||
var osv = CreateAdvisory(
|
||||
source: "osv",
|
||||
advisoryKey: "OSV-2025-desc",
|
||||
title: "OSV Title",
|
||||
summary: "Summary",
|
||||
description: "OSV fresher description",
|
||||
modified: BaseTimestamp.AddHours(72));
|
||||
|
||||
var result = merger.Merge("CVE-2025-5555", ghsa, nvd, osv);
|
||||
|
||||
Assert.Equal("OSV fresher description", result.Advisory.Description);
|
||||
Assert.Contains(result.Decisions, decision =>
|
||||
decision.Field == "description" &&
|
||||
string.Equals(decision.SelectedSource, "osv", StringComparison.OrdinalIgnoreCase) &&
|
||||
string.Equals(decision.DecisionReason, "freshness_override", StringComparison.OrdinalIgnoreCase));
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void Merge_CwesPreferNvdPrecedence()
|
||||
{
|
||||
var merger = new CanonicalMerger(new FixedTimeProvider(BaseTimestamp.AddHours(6)));
|
||||
|
||||
var ghsaWeakness = CreateWeakness("ghsa", "CWE-79", "cross-site scripting", BaseTimestamp.AddHours(1));
|
||||
var nvdWeakness = CreateWeakness("nvd", "CWE-79", "Cross-Site Scripting", BaseTimestamp.AddHours(2));
|
||||
var osvWeakness = CreateWeakness("osv", "CWE-79", "XSS", BaseTimestamp.AddHours(3));
|
||||
|
||||
var ghsa = CreateAdvisory("ghsa", "GHSA-weakness", "GHSA Title", weaknesses: new[] { ghsaWeakness }, modified: BaseTimestamp.AddHours(1));
|
||||
var nvd = CreateAdvisory("nvd", "CVE-2025-7777", "NVD Title", weaknesses: new[] { nvdWeakness }, modified: BaseTimestamp.AddHours(2));
|
||||
var osv = CreateAdvisory("osv", "OSV-weakness", "OSV Title", weaknesses: new[] { osvWeakness }, modified: BaseTimestamp.AddHours(3));
|
||||
|
||||
var result = merger.Merge("CVE-2025-7777", ghsa, nvd, osv);
|
||||
|
||||
var weakness = Assert.Single(result.Advisory.Cwes);
|
||||
Assert.Equal("CWE-79", weakness.Identifier);
|
||||
Assert.Equal("Cross-Site Scripting", weakness.Name);
|
||||
Assert.Contains(result.Decisions, decision =>
|
||||
decision.Field == "cwes[cwe|CWE-79]" &&
|
||||
string.Equals(decision.SelectedSource, "nvd", StringComparison.OrdinalIgnoreCase) &&
|
||||
string.Equals(decision.DecisionReason, "precedence", StringComparison.OrdinalIgnoreCase));
|
||||
}
|
||||
|
||||
private static Advisory CreateAdvisory(
|
||||
@@ -188,10 +301,14 @@ public sealed class CanonicalMergerTests
|
||||
string advisoryKey,
|
||||
string title,
|
||||
string? summary = null,
|
||||
string? description = null,
|
||||
DateTimeOffset? modified = null,
|
||||
string? severity = null,
|
||||
IEnumerable<AffectedPackage>? packages = null,
|
||||
IEnumerable<CvssMetric>? metrics = null)
|
||||
IEnumerable<CvssMetric>? metrics = null,
|
||||
IEnumerable<AdvisoryReference>? references = null,
|
||||
IEnumerable<AdvisoryWeakness>? weaknesses = null,
|
||||
string? canonicalMetricId = null)
|
||||
{
|
||||
var provenance = new AdvisoryProvenance(
|
||||
source,
|
||||
@@ -211,10 +328,13 @@ public sealed class CanonicalMergerTests
|
||||
exploitKnown: false,
|
||||
aliases: new[] { advisoryKey },
|
||||
credits: Array.Empty<AdvisoryCredit>(),
|
||||
references: Array.Empty<AdvisoryReference>(),
|
||||
references: references ?? Array.Empty<AdvisoryReference>(),
|
||||
affectedPackages: packages ?? Array.Empty<AffectedPackage>(),
|
||||
cvssMetrics: metrics ?? Array.Empty<CvssMetric>(),
|
||||
provenance: new[] { provenance });
|
||||
provenance: new[] { provenance },
|
||||
description: description,
|
||||
cwes: weaknesses ?? Array.Empty<AdvisoryWeakness>(),
|
||||
canonicalMetricId: canonicalMetricId);
|
||||
}
|
||||
|
||||
private static AdvisoryProvenance CreateProvenance(string source, string fieldMask)
|
||||
@@ -225,6 +345,18 @@ public sealed class CanonicalMergerTests
|
||||
recordedAt: BaseTimestamp,
|
||||
fieldMask: new[] { fieldMask });
|
||||
|
||||
private static AdvisoryWeakness CreateWeakness(string source, string identifier, string? name, DateTimeOffset recordedAt)
|
||||
{
|
||||
var provenance = new AdvisoryProvenance(
|
||||
source,
|
||||
kind: "map",
|
||||
value: identifier,
|
||||
recordedAt: recordedAt,
|
||||
fieldMask: new[] { ProvenanceFieldMasks.Weaknesses });
|
||||
|
||||
return new AdvisoryWeakness("cwe", identifier, name, uri: null, provenance: new[] { provenance });
|
||||
}
|
||||
|
||||
private sealed class FixedTimeProvider : TimeProvider
|
||||
{
|
||||
private readonly DateTimeOffset _utcNow;
|
||||
|
||||
@@ -19,18 +19,21 @@ public sealed class CanonicalMerger
|
||||
{
|
||||
["title"] = new[] { GhsaSource, NvdSource, OsvSource },
|
||||
["summary"] = new[] { GhsaSource, NvdSource, OsvSource },
|
||||
["description"] = new[] { GhsaSource, NvdSource, OsvSource },
|
||||
["language"] = new[] { GhsaSource, NvdSource, OsvSource },
|
||||
["severity"] = new[] { NvdSource, GhsaSource, OsvSource },
|
||||
["references"] = new[] { GhsaSource, NvdSource, OsvSource },
|
||||
["credits"] = new[] { GhsaSource, OsvSource, NvdSource },
|
||||
["affectedPackages"] = new[] { OsvSource, GhsaSource, NvdSource },
|
||||
["cvssMetrics"] = new[] { NvdSource, GhsaSource, OsvSource },
|
||||
["cwes"] = new[] { NvdSource, GhsaSource, OsvSource },
|
||||
}.ToImmutableDictionary(StringComparer.OrdinalIgnoreCase);
|
||||
|
||||
private static readonly ImmutableHashSet<string> FreshnessSensitiveFields = ImmutableHashSet.Create(
|
||||
StringComparer.OrdinalIgnoreCase,
|
||||
"title",
|
||||
"summary",
|
||||
"description",
|
||||
"references",
|
||||
"credits",
|
||||
"affectedPackages");
|
||||
@@ -87,6 +90,13 @@ public sealed class CanonicalMerger
|
||||
AddMergeProvenance(provenanceSet, summarySelection, now, ProvenanceFieldMasks.Advisory);
|
||||
}
|
||||
|
||||
var descriptionSelection = SelectStringField("description", candidates, advisory => advisory.Description, isFreshnessSensitive: true);
|
||||
if (descriptionSelection.HasValue)
|
||||
{
|
||||
decisions.Add(descriptionSelection.Decision);
|
||||
AddMergeProvenance(provenanceSet, descriptionSelection, now, ProvenanceFieldMasks.Advisory);
|
||||
}
|
||||
|
||||
var languageSelection = SelectStringField("language", candidates, advisory => advisory.Language, isFreshnessSensitive: false);
|
||||
if (languageSelection.HasValue)
|
||||
{
|
||||
@@ -103,15 +113,24 @@ public sealed class CanonicalMerger
|
||||
|
||||
var aliases = MergeAliases(candidates);
|
||||
var creditsResult = MergeCredits(candidates);
|
||||
if (creditsResult.Decision is not null)
|
||||
if (creditsResult.UnionDecision is not null)
|
||||
{
|
||||
decisions.Add(creditsResult.Decision);
|
||||
decisions.Add(creditsResult.UnionDecision);
|
||||
}
|
||||
decisions.AddRange(creditsResult.Decisions);
|
||||
|
||||
var referencesResult = MergeReferences(candidates);
|
||||
if (referencesResult.Decision is not null)
|
||||
if (referencesResult.UnionDecision is not null)
|
||||
{
|
||||
decisions.Add(referencesResult.Decision);
|
||||
decisions.Add(referencesResult.UnionDecision);
|
||||
}
|
||||
decisions.AddRange(referencesResult.Decisions);
|
||||
|
||||
var weaknessesResult = MergeWeaknesses(candidates, now);
|
||||
decisions.AddRange(weaknessesResult.Decisions);
|
||||
foreach (var weaknessProvenance in weaknessesResult.AdditionalProvenance)
|
||||
{
|
||||
provenanceSet.Add(weaknessProvenance);
|
||||
}
|
||||
|
||||
var packagesResult = MergePackages(candidates, now);
|
||||
@@ -143,8 +162,10 @@ public sealed class CanonicalMerger
|
||||
|
||||
var title = titleSelection.Value ?? ghsa?.Title ?? nvd?.Title ?? osv?.Title ?? advisoryKey;
|
||||
var summary = summarySelection.Value ?? ghsa?.Summary ?? nvd?.Summary ?? osv?.Summary;
|
||||
var description = descriptionSelection.Value ?? ghsa?.Description ?? nvd?.Description ?? osv?.Description;
|
||||
var language = languageSelection.Value ?? ghsa?.Language ?? nvd?.Language ?? osv?.Language;
|
||||
var severity = topLevelSeveritySelection.Value ?? metricsResult.CanonicalSeverity ?? ghsa?.Severity ?? nvd?.Severity ?? osv?.Severity;
|
||||
var canonicalMetricId = metricsResult.CanonicalMetricId ?? ghsa?.CanonicalMetricId ?? nvd?.CanonicalMetricId ?? osv?.CanonicalMetricId;
|
||||
|
||||
if (string.IsNullOrWhiteSpace(title))
|
||||
{
|
||||
@@ -171,7 +192,10 @@ public sealed class CanonicalMerger
|
||||
referencesResult.References,
|
||||
packagesResult.Packages,
|
||||
metricsResult.Metrics,
|
||||
provenance);
|
||||
provenance,
|
||||
description,
|
||||
weaknessesResult.Weaknesses,
|
||||
canonicalMetricId);
|
||||
|
||||
return new CanonicalMergeResult(
|
||||
advisory,
|
||||
@@ -201,8 +225,10 @@ public sealed class CanonicalMerger
|
||||
private CreditsMergeResult MergeCredits(List<AdvisorySnapshot> candidates)
|
||||
{
|
||||
var precedence = GetPrecedence("credits");
|
||||
var isFreshnessSensitive = FreshnessSensitiveFields.Contains("credits");
|
||||
var map = new Dictionary<string, CreditSelection>(StringComparer.OrdinalIgnoreCase);
|
||||
var considered = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
|
||||
var decisions = new List<FieldDecision>();
|
||||
|
||||
foreach (var candidate in candidates)
|
||||
{
|
||||
@@ -219,12 +245,25 @@ public sealed class CanonicalMerger
|
||||
|
||||
var candidateRank = GetRank(candidate.Source, precedence);
|
||||
var existingRank = GetRank(existing.Source, precedence);
|
||||
|
||||
if (candidateRank < existingRank ||
|
||||
(candidateRank == existingRank && candidate.Modified > existing.Modified))
|
||||
var reason = EvaluateReplacementReason(candidateRank, existingRank, candidate.Modified, existing.Modified, isFreshnessSensitive);
|
||||
if (reason is null)
|
||||
{
|
||||
map[key] = new CreditSelection(credit, candidate.Source, candidate.Modified);
|
||||
continue;
|
||||
}
|
||||
|
||||
var consideredSources = new HashSet<string>(StringComparer.OrdinalIgnoreCase)
|
||||
{
|
||||
existing.Source,
|
||||
candidate.Source,
|
||||
};
|
||||
|
||||
map[key] = new CreditSelection(credit, candidate.Source, candidate.Modified);
|
||||
decisions.Add(new FieldDecision(
|
||||
Field: $"credits[{key}]",
|
||||
SelectedSource: candidate.Source,
|
||||
DecisionReason: reason,
|
||||
SelectedModified: candidate.Modified,
|
||||
ConsideredSources: consideredSources.OrderBy(static value => value, StringComparer.OrdinalIgnoreCase).ToImmutableArray()));
|
||||
}
|
||||
}
|
||||
|
||||
@@ -241,14 +280,16 @@ public sealed class CanonicalMerger
|
||||
ConsideredSources: considered.OrderBy(static value => value, StringComparer.OrdinalIgnoreCase).ToImmutableArray());
|
||||
}
|
||||
|
||||
return new CreditsMergeResult(credits, decision);
|
||||
return new CreditsMergeResult(credits, decision, decisions);
|
||||
}
|
||||
|
||||
private ReferencesMergeResult MergeReferences(List<AdvisorySnapshot> candidates)
|
||||
{
|
||||
var precedence = GetPrecedence("references");
|
||||
var isFreshnessSensitive = FreshnessSensitiveFields.Contains("references");
|
||||
var map = new Dictionary<string, ReferenceSelection>(StringComparer.OrdinalIgnoreCase);
|
||||
var considered = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
|
||||
var decisions = new List<FieldDecision>();
|
||||
|
||||
foreach (var candidate in candidates)
|
||||
{
|
||||
@@ -259,7 +300,7 @@ public sealed class CanonicalMerger
|
||||
continue;
|
||||
}
|
||||
|
||||
var key = reference.Url.Trim();
|
||||
var key = NormalizeReferenceKey(reference.Url);
|
||||
considered.Add(candidate.Source);
|
||||
|
||||
if (!map.TryGetValue(key, out var existing))
|
||||
@@ -268,14 +309,27 @@ public sealed class CanonicalMerger
|
||||
continue;
|
||||
}
|
||||
|
||||
var candidateRank = GetRank(candidate.Source, precedence);
|
||||
var existingRank = GetRank(existing.Source, precedence);
|
||||
|
||||
if (candidateRank < existingRank ||
|
||||
(candidateRank == existingRank && candidate.Modified > existing.Modified))
|
||||
var candidateRank = GetRank(candidate.Source, precedence);
|
||||
var existingRank = GetRank(existing.Source, precedence);
|
||||
var reason = EvaluateReplacementReason(candidateRank, existingRank, candidate.Modified, existing.Modified, isFreshnessSensitive);
|
||||
if (reason is null)
|
||||
{
|
||||
map[key] = new ReferenceSelection(reference, candidate.Source, candidate.Modified);
|
||||
continue;
|
||||
}
|
||||
|
||||
var consideredSources = new HashSet<string>(StringComparer.OrdinalIgnoreCase)
|
||||
{
|
||||
existing.Source,
|
||||
candidate.Source,
|
||||
};
|
||||
|
||||
map[key] = new ReferenceSelection(reference, candidate.Source, candidate.Modified);
|
||||
decisions.Add(new FieldDecision(
|
||||
Field: $"references[{key}]",
|
||||
SelectedSource: candidate.Source,
|
||||
DecisionReason: reason,
|
||||
SelectedModified: candidate.Modified,
|
||||
ConsideredSources: consideredSources.OrderBy(static value => value, StringComparer.OrdinalIgnoreCase).ToImmutableArray()));
|
||||
}
|
||||
}
|
||||
|
||||
@@ -292,12 +346,13 @@ public sealed class CanonicalMerger
|
||||
ConsideredSources: considered.OrderBy(static value => value, StringComparer.OrdinalIgnoreCase).ToImmutableArray());
|
||||
}
|
||||
|
||||
return new ReferencesMergeResult(references, decision);
|
||||
return new ReferencesMergeResult(references, decision, decisions);
|
||||
}
|
||||
|
||||
private PackagesMergeResult MergePackages(List<AdvisorySnapshot> candidates, DateTimeOffset now)
|
||||
{
|
||||
var precedence = GetPrecedence("affectedPackages");
|
||||
var isFreshnessSensitive = FreshnessSensitiveFields.Contains("affectedPackages");
|
||||
var map = new Dictionary<string, PackageSelection>(StringComparer.OrdinalIgnoreCase);
|
||||
var decisions = new List<FieldDecision>();
|
||||
var additionalProvenance = new List<AdvisoryProvenance>();
|
||||
@@ -328,27 +383,8 @@ public sealed class CanonicalMerger
|
||||
|
||||
var candidateRank = GetRank(candidate.Source, precedence);
|
||||
var existingRank = GetRank(existing.Source, precedence);
|
||||
var freshness = candidate.Modified - existing.Modified;
|
||||
var reason = string.Empty;
|
||||
var shouldReplace = false;
|
||||
|
||||
if (candidateRank < existingRank)
|
||||
{
|
||||
shouldReplace = true;
|
||||
reason = "precedence";
|
||||
}
|
||||
else if (candidateRank > existingRank && freshness >= _freshnessThreshold)
|
||||
{
|
||||
shouldReplace = true;
|
||||
reason = "freshness_override";
|
||||
}
|
||||
else if (candidateRank == existingRank && candidate.Modified > existing.Modified)
|
||||
{
|
||||
shouldReplace = true;
|
||||
reason = "tie_breaker";
|
||||
}
|
||||
|
||||
if (!shouldReplace)
|
||||
var reason = EvaluateReplacementReason(candidateRank, existingRank, candidate.Modified, existing.Modified, isFreshnessSensitive);
|
||||
if (reason is null)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
@@ -370,6 +406,85 @@ public sealed class CanonicalMerger
|
||||
return new PackagesMergeResult(packages, decisions, additionalProvenance);
|
||||
}
|
||||
|
||||
private WeaknessMergeResult MergeWeaknesses(List<AdvisorySnapshot> candidates, DateTimeOffset now)
|
||||
{
|
||||
var precedence = GetPrecedence("cwes");
|
||||
var map = new Dictionary<string, WeaknessSelection>(StringComparer.OrdinalIgnoreCase);
|
||||
var decisions = new List<FieldDecision>();
|
||||
var additionalProvenance = new List<AdvisoryProvenance>();
|
||||
|
||||
foreach (var candidate in candidates)
|
||||
{
|
||||
var candidateWeaknesses = candidate.Advisory.Cwes.IsDefaultOrEmpty
|
||||
? ImmutableArray<AdvisoryWeakness>.Empty
|
||||
: candidate.Advisory.Cwes;
|
||||
|
||||
foreach (var weakness in candidateWeaknesses)
|
||||
{
|
||||
var key = $"{weakness.Taxonomy}|{weakness.Identifier}";
|
||||
var consideredSources = new HashSet<string>(StringComparer.OrdinalIgnoreCase) { candidate.Source };
|
||||
|
||||
if (!map.TryGetValue(key, out var existing))
|
||||
{
|
||||
var enriched = AppendWeaknessProvenance(weakness, candidate.Source, "precedence", now);
|
||||
map[key] = new WeaknessSelection(enriched.Weakness, candidate.Source, candidate.Modified);
|
||||
additionalProvenance.Add(enriched.MergeProvenance);
|
||||
|
||||
decisions.Add(new FieldDecision(
|
||||
Field: $"cwes[{key}]",
|
||||
SelectedSource: candidate.Source,
|
||||
DecisionReason: "precedence",
|
||||
SelectedModified: candidate.Modified,
|
||||
ConsideredSources: consideredSources.ToImmutableArray()));
|
||||
continue;
|
||||
}
|
||||
|
||||
consideredSources.Add(existing.Source);
|
||||
|
||||
var candidateRank = GetRank(candidate.Source, precedence);
|
||||
var existingRank = GetRank(existing.Source, precedence);
|
||||
var decisionReason = string.Empty;
|
||||
var shouldReplace = false;
|
||||
|
||||
if (candidateRank < existingRank)
|
||||
{
|
||||
shouldReplace = true;
|
||||
decisionReason = "precedence";
|
||||
}
|
||||
else if (candidateRank == existingRank && candidate.Modified > existing.Modified)
|
||||
{
|
||||
shouldReplace = true;
|
||||
decisionReason = "tie_breaker";
|
||||
}
|
||||
|
||||
if (!shouldReplace)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var enrichedWeakness = AppendWeaknessProvenance(weakness, candidate.Source, decisionReason, now);
|
||||
map[key] = new WeaknessSelection(enrichedWeakness.Weakness, candidate.Source, candidate.Modified);
|
||||
additionalProvenance.Add(enrichedWeakness.MergeProvenance);
|
||||
|
||||
decisions.Add(new FieldDecision(
|
||||
Field: $"cwes[{key}]",
|
||||
SelectedSource: candidate.Source,
|
||||
DecisionReason: decisionReason,
|
||||
SelectedModified: candidate.Modified,
|
||||
ConsideredSources: consideredSources.ToImmutableArray()));
|
||||
}
|
||||
}
|
||||
|
||||
var mergedWeaknesses = map.Values
|
||||
.Select(static value => value.Weakness)
|
||||
.OrderBy(static value => value.Taxonomy, StringComparer.Ordinal)
|
||||
.ThenBy(static value => value.Identifier, StringComparer.Ordinal)
|
||||
.ThenBy(static value => value.Name, StringComparer.Ordinal)
|
||||
.ToImmutableArray();
|
||||
|
||||
return new WeaknessMergeResult(mergedWeaknesses, decisions, additionalProvenance);
|
||||
}
|
||||
|
||||
private CvssMergeResult MergeCvssMetrics(List<AdvisorySnapshot> candidates)
|
||||
{
|
||||
var precedence = GetPrecedence("cvssMetrics");
|
||||
@@ -408,20 +523,33 @@ public sealed class CanonicalMerger
|
||||
.ToImmutableArray();
|
||||
|
||||
FieldDecision? decision = null;
|
||||
string? canonicalMetricId = null;
|
||||
string? canonicalSelectedSource = null;
|
||||
DateTimeOffset? canonicalSelectedModified = null;
|
||||
|
||||
var canonical = orderedMetrics.FirstOrDefault();
|
||||
if (canonical is not null)
|
||||
{
|
||||
canonicalMetricId = $"{canonical.Version}|{canonical.Vector}";
|
||||
if (map.TryGetValue(canonicalMetricId, out var selection))
|
||||
{
|
||||
canonicalSelectedSource = selection.Source;
|
||||
canonicalSelectedModified = selection.Modified;
|
||||
}
|
||||
}
|
||||
|
||||
if (considered.Count > 0)
|
||||
{
|
||||
var canonical = orderedMetrics.FirstOrDefault();
|
||||
decision = new FieldDecision(
|
||||
Field: "cvssMetrics",
|
||||
SelectedSource: canonical is null ? null : map[$"{canonical.Version}|{canonical.Vector}"].Source,
|
||||
SelectedSource: canonicalSelectedSource,
|
||||
DecisionReason: "precedence",
|
||||
SelectedModified: canonical is null ? null : map[$"{canonical.Version}|{canonical.Vector}"].Modified,
|
||||
SelectedModified: canonicalSelectedModified,
|
||||
ConsideredSources: considered.OrderBy(static value => value, StringComparer.OrdinalIgnoreCase).ToImmutableArray());
|
||||
}
|
||||
|
||||
var severity = orderedMetrics.FirstOrDefault()?.BaseSeverity;
|
||||
return new CvssMergeResult(orderedMetrics, severity, decision);
|
||||
var severity = canonical?.BaseSeverity;
|
||||
return new CvssMergeResult(orderedMetrics, severity, canonicalMetricId, decision);
|
||||
}
|
||||
|
||||
private static string CreatePackageKey(AffectedPackage package)
|
||||
@@ -456,6 +584,99 @@ public sealed class CanonicalMerger
|
||||
return (packageWithProvenance, provenance);
|
||||
}
|
||||
|
||||
private static string NormalizeReferenceKey(string url)
|
||||
{
|
||||
var trimmed = url?.Trim();
|
||||
if (string.IsNullOrEmpty(trimmed))
|
||||
{
|
||||
return string.Empty;
|
||||
}
|
||||
|
||||
if (!Uri.TryCreate(trimmed, UriKind.Absolute, out var uri))
|
||||
{
|
||||
return trimmed;
|
||||
}
|
||||
|
||||
var builder = new StringBuilder();
|
||||
var scheme = uri.Scheme.Equals("http", StringComparison.OrdinalIgnoreCase) ? "https" : uri.Scheme.ToLowerInvariant();
|
||||
builder.Append(scheme).Append("://").Append(uri.Host.ToLowerInvariant());
|
||||
|
||||
if (!uri.IsDefaultPort)
|
||||
{
|
||||
builder.Append(':').Append(uri.Port);
|
||||
}
|
||||
|
||||
var path = uri.AbsolutePath;
|
||||
if (!string.IsNullOrEmpty(path) && path != "/")
|
||||
{
|
||||
if (!path.StartsWith('/'))
|
||||
{
|
||||
builder.Append('/');
|
||||
}
|
||||
|
||||
builder.Append(path.TrimEnd('/'));
|
||||
}
|
||||
|
||||
var query = uri.Query;
|
||||
if (!string.IsNullOrEmpty(query))
|
||||
{
|
||||
var parameters = query.TrimStart('?')
|
||||
.Split('&', StringSplitOptions.RemoveEmptyEntries | StringSplitOptions.TrimEntries);
|
||||
Array.Sort(parameters, StringComparer.Ordinal);
|
||||
builder.Append('?').Append(string.Join('&', parameters));
|
||||
}
|
||||
|
||||
return builder.ToString();
|
||||
}
|
||||
|
||||
private string? EvaluateReplacementReason(int candidateRank, int existingRank, DateTimeOffset candidateModified, DateTimeOffset existingModified, bool isFreshnessSensitive)
|
||||
{
|
||||
if (candidateRank < existingRank)
|
||||
{
|
||||
return "precedence";
|
||||
}
|
||||
|
||||
if (isFreshnessSensitive && candidateRank > existingRank && candidateModified - existingModified >= _freshnessThreshold)
|
||||
{
|
||||
return "freshness_override";
|
||||
}
|
||||
|
||||
if (candidateRank == existingRank && candidateModified > existingModified)
|
||||
{
|
||||
return "tie_breaker";
|
||||
}
|
||||
|
||||
return null;
|
||||
}
|
||||
|
||||
private static (AdvisoryWeakness Weakness, AdvisoryProvenance MergeProvenance) AppendWeaknessProvenance(
|
||||
AdvisoryWeakness weakness,
|
||||
string source,
|
||||
string decisionReason,
|
||||
DateTimeOffset recordedAt)
|
||||
{
|
||||
var provenance = new AdvisoryProvenance(
|
||||
source,
|
||||
kind: "merge",
|
||||
value: $"{weakness.Taxonomy}:{weakness.Identifier}",
|
||||
recordedAt: recordedAt,
|
||||
fieldMask: new[] { ProvenanceFieldMasks.Weaknesses },
|
||||
decisionReason: decisionReason);
|
||||
|
||||
var provenanceList = weakness.Provenance.IsDefaultOrEmpty
|
||||
? ImmutableArray.Create(provenance)
|
||||
: weakness.Provenance.Add(provenance);
|
||||
|
||||
var weaknessWithProvenance = new AdvisoryWeakness(
|
||||
weakness.Taxonomy,
|
||||
weakness.Identifier,
|
||||
weakness.Name,
|
||||
weakness.Uri,
|
||||
provenanceList);
|
||||
|
||||
return (weaknessWithProvenance, provenance);
|
||||
}
|
||||
|
||||
private FieldSelection<string> SelectStringField(
|
||||
string field,
|
||||
List<AdvisorySnapshot> candidates,
|
||||
@@ -653,17 +874,25 @@ public sealed class CanonicalMerger
|
||||
|
||||
private readonly record struct MetricSelection(CvssMetric Metric, string Source, DateTimeOffset Modified);
|
||||
|
||||
private readonly record struct CreditsMergeResult(ImmutableArray<AdvisoryCredit> Credits, FieldDecision? Decision);
|
||||
private readonly record struct WeaknessSelection(AdvisoryWeakness Weakness, string Source, DateTimeOffset Modified);
|
||||
|
||||
private readonly record struct ReferencesMergeResult(ImmutableArray<AdvisoryReference> References, FieldDecision? Decision);
|
||||
private readonly record struct CreditsMergeResult(ImmutableArray<AdvisoryCredit> Credits, FieldDecision? UnionDecision, IReadOnlyList<FieldDecision> Decisions);
|
||||
|
||||
private readonly record struct ReferencesMergeResult(ImmutableArray<AdvisoryReference> References, FieldDecision? UnionDecision, IReadOnlyList<FieldDecision> Decisions);
|
||||
|
||||
private readonly record struct PackagesMergeResult(
|
||||
ImmutableArray<AffectedPackage> Packages,
|
||||
IReadOnlyList<FieldDecision> Decisions,
|
||||
IReadOnlyList<AdvisoryProvenance> AdditionalProvenance);
|
||||
|
||||
private readonly record struct WeaknessMergeResult(
|
||||
ImmutableArray<AdvisoryWeakness> Weaknesses,
|
||||
IReadOnlyList<FieldDecision> Decisions,
|
||||
IReadOnlyList<AdvisoryProvenance> AdditionalProvenance);
|
||||
|
||||
private readonly record struct CvssMergeResult(
|
||||
ImmutableArray<CvssMetric> Metrics,
|
||||
string? CanonicalSeverity,
|
||||
string? CanonicalMetricId,
|
||||
FieldDecision? Decision);
|
||||
}
|
||||
|
||||
@@ -14,3 +14,5 @@
|
||||
|Validate job trigger parameters for serialization|BE-Core|WebService|DONE – trigger parameters normalized/serialized with defensive checks returning InvalidParameters on failure. Full-suite `dotnet test --no-build` currently red from live connector fixture drift (Oracle/JVN/RedHat).|
|
||||
|FEEDCORE-ENGINE-03-001 Canonical merger implementation|BE-Core|Merge|DONE – `CanonicalMerger` applies GHSA/NVD/OSV conflict rules with deterministic provenance and comprehensive unit coverage. **Coordination:** Connector leads must align mapper outputs with the canonical field expectations before 2025-10-18 so Merge can activate the path globally.|
|
||||
|FEEDCORE-ENGINE-03-002 Field precedence and tie-breaker map|BE-Core|Merge|DONE – field precedence and freshness overrides enforced via `FieldPrecedence` map with tie-breakers and analytics capture. **Reminder:** Storage/Merge owners review precedence overrides when onboarding new feeds to ensure `decisionReason` tagging stays consistent.|
|
||||
|Canonical merger parity for description/CWE/canonical metric|BE-Core|Models|DONE (2025-10-15) – merger now populates description/CWEs/canonical metric id with provenance and regression tests cover the new decisions.|
|
||||
|Reference normalization & freshness instrumentation cleanup|BE-Core, QA|Models|DONE (2025-10-15) – reference keys normalized, freshness overrides applied to union fields, and new tests assert decision logging.|
|
||||
|
||||
@@ -88,21 +88,52 @@ public sealed class JsonFeedExporterTests : IDisposable
|
||||
[Fact]
|
||||
public async Task ExportAsync_WritesManifestMetadata()
|
||||
{
|
||||
var exportedAt = DateTimeOffset.Parse("2024-08-10T00:00:00Z", CultureInfo.InvariantCulture);
|
||||
var advisory = new Advisory(
|
||||
advisoryKey: "CVE-2024-4321",
|
||||
title: "Manifest Test",
|
||||
summary: null,
|
||||
language: "en",
|
||||
published: DateTimeOffset.Parse("2024-07-01T00:00:00Z", CultureInfo.InvariantCulture),
|
||||
modified: DateTimeOffset.Parse("2024-07-02T00:00:00Z", CultureInfo.InvariantCulture),
|
||||
severity: "medium",
|
||||
exploitKnown: false,
|
||||
aliases: new[] { "CVE-2024-4321" },
|
||||
references: Array.Empty<AdvisoryReference>(),
|
||||
affectedPackages: Array.Empty<AffectedPackage>(),
|
||||
cvssMetrics: Array.Empty<CvssMetric>(),
|
||||
provenance: Array.Empty<AdvisoryProvenance>());
|
||||
var exportedAt = DateTimeOffset.Parse("2024-08-10T00:00:00Z", CultureInfo.InvariantCulture);
|
||||
var recordedAt = DateTimeOffset.Parse("2024-07-02T00:00:00Z", CultureInfo.InvariantCulture);
|
||||
var reference = new AdvisoryReference(
|
||||
"http://Example.com/path/resource?b=2&a=1",
|
||||
kind: "advisory",
|
||||
sourceTag: "REF-001",
|
||||
summary: "Primary vendor advisory",
|
||||
provenance: new AdvisoryProvenance("ghsa", "map", "REF-001", recordedAt, new[] { ProvenanceFieldMasks.References }));
|
||||
var weakness = new AdvisoryWeakness(
|
||||
taxonomy: "cwe",
|
||||
identifier: "CWE-79",
|
||||
name: "Cross-site Scripting",
|
||||
uri: "https://cwe.mitre.org/data/definitions/79.html",
|
||||
provenance: new[]
|
||||
{
|
||||
new AdvisoryProvenance("nvd", "map", "CWE-79", recordedAt, new[] { ProvenanceFieldMasks.Weaknesses })
|
||||
});
|
||||
var cvssMetric = new CvssMetric(
|
||||
"3.1",
|
||||
"CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:C/C:H/I:H/A:H",
|
||||
9.8,
|
||||
"critical",
|
||||
new AdvisoryProvenance("nvd", "map", "CVE-2024-4321", recordedAt, new[] { ProvenanceFieldMasks.CvssMetrics }));
|
||||
|
||||
var advisory = new Advisory(
|
||||
advisoryKey: "CVE-2024-4321",
|
||||
title: "Manifest Test",
|
||||
summary: "Short summary",
|
||||
language: "en",
|
||||
published: DateTimeOffset.Parse("2024-07-01T00:00:00Z", CultureInfo.InvariantCulture),
|
||||
modified: recordedAt,
|
||||
severity: "medium",
|
||||
exploitKnown: false,
|
||||
aliases: new[] { "CVE-2024-4321", "GHSA-xxxx-yyyy-zzzz" },
|
||||
credits: Array.Empty<AdvisoryCredit>(),
|
||||
references: new[] { reference },
|
||||
affectedPackages: Array.Empty<AffectedPackage>(),
|
||||
cvssMetrics: new[] { cvssMetric },
|
||||
provenance: new[]
|
||||
{
|
||||
new AdvisoryProvenance("ghsa", "map", "GHSA-xxxx-yyyy-zzzz", recordedAt, new[] { ProvenanceFieldMasks.Advisory }),
|
||||
new AdvisoryProvenance("nvd", "map", "CVE-2024-4321", recordedAt, new[] { ProvenanceFieldMasks.Advisory })
|
||||
},
|
||||
description: "Detailed description capturing remediation steps.",
|
||||
cwes: new[] { weakness },
|
||||
canonicalMetricId: "3.1|CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:C/C:H/I:H/A:H");
|
||||
|
||||
var advisoryStore = new StubAdvisoryStore(advisory);
|
||||
var optionsValue = new JsonExportOptions
|
||||
@@ -149,18 +180,33 @@ public sealed class JsonFeedExporterTests : IDisposable
|
||||
.OrderBy(file => file.Relative, StringComparer.Ordinal)
|
||||
.ToArray();
|
||||
|
||||
var filesElement = root.GetProperty("files")
|
||||
.EnumerateArray()
|
||||
.Select(element => new
|
||||
{
|
||||
Path = element.GetProperty("path").GetString(),
|
||||
Bytes = element.GetProperty("bytes").GetInt64(),
|
||||
Digest = element.GetProperty("digest").GetString(),
|
||||
})
|
||||
.OrderBy(file => file.Path, StringComparer.Ordinal)
|
||||
.ToArray();
|
||||
|
||||
Assert.Equal(exportedFiles.Select(file => file.Relative).ToArray(), filesElement.Select(file => file.Path).ToArray());
|
||||
var filesElement = root.GetProperty("files")
|
||||
.EnumerateArray()
|
||||
.Select(element => new
|
||||
{
|
||||
Path = element.GetProperty("path").GetString(),
|
||||
Bytes = element.GetProperty("bytes").GetInt64(),
|
||||
Digest = element.GetProperty("digest").GetString(),
|
||||
})
|
||||
.OrderBy(file => file.Path, StringComparer.Ordinal)
|
||||
.ToArray();
|
||||
|
||||
var dataFile = Assert.Single(exportedFiles);
|
||||
using (var advisoryDocument = JsonDocument.Parse(await File.ReadAllBytesAsync(dataFile.Absolute, CancellationToken.None)))
|
||||
{
|
||||
var advisoryRoot = advisoryDocument.RootElement;
|
||||
Assert.Equal("Detailed description capturing remediation steps.", advisoryRoot.GetProperty("description").GetString());
|
||||
Assert.Equal("3.1|CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:C/C:H/I:H/A:H", advisoryRoot.GetProperty("canonicalMetricId").GetString());
|
||||
|
||||
var referenceElement = advisoryRoot.GetProperty("references").EnumerateArray().Single();
|
||||
Assert.Equal(reference.Url, referenceElement.GetProperty("url").GetString(), StringComparer.OrdinalIgnoreCase);
|
||||
|
||||
var weaknessElement = advisoryRoot.GetProperty("cwes").EnumerateArray().Single();
|
||||
Assert.Equal("cwe", weaknessElement.GetProperty("taxonomy").GetString());
|
||||
Assert.Equal("CWE-79", weaknessElement.GetProperty("identifier").GetString());
|
||||
}
|
||||
|
||||
Assert.Equal(exportedFiles.Select(file => file.Relative).ToArray(), filesElement.Select(file => file.Path).ToArray());
|
||||
|
||||
long totalBytes = exportedFiles.Select(file => new FileInfo(file.Absolute).Length).Sum();
|
||||
Assert.Equal(totalBytes, root.GetProperty("totalBytes").GetInt64());
|
||||
|
||||
@@ -7,5 +7,6 @@
|
||||
|JsonExportJob wiring|BE-Export|Core|DONE – Job scheduler options now configurable via DI; JSON job registered with scheduler.|
|
||||
|Snapshot tests for file tree|QA|Exporters|DONE – Added resolver/exporter tests asserting tree layout and deterministic behavior.|
|
||||
|Parity smoke vs upstream vuln-list|QA|Exporters|DONE – `JsonExporterParitySmokeTests` covers common ecosystems against vuln-list layout.|
|
||||
|Stream advisories during export|BE-Export|Storage.Mongo|DONE – exporter + streaming-only test ensures single enumeration and per-file digest capture.|
|
||||
|Emit export manifest with digest metadata|BE-Export|Exporters|DONE – manifest now includes per-file digests/sizes alongside tree digest.|
|
||||
|Stream advisories during export|BE-Export|Storage.Mongo|DONE – exporter + streaming-only test ensures single enumeration and per-file digest capture.|
|
||||
|Emit export manifest with digest metadata|BE-Export|Exporters|DONE – manifest now includes per-file digests/sizes alongside tree digest.|
|
||||
|Surface new advisory fields (description/CWEs/canonical metric)|BE-Export|Models, Core|DONE (2025-10-15) – JSON exporter validated with new fixtures ensuring description/CWEs/canonical metric are preserved in outputs; `dotnet test src/StellaOps.Feedser.Exporter.Json.Tests` run 2025-10-15 for regression coverage.|
|
||||
|
||||
@@ -663,20 +663,45 @@ public sealed class TrivyDbFeedExporterTests : IDisposable
|
||||
string advisoryKey = "CVE-2024-9999",
|
||||
string title = "Trivy Export Test")
|
||||
{
|
||||
var published = DateTimeOffset.Parse("2024-08-01T00:00:00Z", CultureInfo.InvariantCulture);
|
||||
var modified = DateTimeOffset.Parse("2024-08-02T00:00:00Z", CultureInfo.InvariantCulture);
|
||||
var reference = new AdvisoryReference(
|
||||
"https://example.org/advisories/CVE-2024-9999",
|
||||
kind: "advisory",
|
||||
sourceTag: "EXAMPLE",
|
||||
summary: null,
|
||||
provenance: new AdvisoryProvenance("ghsa", "map", "CVE-2024-9999", modified, new[] { ProvenanceFieldMasks.References }));
|
||||
var weakness = new AdvisoryWeakness(
|
||||
taxonomy: "cwe",
|
||||
identifier: "CWE-89",
|
||||
name: "SQL Injection",
|
||||
uri: "https://cwe.mitre.org/data/definitions/89.html",
|
||||
provenance: new[] { new AdvisoryProvenance("nvd", "map", "CWE-89", modified, new[] { ProvenanceFieldMasks.Weaknesses }) });
|
||||
var cvssMetric = new CvssMetric(
|
||||
"3.1",
|
||||
"CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:H/I:H/A:H",
|
||||
9.8,
|
||||
"critical",
|
||||
new AdvisoryProvenance("nvd", "map", "CVE-2024-9999", modified, new[] { ProvenanceFieldMasks.CvssMetrics }));
|
||||
|
||||
return new Advisory(
|
||||
advisoryKey: advisoryKey,
|
||||
title: title,
|
||||
summary: null,
|
||||
summary: "Trivy export fixture",
|
||||
language: "en",
|
||||
published: DateTimeOffset.Parse("2024-08-01T00:00:00Z", CultureInfo.InvariantCulture),
|
||||
modified: DateTimeOffset.Parse("2024-08-02T00:00:00Z", CultureInfo.InvariantCulture),
|
||||
published: published,
|
||||
modified: modified,
|
||||
severity: "medium",
|
||||
exploitKnown: false,
|
||||
aliases: new[] { "CVE-2024-9999" },
|
||||
references: Array.Empty<AdvisoryReference>(),
|
||||
credits: Array.Empty<AdvisoryCredit>(),
|
||||
references: new[] { reference },
|
||||
affectedPackages: Array.Empty<AffectedPackage>(),
|
||||
cvssMetrics: Array.Empty<CvssMetric>(),
|
||||
provenance: Array.Empty<AdvisoryProvenance>());
|
||||
cvssMetrics: new[] { cvssMetric },
|
||||
provenance: new[] { new AdvisoryProvenance("nvd", "map", "CVE-2024-9999", modified, new[] { ProvenanceFieldMasks.Advisory }) },
|
||||
description: "Detailed description for Trivy exporter testing.",
|
||||
cwes: new[] { weakness },
|
||||
canonicalMetricId: "3.1|CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:H/I:H/A:H");
|
||||
}
|
||||
|
||||
public void Dispose()
|
||||
|
||||
@@ -11,3 +11,4 @@
|
||||
|ExportState persistence & idempotence|BE-Export|Storage.Mongo|DONE – baseline resets wired into `ExportStateManager`, planner signals resets after delta runs, and exporters update state w/ repository-aware baseline rotation + tests.|
|
||||
|Streamed package building to avoid large copies|BE-Export|Exporters|DONE – metadata/config now reuse backing arrays and OCI writer streams directly without double buffering.|
|
||||
|Plan incremental/delta exports|BE-Export|Exporters|DONE – state captures per-file manifests, planner schedules delta vs full resets, layer reuse smoke test verifies OCI reuse, and operator guide documents the validation flow.|
|
||||
|Advisory schema parity export (description/CWEs/canonical metric)|BE-Export|Models, Core|DONE (2025-10-15) – exporter/test fixtures updated to handle description/CWEs/canonical metric fields during Trivy DB packaging; `dotnet test src/StellaOps.Feedser.Exporter.TrivyDb.Tests` re-run 2025-10-15 to confirm coverage.|
|
||||
|
||||
@@ -151,7 +151,10 @@ public sealed class AdvisoryMergeService
|
||||
source.References,
|
||||
source.AffectedPackages,
|
||||
source.CvssMetrics,
|
||||
source.Provenance);
|
||||
source.Provenance,
|
||||
source.Description,
|
||||
source.Cwes,
|
||||
source.CanonicalMetricId);
|
||||
|
||||
private CanonicalMergeResult? ApplyCanonicalMergeIfNeeded(string canonicalKey, List<Advisory> inputs)
|
||||
{
|
||||
|
||||
@@ -16,3 +16,5 @@
|
||||
|Override audit logging|BE-Merge|Observability|DONE – override audits now emit structured logs plus bounded-tag metrics suitable for prod telemetry.|
|
||||
|Configurable precedence table|BE-Merge|Architecture|DONE – precedence options bind via feedser:merge:precedence:ranks with docs/tests covering operator workflow.|
|
||||
|Range primitives backlog|BE-Merge|Connector WGs|**DOING** – Coordinate remaining connectors (`Acsc`, `Cccs`, `CertBund`, `CertCc`, `Cve`, `Ghsa`, `Ics.Cisa`, `Kisa`, `Ru.Bdu`, `Ru.Nkcki`, `Vndr.Apple`, `Vndr.Cisco`, `Vndr.Msrc`) to emit canonical RangePrimitives with provenance tags; track progress/fixtures here.<br>2025-10-11: Storage alignment notes + sample normalized rule JSON now captured in `RANGE_PRIMITIVES_COORDINATION.md` (see “Storage alignment quick reference”).<br>2025-10-11 18:45Z: GHSA normalized rules landed; OSV connector picked up next for rollout.<br>2025-10-11 21:10Z: `docs/dev/merge_semver_playbook.md` Section 8 now documents the persisted Mongo projection (SemVer + NEVRA) for connector reviewers.<br>2025-10-11 21:30Z: Added `docs/dev/normalized_versions_rollout.md` dashboard to centralize connector status and upcoming milestones.<br>2025-10-11 21:55Z: Merge now emits `feedser.merge.normalized_rules*` counters and unions connector-provided normalized arrays; see new test coverage in `AdvisoryPrecedenceMergerTests.Merge_RecordsNormalizedRuleMetrics`.<br>2025-10-12 17:05Z: CVE + KEV normalized rule verification complete; OSV parity fixtures revalidated—downstream parity/monitoring tasks may proceed.|
|
||||
|Merge pipeline parity for new advisory fields|BE-Merge|Models, Core|DONE (2025-10-15) – merge service now surfaces description/CWE/canonical metric decisions with updated metrics/tests.|
|
||||
|Connector coordination for new advisory fields|Connector Leads, BE-Merge|Models, Core|**DONE (2025-10-15)** – GHSA, NVD, and OSV connectors now emit advisory descriptions, CWE weaknesses, and canonical metric ids. Fixtures refreshed (GHSA connector regression suite, `conflict-nvd.canonical.json`, OSV parity snapshots) and completion recorded in coordination log.|
|
||||
|
||||
@@ -7,8 +7,8 @@ namespace StellaOps.Feedser.Models;
|
||||
/// <summary>
|
||||
/// Canonical advisory document produced after merge. Collections are pre-sorted for deterministic serialization.
|
||||
/// </summary>
|
||||
public sealed record Advisory
|
||||
{
|
||||
public sealed record Advisory
|
||||
{
|
||||
public static Advisory Empty { get; } = new(
|
||||
advisoryKey: "unknown",
|
||||
title: "",
|
||||
@@ -23,7 +23,10 @@ public sealed record Advisory
|
||||
references: Array.Empty<AdvisoryReference>(),
|
||||
affectedPackages: Array.Empty<AffectedPackage>(),
|
||||
cvssMetrics: Array.Empty<CvssMetric>(),
|
||||
provenance: Array.Empty<AdvisoryProvenance>());
|
||||
provenance: Array.Empty<AdvisoryProvenance>(),
|
||||
description: null,
|
||||
cwes: Array.Empty<AdvisoryWeakness>(),
|
||||
canonicalMetricId: null);
|
||||
|
||||
public Advisory(
|
||||
string advisoryKey,
|
||||
@@ -38,7 +41,10 @@ public sealed record Advisory
|
||||
IEnumerable<AdvisoryReference>? references,
|
||||
IEnumerable<AffectedPackage>? affectedPackages,
|
||||
IEnumerable<CvssMetric>? cvssMetrics,
|
||||
IEnumerable<AdvisoryProvenance>? provenance)
|
||||
IEnumerable<AdvisoryProvenance>? provenance,
|
||||
string? description = null,
|
||||
IEnumerable<AdvisoryWeakness>? cwes = null,
|
||||
string? canonicalMetricId = null)
|
||||
: this(
|
||||
advisoryKey,
|
||||
title,
|
||||
@@ -53,7 +59,10 @@ public sealed record Advisory
|
||||
references,
|
||||
affectedPackages,
|
||||
cvssMetrics,
|
||||
provenance)
|
||||
provenance,
|
||||
description,
|
||||
cwes,
|
||||
canonicalMetricId)
|
||||
{
|
||||
}
|
||||
|
||||
@@ -61,26 +70,30 @@ public sealed record Advisory
|
||||
string advisoryKey,
|
||||
string title,
|
||||
string? summary,
|
||||
string? language,
|
||||
DateTimeOffset? published,
|
||||
DateTimeOffset? modified,
|
||||
string? severity,
|
||||
string? language,
|
||||
DateTimeOffset? published,
|
||||
DateTimeOffset? modified,
|
||||
string? severity,
|
||||
bool exploitKnown,
|
||||
IEnumerable<string>? aliases,
|
||||
IEnumerable<AdvisoryCredit>? credits,
|
||||
IEnumerable<AdvisoryReference>? references,
|
||||
IEnumerable<AffectedPackage>? affectedPackages,
|
||||
IEnumerable<CvssMetric>? cvssMetrics,
|
||||
IEnumerable<AdvisoryProvenance>? provenance)
|
||||
{
|
||||
AdvisoryKey = Validation.EnsureNotNullOrWhiteSpace(advisoryKey, nameof(advisoryKey));
|
||||
Title = Validation.EnsureNotNullOrWhiteSpace(title, nameof(title));
|
||||
Summary = Validation.TrimToNull(summary);
|
||||
Language = Validation.TrimToNull(language)?.ToLowerInvariant();
|
||||
Published = published?.ToUniversalTime();
|
||||
Modified = modified?.ToUniversalTime();
|
||||
Severity = SeverityNormalization.Normalize(severity);
|
||||
ExploitKnown = exploitKnown;
|
||||
IEnumerable<AdvisoryProvenance>? provenance,
|
||||
string? description = null,
|
||||
IEnumerable<AdvisoryWeakness>? cwes = null,
|
||||
string? canonicalMetricId = null)
|
||||
{
|
||||
AdvisoryKey = Validation.EnsureNotNullOrWhiteSpace(advisoryKey, nameof(advisoryKey));
|
||||
Title = Validation.EnsureNotNullOrWhiteSpace(title, nameof(title));
|
||||
Summary = Validation.TrimToNull(summary);
|
||||
Description = Validation.TrimToNull(description);
|
||||
Language = Validation.TrimToNull(language)?.ToLowerInvariant();
|
||||
Published = published?.ToUniversalTime();
|
||||
Modified = modified?.ToUniversalTime();
|
||||
Severity = SeverityNormalization.Normalize(severity);
|
||||
ExploitKnown = exploitKnown;
|
||||
|
||||
Aliases = (aliases ?? Array.Empty<string>())
|
||||
.Select(static alias => Validation.TryNormalizeAlias(alias, out var normalized) ? normalized! : null)
|
||||
@@ -101,46 +114,58 @@ public sealed record Advisory
|
||||
.OrderBy(static reference => reference.Url, StringComparer.Ordinal)
|
||||
.ThenBy(static reference => reference.Kind, StringComparer.Ordinal)
|
||||
.ThenBy(static reference => reference.SourceTag, StringComparer.Ordinal)
|
||||
.ThenBy(static reference => reference.Provenance.RecordedAt)
|
||||
.ToImmutableArray();
|
||||
|
||||
AffectedPackages = (affectedPackages ?? Array.Empty<AffectedPackage>())
|
||||
.Where(static package => package is not null)
|
||||
.OrderBy(static package => package.Type, StringComparer.Ordinal)
|
||||
.ThenBy(static package => package.Identifier, StringComparer.Ordinal)
|
||||
.ThenBy(static package => package.Platform, StringComparer.Ordinal)
|
||||
.ToImmutableArray();
|
||||
|
||||
CvssMetrics = (cvssMetrics ?? Array.Empty<CvssMetric>())
|
||||
.Where(static metric => metric is not null)
|
||||
.OrderBy(static metric => metric.Version, StringComparer.Ordinal)
|
||||
.ThenBy(static metric => metric.Vector, StringComparer.Ordinal)
|
||||
.ToImmutableArray();
|
||||
|
||||
Provenance = (provenance ?? Array.Empty<AdvisoryProvenance>())
|
||||
.Where(static p => p is not null)
|
||||
.OrderBy(static p => p.Source, StringComparer.Ordinal)
|
||||
.ThenBy(static p => p.Kind, StringComparer.Ordinal)
|
||||
.ThenBy(static p => p.RecordedAt)
|
||||
.ToImmutableArray();
|
||||
.ThenBy(static reference => reference.Provenance.RecordedAt)
|
||||
.ToImmutableArray();
|
||||
|
||||
AffectedPackages = (affectedPackages ?? Array.Empty<AffectedPackage>())
|
||||
.Where(static package => package is not null)
|
||||
.OrderBy(static package => package.Type, StringComparer.Ordinal)
|
||||
.ThenBy(static package => package.Identifier, StringComparer.Ordinal)
|
||||
.ThenBy(static package => package.Platform, StringComparer.Ordinal)
|
||||
.ToImmutableArray();
|
||||
|
||||
CvssMetrics = (cvssMetrics ?? Array.Empty<CvssMetric>())
|
||||
.Where(static metric => metric is not null)
|
||||
.OrderBy(static metric => metric.Version, StringComparer.Ordinal)
|
||||
.ThenBy(static metric => metric.Vector, StringComparer.Ordinal)
|
||||
.ToImmutableArray();
|
||||
|
||||
Cwes = (cwes ?? Array.Empty<AdvisoryWeakness>())
|
||||
.Where(static weakness => weakness is not null)
|
||||
.OrderBy(static weakness => weakness.Taxonomy, StringComparer.Ordinal)
|
||||
.ThenBy(static weakness => weakness.Identifier, StringComparer.Ordinal)
|
||||
.ThenBy(static weakness => weakness.Name, StringComparer.Ordinal)
|
||||
.ToImmutableArray();
|
||||
|
||||
CanonicalMetricId = Validation.TrimToNull(canonicalMetricId);
|
||||
|
||||
Provenance = (provenance ?? Array.Empty<AdvisoryProvenance>())
|
||||
.Where(static p => p is not null)
|
||||
.OrderBy(static p => p.Source, StringComparer.Ordinal)
|
||||
.ThenBy(static p => p.Kind, StringComparer.Ordinal)
|
||||
.ThenBy(static p => p.RecordedAt)
|
||||
.ToImmutableArray();
|
||||
}
|
||||
|
||||
[JsonConstructor]
|
||||
public Advisory(
|
||||
string advisoryKey,
|
||||
string title,
|
||||
string? summary,
|
||||
string? language,
|
||||
DateTimeOffset? published,
|
||||
DateTimeOffset? modified,
|
||||
string? severity,
|
||||
public Advisory(
|
||||
string advisoryKey,
|
||||
string title,
|
||||
string? summary,
|
||||
string? language,
|
||||
DateTimeOffset? published,
|
||||
DateTimeOffset? modified,
|
||||
string? severity,
|
||||
bool exploitKnown,
|
||||
ImmutableArray<string> aliases,
|
||||
ImmutableArray<AdvisoryCredit> credits,
|
||||
ImmutableArray<AdvisoryReference> references,
|
||||
ImmutableArray<AffectedPackage> affectedPackages,
|
||||
ImmutableArray<CvssMetric> cvssMetrics,
|
||||
ImmutableArray<AdvisoryProvenance> provenance)
|
||||
ImmutableArray<AdvisoryProvenance> provenance,
|
||||
string? description,
|
||||
ImmutableArray<AdvisoryWeakness> cwes,
|
||||
string? canonicalMetricId)
|
||||
: this(
|
||||
advisoryKey,
|
||||
title,
|
||||
@@ -155,21 +180,26 @@ public sealed record Advisory
|
||||
references.IsDefault ? null : references.AsEnumerable(),
|
||||
affectedPackages.IsDefault ? null : affectedPackages.AsEnumerable(),
|
||||
cvssMetrics.IsDefault ? null : cvssMetrics.AsEnumerable(),
|
||||
provenance.IsDefault ? null : provenance.AsEnumerable())
|
||||
provenance.IsDefault ? null : provenance.AsEnumerable(),
|
||||
description,
|
||||
cwes.IsDefault ? null : cwes.AsEnumerable(),
|
||||
canonicalMetricId)
|
||||
{
|
||||
}
|
||||
|
||||
public string AdvisoryKey { get; }
|
||||
|
||||
public string Title { get; }
|
||||
|
||||
public string? Summary { get; }
|
||||
|
||||
public string? Language { get; }
|
||||
|
||||
public DateTimeOffset? Published { get; }
|
||||
|
||||
public DateTimeOffset? Modified { get; }
|
||||
}
|
||||
|
||||
public string AdvisoryKey { get; }
|
||||
|
||||
public string Title { get; }
|
||||
|
||||
public string? Summary { get; }
|
||||
|
||||
public string? Description { get; }
|
||||
|
||||
public string? Language { get; }
|
||||
|
||||
public DateTimeOffset? Published { get; }
|
||||
|
||||
public DateTimeOffset? Modified { get; }
|
||||
|
||||
public string? Severity { get; }
|
||||
|
||||
@@ -180,10 +210,14 @@ public sealed record Advisory
|
||||
public ImmutableArray<AdvisoryCredit> Credits { get; }
|
||||
|
||||
public ImmutableArray<AdvisoryReference> References { get; }
|
||||
|
||||
public ImmutableArray<AffectedPackage> AffectedPackages { get; }
|
||||
|
||||
public ImmutableArray<CvssMetric> CvssMetrics { get; }
|
||||
|
||||
public ImmutableArray<AdvisoryProvenance> Provenance { get; }
|
||||
}
|
||||
|
||||
public ImmutableArray<AffectedPackage> AffectedPackages { get; }
|
||||
|
||||
public ImmutableArray<CvssMetric> CvssMetrics { get; }
|
||||
|
||||
public ImmutableArray<AdvisoryWeakness> Cwes { get; }
|
||||
|
||||
public string? CanonicalMetricId { get; }
|
||||
|
||||
public ImmutableArray<AdvisoryProvenance> Provenance { get; }
|
||||
}
|
||||
|
||||
56
src/StellaOps.Feedser.Models/AdvisoryWeakness.cs
Normal file
56
src/StellaOps.Feedser.Models/AdvisoryWeakness.cs
Normal file
@@ -0,0 +1,56 @@
|
||||
using System.Collections.Generic;
|
||||
using System.Collections.Immutable;
|
||||
using System.Linq;
|
||||
using System.Text.Json.Serialization;
|
||||
|
||||
namespace StellaOps.Feedser.Models;
|
||||
|
||||
/// <summary>
|
||||
/// Canonical weakness (e.g., CWE entry) associated with an advisory.
|
||||
/// </summary>
|
||||
public sealed record AdvisoryWeakness
|
||||
{
|
||||
public static AdvisoryWeakness Empty { get; } = new("cwe", "CWE-000", null, null, Array.Empty<AdvisoryProvenance>());
|
||||
|
||||
[JsonConstructor]
|
||||
public AdvisoryWeakness(string taxonomy, string identifier, string? name, string? uri, ImmutableArray<AdvisoryProvenance> provenance)
|
||||
: this(taxonomy, identifier, name, uri, provenance.IsDefault ? null : provenance.AsEnumerable())
|
||||
{
|
||||
}
|
||||
|
||||
public AdvisoryWeakness(string taxonomy, string identifier, string? name, string? uri, IEnumerable<AdvisoryProvenance>? provenance)
|
||||
{
|
||||
Taxonomy = NormalizeTaxonomy(taxonomy);
|
||||
Identifier = NormalizeIdentifier(identifier);
|
||||
Name = Validation.TrimToNull(name);
|
||||
Uri = Validation.TrimToNull(uri);
|
||||
Provenance = (provenance ?? Array.Empty<AdvisoryProvenance>())
|
||||
.Where(static value => value is not null)
|
||||
.OrderBy(static value => value.Source, StringComparer.Ordinal)
|
||||
.ThenBy(static value => value.Kind, StringComparer.Ordinal)
|
||||
.ThenBy(static value => value.RecordedAt)
|
||||
.ToImmutableArray();
|
||||
}
|
||||
|
||||
public string Taxonomy { get; }
|
||||
|
||||
public string Identifier { get; }
|
||||
|
||||
public string? Name { get; }
|
||||
|
||||
public string? Uri { get; }
|
||||
|
||||
public ImmutableArray<AdvisoryProvenance> Provenance { get; }
|
||||
|
||||
private static string NormalizeTaxonomy(string taxonomy)
|
||||
{
|
||||
var normalized = Validation.EnsureNotNullOrWhiteSpace(taxonomy, nameof(taxonomy)).Trim();
|
||||
return normalized.Length == 0 ? "cwe" : normalized.ToLowerInvariant();
|
||||
}
|
||||
|
||||
private static string NormalizeIdentifier(string identifier)
|
||||
{
|
||||
var normalized = Validation.EnsureNotNullOrWhiteSpace(identifier, nameof(identifier)).Trim();
|
||||
return normalized.ToUpperInvariant();
|
||||
}
|
||||
}
|
||||
@@ -66,6 +66,17 @@ public static class CanonicalJsonSerializer
|
||||
"notes",
|
||||
}
|
||||
},
|
||||
{
|
||||
typeof(AdvisoryWeakness),
|
||||
new[]
|
||||
{
|
||||
"taxonomy",
|
||||
"identifier",
|
||||
"name",
|
||||
"uri",
|
||||
"provenance",
|
||||
}
|
||||
},
|
||||
};
|
||||
|
||||
public static string Serialize<T>(T value)
|
||||
@@ -89,7 +100,10 @@ public static class CanonicalJsonSerializer
|
||||
advisory.References,
|
||||
advisory.AffectedPackages,
|
||||
advisory.CvssMetrics,
|
||||
advisory.Provenance);
|
||||
advisory.Provenance,
|
||||
advisory.Description,
|
||||
advisory.Cwes,
|
||||
advisory.CanonicalMetricId);
|
||||
|
||||
public static T Deserialize<T>(string json)
|
||||
=> JsonSerializer.Deserialize<T>(json, PrettyOptions)!
|
||||
|
||||
@@ -13,4 +13,5 @@ public static class ProvenanceFieldMasks
|
||||
public const string NormalizedVersions = "affectedpackages[].normalizedversions[]";
|
||||
public const string PackageStatuses = "affectedpackages[].statuses[]";
|
||||
public const string CvssMetrics = "cvssmetrics[]";
|
||||
public const string Weaknesses = "cwes[]";
|
||||
}
|
||||
|
||||
@@ -13,6 +13,7 @@
|
||||
|Provenance envelope field masks|BE-Merge|Models|DONE – `AdvisoryProvenance.fieldMask` added with diagnostics/tests/docs refreshed; connectors can now emit canonical masks for QA dashboards.|
|
||||
|Backward-compatibility playbook|BE-Merge, QA|Models|DONE – see `BACKWARD_COMPATIBILITY.md` for evolution policy/test checklist.|
|
||||
|Golden canonical examples|QA|Models|DONE – added `/p:UpdateGoldens=true` test hook wiring `UPDATE_GOLDENS=1` so canonical fixtures regenerate via `dotnet test`; docs/tests unchanged.|
|
||||
|Serialization determinism regression tests|QA|Models|DONE – locale-stability tests hash canonical serializer output across multiple cultures and runs.|
|
||||
|Severity normalization helpers|BE-Merge|Models|DONE – helper now normalizes compound vendor labels/priority tiers with expanded synonym coverage and regression tests.|
|
||||
|AffectedPackage status glossary & guardrails|BE-Merge|Models|DONE – catalog now exposes deterministic listing, TryNormalize helpers, and synonym coverage for vendor phrases (not vulnerable, workaround available, etc.).|
|
||||
|Serialization determinism regression tests|QA|Models|DONE – locale-stability tests hash canonical serializer output across multiple cultures and runs.|
|
||||
|Severity normalization helpers|BE-Merge|Models|DONE – helper now normalizes compound vendor labels/priority tiers with expanded synonym coverage and regression tests.|
|
||||
|AffectedPackage status glossary & guardrails|BE-Merge|Models|DONE – catalog now exposes deterministic listing, TryNormalize helpers, and synonym coverage for vendor phrases (not vulnerable, workaround available, etc.).|
|
||||
|Advisory schema parity (description, CWE collection, canonical metric id)|BE-Merge, BE-Core|Core, Exporters|DONE (2025-10-15) – extended `Advisory`/related records with description/CWEs/canonical metric id plus serializer/tests updated; exporters validated via new coverage.|
|
||||
|
||||
@@ -158,4 +158,26 @@ public sealed class SemVerRangeRuleBuilderTests
|
||||
Assert.Equal(Note, result.NormalizedRule.Notes);
|
||||
}
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void BuildNormalizedRules_ProjectsNormalizedRules()
|
||||
{
|
||||
var rules = SemVerRangeRuleBuilder.BuildNormalizedRules(">=1.0.0 <1.2.0", null, Note);
|
||||
var rule = Assert.Single(rules);
|
||||
|
||||
Assert.Equal(NormalizedVersionSchemes.SemVer, rule.Scheme);
|
||||
Assert.Equal(NormalizedVersionRuleTypes.Range, rule.Type);
|
||||
Assert.Equal("1.0.0", rule.Min);
|
||||
Assert.True(rule.MinInclusive);
|
||||
Assert.Equal("1.2.0", rule.Max);
|
||||
Assert.False(rule.MaxInclusive);
|
||||
Assert.Equal(Note, rule.Notes);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void BuildNormalizedRules_ReturnsEmptyWhenNoRules()
|
||||
{
|
||||
var rules = SemVerRangeRuleBuilder.BuildNormalizedRules(" ", null, Note);
|
||||
Assert.Empty(rules);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,3 +1,4 @@
|
||||
using System;
|
||||
using System.Collections.Generic;
|
||||
using System.Diagnostics.CodeAnalysis;
|
||||
using System.Globalization;
|
||||
@@ -58,6 +59,23 @@ public static class SemVerRangeRuleBuilder
|
||||
return results.Count == 0 ? Array.Empty<SemVerRangeBuildResult>() : results;
|
||||
}
|
||||
|
||||
public static IReadOnlyList<NormalizedVersionRule> BuildNormalizedRules(string? rawRange, string? patchedVersion = null, string? provenanceNote = null)
|
||||
{
|
||||
var results = Build(rawRange, patchedVersion, provenanceNote);
|
||||
if (results.Count == 0)
|
||||
{
|
||||
return Array.Empty<NormalizedVersionRule>();
|
||||
}
|
||||
|
||||
var rules = new NormalizedVersionRule[results.Count];
|
||||
for (var i = 0; i < results.Count; i++)
|
||||
{
|
||||
rules[i] = results[i].NormalizedRule;
|
||||
}
|
||||
|
||||
return rules;
|
||||
}
|
||||
|
||||
private static IEnumerable<string> SplitSegments(string rawRange)
|
||||
=> rawRange.Split("||", StringSplitOptions.RemoveEmptyEntries | StringSplitOptions.TrimEntries);
|
||||
|
||||
|
||||
@@ -7,3 +7,4 @@
|
||||
|CVSS metric normalization & severity bands|BE-Norm (Risk WG)|Models|DONE – `CvssMetricNormalizer` unifies vectors, recomputes scores/severities, and is wired through NVD/RedHat/JVN mappers with unit coverage.|
|
||||
|Description and locale normalization pipeline|BE-Norm (I18N)|Source connectors|DONE – `DescriptionNormalizer` strips markup, collapses whitespace, and provides locale fallback used by core mappers.|
|
||||
|SemVer normalized rule emitter (FEEDNORM-NORM-02-001)|BE-Norm (SemVer WG)|Models, `FASTER_MODELING_AND_NORMALIZATION.md`|**DONE (2025-10-12)** – `SemVerRangeRuleBuilder` now parses comparator chains without comma delimiters, supports multi-segment `||` ranges, pushes exact-value metadata, and new tests document the contract for connector teams.|
|
||||
|SemVer normalized rule convenience API|BE-Norm (SemVer WG)|SemVer normalized rule emitter|**DONE (2025-10-15)** – added `SemVerRangeRuleBuilder.BuildNormalizedRules` projection helper and unit coverage for empty/standard ranges so callers can access normalized rules without materializing primitives.|
|
||||
|
||||
@@ -89,6 +89,7 @@
|
||||
"CVE-2025-4242",
|
||||
"GHSA-qqqq-wwww-eeee"
|
||||
],
|
||||
"canonicalMetricId": null,
|
||||
"credits": [
|
||||
{
|
||||
"displayName": "maintainer-team",
|
||||
@@ -126,6 +127,8 @@
|
||||
}
|
||||
],
|
||||
"cvssMetrics": [],
|
||||
"cwes": [],
|
||||
"description": "Container escape vulnerability allowing privilege escalation in conflict-package.",
|
||||
"exploitKnown": false,
|
||||
"language": "en",
|
||||
"modified": "2025-03-02T12:00:00+00:00",
|
||||
|
||||
@@ -89,6 +89,7 @@
|
||||
"CVE-2024-1111",
|
||||
"GHSA-xxxx-yyyy-zzzz"
|
||||
],
|
||||
"canonicalMetricId": "3.1|CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:H/I:H/A:H",
|
||||
"credits": [
|
||||
{
|
||||
"displayName": "maintainer-team",
|
||||
@@ -125,7 +126,43 @@
|
||||
}
|
||||
}
|
||||
],
|
||||
"cvssMetrics": [],
|
||||
"cvssMetrics": [
|
||||
{
|
||||
"baseScore": 9.8,
|
||||
"baseSeverity": "critical",
|
||||
"provenance": {
|
||||
"source": "ghsa",
|
||||
"kind": "cvss",
|
||||
"value": "CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:H/I:H/A:H",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2024-10-02T00:00:00+00:00",
|
||||
"fieldMask": [
|
||||
"cvssmetrics[]"
|
||||
]
|
||||
},
|
||||
"vector": "CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:H/I:H/A:H",
|
||||
"version": "3.1"
|
||||
}
|
||||
],
|
||||
"cwes": [
|
||||
{
|
||||
"taxonomy": "cwe",
|
||||
"identifier": "CWE-79",
|
||||
"name": "Cross-site Scripting",
|
||||
"uri": "https://cwe.mitre.org/data/definitions/79.html",
|
||||
"provenance": [
|
||||
{
|
||||
"source": "unknown",
|
||||
"kind": "unspecified",
|
||||
"value": null,
|
||||
"decisionReason": null,
|
||||
"recordedAt": "1970-01-01T00:00:00+00:00",
|
||||
"fieldMask": []
|
||||
}
|
||||
]
|
||||
}
|
||||
],
|
||||
"description": "An example advisory describing a supply chain risk.",
|
||||
"exploitKnown": false,
|
||||
"language": "en",
|
||||
"modified": "2024-09-20T12:00:00+00:00",
|
||||
|
||||
@@ -35,6 +35,17 @@
|
||||
}
|
||||
}
|
||||
],
|
||||
"cvss": {
|
||||
"score": 9.8,
|
||||
"vector_string": "CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:H/I:H/A:H",
|
||||
"severity": "CRITICAL"
|
||||
},
|
||||
"cwes": [
|
||||
{
|
||||
"cwe_id": "CWE-79",
|
||||
"name": "Cross-site Scripting"
|
||||
}
|
||||
],
|
||||
"vulnerabilities": [
|
||||
{
|
||||
"package": {
|
||||
|
||||
@@ -78,6 +78,16 @@ public sealed class GhsaConnectorTests : IAsyncLifetime
|
||||
Assert.Contains("https://github.com/security-reporter", credit.Contacts);
|
||||
});
|
||||
|
||||
var weakness = Assert.Single(advisory.Cwes);
|
||||
Assert.Equal("CWE-79", weakness.Identifier);
|
||||
Assert.Equal("https://cwe.mitre.org/data/definitions/79.html", weakness.Uri);
|
||||
|
||||
var metric = Assert.Single(advisory.CvssMetrics);
|
||||
Assert.Equal("3.1", metric.Version);
|
||||
Assert.Equal("CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:H/I:H/A:H", metric.Vector);
|
||||
Assert.Equal("critical", metric.BaseSeverity);
|
||||
Assert.Equal("3.1|CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:H/I:H/A:H", advisory.CanonicalMetricId);
|
||||
|
||||
var snapshot = SnapshotSerializer.ToSnapshot(advisory).Replace("\r\n", "\n").TrimEnd();
|
||||
var expected = ReadFixture("Fixtures/expected-GHSA-xxxx-yyyy-zzzz.json").Replace("\r\n", "\n").TrimEnd();
|
||||
|
||||
|
||||
@@ -1,6 +1,9 @@
|
||||
using System.Collections.Generic;
|
||||
using System.Collections.Immutable;
|
||||
using System.Linq;
|
||||
using System.Text;
|
||||
using StellaOps.Feedser.Models;
|
||||
using StellaOps.Feedser.Normalization.Cvss;
|
||||
using StellaOps.Feedser.Normalization.SemVer;
|
||||
using StellaOps.Feedser.Storage.Mongo.Documents;
|
||||
|
||||
@@ -51,9 +54,12 @@ internal static class GhsaMapper
|
||||
|
||||
var affected = CreateAffectedPackages(dto, recordedAt);
|
||||
var credits = CreateCredits(dto.Credits, recordedAt);
|
||||
var weaknesses = CreateWeaknesses(dto.Cwes, recordedAt);
|
||||
var cvssMetrics = CreateCvssMetrics(dto.Cvss, recordedAt, out var cvssSeverity, out var canonicalMetricId);
|
||||
|
||||
var severity = dto.Severity?.ToLowerInvariant();
|
||||
var severity = SeverityNormalization.Normalize(dto.Severity) ?? cvssSeverity;
|
||||
var summary = dto.Summary ?? dto.Description;
|
||||
var description = Validation.TrimToNull(dto.Description);
|
||||
|
||||
return new Advisory(
|
||||
advisoryKey: dto.GhsaId,
|
||||
@@ -68,8 +74,11 @@ internal static class GhsaMapper
|
||||
credits: credits,
|
||||
references: references,
|
||||
affectedPackages: affected,
|
||||
cvssMetrics: Array.Empty<CvssMetric>(),
|
||||
provenance: new[] { fetchProvenance, mapProvenance });
|
||||
cvssMetrics: cvssMetrics,
|
||||
provenance: new[] { fetchProvenance, mapProvenance },
|
||||
description: description,
|
||||
cwes: weaknesses,
|
||||
canonicalMetricId: canonicalMetricId);
|
||||
}
|
||||
|
||||
private static AdvisoryReference? CreateReference(GhsaReferenceDto reference, DateTimeOffset recordedAt)
|
||||
@@ -189,6 +198,100 @@ internal static class GhsaMapper
|
||||
return results.Count == 0 ? Array.Empty<AdvisoryCredit>() : results;
|
||||
}
|
||||
|
||||
private static IReadOnlyList<AdvisoryWeakness> CreateWeaknesses(IReadOnlyList<GhsaWeaknessDto> cwes, DateTimeOffset recordedAt)
|
||||
{
|
||||
if (cwes.Count == 0)
|
||||
{
|
||||
return Array.Empty<AdvisoryWeakness>();
|
||||
}
|
||||
|
||||
var list = new List<AdvisoryWeakness>(cwes.Count);
|
||||
foreach (var cwe in cwes)
|
||||
{
|
||||
if (cwe is null || string.IsNullOrWhiteSpace(cwe.CweId))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var identifier = cwe.CweId.Trim();
|
||||
var provenance = new AdvisoryProvenance(
|
||||
GhsaConnectorPlugin.SourceName,
|
||||
"weakness",
|
||||
identifier,
|
||||
recordedAt,
|
||||
new[] { ProvenanceFieldMasks.Weaknesses });
|
||||
|
||||
var provenanceArray = ImmutableArray.Create(provenance);
|
||||
list.Add(new AdvisoryWeakness(
|
||||
taxonomy: "cwe",
|
||||
identifier: identifier,
|
||||
name: Validation.TrimToNull(cwe.Name),
|
||||
uri: BuildCweUrl(identifier),
|
||||
provenance: provenanceArray));
|
||||
}
|
||||
|
||||
return list.Count == 0 ? Array.Empty<AdvisoryWeakness>() : list;
|
||||
}
|
||||
|
||||
private static IReadOnlyList<CvssMetric> CreateCvssMetrics(GhsaCvssDto? cvss, DateTimeOffset recordedAt, out string? severity, out string? canonicalMetricId)
|
||||
{
|
||||
severity = null;
|
||||
canonicalMetricId = null;
|
||||
|
||||
if (cvss is null)
|
||||
{
|
||||
return Array.Empty<CvssMetric>();
|
||||
}
|
||||
|
||||
var vector = Validation.TrimToNull(cvss.VectorString);
|
||||
if (!CvssMetricNormalizer.TryNormalize(null, vector, cvss.Score, cvss.Severity, out var normalized))
|
||||
{
|
||||
return Array.Empty<CvssMetric>();
|
||||
}
|
||||
|
||||
severity = normalized.BaseSeverity;
|
||||
canonicalMetricId = $"{normalized.Version}|{normalized.Vector}";
|
||||
|
||||
var provenance = new AdvisoryProvenance(
|
||||
GhsaConnectorPlugin.SourceName,
|
||||
"cvss",
|
||||
normalized.Vector,
|
||||
recordedAt,
|
||||
new[] { ProvenanceFieldMasks.CvssMetrics });
|
||||
|
||||
return new[]
|
||||
{
|
||||
normalized.ToModel(provenance),
|
||||
};
|
||||
}
|
||||
|
||||
private static string? BuildCweUrl(string? cweId)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(cweId))
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
var trimmed = cweId.Trim();
|
||||
var dashIndex = trimmed.IndexOf('-');
|
||||
if (dashIndex < 0 || dashIndex == trimmed.Length - 1)
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
var digits = new StringBuilder();
|
||||
for (var i = dashIndex + 1; i < trimmed.Length; i++)
|
||||
{
|
||||
var ch = trimmed[i];
|
||||
if (char.IsDigit(ch))
|
||||
{
|
||||
digits.Append(ch);
|
||||
}
|
||||
}
|
||||
|
||||
return digits.Length == 0 ? null : $"https://cwe.mitre.org/data/definitions/{digits}.html";
|
||||
}
|
||||
|
||||
private static (IReadOnlyList<AffectedVersionRange> Ranges, IReadOnlyList<NormalizedVersionRule> Normalized) CreateSemVerVersionArtifacts(
|
||||
GhsaAffectedDto affected,
|
||||
string identifier,
|
||||
|
||||
@@ -21,6 +21,10 @@ internal sealed record GhsaRecordDto
|
||||
public IReadOnlyList<GhsaAffectedDto> Affected { get; init; } = Array.Empty<GhsaAffectedDto>();
|
||||
|
||||
public IReadOnlyList<GhsaCreditDto> Credits { get; init; } = Array.Empty<GhsaCreditDto>();
|
||||
|
||||
public IReadOnlyList<GhsaWeaknessDto> Cwes { get; init; } = Array.Empty<GhsaWeaknessDto>();
|
||||
|
||||
public GhsaCvssDto? Cvss { get; init; }
|
||||
}
|
||||
|
||||
internal sealed record GhsaReferenceDto
|
||||
@@ -53,3 +57,19 @@ internal sealed record GhsaCreditDto
|
||||
|
||||
public string? ProfileUrl { get; init; }
|
||||
}
|
||||
|
||||
internal sealed record GhsaWeaknessDto
|
||||
{
|
||||
public string? CweId { get; init; }
|
||||
|
||||
public string? Name { get; init; }
|
||||
}
|
||||
|
||||
internal sealed record GhsaCvssDto
|
||||
{
|
||||
public double? Score { get; init; }
|
||||
|
||||
public string? VectorString { get; init; }
|
||||
|
||||
public string? Severity { get; init; }
|
||||
}
|
||||
|
||||
@@ -37,6 +37,8 @@ internal static class GhsaRecordParser
|
||||
var references = ParseReferences(root);
|
||||
var affected = ParseAffected(root);
|
||||
var credits = ParseCredits(root);
|
||||
var cwes = ParseCwes(root);
|
||||
var cvss = ParseCvss(root);
|
||||
|
||||
return new GhsaRecordDto
|
||||
{
|
||||
@@ -50,6 +52,8 @@ internal static class GhsaRecordParser
|
||||
References = references,
|
||||
Affected = affected,
|
||||
Credits = credits,
|
||||
Cwes = cwes,
|
||||
Cvss = cvss,
|
||||
};
|
||||
}
|
||||
|
||||
@@ -171,6 +175,66 @@ internal static class GhsaRecordParser
|
||||
return list;
|
||||
}
|
||||
|
||||
private static IReadOnlyList<GhsaWeaknessDto> ParseCwes(JsonElement root)
|
||||
{
|
||||
if (!root.TryGetProperty("cwes", out var cwes) || cwes.ValueKind != JsonValueKind.Array)
|
||||
{
|
||||
return Array.Empty<GhsaWeaknessDto>();
|
||||
}
|
||||
|
||||
var list = new List<GhsaWeaknessDto>(cwes.GetArrayLength());
|
||||
foreach (var entry in cwes.EnumerateArray())
|
||||
{
|
||||
if (entry.ValueKind != JsonValueKind.Object)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var cweId = GetString(entry, "cwe_id");
|
||||
if (string.IsNullOrWhiteSpace(cweId))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
list.Add(new GhsaWeaknessDto
|
||||
{
|
||||
CweId = cweId,
|
||||
Name = GetString(entry, "name"),
|
||||
});
|
||||
}
|
||||
|
||||
return list.Count == 0 ? Array.Empty<GhsaWeaknessDto>() : list;
|
||||
}
|
||||
|
||||
private static GhsaCvssDto? ParseCvss(JsonElement root)
|
||||
{
|
||||
if (!root.TryGetProperty("cvss", out var cvss) || cvss.ValueKind != JsonValueKind.Object)
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
double? score = null;
|
||||
if (cvss.TryGetProperty("score", out var scoreElement) && scoreElement.ValueKind == JsonValueKind.Number)
|
||||
{
|
||||
score = scoreElement.GetDouble();
|
||||
}
|
||||
|
||||
var vector = GetString(cvss, "vector_string") ?? GetString(cvss, "vectorString");
|
||||
var severity = GetString(cvss, "severity");
|
||||
|
||||
if (score is null && string.IsNullOrWhiteSpace(vector) && string.IsNullOrWhiteSpace(severity))
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
return new GhsaCvssDto
|
||||
{
|
||||
Score = score,
|
||||
VectorString = vector,
|
||||
Severity = severity,
|
||||
};
|
||||
}
|
||||
|
||||
private static string? GetString(JsonElement element, string propertyName)
|
||||
{
|
||||
if (element.ValueKind != JsonValueKind.Object)
|
||||
|
||||
@@ -15,3 +15,5 @@
|
||||
|FEEDCONN-GHSA-02-001 Normalized versions rollout|BE-Conn-GHSA|Models `FEEDMODELS-SCHEMA-01-003`, Normalization playbook|**DONE (2025-10-11)** – GHSA mapper now emits SemVer primitives + normalized ranges, fixtures refreshed, connector tests passing; report logged via FEEDMERGE-COORD-02-900.|
|
||||
|FEEDCONN-GHSA-02-005 Quota monitoring hardening|BE-Conn-GHSA, Observability|Source.Common metrics|**DONE (2025-10-12)** – Diagnostics expose headroom histograms/gauges, warning logs dedupe below the configured threshold, and the ops runbook gained alerting and mitigation guidance.|
|
||||
|FEEDCONN-GHSA-02-006 Scheduler rollout integration|BE-Conn-GHSA, Ops|Job scheduler|**DONE (2025-10-12)** – Dependency routine tests assert cron/timeouts, and the runbook highlights cron overrides plus backoff toggles for staged rollouts.|
|
||||
|FEEDCONN-GHSA-04-003 Description/CWE/metric parity rollout|BE-Conn-GHSA|Models, Core|**DONE (2025-10-15)** – Mapper emits advisory description, CWE weaknesses, and canonical CVSS metric id with updated fixtures (`osv-ghsa.osv.json` parity suite) and connector regression covers the new fields. Reported completion to Merge coordination.|
|
||||
|FEEDCONN-GHSA-04-004 Canonical metric fallback coverage|BE-Conn-GHSA|Models, Merge|TODO – Ensure canonical metric ids remain populated when GitHub omits CVSS vectors/scores; add fixtures capturing severity-only advisories, document precedence with Merge, and emit analytics to track fallback usage.|
|
||||
|
||||
@@ -75,6 +75,7 @@
|
||||
"aliases": [
|
||||
"CVE-2025-4242"
|
||||
],
|
||||
"canonicalMetricId": "3.1|CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:H/I:H/A:H",
|
||||
"credits": [],
|
||||
"cvssMetrics": [
|
||||
{
|
||||
@@ -94,6 +95,27 @@
|
||||
"version": "3.1"
|
||||
}
|
||||
],
|
||||
"cwes": [
|
||||
{
|
||||
"taxonomy": "cwe",
|
||||
"identifier": "CWE-269",
|
||||
"name": null,
|
||||
"uri": "https://cwe.mitre.org/data/definitions/269.html",
|
||||
"provenance": [
|
||||
{
|
||||
"source": "nvd",
|
||||
"kind": "weakness",
|
||||
"value": "CWE-269",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-03-04T02:00:00+00:00",
|
||||
"fieldMask": [
|
||||
"cwes[]"
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
],
|
||||
"description": "NVD baseline summary for conflict-package allowing container escape.",
|
||||
"exploitKnown": false,
|
||||
"language": "en",
|
||||
"modified": "2025-03-03T09:45:00+00:00",
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
using System.Collections.Generic;
|
||||
using System.Collections.Immutable;
|
||||
using System.Linq;
|
||||
using System.Text;
|
||||
using System.Text.Json;
|
||||
@@ -48,9 +49,14 @@ internal static class NvdMapper
|
||||
var modified = TryGetDateTime(cve, "lastModified");
|
||||
var description = GetNormalizedDescription(cve);
|
||||
|
||||
var references = GetReferences(cve, sourceDocument, recordedAt);
|
||||
var weaknessMetadata = GetWeaknessMetadata(cve);
|
||||
var references = GetReferences(cve, sourceDocument, recordedAt, weaknessMetadata);
|
||||
var affectedPackages = GetAffectedPackages(cve, cveId, sourceDocument, recordedAt);
|
||||
var cvssMetrics = GetCvssMetrics(cve, sourceDocument, recordedAt, out var severity);
|
||||
var weaknesses = BuildWeaknesses(weaknessMetadata, recordedAt);
|
||||
var canonicalMetricId = cvssMetrics.Count > 0
|
||||
? $"{cvssMetrics[0].Version}|{cvssMetrics[0].Vector}"
|
||||
: null;
|
||||
|
||||
var provenance = new[]
|
||||
{
|
||||
@@ -77,21 +83,24 @@ internal static class NvdMapper
|
||||
}
|
||||
|
||||
aliasCandidates.Add(advisoryKey);
|
||||
|
||||
var advisory = new Advisory(
|
||||
advisoryKey: advisoryKey,
|
||||
title: title,
|
||||
summary: string.IsNullOrEmpty(description.Text) ? null : description.Text,
|
||||
language: description.Language,
|
||||
published: published,
|
||||
modified: modified,
|
||||
severity: severity,
|
||||
exploitKnown: false,
|
||||
aliases: aliasCandidates,
|
||||
references: references,
|
||||
affectedPackages: affectedPackages,
|
||||
cvssMetrics: cvssMetrics,
|
||||
provenance: provenance);
|
||||
|
||||
var advisory = new Advisory(
|
||||
advisoryKey: advisoryKey,
|
||||
title: title,
|
||||
summary: string.IsNullOrEmpty(description.Text) ? null : description.Text,
|
||||
language: description.Language,
|
||||
published: published,
|
||||
modified: modified,
|
||||
severity: severity,
|
||||
exploitKnown: false,
|
||||
aliases: aliasCandidates,
|
||||
references: references,
|
||||
affectedPackages: affectedPackages,
|
||||
cvssMetrics: cvssMetrics,
|
||||
provenance: provenance,
|
||||
description: string.IsNullOrEmpty(description.Text) ? null : description.Text,
|
||||
cwes: weaknesses,
|
||||
canonicalMetricId: canonicalMetricId);
|
||||
|
||||
advisories.Add(advisory);
|
||||
index++;
|
||||
@@ -140,17 +149,22 @@ internal static class NvdMapper
|
||||
return DateTimeOffset.TryParse(property.GetString(), out var parsed) ? parsed : null;
|
||||
}
|
||||
|
||||
private static IReadOnlyList<AdvisoryReference> GetReferences(JsonElement cve, DocumentRecord document, DateTimeOffset recordedAt)
|
||||
{
|
||||
var references = new List<AdvisoryReference>();
|
||||
if (!cve.TryGetProperty("references", out var referencesElement) || referencesElement.ValueKind != JsonValueKind.Array)
|
||||
{
|
||||
return references;
|
||||
}
|
||||
|
||||
foreach (var reference in referencesElement.EnumerateArray())
|
||||
{
|
||||
if (!reference.TryGetProperty("url", out var urlElement) || urlElement.ValueKind != JsonValueKind.String)
|
||||
private static IReadOnlyList<AdvisoryReference> GetReferences(
|
||||
JsonElement cve,
|
||||
DocumentRecord document,
|
||||
DateTimeOffset recordedAt,
|
||||
IReadOnlyList<WeaknessMetadata> weaknesses)
|
||||
{
|
||||
var references = new List<AdvisoryReference>();
|
||||
if (!cve.TryGetProperty("references", out var referencesElement) || referencesElement.ValueKind != JsonValueKind.Array)
|
||||
{
|
||||
AppendWeaknessReferences(references, weaknesses, recordedAt);
|
||||
return references;
|
||||
}
|
||||
|
||||
foreach (var reference in referencesElement.EnumerateArray())
|
||||
{
|
||||
if (!reference.TryGetProperty("url", out var urlElement) || urlElement.ValueKind != JsonValueKind.String)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
@@ -181,19 +195,18 @@ internal static class NvdMapper
|
||||
new[] { ProvenanceFieldMasks.References })));
|
||||
}
|
||||
|
||||
AppendWeaknessReferences(cve, references, recordedAt);
|
||||
AppendWeaknessReferences(references, weaknesses, recordedAt);
|
||||
return references;
|
||||
}
|
||||
|
||||
private static void AppendWeaknessReferences(JsonElement cve, List<AdvisoryReference> references, DateTimeOffset recordedAt)
|
||||
private static IReadOnlyList<WeaknessMetadata> GetWeaknessMetadata(JsonElement cve)
|
||||
{
|
||||
if (!cve.TryGetProperty("weaknesses", out var weaknesses) || weaknesses.ValueKind != JsonValueKind.Array)
|
||||
{
|
||||
return;
|
||||
return Array.Empty<WeaknessMetadata>();
|
||||
}
|
||||
|
||||
var existing = new HashSet<string>(references.Select(reference => reference.Url), StringComparer.OrdinalIgnoreCase);
|
||||
|
||||
var list = new List<WeaknessMetadata>(weaknesses.GetArrayLength());
|
||||
foreach (var weakness in weaknesses.EnumerateArray())
|
||||
{
|
||||
if (!weakness.TryGetProperty("description", out var descriptions) || descriptions.ValueKind != JsonValueKind.Array)
|
||||
@@ -238,7 +251,56 @@ internal static class NvdMapper
|
||||
continue;
|
||||
}
|
||||
|
||||
var url = BuildCweUrl(cweId);
|
||||
list.Add(new WeaknessMetadata(cweId, name));
|
||||
}
|
||||
|
||||
return list.Count == 0 ? Array.Empty<WeaknessMetadata>() : list;
|
||||
}
|
||||
|
||||
private static IReadOnlyList<AdvisoryWeakness> BuildWeaknesses(IReadOnlyList<WeaknessMetadata> metadata, DateTimeOffset recordedAt)
|
||||
{
|
||||
if (metadata.Count == 0)
|
||||
{
|
||||
return Array.Empty<AdvisoryWeakness>();
|
||||
}
|
||||
|
||||
var list = new List<AdvisoryWeakness>(metadata.Count);
|
||||
foreach (var entry in metadata)
|
||||
{
|
||||
var provenance = new AdvisoryProvenance(
|
||||
NvdConnectorPlugin.SourceName,
|
||||
"weakness",
|
||||
entry.CweId,
|
||||
recordedAt,
|
||||
new[] { ProvenanceFieldMasks.Weaknesses });
|
||||
|
||||
var provenanceArray = ImmutableArray.Create(provenance);
|
||||
list.Add(new AdvisoryWeakness(
|
||||
taxonomy: "cwe",
|
||||
identifier: entry.CweId,
|
||||
name: entry.Name,
|
||||
uri: BuildCweUrl(entry.CweId),
|
||||
provenance: provenanceArray));
|
||||
}
|
||||
|
||||
return list;
|
||||
}
|
||||
|
||||
private static void AppendWeaknessReferences(
|
||||
List<AdvisoryReference> references,
|
||||
IReadOnlyList<WeaknessMetadata> weaknesses,
|
||||
DateTimeOffset recordedAt)
|
||||
{
|
||||
if (weaknesses.Count == 0)
|
||||
{
|
||||
return;
|
||||
}
|
||||
|
||||
var existing = new HashSet<string>(references.Select(reference => reference.Url), StringComparer.OrdinalIgnoreCase);
|
||||
|
||||
foreach (var weakness in weaknesses)
|
||||
{
|
||||
var url = BuildCweUrl(weakness.CweId);
|
||||
if (url is null || existing.Contains(url))
|
||||
{
|
||||
continue;
|
||||
@@ -251,7 +313,7 @@ internal static class NvdMapper
|
||||
recordedAt,
|
||||
new[] { ProvenanceFieldMasks.References });
|
||||
|
||||
references.Add(new AdvisoryReference(url, "weakness", cweId, name, provenance));
|
||||
references.Add(new AdvisoryReference(url, "weakness", weakness.CweId, weakness.Name, provenance));
|
||||
existing.Add(url);
|
||||
}
|
||||
}
|
||||
@@ -701,10 +763,12 @@ internal static class NvdMapper
|
||||
return version;
|
||||
}
|
||||
|
||||
private sealed class PackageAccumulator
|
||||
{
|
||||
public List<AffectedVersionRange> Ranges { get; } = new();
|
||||
|
||||
public List<AdvisoryProvenance> Provenance { get; } = new();
|
||||
}
|
||||
private readonly record struct WeaknessMetadata(string CweId, string? Name);
|
||||
|
||||
private sealed class PackageAccumulator
|
||||
{
|
||||
public List<AffectedVersionRange> Ranges { get; } = new();
|
||||
|
||||
public List<AdvisoryProvenance> Provenance { get; } = new();
|
||||
}
|
||||
}
|
||||
|
||||
@@ -15,3 +15,4 @@
|
||||
|FEEDCONN-NVD-02-004 NVD CVSS & CWE precedence payloads|BE-Conn-Nvd|Models `FEEDMODELS-SCHEMA-01-002`|**DONE (2025-10-11)** – CVSS metrics now carry provenance masks, CWE weaknesses emit normalized references, and fixtures cover the additional precedence data.|
|
||||
|FEEDCONN-NVD-02-005 NVD merge/export parity regression|BE-Conn-Nvd, BE-Merge|Merge `FEEDMERGE-ENGINE-04-003`|**DONE (2025-10-12)** – Canonical merge parity fixtures captured, regression test validates credit/reference union, and exporter snapshot check guarantees parity through JSON exports.|
|
||||
|FEEDCONN-NVD-02-002 Normalized versions rollout|BE-Conn-Nvd|Models `FEEDMODELS-SCHEMA-01-003`, Normalization playbook|**DONE (2025-10-11)** – SemVer primitives + normalized rules emitting for parseable ranges, fixtures/tests refreshed, coordination pinged via FEEDMERGE-COORD-02-900.|
|
||||
|FEEDCONN-NVD-04-003 Description/CWE/metric parity rollout|BE-Conn-Nvd|Models, Core|**DONE (2025-10-15)** – Mapper now surfaces normalized description text, CWE weaknesses, and canonical CVSS metric id. Snapshots (`conflict-nvd.canonical.json`) refreshed and completion relayed to Merge coordination.|
|
||||
|
||||
@@ -73,6 +73,7 @@
|
||||
"GHSA-qqqq-wwww-eeee",
|
||||
"OSV-2025-4242"
|
||||
],
|
||||
"canonicalMetricId": "3.1|CVSS:3.1/AV:N/AC:H/PR:L/UI:R/S:U/C:L/I:L/A:L",
|
||||
"credits": [
|
||||
{
|
||||
"displayName": "osv-reporter",
|
||||
@@ -108,6 +109,8 @@
|
||||
"version": "3.1"
|
||||
}
|
||||
],
|
||||
"cwes": [],
|
||||
"description": "OSV captures the latest container escape details including patched version metadata.",
|
||||
"exploitKnown": false,
|
||||
"language": "en",
|
||||
"modified": "2025-03-06T12:00:00+00:00",
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -33,7 +33,7 @@
|
||||
"kind": "range",
|
||||
"value": "pkg:golang/github.com/opencontainers/image-spec",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2184266+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9970795+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[].versionranges[]"
|
||||
]
|
||||
@@ -61,7 +61,7 @@
|
||||
"kind": "affected",
|
||||
"value": "pkg:golang/github.com/opencontainers/image-spec",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2184266+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9970795+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[]"
|
||||
]
|
||||
@@ -73,6 +73,7 @@
|
||||
"CGA-j36r-723f-8c29",
|
||||
"GHSA-77vh-xpmg-72qh"
|
||||
],
|
||||
"canonicalMetricId": "3.1|CVSS:3.1/AV:N/AC:H/PR:L/UI:R/S:C/C:N/I:L/A:N",
|
||||
"credits": [],
|
||||
"cvssMetrics": [
|
||||
{
|
||||
@@ -83,13 +84,34 @@
|
||||
"kind": "cvss",
|
||||
"value": "CVSS_V3",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2184266+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9970795+00:00",
|
||||
"fieldMask": []
|
||||
},
|
||||
"vector": "CVSS:3.1/AV:N/AC:H/PR:L/UI:R/S:C/C:N/I:L/A:N",
|
||||
"version": "3.1"
|
||||
}
|
||||
],
|
||||
"cwes": [
|
||||
{
|
||||
"taxonomy": "cwe",
|
||||
"identifier": "CWE-843",
|
||||
"name": null,
|
||||
"uri": "https://cwe.mitre.org/data/definitions/843.html",
|
||||
"provenance": [
|
||||
{
|
||||
"source": "osv",
|
||||
"kind": "weakness",
|
||||
"value": "CWE-843",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-15T14:48:57.9970795+00:00",
|
||||
"fieldMask": [
|
||||
"cwes[]"
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
],
|
||||
"description": "### Impact\nIn the OCI Image Specification version 1.0.1 and prior, manifest and index documents are not self-describing and documents with a single digest could be interpreted as either a manifest or an index.\n\n### Patches\nThe Image Specification will be updated to recommend that both manifest and index documents contain a `mediaType` field to identify the type of document.\nRelease [v1.0.2](https://github.com/opencontainers/image-spec/releases/tag/v1.0.2) includes these updates.\n\n### Workarounds\nSoftware attempting to deserialize an ambiguous document may reject the document if it contains both “manifests” and “layers” fields or “manifests” and “config” fields.\n\n### References\nhttps://github.com/opencontainers/distribution-spec/security/advisories/GHSA-mc8v-mgrf-8f4m\n\n### For more information\nIf you have any questions or comments about this advisory:\n* Open an issue in https://github.com/opencontainers/image-spec\n* Email us at [security@opencontainers.org](mailto:security@opencontainers.org)\n* https://github.com/opencontainers/image-spec/commits/v1.0.2",
|
||||
"exploitKnown": false,
|
||||
"language": "en",
|
||||
"modified": "2021-11-24T19:43:35+00:00",
|
||||
@@ -99,7 +121,7 @@
|
||||
"kind": "document",
|
||||
"value": "https://osv.dev/vulnerability/GHSA-77vh-xpmg-72qh",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2181708+00:00",
|
||||
"recordedAt": "2021-11-18T16:02:41+00:00",
|
||||
"fieldMask": [
|
||||
"advisory"
|
||||
]
|
||||
@@ -109,7 +131,7 @@
|
||||
"kind": "mapping",
|
||||
"value": "GHSA-77vh-xpmg-72qh",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2184266+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9970795+00:00",
|
||||
"fieldMask": [
|
||||
"advisory"
|
||||
]
|
||||
@@ -124,7 +146,7 @@
|
||||
"kind": "reference",
|
||||
"value": "https://github.com/opencontainers/distribution-spec/security/advisories/GHSA-mc8v-mgrf-8f4m",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2184266+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9970795+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
@@ -140,7 +162,7 @@
|
||||
"kind": "reference",
|
||||
"value": "https://github.com/opencontainers/image-spec",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2184266+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9970795+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
@@ -156,7 +178,7 @@
|
||||
"kind": "reference",
|
||||
"value": "https://github.com/opencontainers/image-spec/commit/693428a734f5bab1a84bd2f990d92ef1111cd60c",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2184266+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9970795+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
@@ -172,7 +194,7 @@
|
||||
"kind": "reference",
|
||||
"value": "https://github.com/opencontainers/image-spec/releases/tag/v1.0.2",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2184266+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9970795+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
@@ -188,7 +210,7 @@
|
||||
"kind": "reference",
|
||||
"value": "https://github.com/opencontainers/image-spec/security/advisories/GHSA-77vh-xpmg-72qh",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2184266+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9970795+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
@@ -236,7 +258,7 @@
|
||||
"kind": "range",
|
||||
"value": "pkg:maven/org.apache.logging.log4j/log4j-core",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2365076+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9980643+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[].versionranges[]"
|
||||
]
|
||||
@@ -264,7 +286,7 @@
|
||||
"kind": "affected",
|
||||
"value": "pkg:maven/org.apache.logging.log4j/log4j-core",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2365076+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9980643+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[]"
|
||||
]
|
||||
@@ -302,7 +324,7 @@
|
||||
"kind": "range",
|
||||
"value": "pkg:maven/org.apache.logging.log4j/log4j-core",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2365076+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9980643+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[].versionranges[]"
|
||||
]
|
||||
@@ -330,7 +352,7 @@
|
||||
"kind": "affected",
|
||||
"value": "pkg:maven/org.apache.logging.log4j/log4j-core",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2365076+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9980643+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[]"
|
||||
]
|
||||
@@ -368,7 +390,7 @@
|
||||
"kind": "range",
|
||||
"value": "pkg:maven/org.ops4j.pax.logging/pax-logging-log4j2",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2365076+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9980643+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[].versionranges[]"
|
||||
]
|
||||
@@ -396,7 +418,7 @@
|
||||
"kind": "affected",
|
||||
"value": "pkg:maven/org.ops4j.pax.logging/pax-logging-log4j2",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2365076+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9980643+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[]"
|
||||
]
|
||||
@@ -434,7 +456,7 @@
|
||||
"kind": "range",
|
||||
"value": "pkg:maven/org.ops4j.pax.logging/pax-logging-log4j2",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2365076+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9980643+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[].versionranges[]"
|
||||
]
|
||||
@@ -462,7 +484,7 @@
|
||||
"kind": "affected",
|
||||
"value": "pkg:maven/org.ops4j.pax.logging/pax-logging-log4j2",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2365076+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9980643+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[]"
|
||||
]
|
||||
@@ -500,7 +522,7 @@
|
||||
"kind": "range",
|
||||
"value": "pkg:maven/org.ops4j.pax.logging/pax-logging-log4j2",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2365076+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9980643+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[].versionranges[]"
|
||||
]
|
||||
@@ -528,7 +550,7 @@
|
||||
"kind": "affected",
|
||||
"value": "pkg:maven/org.ops4j.pax.logging/pax-logging-log4j2",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2365076+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9980643+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[]"
|
||||
]
|
||||
@@ -566,7 +588,7 @@
|
||||
"kind": "range",
|
||||
"value": "pkg:maven/org.ops4j.pax.logging/pax-logging-log4j2",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2365076+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9980643+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[].versionranges[]"
|
||||
]
|
||||
@@ -594,7 +616,7 @@
|
||||
"kind": "affected",
|
||||
"value": "pkg:maven/org.ops4j.pax.logging/pax-logging-log4j2",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2365076+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9980643+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[]"
|
||||
]
|
||||
@@ -606,6 +628,7 @@
|
||||
"CVE-2021-45046",
|
||||
"GHSA-7rjr-3q55-vv33"
|
||||
],
|
||||
"canonicalMetricId": "3.1|CVSS:3.1/AV:N/AC:H/PR:N/UI:N/S:C/C:H/I:H/A:H",
|
||||
"credits": [],
|
||||
"cvssMetrics": [
|
||||
{
|
||||
@@ -616,13 +639,52 @@
|
||||
"kind": "cvss",
|
||||
"value": "CVSS_V3",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2365076+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9980643+00:00",
|
||||
"fieldMask": []
|
||||
},
|
||||
"vector": "CVSS:3.1/AV:N/AC:H/PR:N/UI:N/S:C/C:H/I:H/A:H",
|
||||
"version": "3.1"
|
||||
}
|
||||
],
|
||||
"cwes": [
|
||||
{
|
||||
"taxonomy": "cwe",
|
||||
"identifier": "CWE-502",
|
||||
"name": null,
|
||||
"uri": "https://cwe.mitre.org/data/definitions/502.html",
|
||||
"provenance": [
|
||||
{
|
||||
"source": "osv",
|
||||
"kind": "weakness",
|
||||
"value": "CWE-502",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-15T14:48:57.9980643+00:00",
|
||||
"fieldMask": [
|
||||
"cwes[]"
|
||||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"taxonomy": "cwe",
|
||||
"identifier": "CWE-917",
|
||||
"name": null,
|
||||
"uri": "https://cwe.mitre.org/data/definitions/917.html",
|
||||
"provenance": [
|
||||
{
|
||||
"source": "osv",
|
||||
"kind": "weakness",
|
||||
"value": "CWE-917",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-15T14:48:57.9980643+00:00",
|
||||
"fieldMask": [
|
||||
"cwes[]"
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
],
|
||||
"description": "# Impact\n\nThe fix to address [CVE-2021-44228](https://nvd.nist.gov/vuln/detail/CVE-2021-44228) in Apache Log4j 2.15.0 was incomplete in certain non-default configurations. This could allow attackers with control over Thread Context Map (MDC) input data when the logging configuration uses a non-default Pattern Layout with either a Context Lookup (for example, $${ctx:loginId}) or a Thread Context Map pattern (%X, %mdc, or %MDC) to craft malicious input data using a JNDI Lookup pattern resulting in a remote code execution (RCE) attack. \n\n## Affected packages\nOnly the `org.apache.logging.log4j:log4j-core` package is directly affected by this vulnerability. The `org.apache.logging.log4j:log4j-api` should be kept at the same version as the `org.apache.logging.log4j:log4j-core` package to ensure compatability if in use.\n\n# Mitigation\n\nLog4j 2.16.0 fixes this issue by removing support for message lookup patterns and disabling JNDI functionality by default. This issue can be mitigated in prior releases (< 2.16.0) by removing the JndiLookup class from the classpath (example: zip -q -d log4j-core-*.jar org/apache/logging/log4j/core/lookup/JndiLookup.class).\n\nLog4j 2.15.0 restricts JNDI LDAP lookups to localhost by default. Note that previous mitigations involving configuration such as to set the system property `log4j2.formatMsgNoLookups` to `true` do NOT mitigate this specific vulnerability.",
|
||||
"exploitKnown": false,
|
||||
"language": "en",
|
||||
"modified": "2025-05-09T13:13:16.169374+00:00",
|
||||
@@ -632,7 +694,7 @@
|
||||
"kind": "document",
|
||||
"value": "https://osv.dev/vulnerability/GHSA-7rjr-3q55-vv33",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2355464+00:00",
|
||||
"recordedAt": "2021-12-14T18:01:28+00:00",
|
||||
"fieldMask": [
|
||||
"advisory"
|
||||
]
|
||||
@@ -642,7 +704,7 @@
|
||||
"kind": "mapping",
|
||||
"value": "GHSA-7rjr-3q55-vv33",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2365076+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9980643+00:00",
|
||||
"fieldMask": [
|
||||
"advisory"
|
||||
]
|
||||
@@ -657,7 +719,7 @@
|
||||
"kind": "reference",
|
||||
"value": "http://www.openwall.com/lists/oss-security/2021/12/14/4",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2365076+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9980643+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
@@ -673,7 +735,7 @@
|
||||
"kind": "reference",
|
||||
"value": "http://www.openwall.com/lists/oss-security/2021/12/15/3",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2365076+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9980643+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
@@ -689,7 +751,7 @@
|
||||
"kind": "reference",
|
||||
"value": "http://www.openwall.com/lists/oss-security/2021/12/18/1",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2365076+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9980643+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
@@ -705,7 +767,7 @@
|
||||
"kind": "reference",
|
||||
"value": "https://cert-portal.siemens.com/productcert/pdf/ssa-397453.pdf",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2365076+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9980643+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
@@ -721,7 +783,7 @@
|
||||
"kind": "reference",
|
||||
"value": "https://cert-portal.siemens.com/productcert/pdf/ssa-479842.pdf",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2365076+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9980643+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
@@ -737,7 +799,7 @@
|
||||
"kind": "reference",
|
||||
"value": "https://cert-portal.siemens.com/productcert/pdf/ssa-661247.pdf",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2365076+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9980643+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
@@ -753,7 +815,7 @@
|
||||
"kind": "reference",
|
||||
"value": "https://cert-portal.siemens.com/productcert/pdf/ssa-714170.pdf",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2365076+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9980643+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
@@ -769,7 +831,7 @@
|
||||
"kind": "reference",
|
||||
"value": "https://github.com/advisories/GHSA-jfh8-c2jp-5v3q",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2365076+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9980643+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
@@ -785,7 +847,7 @@
|
||||
"kind": "reference",
|
||||
"value": "https://lists.fedoraproject.org/archives/list/package-announce@lists.fedoraproject.org/message/EOKPQGV24RRBBI4TBZUDQMM4MEH7MXCY",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2365076+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9980643+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
@@ -801,7 +863,7 @@
|
||||
"kind": "reference",
|
||||
"value": "https://lists.fedoraproject.org/archives/list/package-announce@lists.fedoraproject.org/message/SIG7FZULMNK2XF6FZRU4VWYDQXNMUGAJ",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2365076+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9980643+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
@@ -817,7 +879,7 @@
|
||||
"kind": "reference",
|
||||
"value": "https://logging.apache.org/log4j/2.x/security.html",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2365076+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9980643+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
@@ -833,7 +895,7 @@
|
||||
"kind": "reference",
|
||||
"value": "https://nvd.nist.gov/vuln/detail/CVE-2021-45046",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2365076+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9980643+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
@@ -849,7 +911,7 @@
|
||||
"kind": "reference",
|
||||
"value": "https://psirt.global.sonicwall.com/vuln-detail/SNWLID-2021-0032",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2365076+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9980643+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
@@ -865,7 +927,7 @@
|
||||
"kind": "reference",
|
||||
"value": "https://sec.cloudapps.cisco.com/security/center/content/CiscoSecurityAdvisory/cisco-sa-apache-log4j-qRuKNEbd",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2365076+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9980643+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
@@ -881,7 +943,7 @@
|
||||
"kind": "reference",
|
||||
"value": "https://security.gentoo.org/glsa/202310-16",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2365076+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9980643+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
@@ -897,7 +959,7 @@
|
||||
"kind": "reference",
|
||||
"value": "https://www.cve.org/CVERecord?id=CVE-2021-44228",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2365076+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9980643+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
@@ -913,7 +975,7 @@
|
||||
"kind": "reference",
|
||||
"value": "https://www.debian.org/security/2021/dsa-5022",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2365076+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9980643+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
@@ -929,7 +991,7 @@
|
||||
"kind": "reference",
|
||||
"value": "https://www.intel.com/content/www/us/en/security-center/advisory/intel-sa-00646.html",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2365076+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9980643+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
@@ -945,7 +1007,7 @@
|
||||
"kind": "reference",
|
||||
"value": "https://www.kb.cert.org/vuls/id/930724",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2365076+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9980643+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
@@ -961,7 +1023,7 @@
|
||||
"kind": "reference",
|
||||
"value": "https://www.openwall.com/lists/oss-security/2021/12/14/4",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2365076+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9980643+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
@@ -977,7 +1039,7 @@
|
||||
"kind": "reference",
|
||||
"value": "https://www.oracle.com/security-alerts/alert-cve-2021-44228.html",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2365076+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9980643+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
@@ -993,7 +1055,7 @@
|
||||
"kind": "reference",
|
||||
"value": "https://www.oracle.com/security-alerts/cpuapr2022.html",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2365076+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9980643+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
@@ -1009,7 +1071,7 @@
|
||||
"kind": "reference",
|
||||
"value": "https://www.oracle.com/security-alerts/cpujan2022.html",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2365076+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9980643+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
@@ -1025,7 +1087,7 @@
|
||||
"kind": "reference",
|
||||
"value": "https://www.oracle.com/security-alerts/cpujul2022.html",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2365076+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9980643+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
@@ -1073,7 +1135,7 @@
|
||||
"kind": "range",
|
||||
"value": "pkg:pypi/pyload-ng",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2065811+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.995174+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[].versionranges[]"
|
||||
]
|
||||
@@ -1101,7 +1163,7 @@
|
||||
"kind": "affected",
|
||||
"value": "pkg:pypi/pyload-ng",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2065811+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.995174+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[]"
|
||||
]
|
||||
@@ -1113,6 +1175,7 @@
|
||||
"CVE-2025-61773",
|
||||
"GHSA-cjjf-27cc-pvmv"
|
||||
],
|
||||
"canonicalMetricId": "3.1|CVSS:3.1/AV:N/AC:L/PR:N/UI:R/S:U/C:H/I:H/A:N",
|
||||
"credits": [],
|
||||
"cvssMetrics": [
|
||||
{
|
||||
@@ -1123,13 +1186,88 @@
|
||||
"kind": "cvss",
|
||||
"value": "CVSS_V3",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2065811+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.995174+00:00",
|
||||
"fieldMask": []
|
||||
},
|
||||
"vector": "CVSS:3.1/AV:N/AC:L/PR:N/UI:R/S:U/C:H/I:H/A:N",
|
||||
"version": "3.1"
|
||||
}
|
||||
],
|
||||
"cwes": [
|
||||
{
|
||||
"taxonomy": "cwe",
|
||||
"identifier": "CWE-116",
|
||||
"name": null,
|
||||
"uri": "https://cwe.mitre.org/data/definitions/116.html",
|
||||
"provenance": [
|
||||
{
|
||||
"source": "osv",
|
||||
"kind": "weakness",
|
||||
"value": "CWE-116",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-15T14:48:57.995174+00:00",
|
||||
"fieldMask": [
|
||||
"cwes[]"
|
||||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"taxonomy": "cwe",
|
||||
"identifier": "CWE-74",
|
||||
"name": null,
|
||||
"uri": "https://cwe.mitre.org/data/definitions/74.html",
|
||||
"provenance": [
|
||||
{
|
||||
"source": "osv",
|
||||
"kind": "weakness",
|
||||
"value": "CWE-74",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-15T14:48:57.995174+00:00",
|
||||
"fieldMask": [
|
||||
"cwes[]"
|
||||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"taxonomy": "cwe",
|
||||
"identifier": "CWE-79",
|
||||
"name": null,
|
||||
"uri": "https://cwe.mitre.org/data/definitions/79.html",
|
||||
"provenance": [
|
||||
{
|
||||
"source": "osv",
|
||||
"kind": "weakness",
|
||||
"value": "CWE-79",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-15T14:48:57.995174+00:00",
|
||||
"fieldMask": [
|
||||
"cwes[]"
|
||||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"taxonomy": "cwe",
|
||||
"identifier": "CWE-94",
|
||||
"name": null,
|
||||
"uri": "https://cwe.mitre.org/data/definitions/94.html",
|
||||
"provenance": [
|
||||
{
|
||||
"source": "osv",
|
||||
"kind": "weakness",
|
||||
"value": "CWE-94",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-15T14:48:57.995174+00:00",
|
||||
"fieldMask": [
|
||||
"cwes[]"
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
],
|
||||
"description": "### Summary\npyLoad web interface contained insufficient input validation in both the Captcha script endpoint and the Click'N'Load (CNL) Blueprint. This flaw allowed untrusted user input to be processed unsafely, which could be exploited by an attacker to inject arbitrary content into the web UI or manipulate request handling. The vulnerability could lead to client-side code execution (XSS) or other unintended behaviors when a malicious payload is submitted.\n\nuser-supplied parameters from HTTP requests were not adequately validated or sanitized before being passed into the application logic and response generation. This allowed crafted input to alter the expected execution flow.\n CNL (Click'N'Load) blueprint exposed unsafe handling of untrusted parameters in HTTP requests. The application did not consistently enforce input validation or encoding, making it possible for an attacker to craft malicious requests.\n\n### PoC\n\n1. Run a vulnerable version of pyLoad prior to commit [`f9d27f2`](https://github.com/pyload/pyload/pull/4624).\n2. Start the web UI and access the Captcha or CNL endpoints.\n3. Submit a crafted request containing malicious JavaScript payloads in unvalidated parameters (`/flash/addcrypted2?jk=function(){alert(1)}&crypted=12345`).\n4. Observe that the payload is reflected and executed in the client’s browser, demonstrating cross-site scripting (XSS).\n\nExample request:\n\n```http\nGET /flash/addcrypted2?jk=function(){alert(1)}&crypted=12345 HTTP/1.1\nHost: 127.0.0.1:8000\nContent-Type: application/x-www-form-urlencoded\nContent-Length: 107\n```\n\n### Impact\n\nExploiting this vulnerability allows an attacker to inject and execute arbitrary JavaScript within the browser session of a user accessing the pyLoad Web UI. In practice, this means an attacker could impersonate an administrator, steal authentication cookies or tokens, and perform unauthorized actions on behalf of the victim. Because the affected endpoints are part of the core interface, a successful attack undermines the trust and security of the entire application, potentially leading to a full compromise of the management interface and the data it controls. The impact is particularly severe in cases where the Web UI is exposed over a network without additional access restrictions, as it enables remote attackers to directly target users with crafted links or requests that trigger the vulnerability.",
|
||||
"exploitKnown": false,
|
||||
"language": "en",
|
||||
"modified": "2025-10-09T15:59:13.250015+00:00",
|
||||
@@ -1139,7 +1277,7 @@
|
||||
"kind": "document",
|
||||
"value": "https://osv.dev/vulnerability/GHSA-cjjf-27cc-pvmv",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2061911+00:00",
|
||||
"recordedAt": "2025-10-09T15:19:48+00:00",
|
||||
"fieldMask": [
|
||||
"advisory"
|
||||
]
|
||||
@@ -1149,7 +1287,7 @@
|
||||
"kind": "mapping",
|
||||
"value": "GHSA-cjjf-27cc-pvmv",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2065811+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.995174+00:00",
|
||||
"fieldMask": [
|
||||
"advisory"
|
||||
]
|
||||
@@ -1164,7 +1302,7 @@
|
||||
"kind": "reference",
|
||||
"value": "https://github.com/pyload/pyload",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2065811+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.995174+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
@@ -1180,7 +1318,7 @@
|
||||
"kind": "reference",
|
||||
"value": "https://github.com/pyload/pyload/commit/5823327d0b797161c7195a1f660266d30a69f0ca",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2065811+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.995174+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
@@ -1196,7 +1334,7 @@
|
||||
"kind": "reference",
|
||||
"value": "https://github.com/pyload/pyload/pull/4624",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2065811+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.995174+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
@@ -1212,7 +1350,7 @@
|
||||
"kind": "reference",
|
||||
"value": "https://github.com/pyload/pyload/security/advisories/GHSA-cjjf-27cc-pvmv",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.2065811+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.995174+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
@@ -1260,7 +1398,7 @@
|
||||
"kind": "range",
|
||||
"value": "pkg:pypi/social-auth-app-django",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.1231115+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9927932+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[].versionranges[]"
|
||||
]
|
||||
@@ -1288,7 +1426,7 @@
|
||||
"kind": "affected",
|
||||
"value": "pkg:pypi/social-auth-app-django",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.1231115+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9927932+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[]"
|
||||
]
|
||||
@@ -1300,8 +1438,30 @@
|
||||
"CVE-2025-61783",
|
||||
"GHSA-wv4w-6qv2-qqfg"
|
||||
],
|
||||
"canonicalMetricId": null,
|
||||
"credits": [],
|
||||
"cvssMetrics": [],
|
||||
"cwes": [
|
||||
{
|
||||
"taxonomy": "cwe",
|
||||
"identifier": "CWE-290",
|
||||
"name": null,
|
||||
"uri": "https://cwe.mitre.org/data/definitions/290.html",
|
||||
"provenance": [
|
||||
{
|
||||
"source": "osv",
|
||||
"kind": "weakness",
|
||||
"value": "CWE-290",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-15T14:48:57.9927932+00:00",
|
||||
"fieldMask": [
|
||||
"cwes[]"
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
],
|
||||
"description": "### Impact\n\nUpon authentication, the user could be associated by e-mail even if the `associate_by_email` pipeline was not included. This could lead to account compromise when a third-party authentication service does not validate provided e-mail addresses or doesn't require unique e-mail addresses.\n\n### Patches\n\n* https://github.com/python-social-auth/social-app-django/pull/803\n\n### Workarounds\n\nReview the authentication service policy on e-mail addresses; many will not allow exploiting this vulnerability.",
|
||||
"exploitKnown": false,
|
||||
"language": "en",
|
||||
"modified": "2025-10-09T17:57:29.916841+00:00",
|
||||
@@ -1311,7 +1471,7 @@
|
||||
"kind": "document",
|
||||
"value": "https://osv.dev/vulnerability/GHSA-wv4w-6qv2-qqfg",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.0743113+00:00",
|
||||
"recordedAt": "2025-10-09T17:08:05+00:00",
|
||||
"fieldMask": [
|
||||
"advisory"
|
||||
]
|
||||
@@ -1321,7 +1481,7 @@
|
||||
"kind": "mapping",
|
||||
"value": "GHSA-wv4w-6qv2-qqfg",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.1231115+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9927932+00:00",
|
||||
"fieldMask": [
|
||||
"advisory"
|
||||
]
|
||||
@@ -1336,7 +1496,7 @@
|
||||
"kind": "reference",
|
||||
"value": "https://github.com/python-social-auth/social-app-django",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.1231115+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9927932+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
@@ -1352,7 +1512,7 @@
|
||||
"kind": "reference",
|
||||
"value": "https://github.com/python-social-auth/social-app-django/commit/10c80e2ebabeccd4e9c84ad0e16e1db74148ed4c",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.1231115+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9927932+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
@@ -1368,7 +1528,7 @@
|
||||
"kind": "reference",
|
||||
"value": "https://github.com/python-social-auth/social-app-django/issues/220",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.1231115+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9927932+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
@@ -1384,7 +1544,7 @@
|
||||
"kind": "reference",
|
||||
"value": "https://github.com/python-social-auth/social-app-django/issues/231",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.1231115+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9927932+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
@@ -1400,7 +1560,7 @@
|
||||
"kind": "reference",
|
||||
"value": "https://github.com/python-social-auth/social-app-django/issues/634",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.1231115+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9927932+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
@@ -1416,7 +1576,7 @@
|
||||
"kind": "reference",
|
||||
"value": "https://github.com/python-social-auth/social-app-django/pull/803",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.1231115+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9927932+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
@@ -1432,7 +1592,7 @@
|
||||
"kind": "reference",
|
||||
"value": "https://github.com/python-social-auth/social-app-django/security/advisories/GHSA-wv4w-6qv2-qqfg",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T19:48:04.1231115+00:00",
|
||||
"recordedAt": "2025-10-15T14:48:57.9927932+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
|
||||
@@ -68,17 +68,18 @@
|
||||
]
|
||||
}
|
||||
],
|
||||
"aliases": [
|
||||
"CVE-2025-113",
|
||||
"GHSA-3abc-3def-3ghi",
|
||||
"OSV-2025-npm-0001",
|
||||
"OSV-RELATED-npm-42"
|
||||
],
|
||||
"credits": [],
|
||||
"cvssMetrics": [
|
||||
{
|
||||
"baseScore": 9.8,
|
||||
"baseSeverity": "critical",
|
||||
"aliases": [
|
||||
"CVE-2025-113",
|
||||
"GHSA-3abc-3def-3ghi",
|
||||
"OSV-2025-npm-0001",
|
||||
"OSV-RELATED-npm-42"
|
||||
],
|
||||
"canonicalMetricId": "3.1|CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:H/I:H/A:H",
|
||||
"credits": [],
|
||||
"cvssMetrics": [
|
||||
{
|
||||
"baseScore": 9.8,
|
||||
"baseSeverity": "critical",
|
||||
"provenance": {
|
||||
"source": "osv",
|
||||
"kind": "cvss",
|
||||
@@ -88,12 +89,14 @@
|
||||
"fieldMask": []
|
||||
},
|
||||
"vector": "CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:H/I:H/A:H",
|
||||
"version": "3.1"
|
||||
}
|
||||
],
|
||||
"exploitKnown": false,
|
||||
"language": "en",
|
||||
"modified": "2025-01-08T06:30:00+00:00",
|
||||
"version": "3.1"
|
||||
}
|
||||
],
|
||||
"cwes": [],
|
||||
"description": "Detailed description for npm package @scope/left-pad.",
|
||||
"exploitKnown": false,
|
||||
"language": "en",
|
||||
"modified": "2025-01-08T06:30:00+00:00",
|
||||
"provenance": [
|
||||
{
|
||||
"source": "osv",
|
||||
@@ -154,4 +157,4 @@
|
||||
"severity": "critical",
|
||||
"summary": "Detailed description for npm package @scope/left-pad.",
|
||||
"title": "npm package vulnerability"
|
||||
}
|
||||
}
|
||||
|
||||
@@ -68,17 +68,18 @@
|
||||
]
|
||||
}
|
||||
],
|
||||
"aliases": [
|
||||
"CVE-2025-114",
|
||||
"GHSA-4abc-4def-4ghi",
|
||||
"OSV-2025-PyPI-0001",
|
||||
"OSV-RELATED-PyPI-42"
|
||||
],
|
||||
"credits": [],
|
||||
"cvssMetrics": [
|
||||
{
|
||||
"baseScore": 9.8,
|
||||
"baseSeverity": "critical",
|
||||
"aliases": [
|
||||
"CVE-2025-114",
|
||||
"GHSA-4abc-4def-4ghi",
|
||||
"OSV-2025-PyPI-0001",
|
||||
"OSV-RELATED-PyPI-42"
|
||||
],
|
||||
"canonicalMetricId": "3.1|CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:H/I:H/A:H",
|
||||
"credits": [],
|
||||
"cvssMetrics": [
|
||||
{
|
||||
"baseScore": 9.8,
|
||||
"baseSeverity": "critical",
|
||||
"provenance": {
|
||||
"source": "osv",
|
||||
"kind": "cvss",
|
||||
@@ -88,12 +89,14 @@
|
||||
"fieldMask": []
|
||||
},
|
||||
"vector": "CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:H/I:H/A:H",
|
||||
"version": "3.1"
|
||||
}
|
||||
],
|
||||
"exploitKnown": false,
|
||||
"language": "en",
|
||||
"modified": "2025-01-08T06:30:00+00:00",
|
||||
"version": "3.1"
|
||||
}
|
||||
],
|
||||
"cwes": [],
|
||||
"description": "Detailed description for PyPI package requests.",
|
||||
"exploitKnown": false,
|
||||
"language": "en",
|
||||
"modified": "2025-01-08T06:30:00+00:00",
|
||||
"provenance": [
|
||||
{
|
||||
"source": "osv",
|
||||
@@ -154,4 +157,4 @@
|
||||
"severity": "critical",
|
||||
"summary": "Detailed description for PyPI package requests.",
|
||||
"title": "PyPI package vulnerability"
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,6 +1,8 @@
|
||||
using System;
|
||||
using System;
|
||||
using System.Collections.Generic;
|
||||
using System.Collections.Immutable;
|
||||
using System.Linq;
|
||||
using System.Text;
|
||||
using System.Text.Json;
|
||||
using StellaOps.Feedser.Models;
|
||||
using StellaOps.Feedser.Normalization.Cvss;
|
||||
@@ -66,21 +68,30 @@ internal static class OsvMapper
|
||||
var credits = BuildCredits(dto, recordedAt);
|
||||
var affectedPackages = BuildAffectedPackages(dto, ecosystem, recordedAt);
|
||||
var cvssMetrics = BuildCvssMetrics(dto, recordedAt, out var severity);
|
||||
|
||||
var normalizedDescription = DescriptionNormalizer.Normalize(new[]
|
||||
{
|
||||
new LocalizedText(dto.Details, "en"),
|
||||
new LocalizedText(dto.Summary, "en"),
|
||||
});
|
||||
|
||||
var title = string.IsNullOrWhiteSpace(dto.Summary) ? dto.Id : dto.Summary!.Trim();
|
||||
var summary = string.IsNullOrWhiteSpace(normalizedDescription.Text) ? dto.Summary : normalizedDescription.Text;
|
||||
var language = string.IsNullOrWhiteSpace(normalizedDescription.Language) ? null : normalizedDescription.Language;
|
||||
|
||||
return new Advisory(
|
||||
dto.Id,
|
||||
title,
|
||||
summary,
|
||||
var weaknesses = BuildWeaknesses(dto, recordedAt);
|
||||
var canonicalMetricId = cvssMetrics.Count > 0
|
||||
? $"{cvssMetrics[0].Version}|{cvssMetrics[0].Vector}"
|
||||
: null;
|
||||
|
||||
var normalizedDescription = DescriptionNormalizer.Normalize(new[]
|
||||
{
|
||||
new LocalizedText(dto.Details, "en"),
|
||||
new LocalizedText(dto.Summary, "en"),
|
||||
});
|
||||
|
||||
var title = string.IsNullOrWhiteSpace(dto.Summary) ? dto.Id : dto.Summary!.Trim();
|
||||
var summary = string.IsNullOrWhiteSpace(normalizedDescription.Text) ? dto.Summary : normalizedDescription.Text;
|
||||
var language = string.IsNullOrWhiteSpace(normalizedDescription.Language) ? null : normalizedDescription.Language;
|
||||
var descriptionText = Validation.TrimToNull(dto.Details);
|
||||
if (string.IsNullOrWhiteSpace(summary) && !string.IsNullOrWhiteSpace(descriptionText))
|
||||
{
|
||||
summary = descriptionText;
|
||||
}
|
||||
|
||||
return new Advisory(
|
||||
dto.Id,
|
||||
title,
|
||||
summary,
|
||||
language,
|
||||
dto.Published?.ToUniversalTime(),
|
||||
dto.Modified?.ToUniversalTime(),
|
||||
@@ -91,7 +102,10 @@ internal static class OsvMapper
|
||||
references,
|
||||
affectedPackages,
|
||||
cvssMetrics,
|
||||
new[] { fetchProvenance, mappingProvenance });
|
||||
new[] { fetchProvenance, mappingProvenance },
|
||||
descriptionText,
|
||||
weaknesses,
|
||||
canonicalMetricId);
|
||||
}
|
||||
|
||||
private static IEnumerable<string> BuildAliases(OsvVulnerabilityDto dto)
|
||||
@@ -465,11 +479,81 @@ internal static class OsvMapper
|
||||
var fallbackEcosystem = ecosystemHint ?? Validation.TrimToNull(ecosystem) ?? "osv";
|
||||
return $"{fallbackEcosystem}:{name}";
|
||||
}
|
||||
|
||||
private static IReadOnlyList<CvssMetric> BuildCvssMetrics(OsvVulnerabilityDto dto, DateTimeOffset recordedAt, out string? severity)
|
||||
{
|
||||
severity = null;
|
||||
if (dto.Severity is null || dto.Severity.Count == 0)
|
||||
|
||||
private static IReadOnlyList<AdvisoryWeakness> BuildWeaknesses(OsvVulnerabilityDto dto, DateTimeOffset recordedAt)
|
||||
{
|
||||
if (dto.DatabaseSpecific.ValueKind != JsonValueKind.Object ||
|
||||
!dto.DatabaseSpecific.TryGetProperty("cwe_ids", out var cweIds) ||
|
||||
cweIds.ValueKind != JsonValueKind.Array)
|
||||
{
|
||||
return Array.Empty<AdvisoryWeakness>();
|
||||
}
|
||||
|
||||
var list = new List<AdvisoryWeakness>(cweIds.GetArrayLength());
|
||||
foreach (var element in cweIds.EnumerateArray())
|
||||
{
|
||||
if (element.ValueKind != JsonValueKind.String)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var raw = element.GetString();
|
||||
if (string.IsNullOrWhiteSpace(raw))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var identifier = raw.Trim().ToUpperInvariant();
|
||||
var provenance = new AdvisoryProvenance(
|
||||
OsvConnectorPlugin.SourceName,
|
||||
"weakness",
|
||||
identifier,
|
||||
recordedAt,
|
||||
new[] { ProvenanceFieldMasks.Weaknesses });
|
||||
|
||||
var provenanceArray = ImmutableArray.Create(provenance);
|
||||
list.Add(new AdvisoryWeakness(
|
||||
taxonomy: "cwe",
|
||||
identifier: identifier,
|
||||
name: null,
|
||||
uri: BuildCweUrl(identifier),
|
||||
provenance: provenanceArray));
|
||||
}
|
||||
|
||||
return list.Count == 0 ? Array.Empty<AdvisoryWeakness>() : list;
|
||||
}
|
||||
|
||||
private static string? BuildCweUrl(string? cweId)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(cweId))
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
var trimmed = cweId.Trim();
|
||||
var dashIndex = trimmed.IndexOf('-');
|
||||
if (dashIndex < 0 || dashIndex == trimmed.Length - 1)
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
var digits = new StringBuilder();
|
||||
for (var i = dashIndex + 1; i < trimmed.Length; i++)
|
||||
{
|
||||
var ch = trimmed[i];
|
||||
if (char.IsDigit(ch))
|
||||
{
|
||||
digits.Append(ch);
|
||||
}
|
||||
}
|
||||
|
||||
return digits.Length == 0 ? null : $"https://cwe.mitre.org/data/definitions/{digits}.html";
|
||||
}
|
||||
|
||||
private static IReadOnlyList<CvssMetric> BuildCvssMetrics(OsvVulnerabilityDto dto, DateTimeOffset recordedAt, out string? severity)
|
||||
{
|
||||
severity = null;
|
||||
if (dto.Severity is null || dto.Severity.Count == 0)
|
||||
{
|
||||
return Array.Empty<CvssMetric>();
|
||||
}
|
||||
|
||||
@@ -16,3 +16,5 @@
|
||||
|FEEDCONN-OSV-02-003 Normalized versions rollout|BE-Conn-OSV|Models `FEEDMODELS-SCHEMA-01-003`, Normalization playbook|**DONE (2025-10-11)** – `OsvMapper` now emits SemVer primitives + normalized rules with `osv:{ecosystem}:{advisoryId}:{identifier}` notes; npm/PyPI/Parity fixtures refreshed; merge coordination pinged (OSV handoff).|
|
||||
|FEEDCONN-OSV-04-003 Parity fixture refresh|QA, BE-Conn-OSV|Normalized versions rollout, GHSA parity tests|**DONE (2025-10-12)** – Parity fixtures include normalizedVersions notes (`osv:<ecosystem>:<id>:<purl>`); regression math rerun via `dotnet test src/StellaOps.Feedser.Source.Osv.Tests` and docs flagged for workflow sync.|
|
||||
|FEEDCONN-OSV-04-002 Conflict regression fixtures|BE-Conn-OSV, QA|Merge `FEEDMERGE-ENGINE-04-001`|**DONE (2025-10-12)** – Added `conflict-osv.canonical.json` + regression asserting SemVer range + CVSS medium severity; dataset matches GHSA/NVD fixtures for merge tests. Validation: `dotnet test src/StellaOps.Feedser.Source.Osv.Tests/StellaOps.Feedser.Source.Osv.Tests.csproj --filter OsvConflictFixtureTests`.|
|
||||
|FEEDCONN-OSV-04-004 Description/CWE/metric parity rollout|BE-Conn-OSV|Models, Core|**DONE (2025-10-15)** – OSV mapper writes advisory descriptions, `database_specific.cwe_ids` weaknesses, and canonical CVSS metric id. Parity fixtures (`osv-ghsa.*`, `osv-npm.snapshot.json`, `osv-pypi.snapshot.json`) refreshed and status communicated to Merge coordination.|
|
||||
|FEEDCONN-OSV-04-005 Canonical metric fallbacks & CWE notes|BE-Conn-OSV|Models, Merge|TODO – Add fallback logic and metrics for advisories lacking CVSS vectors, enrich CWE provenance notes, and document merge/export expectations; refresh parity fixtures accordingly.|
|
||||
|
||||
@@ -1,335 +1,335 @@
|
||||
[
|
||||
{
|
||||
"advisoryKey": "BDU:2025-00001",
|
||||
"affectedPackages": [
|
||||
{
|
||||
"type": "vendor",
|
||||
"identifier": "ООО «1С-Софт» 1С:Предприятие",
|
||||
"platform": null,
|
||||
"versionRanges": [
|
||||
{
|
||||
"fixedVersion": null,
|
||||
"introducedVersion": null,
|
||||
"lastAffectedVersion": null,
|
||||
"primitives": null,
|
||||
"provenance": {
|
||||
"source": "ru-bdu",
|
||||
"kind": "package-range",
|
||||
"value": "8.2.19.116",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-14T08:00:00+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[].versionranges[]"
|
||||
]
|
||||
},
|
||||
"rangeExpression": "8.2.19.116",
|
||||
"rangeKind": "string"
|
||||
}
|
||||
],
|
||||
"normalizedVersions": [
|
||||
{
|
||||
"scheme": "ru-bdu.raw",
|
||||
"type": "exact",
|
||||
"min": null,
|
||||
"minInclusive": null,
|
||||
"max": null,
|
||||
"maxInclusive": null,
|
||||
"value": "8.2.19.116",
|
||||
"notes": null
|
||||
}
|
||||
],
|
||||
"statuses": [
|
||||
{
|
||||
"provenance": {
|
||||
"source": "ru-bdu",
|
||||
"kind": "package-status",
|
||||
"value": "Подтверждена производителем",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-14T08:00:00+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[].statuses[]"
|
||||
]
|
||||
},
|
||||
"status": "affected"
|
||||
},
|
||||
{
|
||||
"provenance": {
|
||||
"source": "ru-bdu",
|
||||
"kind": "package-fix-status",
|
||||
"value": "Уязвимость устранена",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-14T08:00:00+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[].statuses[]"
|
||||
]
|
||||
},
|
||||
"status": "fixed"
|
||||
}
|
||||
],
|
||||
"provenance": [
|
||||
{
|
||||
"source": "ru-bdu",
|
||||
"kind": "package",
|
||||
"value": "ООО «1С-Софт» 1С:Предприятие",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-14T08:00:00+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[]"
|
||||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "vendor",
|
||||
"identifier": "ООО «1С-Софт» 1С:Предприятие",
|
||||
"platform": "Windows",
|
||||
"versionRanges": [
|
||||
{
|
||||
"fixedVersion": null,
|
||||
"introducedVersion": null,
|
||||
"lastAffectedVersion": null,
|
||||
"primitives": null,
|
||||
"provenance": {
|
||||
"source": "ru-bdu",
|
||||
"kind": "package-range",
|
||||
"value": "8.2.18.96",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-14T08:00:00+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[].versionranges[]"
|
||||
]
|
||||
},
|
||||
"rangeExpression": "8.2.18.96",
|
||||
"rangeKind": "string"
|
||||
}
|
||||
],
|
||||
"normalizedVersions": [
|
||||
{
|
||||
"scheme": "ru-bdu.raw",
|
||||
"type": "exact",
|
||||
"min": null,
|
||||
"minInclusive": null,
|
||||
"max": null,
|
||||
"maxInclusive": null,
|
||||
"value": "8.2.18.96",
|
||||
"notes": null
|
||||
}
|
||||
],
|
||||
"statuses": [
|
||||
{
|
||||
"provenance": {
|
||||
"source": "ru-bdu",
|
||||
"kind": "package-status",
|
||||
"value": "Подтверждена производителем",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-14T08:00:00+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[].statuses[]"
|
||||
]
|
||||
},
|
||||
"status": "affected"
|
||||
},
|
||||
{
|
||||
"provenance": {
|
||||
"source": "ru-bdu",
|
||||
"kind": "package-fix-status",
|
||||
"value": "Уязвимость устранена",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-14T08:00:00+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[].statuses[]"
|
||||
]
|
||||
},
|
||||
"status": "fixed"
|
||||
}
|
||||
],
|
||||
"provenance": [
|
||||
{
|
||||
"source": "ru-bdu",
|
||||
"kind": "package",
|
||||
"value": "ООО «1С-Софт» 1С:Предприятие",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-14T08:00:00+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[]"
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
],
|
||||
"aliases": [
|
||||
"BDU:2025-00001",
|
||||
"CVE-2009-3555",
|
||||
"CVE-2015-0206",
|
||||
"PT-2015-0206"
|
||||
],
|
||||
"credits": [],
|
||||
"cvssMetrics": [
|
||||
{
|
||||
"baseScore": 7.5,
|
||||
"baseSeverity": "high",
|
||||
"provenance": {
|
||||
"source": "ru-bdu",
|
||||
"kind": "cvss",
|
||||
"value": "CVSS:2.0/AV:N/AC:L/AU:N/C:P/I:P/A:P",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-14T08:00:00+00:00",
|
||||
"fieldMask": [
|
||||
"cvssmetrics[]"
|
||||
]
|
||||
},
|
||||
"vector": "CVSS:2.0/AV:N/AC:L/AU:N/C:P/I:P/A:P",
|
||||
"version": "2.0"
|
||||
},
|
||||
{
|
||||
"baseScore": 9.8,
|
||||
"baseSeverity": "critical",
|
||||
"provenance": {
|
||||
"source": "ru-bdu",
|
||||
"kind": "cvss",
|
||||
"value": "CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:H/I:H/A:H",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-14T08:00:00+00:00",
|
||||
"fieldMask": [
|
||||
"cvssmetrics[]"
|
||||
]
|
||||
},
|
||||
"vector": "CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:H/I:H/A:H",
|
||||
"version": "3.1"
|
||||
}
|
||||
],
|
||||
"exploitKnown": true,
|
||||
"language": "ru",
|
||||
"modified": "2013-01-12T00:00:00+00:00",
|
||||
"provenance": [
|
||||
{
|
||||
"source": "ru-bdu",
|
||||
"kind": "advisory",
|
||||
"value": "BDU:2025-00001",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-14T08:00:00+00:00",
|
||||
"fieldMask": [
|
||||
"advisory"
|
||||
]
|
||||
}
|
||||
],
|
||||
"published": "2013-01-12T00:00:00+00:00",
|
||||
"references": [
|
||||
{
|
||||
"kind": "source",
|
||||
"provenance": {
|
||||
"source": "ru-bdu",
|
||||
"kind": "reference",
|
||||
"value": "http://mirror.example/ru-bdu/BDU-2025-00001",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-14T08:00:00+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
},
|
||||
"sourceTag": "ru-bdu",
|
||||
"summary": null,
|
||||
"url": "http://mirror.example/ru-bdu/BDU-2025-00001"
|
||||
},
|
||||
{
|
||||
"kind": "source",
|
||||
"provenance": {
|
||||
"source": "ru-bdu",
|
||||
"kind": "reference",
|
||||
"value": "https://advisories.example/BDU-2025-00001",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-14T08:00:00+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
},
|
||||
"sourceTag": "ru-bdu",
|
||||
"summary": null,
|
||||
"url": "https://advisories.example/BDU-2025-00001"
|
||||
},
|
||||
{
|
||||
"kind": "details",
|
||||
"provenance": {
|
||||
"source": "ru-bdu",
|
||||
"kind": "reference",
|
||||
"value": "https://bdu.fstec.ru/vul/2025-00001",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-14T08:00:00+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
},
|
||||
"sourceTag": "ru-bdu",
|
||||
"summary": null,
|
||||
"url": "https://bdu.fstec.ru/vul/2025-00001"
|
||||
},
|
||||
{
|
||||
"kind": "cwe",
|
||||
"provenance": {
|
||||
"source": "ru-bdu",
|
||||
"kind": "reference",
|
||||
"value": "https://cwe.mitre.org/data/definitions/310.html",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-14T08:00:00+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
},
|
||||
"sourceTag": "cwe",
|
||||
"summary": "Проблемы использования криптографии",
|
||||
"url": "https://cwe.mitre.org/data/definitions/310.html"
|
||||
},
|
||||
{
|
||||
"kind": "cve",
|
||||
"provenance": {
|
||||
"source": "ru-bdu",
|
||||
"kind": "reference",
|
||||
"value": "https://nvd.nist.gov/vuln/detail/CVE-2009-3555",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-14T08:00:00+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
},
|
||||
"sourceTag": "cve",
|
||||
"summary": "CVE-2009-3555",
|
||||
"url": "https://nvd.nist.gov/vuln/detail/CVE-2009-3555"
|
||||
},
|
||||
{
|
||||
"kind": "cve",
|
||||
"provenance": {
|
||||
"source": "ru-bdu",
|
||||
"kind": "reference",
|
||||
"value": "https://nvd.nist.gov/vuln/detail/CVE-2015-0206",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-14T08:00:00+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
},
|
||||
"sourceTag": "cve",
|
||||
"summary": "CVE-2015-0206",
|
||||
"url": "https://nvd.nist.gov/vuln/detail/CVE-2015-0206"
|
||||
},
|
||||
{
|
||||
"kind": "external",
|
||||
"provenance": {
|
||||
"source": "ru-bdu",
|
||||
"kind": "reference",
|
||||
"value": "https://ptsecurity.com/PT-2015-0206",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-14T08:00:00+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
},
|
||||
"sourceTag": "positivetechnologiesadvisory",
|
||||
"summary": "PT-2015-0206",
|
||||
"url": "https://ptsecurity.com/PT-2015-0206"
|
||||
}
|
||||
],
|
||||
"severity": "critical",
|
||||
"summary": "Удалённый злоумышленник может вызвать отказ в обслуживании или получить доступ к данным.",
|
||||
"title": "Множественные уязвимости криптопровайдера"
|
||||
}
|
||||
[
|
||||
{
|
||||
"advisoryKey": "BDU:2025-00001",
|
||||
"affectedPackages": [
|
||||
{
|
||||
"type": "vendor",
|
||||
"identifier": "ООО «1С-Софт» 1С:Предприятие",
|
||||
"platform": null,
|
||||
"versionRanges": [
|
||||
{
|
||||
"fixedVersion": null,
|
||||
"introducedVersion": null,
|
||||
"lastAffectedVersion": null,
|
||||
"primitives": null,
|
||||
"provenance": {
|
||||
"source": "ru-bdu",
|
||||
"kind": "package-range",
|
||||
"value": "8.2.19.116",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-14T08:00:00+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[].versionranges[]"
|
||||
]
|
||||
},
|
||||
"rangeExpression": "8.2.19.116",
|
||||
"rangeKind": "string"
|
||||
}
|
||||
],
|
||||
"normalizedVersions": [
|
||||
{
|
||||
"scheme": "ru-bdu.raw",
|
||||
"type": "exact",
|
||||
"min": null,
|
||||
"minInclusive": null,
|
||||
"max": null,
|
||||
"maxInclusive": null,
|
||||
"value": "8.2.19.116",
|
||||
"notes": null
|
||||
}
|
||||
],
|
||||
"statuses": [
|
||||
{
|
||||
"provenance": {
|
||||
"source": "ru-bdu",
|
||||
"kind": "package-status",
|
||||
"value": "Подтверждена производителем",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-14T08:00:00+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[].statuses[]"
|
||||
]
|
||||
},
|
||||
"status": "affected"
|
||||
},
|
||||
{
|
||||
"provenance": {
|
||||
"source": "ru-bdu",
|
||||
"kind": "package-fix-status",
|
||||
"value": "Уязвимость устранена",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-14T08:00:00+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[].statuses[]"
|
||||
]
|
||||
},
|
||||
"status": "fixed"
|
||||
}
|
||||
],
|
||||
"provenance": [
|
||||
{
|
||||
"source": "ru-bdu",
|
||||
"kind": "package",
|
||||
"value": "ООО «1С-Софт» 1С:Предприятие",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-14T08:00:00+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[]"
|
||||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "vendor",
|
||||
"identifier": "ООО «1С-Софт» 1С:Предприятие",
|
||||
"platform": "Windows",
|
||||
"versionRanges": [
|
||||
{
|
||||
"fixedVersion": null,
|
||||
"introducedVersion": null,
|
||||
"lastAffectedVersion": null,
|
||||
"primitives": null,
|
||||
"provenance": {
|
||||
"source": "ru-bdu",
|
||||
"kind": "package-range",
|
||||
"value": "8.2.18.96",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-14T08:00:00+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[].versionranges[]"
|
||||
]
|
||||
},
|
||||
"rangeExpression": "8.2.18.96",
|
||||
"rangeKind": "string"
|
||||
}
|
||||
],
|
||||
"normalizedVersions": [
|
||||
{
|
||||
"scheme": "ru-bdu.raw",
|
||||
"type": "exact",
|
||||
"min": null,
|
||||
"minInclusive": null,
|
||||
"max": null,
|
||||
"maxInclusive": null,
|
||||
"value": "8.2.18.96",
|
||||
"notes": null
|
||||
}
|
||||
],
|
||||
"statuses": [
|
||||
{
|
||||
"provenance": {
|
||||
"source": "ru-bdu",
|
||||
"kind": "package-status",
|
||||
"value": "Подтверждена производителем",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-14T08:00:00+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[].statuses[]"
|
||||
]
|
||||
},
|
||||
"status": "affected"
|
||||
},
|
||||
{
|
||||
"provenance": {
|
||||
"source": "ru-bdu",
|
||||
"kind": "package-fix-status",
|
||||
"value": "Уязвимость устранена",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-14T08:00:00+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[].statuses[]"
|
||||
]
|
||||
},
|
||||
"status": "fixed"
|
||||
}
|
||||
],
|
||||
"provenance": [
|
||||
{
|
||||
"source": "ru-bdu",
|
||||
"kind": "package",
|
||||
"value": "ООО «1С-Софт» 1С:Предприятие",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-14T08:00:00+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[]"
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
],
|
||||
"aliases": [
|
||||
"BDU:2025-00001",
|
||||
"CVE-2009-3555",
|
||||
"CVE-2015-0206",
|
||||
"PT-2015-0206"
|
||||
],
|
||||
"credits": [],
|
||||
"cvssMetrics": [
|
||||
{
|
||||
"baseScore": 7.5,
|
||||
"baseSeverity": "high",
|
||||
"provenance": {
|
||||
"source": "ru-bdu",
|
||||
"kind": "cvss",
|
||||
"value": "CVSS:2.0/AV:N/AC:L/AU:N/C:P/I:P/A:P",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-14T08:00:00+00:00",
|
||||
"fieldMask": [
|
||||
"cvssmetrics[]"
|
||||
]
|
||||
},
|
||||
"vector": "CVSS:2.0/AV:N/AC:L/AU:N/C:P/I:P/A:P",
|
||||
"version": "2.0"
|
||||
},
|
||||
{
|
||||
"baseScore": 9.8,
|
||||
"baseSeverity": "critical",
|
||||
"provenance": {
|
||||
"source": "ru-bdu",
|
||||
"kind": "cvss",
|
||||
"value": "CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:H/I:H/A:H",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-14T08:00:00+00:00",
|
||||
"fieldMask": [
|
||||
"cvssmetrics[]"
|
||||
]
|
||||
},
|
||||
"vector": "CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:H/I:H/A:H",
|
||||
"version": "3.1"
|
||||
}
|
||||
],
|
||||
"exploitKnown": true,
|
||||
"language": "ru",
|
||||
"modified": "2013-01-12T00:00:00+00:00",
|
||||
"provenance": [
|
||||
{
|
||||
"source": "ru-bdu",
|
||||
"kind": "advisory",
|
||||
"value": "BDU:2025-00001",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-14T08:00:00+00:00",
|
||||
"fieldMask": [
|
||||
"advisory"
|
||||
]
|
||||
}
|
||||
],
|
||||
"published": "2013-01-12T00:00:00+00:00",
|
||||
"references": [
|
||||
{
|
||||
"kind": "source",
|
||||
"provenance": {
|
||||
"source": "ru-bdu",
|
||||
"kind": "reference",
|
||||
"value": "http://mirror.example/ru-bdu/BDU-2025-00001",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-14T08:00:00+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
},
|
||||
"sourceTag": "ru-bdu",
|
||||
"summary": null,
|
||||
"url": "http://mirror.example/ru-bdu/BDU-2025-00001"
|
||||
},
|
||||
{
|
||||
"kind": "source",
|
||||
"provenance": {
|
||||
"source": "ru-bdu",
|
||||
"kind": "reference",
|
||||
"value": "https://advisories.example/BDU-2025-00001",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-14T08:00:00+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
},
|
||||
"sourceTag": "ru-bdu",
|
||||
"summary": null,
|
||||
"url": "https://advisories.example/BDU-2025-00001"
|
||||
},
|
||||
{
|
||||
"kind": "details",
|
||||
"provenance": {
|
||||
"source": "ru-bdu",
|
||||
"kind": "reference",
|
||||
"value": "https://bdu.fstec.ru/vul/2025-00001",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-14T08:00:00+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
},
|
||||
"sourceTag": "ru-bdu",
|
||||
"summary": null,
|
||||
"url": "https://bdu.fstec.ru/vul/2025-00001"
|
||||
},
|
||||
{
|
||||
"kind": "cwe",
|
||||
"provenance": {
|
||||
"source": "ru-bdu",
|
||||
"kind": "reference",
|
||||
"value": "https://cwe.mitre.org/data/definitions/310.html",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-14T08:00:00+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
},
|
||||
"sourceTag": "cwe",
|
||||
"summary": "Проблемы использования криптографии",
|
||||
"url": "https://cwe.mitre.org/data/definitions/310.html"
|
||||
},
|
||||
{
|
||||
"kind": "cve",
|
||||
"provenance": {
|
||||
"source": "ru-bdu",
|
||||
"kind": "reference",
|
||||
"value": "https://nvd.nist.gov/vuln/detail/CVE-2009-3555",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-14T08:00:00+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
},
|
||||
"sourceTag": "cve",
|
||||
"summary": "CVE-2009-3555",
|
||||
"url": "https://nvd.nist.gov/vuln/detail/CVE-2009-3555"
|
||||
},
|
||||
{
|
||||
"kind": "cve",
|
||||
"provenance": {
|
||||
"source": "ru-bdu",
|
||||
"kind": "reference",
|
||||
"value": "https://nvd.nist.gov/vuln/detail/CVE-2015-0206",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-14T08:00:00+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
},
|
||||
"sourceTag": "cve",
|
||||
"summary": "CVE-2015-0206",
|
||||
"url": "https://nvd.nist.gov/vuln/detail/CVE-2015-0206"
|
||||
},
|
||||
{
|
||||
"kind": "external",
|
||||
"provenance": {
|
||||
"source": "ru-bdu",
|
||||
"kind": "reference",
|
||||
"value": "https://ptsecurity.com/PT-2015-0206",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-14T08:00:00+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
},
|
||||
"sourceTag": "positivetechnologiesadvisory",
|
||||
"summary": "PT-2015-0206",
|
||||
"url": "https://ptsecurity.com/PT-2015-0206"
|
||||
}
|
||||
],
|
||||
"severity": "critical",
|
||||
"summary": "Удалённый злоумышленник может вызвать отказ в обслуживании или получить доступ к данным.",
|
||||
"title": "Множественные уязвимости криптопровайдера"
|
||||
}
|
||||
]
|
||||
@@ -1,11 +1,11 @@
|
||||
[
|
||||
{
|
||||
"metadata": {
|
||||
"ru-bdu.identifier": "BDU:2025-00001",
|
||||
"ru-bdu.name": "Множественные уязвимости криптопровайдера"
|
||||
},
|
||||
"sha256": "c43df9c4a75a74b281ff09122bb8f63096a0a73b30df74d73c3bc997019bd4d4",
|
||||
"status": "mapped",
|
||||
"uri": "https://bdu.fstec.ru/vul/2025-00001"
|
||||
}
|
||||
[
|
||||
{
|
||||
"metadata": {
|
||||
"ru-bdu.identifier": "BDU:2025-00001",
|
||||
"ru-bdu.name": "Множественные уязвимости криптопровайдера"
|
||||
},
|
||||
"sha256": "c43df9c4a75a74b281ff09122bb8f63096a0a73b30df74d73c3bc997019bd4d4",
|
||||
"status": "mapped",
|
||||
"uri": "https://bdu.fstec.ru/vul/2025-00001"
|
||||
}
|
||||
]
|
||||
@@ -1,86 +1,86 @@
|
||||
[
|
||||
{
|
||||
"documentUri": "https://bdu.fstec.ru/vul/2025-00001",
|
||||
"payload": {
|
||||
"identifier": "BDU:2025-00001",
|
||||
"name": "Множественные уязвимости криптопровайдера",
|
||||
"description": "Удалённый злоумышленник может вызвать отказ в обслуживании или получить доступ к данным.",
|
||||
"solution": "Установить обновление 8.2.19.116 защищённого комплекса.",
|
||||
"identifyDate": "2013-01-12T00:00:00+00:00",
|
||||
"severityText": "Высокий уровень опасности (базовая оценка CVSS 2.0 составляет 7,5)",
|
||||
"cvssVector": "AV:N/AC:L/Au:N/C:P/I:P/A:P",
|
||||
"cvssScore": 7.5,
|
||||
"cvss3Vector": "AV:N/AC:L/PR:N/UI:N/S:U/C:H/I:H/A:H",
|
||||
"cvss3Score": 9.8,
|
||||
"exploitStatus": "Существует в открытом доступе",
|
||||
"incidentCount": 0,
|
||||
"fixStatus": "Уязвимость устранена",
|
||||
"vulStatus": "Подтверждена производителем",
|
||||
"vulClass": "Уязвимость кода",
|
||||
"vulState": "Опубликована",
|
||||
"other": "Язык разработки ПО – С",
|
||||
"software": [
|
||||
{
|
||||
"vendor": "ООО «1С-Софт»",
|
||||
"name": "1С:Предприятие",
|
||||
"version": "8.2.18.96",
|
||||
"platform": "Windows",
|
||||
"types": [
|
||||
"Прикладное ПО информационных систем"
|
||||
]
|
||||
},
|
||||
{
|
||||
"vendor": "ООО «1С-Софт»",
|
||||
"name": "1С:Предприятие",
|
||||
"version": "8.2.19.116",
|
||||
"platform": "Не указана",
|
||||
"types": [
|
||||
"Прикладное ПО информационных систем"
|
||||
]
|
||||
}
|
||||
],
|
||||
"environment": [
|
||||
{
|
||||
"vendor": "Microsoft Corp",
|
||||
"name": "Windows",
|
||||
"version": "-",
|
||||
"platform": "64-bit"
|
||||
},
|
||||
{
|
||||
"vendor": "Microsoft Corp",
|
||||
"name": "Windows",
|
||||
"version": "-",
|
||||
"platform": "32-bit"
|
||||
}
|
||||
],
|
||||
"cwes": [
|
||||
{
|
||||
"identifier": "CWE-310",
|
||||
"name": "Проблемы использования криптографии"
|
||||
}
|
||||
],
|
||||
"sources": [
|
||||
"https://advisories.example/BDU-2025-00001",
|
||||
"http://mirror.example/ru-bdu/BDU-2025-00001"
|
||||
],
|
||||
"identifiers": [
|
||||
{
|
||||
"type": "CVE",
|
||||
"value": "CVE-2015-0206",
|
||||
"link": "https://nvd.nist.gov/vuln/detail/CVE-2015-0206"
|
||||
},
|
||||
{
|
||||
"type": "CVE",
|
||||
"value": "CVE-2009-3555",
|
||||
"link": "https://nvd.nist.gov/vuln/detail/CVE-2009-3555"
|
||||
},
|
||||
{
|
||||
"type": "Positive Technologies Advisory",
|
||||
"value": "PT-2015-0206",
|
||||
"link": "https://ptsecurity.com/PT-2015-0206"
|
||||
}
|
||||
]
|
||||
},
|
||||
"schemaVersion": "ru-bdu.v1"
|
||||
}
|
||||
[
|
||||
{
|
||||
"documentUri": "https://bdu.fstec.ru/vul/2025-00001",
|
||||
"payload": {
|
||||
"identifier": "BDU:2025-00001",
|
||||
"name": "Множественные уязвимости криптопровайдера",
|
||||
"description": "Удалённый злоумышленник может вызвать отказ в обслуживании или получить доступ к данным.",
|
||||
"solution": "Установить обновление 8.2.19.116 защищённого комплекса.",
|
||||
"identifyDate": "2013-01-12T00:00:00+00:00",
|
||||
"severityText": "Высокий уровень опасности (базовая оценка CVSS 2.0 составляет 7,5)",
|
||||
"cvssVector": "AV:N/AC:L/Au:N/C:P/I:P/A:P",
|
||||
"cvssScore": 7.5,
|
||||
"cvss3Vector": "AV:N/AC:L/PR:N/UI:N/S:U/C:H/I:H/A:H",
|
||||
"cvss3Score": 9.8,
|
||||
"exploitStatus": "Существует в открытом доступе",
|
||||
"incidentCount": 0,
|
||||
"fixStatus": "Уязвимость устранена",
|
||||
"vulStatus": "Подтверждена производителем",
|
||||
"vulClass": "Уязвимость кода",
|
||||
"vulState": "Опубликована",
|
||||
"other": "Язык разработки ПО – С",
|
||||
"software": [
|
||||
{
|
||||
"vendor": "ООО «1С-Софт»",
|
||||
"name": "1С:Предприятие",
|
||||
"version": "8.2.18.96",
|
||||
"platform": "Windows",
|
||||
"types": [
|
||||
"Прикладное ПО информационных систем"
|
||||
]
|
||||
},
|
||||
{
|
||||
"vendor": "ООО «1С-Софт»",
|
||||
"name": "1С:Предприятие",
|
||||
"version": "8.2.19.116",
|
||||
"platform": "Не указана",
|
||||
"types": [
|
||||
"Прикладное ПО информационных систем"
|
||||
]
|
||||
}
|
||||
],
|
||||
"environment": [
|
||||
{
|
||||
"vendor": "Microsoft Corp",
|
||||
"name": "Windows",
|
||||
"version": "-",
|
||||
"platform": "64-bit"
|
||||
},
|
||||
{
|
||||
"vendor": "Microsoft Corp",
|
||||
"name": "Windows",
|
||||
"version": "-",
|
||||
"platform": "32-bit"
|
||||
}
|
||||
],
|
||||
"cwes": [
|
||||
{
|
||||
"identifier": "CWE-310",
|
||||
"name": "Проблемы использования криптографии"
|
||||
}
|
||||
],
|
||||
"sources": [
|
||||
"https://advisories.example/BDU-2025-00001",
|
||||
"http://mirror.example/ru-bdu/BDU-2025-00001"
|
||||
],
|
||||
"identifiers": [
|
||||
{
|
||||
"type": "CVE",
|
||||
"value": "CVE-2015-0206",
|
||||
"link": "https://nvd.nist.gov/vuln/detail/CVE-2015-0206"
|
||||
},
|
||||
{
|
||||
"type": "CVE",
|
||||
"value": "CVE-2009-3555",
|
||||
"link": "https://nvd.nist.gov/vuln/detail/CVE-2009-3555"
|
||||
},
|
||||
{
|
||||
"type": "Positive Technologies Advisory",
|
||||
"value": "PT-2015-0206",
|
||||
"link": "https://ptsecurity.com/PT-2015-0206"
|
||||
}
|
||||
]
|
||||
},
|
||||
"schemaVersion": "ru-bdu.v1"
|
||||
}
|
||||
]
|
||||
@@ -1,11 +1,11 @@
|
||||
[
|
||||
{
|
||||
"headers": {
|
||||
"accept": "application/zip,application/octet-stream,application/x-zip-compressed",
|
||||
"accept-Language": "ru-RU,ru; q=0.9,en-US; q=0.6,en; q=0.4",
|
||||
"user-Agent": "StellaOps/Feedser,(+https://stella-ops.org)"
|
||||
},
|
||||
"method": "GET",
|
||||
"uri": "https://bdu.fstec.ru/files/documents/vulxml.zip"
|
||||
}
|
||||
[
|
||||
{
|
||||
"headers": {
|
||||
"accept": "application/zip,application/octet-stream,application/x-zip-compressed",
|
||||
"accept-Language": "ru-RU,ru; q=0.9,en-US; q=0.6,en; q=0.4",
|
||||
"user-Agent": "StellaOps/Feedser,(+https://stella-ops.org)"
|
||||
},
|
||||
"method": "GET",
|
||||
"uri": "https://bdu.fstec.ru/files/documents/vulxml.zip"
|
||||
}
|
||||
]
|
||||
@@ -1,5 +1,5 @@
|
||||
{
|
||||
"lastSuccessfulFetch": "2025-10-14T08:00:00.0000000+00:00",
|
||||
"pendingDocuments": [],
|
||||
"pendingMappings": []
|
||||
{
|
||||
"lastSuccessfulFetch": "2025-10-14T08:00:00.0000000+00:00",
|
||||
"pendingDocuments": [],
|
||||
"pendingMappings": []
|
||||
}
|
||||
@@ -1,495 +1,495 @@
|
||||
[
|
||||
{
|
||||
"advisoryKey": "BDU:2025-01001",
|
||||
"affectedPackages": [
|
||||
{
|
||||
"type": "ics-vendor",
|
||||
"identifier": "SampleVendor SampleGateway",
|
||||
"platform": "Energy, ICS",
|
||||
"versionRanges": [
|
||||
{
|
||||
"fixedVersion": null,
|
||||
"introducedVersion": "2.0",
|
||||
"lastAffectedVersion": null,
|
||||
"primitives": {
|
||||
"evr": null,
|
||||
"hasVendorExtensions": false,
|
||||
"nevra": null,
|
||||
"semVer": {
|
||||
"constraintExpression": ">= 2.0",
|
||||
"exactValue": null,
|
||||
"fixed": null,
|
||||
"fixedInclusive": false,
|
||||
"introduced": "2.0",
|
||||
"introducedInclusive": true,
|
||||
"lastAffected": null,
|
||||
"lastAffectedInclusive": false,
|
||||
"style": "greaterThanOrEqual"
|
||||
},
|
||||
"vendorExtensions": null
|
||||
},
|
||||
"provenance": {
|
||||
"source": "ru-nkcki",
|
||||
"kind": "package-range",
|
||||
"value": "SampleVendor SampleGateway >= 2.0 All platforms",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T00:01:00+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[].versionranges[]"
|
||||
]
|
||||
},
|
||||
"rangeExpression": ">= 2.0",
|
||||
"rangeKind": "semver"
|
||||
}
|
||||
],
|
||||
"normalizedVersions": [
|
||||
{
|
||||
"scheme": "semver",
|
||||
"type": "gte",
|
||||
"min": "2.0",
|
||||
"minInclusive": true,
|
||||
"max": null,
|
||||
"maxInclusive": null,
|
||||
"value": null,
|
||||
"notes": "SampleVendor SampleGateway >= 2.0 All platforms"
|
||||
}
|
||||
],
|
||||
"statuses": [
|
||||
{
|
||||
"provenance": {
|
||||
"source": "ru-nkcki",
|
||||
"kind": "package-status",
|
||||
"value": "patch_available",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T00:01:00+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[].statuses[]"
|
||||
]
|
||||
},
|
||||
"status": "fixed"
|
||||
}
|
||||
],
|
||||
"provenance": [
|
||||
{
|
||||
"source": "ru-nkcki",
|
||||
"kind": "package",
|
||||
"value": "SampleVendor SampleGateway >= 2.0 All platforms",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T00:01:00+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[]"
|
||||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "ics-vendor",
|
||||
"identifier": "SampleVendor SampleSCADA",
|
||||
"platform": "Energy, ICS",
|
||||
"versionRanges": [
|
||||
{
|
||||
"fixedVersion": null,
|
||||
"introducedVersion": null,
|
||||
"lastAffectedVersion": "4.2",
|
||||
"primitives": {
|
||||
"evr": null,
|
||||
"hasVendorExtensions": false,
|
||||
"nevra": null,
|
||||
"semVer": {
|
||||
"constraintExpression": "<= 4.2",
|
||||
"exactValue": null,
|
||||
"fixed": null,
|
||||
"fixedInclusive": false,
|
||||
"introduced": null,
|
||||
"introducedInclusive": true,
|
||||
"lastAffected": "4.2",
|
||||
"lastAffectedInclusive": true,
|
||||
"style": "lessThanOrEqual"
|
||||
},
|
||||
"vendorExtensions": null
|
||||
},
|
||||
"provenance": {
|
||||
"source": "ru-nkcki",
|
||||
"kind": "package-range",
|
||||
"value": "SampleVendor SampleSCADA <= 4.2",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T00:01:00+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[].versionranges[]"
|
||||
]
|
||||
},
|
||||
"rangeExpression": "<= 4.2",
|
||||
"rangeKind": "semver"
|
||||
}
|
||||
],
|
||||
"normalizedVersions": [
|
||||
{
|
||||
"scheme": "semver",
|
||||
"type": "lte",
|
||||
"min": null,
|
||||
"minInclusive": null,
|
||||
"max": "4.2",
|
||||
"maxInclusive": true,
|
||||
"value": null,
|
||||
"notes": "SampleVendor SampleSCADA <= 4.2"
|
||||
}
|
||||
],
|
||||
"statuses": [
|
||||
{
|
||||
"provenance": {
|
||||
"source": "ru-nkcki",
|
||||
"kind": "package-status",
|
||||
"value": "patch_available",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T00:01:00+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[].statuses[]"
|
||||
]
|
||||
},
|
||||
"status": "fixed"
|
||||
}
|
||||
],
|
||||
"provenance": [
|
||||
{
|
||||
"source": "ru-nkcki",
|
||||
"kind": "package",
|
||||
"value": "SampleVendor SampleSCADA <= 4.2",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T00:01:00+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[]"
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
],
|
||||
"aliases": [
|
||||
"BDU:2025-01001",
|
||||
"CVE-2025-0101"
|
||||
],
|
||||
"credits": [],
|
||||
"cvssMetrics": [
|
||||
{
|
||||
"baseScore": 8.5,
|
||||
"baseSeverity": "high",
|
||||
"provenance": {
|
||||
"source": "ru-nkcki",
|
||||
"kind": "cvss",
|
||||
"value": "CVSS:3.1/AV:N/AC:H/PR:L/UI:N/S:C/C:H/I:H/A:H",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T00:01:00+00:00",
|
||||
"fieldMask": [
|
||||
"cvssmetrics[]"
|
||||
]
|
||||
},
|
||||
"vector": "CVSS:3.1/AV:N/AC:H/PR:L/UI:N/S:C/C:H/I:H/A:H",
|
||||
"version": "3.1"
|
||||
},
|
||||
{
|
||||
"baseScore": 6.4,
|
||||
"baseSeverity": "medium",
|
||||
"provenance": {
|
||||
"source": "ru-nkcki",
|
||||
"kind": "cvss",
|
||||
"value": "CVSS:4.0/AV:N/AC:H/AT:N/PR:L/UI:N/VC:H/VI:H/VA:H",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T00:01:00+00:00",
|
||||
"fieldMask": [
|
||||
"cvssmetrics[]"
|
||||
]
|
||||
},
|
||||
"vector": "CVSS:4.0/AV:N/AC:H/AT:N/PR:L/UI:N/VC:H/VI:H/VA:H",
|
||||
"version": "4.0"
|
||||
}
|
||||
],
|
||||
"exploitKnown": true,
|
||||
"language": "ru",
|
||||
"modified": "2025-09-22T00:00:00+00:00",
|
||||
"provenance": [
|
||||
{
|
||||
"source": "ru-nkcki",
|
||||
"kind": "advisory",
|
||||
"value": "BDU:2025-01001",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T00:01:00+00:00",
|
||||
"fieldMask": [
|
||||
"advisory"
|
||||
]
|
||||
}
|
||||
],
|
||||
"published": "2025-09-20T00:00:00+00:00",
|
||||
"references": [
|
||||
{
|
||||
"kind": "details",
|
||||
"provenance": {
|
||||
"source": "ru-nkcki",
|
||||
"kind": "reference",
|
||||
"value": "https://bdu.fstec.ru/vul/2025-01001",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T00:01:00+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
},
|
||||
"sourceTag": "bdu",
|
||||
"summary": null,
|
||||
"url": "https://bdu.fstec.ru/vul/2025-01001"
|
||||
},
|
||||
{
|
||||
"kind": "details",
|
||||
"provenance": {
|
||||
"source": "ru-nkcki",
|
||||
"kind": "reference",
|
||||
"value": "https://cert.gov.ru/materialy/uyazvimosti/2025-01001",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T00:01:00+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
},
|
||||
"sourceTag": "ru-nkcki",
|
||||
"summary": null,
|
||||
"url": "https://cert.gov.ru/materialy/uyazvimosti/2025-01001"
|
||||
},
|
||||
{
|
||||
"kind": "cwe",
|
||||
"provenance": {
|
||||
"source": "ru-nkcki",
|
||||
"kind": "reference",
|
||||
"value": "https://cwe.mitre.org/data/definitions/321.html",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T00:01:00+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
},
|
||||
"sourceTag": "cwe",
|
||||
"summary": "Use of Hard-coded Cryptographic Key",
|
||||
"url": "https://cwe.mitre.org/data/definitions/321.html"
|
||||
},
|
||||
{
|
||||
"kind": "external",
|
||||
"provenance": {
|
||||
"source": "ru-nkcki",
|
||||
"kind": "reference",
|
||||
"value": "https://vendor.example/advisories/sample-scada",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T00:01:00+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
},
|
||||
"sourceTag": null,
|
||||
"summary": null,
|
||||
"url": "https://vendor.example/advisories/sample-scada"
|
||||
}
|
||||
],
|
||||
"severity": "critical",
|
||||
"summary": "Authenticated RCE in Sample SCADA",
|
||||
"title": "Authenticated RCE in Sample SCADA"
|
||||
},
|
||||
{
|
||||
"advisoryKey": "BDU:2024-00011",
|
||||
"affectedPackages": [
|
||||
{
|
||||
"type": "cpe",
|
||||
"identifier": "LegacyPanel",
|
||||
"platform": "Software",
|
||||
"versionRanges": [
|
||||
{
|
||||
"fixedVersion": null,
|
||||
"introducedVersion": null,
|
||||
"lastAffectedVersion": "2.5",
|
||||
"primitives": {
|
||||
"evr": null,
|
||||
"hasVendorExtensions": false,
|
||||
"nevra": null,
|
||||
"semVer": {
|
||||
"constraintExpression": "<= 2.5",
|
||||
"exactValue": null,
|
||||
"fixed": null,
|
||||
"fixedInclusive": false,
|
||||
"introduced": null,
|
||||
"introducedInclusive": true,
|
||||
"lastAffected": "2.5",
|
||||
"lastAffectedInclusive": true,
|
||||
"style": "lessThanOrEqual"
|
||||
},
|
||||
"vendorExtensions": null
|
||||
},
|
||||
"provenance": {
|
||||
"source": "ru-nkcki",
|
||||
"kind": "package-range",
|
||||
"value": "LegacyPanel 1.0 - 2.5",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T00:01:00+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[].versionranges[]"
|
||||
]
|
||||
},
|
||||
"rangeExpression": "<= 2.5",
|
||||
"rangeKind": "semver"
|
||||
},
|
||||
{
|
||||
"fixedVersion": null,
|
||||
"introducedVersion": "1.0",
|
||||
"lastAffectedVersion": null,
|
||||
"primitives": {
|
||||
"evr": null,
|
||||
"hasVendorExtensions": false,
|
||||
"nevra": null,
|
||||
"semVer": {
|
||||
"constraintExpression": ">= 1.0",
|
||||
"exactValue": null,
|
||||
"fixed": null,
|
||||
"fixedInclusive": false,
|
||||
"introduced": "1.0",
|
||||
"introducedInclusive": true,
|
||||
"lastAffected": null,
|
||||
"lastAffectedInclusive": false,
|
||||
"style": "greaterThanOrEqual"
|
||||
},
|
||||
"vendorExtensions": null
|
||||
},
|
||||
"provenance": {
|
||||
"source": "ru-nkcki",
|
||||
"kind": "package-range",
|
||||
"value": "LegacyPanel 1.0 - 2.5",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T00:01:00+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[].versionranges[]"
|
||||
]
|
||||
},
|
||||
"rangeExpression": ">= 1.0",
|
||||
"rangeKind": "semver"
|
||||
}
|
||||
],
|
||||
"normalizedVersions": [
|
||||
{
|
||||
"scheme": "semver",
|
||||
"type": "gte",
|
||||
"min": "1.0",
|
||||
"minInclusive": true,
|
||||
"max": null,
|
||||
"maxInclusive": null,
|
||||
"value": null,
|
||||
"notes": "LegacyPanel 1.0 - 2.5"
|
||||
},
|
||||
{
|
||||
"scheme": "semver",
|
||||
"type": "lte",
|
||||
"min": null,
|
||||
"minInclusive": null,
|
||||
"max": "2.5",
|
||||
"maxInclusive": true,
|
||||
"value": null,
|
||||
"notes": "LegacyPanel 1.0 - 2.5"
|
||||
}
|
||||
],
|
||||
"statuses": [
|
||||
{
|
||||
"provenance": {
|
||||
"source": "ru-nkcki",
|
||||
"kind": "package-status",
|
||||
"value": "affected",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T00:01:00+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[].statuses[]"
|
||||
]
|
||||
},
|
||||
"status": "affected"
|
||||
}
|
||||
],
|
||||
"provenance": [
|
||||
{
|
||||
"source": "ru-nkcki",
|
||||
"kind": "package",
|
||||
"value": "LegacyPanel 1.0 - 2.5",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T00:01:00+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[]"
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
],
|
||||
"aliases": [
|
||||
"BDU:2024-00011"
|
||||
],
|
||||
"credits": [],
|
||||
"cvssMetrics": [
|
||||
{
|
||||
"baseScore": 8.8,
|
||||
"baseSeverity": "high",
|
||||
"provenance": {
|
||||
"source": "ru-nkcki",
|
||||
"kind": "cvss",
|
||||
"value": "CVSS:3.1/AV:N/AC:L/PR:N/UI:R/S:U/C:H/I:H/A:H",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T00:01:00+00:00",
|
||||
"fieldMask": [
|
||||
"cvssmetrics[]"
|
||||
]
|
||||
},
|
||||
"vector": "CVSS:3.1/AV:N/AC:L/PR:N/UI:R/S:U/C:H/I:H/A:H",
|
||||
"version": "3.1"
|
||||
}
|
||||
],
|
||||
"exploitKnown": true,
|
||||
"language": "ru",
|
||||
"modified": "2024-08-02T00:00:00+00:00",
|
||||
"provenance": [
|
||||
{
|
||||
"source": "ru-nkcki",
|
||||
"kind": "advisory",
|
||||
"value": "BDU:2024-00011",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T00:01:00+00:00",
|
||||
"fieldMask": [
|
||||
"advisory"
|
||||
]
|
||||
}
|
||||
],
|
||||
"published": "2024-08-01T00:00:00+00:00",
|
||||
"references": [
|
||||
{
|
||||
"kind": "details",
|
||||
"provenance": {
|
||||
"source": "ru-nkcki",
|
||||
"kind": "reference",
|
||||
"value": "https://bdu.fstec.ru/vul/2024-00011",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T00:01:00+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
},
|
||||
"sourceTag": "bdu",
|
||||
"summary": null,
|
||||
"url": "https://bdu.fstec.ru/vul/2024-00011"
|
||||
},
|
||||
{
|
||||
"kind": "details",
|
||||
"provenance": {
|
||||
"source": "ru-nkcki",
|
||||
"kind": "reference",
|
||||
"value": "https://cert.gov.ru/materialy/uyazvimosti/2024-00011",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T00:01:00+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
},
|
||||
"sourceTag": "ru-nkcki",
|
||||
"summary": null,
|
||||
"url": "https://cert.gov.ru/materialy/uyazvimosti/2024-00011"
|
||||
}
|
||||
],
|
||||
"severity": "high",
|
||||
"summary": "Legacy panel overflow",
|
||||
"title": "Legacy panel overflow"
|
||||
}
|
||||
[
|
||||
{
|
||||
"advisoryKey": "BDU:2025-01001",
|
||||
"affectedPackages": [
|
||||
{
|
||||
"type": "ics-vendor",
|
||||
"identifier": "SampleVendor SampleGateway",
|
||||
"platform": "Energy, ICS",
|
||||
"versionRanges": [
|
||||
{
|
||||
"fixedVersion": null,
|
||||
"introducedVersion": "2.0",
|
||||
"lastAffectedVersion": null,
|
||||
"primitives": {
|
||||
"evr": null,
|
||||
"hasVendorExtensions": false,
|
||||
"nevra": null,
|
||||
"semVer": {
|
||||
"constraintExpression": ">= 2.0",
|
||||
"exactValue": null,
|
||||
"fixed": null,
|
||||
"fixedInclusive": false,
|
||||
"introduced": "2.0",
|
||||
"introducedInclusive": true,
|
||||
"lastAffected": null,
|
||||
"lastAffectedInclusive": false,
|
||||
"style": "greaterThanOrEqual"
|
||||
},
|
||||
"vendorExtensions": null
|
||||
},
|
||||
"provenance": {
|
||||
"source": "ru-nkcki",
|
||||
"kind": "package-range",
|
||||
"value": "SampleVendor SampleGateway >= 2.0 All platforms",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T00:01:00+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[].versionranges[]"
|
||||
]
|
||||
},
|
||||
"rangeExpression": ">= 2.0",
|
||||
"rangeKind": "semver"
|
||||
}
|
||||
],
|
||||
"normalizedVersions": [
|
||||
{
|
||||
"scheme": "semver",
|
||||
"type": "gte",
|
||||
"min": "2.0",
|
||||
"minInclusive": true,
|
||||
"max": null,
|
||||
"maxInclusive": null,
|
||||
"value": null,
|
||||
"notes": "SampleVendor SampleGateway >= 2.0 All platforms"
|
||||
}
|
||||
],
|
||||
"statuses": [
|
||||
{
|
||||
"provenance": {
|
||||
"source": "ru-nkcki",
|
||||
"kind": "package-status",
|
||||
"value": "patch_available",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T00:01:00+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[].statuses[]"
|
||||
]
|
||||
},
|
||||
"status": "fixed"
|
||||
}
|
||||
],
|
||||
"provenance": [
|
||||
{
|
||||
"source": "ru-nkcki",
|
||||
"kind": "package",
|
||||
"value": "SampleVendor SampleGateway >= 2.0 All platforms",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T00:01:00+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[]"
|
||||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "ics-vendor",
|
||||
"identifier": "SampleVendor SampleSCADA",
|
||||
"platform": "Energy, ICS",
|
||||
"versionRanges": [
|
||||
{
|
||||
"fixedVersion": null,
|
||||
"introducedVersion": null,
|
||||
"lastAffectedVersion": "4.2",
|
||||
"primitives": {
|
||||
"evr": null,
|
||||
"hasVendorExtensions": false,
|
||||
"nevra": null,
|
||||
"semVer": {
|
||||
"constraintExpression": "<= 4.2",
|
||||
"exactValue": null,
|
||||
"fixed": null,
|
||||
"fixedInclusive": false,
|
||||
"introduced": null,
|
||||
"introducedInclusive": true,
|
||||
"lastAffected": "4.2",
|
||||
"lastAffectedInclusive": true,
|
||||
"style": "lessThanOrEqual"
|
||||
},
|
||||
"vendorExtensions": null
|
||||
},
|
||||
"provenance": {
|
||||
"source": "ru-nkcki",
|
||||
"kind": "package-range",
|
||||
"value": "SampleVendor SampleSCADA <= 4.2",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T00:01:00+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[].versionranges[]"
|
||||
]
|
||||
},
|
||||
"rangeExpression": "<= 4.2",
|
||||
"rangeKind": "semver"
|
||||
}
|
||||
],
|
||||
"normalizedVersions": [
|
||||
{
|
||||
"scheme": "semver",
|
||||
"type": "lte",
|
||||
"min": null,
|
||||
"minInclusive": null,
|
||||
"max": "4.2",
|
||||
"maxInclusive": true,
|
||||
"value": null,
|
||||
"notes": "SampleVendor SampleSCADA <= 4.2"
|
||||
}
|
||||
],
|
||||
"statuses": [
|
||||
{
|
||||
"provenance": {
|
||||
"source": "ru-nkcki",
|
||||
"kind": "package-status",
|
||||
"value": "patch_available",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T00:01:00+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[].statuses[]"
|
||||
]
|
||||
},
|
||||
"status": "fixed"
|
||||
}
|
||||
],
|
||||
"provenance": [
|
||||
{
|
||||
"source": "ru-nkcki",
|
||||
"kind": "package",
|
||||
"value": "SampleVendor SampleSCADA <= 4.2",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T00:01:00+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[]"
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
],
|
||||
"aliases": [
|
||||
"BDU:2025-01001",
|
||||
"CVE-2025-0101"
|
||||
],
|
||||
"credits": [],
|
||||
"cvssMetrics": [
|
||||
{
|
||||
"baseScore": 8.5,
|
||||
"baseSeverity": "high",
|
||||
"provenance": {
|
||||
"source": "ru-nkcki",
|
||||
"kind": "cvss",
|
||||
"value": "CVSS:3.1/AV:N/AC:H/PR:L/UI:N/S:C/C:H/I:H/A:H",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T00:01:00+00:00",
|
||||
"fieldMask": [
|
||||
"cvssmetrics[]"
|
||||
]
|
||||
},
|
||||
"vector": "CVSS:3.1/AV:N/AC:H/PR:L/UI:N/S:C/C:H/I:H/A:H",
|
||||
"version": "3.1"
|
||||
},
|
||||
{
|
||||
"baseScore": 6.4,
|
||||
"baseSeverity": "medium",
|
||||
"provenance": {
|
||||
"source": "ru-nkcki",
|
||||
"kind": "cvss",
|
||||
"value": "CVSS:4.0/AV:N/AC:H/AT:N/PR:L/UI:N/VC:H/VI:H/VA:H",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T00:01:00+00:00",
|
||||
"fieldMask": [
|
||||
"cvssmetrics[]"
|
||||
]
|
||||
},
|
||||
"vector": "CVSS:4.0/AV:N/AC:H/AT:N/PR:L/UI:N/VC:H/VI:H/VA:H",
|
||||
"version": "4.0"
|
||||
}
|
||||
],
|
||||
"exploitKnown": true,
|
||||
"language": "ru",
|
||||
"modified": "2025-09-22T00:00:00+00:00",
|
||||
"provenance": [
|
||||
{
|
||||
"source": "ru-nkcki",
|
||||
"kind": "advisory",
|
||||
"value": "BDU:2025-01001",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T00:01:00+00:00",
|
||||
"fieldMask": [
|
||||
"advisory"
|
||||
]
|
||||
}
|
||||
],
|
||||
"published": "2025-09-20T00:00:00+00:00",
|
||||
"references": [
|
||||
{
|
||||
"kind": "details",
|
||||
"provenance": {
|
||||
"source": "ru-nkcki",
|
||||
"kind": "reference",
|
||||
"value": "https://bdu.fstec.ru/vul/2025-01001",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T00:01:00+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
},
|
||||
"sourceTag": "bdu",
|
||||
"summary": null,
|
||||
"url": "https://bdu.fstec.ru/vul/2025-01001"
|
||||
},
|
||||
{
|
||||
"kind": "details",
|
||||
"provenance": {
|
||||
"source": "ru-nkcki",
|
||||
"kind": "reference",
|
||||
"value": "https://cert.gov.ru/materialy/uyazvimosti/2025-01001",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T00:01:00+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
},
|
||||
"sourceTag": "ru-nkcki",
|
||||
"summary": null,
|
||||
"url": "https://cert.gov.ru/materialy/uyazvimosti/2025-01001"
|
||||
},
|
||||
{
|
||||
"kind": "cwe",
|
||||
"provenance": {
|
||||
"source": "ru-nkcki",
|
||||
"kind": "reference",
|
||||
"value": "https://cwe.mitre.org/data/definitions/321.html",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T00:01:00+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
},
|
||||
"sourceTag": "cwe",
|
||||
"summary": "Use of Hard-coded Cryptographic Key",
|
||||
"url": "https://cwe.mitre.org/data/definitions/321.html"
|
||||
},
|
||||
{
|
||||
"kind": "external",
|
||||
"provenance": {
|
||||
"source": "ru-nkcki",
|
||||
"kind": "reference",
|
||||
"value": "https://vendor.example/advisories/sample-scada",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T00:01:00+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
},
|
||||
"sourceTag": null,
|
||||
"summary": null,
|
||||
"url": "https://vendor.example/advisories/sample-scada"
|
||||
}
|
||||
],
|
||||
"severity": "critical",
|
||||
"summary": "Authenticated RCE in Sample SCADA",
|
||||
"title": "Authenticated RCE in Sample SCADA"
|
||||
},
|
||||
{
|
||||
"advisoryKey": "BDU:2024-00011",
|
||||
"affectedPackages": [
|
||||
{
|
||||
"type": "cpe",
|
||||
"identifier": "LegacyPanel",
|
||||
"platform": "Software",
|
||||
"versionRanges": [
|
||||
{
|
||||
"fixedVersion": null,
|
||||
"introducedVersion": null,
|
||||
"lastAffectedVersion": "2.5",
|
||||
"primitives": {
|
||||
"evr": null,
|
||||
"hasVendorExtensions": false,
|
||||
"nevra": null,
|
||||
"semVer": {
|
||||
"constraintExpression": "<= 2.5",
|
||||
"exactValue": null,
|
||||
"fixed": null,
|
||||
"fixedInclusive": false,
|
||||
"introduced": null,
|
||||
"introducedInclusive": true,
|
||||
"lastAffected": "2.5",
|
||||
"lastAffectedInclusive": true,
|
||||
"style": "lessThanOrEqual"
|
||||
},
|
||||
"vendorExtensions": null
|
||||
},
|
||||
"provenance": {
|
||||
"source": "ru-nkcki",
|
||||
"kind": "package-range",
|
||||
"value": "LegacyPanel 1.0 - 2.5",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T00:01:00+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[].versionranges[]"
|
||||
]
|
||||
},
|
||||
"rangeExpression": "<= 2.5",
|
||||
"rangeKind": "semver"
|
||||
},
|
||||
{
|
||||
"fixedVersion": null,
|
||||
"introducedVersion": "1.0",
|
||||
"lastAffectedVersion": null,
|
||||
"primitives": {
|
||||
"evr": null,
|
||||
"hasVendorExtensions": false,
|
||||
"nevra": null,
|
||||
"semVer": {
|
||||
"constraintExpression": ">= 1.0",
|
||||
"exactValue": null,
|
||||
"fixed": null,
|
||||
"fixedInclusive": false,
|
||||
"introduced": "1.0",
|
||||
"introducedInclusive": true,
|
||||
"lastAffected": null,
|
||||
"lastAffectedInclusive": false,
|
||||
"style": "greaterThanOrEqual"
|
||||
},
|
||||
"vendorExtensions": null
|
||||
},
|
||||
"provenance": {
|
||||
"source": "ru-nkcki",
|
||||
"kind": "package-range",
|
||||
"value": "LegacyPanel 1.0 - 2.5",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T00:01:00+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[].versionranges[]"
|
||||
]
|
||||
},
|
||||
"rangeExpression": ">= 1.0",
|
||||
"rangeKind": "semver"
|
||||
}
|
||||
],
|
||||
"normalizedVersions": [
|
||||
{
|
||||
"scheme": "semver",
|
||||
"type": "gte",
|
||||
"min": "1.0",
|
||||
"minInclusive": true,
|
||||
"max": null,
|
||||
"maxInclusive": null,
|
||||
"value": null,
|
||||
"notes": "LegacyPanel 1.0 - 2.5"
|
||||
},
|
||||
{
|
||||
"scheme": "semver",
|
||||
"type": "lte",
|
||||
"min": null,
|
||||
"minInclusive": null,
|
||||
"max": "2.5",
|
||||
"maxInclusive": true,
|
||||
"value": null,
|
||||
"notes": "LegacyPanel 1.0 - 2.5"
|
||||
}
|
||||
],
|
||||
"statuses": [
|
||||
{
|
||||
"provenance": {
|
||||
"source": "ru-nkcki",
|
||||
"kind": "package-status",
|
||||
"value": "affected",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T00:01:00+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[].statuses[]"
|
||||
]
|
||||
},
|
||||
"status": "affected"
|
||||
}
|
||||
],
|
||||
"provenance": [
|
||||
{
|
||||
"source": "ru-nkcki",
|
||||
"kind": "package",
|
||||
"value": "LegacyPanel 1.0 - 2.5",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T00:01:00+00:00",
|
||||
"fieldMask": [
|
||||
"affectedpackages[]"
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
],
|
||||
"aliases": [
|
||||
"BDU:2024-00011"
|
||||
],
|
||||
"credits": [],
|
||||
"cvssMetrics": [
|
||||
{
|
||||
"baseScore": 8.8,
|
||||
"baseSeverity": "high",
|
||||
"provenance": {
|
||||
"source": "ru-nkcki",
|
||||
"kind": "cvss",
|
||||
"value": "CVSS:3.1/AV:N/AC:L/PR:N/UI:R/S:U/C:H/I:H/A:H",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T00:01:00+00:00",
|
||||
"fieldMask": [
|
||||
"cvssmetrics[]"
|
||||
]
|
||||
},
|
||||
"vector": "CVSS:3.1/AV:N/AC:L/PR:N/UI:R/S:U/C:H/I:H/A:H",
|
||||
"version": "3.1"
|
||||
}
|
||||
],
|
||||
"exploitKnown": true,
|
||||
"language": "ru",
|
||||
"modified": "2024-08-02T00:00:00+00:00",
|
||||
"provenance": [
|
||||
{
|
||||
"source": "ru-nkcki",
|
||||
"kind": "advisory",
|
||||
"value": "BDU:2024-00011",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T00:01:00+00:00",
|
||||
"fieldMask": [
|
||||
"advisory"
|
||||
]
|
||||
}
|
||||
],
|
||||
"published": "2024-08-01T00:00:00+00:00",
|
||||
"references": [
|
||||
{
|
||||
"kind": "details",
|
||||
"provenance": {
|
||||
"source": "ru-nkcki",
|
||||
"kind": "reference",
|
||||
"value": "https://bdu.fstec.ru/vul/2024-00011",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T00:01:00+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
},
|
||||
"sourceTag": "bdu",
|
||||
"summary": null,
|
||||
"url": "https://bdu.fstec.ru/vul/2024-00011"
|
||||
},
|
||||
{
|
||||
"kind": "details",
|
||||
"provenance": {
|
||||
"source": "ru-nkcki",
|
||||
"kind": "reference",
|
||||
"value": "https://cert.gov.ru/materialy/uyazvimosti/2024-00011",
|
||||
"decisionReason": null,
|
||||
"recordedAt": "2025-10-12T00:01:00+00:00",
|
||||
"fieldMask": [
|
||||
"references[]"
|
||||
]
|
||||
},
|
||||
"sourceTag": "ru-nkcki",
|
||||
"summary": null,
|
||||
"url": "https://cert.gov.ru/materialy/uyazvimosti/2024-00011"
|
||||
}
|
||||
],
|
||||
"severity": "high",
|
||||
"summary": "Legacy panel overflow",
|
||||
"title": "Legacy panel overflow"
|
||||
}
|
||||
]
|
||||
@@ -145,12 +145,13 @@ public sealed class AdvisoryStore : IAdvisoryStore
|
||||
?? throw new InvalidOperationException("advisoryKey missing from payload.");
|
||||
var title = payload.GetValue("title", defaultValue: null)?.AsString ?? advisoryKey;
|
||||
|
||||
string? summary = payload.TryGetValue("summary", out var summaryValue) && summaryValue.IsString ? summaryValue.AsString : null;
|
||||
string? language = payload.TryGetValue("language", out var languageValue) && languageValue.IsString ? languageValue.AsString : null;
|
||||
DateTimeOffset? published = TryReadDateTime(payload, "published");
|
||||
DateTimeOffset? modified = TryReadDateTime(payload, "modified");
|
||||
string? severity = payload.TryGetValue("severity", out var severityValue) && severityValue.IsString ? severityValue.AsString : null;
|
||||
var exploitKnown = payload.TryGetValue("exploitKnown", out var exploitValue) && exploitValue.IsBoolean && exploitValue.AsBoolean;
|
||||
string? summary = payload.TryGetValue("summary", out var summaryValue) && summaryValue.IsString ? summaryValue.AsString : null;
|
||||
string? description = payload.TryGetValue("description", out var descriptionValue) && descriptionValue.IsString ? descriptionValue.AsString : null;
|
||||
string? language = payload.TryGetValue("language", out var languageValue) && languageValue.IsString ? languageValue.AsString : null;
|
||||
DateTimeOffset? published = TryReadDateTime(payload, "published");
|
||||
DateTimeOffset? modified = TryReadDateTime(payload, "modified");
|
||||
string? severity = payload.TryGetValue("severity", out var severityValue) && severityValue.IsString ? severityValue.AsString : null;
|
||||
var exploitKnown = payload.TryGetValue("exploitKnown", out var exploitValue) && exploitValue.IsBoolean && exploitValue.AsBoolean;
|
||||
|
||||
var aliases = payload.TryGetValue("aliases", out var aliasValue) && aliasValue is BsonArray aliasArray
|
||||
? aliasArray.OfType<BsonValue>().Where(static x => x.IsString).Select(static x => x.AsString)
|
||||
@@ -168,14 +169,22 @@ public sealed class AdvisoryStore : IAdvisoryStore
|
||||
? affectedArray.OfType<BsonDocument>().Select(DeserializeAffectedPackage).ToArray()
|
||||
: Array.Empty<AffectedPackage>();
|
||||
|
||||
var cvssMetrics = payload.TryGetValue("cvssMetrics", out var cvssValue) && cvssValue is BsonArray cvssArray
|
||||
? cvssArray.OfType<BsonDocument>().Select(DeserializeCvssMetric).ToArray()
|
||||
: Array.Empty<CvssMetric>();
|
||||
|
||||
var provenance = payload.TryGetValue("provenance", out var provenanceValue) && provenanceValue is BsonArray provenanceArray
|
||||
? provenanceArray.OfType<BsonDocument>().Select(DeserializeProvenance).ToArray()
|
||||
: Array.Empty<AdvisoryProvenance>();
|
||||
|
||||
var cvssMetrics = payload.TryGetValue("cvssMetrics", out var cvssValue) && cvssValue is BsonArray cvssArray
|
||||
? cvssArray.OfType<BsonDocument>().Select(DeserializeCvssMetric).ToArray()
|
||||
: Array.Empty<CvssMetric>();
|
||||
|
||||
var cwes = payload.TryGetValue("cwes", out var cweValue) && cweValue is BsonArray cweArray
|
||||
? cweArray.OfType<BsonDocument>().Select(DeserializeWeakness).ToArray()
|
||||
: Array.Empty<AdvisoryWeakness>();
|
||||
|
||||
string? canonicalMetricId = payload.TryGetValue("canonicalMetricId", out var canonicalMetricValue) && canonicalMetricValue.IsString
|
||||
? canonicalMetricValue.AsString
|
||||
: null;
|
||||
|
||||
var provenance = payload.TryGetValue("provenance", out var provenanceValue) && provenanceValue is BsonArray provenanceArray
|
||||
? provenanceArray.OfType<BsonDocument>().Select(DeserializeProvenance).ToArray()
|
||||
: Array.Empty<AdvisoryProvenance>();
|
||||
|
||||
return new Advisory(
|
||||
advisoryKey,
|
||||
title,
|
||||
@@ -190,7 +199,10 @@ public sealed class AdvisoryStore : IAdvisoryStore
|
||||
references,
|
||||
affectedPackages,
|
||||
cvssMetrics,
|
||||
provenance);
|
||||
provenance,
|
||||
description,
|
||||
cwes,
|
||||
canonicalMetricId);
|
||||
}
|
||||
|
||||
private static AdvisoryReference DeserializeReference(BsonDocument document)
|
||||
@@ -280,15 +292,15 @@ public sealed class AdvisoryStore : IAdvisoryStore
|
||||
return new AffectedPackageStatus(status, provenance);
|
||||
}
|
||||
|
||||
private static CvssMetric DeserializeCvssMetric(BsonDocument document)
|
||||
{
|
||||
var version = document.GetValue("version", defaultValue: null)?.AsString
|
||||
?? throw new InvalidOperationException("cvssMetrics.version missing from payload.");
|
||||
var vector = document.GetValue("vector", defaultValue: null)?.AsString
|
||||
?? throw new InvalidOperationException("cvssMetrics.vector missing from payload.");
|
||||
var baseScore = document.TryGetValue("baseScore", out var scoreValue) && scoreValue.IsNumeric ? scoreValue.ToDouble() : 0d;
|
||||
var baseSeverity = document.GetValue("baseSeverity", defaultValue: null)?.AsString
|
||||
?? throw new InvalidOperationException("cvssMetrics.baseSeverity missing from payload.");
|
||||
private static CvssMetric DeserializeCvssMetric(BsonDocument document)
|
||||
{
|
||||
var version = document.GetValue("version", defaultValue: null)?.AsString
|
||||
?? throw new InvalidOperationException("cvssMetrics.version missing from payload.");
|
||||
var vector = document.GetValue("vector", defaultValue: null)?.AsString
|
||||
?? throw new InvalidOperationException("cvssMetrics.vector missing from payload.");
|
||||
var baseScore = document.TryGetValue("baseScore", out var scoreValue) && scoreValue.IsNumeric ? scoreValue.ToDouble() : 0d;
|
||||
var baseSeverity = document.GetValue("baseSeverity", defaultValue: null)?.AsString
|
||||
?? throw new InvalidOperationException("cvssMetrics.baseSeverity missing from payload.");
|
||||
var provenance = document.TryGetValue("provenance", out var provenanceValue) && provenanceValue.IsBsonDocument
|
||||
? DeserializeProvenance(provenanceValue.AsBsonDocument)
|
||||
: AdvisoryProvenance.Empty;
|
||||
@@ -296,6 +308,21 @@ public sealed class AdvisoryStore : IAdvisoryStore
|
||||
return new CvssMetric(version, vector, baseScore, baseSeverity, provenance);
|
||||
}
|
||||
|
||||
private static AdvisoryWeakness DeserializeWeakness(BsonDocument document)
|
||||
{
|
||||
var taxonomy = document.GetValue("taxonomy", defaultValue: null)?.AsString
|
||||
?? throw new InvalidOperationException("cwes.taxonomy missing from payload.");
|
||||
var identifier = document.GetValue("identifier", defaultValue: null)?.AsString
|
||||
?? throw new InvalidOperationException("cwes.identifier missing from payload.");
|
||||
string? name = document.TryGetValue("name", out var nameValue) && nameValue.IsString ? nameValue.AsString : null;
|
||||
string? uri = document.TryGetValue("uri", out var uriValue) && uriValue.IsString ? uriValue.AsString : null;
|
||||
var provenance = document.TryGetValue("provenance", out var provenanceValue) && provenanceValue.IsBsonDocument
|
||||
? DeserializeProvenance(provenanceValue.AsBsonDocument)
|
||||
: AdvisoryProvenance.Empty;
|
||||
|
||||
return new AdvisoryWeakness(taxonomy, identifier, name, uri, new[] { provenance });
|
||||
}
|
||||
|
||||
private static AdvisoryProvenance DeserializeProvenance(BsonDocument document)
|
||||
{
|
||||
var source = document.GetValue("source", defaultValue: null)?.AsString
|
||||
|
||||
@@ -19,3 +19,4 @@
|
||||
|FEEDSTORAGE-TESTS-02-004 Restore AdvisoryStore build after normalized versions refactor|QA|Storage.Mongo|DONE – storage tests updated to cover normalized version payloads and new provenance fields. **Heads-up:** QA to watch for fixture bumps touching normalized rule arrays when connectors roll out support.|
|
||||
|FEEDSTORAGE-DATA-02-002 Provenance decision persistence|BE-Storage|Models `FEEDMODELS-SCHEMA-01-002`|**DONE (2025-10-12)** – Normalized documents carry decision reasons/source/timestamps with regression coverage verifying SemVer notes + provenance fallbacks.|
|
||||
|FEEDSTORAGE-DATA-02-003 Normalized versions index creation|BE-Storage|Normalization, Mongo bootstrapper|**DONE (2025-10-12)** – Bootstrapper seeds `normalizedVersions.*` indexes when SemVer style is enabled; docs/tests confirm index presence.|
|
||||
|FEEDSTORAGE-DATA-04-001 Advisory payload parity (description/CWEs/canonical metric)|BE-Storage|Models, Core|DONE (2025-10-15) – Mongo payloads round-trip new advisory fields; serializer/tests updated, no migration required beyond optional backfill.|
|
||||
|
||||
23
src/StellaOps.Vexer.Attestation/AGENTS.md
Normal file
23
src/StellaOps.Vexer.Attestation/AGENTS.md
Normal file
@@ -0,0 +1,23 @@
|
||||
# AGENTS
|
||||
## Role
|
||||
Builds and verifies in-toto/DSSE attestations for Vexer exports and integrates with Rekor v2 transparency logs.
|
||||
## Scope
|
||||
- Attestation envelope builders, signing workflows (keyless/keyed), and predicate model definitions.
|
||||
- Rekor v2 client implementation (submit, verify, poll inclusion) with retry/backoff policies.
|
||||
- Verification utilities reused by Worker for periodic revalidation.
|
||||
- Configuration bindings for signer identity, Rekor endpoints, and offline bundle operation.
|
||||
## Participants
|
||||
- Export module calls into this layer to generate attestations after export artifacts are produced.
|
||||
- WebService and Worker consume verification helpers to ensure stored envelopes remain valid.
|
||||
- CLI `vexer verify` leverages verification services through WebService endpoints.
|
||||
## Interfaces & contracts
|
||||
- `IExportAttestor`, `ITransparencyLogClient`, predicate DTOs, and verification result records.
|
||||
- Extension methods to register attestation services in DI across WebService/Worker.
|
||||
## In/Out of scope
|
||||
In: attestation creation, verification, Rekor integration, signer configuration.
|
||||
Out: export artifact generation, storage persistence, CLI interaction layers.
|
||||
## Observability & security expectations
|
||||
- Structured logs for signing/verification with envelope digest, Rekor URI, and latency; never log private keys.
|
||||
- Metrics for attestation successes/failures and Rekor submission durations.
|
||||
## Tests
|
||||
- Unit tests and integration stubs (with fake Rekor) will live in `../StellaOps.Vexer.Attestation.Tests`.
|
||||
7
src/StellaOps.Vexer.Attestation/TASKS.md
Normal file
7
src/StellaOps.Vexer.Attestation/TASKS.md
Normal file
@@ -0,0 +1,7 @@
|
||||
If you are working on this file you need to read docs/ARCHITECTURE_VEXER.md and ./AGENTS.md).
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|VEXER-ATTEST-01-001 – In-toto predicate & DSSE builder|Team Vexer Attestation|VEXER-CORE-01-001|TODO – Implement export attestation predicates and DSSE envelope builder with deterministic hashing and signer abstraction.|
|
||||
|VEXER-ATTEST-01-002 – Rekor v2 client integration|Team Vexer Attestation|VEXER-ATTEST-01-001|TODO – Provide `ITransparencyLogClient` with submit/verify operations, retries, and offline queue fallback matching architecture guidance.|
|
||||
|VEXER-ATTEST-01-003 – Verification suite & observability|Team Vexer Attestation|VEXER-ATTEST-01-002|TODO – Add verification helpers for Worker/WebService, metrics/logging hooks, and negative-path regression tests.|
|
||||
22
src/StellaOps.Vexer.Connectors.Abstractions/AGENTS.md
Normal file
22
src/StellaOps.Vexer.Connectors.Abstractions/AGENTS.md
Normal file
@@ -0,0 +1,22 @@
|
||||
# AGENTS
|
||||
## Role
|
||||
Defines shared connector infrastructure for Vexer, including base contexts, result contracts, configuration binding, and helper utilities reused by all connector plug-ins.
|
||||
## Scope
|
||||
- `IVexConnector` context implementation, raw store helpers, verification hooks, and telemetry utilities.
|
||||
- Configuration primitives (YAML parsing, secrets handling guidelines) and options validation.
|
||||
- Connector lifecycle helpers for retries, paging, `.well-known` discovery, and resume markers.
|
||||
- Documentation for connector packaging, plugin manifest metadata, and DI registration.
|
||||
## Participants
|
||||
- All Vexer connector projects reference this module to obtain base classes and context services.
|
||||
- WebService/Worker instantiate connectors via plugin loader leveraging abstractions defined here.
|
||||
## Interfaces & contracts
|
||||
- Connector context, result, and telemetry interfaces; `ConnectorDescriptor`, `ConnectorOptions`, authentication helpers.
|
||||
- Utility classes for HTTP clients, throttling, and deterministic logging.
|
||||
## In/Out of scope
|
||||
In: shared abstractions, helper utilities, configuration binding, documentation for connector authors.
|
||||
Out: provider-specific logic (implemented in individual connector modules), storage persistence, HTTP host code.
|
||||
## Observability & security expectations
|
||||
- Provide structured logging helpers, correlation IDs, and metrics instrumentation toggles for connectors.
|
||||
- Enforce redaction of secrets in logs and config dumps.
|
||||
## Tests
|
||||
- Abstraction/unit tests will live in `../StellaOps.Vexer.Connectors.Abstractions.Tests`, covering default behaviors and sample harness.
|
||||
7
src/StellaOps.Vexer.Connectors.Abstractions/TASKS.md
Normal file
7
src/StellaOps.Vexer.Connectors.Abstractions/TASKS.md
Normal file
@@ -0,0 +1,7 @@
|
||||
If you are working on this file you need to read docs/ARCHITECTURE_VEXER.md and ./AGENTS.md).
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|VEXER-CONN-ABS-01-001 – Connector context & base classes|Team Vexer Connectors|VEXER-CORE-01-003|TODO – Implement `VexConnectorContext`, result types, helper base classes, and deterministic logging helpers for connectors.|
|
||||
|VEXER-CONN-ABS-01-002 – YAML options & validation|Team Vexer Connectors|VEXER-CONN-ABS-01-001|TODO – Provide strongly-typed options binding/validation for connector YAML definitions with offline-safe defaults.|
|
||||
|VEXER-CONN-ABS-01-003 – Plugin packaging & docs|Team Vexer Connectors|VEXER-CONN-ABS-01-001|TODO – Document connector packaging (NuGet manifest, plugin loader metadata) and supply reference templates for downstream connector modules.|
|
||||
23
src/StellaOps.Vexer.Connectors.Cisco.CSAF/AGENTS.md
Normal file
23
src/StellaOps.Vexer.Connectors.Cisco.CSAF/AGENTS.md
Normal file
@@ -0,0 +1,23 @@
|
||||
# AGENTS
|
||||
## Role
|
||||
Connector responsible for ingesting Cisco CSAF VEX advisories and handing raw documents to normalizers with Cisco-specific metadata.
|
||||
## Scope
|
||||
- Discovery of Cisco CSAF collection endpoints, authentication (when required), and pagination routines.
|
||||
- HTTP retries/backoff, checksum verification, and document deduplication before storage.
|
||||
- Mapping Cisco advisory identifiers, product hierarchies, and severity hints into connector metadata.
|
||||
- Surfacing provider trust configuration aligned with policy expectations.
|
||||
## Participants
|
||||
- Worker drives scheduled pulls; WebService may trigger manual runs.
|
||||
- CSAF normalizer consumes raw documents to emit claims.
|
||||
- Policy module references connector trust hints (e.g., Cisco signing identities).
|
||||
## Interfaces & contracts
|
||||
- Implements `IVexConnector` using shared abstractions for HTTP/resume handling.
|
||||
- Provides options for API tokens, rate limits, and concurrency.
|
||||
## In/Out of scope
|
||||
In: data fetching, provider metadata, retry controls, raw document persistence.
|
||||
Out: normalization/export, attestation, Mongo wiring (handled in other modules).
|
||||
## Observability & security expectations
|
||||
- Log fetch batches with document counts/durations; mask credentials.
|
||||
- Emit metrics for rate-limit hits, retries, and quarantine events.
|
||||
## Tests
|
||||
- Unit tests plus HTTP harness fixtures will live in `../StellaOps.Vexer.Connectors.Cisco.CSAF.Tests`.
|
||||
7
src/StellaOps.Vexer.Connectors.Cisco.CSAF/TASKS.md
Normal file
7
src/StellaOps.Vexer.Connectors.Cisco.CSAF/TASKS.md
Normal file
@@ -0,0 +1,7 @@
|
||||
If you are working on this file you need to read docs/ARCHITECTURE_VEXER.md and ./AGENTS.md).
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|VEXER-CONN-CISCO-01-001 – Endpoint discovery & auth plumbing|Team Vexer Connectors – Cisco|VEXER-CONN-ABS-01-001|TODO – Resolve Cisco CSAF collection URLs, configure optional token auth, and validate discovery metadata for offline caching.|
|
||||
|VEXER-CONN-CISCO-01-002 – CSAF pull loop & pagination|Team Vexer Connectors – Cisco|VEXER-CONN-CISCO-01-001, VEXER-STORAGE-01-003|TODO – Implement paginated fetch with retries/backoff, checksum validation, and raw document persistence.|
|
||||
|VEXER-CONN-CISCO-01-003 – Provider trust metadata|Team Vexer Connectors – Cisco|VEXER-CONN-CISCO-01-002, VEXER-POLICY-01-001|TODO – Emit cosign/PGP trust metadata and advisory provenance hints for policy weighting.|
|
||||
23
src/StellaOps.Vexer.Connectors.MSRC.CSAF/AGENTS.md
Normal file
23
src/StellaOps.Vexer.Connectors.MSRC.CSAF/AGENTS.md
Normal file
@@ -0,0 +1,23 @@
|
||||
# AGENTS
|
||||
## Role
|
||||
Connector for Microsoft Security Response Center (MSRC) CSAF advisories, handling authenticated downloads, throttling, and raw document persistence.
|
||||
## Scope
|
||||
- MSRC API onboarding (AAD client credentials), metadata discovery, and CSAF listing retrieval.
|
||||
- Download pipeline with retry/backoff, checksum validation, and document deduplication.
|
||||
- Mapping MSRC-specific identifiers (CVE, ADV, KB) and remediation guidance into connector metadata.
|
||||
- Emitting trust metadata (AAD issuer, signing certificates) for policy weighting.
|
||||
## Participants
|
||||
- Worker schedules MSRC pulls honoring rate limits; WebService may trigger manual runs for urgent updates.
|
||||
- CSAF normalizer processes retrieved documents into claims.
|
||||
- Policy subsystem references connector trust hints for consensus scoring.
|
||||
## Interfaces & contracts
|
||||
- Implements `IVexConnector`, requires configuration options for tenant/client/secret or managed identity.
|
||||
- Uses shared HTTP helpers, resume markers, and telemetry from Abstractions module.
|
||||
## In/Out of scope
|
||||
In: authenticated fetching, raw document storage, metadata mapping, retry logic.
|
||||
Out: normalization/export, attestation, storage implementations (handled elsewhere).
|
||||
## Observability & security expectations
|
||||
- Log request batches, rate-limit responses, and token refresh events without leaking secrets.
|
||||
- Track metrics for documents fetched, retries, and failure categories.
|
||||
## Tests
|
||||
- Connector tests with mocked MSRC endpoints and AAD token flow will live in `../StellaOps.Vexer.Connectors.MSRC.CSAF.Tests`.
|
||||
7
src/StellaOps.Vexer.Connectors.MSRC.CSAF/TASKS.md
Normal file
7
src/StellaOps.Vexer.Connectors.MSRC.CSAF/TASKS.md
Normal file
@@ -0,0 +1,7 @@
|
||||
If you are working on this file you need to read docs/ARCHITECTURE_VEXER.md and ./AGENTS.md).
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|VEXER-CONN-MS-01-001 – AAD onboarding & token cache|Team Vexer Connectors – MSRC|VEXER-CONN-ABS-01-001|TODO – Implement Azure AD credential flow, token caching, and validation for MSRC CSAF access with offline fallback guidance.|
|
||||
|VEXER-CONN-MS-01-002 – CSAF download pipeline|Team Vexer Connectors – MSRC|VEXER-CONN-MS-01-001, VEXER-STORAGE-01-003|TODO – Fetch CSAF packages with retry/backoff, checksum verification, and raw document persistence plus quarantine for schema failures.|
|
||||
|VEXER-CONN-MS-01-003 – Trust metadata & provenance hints|Team Vexer Connectors – MSRC|VEXER-CONN-MS-01-002, VEXER-POLICY-01-001|TODO – Emit cosign/AAD issuer metadata, attach provenance details, and document policy integration.|
|
||||
23
src/StellaOps.Vexer.Connectors.OCI.OpenVEX.Attest/AGENTS.md
Normal file
23
src/StellaOps.Vexer.Connectors.OCI.OpenVEX.Attest/AGENTS.md
Normal file
@@ -0,0 +1,23 @@
|
||||
# AGENTS
|
||||
## Role
|
||||
Connector for OCI registry OpenVEX attestations, discovering images, downloading attestations, and projecting statements into raw storage.
|
||||
## Scope
|
||||
- OCI registry discovery, authentication (cosign OIDC/key), and ref resolution for provided image digests/tags.
|
||||
- Fetching DSSE envelopes, verifying signatures (delegated to Attestation module), and persisting raw statements.
|
||||
- Mapping OCI manifest metadata (repository, digest, subject) to connector provenance.
|
||||
- Managing offline bundles that seed attestations without registry access.
|
||||
## Participants
|
||||
- Worker schedules polls for configured registries/images; WebService supports manual refresh.
|
||||
- OpenVEX normalizer consumes statements to create claims.
|
||||
- Attestation module is reused to verify upstream envelopes prior to storage.
|
||||
## Interfaces & contracts
|
||||
- Implements `IVexConnector` with options for image list, auth, parallelism, and offline file seeds.
|
||||
- Utilizes shared abstractions for retries, telemetry, and resume markers.
|
||||
## In/Out of scope
|
||||
In: OCI interaction, attestation retrieval, verification trigger, raw persistence.
|
||||
Out: normalization/export, policy evaluation, storage implementation.
|
||||
## Observability & security expectations
|
||||
- Log image references, attestation counts, verification outcomes; redact credentials.
|
||||
- Emit metrics for attestation reuse ratio, verification duration, and failures.
|
||||
## Tests
|
||||
- Connector tests with mock OCI registry/attestation responses will live in `../StellaOps.Vexer.Connectors.OCI.OpenVEX.Attest.Tests`.
|
||||
@@ -0,0 +1,7 @@
|
||||
If you are working on this file you need to read docs/ARCHITECTURE_VEXER.md and ./AGENTS.md).
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|VEXER-CONN-OCI-01-001 – OCI discovery & auth plumbing|Team Vexer Connectors – OCI|VEXER-CONN-ABS-01-001|TODO – Resolve OCI references, configure cosign auth (keyless/keyed), and support offline attestation bundles.|
|
||||
|VEXER-CONN-OCI-01-002 – Attestation fetch & verify loop|Team Vexer Connectors – OCI|VEXER-CONN-OCI-01-001, VEXER-ATTEST-01-002|TODO – Download DSSE attestations, trigger verification, handle retries/backoff, and persist raw statements with metadata.|
|
||||
|VEXER-CONN-OCI-01-003 – Provenance metadata & policy hooks|Team Vexer Connectors – OCI|VEXER-CONN-OCI-01-002, VEXER-POLICY-01-001|TODO – Emit provenance hints (image, subject digest, issuer) and trust metadata for policy weighting/logging.|
|
||||
23
src/StellaOps.Vexer.Connectors.Oracle.CSAF/AGENTS.md
Normal file
23
src/StellaOps.Vexer.Connectors.Oracle.CSAF/AGENTS.md
Normal file
@@ -0,0 +1,23 @@
|
||||
# AGENTS
|
||||
## Role
|
||||
Connector for Oracle CSAF advisories, including CPU and other bulletin releases, projecting documents into raw storage for normalization.
|
||||
## Scope
|
||||
- Discovery of Oracle CSAF catalogue, navigation of quarterly CPU bundles, and delta detection.
|
||||
- HTTP fetch with retry/backoff, checksum validation, and deduplication across revisions.
|
||||
- Mapping Oracle advisory metadata (CPU ID, component families) into connector context.
|
||||
- Publishing trust metadata (PGP keys/cosign options) aligned with policy expectations.
|
||||
## Participants
|
||||
- Worker orchestrates regular pulls respecting Oracle publication cadence; WebService offers manual triggers.
|
||||
- CSAF normalizer processes raw documents to claims.
|
||||
- Policy engine leverages trust metadata and provenance hints.
|
||||
## Interfaces & contracts
|
||||
- Implements `IVexConnector` using shared abstractions for HTTP/resume and telemetry.
|
||||
- Configuration options for CPU schedule, credentials (if required), and offline snapshot ingestion.
|
||||
## In/Out of scope
|
||||
In: fetching, metadata mapping, raw persistence, trust hints.
|
||||
Out: normalization, storage internals, export/attestation flows.
|
||||
## Observability & security expectations
|
||||
- Log CPU release windows, document counts, and fetch durations; redact any secrets.
|
||||
- Emit metrics for deduped vs new documents and quarantine rates.
|
||||
## Tests
|
||||
- Harness tests with mocked Oracle catalogues will live in `../StellaOps.Vexer.Connectors.Oracle.CSAF.Tests`.
|
||||
7
src/StellaOps.Vexer.Connectors.Oracle.CSAF/TASKS.md
Normal file
7
src/StellaOps.Vexer.Connectors.Oracle.CSAF/TASKS.md
Normal file
@@ -0,0 +1,7 @@
|
||||
If you are working on this file you need to read docs/ARCHITECTURE_VEXER.md and ./AGENTS.md).
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|VEXER-CONN-ORACLE-01-001 – Oracle CSAF catalogue discovery|Team Vexer Connectors – Oracle|VEXER-CONN-ABS-01-001|TODO – Implement catalogue discovery, CPU calendar awareness, and offline snapshot import for Oracle CSAF feeds.|
|
||||
|VEXER-CONN-ORACLE-01-002 – CSAF download & dedupe pipeline|Team Vexer Connectors – Oracle|VEXER-CONN-ORACLE-01-001, VEXER-STORAGE-01-003|TODO – Fetch CSAF documents with retry/backoff, checksum validation, revision deduplication, and raw persistence.|
|
||||
|VEXER-CONN-ORACLE-01-003 – Trust metadata + provenance|Team Vexer Connectors – Oracle|VEXER-CONN-ORACLE-01-002, VEXER-POLICY-01-001|TODO – Emit Oracle signing metadata (PGP/cosign) and provenance hints for consensus weighting.|
|
||||
23
src/StellaOps.Vexer.Connectors.RedHat.CSAF/AGENTS.md
Normal file
23
src/StellaOps.Vexer.Connectors.RedHat.CSAF/AGENTS.md
Normal file
@@ -0,0 +1,23 @@
|
||||
# AGENTS
|
||||
## Role
|
||||
Connector for Red Hat CSAF VEX feeds, fetching provider metadata, CSAF documents, and projecting them into raw storage for normalization.
|
||||
## Scope
|
||||
- Discovery via `/.well-known/csaf/provider-metadata.json`, scheduling windows, and ETag-aware HTTP fetches.
|
||||
- Mapping Red Hat CSAF specifics (product tree aliases, RHSA identifiers, revision history) into raw documents.
|
||||
- Emitting structured telemetry and resume markers for incremental pulls.
|
||||
- Supplying Red Hat-specific trust overrides and provenance hints to normalization.
|
||||
## Participants
|
||||
- Worker schedules pulls using this connector; WebService triggers ad-hoc runs.
|
||||
- CSAF normalizer consumes fetched documents to produce claims.
|
||||
- Policy/consensus rely on Red Hat trust metadata captured here.
|
||||
## Interfaces & contracts
|
||||
- Implements `IVexConnector` with Red Hat-specific options (parallelism, token auth if configured).
|
||||
- Uses abstractions from `StellaOps.Vexer.Connectors.Abstractions` for HTTP/resume helpers.
|
||||
## In/Out of scope
|
||||
In: data acquisition, HTTP retries, raw document persistence, provider metadata population.
|
||||
Out: normalization, storage internals, attestation, general connector abstractions (covered elsewhere).
|
||||
## Observability & security expectations
|
||||
- Log provider metadata URL, revision ids, fetch durations; redact tokens.
|
||||
- Emit counters for documents fetched, skipped (304), quarantined.
|
||||
## Tests
|
||||
- Connector harness tests (mock HTTP) and resume regression cases will live in `../StellaOps.Vexer.Connectors.RedHat.CSAF.Tests`.
|
||||
7
src/StellaOps.Vexer.Connectors.RedHat.CSAF/TASKS.md
Normal file
7
src/StellaOps.Vexer.Connectors.RedHat.CSAF/TASKS.md
Normal file
@@ -0,0 +1,7 @@
|
||||
If you are working on this file you need to read docs/ARCHITECTURE_VEXER.md and ./AGENTS.md).
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|VEXER-CONN-RH-01-001 – Provider metadata discovery|Team Vexer Connectors – Red Hat|VEXER-CONN-ABS-01-001|TODO – Implement `.well-known` metadata loader with caching, schema validation, and offline snapshot support.|
|
||||
|VEXER-CONN-RH-01-002 – Incremental CSAF pulls|Team Vexer Connectors – Red Hat|VEXER-CONN-RH-01-001, VEXER-STORAGE-01-003|TODO – Fetch CSAF windows with ETag handling, resume tokens, quarantine on schema errors, and persist raw docs.|
|
||||
|VEXER-CONN-RH-01-003 – Trust metadata emission|Team Vexer Connectors – Red Hat|VEXER-CONN-RH-01-002, VEXER-POLICY-01-001|TODO – Populate provider trust overrides (cosign issuer, identity regex) and provenance hints for policy evaluation/logging.|
|
||||
23
src/StellaOps.Vexer.Connectors.SUSE.RancherVEXHub/AGENTS.md
Normal file
23
src/StellaOps.Vexer.Connectors.SUSE.RancherVEXHub/AGENTS.md
Normal file
@@ -0,0 +1,23 @@
|
||||
# AGENTS
|
||||
## Role
|
||||
Connector targeting SUSE Rancher VEX Hub feeds, ingesting hub events and translating them into raw documents for normalization.
|
||||
## Scope
|
||||
- Hub discovery, authentication, and subscription handling for Rancher VEX updates.
|
||||
- HTTP/WebSocket (if provided) ingestion, checkpoint tracking, and deduplication.
|
||||
- Mapping Rancher-specific status fields and product identifiers into connector metadata.
|
||||
- Integration with offline bundles to allow snapshot imports.
|
||||
## Participants
|
||||
- Worker manages scheduled syncs using this connector; WebService can trigger manual reconcile pulls.
|
||||
- Normalizers convert retrieved documents via CSAF/OpenVEX workflows depending on payload.
|
||||
- Policy module uses trust metadata produced here for weight evaluation.
|
||||
## Interfaces & contracts
|
||||
- Implements `IVexConnector` with options for hub URL, credentials, and poll intervals.
|
||||
- Uses shared abstractions for resume markers and telemetry.
|
||||
## In/Out of scope
|
||||
In: hub connectivity, message processing, raw persistence, provider metadata.
|
||||
Out: normalization/export tasks, storage layer implementation, attestation.
|
||||
## Observability & security expectations
|
||||
- Log subscription IDs, batch sizes, and checkpoint updates while redacting secrets.
|
||||
- Emit metrics for messages processed, lag, and retries.
|
||||
## Tests
|
||||
- Connector harness tests with simulated hub responses will live in `../StellaOps.Vexer.Connectors.SUSE.RancherVEXHub.Tests`.
|
||||
@@ -0,0 +1,7 @@
|
||||
If you are working on this file you need to read docs/ARCHITECTURE_VEXER.md and ./AGENTS.md).
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|VEXER-CONN-SUSE-01-001 – Rancher hub discovery & auth|Team Vexer Connectors – SUSE|VEXER-CONN-ABS-01-001|TODO – Implement hub discovery/subscription setup with credential handling and offline snapshot support.|
|
||||
|VEXER-CONN-SUSE-01-002 – Checkpointed event ingestion|Team Vexer Connectors – SUSE|VEXER-CONN-SUSE-01-001, VEXER-STORAGE-01-003|TODO – Process hub events with resume checkpoints, deduplication, and quarantine path for malformed payloads.|
|
||||
|VEXER-CONN-SUSE-01-003 – Trust metadata & policy hints|Team Vexer Connectors – SUSE|VEXER-CONN-SUSE-01-002, VEXER-POLICY-01-001|TODO – Emit provider trust configuration (signers, weight overrides) and attach provenance hints for consensus engine.|
|
||||
23
src/StellaOps.Vexer.Connectors.Ubuntu.CSAF/AGENTS.md
Normal file
23
src/StellaOps.Vexer.Connectors.Ubuntu.CSAF/AGENTS.md
Normal file
@@ -0,0 +1,23 @@
|
||||
# AGENTS
|
||||
## Role
|
||||
Connector for Ubuntu CSAF advisories (USN VEX data), managing discovery, incremental pulls, and raw document persistence.
|
||||
## Scope
|
||||
- Ubuntu CSAF metadata discovery, release channel awareness, and pagination handling.
|
||||
- HTTP client with retries/backoff, checksum validation, and deduplication.
|
||||
- Mapping Ubuntu identifiers (USN numbers, package metadata) into connector metadata for downstream policy.
|
||||
- Emitting trust configuration (GPG fingerprints, cosign options) for policy weighting.
|
||||
## Participants
|
||||
- Worker schedules regular pulls; WebService can initiate manual ingest/resume.
|
||||
- CSAF normalizer converts raw documents into claims.
|
||||
- Policy engine leverages connector-supplied trust metadata.
|
||||
## Interfaces & contracts
|
||||
- Implements `IVexConnector`, using shared abstractions for HTTP/resume markers and telemetry.
|
||||
- Provides options for release channels (stable/LTS) and offline seed bundles.
|
||||
## In/Out of scope
|
||||
In: data fetching, metadata mapping, raw persistence, trust hints.
|
||||
Out: normalization/export, storage internals, attestation.
|
||||
## Observability & security expectations
|
||||
- Log release window fetch metrics, rate limits, and deduplication stats; mask secrets.
|
||||
- Emit counters for newly ingested vs unchanged USNs and quota usage.
|
||||
## Tests
|
||||
- Connector tests with mocked Ubuntu CSAF endpoints will live in `../StellaOps.Vexer.Connectors.Ubuntu.CSAF.Tests`.
|
||||
7
src/StellaOps.Vexer.Connectors.Ubuntu.CSAF/TASKS.md
Normal file
7
src/StellaOps.Vexer.Connectors.Ubuntu.CSAF/TASKS.md
Normal file
@@ -0,0 +1,7 @@
|
||||
If you are working on this file you need to read docs/ARCHITECTURE_VEXER.md and ./AGENTS.md).
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|VEXER-CONN-UBUNTU-01-001 – Ubuntu CSAF discovery & channels|Team Vexer Connectors – Ubuntu|VEXER-CONN-ABS-01-001|TODO – Implement discovery of Ubuntu CSAF catalogs, channel selection (stable/LTS), and offline snapshot import.|
|
||||
|VEXER-CONN-UBUNTU-01-002 – Incremental fetch & deduplication|Team Vexer Connectors – Ubuntu|VEXER-CONN-UBUNTU-01-001, VEXER-STORAGE-01-003|TODO – Fetch CSAF bundles with ETag handling, checksum validation, deduplication, and raw persistence.|
|
||||
|VEXER-CONN-UBUNTU-01-003 – Trust metadata & provenance|Team Vexer Connectors – Ubuntu|VEXER-CONN-UBUNTU-01-002, VEXER-POLICY-01-001|TODO – Emit Ubuntu signing metadata (GPG fingerprints) plus provenance hints for policy weighting and diagnostics.|
|
||||
@@ -0,0 +1,12 @@
|
||||
<Project Sdk="Microsoft.NET.Sdk">
|
||||
<PropertyGroup>
|
||||
<TargetFramework>net10.0</TargetFramework>
|
||||
<LangVersion>preview</LangVersion>
|
||||
<Nullable>enable</Nullable>
|
||||
<ImplicitUsings>enable</ImplicitUsings>
|
||||
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
|
||||
</PropertyGroup>
|
||||
<ItemGroup>
|
||||
<ProjectReference Include="..\StellaOps.Vexer.Core\StellaOps.Vexer.Core.csproj" />
|
||||
</ItemGroup>
|
||||
</Project>
|
||||
@@ -0,0 +1,126 @@
|
||||
using System.Collections.Generic;
|
||||
using System.Collections.Immutable;
|
||||
using StellaOps.Vexer.Core;
|
||||
using Xunit;
|
||||
|
||||
namespace StellaOps.Vexer.Core.Tests;
|
||||
|
||||
public sealed class VexCanonicalJsonSerializerTests
|
||||
{
|
||||
[Fact]
|
||||
public void SerializeClaim_ProducesDeterministicOrder()
|
||||
{
|
||||
var product = new VexProduct(
|
||||
key: "pkg:redhat/demo",
|
||||
name: "Demo App",
|
||||
version: "1.2.3",
|
||||
purl: "pkg:rpm/redhat/demo@1.2.3",
|
||||
cpe: "cpe:2.3:a:redhat:demo:1.2.3",
|
||||
componentIdentifiers: new[] { "componentB", "componentA" });
|
||||
|
||||
var document = new VexClaimDocument(
|
||||
format: VexDocumentFormat.Csaf,
|
||||
digest: "sha256:6d5a",
|
||||
sourceUri: new Uri("https://example.org/vex/csaf.json"),
|
||||
revision: "2024-09-15",
|
||||
signature: new VexSignatureMetadata(
|
||||
type: "pgp",
|
||||
subject: "CN=Red Hat",
|
||||
issuer: "CN=Red Hat Root",
|
||||
keyId: "0xABCD",
|
||||
verifiedAt: new DateTimeOffset(2025, 10, 14, 9, 30, 0, TimeSpan.Zero)));
|
||||
|
||||
var claim = new VexClaim(
|
||||
vulnerabilityId: "CVE-2025-12345",
|
||||
providerId: "redhat",
|
||||
product: product,
|
||||
status: VexClaimStatus.NotAffected,
|
||||
document: document,
|
||||
firstSeen: new DateTimeOffset(2025, 10, 10, 12, 0, 0, TimeSpan.Zero),
|
||||
lastSeen: new DateTimeOffset(2025, 10, 11, 12, 0, 0, TimeSpan.Zero),
|
||||
justification: VexJustification.ComponentNotPresent,
|
||||
detail: "Package not shipped in this channel.",
|
||||
confidence: new VexConfidence("high", 0.95, "policy/default"),
|
||||
additionalMetadata: ImmutableDictionary<string, string>.Empty
|
||||
.Add("source", "csaf")
|
||||
.Add("revision", "2024-09-15"));
|
||||
|
||||
var json = VexCanonicalJsonSerializer.Serialize(claim);
|
||||
|
||||
Assert.Equal(
|
||||
"{\"vulnerabilityId\":\"CVE-2025-12345\",\"providerId\":\"redhat\",\"product\":{\"key\":\"pkg:redhat/demo\",\"name\":\"Demo App\",\"version\":\"1.2.3\",\"purl\":\"pkg:rpm/redhat/demo@1.2.3\",\"cpe\":\"cpe:2.3:a:redhat:demo:1.2.3\",\"componentIdentifiers\":[\"componentA\",\"componentB\"]},\"status\":\"not_affected\",\"justification\":\"component_not_present\",\"detail\":\"Package not shipped in this channel.\",\"document\":{\"format\":\"csaf\",\"digest\":\"sha256:6d5a\",\"sourceUri\":\"https://example.org/vex/csaf.json\",\"revision\":\"2024-09-15\",\"signature\":{\"type\":\"pgp\",\"subject\":\"CN=Red Hat\",\"issuer\":\"CN=Red Hat Root\",\"keyId\":\"0xABCD\",\"verifiedAt\":\"2025-10-14T09:30:00+00:00\",\"transparencyLogReference\":null}},\"firstSeen\":\"2025-10-10T12:00:00+00:00\",\"lastSeen\":\"2025-10-11T12:00:00+00:00\",\"confidence\":{\"level\":\"high\",\"score\":0.95,\"method\":\"policy/default\"},\"additionalMetadata\":{\"revision\":\"2024-09-15\",\"source\":\"csaf\"}}",
|
||||
json);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void QuerySignature_FromFilters_SortsAndNormalizesKeys()
|
||||
{
|
||||
var signature = VexQuerySignature.FromFilters(new[]
|
||||
{
|
||||
new KeyValuePair<string, string>(" provider ", " redhat "),
|
||||
new KeyValuePair<string, string>("vulnId", "CVE-2025-12345"),
|
||||
new KeyValuePair<string, string>("provider", "canonical"),
|
||||
});
|
||||
|
||||
Assert.Equal("provider=canonical&provider=redhat&vulnId=CVE-2025-12345", signature.Value);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void SerializeExportManifest_OrdersArraysAndNestedObjects()
|
||||
{
|
||||
var manifest = new VexExportManifest(
|
||||
exportId: "export/2025/10/15/1",
|
||||
querySignature: new VexQuerySignature("provider=redhat&format=consensus"),
|
||||
format: VexExportFormat.OpenVex,
|
||||
createdAt: new DateTimeOffset(2025, 10, 15, 8, 45, 0, TimeSpan.Zero),
|
||||
artifact: new VexContentAddress("sha256", "abcd1234"),
|
||||
claimCount: 42,
|
||||
sourceProviders: new[] { "cisco", "redhat", "redhat" },
|
||||
fromCache: true,
|
||||
consensusRevision: "rev-7",
|
||||
attestation: new VexAttestationMetadata(
|
||||
predicateType: "https://in-toto.io/Statement/v0.1",
|
||||
rekor: new VexRekorReference("v2", "rekor://uuid/1234", "17", new Uri("https://rekor.example/log/17")),
|
||||
envelopeDigest: "sha256:deadbeef",
|
||||
signedAt: new DateTimeOffset(2025, 10, 15, 8, 46, 0, TimeSpan.Zero)),
|
||||
sizeBytes: 4096);
|
||||
|
||||
var json = VexCanonicalJsonSerializer.SerializeIndented(manifest);
|
||||
|
||||
const string expected = """
|
||||
{
|
||||
"exportId": "export/2025/10/15/1",
|
||||
"querySignature": {
|
||||
"value": "provider=redhat&format=consensus"
|
||||
},
|
||||
"format": "openvex",
|
||||
"createdAt": "2025-10-15T08:45:00+00:00",
|
||||
"artifact": {
|
||||
"algorithm": "sha256",
|
||||
"digest": "abcd1234"
|
||||
},
|
||||
"claimCount": 42,
|
||||
"fromCache": true,
|
||||
"sourceProviders": [
|
||||
"cisco",
|
||||
"redhat"
|
||||
],
|
||||
"consensusRevision": "rev-7",
|
||||
"attestation": {
|
||||
"predicateType": "https://in-toto.io/Statement/v0.1",
|
||||
"rekor": {
|
||||
"apiVersion": "v2",
|
||||
"location": "rekor://uuid/1234",
|
||||
"logIndex": "17",
|
||||
"inclusionProofUri": "https://rekor.example/log/17"
|
||||
},
|
||||
"envelopeDigest": "sha256:deadbeef",
|
||||
"signedAt": "2025-10-15T08:46:00+00:00"
|
||||
},
|
||||
"sizeBytes": 4096
|
||||
}
|
||||
""";
|
||||
|
||||
Assert.Equal(expected.ReplaceLineEndings(), json.ReplaceLineEndings());
|
||||
}
|
||||
}
|
||||
200
src/StellaOps.Vexer.Core.Tests/VexConsensusResolverTests.cs
Normal file
200
src/StellaOps.Vexer.Core.Tests/VexConsensusResolverTests.cs
Normal file
@@ -0,0 +1,200 @@
|
||||
using System;
|
||||
using System.Collections.Generic;
|
||||
using System.Collections.Immutable;
|
||||
using System.Linq;
|
||||
using StellaOps.Vexer.Core;
|
||||
using Xunit;
|
||||
|
||||
namespace StellaOps.Vexer.Core.Tests;
|
||||
|
||||
public sealed class VexConsensusResolverTests
|
||||
{
|
||||
private static readonly VexProduct DemoProduct = new(
|
||||
key: "pkg:demo/app",
|
||||
name: "Demo App",
|
||||
version: "1.0.0",
|
||||
purl: "pkg:demo/app@1.0.0",
|
||||
cpe: "cpe:2.3:a:demo:app:1.0.0");
|
||||
|
||||
[Fact]
|
||||
public void Resolve_SingleAcceptedClaim_SelectsStatus()
|
||||
{
|
||||
var provider = CreateProvider("redhat", VexProviderKind.Vendor);
|
||||
var claim = CreateClaim(
|
||||
"CVE-2025-0001",
|
||||
provider.Id,
|
||||
VexClaimStatus.Affected,
|
||||
justification: null);
|
||||
|
||||
var resolver = new VexConsensusResolver(new BaselineVexConsensusPolicy());
|
||||
|
||||
var result = resolver.Resolve(new VexConsensusRequest(
|
||||
claim.VulnerabilityId,
|
||||
DemoProduct,
|
||||
new[] { claim },
|
||||
new Dictionary<string, VexProvider> { [provider.Id] = provider },
|
||||
DateTimeOffset.Parse("2025-10-15T12:00:00Z")));
|
||||
|
||||
Assert.Equal(VexConsensusStatus.Affected, result.Consensus.Status);
|
||||
Assert.Equal("baseline/v1", result.Consensus.PolicyVersion);
|
||||
Assert.Single(result.Consensus.Sources);
|
||||
Assert.Empty(result.Consensus.Conflicts);
|
||||
Assert.NotNull(result.Consensus.Summary);
|
||||
Assert.Contains("affected", result.Consensus.Summary!, StringComparison.Ordinal);
|
||||
|
||||
var decision = Assert.Single(result.DecisionLog);
|
||||
Assert.True(decision.Included);
|
||||
Assert.Equal(provider.Id, decision.ProviderId);
|
||||
Assert.Null(decision.Reason);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void Resolve_NotAffectedWithoutJustification_IsRejected()
|
||||
{
|
||||
var provider = CreateProvider("cisco", VexProviderKind.Vendor);
|
||||
var claim = CreateClaim(
|
||||
"CVE-2025-0002",
|
||||
provider.Id,
|
||||
VexClaimStatus.NotAffected,
|
||||
justification: null);
|
||||
|
||||
var resolver = new VexConsensusResolver(new BaselineVexConsensusPolicy());
|
||||
|
||||
var result = resolver.Resolve(new VexConsensusRequest(
|
||||
claim.VulnerabilityId,
|
||||
DemoProduct,
|
||||
new[] { claim },
|
||||
new Dictionary<string, VexProvider> { [provider.Id] = provider },
|
||||
DateTimeOffset.Parse("2025-10-15T12:00:00Z")));
|
||||
|
||||
Assert.Equal(VexConsensusStatus.UnderInvestigation, result.Consensus.Status);
|
||||
Assert.Empty(result.Consensus.Sources);
|
||||
var conflict = Assert.Single(result.Consensus.Conflicts);
|
||||
Assert.Equal("missing_justification", conflict.Reason);
|
||||
|
||||
var decision = Assert.Single(result.DecisionLog);
|
||||
Assert.False(decision.Included);
|
||||
Assert.Equal("missing_justification", decision.Reason);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void Resolve_MajorityWeightWins_WithConflictingSources()
|
||||
{
|
||||
var vendor = CreateProvider("redhat", VexProviderKind.Vendor);
|
||||
var distro = CreateProvider("fedora", VexProviderKind.Distro);
|
||||
|
||||
var claims = new[]
|
||||
{
|
||||
CreateClaim(
|
||||
"CVE-2025-0003",
|
||||
vendor.Id,
|
||||
VexClaimStatus.Affected,
|
||||
detail: "Vendor advisory",
|
||||
documentDigest: "sha256:vendor"),
|
||||
CreateClaim(
|
||||
"CVE-2025-0003",
|
||||
distro.Id,
|
||||
VexClaimStatus.NotAffected,
|
||||
justification: VexJustification.ComponentNotPresent,
|
||||
detail: "Distro package not shipped",
|
||||
documentDigest: "sha256:distro"),
|
||||
};
|
||||
|
||||
var resolver = new VexConsensusResolver(new BaselineVexConsensusPolicy());
|
||||
|
||||
var result = resolver.Resolve(new VexConsensusRequest(
|
||||
"CVE-2025-0003",
|
||||
DemoProduct,
|
||||
claims,
|
||||
new Dictionary<string, VexProvider>
|
||||
{
|
||||
[vendor.Id] = vendor,
|
||||
[distro.Id] = distro,
|
||||
},
|
||||
DateTimeOffset.Parse("2025-10-15T12:00:00Z")));
|
||||
|
||||
Assert.Equal(VexConsensusStatus.Affected, result.Consensus.Status);
|
||||
Assert.Equal(2, result.Consensus.Sources.Length);
|
||||
Assert.Equal(1.0, result.Consensus.Sources.First(s => s.ProviderId == vendor.Id).Weight);
|
||||
Assert.Contains(result.Consensus.Conflicts, c => c.ProviderId == distro.Id && c.Reason == "status_conflict");
|
||||
Assert.NotNull(result.Consensus.Summary);
|
||||
Assert.Contains("affected", result.Consensus.Summary!, StringComparison.Ordinal);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void Resolve_TieFallsBackToUnderInvestigation()
|
||||
{
|
||||
var hub = CreateProvider("hub", VexProviderKind.Hub);
|
||||
var platform = CreateProvider("platform", VexProviderKind.Platform);
|
||||
|
||||
var claims = new[]
|
||||
{
|
||||
CreateClaim(
|
||||
"CVE-2025-0004",
|
||||
hub.Id,
|
||||
VexClaimStatus.Affected,
|
||||
detail: "Hub escalation",
|
||||
documentDigest: "sha256:hub"),
|
||||
CreateClaim(
|
||||
"CVE-2025-0004",
|
||||
platform.Id,
|
||||
VexClaimStatus.NotAffected,
|
||||
justification: VexJustification.ProtectedByMitigatingControl,
|
||||
detail: "Runtime mitigations",
|
||||
documentDigest: "sha256:platform"),
|
||||
};
|
||||
|
||||
var resolver = new VexConsensusResolver(new BaselineVexConsensusPolicy(
|
||||
new VexConsensusPolicyOptions(
|
||||
hubWeight: 0.5,
|
||||
platformWeight: 0.5)));
|
||||
|
||||
var result = resolver.Resolve(new VexConsensusRequest(
|
||||
"CVE-2025-0004",
|
||||
DemoProduct,
|
||||
claims,
|
||||
new Dictionary<string, VexProvider>
|
||||
{
|
||||
[hub.Id] = hub,
|
||||
[platform.Id] = platform,
|
||||
},
|
||||
DateTimeOffset.Parse("2025-10-15T12:00:00Z")));
|
||||
|
||||
Assert.Equal(VexConsensusStatus.UnderInvestigation, result.Consensus.Status);
|
||||
Assert.Equal(2, result.Consensus.Conflicts.Length);
|
||||
Assert.NotNull(result.Consensus.Summary);
|
||||
Assert.Contains("No majority consensus", result.Consensus.Summary!, StringComparison.Ordinal);
|
||||
}
|
||||
|
||||
private static VexProvider CreateProvider(string id, VexProviderKind kind)
|
||||
=> new(
|
||||
id,
|
||||
displayName: id.ToUpperInvariant(),
|
||||
kind,
|
||||
baseUris: Array.Empty<Uri>(),
|
||||
trust: new VexProviderTrust(weight: 1.0, cosign: null));
|
||||
|
||||
private static VexClaim CreateClaim(
|
||||
string vulnerabilityId,
|
||||
string providerId,
|
||||
VexClaimStatus status,
|
||||
VexJustification? justification = null,
|
||||
string? detail = null,
|
||||
string? documentDigest = null)
|
||||
=> new(
|
||||
vulnerabilityId,
|
||||
providerId,
|
||||
DemoProduct,
|
||||
status,
|
||||
new VexClaimDocument(
|
||||
VexDocumentFormat.Csaf,
|
||||
documentDigest ?? $"sha256:{providerId}",
|
||||
new Uri($"https://example.org/{providerId}/{vulnerabilityId}.json"),
|
||||
"1"),
|
||||
firstSeen: DateTimeOffset.Parse("2025-10-10T12:00:00Z"),
|
||||
lastSeen: DateTimeOffset.Parse("2025-10-11T12:00:00Z"),
|
||||
justification,
|
||||
detail,
|
||||
confidence: null,
|
||||
additionalMetadata: ImmutableDictionary<string, string>.Empty);
|
||||
}
|
||||
59
src/StellaOps.Vexer.Core.Tests/VexQuerySignatureTests.cs
Normal file
59
src/StellaOps.Vexer.Core.Tests/VexQuerySignatureTests.cs
Normal file
@@ -0,0 +1,59 @@
|
||||
using System.Collections.Generic;
|
||||
using StellaOps.Vexer.Core;
|
||||
using Xunit;
|
||||
|
||||
namespace StellaOps.Vexer.Core.Tests;
|
||||
|
||||
public sealed class VexQuerySignatureTests
|
||||
{
|
||||
[Fact]
|
||||
public void FromFilters_SortsAlphabetically()
|
||||
{
|
||||
var filters = new[]
|
||||
{
|
||||
new KeyValuePair<string, string>("provider", "redhat"),
|
||||
new KeyValuePair<string, string>("vulnId", "CVE-2025-0001"),
|
||||
new KeyValuePair<string, string>("provider", "cisco"),
|
||||
};
|
||||
|
||||
var signature = VexQuerySignature.FromFilters(filters);
|
||||
|
||||
Assert.Equal("provider=cisco&provider=redhat&vulnId=CVE-2025-0001", signature.Value);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void FromQuery_NormalizesFiltersAndSort()
|
||||
{
|
||||
var query = VexQuery.Create(
|
||||
filters: new[]
|
||||
{
|
||||
new VexQueryFilter(" provider ", " redhat "),
|
||||
new VexQueryFilter("vulnId", "CVE-2025-0002"),
|
||||
},
|
||||
sort: new[]
|
||||
{
|
||||
new VexQuerySort("published", true),
|
||||
new VexQuerySort("severity", false),
|
||||
},
|
||||
limit: 200,
|
||||
offset: 10,
|
||||
view: "consensus");
|
||||
|
||||
var signature = VexQuerySignature.FromQuery(query);
|
||||
|
||||
Assert.Equal(
|
||||
"provider=redhat&vulnId=CVE-2025-0002&sort=-published&sort=+severity&limit=200&offset=10&view=consensus",
|
||||
signature.Value);
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void ComputeHash_ReturnsStableSha256()
|
||||
{
|
||||
var signature = new VexQuerySignature("provider=redhat&vulnId=CVE-2025-0003");
|
||||
|
||||
var address = signature.ComputeHash();
|
||||
|
||||
Assert.Equal("sha256", address.Algorithm);
|
||||
Assert.Equal("44c9881aaa79050ae943eaaf78afa697b1a4d3e38b03e20db332f2bd1e5b1029", address.Digest);
|
||||
}
|
||||
}
|
||||
26
src/StellaOps.Vexer.Core/AGENTS.md
Normal file
26
src/StellaOps.Vexer.Core/AGENTS.md
Normal file
@@ -0,0 +1,26 @@
|
||||
# AGENTS
|
||||
## Role
|
||||
Domain source of truth for VEX statements, consensus rollups, and trust policy orchestration across all Vexer services.
|
||||
## Scope
|
||||
- Records for raw document metadata, normalized claims, consensus projections, and export descriptors.
|
||||
- Policy + weighting engine that projects provider trust tiers into consensus status outcomes.
|
||||
- Connector, normalizer, export, and attestation contracts shared by WebService, Worker, and plug-ins.
|
||||
- Deterministic hashing utilities (query signatures, artifact digests, attestation subjects).
|
||||
## Participants
|
||||
- Vexer WebService uses the models to persist ingress/egress payloads and to perform consensus mutations.
|
||||
- Vexer Worker executes reconciliation and verification routines using policy helpers defined here.
|
||||
- Export/Attestation modules depend on record definitions for envelopes and manifest payloads.
|
||||
## Interfaces & contracts
|
||||
- `IVexConnector`, `INormalizer`, `IExportEngine`, `ITransparencyLogClient`, `IArtifactStore`, and policy abstractions for consensus resolution.
|
||||
- Value objects for provider metadata, VexClaim, VexConsensusEntry, ExportManifest, QuerySignature.
|
||||
- Deterministic comparer utilities and stable JSON serialization helpers for tests and cache keys.
|
||||
## In/Out of scope
|
||||
In: domain invariants, policy evaluation helpers, deterministic serialization, shared abstractions.
|
||||
Out: Mongo persistence implementations, HTTP endpoints, background scheduling, concrete connector logic.
|
||||
## Observability & security expectations
|
||||
- Avoid secret handling; provide structured logging extension methods for consensus decisions.
|
||||
- Emit correlation identifiers and query signatures without embedding PII.
|
||||
- Ensure deterministic logging order to keep reproducibility guarantees intact.
|
||||
## Tests
|
||||
- Unit coverage lives in `../StellaOps.Vexer.Core.Tests` (to be scaffolded) focusing on consensus, policy gates, and serialization determinism.
|
||||
- Golden fixtures must rely on canonical JSON snapshots produced via stable serializers.
|
||||
61
src/StellaOps.Vexer.Core/BaselineVexConsensusPolicy.cs
Normal file
61
src/StellaOps.Vexer.Core/BaselineVexConsensusPolicy.cs
Normal file
@@ -0,0 +1,61 @@
|
||||
namespace StellaOps.Vexer.Core;
|
||||
|
||||
/// <summary>
|
||||
/// Baseline consensus policy applying tier-based weights and enforcing justification gates.
|
||||
/// </summary>
|
||||
public sealed class BaselineVexConsensusPolicy : IVexConsensusPolicy
|
||||
{
|
||||
private readonly VexConsensusPolicyOptions _options;
|
||||
|
||||
public BaselineVexConsensusPolicy(VexConsensusPolicyOptions? options = null)
|
||||
{
|
||||
_options = options ?? new VexConsensusPolicyOptions();
|
||||
}
|
||||
|
||||
public string Version => _options.Version;
|
||||
|
||||
public double GetProviderWeight(VexProvider provider)
|
||||
{
|
||||
if (provider is null)
|
||||
{
|
||||
throw new ArgumentNullException(nameof(provider));
|
||||
}
|
||||
|
||||
if (_options.ProviderOverrides.TryGetValue(provider.Id, out var overrideWeight))
|
||||
{
|
||||
return overrideWeight;
|
||||
}
|
||||
|
||||
return provider.Kind switch
|
||||
{
|
||||
VexProviderKind.Vendor => _options.VendorWeight,
|
||||
VexProviderKind.Distro => _options.DistroWeight,
|
||||
VexProviderKind.Platform => _options.PlatformWeight,
|
||||
VexProviderKind.Hub => _options.HubWeight,
|
||||
VexProviderKind.Attestation => _options.AttestationWeight,
|
||||
_ => 0,
|
||||
};
|
||||
}
|
||||
|
||||
public bool IsClaimEligible(VexClaim claim, VexProvider provider, out string? rejectionReason)
|
||||
{
|
||||
if (claim is null)
|
||||
{
|
||||
throw new ArgumentNullException(nameof(claim));
|
||||
}
|
||||
|
||||
if (provider is null)
|
||||
{
|
||||
throw new ArgumentNullException(nameof(provider));
|
||||
}
|
||||
|
||||
if (claim.Status is VexClaimStatus.NotAffected && claim.Justification is null)
|
||||
{
|
||||
rejectionReason = "missing_justification";
|
||||
return false;
|
||||
}
|
||||
|
||||
rejectionReason = null;
|
||||
return true;
|
||||
}
|
||||
}
|
||||
26
src/StellaOps.Vexer.Core/IVexConsensusPolicy.cs
Normal file
26
src/StellaOps.Vexer.Core/IVexConsensusPolicy.cs
Normal file
@@ -0,0 +1,26 @@
|
||||
namespace StellaOps.Vexer.Core;
|
||||
|
||||
/// <summary>
|
||||
/// Policy abstraction supplying trust weights and gating logic for consensus decisions.
|
||||
/// </summary>
|
||||
public interface IVexConsensusPolicy
|
||||
{
|
||||
/// <summary>
|
||||
/// Semantic version describing the active policy.
|
||||
/// </summary>
|
||||
string Version { get; }
|
||||
|
||||
/// <summary>
|
||||
/// Returns the effective weight (0-1) to apply for the provided VEX source.
|
||||
/// </summary>
|
||||
double GetProviderWeight(VexProvider provider);
|
||||
|
||||
/// <summary>
|
||||
/// Determines whether the claim is eligible to participate in consensus.
|
||||
/// </summary>
|
||||
/// <param name="claim">Normalized claim to evaluate.</param>
|
||||
/// <param name="provider">Provider metadata for the claim.</param>
|
||||
/// <param name="rejectionReason">Textual reason when the claim is rejected.</param>
|
||||
/// <returns><c>true</c> if the claim should participate; <c>false</c> otherwise.</returns>
|
||||
bool IsClaimEligible(VexClaim claim, VexProvider provider, out string? rejectionReason);
|
||||
}
|
||||
9
src/StellaOps.Vexer.Core/StellaOps.Vexer.Core.csproj
Normal file
9
src/StellaOps.Vexer.Core/StellaOps.Vexer.Core.csproj
Normal file
@@ -0,0 +1,9 @@
|
||||
<Project Sdk="Microsoft.NET.Sdk">
|
||||
<PropertyGroup>
|
||||
<TargetFramework>net10.0</TargetFramework>
|
||||
<LangVersion>preview</LangVersion>
|
||||
<Nullable>enable</Nullable>
|
||||
<ImplicitUsings>enable</ImplicitUsings>
|
||||
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
|
||||
</PropertyGroup>
|
||||
</Project>
|
||||
7
src/StellaOps.Vexer.Core/TASKS.md
Normal file
7
src/StellaOps.Vexer.Core/TASKS.md
Normal file
@@ -0,0 +1,7 @@
|
||||
If you are working on this file you need to read docs/ARCHITECTURE_VEXER.md and ./AGENTS.md).
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|VEXER-CORE-01-001 – Canonical VEX domain records|Team Vexer Core & Policy|docs/ARCHITECTURE_VEXER.md|DONE (2025-10-15) – Introduced `VexClaim`, `VexConsensus`, provider metadata, export manifest records, and deterministic JSON serialization with tests covering canonical ordering and query signatures.|
|
||||
|VEXER-CORE-01-002 – Trust-weighted consensus resolver|Team Vexer Core & Policy|VEXER-CORE-01-001|DONE (2025-10-15) – Added consensus resolver, baseline policy (tier weights + justification gate), telemetry output, and tests covering acceptance, conflict ties, and determinism.|
|
||||
|VEXER-CORE-01-003 – Shared contracts & query signatures|Team Vexer Core & Policy|VEXER-CORE-01-001|DONE (2025-10-15) – Published connector/normalizer/exporter/attestation abstractions and expanded deterministic `VexQuerySignature`/hash utilities with test coverage.|
|
||||
28
src/StellaOps.Vexer.Core/VexAttestationAbstractions.cs
Normal file
28
src/StellaOps.Vexer.Core/VexAttestationAbstractions.cs
Normal file
@@ -0,0 +1,28 @@
|
||||
using System;
|
||||
using System.Collections.Immutable;
|
||||
using System.Threading;
|
||||
using System.Threading.Tasks;
|
||||
|
||||
namespace StellaOps.Vexer.Core;
|
||||
|
||||
public interface IVexAttestationClient
|
||||
{
|
||||
ValueTask<VexAttestationResponse> SignAsync(VexAttestationRequest request, CancellationToken cancellationToken);
|
||||
|
||||
ValueTask<VexAttestationVerification> VerifyAsync(VexAttestationRequest request, CancellationToken cancellationToken);
|
||||
}
|
||||
|
||||
public sealed record VexAttestationRequest(
|
||||
VexQuerySignature QuerySignature,
|
||||
VexContentAddress Artifact,
|
||||
VexExportFormat Format,
|
||||
DateTimeOffset CreatedAt,
|
||||
ImmutableDictionary<string, string> Metadata);
|
||||
|
||||
public sealed record VexAttestationResponse(
|
||||
VexAttestationMetadata Attestation,
|
||||
ImmutableDictionary<string, string> Diagnostics);
|
||||
|
||||
public sealed record VexAttestationVerification(
|
||||
bool IsValid,
|
||||
ImmutableDictionary<string, string> Diagnostics);
|
||||
494
src/StellaOps.Vexer.Core/VexCanonicalJsonSerializer.cs
Normal file
494
src/StellaOps.Vexer.Core/VexCanonicalJsonSerializer.cs
Normal file
@@ -0,0 +1,494 @@
|
||||
using System.Collections.Generic;
|
||||
using System.Reflection;
|
||||
using System.Runtime.Serialization;
|
||||
using System.Text.Encodings.Web;
|
||||
using System.Text.Json;
|
||||
using System.Text.Json.Serialization;
|
||||
using System.Text.Json.Serialization.Metadata;
|
||||
|
||||
namespace StellaOps.Vexer.Core;
|
||||
|
||||
public static class VexCanonicalJsonSerializer
|
||||
{
|
||||
private static readonly JsonSerializerOptions CompactOptions = CreateOptions(writeIndented: false);
|
||||
private static readonly JsonSerializerOptions PrettyOptions = CreateOptions(writeIndented: true);
|
||||
|
||||
private static readonly IReadOnlyDictionary<Type, string[]> PropertyOrderOverrides = new Dictionary<Type, string[]>
|
||||
{
|
||||
{
|
||||
typeof(VexProvider),
|
||||
new[]
|
||||
{
|
||||
"id",
|
||||
"displayName",
|
||||
"kind",
|
||||
"baseUris",
|
||||
"discovery",
|
||||
"trust",
|
||||
"enabled",
|
||||
}
|
||||
},
|
||||
{
|
||||
typeof(VexProviderDiscovery),
|
||||
new[]
|
||||
{
|
||||
"wellKnownMetadata",
|
||||
"rolIeService",
|
||||
}
|
||||
},
|
||||
{
|
||||
typeof(VexProviderTrust),
|
||||
new[]
|
||||
{
|
||||
"weight",
|
||||
"cosign",
|
||||
"pgpFingerprints",
|
||||
}
|
||||
},
|
||||
{
|
||||
typeof(VexCosignTrust),
|
||||
new[]
|
||||
{
|
||||
"issuer",
|
||||
"identityPattern",
|
||||
}
|
||||
},
|
||||
{
|
||||
typeof(VexClaim),
|
||||
new[]
|
||||
{
|
||||
"vulnerabilityId",
|
||||
"providerId",
|
||||
"product",
|
||||
"status",
|
||||
"justification",
|
||||
"detail",
|
||||
"document",
|
||||
"firstSeen",
|
||||
"lastSeen",
|
||||
"confidence",
|
||||
"additionalMetadata",
|
||||
}
|
||||
},
|
||||
{
|
||||
typeof(VexProduct),
|
||||
new[]
|
||||
{
|
||||
"key",
|
||||
"name",
|
||||
"version",
|
||||
"purl",
|
||||
"cpe",
|
||||
"componentIdentifiers",
|
||||
}
|
||||
},
|
||||
{
|
||||
typeof(VexClaimDocument),
|
||||
new[]
|
||||
{
|
||||
"format",
|
||||
"digest",
|
||||
"sourceUri",
|
||||
"revision",
|
||||
"signature",
|
||||
}
|
||||
},
|
||||
{
|
||||
typeof(VexSignatureMetadata),
|
||||
new[]
|
||||
{
|
||||
"type",
|
||||
"subject",
|
||||
"issuer",
|
||||
"keyId",
|
||||
"verifiedAt",
|
||||
"transparencyLogReference",
|
||||
}
|
||||
},
|
||||
{
|
||||
typeof(VexConfidence),
|
||||
new[]
|
||||
{
|
||||
"level",
|
||||
"score",
|
||||
"method",
|
||||
}
|
||||
},
|
||||
{
|
||||
typeof(VexConsensus),
|
||||
new[]
|
||||
{
|
||||
"vulnerabilityId",
|
||||
"product",
|
||||
"status",
|
||||
"calculatedAt",
|
||||
"sources",
|
||||
"conflicts",
|
||||
"policyVersion",
|
||||
"summary",
|
||||
}
|
||||
},
|
||||
{
|
||||
typeof(VexConsensusSource),
|
||||
new[]
|
||||
{
|
||||
"providerId",
|
||||
"status",
|
||||
"documentDigest",
|
||||
"weight",
|
||||
"justification",
|
||||
"detail",
|
||||
"confidence",
|
||||
}
|
||||
},
|
||||
{
|
||||
typeof(VexConsensusConflict),
|
||||
new[]
|
||||
{
|
||||
"providerId",
|
||||
"status",
|
||||
"documentDigest",
|
||||
"justification",
|
||||
"detail",
|
||||
"reason",
|
||||
}
|
||||
},
|
||||
{
|
||||
typeof(VexConnectorSettings),
|
||||
new[]
|
||||
{
|
||||
"values",
|
||||
}
|
||||
},
|
||||
{
|
||||
typeof(VexConnectorContext),
|
||||
new[]
|
||||
{
|
||||
"since",
|
||||
"settings",
|
||||
"rawSink",
|
||||
"signatureVerifier",
|
||||
"normalizers",
|
||||
"services",
|
||||
}
|
||||
},
|
||||
{
|
||||
typeof(VexRawDocument),
|
||||
new[]
|
||||
{
|
||||
"documentId",
|
||||
"providerId",
|
||||
"format",
|
||||
"sourceUri",
|
||||
"retrievedAt",
|
||||
"digest",
|
||||
"content",
|
||||
"metadata",
|
||||
}
|
||||
},
|
||||
{
|
||||
typeof(VexClaimBatch),
|
||||
new[]
|
||||
{
|
||||
"source",
|
||||
"claims",
|
||||
"diagnostics",
|
||||
}
|
||||
},
|
||||
{
|
||||
typeof(VexExportManifest),
|
||||
new[]
|
||||
{
|
||||
"exportId",
|
||||
"querySignature",
|
||||
"format",
|
||||
"createdAt",
|
||||
"artifact",
|
||||
"claimCount",
|
||||
"fromCache",
|
||||
"sourceProviders",
|
||||
"consensusRevision",
|
||||
"attestation",
|
||||
"sizeBytes",
|
||||
}
|
||||
},
|
||||
{
|
||||
typeof(VexContentAddress),
|
||||
new[]
|
||||
{
|
||||
"algorithm",
|
||||
"digest",
|
||||
}
|
||||
},
|
||||
{
|
||||
typeof(VexAttestationMetadata),
|
||||
new[]
|
||||
{
|
||||
"predicateType",
|
||||
"rekor",
|
||||
"envelopeDigest",
|
||||
"signedAt",
|
||||
}
|
||||
},
|
||||
{
|
||||
typeof(VexRekorReference),
|
||||
new[]
|
||||
{
|
||||
"apiVersion",
|
||||
"location",
|
||||
"logIndex",
|
||||
"inclusionProofUri",
|
||||
}
|
||||
},
|
||||
{
|
||||
typeof(VexQuerySignature),
|
||||
new[]
|
||||
{
|
||||
"value",
|
||||
}
|
||||
},
|
||||
{
|
||||
typeof(VexQuery),
|
||||
new[]
|
||||
{
|
||||
"filters",
|
||||
"sort",
|
||||
"limit",
|
||||
"offset",
|
||||
"view",
|
||||
}
|
||||
},
|
||||
{
|
||||
typeof(VexQueryFilter),
|
||||
new[]
|
||||
{
|
||||
"key",
|
||||
"value",
|
||||
}
|
||||
},
|
||||
{
|
||||
typeof(VexQuerySort),
|
||||
new[]
|
||||
{
|
||||
"field",
|
||||
"descending",
|
||||
}
|
||||
},
|
||||
{
|
||||
typeof(VexExportRequest),
|
||||
new[]
|
||||
{
|
||||
"query",
|
||||
"consensus",
|
||||
"claims",
|
||||
"generatedAt",
|
||||
}
|
||||
},
|
||||
{
|
||||
typeof(VexExportResult),
|
||||
new[]
|
||||
{
|
||||
"digest",
|
||||
"bytesWritten",
|
||||
"metadata",
|
||||
}
|
||||
},
|
||||
{
|
||||
typeof(VexAttestationRequest),
|
||||
new[]
|
||||
{
|
||||
"querySignature",
|
||||
"artifact",
|
||||
"format",
|
||||
"createdAt",
|
||||
"metadata",
|
||||
}
|
||||
},
|
||||
{
|
||||
typeof(VexAttestationResponse),
|
||||
new[]
|
||||
{
|
||||
"attestation",
|
||||
"diagnostics",
|
||||
}
|
||||
},
|
||||
{
|
||||
typeof(VexAttestationVerification),
|
||||
new[]
|
||||
{
|
||||
"isValid",
|
||||
"diagnostics",
|
||||
}
|
||||
},
|
||||
};
|
||||
|
||||
public static string Serialize<T>(T value)
|
||||
=> JsonSerializer.Serialize(value, CompactOptions);
|
||||
|
||||
public static string SerializeIndented<T>(T value)
|
||||
=> JsonSerializer.Serialize(value, PrettyOptions);
|
||||
|
||||
public static T Deserialize<T>(string json)
|
||||
=> JsonSerializer.Deserialize<T>(json, PrettyOptions)
|
||||
?? throw new InvalidOperationException($"Unable to deserialize type {typeof(T).Name}.");
|
||||
|
||||
private static JsonSerializerOptions CreateOptions(bool writeIndented)
|
||||
{
|
||||
var options = new JsonSerializerOptions
|
||||
{
|
||||
PropertyNamingPolicy = JsonNamingPolicy.CamelCase,
|
||||
DictionaryKeyPolicy = JsonNamingPolicy.CamelCase,
|
||||
DefaultIgnoreCondition = JsonIgnoreCondition.Never,
|
||||
WriteIndented = writeIndented,
|
||||
Encoder = JavaScriptEncoder.UnsafeRelaxedJsonEscaping,
|
||||
};
|
||||
|
||||
var baselineResolver = options.TypeInfoResolver ?? new DefaultJsonTypeInfoResolver();
|
||||
options.TypeInfoResolver = new DeterministicTypeInfoResolver(baselineResolver);
|
||||
options.Converters.Add(new EnumMemberJsonConverterFactory());
|
||||
return options;
|
||||
}
|
||||
|
||||
private sealed class DeterministicTypeInfoResolver : IJsonTypeInfoResolver
|
||||
{
|
||||
private readonly IJsonTypeInfoResolver _inner;
|
||||
|
||||
public DeterministicTypeInfoResolver(IJsonTypeInfoResolver inner)
|
||||
{
|
||||
_inner = inner ?? throw new ArgumentNullException(nameof(inner));
|
||||
}
|
||||
|
||||
public JsonTypeInfo? GetTypeInfo(Type type, JsonSerializerOptions options)
|
||||
{
|
||||
var info = _inner.GetTypeInfo(type, options);
|
||||
if (info is null)
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
if (info.Kind is JsonTypeInfoKind.Object && info.Properties is { Count: > 1 })
|
||||
{
|
||||
var ordered = info.Properties
|
||||
.OrderBy(property => GetPropertyOrder(type, property.Name))
|
||||
.ThenBy(property => property.Name, StringComparer.Ordinal)
|
||||
.ToArray();
|
||||
|
||||
info.Properties.Clear();
|
||||
foreach (var property in ordered)
|
||||
{
|
||||
info.Properties.Add(property);
|
||||
}
|
||||
}
|
||||
|
||||
return info;
|
||||
}
|
||||
|
||||
private static int GetPropertyOrder(Type type, string propertyName)
|
||||
{
|
||||
if (PropertyOrderOverrides.TryGetValue(type, out var order) &&
|
||||
Array.IndexOf(order, propertyName) is var index &&
|
||||
index >= 0)
|
||||
{
|
||||
return index;
|
||||
}
|
||||
|
||||
return int.MaxValue;
|
||||
}
|
||||
}
|
||||
|
||||
private sealed class EnumMemberJsonConverterFactory : JsonConverterFactory
|
||||
{
|
||||
public override bool CanConvert(Type typeToConvert)
|
||||
{
|
||||
var type = Nullable.GetUnderlyingType(typeToConvert) ?? typeToConvert;
|
||||
return type.IsEnum;
|
||||
}
|
||||
|
||||
public override JsonConverter? CreateConverter(Type typeToConvert, JsonSerializerOptions options)
|
||||
{
|
||||
var underlying = Nullable.GetUnderlyingType(typeToConvert);
|
||||
if (underlying is not null)
|
||||
{
|
||||
var nullableConverterType = typeof(NullableEnumMemberJsonConverter<>).MakeGenericType(underlying);
|
||||
return (JsonConverter)Activator.CreateInstance(nullableConverterType)!;
|
||||
}
|
||||
|
||||
var converterType = typeof(EnumMemberJsonConverter<>).MakeGenericType(typeToConvert);
|
||||
return (JsonConverter)Activator.CreateInstance(converterType)!;
|
||||
}
|
||||
|
||||
private sealed class EnumMemberJsonConverter<T> : JsonConverter<T>
|
||||
where T : struct, Enum
|
||||
{
|
||||
private readonly Dictionary<string, T> _nameToValue;
|
||||
private readonly Dictionary<T, string> _valueToName;
|
||||
|
||||
public EnumMemberJsonConverter()
|
||||
{
|
||||
_nameToValue = new Dictionary<string, T>(StringComparer.Ordinal);
|
||||
_valueToName = new Dictionary<T, string>();
|
||||
foreach (var value in Enum.GetValues<T>())
|
||||
{
|
||||
var name = value.ToString();
|
||||
var enumMember = typeof(T).GetField(name)!.GetCustomAttribute<EnumMemberAttribute>();
|
||||
var text = enumMember?.Value ?? name;
|
||||
_nameToValue[text] = value;
|
||||
_valueToName[value] = text;
|
||||
}
|
||||
}
|
||||
|
||||
public override T Read(ref Utf8JsonReader reader, Type typeToConvert, JsonSerializerOptions options)
|
||||
{
|
||||
if (reader.TokenType != JsonTokenType.String)
|
||||
{
|
||||
throw new JsonException($"Unexpected token '{reader.TokenType}' when parsing enum '{typeof(T).Name}'.");
|
||||
}
|
||||
|
||||
var text = reader.GetString();
|
||||
if (text is null || !_nameToValue.TryGetValue(text, out var value))
|
||||
{
|
||||
throw new JsonException($"Value '{text}' is not defined for enum '{typeof(T).Name}'.");
|
||||
}
|
||||
|
||||
return value;
|
||||
}
|
||||
|
||||
public override void Write(Utf8JsonWriter writer, T value, JsonSerializerOptions options)
|
||||
{
|
||||
if (!_valueToName.TryGetValue(value, out var text))
|
||||
{
|
||||
throw new JsonException($"Value '{value}' is not defined for enum '{typeof(T).Name}'.");
|
||||
}
|
||||
|
||||
writer.WriteStringValue(text);
|
||||
}
|
||||
}
|
||||
|
||||
private sealed class NullableEnumMemberJsonConverter<T> : JsonConverter<T?>
|
||||
where T : struct, Enum
|
||||
{
|
||||
private readonly EnumMemberJsonConverter<T> _inner = new();
|
||||
|
||||
public override T? Read(ref Utf8JsonReader reader, Type typeToConvert, JsonSerializerOptions options)
|
||||
{
|
||||
if (reader.TokenType == JsonTokenType.Null)
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
return _inner.Read(ref reader, typeof(T), options);
|
||||
}
|
||||
|
||||
public override void Write(Utf8JsonWriter writer, T? value, JsonSerializerOptions options)
|
||||
{
|
||||
if (value is null)
|
||||
{
|
||||
writer.WriteNullValue();
|
||||
return;
|
||||
}
|
||||
|
||||
_inner.Write(writer, value.Value, options);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
326
src/StellaOps.Vexer.Core/VexClaim.cs
Normal file
326
src/StellaOps.Vexer.Core/VexClaim.cs
Normal file
@@ -0,0 +1,326 @@
|
||||
using System.Collections.Immutable;
|
||||
using System.Runtime.Serialization;
|
||||
|
||||
namespace StellaOps.Vexer.Core;
|
||||
|
||||
public sealed record VexClaim
|
||||
{
|
||||
public VexClaim(
|
||||
string vulnerabilityId,
|
||||
string providerId,
|
||||
VexProduct product,
|
||||
VexClaimStatus status,
|
||||
VexClaimDocument document,
|
||||
DateTimeOffset firstSeen,
|
||||
DateTimeOffset lastSeen,
|
||||
VexJustification? justification = null,
|
||||
string? detail = null,
|
||||
VexConfidence? confidence = null,
|
||||
ImmutableDictionary<string, string>? additionalMetadata = null)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(vulnerabilityId))
|
||||
{
|
||||
throw new ArgumentException("Vulnerability id must be provided.", nameof(vulnerabilityId));
|
||||
}
|
||||
|
||||
if (string.IsNullOrWhiteSpace(providerId))
|
||||
{
|
||||
throw new ArgumentException("Provider id must be provided.", nameof(providerId));
|
||||
}
|
||||
|
||||
if (lastSeen < firstSeen)
|
||||
{
|
||||
throw new ArgumentOutOfRangeException(nameof(lastSeen), "Last seen timestamp cannot be earlier than first seen.");
|
||||
}
|
||||
|
||||
VulnerabilityId = vulnerabilityId.Trim();
|
||||
ProviderId = providerId.Trim();
|
||||
Product = product ?? throw new ArgumentNullException(nameof(product));
|
||||
Status = status;
|
||||
Document = document ?? throw new ArgumentNullException(nameof(document));
|
||||
FirstSeen = firstSeen;
|
||||
LastSeen = lastSeen;
|
||||
Justification = justification;
|
||||
Detail = string.IsNullOrWhiteSpace(detail) ? null : detail.Trim();
|
||||
Confidence = confidence;
|
||||
AdditionalMetadata = NormalizeMetadata(additionalMetadata);
|
||||
}
|
||||
|
||||
public string VulnerabilityId { get; }
|
||||
|
||||
public string ProviderId { get; }
|
||||
|
||||
public VexProduct Product { get; }
|
||||
|
||||
public VexClaimStatus Status { get; }
|
||||
|
||||
public VexJustification? Justification { get; }
|
||||
|
||||
public string? Detail { get; }
|
||||
|
||||
public VexClaimDocument Document { get; }
|
||||
|
||||
public DateTimeOffset FirstSeen { get; }
|
||||
|
||||
public DateTimeOffset LastSeen { get; }
|
||||
|
||||
public VexConfidence? Confidence { get; }
|
||||
|
||||
public ImmutableSortedDictionary<string, string> AdditionalMetadata { get; }
|
||||
|
||||
private static ImmutableSortedDictionary<string, string> NormalizeMetadata(
|
||||
ImmutableDictionary<string, string>? additionalMetadata)
|
||||
{
|
||||
if (additionalMetadata is null || additionalMetadata.Count == 0)
|
||||
{
|
||||
return ImmutableSortedDictionary<string, string>.Empty;
|
||||
}
|
||||
|
||||
var builder = ImmutableSortedDictionary.CreateBuilder<string, string>(StringComparer.Ordinal);
|
||||
foreach (var (key, value) in additionalMetadata)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(key))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
builder[key.Trim()] = value?.Trim() ?? string.Empty;
|
||||
}
|
||||
|
||||
return builder.ToImmutable();
|
||||
}
|
||||
}
|
||||
|
||||
public sealed record VexProduct
|
||||
{
|
||||
public VexProduct(
|
||||
string key,
|
||||
string? name,
|
||||
string? version = null,
|
||||
string? purl = null,
|
||||
string? cpe = null,
|
||||
IEnumerable<string>? componentIdentifiers = null)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(key))
|
||||
{
|
||||
throw new ArgumentException("Product key must be provided.", nameof(key));
|
||||
}
|
||||
|
||||
Key = key.Trim();
|
||||
Name = string.IsNullOrWhiteSpace(name) ? null : name.Trim();
|
||||
Version = string.IsNullOrWhiteSpace(version) ? null : version.Trim();
|
||||
Purl = string.IsNullOrWhiteSpace(purl) ? null : purl.Trim();
|
||||
Cpe = string.IsNullOrWhiteSpace(cpe) ? null : cpe.Trim();
|
||||
ComponentIdentifiers = NormalizeComponentIdentifiers(componentIdentifiers);
|
||||
}
|
||||
|
||||
public string Key { get; }
|
||||
|
||||
public string? Name { get; }
|
||||
|
||||
public string? Version { get; }
|
||||
|
||||
public string? Purl { get; }
|
||||
|
||||
public string? Cpe { get; }
|
||||
|
||||
public ImmutableArray<string> ComponentIdentifiers { get; }
|
||||
|
||||
private static ImmutableArray<string> NormalizeComponentIdentifiers(IEnumerable<string>? identifiers)
|
||||
{
|
||||
if (identifiers is null)
|
||||
{
|
||||
return ImmutableArray<string>.Empty;
|
||||
}
|
||||
|
||||
var set = new SortedSet<string>(StringComparer.Ordinal);
|
||||
foreach (var identifier in identifiers)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(identifier))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
set.Add(identifier.Trim());
|
||||
}
|
||||
|
||||
return set.Count == 0 ? ImmutableArray<string>.Empty : set.ToImmutableArray();
|
||||
}
|
||||
}
|
||||
|
||||
public sealed record VexClaimDocument
|
||||
{
|
||||
public VexClaimDocument(
|
||||
VexDocumentFormat format,
|
||||
string digest,
|
||||
Uri sourceUri,
|
||||
string? revision = null,
|
||||
VexSignatureMetadata? signature = null)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(digest))
|
||||
{
|
||||
throw new ArgumentException("Document digest must be provided.", nameof(digest));
|
||||
}
|
||||
|
||||
Format = format;
|
||||
Digest = digest.Trim();
|
||||
SourceUri = sourceUri ?? throw new ArgumentNullException(nameof(sourceUri));
|
||||
Revision = string.IsNullOrWhiteSpace(revision) ? null : revision.Trim();
|
||||
Signature = signature;
|
||||
}
|
||||
|
||||
public VexDocumentFormat Format { get; }
|
||||
|
||||
public string Digest { get; }
|
||||
|
||||
public Uri SourceUri { get; }
|
||||
|
||||
public string? Revision { get; }
|
||||
|
||||
public VexSignatureMetadata? Signature { get; }
|
||||
}
|
||||
|
||||
public sealed record VexSignatureMetadata
|
||||
{
|
||||
public VexSignatureMetadata(
|
||||
string type,
|
||||
string? subject = null,
|
||||
string? issuer = null,
|
||||
string? keyId = null,
|
||||
DateTimeOffset? verifiedAt = null,
|
||||
string? transparencyLogReference = null)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(type))
|
||||
{
|
||||
throw new ArgumentException("Signature type must be provided.", nameof(type));
|
||||
}
|
||||
|
||||
Type = type.Trim();
|
||||
Subject = string.IsNullOrWhiteSpace(subject) ? null : subject.Trim();
|
||||
Issuer = string.IsNullOrWhiteSpace(issuer) ? null : issuer.Trim();
|
||||
KeyId = string.IsNullOrWhiteSpace(keyId) ? null : keyId.Trim();
|
||||
VerifiedAt = verifiedAt;
|
||||
TransparencyLogReference = string.IsNullOrWhiteSpace(transparencyLogReference)
|
||||
? null
|
||||
: transparencyLogReference.Trim();
|
||||
}
|
||||
|
||||
public string Type { get; }
|
||||
|
||||
public string? Subject { get; }
|
||||
|
||||
public string? Issuer { get; }
|
||||
|
||||
public string? KeyId { get; }
|
||||
|
||||
public DateTimeOffset? VerifiedAt { get; }
|
||||
|
||||
public string? TransparencyLogReference { get; }
|
||||
}
|
||||
|
||||
public sealed record VexConfidence
|
||||
{
|
||||
public VexConfidence(string level, double? score = null, string? method = null)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(level))
|
||||
{
|
||||
throw new ArgumentException("Confidence level must be provided.", nameof(level));
|
||||
}
|
||||
|
||||
if (score is not null && (double.IsNaN(score.Value) || double.IsInfinity(score.Value)))
|
||||
{
|
||||
throw new ArgumentOutOfRangeException(nameof(score), "Confidence score must be a finite number.");
|
||||
}
|
||||
|
||||
Level = level.Trim();
|
||||
Score = score;
|
||||
Method = string.IsNullOrWhiteSpace(method) ? null : method.Trim();
|
||||
}
|
||||
|
||||
public string Level { get; }
|
||||
|
||||
public double? Score { get; }
|
||||
|
||||
public string? Method { get; }
|
||||
}
|
||||
|
||||
[DataContract]
|
||||
public enum VexDocumentFormat
|
||||
{
|
||||
[EnumMember(Value = "csaf")]
|
||||
Csaf,
|
||||
|
||||
[EnumMember(Value = "cyclonedx")]
|
||||
CycloneDx,
|
||||
|
||||
[EnumMember(Value = "openvex")]
|
||||
OpenVex,
|
||||
|
||||
[EnumMember(Value = "oci_attestation")]
|
||||
OciAttestation,
|
||||
}
|
||||
|
||||
[DataContract]
|
||||
public enum VexClaimStatus
|
||||
{
|
||||
[EnumMember(Value = "affected")]
|
||||
Affected,
|
||||
|
||||
[EnumMember(Value = "not_affected")]
|
||||
NotAffected,
|
||||
|
||||
[EnumMember(Value = "fixed")]
|
||||
Fixed,
|
||||
|
||||
[EnumMember(Value = "under_investigation")]
|
||||
UnderInvestigation,
|
||||
}
|
||||
|
||||
[DataContract]
|
||||
public enum VexJustification
|
||||
{
|
||||
[EnumMember(Value = "component_not_present")]
|
||||
ComponentNotPresent,
|
||||
|
||||
[EnumMember(Value = "component_not_configured")]
|
||||
ComponentNotConfigured,
|
||||
|
||||
[EnumMember(Value = "vulnerable_code_not_present")]
|
||||
VulnerableCodeNotPresent,
|
||||
|
||||
[EnumMember(Value = "vulnerable_code_not_in_execute_path")]
|
||||
VulnerableCodeNotInExecutePath,
|
||||
|
||||
[EnumMember(Value = "vulnerable_code_cannot_be_controlled_by_adversary")]
|
||||
VulnerableCodeCannotBeControlledByAdversary,
|
||||
|
||||
[EnumMember(Value = "inline_mitigations_already_exist")]
|
||||
InlineMitigationsAlreadyExist,
|
||||
|
||||
[EnumMember(Value = "protected_by_mitigating_control")]
|
||||
ProtectedByMitigatingControl,
|
||||
|
||||
[EnumMember(Value = "code_not_present")]
|
||||
CodeNotPresent,
|
||||
|
||||
[EnumMember(Value = "code_not_reachable")]
|
||||
CodeNotReachable,
|
||||
|
||||
[EnumMember(Value = "requires_configuration")]
|
||||
RequiresConfiguration,
|
||||
|
||||
[EnumMember(Value = "requires_dependency")]
|
||||
RequiresDependency,
|
||||
|
||||
[EnumMember(Value = "requires_environment")]
|
||||
RequiresEnvironment,
|
||||
|
||||
[EnumMember(Value = "protected_by_compensating_control")]
|
||||
ProtectedByCompensatingControl,
|
||||
|
||||
[EnumMember(Value = "protected_at_perimeter")]
|
||||
ProtectedAtPerimeter,
|
||||
|
||||
[EnumMember(Value = "protected_at_runtime")]
|
||||
ProtectedAtRuntime,
|
||||
}
|
||||
88
src/StellaOps.Vexer.Core/VexConnectorAbstractions.cs
Normal file
88
src/StellaOps.Vexer.Core/VexConnectorAbstractions.cs
Normal file
@@ -0,0 +1,88 @@
|
||||
using System;
|
||||
using System.Collections.Immutable;
|
||||
using System.Threading;
|
||||
using System.Threading.Tasks;
|
||||
|
||||
namespace StellaOps.Vexer.Core;
|
||||
|
||||
/// <summary>
|
||||
/// Shared connector contract for fetching and normalizing provider-specific VEX data.
|
||||
/// </summary>
|
||||
public interface IVexConnector
|
||||
{
|
||||
string Id { get; }
|
||||
|
||||
VexProviderKind Kind { get; }
|
||||
|
||||
ValueTask ValidateAsync(VexConnectorSettings settings, CancellationToken cancellationToken);
|
||||
|
||||
IAsyncEnumerable<VexRawDocument> FetchAsync(VexConnectorContext context, CancellationToken cancellationToken);
|
||||
|
||||
ValueTask<VexClaimBatch> NormalizeAsync(VexRawDocument document, CancellationToken cancellationToken);
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Connector context populated by the orchestrator/worker.
|
||||
/// </summary>
|
||||
public sealed record VexConnectorContext(
|
||||
DateTimeOffset? Since,
|
||||
VexConnectorSettings Settings,
|
||||
IVexRawDocumentSink RawSink,
|
||||
IVexSignatureVerifier SignatureVerifier,
|
||||
IVexNormalizerRouter Normalizers,
|
||||
IServiceProvider Services);
|
||||
|
||||
/// <summary>
|
||||
/// Normalized connector configuration values.
|
||||
/// </summary>
|
||||
public sealed record VexConnectorSettings(ImmutableDictionary<string, string> Values)
|
||||
{
|
||||
public static VexConnectorSettings Empty { get; } = new(ImmutableDictionary<string, string>.Empty);
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Raw document retrieved from a connector pull.
|
||||
/// </summary>
|
||||
public sealed record VexRawDocument(
|
||||
string ProviderId,
|
||||
VexDocumentFormat Format,
|
||||
Uri SourceUri,
|
||||
DateTimeOffset RetrievedAt,
|
||||
string Digest,
|
||||
ReadOnlyMemory<byte> Content,
|
||||
ImmutableDictionary<string, string> Metadata)
|
||||
{
|
||||
public Guid DocumentId { get; init; } = Guid.NewGuid();
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Batch of normalized claims derived from a raw document.
|
||||
/// </summary>
|
||||
public sealed record VexClaimBatch(
|
||||
VexRawDocument Source,
|
||||
ImmutableArray<VexClaim> Claims,
|
||||
ImmutableDictionary<string, string> Diagnostics);
|
||||
|
||||
/// <summary>
|
||||
/// Sink abstraction allowing connectors to stream raw documents for persistence.
|
||||
/// </summary>
|
||||
public interface IVexRawDocumentSink
|
||||
{
|
||||
ValueTask StoreAsync(VexRawDocument document, CancellationToken cancellationToken);
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Signature/attestation verification service used while ingesting documents.
|
||||
/// </summary>
|
||||
public interface IVexSignatureVerifier
|
||||
{
|
||||
ValueTask<VexSignatureMetadata?> VerifyAsync(VexRawDocument document, CancellationToken cancellationToken);
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Normalizer router providing format-specific normalization helpers.
|
||||
/// </summary>
|
||||
public interface IVexNormalizerRouter
|
||||
{
|
||||
ValueTask<VexClaimBatch> NormalizeAsync(VexRawDocument document, CancellationToken cancellationToken);
|
||||
}
|
||||
194
src/StellaOps.Vexer.Core/VexConsensus.cs
Normal file
194
src/StellaOps.Vexer.Core/VexConsensus.cs
Normal file
@@ -0,0 +1,194 @@
|
||||
using System.Collections.Immutable;
|
||||
using System.Runtime.Serialization;
|
||||
|
||||
namespace StellaOps.Vexer.Core;
|
||||
|
||||
public sealed record VexConsensus
|
||||
{
|
||||
public VexConsensus(
|
||||
string vulnerabilityId,
|
||||
VexProduct product,
|
||||
VexConsensusStatus status,
|
||||
DateTimeOffset calculatedAt,
|
||||
IEnumerable<VexConsensusSource> sources,
|
||||
IEnumerable<VexConsensusConflict>? conflicts = null,
|
||||
string? policyVersion = null,
|
||||
string? summary = null)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(vulnerabilityId))
|
||||
{
|
||||
throw new ArgumentException("Vulnerability id must be provided.", nameof(vulnerabilityId));
|
||||
}
|
||||
|
||||
VulnerabilityId = vulnerabilityId.Trim();
|
||||
Product = product ?? throw new ArgumentNullException(nameof(product));
|
||||
Status = status;
|
||||
CalculatedAt = calculatedAt;
|
||||
Sources = NormalizeSources(sources);
|
||||
Conflicts = NormalizeConflicts(conflicts);
|
||||
PolicyVersion = string.IsNullOrWhiteSpace(policyVersion) ? null : policyVersion.Trim();
|
||||
Summary = string.IsNullOrWhiteSpace(summary) ? null : summary.Trim();
|
||||
}
|
||||
|
||||
public string VulnerabilityId { get; }
|
||||
|
||||
public VexProduct Product { get; }
|
||||
|
||||
public VexConsensusStatus Status { get; }
|
||||
|
||||
public DateTimeOffset CalculatedAt { get; }
|
||||
|
||||
public ImmutableArray<VexConsensusSource> Sources { get; }
|
||||
|
||||
public ImmutableArray<VexConsensusConflict> Conflicts { get; }
|
||||
|
||||
public string? PolicyVersion { get; }
|
||||
|
||||
public string? Summary { get; }
|
||||
|
||||
private static ImmutableArray<VexConsensusSource> NormalizeSources(IEnumerable<VexConsensusSource> sources)
|
||||
{
|
||||
if (sources is null)
|
||||
{
|
||||
throw new ArgumentNullException(nameof(sources));
|
||||
}
|
||||
|
||||
var builder = ImmutableArray.CreateBuilder<VexConsensusSource>();
|
||||
builder.AddRange(sources);
|
||||
if (builder.Count == 0)
|
||||
{
|
||||
return ImmutableArray<VexConsensusSource>.Empty;
|
||||
}
|
||||
|
||||
return builder
|
||||
.OrderBy(static x => x.ProviderId, StringComparer.Ordinal)
|
||||
.ThenBy(static x => x.DocumentDigest, StringComparer.Ordinal)
|
||||
.ToImmutableArray();
|
||||
}
|
||||
|
||||
private static ImmutableArray<VexConsensusConflict> NormalizeConflicts(IEnumerable<VexConsensusConflict>? conflicts)
|
||||
{
|
||||
if (conflicts is null)
|
||||
{
|
||||
return ImmutableArray<VexConsensusConflict>.Empty;
|
||||
}
|
||||
|
||||
var items = conflicts.ToArray();
|
||||
return items.Length == 0
|
||||
? ImmutableArray<VexConsensusConflict>.Empty
|
||||
: items
|
||||
.OrderBy(static x => x.ProviderId, StringComparer.Ordinal)
|
||||
.ThenBy(static x => x.DocumentDigest, StringComparer.Ordinal)
|
||||
.ToImmutableArray();
|
||||
}
|
||||
}
|
||||
|
||||
public sealed record VexConsensusSource
|
||||
{
|
||||
public VexConsensusSource(
|
||||
string providerId,
|
||||
VexClaimStatus status,
|
||||
string documentDigest,
|
||||
double weight,
|
||||
VexJustification? justification = null,
|
||||
string? detail = null,
|
||||
VexConfidence? confidence = null)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(providerId))
|
||||
{
|
||||
throw new ArgumentException("Provider id must be provided.", nameof(providerId));
|
||||
}
|
||||
|
||||
if (string.IsNullOrWhiteSpace(documentDigest))
|
||||
{
|
||||
throw new ArgumentException("Document digest must be provided.", nameof(documentDigest));
|
||||
}
|
||||
|
||||
if (double.IsNaN(weight) || double.IsInfinity(weight) || weight < 0)
|
||||
{
|
||||
throw new ArgumentOutOfRangeException(nameof(weight), "Weight must be a finite, non-negative number.");
|
||||
}
|
||||
|
||||
ProviderId = providerId.Trim();
|
||||
Status = status;
|
||||
DocumentDigest = documentDigest.Trim();
|
||||
Weight = weight;
|
||||
Justification = justification;
|
||||
Detail = string.IsNullOrWhiteSpace(detail) ? null : detail.Trim();
|
||||
Confidence = confidence;
|
||||
}
|
||||
|
||||
public string ProviderId { get; }
|
||||
|
||||
public VexClaimStatus Status { get; }
|
||||
|
||||
public string DocumentDigest { get; }
|
||||
|
||||
public double Weight { get; }
|
||||
|
||||
public VexJustification? Justification { get; }
|
||||
|
||||
public string? Detail { get; }
|
||||
|
||||
public VexConfidence? Confidence { get; }
|
||||
}
|
||||
|
||||
public sealed record VexConsensusConflict
|
||||
{
|
||||
public VexConsensusConflict(
|
||||
string providerId,
|
||||
VexClaimStatus status,
|
||||
string documentDigest,
|
||||
VexJustification? justification = null,
|
||||
string? detail = null,
|
||||
string? reason = null)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(providerId))
|
||||
{
|
||||
throw new ArgumentException("Provider id must be provided.", nameof(providerId));
|
||||
}
|
||||
|
||||
if (string.IsNullOrWhiteSpace(documentDigest))
|
||||
{
|
||||
throw new ArgumentException("Document digest must be provided.", nameof(documentDigest));
|
||||
}
|
||||
|
||||
ProviderId = providerId.Trim();
|
||||
Status = status;
|
||||
DocumentDigest = documentDigest.Trim();
|
||||
Justification = justification;
|
||||
Detail = string.IsNullOrWhiteSpace(detail) ? null : detail.Trim();
|
||||
Reason = string.IsNullOrWhiteSpace(reason) ? null : reason.Trim();
|
||||
}
|
||||
|
||||
public string ProviderId { get; }
|
||||
|
||||
public VexClaimStatus Status { get; }
|
||||
|
||||
public string DocumentDigest { get; }
|
||||
|
||||
public VexJustification? Justification { get; }
|
||||
|
||||
public string? Detail { get; }
|
||||
|
||||
public string? Reason { get; }
|
||||
}
|
||||
|
||||
[DataContract]
|
||||
public enum VexConsensusStatus
|
||||
{
|
||||
[EnumMember(Value = "affected")]
|
||||
Affected,
|
||||
|
||||
[EnumMember(Value = "not_affected")]
|
||||
NotAffected,
|
||||
|
||||
[EnumMember(Value = "fixed")]
|
||||
Fixed,
|
||||
|
||||
[EnumMember(Value = "under_investigation")]
|
||||
UnderInvestigation,
|
||||
|
||||
[EnumMember(Value = "divergent")]
|
||||
Divergent,
|
||||
}
|
||||
82
src/StellaOps.Vexer.Core/VexConsensusPolicyOptions.cs
Normal file
82
src/StellaOps.Vexer.Core/VexConsensusPolicyOptions.cs
Normal file
@@ -0,0 +1,82 @@
|
||||
using System.Collections.Immutable;
|
||||
|
||||
namespace StellaOps.Vexer.Core;
|
||||
|
||||
public sealed record VexConsensusPolicyOptions
|
||||
{
|
||||
public const string BaselineVersion = "baseline/v1";
|
||||
|
||||
public VexConsensusPolicyOptions(
|
||||
string? version = null,
|
||||
double vendorWeight = 1.0,
|
||||
double distroWeight = 0.9,
|
||||
double platformWeight = 0.7,
|
||||
double hubWeight = 0.5,
|
||||
double attestationWeight = 0.6,
|
||||
IEnumerable<KeyValuePair<string, double>>? providerOverrides = null)
|
||||
{
|
||||
Version = string.IsNullOrWhiteSpace(version) ? BaselineVersion : version.Trim();
|
||||
VendorWeight = NormalizeWeight(vendorWeight);
|
||||
DistroWeight = NormalizeWeight(distroWeight);
|
||||
PlatformWeight = NormalizeWeight(platformWeight);
|
||||
HubWeight = NormalizeWeight(hubWeight);
|
||||
AttestationWeight = NormalizeWeight(attestationWeight);
|
||||
ProviderOverrides = NormalizeOverrides(providerOverrides);
|
||||
}
|
||||
|
||||
public string Version { get; }
|
||||
|
||||
public double VendorWeight { get; }
|
||||
|
||||
public double DistroWeight { get; }
|
||||
|
||||
public double PlatformWeight { get; }
|
||||
|
||||
public double HubWeight { get; }
|
||||
|
||||
public double AttestationWeight { get; }
|
||||
|
||||
public ImmutableDictionary<string, double> ProviderOverrides { get; }
|
||||
|
||||
private static double NormalizeWeight(double weight)
|
||||
{
|
||||
if (double.IsNaN(weight) || double.IsInfinity(weight))
|
||||
{
|
||||
throw new ArgumentOutOfRangeException(nameof(weight), "Weight must be a finite number.");
|
||||
}
|
||||
|
||||
if (weight <= 0)
|
||||
{
|
||||
return 0;
|
||||
}
|
||||
|
||||
if (weight >= 1)
|
||||
{
|
||||
return 1;
|
||||
}
|
||||
|
||||
return weight;
|
||||
}
|
||||
|
||||
private static ImmutableDictionary<string, double> NormalizeOverrides(
|
||||
IEnumerable<KeyValuePair<string, double>>? overrides)
|
||||
{
|
||||
if (overrides is null)
|
||||
{
|
||||
return ImmutableDictionary<string, double>.Empty;
|
||||
}
|
||||
|
||||
var builder = ImmutableDictionary.CreateBuilder<string, double>(StringComparer.Ordinal);
|
||||
foreach (var (key, weight) in overrides)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(key))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
builder[key.Trim()] = NormalizeWeight(weight);
|
||||
}
|
||||
|
||||
return builder.ToImmutable();
|
||||
}
|
||||
}
|
||||
289
src/StellaOps.Vexer.Core/VexConsensusResolver.cs
Normal file
289
src/StellaOps.Vexer.Core/VexConsensusResolver.cs
Normal file
@@ -0,0 +1,289 @@
|
||||
using System.Collections.Immutable;
|
||||
using System.Globalization;
|
||||
|
||||
namespace StellaOps.Vexer.Core;
|
||||
|
||||
public sealed class VexConsensusResolver
|
||||
{
|
||||
private readonly IVexConsensusPolicy _policy;
|
||||
|
||||
public VexConsensusResolver(IVexConsensusPolicy policy)
|
||||
{
|
||||
_policy = policy ?? throw new ArgumentNullException(nameof(policy));
|
||||
}
|
||||
|
||||
public VexConsensusResolution Resolve(VexConsensusRequest request)
|
||||
{
|
||||
if (request is null)
|
||||
{
|
||||
throw new ArgumentNullException(nameof(request));
|
||||
}
|
||||
|
||||
var orderedClaims = request.Claims
|
||||
.OrderBy(static claim => claim.ProviderId, StringComparer.Ordinal)
|
||||
.ThenBy(static claim => claim.Document.Digest, StringComparer.Ordinal)
|
||||
.ThenBy(static claim => claim.Document.SourceUri.ToString(), StringComparer.Ordinal)
|
||||
.ToArray();
|
||||
|
||||
var decisions = ImmutableArray.CreateBuilder<VexConsensusDecisionTelemetry>(orderedClaims.Length);
|
||||
var acceptedSources = new List<VexConsensusSource>(orderedClaims.Length);
|
||||
var conflicts = new List<VexConsensusConflict>();
|
||||
var conflictKeys = new HashSet<string>(StringComparer.Ordinal);
|
||||
var weightByStatus = new Dictionary<VexClaimStatus, double>();
|
||||
|
||||
foreach (var claim in orderedClaims)
|
||||
{
|
||||
request.Providers.TryGetValue(claim.ProviderId, out var provider);
|
||||
|
||||
string? rejectionReason = null;
|
||||
double weight = 0;
|
||||
var included = false;
|
||||
|
||||
if (provider is null)
|
||||
{
|
||||
rejectionReason = "provider_not_registered";
|
||||
}
|
||||
else
|
||||
{
|
||||
weight = NormalizeWeight(_policy.GetProviderWeight(provider));
|
||||
if (weight <= 0)
|
||||
{
|
||||
rejectionReason = "weight_not_positive";
|
||||
}
|
||||
else if (!_policy.IsClaimEligible(claim, provider, out rejectionReason))
|
||||
{
|
||||
rejectionReason ??= "rejected_by_policy";
|
||||
}
|
||||
else
|
||||
{
|
||||
included = true;
|
||||
TrackStatusWeight(weightByStatus, claim.Status, weight);
|
||||
acceptedSources.Add(new VexConsensusSource(
|
||||
claim.ProviderId,
|
||||
claim.Status,
|
||||
claim.Document.Digest,
|
||||
weight,
|
||||
claim.Justification,
|
||||
claim.Detail,
|
||||
claim.Confidence));
|
||||
}
|
||||
}
|
||||
|
||||
if (!included)
|
||||
{
|
||||
var conflict = new VexConsensusConflict(
|
||||
claim.ProviderId,
|
||||
claim.Status,
|
||||
claim.Document.Digest,
|
||||
claim.Justification,
|
||||
claim.Detail,
|
||||
rejectionReason);
|
||||
if (conflictKeys.Add(CreateConflictKey(conflict.ProviderId, conflict.DocumentDigest)))
|
||||
{
|
||||
conflicts.Add(conflict);
|
||||
}
|
||||
}
|
||||
|
||||
decisions.Add(new VexConsensusDecisionTelemetry(
|
||||
claim.ProviderId,
|
||||
claim.Document.Digest,
|
||||
claim.Status,
|
||||
included,
|
||||
weight,
|
||||
rejectionReason,
|
||||
claim.Justification,
|
||||
claim.Detail));
|
||||
}
|
||||
|
||||
var consensusStatus = DetermineConsensusStatus(weightByStatus);
|
||||
var summary = BuildSummary(weightByStatus, consensusStatus);
|
||||
|
||||
var consensus = new VexConsensus(
|
||||
request.VulnerabilityId,
|
||||
request.Product,
|
||||
consensusStatus,
|
||||
request.CalculatedAt,
|
||||
acceptedSources,
|
||||
AttachConflictDetails(conflicts, acceptedSources, consensusStatus, conflictKeys),
|
||||
_policy.Version,
|
||||
summary);
|
||||
|
||||
return new VexConsensusResolution(consensus, decisions.ToImmutable());
|
||||
}
|
||||
|
||||
private static Dictionary<VexClaimStatus, double> TrackStatusWeight(
|
||||
Dictionary<VexClaimStatus, double> accumulator,
|
||||
VexClaimStatus status,
|
||||
double weight)
|
||||
{
|
||||
if (accumulator.TryGetValue(status, out var current))
|
||||
{
|
||||
accumulator[status] = current + weight;
|
||||
}
|
||||
else
|
||||
{
|
||||
accumulator[status] = weight;
|
||||
}
|
||||
|
||||
return accumulator;
|
||||
}
|
||||
|
||||
private static double NormalizeWeight(double weight)
|
||||
{
|
||||
if (double.IsNaN(weight) || double.IsInfinity(weight) || weight <= 0)
|
||||
{
|
||||
return 0;
|
||||
}
|
||||
|
||||
if (weight >= 1)
|
||||
{
|
||||
return 1;
|
||||
}
|
||||
|
||||
return weight;
|
||||
}
|
||||
|
||||
private static VexConsensusStatus DetermineConsensusStatus(
|
||||
IReadOnlyDictionary<VexClaimStatus, double> weights)
|
||||
{
|
||||
if (weights.Count == 0)
|
||||
{
|
||||
return VexConsensusStatus.UnderInvestigation;
|
||||
}
|
||||
|
||||
var ordered = weights
|
||||
.OrderByDescending(static pair => pair.Value)
|
||||
.ThenBy(static pair => pair.Key)
|
||||
.ToArray();
|
||||
|
||||
var topStatus = ordered[0].Key;
|
||||
var topWeight = ordered[0].Value;
|
||||
var totalWeight = ordered.Sum(static pair => pair.Value);
|
||||
var remainder = totalWeight - topWeight;
|
||||
|
||||
if (topWeight <= 0)
|
||||
{
|
||||
return VexConsensusStatus.UnderInvestigation;
|
||||
}
|
||||
|
||||
if (topWeight > remainder)
|
||||
{
|
||||
return topStatus switch
|
||||
{
|
||||
VexClaimStatus.Affected => VexConsensusStatus.Affected,
|
||||
VexClaimStatus.Fixed => VexConsensusStatus.Fixed,
|
||||
VexClaimStatus.NotAffected => VexConsensusStatus.NotAffected,
|
||||
_ => VexConsensusStatus.UnderInvestigation,
|
||||
};
|
||||
}
|
||||
|
||||
return VexConsensusStatus.UnderInvestigation;
|
||||
}
|
||||
|
||||
private static string BuildSummary(
|
||||
IReadOnlyDictionary<VexClaimStatus, double> weights,
|
||||
VexConsensusStatus status)
|
||||
{
|
||||
if (weights.Count == 0)
|
||||
{
|
||||
return "No eligible claims met policy requirements.";
|
||||
}
|
||||
|
||||
var breakdown = string.Join(
|
||||
", ",
|
||||
weights
|
||||
.OrderByDescending(static pair => pair.Value)
|
||||
.ThenBy(static pair => pair.Key)
|
||||
.Select(pair => $"{FormatStatus(pair.Key)}={pair.Value.ToString("0.###", CultureInfo.InvariantCulture)}"));
|
||||
|
||||
if (status == VexConsensusStatus.UnderInvestigation)
|
||||
{
|
||||
return $"No majority consensus; weighted breakdown {breakdown}.";
|
||||
}
|
||||
|
||||
return $"{FormatStatus(status)} determined via weighted majority; breakdown {breakdown}.";
|
||||
}
|
||||
|
||||
private static List<VexConsensusConflict> AttachConflictDetails(
|
||||
List<VexConsensusConflict> conflicts,
|
||||
IEnumerable<VexConsensusSource> acceptedSources,
|
||||
VexConsensusStatus status,
|
||||
HashSet<string> conflictKeys)
|
||||
{
|
||||
var consensusClaimStatus = status switch
|
||||
{
|
||||
VexConsensusStatus.Affected => VexClaimStatus.Affected,
|
||||
VexConsensusStatus.NotAffected => VexClaimStatus.NotAffected,
|
||||
VexConsensusStatus.Fixed => VexClaimStatus.Fixed,
|
||||
VexConsensusStatus.UnderInvestigation => (VexClaimStatus?)null,
|
||||
VexConsensusStatus.Divergent => (VexClaimStatus?)null,
|
||||
_ => null,
|
||||
};
|
||||
|
||||
foreach (var source in acceptedSources)
|
||||
{
|
||||
if (consensusClaimStatus is null || source.Status != consensusClaimStatus.Value)
|
||||
{
|
||||
var conflict = new VexConsensusConflict(
|
||||
source.ProviderId,
|
||||
source.Status,
|
||||
source.DocumentDigest,
|
||||
source.Justification,
|
||||
source.Detail,
|
||||
consensusClaimStatus is null ? "no_majority" : "status_conflict");
|
||||
|
||||
if (conflictKeys.Add(CreateConflictKey(conflict.ProviderId, conflict.DocumentDigest)))
|
||||
{
|
||||
conflicts.Add(conflict);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return conflicts;
|
||||
}
|
||||
|
||||
private static string FormatStatus(VexClaimStatus status)
|
||||
=> status switch
|
||||
{
|
||||
VexClaimStatus.Affected => "affected",
|
||||
VexClaimStatus.NotAffected => "not_affected",
|
||||
VexClaimStatus.Fixed => "fixed",
|
||||
VexClaimStatus.UnderInvestigation => "under_investigation",
|
||||
_ => status.ToString().ToLowerInvariant(),
|
||||
};
|
||||
|
||||
private static string CreateConflictKey(string providerId, string documentDigest)
|
||||
=> $"{providerId}|{documentDigest}";
|
||||
|
||||
private static string FormatStatus(VexConsensusStatus status)
|
||||
=> status switch
|
||||
{
|
||||
VexConsensusStatus.Affected => "affected",
|
||||
VexConsensusStatus.NotAffected => "not_affected",
|
||||
VexConsensusStatus.Fixed => "fixed",
|
||||
VexConsensusStatus.UnderInvestigation => "under_investigation",
|
||||
VexConsensusStatus.Divergent => "divergent",
|
||||
_ => status.ToString().ToLowerInvariant(),
|
||||
};
|
||||
}
|
||||
|
||||
public sealed record VexConsensusRequest(
|
||||
string VulnerabilityId,
|
||||
VexProduct Product,
|
||||
IReadOnlyList<VexClaim> Claims,
|
||||
IReadOnlyDictionary<string, VexProvider> Providers,
|
||||
DateTimeOffset CalculatedAt);
|
||||
|
||||
public sealed record VexConsensusResolution(
|
||||
VexConsensus Consensus,
|
||||
ImmutableArray<VexConsensusDecisionTelemetry> DecisionLog);
|
||||
|
||||
public sealed record VexConsensusDecisionTelemetry(
|
||||
string ProviderId,
|
||||
string DocumentDigest,
|
||||
VexClaimStatus Status,
|
||||
bool Included,
|
||||
double Weight,
|
||||
string? Reason,
|
||||
VexJustification? Justification,
|
||||
string? Detail);
|
||||
257
src/StellaOps.Vexer.Core/VexExportManifest.cs
Normal file
257
src/StellaOps.Vexer.Core/VexExportManifest.cs
Normal file
@@ -0,0 +1,257 @@
|
||||
using System.Collections.Immutable;
|
||||
using System.Runtime.Serialization;
|
||||
using System.Text;
|
||||
|
||||
namespace StellaOps.Vexer.Core;
|
||||
|
||||
public sealed record VexExportManifest
|
||||
{
|
||||
public VexExportManifest(
|
||||
string exportId,
|
||||
VexQuerySignature querySignature,
|
||||
VexExportFormat format,
|
||||
DateTimeOffset createdAt,
|
||||
VexContentAddress artifact,
|
||||
int claimCount,
|
||||
IEnumerable<string> sourceProviders,
|
||||
bool fromCache = false,
|
||||
string? consensusRevision = null,
|
||||
VexAttestationMetadata? attestation = null,
|
||||
long sizeBytes = 0)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(exportId))
|
||||
{
|
||||
throw new ArgumentException("Export id must be provided.", nameof(exportId));
|
||||
}
|
||||
|
||||
if (claimCount < 0)
|
||||
{
|
||||
throw new ArgumentOutOfRangeException(nameof(claimCount), "Claim count cannot be negative.");
|
||||
}
|
||||
|
||||
if (sizeBytes < 0)
|
||||
{
|
||||
throw new ArgumentOutOfRangeException(nameof(sizeBytes), "Export size cannot be negative.");
|
||||
}
|
||||
|
||||
ExportId = exportId.Trim();
|
||||
QuerySignature = querySignature ?? throw new ArgumentNullException(nameof(querySignature));
|
||||
Format = format;
|
||||
CreatedAt = createdAt;
|
||||
Artifact = artifact ?? throw new ArgumentNullException(nameof(artifact));
|
||||
ClaimCount = claimCount;
|
||||
FromCache = fromCache;
|
||||
SourceProviders = NormalizeProviders(sourceProviders);
|
||||
ConsensusRevision = string.IsNullOrWhiteSpace(consensusRevision) ? null : consensusRevision.Trim();
|
||||
Attestation = attestation;
|
||||
SizeBytes = sizeBytes;
|
||||
}
|
||||
|
||||
public string ExportId { get; }
|
||||
|
||||
public VexQuerySignature QuerySignature { get; }
|
||||
|
||||
public VexExportFormat Format { get; }
|
||||
|
||||
public DateTimeOffset CreatedAt { get; }
|
||||
|
||||
public VexContentAddress Artifact { get; }
|
||||
|
||||
public int ClaimCount { get; }
|
||||
|
||||
public bool FromCache { get; }
|
||||
|
||||
public ImmutableArray<string> SourceProviders { get; }
|
||||
|
||||
public string? ConsensusRevision { get; }
|
||||
|
||||
public VexAttestationMetadata? Attestation { get; }
|
||||
|
||||
public long SizeBytes { get; }
|
||||
|
||||
private static ImmutableArray<string> NormalizeProviders(IEnumerable<string> providers)
|
||||
{
|
||||
if (providers is null)
|
||||
{
|
||||
throw new ArgumentNullException(nameof(providers));
|
||||
}
|
||||
|
||||
var set = new SortedSet<string>(StringComparer.Ordinal);
|
||||
foreach (var provider in providers)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(provider))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
set.Add(provider.Trim());
|
||||
}
|
||||
|
||||
return set.Count == 0
|
||||
? ImmutableArray<string>.Empty
|
||||
: set.ToImmutableArray();
|
||||
}
|
||||
}
|
||||
|
||||
public sealed record VexContentAddress
|
||||
{
|
||||
public VexContentAddress(string algorithm, string digest)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(algorithm))
|
||||
{
|
||||
throw new ArgumentException("Content algorithm must be provided.", nameof(algorithm));
|
||||
}
|
||||
|
||||
if (string.IsNullOrWhiteSpace(digest))
|
||||
{
|
||||
throw new ArgumentException("Content digest must be provided.", nameof(digest));
|
||||
}
|
||||
|
||||
Algorithm = algorithm.Trim();
|
||||
Digest = digest.Trim();
|
||||
}
|
||||
|
||||
public string Algorithm { get; }
|
||||
|
||||
public string Digest { get; }
|
||||
|
||||
public string ToUri() => $"{Algorithm}:{Digest}";
|
||||
|
||||
public override string ToString() => ToUri();
|
||||
}
|
||||
|
||||
public sealed record VexAttestationMetadata
|
||||
{
|
||||
public VexAttestationMetadata(
|
||||
string predicateType,
|
||||
VexRekorReference? rekor = null,
|
||||
string? envelopeDigest = null,
|
||||
DateTimeOffset? signedAt = null)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(predicateType))
|
||||
{
|
||||
throw new ArgumentException("Predicate type must be provided.", nameof(predicateType));
|
||||
}
|
||||
|
||||
PredicateType = predicateType.Trim();
|
||||
Rekor = rekor;
|
||||
EnvelopeDigest = string.IsNullOrWhiteSpace(envelopeDigest) ? null : envelopeDigest.Trim();
|
||||
SignedAt = signedAt;
|
||||
}
|
||||
|
||||
public string PredicateType { get; }
|
||||
|
||||
public VexRekorReference? Rekor { get; }
|
||||
|
||||
public string? EnvelopeDigest { get; }
|
||||
|
||||
public DateTimeOffset? SignedAt { get; }
|
||||
}
|
||||
|
||||
public sealed record VexRekorReference
|
||||
{
|
||||
public VexRekorReference(string apiVersion, string location, string? logIndex = null, Uri? inclusionProofUri = null)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(apiVersion))
|
||||
{
|
||||
throw new ArgumentException("Rekor API version must be provided.", nameof(apiVersion));
|
||||
}
|
||||
|
||||
if (string.IsNullOrWhiteSpace(location))
|
||||
{
|
||||
throw new ArgumentException("Rekor location must be provided.", nameof(location));
|
||||
}
|
||||
|
||||
ApiVersion = apiVersion.Trim();
|
||||
Location = location.Trim();
|
||||
LogIndex = string.IsNullOrWhiteSpace(logIndex) ? null : logIndex.Trim();
|
||||
InclusionProofUri = inclusionProofUri;
|
||||
}
|
||||
|
||||
public string ApiVersion { get; }
|
||||
|
||||
public string Location { get; }
|
||||
|
||||
public string? LogIndex { get; }
|
||||
|
||||
public Uri? InclusionProofUri { get; }
|
||||
}
|
||||
|
||||
public sealed partial record VexQuerySignature
|
||||
{
|
||||
public VexQuerySignature(string value)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(value))
|
||||
{
|
||||
throw new ArgumentException("Query signature must be provided.", nameof(value));
|
||||
}
|
||||
|
||||
Value = value.Trim();
|
||||
}
|
||||
|
||||
public string Value { get; }
|
||||
|
||||
public static VexQuerySignature FromFilters(IEnumerable<KeyValuePair<string, string>> filters)
|
||||
{
|
||||
if (filters is null)
|
||||
{
|
||||
throw new ArgumentNullException(nameof(filters));
|
||||
}
|
||||
|
||||
var builder = ImmutableArray.CreateBuilder<KeyValuePair<string, string>>();
|
||||
foreach (var pair in filters)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(pair.Key))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var key = pair.Key.Trim();
|
||||
var value = pair.Value?.Trim() ?? string.Empty;
|
||||
builder.Add(new KeyValuePair<string, string>(key, value));
|
||||
}
|
||||
|
||||
if (builder.Count == 0)
|
||||
{
|
||||
throw new ArgumentException("At least one filter is required to build a query signature.", nameof(filters));
|
||||
}
|
||||
|
||||
var ordered = builder
|
||||
.OrderBy(static pair => pair.Key, StringComparer.Ordinal)
|
||||
.ThenBy(static pair => pair.Value, StringComparer.Ordinal)
|
||||
.ToImmutableArray();
|
||||
|
||||
var sb = new StringBuilder();
|
||||
for (var i = 0; i < ordered.Length; i++)
|
||||
{
|
||||
if (i > 0)
|
||||
{
|
||||
sb.Append('&');
|
||||
}
|
||||
|
||||
sb.Append(ordered[i].Key);
|
||||
sb.Append('=');
|
||||
sb.Append(ordered[i].Value);
|
||||
}
|
||||
|
||||
return new VexQuerySignature(sb.ToString());
|
||||
}
|
||||
|
||||
public override string ToString() => Value;
|
||||
}
|
||||
|
||||
[DataContract]
|
||||
public enum VexExportFormat
|
||||
{
|
||||
[EnumMember(Value = "json")]
|
||||
Json,
|
||||
|
||||
[EnumMember(Value = "jsonl")]
|
||||
JsonLines,
|
||||
|
||||
[EnumMember(Value = "openvex")]
|
||||
OpenVex,
|
||||
|
||||
[EnumMember(Value = "csaf")]
|
||||
Csaf,
|
||||
}
|
||||
30
src/StellaOps.Vexer.Core/VexExporterAbstractions.cs
Normal file
30
src/StellaOps.Vexer.Core/VexExporterAbstractions.cs
Normal file
@@ -0,0 +1,30 @@
|
||||
using System;
|
||||
using System.Collections.Immutable;
|
||||
using System.IO;
|
||||
using System.Threading;
|
||||
using System.Threading.Tasks;
|
||||
|
||||
namespace StellaOps.Vexer.Core;
|
||||
|
||||
public interface IVexExporter
|
||||
{
|
||||
VexExportFormat Format { get; }
|
||||
|
||||
VexContentAddress Digest(VexExportRequest request);
|
||||
|
||||
ValueTask<VexExportResult> SerializeAsync(
|
||||
VexExportRequest request,
|
||||
Stream output,
|
||||
CancellationToken cancellationToken);
|
||||
}
|
||||
|
||||
public sealed record VexExportRequest(
|
||||
VexQuery Query,
|
||||
ImmutableArray<VexConsensus> Consensus,
|
||||
ImmutableArray<VexClaim> Claims,
|
||||
DateTimeOffset GeneratedAt);
|
||||
|
||||
public sealed record VexExportResult(
|
||||
VexContentAddress Digest,
|
||||
long BytesWritten,
|
||||
ImmutableDictionary<string, string> Metadata);
|
||||
28
src/StellaOps.Vexer.Core/VexNormalizerAbstractions.cs
Normal file
28
src/StellaOps.Vexer.Core/VexNormalizerAbstractions.cs
Normal file
@@ -0,0 +1,28 @@
|
||||
using System;
|
||||
using System.Collections.Immutable;
|
||||
using System.Linq;
|
||||
using System.Threading;
|
||||
using System.Threading.Tasks;
|
||||
|
||||
namespace StellaOps.Vexer.Core;
|
||||
|
||||
/// <summary>
|
||||
/// Normalizer contract for translating raw connector documents into canonical claims.
|
||||
/// </summary>
|
||||
public interface IVexNormalizer
|
||||
{
|
||||
string Format { get; }
|
||||
|
||||
bool CanHandle(VexRawDocument document);
|
||||
|
||||
ValueTask<VexClaimBatch> NormalizeAsync(VexRawDocument document, VexProvider provider, CancellationToken cancellationToken);
|
||||
}
|
||||
|
||||
/// <summary>
|
||||
/// Registry that maps formats to registered normalizers.
|
||||
/// </summary>
|
||||
public sealed record VexNormalizerRegistry(ImmutableArray<IVexNormalizer> Normalizers)
|
||||
{
|
||||
public IVexNormalizer? Resolve(VexRawDocument document)
|
||||
=> Normalizers.FirstOrDefault(normalizer => normalizer.CanHandle(document));
|
||||
}
|
||||
206
src/StellaOps.Vexer.Core/VexProvider.cs
Normal file
206
src/StellaOps.Vexer.Core/VexProvider.cs
Normal file
@@ -0,0 +1,206 @@
|
||||
using System.Collections.Immutable;
|
||||
using System.Runtime.Serialization;
|
||||
|
||||
namespace StellaOps.Vexer.Core;
|
||||
|
||||
/// <summary>
|
||||
/// Metadata describing a VEX provider (vendor, distro, hub, platform).
|
||||
/// </summary>
|
||||
public sealed record VexProvider
|
||||
{
|
||||
public VexProvider(
|
||||
string id,
|
||||
string displayName,
|
||||
VexProviderKind kind,
|
||||
IEnumerable<Uri>? baseUris = null,
|
||||
VexProviderDiscovery? discovery = null,
|
||||
VexProviderTrust? trust = null,
|
||||
bool enabled = true)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(id))
|
||||
{
|
||||
throw new ArgumentException("Provider id must be non-empty.", nameof(id));
|
||||
}
|
||||
|
||||
if (string.IsNullOrWhiteSpace(displayName))
|
||||
{
|
||||
throw new ArgumentException("Provider display name must be non-empty.", nameof(displayName));
|
||||
}
|
||||
|
||||
Id = id.Trim();
|
||||
DisplayName = displayName.Trim();
|
||||
Kind = kind;
|
||||
BaseUris = NormalizeUris(baseUris);
|
||||
Discovery = discovery ?? VexProviderDiscovery.Empty;
|
||||
Trust = trust ?? VexProviderTrust.Default;
|
||||
Enabled = enabled;
|
||||
}
|
||||
|
||||
public string Id { get; }
|
||||
|
||||
public string DisplayName { get; }
|
||||
|
||||
public VexProviderKind Kind { get; }
|
||||
|
||||
public ImmutableArray<Uri> BaseUris { get; }
|
||||
|
||||
public VexProviderDiscovery Discovery { get; }
|
||||
|
||||
public VexProviderTrust Trust { get; }
|
||||
|
||||
public bool Enabled { get; }
|
||||
|
||||
private static ImmutableArray<Uri> NormalizeUris(IEnumerable<Uri>? baseUris)
|
||||
{
|
||||
if (baseUris is null)
|
||||
{
|
||||
return ImmutableArray<Uri>.Empty;
|
||||
}
|
||||
|
||||
var distinct = new HashSet<string>(StringComparer.Ordinal);
|
||||
var builder = ImmutableArray.CreateBuilder<Uri>();
|
||||
foreach (var uri in baseUris)
|
||||
{
|
||||
if (uri is null)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var canonical = uri.ToString();
|
||||
if (distinct.Add(canonical))
|
||||
{
|
||||
builder.Add(uri);
|
||||
}
|
||||
}
|
||||
|
||||
if (builder.Count == 0)
|
||||
{
|
||||
return ImmutableArray<Uri>.Empty;
|
||||
}
|
||||
|
||||
return builder
|
||||
.OrderBy(static x => x.ToString(), StringComparer.Ordinal)
|
||||
.ToImmutableArray();
|
||||
}
|
||||
}
|
||||
|
||||
public sealed record VexProviderDiscovery
|
||||
{
|
||||
public static readonly VexProviderDiscovery Empty = new(null, null);
|
||||
|
||||
public VexProviderDiscovery(Uri? wellKnownMetadata, Uri? rolieService)
|
||||
{
|
||||
WellKnownMetadata = wellKnownMetadata;
|
||||
RolIeService = rolieService;
|
||||
}
|
||||
|
||||
public Uri? WellKnownMetadata { get; }
|
||||
|
||||
public Uri? RolIeService { get; }
|
||||
}
|
||||
|
||||
public sealed record VexProviderTrust
|
||||
{
|
||||
public static readonly VexProviderTrust Default = new(1.0, null, ImmutableArray<string>.Empty);
|
||||
|
||||
public VexProviderTrust(
|
||||
double weight,
|
||||
VexCosignTrust? cosign,
|
||||
IEnumerable<string>? pgpFingerprints = null)
|
||||
{
|
||||
Weight = NormalizeWeight(weight);
|
||||
Cosign = cosign;
|
||||
PgpFingerprints = NormalizeFingerprints(pgpFingerprints);
|
||||
}
|
||||
|
||||
public double Weight { get; }
|
||||
|
||||
public VexCosignTrust? Cosign { get; }
|
||||
|
||||
public ImmutableArray<string> PgpFingerprints { get; }
|
||||
|
||||
private static double NormalizeWeight(double weight)
|
||||
{
|
||||
if (double.IsNaN(weight) || double.IsInfinity(weight))
|
||||
{
|
||||
throw new ArgumentOutOfRangeException(nameof(weight), "Weight must be a finite number.");
|
||||
}
|
||||
|
||||
if (weight <= 0)
|
||||
{
|
||||
return 0.0;
|
||||
}
|
||||
|
||||
if (weight >= 1.0)
|
||||
{
|
||||
return 1.0;
|
||||
}
|
||||
|
||||
return weight;
|
||||
}
|
||||
|
||||
private static ImmutableArray<string> NormalizeFingerprints(IEnumerable<string>? values)
|
||||
{
|
||||
if (values is null)
|
||||
{
|
||||
return ImmutableArray<string>.Empty;
|
||||
}
|
||||
|
||||
var set = new SortedSet<string>(StringComparer.Ordinal);
|
||||
foreach (var value in values)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(value))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
set.Add(value.Trim());
|
||||
}
|
||||
|
||||
return set.Count == 0
|
||||
? ImmutableArray<string>.Empty
|
||||
: set.ToImmutableArray();
|
||||
}
|
||||
}
|
||||
|
||||
public sealed record VexCosignTrust
|
||||
{
|
||||
public VexCosignTrust(string issuer, string identityPattern)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(issuer))
|
||||
{
|
||||
throw new ArgumentException("Issuer must be provided for cosign trust metadata.", nameof(issuer));
|
||||
}
|
||||
|
||||
if (string.IsNullOrWhiteSpace(identityPattern))
|
||||
{
|
||||
throw new ArgumentException("Identity pattern must be provided for cosign trust metadata.", nameof(identityPattern));
|
||||
}
|
||||
|
||||
Issuer = issuer.Trim();
|
||||
IdentityPattern = identityPattern.Trim();
|
||||
}
|
||||
|
||||
public string Issuer { get; }
|
||||
|
||||
public string IdentityPattern { get; }
|
||||
}
|
||||
|
||||
[DataContract]
|
||||
public enum VexProviderKind
|
||||
{
|
||||
[EnumMember(Value = "vendor")]
|
||||
Vendor,
|
||||
|
||||
[EnumMember(Value = "distro")]
|
||||
Distro,
|
||||
|
||||
[EnumMember(Value = "hub")]
|
||||
Hub,
|
||||
|
||||
[EnumMember(Value = "platform")]
|
||||
Platform,
|
||||
|
||||
[EnumMember(Value = "attestation")]
|
||||
Attestation,
|
||||
}
|
||||
143
src/StellaOps.Vexer.Core/VexQuery.cs
Normal file
143
src/StellaOps.Vexer.Core/VexQuery.cs
Normal file
@@ -0,0 +1,143 @@
|
||||
using System;
|
||||
using System.Collections.Generic;
|
||||
using System.Collections.Immutable;
|
||||
using System.Globalization;
|
||||
using System.Security.Cryptography;
|
||||
using System.Text;
|
||||
using System.Linq;
|
||||
|
||||
namespace StellaOps.Vexer.Core;
|
||||
|
||||
public sealed record VexQuery(
|
||||
ImmutableArray<VexQueryFilter> Filters,
|
||||
ImmutableArray<VexQuerySort> Sort,
|
||||
int? Limit = null,
|
||||
int? Offset = null,
|
||||
string? View = null)
|
||||
{
|
||||
public static VexQuery Empty { get; } = new(
|
||||
ImmutableArray<VexQueryFilter>.Empty,
|
||||
ImmutableArray<VexQuerySort>.Empty);
|
||||
|
||||
public static VexQuery Create(
|
||||
IEnumerable<VexQueryFilter>? filters = null,
|
||||
IEnumerable<VexQuerySort>? sort = null,
|
||||
int? limit = null,
|
||||
int? offset = null,
|
||||
string? view = null)
|
||||
{
|
||||
var normalizedFilters = NormalizeFilters(filters);
|
||||
var normalizedSort = NormalizeSort(sort);
|
||||
return new VexQuery(normalizedFilters, normalizedSort, NormalizeBound(limit), NormalizeBound(offset), NormalizeView(view));
|
||||
}
|
||||
|
||||
public VexQuery WithFilters(IEnumerable<VexQueryFilter> filters)
|
||||
=> this with { Filters = NormalizeFilters(filters) };
|
||||
|
||||
public VexQuery WithSort(IEnumerable<VexQuerySort> sort)
|
||||
=> this with { Sort = NormalizeSort(sort) };
|
||||
|
||||
public VexQuery WithBounds(int? limit = null, int? offset = null)
|
||||
=> this with { Limit = NormalizeBound(limit), Offset = NormalizeBound(offset) };
|
||||
|
||||
public VexQuery WithView(string? view)
|
||||
=> this with { View = NormalizeView(view) };
|
||||
|
||||
private static ImmutableArray<VexQueryFilter> NormalizeFilters(IEnumerable<VexQueryFilter>? filters)
|
||||
{
|
||||
if (filters is null)
|
||||
{
|
||||
return ImmutableArray<VexQueryFilter>.Empty;
|
||||
}
|
||||
|
||||
return filters
|
||||
.Where(filter => !string.IsNullOrWhiteSpace(filter.Key))
|
||||
.Select(filter => new VexQueryFilter(filter.Key.Trim(), filter.Value?.Trim() ?? string.Empty))
|
||||
.OrderBy(filter => filter.Key, StringComparer.Ordinal)
|
||||
.ThenBy(filter => filter.Value, StringComparer.Ordinal)
|
||||
.ToImmutableArray();
|
||||
}
|
||||
|
||||
private static ImmutableArray<VexQuerySort> NormalizeSort(IEnumerable<VexQuerySort>? sort)
|
||||
{
|
||||
if (sort is null)
|
||||
{
|
||||
return ImmutableArray<VexQuerySort>.Empty;
|
||||
}
|
||||
|
||||
return sort
|
||||
.Where(s => !string.IsNullOrWhiteSpace(s.Field))
|
||||
.Select(s => new VexQuerySort(s.Field.Trim(), s.Descending))
|
||||
.OrderBy(s => s.Field, StringComparer.Ordinal)
|
||||
.ThenBy(s => s.Descending)
|
||||
.ToImmutableArray();
|
||||
}
|
||||
|
||||
private static int? NormalizeBound(int? value)
|
||||
{
|
||||
if (value is null)
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
if (value.Value < 0)
|
||||
{
|
||||
return 0;
|
||||
}
|
||||
|
||||
return value.Value;
|
||||
}
|
||||
|
||||
private static string? NormalizeView(string? view)
|
||||
=> string.IsNullOrWhiteSpace(view) ? null : view.Trim();
|
||||
}
|
||||
|
||||
public sealed record VexQueryFilter(string Key, string Value);
|
||||
|
||||
public sealed record VexQuerySort(string Field, bool Descending);
|
||||
|
||||
public sealed partial record VexQuerySignature
|
||||
{
|
||||
public static VexQuerySignature FromQuery(VexQuery query)
|
||||
{
|
||||
if (query is null)
|
||||
{
|
||||
throw new ArgumentNullException(nameof(query));
|
||||
}
|
||||
|
||||
var components = new List<string>(query.Filters.Length + query.Sort.Length + 3);
|
||||
components.AddRange(query.Filters.Select(filter => $"{filter.Key}={filter.Value}"));
|
||||
components.AddRange(query.Sort.Select(sort => sort.Descending ? $"sort=-{sort.Field}" : $"sort=+{sort.Field}"));
|
||||
|
||||
if (query.Limit is not null)
|
||||
{
|
||||
components.Add($"limit={query.Limit.Value.ToString(CultureInfo.InvariantCulture)}");
|
||||
}
|
||||
|
||||
if (query.Offset is not null)
|
||||
{
|
||||
components.Add($"offset={query.Offset.Value.ToString(CultureInfo.InvariantCulture)}");
|
||||
}
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(query.View))
|
||||
{
|
||||
components.Add($"view={query.View}");
|
||||
}
|
||||
|
||||
return new VexQuerySignature(string.Join('&', components));
|
||||
}
|
||||
|
||||
public VexContentAddress ComputeHash()
|
||||
{
|
||||
using var sha = SHA256.Create();
|
||||
var bytes = Encoding.UTF8.GetBytes(Value);
|
||||
var digest = sha.ComputeHash(bytes);
|
||||
var builder = new StringBuilder(digest.Length * 2);
|
||||
foreach (var b in digest)
|
||||
{
|
||||
_ = builder.Append(b.ToString("x2", CultureInfo.InvariantCulture));
|
||||
}
|
||||
|
||||
return new VexContentAddress("sha256", builder.ToString());
|
||||
}
|
||||
}
|
||||
132
src/StellaOps.Vexer.Export.Tests/ExportEngineTests.cs
Normal file
132
src/StellaOps.Vexer.Export.Tests/ExportEngineTests.cs
Normal file
@@ -0,0 +1,132 @@
|
||||
using System;
|
||||
using System.Collections.Immutable;
|
||||
using System.IO;
|
||||
using System.Text;
|
||||
using Microsoft.Extensions.Logging.Abstractions;
|
||||
using StellaOps.Vexer.Core;
|
||||
using StellaOps.Vexer.Export;
|
||||
using StellaOps.Vexer.Policy;
|
||||
using StellaOps.Vexer.Storage.Mongo;
|
||||
using Xunit;
|
||||
|
||||
namespace StellaOps.Vexer.Export.Tests;
|
||||
|
||||
public sealed class ExportEngineTests
|
||||
{
|
||||
[Fact]
|
||||
public async Task ExportAsync_GeneratesAndCachesManifest()
|
||||
{
|
||||
var store = new InMemoryExportStore();
|
||||
var evaluator = new StaticPolicyEvaluator("baseline/v1");
|
||||
var dataSource = new InMemoryExportDataSource();
|
||||
var exporter = new DummyExporter(VexExportFormat.Json);
|
||||
var engine = new VexExportEngine(store, evaluator, dataSource, new[] { exporter }, NullLogger<VexExportEngine>.Instance);
|
||||
|
||||
var query = VexQuery.Create(new[] { new VexQueryFilter("vulnId", "CVE-2025-0001") });
|
||||
var context = new VexExportRequestContext(query, VexExportFormat.Json, DateTimeOffset.UtcNow, ForceRefresh: false);
|
||||
|
||||
var manifest = await engine.ExportAsync(context, CancellationToken.None);
|
||||
|
||||
Assert.False(manifest.FromCache);
|
||||
Assert.Equal(VexExportFormat.Json, manifest.Format);
|
||||
Assert.Equal("baseline/v1", manifest.ConsensusRevision);
|
||||
Assert.Equal(1, manifest.ClaimCount);
|
||||
|
||||
// second call hits cache
|
||||
var cached = await engine.ExportAsync(context, CancellationToken.None);
|
||||
Assert.True(cached.FromCache);
|
||||
Assert.Equal(manifest.ExportId, cached.ExportId);
|
||||
}
|
||||
|
||||
private sealed class InMemoryExportStore : IVexExportStore
|
||||
{
|
||||
private readonly Dictionary<string, VexExportManifest> _store = new(StringComparer.Ordinal);
|
||||
|
||||
public ValueTask<VexExportManifest?> FindAsync(VexQuerySignature signature, VexExportFormat format, CancellationToken cancellationToken)
|
||||
{
|
||||
var key = CreateKey(signature.Value, format);
|
||||
_store.TryGetValue(key, out var manifest);
|
||||
return ValueTask.FromResult<VexExportManifest?>(manifest);
|
||||
}
|
||||
|
||||
public ValueTask SaveAsync(VexExportManifest manifest, CancellationToken cancellationToken)
|
||||
{
|
||||
var key = CreateKey(manifest.QuerySignature.Value, manifest.Format);
|
||||
_store[key] = manifest;
|
||||
return ValueTask.CompletedTask;
|
||||
}
|
||||
|
||||
private static string CreateKey(string signature, VexExportFormat format)
|
||||
=> FormattableString.Invariant($"{signature}|{format}");
|
||||
}
|
||||
|
||||
private sealed class StaticPolicyEvaluator : IVexPolicyEvaluator
|
||||
{
|
||||
public StaticPolicyEvaluator(string version)
|
||||
{
|
||||
Version = version;
|
||||
}
|
||||
|
||||
public string Version { get; }
|
||||
|
||||
public VexPolicySnapshot Snapshot => VexPolicySnapshot.Default;
|
||||
|
||||
public double GetProviderWeight(VexProvider provider) => 1.0;
|
||||
|
||||
public bool IsClaimEligible(VexClaim claim, VexProvider provider, out string? rejectionReason)
|
||||
{
|
||||
rejectionReason = null;
|
||||
return true;
|
||||
}
|
||||
}
|
||||
|
||||
private sealed class InMemoryExportDataSource : IVexExportDataSource
|
||||
{
|
||||
public ValueTask<VexExportDataSet> FetchAsync(VexQuery query, CancellationToken cancellationToken)
|
||||
{
|
||||
var claim = new VexClaim(
|
||||
"CVE-2025-0001",
|
||||
"vendor",
|
||||
new VexProduct("pkg:demo/app", "Demo"),
|
||||
VexClaimStatus.Affected,
|
||||
new VexClaimDocument(VexDocumentFormat.Csaf, "sha256:demo", new Uri("https://example.org/demo")),
|
||||
DateTimeOffset.UtcNow,
|
||||
DateTimeOffset.UtcNow);
|
||||
|
||||
var consensus = new VexConsensus(
|
||||
"CVE-2025-0001",
|
||||
claim.Product,
|
||||
VexConsensusStatus.Affected,
|
||||
DateTimeOffset.UtcNow,
|
||||
new[] { new VexConsensusSource("vendor", VexClaimStatus.Affected, "sha256:demo", 1.0) },
|
||||
conflicts: null,
|
||||
policyVersion: "baseline/v1",
|
||||
summary: "affected");
|
||||
|
||||
return ValueTask.FromResult(new VexExportDataSet(
|
||||
ImmutableArray.Create(consensus),
|
||||
ImmutableArray.Create(claim),
|
||||
ImmutableArray.Create("vendor")));
|
||||
}
|
||||
}
|
||||
|
||||
private sealed class DummyExporter : IVexExporter
|
||||
{
|
||||
public DummyExporter(VexExportFormat format)
|
||||
{
|
||||
Format = format;
|
||||
}
|
||||
|
||||
public VexExportFormat Format { get; }
|
||||
|
||||
public VexContentAddress Digest(VexExportRequest request)
|
||||
=> new("sha256", "deadbeef");
|
||||
|
||||
public ValueTask<VexExportResult> SerializeAsync(VexExportRequest request, Stream output, CancellationToken cancellationToken)
|
||||
{
|
||||
var bytes = System.Text.Encoding.UTF8.GetBytes("{}");
|
||||
output.Write(bytes);
|
||||
return ValueTask.FromResult(new VexExportResult(new VexContentAddress("sha256", "deadbeef"), bytes.Length, ImmutableDictionary<string, string>.Empty));
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,12 @@
|
||||
<Project Sdk="Microsoft.NET.Sdk">
|
||||
<PropertyGroup>
|
||||
<TargetFramework>net10.0</TargetFramework>
|
||||
<LangVersion>preview</LangVersion>
|
||||
<Nullable>enable</Nullable>
|
||||
<ImplicitUsings>enable</ImplicitUsings>
|
||||
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
|
||||
</PropertyGroup>
|
||||
<ItemGroup>
|
||||
<ProjectReference Include="..\StellaOps.Vexer.Export\StellaOps.Vexer.Export.csproj" />
|
||||
</ItemGroup>
|
||||
</Project>
|
||||
23
src/StellaOps.Vexer.Export/AGENTS.md
Normal file
23
src/StellaOps.Vexer.Export/AGENTS.md
Normal file
@@ -0,0 +1,23 @@
|
||||
# AGENTS
|
||||
## Role
|
||||
Produces deterministic VEX export artifacts, coordinates cache lookups, and bridges artifact storage with attestation generation.
|
||||
## Scope
|
||||
- Export orchestration pipeline: query signature resolution, cache lookup, snapshot building, attestation handoff.
|
||||
- Format-neutral builder interfaces consumed by format-specific plug-ins.
|
||||
- Artifact store abstraction wiring (S3/MinIO/filesystem) with offline-friendly packaging.
|
||||
- Export metrics/logging and deterministic manifest emission.
|
||||
## Participants
|
||||
- WebService invokes the export engine to service `/vexer/export` requests.
|
||||
- Attestation module receives built artifacts through this layer for signing.
|
||||
- Worker reuses caching and artifact utilities for scheduled exports and GC routines.
|
||||
## Interfaces & contracts
|
||||
- `IExportEngine`, `IExportSnapshotBuilder`, cache provider interfaces, and artifact store adapters.
|
||||
- Hook points for format plug-ins (JSON, JSONL, OpenVEX, CSAF, ZIP bundle).
|
||||
## In/Out of scope
|
||||
In: orchestration, caching, artifact store interactions, manifest metadata.
|
||||
Out: format-specific serialization (lives in Formats.*), policy evaluation (Policy), HTTP presentation (WebService).
|
||||
## Observability & security expectations
|
||||
- Emit cache hit/miss counters, export durations, artifact sizes, and attestation timing logs.
|
||||
- Ensure no sensitive tokens/URIs are logged.
|
||||
## Tests
|
||||
- Engine orchestration tests, cache behavior, and artifact lifecycle coverage will live in `../StellaOps.Vexer.Export.Tests`.
|
||||
128
src/StellaOps.Vexer.Export/ExportEngine.cs
Normal file
128
src/StellaOps.Vexer.Export/ExportEngine.cs
Normal file
@@ -0,0 +1,128 @@
|
||||
using System.Collections.Immutable;
|
||||
using System.IO;
|
||||
using System.Linq;
|
||||
using Microsoft.Extensions.DependencyInjection;
|
||||
using Microsoft.Extensions.Logging;
|
||||
using StellaOps.Vexer.Core;
|
||||
using StellaOps.Vexer.Policy;
|
||||
using StellaOps.Vexer.Storage.Mongo;
|
||||
|
||||
namespace StellaOps.Vexer.Export;
|
||||
|
||||
public interface IExportEngine
|
||||
{
|
||||
ValueTask<VexExportManifest> ExportAsync(VexExportRequestContext context, CancellationToken cancellationToken);
|
||||
}
|
||||
|
||||
public sealed record VexExportRequestContext(
|
||||
VexQuery Query,
|
||||
VexExportFormat Format,
|
||||
DateTimeOffset RequestedAt,
|
||||
bool ForceRefresh = false);
|
||||
|
||||
public interface IVexExportDataSource
|
||||
{
|
||||
ValueTask<VexExportDataSet> FetchAsync(VexQuery query, CancellationToken cancellationToken);
|
||||
}
|
||||
|
||||
public sealed record VexExportDataSet(
|
||||
ImmutableArray<VexConsensus> Consensus,
|
||||
ImmutableArray<VexClaim> Claims,
|
||||
ImmutableArray<string> SourceProviders);
|
||||
|
||||
public sealed class VexExportEngine : IExportEngine
|
||||
{
|
||||
private readonly IVexExportStore _exportStore;
|
||||
private readonly IVexPolicyEvaluator _policyEvaluator;
|
||||
private readonly IVexExportDataSource _dataSource;
|
||||
private readonly IReadOnlyDictionary<VexExportFormat, IVexExporter> _exporters;
|
||||
private readonly ILogger<VexExportEngine> _logger;
|
||||
|
||||
public VexExportEngine(
|
||||
IVexExportStore exportStore,
|
||||
IVexPolicyEvaluator policyEvaluator,
|
||||
IVexExportDataSource dataSource,
|
||||
IEnumerable<IVexExporter> exporters,
|
||||
ILogger<VexExportEngine> logger)
|
||||
{
|
||||
_exportStore = exportStore ?? throw new ArgumentNullException(nameof(exportStore));
|
||||
_policyEvaluator = policyEvaluator ?? throw new ArgumentNullException(nameof(policyEvaluator));
|
||||
_dataSource = dataSource ?? throw new ArgumentNullException(nameof(dataSource));
|
||||
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
|
||||
|
||||
if (exporters is null)
|
||||
{
|
||||
throw new ArgumentNullException(nameof(exporters));
|
||||
}
|
||||
|
||||
_exporters = exporters.ToDictionary(x => x.Format);
|
||||
}
|
||||
|
||||
public async ValueTask<VexExportManifest> ExportAsync(VexExportRequestContext context, CancellationToken cancellationToken)
|
||||
{
|
||||
ArgumentNullException.ThrowIfNull(context);
|
||||
var signature = VexQuerySignature.FromQuery(context.Query);
|
||||
|
||||
if (!context.ForceRefresh)
|
||||
{
|
||||
var cached = await _exportStore.FindAsync(signature, context.Format, cancellationToken).ConfigureAwait(false);
|
||||
if (cached is not null)
|
||||
{
|
||||
_logger.LogInformation("Reusing cached export for {Signature} ({Format})", signature.Value, context.Format);
|
||||
return cached with { FromCache = true };
|
||||
}
|
||||
}
|
||||
|
||||
var dataset = await _dataSource.FetchAsync(context.Query, cancellationToken).ConfigureAwait(false);
|
||||
var exporter = ResolveExporter(context.Format);
|
||||
|
||||
var exportRequest = new VexExportRequest(
|
||||
context.Query,
|
||||
dataset.Consensus,
|
||||
dataset.Claims,
|
||||
context.RequestedAt);
|
||||
|
||||
var digest = exporter.Digest(exportRequest);
|
||||
|
||||
await using var buffer = new MemoryStream();
|
||||
var result = await exporter.SerializeAsync(exportRequest, buffer, cancellationToken).ConfigureAwait(false);
|
||||
|
||||
var exportId = FormattableString.Invariant($"exports/{context.RequestedAt:yyyyMMddTHHmmssfffZ}/{digest.Digest}");
|
||||
var manifest = new VexExportManifest(
|
||||
exportId,
|
||||
signature,
|
||||
context.Format,
|
||||
context.RequestedAt,
|
||||
digest,
|
||||
dataset.Claims.Length,
|
||||
dataset.SourceProviders,
|
||||
fromCache: false,
|
||||
consensusRevision: _policyEvaluator.Version,
|
||||
attestation: null,
|
||||
sizeBytes: result.BytesWritten);
|
||||
|
||||
await _exportStore.SaveAsync(manifest, cancellationToken).ConfigureAwait(false);
|
||||
|
||||
_logger.LogInformation(
|
||||
"Export generated for {Signature} ({Format}) size={SizeBytes} bytes",
|
||||
signature.Value,
|
||||
context.Format,
|
||||
result.BytesWritten);
|
||||
|
||||
return manifest;
|
||||
}
|
||||
|
||||
private IVexExporter ResolveExporter(VexExportFormat format)
|
||||
=> _exporters.TryGetValue(format, out var exporter)
|
||||
? exporter
|
||||
: throw new InvalidOperationException($"No exporter registered for format '{format}'.");
|
||||
}
|
||||
|
||||
public static class VexExportServiceCollectionExtensions
|
||||
{
|
||||
public static IServiceCollection AddVexExportEngine(this IServiceCollection services)
|
||||
{
|
||||
services.AddSingleton<IExportEngine, VexExportEngine>();
|
||||
return services;
|
||||
}
|
||||
}
|
||||
18
src/StellaOps.Vexer.Export/StellaOps.Vexer.Export.csproj
Normal file
18
src/StellaOps.Vexer.Export/StellaOps.Vexer.Export.csproj
Normal file
@@ -0,0 +1,18 @@
|
||||
<Project Sdk="Microsoft.NET.Sdk">
|
||||
<PropertyGroup>
|
||||
<TargetFramework>net10.0</TargetFramework>
|
||||
<LangVersion>preview</LangVersion>
|
||||
<Nullable>enable</Nullable>
|
||||
<ImplicitUsings>enable</ImplicitUsings>
|
||||
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
|
||||
</PropertyGroup>
|
||||
<ItemGroup>
|
||||
<PackageReference Include="Microsoft.Extensions.Logging.Abstractions" Version="8.0.0" />
|
||||
<PackageReference Include="Microsoft.Extensions.Options" Version="8.0.0" />
|
||||
</ItemGroup>
|
||||
<ItemGroup>
|
||||
<ProjectReference Include="..\StellaOps.Vexer.Core\StellaOps.Vexer.Core.csproj" />
|
||||
<ProjectReference Include="..\StellaOps.Vexer.Policy\StellaOps.Vexer.Policy.csproj" />
|
||||
<ProjectReference Include="..\StellaOps.Vexer.Storage.Mongo\StellaOps.Vexer.Storage.Mongo.csproj" />
|
||||
</ItemGroup>
|
||||
</Project>
|
||||
8
src/StellaOps.Vexer.Export/TASKS.md
Normal file
8
src/StellaOps.Vexer.Export/TASKS.md
Normal file
@@ -0,0 +1,8 @@
|
||||
If you are working on this file you need to read docs/ARCHITECTURE_VEXER.md and ./AGENTS.md).
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|VEXER-EXPORT-01-001 – Export engine orchestration|Team Vexer Export|VEXER-CORE-01-003|DONE (2025-10-15) – Export engine scaffolding with cache lookup, data source hooks, and deterministic manifest emission.|
|
||||
|VEXER-EXPORT-01-002 – Cache index & eviction hooks|Team Vexer Export|VEXER-EXPORT-01-001, VEXER-STORAGE-01-003|TODO – Wire cache lookup/write path against `vex.cache` collection and add GC utilities for Worker to prune stale entries deterministically.|
|
||||
|VEXER-EXPORT-01-003 – Artifact store adapters|Team Vexer Export|VEXER-EXPORT-01-001|TODO – Provide pluggable storage adapters (filesystem, S3/MinIO) with offline bundle packaging and hash verification.|
|
||||
|VEXER-EXPORT-01-004 – Attestation handoff integration|Team Vexer Export|VEXER-EXPORT-01-001, VEXER-ATTEST-01-001|TODO – Connect export engine to attestation client, persist Rekor metadata, and reuse cached attestations.|
|
||||
23
src/StellaOps.Vexer.Formats.CSAF/AGENTS.md
Normal file
23
src/StellaOps.Vexer.Formats.CSAF/AGENTS.md
Normal file
@@ -0,0 +1,23 @@
|
||||
# AGENTS
|
||||
## Role
|
||||
Normalize CSAF VEX profile documents into Vexer claims and provide CSAF export adapters.
|
||||
## Scope
|
||||
- CSAF ingestion helpers: provider metadata parsing, document revision handling, vulnerability/action mappings.
|
||||
- Normalizer implementation fulfilling `INormalizer` for CSAF sources (Red Hat, Cisco, SUSE, MSRC, Oracle, Ubuntu).
|
||||
- Export adapters producing CSAF-compliant output slices from consensus data.
|
||||
- Schema/version compatibility checks (CSAF 2.0 profile validation).
|
||||
## Participants
|
||||
- Connectors deliver raw CSAF documents to this module for normalization.
|
||||
- Export module leverages adapters when producing CSAF exports.
|
||||
- Policy engine consumes normalized justification/status fields for gating.
|
||||
## Interfaces & contracts
|
||||
- Parser/normalizer classes, helper utilities for `product_tree`, `vulnerabilities`, and `notes`.
|
||||
- Export writer interfaces for per-provider/per-product CSAF packaging.
|
||||
## In/Out of scope
|
||||
In: CSAF parsing/normalization/export, schema validation, mapping to canonical claims.
|
||||
Out: HTTP fetching (connectors), storage persistence, attestation logic.
|
||||
## Observability & security expectations
|
||||
- Emit structured diagnostics when CSAF documents fail schema validation, including source URI and revision.
|
||||
- Provide counters for normalization outcomes (status distribution, justification coverage).
|
||||
## Tests
|
||||
- Fixture-driven parsing/export tests will live in `../StellaOps.Vexer.Formats.CSAF.Tests` using real CSAF samples.
|
||||
7
src/StellaOps.Vexer.Formats.CSAF/TASKS.md
Normal file
7
src/StellaOps.Vexer.Formats.CSAF/TASKS.md
Normal file
@@ -0,0 +1,7 @@
|
||||
If you are working on this file you need to read docs/ARCHITECTURE_VEXER.md and ./AGENTS.md).
|
||||
# TASKS
|
||||
| Task | Owner(s) | Depends on | Notes |
|
||||
|---|---|---|---|
|
||||
|VEXER-FMT-CSAF-01-001 – CSAF normalizer foundation|Team Vexer Formats|VEXER-CORE-01-001|TODO – Implement CSAF parser covering provider metadata, document tracking, and vulnerability/product mapping into `VexClaim`.|
|
||||
|VEXER-FMT-CSAF-01-002 – Status/justification mapping|Team Vexer Formats|VEXER-FMT-CSAF-01-001, VEXER-POLICY-01-001|TODO – Normalize CSAF `product_status` + `justification` values into policy-aware enums with audit diagnostics for unsupported codes.|
|
||||
|VEXER-FMT-CSAF-01-003 – CSAF export adapter|Team Vexer Formats|VEXER-EXPORT-01-001, VEXER-FMT-CSAF-01-001|TODO – Provide CSAF export writer producing deterministic documents (per vuln/product) and manifest metadata for attestation.|
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user