Add unit tests for RancherHubConnector and various exporters
- Implemented tests for RancherHubConnector to validate fetching documents, handling errors, and managing state. - Added tests for CsafExporter to ensure deterministic serialization of CSAF documents. - Created tests for CycloneDX exporters and reconciler to verify correct handling of VEX claims and output structure. - Developed OpenVEX exporter tests to confirm the generation of canonical OpenVEX documents and statement merging logic. - Introduced Rust file caching and license scanning functionality, including a cache key structure and hash computation. - Added sample Cargo.toml and LICENSE files for testing Rust license scanning functionality.
This commit is contained in:
		
							
								
								
									
										21
									
								
								docs/dev/aoc-normalization-removal-notes.md
									
									
									
									
									
										Normal file
									
								
							
							
						
						
									
										21
									
								
								docs/dev/aoc-normalization-removal-notes.md
									
									
									
									
									
										Normal file
									
								
							| @@ -0,0 +1,21 @@ | |||||||
|  | # AOC Normalization Removal Notes | ||||||
|  |  | ||||||
|  | _Last updated: 2025-10-29_ | ||||||
|  |  | ||||||
|  | ## Goal | ||||||
|  |  | ||||||
|  | Document follow-up actions for CONCELIER-CORE-AOC-19-004 as we unwind the final pieces of normalization from the ingestion/runtime path. | ||||||
|  |  | ||||||
|  | ## Current Findings | ||||||
|  |  | ||||||
|  | - `AdvisoryRawService` and `MongoAdvisoryRawRepository` already preserve upstream ordering and duplicate aliases (trim-only). No additional code changes required there. | ||||||
|  | - Observation layers (`AdvisoryObservationFactory`, `AdvisoryObservationQueryService`) still canonicalise aliases, PURLs, CPEs, and references. These need to be relaxed so Policy/overlays receive raw linksets and can own dedupe logic. | ||||||
|  | - Linkset mapper continues to emit deterministic hints. We will keep the mapper but ensure observation output can surface both raw and canonical views for downstream services. | ||||||
|  |  | ||||||
|  | ## Next Steps | ||||||
|  |  | ||||||
|  | 1. Introduce a raw linkset projection alongside the existing canonical mapper so Policy Engine can choose which flavour to consume. | ||||||
|  | 2. Update observation factory/query tests to assert duplicate handling and ordering with the relaxed projection. | ||||||
|  | 3. Refresh docs (`docs/ingestion/aggregation-only-contract.md`) once behaviour lands to explain the “raw vs canonical linkset” split. | ||||||
|  | 4. Coordinate with Policy Guild to validate consumers against the new raw projection before flipping defaults. | ||||||
|  |  | ||||||
| @@ -115,7 +115,7 @@ Generated from SPRINTS.md and module TASKS.md files on 2025-10-19. Waves cluster | |||||||
| - Team Notify Engine Guild: read EXECPLAN.md Wave 4 and SPRINTS.md rows for `src/Notify/__Libraries/StellaOps.Notify.Engine/TASKS.md`. Focus on NOTIFY-ENGINE-15-304 (TODO). Confirm prerequisites (internal: NOTIFY-ENGINE-15-303 (Wave 3)) before starting and report status in module TASKS.md. | - Team Notify Engine Guild: read EXECPLAN.md Wave 4 and SPRINTS.md rows for `src/Notify/__Libraries/StellaOps.Notify.Engine/TASKS.md`. Focus on NOTIFY-ENGINE-15-304 (TODO). Confirm prerequisites (internal: NOTIFY-ENGINE-15-303 (Wave 3)) before starting and report status in module TASKS.md. | ||||||
| - Team Notify Worker Guild: read EXECPLAN.md Wave 4 and SPRINTS.md rows for `src/Notify/StellaOps.Notify.Worker/TASKS.md`. Focus on NOTIFY-WORKER-15-204 (TODO). Confirm prerequisites (internal: NOTIFY-WORKER-15-203 (Wave 3)) before starting and report status in module TASKS.md. | - Team Notify Worker Guild: read EXECPLAN.md Wave 4 and SPRINTS.md rows for `src/Notify/StellaOps.Notify.Worker/TASKS.md`. Focus on NOTIFY-WORKER-15-204 (TODO). Confirm prerequisites (internal: NOTIFY-WORKER-15-203 (Wave 3)) before starting and report status in module TASKS.md. | ||||||
| - Team Scheduler Worker Guild: read EXECPLAN.md Wave 4 and SPRINTS.md rows for `src/Scheduler/__Libraries/StellaOps.Scheduler.Worker/TASKS.md`. Focus on SCHED-WORKER-16-204 (TODO). Confirm prerequisites (internal: SCHED-WORKER-16-203 (Wave 3)) before starting and report status in module TASKS.md. | - Team Scheduler Worker Guild: read EXECPLAN.md Wave 4 and SPRINTS.md rows for `src/Scheduler/__Libraries/StellaOps.Scheduler.Worker/TASKS.md`. Focus on SCHED-WORKER-16-204 (TODO). Confirm prerequisites (internal: SCHED-WORKER-16-203 (Wave 3)) before starting and report status in module TASKS.md. | ||||||
| - Team TBD: read EXECPLAN.md Wave 4 and SPRINTS.md rows for `src/Scanner/__Libraries/StellaOps.Scanner.Analyzers.Lang.DotNet/TASKS.md`, `src/Scanner/__Libraries/StellaOps.Scanner.Analyzers.Lang.Go/TASKS.md`, `src/Scanner/__Libraries/StellaOps.Scanner.Analyzers.Lang.Python/TASKS.md`, `src/Scanner/__Libraries/StellaOps.Scanner.Analyzers.Lang.Rust/TASKS.md`. SCANNER-ANALYZERS-LANG-10-307D/G/P are DONE (latest 2025-10-23); remaining focus is SCANNER-ANALYZERS-LANG-10-307R (TODO). Confirm prerequisites (internal: SCANNER-ANALYZERS-LANG-10-303C (Wave 3), SCANNER-ANALYZERS-LANG-10-304C (Wave 3), SCANNER-ANALYZERS-LANG-10-305C (Wave 3), SCANNER-ANALYZERS-LANG-10-306C (Wave 3)) before progressing and report status in module TASKS.md. | - Team TBD: read EXECPLAN.md Wave 4 and SPRINTS.md rows for `src/Scanner/__Libraries/StellaOps.Scanner.Analyzers.Lang.DotNet/TASKS.md`, `src/Scanner/__Libraries/StellaOps.Scanner.Analyzers.Lang.Go/TASKS.md`, `src/Scanner/__Libraries/StellaOps.Scanner.Analyzers.Lang.Python/TASKS.md`, `src/Scanner/__Libraries/StellaOps.Scanner.Analyzers.Lang.Rust/TASKS.md`. SCANNER-ANALYZERS-LANG-10-307D/G/P are DONE (latest 2025-10-23); remaining focus is SCANNER-ANALYZERS-LANG-10-307R (DOING). Confirm prerequisites (internal: SCANNER-ANALYZERS-LANG-10-303C (Wave 3), SCANNER-ANALYZERS-LANG-10-304C (Wave 3), SCANNER-ANALYZERS-LANG-10-305C (Wave 3), SCANNER-ANALYZERS-LANG-10-306C (Wave 3)) before progressing and report status in module TASKS.md. | ||||||
|  |  | ||||||
| ### Wave 5 | ### Wave 5 | ||||||
| - **Sprint 23-28** · StellaOps Console, Policy Studio, Graph Explorer | - **Sprint 23-28** · StellaOps Console, Policy Studio, Graph Explorer | ||||||
| @@ -438,26 +438,26 @@ Generated from SPRINTS.md and module TASKS.md files on 2025-10-19. Waves cluster | |||||||
|          • Current: TODO – Fetch CSAF bundles with ETag handling, checksum validation, deduplication, and raw persistence. |          • Current: TODO – Fetch CSAF bundles with ETag handling, checksum validation, deduplication, and raw persistence. | ||||||
|   - Team: Team Excititor Formats |   - Team: Team Excititor Formats | ||||||
|     - Path: `src/Excititor/__Libraries/StellaOps.Excititor.Formats.CSAF/TASKS.md` |     - Path: `src/Excititor/__Libraries/StellaOps.Excititor.Formats.CSAF/TASKS.md` | ||||||
|       1. [TODO] EXCITITOR-FMT-CSAF-01-002 — EXCITITOR-FMT-CSAF-01-002 – Status/justification mapping |       1. [DONE 2025-10-29] EXCITITOR-FMT-CSAF-01-002 — EXCITITOR-FMT-CSAF-01-002 – Status/justification mapping | ||||||
|          • Prereqs: EXCITITOR-FMT-CSAF-01-001 (external/completed), EXCITITOR-POLICY-01-001 (external/completed) |          • Prereqs: EXCITITOR-FMT-CSAF-01-001 (external/completed), EXCITITOR-POLICY-01-001 (external/completed) | ||||||
|          • Current: TODO – Normalize CSAF `product_status` + `justification` values into policy-aware enums with audit diagnostics for unsupported codes. |          • Current: DONE – Normalizer now emits policy-safe status/justification mappings and flags unsupported or missing evidence for audit diagnostics. | ||||||
|       2. [TODO] EXCITITOR-FMT-CSAF-01-003 — EXCITITOR-FMT-CSAF-01-003 – CSAF export adapter |       2. [DONE 2025-10-29] EXCITITOR-FMT-CSAF-01-003 — EXCITITOR-FMT-CSAF-01-003 – CSAF export adapter | ||||||
|          • Prereqs: EXCITITOR-EXPORT-01-001 (external/completed), EXCITITOR-FMT-CSAF-01-001 (external/completed) |          • Prereqs: EXCITITOR-EXPORT-01-001 (external/completed), EXCITITOR-FMT-CSAF-01-001 (external/completed) | ||||||
|          • Current: TODO – Provide CSAF export writer producing deterministic documents (per vuln/product) and manifest metadata for attestation. |          • Current: DONE – CSAF exporter produces deterministic documents with reconciled product tree, vulnerability statuses, and export metadata. | ||||||
|     - Path: `src/Excititor/__Libraries/StellaOps.Excititor.Formats.CycloneDX/TASKS.md` |     - Path: `src/Excititor/__Libraries/StellaOps.Excititor.Formats.CycloneDX/TASKS.md` | ||||||
|       1. [TODO] EXCITITOR-FMT-CYCLONE-01-002 — EXCITITOR-FMT-CYCLONE-01-002 – Component reference reconciliation |       1. [DONE 2025-10-29] EXCITITOR-FMT-CYCLONE-01-002 — EXCITITOR-FMT-CYCLONE-01-002 – Component reference reconciliation | ||||||
|          • Prereqs: EXCITITOR-FMT-CYCLONE-01-001 (external/completed) |          • Prereqs: EXCITITOR-FMT-CYCLONE-01-001 (external/completed) | ||||||
|          • Current: TODO – Implement helpers to reconcile component/service references against policy expectations and emit diagnostics for missing SBOM links. |          • Current: DONE – Component reconciler issues stable bom-refs, aggregates identifiers, and records diagnostics for missing SBOM linkage. | ||||||
|       2. [TODO] EXCITITOR-FMT-CYCLONE-01-003 — EXCITITOR-FMT-CYCLONE-01-003 – CycloneDX export serializer |       2. [DONE 2025-10-29] EXCITITOR-FMT-CYCLONE-01-003 — EXCITITOR-FMT-CYCLONE-01-003 – CycloneDX export serializer | ||||||
|          • Prereqs: EXCITITOR-EXPORT-01-001 (external/completed), EXCITITOR-FMT-CYCLONE-01-001 (external/completed) |          • Prereqs: EXCITITOR-EXPORT-01-001 (external/completed), EXCITITOR-FMT-CYCLONE-01-001 (external/completed) | ||||||
|          • Current: TODO – Provide exporters producing CycloneDX VEX output with canonical ordering and hash-stable manifests. |          • Current: DONE – CycloneDX exporter delivers canonical VEX payloads with reconciled components, per-claim analyses, and metadata for caching. | ||||||
|     - Path: `src/Excititor/__Libraries/StellaOps.Excititor.Formats.OpenVEX/TASKS.md` |     - Path: `src/Excititor/__Libraries/StellaOps.Excititor.Formats.OpenVEX/TASKS.md` | ||||||
|       1. [TODO] EXCITITOR-FMT-OPENVEX-01-002 — EXCITITOR-FMT-OPENVEX-01-002 – Statement merge utilities |       1. [DONE 2025-10-29] EXCITITOR-FMT-OPENVEX-01-002 — EXCITITOR-FMT-OPENVEX-01-002 – Statement merge utilities | ||||||
|          • Prereqs: EXCITITOR-FMT-OPENVEX-01-001 (external/completed) |          • Prereqs: EXCITITOR-FMT-OPENVEX-01-001 (external/completed) | ||||||
|          • Current: TODO – Add reducers merging multiple OpenVEX statements, resolving conflicts deterministically, and emitting policy diagnostics. |          • Current: DONE – Merge utilities combine statements deterministically, highlight conflicts, and preserve source diagnostics for policy checks. | ||||||
|       2. [TODO] EXCITITOR-FMT-OPENVEX-01-003 — EXCITITOR-FMT-OPENVEX-01-003 – OpenVEX export writer |       2. [DONE 2025-10-29] EXCITITOR-FMT-OPENVEX-01-003 — EXCITITOR-FMT-OPENVEX-01-003 – OpenVEX export writer | ||||||
|          • Prereqs: EXCITITOR-EXPORT-01-001 (external/completed), EXCITITOR-FMT-OPENVEX-01-001 (external/completed) |          • Prereqs: EXCITITOR-EXPORT-01-001 (external/completed), EXCITITOR-FMT-OPENVEX-01-001 (external/completed) | ||||||
|          • Current: TODO – Provide export serializer generating canonical OpenVEX documents with optional SBOM references and hash-stable ordering. |          • Current: DONE – OpenVEX exporter serializes merged statements with canonical ordering, provenance metadata, and deterministic digests. | ||||||
|  |  | ||||||
| - **Sprint 7** · Contextual Truth Foundations | - **Sprint 7** · Contextual Truth Foundations | ||||||
|   - Team: Team Excititor Export |   - Team: Team Excititor Export | ||||||
| @@ -956,7 +956,7 @@ Generated from SPRINTS.md and module TASKS.md files on 2025-10-19. Waves cluster | |||||||
|          • Prereqs: SCANNER-ANALYZERS-LANG-10-303C (Wave 3) |          • Prereqs: SCANNER-ANALYZERS-LANG-10-303C (Wave 3) | ||||||
|          • Current: TODO |          • Current: TODO | ||||||
|     - Path: `src/Scanner/__Libraries/StellaOps.Scanner.Analyzers.Lang.Rust/TASKS.md` |     - Path: `src/Scanner/__Libraries/StellaOps.Scanner.Analyzers.Lang.Rust/TASKS.md` | ||||||
|       1. [TODO] SCANNER-ANALYZERS-LANG-10-307R — Finalize shared helper usage (license, usage flags) and concurrency-safe caches. |       1. [DOING] SCANNER-ANALYZERS-LANG-10-307R — Finalize shared helper usage (license, usage flags) and concurrency-safe caches. | ||||||
|          • Prereqs: SCANNER-ANALYZERS-LANG-10-306C (Wave 3) |          • Prereqs: SCANNER-ANALYZERS-LANG-10-306C (Wave 3) | ||||||
|          • Current: TODO |          • Current: TODO | ||||||
| - **Sprint 13** · UX & CLI Experience | - **Sprint 13** · UX & CLI Experience | ||||||
|   | |||||||
| @@ -603,7 +603,7 @@ This file describe implementation of Stella Ops (docs/README.md). Implementation | |||||||
| | Sprint 39 | Java Analyzer Core | src/Scanner/__Libraries/StellaOps.Scanner.Analyzers.Lang.Java/TASKS.md | TODO | Java Analyzer Guild | SCANNER-ANALYZERS-JAVA-21-001 | Java input normalizer (jar/war/ear/fat/jmod/jimage) with MR overlay selection. | | | Sprint 39 | Java Analyzer Core | src/Scanner/__Libraries/StellaOps.Scanner.Analyzers.Lang.Java/TASKS.md | TODO | Java Analyzer Guild | SCANNER-ANALYZERS-JAVA-21-001 | Java input normalizer (jar/war/ear/fat/jmod/jimage) with MR overlay selection. | | ||||||
| | Sprint 39 | Java Analyzer Core | src/Scanner/__Libraries/StellaOps.Scanner.Analyzers.Lang.Java/TASKS.md | TODO | Java Analyzer Guild | SCANNER-ANALYZERS-JAVA-21-002 | Module/classpath builder with duplicate & split-package detection. | | | Sprint 39 | Java Analyzer Core | src/Scanner/__Libraries/StellaOps.Scanner.Analyzers.Lang.Java/TASKS.md | TODO | Java Analyzer Guild | SCANNER-ANALYZERS-JAVA-21-002 | Module/classpath builder with duplicate & split-package detection. | | ||||||
| | Sprint 39 | Java Analyzer Core | src/Scanner/__Libraries/StellaOps.Scanner.Analyzers.Lang.Java/TASKS.md | TODO | Java Analyzer Guild | SCANNER-ANALYZERS-JAVA-21-003 | SPI scanner & provider selection with warnings. | | | Sprint 39 | Java Analyzer Core | src/Scanner/__Libraries/StellaOps.Scanner.Analyzers.Lang.Java/TASKS.md | TODO | Java Analyzer Guild | SCANNER-ANALYZERS-JAVA-21-003 | SPI scanner & provider selection with warnings. | | ||||||
| | Sprint 39 | Java Analyzer Core | src/Scanner/__Libraries/StellaOps.Scanner.Analyzers.Lang.Java/TASKS.md | TODO | Java Analyzer Guild | SCANNER-ANALYZERS-JAVA-21-004 | Reflection/TCCL heuristics emitting reason-coded edges. | | | Sprint 39 | Java Analyzer Core | src/Scanner/__Libraries/StellaOps.Scanner.Analyzers.Lang.Java/TASKS.md | DONE | Java Analyzer Guild | SCANNER-ANALYZERS-JAVA-21-004 | Reflection/TCCL heuristics emitting reason-coded edges. | | ||||||
| | Sprint 39 | Java Analyzer Core | src/Scanner/__Libraries/StellaOps.Scanner.Analyzers.Lang.Java/TASKS.md | TODO | Java Analyzer Guild | SCANNER-ANALYZERS-JAVA-21-005 | Framework config extraction (Spring, Jakarta, MicroProfile, logging, Graal configs). | | | Sprint 39 | Java Analyzer Core | src/Scanner/__Libraries/StellaOps.Scanner.Analyzers.Lang.Java/TASKS.md | TODO | Java Analyzer Guild | SCANNER-ANALYZERS-JAVA-21-005 | Framework config extraction (Spring, Jakarta, MicroProfile, logging, Graal configs). | | ||||||
| | Sprint 39 | Java Analyzer Core | src/Scanner/__Libraries/StellaOps.Scanner.Analyzers.Lang.Java/TASKS.md | TODO | Java Analyzer Guild | SCANNER-ANALYZERS-JAVA-21-006 | JNI/native hint detection for Java artifacts. | | | Sprint 39 | Java Analyzer Core | src/Scanner/__Libraries/StellaOps.Scanner.Analyzers.Lang.Java/TASKS.md | TODO | Java Analyzer Guild | SCANNER-ANALYZERS-JAVA-21-006 | JNI/native hint detection for Java artifacts. | | ||||||
| | Sprint 39 | Java Analyzer Core | src/Scanner/__Libraries/StellaOps.Scanner.Analyzers.Lang.Java/TASKS.md | TODO | Java Analyzer Guild | SCANNER-ANALYZERS-JAVA-21-007 | Manifest/signature metadata collector (main/start/agent classes, signers). | | | Sprint 39 | Java Analyzer Core | src/Scanner/__Libraries/StellaOps.Scanner.Analyzers.Lang.Java/TASKS.md | TODO | Java Analyzer Guild | SCANNER-ANALYZERS-JAVA-21-007 | Manifest/signature metadata collector (main/start/agent classes, signers). | | ||||||
|   | |||||||
| @@ -12,7 +12,7 @@ | |||||||
| | CONCELIER-CORE-AOC-19-003 `Idempotent append-only upsert` | DONE (2025-10-28) | Concelier Core Guild | CONCELIER-STORE-AOC-19-002 | Implement idempotent upsert path using `(vendor, upstreamId, contentHash, tenant)` key, emitting supersedes pointers for new revisions and preventing duplicate inserts. | | | CONCELIER-CORE-AOC-19-003 `Idempotent append-only upsert` | DONE (2025-10-28) | Concelier Core Guild | CONCELIER-STORE-AOC-19-002 | Implement idempotent upsert path using `(vendor, upstreamId, contentHash, tenant)` key, emitting supersedes pointers for new revisions and preventing duplicate inserts. | | ||||||
| > 2025-10-28: Advisory raw ingestion now strips client-supplied supersedes hints, logs ignored pointers, and surfaces repository-supplied supersedes identifiers; service tests cover duplicate handling and append-only semantics. | > 2025-10-28: Advisory raw ingestion now strips client-supplied supersedes hints, logs ignored pointers, and surfaces repository-supplied supersedes identifiers; service tests cover duplicate handling and append-only semantics. | ||||||
| > Docs alignment (2025-10-26): Deployment guide + observability guide describe supersedes metrics; ensure implementation emits `aoc_violation_total` on failure. | > Docs alignment (2025-10-26): Deployment guide + observability guide describe supersedes metrics; ensure implementation emits `aoc_violation_total` on failure. | ||||||
| | CONCELIER-CORE-AOC-19-004 `Remove ingestion normalization` | DOING (2025-10-28) | Concelier Core Guild | CONCELIER-CORE-AOC-19-002, POLICY-AOC-19-003 | Strip normalization/dedup/severity logic from ingestion pipelines, delegate derived computations to Policy Engine, and update exporters/tests to consume raw documents only. | | | CONCELIER-CORE-AOC-19-004 `Remove ingestion normalization` | DOING (2025-10-28) | Concelier Core Guild | CONCELIER-CORE-AOC-19-002, POLICY-AOC-19-003 | Strip normalization/dedup/severity logic from ingestion pipelines, delegate derived computations to Policy Engine, and update exporters/tests to consume raw documents only.<br>2025-10-29 19:05Z: Audit completed for `AdvisoryRawService`/Mongo repo to confirm alias order/dedup removal persists; identified remaining normalization in observation/linkset factory that will be revised to surface raw duplicates for Policy ingestion. Change sketch + regression matrix drafted under `docs/dev/aoc-normalization-removal-notes.md` (pending commit). | | ||||||
| > Docs alignment (2025-10-26): Architecture overview emphasises policy-only derivation; coordinate with Policy Engine guild for rollout. | > Docs alignment (2025-10-26): Architecture overview emphasises policy-only derivation; coordinate with Policy Engine guild for rollout. | ||||||
| > 2025-10-29: `AdvisoryRawService` now preserves upstream alias/linkset ordering (trim-only) and updated AOC documentation reflects the behaviour; follow-up to ensure policy consumers handle duplicates remains open. | > 2025-10-29: `AdvisoryRawService` now preserves upstream alias/linkset ordering (trim-only) and updated AOC documentation reflects the behaviour; follow-up to ensure policy consumers handle duplicates remains open. | ||||||
| | CONCELIER-CORE-AOC-19-013 `Authority tenant scope smoke coverage` | TODO | Concelier Core Guild | AUTH-AOC-19-002 | Extend Concelier smoke/e2e fixtures to configure `requiredTenants` and assert cross-tenant rejection with updated Authority tokens. | Coordinate deliverable so Authority docs (`AUTH-AOC-19-003`) can close once tests are in place. | | | CONCELIER-CORE-AOC-19-013 `Authority tenant scope smoke coverage` | TODO | Concelier Core Guild | AUTH-AOC-19-002 | Extend Concelier smoke/e2e fixtures to configure `requiredTenants` and assert cross-tenant rejection with updated Authority tokens. | Coordinate deliverable so Authority docs (`AUTH-AOC-19-003`) can close once tests are in place. | | ||||||
|   | |||||||
| @@ -18,6 +18,10 @@ | |||||||
|     <ScannerLangAnalyzerPluginOutputRoot Condition="'$(ScannerLangAnalyzerPluginOutputRoot)' == ''">$([System.IO.Path]::GetFullPath('$(MSBuildThisFileDirectory)..\plugins\scanner\analyzers\lang\'))</ScannerLangAnalyzerPluginOutputRoot> |     <ScannerLangAnalyzerPluginOutputRoot Condition="'$(ScannerLangAnalyzerPluginOutputRoot)' == ''">$([System.IO.Path]::GetFullPath('$(MSBuildThisFileDirectory)..\plugins\scanner\analyzers\lang\'))</ScannerLangAnalyzerPluginOutputRoot> | ||||||
|     <IsScannerLangAnalyzerPlugin Condition="'$(IsScannerLangAnalyzerPlugin)' == '' and $([System.String]::Copy('$(MSBuildProjectName)').StartsWith('StellaOps.Scanner.Analyzers.Lang.'))">true</IsScannerLangAnalyzerPlugin> |     <IsScannerLangAnalyzerPlugin Condition="'$(IsScannerLangAnalyzerPlugin)' == '' and $([System.String]::Copy('$(MSBuildProjectName)').StartsWith('StellaOps.Scanner.Analyzers.Lang.'))">true</IsScannerLangAnalyzerPlugin> | ||||||
|     <UseConcelierTestInfra Condition="'$(UseConcelierTestInfra)' == ''">true</UseConcelierTestInfra> |     <UseConcelierTestInfra Condition="'$(UseConcelierTestInfra)' == ''">true</UseConcelierTestInfra> | ||||||
|  |     <ConcelierTestingPath Condition="'$(ConcelierTestingPath)' == '' and Exists('$(MSBuildThisFileDirectory)StellaOps.Concelier.Testing\StellaOps.Concelier.Testing.csproj')">$(MSBuildThisFileDirectory)StellaOps.Concelier.Testing\</ConcelierTestingPath> | ||||||
|  |     <ConcelierTestingPath Condition="'$(ConcelierTestingPath)' == '' and Exists('$(MSBuildThisFileDirectory)Concelier\__Libraries\StellaOps.Concelier.Testing\StellaOps.Concelier.Testing.csproj')">$(MSBuildThisFileDirectory)Concelier\__Libraries\StellaOps.Concelier.Testing\</ConcelierTestingPath> | ||||||
|  |     <ConcelierSharedTestsPath Condition="'$(ConcelierSharedTestsPath)' == '' and Exists('$(MSBuildThisFileDirectory)StellaOps.Concelier.Tests.Shared\AssemblyInfo.cs')">$(MSBuildThisFileDirectory)StellaOps.Concelier.Tests.Shared\</ConcelierSharedTestsPath> | ||||||
|  |     <ConcelierSharedTestsPath Condition="'$(ConcelierSharedTestsPath)' == '' and Exists('$(MSBuildThisFileDirectory)Concelier\StellaOps.Concelier.Tests.Shared\AssemblyInfo.cs')">$(MSBuildThisFileDirectory)Concelier\StellaOps.Concelier.Tests.Shared\</ConcelierSharedTestsPath> | ||||||
|   </PropertyGroup> |   </PropertyGroup> | ||||||
|  |  | ||||||
|   <ItemGroup> |   <ItemGroup> | ||||||
| @@ -40,9 +44,9 @@ | |||||||
|     <PackageReference Include="xunit" Version="2.9.2" /> |     <PackageReference Include="xunit" Version="2.9.2" /> | ||||||
|     <PackageReference Include="xunit.runner.visualstudio" Version="2.8.2" /> |     <PackageReference Include="xunit.runner.visualstudio" Version="2.8.2" /> | ||||||
|     <PackageReference Include="Microsoft.Extensions.TimeProvider.Testing" Version="9.10.0" /> |     <PackageReference Include="Microsoft.Extensions.TimeProvider.Testing" Version="9.10.0" /> | ||||||
|     <Compile Include="$(MSBuildThisFileDirectory)StellaOps.Concelier.Tests.Shared\AssemblyInfo.cs" Link="Shared\AssemblyInfo.cs" /> |     <Compile Include="$(ConcelierSharedTestsPath)AssemblyInfo.cs" Link="Shared\AssemblyInfo.cs" Condition="'$(ConcelierSharedTestsPath)' != ''" /> | ||||||
|     <Compile Include="$(MSBuildThisFileDirectory)StellaOps.Concelier.Tests.Shared\MongoFixtureCollection.cs" Link="Shared\MongoFixtureCollection.cs" /> |     <Compile Include="$(ConcelierSharedTestsPath)MongoFixtureCollection.cs" Link="Shared\MongoFixtureCollection.cs" Condition="'$(ConcelierSharedTestsPath)' != ''" /> | ||||||
|     <ProjectReference Include="$(MSBuildThisFileDirectory)StellaOps.Concelier.Testing\StellaOps.Concelier.Testing.csproj" /> |     <ProjectReference Include="$(ConcelierTestingPath)StellaOps.Concelier.Testing.csproj" Condition="'$(ConcelierTestingPath)' != ''" /> | ||||||
|     <Using Include="StellaOps.Concelier.Testing" /> |     <Using Include="StellaOps.Concelier.Testing" /> | ||||||
|     <Using Include="Xunit" /> |     <Using Include="Xunit" /> | ||||||
|   </ItemGroup> |   </ItemGroup> | ||||||
|   | |||||||
| @@ -299,6 +299,7 @@ internal static class MirrorEndpoints | |||||||
|             VexExportFormat.JsonLines => "application/jsonl", |             VexExportFormat.JsonLines => "application/jsonl", | ||||||
|             VexExportFormat.OpenVex => "application/json", |             VexExportFormat.OpenVex => "application/json", | ||||||
|             VexExportFormat.Csaf => "application/json", |             VexExportFormat.Csaf => "application/json", | ||||||
|  |             VexExportFormat.CycloneDx => "application/json", | ||||||
|             _ => "application/octet-stream", |             _ => "application/octet-stream", | ||||||
|         }; |         }; | ||||||
|  |  | ||||||
| @@ -312,6 +313,7 @@ internal static class MirrorEndpoints | |||||||
|             VexExportFormat.JsonLines => ".jsonl", |             VexExportFormat.JsonLines => ".jsonl", | ||||||
|             VexExportFormat.OpenVex => ".openvex.json", |             VexExportFormat.OpenVex => ".openvex.json", | ||||||
|             VexExportFormat.Csaf => ".csaf.json", |             VexExportFormat.Csaf => ".csaf.json", | ||||||
|  |             VexExportFormat.CycloneDx => ".cyclonedx.json", | ||||||
|             _ => ".bin", |             _ => ".bin", | ||||||
|         }); |         }); | ||||||
|         return builder.ToString(); |         return builder.ToString(); | ||||||
|   | |||||||
| @@ -1,9 +1,11 @@ | |||||||
| using System.Collections.Generic; | using System.Collections.Generic; | ||||||
| using System.Collections.Immutable; | using System.Collections.Immutable; | ||||||
|  | using System.Globalization; | ||||||
| using System.Linq; | using System.Linq; | ||||||
| using System.Net.Http; | using System.Net.Http; | ||||||
| using System.Runtime.CompilerServices; | using System.Runtime.CompilerServices; | ||||||
| using System.Text.Json; | using System.Text.Json; | ||||||
|  | using Microsoft.Extensions.DependencyInjection; | ||||||
| using Microsoft.Extensions.Logging; | using Microsoft.Extensions.Logging; | ||||||
| using StellaOps.Excititor.Connectors.Abstractions; | using StellaOps.Excititor.Connectors.Abstractions; | ||||||
| using StellaOps.Excititor.Connectors.Cisco.CSAF.Configuration; | using StellaOps.Excititor.Connectors.Cisco.CSAF.Configuration; | ||||||
| @@ -76,6 +78,8 @@ public sealed class CiscoCsafConnector : VexConnectorBase | |||||||
|             _providerMetadata = await _metadataLoader.LoadAsync(cancellationToken).ConfigureAwait(false); |             _providerMetadata = await _metadataLoader.LoadAsync(cancellationToken).ConfigureAwait(false); | ||||||
|         } |         } | ||||||
|  |  | ||||||
|  |         await UpsertProviderAsync(context.Services, _providerMetadata.Provider, cancellationToken).ConfigureAwait(false); | ||||||
|  |  | ||||||
|         var state = await _stateRepository.GetAsync(Descriptor.Id, cancellationToken).ConfigureAwait(false); |         var state = await _stateRepository.GetAsync(Descriptor.Id, cancellationToken).ConfigureAwait(false); | ||||||
|         var knownDigests = state?.DocumentDigests ?? ImmutableArray<string>.Empty; |         var knownDigests = state?.DocumentDigests ?? ImmutableArray<string>.Empty; | ||||||
|         var digestSet = new HashSet<string>(knownDigests, StringComparer.OrdinalIgnoreCase); |         var digestSet = new HashSet<string>(knownDigests, StringComparer.OrdinalIgnoreCase); | ||||||
| @@ -99,16 +103,23 @@ public sealed class CiscoCsafConnector : VexConnectorBase | |||||||
|                 contentResponse.EnsureSuccessStatusCode(); |                 contentResponse.EnsureSuccessStatusCode(); | ||||||
|                 var payload = await contentResponse.Content.ReadAsByteArrayAsync(cancellationToken).ConfigureAwait(false); |                 var payload = await contentResponse.Content.ReadAsByteArrayAsync(cancellationToken).ConfigureAwait(false); | ||||||
|  |  | ||||||
|                 var rawDocument = CreateRawDocument( |                 var metadata = BuildMetadata(builder => | ||||||
|                     VexDocumentFormat.Csaf, |                 { | ||||||
|                     advisory.DocumentUri, |                     builder | ||||||
|                     payload, |  | ||||||
|                     BuildMetadata(builder => builder |  | ||||||
|                         .Add("cisco.csaf.advisoryId", advisory.Id) |                         .Add("cisco.csaf.advisoryId", advisory.Id) | ||||||
|                         .Add("cisco.csaf.revision", advisory.Revision) |                         .Add("cisco.csaf.revision", advisory.Revision) | ||||||
|                         .Add("cisco.csaf.published", advisory.Published?.ToString("O")) |                         .Add("cisco.csaf.published", advisory.Published?.ToString("O")) | ||||||
|                         .Add("cisco.csaf.modified", advisory.LastModified?.ToString("O")) |                         .Add("cisco.csaf.modified", advisory.LastModified?.ToString("O")) | ||||||
|                         .Add("cisco.csaf.sha256", advisory.Sha256))); |                         .Add("cisco.csaf.sha256", advisory.Sha256); | ||||||
|  |  | ||||||
|  |                     AddProvenanceMetadata(builder); | ||||||
|  |                 }); | ||||||
|  |  | ||||||
|  |                 var rawDocument = CreateRawDocument( | ||||||
|  |                     VexDocumentFormat.Csaf, | ||||||
|  |                     advisory.DocumentUri, | ||||||
|  |                     payload, | ||||||
|  |                     metadata); | ||||||
|  |  | ||||||
|                 if (!digestSet.Add(rawDocument.Digest)) |                 if (!digestSet.Add(rawDocument.Digest)) | ||||||
|                 { |                 { | ||||||
| @@ -231,6 +242,46 @@ public sealed class CiscoCsafConnector : VexConnectorBase | |||||||
|         return BuildIndexUri(directory, next); |         return BuildIndexUri(directory, next); | ||||||
|     } |     } | ||||||
|  |  | ||||||
|  |     private void AddProvenanceMetadata(VexConnectorMetadataBuilder builder) | ||||||
|  |     { | ||||||
|  |         if (_providerMetadata is null) | ||||||
|  |         { | ||||||
|  |             return; | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         var provider = _providerMetadata.Provider; | ||||||
|  |         builder.Add("vex.provenance.provider", provider.Id); | ||||||
|  |         builder.Add("vex.provenance.providerName", provider.DisplayName); | ||||||
|  |         builder.Add("vex.provenance.trust.weight", provider.Trust.Weight.ToString("0.###", CultureInfo.InvariantCulture)); | ||||||
|  |  | ||||||
|  |         if (provider.Trust.Cosign is { } cosign) | ||||||
|  |         { | ||||||
|  |             builder.Add("vex.provenance.cosign.issuer", cosign.Issuer); | ||||||
|  |             builder.Add("vex.provenance.cosign.identityPattern", cosign.IdentityPattern); | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         if (!provider.Trust.PgpFingerprints.IsDefaultOrEmpty && provider.Trust.PgpFingerprints.Length > 0) | ||||||
|  |         { | ||||||
|  |             builder.Add("vex.provenance.pgp.fingerprints", string.Join(",", provider.Trust.PgpFingerprints)); | ||||||
|  |         } | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     private static async ValueTask UpsertProviderAsync(IServiceProvider services, VexProvider provider, CancellationToken cancellationToken) | ||||||
|  |     { | ||||||
|  |         if (services is null) | ||||||
|  |         { | ||||||
|  |             return; | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         var store = services.GetService<IVexProviderStore>(); | ||||||
|  |         if (store is null) | ||||||
|  |         { | ||||||
|  |             return; | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         await store.SaveAsync(provider, cancellationToken).ConfigureAwait(false); | ||||||
|  |     } | ||||||
|  |  | ||||||
|     private sealed record CiscoAdvisoryIndex |     private sealed record CiscoAdvisoryIndex | ||||||
|     { |     { | ||||||
|         public List<CiscoAdvisory>? Advisories { get; init; } |         public List<CiscoAdvisory>? Advisories { get; init; } | ||||||
|   | |||||||
| @@ -4,4 +4,4 @@ If you are working on this file you need to read docs/modules/excititor/ARCHITEC | |||||||
| |---|---|---|---| | |---|---|---|---| | ||||||
| |EXCITITOR-CONN-CISCO-01-001 – Endpoint discovery & auth plumbing|Team Excititor Connectors – Cisco|EXCITITOR-CONN-ABS-01-001|**DONE (2025-10-17)** – Added `CiscoProviderMetadataLoader` with bearer token support, offline snapshot fallback, DI helpers, and tests covering network/offline discovery to unblock subsequent fetch work.| | |EXCITITOR-CONN-CISCO-01-001 – Endpoint discovery & auth plumbing|Team Excititor Connectors – Cisco|EXCITITOR-CONN-ABS-01-001|**DONE (2025-10-17)** – Added `CiscoProviderMetadataLoader` with bearer token support, offline snapshot fallback, DI helpers, and tests covering network/offline discovery to unblock subsequent fetch work.| | ||||||
| |EXCITITOR-CONN-CISCO-01-002 – CSAF pull loop & pagination|Team Excititor Connectors – Cisco|EXCITITOR-CONN-CISCO-01-001, EXCITITOR-STORAGE-01-003|**DONE (2025-10-17)** – Implemented paginated advisory fetch using provider directories, raw document persistence with dedupe/state tracking, offline resiliency, and unit coverage.| | |EXCITITOR-CONN-CISCO-01-002 – CSAF pull loop & pagination|Team Excititor Connectors – Cisco|EXCITITOR-CONN-CISCO-01-001, EXCITITOR-STORAGE-01-003|**DONE (2025-10-17)** – Implemented paginated advisory fetch using provider directories, raw document persistence with dedupe/state tracking, offline resiliency, and unit coverage.| | ||||||
| |EXCITITOR-CONN-CISCO-01-003 – Provider trust metadata|Team Excititor Connectors – Cisco|EXCITITOR-CONN-CISCO-01-002, EXCITITOR-POLICY-01-001|**DOING (2025-10-19)** – Prereqs confirmed (both DONE); implementing cosign/PGP trust metadata emission and advisory provenance hints for policy weighting.| | |EXCITITOR-CONN-CISCO-01-003 – Provider trust metadata|Team Excititor Connectors – Cisco|EXCITITOR-CONN-CISCO-01-002, EXCITITOR-POLICY-01-001|**DONE (2025-10-29)** – Connector now annotates raw documents with provider trust + cosign/PGP provenance and upserts `VexProvider` entries; new unit coverage asserts metadata emission and provider-store invocation.| | ||||||
|   | |||||||
| @@ -3,5 +3,5 @@ If you are working on this file you need to read docs/modules/excititor/ARCHITEC | |||||||
| | Task | Owner(s) | Depends on | Notes | | | Task | Owner(s) | Depends on | Notes | | ||||||
| |---|---|---|---| | |---|---|---|---| | ||||||
| |EXCITITOR-CONN-MS-01-001 – AAD onboarding & token cache|Team Excititor Connectors – MSRC|EXCITITOR-CONN-ABS-01-001|**DONE (2025-10-17)** – Added MSRC connector project with configurable AAD options, token provider (offline/online modes), DI wiring, and unit tests covering caching and fallback scenarios.| | |EXCITITOR-CONN-MS-01-001 – AAD onboarding & token cache|Team Excititor Connectors – MSRC|EXCITITOR-CONN-ABS-01-001|**DONE (2025-10-17)** – Added MSRC connector project with configurable AAD options, token provider (offline/online modes), DI wiring, and unit tests covering caching and fallback scenarios.| | ||||||
| |EXCITITOR-CONN-MS-01-002 – CSAF download pipeline|Team Excititor Connectors – MSRC|EXCITITOR-CONN-MS-01-001, EXCITITOR-STORAGE-01-003|**DOING (2025-10-19)** – Prereqs verified (EXCITITOR-CONN-MS-01-001, EXCITITOR-STORAGE-01-003); drafting fetch/retry plan and storage wiring before implementation of CSAF package download, checksum validation, and quarantine flows.| | |EXCITITOR-CONN-MS-01-002 – CSAF download pipeline|Team Excititor Connectors – MSRC|EXCITITOR-CONN-MS-01-001, EXCITITOR-STORAGE-01-003|**DONE (2025-10-29)** – Implemented authenticated CSAF retrieval with retry/backoff, checksum enforcement, quarantine for invalid archives, and regression tests covering dedupe + idempotent state updates.| | ||||||
| |EXCITITOR-CONN-MS-01-003 – Trust metadata & provenance hints|Team Excititor Connectors – MSRC|EXCITITOR-CONN-MS-01-002, EXCITITOR-POLICY-01-001|TODO – Emit cosign/AAD issuer metadata, attach provenance details, and document policy integration.| | |EXCITITOR-CONN-MS-01-003 – Trust metadata & provenance hints|Team Excititor Connectors – MSRC|EXCITITOR-CONN-MS-01-002, EXCITITOR-POLICY-01-001|TODO – Emit cosign/AAD issuer metadata, attach provenance details, and document policy integration.| | ||||||
|   | |||||||
| @@ -7,7 +7,9 @@ using Microsoft.Extensions.DependencyInjection.Extensions; | |||||||
| using StellaOps.Excititor.Connectors.Abstractions; | using StellaOps.Excititor.Connectors.Abstractions; | ||||||
| using StellaOps.Excititor.Connectors.SUSE.RancherVEXHub.Authentication; | using StellaOps.Excititor.Connectors.SUSE.RancherVEXHub.Authentication; | ||||||
| using StellaOps.Excititor.Connectors.SUSE.RancherVEXHub.Configuration; | using StellaOps.Excititor.Connectors.SUSE.RancherVEXHub.Configuration; | ||||||
|  | using StellaOps.Excititor.Connectors.SUSE.RancherVEXHub.Events; | ||||||
| using StellaOps.Excititor.Connectors.SUSE.RancherVEXHub.Metadata; | using StellaOps.Excititor.Connectors.SUSE.RancherVEXHub.Metadata; | ||||||
|  | using StellaOps.Excititor.Connectors.SUSE.RancherVEXHub.State; | ||||||
| using StellaOps.Excititor.Core; | using StellaOps.Excititor.Core; | ||||||
| using System.IO.Abstractions; | using System.IO.Abstractions; | ||||||
|  |  | ||||||
| @@ -29,6 +31,8 @@ public static class RancherHubConnectorServiceCollectionExtensions | |||||||
|             }); |             }); | ||||||
|  |  | ||||||
|         services.AddSingleton<IVexConnectorOptionsValidator<RancherHubConnectorOptions>, RancherHubConnectorOptionsValidator>(); |         services.AddSingleton<IVexConnectorOptionsValidator<RancherHubConnectorOptions>, RancherHubConnectorOptionsValidator>(); | ||||||
|  |         services.AddSingleton<RancherHubCheckpointManager>(); | ||||||
|  |         services.AddSingleton<RancherHubEventClient>(); | ||||||
|         services.AddSingleton<RancherHubTokenProvider>(); |         services.AddSingleton<RancherHubTokenProvider>(); | ||||||
|         services.AddSingleton<RancherHubMetadataLoader>(); |         services.AddSingleton<RancherHubMetadataLoader>(); | ||||||
|         services.AddSingleton<IVexConnector, RancherHubConnector>(); |         services.AddSingleton<IVexConnector, RancherHubConnector>(); | ||||||
|   | |||||||
| @@ -22,6 +22,8 @@ namespace StellaOps.Excititor.Connectors.SUSE.RancherVEXHub; | |||||||
|  |  | ||||||
| public sealed class RancherHubConnector : VexConnectorBase | public sealed class RancherHubConnector : VexConnectorBase | ||||||
| { | { | ||||||
|  |     private const int MaxDigestHistory = 200; | ||||||
|  |  | ||||||
|     private static readonly VexConnectorDescriptor StaticDescriptor = new( |     private static readonly VexConnectorDescriptor StaticDescriptor = new( | ||||||
|             id: "excititor:suse.rancher", |             id: "excititor:suse.rancher", | ||||||
|             kind: VexProviderKind.Hub, |             kind: VexProviderKind.Hub, | ||||||
| @@ -153,6 +155,12 @@ public sealed class RancherHubConnector : VexConnectorBase | |||||||
|             } |             } | ||||||
|         } |         } | ||||||
|  |  | ||||||
|  |         var trimmed = TrimHistory(digestHistory); | ||||||
|  |         if (trimmed) | ||||||
|  |         { | ||||||
|  |             stateChanged = true; | ||||||
|  |         } | ||||||
|  |  | ||||||
|         if (stateChanged || !string.Equals(latestCursor, checkpoint.Cursor, StringComparison.Ordinal) || latestPublishedAt != checkpoint.LastPublishedAt) |         if (stateChanged || !string.Equals(latestCursor, checkpoint.Cursor, StringComparison.Ordinal) || latestPublishedAt != checkpoint.LastPublishedAt) | ||||||
|         { |         { | ||||||
|             await _checkpointManager.SaveAsync( |             await _checkpointManager.SaveAsync( | ||||||
| @@ -236,6 +244,18 @@ public sealed class RancherHubConnector : VexConnectorBase | |||||||
|         return new EventProcessingResult(document, false, publishedAt); |         return new EventProcessingResult(document, false, publishedAt); | ||||||
|     } |     } | ||||||
|  |  | ||||||
|  |     private static bool TrimHistory(List<string> digestHistory) | ||||||
|  |     { | ||||||
|  |         if (digestHistory.Count <= MaxDigestHistory) | ||||||
|  |         { | ||||||
|  |             return false; | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         var excess = digestHistory.Count - MaxDigestHistory; | ||||||
|  |         digestHistory.RemoveRange(0, excess); | ||||||
|  |         return true; | ||||||
|  |     } | ||||||
|  |  | ||||||
|     private async Task<HttpRequestMessage> CreateDocumentRequestAsync(Uri documentUri, CancellationToken cancellationToken) |     private async Task<HttpRequestMessage> CreateDocumentRequestAsync(Uri documentUri, CancellationToken cancellationToken) | ||||||
|     { |     { | ||||||
|         var request = new HttpRequestMessage(HttpMethod.Get, documentUri); |         var request = new HttpRequestMessage(HttpMethod.Get, documentUri); | ||||||
|   | |||||||
| @@ -3,5 +3,5 @@ If you are working on this file you need to read docs/modules/excititor/ARCHITEC | |||||||
| | Task | Owner(s) | Depends on | Notes | | | Task | Owner(s) | Depends on | Notes | | ||||||
| |---|---|---|---| | |---|---|---|---| | ||||||
| |EXCITITOR-CONN-SUSE-01-001 – Rancher hub discovery & auth|Team Excititor Connectors – SUSE|EXCITITOR-CONN-ABS-01-001|**DONE (2025-10-17)** – Added Rancher hub options/token provider, discovery metadata loader with offline snapshots + caching, connector shell, DI wiring, and unit tests covering network/offline paths.| | |EXCITITOR-CONN-SUSE-01-001 – Rancher hub discovery & auth|Team Excititor Connectors – SUSE|EXCITITOR-CONN-ABS-01-001|**DONE (2025-10-17)** – Added Rancher hub options/token provider, discovery metadata loader with offline snapshots + caching, connector shell, DI wiring, and unit tests covering network/offline paths.| | ||||||
| |EXCITITOR-CONN-SUSE-01-002 – Checkpointed event ingestion|Team Excititor Connectors – SUSE|EXCITITOR-CONN-SUSE-01-001, EXCITITOR-STORAGE-01-003|**DOING (2025-10-19)** – Process hub events with resume checkpoints, deduplication, and quarantine path for malformed payloads.<br>2025-10-19: Prereqs EXCITITOR-CONN-SUSE-01-001 & EXCITITOR-STORAGE-01-003 confirmed complete; initiating checkpoint/resume implementation plan.| | |EXCITITOR-CONN-SUSE-01-002 – Checkpointed event ingestion|Team Excititor Connectors – SUSE|EXCITITOR-CONN-SUSE-01-001, EXCITITOR-STORAGE-01-003|**DONE (2025-10-29)** – Wired checkpoint manager/event client into DI, bounded digest history, and exercised offline snapshot/dedup/quarantine flows with new connector tests ensuring state persistence and replay determinism.| | ||||||
| |EXCITITOR-CONN-SUSE-01-003 – Trust metadata & policy hints|Team Excititor Connectors – SUSE|EXCITITOR-CONN-SUSE-01-002, EXCITITOR-POLICY-01-001|TODO – Emit provider trust configuration (signers, weight overrides) and attach provenance hints for consensus engine.| | |EXCITITOR-CONN-SUSE-01-003 – Trust metadata & policy hints|Team Excititor Connectors – SUSE|EXCITITOR-CONN-SUSE-01-002, EXCITITOR-POLICY-01-001|TODO – Emit provider trust configuration (signers, weight overrides) and attach provenance hints for consensus engine.| | ||||||
|   | |||||||
| @@ -3,6 +3,6 @@ If you are working on this file you need to read docs/modules/excititor/ARCHITEC | |||||||
| | Task | Owner(s) | Depends on | Notes | | | Task | Owner(s) | Depends on | Notes | | ||||||
| |---|---|---|---| | |---|---|---|---| | ||||||
| |EXCITITOR-CONN-UBUNTU-01-001 – Ubuntu CSAF discovery & channels|Team Excititor Connectors – Ubuntu|EXCITITOR-CONN-ABS-01-001|**DONE (2025-10-17)** – Added Ubuntu connector project with configurable channel options, catalog loader (network/offline), DI wiring, and discovery unit tests.| | |EXCITITOR-CONN-UBUNTU-01-001 – Ubuntu CSAF discovery & channels|Team Excititor Connectors – Ubuntu|EXCITITOR-CONN-ABS-01-001|**DONE (2025-10-17)** – Added Ubuntu connector project with configurable channel options, catalog loader (network/offline), DI wiring, and discovery unit tests.| | ||||||
| |EXCITITOR-CONN-UBUNTU-01-002 – Incremental fetch & deduplication|Team Excititor Connectors – Ubuntu|EXCITITOR-CONN-UBUNTU-01-001, EXCITITOR-STORAGE-01-003|**DOING (2025-10-19)** – Fetch CSAF bundles with ETag handling, checksum validation, deduplication, and raw persistence.| | |EXCITITOR-CONN-UBUNTU-01-002 – Incremental fetch & deduplication|Team Excititor Connectors – Ubuntu|EXCITITOR-CONN-UBUNTU-01-001, EXCITITOR-STORAGE-01-003|**DONE (2025-10-29)** – Incremental pull loop now enforces ETag/sha validation, resumes from persisted state, and includes regression tests covering checksum mismatch quarantine and If-None-Match replay.| | ||||||
| |EXCITITOR-CONN-UBUNTU-01-003 – Trust metadata & provenance|Team Excititor Connectors – Ubuntu|EXCITITOR-CONN-UBUNTU-01-002, EXCITITOR-POLICY-01-001|TODO – Emit Ubuntu signing metadata (GPG fingerprints) plus provenance hints for policy weighting and diagnostics.| | |EXCITITOR-CONN-UBUNTU-01-003 – Trust metadata & provenance|Team Excititor Connectors – Ubuntu|EXCITITOR-CONN-UBUNTU-01-002, EXCITITOR-POLICY-01-001|TODO – Emit Ubuntu signing metadata (GPG fingerprints) plus provenance hints for policy weighting and diagnostics.| | ||||||
| > Remark (2025-10-19, EXCITITOR-CONN-UBUNTU-01-002): Prerequisites EXCITITOR-CONN-UBUNTU-01-001 and EXCITITOR-STORAGE-01-003 verified as **DONE**; advancing to DOING per Wave 0 kickoff. | > Remark (2025-10-29, EXCITITOR-CONN-UBUNTU-01-002): Offline + network regression pass validated resume tokens, dedupe skips, checksum enforcement, and ETag handling before closing the task. | ||||||
|   | |||||||
| @@ -288,4 +288,7 @@ public enum VexExportFormat | |||||||
|  |  | ||||||
|     [EnumMember(Value = "csaf")] |     [EnumMember(Value = "csaf")] | ||||||
|     Csaf, |     Csaf, | ||||||
|  |  | ||||||
|  |     [EnumMember(Value = "cyclonedx")] | ||||||
|  |     CycloneDx, | ||||||
| } | } | ||||||
|   | |||||||
| @@ -140,6 +140,7 @@ public sealed class FileSystemArtifactStore : IVexArtifactStore | |||||||
|             VexExportFormat.JsonLines => ".jsonl", |             VexExportFormat.JsonLines => ".jsonl", | ||||||
|             VexExportFormat.OpenVex => ".json", |             VexExportFormat.OpenVex => ".json", | ||||||
|             VexExportFormat.Csaf => ".json", |             VexExportFormat.Csaf => ".json", | ||||||
|  |             VexExportFormat.CycloneDx => ".json", | ||||||
|             _ => ".bin", |             _ => ".bin", | ||||||
|         }; |         }; | ||||||
| } | } | ||||||
|   | |||||||
| @@ -220,6 +220,7 @@ public sealed class OfflineBundleArtifactStore : IVexArtifactStore | |||||||
|             VexExportFormat.JsonLines => ".jsonl", |             VexExportFormat.JsonLines => ".jsonl", | ||||||
|             VexExportFormat.OpenVex => ".json", |             VexExportFormat.OpenVex => ".json", | ||||||
|             VexExportFormat.Csaf => ".json", |             VexExportFormat.Csaf => ".json", | ||||||
|  |             VexExportFormat.CycloneDx => ".json", | ||||||
|             _ => ".bin", |             _ => ".bin", | ||||||
|         }; |         }; | ||||||
|  |  | ||||||
|   | |||||||
| @@ -146,6 +146,7 @@ public sealed class S3ArtifactStore : IVexArtifactStore | |||||||
|                 VexExportFormat.JsonLines => "application/json", |                 VexExportFormat.JsonLines => "application/json", | ||||||
|                 VexExportFormat.OpenVex => "application/vnd.openvex+json", |                 VexExportFormat.OpenVex => "application/vnd.openvex+json", | ||||||
|                 VexExportFormat.Csaf => "application/json", |                 VexExportFormat.Csaf => "application/json", | ||||||
|  |                 VexExportFormat.CycloneDx => "application/vnd.cyclonedx+json", | ||||||
|                 _ => "application/octet-stream", |                 _ => "application/octet-stream", | ||||||
|             }, |             }, | ||||||
|         }; |         }; | ||||||
| @@ -165,6 +166,7 @@ public sealed class S3ArtifactStore : IVexArtifactStore | |||||||
|             VexExportFormat.JsonLines => ".jsonl", |             VexExportFormat.JsonLines => ".jsonl", | ||||||
|             VexExportFormat.OpenVex => ".json", |             VexExportFormat.OpenVex => ".json", | ||||||
|             VexExportFormat.Csaf => ".json", |             VexExportFormat.Csaf => ".json", | ||||||
|  |             VexExportFormat.CycloneDx => ".json", | ||||||
|             _ => ".bin", |             _ => ".bin", | ||||||
|         }; |         }; | ||||||
| } | } | ||||||
|   | |||||||
| @@ -0,0 +1,512 @@ | |||||||
|  | using System.Collections.Generic; | ||||||
|  | using System.Collections.Immutable; | ||||||
|  | using System.Globalization; | ||||||
|  | using System.IO; | ||||||
|  | using System.Linq; | ||||||
|  | using System.Security.Cryptography; | ||||||
|  | using System.Text; | ||||||
|  | using System.Text.Json.Serialization; | ||||||
|  | using System.Threading; | ||||||
|  | using System.Threading.Tasks; | ||||||
|  | using StellaOps.Excititor.Core; | ||||||
|  |  | ||||||
|  | namespace StellaOps.Excititor.Formats.CSAF; | ||||||
|  |  | ||||||
|  | /// <summary> | ||||||
|  | /// Emits deterministic CSAF 2.0 VEX documents summarising normalized claims. | ||||||
|  | /// </summary> | ||||||
|  | public sealed class CsafExporter : IVexExporter | ||||||
|  | { | ||||||
|  |     public CsafExporter() | ||||||
|  |     { | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     public VexExportFormat Format => VexExportFormat.Csaf; | ||||||
|  |  | ||||||
|  |     public VexContentAddress Digest(VexExportRequest request) | ||||||
|  |     { | ||||||
|  |         ArgumentNullException.ThrowIfNull(request); | ||||||
|  |         var document = BuildDocument(request, out _); | ||||||
|  |         var json = VexCanonicalJsonSerializer.Serialize(document); | ||||||
|  |         return ComputeDigest(json); | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     public async ValueTask<VexExportResult> SerializeAsync( | ||||||
|  |         VexExportRequest request, | ||||||
|  |         Stream output, | ||||||
|  |         CancellationToken cancellationToken) | ||||||
|  |     { | ||||||
|  |         ArgumentNullException.ThrowIfNull(request); | ||||||
|  |         ArgumentNullException.ThrowIfNull(output); | ||||||
|  |  | ||||||
|  |         var document = BuildDocument(request, out var metadata); | ||||||
|  |         var json = VexCanonicalJsonSerializer.Serialize(document); | ||||||
|  |         var digest = ComputeDigest(json); | ||||||
|  |         var buffer = Encoding.UTF8.GetBytes(json); | ||||||
|  |         await output.WriteAsync(buffer, 0, buffer.Length, cancellationToken).ConfigureAwait(false); | ||||||
|  |         return new VexExportResult(digest, buffer.LongLength, metadata); | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     private CsafExportDocument BuildDocument(VexExportRequest request, out ImmutableDictionary<string, string> metadata) | ||||||
|  |     { | ||||||
|  |         var signature = VexQuerySignature.FromQuery(request.Query); | ||||||
|  |         var signatureHash = signature.ComputeHash(); | ||||||
|  |         var generatedAt = request.GeneratedAt.UtcDateTime.ToString("O", CultureInfo.InvariantCulture); | ||||||
|  |  | ||||||
|  |         var productCatalog = new ProductCatalog(); | ||||||
|  |         var missingJustifications = new SortedSet<string>(StringComparer.Ordinal); | ||||||
|  |  | ||||||
|  |         var vulnerabilityBuilders = new Dictionary<string, CsafVulnerabilityBuilder>(StringComparer.Ordinal); | ||||||
|  |  | ||||||
|  |         foreach (var claim in request.Claims) | ||||||
|  |         { | ||||||
|  |             var productId = productCatalog.GetOrAddProductId(claim.Product); | ||||||
|  |  | ||||||
|  |             if (!vulnerabilityBuilders.TryGetValue(claim.VulnerabilityId, out var builder)) | ||||||
|  |             { | ||||||
|  |                 builder = new CsafVulnerabilityBuilder(claim.VulnerabilityId); | ||||||
|  |                 vulnerabilityBuilders[claim.VulnerabilityId] = builder; | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             builder.AddClaim(claim, productId); | ||||||
|  |  | ||||||
|  |             if (claim.Status == VexClaimStatus.NotAffected && claim.Justification is null) | ||||||
|  |             { | ||||||
|  |                 missingJustifications.Add(FormattableString.Invariant($"{claim.VulnerabilityId}:{productId}")); | ||||||
|  |             } | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         var products = productCatalog.Build(); | ||||||
|  |         var vulnerabilities = vulnerabilityBuilders.Values | ||||||
|  |             .Select(builder => builder.ToVulnerability()) | ||||||
|  |             .Where(static vulnerability => vulnerability is not null) | ||||||
|  |             .Select(static vulnerability => vulnerability!) | ||||||
|  |             .OrderBy(static vulnerability => vulnerability.Cve ?? vulnerability.Id ?? string.Empty, StringComparer.Ordinal) | ||||||
|  |             .ToImmutableArray(); | ||||||
|  |  | ||||||
|  |         var sourceProviders = request.Claims | ||||||
|  |             .Select(static claim => claim.ProviderId) | ||||||
|  |             .Distinct(StringComparer.Ordinal) | ||||||
|  |             .OrderBy(static provider => provider, StringComparer.Ordinal) | ||||||
|  |             .ToImmutableArray(); | ||||||
|  |  | ||||||
|  |         var documentSection = new CsafDocumentSection( | ||||||
|  |             Category: "vex", | ||||||
|  |             Title: "StellaOps VEX CSAF Export", | ||||||
|  |             Tracking: new CsafTrackingSection( | ||||||
|  |                 Id: FormattableString.Invariant($"stellaops:csaf:{signatureHash.Digest}"), | ||||||
|  |                 Status: "final", | ||||||
|  |                 Version: "1", | ||||||
|  |                 Revision: "1", | ||||||
|  |                 InitialReleaseDate: generatedAt, | ||||||
|  |                 CurrentReleaseDate: generatedAt, | ||||||
|  |                 Generator: new CsafGeneratorSection("StellaOps Excititor")), | ||||||
|  |             Publisher: new CsafPublisherSection("StellaOps Excititor", "coordinator")); | ||||||
|  |  | ||||||
|  |         var metadataSection = new CsafExportMetadata( | ||||||
|  |             generatedAt, | ||||||
|  |             signature.Value, | ||||||
|  |             sourceProviders, | ||||||
|  |             missingJustifications.Count == 0 | ||||||
|  |                 ? ImmutableDictionary<string, string>.Empty | ||||||
|  |                 : ImmutableDictionary<string, string>.Empty.Add( | ||||||
|  |                     "policy.justification_missing", | ||||||
|  |                     string.Join(",", missingJustifications))); | ||||||
|  |  | ||||||
|  |         metadata = BuildMetadata(signature, vulnerabilities.Length, products.Length, missingJustifications, sourceProviders, generatedAt); | ||||||
|  |  | ||||||
|  |         var productTree = new CsafProductTreeSection(products); | ||||||
|  |         return new CsafExportDocument(documentSection, productTree, vulnerabilities, metadataSection); | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     private static ImmutableDictionary<string, string> BuildMetadata( | ||||||
|  |         VexQuerySignature signature, | ||||||
|  |         int vulnerabilityCount, | ||||||
|  |         int productCount, | ||||||
|  |         IEnumerable<string> missingJustifications, | ||||||
|  |         ImmutableArray<string> sourceProviders, | ||||||
|  |         string generatedAt) | ||||||
|  |     { | ||||||
|  |         var builder = ImmutableDictionary.CreateBuilder<string, string>(StringComparer.Ordinal); | ||||||
|  |         builder["csaf.querySignature"] = signature.Value; | ||||||
|  |         builder["csaf.generatedAt"] = generatedAt; | ||||||
|  |         builder["csaf.vulnerabilityCount"] = vulnerabilityCount.ToString(CultureInfo.InvariantCulture); | ||||||
|  |         builder["csaf.productCount"] = productCount.ToString(CultureInfo.InvariantCulture); | ||||||
|  |         builder["csaf.providerCount"] = sourceProviders.Length.ToString(CultureInfo.InvariantCulture); | ||||||
|  |  | ||||||
|  |         var missing = missingJustifications.ToArray(); | ||||||
|  |         if (missing.Length > 0) | ||||||
|  |         { | ||||||
|  |             builder["policy.justification_missing"] = string.Join(",", missing); | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         return builder.ToImmutable(); | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     private static VexContentAddress ComputeDigest(string json) | ||||||
|  |     { | ||||||
|  |         var bytes = Encoding.UTF8.GetBytes(json); | ||||||
|  |         Span<byte> hash = stackalloc byte[SHA256.HashSizeInBytes]; | ||||||
|  |         SHA256.HashData(bytes, hash); | ||||||
|  |         var digest = Convert.ToHexString(hash).ToLowerInvariant(); | ||||||
|  |         return new VexContentAddress("sha256", digest); | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     private sealed class ProductCatalog | ||||||
|  |     { | ||||||
|  |         private readonly Dictionary<string, MutableProduct> _products = new(StringComparer.Ordinal); | ||||||
|  |         private readonly HashSet<string> _usedIds = new(StringComparer.Ordinal); | ||||||
|  |  | ||||||
|  |         public string GetOrAddProductId(VexProduct product) | ||||||
|  |         { | ||||||
|  |             if (_products.TryGetValue(product.Key, out var existing)) | ||||||
|  |             { | ||||||
|  |                 existing.Update(product); | ||||||
|  |                 return existing.ProductId; | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             var productId = GenerateProductId(product.Key); | ||||||
|  |             var mutable = new MutableProduct(productId); | ||||||
|  |             mutable.Update(product); | ||||||
|  |             _products[product.Key] = mutable; | ||||||
|  |             return productId; | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         public ImmutableArray<CsafProductEntry> Build() | ||||||
|  |             => _products.Values | ||||||
|  |                 .Select(static product => product.ToEntry()) | ||||||
|  |                 .OrderBy(static entry => entry.ProductId, StringComparer.Ordinal) | ||||||
|  |                 .ToImmutableArray(); | ||||||
|  |  | ||||||
|  |         private string GenerateProductId(string key) | ||||||
|  |         { | ||||||
|  |             var sanitized = SanitizeIdentifier(key); | ||||||
|  |             if (_usedIds.Add(sanitized)) | ||||||
|  |             { | ||||||
|  |                 return sanitized; | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             var hash = ComputeShortHash(key); | ||||||
|  |             var candidate = FormattableString.Invariant($"{sanitized}-{hash}"); | ||||||
|  |             while (!_usedIds.Add(candidate)) | ||||||
|  |             { | ||||||
|  |                 candidate = FormattableString.Invariant($"{candidate}-{hash}"); | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             return candidate; | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         private static string SanitizeIdentifier(string value) | ||||||
|  |         { | ||||||
|  |             if (string.IsNullOrWhiteSpace(value)) | ||||||
|  |             { | ||||||
|  |                 return "product"; | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             var builder = new StringBuilder(value.Length); | ||||||
|  |             foreach (var ch in value) | ||||||
|  |             { | ||||||
|  |                 builder.Append(char.IsLetterOrDigit(ch) ? char.ToLowerInvariant(ch) : '-'); | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             var sanitized = builder.ToString().Trim('-'); | ||||||
|  |             return string.IsNullOrEmpty(sanitized) ? "product" : sanitized; | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         private static string ComputeShortHash(string value) | ||||||
|  |         { | ||||||
|  |             var bytes = Encoding.UTF8.GetBytes(value); | ||||||
|  |             Span<byte> hash = stackalloc byte[SHA256.HashSizeInBytes]; | ||||||
|  |             SHA256.HashData(bytes, hash); | ||||||
|  |             return Convert.ToHexString(hash[..6]).ToLowerInvariant(); | ||||||
|  |         } | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     private sealed class MutableProduct | ||||||
|  |     { | ||||||
|  |         public MutableProduct(string productId) | ||||||
|  |         { | ||||||
|  |             ProductId = productId; | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         public string ProductId { get; } | ||||||
|  |  | ||||||
|  |         private string? _name; | ||||||
|  |         private string? _version; | ||||||
|  |         private string? _purl; | ||||||
|  |         private string? _cpe; | ||||||
|  |         private readonly SortedSet<string> _componentIdentifiers = new(StringComparer.OrdinalIgnoreCase); | ||||||
|  |  | ||||||
|  |         public void Update(VexProduct product) | ||||||
|  |         { | ||||||
|  |             if (!string.IsNullOrWhiteSpace(product.Name) && ShouldReplace(_name, product.Name)) | ||||||
|  |             { | ||||||
|  |                 _name = product.Name; | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             if (!string.IsNullOrWhiteSpace(product.Version) && ShouldReplace(_version, product.Version)) | ||||||
|  |             { | ||||||
|  |                 _version = product.Version; | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             if (!string.IsNullOrWhiteSpace(product.Purl) && ShouldReplace(_purl, product.Purl)) | ||||||
|  |             { | ||||||
|  |                 _purl = product.Purl; | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             if (!string.IsNullOrWhiteSpace(product.Cpe) && ShouldReplace(_cpe, product.Cpe)) | ||||||
|  |             { | ||||||
|  |                 _cpe = product.Cpe; | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             foreach (var identifier in product.ComponentIdentifiers) | ||||||
|  |             { | ||||||
|  |                 if (!string.IsNullOrWhiteSpace(identifier)) | ||||||
|  |                 { | ||||||
|  |                     _componentIdentifiers.Add(identifier.Trim()); | ||||||
|  |                 } | ||||||
|  |             } | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         private static bool ShouldReplace(string? existing, string candidate) | ||||||
|  |         { | ||||||
|  |             if (string.IsNullOrWhiteSpace(candidate)) | ||||||
|  |             { | ||||||
|  |                 return false; | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             if (string.IsNullOrWhiteSpace(existing)) | ||||||
|  |             { | ||||||
|  |                 return true; | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             return candidate.Length > existing.Length; | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         public CsafProductEntry ToEntry() | ||||||
|  |         { | ||||||
|  |             var helper = new CsafProductIdentificationHelper( | ||||||
|  |                 _purl, | ||||||
|  |                 _cpe, | ||||||
|  |                 _version, | ||||||
|  |                 _componentIdentifiers.Count == 0 ? null : _componentIdentifiers.ToImmutableArray()); | ||||||
|  |  | ||||||
|  |             return new CsafProductEntry(ProductId, _name ?? ProductId, helper); | ||||||
|  |         } | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     private sealed class CsafVulnerabilityBuilder | ||||||
|  |     { | ||||||
|  |         private readonly string _vulnerabilityId; | ||||||
|  |         private string? _title; | ||||||
|  |         private readonly Dictionary<string, SortedSet<string>> _statusMap = new(StringComparer.Ordinal); | ||||||
|  |         private readonly Dictionary<string, SortedSet<string>> _flags = new(StringComparer.Ordinal); | ||||||
|  |         private readonly Dictionary<string, CsafReference> _references = new(StringComparer.Ordinal); | ||||||
|  |         private readonly Dictionary<string, CsafNote> _notes = new(StringComparer.Ordinal); | ||||||
|  |  | ||||||
|  |         public CsafVulnerabilityBuilder(string vulnerabilityId) | ||||||
|  |         { | ||||||
|  |             _vulnerabilityId = vulnerabilityId; | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         public void AddClaim(VexClaim claim, string productId) | ||||||
|  |         { | ||||||
|  |             var statusField = MapStatus(claim.Status); | ||||||
|  |             if (!string.IsNullOrEmpty(statusField)) | ||||||
|  |             { | ||||||
|  |                 GetSet(_statusMap, statusField!).Add(productId); | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             if (claim.Justification is not null) | ||||||
|  |             { | ||||||
|  |                 var label = claim.Justification.Value.ToString().ToLowerInvariant(); | ||||||
|  |                 GetSet(_flags, label).Add(productId); | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             if (!string.IsNullOrWhiteSpace(claim.Detail)) | ||||||
|  |             { | ||||||
|  |                 var noteKey = FormattableString.Invariant($"{claim.ProviderId}|{productId}"); | ||||||
|  |                 var text = claim.Detail!.Trim(); | ||||||
|  |                 _notes[noteKey] = new CsafNote("description", claim.ProviderId, text, "external"); | ||||||
|  |  | ||||||
|  |                 if (string.IsNullOrWhiteSpace(_title)) | ||||||
|  |                 { | ||||||
|  |                     _title = text; | ||||||
|  |                 } | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             var referenceKey = claim.Document.Digest; | ||||||
|  |             if (!_references.ContainsKey(referenceKey)) | ||||||
|  |             { | ||||||
|  |                 _references[referenceKey] = new CsafReference( | ||||||
|  |                     claim.Document.SourceUri.ToString(), | ||||||
|  |                     claim.ProviderId, | ||||||
|  |                     "advisory"); | ||||||
|  |             } | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         public CsafExportVulnerability? ToVulnerability() | ||||||
|  |         { | ||||||
|  |             if (_statusMap.Count == 0 && _flags.Count == 0 && _references.Count == 0 && _notes.Count == 0) | ||||||
|  |             { | ||||||
|  |                 return null; | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             var productStatus = BuildProductStatus(); | ||||||
|  |             ImmutableArray<CsafFlag>? flags = _flags.Count == 0 | ||||||
|  |                 ? null | ||||||
|  |                 : _flags | ||||||
|  |                     .OrderBy(static pair => pair.Key, StringComparer.Ordinal) | ||||||
|  |                     .Select(pair => new CsafFlag(pair.Key, pair.Value.ToImmutableArray())) | ||||||
|  |                     .ToImmutableArray(); | ||||||
|  |  | ||||||
|  |             ImmutableArray<CsafNote>? notes = _notes.Count == 0 | ||||||
|  |                 ? null | ||||||
|  |                 : _notes.Values | ||||||
|  |                     .OrderBy(static note => note.Title, StringComparer.Ordinal) | ||||||
|  |                     .ThenBy(static note => note.Text, StringComparer.Ordinal) | ||||||
|  |                     .ToImmutableArray(); | ||||||
|  |  | ||||||
|  |             ImmutableArray<CsafReference>? references = _references.Count == 0 | ||||||
|  |                 ? null | ||||||
|  |                 : _references.Values | ||||||
|  |                     .OrderBy(static reference => reference.Url, StringComparer.Ordinal) | ||||||
|  |                     .ToImmutableArray(); | ||||||
|  |  | ||||||
|  |             var isCve = _vulnerabilityId.StartsWith("CVE-", StringComparison.OrdinalIgnoreCase); | ||||||
|  |  | ||||||
|  |             return new CsafExportVulnerability( | ||||||
|  |                 Cve: isCve ? _vulnerabilityId.ToUpperInvariant() : null, | ||||||
|  |                 Id: isCve ? null : _vulnerabilityId, | ||||||
|  |                 Title: _title, | ||||||
|  |                 ProductStatus: productStatus, | ||||||
|  |                 Flags: flags, | ||||||
|  |                 Notes: notes, | ||||||
|  |                 References: references); | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         private CsafProductStatus? BuildProductStatus() | ||||||
|  |         { | ||||||
|  |             var knownAffected = GetStatusArray("known_affected"); | ||||||
|  |             var knownNotAffected = GetStatusArray("known_not_affected"); | ||||||
|  |             var fixedProducts = GetStatusArray("fixed"); | ||||||
|  |             var underInvestigation = GetStatusArray("under_investigation"); | ||||||
|  |  | ||||||
|  |             if (knownAffected is null && knownNotAffected is null && fixedProducts is null && underInvestigation is null) | ||||||
|  |             { | ||||||
|  |                 return null; | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             return new CsafProductStatus(knownAffected, knownNotAffected, fixedProducts, underInvestigation); | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         private ImmutableArray<string>? GetStatusArray(string statusKey) | ||||||
|  |         { | ||||||
|  |             if (_statusMap.TryGetValue(statusKey, out var entries) && entries.Count > 0) | ||||||
|  |             { | ||||||
|  |                 return entries.ToImmutableArray(); | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             return null; | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         private static SortedSet<string> GetSet(Dictionary<string, SortedSet<string>> map, string key) | ||||||
|  |         { | ||||||
|  |             if (!map.TryGetValue(key, out var set)) | ||||||
|  |             { | ||||||
|  |                 set = new SortedSet<string>(StringComparer.Ordinal); | ||||||
|  |                 map[key] = set; | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             return set; | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         private static string? MapStatus(VexClaimStatus status) | ||||||
|  |             => status switch | ||||||
|  |             { | ||||||
|  |                 VexClaimStatus.Affected => "known_affected", | ||||||
|  |                 VexClaimStatus.NotAffected => "known_not_affected", | ||||||
|  |                 VexClaimStatus.Fixed => "fixed", | ||||||
|  |                 VexClaimStatus.UnderInvestigation => "under_investigation", | ||||||
|  |                 _ => null, | ||||||
|  |             }; | ||||||
|  |     } | ||||||
|  | } | ||||||
|  |  | ||||||
|  | internal sealed record CsafExportDocument( | ||||||
|  |     CsafDocumentSection Document, | ||||||
|  |     CsafProductTreeSection ProductTree, | ||||||
|  |     ImmutableArray<CsafExportVulnerability> Vulnerabilities, | ||||||
|  |     CsafExportMetadata Metadata); | ||||||
|  |  | ||||||
|  | internal sealed record CsafDocumentSection( | ||||||
|  |     [property: JsonPropertyName("category")] string Category, | ||||||
|  |     [property: JsonPropertyName("title")] string Title, | ||||||
|  |     [property: JsonPropertyName("tracking")] CsafTrackingSection Tracking, | ||||||
|  |     [property: JsonPropertyName("publisher")] CsafPublisherSection Publisher); | ||||||
|  |  | ||||||
|  | internal sealed record CsafTrackingSection( | ||||||
|  |     [property: JsonPropertyName("id")] string Id, | ||||||
|  |     [property: JsonPropertyName("status")] string Status, | ||||||
|  |     [property: JsonPropertyName("version")] string Version, | ||||||
|  |     [property: JsonPropertyName("revision")] string Revision, | ||||||
|  |     [property: JsonPropertyName("initial_release_date")] string InitialReleaseDate, | ||||||
|  |     [property: JsonPropertyName("current_release_date")] string CurrentReleaseDate, | ||||||
|  |     [property: JsonPropertyName("generator")] CsafGeneratorSection Generator); | ||||||
|  |  | ||||||
|  | internal sealed record CsafGeneratorSection( | ||||||
|  |     [property: JsonPropertyName("engine")] string Engine); | ||||||
|  |  | ||||||
|  | internal sealed record CsafPublisherSection( | ||||||
|  |     [property: JsonPropertyName("name")] string Name, | ||||||
|  |     [property: JsonPropertyName("category")] string Category); | ||||||
|  |  | ||||||
|  | internal sealed record CsafProductTreeSection( | ||||||
|  |     [property: JsonPropertyName("full_product_names")] ImmutableArray<CsafProductEntry> FullProductNames); | ||||||
|  |  | ||||||
|  | internal sealed record CsafProductEntry( | ||||||
|  |     [property: JsonPropertyName("product_id")] string ProductId, | ||||||
|  |     [property: JsonPropertyName("name")] string Name, | ||||||
|  |     [property: JsonPropertyName("product_identification_helper"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] CsafProductIdentificationHelper? IdentificationHelper); | ||||||
|  |  | ||||||
|  | internal sealed record CsafProductIdentificationHelper( | ||||||
|  |     [property: JsonPropertyName("purl"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] string? Purl, | ||||||
|  |     [property: JsonPropertyName("cpe"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] string? Cpe, | ||||||
|  |     [property: JsonPropertyName("product_version"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] string? ProductVersion, | ||||||
|  |     [property: JsonPropertyName("x_stellaops_component_identifiers"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] ImmutableArray<string>? ComponentIdentifiers); | ||||||
|  |  | ||||||
|  | internal sealed record CsafExportVulnerability( | ||||||
|  |     [property: JsonPropertyName("cve"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] string? Cve, | ||||||
|  |     [property: JsonPropertyName("id"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] string? Id, | ||||||
|  |     [property: JsonPropertyName("title"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] string? Title, | ||||||
|  |     [property: JsonPropertyName("product_status"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] CsafProductStatus? ProductStatus, | ||||||
|  |     [property: JsonPropertyName("flags"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] ImmutableArray<CsafFlag>? Flags, | ||||||
|  |     [property: JsonPropertyName("notes"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] ImmutableArray<CsafNote>? Notes, | ||||||
|  |     [property: JsonPropertyName("references"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] ImmutableArray<CsafReference>? References); | ||||||
|  |  | ||||||
|  | internal sealed record CsafProductStatus( | ||||||
|  |     [property: JsonPropertyName("known_affected"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] ImmutableArray<string>? KnownAffected, | ||||||
|  |     [property: JsonPropertyName("known_not_affected"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] ImmutableArray<string>? KnownNotAffected, | ||||||
|  |     [property: JsonPropertyName("fixed"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] ImmutableArray<string>? Fixed, | ||||||
|  |     [property: JsonPropertyName("under_investigation"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] ImmutableArray<string>? UnderInvestigation); | ||||||
|  |  | ||||||
|  | internal sealed record CsafFlag( | ||||||
|  |     [property: JsonPropertyName("label")] string Label, | ||||||
|  |     [property: JsonPropertyName("product_ids")] ImmutableArray<string> ProductIds); | ||||||
|  |  | ||||||
|  | internal sealed record CsafNote( | ||||||
|  |     [property: JsonPropertyName("category")] string Category, | ||||||
|  |     [property: JsonPropertyName("title")] string Title, | ||||||
|  |     [property: JsonPropertyName("text")] string Text, | ||||||
|  |     [property: JsonPropertyName("audience"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] string? Audience); | ||||||
|  |  | ||||||
|  | internal sealed record CsafReference( | ||||||
|  |     [property: JsonPropertyName("url")] string Url, | ||||||
|  |     [property: JsonPropertyName("summary")] string Summary, | ||||||
|  |     [property: JsonPropertyName("type")] string Type); | ||||||
|  |  | ||||||
|  | internal sealed record CsafExportMetadata( | ||||||
|  |     [property: JsonPropertyName("generated_at")] string GeneratedAt, | ||||||
|  |     [property: JsonPropertyName("query_signature")] string QuerySignature, | ||||||
|  |     [property: JsonPropertyName("source_providers")] ImmutableArray<string> SourceProviders, | ||||||
|  |     [property: JsonPropertyName("diagnostics")] ImmutableDictionary<string, string> Diagnostics); | ||||||
| @@ -141,17 +141,22 @@ public sealed class CsafNormalizer : IVexNormalizer | |||||||
|             var diagnosticsBuilder = ImmutableDictionary.CreateBuilder<string, string>(StringComparer.Ordinal); |             var diagnosticsBuilder = ImmutableDictionary.CreateBuilder<string, string>(StringComparer.Ordinal); | ||||||
|             if (!result.UnsupportedStatuses.IsDefaultOrEmpty && result.UnsupportedStatuses.Length > 0) |             if (!result.UnsupportedStatuses.IsDefaultOrEmpty && result.UnsupportedStatuses.Length > 0) | ||||||
|             { |             { | ||||||
|                 diagnosticsBuilder["csaf.unsupported_statuses"] = string.Join(",", result.UnsupportedStatuses); |                 diagnosticsBuilder["policy.unsupported_statuses"] = string.Join(",", result.UnsupportedStatuses); | ||||||
|             } |             } | ||||||
|  |  | ||||||
|             if (!result.UnsupportedJustifications.IsDefaultOrEmpty && result.UnsupportedJustifications.Length > 0) |             if (!result.UnsupportedJustifications.IsDefaultOrEmpty && result.UnsupportedJustifications.Length > 0) | ||||||
|             { |             { | ||||||
|                 diagnosticsBuilder["csaf.unsupported_justifications"] = string.Join(",", result.UnsupportedJustifications); |                 diagnosticsBuilder["policy.unsupported_justifications"] = string.Join(",", result.UnsupportedJustifications); | ||||||
|             } |             } | ||||||
|  |  | ||||||
|             if (!result.ConflictingJustifications.IsDefaultOrEmpty && result.ConflictingJustifications.Length > 0) |             if (!result.ConflictingJustifications.IsDefaultOrEmpty && result.ConflictingJustifications.Length > 0) | ||||||
|             { |             { | ||||||
|                 diagnosticsBuilder["csaf.justification_conflicts"] = string.Join(",", result.ConflictingJustifications); |                 diagnosticsBuilder["policy.justification_conflicts"] = string.Join(",", result.ConflictingJustifications); | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             if (!result.MissingRequiredJustifications.IsDefaultOrEmpty && result.MissingRequiredJustifications.Length > 0) | ||||||
|  |             { | ||||||
|  |                 diagnosticsBuilder["policy.justification_missing"] = string.Join(",", result.MissingRequiredJustifications); | ||||||
|             } |             } | ||||||
|  |  | ||||||
|             var diagnostics = diagnosticsBuilder.Count == 0 |             var diagnostics = diagnosticsBuilder.Count == 0 | ||||||
| @@ -202,6 +207,7 @@ public sealed class CsafNormalizer : IVexNormalizer | |||||||
|             var unsupportedStatuses = new HashSet<string>(StringComparer.OrdinalIgnoreCase); |             var unsupportedStatuses = new HashSet<string>(StringComparer.OrdinalIgnoreCase); | ||||||
|             var unsupportedJustifications = new HashSet<string>(StringComparer.OrdinalIgnoreCase); |             var unsupportedJustifications = new HashSet<string>(StringComparer.OrdinalIgnoreCase); | ||||||
|             var conflictingJustifications = new HashSet<string>(StringComparer.OrdinalIgnoreCase); |             var conflictingJustifications = new HashSet<string>(StringComparer.OrdinalIgnoreCase); | ||||||
|  |             var missingRequiredJustifications = new HashSet<string>(StringComparer.OrdinalIgnoreCase); | ||||||
|  |  | ||||||
|             var claimsBuilder = ImmutableArray.CreateBuilder<CsafClaimEntry>(); |             var claimsBuilder = ImmutableArray.CreateBuilder<CsafClaimEntry>(); | ||||||
|  |  | ||||||
| @@ -230,7 +236,8 @@ public sealed class CsafNormalizer : IVexNormalizer | |||||||
|                         productCatalog, |                         productCatalog, | ||||||
|                         justifications, |                         justifications, | ||||||
|                         detail, |                         detail, | ||||||
|                         unsupportedStatuses); |                         unsupportedStatuses, | ||||||
|  |                         missingRequiredJustifications); | ||||||
|  |  | ||||||
|                     claimsBuilder.AddRange(productClaims); |                     claimsBuilder.AddRange(productClaims); | ||||||
|                 } |                 } | ||||||
| @@ -244,7 +251,8 @@ public sealed class CsafNormalizer : IVexNormalizer | |||||||
|                 claimsBuilder.ToImmutable(), |                 claimsBuilder.ToImmutable(), | ||||||
|                 unsupportedStatuses.OrderBy(static s => s, StringComparer.OrdinalIgnoreCase).ToImmutableArray(), |                 unsupportedStatuses.OrderBy(static s => s, StringComparer.OrdinalIgnoreCase).ToImmutableArray(), | ||||||
|                 unsupportedJustifications.OrderBy(static s => s, StringComparer.OrdinalIgnoreCase).ToImmutableArray(), |                 unsupportedJustifications.OrderBy(static s => s, StringComparer.OrdinalIgnoreCase).ToImmutableArray(), | ||||||
|                 conflictingJustifications.OrderBy(static s => s, StringComparer.OrdinalIgnoreCase).ToImmutableArray()); |                 conflictingJustifications.OrderBy(static s => s, StringComparer.OrdinalIgnoreCase).ToImmutableArray(), | ||||||
|  |                 missingRequiredJustifications.OrderBy(static s => s, StringComparer.OrdinalIgnoreCase).ToImmutableArray()); | ||||||
|         } |         } | ||||||
|  |  | ||||||
|         private static IReadOnlyList<CsafClaimEntry> BuildClaimsForVulnerability( |         private static IReadOnlyList<CsafClaimEntry> BuildClaimsForVulnerability( | ||||||
| @@ -253,7 +261,8 @@ public sealed class CsafNormalizer : IVexNormalizer | |||||||
|             IReadOnlyDictionary<string, CsafProductInfo> productCatalog, |             IReadOnlyDictionary<string, CsafProductInfo> productCatalog, | ||||||
|             ImmutableDictionary<string, CsafJustificationInfo> justifications, |             ImmutableDictionary<string, CsafJustificationInfo> justifications, | ||||||
|             string? detail, |             string? detail, | ||||||
|             ISet<string> unsupportedStatuses) |             ISet<string> unsupportedStatuses, | ||||||
|  |             ISet<string> missingRequiredJustifications) | ||||||
|         { |         { | ||||||
|             if (!vulnerability.TryGetProperty("product_status", out var statusElement) || |             if (!vulnerability.TryGetProperty("product_status", out var statusElement) || | ||||||
|                 statusElement.ValueKind != JsonValueKind.Object) |                 statusElement.ValueKind != JsonValueKind.Object) | ||||||
| @@ -297,7 +306,7 @@ public sealed class CsafNormalizer : IVexNormalizer | |||||||
|                 return Array.Empty<CsafClaimEntry>(); |                 return Array.Empty<CsafClaimEntry>(); | ||||||
|             } |             } | ||||||
|  |  | ||||||
|             return claims.Values |             var builtClaims = claims.Values | ||||||
|                 .Select(builder => new CsafClaimEntry( |                 .Select(builder => new CsafClaimEntry( | ||||||
|                     vulnerabilityId, |                     vulnerabilityId, | ||||||
|                     builder.Product, |                     builder.Product, | ||||||
| @@ -307,6 +316,16 @@ public sealed class CsafNormalizer : IVexNormalizer | |||||||
|                     builder.Justification, |                     builder.Justification, | ||||||
|                     builder.RawJustification)) |                     builder.RawJustification)) | ||||||
|                 .ToArray(); |                 .ToArray(); | ||||||
|  |  | ||||||
|  |             foreach (var entry in builtClaims) | ||||||
|  |             { | ||||||
|  |                 if (entry.Status == VexClaimStatus.NotAffected && entry.Justification is null) | ||||||
|  |                 { | ||||||
|  |                     missingRequiredJustifications.Add(FormattableString.Invariant($"{entry.VulnerabilityId}:{entry.Product.ProductId}")); | ||||||
|  |                 } | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             return builtClaims; | ||||||
|         } |         } | ||||||
|  |  | ||||||
|         private static void UpdateClaim( |         private static void UpdateClaim( | ||||||
| @@ -855,7 +874,8 @@ public sealed class CsafNormalizer : IVexNormalizer | |||||||
|         ImmutableArray<CsafClaimEntry> Claims, |         ImmutableArray<CsafClaimEntry> Claims, | ||||||
|         ImmutableArray<string> UnsupportedStatuses, |         ImmutableArray<string> UnsupportedStatuses, | ||||||
|         ImmutableArray<string> UnsupportedJustifications, |         ImmutableArray<string> UnsupportedJustifications, | ||||||
|         ImmutableArray<string> ConflictingJustifications); |         ImmutableArray<string> ConflictingJustifications, | ||||||
|  |         ImmutableArray<string> MissingRequiredJustifications); | ||||||
|  |  | ||||||
|     private sealed record CsafJustificationInfo( |     private sealed record CsafJustificationInfo( | ||||||
|         string RawValue, |         string RawValue, | ||||||
|   | |||||||
| @@ -9,6 +9,7 @@ public static class CsafFormatsServiceCollectionExtensions | |||||||
|     { |     { | ||||||
|         ArgumentNullException.ThrowIfNull(services); |         ArgumentNullException.ThrowIfNull(services); | ||||||
|         services.AddSingleton<IVexNormalizer, CsafNormalizer>(); |         services.AddSingleton<IVexNormalizer, CsafNormalizer>(); | ||||||
|  |         services.AddSingleton<IVexExporter, CsafExporter>(); | ||||||
|         return services; |         return services; | ||||||
|     } |     } | ||||||
| } | } | ||||||
|   | |||||||
| @@ -3,5 +3,5 @@ If you are working on this file you need to read docs/modules/excititor/ARCHITEC | |||||||
| | Task | Owner(s) | Depends on | Notes | | | Task | Owner(s) | Depends on | Notes | | ||||||
| |---|---|---|---| | |---|---|---|---| | ||||||
| |EXCITITOR-FMT-CSAF-01-001 – CSAF normalizer foundation|Team Excititor Formats|EXCITITOR-CORE-01-001|**DONE (2025-10-17)** – Implemented CSAF normalizer + DI hook, parsing tracking metadata, product tree branches/full names, and mapping product statuses into canonical `VexClaim`s with baseline precedence. Regression added in `CsafNormalizerTests`.| | |EXCITITOR-FMT-CSAF-01-001 – CSAF normalizer foundation|Team Excititor Formats|EXCITITOR-CORE-01-001|**DONE (2025-10-17)** – Implemented CSAF normalizer + DI hook, parsing tracking metadata, product tree branches/full names, and mapping product statuses into canonical `VexClaim`s with baseline precedence. Regression added in `CsafNormalizerTests`.| | ||||||
| |EXCITITOR-FMT-CSAF-01-002 – Status/justification mapping|Team Excititor Formats|EXCITITOR-FMT-CSAF-01-001, EXCITITOR-POLICY-01-001|**DOING (2025-10-19)** – Prereqs EXCITITOR-FMT-CSAF-01-001 & EXCITITOR-POLICY-01-001 verified DONE; starting normalization of `product_status`/`justification` values with policy-aligned diagnostics.| | |EXCITITOR-FMT-CSAF-01-002 – Status/justification mapping|Team Excititor Formats|EXCITITOR-FMT-CSAF-01-001, EXCITITOR-POLICY-01-001|**DONE (2025-10-29)** – Added policy-aligned diagnostics for unsupported statuses/justifications and flagged missing not_affected evidence inside normalizer outputs.| | ||||||
| |EXCITITOR-FMT-CSAF-01-003 – CSAF export adapter|Team Excititor Formats|EXCITITOR-EXPORT-01-001, EXCITITOR-FMT-CSAF-01-001|**DOING (2025-10-19)** – Prereqs EXCITITOR-EXPORT-01-001 & EXCITITOR-FMT-CSAF-01-001 confirmed DONE; drafting deterministic CSAF exporter and manifest metadata flow.| | |EXCITITOR-FMT-CSAF-01-003 – CSAF export adapter|Team Excititor Formats|EXCITITOR-EXPORT-01-001, EXCITITOR-FMT-CSAF-01-001|**DONE (2025-10-29)** – Implemented deterministic CSAF exporter with product tree reconciliation, vulnerability status mapping, and metadata for downstream attestation.| | ||||||
|   | |||||||
| @@ -0,0 +1,242 @@ | |||||||
|  | using System.Collections.Generic; | ||||||
|  | using System.Collections.Immutable; | ||||||
|  | using System.Globalization; | ||||||
|  | using System.Linq; | ||||||
|  | using System.Security.Cryptography; | ||||||
|  | using System.Text; | ||||||
|  | using System.Text.Json.Serialization; | ||||||
|  | using StellaOps.Excititor.Core; | ||||||
|  |  | ||||||
|  | namespace StellaOps.Excititor.Formats.CycloneDX; | ||||||
|  |  | ||||||
|  | internal static class CycloneDxComponentReconciler | ||||||
|  | { | ||||||
|  |     public static CycloneDxReconciliationResult Reconcile(IEnumerable<VexClaim> claims) | ||||||
|  |     { | ||||||
|  |         ArgumentNullException.ThrowIfNull(claims); | ||||||
|  |  | ||||||
|  |         var catalog = new ComponentCatalog(); | ||||||
|  |         var diagnostics = new Dictionary<string, SortedSet<string>>(StringComparer.Ordinal); | ||||||
|  |         var componentRefs = new Dictionary<(string VulnerabilityId, string ProductKey), string>(); | ||||||
|  |  | ||||||
|  |         foreach (var claim in claims) | ||||||
|  |         { | ||||||
|  |             if (claim is null) | ||||||
|  |             { | ||||||
|  |                 continue; | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             var component = catalog.GetOrAdd(claim.Product, claim.ProviderId, diagnostics); | ||||||
|  |             componentRefs[(claim.VulnerabilityId, claim.Product.Key)] = component.BomRef; | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         var components = catalog.Build(); | ||||||
|  |         var orderedDiagnostics = diagnostics.Count == 0 | ||||||
|  |             ? ImmutableDictionary<string, string>.Empty | ||||||
|  |             : diagnostics.ToImmutableDictionary( | ||||||
|  |                 static pair => pair.Key, | ||||||
|  |                 pair => string.Join(",", pair.Value.OrderBy(static value => value, StringComparer.Ordinal)), | ||||||
|  |                 StringComparer.Ordinal); | ||||||
|  |  | ||||||
|  |         return new CycloneDxReconciliationResult( | ||||||
|  |             components, | ||||||
|  |             componentRefs.ToImmutableDictionary(), | ||||||
|  |             orderedDiagnostics); | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     private sealed class ComponentCatalog | ||||||
|  |     { | ||||||
|  |         private readonly Dictionary<string, MutableComponent> _components = new(StringComparer.Ordinal); | ||||||
|  |         private readonly HashSet<string> _bomRefs = new(StringComparer.Ordinal); | ||||||
|  |  | ||||||
|  |         public MutableComponent GetOrAdd(VexProduct product, string providerId, IDictionary<string, SortedSet<string>> diagnostics) | ||||||
|  |         { | ||||||
|  |             if (_components.TryGetValue(product.Key, out var existing)) | ||||||
|  |             { | ||||||
|  |                 existing.Update(product, providerId, diagnostics); | ||||||
|  |                 return existing; | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             var bomRef = GenerateBomRef(product); | ||||||
|  |             var component = new MutableComponent(product.Key, bomRef); | ||||||
|  |             component.Update(product, providerId, diagnostics); | ||||||
|  |             _components[product.Key] = component; | ||||||
|  |             return component; | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         public ImmutableArray<CycloneDxComponentEntry> Build() | ||||||
|  |             => _components.Values | ||||||
|  |                 .Select(static component => component.ToEntry()) | ||||||
|  |                 .OrderBy(static entry => entry.BomRef, StringComparer.Ordinal) | ||||||
|  |                 .ToImmutableArray(); | ||||||
|  |  | ||||||
|  |         private string GenerateBomRef(VexProduct product) | ||||||
|  |         { | ||||||
|  |             if (!string.IsNullOrWhiteSpace(product.Purl)) | ||||||
|  |             { | ||||||
|  |                 var normalized = product.Purl!.Trim(); | ||||||
|  |                 if (_bomRefs.Add(normalized)) | ||||||
|  |                 { | ||||||
|  |                     return normalized; | ||||||
|  |                 } | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             var baseRef = Sanitize(product.Key); | ||||||
|  |             if (_bomRefs.Add(baseRef)) | ||||||
|  |             { | ||||||
|  |                 return baseRef; | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             var hash = ComputeShortHash(product.Key + product.Name); | ||||||
|  |             var candidate = FormattableString.Invariant($"{baseRef}-{hash}"); | ||||||
|  |             while (!_bomRefs.Add(candidate)) | ||||||
|  |             { | ||||||
|  |                 candidate = FormattableString.Invariant($"{candidate}-{hash}"); | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             return candidate; | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         private static string Sanitize(string value) | ||||||
|  |         { | ||||||
|  |             if (string.IsNullOrWhiteSpace(value)) | ||||||
|  |             { | ||||||
|  |                 return "component"; | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             var builder = new StringBuilder(value.Length); | ||||||
|  |             foreach (var ch in value) | ||||||
|  |             { | ||||||
|  |                 builder.Append(char.IsLetterOrDigit(ch) ? char.ToLowerInvariant(ch) : '-'); | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             var sanitized = builder.ToString().Trim('-'); | ||||||
|  |             return string.IsNullOrEmpty(sanitized) ? "component" : sanitized; | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         private static string ComputeShortHash(string value) | ||||||
|  |         { | ||||||
|  |             var bytes = Encoding.UTF8.GetBytes(value); | ||||||
|  |             Span<byte> hash = stackalloc byte[SHA256.HashSizeInBytes]; | ||||||
|  |             SHA256.HashData(bytes, hash); | ||||||
|  |             return Convert.ToHexString(hash[..6]).ToLowerInvariant(); | ||||||
|  |         } | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     private sealed class MutableComponent | ||||||
|  |     { | ||||||
|  |         public MutableComponent(string key, string bomRef) | ||||||
|  |         { | ||||||
|  |             ProductKey = key; | ||||||
|  |             BomRef = bomRef; | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         public string ProductKey { get; } | ||||||
|  |  | ||||||
|  |         public string BomRef { get; } | ||||||
|  |  | ||||||
|  |         private string? _name; | ||||||
|  |         private string? _version; | ||||||
|  |         private string? _purl; | ||||||
|  |         private string? _cpe; | ||||||
|  |         private readonly SortedDictionary<string, string> _properties = new(StringComparer.Ordinal); | ||||||
|  |  | ||||||
|  |         public void Update(VexProduct product, string providerId, IDictionary<string, SortedSet<string>> diagnostics) | ||||||
|  |         { | ||||||
|  |             if (!string.IsNullOrWhiteSpace(product.Name) && ShouldReplace(_name, product.Name)) | ||||||
|  |             { | ||||||
|  |                 _name = product.Name; | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             if (!string.IsNullOrWhiteSpace(product.Version) && ShouldReplace(_version, product.Version)) | ||||||
|  |             { | ||||||
|  |                 _version = product.Version; | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             if (!string.IsNullOrWhiteSpace(product.Purl)) | ||||||
|  |             { | ||||||
|  |                 var trimmed = product.Purl!.Trim(); | ||||||
|  |                 if (string.IsNullOrWhiteSpace(_purl)) | ||||||
|  |                 { | ||||||
|  |                     _purl = trimmed; | ||||||
|  |                 } | ||||||
|  |                 else if (!string.Equals(_purl, trimmed, StringComparison.OrdinalIgnoreCase)) | ||||||
|  |                 { | ||||||
|  |                     AddDiagnostic(diagnostics, "purl_conflict", FormattableString.Invariant($"{ProductKey}:{_purl}->{trimmed}")); | ||||||
|  |                 } | ||||||
|  |             } | ||||||
|  |             else | ||||||
|  |             { | ||||||
|  |                 AddDiagnostic(diagnostics, "missing_purl", FormattableString.Invariant($"{ProductKey}:{providerId}")); | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             if (!string.IsNullOrWhiteSpace(product.Cpe)) | ||||||
|  |             { | ||||||
|  |                 _cpe = product.Cpe; | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             if (product.ComponentIdentifiers.Length > 0) | ||||||
|  |             { | ||||||
|  |                 _properties["stellaops/componentIdentifiers"] = string.Join(';', product.ComponentIdentifiers.OrderBy(static identifier => identifier, StringComparer.OrdinalIgnoreCase)); | ||||||
|  |             } | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         public CycloneDxComponentEntry ToEntry() | ||||||
|  |         { | ||||||
|  |             ImmutableArray<CycloneDxProperty>? properties = _properties.Count == 0 | ||||||
|  |                 ? null | ||||||
|  |                 : _properties.Select(static pair => new CycloneDxProperty(pair.Key, pair.Value)).ToImmutableArray(); | ||||||
|  |  | ||||||
|  |             return new CycloneDxComponentEntry( | ||||||
|  |                 BomRef, | ||||||
|  |                 _name ?? ProductKey, | ||||||
|  |                 _version, | ||||||
|  |                 _purl, | ||||||
|  |                 _cpe, | ||||||
|  |                 properties); | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         private static bool ShouldReplace(string? existing, string candidate) | ||||||
|  |         { | ||||||
|  |             if (string.IsNullOrWhiteSpace(candidate)) | ||||||
|  |             { | ||||||
|  |                 return false; | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             if (string.IsNullOrWhiteSpace(existing)) | ||||||
|  |             { | ||||||
|  |                 return true; | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             return candidate.Length > existing.Length; | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         private static void AddDiagnostic(IDictionary<string, SortedSet<string>> diagnostics, string key, string value) | ||||||
|  |         { | ||||||
|  |             if (!diagnostics.TryGetValue(key, out var set)) | ||||||
|  |             { | ||||||
|  |                 set = new SortedSet<string>(StringComparer.Ordinal); | ||||||
|  |                 diagnostics[key] = set; | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             set.Add(value); | ||||||
|  |         } | ||||||
|  |     } | ||||||
|  | } | ||||||
|  |  | ||||||
|  | internal sealed record CycloneDxReconciliationResult( | ||||||
|  |     ImmutableArray<CycloneDxComponentEntry> Components, | ||||||
|  |     ImmutableDictionary<(string VulnerabilityId, string ProductKey), string> ComponentRefs, | ||||||
|  |     ImmutableDictionary<string, string> Diagnostics); | ||||||
|  |  | ||||||
|  | internal sealed record CycloneDxComponentEntry( | ||||||
|  |     [property: JsonPropertyName("bom-ref")] string BomRef, | ||||||
|  |     [property: JsonPropertyName("name")] string Name, | ||||||
|  |     [property: JsonPropertyName("version"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] string? Version, | ||||||
|  |     [property: JsonPropertyName("purl"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] string? Purl, | ||||||
|  |     [property: JsonPropertyName("cpe"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] string? Cpe, | ||||||
|  |     [property: JsonPropertyName("properties"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] ImmutableArray<CycloneDxProperty>? Properties); | ||||||
|  |  | ||||||
|  | internal sealed record CycloneDxProperty( | ||||||
|  |     [property: JsonPropertyName("name")] string Name, | ||||||
|  |     [property: JsonPropertyName("value")] string Value); | ||||||
| @@ -0,0 +1,228 @@ | |||||||
|  | using System.Collections.Generic; | ||||||
|  | using System.Collections.Immutable; | ||||||
|  | using System.Globalization; | ||||||
|  | using System.IO; | ||||||
|  | using System.Linq; | ||||||
|  | using System.Security.Cryptography; | ||||||
|  | using System.Text; | ||||||
|  | using System.Text.Json.Serialization; | ||||||
|  | using System.Threading; | ||||||
|  | using System.Threading.Tasks; | ||||||
|  | using StellaOps.Excititor.Core; | ||||||
|  |  | ||||||
|  | namespace StellaOps.Excititor.Formats.CycloneDX; | ||||||
|  |  | ||||||
|  | /// <summary> | ||||||
|  | /// Serialises normalized VEX claims into CycloneDX VEX documents with reconciled component references. | ||||||
|  | /// </summary> | ||||||
|  | public sealed class CycloneDxExporter : IVexExporter | ||||||
|  | { | ||||||
|  |     public VexExportFormat Format => VexExportFormat.CycloneDx; | ||||||
|  |  | ||||||
|  |     public VexContentAddress Digest(VexExportRequest request) | ||||||
|  |     { | ||||||
|  |         ArgumentNullException.ThrowIfNull(request); | ||||||
|  |         var document = BuildDocument(request, out _); | ||||||
|  |         var json = VexCanonicalJsonSerializer.Serialize(document); | ||||||
|  |         return ComputeDigest(json); | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     public async ValueTask<VexExportResult> SerializeAsync( | ||||||
|  |         VexExportRequest request, | ||||||
|  |         Stream output, | ||||||
|  |         CancellationToken cancellationToken) | ||||||
|  |     { | ||||||
|  |         ArgumentNullException.ThrowIfNull(request); | ||||||
|  |         ArgumentNullException.ThrowIfNull(output); | ||||||
|  |  | ||||||
|  |         var document = BuildDocument(request, out var metadata); | ||||||
|  |         var json = VexCanonicalJsonSerializer.Serialize(document); | ||||||
|  |         var digest = ComputeDigest(json); | ||||||
|  |         var buffer = Encoding.UTF8.GetBytes(json); | ||||||
|  |         await output.WriteAsync(buffer, 0, buffer.Length, cancellationToken).ConfigureAwait(false); | ||||||
|  |         return new VexExportResult(digest, buffer.LongLength, metadata); | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     private CycloneDxExportDocument BuildDocument(VexExportRequest request, out ImmutableDictionary<string, string> metadata) | ||||||
|  |     { | ||||||
|  |         var signature = VexQuerySignature.FromQuery(request.Query); | ||||||
|  |         var signatureHash = signature.ComputeHash(); | ||||||
|  |         var generatedAt = request.GeneratedAt.UtcDateTime.ToString("O", CultureInfo.InvariantCulture); | ||||||
|  |  | ||||||
|  |         var reconciliation = CycloneDxComponentReconciler.Reconcile(request.Claims); | ||||||
|  |         var vulnerabilityEntries = BuildVulnerabilities(request.Claims, reconciliation.ComponentRefs); | ||||||
|  |  | ||||||
|  |         var missingJustifications = request.Claims | ||||||
|  |             .Where(static claim => claim.Status == VexClaimStatus.NotAffected && claim.Justification is null) | ||||||
|  |             .Select(static claim => FormattableString.Invariant($"{claim.VulnerabilityId}:{claim.Product.Key}")) | ||||||
|  |             .Distinct(StringComparer.Ordinal) | ||||||
|  |             .OrderBy(static key => key, StringComparer.Ordinal) | ||||||
|  |             .ToImmutableArray(); | ||||||
|  |  | ||||||
|  |         var properties = ImmutableArray.Create(new CycloneDxProperty("stellaops/querySignature", signature.Value)); | ||||||
|  |  | ||||||
|  |         metadata = BuildMetadata(signature, reconciliation.Diagnostics, generatedAt, vulnerabilityEntries.Length, reconciliation.Components.Length, missingJustifications); | ||||||
|  |  | ||||||
|  |         var document = new CycloneDxExportDocument( | ||||||
|  |             BomFormat: "CycloneDX", | ||||||
|  |             SpecVersion: "1.6", | ||||||
|  |             SerialNumber: FormattableString.Invariant($"urn:uuid:{BuildDeterministicGuid(signatureHash.Digest)}"), | ||||||
|  |             Version: 1, | ||||||
|  |             Metadata: new CycloneDxMetadata(generatedAt), | ||||||
|  |             Components: reconciliation.Components, | ||||||
|  |             Vulnerabilities: vulnerabilityEntries, | ||||||
|  |             Properties: properties); | ||||||
|  |  | ||||||
|  |         return document; | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     private static ImmutableArray<CycloneDxVulnerabilityEntry> BuildVulnerabilities( | ||||||
|  |         ImmutableArray<VexClaim> claims, | ||||||
|  |         ImmutableDictionary<(string VulnerabilityId, string ProductKey), string> componentRefs) | ||||||
|  |     { | ||||||
|  |         var entries = ImmutableArray.CreateBuilder<CycloneDxVulnerabilityEntry>(); | ||||||
|  |  | ||||||
|  |         foreach (var claim in claims) | ||||||
|  |         { | ||||||
|  |             if (!componentRefs.TryGetValue((claim.VulnerabilityId, claim.Product.Key), out var componentRef)) | ||||||
|  |             { | ||||||
|  |                 continue; | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             var analysis = new CycloneDxAnalysis( | ||||||
|  |                 State: MapStatus(claim.Status), | ||||||
|  |                 Justification: claim.Justification?.ToString().ToLowerInvariant(), | ||||||
|  |                 Responses: null); | ||||||
|  |  | ||||||
|  |             var affects = ImmutableArray.Create(new CycloneDxAffectEntry(componentRef)); | ||||||
|  |  | ||||||
|  |             var properties = ImmutableArray.Create( | ||||||
|  |                 new CycloneDxProperty("stellaops/providerId", claim.ProviderId), | ||||||
|  |                 new CycloneDxProperty("stellaops/documentDigest", claim.Document.Digest)); | ||||||
|  |  | ||||||
|  |             var vulnerabilityId = claim.VulnerabilityId; | ||||||
|  |             var bomRef = FormattableString.Invariant($"{vulnerabilityId}#{Normalize(componentRef)}"); | ||||||
|  |  | ||||||
|  |             entries.Add(new CycloneDxVulnerabilityEntry( | ||||||
|  |                 Id: vulnerabilityId, | ||||||
|  |                 BomRef: bomRef, | ||||||
|  |                 Description: claim.Detail, | ||||||
|  |                 Analysis: analysis, | ||||||
|  |                 Affects: affects, | ||||||
|  |                 Properties: properties)); | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         return entries | ||||||
|  |             .ToImmutable() | ||||||
|  |             .OrderBy(static entry => entry.Id, StringComparer.Ordinal) | ||||||
|  |             .ThenBy(static entry => entry.BomRef, StringComparer.Ordinal) | ||||||
|  |             .ToImmutableArray(); | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     private static string Normalize(string value) | ||||||
|  |     { | ||||||
|  |         if (string.IsNullOrWhiteSpace(value)) | ||||||
|  |         { | ||||||
|  |             return "component"; | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         var builder = new StringBuilder(value.Length); | ||||||
|  |         foreach (var ch in value) | ||||||
|  |         { | ||||||
|  |             builder.Append(char.IsLetterOrDigit(ch) ? char.ToLowerInvariant(ch) : '-'); | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         var normalized = builder.ToString().Trim('-'); | ||||||
|  |         return string.IsNullOrEmpty(normalized) ? "component" : normalized; | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     private static string MapStatus(VexClaimStatus status) | ||||||
|  |         => status switch | ||||||
|  |         { | ||||||
|  |             VexClaimStatus.Affected => "affected", | ||||||
|  |             VexClaimStatus.NotAffected => "not_affected", | ||||||
|  |             VexClaimStatus.Fixed => "resolved", | ||||||
|  |             VexClaimStatus.UnderInvestigation => "under_investigation", | ||||||
|  |             _ => "unknown", | ||||||
|  |         }; | ||||||
|  |  | ||||||
|  |     private static ImmutableDictionary<string, string> BuildMetadata( | ||||||
|  |         VexQuerySignature signature, | ||||||
|  |         ImmutableDictionary<string, string> diagnostics, | ||||||
|  |         string generatedAt, | ||||||
|  |         int vulnerabilityCount, | ||||||
|  |         int componentCount, | ||||||
|  |         ImmutableArray<string> missingJustifications) | ||||||
|  |     { | ||||||
|  |         var builder = ImmutableDictionary.CreateBuilder<string, string>(StringComparer.Ordinal); | ||||||
|  |         builder["cyclonedx.querySignature"] = signature.Value; | ||||||
|  |         builder["cyclonedx.generatedAt"] = generatedAt; | ||||||
|  |         builder["cyclonedx.vulnerabilityCount"] = vulnerabilityCount.ToString(CultureInfo.InvariantCulture); | ||||||
|  |         builder["cyclonedx.componentCount"] = componentCount.ToString(CultureInfo.InvariantCulture); | ||||||
|  |  | ||||||
|  |         foreach (var diagnostic in diagnostics.OrderBy(static pair => pair.Key, StringComparer.Ordinal)) | ||||||
|  |         { | ||||||
|  |             builder[$"cyclonedx.{diagnostic.Key}"] = diagnostic.Value; | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         if (!missingJustifications.IsDefaultOrEmpty && missingJustifications.Length > 0) | ||||||
|  |         { | ||||||
|  |             builder["policy.justification_missing"] = string.Join(",", missingJustifications); | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         return builder.ToImmutable(); | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     private static string BuildDeterministicGuid(string digest) | ||||||
|  |     { | ||||||
|  |         if (string.IsNullOrWhiteSpace(digest) || digest.Length < 32) | ||||||
|  |         { | ||||||
|  |             return Guid.NewGuid().ToString(); | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         var hex = digest[..32]; | ||||||
|  |         var bytes = Enumerable.Range(0, hex.Length / 2) | ||||||
|  |             .Select(i => byte.Parse(hex.Substring(i * 2, 2), NumberStyles.HexNumber, CultureInfo.InvariantCulture)) | ||||||
|  |             .ToArray(); | ||||||
|  |  | ||||||
|  |         return new Guid(bytes).ToString(); | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     private static VexContentAddress ComputeDigest(string json) | ||||||
|  |     { | ||||||
|  |         var bytes = Encoding.UTF8.GetBytes(json); | ||||||
|  |         Span<byte> hash = stackalloc byte[SHA256.HashSizeInBytes]; | ||||||
|  |         SHA256.HashData(bytes, hash); | ||||||
|  |         var digest = Convert.ToHexString(hash).ToLowerInvariant(); | ||||||
|  |         return new VexContentAddress("sha256", digest); | ||||||
|  |     } | ||||||
|  | } | ||||||
|  |  | ||||||
|  | internal sealed record CycloneDxExportDocument( | ||||||
|  |     [property: JsonPropertyName("bomFormat")] string BomFormat, | ||||||
|  |     [property: JsonPropertyName("specVersion")] string SpecVersion, | ||||||
|  |     [property: JsonPropertyName("serialNumber")] string SerialNumber, | ||||||
|  |     [property: JsonPropertyName("version")] int Version, | ||||||
|  |     [property: JsonPropertyName("metadata")] CycloneDxMetadata Metadata, | ||||||
|  |     [property: JsonPropertyName("components")] ImmutableArray<CycloneDxComponentEntry> Components, | ||||||
|  |     [property: JsonPropertyName("vulnerabilities")] ImmutableArray<CycloneDxVulnerabilityEntry> Vulnerabilities, | ||||||
|  |     [property: JsonPropertyName("properties"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] ImmutableArray<CycloneDxProperty>? Properties); | ||||||
|  |  | ||||||
|  | internal sealed record CycloneDxMetadata( | ||||||
|  |     [property: JsonPropertyName("timestamp")] string Timestamp); | ||||||
|  |  | ||||||
|  | internal sealed record CycloneDxVulnerabilityEntry( | ||||||
|  |     [property: JsonPropertyName("id")] string Id, | ||||||
|  |     [property: JsonPropertyName("bom-ref")] string BomRef, | ||||||
|  |     [property: JsonPropertyName("description"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] string? Description, | ||||||
|  |     [property: JsonPropertyName("analysis")] CycloneDxAnalysis Analysis, | ||||||
|  |     [property: JsonPropertyName("affects")] ImmutableArray<CycloneDxAffectEntry> Affects, | ||||||
|  |     [property: JsonPropertyName("properties")] ImmutableArray<CycloneDxProperty> Properties); | ||||||
|  |  | ||||||
|  | internal sealed record CycloneDxAnalysis( | ||||||
|  |     [property: JsonPropertyName("state")] string State, | ||||||
|  |     [property: JsonPropertyName("justification"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] string? Justification, | ||||||
|  |     [property: JsonPropertyName("response"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] ImmutableArray<string>? Responses); | ||||||
|  |  | ||||||
|  | internal sealed record CycloneDxAffectEntry( | ||||||
|  |     [property: JsonPropertyName("ref")] string Reference); | ||||||
| @@ -9,6 +9,7 @@ public static class CycloneDxFormatsServiceCollectionExtensions | |||||||
|     { |     { | ||||||
|         ArgumentNullException.ThrowIfNull(services); |         ArgumentNullException.ThrowIfNull(services); | ||||||
|         services.AddSingleton<IVexNormalizer, CycloneDxNormalizer>(); |         services.AddSingleton<IVexNormalizer, CycloneDxNormalizer>(); | ||||||
|  |         services.AddSingleton<IVexExporter, CycloneDxExporter>(); | ||||||
|         return services; |         return services; | ||||||
|     } |     } | ||||||
| } | } | ||||||
|   | |||||||
| @@ -3,5 +3,5 @@ If you are working on this file you need to read docs/modules/excititor/ARCHITEC | |||||||
| | Task | Owner(s) | Depends on | Notes | | | Task | Owner(s) | Depends on | Notes | | ||||||
| |---|---|---|---| | |---|---|---|---| | ||||||
| |EXCITITOR-FMT-CYCLONE-01-001 – CycloneDX VEX normalizer|Team Excititor Formats|EXCITITOR-CORE-01-001|**DONE (2025-10-17)** – CycloneDX normalizer parses `analysis` data, resolves component references, and emits canonical `VexClaim`s; regression lives in `CycloneDxNormalizerTests`.| | |EXCITITOR-FMT-CYCLONE-01-001 – CycloneDX VEX normalizer|Team Excititor Formats|EXCITITOR-CORE-01-001|**DONE (2025-10-17)** – CycloneDX normalizer parses `analysis` data, resolves component references, and emits canonical `VexClaim`s; regression lives in `CycloneDxNormalizerTests`.| | ||||||
| |EXCITITOR-FMT-CYCLONE-01-002 – Component reference reconciliation|Team Excititor Formats|EXCITITOR-FMT-CYCLONE-01-001|**DOING (2025-10-19)** – Prereq EXCITITOR-FMT-CYCLONE-01-001 confirmed DONE; proceeding with reference reconciliation helpers and diagnostics for missing SBOM links.| | |EXCITITOR-FMT-CYCLONE-01-002 – Component reference reconciliation|Team Excititor Formats|EXCITITOR-FMT-CYCLONE-01-001|**DONE (2025-10-29)** – Added reconciler producing stable bom-refs, aggregating component metadata, and reporting missing PURL diagnostics for policy gating.| | ||||||
| |EXCITITOR-FMT-CYCLONE-01-003 – CycloneDX export serializer|Team Excititor Formats|EXCITITOR-EXPORT-01-001, EXCITITOR-FMT-CYCLONE-01-001|**DOING (2025-10-19)** – Prereqs EXCITITOR-EXPORT-01-001 & EXCITITOR-FMT-CYCLONE-01-001 verified DONE; initiating deterministic CycloneDX VEX exporter work.| | |EXCITITOR-FMT-CYCLONE-01-003 – CycloneDX export serializer|Team Excititor Formats|EXCITITOR-EXPORT-01-001, EXCITITOR-FMT-CYCLONE-01-001|**DONE (2025-10-29)** – Implemented CycloneDX VEX exporter emitting reconciled components, vulnerability analysis blocks, and canonical metadata.| | ||||||
|   | |||||||
| @@ -0,0 +1,217 @@ | |||||||
|  | using System.Collections.Generic; | ||||||
|  | using System.Collections.Immutable; | ||||||
|  | using System.Globalization; | ||||||
|  | using System.IO; | ||||||
|  | using System.Linq; | ||||||
|  | using System.Security.Cryptography; | ||||||
|  | using System.Text; | ||||||
|  | using System.Text.Json.Serialization; | ||||||
|  | using System.Threading; | ||||||
|  | using System.Threading.Tasks; | ||||||
|  | using StellaOps.Excititor.Core; | ||||||
|  |  | ||||||
|  | namespace StellaOps.Excititor.Formats.OpenVEX; | ||||||
|  |  | ||||||
|  | /// <summary> | ||||||
|  | /// Serializes merged VEX statements into canonical OpenVEX export documents. | ||||||
|  | /// </summary> | ||||||
|  | public sealed class OpenVexExporter : IVexExporter | ||||||
|  | { | ||||||
|  |     public OpenVexExporter() | ||||||
|  |     { | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     public VexExportFormat Format => VexExportFormat.OpenVex; | ||||||
|  |  | ||||||
|  |     public VexContentAddress Digest(VexExportRequest request) | ||||||
|  |     { | ||||||
|  |         ArgumentNullException.ThrowIfNull(request); | ||||||
|  |         var document = BuildDocument(request, out _); | ||||||
|  |         var json = VexCanonicalJsonSerializer.Serialize(document); | ||||||
|  |         return ComputeDigest(json); | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     public async ValueTask<VexExportResult> SerializeAsync( | ||||||
|  |         VexExportRequest request, | ||||||
|  |         Stream output, | ||||||
|  |         CancellationToken cancellationToken) | ||||||
|  |     { | ||||||
|  |         ArgumentNullException.ThrowIfNull(request); | ||||||
|  |         ArgumentNullException.ThrowIfNull(output); | ||||||
|  |  | ||||||
|  |         var metadata = BuildDocument(request, out var exportMetadata); | ||||||
|  |         var json = VexCanonicalJsonSerializer.Serialize(metadata); | ||||||
|  |         var digest = ComputeDigest(json); | ||||||
|  |         var buffer = Encoding.UTF8.GetBytes(json); | ||||||
|  |         await output.WriteAsync(buffer, 0, buffer.Length, cancellationToken).ConfigureAwait(false); | ||||||
|  |         return new VexExportResult(digest, buffer.LongLength, exportMetadata); | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     private OpenVexExportDocument BuildDocument(VexExportRequest request, out ImmutableDictionary<string, string> metadata) | ||||||
|  |     { | ||||||
|  |         var mergeResult = OpenVexStatementMerger.Merge(request.Claims); | ||||||
|  |         var signature = VexQuerySignature.FromQuery(request.Query); | ||||||
|  |         var signatureHash = signature.ComputeHash(); | ||||||
|  |         var generatedAt = request.GeneratedAt.UtcDateTime.ToString("O", CultureInfo.InvariantCulture); | ||||||
|  |         var sourceProviders = request.Claims | ||||||
|  |             .Select(static claim => claim.ProviderId) | ||||||
|  |             .Distinct(StringComparer.Ordinal) | ||||||
|  |             .OrderBy(static provider => provider, StringComparer.Ordinal) | ||||||
|  |             .ToImmutableArray(); | ||||||
|  |  | ||||||
|  |         var statements = mergeResult.Statements | ||||||
|  |             .Select(statement => MapStatement(statement)) | ||||||
|  |             .ToImmutableArray(); | ||||||
|  |  | ||||||
|  |         var document = new OpenVexDocumentSection( | ||||||
|  |             Id: FormattableString.Invariant($"openvex:export:{signatureHash.Digest}"), | ||||||
|  |             Author: "StellaOps Excititor", | ||||||
|  |             Version: "1", | ||||||
|  |             Created: generatedAt, | ||||||
|  |             LastUpdated: generatedAt, | ||||||
|  |             Profile: "stellaops-export/v1"); | ||||||
|  |  | ||||||
|  |         var metadataSection = new OpenVexExportMetadata( | ||||||
|  |             generatedAt, | ||||||
|  |             signature.Value, | ||||||
|  |             sourceProviders, | ||||||
|  |             mergeResult.Diagnostics); | ||||||
|  |  | ||||||
|  |         metadata = BuildMetadata(signature, mergeResult, sourceProviders, generatedAt); | ||||||
|  |  | ||||||
|  |         return new OpenVexExportDocument(document, statements, metadataSection); | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     private static ImmutableDictionary<string, string> BuildMetadata( | ||||||
|  |         VexQuerySignature signature, | ||||||
|  |         OpenVexMergeResult mergeResult, | ||||||
|  |         ImmutableArray<string> sourceProviders, | ||||||
|  |         string generatedAt) | ||||||
|  |     { | ||||||
|  |         var metadataBuilder = ImmutableDictionary.CreateBuilder<string, string>(StringComparer.Ordinal); | ||||||
|  |         metadataBuilder["openvex.querySignature"] = signature.Value; | ||||||
|  |         metadataBuilder["openvex.generatedAt"] = generatedAt; | ||||||
|  |         metadataBuilder["openvex.statementCount"] = mergeResult.Statements.Length.ToString(CultureInfo.InvariantCulture); | ||||||
|  |         metadataBuilder["openvex.providerCount"] = sourceProviders.Length.ToString(CultureInfo.InvariantCulture); | ||||||
|  |  | ||||||
|  |         var sourceCount = mergeResult.Statements.Sum(static statement => statement.Sources.Length); | ||||||
|  |         metadataBuilder["openvex.sourceCount"] = sourceCount.ToString(CultureInfo.InvariantCulture); | ||||||
|  |  | ||||||
|  |         foreach (var diagnostic in mergeResult.Diagnostics.OrderBy(static pair => pair.Key, StringComparer.Ordinal)) | ||||||
|  |         { | ||||||
|  |             metadataBuilder[$"openvex.diagnostic.{diagnostic.Key}"] = diagnostic.Value; | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         return metadataBuilder.ToImmutable(); | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     private static OpenVexExportStatement MapStatement(OpenVexMergedStatement statement) | ||||||
|  |     { | ||||||
|  |         var products = ImmutableArray.Create( | ||||||
|  |             new OpenVexExportProduct( | ||||||
|  |                 Id: statement.Product.Key, | ||||||
|  |                 Name: statement.Product.Name ?? statement.Product.Key, | ||||||
|  |                 Version: statement.Product.Version, | ||||||
|  |                 Purl: statement.Product.Purl, | ||||||
|  |                 Cpe: statement.Product.Cpe)); | ||||||
|  |  | ||||||
|  |         var sources = statement.Sources | ||||||
|  |             .Select(source => new OpenVexExportSource( | ||||||
|  |                 Provider: source.ProviderId, | ||||||
|  |                 Status: source.Status.ToString().ToLowerInvariant(), | ||||||
|  |                 Justification: source.Justification?.ToString().ToLowerInvariant(), | ||||||
|  |                 DocumentDigest: source.DocumentDigest, | ||||||
|  |                 SourceUri: source.DocumentSource.ToString(), | ||||||
|  |                 Detail: source.Detail, | ||||||
|  |                 FirstObserved: source.FirstSeen.UtcDateTime.ToString("O", CultureInfo.InvariantCulture), | ||||||
|  |                 LastObserved: source.LastSeen.UtcDateTime.ToString("O", CultureInfo.InvariantCulture))) | ||||||
|  |             .ToImmutableArray(); | ||||||
|  |  | ||||||
|  |         var statementId = FormattableString.Invariant($"{statement.VulnerabilityId}#{NormalizeProductKey(statement.Product.Key)}"); | ||||||
|  |  | ||||||
|  |         return new OpenVexExportStatement( | ||||||
|  |             Id: statementId, | ||||||
|  |             Vulnerability: statement.VulnerabilityId, | ||||||
|  |             Status: statement.Status.ToString().ToLowerInvariant(), | ||||||
|  |             Justification: statement.Justification?.ToString().ToLowerInvariant(), | ||||||
|  |             Timestamp: statement.FirstObserved.UtcDateTime.ToString("O", CultureInfo.InvariantCulture), | ||||||
|  |             LastUpdated: statement.LastObserved.UtcDateTime.ToString("O", CultureInfo.InvariantCulture), | ||||||
|  |             Products: products, | ||||||
|  |             Statement: statement.Detail, | ||||||
|  |             Sources: sources); | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     private static string NormalizeProductKey(string key) | ||||||
|  |     { | ||||||
|  |         if (string.IsNullOrWhiteSpace(key)) | ||||||
|  |         { | ||||||
|  |             return "unknown"; | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         var builder = new StringBuilder(key.Length); | ||||||
|  |         foreach (var ch in key) | ||||||
|  |         { | ||||||
|  |             builder.Append(char.IsLetterOrDigit(ch) ? char.ToLowerInvariant(ch) : '-'); | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         var normalized = builder.ToString().Trim('-'); | ||||||
|  |         return string.IsNullOrEmpty(normalized) ? "unknown" : normalized; | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     private static VexContentAddress ComputeDigest(string json) | ||||||
|  |     { | ||||||
|  |         var bytes = Encoding.UTF8.GetBytes(json); | ||||||
|  |         Span<byte> hash = stackalloc byte[SHA256.HashSizeInBytes]; | ||||||
|  |         SHA256.HashData(bytes, hash); | ||||||
|  |         var digest = Convert.ToHexString(hash).ToLowerInvariant(); | ||||||
|  |         return new VexContentAddress("sha256", digest); | ||||||
|  |     } | ||||||
|  | } | ||||||
|  |  | ||||||
|  | internal sealed record OpenVexExportDocument( | ||||||
|  |     OpenVexDocumentSection Document, | ||||||
|  |     ImmutableArray<OpenVexExportStatement> Statements, | ||||||
|  |     OpenVexExportMetadata Metadata); | ||||||
|  |  | ||||||
|  | internal sealed record OpenVexDocumentSection( | ||||||
|  |     [property: JsonPropertyName("@context")] string Context = "https://openvex.dev/ns/v0.2", | ||||||
|  |     [property: JsonPropertyName("id")] string Id = "", | ||||||
|  |     [property: JsonPropertyName("author")] string Author = "", | ||||||
|  |     [property: JsonPropertyName("version")] string Version = "1", | ||||||
|  |     [property: JsonPropertyName("created")] string Created = "", | ||||||
|  |     [property: JsonPropertyName("last_updated")] string LastUpdated = "", | ||||||
|  |     [property: JsonPropertyName("profile")] string Profile = ""); | ||||||
|  |  | ||||||
|  | internal sealed record OpenVexExportStatement( | ||||||
|  |     [property: JsonPropertyName("id")] string Id, | ||||||
|  |     [property: JsonPropertyName("vulnerability")] string Vulnerability, | ||||||
|  |     [property: JsonPropertyName("status")] string Status, | ||||||
|  |     [property: JsonPropertyName("justification"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] string? Justification, | ||||||
|  |     [property: JsonPropertyName("timestamp")] string Timestamp, | ||||||
|  |     [property: JsonPropertyName("last_updated")] string LastUpdated, | ||||||
|  |     [property: JsonPropertyName("products")] ImmutableArray<OpenVexExportProduct> Products, | ||||||
|  |     [property: JsonPropertyName("statement"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] string? Statement, | ||||||
|  |     [property: JsonPropertyName("sources")] ImmutableArray<OpenVexExportSource> Sources); | ||||||
|  |  | ||||||
|  | internal sealed record OpenVexExportProduct( | ||||||
|  |     [property: JsonPropertyName("id")] string Id, | ||||||
|  |     [property: JsonPropertyName("name")] string Name, | ||||||
|  |     [property: JsonPropertyName("version"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] string? Version, | ||||||
|  |     [property: JsonPropertyName("purl"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] string? Purl, | ||||||
|  |     [property: JsonPropertyName("cpe"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] string? Cpe); | ||||||
|  |  | ||||||
|  | internal sealed record OpenVexExportSource( | ||||||
|  |     [property: JsonPropertyName("provider")] string Provider, | ||||||
|  |     [property: JsonPropertyName("status")] string Status, | ||||||
|  |     [property: JsonPropertyName("justification"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] string? Justification, | ||||||
|  |     [property: JsonPropertyName("document_digest")] string DocumentDigest, | ||||||
|  |     [property: JsonPropertyName("source_uri")] string SourceUri, | ||||||
|  |     [property: JsonPropertyName("detail"), JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)] string? Detail, | ||||||
|  |     [property: JsonPropertyName("first_observed")] string FirstObserved, | ||||||
|  |     [property: JsonPropertyName("last_observed")] string LastObserved); | ||||||
|  |  | ||||||
|  | internal sealed record OpenVexExportMetadata( | ||||||
|  |     [property: JsonPropertyName("generated_at")] string GeneratedAt, | ||||||
|  |     [property: JsonPropertyName("query_signature")] string QuerySignature, | ||||||
|  |     [property: JsonPropertyName("source_providers")] ImmutableArray<string> SourceProviders, | ||||||
|  |     [property: JsonPropertyName("diagnostics")] ImmutableDictionary<string, string> Diagnostics); | ||||||
| @@ -0,0 +1,282 @@ | |||||||
|  | using System.Collections.Generic; | ||||||
|  | using System.Collections.Immutable; | ||||||
|  | using System.Globalization; | ||||||
|  | using System.Linq; | ||||||
|  | using StellaOps.Excititor.Core; | ||||||
|  |  | ||||||
|  | namespace StellaOps.Excititor.Formats.OpenVEX; | ||||||
|  |  | ||||||
|  | /// <summary> | ||||||
|  | /// Provides deterministic merging utilities for OpenVEX statements derived from normalized VEX claims. | ||||||
|  | /// </summary> | ||||||
|  | public static class OpenVexStatementMerger | ||||||
|  | { | ||||||
|  |     private static readonly ImmutableDictionary<VexClaimStatus, int> StatusRiskPrecedence = new Dictionary<VexClaimStatus, int> | ||||||
|  |     { | ||||||
|  |         [VexClaimStatus.Affected] = 3, | ||||||
|  |         [VexClaimStatus.UnderInvestigation] = 2, | ||||||
|  |         [VexClaimStatus.Fixed] = 1, | ||||||
|  |         [VexClaimStatus.NotAffected] = 0, | ||||||
|  |     }.ToImmutableDictionary(); | ||||||
|  |  | ||||||
|  |     public static OpenVexMergeResult Merge(IEnumerable<VexClaim> claims) | ||||||
|  |     { | ||||||
|  |         ArgumentNullException.ThrowIfNull(claims); | ||||||
|  |  | ||||||
|  |         var statements = new List<OpenVexMergedStatement>(); | ||||||
|  |         var diagnostics = new Dictionary<string, SortedSet<string>>(StringComparer.Ordinal); | ||||||
|  |  | ||||||
|  |         foreach (var group in claims | ||||||
|  |                      .Where(static claim => claim is not null) | ||||||
|  |                      .GroupBy(static claim => (claim.VulnerabilityId, claim.Product.Key))) | ||||||
|  |         { | ||||||
|  |             var orderedClaims = group | ||||||
|  |                 .OrderBy(static claim => claim.ProviderId, StringComparer.Ordinal) | ||||||
|  |                 .ThenBy(static claim => claim.Document.Digest, StringComparer.Ordinal) | ||||||
|  |                 .ToImmutableArray(); | ||||||
|  |  | ||||||
|  |             if (orderedClaims.IsDefaultOrEmpty) | ||||||
|  |             { | ||||||
|  |                 continue; | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             var mergedProduct = MergeProduct(orderedClaims); | ||||||
|  |             var sources = BuildSources(orderedClaims); | ||||||
|  |             var firstSeen = orderedClaims.Min(static claim => claim.FirstSeen); | ||||||
|  |             var lastSeen = orderedClaims.Max(static claim => claim.LastSeen); | ||||||
|  |             var statusSet = orderedClaims | ||||||
|  |                 .Select(static claim => claim.Status) | ||||||
|  |                 .Distinct() | ||||||
|  |                 .ToArray(); | ||||||
|  |  | ||||||
|  |             if (statusSet.Length > 1) | ||||||
|  |             { | ||||||
|  |                 AddDiagnostic( | ||||||
|  |                     diagnostics, | ||||||
|  |                     "openvex.status_conflict", | ||||||
|  |                     FormattableString.Invariant($"{group.Key.VulnerabilityId}:{group.Key.Key}={string.Join('|', statusSet.Select(static status => status.ToString().ToLowerInvariant()))}")); | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             var canonicalStatus = SelectCanonicalStatus(statusSet); | ||||||
|  |             var justification = SelectJustification(canonicalStatus, orderedClaims, diagnostics, group.Key); | ||||||
|  |  | ||||||
|  |             if (canonicalStatus == VexClaimStatus.NotAffected && justification is null) | ||||||
|  |             { | ||||||
|  |                 AddDiagnostic( | ||||||
|  |                     diagnostics, | ||||||
|  |                     "policy.justification_missing", | ||||||
|  |                     FormattableString.Invariant($"{group.Key.VulnerabilityId}:{group.Key.Key}")); | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             var detail = BuildDetail(orderedClaims); | ||||||
|  |  | ||||||
|  |             statements.Add(new OpenVexMergedStatement( | ||||||
|  |                 group.Key.VulnerabilityId, | ||||||
|  |                 mergedProduct, | ||||||
|  |                 canonicalStatus, | ||||||
|  |                 justification, | ||||||
|  |                 detail, | ||||||
|  |                 sources, | ||||||
|  |                 firstSeen, | ||||||
|  |                 lastSeen)); | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         var orderedStatements = statements | ||||||
|  |             .OrderBy(static statement => statement.VulnerabilityId, StringComparer.Ordinal) | ||||||
|  |             .ThenBy(static statement => statement.Product.Key, StringComparer.Ordinal) | ||||||
|  |             .ToImmutableArray(); | ||||||
|  |  | ||||||
|  |         var orderedDiagnostics = diagnostics.Count == 0 | ||||||
|  |             ? ImmutableDictionary<string, string>.Empty | ||||||
|  |             : diagnostics.ToImmutableDictionary( | ||||||
|  |                 static pair => pair.Key, | ||||||
|  |                 pair => string.Join(",", pair.Value.OrderBy(static entry => entry, StringComparer.Ordinal)), | ||||||
|  |                 StringComparer.Ordinal); | ||||||
|  |  | ||||||
|  |         return new OpenVexMergeResult(orderedStatements, orderedDiagnostics); | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     private static VexClaimStatus SelectCanonicalStatus(IReadOnlyCollection<VexClaimStatus> statuses) | ||||||
|  |     { | ||||||
|  |         if (statuses.Count == 0) | ||||||
|  |         { | ||||||
|  |             return VexClaimStatus.UnderInvestigation; | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         return statuses | ||||||
|  |             .OrderByDescending(static status => StatusRiskPrecedence.GetValueOrDefault(status, -1)) | ||||||
|  |             .ThenBy(static status => status.ToString(), StringComparer.Ordinal) | ||||||
|  |             .First(); | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     private static VexJustification? SelectJustification( | ||||||
|  |         VexClaimStatus canonicalStatus, | ||||||
|  |         ImmutableArray<VexClaim> claims, | ||||||
|  |         IDictionary<string, SortedSet<string>> diagnostics, | ||||||
|  |         (string Vulnerability, string ProductKey) groupKey) | ||||||
|  |     { | ||||||
|  |         var relevantClaims = claims | ||||||
|  |             .Where(claim => claim.Status == canonicalStatus) | ||||||
|  |             .ToArray(); | ||||||
|  |  | ||||||
|  |         if (relevantClaims.Length == 0) | ||||||
|  |         { | ||||||
|  |             relevantClaims = claims.ToArray(); | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         var justifications = relevantClaims | ||||||
|  |             .Select(static claim => claim.Justification) | ||||||
|  |             .Where(static justification => justification is not null) | ||||||
|  |             .Cast<VexJustification>() | ||||||
|  |             .Distinct() | ||||||
|  |             .ToArray(); | ||||||
|  |  | ||||||
|  |         if (justifications.Length == 0) | ||||||
|  |         { | ||||||
|  |             return null; | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         if (justifications.Length > 1) | ||||||
|  |         { | ||||||
|  |             AddDiagnostic( | ||||||
|  |                 diagnostics, | ||||||
|  |                 "openvex.justification_conflict", | ||||||
|  |                 FormattableString.Invariant($"{groupKey.Vulnerability}:{groupKey.ProductKey}={string.Join('|', justifications.Select(static justification => justification.ToString().ToLowerInvariant()))}")); | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         return justifications | ||||||
|  |             .OrderBy(static justification => justification.ToString(), StringComparer.Ordinal) | ||||||
|  |             .First(); | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     private static string? BuildDetail(ImmutableArray<VexClaim> claims) | ||||||
|  |     { | ||||||
|  |         var details = claims | ||||||
|  |             .Select(static claim => claim.Detail) | ||||||
|  |             .Where(static detail => !string.IsNullOrWhiteSpace(detail)) | ||||||
|  |             .Select(static detail => detail!.Trim()) | ||||||
|  |             .Distinct(StringComparer.Ordinal) | ||||||
|  |             .ToArray(); | ||||||
|  |  | ||||||
|  |         if (details.Length == 0) | ||||||
|  |         { | ||||||
|  |             return null; | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         return string.Join("; ", details.OrderBy(static detail => detail, StringComparer.Ordinal)); | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     private static ImmutableArray<OpenVexSourceEntry> BuildSources(ImmutableArray<VexClaim> claims) | ||||||
|  |     { | ||||||
|  |         var builder = ImmutableArray.CreateBuilder<OpenVexSourceEntry>(claims.Length); | ||||||
|  |         foreach (var claim in claims) | ||||||
|  |         { | ||||||
|  |             builder.Add(new OpenVexSourceEntry( | ||||||
|  |                 claim.ProviderId, | ||||||
|  |                 claim.Status, | ||||||
|  |                 claim.Justification, | ||||||
|  |                 claim.Document.Digest, | ||||||
|  |                 claim.Document.SourceUri, | ||||||
|  |                 claim.Detail, | ||||||
|  |                 claim.FirstSeen, | ||||||
|  |                 claim.LastSeen)); | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         return builder | ||||||
|  |             .ToImmutable() | ||||||
|  |             .OrderBy(static source => source.ProviderId, StringComparer.Ordinal) | ||||||
|  |             .ThenBy(static source => source.DocumentDigest, StringComparer.Ordinal) | ||||||
|  |             .ToImmutableArray(); | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     private static VexProduct MergeProduct(ImmutableArray<VexClaim> claims) | ||||||
|  |     { | ||||||
|  |         var key = claims[0].Product.Key; | ||||||
|  |         var names = claims | ||||||
|  |             .Select(static claim => claim.Product.Name) | ||||||
|  |             .Where(static name => !string.IsNullOrWhiteSpace(name)) | ||||||
|  |             .Select(static name => name!) | ||||||
|  |             .Distinct(StringComparer.Ordinal) | ||||||
|  |             .ToArray(); | ||||||
|  |  | ||||||
|  |         var versions = claims | ||||||
|  |             .Select(static claim => claim.Product.Version) | ||||||
|  |             .Where(static version => !string.IsNullOrWhiteSpace(version)) | ||||||
|  |             .Select(static version => version!) | ||||||
|  |             .Distinct(StringComparer.Ordinal) | ||||||
|  |             .ToArray(); | ||||||
|  |  | ||||||
|  |         var purls = claims | ||||||
|  |             .Select(static claim => claim.Product.Purl) | ||||||
|  |             .Where(static purl => !string.IsNullOrWhiteSpace(purl)) | ||||||
|  |             .Select(static purl => purl!) | ||||||
|  |             .Distinct(StringComparer.OrdinalIgnoreCase) | ||||||
|  |             .ToArray(); | ||||||
|  |  | ||||||
|  |         var cpes = claims | ||||||
|  |             .Select(static claim => claim.Product.Cpe) | ||||||
|  |             .Where(static cpe => !string.IsNullOrWhiteSpace(cpe)) | ||||||
|  |             .Select(static cpe => cpe!) | ||||||
|  |             .Distinct(StringComparer.OrdinalIgnoreCase) | ||||||
|  |             .ToArray(); | ||||||
|  |  | ||||||
|  |         var identifiers = claims | ||||||
|  |             .SelectMany(static claim => claim.Product.ComponentIdentifiers) | ||||||
|  |             .Where(static identifier => !string.IsNullOrWhiteSpace(identifier)) | ||||||
|  |             .Select(static identifier => identifier!) | ||||||
|  |             .Distinct(StringComparer.OrdinalIgnoreCase) | ||||||
|  |             .OrderBy(static identifier => identifier, StringComparer.OrdinalIgnoreCase) | ||||||
|  |             .ToImmutableArray(); | ||||||
|  |  | ||||||
|  |         return new VexProduct( | ||||||
|  |             key, | ||||||
|  |             names.Length == 0 ? claims[0].Product.Name : names.OrderByDescending(static name => name.Length).ThenBy(static name => name, StringComparer.Ordinal).First(), | ||||||
|  |             versions.Length == 0 ? claims[0].Product.Version : versions.OrderByDescending(static version => version.Length).ThenBy(static version => version, StringComparer.Ordinal).First(), | ||||||
|  |             purls.Length == 0 ? claims[0].Product.Purl : purls.OrderBy(static purl => purl, StringComparer.OrdinalIgnoreCase).First(), | ||||||
|  |             cpes.Length == 0 ? claims[0].Product.Cpe : cpes.OrderBy(static cpe => cpe, StringComparer.OrdinalIgnoreCase).First(), | ||||||
|  |             identifiers); | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     private static void AddDiagnostic( | ||||||
|  |         IDictionary<string, SortedSet<string>> diagnostics, | ||||||
|  |         string code, | ||||||
|  |         string value) | ||||||
|  |     { | ||||||
|  |         if (!diagnostics.TryGetValue(code, out var entries)) | ||||||
|  |         { | ||||||
|  |             entries = new SortedSet<string>(StringComparer.Ordinal); | ||||||
|  |             diagnostics[code] = entries; | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         entries.Add(value); | ||||||
|  |     } | ||||||
|  | } | ||||||
|  |  | ||||||
|  | public sealed record OpenVexMergeResult( | ||||||
|  |     ImmutableArray<OpenVexMergedStatement> Statements, | ||||||
|  |     ImmutableDictionary<string, string> Diagnostics); | ||||||
|  |  | ||||||
|  | public sealed record OpenVexMergedStatement( | ||||||
|  |     string VulnerabilityId, | ||||||
|  |     VexProduct Product, | ||||||
|  |     VexClaimStatus Status, | ||||||
|  |     VexJustification? Justification, | ||||||
|  |     string? Detail, | ||||||
|  |     ImmutableArray<OpenVexSourceEntry> Sources, | ||||||
|  |     DateTimeOffset FirstObserved, | ||||||
|  |     DateTimeOffset LastObserved); | ||||||
|  |  | ||||||
|  | public sealed record OpenVexSourceEntry( | ||||||
|  |     string ProviderId, | ||||||
|  |     VexClaimStatus Status, | ||||||
|  |     VexJustification? Justification, | ||||||
|  |     string DocumentDigest, | ||||||
|  |     Uri DocumentSource, | ||||||
|  |     string? Detail, | ||||||
|  |     DateTimeOffset FirstSeen, | ||||||
|  |     DateTimeOffset LastSeen) | ||||||
|  | { | ||||||
|  |     public string DocumentDigest { get; } = string.IsNullOrWhiteSpace(DocumentDigest) | ||||||
|  |         ? throw new ArgumentException("Document digest must be provided.", nameof(DocumentDigest)) | ||||||
|  |         : DocumentDigest.Trim(); | ||||||
|  | } | ||||||
| @@ -9,6 +9,7 @@ public static class OpenVexFormatsServiceCollectionExtensions | |||||||
|     { |     { | ||||||
|         ArgumentNullException.ThrowIfNull(services); |         ArgumentNullException.ThrowIfNull(services); | ||||||
|         services.AddSingleton<IVexNormalizer, OpenVexNormalizer>(); |         services.AddSingleton<IVexNormalizer, OpenVexNormalizer>(); | ||||||
|  |         services.AddSingleton<IVexExporter, OpenVexExporter>(); | ||||||
|         return services; |         return services; | ||||||
|     } |     } | ||||||
| } | } | ||||||
|   | |||||||
| @@ -3,5 +3,5 @@ If you are working on this file you need to read docs/modules/excititor/ARCHITEC | |||||||
| | Task | Owner(s) | Depends on | Notes | | | Task | Owner(s) | Depends on | Notes | | ||||||
| |---|---|---|---| | |---|---|---|---| | ||||||
| |EXCITITOR-FMT-OPENVEX-01-001 – OpenVEX normalizer|Team Excititor Formats|EXCITITOR-CORE-01-001|**DONE (2025-10-17)** – OpenVEX normalizer parses statements/products, maps status/justification, and surfaces provenance metadata; coverage in `OpenVexNormalizerTests`.| | |EXCITITOR-FMT-OPENVEX-01-001 – OpenVEX normalizer|Team Excititor Formats|EXCITITOR-CORE-01-001|**DONE (2025-10-17)** – OpenVEX normalizer parses statements/products, maps status/justification, and surfaces provenance metadata; coverage in `OpenVexNormalizerTests`.| | ||||||
| |EXCITITOR-FMT-OPENVEX-01-002 – Statement merge utilities|Team Excititor Formats|EXCITITOR-FMT-OPENVEX-01-001|**DOING (2025-10-19)** – Prereq EXCITITOR-FMT-OPENVEX-01-001 confirmed DONE; building deterministic merge reducers with policy diagnostics.| | |EXCITITOR-FMT-OPENVEX-01-002 – Statement merge utilities|Team Excititor Formats|EXCITITOR-FMT-OPENVEX-01-001|**DONE (2025-10-29)** – Delivered deterministic statement merger prioritising risk status, preserving source provenance, and surfacing conflict diagnostics.| | ||||||
| |EXCITITOR-FMT-OPENVEX-01-003 – OpenVEX export writer|Team Excititor Formats|EXCITITOR-EXPORT-01-001, EXCITITOR-FMT-OPENVEX-01-001|**DOING (2025-10-19)** – Prereqs EXCITITOR-EXPORT-01-001 & EXCITITOR-FMT-OPENVEX-01-001 verified DONE; starting canonical OpenVEX exporter with stable ordering/SBOM references.| | |EXCITITOR-FMT-OPENVEX-01-003 – OpenVEX export writer|Team Excititor Formats|EXCITITOR-EXPORT-01-001, EXCITITOR-FMT-OPENVEX-01-001|**DONE (2025-10-29)** – Shipped canonical OpenVEX exporter emitting merged statements, metadata, and stable digests for attested distribution.| | ||||||
|   | |||||||
| @@ -6,7 +6,12 @@ | |||||||
|     <Nullable>enable</Nullable> |     <Nullable>enable</Nullable> | ||||||
|     <ImplicitUsings>enable</ImplicitUsings> |     <ImplicitUsings>enable</ImplicitUsings> | ||||||
|     <TreatWarningsAsErrors>true</TreatWarningsAsErrors> |     <TreatWarningsAsErrors>true</TreatWarningsAsErrors> | ||||||
|  |     <UseConcelierTestInfra>false</UseConcelierTestInfra> | ||||||
|   </PropertyGroup> |   </PropertyGroup> | ||||||
|  |   <ItemGroup> | ||||||
|  |     <Compile Remove="..\..\..\StellaOps.Concelier.Tests.Shared\AssemblyInfo.cs" /> | ||||||
|  |     <Compile Remove="..\..\..\StellaOps.Concelier.Tests.Shared\MongoFixtureCollection.cs" /> | ||||||
|  |   </ItemGroup> | ||||||
|   <ItemGroup> |   <ItemGroup> | ||||||
|     <ProjectReference Include="../../__Libraries/StellaOps.Excititor.Attestation/StellaOps.Excititor.Attestation.csproj" /> |     <ProjectReference Include="../../__Libraries/StellaOps.Excititor.Attestation/StellaOps.Excititor.Attestation.csproj" /> | ||||||
|     <ProjectReference Include="../../__Libraries/StellaOps.Excititor.Core/StellaOps.Excititor.Core.csproj" /> |     <ProjectReference Include="../../__Libraries/StellaOps.Excititor.Core/StellaOps.Excititor.Core.csproj" /> | ||||||
|   | |||||||
| @@ -67,6 +67,45 @@ public sealed class VexAttestationVerifierTests : IDisposable | |||||||
|         Assert.Equal("offline", verification.Diagnostics["rekor.state"]); |         Assert.Equal("offline", verification.Diagnostics["rekor.state"]); | ||||||
|     } |     } | ||||||
|  |  | ||||||
|  |     [Fact] | ||||||
|  |     public async Task VerifyAsync_ReturnsInvalid_WhenTransparencyRequiredAndMissing() | ||||||
|  |     { | ||||||
|  |         var (request, metadata, envelope) = await CreateSignedAttestationAsync(includeRekor: false); | ||||||
|  |         var verifier = CreateVerifier(options => | ||||||
|  |         { | ||||||
|  |             options.RequireTransparencyLog = true; | ||||||
|  |             options.AllowOfflineTransparency = false; | ||||||
|  |         }); | ||||||
|  |  | ||||||
|  |         var verification = await verifier.VerifyAsync( | ||||||
|  |             new VexAttestationVerificationRequest(request, metadata, envelope), | ||||||
|  |             CancellationToken.None); | ||||||
|  |  | ||||||
|  |         Assert.False(verification.IsValid); | ||||||
|  |         Assert.Equal("missing", verification.Diagnostics["rekor.state"]); | ||||||
|  |         Assert.Equal("invalid", verification.Diagnostics["result"]); | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     [Fact] | ||||||
|  |     public async Task VerifyAsync_ReturnsInvalid_WhenTransparencyUnavailableAndOfflineDisallowed() | ||||||
|  |     { | ||||||
|  |         var (request, metadata, envelope) = await CreateSignedAttestationAsync(includeRekor: true); | ||||||
|  |         var transparency = new ThrowingTransparencyLogClient(); | ||||||
|  |         var verifier = CreateVerifier(options => | ||||||
|  |         { | ||||||
|  |             options.RequireTransparencyLog = true; | ||||||
|  |             options.AllowOfflineTransparency = false; | ||||||
|  |         }, transparency); | ||||||
|  |  | ||||||
|  |         var verification = await verifier.VerifyAsync( | ||||||
|  |             new VexAttestationVerificationRequest(request, metadata, envelope), | ||||||
|  |             CancellationToken.None); | ||||||
|  |  | ||||||
|  |         Assert.False(verification.IsValid); | ||||||
|  |         Assert.Equal("unreachable", verification.Diagnostics["rekor.state"]); | ||||||
|  |         Assert.Equal("invalid", verification.Diagnostics["result"]); | ||||||
|  |     } | ||||||
|  |  | ||||||
|     private async Task<(VexAttestationRequest Request, VexAttestationMetadata Metadata, string Envelope)> CreateSignedAttestationAsync(bool includeRekor = false) |     private async Task<(VexAttestationRequest Request, VexAttestationMetadata Metadata, string Envelope)> CreateSignedAttestationAsync(bool includeRekor = false) | ||||||
|     { |     { | ||||||
|         var signer = new FakeSigner(); |         var signer = new FakeSigner(); | ||||||
|   | |||||||
| @@ -111,6 +111,121 @@ public sealed class CiscoCsafConnectorTests | |||||||
|         sink.Documents.Should().BeEmpty(); |         sink.Documents.Should().BeEmpty(); | ||||||
|     } |     } | ||||||
|  |  | ||||||
|  |     [Fact] | ||||||
|  |     public async Task FetchAsync_EmitsTrustMetadataAndUpsertsProvider() | ||||||
|  |     { | ||||||
|  |         var metadataResponse = """ | ||||||
|  |             { | ||||||
|  |               "metadata": { | ||||||
|  |                 "publisher": { | ||||||
|  |                   "name": "Cisco", | ||||||
|  |                   "category": "vendor", | ||||||
|  |                   "contact_details": { "id": "excititor:cisco" } | ||||||
|  |                 } | ||||||
|  |               }, | ||||||
|  |               "trust": { | ||||||
|  |                 "weight": 0.75, | ||||||
|  |                 "cosign": { | ||||||
|  |                   "issuer": "https://issuer.example.com", | ||||||
|  |                   "identity_pattern": "https://sig.example.com/*" | ||||||
|  |                 }, | ||||||
|  |                 "pgp_fingerprints": [ | ||||||
|  |                   "0123456789ABCDEF", | ||||||
|  |                   "FEDCBA9876543210" | ||||||
|  |                 ] | ||||||
|  |               }, | ||||||
|  |               "distributions": { | ||||||
|  |                 "directories": [ "https://api.cisco.test/csaf/" ] | ||||||
|  |               } | ||||||
|  |             } | ||||||
|  |             """; | ||||||
|  |  | ||||||
|  |         var responses = new Dictionary<Uri, Queue<HttpResponseMessage>> | ||||||
|  |         { | ||||||
|  |             [new Uri("https://api.cisco.test/.well-known/csaf/provider-metadata.json")] = QueueResponses(metadataResponse), | ||||||
|  |             [new Uri("https://api.cisco.test/csaf/index.json")] = QueueResponses(""" | ||||||
|  |                 { | ||||||
|  |                   "advisories": [ | ||||||
|  |                     { | ||||||
|  |                       "id": "cisco-sa-2025", | ||||||
|  |                       "url": "https://api.cisco.test/csaf/cisco-sa-2025.json", | ||||||
|  |                       "published": "2025-10-01T00:00:00Z", | ||||||
|  |                       "lastModified": "2025-10-02T00:00:00Z", | ||||||
|  |                       "sha256": "cafebabe" | ||||||
|  |                     } | ||||||
|  |                   ] | ||||||
|  |                 } | ||||||
|  |                 """), | ||||||
|  |             [new Uri("https://api.cisco.test/csaf/cisco-sa-2025.json")] = QueueResponses("{ \"document\": \"payload\" }") | ||||||
|  |         }; | ||||||
|  |  | ||||||
|  |         var handler = new RoutingHttpMessageHandler(responses); | ||||||
|  |         var httpClient = new HttpClient(handler); | ||||||
|  |         var factory = new SingleHttpClientFactory(httpClient); | ||||||
|  |         var connectorOptions = new CiscoConnectorOptions | ||||||
|  |         { | ||||||
|  |             MetadataUri = "https://api.cisco.test/.well-known/csaf/provider-metadata.json", | ||||||
|  |             PersistOfflineSnapshot = false, | ||||||
|  |         }; | ||||||
|  |  | ||||||
|  |         var metadataLoader = new CiscoProviderMetadataLoader( | ||||||
|  |             factory, | ||||||
|  |             new MemoryCache(new MemoryCacheOptions()), | ||||||
|  |             Options.Create(connectorOptions), | ||||||
|  |             NullLogger<CiscoProviderMetadataLoader>.Instance, | ||||||
|  |             new MockFileSystem()); | ||||||
|  |  | ||||||
|  |         var stateRepository = new InMemoryConnectorStateRepository(); | ||||||
|  |         var connector = new CiscoCsafConnector( | ||||||
|  |             metadataLoader, | ||||||
|  |             factory, | ||||||
|  |             stateRepository, | ||||||
|  |             new[] { new CiscoConnectorOptionsValidator() }, | ||||||
|  |             NullLogger<CiscoCsafConnector>.Instance, | ||||||
|  |             TimeProvider.System); | ||||||
|  |  | ||||||
|  |         await connector.ValidateAsync(VexConnectorSettings.Empty, CancellationToken.None); | ||||||
|  |  | ||||||
|  |         var providerStore = new StubProviderStore(); | ||||||
|  |         var services = new ServiceCollection() | ||||||
|  |             .AddSingleton<IVexProviderStore>(providerStore) | ||||||
|  |             .BuildServiceProvider(); | ||||||
|  |  | ||||||
|  |         var sink = new InMemoryRawSink(); | ||||||
|  |         var context = new VexConnectorContext( | ||||||
|  |             null, | ||||||
|  |             VexConnectorSettings.Empty, | ||||||
|  |             sink, | ||||||
|  |             new NoopSignatureVerifier(), | ||||||
|  |             new NoopNormalizerRouter(), | ||||||
|  |             services, | ||||||
|  |             ImmutableDictionary<string, string>.Empty); | ||||||
|  |  | ||||||
|  |         var documents = new List<VexRawDocument>(); | ||||||
|  |         await foreach (var doc in connector.FetchAsync(context, CancellationToken.None)) | ||||||
|  |         { | ||||||
|  |             documents.Add(doc); | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         documents.Should().HaveCount(1); | ||||||
|  |         var metadata = documents[0].Metadata; | ||||||
|  |         metadata.Should().Contain("vex.provenance.provider", "excititor:cisco"); | ||||||
|  |         metadata.Should().Contain("vex.provenance.providerName", "Cisco"); | ||||||
|  |         metadata.Should().Contain("vex.provenance.trust.weight", "0.75"); | ||||||
|  |         metadata.Should().Contain("vex.provenance.cosign.issuer", "https://issuer.example.com"); | ||||||
|  |         metadata.Should().Contain("vex.provenance.cosign.identityPattern", "https://sig.example.com/*"); | ||||||
|  |         metadata.Should().Contain("vex.provenance.pgp.fingerprints", "0123456789ABCDEF,FEDCBA9876543210"); | ||||||
|  |  | ||||||
|  |         providerStore.SavedProviders.Should().HaveCount(1); | ||||||
|  |         var savedProvider = providerStore.SavedProviders[0]; | ||||||
|  |         savedProvider.Id.Should().Be("excititor:cisco"); | ||||||
|  |         savedProvider.Trust.Weight.Should().Be(0.75); | ||||||
|  |         savedProvider.Trust.Cosign.Should().NotBeNull(); | ||||||
|  |         savedProvider.Trust.Cosign!.Issuer.Should().Be("https://issuer.example.com"); | ||||||
|  |         savedProvider.Trust.Cosign.IdentityPattern.Should().Be("https://sig.example.com/*"); | ||||||
|  |         savedProvider.Trust.PgpFingerprints.Should().Contain(new[] { "0123456789ABCDEF", "FEDCBA9876543210" }); | ||||||
|  |     } | ||||||
|  |  | ||||||
|     private static Queue<HttpResponseMessage> QueueResponses(string payload) |     private static Queue<HttpResponseMessage> QueueResponses(string payload) | ||||||
|         => new(new[] |         => new(new[] | ||||||
|         { |         { | ||||||
| @@ -170,6 +285,23 @@ public sealed class CiscoCsafConnectorTests | |||||||
|         } |         } | ||||||
|     } |     } | ||||||
|  |  | ||||||
|  |     private sealed class StubProviderStore : IVexProviderStore | ||||||
|  |     { | ||||||
|  |         public List<VexProvider> SavedProviders { get; } = new(); | ||||||
|  |  | ||||||
|  |         public ValueTask<VexProvider?> FindAsync(string id, CancellationToken cancellationToken, IClientSessionHandle? session = null) | ||||||
|  |             => ValueTask.FromResult<VexProvider?>(null); | ||||||
|  |  | ||||||
|  |         public ValueTask<IReadOnlyCollection<VexProvider>> ListAsync(CancellationToken cancellationToken, IClientSessionHandle? session = null) | ||||||
|  |             => ValueTask.FromResult<IReadOnlyCollection<VexProvider>>(Array.Empty<VexProvider>()); | ||||||
|  |  | ||||||
|  |         public ValueTask SaveAsync(VexProvider provider, CancellationToken cancellationToken, IClientSessionHandle? session = null) | ||||||
|  |         { | ||||||
|  |             SavedProviders.Add(provider); | ||||||
|  |             return ValueTask.CompletedTask; | ||||||
|  |         } | ||||||
|  |     } | ||||||
|  |  | ||||||
|     private sealed class InMemoryRawSink : IVexRawDocumentSink |     private sealed class InMemoryRawSink : IVexRawDocumentSink | ||||||
|     { |     { | ||||||
|         public List<VexRawDocument> Documents { get; } = new(); |         public List<VexRawDocument> Documents { get; } = new(); | ||||||
|   | |||||||
| @@ -6,11 +6,16 @@ | |||||||
|     <Nullable>enable</Nullable> |     <Nullable>enable</Nullable> | ||||||
|     <ImplicitUsings>enable</ImplicitUsings> |     <ImplicitUsings>enable</ImplicitUsings> | ||||||
|     <TreatWarningsAsErrors>true</TreatWarningsAsErrors> |     <TreatWarningsAsErrors>true</TreatWarningsAsErrors> | ||||||
|  |     <UseConcelierTestInfra>false</UseConcelierTestInfra> | ||||||
|   </PropertyGroup> |   </PropertyGroup> | ||||||
|   <ItemGroup> |   <ItemGroup> | ||||||
|     <PackageReference Include="FluentAssertions" Version="6.12.0" /> |     <PackageReference Include="FluentAssertions" Version="6.12.0" /> | ||||||
|     <PackageReference Include="System.IO.Abstractions.TestingHelpers" Version="20.0.28" /> |     <PackageReference Include="System.IO.Abstractions.TestingHelpers" Version="20.0.28" /> | ||||||
|   </ItemGroup> |   </ItemGroup> | ||||||
|  |   <ItemGroup> | ||||||
|  |     <Compile Remove="..\..\..\StellaOps.Concelier.Tests.Shared\AssemblyInfo.cs" /> | ||||||
|  |     <Compile Remove="..\..\..\StellaOps.Concelier.Tests.Shared\MongoFixtureCollection.cs" /> | ||||||
|  |   </ItemGroup> | ||||||
|   <ItemGroup> |   <ItemGroup> | ||||||
|     <ProjectReference Include="../../__Libraries/StellaOps.Excititor.Connectors.Cisco.CSAF/StellaOps.Excititor.Connectors.Cisco.CSAF.csproj" /> |     <ProjectReference Include="../../__Libraries/StellaOps.Excititor.Connectors.Cisco.CSAF/StellaOps.Excititor.Connectors.Cisco.CSAF.csproj" /> | ||||||
|   </ItemGroup> |   </ItemGroup> | ||||||
|   | |||||||
| @@ -0,0 +1,429 @@ | |||||||
|  | using System.Collections.Immutable; | ||||||
|  | using System.Globalization; | ||||||
|  | using System.Net; | ||||||
|  | using System.Net.Http; | ||||||
|  | using System.Security.Cryptography; | ||||||
|  | using System.Text; | ||||||
|  | using FluentAssertions; | ||||||
|  | using Microsoft.Extensions.Caching.Memory; | ||||||
|  | using Microsoft.Extensions.DependencyInjection; | ||||||
|  | using Microsoft.Extensions.Logging.Abstractions; | ||||||
|  | using StellaOps.Excititor.Connectors.Abstractions; | ||||||
|  | using StellaOps.Excititor.Connectors.SUSE.RancherVEXHub; | ||||||
|  | using StellaOps.Excititor.Connectors.SUSE.RancherVEXHub.Configuration; | ||||||
|  | using StellaOps.Excititor.Connectors.SUSE.RancherVEXHub.Events; | ||||||
|  | using StellaOps.Excititor.Connectors.SUSE.RancherVEXHub.Metadata; | ||||||
|  | using StellaOps.Excititor.Connectors.SUSE.RancherVEXHub.State; | ||||||
|  | using StellaOps.Excititor.Core; | ||||||
|  | using StellaOps.Excititor.Storage.Mongo; | ||||||
|  | using Xunit; | ||||||
|  |  | ||||||
|  | namespace StellaOps.Excititor.Connectors.SUSE.RancherVEXHub.Tests.Connectors; | ||||||
|  |  | ||||||
|  | public sealed class RancherHubConnectorTests | ||||||
|  | { | ||||||
|  |     [Fact] | ||||||
|  |     public async Task FetchAsync_OfflineSnapshot_StoresDocumentAndUpdatesCheckpoint() | ||||||
|  |     { | ||||||
|  |         using var fixture = await ConnectorFixture.CreateAsync(); | ||||||
|  |  | ||||||
|  |         var sink = new InMemoryRawSink(); | ||||||
|  |         var context = fixture.CreateContext(sink); | ||||||
|  |  | ||||||
|  |         var documents = await CollectAsync(fixture.Connector.FetchAsync(context, CancellationToken.None)); | ||||||
|  |  | ||||||
|  |         documents.Should().HaveCount(1); | ||||||
|  |         var document = documents[0]; | ||||||
|  |         document.Digest.Should().Be(fixture.ExpectedDocumentDigest); | ||||||
|  |         document.Metadata.Should().ContainKey("rancher.event.id").WhoseValue.Should().Be("evt-1"); | ||||||
|  |         document.Metadata.Should().ContainKey("rancher.event.cursor").WhoseValue.Should().Be("cursor-2"); | ||||||
|  |         sink.Documents.Should().HaveCount(1); | ||||||
|  |  | ||||||
|  |         var state = fixture.StateRepository.State; | ||||||
|  |         state.Should().NotBeNull(); | ||||||
|  |         state!.LastUpdated.Should().Be(DateTimeOffset.Parse("2025-10-19T12:00:00Z", CultureInfo.InvariantCulture, DateTimeStyles.AssumeUniversal)); | ||||||
|  |         state.DocumentDigests.Should().Contain(fixture.ExpectedDocumentDigest); | ||||||
|  |         state.DocumentDigests.Should().Contain("checkpoint:cursor-2"); | ||||||
|  |         state.DocumentDigests.Count.Should().BeLessOrEqualTo(ConnectorFixture.MaxDigestHistory + 1); | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     [Fact] | ||||||
|  |     public async Task FetchAsync_WhenDocumentDownloadFails_QuarantinesEvent() | ||||||
|  |     { | ||||||
|  |         using var fixture = await ConnectorFixture.CreateAsync(); | ||||||
|  |  | ||||||
|  |         fixture.Handler.SetRoute(fixture.DocumentUri, () => new HttpResponseMessage(HttpStatusCode.InternalServerError)); | ||||||
|  |  | ||||||
|  |         var sink = new InMemoryRawSink(); | ||||||
|  |         var context = fixture.CreateContext(sink); | ||||||
|  |  | ||||||
|  |         var documents = await CollectAsync(fixture.Connector.FetchAsync(context, CancellationToken.None)); | ||||||
|  |  | ||||||
|  |         documents.Should().BeEmpty(); | ||||||
|  |         sink.Documents.Should().HaveCount(1); | ||||||
|  |         var quarantined = sink.Documents[0]; | ||||||
|  |         quarantined.Metadata.Should().Contain("rancher.event.quarantine", "true"); | ||||||
|  |         quarantined.Metadata.Should().ContainKey("rancher.event.error").WhoseValue.Should().Contain("document fetch failed"); | ||||||
|  |  | ||||||
|  |         var state = fixture.StateRepository.State; | ||||||
|  |         state.Should().NotBeNull(); | ||||||
|  |         state!.DocumentDigests.Should().Contain(d => d.StartsWith("quarantine:", StringComparison.Ordinal)); | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     [Fact] | ||||||
|  |     public async Task FetchAsync_ReplayingSnapshot_SkipsDuplicateDocuments() | ||||||
|  |     { | ||||||
|  |         using var fixture = await ConnectorFixture.CreateAsync(); | ||||||
|  |  | ||||||
|  |         var firstSink = new InMemoryRawSink(); | ||||||
|  |         var firstContext = fixture.CreateContext(firstSink); | ||||||
|  |         await CollectAsync(fixture.Connector.FetchAsync(firstContext, CancellationToken.None)); | ||||||
|  |  | ||||||
|  |         var secondSink = new InMemoryRawSink(); | ||||||
|  |         var secondContext = fixture.CreateContext(secondSink); | ||||||
|  |         var secondRunDocuments = await CollectAsync(fixture.Connector.FetchAsync(secondContext, CancellationToken.None)); | ||||||
|  |  | ||||||
|  |         secondRunDocuments.Should().BeEmpty(); | ||||||
|  |         secondSink.Documents.Should().BeEmpty(); | ||||||
|  |  | ||||||
|  |         var state = fixture.StateRepository.State; | ||||||
|  |         state.Should().NotBeNull(); | ||||||
|  |         state!.DocumentDigests.Should().Contain(fixture.ExpectedDocumentDigest); | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     [Fact] | ||||||
|  |     public async Task FetchAsync_TrimsPersistedDigestHistory() | ||||||
|  |     { | ||||||
|  |         var existingDigests = Enumerable.Range(0, ConnectorFixture.MaxDigestHistory + 5) | ||||||
|  |             .Select(i => $"sha256:{i:X32}") | ||||||
|  |             .ToImmutableArray(); | ||||||
|  |         var initialState = new VexConnectorState( | ||||||
|  |             "excititor:suse.rancher", | ||||||
|  |             DateTimeOffset.Parse("2025-10-18T00:00:00Z", CultureInfo.InvariantCulture, DateTimeStyles.AssumeUniversal), | ||||||
|  |             ImmutableArray.CreateBuilder<string>() | ||||||
|  |                 .Add("checkpoint:cursor-old") | ||||||
|  |                 .AddRange(existingDigests) | ||||||
|  |                 .ToImmutable()); | ||||||
|  |  | ||||||
|  |         using var fixture = await ConnectorFixture.CreateAsync(initialState); | ||||||
|  |  | ||||||
|  |         var sink = new InMemoryRawSink(); | ||||||
|  |         var context = fixture.CreateContext(sink); | ||||||
|  |         await CollectAsync(fixture.Connector.FetchAsync(context, CancellationToken.None)); | ||||||
|  |  | ||||||
|  |         var state = fixture.StateRepository.State; | ||||||
|  |         state.Should().NotBeNull(); | ||||||
|  |         state!.DocumentDigests.Should().Contain(d => d.StartsWith("checkpoint:", StringComparison.Ordinal)); | ||||||
|  |         state.DocumentDigests.Count.Should().Be(ConnectorFixture.MaxDigestHistory + 1); | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     private static async Task<List<VexRawDocument>> CollectAsync(IAsyncEnumerable<VexRawDocument> source) | ||||||
|  |     { | ||||||
|  |         var list = new List<VexRawDocument>(); | ||||||
|  |         await foreach (var document in source.ConfigureAwait(false)) | ||||||
|  |         { | ||||||
|  |             list.Add(document); | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         return list; | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     #region helpers | ||||||
|  |  | ||||||
|  |     private sealed class ConnectorFixture : IDisposable | ||||||
|  |     { | ||||||
|  |         public const int MaxDigestHistory = 200; | ||||||
|  |  | ||||||
|  |         private readonly IServiceProvider _serviceProvider; | ||||||
|  |         private readonly TempDirectory _tempDirectory; | ||||||
|  |         private readonly HttpClient _httpClient; | ||||||
|  |  | ||||||
|  |         private ConnectorFixture( | ||||||
|  |             RancherHubConnector connector, | ||||||
|  |             InMemoryConnectorStateRepository stateRepository, | ||||||
|  |             RoutingHttpMessageHandler handler, | ||||||
|  |             IServiceProvider serviceProvider, | ||||||
|  |             TempDirectory tempDirectory, | ||||||
|  |             HttpClient httpClient, | ||||||
|  |             Uri documentUri, | ||||||
|  |             string documentDigest) | ||||||
|  |         { | ||||||
|  |             Connector = connector; | ||||||
|  |             StateRepository = stateRepository; | ||||||
|  |             Handler = handler; | ||||||
|  |             _serviceProvider = serviceProvider; | ||||||
|  |             _tempDirectory = tempDirectory; | ||||||
|  |             _httpClient = httpClient; | ||||||
|  |             DocumentUri = documentUri; | ||||||
|  |             ExpectedDocumentDigest = $"sha256:{documentDigest}"; | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         public RancherHubConnector Connector { get; } | ||||||
|  |  | ||||||
|  |         public InMemoryConnectorStateRepository StateRepository { get; } | ||||||
|  |  | ||||||
|  |         public RoutingHttpMessageHandler Handler { get; } | ||||||
|  |  | ||||||
|  |         public Uri DocumentUri { get; } | ||||||
|  |  | ||||||
|  |         public string ExpectedDocumentDigest { get; } | ||||||
|  |  | ||||||
|  |         public VexConnectorContext CreateContext(InMemoryRawSink sink, DateTimeOffset? since = null) | ||||||
|  |             => new( | ||||||
|  |                 since, | ||||||
|  |                 VexConnectorSettings.Empty, | ||||||
|  |                 sink, | ||||||
|  |                 new NoopSignatureVerifier(), | ||||||
|  |                 new NoopNormalizerRouter(), | ||||||
|  |                 _serviceProvider, | ||||||
|  |                 ImmutableDictionary<string, string>.Empty); | ||||||
|  |  | ||||||
|  |         public void Dispose() | ||||||
|  |         { | ||||||
|  |             _httpClient.Dispose(); | ||||||
|  |             _tempDirectory.Dispose(); | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         public static async Task<ConnectorFixture> CreateAsync(VexConnectorState? initialState = null) | ||||||
|  |         { | ||||||
|  |             var tempDirectory = new TempDirectory(); | ||||||
|  |             var documentPayload = "{\"document\":\"payload\"}"; | ||||||
|  |             var documentDigest = ComputeSha256Hex(documentPayload); | ||||||
|  |  | ||||||
|  |             var documentUri = new Uri("https://hub.test/events/evt-1.json"); | ||||||
|  |             var eventsPayload = """ | ||||||
|  |             { | ||||||
|  |               "cursor": "cursor-1", | ||||||
|  |               "nextCursor": "cursor-2", | ||||||
|  |               "events": [ | ||||||
|  |                 { | ||||||
|  |                   "id": "evt-1", | ||||||
|  |                   "type": "vex.statement.published", | ||||||
|  |                   "channel": "rancher/rke2", | ||||||
|  |                   "publishedAt": "2025-10-19T12:00:00Z", | ||||||
|  |                   "document": { | ||||||
|  |                     "uri": "https://hub.test/events/evt-1.json", | ||||||
|  |                     "sha256": "DOC_DIGEST", | ||||||
|  |                     "format": "csaf" | ||||||
|  |                   } | ||||||
|  |                 } | ||||||
|  |               ] | ||||||
|  |             } | ||||||
|  |             """.Replace("DOC_DIGEST", documentDigest, StringComparison.Ordinal); | ||||||
|  |  | ||||||
|  |             var eventsPath = tempDirectory.Combine("events.json"); | ||||||
|  |             await File.WriteAllTextAsync(eventsPath, eventsPayload, Encoding.UTF8).ConfigureAwait(false); | ||||||
|  |             var eventsChecksum = ComputeSha256Hex(eventsPayload); | ||||||
|  |  | ||||||
|  |             var discoveryPayload = """ | ||||||
|  |             { | ||||||
|  |               "hubId": "excititor:suse.rancher", | ||||||
|  |               "title": "SUSE Rancher VEX Hub", | ||||||
|  |               "subscription": { | ||||||
|  |                 "eventsUri": "https://hub.test/events", | ||||||
|  |                 "checkpointUri": "https://hub.test/checkpoint", | ||||||
|  |                 "channels": [ "rancher/rke2" ], | ||||||
|  |                 "requiresAuthentication": false | ||||||
|  |               }, | ||||||
|  |               "offline": { | ||||||
|  |                 "snapshotUri": "EVENTS_URI", | ||||||
|  |                 "sha256": "EVENTS_DIGEST" | ||||||
|  |               } | ||||||
|  |             } | ||||||
|  |             """ | ||||||
|  |             .Replace("EVENTS_URI", new Uri(eventsPath).ToString(), StringComparison.Ordinal) | ||||||
|  |             .Replace("EVENTS_DIGEST", eventsChecksum, StringComparison.Ordinal); | ||||||
|  |  | ||||||
|  |             var discoveryPath = tempDirectory.Combine("discovery.json"); | ||||||
|  |             await File.WriteAllTextAsync(discoveryPath, discoveryPayload, Encoding.UTF8).ConfigureAwait(false); | ||||||
|  |  | ||||||
|  |             var handler = new RoutingHttpMessageHandler(); | ||||||
|  |             handler.SetRoute(documentUri, () => JsonResponse(documentPayload)); | ||||||
|  |             var httpClient = new HttpClient(handler) | ||||||
|  |             { | ||||||
|  |                 Timeout = TimeSpan.FromSeconds(10), | ||||||
|  |             }; | ||||||
|  |             var httpFactory = new SingletonHttpClientFactory(httpClient); | ||||||
|  |  | ||||||
|  |             var memoryCache = new MemoryCache(new MemoryCacheOptions()); | ||||||
|  |             var fileSystem = new System.IO.Abstractions.FileSystem(); | ||||||
|  |             var tokenProvider = new RancherHubTokenProvider(httpFactory, memoryCache, NullLogger<RancherHubTokenProvider>.Instance); | ||||||
|  |             var metadataLoader = new RancherHubMetadataLoader(httpFactory, memoryCache, tokenProvider, fileSystem, NullLogger<RancherHubMetadataLoader>.Instance); | ||||||
|  |             var eventClient = new RancherHubEventClient(httpFactory, tokenProvider, fileSystem, NullLogger<RancherHubEventClient>.Instance); | ||||||
|  |  | ||||||
|  |             var stateRepository = new InMemoryConnectorStateRepository(initialState); | ||||||
|  |             var checkpointManager = new RancherHubCheckpointManager(stateRepository); | ||||||
|  |  | ||||||
|  |             var validators = new[] { new RancherHubConnectorOptionsValidator(fileSystem) }; | ||||||
|  |             var connector = new RancherHubConnector( | ||||||
|  |                 metadataLoader, | ||||||
|  |                 eventClient, | ||||||
|  |                 checkpointManager, | ||||||
|  |                 tokenProvider, | ||||||
|  |                 httpFactory, | ||||||
|  |                 NullLogger<RancherHubConnector>.Instance, | ||||||
|  |                 TimeProvider.System, | ||||||
|  |                 validators); | ||||||
|  |  | ||||||
|  |             var settingsValues = ImmutableDictionary.CreateBuilder<string, string>(StringComparer.OrdinalIgnoreCase); | ||||||
|  |             settingsValues["DiscoveryUri"] = "https://hub.test/.well-known/rancher-hub.json"; | ||||||
|  |             settingsValues["OfflineSnapshotPath"] = discoveryPath; | ||||||
|  |             settingsValues["PreferOfflineSnapshot"] = "true"; | ||||||
|  |             var settings = new VexConnectorSettings(settingsValues.ToImmutable()); | ||||||
|  |             await connector.ValidateAsync(settings, CancellationToken.None).ConfigureAwait(false); | ||||||
|  |  | ||||||
|  |             var services = new ServiceCollection().BuildServiceProvider(); | ||||||
|  |  | ||||||
|  |             return new ConnectorFixture( | ||||||
|  |                 connector, | ||||||
|  |                 stateRepository, | ||||||
|  |                 handler, | ||||||
|  |                 services, | ||||||
|  |                 tempDirectory, | ||||||
|  |                 httpClient, | ||||||
|  |                 documentUri, | ||||||
|  |                 documentDigest); | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         private static HttpResponseMessage JsonResponse(string payload) | ||||||
|  |         { | ||||||
|  |             var response = new HttpResponseMessage(HttpStatusCode.OK) | ||||||
|  |             { | ||||||
|  |                 Content = new StringContent(payload, Encoding.UTF8, "application/json"), | ||||||
|  |             }; | ||||||
|  |             return response; | ||||||
|  |         } | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     private sealed class SingletonHttpClientFactory : IHttpClientFactory | ||||||
|  |     { | ||||||
|  |         private readonly HttpClient _client; | ||||||
|  |  | ||||||
|  |         public SingletonHttpClientFactory(HttpClient client) | ||||||
|  |         { | ||||||
|  |             _client = client; | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         public HttpClient CreateClient(string name) => _client; | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     private sealed class RoutingHttpMessageHandler : HttpMessageHandler | ||||||
|  |     { | ||||||
|  |         private readonly Dictionary<Uri, Queue<Func<HttpResponseMessage>>> _routes = new(); | ||||||
|  |  | ||||||
|  |         public void SetRoute(Uri uri, params Func<HttpResponseMessage>[] responders) | ||||||
|  |         { | ||||||
|  |             ArgumentNullException.ThrowIfNull(uri); | ||||||
|  |             if (responders is null || responders.Length == 0) | ||||||
|  |             { | ||||||
|  |                 _routes.Remove(uri); | ||||||
|  |                 return; | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             _routes[uri] = new Queue<Func<HttpResponseMessage>>(responders); | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         protected override Task<HttpResponseMessage> SendAsync(HttpRequestMessage request, CancellationToken cancellationToken) | ||||||
|  |         { | ||||||
|  |             if (request.RequestUri is not null && | ||||||
|  |                 _routes.TryGetValue(request.RequestUri, out var queue) && | ||||||
|  |                 queue.Count > 0) | ||||||
|  |             { | ||||||
|  |                 var responder = queue.Count > 1 ? queue.Dequeue() : queue.Peek(); | ||||||
|  |                 var response = responder(); | ||||||
|  |                 response.RequestMessage = request; | ||||||
|  |                 return Task.FromResult(response); | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             return Task.FromResult(new HttpResponseMessage(HttpStatusCode.NotFound) | ||||||
|  |             { | ||||||
|  |                 Content = new StringContent($"No response configured for {request.RequestUri}", Encoding.UTF8, "text/plain"), | ||||||
|  |             }); | ||||||
|  |         } | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     private sealed class InMemoryConnectorStateRepository : IVexConnectorStateRepository | ||||||
|  |     { | ||||||
|  |         public InMemoryConnectorStateRepository(VexConnectorState? initialState = null) | ||||||
|  |         { | ||||||
|  |             State = initialState; | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         public VexConnectorState? State { get; private set; } | ||||||
|  |  | ||||||
|  |         public ValueTask<VexConnectorState?> GetAsync(string connectorId, CancellationToken cancellationToken, MongoDB.Driver.IClientSessionHandle? session = null) | ||||||
|  |             => ValueTask.FromResult(State); | ||||||
|  |  | ||||||
|  |         public ValueTask SaveAsync(VexConnectorState state, CancellationToken cancellationToken, MongoDB.Driver.IClientSessionHandle? session = null) | ||||||
|  |         { | ||||||
|  |             State = state; | ||||||
|  |             return ValueTask.CompletedTask; | ||||||
|  |         } | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     private sealed class InMemoryRawSink : IVexRawDocumentSink | ||||||
|  |     { | ||||||
|  |         public List<VexRawDocument> Documents { get; } = new(); | ||||||
|  |  | ||||||
|  |         public ValueTask StoreAsync(VexRawDocument document, CancellationToken cancellationToken) | ||||||
|  |         { | ||||||
|  |             Documents.Add(document); | ||||||
|  |             return ValueTask.CompletedTask; | ||||||
|  |         } | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     private sealed class NoopSignatureVerifier : IVexSignatureVerifier | ||||||
|  |     { | ||||||
|  |         public ValueTask<VexSignatureMetadata?> VerifyAsync(VexRawDocument document, CancellationToken cancellationToken) | ||||||
|  |             => ValueTask.FromResult<VexSignatureMetadata?>(null); | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     private sealed class NoopNormalizerRouter : IVexNormalizerRouter | ||||||
|  |     { | ||||||
|  |         public ValueTask<VexClaimBatch> NormalizeAsync(VexRawDocument document, CancellationToken cancellationToken) | ||||||
|  |             => ValueTask.FromResult(new VexClaimBatch(document, ImmutableArray<VexClaim>.Empty, ImmutableDictionary<string, string>.Empty)); | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     private sealed class TempDirectory : IDisposable | ||||||
|  |     { | ||||||
|  |         private readonly string _path; | ||||||
|  |  | ||||||
|  |         public TempDirectory() | ||||||
|  |         { | ||||||
|  |             _path = Path.Combine(Path.GetTempPath(), "stellaops-excititor-tests", Guid.NewGuid().ToString("n")); | ||||||
|  |             Directory.CreateDirectory(_path); | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         public string Combine(string relative) => Path.Combine(_path, relative); | ||||||
|  |  | ||||||
|  |         public void Dispose() | ||||||
|  |         { | ||||||
|  |             try | ||||||
|  |             { | ||||||
|  |                 if (Directory.Exists(_path)) | ||||||
|  |                 { | ||||||
|  |                     Directory.Delete(_path, recursive: true); | ||||||
|  |                 } | ||||||
|  |             } | ||||||
|  |             catch | ||||||
|  |             { | ||||||
|  |                 // Best-effort cleanup. | ||||||
|  |             } | ||||||
|  |         } | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     private static string ComputeSha256Hex(string payload) | ||||||
|  |     { | ||||||
|  |         var bytes = Encoding.UTF8.GetBytes(payload); | ||||||
|  |         return ComputeSha256Hex(bytes); | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     private static string ComputeSha256Hex(ReadOnlySpan<byte> payload) | ||||||
|  |     { | ||||||
|  |         Span<byte> buffer = stackalloc byte[32]; | ||||||
|  |         SHA256.HashData(payload, buffer); | ||||||
|  |         return Convert.ToHexString(buffer).ToLowerInvariant(); | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     #endregion | ||||||
|  | } | ||||||
| @@ -6,11 +6,16 @@ | |||||||
|     <Nullable>enable</Nullable> |     <Nullable>enable</Nullable> | ||||||
|     <ImplicitUsings>enable</ImplicitUsings> |     <ImplicitUsings>enable</ImplicitUsings> | ||||||
|     <TreatWarningsAsErrors>true</TreatWarningsAsErrors> |     <TreatWarningsAsErrors>true</TreatWarningsAsErrors> | ||||||
|  |     <UseConcelierTestInfra>false</UseConcelierTestInfra> | ||||||
|   </PropertyGroup> |   </PropertyGroup> | ||||||
|   <ItemGroup> |   <ItemGroup> | ||||||
|     <ProjectReference Include="../../__Libraries/StellaOps.Excititor.Connectors.SUSE.RancherVEXHub/StellaOps.Excititor.Connectors.SUSE.RancherVEXHub.csproj" /> |     <ProjectReference Include="../../__Libraries/StellaOps.Excititor.Connectors.SUSE.RancherVEXHub/StellaOps.Excititor.Connectors.SUSE.RancherVEXHub.csproj" /> | ||||||
|     <ProjectReference Include="../../__Libraries/StellaOps.Excititor.Storage.Mongo/StellaOps.Excititor.Storage.Mongo.csproj" /> |     <ProjectReference Include="../../__Libraries/StellaOps.Excititor.Storage.Mongo/StellaOps.Excititor.Storage.Mongo.csproj" /> | ||||||
|   </ItemGroup> |   </ItemGroup> | ||||||
|  |   <ItemGroup> | ||||||
|  |     <Compile Remove="..\..\..\StellaOps.Concelier.Tests.Shared\AssemblyInfo.cs" /> | ||||||
|  |     <Compile Remove="..\..\..\StellaOps.Concelier.Tests.Shared\MongoFixtureCollection.cs" /> | ||||||
|  |   </ItemGroup> | ||||||
|   <ItemGroup> |   <ItemGroup> | ||||||
|     <PackageReference Include="FluentAssertions" Version="6.12.0" /> |     <PackageReference Include="FluentAssertions" Version="6.12.0" /> | ||||||
|     <PackageReference Include="System.IO.Abstractions.TestingHelpers" Version="20.0.28" /> |     <PackageReference Include="System.IO.Abstractions.TestingHelpers" Version="20.0.28" /> | ||||||
|   | |||||||
| @@ -0,0 +1,73 @@ | |||||||
|  | using System.Collections.Immutable; | ||||||
|  | using System.Text.Json; | ||||||
|  | using FluentAssertions; | ||||||
|  | using StellaOps.Excititor.Core; | ||||||
|  | using StellaOps.Excititor.Formats.CSAF; | ||||||
|  |  | ||||||
|  | namespace StellaOps.Excititor.Formats.CSAF.Tests; | ||||||
|  |  | ||||||
|  | public sealed class CsafExporterTests | ||||||
|  | { | ||||||
|  |     [Fact] | ||||||
|  |     public async Task SerializeAsync_WritesDeterministicCsafDocument() | ||||||
|  |     { | ||||||
|  |         var claims = ImmutableArray.Create( | ||||||
|  |             new VexClaim( | ||||||
|  |                 "CVE-2025-3000", | ||||||
|  |                 "vendor:example", | ||||||
|  |                 new VexProduct("pkg:example/app@1.0.0", "Example App", "1.0.0", "pkg:example/app@1.0.0"), | ||||||
|  |                 VexClaimStatus.Affected, | ||||||
|  |                 new VexClaimDocument(VexDocumentFormat.Csaf, "sha256:doc1", new Uri("https://example.com/csaf/advisory1.json")), | ||||||
|  |                 new DateTimeOffset(2025, 10, 10, 0, 0, 0, TimeSpan.Zero), | ||||||
|  |                 new DateTimeOffset(2025, 10, 11, 0, 0, 0, TimeSpan.Zero), | ||||||
|  |                 detail: "Impact on Example App 1.0.0"), | ||||||
|  |             new VexClaim( | ||||||
|  |                 "CVE-2025-3000", | ||||||
|  |                 "vendor:example", | ||||||
|  |                 new VexProduct("pkg:example/app@1.0.0", "Example App", "1.0.0", "pkg:example/app@1.0.0"), | ||||||
|  |                 VexClaimStatus.NotAffected, | ||||||
|  |                 new VexClaimDocument(VexDocumentFormat.Csaf, "sha256:doc2", new Uri("https://example.com/csaf/advisory2.json")), | ||||||
|  |                 new DateTimeOffset(2025, 10, 12, 0, 0, 0, TimeSpan.Zero), | ||||||
|  |                 new DateTimeOffset(2025, 10, 12, 0, 0, 0, TimeSpan.Zero), | ||||||
|  |                 justification: VexJustification.ComponentNotPresent), | ||||||
|  |             new VexClaim( | ||||||
|  |                 "ADVISORY-1", | ||||||
|  |                 "vendor:example", | ||||||
|  |                 new VexProduct("pkg:example/lib@2.0.0", "Example Lib", "2.0.0"), | ||||||
|  |                 VexClaimStatus.NotAffected, | ||||||
|  |                 new VexClaimDocument(VexDocumentFormat.Csaf, "sha256:doc3", new Uri("https://example.com/csaf/advisory3.json")), | ||||||
|  |                 new DateTimeOffset(2025, 10, 12, 0, 0, 0, TimeSpan.Zero), | ||||||
|  |                 new DateTimeOffset(2025, 10, 12, 0, 0, 0, TimeSpan.Zero), | ||||||
|  |                 justification: null)); | ||||||
|  |  | ||||||
|  |         var request = new VexExportRequest( | ||||||
|  |             VexQuery.Empty, | ||||||
|  |             ImmutableArray<VexConsensus>.Empty, | ||||||
|  |             claims, | ||||||
|  |             new DateTimeOffset(2025, 10, 13, 0, 0, 0, TimeSpan.Zero)); | ||||||
|  |  | ||||||
|  |         var exporter = new CsafExporter(); | ||||||
|  |         var digest = exporter.Digest(request); | ||||||
|  |  | ||||||
|  |         await using var stream = new MemoryStream(); | ||||||
|  |         var result = await exporter.SerializeAsync(request, stream, CancellationToken.None); | ||||||
|  |  | ||||||
|  |         digest.Should().NotBeNull(); | ||||||
|  |         digest.Should().Be(result.Digest); | ||||||
|  |  | ||||||
|  |         stream.Position = 0; | ||||||
|  |         using var document = JsonDocument.Parse(stream); | ||||||
|  |         var root = document.RootElement; | ||||||
|  |  | ||||||
|  |         root.GetProperty("document").GetProperty("tracking").GetProperty("id").GetString()!.Should().StartWith("stellaops:csaf"); | ||||||
|  |         root.GetProperty("product_tree").GetProperty("full_product_names").GetArrayLength().Should().Be(2); | ||||||
|  |         root.GetProperty("vulnerabilities").EnumerateArray().Should().HaveCount(2); | ||||||
|  |  | ||||||
|  |         var metadata = root.GetProperty("metadata"); | ||||||
|  |         metadata.GetProperty("query_signature").GetString().Should().NotBeNull(); | ||||||
|  |         metadata.GetProperty("diagnostics").EnumerateObject().Select(p => p.Name).Should().Contain("policy.justification_missing"); | ||||||
|  |  | ||||||
|  |         result.Metadata.Should().ContainKey("csaf.vulnerabilityCount"); | ||||||
|  |         result.Metadata["csaf.productCount"].Should().Be("2"); | ||||||
|  |     } | ||||||
|  | } | ||||||
| @@ -128,4 +128,52 @@ public sealed class CsafNormalizerTests | |||||||
|         claim.AdditionalMetadata["csaf.tracking.status"].Should().Be("final"); |         claim.AdditionalMetadata["csaf.tracking.status"].Should().Be("final"); | ||||||
|         claim.AdditionalMetadata["csaf.publisher.name"].Should().Be("Red Hat Product Security"); |         claim.AdditionalMetadata["csaf.publisher.name"].Should().Be("Red Hat Product Security"); | ||||||
|     } |     } | ||||||
|  |  | ||||||
|  |     [Fact] | ||||||
|  |     public async Task NormalizeAsync_MissingJustification_AddsPolicyDiagnostic() | ||||||
|  |     { | ||||||
|  |         var json = """ | ||||||
|  |         { | ||||||
|  |           "document": { | ||||||
|  |             "tracking": { | ||||||
|  |               "initial_release_date": "2025-10-02T00:00:00Z", | ||||||
|  |               "current_release_date": "2025-10-03T00:00:00Z" | ||||||
|  |             } | ||||||
|  |           }, | ||||||
|  |           "product_tree": { | ||||||
|  |             "full_product_names": [ | ||||||
|  |               { | ||||||
|  |                 "product_id": "pkg:example/app@1.0.0", | ||||||
|  |                 "name": "Example App" | ||||||
|  |               } | ||||||
|  |             ] | ||||||
|  |           }, | ||||||
|  |           "vulnerabilities": [ | ||||||
|  |             { | ||||||
|  |               "id": "VULN-1", | ||||||
|  |               "product_status": { | ||||||
|  |                 "known_not_affected": [ "pkg:example/app@1.0.0" ] | ||||||
|  |               } | ||||||
|  |             } | ||||||
|  |           ] | ||||||
|  |         } | ||||||
|  |         """; | ||||||
|  |  | ||||||
|  |         var rawDocument = new VexRawDocument( | ||||||
|  |             "excititor:example", | ||||||
|  |             VexDocumentFormat.Csaf, | ||||||
|  |             new Uri("https://example.com/csaf.json"), | ||||||
|  |             new DateTimeOffset(2025, 10, 4, 0, 0, 0, TimeSpan.Zero), | ||||||
|  |             "sha256:digest", | ||||||
|  |             Encoding.UTF8.GetBytes(json), | ||||||
|  |             ImmutableDictionary<string, string>.Empty); | ||||||
|  |  | ||||||
|  |         var provider = new VexProvider("excititor:example", "Example CSAF", VexProviderKind.Vendor); | ||||||
|  |         var normalizer = new CsafNormalizer(NullLogger<CsafNormalizer>.Instance); | ||||||
|  |  | ||||||
|  |         var batch = await normalizer.NormalizeAsync(rawDocument, provider, CancellationToken.None); | ||||||
|  |  | ||||||
|  |         batch.Diagnostics.Should().ContainKey("policy.justification_missing"); | ||||||
|  |         batch.Diagnostics["policy.justification_missing"].Should().Contain("VULN-1:pkg:example/app@1.0.0"); | ||||||
|  |     } | ||||||
| } | } | ||||||
|   | |||||||
| @@ -0,0 +1,37 @@ | |||||||
|  | using System.Collections.Immutable; | ||||||
|  | using FluentAssertions; | ||||||
|  | using StellaOps.Excititor.Core; | ||||||
|  | using StellaOps.Excititor.Formats.CycloneDX; | ||||||
|  |  | ||||||
|  | namespace StellaOps.Excititor.Formats.CycloneDX.Tests; | ||||||
|  |  | ||||||
|  | public sealed class CycloneDxComponentReconcilerTests | ||||||
|  | { | ||||||
|  |     [Fact] | ||||||
|  |     public void Reconcile_AssignsBomRefsAndDiagnostics() | ||||||
|  |     { | ||||||
|  |         var claims = ImmutableArray.Create( | ||||||
|  |             new VexClaim( | ||||||
|  |                 "CVE-2025-7000", | ||||||
|  |                 "vendor:one", | ||||||
|  |                 new VexProduct("pkg:demo/component@1.0.0", "Demo Component", "1.0.0", "pkg:demo/component@1.0.0"), | ||||||
|  |                 VexClaimStatus.Affected, | ||||||
|  |                 new VexClaimDocument(VexDocumentFormat.CycloneDx, "sha256:doc1", new Uri("https://example.com/vex/1")), | ||||||
|  |                 DateTimeOffset.UtcNow, | ||||||
|  |                 DateTimeOffset.UtcNow), | ||||||
|  |             new VexClaim( | ||||||
|  |                 "CVE-2025-7000", | ||||||
|  |                 "vendor:two", | ||||||
|  |                 new VexProduct("component-key", "Component Key"), | ||||||
|  |                 VexClaimStatus.NotAffected, | ||||||
|  |                 new VexClaimDocument(VexDocumentFormat.CycloneDx, "sha256:doc2", new Uri("https://example.com/vex/2")), | ||||||
|  |                 DateTimeOffset.UtcNow, | ||||||
|  |                 DateTimeOffset.UtcNow)); | ||||||
|  |  | ||||||
|  |         var result = CycloneDxComponentReconciler.Reconcile(claims); | ||||||
|  |  | ||||||
|  |         result.Components.Should().HaveCount(2); | ||||||
|  |         result.ComponentRefs.Should().ContainKey(("CVE-2025-7000", "component-key")); | ||||||
|  |         result.Diagnostics.Keys.Should().Contain("missing_purl"); | ||||||
|  |     } | ||||||
|  | } | ||||||
| @@ -0,0 +1,47 @@ | |||||||
|  | using System.Collections.Immutable; | ||||||
|  | using System.Text.Json; | ||||||
|  | using FluentAssertions; | ||||||
|  | using StellaOps.Excititor.Core; | ||||||
|  | using StellaOps.Excititor.Formats.CycloneDX; | ||||||
|  |  | ||||||
|  | namespace StellaOps.Excititor.Formats.CycloneDX.Tests; | ||||||
|  |  | ||||||
|  | public sealed class CycloneDxExporterTests | ||||||
|  | { | ||||||
|  |     [Fact] | ||||||
|  |     public async Task SerializeAsync_WritesCycloneDxVexDocument() | ||||||
|  |     { | ||||||
|  |         var claims = ImmutableArray.Create( | ||||||
|  |             new VexClaim( | ||||||
|  |                 "CVE-2025-6000", | ||||||
|  |                 "vendor:demo", | ||||||
|  |                 new VexProduct("pkg:demo/component@1.2.3", "Demo Component", "1.2.3", "pkg:demo/component@1.2.3"), | ||||||
|  |                 VexClaimStatus.Fixed, | ||||||
|  |                 new VexClaimDocument(VexDocumentFormat.CycloneDx, "sha256:doc1", new Uri("https://example.com/cyclonedx/1")), | ||||||
|  |                 new DateTimeOffset(2025, 10, 10, 0, 0, 0, TimeSpan.Zero), | ||||||
|  |                 new DateTimeOffset(2025, 10, 11, 0, 0, 0, TimeSpan.Zero), | ||||||
|  |                 detail: "Issue resolved in 1.2.3")); | ||||||
|  |  | ||||||
|  |         var request = new VexExportRequest( | ||||||
|  |             VexQuery.Empty, | ||||||
|  |             ImmutableArray<VexConsensus>.Empty, | ||||||
|  |             claims, | ||||||
|  |             new DateTimeOffset(2025, 10, 12, 0, 0, 0, TimeSpan.Zero)); | ||||||
|  |  | ||||||
|  |         var exporter = new CycloneDxExporter(); | ||||||
|  |         await using var stream = new MemoryStream(); | ||||||
|  |         var result = await exporter.SerializeAsync(request, stream, CancellationToken.None); | ||||||
|  |  | ||||||
|  |         stream.Position = 0; | ||||||
|  |         using var document = JsonDocument.Parse(stream); | ||||||
|  |         var root = document.RootElement; | ||||||
|  |  | ||||||
|  |         root.GetProperty("bomFormat").GetString().Should().Be("CycloneDX"); | ||||||
|  |         root.GetProperty("components").EnumerateArray().Should().HaveCount(1); | ||||||
|  |         root.GetProperty("vulnerabilities").EnumerateArray().Should().HaveCount(1); | ||||||
|  |  | ||||||
|  |         result.Metadata.Should().ContainKey("cyclonedx.vulnerabilityCount"); | ||||||
|  |         result.Metadata["cyclonedx.componentCount"].Should().Be("1"); | ||||||
|  |         result.Digest.Algorithm.Should().Be("sha256"); | ||||||
|  |     } | ||||||
|  | } | ||||||
| @@ -0,0 +1,49 @@ | |||||||
|  | using System.Collections.Immutable; | ||||||
|  | using System.Text.Json; | ||||||
|  | using FluentAssertions; | ||||||
|  | using StellaOps.Excititor.Core; | ||||||
|  | using StellaOps.Excititor.Formats.OpenVEX; | ||||||
|  |  | ||||||
|  | namespace StellaOps.Excititor.Formats.OpenVEX.Tests; | ||||||
|  |  | ||||||
|  | public sealed class OpenVexExporterTests | ||||||
|  | { | ||||||
|  |     [Fact] | ||||||
|  |     public async Task SerializeAsync_ProducesCanonicalOpenVexDocument() | ||||||
|  |     { | ||||||
|  |         var claims = ImmutableArray.Create( | ||||||
|  |             new VexClaim( | ||||||
|  |                 "CVE-2025-5000", | ||||||
|  |                 "vendor:alpha", | ||||||
|  |                 new VexProduct("pkg:alpha/app@2.0.0", "Alpha App", "2.0.0", "pkg:alpha/app@2.0.0"), | ||||||
|  |                 VexClaimStatus.NotAffected, | ||||||
|  |                 new VexClaimDocument(VexDocumentFormat.OpenVex, "sha256:doc1", new Uri("https://example.com/openvex/alpha")), | ||||||
|  |                 new DateTimeOffset(2025, 10, 10, 0, 0, 0, TimeSpan.Zero), | ||||||
|  |                 new DateTimeOffset(2025, 10, 11, 0, 0, 0, TimeSpan.Zero), | ||||||
|  |                 justification: VexJustification.ComponentNotPresent, | ||||||
|  |                 detail: "Component not shipped.")); | ||||||
|  |  | ||||||
|  |         var request = new VexExportRequest( | ||||||
|  |             VexQuery.Empty, | ||||||
|  |             ImmutableArray<VexConsensus>.Empty, | ||||||
|  |             claims, | ||||||
|  |             new DateTimeOffset(2025, 10, 12, 0, 0, 0, TimeSpan.Zero)); | ||||||
|  |  | ||||||
|  |         var exporter = new OpenVexExporter(); | ||||||
|  |         await using var stream = new MemoryStream(); | ||||||
|  |         var result = await exporter.SerializeAsync(request, stream, CancellationToken.None); | ||||||
|  |  | ||||||
|  |         stream.Position = 0; | ||||||
|  |         using var document = JsonDocument.Parse(stream); | ||||||
|  |         var root = document.RootElement; | ||||||
|  |         root.GetProperty("document").GetProperty("author").GetString().Should().Be("StellaOps Excititor"); | ||||||
|  |         root.GetProperty("statements").GetArrayLength().Should().Be(1); | ||||||
|  |         var statement = root.GetProperty("statements")[0]; | ||||||
|  |         statement.GetProperty("status").GetString().Should().Be("not_affected"); | ||||||
|  |         statement.GetProperty("products")[0].GetProperty("id").GetString().Should().Be("pkg:alpha/app@2.0.0"); | ||||||
|  |  | ||||||
|  |         result.Metadata.Should().ContainKey("openvex.statementCount"); | ||||||
|  |         result.Metadata["openvex.statementCount"].Should().Be("1"); | ||||||
|  |         result.Digest.Algorithm.Should().Be("sha256"); | ||||||
|  |     } | ||||||
|  | } | ||||||
| @@ -0,0 +1,39 @@ | |||||||
|  | using System.Collections.Immutable; | ||||||
|  | using FluentAssertions; | ||||||
|  | using StellaOps.Excititor.Core; | ||||||
|  | using StellaOps.Excititor.Formats.OpenVEX; | ||||||
|  |  | ||||||
|  | namespace StellaOps.Excititor.Formats.OpenVEX.Tests; | ||||||
|  |  | ||||||
|  | public sealed class OpenVexStatementMergerTests | ||||||
|  | { | ||||||
|  |     [Fact] | ||||||
|  |     public void Merge_DetectsConflictsAndSelectsCanonicalStatus() | ||||||
|  |     { | ||||||
|  |         var claims = ImmutableArray.Create( | ||||||
|  |             new VexClaim( | ||||||
|  |                 "CVE-2025-4000", | ||||||
|  |                 "vendor:one", | ||||||
|  |                 new VexProduct("pkg:demo/app@1.0.0", "Demo App", "1.0.0"), | ||||||
|  |                 VexClaimStatus.NotAffected, | ||||||
|  |                 new VexClaimDocument(VexDocumentFormat.OpenVex, "sha256:doc1", new Uri("https://example.com/openvex/1")), | ||||||
|  |                 DateTimeOffset.UtcNow, | ||||||
|  |                 DateTimeOffset.UtcNow, | ||||||
|  |                 justification: VexJustification.ComponentNotPresent), | ||||||
|  |             new VexClaim( | ||||||
|  |                 "CVE-2025-4000", | ||||||
|  |                 "vendor:two", | ||||||
|  |                 new VexProduct("pkg:demo/app@1.0.0", "Demo App", "1.0.0"), | ||||||
|  |                 VexClaimStatus.Affected, | ||||||
|  |                 new VexClaimDocument(VexDocumentFormat.OpenVex, "sha256:doc2", new Uri("https://example.com/openvex/2")), | ||||||
|  |                 DateTimeOffset.UtcNow, | ||||||
|  |                 DateTimeOffset.UtcNow)); | ||||||
|  |  | ||||||
|  |         var result = OpenVexStatementMerger.Merge(claims); | ||||||
|  |  | ||||||
|  |         result.Statements.Should().HaveCount(1); | ||||||
|  |         var statement = result.Statements[0]; | ||||||
|  |         statement.Status.Should().Be(VexClaimStatus.Affected); | ||||||
|  |         result.Diagnostics.Should().ContainKey("openvex.status_conflict"); | ||||||
|  |     } | ||||||
|  | } | ||||||
| @@ -256,6 +256,21 @@ internal static class JavaReflectionAnalyzer | |||||||
|                 instructionOffset, |                 instructionOffset, | ||||||
|                 null)); |                 null)); | ||||||
|         } |         } | ||||||
|  |         else if (normalizedOwner == "java/lang/Class" && (name == "getResource" || name == "getResourceAsStream")) | ||||||
|  |         { | ||||||
|  |             var target = pendingString; | ||||||
|  |             var confidence = pendingString is null ? JavaReflectionConfidence.Low : JavaReflectionConfidence.High; | ||||||
|  |             edges.Add(new JavaReflectionEdge( | ||||||
|  |                 normalizedSource, | ||||||
|  |                 segmentIdentifier, | ||||||
|  |                 target, | ||||||
|  |                 JavaReflectionReason.ResourceLookup, | ||||||
|  |                 confidence, | ||||||
|  |                 method.Name, | ||||||
|  |                 method.Descriptor, | ||||||
|  |                 instructionOffset, | ||||||
|  |                 null)); | ||||||
|  |         } | ||||||
|         else if (normalizedOwner == "java/lang/Thread" && name == "currentThread") |         else if (normalizedOwner == "java/lang/Thread" && name == "currentThread") | ||||||
|         { |         { | ||||||
|             sawCurrentThread = true; |             sawCurrentThread = true; | ||||||
|   | |||||||
| @@ -7,7 +7,7 @@ | |||||||
| | SCANNER-ANALYZERS-JAVA-21-001 | DONE (2025-10-27) | Java Analyzer Guild | SCANNER-CORE-09-501 | Build input normalizer and virtual file system for JAR/WAR/EAR/fat-jar/JMOD/jimage/container roots. Detect packaging type, layered dirs (BOOT-INF/WEB-INF), multi-release overlays, and jlink runtime metadata. | Normalizer walks fixtures without extraction, classifies packaging, selects MR overlays deterministically, records java version + vendor from runtime images. | | | SCANNER-ANALYZERS-JAVA-21-001 | DONE (2025-10-27) | Java Analyzer Guild | SCANNER-CORE-09-501 | Build input normalizer and virtual file system for JAR/WAR/EAR/fat-jar/JMOD/jimage/container roots. Detect packaging type, layered dirs (BOOT-INF/WEB-INF), multi-release overlays, and jlink runtime metadata. | Normalizer walks fixtures without extraction, classifies packaging, selects MR overlays deterministically, records java version + vendor from runtime images. | | ||||||
| | SCANNER-ANALYZERS-JAVA-21-002 | DONE (2025-10-27) | Java Analyzer Guild | SCANNER-ANALYZERS-JAVA-21-001 | Implement module/classpath builder: JPMS graph parser (`module-info.class`), classpath order rules (fat jar, war, ear), duplicate & split-package detection, package fingerprinting. | Classpath order reproduced for fixtures; module graph serialized; duplicate provider + split-package warnings emitted deterministically. | | | SCANNER-ANALYZERS-JAVA-21-002 | DONE (2025-10-27) | Java Analyzer Guild | SCANNER-ANALYZERS-JAVA-21-001 | Implement module/classpath builder: JPMS graph parser (`module-info.class`), classpath order rules (fat jar, war, ear), duplicate & split-package detection, package fingerprinting. | Classpath order reproduced for fixtures; module graph serialized; duplicate provider + split-package warnings emitted deterministically. | | ||||||
| | SCANNER-ANALYZERS-JAVA-21-003 | DONE (2025-10-27) | Java Analyzer Guild | SCANNER-ANALYZERS-JAVA-21-002 | SPI scanner covering META-INF/services, provider selection, and warning generation. Include configurable SPI corpus (JDK, Spring, logging, Jackson, MicroProfile). | SPI tables produced with selected provider + candidates; fixtures show first-wins behaviour; warnings recorded for duplicate providers. | | | SCANNER-ANALYZERS-JAVA-21-003 | DONE (2025-10-27) | Java Analyzer Guild | SCANNER-ANALYZERS-JAVA-21-002 | SPI scanner covering META-INF/services, provider selection, and warning generation. Include configurable SPI corpus (JDK, Spring, logging, Jackson, MicroProfile). | SPI tables produced with selected provider + candidates; fixtures show first-wins behaviour; warnings recorded for duplicate providers. | | ||||||
| | SCANNER-ANALYZERS-JAVA-21-004 | DOING (2025-10-27) | Java Analyzer Guild | SCANNER-ANALYZERS-JAVA-21-002 | Reflection/dynamic loader heuristics: scan constant pools, bytecode sites (Class.forName, loadClass, TCCL usage), resource-based plugin hints, manifest loader hints. Emit edges with reason codes + confidence. | Reflection edges generated for fixtures (classpath, boot, war); includes call site metadata and confidence scoring; TCCL warning emitted where detected. | | | SCANNER-ANALYZERS-JAVA-21-004 | DONE (2025-10-29) | Java Analyzer Guild | SCANNER-ANALYZERS-JAVA-21-002 | Reflection/dynamic loader heuristics: scan constant pools, bytecode sites (Class.forName, loadClass, TCCL usage), resource-based plugin hints, manifest loader hints. Emit edges with reason codes + confidence. | Reflection edges generated for fixtures (classpath, boot, war); includes call site metadata and confidence scoring; TCCL warning emitted where detected. | | ||||||
| | SCANNER-ANALYZERS-JAVA-21-005 | TODO | Java Analyzer Guild | SCANNER-ANALYZERS-JAVA-21-002 | Framework config extraction: Spring Boot imports, spring.factories, application properties/yaml, Jakarta web.xml & fragments, JAX-RS/JPA/CDI/JAXB configs, logging files, Graal native-image configs. | Framework fixtures parsed; relevant class FQCNs surfaced with reasons (`config-spring`, `config-jaxrs`, etc.); non-class config ignored; determinism guard passes. | | | SCANNER-ANALYZERS-JAVA-21-005 | TODO | Java Analyzer Guild | SCANNER-ANALYZERS-JAVA-21-002 | Framework config extraction: Spring Boot imports, spring.factories, application properties/yaml, Jakarta web.xml & fragments, JAX-RS/JPA/CDI/JAXB configs, logging files, Graal native-image configs. | Framework fixtures parsed; relevant class FQCNs surfaced with reasons (`config-spring`, `config-jaxrs`, etc.); non-class config ignored; determinism guard passes. | | ||||||
| | SCANNER-ANALYZERS-JAVA-21-006 | TODO | Java Analyzer Guild | SCANNER-ANALYZERS-JAVA-21-002 | JNI/native hint scanner: detect native methods, System.load/Library literals, bundled native libs, Graal JNI configs; emit `jni-load` edges for native analyzer correlation. | JNI fixtures produce hint edges pointing at embedded libs; metadata includes candidate paths and reason `jni`. | | | SCANNER-ANALYZERS-JAVA-21-006 | TODO | Java Analyzer Guild | SCANNER-ANALYZERS-JAVA-21-002 | JNI/native hint scanner: detect native methods, System.load/Library literals, bundled native libs, Graal JNI configs; emit `jni-load` edges for native analyzer correlation. | JNI fixtures produce hint edges pointing at embedded libs; metadata includes candidate paths and reason `jni`. | | ||||||
| | SCANNER-ANALYZERS-JAVA-21-007 | TODO | Java Analyzer Guild | SCANNER-ANALYZERS-JAVA-21-003 | Signature and manifest metadata collector: verify JAR signature structure, capture signers, manifest loader attributes (Main-Class, Agent-Class, Start-Class, Class-Path). | Signed jar fixture reports signer info and structural validation result; manifest metadata attached to entrypoints. | | | SCANNER-ANALYZERS-JAVA-21-007 | TODO | Java Analyzer Guild | SCANNER-ANALYZERS-JAVA-21-003 | Signature and manifest metadata collector: verify JAR signature structure, capture signers, manifest loader attributes (Main-Class, Agent-Class, Start-Class, Class-Path). | Signed jar fixture reports signer info and structural validation result; manifest metadata attached to entrypoints. | | ||||||
|   | |||||||
| @@ -1,6 +1,5 @@ | |||||||
| using System.Collections.Immutable; | using System.Collections.Immutable; | ||||||
| using System.Linq; | using System.Linq; | ||||||
| using System.Security.Cryptography; |  | ||||||
|  |  | ||||||
| namespace StellaOps.Scanner.Analyzers.Lang.Rust.Internal; | namespace StellaOps.Scanner.Analyzers.Lang.Rust.Internal; | ||||||
|  |  | ||||||
| @@ -30,6 +29,7 @@ internal static class RustAnalyzerCollector | |||||||
|         private readonly Dictionary<string, List<RustCrateBuilder>> _cratesByName = new(StringComparer.Ordinal); |         private readonly Dictionary<string, List<RustCrateBuilder>> _cratesByName = new(StringComparer.Ordinal); | ||||||
|         private readonly Dictionary<string, RustHeuristicBuilder> _heuristics = new(StringComparer.Ordinal); |         private readonly Dictionary<string, RustHeuristicBuilder> _heuristics = new(StringComparer.Ordinal); | ||||||
|         private readonly Dictionary<string, RustBinaryRecord> _binaries = new(StringComparer.Ordinal); |         private readonly Dictionary<string, RustBinaryRecord> _binaries = new(StringComparer.Ordinal); | ||||||
|  |         private RustLicenseIndex _licenseIndex = RustLicenseIndex.Empty; | ||||||
|  |  | ||||||
|         public Collector(LanguageAnalyzerContext context) |         public Collector(LanguageAnalyzerContext context) | ||||||
|         { |         { | ||||||
| @@ -38,6 +38,7 @@ internal static class RustAnalyzerCollector | |||||||
|  |  | ||||||
|         public void Execute(CancellationToken cancellationToken) |         public void Execute(CancellationToken cancellationToken) | ||||||
|         { |         { | ||||||
|  |             _licenseIndex = RustLicenseScanner.GetOrCreate(_context.RootPath, cancellationToken); | ||||||
|             CollectCargoLocks(cancellationToken); |             CollectCargoLocks(cancellationToken); | ||||||
|             CollectFingerprints(cancellationToken); |             CollectFingerprints(cancellationToken); | ||||||
|             CollectBinaries(cancellationToken); |             CollectBinaries(cancellationToken); | ||||||
| @@ -81,6 +82,7 @@ internal static class RustAnalyzerCollector | |||||||
|                 { |                 { | ||||||
|                     var builder = GetOrCreateCrate(package.Name, package.Version); |                     var builder = GetOrCreateCrate(package.Name, package.Version); | ||||||
|                     builder.ApplyCargoPackage(package, relativePath); |                     builder.ApplyCargoPackage(package, relativePath); | ||||||
|  |                     TryApplyLicense(builder); | ||||||
|                 } |                 } | ||||||
|             } |             } | ||||||
|         } |         } | ||||||
| @@ -95,6 +97,7 @@ internal static class RustAnalyzerCollector | |||||||
|                 var builder = GetOrCreateCrate(record.Name, record.Version); |                 var builder = GetOrCreateCrate(record.Name, record.Version); | ||||||
|                 var relative = NormalizeRelative(_context.GetRelativePath(record.AbsolutePath)); |                 var relative = NormalizeRelative(_context.GetRelativePath(record.AbsolutePath)); | ||||||
|                 builder.ApplyFingerprint(record, relative); |                 builder.ApplyFingerprint(record, relative); | ||||||
|  |                 TryApplyLicense(builder); | ||||||
|             } |             } | ||||||
|         } |         } | ||||||
|  |  | ||||||
| @@ -153,6 +156,10 @@ internal static class RustAnalyzerCollector | |||||||
|                 if (_crates.TryGetValue(key, out var existing)) |                 if (_crates.TryGetValue(key, out var existing)) | ||||||
|                 { |                 { | ||||||
|                     existing.EnsureVersion(version); |                     existing.EnsureVersion(version); | ||||||
|  |                     if (!existing.HasLicenseMetadata) | ||||||
|  |                     { | ||||||
|  |                         TryApplyLicense(existing); | ||||||
|  |                     } | ||||||
|                     return existing; |                     return existing; | ||||||
|                 } |                 } | ||||||
|  |  | ||||||
| @@ -166,6 +173,7 @@ internal static class RustAnalyzerCollector | |||||||
|                 } |                 } | ||||||
|  |  | ||||||
|                 list.Add(builder); |                 list.Add(builder); | ||||||
|  |                 TryApplyLicense(builder); | ||||||
|                 return builder; |                 return builder; | ||||||
|             } |             } | ||||||
|  |  | ||||||
| @@ -250,6 +258,15 @@ internal static class RustAnalyzerCollector | |||||||
|  |  | ||||||
|             return relativePath.Replace('\\', '/'); |             return relativePath.Replace('\\', '/'); | ||||||
|         } |         } | ||||||
|  |  | ||||||
|  |         private void TryApplyLicense(RustCrateBuilder builder) | ||||||
|  |         { | ||||||
|  |             var info = _licenseIndex.Find(builder.Name, builder.Version); | ||||||
|  |             if (info is not null) | ||||||
|  |             { | ||||||
|  |                 builder.ApplyLicense(info); | ||||||
|  |             } | ||||||
|  |         } | ||||||
|     } |     } | ||||||
| } | } | ||||||
|  |  | ||||||
| @@ -274,6 +291,8 @@ internal sealed class RustCrateBuilder | |||||||
|     private readonly HashSet<LanguageComponentEvidence> _evidence = new(new LanguageComponentEvidenceComparer()); |     private readonly HashSet<LanguageComponentEvidence> _evidence = new(new LanguageComponentEvidenceComparer()); | ||||||
|     private readonly SortedSet<string> _binaryPaths = new(StringComparer.Ordinal); |     private readonly SortedSet<string> _binaryPaths = new(StringComparer.Ordinal); | ||||||
|     private readonly SortedSet<string> _binaryHashes = new(StringComparer.Ordinal); |     private readonly SortedSet<string> _binaryHashes = new(StringComparer.Ordinal); | ||||||
|  |     private readonly SortedSet<string> _licenseExpressions = new(StringComparer.OrdinalIgnoreCase); | ||||||
|  |     private readonly SortedDictionary<string, string?> _licenseFiles = new(StringComparer.Ordinal); | ||||||
|  |  | ||||||
|     private string? _version; |     private string? _version; | ||||||
|     private string? _source; |     private string? _source; | ||||||
| @@ -290,6 +309,8 @@ internal sealed class RustCrateBuilder | |||||||
|  |  | ||||||
|     public string? Version => _version; |     public string? Version => _version; | ||||||
|  |  | ||||||
|  |     public bool HasLicenseMetadata => _licenseExpressions.Count > 0 || _licenseFiles.Count > 0; | ||||||
|  |  | ||||||
|     public static string NormalizeName(string value) |     public static string NormalizeName(string value) | ||||||
|     { |     { | ||||||
|         if (string.IsNullOrWhiteSpace(value)) |         if (string.IsNullOrWhiteSpace(value)) | ||||||
| @@ -399,9 +420,40 @@ internal sealed class RustCrateBuilder | |||||||
|  |  | ||||||
|         var metadata = _metadata |         var metadata = _metadata | ||||||
|             .Select(static pair => new KeyValuePair<string, string?>(pair.Key, pair.Value)) |             .Select(static pair => new KeyValuePair<string, string?>(pair.Key, pair.Value)) | ||||||
|             .OrderBy(static pair => pair.Key, StringComparer.Ordinal) |  | ||||||
|             .ToList(); |             .ToList(); | ||||||
|  |  | ||||||
|  |         if (_licenseExpressions.Count > 0) | ||||||
|  |         { | ||||||
|  |             var index = 0; | ||||||
|  |             foreach (var expression in _licenseExpressions) | ||||||
|  |             { | ||||||
|  |                 if (string.IsNullOrWhiteSpace(expression)) | ||||||
|  |                 { | ||||||
|  |                     continue; | ||||||
|  |                 } | ||||||
|  |  | ||||||
|  |                 metadata.Add(new KeyValuePair<string, string?>($"license.expression[{index}]", expression)); | ||||||
|  |                 index++; | ||||||
|  |             } | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         if (_licenseFiles.Count > 0) | ||||||
|  |         { | ||||||
|  |             var index = 0; | ||||||
|  |             foreach (var pair in _licenseFiles) | ||||||
|  |             { | ||||||
|  |                 metadata.Add(new KeyValuePair<string, string?>($"license.file[{index}]", pair.Key)); | ||||||
|  |                 if (!string.IsNullOrWhiteSpace(pair.Value)) | ||||||
|  |                 { | ||||||
|  |                     metadata.Add(new KeyValuePair<string, string?>($"license.file.sha256[{index}]", pair.Value)); | ||||||
|  |                 } | ||||||
|  |  | ||||||
|  |                 index++; | ||||||
|  |             } | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         metadata.Sort(static (left, right) => string.CompareOrdinal(left.Key, right.Key)); | ||||||
|  |  | ||||||
|         var evidence = _evidence |         var evidence = _evidence | ||||||
|             .OrderBy(static item => item.ComparisonKey, StringComparer.Ordinal) |             .OrderBy(static item => item.ComparisonKey, StringComparer.Ordinal) | ||||||
|             .ToImmutableArray(); |             .ToImmutableArray(); | ||||||
| @@ -422,6 +474,45 @@ internal sealed class RustCrateBuilder | |||||||
|             UsedByEntrypoint: _usedByEntrypoint); |             UsedByEntrypoint: _usedByEntrypoint); | ||||||
|     } |     } | ||||||
|  |  | ||||||
|  |     public void ApplyLicense(RustLicenseInfo info) | ||||||
|  |     { | ||||||
|  |         if (info is null) | ||||||
|  |         { | ||||||
|  |             return; | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         foreach (var expression in info.Expressions) | ||||||
|  |         { | ||||||
|  |             if (!string.IsNullOrWhiteSpace(expression)) | ||||||
|  |             { | ||||||
|  |                 _licenseExpressions.Add(expression.Trim()); | ||||||
|  |             } | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         foreach (var file in info.Files) | ||||||
|  |         { | ||||||
|  |             if (string.IsNullOrWhiteSpace(file.RelativePath)) | ||||||
|  |             { | ||||||
|  |                 continue; | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             var normalized = file.RelativePath.Replace('\\', '/'); | ||||||
|  |             if (_licenseFiles.ContainsKey(normalized)) | ||||||
|  |             { | ||||||
|  |                 continue; | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             if (string.IsNullOrWhiteSpace(file.Sha256)) | ||||||
|  |             { | ||||||
|  |                 _licenseFiles[normalized] = null; | ||||||
|  |             } | ||||||
|  |             else | ||||||
|  |             { | ||||||
|  |                 _licenseFiles[normalized] = file.Sha256!.Trim(); | ||||||
|  |             } | ||||||
|  |         } | ||||||
|  |     } | ||||||
|  |  | ||||||
|     private void AddMetadataIfEmpty(string key, string? value) |     private void AddMetadataIfEmpty(string key, string? value) | ||||||
|     { |     { | ||||||
|         if (string.IsNullOrWhiteSpace(key) || string.IsNullOrWhiteSpace(value)) |         if (string.IsNullOrWhiteSpace(key) || string.IsNullOrWhiteSpace(value)) | ||||||
| @@ -577,28 +668,14 @@ internal sealed class RustBinaryRecord | |||||||
|             _hash ??= hash; |             _hash ??= hash; | ||||||
|         } |         } | ||||||
|  |  | ||||||
|         if (_hash is null) |         if (!string.IsNullOrEmpty(_hash)) | ||||||
|         { |         { | ||||||
|             _hash = ComputeHashSafely(); |             return; | ||||||
|         } |  | ||||||
|         } |         } | ||||||
|  |  | ||||||
|     private string? ComputeHashSafely() |         if (RustFileHashCache.TryGetSha256(AbsolutePath, out var computed) && !string.IsNullOrEmpty(computed)) | ||||||
|         { |         { | ||||||
|         try |             _hash = computed; | ||||||
|         { |  | ||||||
|             using var stream = new FileStream(AbsolutePath, FileMode.Open, FileAccess.Read, FileShare.Read); |  | ||||||
|             using var sha = SHA256.Create(); |  | ||||||
|             var hash = sha.ComputeHash(stream); |  | ||||||
|             return Convert.ToHexString(hash).ToLowerInvariant(); |  | ||||||
|         } |  | ||||||
|         catch (IOException) |  | ||||||
|         { |  | ||||||
|             return null; |  | ||||||
|         } |  | ||||||
|         catch (UnauthorizedAccessException) |  | ||||||
|         { |  | ||||||
|             return null; |  | ||||||
|         } |         } | ||||||
|     } |     } | ||||||
| } | } | ||||||
|   | |||||||
| @@ -1,7 +1,7 @@ | |||||||
| using System.Buffers; | using System.Buffers; | ||||||
|  | using System.Collections.Concurrent; | ||||||
| using System.Collections.Immutable; | using System.Collections.Immutable; | ||||||
| using System.Linq; | using System.Linq; | ||||||
| using System.Security.Cryptography; |  | ||||||
| using System.Text; | using System.Text; | ||||||
|  |  | ||||||
| namespace StellaOps.Scanner.Analyzers.Lang.Rust.Internal; | namespace StellaOps.Scanner.Analyzers.Lang.Rust.Internal; | ||||||
| @@ -32,6 +32,8 @@ internal static class RustBinaryClassifier | |||||||
|         AttributesToSkip = FileAttributes.Device | FileAttributes.ReparsePoint, |         AttributesToSkip = FileAttributes.Device | FileAttributes.ReparsePoint, | ||||||
|     }; |     }; | ||||||
|  |  | ||||||
|  |     private static readonly ConcurrentDictionary<RustFileCacheKey, ImmutableArray<string>> CandidateCache = new(); | ||||||
|  |  | ||||||
|     public static IReadOnlyList<RustBinaryInfo> Scan(string rootPath, CancellationToken cancellationToken) |     public static IReadOnlyList<RustBinaryInfo> Scan(string rootPath, CancellationToken cancellationToken) | ||||||
|     { |     { | ||||||
|         if (string.IsNullOrWhiteSpace(rootPath)) |         if (string.IsNullOrWhiteSpace(rootPath)) | ||||||
| @@ -49,7 +51,16 @@ internal static class RustBinaryClassifier | |||||||
|                 continue; |                 continue; | ||||||
|             } |             } | ||||||
|  |  | ||||||
|             var candidates = ExtractCrateNames(path, cancellationToken); |             if (!RustFileCacheKey.TryCreate(path, out var key)) | ||||||
|  |             { | ||||||
|  |                 continue; | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             var candidates = CandidateCache.GetOrAdd( | ||||||
|  |                 key, | ||||||
|  |                 static (_, state) => ExtractCrateNames(state.Path, state.CancellationToken), | ||||||
|  |                 (Path: path, CancellationToken: cancellationToken)); | ||||||
|  |  | ||||||
|             binaries.Add(new RustBinaryInfo(path, candidates)); |             binaries.Add(new RustBinaryInfo(path, candidates)); | ||||||
|         } |         } | ||||||
|  |  | ||||||
| @@ -220,31 +231,13 @@ internal static class RustBinaryClassifier | |||||||
|  |  | ||||||
| internal sealed record RustBinaryInfo(string AbsolutePath, ImmutableArray<string> CrateCandidates) | internal sealed record RustBinaryInfo(string AbsolutePath, ImmutableArray<string> CrateCandidates) | ||||||
| { | { | ||||||
|     private string? _sha256; |  | ||||||
|  |  | ||||||
|     public string ComputeSha256() |     public string ComputeSha256() | ||||||
|     { |     { | ||||||
|         if (_sha256 is not null) |         if (RustFileHashCache.TryGetSha256(AbsolutePath, out var sha256) && !string.IsNullOrEmpty(sha256)) | ||||||
|         { |         { | ||||||
|             return _sha256; |             return sha256; | ||||||
|         } |         } | ||||||
|  |  | ||||||
|         try |         return string.Empty; | ||||||
|         { |  | ||||||
|             using var stream = new FileStream(AbsolutePath, FileMode.Open, FileAccess.Read, FileShare.Read); |  | ||||||
|             using var sha = SHA256.Create(); |  | ||||||
|             var hash = sha.ComputeHash(stream); |  | ||||||
|             _sha256 = Convert.ToHexString(hash).ToLowerInvariant(); |  | ||||||
|         } |  | ||||||
|         catch (IOException) |  | ||||||
|         { |  | ||||||
|             _sha256 = string.Empty; |  | ||||||
|         } |  | ||||||
|         catch (UnauthorizedAccessException) |  | ||||||
|         { |  | ||||||
|             _sha256 = string.Empty; |  | ||||||
|         } |  | ||||||
|  |  | ||||||
|         return _sha256 ?? string.Empty; |  | ||||||
|     } |     } | ||||||
| } | } | ||||||
|   | |||||||
| @@ -1,7 +1,12 @@ | |||||||
|  | using System.Collections.Concurrent; | ||||||
|  | using System.Collections.Immutable; | ||||||
|  |  | ||||||
| namespace StellaOps.Scanner.Analyzers.Lang.Rust.Internal; | namespace StellaOps.Scanner.Analyzers.Lang.Rust.Internal; | ||||||
|  |  | ||||||
| internal static class RustCargoLockParser | internal static class RustCargoLockParser | ||||||
| { | { | ||||||
|  |     private static readonly ConcurrentDictionary<RustFileCacheKey, ImmutableArray<RustCargoPackage>> Cache = new(); | ||||||
|  |  | ||||||
|     public static IReadOnlyList<RustCargoPackage> Parse(string path, CancellationToken cancellationToken) |     public static IReadOnlyList<RustCargoPackage> Parse(string path, CancellationToken cancellationToken) | ||||||
|     { |     { | ||||||
|         if (string.IsNullOrWhiteSpace(path)) |         if (string.IsNullOrWhiteSpace(path)) | ||||||
| @@ -9,17 +14,26 @@ internal static class RustCargoLockParser | |||||||
|             throw new ArgumentException("Lock path is required", nameof(path)); |             throw new ArgumentException("Lock path is required", nameof(path)); | ||||||
|         } |         } | ||||||
|  |  | ||||||
|         var info = new FileInfo(path); |         if (!RustFileCacheKey.TryCreate(path, out var key)) | ||||||
|         if (!info.Exists) |  | ||||||
|         { |         { | ||||||
|             return Array.Empty<RustCargoPackage>(); |             return Array.Empty<RustCargoPackage>(); | ||||||
|         } |         } | ||||||
|  |  | ||||||
|         var packages = new List<RustCargoPackage>(); |         var packages = Cache.GetOrAdd( | ||||||
|  |             key, | ||||||
|  |             static (_, state) => ParseInternal(state.Path, state.CancellationToken), | ||||||
|  |             (Path: path, CancellationToken: cancellationToken)); | ||||||
|  |  | ||||||
|  |         return packages.IsDefaultOrEmpty ? Array.Empty<RustCargoPackage>() : packages; | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     private static ImmutableArray<RustCargoPackage> ParseInternal(string path, CancellationToken cancellationToken) | ||||||
|  |     { | ||||||
|  |         var resultBuilder = ImmutableArray.CreateBuilder<RustCargoPackage>(); | ||||||
|  |  | ||||||
|         using var stream = new FileStream(path, FileMode.Open, FileAccess.Read, FileShare.Read); |         using var stream = new FileStream(path, FileMode.Open, FileAccess.Read, FileShare.Read); | ||||||
|         using var reader = new StreamReader(stream); |         using var reader = new StreamReader(stream); | ||||||
|  |         RustCargoPackageBuilder? packageBuilder = null; | ||||||
|         RustCargoPackageBuilder? builder = null; |  | ||||||
|         string? currentArrayKey = null; |         string? currentArrayKey = null; | ||||||
|         var arrayValues = new List<string>(); |         var arrayValues = new List<string>(); | ||||||
|  |  | ||||||
| @@ -41,14 +55,14 @@ internal static class RustCargoLockParser | |||||||
|  |  | ||||||
|             if (IsPackageHeader(trimmed)) |             if (IsPackageHeader(trimmed)) | ||||||
|             { |             { | ||||||
|                 FlushCurrent(builder, packages); |                 FlushCurrent(packageBuilder, resultBuilder); | ||||||
|                 builder = new RustCargoPackageBuilder(); |                 packageBuilder = new RustCargoPackageBuilder(); | ||||||
|                 currentArrayKey = null; |                 currentArrayKey = null; | ||||||
|                 arrayValues.Clear(); |                 arrayValues.Clear(); | ||||||
|                 continue; |                 continue; | ||||||
|             } |             } | ||||||
|  |  | ||||||
|             if (builder is null) |             if (packageBuilder is null) | ||||||
|             { |             { | ||||||
|                 continue; |                 continue; | ||||||
|             } |             } | ||||||
| @@ -113,7 +127,7 @@ internal static class RustCargoLockParser | |||||||
|                         } |                         } | ||||||
|                     } |                     } | ||||||
|  |  | ||||||
|                     builder.SetArray(currentArrayKey, arrayValues); |                     packageBuilder.SetArray(currentArrayKey, arrayValues); | ||||||
|                     currentArrayKey = null; |                     currentArrayKey = null; | ||||||
|                     arrayValues.Clear(); |                     arrayValues.Clear(); | ||||||
|                 } |                 } | ||||||
| @@ -124,17 +138,17 @@ internal static class RustCargoLockParser | |||||||
|             var parsed = ExtractString(valuePart); |             var parsed = ExtractString(valuePart); | ||||||
|             if (parsed is not null) |             if (parsed is not null) | ||||||
|             { |             { | ||||||
|                 builder.SetField(key, parsed); |                 packageBuilder.SetField(key, parsed); | ||||||
|             } |             } | ||||||
|         } |         } | ||||||
|  |  | ||||||
|         if (currentArrayKey is not null && arrayValues.Count > 0) |         if (currentArrayKey is not null && arrayValues.Count > 0) | ||||||
|         { |         { | ||||||
|             builder?.SetArray(currentArrayKey, arrayValues); |             packageBuilder?.SetArray(currentArrayKey, arrayValues); | ||||||
|         } |         } | ||||||
|  |  | ||||||
|         FlushCurrent(builder, packages); |         FlushCurrent(packageBuilder, resultBuilder); | ||||||
|         return packages; |         return resultBuilder.ToImmutable(); | ||||||
|     } |     } | ||||||
|  |  | ||||||
|     private static ReadOnlySpan<char> TrimComments(ReadOnlySpan<char> line) |     private static ReadOnlySpan<char> TrimComments(ReadOnlySpan<char> line) | ||||||
| @@ -204,14 +218,14 @@ internal static class RustCargoLockParser | |||||||
|         return trimmed.Length == 0 ? null : trimmed.ToString(); |         return trimmed.Length == 0 ? null : trimmed.ToString(); | ||||||
|     } |     } | ||||||
|  |  | ||||||
|     private static void FlushCurrent(RustCargoPackageBuilder? builder, List<RustCargoPackage> packages) |     private static void FlushCurrent(RustCargoPackageBuilder? packageBuilder, ImmutableArray<RustCargoPackage>.Builder packages) | ||||||
|     { |     { | ||||||
|         if (builder is null || !builder.HasData) |         if (packageBuilder is null || !packageBuilder.HasData) | ||||||
|         { |         { | ||||||
|             return; |             return; | ||||||
|         } |         } | ||||||
|  |  | ||||||
|         if (builder.TryBuild(out var package)) |         if (packageBuilder.TryBuild(out var package)) | ||||||
|         { |         { | ||||||
|             packages.Add(package); |             packages.Add(package); | ||||||
|         } |         } | ||||||
|   | |||||||
| @@ -0,0 +1,74 @@ | |||||||
|  | using System.Security; | ||||||
|  |  | ||||||
|  | namespace StellaOps.Scanner.Analyzers.Lang.Rust.Internal; | ||||||
|  |  | ||||||
|  | internal readonly struct RustFileCacheKey : IEquatable<RustFileCacheKey> | ||||||
|  | { | ||||||
|  |     private readonly string _normalizedPath; | ||||||
|  |     private readonly long _length; | ||||||
|  |     private readonly long _lastWriteTicks; | ||||||
|  |  | ||||||
|  |     private RustFileCacheKey(string normalizedPath, long length, long lastWriteTicks) | ||||||
|  |     { | ||||||
|  |         _normalizedPath = normalizedPath; | ||||||
|  |         _length = length; | ||||||
|  |         _lastWriteTicks = lastWriteTicks; | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     public static bool TryCreate(string path, out RustFileCacheKey key) | ||||||
|  |     { | ||||||
|  |         key = default; | ||||||
|  |  | ||||||
|  |         if (string.IsNullOrWhiteSpace(path)) | ||||||
|  |         { | ||||||
|  |             return false; | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         try | ||||||
|  |         { | ||||||
|  |             var info = new FileInfo(path); | ||||||
|  |             if (!info.Exists) | ||||||
|  |             { | ||||||
|  |                 return false; | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             var normalizedPath = OperatingSystem.IsWindows() | ||||||
|  |                 ? info.FullName.ToLowerInvariant() | ||||||
|  |                 : info.FullName; | ||||||
|  |  | ||||||
|  |             key = new RustFileCacheKey(normalizedPath, info.Length, info.LastWriteTimeUtc.Ticks); | ||||||
|  |             return true; | ||||||
|  |         } | ||||||
|  |         catch (IOException) | ||||||
|  |         { | ||||||
|  |             return false; | ||||||
|  |         } | ||||||
|  |         catch (UnauthorizedAccessException) | ||||||
|  |         { | ||||||
|  |             return false; | ||||||
|  |         } | ||||||
|  |         catch (SecurityException) | ||||||
|  |         { | ||||||
|  |             return false; | ||||||
|  |         } | ||||||
|  |         catch (ArgumentException) | ||||||
|  |         { | ||||||
|  |             return false; | ||||||
|  |         } | ||||||
|  |         catch (NotSupportedException) | ||||||
|  |         { | ||||||
|  |             return false; | ||||||
|  |         } | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     public bool Equals(RustFileCacheKey other) | ||||||
|  |         => _length == other._length | ||||||
|  |            && _lastWriteTicks == other._lastWriteTicks | ||||||
|  |            && string.Equals(_normalizedPath, other._normalizedPath, StringComparison.Ordinal); | ||||||
|  |  | ||||||
|  |     public override bool Equals(object? obj) | ||||||
|  |         => obj is RustFileCacheKey other && Equals(other); | ||||||
|  |  | ||||||
|  |     public override int GetHashCode() | ||||||
|  |         => HashCode.Combine(_normalizedPath, _length, _lastWriteTicks); | ||||||
|  | } | ||||||
| @@ -0,0 +1,45 @@ | |||||||
|  | using System.Collections.Concurrent; | ||||||
|  | using System.Security; | ||||||
|  |  | ||||||
|  | namespace StellaOps.Scanner.Analyzers.Lang.Rust.Internal; | ||||||
|  |  | ||||||
|  | internal static class RustFileHashCache | ||||||
|  | { | ||||||
|  |     private static readonly ConcurrentDictionary<RustFileCacheKey, string> Sha256Cache = new(); | ||||||
|  |  | ||||||
|  |     public static bool TryGetSha256(string path, out string? sha256) | ||||||
|  |     { | ||||||
|  |         sha256 = null; | ||||||
|  |  | ||||||
|  |         if (!RustFileCacheKey.TryCreate(path, out var key)) | ||||||
|  |         { | ||||||
|  |             return false; | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         try | ||||||
|  |         { | ||||||
|  |             sha256 = Sha256Cache.GetOrAdd(key, static (_, state) => ComputeSha256(state), path); | ||||||
|  |             return !string.IsNullOrEmpty(sha256); | ||||||
|  |         } | ||||||
|  |         catch (IOException) | ||||||
|  |         { | ||||||
|  |             return false; | ||||||
|  |         } | ||||||
|  |         catch (UnauthorizedAccessException) | ||||||
|  |         { | ||||||
|  |             return false; | ||||||
|  |         } | ||||||
|  |         catch (SecurityException) | ||||||
|  |         { | ||||||
|  |             return false; | ||||||
|  |         } | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     private static string ComputeSha256(string path) | ||||||
|  |     { | ||||||
|  |         using var stream = File.Open(path, FileMode.Open, FileAccess.Read, FileShare.Read); | ||||||
|  |         using var sha = System.Security.Cryptography.SHA256.Create(); | ||||||
|  |         var hash = sha.ComputeHash(stream); | ||||||
|  |         return Convert.ToHexString(hash).ToLowerInvariant(); | ||||||
|  |     } | ||||||
|  | } | ||||||
| @@ -1,3 +1,4 @@ | |||||||
|  | using System.Collections.Concurrent; | ||||||
| using System.Text.Json; | using System.Text.Json; | ||||||
|  |  | ||||||
| namespace StellaOps.Scanner.Analyzers.Lang.Rust.Internal; | namespace StellaOps.Scanner.Analyzers.Lang.Rust.Internal; | ||||||
| @@ -13,6 +14,7 @@ internal static class RustFingerprintScanner | |||||||
|     }; |     }; | ||||||
|  |  | ||||||
|     private static readonly string FingerprintSegment = $"{Path.DirectorySeparatorChar}.fingerprint{Path.DirectorySeparatorChar}"; |     private static readonly string FingerprintSegment = $"{Path.DirectorySeparatorChar}.fingerprint{Path.DirectorySeparatorChar}"; | ||||||
|  |     private static readonly ConcurrentDictionary<RustFileCacheKey, RustFingerprintRecord?> Cache = new(); | ||||||
|  |  | ||||||
|     public static IReadOnlyList<RustFingerprintRecord> Scan(string rootPath, CancellationToken cancellationToken) |     public static IReadOnlyList<RustFingerprintRecord> Scan(string rootPath, CancellationToken cancellationToken) | ||||||
|     { |     { | ||||||
| @@ -31,7 +33,17 @@ internal static class RustFingerprintScanner | |||||||
|                 continue; |                 continue; | ||||||
|             } |             } | ||||||
|  |  | ||||||
|             if (TryParse(path, out var record)) |             if (!RustFileCacheKey.TryCreate(path, out var key)) | ||||||
|  |             { | ||||||
|  |                 continue; | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             var record = Cache.GetOrAdd( | ||||||
|  |                 key, | ||||||
|  |                 static (_, state) => ParseFingerprint(state), | ||||||
|  |                 path); | ||||||
|  |  | ||||||
|  |             if (record is not null) | ||||||
|             { |             { | ||||||
|                 results.Add(record); |                 results.Add(record); | ||||||
|             } |             } | ||||||
| @@ -40,10 +52,8 @@ internal static class RustFingerprintScanner | |||||||
|         return results; |         return results; | ||||||
|     } |     } | ||||||
|  |  | ||||||
|     private static bool TryParse(string path, out RustFingerprintRecord record) |     private static RustFingerprintRecord? ParseFingerprint(string path) | ||||||
|     { |     { | ||||||
|         record = default!; |  | ||||||
|  |  | ||||||
|         try |         try | ||||||
|         { |         { | ||||||
|             using var stream = new FileStream(path, FileMode.Open, FileAccess.Read, FileShare.Read); |             using var stream = new FileStream(path, FileMode.Open, FileAccess.Read, FileShare.Read); | ||||||
| @@ -57,33 +67,31 @@ internal static class RustFingerprintScanner | |||||||
|             var (name, version, source) = ParseIdentity(pkgId, path); |             var (name, version, source) = ParseIdentity(pkgId, path); | ||||||
|             if (string.IsNullOrWhiteSpace(name)) |             if (string.IsNullOrWhiteSpace(name)) | ||||||
|             { |             { | ||||||
|                 return false; |                 return null; | ||||||
|             } |             } | ||||||
|  |  | ||||||
|             var profile = TryGetString(root, "profile"); |             var profile = TryGetString(root, "profile"); | ||||||
|             var targetKind = TryGetKind(root); |             var targetKind = TryGetKind(root); | ||||||
|  |  | ||||||
|             record = new RustFingerprintRecord( |             return new RustFingerprintRecord( | ||||||
|                 Name: name!, |                 Name: name!, | ||||||
|                 Version: version, |                 Version: version, | ||||||
|                 Source: source, |                 Source: source, | ||||||
|                 TargetKind: targetKind, |                 TargetKind: targetKind, | ||||||
|                 Profile: profile, |                 Profile: profile, | ||||||
|                 AbsolutePath: path); |                 AbsolutePath: path); | ||||||
|  |  | ||||||
|             return true; |  | ||||||
|         } |         } | ||||||
|         catch (JsonException) |         catch (JsonException) | ||||||
|         { |         { | ||||||
|             return false; |             return null; | ||||||
|         } |         } | ||||||
|         catch (IOException) |         catch (IOException) | ||||||
|         { |         { | ||||||
|             return false; |             return null; | ||||||
|         } |         } | ||||||
|         catch (UnauthorizedAccessException) |         catch (UnauthorizedAccessException) | ||||||
|         { |         { | ||||||
|             return false; |             return null; | ||||||
|         } |         } | ||||||
|     } |     } | ||||||
|  |  | ||||||
|   | |||||||
| @@ -0,0 +1,298 @@ | |||||||
|  | using System.Collections.Concurrent; | ||||||
|  | using System.Collections.Immutable; | ||||||
|  | using System.Security; | ||||||
|  |  | ||||||
|  | namespace StellaOps.Scanner.Analyzers.Lang.Rust.Internal; | ||||||
|  |  | ||||||
|  | internal static class RustLicenseScanner | ||||||
|  | { | ||||||
|  |     private static readonly ConcurrentDictionary<string, RustLicenseIndex> IndexCache = new(StringComparer.Ordinal); | ||||||
|  |  | ||||||
|  |     public static RustLicenseIndex GetOrCreate(string rootPath, CancellationToken cancellationToken) | ||||||
|  |     { | ||||||
|  |         if (string.IsNullOrWhiteSpace(rootPath) || !Directory.Exists(rootPath)) | ||||||
|  |         { | ||||||
|  |             return RustLicenseIndex.Empty; | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         var normalizedRoot = NormalizeRoot(rootPath); | ||||||
|  |         return IndexCache.GetOrAdd( | ||||||
|  |             normalizedRoot, | ||||||
|  |             static (_, state) => BuildIndex(state.RootPath, state.CancellationToken), | ||||||
|  |             (RootPath: rootPath, CancellationToken: cancellationToken)); | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     private static RustLicenseIndex BuildIndex(string rootPath, CancellationToken cancellationToken) | ||||||
|  |     { | ||||||
|  |         var byName = new Dictionary<string, List<RustLicenseInfo>>(StringComparer.Ordinal); | ||||||
|  |         var enumeration = new EnumerationOptions | ||||||
|  |         { | ||||||
|  |             MatchCasing = MatchCasing.CaseSensitive, | ||||||
|  |             IgnoreInaccessible = true, | ||||||
|  |             RecurseSubdirectories = true, | ||||||
|  |             AttributesToSkip = FileAttributes.Device | FileAttributes.ReparsePoint, | ||||||
|  |         }; | ||||||
|  |  | ||||||
|  |         foreach (var cargoTomlPath in Directory.EnumerateFiles(rootPath, "Cargo.toml", enumeration)) | ||||||
|  |         { | ||||||
|  |             cancellationToken.ThrowIfCancellationRequested(); | ||||||
|  |  | ||||||
|  |             if (IsUnderTargetDirectory(cargoTomlPath)) | ||||||
|  |             { | ||||||
|  |                 continue; | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             if (!TryParseCargoToml(rootPath, cargoTomlPath, out var info)) | ||||||
|  |             { | ||||||
|  |                 continue; | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             var normalizedName = RustCrateBuilder.NormalizeName(info.Name); | ||||||
|  |             if (!byName.TryGetValue(normalizedName, out var entries)) | ||||||
|  |             { | ||||||
|  |                 entries = new List<RustLicenseInfo>(); | ||||||
|  |                 byName[normalizedName] = entries; | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             entries.Add(info); | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         foreach (var entry in byName.Values) | ||||||
|  |         { | ||||||
|  |             entry.Sort(static (left, right) => | ||||||
|  |             { | ||||||
|  |                 var versionCompare = string.Compare(left.Version, right.Version, StringComparison.OrdinalIgnoreCase); | ||||||
|  |                 if (versionCompare != 0) | ||||||
|  |                 { | ||||||
|  |                     return versionCompare; | ||||||
|  |                 } | ||||||
|  |  | ||||||
|  |                 return string.Compare(left.CargoTomlRelativePath, right.CargoTomlRelativePath, StringComparison.Ordinal); | ||||||
|  |             }); | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         return new RustLicenseIndex(byName); | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     private static bool TryParseCargoToml(string rootPath, string cargoTomlPath, out RustLicenseInfo info) | ||||||
|  |     { | ||||||
|  |         info = default!; | ||||||
|  |  | ||||||
|  |         try | ||||||
|  |         { | ||||||
|  |             using var stream = new FileStream(cargoTomlPath, FileMode.Open, FileAccess.Read, FileShare.Read); | ||||||
|  |             using var reader = new StreamReader(stream, leaveOpen: false); | ||||||
|  |  | ||||||
|  |             string? name = null; | ||||||
|  |             string? version = null; | ||||||
|  |             string? licenseExpression = null; | ||||||
|  |             string? licenseFile = null; | ||||||
|  |             var inPackageSection = false; | ||||||
|  |  | ||||||
|  |             while (reader.ReadLine() is { } line) | ||||||
|  |             { | ||||||
|  |                 line = StripComment(line).Trim(); | ||||||
|  |                 if (line.Length == 0) | ||||||
|  |                 { | ||||||
|  |                     continue; | ||||||
|  |                 } | ||||||
|  |  | ||||||
|  |                 if (line.StartsWith("[", StringComparison.Ordinal)) | ||||||
|  |                 { | ||||||
|  |                     inPackageSection = string.Equals(line, "[package]", StringComparison.OrdinalIgnoreCase); | ||||||
|  |                     if (!inPackageSection && line.StartsWith("[dependency", StringComparison.OrdinalIgnoreCase)) | ||||||
|  |                     { | ||||||
|  |                         // Exiting package section. | ||||||
|  |                         break; | ||||||
|  |                     } | ||||||
|  |  | ||||||
|  |                     continue; | ||||||
|  |                 } | ||||||
|  |  | ||||||
|  |                 if (!inPackageSection) | ||||||
|  |                 { | ||||||
|  |                     continue; | ||||||
|  |                 } | ||||||
|  |  | ||||||
|  |                 if (TryParseStringAssignment(line, "name", out var parsedName)) | ||||||
|  |                 { | ||||||
|  |                     name ??= parsedName; | ||||||
|  |                     continue; | ||||||
|  |                 } | ||||||
|  |  | ||||||
|  |                 if (TryParseStringAssignment(line, "version", out var parsedVersion)) | ||||||
|  |                 { | ||||||
|  |                     version ??= parsedVersion; | ||||||
|  |                     continue; | ||||||
|  |                 } | ||||||
|  |  | ||||||
|  |                 if (TryParseStringAssignment(line, "license", out var parsedLicense)) | ||||||
|  |                 { | ||||||
|  |                     licenseExpression ??= parsedLicense; | ||||||
|  |                     continue; | ||||||
|  |                 } | ||||||
|  |  | ||||||
|  |                 if (TryParseStringAssignment(line, "license-file", out var parsedLicenseFile)) | ||||||
|  |                 { | ||||||
|  |                     licenseFile ??= parsedLicenseFile; | ||||||
|  |                     continue; | ||||||
|  |                 } | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             if (string.IsNullOrWhiteSpace(name)) | ||||||
|  |             { | ||||||
|  |                 return false; | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             var expressions = ImmutableArray<string>.Empty; | ||||||
|  |             if (!string.IsNullOrWhiteSpace(licenseExpression)) | ||||||
|  |             { | ||||||
|  |                 expressions = ImmutableArray.Create(licenseExpression!); | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             var files = ImmutableArray<RustLicenseFileReference>.Empty; | ||||||
|  |             if (!string.IsNullOrWhiteSpace(licenseFile)) | ||||||
|  |             { | ||||||
|  |                 var directory = Path.GetDirectoryName(cargoTomlPath) ?? string.Empty; | ||||||
|  |                 var absolute = Path.GetFullPath(Path.Combine(directory, licenseFile!)); | ||||||
|  |                 if (File.Exists(absolute)) | ||||||
|  |                 { | ||||||
|  |                     var relative = NormalizeRelativePath(rootPath, absolute); | ||||||
|  |                     if (RustFileHashCache.TryGetSha256(absolute, out var sha256)) | ||||||
|  |                     { | ||||||
|  |                         files = ImmutableArray.Create(new RustLicenseFileReference(relative, sha256)); | ||||||
|  |                     } | ||||||
|  |                     else | ||||||
|  |                     { | ||||||
|  |                         files = ImmutableArray.Create(new RustLicenseFileReference(relative, null)); | ||||||
|  |                     } | ||||||
|  |                 } | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             var cargoRelative = NormalizeRelativePath(rootPath, cargoTomlPath); | ||||||
|  |  | ||||||
|  |             info = new RustLicenseInfo( | ||||||
|  |                 name!.Trim(), | ||||||
|  |                 string.IsNullOrWhiteSpace(version) ? null : version!.Trim(), | ||||||
|  |                 expressions, | ||||||
|  |                 files, | ||||||
|  |                 cargoRelative); | ||||||
|  |  | ||||||
|  |             return true; | ||||||
|  |         } | ||||||
|  |         catch (IOException) | ||||||
|  |         { | ||||||
|  |             return false; | ||||||
|  |         } | ||||||
|  |         catch (UnauthorizedAccessException) | ||||||
|  |         { | ||||||
|  |             return false; | ||||||
|  |         } | ||||||
|  |         catch (SecurityException) | ||||||
|  |         { | ||||||
|  |             return false; | ||||||
|  |         } | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     private static string NormalizeRoot(string rootPath) | ||||||
|  |     { | ||||||
|  |         var full = Path.GetFullPath(rootPath); | ||||||
|  |         return OperatingSystem.IsWindows() | ||||||
|  |             ? full.ToLowerInvariant() | ||||||
|  |             : full; | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     private static bool TryParseStringAssignment(string line, string key, out string? value) | ||||||
|  |     { | ||||||
|  |         value = null; | ||||||
|  |  | ||||||
|  |         if (!line.StartsWith(key, StringComparison.Ordinal)) | ||||||
|  |         { | ||||||
|  |             return false; | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         var remaining = line[key.Length..].TrimStart(); | ||||||
|  |         if (remaining.Length == 0 || remaining[0] != '=') | ||||||
|  |         { | ||||||
|  |             return false; | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         remaining = remaining[1..].TrimStart(); | ||||||
|  |         if (remaining.Length < 2 || remaining[0] != '"' || remaining[^1] != '"') | ||||||
|  |         { | ||||||
|  |             return false; | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         value = remaining[1..^1]; | ||||||
|  |         return true; | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     private static string StripComment(string line) | ||||||
|  |     { | ||||||
|  |         var index = line.IndexOf('#'); | ||||||
|  |         return index < 0 ? line : line[..index]; | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     private static bool IsUnderTargetDirectory(string path) | ||||||
|  |     { | ||||||
|  |         var segment = $"{Path.DirectorySeparatorChar}target{Path.DirectorySeparatorChar}"; | ||||||
|  |         return path.Contains(segment, OperatingSystem.IsWindows() ? StringComparison.OrdinalIgnoreCase : StringComparison.Ordinal); | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     private static string NormalizeRelativePath(string rootPath, string absolutePath) | ||||||
|  |     { | ||||||
|  |         var relative = Path.GetRelativePath(rootPath, absolutePath); | ||||||
|  |         if (string.IsNullOrWhiteSpace(relative) || relative == ".") | ||||||
|  |         { | ||||||
|  |             return "."; | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         return relative.Replace('\\', '/'); | ||||||
|  |     } | ||||||
|  | } | ||||||
|  |  | ||||||
|  | internal sealed class RustLicenseIndex | ||||||
|  | { | ||||||
|  |     private readonly Dictionary<string, List<RustLicenseInfo>> _byName; | ||||||
|  |  | ||||||
|  |     public static readonly RustLicenseIndex Empty = new(new Dictionary<string, List<RustLicenseInfo>>(StringComparer.Ordinal)); | ||||||
|  |  | ||||||
|  |     public RustLicenseIndex(Dictionary<string, List<RustLicenseInfo>> byName) | ||||||
|  |     { | ||||||
|  |         _byName = byName ?? throw new ArgumentNullException(nameof(byName)); | ||||||
|  |     } | ||||||
|  |  | ||||||
|  |     public RustLicenseInfo? Find(string crateName, string? version) | ||||||
|  |     { | ||||||
|  |         if (string.IsNullOrWhiteSpace(crateName)) | ||||||
|  |         { | ||||||
|  |             return null; | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         var normalized = RustCrateBuilder.NormalizeName(crateName); | ||||||
|  |         if (!_byName.TryGetValue(normalized, out var list) || list.Count == 0) | ||||||
|  |         { | ||||||
|  |             return null; | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         if (!string.IsNullOrWhiteSpace(version)) | ||||||
|  |         { | ||||||
|  |             var match = list.FirstOrDefault(entry => string.Equals(entry.Version, version, StringComparison.OrdinalIgnoreCase)); | ||||||
|  |             if (match is not null) | ||||||
|  |             { | ||||||
|  |                 return match; | ||||||
|  |             } | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         return list[0]; | ||||||
|  |     } | ||||||
|  | } | ||||||
|  |  | ||||||
|  | internal sealed record RustLicenseInfo( | ||||||
|  |     string Name, | ||||||
|  |     string? Version, | ||||||
|  |     ImmutableArray<string> Expressions, | ||||||
|  |     ImmutableArray<RustLicenseFileReference> Files, | ||||||
|  |     string CargoTomlRelativePath); | ||||||
|  |  | ||||||
|  | internal sealed record RustLicenseFileReference(string RelativePath, string? Sha256); | ||||||
| @@ -5,6 +5,6 @@ | |||||||
| | 1 | SCANNER-ANALYZERS-LANG-10-306A | DONE (2025-10-22) | SCANNER-ANALYZERS-LANG-10-307 | Parse Cargo metadata (`Cargo.lock`, `.fingerprint`, `.metadata`) and map crates to components with evidence. | Fixtures confirm crate attribution ≥85 % coverage; metadata normalized; evidence includes path + hash. | | | 1 | SCANNER-ANALYZERS-LANG-10-306A | DONE (2025-10-22) | SCANNER-ANALYZERS-LANG-10-307 | Parse Cargo metadata (`Cargo.lock`, `.fingerprint`, `.metadata`) and map crates to components with evidence. | Fixtures confirm crate attribution ≥85 % coverage; metadata normalized; evidence includes path + hash. | | ||||||
| | 2 | SCANNER-ANALYZERS-LANG-10-306B | DONE (2025-10-22) | SCANNER-ANALYZERS-LANG-10-306A | Implement heuristic classifier using ELF section names, symbol mangling, and `.comment` data for stripped binaries. | Heuristic output flagged as `heuristic`; regression tests ensure no false “observed” classifications. | | | 2 | SCANNER-ANALYZERS-LANG-10-306B | DONE (2025-10-22) | SCANNER-ANALYZERS-LANG-10-306A | Implement heuristic classifier using ELF section names, symbol mangling, and `.comment` data for stripped binaries. | Heuristic output flagged as `heuristic`; regression tests ensure no false “observed” classifications. | | ||||||
| | 3 | SCANNER-ANALYZERS-LANG-10-306C | DONE (2025-10-22) | SCANNER-ANALYZERS-LANG-10-306B | Integrate binary hash fallback (`bin:{sha256}`) and tie into shared quiet provenance helpers. | Fallback path deterministic; shared helpers reused; tests verify consistent hashing. | | | 3 | SCANNER-ANALYZERS-LANG-10-306C | DONE (2025-10-22) | SCANNER-ANALYZERS-LANG-10-306B | Integrate binary hash fallback (`bin:{sha256}`) and tie into shared quiet provenance helpers. | Fallback path deterministic; shared helpers reused; tests verify consistent hashing. | | ||||||
| | 4 | SCANNER-ANALYZERS-LANG-10-307R | DOING (2025-10-23) | SCANNER-ANALYZERS-LANG-10-306C | Finalize shared helper usage (license, usage flags) and concurrency-safe caches. | Analyzer uses shared utilities; concurrency tests pass; no race conditions. | | | 4 | SCANNER-ANALYZERS-LANG-10-307R | DONE (2025-10-29) | SCANNER-ANALYZERS-LANG-10-306C | Finalize shared helper usage (license, usage flags) and concurrency-safe caches. | Analyzer uses shared utilities; concurrency tests pass; no race conditions. | | ||||||
| | 5 | SCANNER-ANALYZERS-LANG-10-308R | TODO | SCANNER-ANALYZERS-LANG-10-307R | Determinism fixtures + performance benchmarks; compare against competitor heuristic coverage. | Fixtures `Fixtures/lang/rust/` committed; determinism guard; benchmark shows ≥15 % better coverage vs competitor. | | | 5 | SCANNER-ANALYZERS-LANG-10-308R | TODO | SCANNER-ANALYZERS-LANG-10-307R | Determinism fixtures + performance benchmarks; compare against competitor heuristic coverage. | Fixtures `Fixtures/lang/rust/` committed; determinism guard; benchmark shows ≥15 % better coverage vs competitor. | | ||||||
| | 6 | SCANNER-ANALYZERS-LANG-10-309R | TODO | SCANNER-ANALYZERS-LANG-10-308R | Package plug-in manifest + Offline Kit documentation; ensure Worker integration. | Manifest copied; Worker loads analyzer; Offline Kit doc updated. | | | 6 | SCANNER-ANALYZERS-LANG-10-309R | TODO | SCANNER-ANALYZERS-LANG-10-308R | Package plug-in manifest + Offline Kit documentation; ensure Worker integration. | Manifest copied; Worker loads analyzer; Offline Kit doc updated. | | ||||||
|   | |||||||
| @@ -99,4 +99,37 @@ public sealed class JavaReflectionAnalyzerTests | |||||||
|             TestPaths.SafeDelete(root); |             TestPaths.SafeDelete(root); | ||||||
|         } |         } | ||||||
|     } |     } | ||||||
|  |  | ||||||
|  |     [Fact] | ||||||
|  |     public void Analyze_ClassResourceLookup_ProducesResourceEdge() | ||||||
|  |     { | ||||||
|  |         var root = TestPaths.CreateTemporaryDirectory(); | ||||||
|  |         try | ||||||
|  |         { | ||||||
|  |             var jarPath = Path.Combine(root, "libs", "resources.jar"); | ||||||
|  |             Directory.CreateDirectory(Path.GetDirectoryName(jarPath)!); | ||||||
|  |             using (var archive = new ZipArchive(new FileStream(jarPath, FileMode.Create, FileAccess.ReadWrite, FileShare.None), ZipArchiveMode.Create, leaveOpen: false)) | ||||||
|  |             { | ||||||
|  |                 var entry = archive.CreateEntry("com/example/Resources.class"); | ||||||
|  |                 var bytes = JavaClassFileFactory.CreateClassResourceLookup("com/example/Resources", "/META-INF/plugin.properties"); | ||||||
|  |                 using var stream = entry.Open(); | ||||||
|  |                 stream.Write(bytes); | ||||||
|  |             } | ||||||
|  |  | ||||||
|  |             var cancellationToken = TestContext.Current.CancellationToken; | ||||||
|  |             var context = new LanguageAnalyzerContext(root, TimeProvider.System); | ||||||
|  |             var workspace = JavaWorkspaceNormalizer.Normalize(context, cancellationToken); | ||||||
|  |             var classPath = JavaClassPathBuilder.Build(workspace, cancellationToken); | ||||||
|  |             var analysis = JavaReflectionAnalyzer.Analyze(classPath, cancellationToken); | ||||||
|  |  | ||||||
|  |             var edge = Assert.Single(analysis.Edges.Where(edge => edge.Reason == JavaReflectionReason.ResourceLookup)); | ||||||
|  |             Assert.Equal("com.example.Resources", edge.SourceClass); | ||||||
|  |             Assert.Equal("/META-INF/plugin.properties", edge.TargetType); | ||||||
|  |             Assert.Equal(JavaReflectionConfidence.High, edge.Confidence); | ||||||
|  |         } | ||||||
|  |         finally | ||||||
|  |         { | ||||||
|  |             TestPaths.SafeDelete(root); | ||||||
|  |         } | ||||||
|  |     } | ||||||
| } | } | ||||||
|   | |||||||
| @@ -0,0 +1,4 @@ | |||||||
|  | [package] | ||||||
|  | name = "my_app" | ||||||
|  | version = "0.1.0" | ||||||
|  | license = "MIT" | ||||||
| @@ -0,0 +1,16 @@ | |||||||
|  | MIT License | ||||||
|  |  | ||||||
|  | Permission is hereby granted, free of charge, to any person obtaining a copy | ||||||
|  | of this software and associated documentation files (the "Software"), to deal | ||||||
|  | in the Software without restriction, including without limitation the rights | ||||||
|  | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell | ||||||
|  | copies of the Software, and to permit persons to whom the Software is | ||||||
|  | furnished to do so, subject to the following conditions: | ||||||
|  |  | ||||||
|  | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR | ||||||
|  | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, | ||||||
|  | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE | ||||||
|  | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER | ||||||
|  | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, | ||||||
|  | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE | ||||||
|  | SOFTWARE. | ||||||
| @@ -11,6 +11,7 @@ | |||||||
|       "cargo.lock.path": "Cargo.lock", |       "cargo.lock.path": "Cargo.lock", | ||||||
|       "fingerprint.profile": "debug", |       "fingerprint.profile": "debug", | ||||||
|       "fingerprint.targetKind": "bin", |       "fingerprint.targetKind": "bin", | ||||||
|  |       "license.expression[0]": "MIT", | ||||||
|       "source": "registry\u002Bhttps://github.com/rust-lang/crates.io-index" |       "source": "registry\u002Bhttps://github.com/rust-lang/crates.io-index" | ||||||
|     }, |     }, | ||||||
|     "evidence": [ |     "evidence": [ | ||||||
| @@ -41,6 +42,7 @@ | |||||||
|       "checksum": "abc123", |       "checksum": "abc123", | ||||||
|       "fingerprint.profile": "release", |       "fingerprint.profile": "release", | ||||||
|       "fingerprint.targetKind": "lib", |       "fingerprint.targetKind": "lib", | ||||||
|  |       "license.expression[0]": "Apache-2.0", | ||||||
|       "source": "registry\u002Bhttps://github.com/rust-lang/crates.io-index" |       "source": "registry\u002Bhttps://github.com/rust-lang/crates.io-index" | ||||||
|     }, |     }, | ||||||
|     "evidence": [ |     "evidence": [ | ||||||
|   | |||||||
| @@ -0,0 +1,4 @@ | |||||||
|  | [package] | ||||||
|  | name = "serde" | ||||||
|  | version = "1.0.188" | ||||||
|  | license = "Apache-2.0" | ||||||
| @@ -1,4 +1,6 @@ | |||||||
|  | using System; | ||||||
| using System.IO; | using System.IO; | ||||||
|  | using System.Linq; | ||||||
| using StellaOps.Scanner.Analyzers.Lang.Rust; | using StellaOps.Scanner.Analyzers.Lang.Rust; | ||||||
| using StellaOps.Scanner.Analyzers.Lang.Tests.Harness; | using StellaOps.Scanner.Analyzers.Lang.Tests.Harness; | ||||||
| using StellaOps.Scanner.Analyzers.Lang.Tests.TestUtilities; | using StellaOps.Scanner.Analyzers.Lang.Tests.TestUtilities; | ||||||
| @@ -31,4 +33,27 @@ public sealed class RustLanguageAnalyzerTests | |||||||
|             cancellationToken, |             cancellationToken, | ||||||
|             usageHints); |             usageHints); | ||||||
|     } |     } | ||||||
|  |  | ||||||
|  |     [Fact] | ||||||
|  |     public async Task AnalyzerIsThreadSafeUnderConcurrencyAsync() | ||||||
|  |     { | ||||||
|  |         var cancellationToken = TestContext.Current.CancellationToken; | ||||||
|  |         var fixturePath = TestPaths.ResolveFixture("lang", "rust", "simple"); | ||||||
|  |  | ||||||
|  |         var analyzers = new ILanguageAnalyzer[] | ||||||
|  |         { | ||||||
|  |             new RustLanguageAnalyzer() | ||||||
|  |         }; | ||||||
|  |  | ||||||
|  |         var workers = Math.Max(Environment.ProcessorCount, 4); | ||||||
|  |         var tasks = Enumerable.Range(0, workers) | ||||||
|  |             .Select(_ => LanguageAnalyzerTestHarness.RunToJsonAsync(fixturePath, analyzers, cancellationToken)); | ||||||
|  |  | ||||||
|  |         var results = await Task.WhenAll(tasks); | ||||||
|  |         var baseline = results[0]; | ||||||
|  |         foreach (var result in results) | ||||||
|  |         { | ||||||
|  |             Assert.Equal(baseline, result); | ||||||
|  |         } | ||||||
|  |     } | ||||||
| } | } | ||||||
|   | |||||||
| @@ -43,6 +43,44 @@ public static class JavaClassFileFactory | |||||||
|         return buffer.ToArray(); |         return buffer.ToArray(); | ||||||
|     } |     } | ||||||
|  |  | ||||||
|  |     public static byte[] CreateClassResourceLookup(string internalClassName, string resourcePath) | ||||||
|  |     { | ||||||
|  |         using var buffer = new MemoryStream(); | ||||||
|  |         using var writer = new BigEndianWriter(buffer); | ||||||
|  |  | ||||||
|  |         WriteClassFileHeader(writer, constantPoolCount: 18); | ||||||
|  |  | ||||||
|  |         writer.WriteByte((byte)ConstantTag.Utf8); writer.WriteUtf8(internalClassName); // #1 | ||||||
|  |         writer.WriteByte((byte)ConstantTag.Class); writer.WriteUInt16(1); // #2 | ||||||
|  |         writer.WriteByte((byte)ConstantTag.Utf8); writer.WriteUtf8("java/lang/Object"); // #3 | ||||||
|  |         writer.WriteByte((byte)ConstantTag.Class); writer.WriteUInt16(3); // #4 | ||||||
|  |         writer.WriteByte((byte)ConstantTag.Utf8); writer.WriteUtf8("load"); // #5 | ||||||
|  |         writer.WriteByte((byte)ConstantTag.Utf8); writer.WriteUtf8("()V"); // #6 | ||||||
|  |         writer.WriteByte((byte)ConstantTag.Utf8); writer.WriteUtf8("Code"); // #7 | ||||||
|  |         writer.WriteByte((byte)ConstantTag.Utf8); writer.WriteUtf8(resourcePath); // #8 | ||||||
|  |         writer.WriteByte((byte)ConstantTag.String); writer.WriteUInt16(8); // #9 | ||||||
|  |         writer.WriteByte((byte)ConstantTag.Utf8); writer.WriteUtf8("java/lang/Class"); // #10 | ||||||
|  |         writer.WriteByte((byte)ConstantTag.Class); writer.WriteUInt16(10); // #11 | ||||||
|  |         writer.WriteByte((byte)ConstantTag.Utf8); writer.WriteUtf8("getResource"); // #12 | ||||||
|  |         writer.WriteByte((byte)ConstantTag.Utf8); writer.WriteUtf8("(Ljava/lang/String;)Ljava/net/URL;"); // #13 | ||||||
|  |         writer.WriteByte((byte)ConstantTag.NameAndType); writer.WriteUInt16(12); writer.WriteUInt16(13); // #14 | ||||||
|  |         writer.WriteByte((byte)ConstantTag.Methodref); writer.WriteUInt16(11); writer.WriteUInt16(14); // #15 | ||||||
|  |  | ||||||
|  |         writer.WriteUInt16(0x0001); // public | ||||||
|  |         writer.WriteUInt16(2); // this class | ||||||
|  |         writer.WriteUInt16(4); // super class | ||||||
|  |  | ||||||
|  |         writer.WriteUInt16(0); // interfaces | ||||||
|  |         writer.WriteUInt16(0); // fields | ||||||
|  |         writer.WriteUInt16(1); // methods | ||||||
|  |  | ||||||
|  |         WriteResourceLookupMethod(writer, methodNameIndex: 5, descriptorIndex: 6, classConstantIndex: 4, stringIndex: 9, methodRefIndex: 15); | ||||||
|  |  | ||||||
|  |         writer.WriteUInt16(0); // class attributes | ||||||
|  |  | ||||||
|  |         return buffer.ToArray(); | ||||||
|  |     } | ||||||
|  |  | ||||||
|     public static byte[] CreateTcclChecker(string internalClassName) |     public static byte[] CreateTcclChecker(string internalClassName) | ||||||
|     { |     { | ||||||
|         using var buffer = new MemoryStream(); |         using var buffer = new MemoryStream(); | ||||||
| @@ -148,6 +186,37 @@ public static class JavaClassFileFactory | |||||||
|         writer.WriteBytes(codeBytes); |         writer.WriteBytes(codeBytes); | ||||||
|     } |     } | ||||||
|  |  | ||||||
|  |     private static void WriteResourceLookupMethod(BigEndianWriter writer, ushort methodNameIndex, ushort descriptorIndex, ushort classConstantIndex, ushort stringIndex, ushort methodRefIndex) | ||||||
|  |     { | ||||||
|  |         writer.WriteUInt16(0x0009); | ||||||
|  |         writer.WriteUInt16(methodNameIndex); | ||||||
|  |         writer.WriteUInt16(descriptorIndex); | ||||||
|  |         writer.WriteUInt16(1); | ||||||
|  |  | ||||||
|  |         writer.WriteUInt16(7); | ||||||
|  |         using var codeBuffer = new MemoryStream(); | ||||||
|  |         using (var codeWriter = new BigEndianWriter(codeBuffer)) | ||||||
|  |         { | ||||||
|  |             codeWriter.WriteUInt16(2); | ||||||
|  |             codeWriter.WriteUInt16(0); | ||||||
|  |             codeWriter.WriteUInt32(8); | ||||||
|  |             codeWriter.WriteByte(0x13); // ldc_w for class literal | ||||||
|  |             codeWriter.WriteUInt16(classConstantIndex); | ||||||
|  |             codeWriter.WriteByte(0x12); | ||||||
|  |             codeWriter.WriteByte((byte)stringIndex); | ||||||
|  |             codeWriter.WriteByte(0xB6); | ||||||
|  |             codeWriter.WriteUInt16(methodRefIndex); | ||||||
|  |             codeWriter.WriteByte(0x57); | ||||||
|  |             codeWriter.WriteByte(0xB1); | ||||||
|  |             codeWriter.WriteUInt16(0); | ||||||
|  |             codeWriter.WriteUInt16(0); | ||||||
|  |         } | ||||||
|  |  | ||||||
|  |         var codeBytes = codeBuffer.ToArray(); | ||||||
|  |         writer.WriteUInt32((uint)codeBytes.Length); | ||||||
|  |         writer.WriteBytes(codeBytes); | ||||||
|  |     } | ||||||
|  |  | ||||||
|     private sealed class BigEndianWriter : IDisposable |     private sealed class BigEndianWriter : IDisposable | ||||||
|     { |     { | ||||||
|         private readonly BinaryWriter _writer; |         private readonly BinaryWriter _writer; | ||||||
|   | |||||||
| @@ -18,6 +18,7 @@ public sealed class FileKmsClient : IKmsClient, IDisposable | |||||||
|             new JsonStringEnumConverter(), |             new JsonStringEnumConverter(), | ||||||
|         }, |         }, | ||||||
|     }; |     }; | ||||||
|  |     private const int MinKeyDerivationIterations = 600_000; | ||||||
|  |  | ||||||
|     private readonly FileKmsOptions _options; |     private readonly FileKmsOptions _options; | ||||||
|     private readonly SemaphoreSlim _mutex = new(1, 1); |     private readonly SemaphoreSlim _mutex = new(1, 1); | ||||||
| @@ -36,6 +37,13 @@ public sealed class FileKmsClient : IKmsClient, IDisposable | |||||||
|         } |         } | ||||||
|  |  | ||||||
|         _options = options; |         _options = options; | ||||||
|  |         if (_options.KeyDerivationIterations < MinKeyDerivationIterations) | ||||||
|  |         { | ||||||
|  |             throw new ArgumentOutOfRangeException( | ||||||
|  |                 nameof(options.KeyDerivationIterations), | ||||||
|  |                 _options.KeyDerivationIterations, | ||||||
|  |                 $"PBKDF2 iterations must be at least {MinKeyDerivationIterations:N0} to satisfy cryptographic guidance."); | ||||||
|  |         } | ||||||
|         Directory.CreateDirectory(_options.RootPath); |         Directory.CreateDirectory(_options.RootPath); | ||||||
|     } |     } | ||||||
|  |  | ||||||
| @@ -415,7 +423,7 @@ public sealed class FileKmsClient : IKmsClient, IDisposable | |||||||
|  |  | ||||||
|         using var ecdsa = ECDsa.Create(); |         using var ecdsa = ECDsa.Create(); | ||||||
|         ecdsa.ImportParameters(parameters); |         ecdsa.ImportParameters(parameters); | ||||||
|         return ecdsa.SignData(data.ToArray(), HashAlgorithmName.SHA256); |         return ecdsa.SignData(data, HashAlgorithmName.SHA256); | ||||||
|     } |     } | ||||||
|  |  | ||||||
|     private bool VerifyData(string curveName, string publicKeyBase64, ReadOnlySpan<byte> data, ReadOnlySpan<byte> signature) |     private bool VerifyData(string curveName, string publicKeyBase64, ReadOnlySpan<byte> data, ReadOnlySpan<byte> signature) | ||||||
| @@ -442,7 +450,7 @@ public sealed class FileKmsClient : IKmsClient, IDisposable | |||||||
|  |  | ||||||
|         using var ecdsa = ECDsa.Create(); |         using var ecdsa = ECDsa.Create(); | ||||||
|         ecdsa.ImportParameters(parameters); |         ecdsa.ImportParameters(parameters); | ||||||
|         return ecdsa.VerifyData(data.ToArray(), signature.ToArray(), HashAlgorithmName.SHA256); |         return ecdsa.VerifyData(data, signature, HashAlgorithmName.SHA256); | ||||||
|     } |     } | ||||||
|  |  | ||||||
|     private KeyEnvelope EncryptPrivateKey(ReadOnlySpan<byte> privateKey) |     private KeyEnvelope EncryptPrivateKey(ReadOnlySpan<byte> privateKey) | ||||||
| @@ -457,9 +465,10 @@ public sealed class FileKmsClient : IKmsClient, IDisposable | |||||||
|             var tag = new byte[16]; |             var tag = new byte[16]; | ||||||
|             var plaintextCopy = privateKey.ToArray(); |             var plaintextCopy = privateKey.ToArray(); | ||||||
|  |  | ||||||
|  |             using var aesGcm = new AesGcm(key, tag.Length); | ||||||
|             try |             try | ||||||
|             { |             { | ||||||
|                 AesGcm.Encrypt(key, nonce, plaintextCopy, ciphertext, tag); |                 aesGcm.Encrypt(nonce, plaintextCopy, ciphertext, tag); | ||||||
|             } |             } | ||||||
|             finally |             finally | ||||||
|             { |             { | ||||||
| @@ -489,7 +498,8 @@ public sealed class FileKmsClient : IKmsClient, IDisposable | |||||||
|         try |         try | ||||||
|         { |         { | ||||||
|             var plaintext = new byte[ciphertext.Length]; |             var plaintext = new byte[ciphertext.Length]; | ||||||
|             AesGcm.Decrypt(key, nonce, ciphertext, tag, plaintext); |             using var aesGcm = new AesGcm(key, tag.Length); | ||||||
|  |             aesGcm.Decrypt(nonce, ciphertext, tag, plaintext); | ||||||
|  |  | ||||||
|             return plaintext; |             return plaintext; | ||||||
|         } |         } | ||||||
|   | |||||||
| @@ -16,12 +16,12 @@ public sealed class FileKmsOptions | |||||||
|     public required string Password { get; set; } |     public required string Password { get; set; } | ||||||
|  |  | ||||||
|     /// <summary> |     /// <summary> | ||||||
|     /// Signing algorithm identifier (default ED25519). |     /// Signing algorithm identifier (default ES256). | ||||||
|     /// </summary> |     /// </summary> | ||||||
|     public string Algorithm { get; set; } = KmsAlgorithms.Es256; |     public string Algorithm { get; set; } = KmsAlgorithms.Es256; | ||||||
|  |  | ||||||
|     /// <summary> |     /// <summary> | ||||||
|     /// PBKDF2 iteration count for envelope encryption. |     /// PBKDF2 iteration count for envelope encryption. | ||||||
|     /// </summary> |     /// </summary> | ||||||
|     public int KeyDerivationIterations { get; set; } = 100_000; |     public int KeyDerivationIterations { get; set; } = 600_000; | ||||||
| } | } | ||||||
|   | |||||||
| @@ -3,7 +3,7 @@ | |||||||
| ## Sprint 72 – Abstractions & File Driver | ## Sprint 72 – Abstractions & File Driver | ||||||
| | ID | Status | Owner(s) | Depends on | Description | Exit Criteria | | | ID | Status | Owner(s) | Depends on | Description | Exit Criteria | | ||||||
| |----|--------|----------|------------|-------------|---------------| | |----|--------|----------|------------|-------------|---------------| | ||||||
| | KMS-72-001 | DOING (2025-10-29) | KMS Guild | — | Implement KMS interface (sign, verify, metadata, rotate, revoke) and file-based key driver with encrypted at-rest storage. | Interface + file driver operational; unit tests cover sign/verify/rotation; lint passes.<br>2025-10-29: `FileKmsClient` (ES256) file driver scaffolding committed under `StellaOps.Cryptography.Kms`; includes disk encryption + unit tests. Follow-up: address PBKDF2/AesGcm warnings and wire into Authority services. | | | KMS-72-001 | DOING (2025-10-29) | KMS Guild | — | Implement KMS interface (sign, verify, metadata, rotate, revoke) and file-based key driver with encrypted at-rest storage. | Interface + file driver operational; unit tests cover sign/verify/rotation; lint passes.<br>2025-10-29: `FileKmsClient` (ES256) file driver scaffolding committed under `StellaOps.Cryptography.Kms`; includes disk encryption + unit tests. Follow-up: address PBKDF2/AesGcm warnings and wire into Authority services.<br>2025-10-29 18:40Z: Hardened PBKDF2 iteration floor (≥600k), switched to tag-size explicit `AesGcm` usage, removed transient array allocations, and refreshed unit tests (`StellaOps.Cryptography.Kms.Tests`). | | ||||||
| | KMS-72-002 | TODO | KMS Guild | KMS-72-001 | Add CLI support for importing/exporting file-based keys with password protection. | CLI commands functional; docs updated; integration tests pass. | | | KMS-72-002 | TODO | KMS Guild | KMS-72-001 | Add CLI support for importing/exporting file-based keys with password protection. | CLI commands functional; docs updated; integration tests pass. | | ||||||
|  |  | ||||||
| ## Sprint 73 – Cloud & HSM Integration | ## Sprint 73 – Cloud & HSM Integration | ||||||
|   | |||||||
		Reference in New Issue
	
	Block a user