feat(rust): Implement RustCargoLockParser and RustFingerprintScanner
Some checks failed
Docs CI / lint-and-preview (push) Has been cancelled

- Added RustCargoLockParser to parse Cargo.lock files and extract package information.
- Introduced RustFingerprintScanner to scan for Rust fingerprint records in JSON files.
- Created test fixtures for Rust language analysis, including Cargo.lock and fingerprint JSON files.
- Developed tests for RustLanguageAnalyzer to ensure deterministic output based on provided fixtures.
- Added expected output files for both simple and signed Rust applications.
This commit is contained in:
Vladimir Moushkov
2025-10-22 18:11:01 +03:00
parent 323bf5844f
commit 224c76c276
66 changed files with 4200 additions and 217 deletions

View File

@@ -94,7 +94,7 @@ Generated from SPRINTS.md and module TASKS.md files on 2025-10-19. Waves cluster
- Team Scheduler ImpactIndex Guild: read EXECPLAN.md Wave 2 and SPRINTS.md rows for `src/StellaOps.Scheduler.ImpactIndex/TASKS.md`. Focus on SCHED-IMPACT-16-303 (TODO), SCHED-IMPACT-16-302 (TODO). Confirm prerequisites (internal: SCHED-IMPACT-16-301 (Wave 1)) before starting and report status in module TASKS.md. - Team Scheduler ImpactIndex Guild: read EXECPLAN.md Wave 2 and SPRINTS.md rows for `src/StellaOps.Scheduler.ImpactIndex/TASKS.md`. Focus on SCHED-IMPACT-16-303 (TODO), SCHED-IMPACT-16-302 (TODO). Confirm prerequisites (internal: SCHED-IMPACT-16-301 (Wave 1)) before starting and report status in module TASKS.md.
- Team Scheduler WebService Guild: read EXECPLAN.md Wave 2 and SPRINTS.md rows for `src/StellaOps.Scheduler.WebService/TASKS.md`. Focus on SCHED-WEB-16-103 (TODO). Confirm prerequisites (internal: SCHED-WEB-16-102 (Wave 1)) before starting and report status in module TASKS.md. - Team Scheduler WebService Guild: read EXECPLAN.md Wave 2 and SPRINTS.md rows for `src/StellaOps.Scheduler.WebService/TASKS.md`. Focus on SCHED-WEB-16-103 (TODO). Confirm prerequisites (internal: SCHED-WEB-16-102 (Wave 1)) before starting and report status in module TASKS.md.
- Team Scheduler Worker Guild: read EXECPLAN.md Wave 2 and SPRINTS.md rows for `src/StellaOps.Scheduler.Worker/TASKS.md`. Focus on SCHED-WORKER-16-202 (TODO), SCHED-WORKER-16-205 (TODO). Confirm prerequisites (internal: SCHED-IMPACT-16-301 (Wave 1), SCHED-WORKER-16-201 (Wave 1)) before starting and report status in module TASKS.md. - Team Scheduler Worker Guild: read EXECPLAN.md Wave 2 and SPRINTS.md rows for `src/StellaOps.Scheduler.Worker/TASKS.md`. Focus on SCHED-WORKER-16-202 (TODO), SCHED-WORKER-16-205 (TODO). Confirm prerequisites (internal: SCHED-IMPACT-16-301 (Wave 1), SCHED-WORKER-16-201 (Wave 1)) before starting and report status in module TASKS.md.
- Team TBD: read EXECPLAN.md Wave 2 and SPRINTS.md rows for `src/StellaOps.Scanner.Analyzers.Lang.DotNet/TASKS.md`, `src/StellaOps.Scanner.Analyzers.Lang.Go/TASKS.md`, `src/StellaOps.Scanner.Analyzers.Lang.Node/TASKS.md`, `src/StellaOps.Scanner.Analyzers.Lang.Python/TASKS.md`, `src/StellaOps.Scanner.Analyzers.Lang.Rust/TASKS.md`. Focus on SCANNER-ANALYZERS-LANG-10-305B (TODO), SCANNER-ANALYZERS-LANG-10-304B (DONE 2025-10-22), SCANNER-ANALYZERS-LANG-10-303B (DONE 2025-10-21), SCANNER-ANALYZERS-LANG-10-306B (TODO); Node packaging milestone 10-308N closed 2025-10-21. Confirm prerequisites (internal: SCANNER-ANALYZERS-LANG-10-303A (Wave 1), SCANNER-ANALYZERS-LANG-10-304A (Wave 1), SCANNER-ANALYZERS-LANG-10-305A (Wave 1), SCANNER-ANALYZERS-LANG-10-306A (Wave 1), SCANNER-ANALYZERS-LANG-10-307N (Wave 1)) before starting and report status in module TASKS.md. - Team TBD: read EXECPLAN.md Wave 2 and SPRINTS.md rows for `src/StellaOps.Scanner.Analyzers.Lang.DotNet/TASKS.md`, `src/StellaOps.Scanner.Analyzers.Lang.Go/TASKS.md`, `src/StellaOps.Scanner.Analyzers.Lang.Node/TASKS.md`, `src/StellaOps.Scanner.Analyzers.Lang.Python/TASKS.md`, `src/StellaOps.Scanner.Analyzers.Lang.Rust/TASKS.md`. Focus on SCANNER-ANALYZERS-LANG-10-305B (DONE 2025-10-22), SCANNER-ANALYZERS-LANG-10-304B (DONE 2025-10-22), SCANNER-ANALYZERS-LANG-10-303B (DONE 2025-10-21), SCANNER-ANALYZERS-LANG-10-306B (TODO); Node packaging milestone 10-308N closed 2025-10-21. Confirm prerequisites (internal: SCANNER-ANALYZERS-LANG-10-303A (Wave 1), SCANNER-ANALYZERS-LANG-10-304A (Wave 1), SCANNER-ANALYZERS-LANG-10-305A (Wave 1), SCANNER-ANALYZERS-LANG-10-306A (Wave 1), SCANNER-ANALYZERS-LANG-10-307N (Wave 1)) before starting and report status in module TASKS.md.
- Team Team Excititor Connectors Oracle: read EXECPLAN.md Wave 2 and SPRINTS.md rows for `src/StellaOps.Excititor.Connectors.Oracle.CSAF/TASKS.md`. Focus on EXCITITOR-CONN-ORACLE-01-003 (TODO). Confirm prerequisites (internal: EXCITITOR-CONN-ORACLE-01-002 (Wave 1); external: EXCITITOR-POLICY-01-001) before starting and report status in module TASKS.md. - Team Team Excititor Connectors Oracle: read EXECPLAN.md Wave 2 and SPRINTS.md rows for `src/StellaOps.Excititor.Connectors.Oracle.CSAF/TASKS.md`. Focus on EXCITITOR-CONN-ORACLE-01-003 (TODO). Confirm prerequisites (internal: EXCITITOR-CONN-ORACLE-01-002 (Wave 1); external: EXCITITOR-POLICY-01-001) before starting and report status in module TASKS.md.
- Team Team Excititor Export: read EXECPLAN.md Wave 2 and SPRINTS.md rows for `src/StellaOps.Excititor.Export/TASKS.md`. Focus on EXCITITOR-EXPORT-01-007 (DONE 2025-10-21). Confirm prerequisites (internal: EXCITITOR-EXPORT-01-006 (Wave 1)) before starting and report status in module TASKS.md. - Team Team Excititor Export: read EXECPLAN.md Wave 2 and SPRINTS.md rows for `src/StellaOps.Excititor.Export/TASKS.md`. Focus on EXCITITOR-EXPORT-01-007 (DONE 2025-10-21). Confirm prerequisites (internal: EXCITITOR-EXPORT-01-006 (Wave 1)) before starting and report status in module TASKS.md.
- Team Zastava Observer Guild: read EXECPLAN.md Wave 2 and SPRINTS.md rows for `src/StellaOps.Zastava.Observer/TASKS.md`. Focus on ZASTAVA-OBS-12-002 (TODO). Confirm prerequisites (internal: ZASTAVA-OBS-12-001 (Wave 1)) before starting and report status in module TASKS.md. - Team Zastava Observer Guild: read EXECPLAN.md Wave 2 and SPRINTS.md rows for `src/StellaOps.Zastava.Observer/TASKS.md`. Focus on ZASTAVA-OBS-12-002 (TODO). Confirm prerequisites (internal: ZASTAVA-OBS-12-001 (Wave 1)) before starting and report status in module TASKS.md.
@@ -106,7 +106,7 @@ Generated from SPRINTS.md and module TASKS.md files on 2025-10-19. Waves cluster
- Team Notify Engine Guild: read EXECPLAN.md Wave 3 and SPRINTS.md rows for `src/StellaOps.Notify.Engine/TASKS.md`. Focus on NOTIFY-ENGINE-15-303 (TODO). Confirm prerequisites (internal: NOTIFY-ENGINE-15-302 (Wave 2)) before starting and report status in module TASKS.md. - Team Notify Engine Guild: read EXECPLAN.md Wave 3 and SPRINTS.md rows for `src/StellaOps.Notify.Engine/TASKS.md`. Focus on NOTIFY-ENGINE-15-303 (TODO). Confirm prerequisites (internal: NOTIFY-ENGINE-15-302 (Wave 2)) before starting and report status in module TASKS.md.
- Team Notify Worker Guild: read EXECPLAN.md Wave 3 and SPRINTS.md rows for `src/StellaOps.Notify.Worker/TASKS.md`. Focus on NOTIFY-WORKER-15-203 (TODO). Confirm prerequisites (internal: NOTIFY-ENGINE-15-302 (Wave 2)) before starting and report status in module TASKS.md. - Team Notify Worker Guild: read EXECPLAN.md Wave 3 and SPRINTS.md rows for `src/StellaOps.Notify.Worker/TASKS.md`. Focus on NOTIFY-WORKER-15-203 (TODO). Confirm prerequisites (internal: NOTIFY-ENGINE-15-302 (Wave 2)) before starting and report status in module TASKS.md.
- Team Scheduler Worker Guild: read EXECPLAN.md Wave 3 and SPRINTS.md rows for `src/StellaOps.Scheduler.Worker/TASKS.md`. Focus on SCHED-WORKER-16-203 (TODO). Confirm prerequisites (internal: SCHED-WORKER-16-202 (Wave 2)) before starting and report status in module TASKS.md. - Team Scheduler Worker Guild: read EXECPLAN.md Wave 3 and SPRINTS.md rows for `src/StellaOps.Scheduler.Worker/TASKS.md`. Focus on SCHED-WORKER-16-203 (TODO). Confirm prerequisites (internal: SCHED-WORKER-16-202 (Wave 2)) before starting and report status in module TASKS.md.
- Team TBD: read EXECPLAN.md Wave 3 and SPRINTS.md rows for `src/StellaOps.Scanner.Analyzers.Lang.DotNet/TASKS.md`, `src/StellaOps.Scanner.Analyzers.Lang.Go/TASKS.md`, `src/StellaOps.Scanner.Analyzers.Lang.Node/TASKS.md`, `src/StellaOps.Scanner.Analyzers.Lang.Python/TASKS.md`, `src/StellaOps.Scanner.Analyzers.Lang.Rust/TASKS.md`. Focus on SCANNER-ANALYZERS-LANG-10-305C (TODO), SCANNER-ANALYZERS-LANG-10-304C (TODO), SCANNER-ANALYZERS-LANG-10-309N (TODO), SCANNER-ANALYZERS-LANG-10-303C (DONE 2025-10-21), SCANNER-ANALYZERS-LANG-10-306C (TODO). Confirm prerequisites (internal: SCANNER-ANALYZERS-LANG-10-303B (Wave 2), SCANNER-ANALYZERS-LANG-10-304B (Wave 2), SCANNER-ANALYZERS-LANG-10-305B (Wave 2), SCANNER-ANALYZERS-LANG-10-306B (Wave 2), SCANNER-ANALYZERS-LANG-10-308N (Wave 2)) before starting and report status in module TASKS.md. - Team TBD: read EXECPLAN.md Wave 3 and SPRINTS.md rows for `src/StellaOps.Scanner.Analyzers.Lang.DotNet/TASKS.md`, `src/StellaOps.Scanner.Analyzers.Lang.Go/TASKS.md`, `src/StellaOps.Scanner.Analyzers.Lang.Node/TASKS.md`, `src/StellaOps.Scanner.Analyzers.Lang.Python/TASKS.md`, `src/StellaOps.Scanner.Analyzers.Lang.Rust/TASKS.md`. Focus on SCANNER-ANALYZERS-LANG-10-305C (DONE 2025-10-22), SCANNER-ANALYZERS-LANG-10-304C (TODO), SCANNER-ANALYZERS-LANG-10-309N (TODO), SCANNER-ANALYZERS-LANG-10-303C (DONE 2025-10-21), SCANNER-ANALYZERS-LANG-10-306C (TODO). Confirm prerequisites (internal: SCANNER-ANALYZERS-LANG-10-303B (Wave 2), SCANNER-ANALYZERS-LANG-10-304B (Wave 2), SCANNER-ANALYZERS-LANG-10-305B (Wave 2), SCANNER-ANALYZERS-LANG-10-306B (Wave 2), SCANNER-ANALYZERS-LANG-10-308N (Wave 2)) before starting and report status in module TASKS.md.
- Team Zastava Observer Guild: read EXECPLAN.md Wave 3 and SPRINTS.md rows for `src/StellaOps.Zastava.Observer/TASKS.md`. Focus on ZASTAVA-OBS-12-003 (TODO), ZASTAVA-OBS-12-004 (TODO), ZASTAVA-OBS-17-005 (TODO). Confirm prerequisites (internal: ZASTAVA-OBS-12-002 (Wave 2)) before starting and report status in module TASKS.md. - Team Zastava Observer Guild: read EXECPLAN.md Wave 3 and SPRINTS.md rows for `src/StellaOps.Zastava.Observer/TASKS.md`. Focus on ZASTAVA-OBS-12-003 (TODO), ZASTAVA-OBS-12-004 (TODO), ZASTAVA-OBS-17-005 (TODO). Confirm prerequisites (internal: ZASTAVA-OBS-12-002 (Wave 2)) before starting and report status in module TASKS.md.
### Wave 4 ### Wave 4
@@ -721,9 +721,9 @@ Generated from SPRINTS.md and module TASKS.md files on 2025-10-19. Waves cluster
- **Sprint 10** · Backlog - **Sprint 10** · Backlog
- Team: TBD - Team: TBD
- Path: `src/StellaOps.Scanner.Analyzers.Lang.DotNet/TASKS.md` - Path: `src/StellaOps.Scanner.Analyzers.Lang.DotNet/TASKS.md`
1. [TODO] SCANNER-ANALYZERS-LANG-10-305B — Extract assembly metadata (strong name, file/product info) and optional Authenticode details when offline cert bundle provided. 1. [DONE 2025-10-22] SCANNER-ANALYZERS-LANG-10-305B — Extract assembly metadata (strong name, file/product info) and optional Authenticode details when offline cert bundle provided.
• Prereqs: SCANNER-ANALYZERS-LANG-10-305A (Wave 1) • Prereqs: SCANNER-ANALYZERS-LANG-10-305A (Wave 1)
• Current: TODO • Current: DONE — Assembly metadata now emits strong-name, file/product info, and optional Authenticode signals with deterministic fixtures/tests.
- Path: `src/StellaOps.Scanner.Analyzers.Lang.Go/TASKS.md` - Path: `src/StellaOps.Scanner.Analyzers.Lang.Go/TASKS.md`
1. [DONE 2025-10-22] SCANNER-ANALYZERS-LANG-10-304B — Implement DWARF-lite reader for VCS metadata + dirty flag; add cache to avoid re-reading identical binaries. 1. [DONE 2025-10-22] SCANNER-ANALYZERS-LANG-10-304B — Implement DWARF-lite reader for VCS metadata + dirty flag; add cache to avoid re-reading identical binaries.
• Prereqs: SCANNER-ANALYZERS-LANG-10-304A (Wave 1) • Prereqs: SCANNER-ANALYZERS-LANG-10-304A (Wave 1)
@@ -852,8 +852,8 @@ Generated from SPRINTS.md and module TASKS.md files on 2025-10-19. Waves cluster
- **Sprint 10** · Backlog - **Sprint 10** · Backlog
- Team: TBD - Team: TBD
- Path: `src/StellaOps.Scanner.Analyzers.Lang.DotNet/TASKS.md` - Path: `src/StellaOps.Scanner.Analyzers.Lang.DotNet/TASKS.md`
1. [TODO] SCANNER-ANALYZERS-LANG-10-305C — Handle self-contained apps and native assets; merge with EntryTrace usage hints. 1. [DONE 2025-10-22] SCANNER-ANALYZERS-LANG-10-305C — Handle self-contained apps and native assets; merge with EntryTrace usage hints.
• Prereqs: SCANNER-ANALYZERS-LANG-10-305B (Wave 2) • Prereqs: SCANNER-ANALYZERS-LANG-10-305A (Wave 1)
• Current: TODO • Current: TODO
- Path: `src/StellaOps.Scanner.Analyzers.Lang.Go/TASKS.md` - Path: `src/StellaOps.Scanner.Analyzers.Lang.Go/TASKS.md`
1. [TODO] SCANNER-ANALYZERS-LANG-10-304C — Fallback heuristics for stripped binaries with deterministic `bin:{sha256}` labeling and quiet provenance. 1. [TODO] SCANNER-ANALYZERS-LANG-10-304C — Fallback heuristics for stripped binaries with deterministic `bin:{sha256}` labeling and quiet provenance.

View File

@@ -2,6 +2,7 @@ This file describe implementation of Stella Ops (docs/README.md). Implementation
| Sprint | Theme | Tasks File Path | Status | Type of Specialist | Task ID | Task Description | | Sprint | Theme | Tasks File Path | Status | Type of Specialist | Task ID | Task Description |
| --- | --- | --- | --- | --- | --- | --- | | --- | --- | --- | --- | --- | --- | --- |
| Sprint 7 | Contextual Truth Foundations | docs/TASKS.md | DONE (2025-10-22) | Docs Guild, Concelier WebService | DOCS-CONCELIER-07-201 | Final editorial review and publish pass for Concelier authority toggle documentation (Quickstart + operator guide). |
| Sprint 10 | Scanner Analyzers & SBOM | src/StellaOps.Scanner.Cache/TASKS.md | TODO | Scanner Cache Guild | SCANNER-CACHE-10-101 | Implement layer cache store keyed by layer digest with metadata retention per architecture §3.3. | | Sprint 10 | Scanner Analyzers & SBOM | src/StellaOps.Scanner.Cache/TASKS.md | TODO | Scanner Cache Guild | SCANNER-CACHE-10-101 | Implement layer cache store keyed by layer digest with metadata retention per architecture §3.3. |
| Sprint 10 | Scanner Analyzers & SBOM | src/StellaOps.Scanner.Cache/TASKS.md | TODO | Scanner Cache Guild | SCANNER-CACHE-10-102 | Build file CAS with dedupe, TTL enforcement, and offline import/export hooks. | | Sprint 10 | Scanner Analyzers & SBOM | src/StellaOps.Scanner.Cache/TASKS.md | TODO | Scanner Cache Guild | SCANNER-CACHE-10-102 | Build file CAS with dedupe, TTL enforcement, and offline import/export hooks. |
| Sprint 10 | Scanner Analyzers & SBOM | src/StellaOps.Scanner.Cache/TASKS.md | TODO | Scanner Cache Guild | SCANNER-CACHE-10-103 | Expose cache metrics/logging and configuration toggles for warm/cold thresholds. | | Sprint 10 | Scanner Analyzers & SBOM | src/StellaOps.Scanner.Cache/TASKS.md | TODO | Scanner Cache Guild | SCANNER-CACHE-10-103 | Expose cache metrics/logging and configuration toggles for warm/cold thresholds. |
@@ -16,9 +17,9 @@ This file describe implementation of Stella Ops (docs/README.md). Implementation
| Sprint 10 | Scanner Analyzers & SBOM | src/StellaOps.Scanner.Analyzers.Lang/TASKS.md | TODO | Language Analyzer Guild | SCANNER-ANALYZERS-LANG-10-301 | Java analyzer emitting `pkg:maven` with provenance. | | Sprint 10 | Scanner Analyzers & SBOM | src/StellaOps.Scanner.Analyzers.Lang/TASKS.md | TODO | Language Analyzer Guild | SCANNER-ANALYZERS-LANG-10-301 | Java analyzer emitting `pkg:maven` with provenance. |
| Sprint 10 | Scanner Analyzers & SBOM | src/StellaOps.Scanner.Analyzers.Lang/TASKS.md | DONE (2025-10-21) | Language Analyzer Guild | SCANNER-ANALYZERS-LANG-10-302 | Node analyzer handling workspaces/symlinks emitting `pkg:npm`. | | Sprint 10 | Scanner Analyzers & SBOM | src/StellaOps.Scanner.Analyzers.Lang/TASKS.md | DONE (2025-10-21) | Language Analyzer Guild | SCANNER-ANALYZERS-LANG-10-302 | Node analyzer handling workspaces/symlinks emitting `pkg:npm`. |
| Sprint 10 | Scanner Analyzers & SBOM | src/StellaOps.Scanner.Analyzers.Lang/TASKS.md | DONE (2025-10-21) | Language Analyzer Guild | SCANNER-ANALYZERS-LANG-10-303 | Python analyzer reading `*.dist-info`, RECORD hashes, entry points. | | Sprint 10 | Scanner Analyzers & SBOM | src/StellaOps.Scanner.Analyzers.Lang/TASKS.md | DONE (2025-10-21) | Language Analyzer Guild | SCANNER-ANALYZERS-LANG-10-303 | Python analyzer reading `*.dist-info`, RECORD hashes, entry points. |
| Sprint 10 | Scanner Analyzers & SBOM | src/StellaOps.Scanner.Analyzers.Lang/TASKS.md | DOING (2025-10-22) | Language Analyzer Guild | SCANNER-ANALYZERS-LANG-10-304 | Go analyzer leveraging buildinfo for `pkg:golang` components. | | Sprint 10 | Scanner Analyzers & SBOM | src/StellaOps.Scanner.Analyzers.Lang/TASKS.md | DONE (2025-10-22) | Language Analyzer Guild | SCANNER-ANALYZERS-LANG-10-304 | Go analyzer leveraging buildinfo for `pkg:golang` components. |
| Sprint 10 | Scanner Analyzers & SBOM | src/StellaOps.Scanner.Analyzers.Lang/TASKS.md | DOING (2025-10-22) | Language Analyzer Guild | SCANNER-ANALYZERS-LANG-10-305 | .NET analyzer parsing `*.deps.json`, assembly metadata, RID variants. | | Sprint 10 | Scanner Analyzers & SBOM | src/StellaOps.Scanner.Analyzers.Lang/TASKS.md | DONE (2025-10-22) | Language Analyzer Guild | SCANNER-ANALYZERS-LANG-10-305 | .NET analyzer parsing `*.deps.json`, assembly metadata, RID variants. |
| Sprint 10 | Scanner Analyzers & SBOM | src/StellaOps.Scanner.Analyzers.Lang/TASKS.md | TODO | Language Analyzer Guild | SCANNER-ANALYZERS-LANG-10-306 | Rust analyzer detecting crates or falling back to `bin:{sha256}`. | | Sprint 10 | Scanner Analyzers & SBOM | src/StellaOps.Scanner.Analyzers.Lang/TASKS.md | DONE (2025-10-22) | Language Analyzer Guild | SCANNER-ANALYZERS-LANG-10-306 | Rust analyzer detecting crates or falling back to `bin:{sha256}`. |
| Sprint 10 | Scanner Analyzers & SBOM | src/StellaOps.Scanner.Analyzers.Lang/TASKS.md | DONE (2025-10-19) | Language Analyzer Guild | SCANNER-ANALYZERS-LANG-10-307 | Shared language evidence helpers + usage flag propagation. | | Sprint 10 | Scanner Analyzers & SBOM | src/StellaOps.Scanner.Analyzers.Lang/TASKS.md | DONE (2025-10-19) | Language Analyzer Guild | SCANNER-ANALYZERS-LANG-10-307 | Shared language evidence helpers + usage flag propagation. |
| Sprint 10 | Scanner Analyzers & SBOM | src/StellaOps.Scanner.Analyzers.Lang/TASKS.md | DONE (2025-10-19) | Language Analyzer Guild | SCANNER-ANALYZERS-LANG-10-308 | Determinism + fixture harness for language analyzers. | | Sprint 10 | Scanner Analyzers & SBOM | src/StellaOps.Scanner.Analyzers.Lang/TASKS.md | DONE (2025-10-19) | Language Analyzer Guild | SCANNER-ANALYZERS-LANG-10-308 | Determinism + fixture harness for language analyzers. |
| Sprint 10 | Scanner Analyzers & SBOM | src/StellaOps.Scanner.Analyzers.Lang/TASKS.md | DONE (2025-10-21) | Language Analyzer Guild | SCANNER-ANALYZERS-LANG-10-309 | Package language analyzers as restart-time plug-ins (manifest + host registration). | | Sprint 10 | Scanner Analyzers & SBOM | src/StellaOps.Scanner.Analyzers.Lang/TASKS.md | DONE (2025-10-21) | Language Analyzer Guild | SCANNER-ANALYZERS-LANG-10-309 | Package language analyzers as restart-time plug-ins (manifest + host registration). |
@@ -71,6 +72,8 @@ This file describe implementation of Stella Ops (docs/README.md). Implementation
| Sprint 13 | UX & CLI Experience | src/StellaOps.Cli/TASKS.md | DONE (2025-10-22) | DevEx/CLI | CLI-PLUGIN-13-007 | Package non-core CLI verbs as restart-time plug-ins (manifest + loader tests). | | Sprint 13 | UX & CLI Experience | src/StellaOps.Cli/TASKS.md | DONE (2025-10-22) | DevEx/CLI | CLI-PLUGIN-13-007 | Package non-core CLI verbs as restart-time plug-ins (manifest + loader tests). |
| Sprint 13 | UX & CLI Experience | src/StellaOps.Web/TASKS.md | DONE (2025-10-21) | UX Specialist, Angular Eng, DevEx | WEB1.DEPS-13-001 | Stabilise Angular workspace dependencies for headless CI installs (`npm install`, Chromium handling, docs). | | Sprint 13 | UX & CLI Experience | src/StellaOps.Web/TASKS.md | DONE (2025-10-21) | UX Specialist, Angular Eng, DevEx | WEB1.DEPS-13-001 | Stabilise Angular workspace dependencies for headless CI installs (`npm install`, Chromium handling, docs). |
| Sprint 13 | Platform Reliability | ops/devops/TASKS.md | TODO | DevOps Guild, Platform Leads | DEVOPS-NUGET-13-001 | Wire up .NET 10 preview feeds/local mirrors so `dotnet restore` succeeds offline; document updated NuGet bootstrap. | | Sprint 13 | Platform Reliability | ops/devops/TASKS.md | TODO | DevOps Guild, Platform Leads | DEVOPS-NUGET-13-001 | Wire up .NET 10 preview feeds/local mirrors so `dotnet restore` succeeds offline; document updated NuGet bootstrap. |
| Sprint 13 | Platform Reliability | ops/devops/TASKS.md | TODO | DevOps Guild | DEVOPS-NUGET-13-002 | Ensure all solutions/projects prioritize `local-nuget` before public feeds and add restore-order validation. |
| Sprint 13 | Platform Reliability | ops/devops/TASKS.md | TODO | DevOps Guild, Platform Leads | DEVOPS-NUGET-13-003 | Upgrade `Microsoft.*` dependencies pinned to 8.* to their latest .NET 10 (or 9.x) releases and refresh guidance. |
| Sprint 14 | Release & Offline Ops | ops/devops/TASKS.md | TODO | DevOps Guild | DEVOPS-REL-14-001 | Deterministic build/release pipeline with SBOM/provenance, signing, and manifest generation. | | Sprint 14 | Release & Offline Ops | ops/devops/TASKS.md | TODO | DevOps Guild | DEVOPS-REL-14-001 | Deterministic build/release pipeline with SBOM/provenance, signing, and manifest generation. |
| Sprint 14 | Release & Offline Ops | ops/offline-kit/TASKS.md | TODO | Offline Kit Guild | DEVOPS-OFFLINE-14-002 | Offline kit packaging workflow with integrity verification and documentation. | | Sprint 14 | Release & Offline Ops | ops/offline-kit/TASKS.md | TODO | Offline Kit Guild | DEVOPS-OFFLINE-14-002 | Offline kit packaging workflow with integrity verification and documentation. |
| Sprint 14 | Release & Offline Ops | ops/deployment/TASKS.md | TODO | Deployment Guild | DEVOPS-OPS-14-003 | Deployment/update/rollback automation and channel management documentation. | | Sprint 14 | Release & Offline Ops | ops/deployment/TASKS.md | TODO | Deployment Guild | DEVOPS-OPS-14-003 | Deployment/update/rollback automation and channel management documentation. |

View File

@@ -56,8 +56,16 @@ runtime wiring, CLI usage) and leaves connector/internal customization for later
- `GET /jobs` + `POST /jobs/{kind}` inspect and trigger connector/export jobs - `GET /jobs` + `POST /jobs/{kind}` inspect and trigger connector/export jobs
> **Security note** authentication now ships via StellaOps Authority. Keep > **Security note** authentication now ships via StellaOps Authority. Keep
> `authority.allowAnonymousFallback: true` only during the staged rollout and > `authority.allowAnonymousFallback: true` only during the staged rollout and
> disable it before **2025-12-31 UTC** so tokens become mandatory. > disable it before **2025-12-31 UTC** so tokens become mandatory.
Rollout checkpoints for the two Authority toggles:
| Phase | `authority.enabled` | `authority.allowAnonymousFallback` | Goal | Observability focus |
| ----- | ------------------- | ---------------------------------- | ---- | ------------------- |
| **Validation (staging)** | `true` | `true` | Verify token issuance, CLI scopes, and audit log noise without breaking cron jobs. | Watch `Concelier.Authorization.Audit` for `bypass=True` events and scope gaps; confirm CLI `auth status` succeeds. |
| **Cutover rehearsal** | `true` | `false` | Exercise production-style enforcement before the deadline; ensure only approved maintenance ranges remain in `bypassNetworks`. | Expect some HTTP 401s; verify `web.jobs.triggered` metrics flatten for unauthenticated calls and audit logs highlight missing tokens. |
| **Enforced (steady state)** | `true` | `false` | Production baseline after the 2025-12-31 UTC cutoff. | Alert on new `bypass=True` entries and on repeated 401 bursts; correlate with Authority availability dashboards. |
### Authority companion configuration (preview) ### Authority companion configuration (preview)
@@ -243,10 +251,10 @@ a problem document.
--- ---
## 6 · Authority Integration ## 6 · Authority Integration
- Concelier now authenticates callers through StellaOps Authority using OAuth 2.0 - Concelier now authenticates callers through StellaOps Authority using OAuth 2.0
resource server flows. Populate the `authority` block in `concelier.yaml`: resource server flows. Populate the `authority` block in `concelier.yaml`:
```yaml ```yaml
authority: authority:
@@ -282,8 +290,12 @@ a problem document.
export CONCELIER_AUTHORITY__CLIENTSECRETFILE="/var/run/secrets/concelier/authority-client" export CONCELIER_AUTHORITY__CLIENTSECRETFILE="/var/run/secrets/concelier/authority-client"
``` ```
- CLI commands already pass `Authorization` headers when credentials are supplied. - CLI commands already pass `Authorization` headers when credentials are supplied.
Configure the CLI with matching Authority settings (`docs/09_API_CLI_REFERENCE.md`) Configure the CLI with matching Authority settings (`docs/09_API_CLI_REFERENCE.md`)
so that automation can obtain tokens with the same client credentials. Concelier so that automation can obtain tokens with the same client credentials. Concelier
logs every job request with the client ID, subject (if present), scopes, and logs every job request with the client ID, subject (if present), scopes, and
a `bypass` flag so operators can audit cron traffic. a `bypass` flag so operators can audit cron traffic.
- **Rollout checklist.**
1. Stage the integration with fallback enabled (`allowAnonymousFallback=true`) and confirm CLI/token issuance using `stella auth status`.
2. Follow the rehearsal pattern (`allowAnonymousFallback=false`) while monitoring `Concelier.Authorization.Audit` and `web.jobs.triggered`/`web.jobs.trigger.failed` metrics.
3. Lock in enforcement, review the audit runbook (`docs/ops/concelier-authority-audit-runbook.md`), and document the bypass CIDR approvals in your change log.

View File

@@ -15,7 +15,7 @@
| DOCS-EVENTS-09-004 | DONE (2025-10-19) | Docs Guild, Scanner WebService | SCANNER-EVENTS-15-201 | Refresh scanner event docs to mirror DSSE-backed report fields, document `scanner.scan.completed`, and capture canonical sample validation. | Schemas updated for new payload shape; README references DSSE reuse and validation test; samples align with emitted events. | | DOCS-EVENTS-09-004 | DONE (2025-10-19) | Docs Guild, Scanner WebService | SCANNER-EVENTS-15-201 | Refresh scanner event docs to mirror DSSE-backed report fields, document `scanner.scan.completed`, and capture canonical sample validation. | Schemas updated for new payload shape; README references DSSE reuse and validation test; samples align with emitted events. |
| PLATFORM-EVENTS-09-401 | DONE (2025-10-21) | Platform Events Guild | DOCS-EVENTS-09-003 | Embed canonical event samples into contract/integration tests and ensure CI validates payloads against published schemas. | Notify models tests now run schema validation against `docs/events/*.json`, event schemas allow optional `attributes`, and docs capture the new validation workflow. | | PLATFORM-EVENTS-09-401 | DONE (2025-10-21) | Platform Events Guild | DOCS-EVENTS-09-003 | Embed canonical event samples into contract/integration tests and ensure CI validates payloads against published schemas. | Notify models tests now run schema validation against `docs/events/*.json`, event schemas allow optional `attributes`, and docs capture the new validation workflow. |
| RUNTIME-GUILD-09-402 | DONE (2025-10-19) | Runtime Guild | SCANNER-POLICY-09-107 | Confirm Scanner WebService surfaces `quietedFindingCount` and progress hints to runtime consumers; document readiness checklist. | Runtime verification run captures enriched payload; checklist/doc updates merged; stakeholders acknowledge availability. | | RUNTIME-GUILD-09-402 | DONE (2025-10-19) | Runtime Guild | SCANNER-POLICY-09-107 | Confirm Scanner WebService surfaces `quietedFindingCount` and progress hints to runtime consumers; document readiness checklist. | Runtime verification run captures enriched payload; checklist/doc updates merged; stakeholders acknowledge availability. |
| DOCS-CONCELIER-07-201 | TODO | Docs Guild, Concelier WebService | FEEDWEB-DOCS-01-001 | Final editorial review and publish pass for Concelier authority toggle documentation (Quickstart + operator guide). | Review feedback resolved, publish PR merged, release notes updated with documentation pointer. | | DOCS-CONCELIER-07-201 | DONE (2025-10-22) | Docs Guild, Concelier WebService | FEEDWEB-DOCS-01-001 | Final editorial review and publish pass for Concelier authority toggle documentation (Quickstart + operator guide). | Review feedback resolved, publish PR merged, release notes updated with documentation pointer. |
| DOCS-RUNTIME-17-004 | TODO | Docs Guild, Runtime Guild | SCANNER-EMIT-17-701, ZASTAVA-OBS-17-005, DEVOPS-REL-17-002 | Document build-id workflows: SBOM exposure, runtime event payloads, debug-store layout, and operator guidance for symbol retrieval. | Architecture + operator docs updated with build-id sections, examples show `readelf` output + debuginfod usage, references linked from Offline Kit/Release guides. | | DOCS-RUNTIME-17-004 | TODO | Docs Guild, Runtime Guild | SCANNER-EMIT-17-701, ZASTAVA-OBS-17-005, DEVOPS-REL-17-002 | Document build-id workflows: SBOM exposure, runtime event payloads, debug-store layout, and operator guidance for symbol retrieval. | Architecture + operator docs updated with build-id sections, examples show `readelf` output + debuginfod usage, references linked from Offline Kit/Release guides. |
> Update statuses (TODO/DOING/REVIEW/DONE/BLOCKED) as progress changes. Keep guides in sync with configuration samples under `etc/`. > Update statuses (TODO/DOING/REVIEW/DONE/BLOCKED) as progress changes. Keep guides in sync with configuration samples under `etc/`.

View File

@@ -1,14 +1,15 @@
# Concelier Authority Audit Runbook # Concelier Authority Audit Runbook
_Last updated: 2025-10-12_ _Last updated: 2025-10-22_
This runbook helps operators verify and monitor the StellaOps Concelier ⇆ Authority integration. It focuses on the `/jobs*` surface, which now requires StellaOps Authority tokens, and the corresponding audit/metric signals that expose authentication and bypass activity. This runbook helps operators verify and monitor the StellaOps Concelier ⇆ Authority integration. It focuses on the `/jobs*` surface, which now requires StellaOps Authority tokens, and the corresponding audit/metric signals that expose authentication and bypass activity.
## 1. Prerequisites ## 1. Prerequisites
- Authority integration is enabled in `concelier.yaml` (or via `CONCELIER_AUTHORITY__*` environment variables) with a valid `clientId`, secret, audience, and required scopes. - Authority integration is enabled in `concelier.yaml` (or via `CONCELIER_AUTHORITY__*` environment variables) with a valid `clientId`, secret, audience, and required scopes.
- OTLP metrics/log exporters are configured (`concelier.telemetry.*`) or container stdout is shipped to your SIEM. - OTLP metrics/log exporters are configured (`concelier.telemetry.*`) or container stdout is shipped to your SIEM.
- Operators have access to the Concelier job trigger endpoints via CLI or REST for smoke tests. - Operators have access to the Concelier job trigger endpoints via CLI or REST for smoke tests.
- The rollout table in `docs/10_CONCELIER_CLI_QUICKSTART.md` has been reviewed so stakeholders align on the staged → enforced toggle timeline.
### Configuration snippet ### Configuration snippet
@@ -112,9 +113,10 @@ Correlate audit logs with the following global meter exported via `Concelier.Sou
## 4. Rollout & Verification Procedure ## 4. Rollout & Verification Procedure
1. **Pre-checks** 1. **Pre-checks**
- Confirm `allowAnonymousFallback` is `false` in production; keep `true` only during staged validation. - Align with the rollout phases documented in `docs/10_CONCELIER_CLI_QUICKSTART.md` (validation → rehearsal → enforced) and record the target dates in your change request.
- Validate Authority issuer metadata is reachable from Concelier (`curl https://authority.internal/.well-known/openid-configuration` from the host). - Confirm `allowAnonymousFallback` is `false` in production; keep `true` only during staged validation.
- Validate Authority issuer metadata is reachable from Concelier (`curl https://authority.internal/.well-known/openid-configuration` from the host).
2. **Smoke test with valid token** 2. **Smoke test with valid token**
- Obtain a token via CLI: `stella auth login --scope concelier.jobs.trigger`. - Obtain a token via CLI: `stella auth login --scope concelier.jobs.trigger`.

View File

@@ -0,0 +1,12 @@
# Docs Guild Update — 2025-10-22
**Subject:** Concelier Authority toggle rollout polish
**Audience:** Docs Guild, Concelier WebService Guild, Authority Core
- Added a rollout phase table to `docs/10_CONCELIER_CLI_QUICKSTART.md`, clarifying how `authority.enabled` and `authority.allowAnonymousFallback` move from validation to enforced mode and highlighting the audit/metric signals to watch at each step.
- Extended the Authority integration checklist in the same quickstart so operators tie CLI smoke tests to audit counters before flipping enforcement.
- Refreshed `docs/ops/concelier-authority-audit-runbook.md` with the latest date stamp, prerequisites, and pre-check guidance that reference the quickstart timeline; keeps change-request templates aligned.
Next steps:
- Concelier WebService owners to link this update in the next deployment bulletin once FEEDWEB-DOCS-01-001 clears review.
- Docs Guild to verify the Offline Kit doc bundle picks up the quickstart/runbook changes after the nightly build.

Binary file not shown.

Binary file not shown.

Binary file not shown.

View File

@@ -15,5 +15,7 @@
| DEVOPS-LAUNCH-18-900 | TODO | DevOps Guild, Module Leads | Wave 0 completion | Collect full implementation sign-off from module owners and consolidate launch readiness checklist. | Sign-off record stored under `docs/ops/launch-readiness.md`; outstanding gaps triaged; checklist approved. | | DEVOPS-LAUNCH-18-900 | TODO | DevOps Guild, Module Leads | Wave 0 completion | Collect full implementation sign-off from module owners and consolidate launch readiness checklist. | Sign-off record stored under `docs/ops/launch-readiness.md`; outstanding gaps triaged; checklist approved. |
| DEVOPS-LAUNCH-18-001 | TODO | DevOps Guild | DEVOPS-LAUNCH-18-100, DEVOPS-LAUNCH-18-900 | Production launch cutover rehearsal and runbook publication. | `docs/ops/launch-cutover.md` drafted, rehearsal executed with rollback drill, approvals captured. | | DEVOPS-LAUNCH-18-001 | TODO | DevOps Guild | DEVOPS-LAUNCH-18-100, DEVOPS-LAUNCH-18-900 | Production launch cutover rehearsal and runbook publication. | `docs/ops/launch-cutover.md` drafted, rehearsal executed with rollback drill, approvals captured. |
| DEVOPS-NUGET-13-001 | TODO | DevOps Guild, Platform Leads | DEVOPS-REL-14-001 | Add .NET 10 preview feeds / local mirrors so `Microsoft.Extensions.*` 10.0 preview packages restore offline; refresh restore docs. | NuGet.config maps preview feeds (or local mirrored packages), `dotnet restore` succeeds for Excititor/Concelier solutions without ad-hoc feed edits, docs updated for offline bootstrap. | | DEVOPS-NUGET-13-001 | TODO | DevOps Guild, Platform Leads | DEVOPS-REL-14-001 | Add .NET 10 preview feeds / local mirrors so `Microsoft.Extensions.*` 10.0 preview packages restore offline; refresh restore docs. | NuGet.config maps preview feeds (or local mirrored packages), `dotnet restore` succeeds for Excititor/Concelier solutions without ad-hoc feed edits, docs updated for offline bootstrap. |
| DEVOPS-NUGET-13-002 | TODO | DevOps Guild | DEVOPS-NUGET-13-001 | Ensure all solutions/projects prefer `local-nuget` before public sources and document restore order validation. | `NuGet.config` and solution-level configs resolve from `local-nuget` first; automated check verifies priority; docs updated for restore ordering. |
| DEVOPS-NUGET-13-003 | TODO | DevOps Guild, Platform Leads | DEVOPS-NUGET-13-002 | Sweep `Microsoft.*` NuGet dependencies pinned to 8.* and upgrade to latest .NET 10 equivalents (or .NET 9 when 10 unavailable), updating restore guidance. | Dependency audit shows no 8.* `Microsoft.*` packages remaining; CI builds green; changelog/doc sections capture upgrade rationale. |
> Remark (2025-10-20): Repacked `Mongo2Go` local feed to require MongoDB.Driver 3.5.0 + SharpCompress 0.41.0; cache regression tests green and NU1902/NU1903 suppressed. > Remark (2025-10-20): Repacked `Mongo2Go` local feed to require MongoDB.Driver 3.5.0 + SharpCompress 0.41.0; cache regression tests green and NU1902/NU1903 suppressed.
> Remark (2025-10-21): Compose/Helm profiles now surface `SCANNER__EVENTS__*` toggles with docs pointing at new `.env` placeholders. > Remark (2025-10-21): Compose/Helm profiles now surface `SCANNER__EVENTS__*` toggles with docs pointing at new `.env` placeholders.

View File

@@ -0,0 +1,23 @@
{
"schemaVersion": "1.0",
"id": "stellaops.analyzer.lang.rust",
"displayName": "StellaOps Rust Analyzer (preview)",
"version": "0.1.0",
"requiresRestart": true,
"entryPoint": {
"type": "dotnet",
"assembly": "StellaOps.Scanner.Analyzers.Lang.Rust.dll",
"typeName": "StellaOps.Scanner.Analyzers.Lang.Rust.RustAnalyzerPlugin"
},
"capabilities": [
"language-analyzer",
"rust",
"cargo"
],
"metadata": {
"org.stellaops.analyzer.language": "rust",
"org.stellaops.analyzer.kind": "language",
"org.stellaops.restart.required": "true",
"org.stellaops.analyzer.status": "preview"
}
}

View File

@@ -2,9 +2,10 @@ using System.Collections.Immutable;
using Microsoft.Extensions.Logging.Abstractions; using Microsoft.Extensions.Logging.Abstractions;
using Microsoft.Extensions.Options; using Microsoft.Extensions.Options;
using StellaOps.Excititor.Attestation.Dsse; using StellaOps.Excititor.Attestation.Dsse;
using StellaOps.Excititor.Attestation.Signing; using StellaOps.Excititor.Attestation.Signing;
using StellaOps.Excititor.Attestation.Transparency; using StellaOps.Excititor.Attestation.Transparency;
using StellaOps.Excititor.Core; using StellaOps.Excititor.Attestation.Verification;
using StellaOps.Excititor.Core;
namespace StellaOps.Excititor.Attestation.Tests; namespace StellaOps.Excititor.Attestation.Tests;
@@ -16,7 +17,8 @@ public sealed class VexAttestationClientTests
var signer = new FakeSigner(); var signer = new FakeSigner();
var builder = new VexDsseBuilder(signer, NullLogger<VexDsseBuilder>.Instance); var builder = new VexDsseBuilder(signer, NullLogger<VexDsseBuilder>.Instance);
var options = Options.Create(new VexAttestationClientOptions()); var options = Options.Create(new VexAttestationClientOptions());
var client = new VexAttestationClient(builder, options, NullLogger<VexAttestationClient>.Instance); var verifier = new FakeVerifier();
var client = new VexAttestationClient(builder, options, NullLogger<VexAttestationClient>.Instance, verifier);
var request = new VexAttestationRequest( var request = new VexAttestationRequest(
ExportId: "exports/456", ExportId: "exports/456",
@@ -40,8 +42,9 @@ public sealed class VexAttestationClientTests
var signer = new FakeSigner(); var signer = new FakeSigner();
var builder = new VexDsseBuilder(signer, NullLogger<VexDsseBuilder>.Instance); var builder = new VexDsseBuilder(signer, NullLogger<VexDsseBuilder>.Instance);
var options = Options.Create(new VexAttestationClientOptions()); var options = Options.Create(new VexAttestationClientOptions());
var transparency = new FakeTransparencyLogClient(); var transparency = new FakeTransparencyLogClient();
var client = new VexAttestationClient(builder, options, NullLogger<VexAttestationClient>.Instance, transparencyLogClient: transparency); var verifier = new FakeVerifier();
var client = new VexAttestationClient(builder, options, NullLogger<VexAttestationClient>.Instance, verifier, transparencyLogClient: transparency);
var request = new VexAttestationRequest( var request = new VexAttestationRequest(
ExportId: "exports/789", ExportId: "exports/789",
@@ -65,9 +68,9 @@ public sealed class VexAttestationClientTests
=> ValueTask.FromResult(new VexSignedPayload("signature", "key")); => ValueTask.FromResult(new VexSignedPayload("signature", "key"));
} }
private sealed class FakeTransparencyLogClient : ITransparencyLogClient private sealed class FakeTransparencyLogClient : ITransparencyLogClient
{ {
public bool SubmitCalled { get; private set; } public bool SubmitCalled { get; private set; }
public ValueTask<TransparencyLogEntry> SubmitAsync(DsseEnvelope envelope, CancellationToken cancellationToken) public ValueTask<TransparencyLogEntry> SubmitAsync(DsseEnvelope envelope, CancellationToken cancellationToken)
{ {
@@ -75,7 +78,13 @@ public sealed class VexAttestationClientTests
return ValueTask.FromResult(new TransparencyLogEntry(Guid.NewGuid().ToString(), "https://rekor.example/entries/123", "23", null)); return ValueTask.FromResult(new TransparencyLogEntry(Guid.NewGuid().ToString(), "https://rekor.example/entries/123", "23", null));
} }
public ValueTask<bool> VerifyAsync(string entryLocation, CancellationToken cancellationToken) public ValueTask<bool> VerifyAsync(string entryLocation, CancellationToken cancellationToken)
=> ValueTask.FromResult(true); => ValueTask.FromResult(true);
} }
}
private sealed class FakeVerifier : IVexAttestationVerifier
{
public ValueTask<VexAttestationVerification> VerifyAsync(VexAttestationVerificationRequest request, CancellationToken cancellationToken)
=> ValueTask.FromResult(new VexAttestationVerification(true, ImmutableDictionary<string, string>.Empty));
}
}

View File

@@ -0,0 +1,132 @@
using System.Collections.Immutable;
using Microsoft.Extensions.Logging.Abstractions;
using Microsoft.Extensions.Options;
using StellaOps.Excititor.Attestation.Dsse;
using StellaOps.Excititor.Attestation.Signing;
using StellaOps.Excititor.Attestation.Transparency;
using StellaOps.Excititor.Attestation.Verification;
using StellaOps.Excititor.Core;
namespace StellaOps.Excititor.Attestation.Tests;
public sealed class VexAttestationVerifierTests : IDisposable
{
private readonly VexAttestationMetrics _metrics = new();
[Fact]
public async Task VerifyAsync_ReturnsValid_WhenEnvelopeMatches()
{
var (request, metadata, envelope) = await CreateSignedAttestationAsync();
var verifier = CreateVerifier(options => options.RequireTransparencyLog = false);
var verification = await verifier.VerifyAsync(
new VexAttestationVerificationRequest(request, metadata, envelope),
CancellationToken.None);
Assert.True(verification.IsValid);
Assert.Equal("valid", verification.Diagnostics["result"]);
}
[Fact]
public async Task VerifyAsync_ReturnsInvalid_WhenDigestMismatch()
{
var (request, metadata, envelope) = await CreateSignedAttestationAsync();
var verifier = CreateVerifier(options => options.RequireTransparencyLog = false);
var tamperedMetadata = new VexAttestationMetadata(
metadata.PredicateType,
metadata.Rekor,
"sha256:deadbeef",
metadata.SignedAt);
var verification = await verifier.VerifyAsync(
new VexAttestationVerificationRequest(request, tamperedMetadata, envelope),
CancellationToken.None);
Assert.False(verification.IsValid);
Assert.Equal("invalid", verification.Diagnostics["result"]);
Assert.Equal("sha256:deadbeef", verification.Diagnostics["metadata.envelopeDigest"]);
}
[Fact]
public async Task VerifyAsync_AllowsOfflineTransparency_WhenConfigured()
{
var (request, metadata, envelope) = await CreateSignedAttestationAsync(includeRekor: true);
var transparency = new ThrowingTransparencyLogClient();
var verifier = CreateVerifier(options =>
{
options.AllowOfflineTransparency = true;
options.RequireTransparencyLog = true;
}, transparency);
var verification = await verifier.VerifyAsync(
new VexAttestationVerificationRequest(request, metadata, envelope),
CancellationToken.None);
Assert.True(verification.IsValid);
Assert.Equal("offline", verification.Diagnostics["rekor.state"]);
}
private async Task<(VexAttestationRequest Request, VexAttestationMetadata Metadata, string Envelope)> CreateSignedAttestationAsync(bool includeRekor = false)
{
var signer = new FakeSigner();
var builder = new VexDsseBuilder(signer, NullLogger<VexDsseBuilder>.Instance);
var options = Options.Create(new VexAttestationClientOptions());
var transparency = includeRekor ? new FakeTransparencyLogClient() : null;
var verifier = CreateVerifier(options => options.RequireTransparencyLog = false);
var client = new VexAttestationClient(builder, options, NullLogger<VexAttestationClient>.Instance, verifier, transparency);
var request = new VexAttestationRequest(
ExportId: "exports/unit-test",
QuerySignature: new VexQuerySignature("filters"),
Artifact: new VexContentAddress("sha256", "cafebabe"),
Format: VexExportFormat.Json,
CreatedAt: DateTimeOffset.UtcNow,
SourceProviders: ImmutableArray.Create("provider-a"),
Metadata: ImmutableDictionary<string, string>.Empty);
var response = await client.SignAsync(request, CancellationToken.None);
var envelope = response.Diagnostics["envelope"];
return (request, response.Attestation, envelope);
}
private VexAttestationVerifier CreateVerifier(Action<VexAttestationVerificationOptions>? configureOptions = null, ITransparencyLogClient? transparency = null)
{
var options = new VexAttestationVerificationOptions();
configureOptions?.Invoke(options);
return new VexAttestationVerifier(
NullLogger<VexAttestationVerifier>.Instance,
transparency,
Options.Create(options),
_metrics);
}
public void Dispose()
{
_metrics.Dispose();
}
private sealed class FakeSigner : IVexSigner
{
public ValueTask<VexSignedPayload> SignAsync(ReadOnlyMemory<byte> payload, CancellationToken cancellationToken)
=> ValueTask.FromResult(new VexSignedPayload("signature", "key"));
}
private sealed class FakeTransparencyLogClient : ITransparencyLogClient
{
public ValueTask<TransparencyLogEntry> SubmitAsync(DsseEnvelope envelope, CancellationToken cancellationToken)
=> ValueTask.FromResult(new TransparencyLogEntry(Guid.NewGuid().ToString(), "https://rekor.example/entries/123", "42", null));
public ValueTask<bool> VerifyAsync(string entryLocation, CancellationToken cancellationToken)
=> ValueTask.FromResult(true);
}
private sealed class ThrowingTransparencyLogClient : ITransparencyLogClient
{
public ValueTask<TransparencyLogEntry> SubmitAsync(DsseEnvelope envelope, CancellationToken cancellationToken)
=> throw new NotSupportedException();
public ValueTask<bool> VerifyAsync(string entryLocation, CancellationToken cancellationToken)
=> throw new HttpRequestException("rekor unavailable");
}
}

View File

@@ -14,9 +14,9 @@ using StellaOps.Excititor.Core;
namespace StellaOps.Excititor.Attestation.Dsse; namespace StellaOps.Excititor.Attestation.Dsse;
public sealed class VexDsseBuilder public sealed class VexDsseBuilder
{ {
private const string PayloadType = "application/vnd.in-toto+json"; internal const string PayloadType = "application/vnd.in-toto+json";
private readonly IVexSigner _signer; private readonly IVexSigner _signer;
private readonly ILogger<VexDsseBuilder> _logger; private readonly ILogger<VexDsseBuilder> _logger;

View File

@@ -1,7 +1,7 @@
# EXCITITOR-ATTEST-01-003 - Verification & Observability Plan # EXCITITOR-ATTEST-01-003 - Verification & Observability Plan
- **Date:** 2025-10-19 - **Date:** 2025-10-19
- **Status:** Draft - **Status:** In progress (2025-10-22)
- **Owner:** Team Excititor Attestation - **Owner:** Team Excititor Attestation
- **Related tasks:** EXCITITOR-ATTEST-01-003 (Wave 0), EXCITITOR-WEB-01-003/004, EXCITITOR-WORKER-01-003 - **Related tasks:** EXCITITOR-ATTEST-01-003 (Wave 0), EXCITITOR-WEB-01-003/004, EXCITITOR-WORKER-01-003
- **Prerequisites satisfied:** EXCITITOR-ATTEST-01-002 (Rekor v2 client integration) - **Prerequisites satisfied:** EXCITITOR-ATTEST-01-002 (Rekor v2 client integration)
@@ -141,9 +141,17 @@ Metrics must register via static helper using `Meter` and support offline operat
- Do we need cross-module eventing when verification fails (e.g., notify Export module) or is logging sufficient in Wave 0? (Proposed: log + metrics now, escalate in later wave.) - Do we need cross-module eventing when verification fails (e.g., notify Export module) or is logging sufficient in Wave 0? (Proposed: log + metrics now, escalate in later wave.)
- Confirm whether Worker re-verification writes to Mongo or triggers Export module to re-sign artifacts automatically; placeholder: record status + timestamp only. - Confirm whether Worker re-verification writes to Mongo or triggers Export module to re-sign artifacts automatically; placeholder: record status + timestamp only.
## 10. Acceptance Criteria ## 10. Acceptance Criteria
- Plan approved by Attestation + WebService + Worker leads. - Plan approved by Attestation + WebService + Worker leads.
- Metrics/logging names peer-reviewed to avoid collisions. - Metrics/logging names peer-reviewed to avoid collisions.
- Test backlog items entered into respective `TASKS.md` once implementation starts. - Test backlog items entered into respective `TASKS.md` once implementation starts.
- Documentation (this plan) linked from `TASKS.md` notes for discoverability. - Documentation (this plan) linked from `TASKS.md` notes for discoverability.
## 11. 2025-10-22 Progress Notes
- Implemented `IVexAttestationVerifier`/`VexAttestationVerifier` with structural validation (subject/predicate checks, digest comparison, Rekor probes) and diagnostics map.
- Added `VexAttestationVerificationOptions` (RequireTransparencyLog, AllowOfflineTransparency, MaxClockSkew) and wired configuration through WebService DI.
- Created `VexAttestationMetrics` (`excititor.attestation.verify_total`, `excititor.attestation.verify_duration_seconds`) and hooked into verification flow with component/rekor tags.
- `VexAttestationClient.VerifyAsync` now delegates to the verifier; DI registers metrics + verifier via `AddVexAttestation`.
- Added unit coverage in `VexAttestationVerifierTests` (happy path, digest mismatch, offline Rekor) and updated client/export/webservice stubs to new verification signature.

View File

@@ -1,18 +1,21 @@
using Microsoft.Extensions.DependencyInjection; using Microsoft.Extensions.DependencyInjection;
using StellaOps.Excititor.Attestation.Dsse; using StellaOps.Excititor.Attestation.Dsse;
using StellaOps.Excititor.Attestation.Transparency; using StellaOps.Excititor.Attestation.Transparency;
using StellaOps.Excititor.Core; using StellaOps.Excititor.Attestation.Verification;
using StellaOps.Excititor.Core;
namespace StellaOps.Excititor.Attestation.Extensions;
namespace StellaOps.Excititor.Attestation.Extensions;
public static class VexAttestationServiceCollectionExtensions
{ public static class VexAttestationServiceCollectionExtensions
public static IServiceCollection AddVexAttestation(this IServiceCollection services) {
{ public static IServiceCollection AddVexAttestation(this IServiceCollection services)
services.AddSingleton<VexDsseBuilder>(); {
services.AddSingleton<IVexAttestationClient, VexAttestationClient>(); services.AddSingleton<VexDsseBuilder>();
return services; services.AddSingleton<VexAttestationMetrics>();
} services.AddSingleton<IVexAttestationVerifier, VexAttestationVerifier>();
services.AddSingleton<IVexAttestationClient, VexAttestationClient>();
return services;
}
public static IServiceCollection AddVexRekorClient(this IServiceCollection services, Action<RekorHttpClientOptions> configure) public static IServiceCollection AddVexRekorClient(this IServiceCollection services, Action<RekorHttpClientOptions> configure)
{ {

View File

@@ -4,4 +4,6 @@ If you are working on this file you need to read docs/ARCHITECTURE_EXCITITOR.md
|---|---|---|---| |---|---|---|---|
|EXCITITOR-ATTEST-01-001 In-toto predicate & DSSE builder|Team Excititor Attestation|EXCITITOR-CORE-01-001|**DONE (2025-10-16)** Added deterministic in-toto predicate/statement models, DSSE envelope builder wired to signer abstraction, and attestation client producing metadata + diagnostics.| |EXCITITOR-ATTEST-01-001 In-toto predicate & DSSE builder|Team Excititor Attestation|EXCITITOR-CORE-01-001|**DONE (2025-10-16)** Added deterministic in-toto predicate/statement models, DSSE envelope builder wired to signer abstraction, and attestation client producing metadata + diagnostics.|
|EXCITITOR-ATTEST-01-002 Rekor v2 client integration|Team Excititor Attestation|EXCITITOR-ATTEST-01-001|**DONE (2025-10-16)** Implemented Rekor HTTP client with retry/backoff, transparency log abstraction, DI helpers, and attestation client integration capturing Rekor metadata + diagnostics.| |EXCITITOR-ATTEST-01-002 Rekor v2 client integration|Team Excititor Attestation|EXCITITOR-ATTEST-01-001|**DONE (2025-10-16)** Implemented Rekor HTTP client with retry/backoff, transparency log abstraction, DI helpers, and attestation client integration capturing Rekor metadata + diagnostics.|
|EXCITITOR-ATTEST-01-003 Verification suite & observability|Team Excititor Attestation|EXCITITOR-ATTEST-01-002|DOING (2025-10-19) Add verification helpers for Worker/WebService, metrics/logging hooks, and negative-path regression tests. Draft plan logged in `EXCITITOR-ATTEST-01-003-plan.md` (2025-10-19).| |EXCITITOR-ATTEST-01-003 Verification suite & observability|Team Excititor Attestation|EXCITITOR-ATTEST-01-002|DOING (2025-10-22) Continuing implementation: build `IVexAttestationVerifier`, wire metrics/logging, and add regression tests. Draft plan in `EXCITITOR-ATTEST-01-003-plan.md` (2025-10-19) guides scope; updating with worknotes as progress lands.|
> Remark (2025-10-22): Added verifier implementation + metrics/tests; next steps include wiring into WebService/Worker flows and expanding negative-path coverage.

View File

@@ -0,0 +1,10 @@
using System.Threading;
using System.Threading.Tasks;
using StellaOps.Excititor.Core;
namespace StellaOps.Excititor.Attestation.Verification;
public interface IVexAttestationVerifier
{
ValueTask<VexAttestationVerification> VerifyAsync(VexAttestationVerificationRequest request, CancellationToken cancellationToken);
}

View File

@@ -0,0 +1,35 @@
using System;
using System.Diagnostics.Metrics;
namespace StellaOps.Excititor.Attestation.Verification;
public sealed class VexAttestationMetrics : IDisposable
{
public const string MeterName = "StellaOps.Excititor.Attestation";
public const string MeterVersion = "1.0";
private readonly Meter _meter;
private bool _disposed;
public VexAttestationMetrics()
{
_meter = new Meter(MeterName, MeterVersion);
VerifyTotal = _meter.CreateCounter<long>("excititor.attestation.verify_total", description: "Attestation verification attempts grouped by result/component/rekor.");
VerifyDuration = _meter.CreateHistogram<double>("excititor.attestation.verify_duration_seconds", unit: "s", description: "Attestation verification latency in seconds.");
}
public Counter<long> VerifyTotal { get; }
public Histogram<double> VerifyDuration { get; }
public void Dispose()
{
if (_disposed)
{
return;
}
_meter.Dispose();
_disposed = true;
}
}

View File

@@ -0,0 +1,28 @@
using System;
namespace StellaOps.Excititor.Attestation.Verification;
public sealed class VexAttestationVerificationOptions
{
private TimeSpan _maxClockSkew = TimeSpan.FromMinutes(5);
/// <summary>
/// When true, verification fails if no transparency record is present.
/// </summary>
public bool RequireTransparencyLog { get; set; } = true;
/// <summary>
/// Allows verification to succeed when the transparency log cannot be reached.
/// A diagnostic entry is still emitted to signal the degraded state.
/// </summary>
public bool AllowOfflineTransparency { get; set; }
/// <summary>
/// Maximum tolerated clock skew between the attestation creation time and the verification context timestamp.
/// </summary>
public TimeSpan MaxClockSkew
{
get => _maxClockSkew;
set => _maxClockSkew = value < TimeSpan.Zero ? TimeSpan.Zero : value;
}
}

View File

@@ -0,0 +1,470 @@
using System;
using System.Collections.Generic;
using System.Collections.Immutable;
using System.Diagnostics;
using System.Linq;
using System.Text;
using System.Text.Json;
using System.Text.Json.Serialization;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options;
using StellaOps.Excititor.Attestation.Dsse;
using StellaOps.Excititor.Attestation.Models;
using StellaOps.Excititor.Attestation.Transparency;
using StellaOps.Excititor.Core;
namespace StellaOps.Excititor.Attestation.Verification;
internal sealed class VexAttestationVerifier : IVexAttestationVerifier
{
private static readonly JsonSerializerOptions EnvelopeSerializerOptions = new()
{
PropertyNamingPolicy = JsonNamingPolicy.CamelCase,
DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingNull,
};
private static readonly JsonSerializerOptions StatementSerializerOptions = new()
{
PropertyNamingPolicy = JsonNamingPolicy.CamelCase,
DefaultIgnoreCondition = JsonIgnoreCondition.Never,
Converters = { new JsonStringEnumConverter(JsonNamingPolicy.CamelCase) },
};
private readonly ILogger<VexAttestationVerifier> _logger;
private readonly ITransparencyLogClient? _transparencyLogClient;
private readonly VexAttestationVerificationOptions _options;
private readonly VexAttestationMetrics _metrics;
public VexAttestationVerifier(
ILogger<VexAttestationVerifier> logger,
ITransparencyLogClient? transparencyLogClient,
IOptions<VexAttestationVerificationOptions> options,
VexAttestationMetrics metrics)
{
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
ArgumentNullException.ThrowIfNull(options);
_transparencyLogClient = transparencyLogClient;
_options = options.Value;
_metrics = metrics ?? throw new ArgumentNullException(nameof(metrics));
}
public async ValueTask<VexAttestationVerification> VerifyAsync(
VexAttestationVerificationRequest request,
CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(request);
var stopwatch = Stopwatch.StartNew();
var diagnostics = ImmutableDictionary.CreateBuilder<string, string>(StringComparer.Ordinal);
var resultLabel = "valid";
var rekorState = "skipped";
var component = request.IsReverify ? "worker" : "webservice";
try
{
if (string.IsNullOrWhiteSpace(request.Envelope))
{
diagnostics["envelope.state"] = "missing";
_logger.LogWarning("Attestation envelope is missing for export {ExportId}", request.Attestation.ExportId);
resultLabel = "invalid";
return BuildResult(false);
}
if (!TryDeserializeEnvelope(request.Envelope, out var envelope, diagnostics))
{
_logger.LogWarning("Failed to deserialize attestation envelope for export {ExportId}", request.Attestation.ExportId);
resultLabel = "invalid";
return BuildResult(false);
}
if (!string.Equals(envelope.PayloadType, VexDsseBuilder.PayloadType, StringComparison.OrdinalIgnoreCase))
{
diagnostics["payload.type"] = envelope.PayloadType ?? string.Empty;
_logger.LogWarning(
"Unexpected DSSE payload type {PayloadType} for export {ExportId}",
envelope.PayloadType,
request.Attestation.ExportId);
resultLabel = "invalid";
return BuildResult(false);
}
if (envelope.Signatures is null || envelope.Signatures.Length == 0)
{
diagnostics["signature.state"] = "missing";
_logger.LogWarning("Attestation envelope for export {ExportId} does not contain signatures.", request.Attestation.ExportId);
resultLabel = "invalid";
return BuildResult(false);
}
if (!TryDecodePayload(envelope.PayloadBase64, out var payloadBytes, diagnostics))
{
_logger.LogWarning("Failed to decode attestation payload for export {ExportId}", request.Attestation.ExportId);
resultLabel = "invalid";
return BuildResult(false);
}
if (!TryDeserializeStatement(payloadBytes, out var statement, diagnostics))
{
_logger.LogWarning("Failed to deserialize DSSE statement for export {ExportId}", request.Attestation.ExportId);
resultLabel = "invalid";
return BuildResult(false);
}
if (!ValidatePredicateType(statement, request, diagnostics))
{
_logger.LogWarning("Predicate type mismatch for export {ExportId}", request.Attestation.ExportId);
resultLabel = "invalid";
return BuildResult(false);
}
if (!ValidateSubject(statement, request, diagnostics))
{
_logger.LogWarning("Subject mismatch for export {ExportId}", request.Attestation.ExportId);
resultLabel = "invalid";
return BuildResult(false);
}
if (!ValidatePredicate(statement, request, diagnostics))
{
_logger.LogWarning("Predicate payload mismatch for export {ExportId}", request.Attestation.ExportId);
resultLabel = "invalid";
return BuildResult(false);
}
if (!ValidateMetadataDigest(envelope, request.Metadata, diagnostics))
{
_logger.LogWarning("Attestation digest mismatch for export {ExportId}", request.Attestation.ExportId);
resultLabel = "invalid";
return BuildResult(false);
}
if (!ValidateSignedAt(request.Metadata, request.Attestation.CreatedAt, diagnostics))
{
_logger.LogWarning("SignedAt validation failed for export {ExportId}", request.Attestation.ExportId);
resultLabel = "invalid";
return BuildResult(false);
}
rekorState = await VerifyTransparencyAsync(request.Metadata, diagnostics, cancellationToken).ConfigureAwait(false);
if (rekorState is "missing" or "unverified" or "client_unavailable")
{
resultLabel = "invalid";
return BuildResult(false);
}
diagnostics["signature.state"] = "present";
return BuildResult(true);
}
catch (Exception ex)
{
diagnostics["error"] = ex.GetType().Name;
diagnostics["error.message"] = ex.Message;
resultLabel = "error";
_logger.LogError(ex, "Unexpected exception verifying attestation for export {ExportId}", request.Attestation.ExportId);
return BuildResult(false);
}
finally
{
stopwatch.Stop();
var tags = new KeyValuePair<string, object?>[]
{
new("result", resultLabel),
new("component", component),
new("rekor", rekorState),
};
_metrics.VerifyTotal.Add(1, tags);
_metrics.VerifyDuration.Record(stopwatch.Elapsed.TotalSeconds, tags);
}
VexAttestationVerification BuildResult(bool isValid)
{
diagnostics["result"] = resultLabel;
diagnostics["component"] = component;
diagnostics["rekor.state"] = rekorState;
return new VexAttestationVerification(isValid, diagnostics.ToImmutable());
}
}
private static bool TryDeserializeEnvelope(
string envelopeJson,
out DsseEnvelope envelope,
ImmutableDictionary<string, string>.Builder diagnostics)
{
try
{
envelope = JsonSerializer.Deserialize<DsseEnvelope>(envelopeJson, EnvelopeSerializerOptions)
?? throw new InvalidOperationException("Envelope deserialized to null.");
return true;
}
catch (Exception ex)
{
diagnostics["envelope.error"] = ex.GetType().Name;
envelope = default!;
return false;
}
}
private static bool TryDecodePayload(
string payloadBase64,
out byte[] payloadBytes,
ImmutableDictionary<string, string>.Builder diagnostics)
{
try
{
payloadBytes = Convert.FromBase64String(payloadBase64);
return true;
}
catch (FormatException)
{
diagnostics["payload.base64"] = "invalid";
payloadBytes = Array.Empty<byte>();
return false;
}
}
private static bool TryDeserializeStatement(
byte[] payload,
out VexInTotoStatement statement,
ImmutableDictionary<string, string>.Builder diagnostics)
{
try
{
statement = JsonSerializer.Deserialize<VexInTotoStatement>(payload, StatementSerializerOptions)
?? throw new InvalidOperationException("Statement deserialized to null.");
return true;
}
catch (Exception ex)
{
diagnostics["payload.error"] = ex.GetType().Name;
statement = default!;
return false;
}
}
private static bool ValidatePredicateType(
VexInTotoStatement statement,
VexAttestationVerificationRequest request,
ImmutableDictionary<string, string>.Builder diagnostics)
{
var predicateType = statement.PredicateType ?? string.Empty;
if (!string.Equals(predicateType, request.Metadata.PredicateType, StringComparison.Ordinal))
{
diagnostics["predicate.type"] = predicateType;
return false;
}
return true;
}
private static bool ValidateSubject(
VexInTotoStatement statement,
VexAttestationVerificationRequest request,
ImmutableDictionary<string, string>.Builder diagnostics)
{
if (statement.Subject is null || statement.Subject.Count != 1)
{
diagnostics["subject.count"] = (statement.Subject?.Count ?? 0).ToString();
return false;
}
var subject = statement.Subject[0];
if (!string.Equals(subject.Name, request.Attestation.ExportId, StringComparison.Ordinal))
{
diagnostics["subject.name"] = subject.Name ?? string.Empty;
return false;
}
if (subject.Digest is null)
{
diagnostics["subject.digest"] = "missing";
return false;
}
var algorithmKey = request.Attestation.Artifact.Algorithm.ToLowerInvariant();
if (!subject.Digest.TryGetValue(algorithmKey, out var digest)
|| !string.Equals(digest, request.Attestation.Artifact.Digest, StringComparison.OrdinalIgnoreCase))
{
diagnostics["subject.digest"] = digest ?? string.Empty;
return false;
}
return true;
}
private bool ValidatePredicate(
VexInTotoStatement statement,
VexAttestationVerificationRequest request,
ImmutableDictionary<string, string>.Builder diagnostics)
{
var predicate = statement.Predicate;
if (predicate is null)
{
diagnostics["predicate.state"] = "missing";
return false;
}
if (!string.Equals(predicate.ExportId, request.Attestation.ExportId, StringComparison.Ordinal))
{
diagnostics["predicate.exportId"] = predicate.ExportId ?? string.Empty;
return false;
}
if (!string.Equals(predicate.QuerySignature, request.Attestation.QuerySignature.Value, StringComparison.Ordinal))
{
diagnostics["predicate.querySignature"] = predicate.QuerySignature ?? string.Empty;
return false;
}
if (!string.Equals(predicate.ArtifactAlgorithm, request.Attestation.Artifact.Algorithm, StringComparison.OrdinalIgnoreCase)
|| !string.Equals(predicate.ArtifactDigest, request.Attestation.Artifact.Digest, StringComparison.OrdinalIgnoreCase))
{
diagnostics["predicate.artifact"] = $"{predicate.ArtifactAlgorithm}:{predicate.ArtifactDigest}";
return false;
}
if (predicate.Format != request.Attestation.Format)
{
diagnostics["predicate.format"] = predicate.Format.ToString();
return false;
}
var createdDelta = (predicate.CreatedAt - request.Attestation.CreatedAt).Duration();
if (createdDelta > _options.MaxClockSkew)
{
diagnostics["predicate.createdAtDelta"] = createdDelta.ToString();
return false;
}
if (!SetEquals(predicate.SourceProviders, request.Attestation.SourceProviders))
{
diagnostics["predicate.sourceProviders"] = string.Join(",", predicate.SourceProviders ?? Array.Empty<string>());
return false;
}
if (request.Attestation.Metadata.Count > 0)
{
if (predicate.Metadata is null)
{
diagnostics["predicate.metadata"] = "missing";
return false;
}
foreach (var kvp in request.Attestation.Metadata)
{
if (!predicate.Metadata.TryGetValue(kvp.Key, out var actual)
|| !string.Equals(actual, kvp.Value, StringComparison.Ordinal))
{
diagnostics[$"predicate.metadata.{kvp.Key}"] = actual ?? string.Empty;
return false;
}
}
}
return true;
}
private bool ValidateMetadataDigest(
DsseEnvelope envelope,
VexAttestationMetadata metadata,
ImmutableDictionary<string, string>.Builder diagnostics)
{
if (string.IsNullOrWhiteSpace(metadata.EnvelopeDigest))
{
diagnostics["metadata.envelopeDigest"] = "missing";
return false;
}
var computed = VexDsseBuilder.ComputeEnvelopeDigest(envelope);
if (!string.Equals(computed, metadata.EnvelopeDigest, StringComparison.OrdinalIgnoreCase))
{
diagnostics["metadata.envelopeDigest"] = metadata.EnvelopeDigest;
diagnostics["metadata.envelopeDigest.computed"] = computed;
return false;
}
diagnostics["metadata.envelopeDigest"] = "match";
return true;
}
private bool ValidateSignedAt(
VexAttestationMetadata metadata,
DateTimeOffset createdAt,
ImmutableDictionary<string, string>.Builder diagnostics)
{
if (metadata.SignedAt is null)
{
diagnostics["metadata.signedAt"] = "missing";
return false;
}
var delta = (metadata.SignedAt.Value - createdAt).Duration();
if (delta > _options.MaxClockSkew)
{
diagnostics["metadata.signedAtDelta"] = delta.ToString();
return false;
}
return true;
}
private async ValueTask<string> VerifyTransparencyAsync(
VexAttestationMetadata metadata,
ImmutableDictionary<string, string>.Builder diagnostics,
CancellationToken cancellationToken)
{
if (metadata.Rekor is null)
{
if (_options.RequireTransparencyLog)
{
diagnostics["rekor.state"] = "missing";
return "missing";
}
diagnostics["rekor.state"] = "disabled";
return "disabled";
}
if (_transparencyLogClient is null)
{
diagnostics["rekor.state"] = "client_unavailable";
return _options.RequireTransparencyLog ? "client_unavailable" : "disabled";
}
try
{
var verified = await _transparencyLogClient.VerifyAsync(metadata.Rekor.Location, cancellationToken).ConfigureAwait(false);
diagnostics["rekor.state"] = verified ? "verified" : "unverified";
return verified ? "verified" : "unverified";
}
catch (Exception ex)
{
diagnostics["rekor.error"] = ex.GetType().Name;
if (_options.AllowOfflineTransparency)
{
diagnostics["rekor.state"] = "offline";
return "offline";
}
diagnostics["rekor.state"] = "unreachable";
return "unreachable";
}
}
private static bool SetEquals(IReadOnlyCollection<string>? left, ImmutableArray<string> right)
{
if (left is null)
{
return right.IsDefaultOrEmpty;
}
if (left.Count != right.Length)
{
return false;
}
var leftSet = new HashSet<string>(left, StringComparer.Ordinal);
return right.All(leftSet.Contains);
}
}

View File

@@ -4,13 +4,14 @@ using System.Collections.Immutable;
using System.Text.Json; using System.Text.Json;
using System.Threading; using System.Threading;
using System.Threading.Tasks; using System.Threading.Tasks;
using Microsoft.Extensions.Logging; using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Options; using Microsoft.Extensions.Options;
using StellaOps.Excititor.Attestation.Dsse; using StellaOps.Excititor.Attestation.Dsse;
using StellaOps.Excititor.Attestation.Models; using StellaOps.Excititor.Attestation.Models;
using StellaOps.Excititor.Attestation.Signing; using StellaOps.Excititor.Attestation.Signing;
using StellaOps.Excititor.Attestation.Transparency; using StellaOps.Excititor.Attestation.Transparency;
using StellaOps.Excititor.Core; using StellaOps.Excititor.Attestation.Verification;
using StellaOps.Excititor.Core;
namespace StellaOps.Excititor.Attestation; namespace StellaOps.Excititor.Attestation;
@@ -19,28 +20,31 @@ public sealed class VexAttestationClientOptions
public IReadOnlyDictionary<string, string> DefaultMetadata { get; set; } = ImmutableDictionary<string, string>.Empty; public IReadOnlyDictionary<string, string> DefaultMetadata { get; set; } = ImmutableDictionary<string, string>.Empty;
} }
public sealed class VexAttestationClient : IVexAttestationClient public sealed class VexAttestationClient : IVexAttestationClient
{ {
private readonly VexDsseBuilder _builder; private readonly VexDsseBuilder _builder;
private readonly ILogger<VexAttestationClient> _logger; private readonly ILogger<VexAttestationClient> _logger;
private readonly TimeProvider _timeProvider; private readonly TimeProvider _timeProvider;
private readonly IReadOnlyDictionary<string, string> _defaultMetadata; private readonly IReadOnlyDictionary<string, string> _defaultMetadata;
private readonly ITransparencyLogClient? _transparencyLogClient; private readonly ITransparencyLogClient? _transparencyLogClient;
private readonly IVexAttestationVerifier _verifier;
public VexAttestationClient(
VexDsseBuilder builder, public VexAttestationClient(
IOptions<VexAttestationClientOptions> options, VexDsseBuilder builder,
ILogger<VexAttestationClient> logger, IOptions<VexAttestationClientOptions> options,
TimeProvider? timeProvider = null, ILogger<VexAttestationClient> logger,
ITransparencyLogClient? transparencyLogClient = null) IVexAttestationVerifier verifier,
{ TimeProvider? timeProvider = null,
_builder = builder ?? throw new ArgumentNullException(nameof(builder)); ITransparencyLogClient? transparencyLogClient = null)
ArgumentNullException.ThrowIfNull(options); {
_logger = logger ?? throw new ArgumentNullException(nameof(logger)); _builder = builder ?? throw new ArgumentNullException(nameof(builder));
_timeProvider = timeProvider ?? TimeProvider.System; ArgumentNullException.ThrowIfNull(options);
_defaultMetadata = options.Value.DefaultMetadata; _logger = logger ?? throw new ArgumentNullException(nameof(logger));
_transparencyLogClient = transparencyLogClient; _verifier = verifier ?? throw new ArgumentNullException(nameof(verifier));
} _timeProvider = timeProvider ?? TimeProvider.System;
_defaultMetadata = options.Value.DefaultMetadata;
_transparencyLogClient = transparencyLogClient;
}
public async ValueTask<VexAttestationResponse> SignAsync(VexAttestationRequest request, CancellationToken cancellationToken) public async ValueTask<VexAttestationResponse> SignAsync(VexAttestationRequest request, CancellationToken cancellationToken)
{ {
@@ -82,11 +86,10 @@ public sealed class VexAttestationClient : IVexAttestationClient
return new VexAttestationResponse(metadata, diagnosticsBuilder); return new VexAttestationResponse(metadata, diagnosticsBuilder);
} }
public ValueTask<VexAttestationVerification> VerifyAsync(VexAttestationRequest request, CancellationToken cancellationToken) public ValueTask<VexAttestationVerification> VerifyAsync(
{ VexAttestationVerificationRequest request,
// Placeholder until verification flow is implemented in EXCITITOR-ATTEST-01-003. CancellationToken cancellationToken)
return ValueTask.FromResult(new VexAttestationVerification(true, ImmutableDictionary<string, string>.Empty)); => _verifier.VerifyAsync(request, cancellationToken);
}
private static IReadOnlyDictionary<string, string> MergeMetadata( private static IReadOnlyDictionary<string, string> MergeMetadata(
IReadOnlyDictionary<string, string> requestMetadata, IReadOnlyDictionary<string, string> requestMetadata,

View File

@@ -5,26 +5,32 @@ using System.Threading.Tasks;
namespace StellaOps.Excititor.Core; namespace StellaOps.Excititor.Core;
public interface IVexAttestationClient public interface IVexAttestationClient
{ {
ValueTask<VexAttestationResponse> SignAsync(VexAttestationRequest request, CancellationToken cancellationToken); ValueTask<VexAttestationResponse> SignAsync(VexAttestationRequest request, CancellationToken cancellationToken);
ValueTask<VexAttestationVerification> VerifyAsync(VexAttestationRequest request, CancellationToken cancellationToken); ValueTask<VexAttestationVerification> VerifyAsync(VexAttestationVerificationRequest request, CancellationToken cancellationToken);
} }
public sealed record VexAttestationRequest( public sealed record VexAttestationRequest(
string ExportId, string ExportId,
VexQuerySignature QuerySignature, VexQuerySignature QuerySignature,
VexContentAddress Artifact, VexContentAddress Artifact,
VexExportFormat Format, VexExportFormat Format,
DateTimeOffset CreatedAt, DateTimeOffset CreatedAt,
ImmutableArray<string> SourceProviders, ImmutableArray<string> SourceProviders,
ImmutableDictionary<string, string> Metadata); ImmutableDictionary<string, string> Metadata);
public sealed record VexAttestationResponse( public sealed record VexAttestationResponse(
VexAttestationMetadata Attestation, VexAttestationMetadata Attestation,
ImmutableDictionary<string, string> Diagnostics); ImmutableDictionary<string, string> Diagnostics);
public sealed record VexAttestationVerification( public sealed record VexAttestationVerificationRequest(
bool IsValid, VexAttestationRequest Attestation,
ImmutableDictionary<string, string> Diagnostics); VexAttestationMetadata Metadata,
string Envelope,
bool IsReverify = false);
public sealed record VexAttestationVerification(
bool IsValid,
ImmutableDictionary<string, string> Diagnostics);

View File

@@ -374,19 +374,29 @@ public static class VexCanonicalJsonSerializer
"metadata", "metadata",
} }
}, },
{ {
typeof(VexAttestationResponse), typeof(VexAttestationResponse),
new[] new[]
{ {
"attestation", "attestation",
"diagnostics", "diagnostics",
} }
}, },
{ {
typeof(VexAttestationVerification), typeof(VexAttestationVerificationRequest),
new[] new[]
{ {
"isValid", "attestation",
"metadata",
"envelope",
"isReverify",
}
},
{
typeof(VexAttestationVerification),
new[]
{
"isValid",
"diagnostics", "diagnostics",
} }
}, },

View File

@@ -290,7 +290,7 @@ public sealed class ExportEngineTests
return ValueTask.FromResult(Response); return ValueTask.FromResult(Response);
} }
public ValueTask<VexAttestationVerification> VerifyAsync(VexAttestationRequest request, CancellationToken cancellationToken) public ValueTask<VexAttestationVerification> VerifyAsync(VexAttestationVerificationRequest request, CancellationToken cancellationToken)
=> ValueTask.FromResult(new VexAttestationVerification(true, ImmutableDictionary<string, string>.Empty)); => ValueTask.FromResult(new VexAttestationVerification(true, ImmutableDictionary<string, string>.Empty));
} }

View File

@@ -140,7 +140,7 @@ internal static class TestServiceOverrides
{ {
var envelope = new DsseEnvelope( var envelope = new DsseEnvelope(
Convert.ToBase64String(Encoding.UTF8.GetBytes("{\"stub\":\"payload\"}")), Convert.ToBase64String(Encoding.UTF8.GetBytes("{\"stub\":\"payload\"}")),
"application/vnd.stellaops.resolve+json", "application/vnd.in-toto+json",
new[] new[]
{ {
new DsseSignature("attestation-signature", "attestation-key"), new DsseSignature("attestation-signature", "attestation-key"),
@@ -149,13 +149,18 @@ internal static class TestServiceOverrides
var diagnostics = ImmutableDictionary<string, string>.Empty var diagnostics = ImmutableDictionary<string, string>.Empty
.Add("envelope", JsonSerializer.Serialize(envelope)); .Add("envelope", JsonSerializer.Serialize(envelope));
var metadata = new VexAttestationMetadata(
"stub",
envelopeDigest: VexDsseBuilder.ComputeEnvelopeDigest(envelope),
signedAt: DateTimeOffset.UtcNow);
var response = new VexAttestationResponse( var response = new VexAttestationResponse(
new VexAttestationMetadata("stub"), metadata,
diagnostics); diagnostics);
return ValueTask.FromResult(response); return ValueTask.FromResult(response);
} }
public ValueTask<VexAttestationVerification> VerifyAsync(VexAttestationRequest request, CancellationToken cancellationToken) public ValueTask<VexAttestationVerification> VerifyAsync(VexAttestationVerificationRequest request, CancellationToken cancellationToken)
{ {
var verification = new VexAttestationVerification(true, ImmutableDictionary<string, string>.Empty); var verification = new VexAttestationVerification(true, ImmutableDictionary<string, string>.Empty);
return ValueTask.FromResult(verification); return ValueTask.FromResult(verification);

View File

@@ -1,8 +1,9 @@
using System.Collections.Generic; using System.Collections.Generic;
using System.Linq; using System.Linq;
using System.Collections.Immutable; using System.Collections.Immutable;
using Microsoft.AspNetCore.Authentication; using Microsoft.AspNetCore.Authentication;
using Microsoft.Extensions.Options; using Microsoft.Extensions.Options;
using StellaOps.Excititor.Attestation.Verification;
using StellaOps.Excititor.Attestation.Extensions; using StellaOps.Excititor.Attestation.Extensions;
using StellaOps.Excititor.Attestation; using StellaOps.Excititor.Attestation;
using StellaOps.Excititor.Attestation.Transparency; using StellaOps.Excititor.Attestation.Transparency;
@@ -35,8 +36,9 @@ services.AddSingleton<IVexSignatureVerifier, NoopVexSignatureVerifier>();
services.AddScoped<IVexIngestOrchestrator, VexIngestOrchestrator>(); services.AddScoped<IVexIngestOrchestrator, VexIngestOrchestrator>();
services.AddVexExportEngine(); services.AddVexExportEngine();
services.AddVexExportCacheServices(); services.AddVexExportCacheServices();
services.AddVexAttestation(); services.AddVexAttestation();
services.Configure<VexAttestationClientOptions>(configuration.GetSection("Excititor:Attestation:Client")); services.Configure<VexAttestationClientOptions>(configuration.GetSection("Excititor:Attestation:Client"));
services.Configure<VexAttestationVerificationOptions>(configuration.GetSection("Excititor:Attestation:Verification"));
services.AddVexPolicy(); services.AddVexPolicy();
services.AddRedHatCsafConnector(); services.AddRedHatCsafConnector();
services.Configure<MirrorDistributionOptions>(configuration.GetSection(MirrorDistributionOptions.SectionName)); services.Configure<MirrorDistributionOptions>(configuration.GetSection(MirrorDistributionOptions.SectionName));

View File

@@ -0,0 +1,14 @@
namespace StellaOps.Scanner.Analyzers.Lang.DotNet;
public interface IDotNetAuthenticodeInspector
{
DotNetAuthenticodeMetadata? TryInspect(string assemblyPath, CancellationToken cancellationToken);
}
public sealed record DotNetAuthenticodeMetadata(
string? Subject,
string? Issuer,
DateTimeOffset? NotBefore,
DateTimeOffset? NotAfter,
string? Thumbprint,
string? SerialNumber);

View File

@@ -1,4 +1,8 @@
using System.Diagnostics;
using System.Globalization;
using System.Linq; using System.Linq;
using System.Reflection;
using System.Security.Cryptography;
using System.Text.Json; using System.Text.Json;
namespace StellaOps.Scanner.Analyzers.Lang.DotNet.Internal; namespace StellaOps.Scanner.Analyzers.Lang.DotNet.Internal;
@@ -26,7 +30,7 @@ internal static class DotNetDependencyCollector
return ValueTask.FromResult<IReadOnlyList<DotNetPackage>>(Array.Empty<DotNetPackage>()); return ValueTask.FromResult<IReadOnlyList<DotNetPackage>>(Array.Empty<DotNetPackage>());
} }
var aggregator = new DotNetPackageAggregator(); var aggregator = new DotNetPackageAggregator(context);
foreach (var depsPath in depsFiles) foreach (var depsPath in depsFiles)
{ {
@@ -65,7 +69,7 @@ internal static class DotNetDependencyCollector
} }
} }
var packages = aggregator.Build(); var packages = aggregator.Build(cancellationToken);
return ValueTask.FromResult<IReadOnlyList<DotNetPackage>>(packages); return ValueTask.FromResult<IReadOnlyList<DotNetPackage>>(packages);
} }
@@ -83,8 +87,19 @@ internal static class DotNetDependencyCollector
internal sealed class DotNetPackageAggregator internal sealed class DotNetPackageAggregator
{ {
private readonly LanguageAnalyzerContext _context;
private readonly IDotNetAuthenticodeInspector? _authenticodeInspector;
private readonly Dictionary<string, DotNetPackageBuilder> _packages = new(StringComparer.Ordinal); private readonly Dictionary<string, DotNetPackageBuilder> _packages = new(StringComparer.Ordinal);
public DotNetPackageAggregator(LanguageAnalyzerContext context)
{
_context = context ?? throw new ArgumentNullException(nameof(context));
if (context.TryGetService<IDotNetAuthenticodeInspector>(out var inspector))
{
_authenticodeInspector = inspector;
}
}
public void Add(DotNetDepsFile depsFile, DotNetRuntimeConfig? runtimeConfig) public void Add(DotNetDepsFile depsFile, DotNetRuntimeConfig? runtimeConfig)
{ {
ArgumentNullException.ThrowIfNull(depsFile); ArgumentNullException.ThrowIfNull(depsFile);
@@ -101,7 +116,7 @@ internal sealed class DotNetPackageAggregator
if (!_packages.TryGetValue(key, out var builder)) if (!_packages.TryGetValue(key, out var builder))
{ {
builder = new DotNetPackageBuilder(library.Id, normalizedId, library.Version); builder = new DotNetPackageBuilder(_context, _authenticodeInspector, library.Id, normalizedId, library.Version);
_packages[key] = builder; _packages[key] = builder;
} }
@@ -109,7 +124,7 @@ internal sealed class DotNetPackageAggregator
} }
} }
public IReadOnlyList<DotNetPackage> Build() public IReadOnlyList<DotNetPackage> Build(CancellationToken cancellationToken)
{ {
if (_packages.Count == 0) if (_packages.Count == 0)
{ {
@@ -119,7 +134,8 @@ internal sealed class DotNetPackageAggregator
var items = new List<DotNetPackage>(_packages.Count); var items = new List<DotNetPackage>(_packages.Count);
foreach (var builder in _packages.Values) foreach (var builder in _packages.Values)
{ {
items.Add(builder.Build()); cancellationToken.ThrowIfCancellationRequested();
items.Add(builder.Build(cancellationToken));
} }
items.Sort(static (left, right) => string.CompareOrdinal(left.ComponentKey, right.ComponentKey)); items.Sort(static (left, right) => string.CompareOrdinal(left.ComponentKey, right.ComponentKey));
@@ -129,6 +145,9 @@ internal sealed class DotNetPackageAggregator
internal sealed class DotNetPackageBuilder internal sealed class DotNetPackageBuilder
{ {
private readonly LanguageAnalyzerContext _context;
private readonly IDotNetAuthenticodeInspector? _authenticodeInspector;
private readonly string _originalId; private readonly string _originalId;
private readonly string _normalizedId; private readonly string _normalizedId;
private readonly string _version; private readonly string _version;
@@ -147,10 +166,15 @@ internal sealed class DotNetPackageBuilder
private readonly SortedSet<string> _runtimeConfigFrameworks = new(StringComparer.OrdinalIgnoreCase); private readonly SortedSet<string> _runtimeConfigFrameworks = new(StringComparer.OrdinalIgnoreCase);
private readonly SortedSet<string> _runtimeConfigGraph = new(StringComparer.OrdinalIgnoreCase); private readonly SortedSet<string> _runtimeConfigGraph = new(StringComparer.OrdinalIgnoreCase);
private readonly Dictionary<string, AssemblyMetadataAggregate> _assemblies = new(StringComparer.OrdinalIgnoreCase);
private readonly Dictionary<string, NativeAssetAggregate> _nativeAssets = new(StringComparer.OrdinalIgnoreCase);
private readonly HashSet<LanguageComponentEvidence> _evidence = new(new LanguageComponentEvidenceComparer()); private readonly HashSet<LanguageComponentEvidence> _evidence = new(new LanguageComponentEvidenceComparer());
private bool _usedByEntrypoint;
public DotNetPackageBuilder(string originalId, string normalizedId, string version) public DotNetPackageBuilder(LanguageAnalyzerContext context, IDotNetAuthenticodeInspector? authenticodeInspector, string originalId, string normalizedId, string version)
{ {
_context = context ?? throw new ArgumentNullException(nameof(context));
_authenticodeInspector = authenticodeInspector;
_originalId = string.IsNullOrWhiteSpace(originalId) ? normalizedId : originalId.Trim(); _originalId = string.IsNullOrWhiteSpace(originalId) ? normalizedId : originalId.Trim();
_normalizedId = normalizedId; _normalizedId = normalizedId;
_version = version ?? string.Empty; _version = version ?? string.Empty;
@@ -193,6 +217,8 @@ internal sealed class DotNetPackageBuilder
AddIfPresent(_runtimeIdentifiers, rid); AddIfPresent(_runtimeIdentifiers, rid);
} }
AddRuntimeAssets(library);
_evidence.Add(new LanguageComponentEvidence( _evidence.Add(new LanguageComponentEvidence(
LanguageEvidenceKind.File, LanguageEvidenceKind.File,
"deps.json", "deps.json",
@@ -202,34 +228,11 @@ internal sealed class DotNetPackageBuilder
if (runtimeConfig is not null) if (runtimeConfig is not null)
{ {
AddIfPresent(_runtimeConfigPaths, runtimeConfig.RelativePath); AddRuntimeConfig(runtimeConfig);
foreach (var tfm in runtimeConfig.Tfms)
{
AddIfPresent(_runtimeConfigTfms, tfm);
}
foreach (var framework in runtimeConfig.Frameworks)
{
AddIfPresent(_runtimeConfigFrameworks, framework);
}
foreach (var entry in runtimeConfig.RuntimeGraph)
{
var value = BuildRuntimeGraphValue(entry.Rid, entry.Fallbacks);
AddIfPresent(_runtimeConfigGraph, value);
}
_evidence.Add(new LanguageComponentEvidence(
LanguageEvidenceKind.File,
"runtimeconfig.json",
runtimeConfig.RelativePath,
Value: null,
Sha256: null));
} }
} }
public DotNetPackage Build() public DotNetPackage Build(CancellationToken cancellationToken)
{ {
var metadata = new List<KeyValuePair<string, string?>>(32) var metadata = new List<KeyValuePair<string, string?>>(32)
{ {
@@ -255,6 +258,12 @@ internal sealed class DotNetPackageBuilder
AddIndexed(metadata, "runtimeconfig.framework", _runtimeConfigFrameworks); AddIndexed(metadata, "runtimeconfig.framework", _runtimeConfigFrameworks);
AddIndexed(metadata, "runtimeconfig.graph", _runtimeConfigGraph); AddIndexed(metadata, "runtimeconfig.graph", _runtimeConfigGraph);
var assemblies = CollectAssemblyMetadata(cancellationToken);
AddAssemblyMetadata(metadata, assemblies);
var nativeAssets = CollectNativeMetadata(cancellationToken);
AddNativeMetadata(metadata, nativeAssets);
metadata.Sort(static (left, right) => string.CompareOrdinal(left.Key, right.Key)); metadata.Sort(static (left, right) => string.CompareOrdinal(left.Key, right.Key));
var evidence = _evidence var evidence = _evidence
@@ -269,11 +278,231 @@ internal sealed class DotNetPackageBuilder
version: _version, version: _version,
metadata: metadata, metadata: metadata,
evidence: evidence, evidence: evidence,
usedByEntrypoint: false); usedByEntrypoint: _usedByEntrypoint);
}
private IReadOnlyList<AssemblyMetadataResult> CollectAssemblyMetadata(CancellationToken cancellationToken)
{
if (_assemblies.Count == 0)
{
return Array.Empty<AssemblyMetadataResult>();
}
var results = new List<AssemblyMetadataResult>(_assemblies.Count);
foreach (var aggregate in _assemblies.Values.OrderBy(static aggregate => aggregate.AssetRelativePath, StringComparer.Ordinal))
{
cancellationToken.ThrowIfCancellationRequested();
results.Add(aggregate.Build(_context, _authenticodeInspector, cancellationToken));
}
return results;
}
private IReadOnlyList<NativeAssetResult> CollectNativeMetadata(CancellationToken cancellationToken)
{
if (_nativeAssets.Count == 0)
{
return Array.Empty<NativeAssetResult>();
}
var results = new List<NativeAssetResult>(_nativeAssets.Count);
foreach (var aggregate in _nativeAssets.Values.OrderBy(static aggregate => aggregate.AssetRelativePath, StringComparer.Ordinal))
{
cancellationToken.ThrowIfCancellationRequested();
results.Add(aggregate.Build(_context, cancellationToken));
}
return results;
}
private void AddAssemblyMetadata(ICollection<KeyValuePair<string, string?>> metadata, IReadOnlyList<AssemblyMetadataResult> assemblies)
{
if (assemblies.Count == 0)
{
return;
}
for (var index = 0; index < assemblies.Count; index++)
{
var record = assemblies[index];
var prefix = $"assembly[{index}]";
if (record.UsedByEntrypoint)
{
_usedByEntrypoint = true;
}
AddIfPresent(metadata, $"{prefix}.assetPath", record.AssetPath);
AddIfPresent(metadata, $"{prefix}.path", record.RelativePath);
AddIndexed(metadata, $"{prefix}.tfm", record.TargetFrameworks);
AddIndexed(metadata, $"{prefix}.rid", record.RuntimeIdentifiers);
AddIfPresent(metadata, $"{prefix}.version", record.AssemblyVersion);
AddIfPresent(metadata, $"{prefix}.fileVersion", record.FileVersion);
AddIfPresent(metadata, $"{prefix}.publicKeyToken", record.PublicKeyToken);
AddIfPresent(metadata, $"{prefix}.strongName", record.StrongName);
AddIfPresent(metadata, $"{prefix}.company", record.CompanyName);
AddIfPresent(metadata, $"{prefix}.product", record.ProductName);
AddIfPresent(metadata, $"{prefix}.productVersion", record.ProductVersion);
AddIfPresent(metadata, $"{prefix}.fileDescription", record.FileDescription);
AddIfPresent(metadata, $"{prefix}.sha256", record.Sha256);
if (record.Authenticode is { } authenticode)
{
AddIfPresent(metadata, $"{prefix}.authenticode.subject", authenticode.Subject);
AddIfPresent(metadata, $"{prefix}.authenticode.issuer", authenticode.Issuer);
AddIfPresent(metadata, $"{prefix}.authenticode.notBefore", FormatTimestamp(authenticode.NotBefore));
AddIfPresent(metadata, $"{prefix}.authenticode.notAfter", FormatTimestamp(authenticode.NotAfter));
AddIfPresent(metadata, $"{prefix}.authenticode.thumbprint", authenticode.Thumbprint);
AddIfPresent(metadata, $"{prefix}.authenticode.serialNumber", authenticode.SerialNumber);
}
if (!string.IsNullOrEmpty(record.RelativePath))
{
_evidence.Add(new LanguageComponentEvidence(
LanguageEvidenceKind.File,
Source: "assembly",
Locator: record.RelativePath!,
Value: record.AssetPath,
Sha256: record.Sha256));
}
}
}
private void AddNativeMetadata(ICollection<KeyValuePair<string, string?>> metadata, IReadOnlyList<NativeAssetResult> nativeAssets)
{
if (nativeAssets.Count == 0)
{
return;
}
for (var index = 0; index < nativeAssets.Count; index++)
{
var record = nativeAssets[index];
var prefix = $"native[{index}]";
if (record.UsedByEntrypoint)
{
_usedByEntrypoint = true;
}
AddIfPresent(metadata, $"{prefix}.assetPath", record.AssetPath);
AddIfPresent(metadata, $"{prefix}.path", record.RelativePath);
AddIndexed(metadata, $"{prefix}.tfm", record.TargetFrameworks);
AddIndexed(metadata, $"{prefix}.rid", record.RuntimeIdentifiers);
AddIfPresent(metadata, $"{prefix}.sha256", record.Sha256);
if (!string.IsNullOrEmpty(record.RelativePath))
{
_evidence.Add(new LanguageComponentEvidence(
LanguageEvidenceKind.File,
Source: "native",
Locator: record.RelativePath!,
Value: record.AssetPath,
Sha256: record.Sha256));
}
}
}
private void AddRuntimeAssets(DotNetLibrary library)
{
foreach (var asset in library.RuntimeAssets)
{
switch (asset.Kind)
{
case DotNetLibraryAssetKind.Runtime:
AddRuntimeAssemblyAsset(asset, library.PackagePath);
break;
case DotNetLibraryAssetKind.Native:
AddNativeAsset(asset, library.PackagePath);
break;
}
}
}
private void AddRuntimeAssemblyAsset(DotNetLibraryAsset asset, string? packagePath)
{
var key = NormalizePath(asset.RelativePath);
if (string.IsNullOrEmpty(key))
{
return;
}
if (!_assemblies.TryGetValue(key, out var aggregate))
{
aggregate = new AssemblyMetadataAggregate(key);
_assemblies[key] = aggregate;
}
aggregate.AddManifestData(asset, packagePath);
}
private void AddNativeAsset(DotNetLibraryAsset asset, string? packagePath)
{
var key = NormalizePath(asset.RelativePath);
if (string.IsNullOrEmpty(key))
{
return;
}
if (!_nativeAssets.TryGetValue(key, out var aggregate))
{
aggregate = new NativeAssetAggregate(key);
_nativeAssets[key] = aggregate;
}
aggregate.AddManifestData(asset, packagePath);
}
private void AddRuntimeConfig(DotNetRuntimeConfig runtimeConfig)
{
AddIfPresent(_runtimeConfigPaths, runtimeConfig.RelativePath);
foreach (var tfm in runtimeConfig.Tfms)
{
AddIfPresent(_runtimeConfigTfms, tfm);
}
foreach (var framework in runtimeConfig.Frameworks)
{
AddIfPresent(_runtimeConfigFrameworks, framework);
}
foreach (var entry in runtimeConfig.RuntimeGraph)
{
var value = BuildRuntimeGraphValue(entry.Rid, entry.Fallbacks);
AddIfPresent(_runtimeConfigGraph, value);
}
_evidence.Add(new LanguageComponentEvidence(
LanguageEvidenceKind.File,
"runtimeconfig.json",
runtimeConfig.RelativePath,
Value: null,
Sha256: null));
}
private static void AddIfPresent(ICollection<KeyValuePair<string, string?>> metadata, string key, string? value)
{
if (metadata is null)
{
throw new ArgumentNullException(nameof(metadata));
}
if (string.IsNullOrWhiteSpace(key) || string.IsNullOrWhiteSpace(value))
{
return;
}
metadata.Add(new KeyValuePair<string, string?>(key, value));
} }
private static void AddIfPresent(ISet<string> set, string? value, bool normalizeLower = false) private static void AddIfPresent(ISet<string> set, string? value, bool normalizeLower = false)
{ {
if (set is null)
{
throw new ArgumentNullException(nameof(set));
}
if (string.IsNullOrWhiteSpace(value)) if (string.IsNullOrWhiteSpace(value))
{ {
return; return;
@@ -322,6 +551,148 @@ internal sealed class DotNetPackageBuilder
return path.Replace('\\', '/'); return path.Replace('\\', '/');
} }
private static string NormalizePath(string? path)
{
if (string.IsNullOrWhiteSpace(path))
{
return string.Empty;
}
var normalized = path.Replace('\\', '/').Trim();
return string.IsNullOrEmpty(normalized) ? string.Empty : normalized;
}
private static string ConvertToPlatformPath(string path)
=> string.IsNullOrEmpty(path) ? "." : path.Replace('/', Path.DirectorySeparatorChar);
private static string CombineRelative(string basePath, string relativePath)
{
var left = NormalizePath(basePath);
var right = NormalizePath(relativePath);
if (string.IsNullOrEmpty(left))
{
return right;
}
if (string.IsNullOrEmpty(right))
{
return left;
}
return NormalizePath($"{left}/{right}");
}
private static string? ComputeSha256(string path)
{
using var stream = File.Open(path, FileMode.Open, FileAccess.Read, FileShare.Read);
using var sha = SHA256.Create();
var hash = sha.ComputeHash(stream);
return Convert.ToHexString(hash).ToLowerInvariant();
}
private static AssemblyName? TryGetAssemblyName(string path)
{
try
{
return AssemblyName.GetAssemblyName(path);
}
catch (FileNotFoundException)
{
return null;
}
catch (BadImageFormatException)
{
return null;
}
catch (FileLoadException)
{
return null;
}
}
private static FileVersionInfo? TryGetFileVersionInfo(string path)
{
try
{
return FileVersionInfo.GetVersionInfo(path);
}
catch (FileNotFoundException)
{
return null;
}
catch (IOException)
{
return null;
}
catch (UnauthorizedAccessException)
{
return null;
}
}
private static string? FormatPublicKeyToken(byte[]? token)
{
if (token is null || token.Length == 0)
{
return null;
}
return Convert.ToHexString(token).ToLowerInvariant();
}
private static string? BuildStrongName(AssemblyName assemblyName, string? publicKeyToken)
{
if (assemblyName is null || string.IsNullOrWhiteSpace(assemblyName.Name))
{
return null;
}
var version = assemblyName.Version?.ToString() ?? "0.0.0.0";
var culture = string.IsNullOrWhiteSpace(assemblyName.CultureName) ? "neutral" : assemblyName.CultureName;
var token = string.IsNullOrWhiteSpace(publicKeyToken) ? "null" : publicKeyToken;
return $"{assemblyName.Name}, Version={version}, Culture={culture}, PublicKeyToken={token}";
}
private static string? NormalizeMetadataValue(string? value)
{
if (string.IsNullOrWhiteSpace(value))
{
return null;
}
return value.Trim();
}
private static string? FormatTimestamp(DateTimeOffset? value)
{
if (value is null)
{
return null;
}
return value.Value.UtcDateTime.ToString("yyyy-MM-ddTHH:mm:ss.fffZ", CultureInfo.InvariantCulture);
}
private static IEnumerable<string> EnumeratePackageBases(string packagePath)
{
if (string.IsNullOrWhiteSpace(packagePath))
{
yield break;
}
var normalized = NormalizePath(packagePath);
if (string.IsNullOrEmpty(normalized))
{
yield break;
}
yield return normalized;
yield return NormalizePath($".nuget/packages/{normalized}");
yield return NormalizePath($"packages/{normalized}");
yield return NormalizePath($"usr/share/dotnet/packs/{normalized}");
}
private static string BuildRuntimeGraphValue(string rid, IReadOnlyList<string> fallbacks) private static string BuildRuntimeGraphValue(string rid, IReadOnlyList<string> fallbacks)
{ {
if (string.IsNullOrWhiteSpace(rid)) if (string.IsNullOrWhiteSpace(rid))
@@ -346,6 +717,343 @@ internal sealed class DotNetPackageBuilder
: $"{rid.Trim()}=>{string.Join(';', ordered)}"; : $"{rid.Trim()}=>{string.Join(';', ordered)}";
} }
private sealed class AssemblyMetadataAggregate
{
private readonly string _assetRelativePath;
private readonly SortedSet<string> _tfms = new(StringComparer.OrdinalIgnoreCase);
private readonly SortedSet<string> _runtimeIdentifiers = new(StringComparer.OrdinalIgnoreCase);
private readonly SortedSet<string> _packagePaths = new(StringComparer.Ordinal);
private string? _assemblyVersion;
private string? _fileVersion;
public AssemblyMetadataAggregate(string assetRelativePath)
{
_assetRelativePath = NormalizePath(assetRelativePath);
}
public string AssetRelativePath => _assetRelativePath;
public void AddManifestData(DotNetLibraryAsset asset, string? packagePath)
{
if (!string.IsNullOrWhiteSpace(asset.TargetFramework))
{
_tfms.Add(asset.TargetFramework);
}
if (!string.IsNullOrWhiteSpace(asset.RuntimeIdentifier))
{
_runtimeIdentifiers.Add(asset.RuntimeIdentifier);
}
if (!string.IsNullOrWhiteSpace(asset.AssemblyVersion) && string.IsNullOrEmpty(_assemblyVersion))
{
_assemblyVersion = asset.AssemblyVersion;
}
if (!string.IsNullOrWhiteSpace(asset.FileVersion) && string.IsNullOrEmpty(_fileVersion))
{
_fileVersion = asset.FileVersion;
}
if (!string.IsNullOrWhiteSpace(packagePath))
{
var normalized = NormalizePath(packagePath);
if (!string.IsNullOrEmpty(normalized))
{
_packagePaths.Add(normalized);
}
}
}
public AssemblyMetadataResult Build(LanguageAnalyzerContext context, IDotNetAuthenticodeInspector? authenticodeInspector, CancellationToken cancellationToken)
{
var fileMetadata = ResolveFileMetadata(context, authenticodeInspector, cancellationToken);
var assemblyName = fileMetadata?.AssemblyName;
var versionInfo = fileMetadata?.FileVersionInfo;
var assemblyVersion = assemblyName?.Version?.ToString() ?? _assemblyVersion;
var fileVersion = !string.IsNullOrWhiteSpace(versionInfo?.FileVersion) ? versionInfo?.FileVersion : _fileVersion;
var usedByEntrypoint = fileMetadata?.UsedByEntrypoint ?? false;
string? publicKeyToken = null;
string? strongName = null;
if (assemblyName is not null)
{
publicKeyToken = FormatPublicKeyToken(assemblyName.GetPublicKeyToken());
strongName = BuildStrongName(assemblyName, publicKeyToken);
}
return new AssemblyMetadataResult(
AssetPath: _assetRelativePath,
RelativePath: fileMetadata?.RelativePath,
TargetFrameworks: _tfms.ToArray(),
RuntimeIdentifiers: _runtimeIdentifiers.ToArray(),
AssemblyVersion: assemblyVersion,
FileVersion: fileVersion,
PublicKeyToken: publicKeyToken,
StrongName: strongName,
CompanyName: NormalizeMetadataValue(versionInfo?.CompanyName),
ProductName: NormalizeMetadataValue(versionInfo?.ProductName),
ProductVersion: NormalizeMetadataValue(versionInfo?.ProductVersion),
FileDescription: NormalizeMetadataValue(versionInfo?.FileDescription),
Sha256: fileMetadata?.Sha256,
Authenticode: fileMetadata?.Authenticode,
UsedByEntrypoint: usedByEntrypoint);
}
private AssemblyFileMetadata? ResolveFileMetadata(LanguageAnalyzerContext context, IDotNetAuthenticodeInspector? authenticodeInspector, CancellationToken cancellationToken)
{
var candidates = BuildCandidateRelativePaths();
foreach (var candidate in candidates)
{
cancellationToken.ThrowIfCancellationRequested();
var absolutePath = context.ResolvePath(ConvertToPlatformPath(candidate));
if (!File.Exists(absolutePath))
{
continue;
}
try
{
var relativePath = NormalizePath(context.GetRelativePath(absolutePath));
var sha256 = ComputeSha256(absolutePath);
var assemblyName = TryGetAssemblyName(absolutePath);
var versionInfo = TryGetFileVersionInfo(absolutePath);
DotNetAuthenticodeMetadata? authenticode = null;
if (authenticodeInspector is not null)
{
try
{
authenticode = authenticodeInspector.TryInspect(absolutePath, cancellationToken);
}
catch
{
authenticode = null;
}
}
var usedByEntrypoint = context.UsageHints.IsPathUsed(absolutePath);
return new AssemblyFileMetadata(
AbsolutePath: absolutePath,
RelativePath: relativePath,
Sha256: sha256,
AssemblyName: assemblyName,
FileVersionInfo: versionInfo,
Authenticode: authenticode,
UsedByEntrypoint: usedByEntrypoint);
}
catch (IOException)
{
continue;
}
catch (UnauthorizedAccessException)
{
continue;
}
catch (BadImageFormatException)
{
continue;
}
}
return null;
}
private IEnumerable<string> BuildCandidateRelativePaths()
{
var seen = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
if (_packagePaths.Count > 0)
{
foreach (var packagePath in _packagePaths)
{
foreach (var basePath in EnumeratePackageBases(packagePath))
{
var combined = CombineRelative(basePath, _assetRelativePath);
if (string.IsNullOrEmpty(combined))
{
continue;
}
if (seen.Add(combined))
{
yield return combined;
}
}
}
}
if (seen.Add(_assetRelativePath))
{
yield return _assetRelativePath;
}
}
}
private sealed class NativeAssetAggregate
{
private readonly string _assetRelativePath;
private readonly SortedSet<string> _tfms = new(StringComparer.OrdinalIgnoreCase);
private readonly SortedSet<string> _runtimeIdentifiers = new(StringComparer.OrdinalIgnoreCase);
private readonly SortedSet<string> _packagePaths = new(StringComparer.Ordinal);
public NativeAssetAggregate(string assetRelativePath)
{
_assetRelativePath = NormalizePath(assetRelativePath);
}
public string AssetRelativePath => _assetRelativePath;
public void AddManifestData(DotNetLibraryAsset asset, string? packagePath)
{
if (!string.IsNullOrWhiteSpace(asset.TargetFramework))
{
_tfms.Add(asset.TargetFramework);
}
if (!string.IsNullOrWhiteSpace(asset.RuntimeIdentifier))
{
_runtimeIdentifiers.Add(asset.RuntimeIdentifier);
}
if (!string.IsNullOrWhiteSpace(packagePath))
{
var normalized = NormalizePath(packagePath);
if (!string.IsNullOrEmpty(normalized))
{
_packagePaths.Add(normalized);
}
}
}
public NativeAssetResult Build(LanguageAnalyzerContext context, CancellationToken cancellationToken)
{
var fileMetadata = ResolveFileMetadata(context, cancellationToken);
return new NativeAssetResult(
AssetPath: _assetRelativePath,
RelativePath: fileMetadata?.RelativePath,
TargetFrameworks: _tfms.ToArray(),
RuntimeIdentifiers: _runtimeIdentifiers.ToArray(),
Sha256: fileMetadata?.Sha256,
UsedByEntrypoint: fileMetadata?.UsedByEntrypoint ?? false);
}
private NativeAssetFileMetadata? ResolveFileMetadata(LanguageAnalyzerContext context, CancellationToken cancellationToken)
{
var candidates = BuildCandidateRelativePaths();
foreach (var candidate in candidates)
{
cancellationToken.ThrowIfCancellationRequested();
var absolutePath = context.ResolvePath(ConvertToPlatformPath(candidate));
var usedByEntrypoint = context.UsageHints.IsPathUsed(absolutePath);
if (!File.Exists(absolutePath))
{
continue;
}
try
{
var relativePath = NormalizePath(context.GetRelativePath(absolutePath));
var sha256 = ComputeSha256(absolutePath);
return new NativeAssetFileMetadata(
AbsolutePath: absolutePath,
RelativePath: relativePath,
Sha256: sha256,
UsedByEntrypoint: usedByEntrypoint);
}
catch (IOException)
{
continue;
}
catch (UnauthorizedAccessException)
{
continue;
}
}
return null;
}
private IEnumerable<string> BuildCandidateRelativePaths()
{
var seen = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
if (_packagePaths.Count > 0)
{
foreach (var packagePath in _packagePaths)
{
foreach (var basePath in EnumeratePackageBases(packagePath))
{
var combined = CombineRelative(basePath, _assetRelativePath);
if (string.IsNullOrEmpty(combined))
{
continue;
}
if (seen.Add(combined))
{
yield return combined;
}
}
}
}
if (seen.Add(_assetRelativePath))
{
yield return _assetRelativePath;
}
}
}
private sealed record AssemblyMetadataResult(
string AssetPath,
string? RelativePath,
IReadOnlyList<string> TargetFrameworks,
IReadOnlyList<string> RuntimeIdentifiers,
string? AssemblyVersion,
string? FileVersion,
string? PublicKeyToken,
string? StrongName,
string? CompanyName,
string? ProductName,
string? ProductVersion,
string? FileDescription,
string? Sha256,
DotNetAuthenticodeMetadata? Authenticode,
bool UsedByEntrypoint);
private sealed record NativeAssetResult(
string AssetPath,
string? RelativePath,
IReadOnlyList<string> TargetFrameworks,
IReadOnlyList<string> RuntimeIdentifiers,
string? Sha256,
bool UsedByEntrypoint);
private sealed record AssemblyFileMetadata(
string AbsolutePath,
string? RelativePath,
string? Sha256,
AssemblyName? AssemblyName,
FileVersionInfo? FileVersionInfo,
DotNetAuthenticodeMetadata? Authenticode,
bool UsedByEntrypoint);
private sealed record NativeAssetFileMetadata(
string AbsolutePath,
string? RelativePath,
string? Sha256,
bool UsedByEntrypoint);
private sealed class LanguageComponentEvidenceComparer : IEqualityComparer<LanguageComponentEvidence> private sealed class LanguageComponentEvidenceComparer : IEqualityComparer<LanguageComponentEvidence>
{ {
public bool Equals(LanguageComponentEvidence? x, LanguageComponentEvidence? y) public bool Equals(LanguageComponentEvidence? x, LanguageComponentEvidence? y)

View File

@@ -98,7 +98,7 @@ internal sealed class DotNetDepsFile
library.AddRuntimeIdentifier(rid); library.AddRuntimeIdentifier(rid);
} }
library.MergeTargetMetadata(libraryProperty.Value); library.MergeTargetMetadata(libraryProperty.Value, tfm, rid);
} }
} }
} }
@@ -126,6 +126,7 @@ internal sealed class DotNetLibrary
{ {
private readonly HashSet<string> _dependencies = new(StringComparer.OrdinalIgnoreCase); private readonly HashSet<string> _dependencies = new(StringComparer.OrdinalIgnoreCase);
private readonly HashSet<string> _runtimeIdentifiers = new(StringComparer.Ordinal); private readonly HashSet<string> _runtimeIdentifiers = new(StringComparer.Ordinal);
private readonly List<DotNetLibraryAsset> _runtimeAssets = new();
private readonly HashSet<string> _targetFrameworks = new(StringComparer.Ordinal); private readonly HashSet<string> _targetFrameworks = new(StringComparer.Ordinal);
private DotNetLibrary( private DotNetLibrary(
@@ -172,6 +173,8 @@ internal sealed class DotNetLibrary
public IReadOnlyCollection<string> RuntimeIdentifiers => _runtimeIdentifiers; public IReadOnlyCollection<string> RuntimeIdentifiers => _runtimeIdentifiers;
public IReadOnlyCollection<DotNetLibraryAsset> RuntimeAssets => _runtimeAssets;
public static bool TryCreate(string key, JsonElement element, [NotNullWhen(true)] out DotNetLibrary? library) public static bool TryCreate(string key, JsonElement element, [NotNullWhen(true)] out DotNetLibrary? library)
{ {
library = null; library = null;
@@ -230,29 +233,67 @@ internal sealed class DotNetLibrary
} }
} }
public void MergeTargetMetadata(JsonElement element) public void MergeTargetMetadata(JsonElement element, string? tfm, string? rid)
{ {
if (!element.TryGetProperty("dependencies", out var dependenciesElement) || dependenciesElement.ValueKind is not JsonValueKind.Object) if (element.TryGetProperty("dependencies", out var dependenciesElement) && dependenciesElement.ValueKind is JsonValueKind.Object)
{ {
return; foreach (var dependencyProperty in dependenciesElement.EnumerateObject())
{
AddDependency(dependencyProperty.Name);
}
} }
foreach (var dependencyProperty in dependenciesElement.EnumerateObject()) MergeRuntimeAssets(element, tfm, rid);
{
AddDependency(dependencyProperty.Name);
}
} }
public void MergeLibraryMetadata(JsonElement element) public void MergeLibraryMetadata(JsonElement element)
{ {
if (!element.TryGetProperty("dependencies", out var dependenciesElement) || dependenciesElement.ValueKind is not JsonValueKind.Object) if (element.TryGetProperty("dependencies", out var dependenciesElement) && dependenciesElement.ValueKind is JsonValueKind.Object)
{
foreach (var dependencyProperty in dependenciesElement.EnumerateObject())
{
AddDependency(dependencyProperty.Name);
}
}
MergeRuntimeAssets(element, tfm: null, rid: null);
}
private void MergeRuntimeAssets(JsonElement element, string? tfm, string? rid)
{
AddRuntimeAssetsFromRuntime(element, tfm, rid);
AddRuntimeAssetsFromRuntimeTargets(element, tfm, rid);
}
private void AddRuntimeAssetsFromRuntime(JsonElement element, string? tfm, string? rid)
{
if (!element.TryGetProperty("runtime", out var runtimeElement) || runtimeElement.ValueKind is not JsonValueKind.Object)
{ {
return; return;
} }
foreach (var dependencyProperty in dependenciesElement.EnumerateObject()) foreach (var assetProperty in runtimeElement.EnumerateObject())
{ {
AddDependency(dependencyProperty.Name); if (DotNetLibraryAsset.TryCreateFromRuntime(assetProperty.Name, assetProperty.Value, tfm, rid, out var asset))
{
_runtimeAssets.Add(asset);
}
}
}
private void AddRuntimeAssetsFromRuntimeTargets(JsonElement element, string? tfm, string? rid)
{
if (!element.TryGetProperty("runtimeTargets", out var runtimeTargetsElement) || runtimeTargetsElement.ValueKind is not JsonValueKind.Object)
{
return;
}
foreach (var assetProperty in runtimeTargetsElement.EnumerateObject())
{
if (DotNetLibraryAsset.TryCreateFromRuntimeTarget(assetProperty.Name, assetProperty.Value, tfm, rid, out var asset))
{
_runtimeAssets.Add(asset);
}
} }
} }
@@ -316,3 +357,162 @@ internal sealed class DotNetLibrary
return value.Trim(); return value.Trim();
} }
} }
internal enum DotNetLibraryAssetKind
{
Runtime,
Native
}
internal sealed record DotNetLibraryAsset(
string RelativePath,
string? TargetFramework,
string? RuntimeIdentifier,
string? AssemblyVersion,
string? FileVersion,
DotNetLibraryAssetKind Kind)
{
public static bool TryCreateFromRuntime(string name, JsonElement element, string? tfm, string? rid, [NotNullWhen(true)] out DotNetLibraryAsset? asset)
{
asset = null;
if (string.IsNullOrWhiteSpace(name))
{
return false;
}
var normalizedPath = NormalizePath(name);
if (string.IsNullOrEmpty(normalizedPath))
{
return false;
}
if (!IsManagedAssembly(normalizedPath))
{
return false;
}
string? assemblyVersion = null;
string? fileVersion = null;
if (element.ValueKind == JsonValueKind.Object)
{
if (element.TryGetProperty("assemblyVersion", out var assemblyVersionElement) && assemblyVersionElement.ValueKind == JsonValueKind.String)
{
assemblyVersion = NormalizeValue(assemblyVersionElement.GetString());
}
if (element.TryGetProperty("fileVersion", out var fileVersionElement) && fileVersionElement.ValueKind == JsonValueKind.String)
{
fileVersion = NormalizeValue(fileVersionElement.GetString());
}
}
asset = new DotNetLibraryAsset(
RelativePath: normalizedPath,
TargetFramework: NormalizeValue(tfm),
RuntimeIdentifier: NormalizeValue(rid),
AssemblyVersion: assemblyVersion,
FileVersion: fileVersion,
Kind: DotNetLibraryAssetKind.Runtime);
return true;
}
public static bool TryCreateFromRuntimeTarget(string name, JsonElement element, string? tfm, string? rid, [NotNullWhen(true)] out DotNetLibraryAsset? asset)
{
asset = null;
if (string.IsNullOrWhiteSpace(name) || element.ValueKind is not JsonValueKind.Object)
{
return false;
}
var assetType = element.TryGetProperty("assetType", out var assetTypeElement) && assetTypeElement.ValueKind == JsonValueKind.String
? NormalizeValue(assetTypeElement.GetString())
: null;
var normalizedPath = NormalizePath(name);
if (string.IsNullOrEmpty(normalizedPath))
{
return false;
}
DotNetLibraryAssetKind kind;
if (assetType is null || string.Equals(assetType, "runtime", StringComparison.OrdinalIgnoreCase))
{
if (!IsManagedAssembly(normalizedPath))
{
return false;
}
kind = DotNetLibraryAssetKind.Runtime;
}
else if (string.Equals(assetType, "native", StringComparison.OrdinalIgnoreCase))
{
kind = DotNetLibraryAssetKind.Native;
}
else
{
return false;
}
string? assemblyVersion = null;
string? fileVersion = null;
if (kind == DotNetLibraryAssetKind.Runtime &&
element.TryGetProperty("assemblyVersion", out var assemblyVersionElement) &&
assemblyVersionElement.ValueKind == JsonValueKind.String)
{
assemblyVersion = NormalizeValue(assemblyVersionElement.GetString());
}
if (kind == DotNetLibraryAssetKind.Runtime &&
element.TryGetProperty("fileVersion", out var fileVersionElement) &&
fileVersionElement.ValueKind == JsonValueKind.String)
{
fileVersion = NormalizeValue(fileVersionElement.GetString());
}
string? runtimeIdentifier = rid;
if (element.TryGetProperty("rid", out var ridElement) && ridElement.ValueKind == JsonValueKind.String)
{
runtimeIdentifier = NormalizeValue(ridElement.GetString());
}
asset = new DotNetLibraryAsset(
RelativePath: normalizedPath,
TargetFramework: NormalizeValue(tfm),
RuntimeIdentifier: NormalizeValue(runtimeIdentifier),
AssemblyVersion: assemblyVersion,
FileVersion: fileVersion,
Kind: kind);
return true;
}
private static string NormalizePath(string value)
{
var normalized = NormalizeValue(value);
if (string.IsNullOrEmpty(normalized))
{
return string.Empty;
}
return normalized.Replace('\\', '/');
}
private static bool IsManagedAssembly(string path)
=> path.EndsWith(".dll", StringComparison.OrdinalIgnoreCase) ||
path.EndsWith(".exe", StringComparison.OrdinalIgnoreCase);
private static string? NormalizeValue(string? value)
{
if (string.IsNullOrWhiteSpace(value))
{
return null;
}
return value.Trim();
}
}

View File

@@ -3,8 +3,8 @@
| Seq | ID | Status | Depends on | Description | Exit Criteria | | Seq | ID | Status | Depends on | Description | Exit Criteria |
|-----|----|--------|------------|-------------|---------------| |-----|----|--------|------------|-------------|---------------|
| 1 | SCANNER-ANALYZERS-LANG-10-305A | DONE (2025-10-22) | SCANNER-ANALYZERS-LANG-10-307 | Parse `*.deps.json` + `runtimeconfig.json`, build RID graph, and normalize to `pkg:nuget` components. | RID graph deterministic; fixtures confirm consistent component ordering; fallback to `bin:{sha256}` documented. | | 1 | SCANNER-ANALYZERS-LANG-10-305A | DONE (2025-10-22) | SCANNER-ANALYZERS-LANG-10-307 | Parse `*.deps.json` + `runtimeconfig.json`, build RID graph, and normalize to `pkg:nuget` components. | RID graph deterministic; fixtures confirm consistent component ordering; fallback to `bin:{sha256}` documented. |
| 2 | SCANNER-ANALYZERS-LANG-10-305B | TODO | SCANNER-ANALYZERS-LANG-10-305A | Extract assembly metadata (strong name, file/product info) and optional Authenticode details when offline cert bundle provided. | Signing metadata captured for signed assemblies; offline trust store documented; hash validations deterministic. | | 2 | SCANNER-ANALYZERS-LANG-10-305B | DONE (2025-10-22) | SCANNER-ANALYZERS-LANG-10-305A | Extract assembly metadata (strong name, file/product info) and optional Authenticode details when offline cert bundle provided. | Signing metadata captured for signed assemblies; offline trust store documented; hash validations deterministic. |
| 3 | SCANNER-ANALYZERS-LANG-10-305C | TODO | SCANNER-ANALYZERS-LANG-10-305B | Handle self-contained apps and native assets; merge with EntryTrace usage hints. | Self-contained fixtures map to components with RID flags; usage hints propagate; tests cover linux/win variants. | | 3 | SCANNER-ANALYZERS-LANG-10-305C | DONE (2025-10-22) | SCANNER-ANALYZERS-LANG-10-305B | Handle self-contained apps and native assets; merge with EntryTrace usage hints. | Self-contained fixtures map to components with RID flags; usage hints propagate; tests cover linux/win variants. |
| 4 | SCANNER-ANALYZERS-LANG-10-307D | TODO | SCANNER-ANALYZERS-LANG-10-305C | Integrate shared helpers (license mapping, quiet provenance) and concurrency-safe caches. | Shared helpers reused; concurrency tests for parallel layer scans pass; no redundant allocations. | | 4 | SCANNER-ANALYZERS-LANG-10-307D | TODO | SCANNER-ANALYZERS-LANG-10-305C | Integrate shared helpers (license mapping, quiet provenance) and concurrency-safe caches. | Shared helpers reused; concurrency tests for parallel layer scans pass; no redundant allocations. |
| 5 | SCANNER-ANALYZERS-LANG-10-308D | TODO | SCANNER-ANALYZERS-LANG-10-307D | Determinism fixtures + benchmark harness; compare to competitor scanners for accuracy/perf. | Fixtures in `Fixtures/lang/dotnet/`; determinism CI guard; benchmark demonstrates lower duplication + faster runtime. | | 5 | SCANNER-ANALYZERS-LANG-10-308D | TODO | SCANNER-ANALYZERS-LANG-10-307D | Determinism fixtures + benchmark harness; compare to competitor scanners for accuracy/perf. | Fixtures in `Fixtures/lang/dotnet/`; determinism CI guard; benchmark demonstrates lower duplication + faster runtime. |
| 6 | SCANNER-ANALYZERS-LANG-10-309D | TODO | SCANNER-ANALYZERS-LANG-10-308D | Package plug-in (manifest, DI registration) and update Offline Kit instructions. | Manifest copied to `plugins/scanner/analyzers/lang/`; Worker loads analyzer; Offline Kit doc updated. | | 6 | SCANNER-ANALYZERS-LANG-10-309D | TODO | SCANNER-ANALYZERS-LANG-10-308D | Package plug-in (manifest, DI registration) and update Offline Kit instructions. | Manifest copied to `plugins/scanner/analyzers/lang/`; Worker loads analyzer; Offline Kit doc updated. |

View File

@@ -0,0 +1,5 @@
MZfakebinaryheader
Go build ID: "random-go-build-id"
....gopclntab....
runtime.buildVersion=go1.22.8
padding0000000000000000000000000000000000000000000000000000000000000000

View File

@@ -0,0 +1,30 @@
[
{
"analyzerId": "golang",
"componentKey": "golang::bin::sha256:80f528c90b72a4c4cc3fa078501154e4f2a3f49faea3ec380112d61740bde4c3",
"name": "app",
"type": "bin",
"usedByEntrypoint": false,
"metadata": {
"binary.sha256": "80f528c90b72a4c4cc3fa078501154e4f2a3f49faea3ec380112d61740bde4c3",
"binaryPath": "app",
"go.version.hint": "go1.22.8",
"languageHint": "golang",
"provenance": "binary"
},
"evidence": [
{
"kind": "file",
"source": "binary",
"locator": "app",
"sha256": "80f528c90b72a4c4cc3fa078501154e4f2a3f49faea3ec380112d61740bde4c3"
},
{
"kind": "metadata",
"source": "go.heuristic",
"locator": "classification",
"value": "build-id"
}
]
}
]

View File

@@ -44,4 +44,23 @@ public sealed class GoLanguageAnalyzerTests
analyzers, analyzers,
cancellationToken); cancellationToken);
} }
[Fact]
public async Task StrippedBinaryFallsBackToHeuristicBinHashAsync()
{
var cancellationToken = TestContext.Current.CancellationToken;
var fixturePath = TestPaths.ResolveFixture("lang", "go", "stripped");
var goldenPath = Path.Combine(fixturePath, "expected.json");
var analyzers = new ILanguageAnalyzer[]
{
new GoLanguageAnalyzer(),
};
await LanguageAnalyzerTestHarness.AssertDeterministicAsync(
fixturePath,
goldenPath,
analyzers,
cancellationToken);
}
} }

View File

@@ -21,18 +21,31 @@ public sealed class GoLanguageAnalyzer : ILanguageAnalyzer
var candidatePaths = new List<string>(GoBinaryScanner.EnumerateCandidateFiles(context.RootPath)); var candidatePaths = new List<string>(GoBinaryScanner.EnumerateCandidateFiles(context.RootPath));
candidatePaths.Sort(StringComparer.Ordinal); candidatePaths.Sort(StringComparer.Ordinal);
var fallbackBinaries = new List<GoStrippedBinaryClassification>();
foreach (var absolutePath in candidatePaths) foreach (var absolutePath in candidatePaths)
{ {
cancellationToken.ThrowIfCancellationRequested(); cancellationToken.ThrowIfCancellationRequested();
if (!GoBuildInfoProvider.TryGetBuildInfo(absolutePath, out var buildInfo) || buildInfo is null) if (!GoBuildInfoProvider.TryGetBuildInfo(absolutePath, out var buildInfo) || buildInfo is null)
{ {
if (GoBinaryScanner.TryClassifyStrippedBinary(absolutePath, out var classification))
{
fallbackBinaries.Add(classification);
}
continue; continue;
} }
EmitComponents(buildInfo, context, writer); EmitComponents(buildInfo, context, writer);
} }
foreach (var fallback in fallbackBinaries)
{
cancellationToken.ThrowIfCancellationRequested();
EmitFallbackComponent(fallback, context, writer);
}
return ValueTask.CompletedTask; return ValueTask.CompletedTask;
} }
@@ -144,6 +157,84 @@ public sealed class GoLanguageAnalyzer : ILanguageAnalyzer
return entries; return entries;
} }
private void EmitFallbackComponent(GoStrippedBinaryClassification strippedBinary, LanguageAnalyzerContext context, LanguageComponentWriter writer)
{
var relativePath = context.GetRelativePath(strippedBinary.AbsolutePath);
var normalizedRelative = string.IsNullOrEmpty(relativePath) ? "." : relativePath;
var usedByEntrypoint = context.UsageHints.IsPathUsed(strippedBinary.AbsolutePath);
var binaryHash = ComputeBinaryHash(strippedBinary.AbsolutePath);
var metadata = new List<KeyValuePair<string, string?>>
{
new("binaryPath", normalizedRelative),
new("languageHint", "golang"),
new("provenance", "binary"),
};
if (!string.IsNullOrEmpty(binaryHash))
{
metadata.Add(new KeyValuePair<string, string?>("binary.sha256", binaryHash));
}
if (!string.IsNullOrEmpty(strippedBinary.GoVersionHint))
{
metadata.Add(new KeyValuePair<string, string?>("go.version.hint", strippedBinary.GoVersionHint));
}
metadata.Sort(static (left, right) => string.CompareOrdinal(left.Key, right.Key));
var evidence = new List<LanguageComponentEvidence>
{
new(
LanguageEvidenceKind.File,
"binary",
normalizedRelative,
null,
string.IsNullOrEmpty(binaryHash) ? null : binaryHash),
};
var detectionSource = strippedBinary.Indicator switch
{
GoStrippedBinaryIndicator.BuildId => "build-id",
GoStrippedBinaryIndicator.GoRuntimeMarkers => "runtime-markers",
_ => null,
};
if (!string.IsNullOrEmpty(detectionSource))
{
evidence.Add(new LanguageComponentEvidence(
LanguageEvidenceKind.Metadata,
"go.heuristic",
"classification",
detectionSource,
null));
}
evidence.Sort(static (left, right) => string.CompareOrdinal(left.ComparisonKey, right.ComparisonKey));
var componentName = Path.GetFileName(strippedBinary.AbsolutePath);
if (string.IsNullOrWhiteSpace(componentName))
{
componentName = "golang-binary";
}
var componentKey = string.IsNullOrEmpty(binaryHash)
? $"golang::bin::{normalizedRelative}"
: $"golang::bin::sha256:{binaryHash}";
writer.AddFromExplicitKey(
analyzerId: Id,
componentKey: componentKey,
purl: null,
name: componentName,
version: null,
type: "bin",
metadata: metadata,
evidence: evidence,
usedByEntrypoint: usedByEntrypoint);
}
private static IEnumerable<LanguageComponentEvidence> BuildEvidence(GoBuildInfo buildInfo, GoModule module, string binaryRelativePath, LanguageAnalyzerContext context, ref string? binaryHash) private static IEnumerable<LanguageComponentEvidence> BuildEvidence(GoBuildInfo buildInfo, GoModule module, string binaryRelativePath, LanguageAnalyzerContext context, ref string? binaryHash)
{ {
var evidence = new List<LanguageComponentEvidence> var evidence = new List<LanguageComponentEvidence>

View File

@@ -1,6 +1,8 @@
using System; using System;
using System.Collections.Generic; using System.Collections.Generic;
using System.Buffers;
using System.IO; using System.IO;
using System.Text;
namespace StellaOps.Scanner.Analyzers.Lang.Go.Internal; namespace StellaOps.Scanner.Analyzers.Lang.Go.Internal;
@@ -11,6 +13,10 @@ internal static class GoBinaryScanner
0xFF, (byte)' ', (byte)'G', (byte)'o', (byte)' ', (byte)'b', (byte)'u', (byte)'i', (byte)'l', (byte)'d', (byte)'i', (byte)'n', (byte)'f', (byte)':' 0xFF, (byte)' ', (byte)'G', (byte)'o', (byte)' ', (byte)'b', (byte)'u', (byte)'i', (byte)'l', (byte)'d', (byte)'i', (byte)'n', (byte)'f', (byte)':'
}; };
private static readonly ReadOnlyMemory<byte> BuildIdMarker = Encoding.ASCII.GetBytes("Go build ID:");
private static readonly ReadOnlyMemory<byte> GoPclnTabMarker = Encoding.ASCII.GetBytes(".gopclntab");
private static readonly ReadOnlyMemory<byte> GoVersionPrefix = Encoding.ASCII.GetBytes("go1.");
public static IEnumerable<string> EnumerateCandidateFiles(string rootPath) public static IEnumerable<string> EnumerateCandidateFiles(string rootPath)
{ {
var enumeration = new EnumerationOptions var enumeration = new EnumerationOptions
@@ -60,4 +66,151 @@ internal static class GoBinaryScanner
return false; return false;
} }
} }
public static bool TryClassifyStrippedBinary(string filePath, out GoStrippedBinaryClassification classification)
{
classification = default;
FileInfo fileInfo;
try
{
fileInfo = new FileInfo(filePath);
if (!fileInfo.Exists)
{
return false;
}
}
catch (IOException)
{
return false;
}
catch (UnauthorizedAccessException)
{
return false;
}
catch (System.Security.SecurityException)
{
return false;
}
var length = fileInfo.Length;
if (length < 128)
{
return false;
}
const int WindowSize = 128 * 1024;
var readSize = (int)Math.Min(length, WindowSize);
var buffer = ArrayPool<byte>.Shared.Rent(readSize);
try
{
using var stream = new FileStream(filePath, FileMode.Open, FileAccess.Read, FileShare.Read);
var headRead = stream.Read(buffer, 0, readSize);
if (headRead <= 0)
{
return false;
}
var headSpan = new ReadOnlySpan<byte>(buffer, 0, headRead);
var hasBuildId = headSpan.IndexOf(BuildIdMarker.Span) >= 0;
var hasPcln = headSpan.IndexOf(GoPclnTabMarker.Span) >= 0;
var goVersion = ExtractGoVersion(headSpan);
if (length > headRead)
{
var tailSize = Math.Min(readSize, (int)length);
if (tailSize > 0)
{
stream.Seek(-tailSize, SeekOrigin.End);
var tailRead = stream.Read(buffer, 0, tailSize);
if (tailRead > 0)
{
var tailSpan = new ReadOnlySpan<byte>(buffer, 0, tailRead);
hasBuildId |= tailSpan.IndexOf(BuildIdMarker.Span) >= 0;
hasPcln |= tailSpan.IndexOf(GoPclnTabMarker.Span) >= 0;
goVersion ??= ExtractGoVersion(tailSpan);
}
}
}
if (hasBuildId)
{
classification = new GoStrippedBinaryClassification(
filePath,
GoStrippedBinaryIndicator.BuildId,
goVersion);
return true;
}
if (hasPcln && !string.IsNullOrEmpty(goVersion))
{
classification = new GoStrippedBinaryClassification(
filePath,
GoStrippedBinaryIndicator.GoRuntimeMarkers,
goVersion);
return true;
}
return false;
}
finally
{
Array.Clear(buffer, 0, readSize);
ArrayPool<byte>.Shared.Return(buffer);
}
}
private static string? ExtractGoVersion(ReadOnlySpan<byte> data)
{
var prefix = GoVersionPrefix.Span;
var span = data;
while (!span.IsEmpty)
{
var index = span.IndexOf(prefix);
if (index < 0)
{
return null;
}
var absoluteIndex = data.Length - span.Length + index;
if (absoluteIndex > 0)
{
var previous = (char)data[absoluteIndex - 1];
if (char.IsLetterOrDigit(previous))
{
span = span[(index + 1)..];
continue;
}
}
var start = absoluteIndex;
var end = start + prefix.Length;
while (end < data.Length && IsVersionCharacter((char)data[end]))
{
end++;
}
if (end - start <= prefix.Length)
{
span = span[(index + 1)..];
continue;
}
var candidate = data[start..end];
return Encoding.ASCII.GetString(candidate);
}
return null;
}
private static bool IsVersionCharacter(char value)
=> (value >= '0' && value <= '9')
|| (value >= 'a' && value <= 'z')
|| (value >= 'A' && value <= 'Z')
|| value is '.' or '-' or '+' or '_';
} }

View File

@@ -0,0 +1,13 @@
namespace StellaOps.Scanner.Analyzers.Lang.Go.Internal;
internal readonly record struct GoStrippedBinaryClassification(
string AbsolutePath,
GoStrippedBinaryIndicator Indicator,
string? GoVersionHint);
internal enum GoStrippedBinaryIndicator
{
None = 0,
BuildId,
GoRuntimeMarkers,
}

View File

@@ -4,7 +4,8 @@
|-----|----|--------|------------|-------------|---------------| |-----|----|--------|------------|-------------|---------------|
| 1 | SCANNER-ANALYZERS-LANG-10-304A | DONE (2025-10-22) | SCANNER-ANALYZERS-LANG-10-307 | Parse Go build info blob (`runtime/debug` format) and `.note.go.buildid`; map to module/version and evidence. | Build info extracted across Go 1.181.23 fixtures; evidence includes VCS, module path, and build settings. | | 1 | SCANNER-ANALYZERS-LANG-10-304A | DONE (2025-10-22) | SCANNER-ANALYZERS-LANG-10-307 | Parse Go build info blob (`runtime/debug` format) and `.note.go.buildid`; map to module/version and evidence. | Build info extracted across Go 1.181.23 fixtures; evidence includes VCS, module path, and build settings. |
| 2 | SCANNER-ANALYZERS-LANG-10-304B | DONE (2025-10-22) | SCANNER-ANALYZERS-LANG-10-304A | Implement DWARF-lite reader for VCS metadata + dirty flag; add cache to avoid re-reading identical binaries. | DWARF reader supplies commit hash for ≥95% fixtures; cache reduces duplicated IO by ≥70%. | | 2 | SCANNER-ANALYZERS-LANG-10-304B | DONE (2025-10-22) | SCANNER-ANALYZERS-LANG-10-304A | Implement DWARF-lite reader for VCS metadata + dirty flag; add cache to avoid re-reading identical binaries. | DWARF reader supplies commit hash for ≥95% fixtures; cache reduces duplicated IO by ≥70%. |
| 3 | SCANNER-ANALYZERS-LANG-10-304C | TODO | SCANNER-ANALYZERS-LANG-10-304B | Fallback heuristics for stripped binaries with deterministic `bin:{sha256}` labeling and quiet provenance. | Heuristic labels clearly separated; tests ensure no false “observed” provenance; documentation updated. | | 3 | SCANNER-ANALYZERS-LANG-10-304C | DONE (2025-10-22) | SCANNER-ANALYZERS-LANG-10-304B | Fallback heuristics for stripped binaries with deterministic `bin:{sha256}` labeling and quiet provenance. | Heuristic labels clearly separated; tests ensure no false “observed” provenance; documentation updated. |
| 4 | SCANNER-ANALYZERS-LANG-10-307G | TODO | SCANNER-ANALYZERS-LANG-10-304C | Wire shared helpers (license mapping, usage flags) and ensure concurrency-safe buffer reuse. | Analyzer reuses shared infrastructure; concurrency tests with parallel scans pass; no data races. | | 4 | SCANNER-ANALYZERS-LANG-10-307G | TODO | SCANNER-ANALYZERS-LANG-10-304C | Wire shared helpers (license mapping, usage flags) and ensure concurrency-safe buffer reuse. | Analyzer reuses shared infrastructure; concurrency tests with parallel scans pass; no data races. |
| 5 | SCANNER-ANALYZERS-LANG-10-308G | TODO | SCANNER-ANALYZERS-LANG-10-307G | Determinism fixtures + benchmark harness (Vs competitor). | Fixtures under `Fixtures/lang/go/`; CI determinism check; benchmark runs showing ≥20% speed advantage. | | 5 | SCANNER-ANALYZERS-LANG-10-308G | TODO | SCANNER-ANALYZERS-LANG-10-307G | Determinism fixtures + benchmark harness (Vs competitor). | Fixtures under `Fixtures/lang/go/`; CI determinism check; benchmark runs showing ≥20% speed advantage. |
| 6 | SCANNER-ANALYZERS-LANG-10-309G | TODO | SCANNER-ANALYZERS-LANG-10-308G | Package plug-in manifest + Offline Kit notes; ensure Worker DI registration. | Manifest copied; Worker loads analyzer; Offline Kit docs updated with Go analyzer presence. | | 6 | SCANNER-ANALYZERS-LANG-10-309G | TODO | SCANNER-ANALYZERS-LANG-10-308G | Package plug-in manifest + Offline Kit notes; ensure Worker DI registration. | Manifest copied; Worker loads analyzer; Offline Kit docs updated with Go analyzer presence. |
| 7 | SCANNER-ANALYZERS-LANG-10-304D | TODO | SCANNER-ANALYZERS-LANG-10-304C | Emit telemetry counters for stripped-binary heuristics and document metrics wiring. | New `scanner_analyzer_golang_heuristic_total` counter recorded; docs updated with offline aggregation notes. |

View File

@@ -0,0 +1,650 @@
using System.Collections.Immutable;
using System.Linq;
using System.Security.Cryptography;
namespace StellaOps.Scanner.Analyzers.Lang.Rust.Internal;
internal static class RustAnalyzerCollector
{
public static RustAnalyzerCollection Collect(LanguageAnalyzerContext context, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(context);
var collector = new Collector(context);
collector.Execute(cancellationToken);
return collector.Build();
}
private sealed class Collector
{
private static readonly EnumerationOptions LockEnumeration = new()
{
MatchCasing = MatchCasing.CaseSensitive,
IgnoreInaccessible = true,
RecurseSubdirectories = true,
AttributesToSkip = FileAttributes.Device | FileAttributes.ReparsePoint,
};
private readonly LanguageAnalyzerContext _context;
private readonly Dictionary<RustCrateKey, RustCrateBuilder> _crates = new();
private readonly Dictionary<string, List<RustCrateBuilder>> _cratesByName = new(StringComparer.Ordinal);
private readonly Dictionary<string, RustHeuristicBuilder> _heuristics = new(StringComparer.Ordinal);
private readonly Dictionary<string, RustBinaryRecord> _binaries = new(StringComparer.Ordinal);
public Collector(LanguageAnalyzerContext context)
{
_context = context;
}
public void Execute(CancellationToken cancellationToken)
{
CollectCargoLocks(cancellationToken);
CollectFingerprints(cancellationToken);
CollectBinaries(cancellationToken);
}
public RustAnalyzerCollection Build()
{
var crateRecords = _crates.Values
.Select(static builder => builder.Build())
.OrderBy(static record => record.ComponentKey, StringComparer.Ordinal)
.ToImmutableArray();
var heuristicRecords = _heuristics.Values
.Select(static builder => builder.Build())
.OrderBy(static record => record.ComponentKey, StringComparer.Ordinal)
.ToImmutableArray();
var fallbackRecords = _binaries.Values
.Where(static record => !record.HasMatches)
.Select(BuildFallback)
.OrderBy(static record => record.ComponentKey, StringComparer.Ordinal)
.ToImmutableArray();
return new RustAnalyzerCollection(crateRecords, heuristicRecords, fallbackRecords);
}
private void CollectCargoLocks(CancellationToken cancellationToken)
{
foreach (var lockPath in Directory.EnumerateFiles(_context.RootPath, "Cargo.lock", LockEnumeration))
{
cancellationToken.ThrowIfCancellationRequested();
var packages = RustCargoLockParser.Parse(lockPath, cancellationToken);
if (packages.Count == 0)
{
continue;
}
var relativePath = NormalizeRelative(_context.GetRelativePath(lockPath));
foreach (var package in packages)
{
var builder = GetOrCreateCrate(package.Name, package.Version);
builder.ApplyCargoPackage(package, relativePath);
}
}
}
private void CollectFingerprints(CancellationToken cancellationToken)
{
var records = RustFingerprintScanner.Scan(_context.RootPath, cancellationToken);
foreach (var record in records)
{
cancellationToken.ThrowIfCancellationRequested();
var builder = GetOrCreateCrate(record.Name, record.Version);
var relative = NormalizeRelative(_context.GetRelativePath(record.AbsolutePath));
builder.ApplyFingerprint(record, relative);
}
}
private void CollectBinaries(CancellationToken cancellationToken)
{
var binaries = RustBinaryClassifier.Scan(_context.RootPath, cancellationToken);
foreach (var binary in binaries)
{
cancellationToken.ThrowIfCancellationRequested();
var relative = NormalizeRelative(_context.GetRelativePath(binary.AbsolutePath));
var usage = _context.UsageHints.IsPathUsed(binary.AbsolutePath);
var hash = binary.ComputeSha256();
if (!_binaries.TryGetValue(relative, out var record))
{
record = new RustBinaryRecord(binary.AbsolutePath, relative, usage, hash);
_binaries[relative] = record;
}
else
{
record.MergeUsage(usage);
record.EnsureHash(hash);
}
if (binary.CrateCandidates.IsDefaultOrEmpty || binary.CrateCandidates.Length == 0)
{
continue;
}
foreach (var candidate in binary.CrateCandidates)
{
if (string.IsNullOrWhiteSpace(candidate))
{
continue;
}
var crateBuilder = FindCrateByName(candidate);
if (crateBuilder is not null)
{
crateBuilder.AddBinaryEvidence(relative, record.Hash, usage);
record.MarkCrateMatch();
continue;
}
var heuristic = GetOrCreateHeuristic(candidate);
heuristic.AddBinary(relative, record.Hash, usage);
record.MarkHeuristicMatch();
}
}
}
private RustCrateBuilder GetOrCreateCrate(string name, string? version)
{
var key = new RustCrateKey(name, version);
if (_crates.TryGetValue(key, out var existing))
{
existing.EnsureVersion(version);
return existing;
}
var builder = new RustCrateBuilder(name, version);
_crates[key] = builder;
if (!_cratesByName.TryGetValue(builder.Name, out var list))
{
list = new List<RustCrateBuilder>();
_cratesByName[builder.Name] = list;
}
list.Add(builder);
return builder;
}
private RustCrateBuilder? FindCrateByName(string candidate)
{
var normalized = RustCrateBuilder.NormalizeName(candidate);
if (!_cratesByName.TryGetValue(normalized, out var builders) || builders.Count == 0)
{
return null;
}
return builders
.OrderBy(static builder => builder.Version ?? string.Empty, StringComparer.Ordinal)
.FirstOrDefault();
}
private RustHeuristicBuilder GetOrCreateHeuristic(string crateName)
{
var normalized = RustCrateBuilder.NormalizeName(crateName);
if (_heuristics.TryGetValue(normalized, out var existing))
{
return existing;
}
var builder = new RustHeuristicBuilder(normalized);
_heuristics[normalized] = builder;
return builder;
}
private RustComponentRecord BuildFallback(RustBinaryRecord record)
{
var metadata = new List<KeyValuePair<string, string?>>
{
new("binary.path", record.RelativePath),
new("provenance", "binary"),
};
if (!string.IsNullOrEmpty(record.Hash))
{
metadata.Add(new KeyValuePair<string, string?>("binary.sha256", record.Hash));
}
metadata.Sort(static (left, right) => string.CompareOrdinal(left.Key, right.Key));
var evidence = new List<LanguageComponentEvidence>
{
new(
LanguageEvidenceKind.File,
"binary",
record.RelativePath,
null,
string.IsNullOrEmpty(record.Hash) ? null : record.Hash)
};
var componentName = Path.GetFileName(record.RelativePath);
if (string.IsNullOrWhiteSpace(componentName))
{
componentName = "binary";
}
var key = string.IsNullOrEmpty(record.Hash)
? $"bin::{record.RelativePath}"
: $"bin::sha256:{record.Hash}";
return new RustComponentRecord(
Name: componentName,
Version: null,
Type: "bin",
Purl: null,
ComponentKey: key,
Metadata: metadata,
Evidence: evidence,
UsedByEntrypoint: record.UsedByEntrypoint);
}
private static string NormalizeRelative(string relativePath)
{
if (string.IsNullOrWhiteSpace(relativePath) || relativePath == ".")
{
return ".";
}
return relativePath.Replace('\\', '/');
}
}
}
internal sealed record RustAnalyzerCollection(
ImmutableArray<RustComponentRecord> Crates,
ImmutableArray<RustComponentRecord> Heuristics,
ImmutableArray<RustComponentRecord> Fallbacks);
internal sealed record RustComponentRecord(
string Name,
string? Version,
string Type,
string? Purl,
string ComponentKey,
IReadOnlyList<KeyValuePair<string, string?>> Metadata,
IReadOnlyCollection<LanguageComponentEvidence> Evidence,
bool UsedByEntrypoint);
internal sealed class RustCrateBuilder
{
private readonly SortedDictionary<string, string?> _metadata = new(StringComparer.Ordinal);
private readonly HashSet<LanguageComponentEvidence> _evidence = new(new LanguageComponentEvidenceComparer());
private readonly SortedSet<string> _binaryPaths = new(StringComparer.Ordinal);
private readonly SortedSet<string> _binaryHashes = new(StringComparer.Ordinal);
private string? _version;
private string? _source;
private string? _checksum;
private bool _usedByEntrypoint;
public RustCrateBuilder(string name, string? version)
{
Name = NormalizeName(name);
EnsureVersion(version);
}
public string Name { get; }
public string? Version => _version;
public static string NormalizeName(string value)
{
if (string.IsNullOrWhiteSpace(value))
{
return string.Empty;
}
return value.Trim();
}
public void EnsureVersion(string? version)
{
if (string.IsNullOrWhiteSpace(version))
{
return;
}
_version ??= version.Trim();
}
public void ApplyCargoPackage(RustCargoPackage package, string relativePath)
{
EnsureVersion(package.Version);
if (!string.IsNullOrWhiteSpace(package.Source))
{
_source ??= package.Source.Trim();
_metadata["source"] = _source;
}
if (!string.IsNullOrWhiteSpace(package.Checksum))
{
_checksum ??= package.Checksum.Trim();
_metadata["checksum"] = _checksum;
}
_metadata["cargo.lock.path"] = relativePath;
_evidence.Add(new LanguageComponentEvidence(
LanguageEvidenceKind.File,
"cargo.lock",
relativePath,
$"{package.Name} {package.Version}",
string.IsNullOrWhiteSpace(package.Checksum) ? null : package.Checksum));
}
public void ApplyFingerprint(RustFingerprintRecord record, string relativePath)
{
EnsureVersion(record.Version);
if (!string.IsNullOrWhiteSpace(record.Source))
{
_source ??= record.Source.Trim();
_metadata["source"] = _source;
}
AddMetadataIfEmpty("fingerprint.profile", record.Profile);
AddMetadataIfEmpty("fingerprint.targetKind", record.TargetKind);
_evidence.Add(new LanguageComponentEvidence(
LanguageEvidenceKind.File,
"cargo.fingerprint",
relativePath,
record.TargetKind ?? record.Profile ?? "fingerprint",
null));
}
public void AddBinaryEvidence(string relativePath, string? hash, bool usedByEntrypoint)
{
if (!string.IsNullOrWhiteSpace(relativePath))
{
_binaryPaths.Add(relativePath);
}
if (!string.IsNullOrWhiteSpace(hash))
{
_binaryHashes.Add(hash);
}
if (usedByEntrypoint)
{
_usedByEntrypoint = true;
}
if (!string.IsNullOrWhiteSpace(relativePath))
{
_evidence.Add(new LanguageComponentEvidence(
LanguageEvidenceKind.File,
"binary",
relativePath,
null,
string.IsNullOrWhiteSpace(hash) ? null : hash));
}
}
public RustComponentRecord Build()
{
if (_binaryPaths.Count > 0)
{
_metadata["binary.paths"] = string.Join(';', _binaryPaths);
}
if (_binaryHashes.Count > 0)
{
_metadata["binary.sha256"] = string.Join(';', _binaryHashes);
}
var metadata = _metadata
.Select(static pair => new KeyValuePair<string, string?>(pair.Key, pair.Value))
.OrderBy(static pair => pair.Key, StringComparer.Ordinal)
.ToList();
var evidence = _evidence
.OrderBy(static item => item.ComparisonKey, StringComparer.Ordinal)
.ToImmutableArray();
var purl = BuildPurl(Name, _version);
var componentKey = string.IsNullOrEmpty(purl)
? $"cargo::{Name}::{_version ?? "unknown"}"
: $"purl::{purl}";
return new RustComponentRecord(
Name: Name,
Version: _version,
Type: "cargo",
Purl: purl,
ComponentKey: componentKey,
Metadata: metadata,
Evidence: evidence,
UsedByEntrypoint: _usedByEntrypoint);
}
private void AddMetadataIfEmpty(string key, string? value)
{
if (string.IsNullOrWhiteSpace(key) || string.IsNullOrWhiteSpace(value))
{
return;
}
if (_metadata.ContainsKey(key))
{
return;
}
_metadata[key] = value.Trim();
}
private static string? BuildPurl(string name, string? version)
{
if (string.IsNullOrWhiteSpace(name))
{
return null;
}
var escapedName = Uri.EscapeDataString(name.Trim());
if (string.IsNullOrWhiteSpace(version))
{
return $"pkg:cargo/{escapedName}";
}
var escapedVersion = Uri.EscapeDataString(version.Trim());
return $"pkg:cargo/{escapedName}@{escapedVersion}";
}
}
internal sealed class RustHeuristicBuilder
{
private readonly HashSet<LanguageComponentEvidence> _evidence = new(new LanguageComponentEvidenceComparer());
private readonly SortedSet<string> _binaryPaths = new(StringComparer.Ordinal);
private readonly SortedSet<string> _binaryHashes = new(StringComparer.Ordinal);
private bool _usedByEntrypoint;
public RustHeuristicBuilder(string crateName)
{
CrateName = RustCrateBuilder.NormalizeName(crateName);
}
public string CrateName { get; }
public void AddBinary(string relativePath, string? hash, bool usedByEntrypoint)
{
if (!string.IsNullOrWhiteSpace(relativePath))
{
_binaryPaths.Add(relativePath);
}
if (!string.IsNullOrWhiteSpace(hash))
{
_binaryHashes.Add(hash);
}
if (usedByEntrypoint)
{
_usedByEntrypoint = true;
}
if (!string.IsNullOrWhiteSpace(relativePath))
{
_evidence.Add(new LanguageComponentEvidence(
LanguageEvidenceKind.Derived,
"rust.heuristic",
relativePath,
CrateName,
string.IsNullOrWhiteSpace(hash) ? null : hash));
}
}
public RustComponentRecord Build()
{
var metadata = new List<KeyValuePair<string, string?>>
{
new("crate", CrateName),
new("provenance", "heuristic"),
new("binary.paths", string.Join(';', _binaryPaths)),
};
if (_binaryHashes.Count > 0)
{
metadata.Add(new KeyValuePair<string, string?>("binary.sha256", string.Join(';', _binaryHashes)));
}
metadata.Sort(static (left, right) => string.CompareOrdinal(left.Key, right.Key));
var evidence = _evidence
.OrderBy(static item => item.ComparisonKey, StringComparer.Ordinal)
.ToImmutableArray();
var suffix = string.Join("|", _binaryPaths);
var componentKey = $"rust::heuristic::{CrateName}::{suffix}";
return new RustComponentRecord(
Name: CrateName,
Version: null,
Type: "cargo",
Purl: null,
ComponentKey: componentKey,
Metadata: metadata,
Evidence: evidence,
UsedByEntrypoint: _usedByEntrypoint);
}
}
internal sealed class RustBinaryRecord
{
private string? _hash;
public RustBinaryRecord(string absolutePath, string relativePath, bool usedByEntrypoint, string? hash)
{
AbsolutePath = absolutePath ?? throw new ArgumentNullException(nameof(absolutePath));
RelativePath = string.IsNullOrWhiteSpace(relativePath) ? "." : relativePath;
UsedByEntrypoint = usedByEntrypoint;
_hash = string.IsNullOrWhiteSpace(hash) ? null : hash;
}
public string AbsolutePath { get; }
public string RelativePath { get; }
public bool UsedByEntrypoint { get; private set; }
public bool HasMatches => HasCrateMatch || HasHeuristicMatch;
public bool HasCrateMatch { get; private set; }
public bool HasHeuristicMatch { get; private set; }
public string? Hash => _hash;
public void MarkCrateMatch() => HasCrateMatch = true;
public void MarkHeuristicMatch() => HasHeuristicMatch = true;
public void MergeUsage(bool used)
{
if (used)
{
UsedByEntrypoint = true;
}
}
public void EnsureHash(string? hash)
{
if (!string.IsNullOrWhiteSpace(hash))
{
_hash ??= hash;
}
if (_hash is null)
{
_hash = ComputeHashSafely();
}
}
private string? ComputeHashSafely()
{
try
{
using var stream = new FileStream(AbsolutePath, FileMode.Open, FileAccess.Read, FileShare.Read);
using var sha = SHA256.Create();
var hash = sha.ComputeHash(stream);
return Convert.ToHexString(hash).ToLowerInvariant();
}
catch (IOException)
{
return null;
}
catch (UnauthorizedAccessException)
{
return null;
}
}
}
internal readonly record struct RustCrateKey
{
public RustCrateKey(string name, string? version)
{
Name = RustCrateBuilder.NormalizeName(name);
Version = string.IsNullOrWhiteSpace(version) ? null : version.Trim();
}
public string Name { get; }
public string? Version { get; }
}
internal sealed class LanguageComponentEvidenceComparer : IEqualityComparer<LanguageComponentEvidence>
{
public bool Equals(LanguageComponentEvidence? x, LanguageComponentEvidence? y)
{
if (ReferenceEquals(x, y))
{
return true;
}
if (x is null || y is null)
{
return false;
}
return x.Kind == y.Kind &&
string.Equals(x.Source, y.Source, StringComparison.Ordinal) &&
string.Equals(x.Locator, y.Locator, StringComparison.Ordinal) &&
string.Equals(x.Value, y.Value, StringComparison.Ordinal) &&
string.Equals(x.Sha256, y.Sha256, StringComparison.Ordinal);
}
public int GetHashCode(LanguageComponentEvidence obj)
{
var hash = new HashCode();
hash.Add(obj.Kind);
hash.Add(obj.Source, StringComparer.Ordinal);
hash.Add(obj.Locator, StringComparer.Ordinal);
hash.Add(obj.Value, StringComparer.Ordinal);
hash.Add(obj.Sha256, StringComparer.Ordinal);
return hash.ToHashCode();
}
}

View File

@@ -0,0 +1,250 @@
using System.Buffers;
using System.Collections.Immutable;
using System.Linq;
using System.Security.Cryptography;
using System.Text;
namespace StellaOps.Scanner.Analyzers.Lang.Rust.Internal;
internal static class RustBinaryClassifier
{
private static readonly ReadOnlyMemory<byte> ElfMagic = new byte[] { 0x7F, (byte)'E', (byte)'L', (byte)'F' };
private static readonly ReadOnlyMemory<byte> SymbolPrefix = new byte[] { (byte)'_', (byte)'Z', (byte)'N' };
private const int ChunkSize = 64 * 1024;
private const int OverlapSize = 48;
private const long MaxBinarySize = 128L * 1024L * 1024L;
private static readonly HashSet<string> StandardCrates = new(StringComparer.Ordinal)
{
"core",
"alloc",
"std",
"panic_unwind",
"panic_abort",
};
private static readonly EnumerationOptions Enumeration = new()
{
MatchCasing = MatchCasing.CaseSensitive,
IgnoreInaccessible = true,
RecurseSubdirectories = true,
AttributesToSkip = FileAttributes.Device | FileAttributes.ReparsePoint,
};
public static IReadOnlyList<RustBinaryInfo> Scan(string rootPath, CancellationToken cancellationToken)
{
if (string.IsNullOrWhiteSpace(rootPath))
{
throw new ArgumentException("Root path is required", nameof(rootPath));
}
var binaries = new List<RustBinaryInfo>();
foreach (var path in Directory.EnumerateFiles(rootPath, "*", Enumeration))
{
cancellationToken.ThrowIfCancellationRequested();
if (!IsEligibleBinary(path))
{
continue;
}
var candidates = ExtractCrateNames(path, cancellationToken);
binaries.Add(new RustBinaryInfo(path, candidates));
}
return binaries;
}
private static bool IsEligibleBinary(string path)
{
try
{
var info = new FileInfo(path);
if (!info.Exists || info.Length == 0 || info.Length > MaxBinarySize)
{
return false;
}
using var stream = info.OpenRead();
Span<byte> buffer = stackalloc byte[4];
var read = stream.Read(buffer);
if (read != 4)
{
return false;
}
return buffer.SequenceEqual(ElfMagic.Span);
}
catch (IOException)
{
return false;
}
catch (UnauthorizedAccessException)
{
return false;
}
}
private static ImmutableArray<string> ExtractCrateNames(string path, CancellationToken cancellationToken)
{
var names = new HashSet<string>(StringComparer.Ordinal);
var buffer = ArrayPool<byte>.Shared.Rent(ChunkSize + OverlapSize);
var overlap = new byte[OverlapSize];
var overlapLength = 0;
try
{
using var stream = new FileStream(path, FileMode.Open, FileAccess.Read, FileShare.Read);
while (true)
{
cancellationToken.ThrowIfCancellationRequested();
// Copy previous overlap to buffer prefix.
if (overlapLength > 0)
{
Array.Copy(overlap, 0, buffer, 0, overlapLength);
}
var read = stream.Read(buffer, overlapLength, ChunkSize);
if (read <= 0)
{
break;
}
var span = new ReadOnlySpan<byte>(buffer, 0, overlapLength + read);
ScanForSymbols(span, names);
overlapLength = Math.Min(OverlapSize, span.Length);
if (overlapLength > 0)
{
span[^overlapLength..].CopyTo(overlap);
}
}
}
catch (IOException)
{
return ImmutableArray<string>.Empty;
}
catch (UnauthorizedAccessException)
{
return ImmutableArray<string>.Empty;
}
finally
{
ArrayPool<byte>.Shared.Return(buffer);
}
if (names.Count == 0)
{
return ImmutableArray<string>.Empty;
}
var ordered = names
.Where(static name => !string.IsNullOrWhiteSpace(name))
.Select(static name => name.Trim())
.Where(static name => name.Length > 1)
.Where(name => !StandardCrates.Contains(name))
.Distinct(StringComparer.Ordinal)
.OrderBy(static name => name, StringComparer.Ordinal)
.ToImmutableArray();
return ordered;
}
private static void ScanForSymbols(ReadOnlySpan<byte> span, HashSet<string> names)
{
var prefix = SymbolPrefix.Span;
var index = 0;
while (index < span.Length)
{
var slice = span[index..];
var offset = slice.IndexOf(prefix);
if (offset < 0)
{
break;
}
index += offset + prefix.Length;
if (index >= span.Length)
{
break;
}
var remaining = span[index..];
if (!TryParseCrate(remaining, out var crate, out var consumed))
{
index += 1;
continue;
}
if (!string.IsNullOrWhiteSpace(crate))
{
names.Add(crate);
}
index += Math.Max(consumed, 1);
}
}
private static bool TryParseCrate(ReadOnlySpan<byte> span, out string? crate, out int consumed)
{
crate = null;
consumed = 0;
var i = 0;
var length = 0;
while (i < span.Length && span[i] is >= (byte)'0' and <= (byte)'9')
{
length = (length * 10) + (span[i] - (byte)'0');
i++;
if (length > 256)
{
return false;
}
}
if (i == 0 || length <= 0 || i + length > span.Length)
{
return false;
}
crate = Encoding.ASCII.GetString(span.Slice(i, length));
consumed = i + length;
return true;
}
}
internal sealed record RustBinaryInfo(string AbsolutePath, ImmutableArray<string> CrateCandidates)
{
private string? _sha256;
public string ComputeSha256()
{
if (_sha256 is not null)
{
return _sha256;
}
try
{
using var stream = new FileStream(AbsolutePath, FileMode.Open, FileAccess.Read, FileShare.Read);
using var sha = SHA256.Create();
var hash = sha.ComputeHash(stream);
_sha256 = Convert.ToHexString(hash).ToLowerInvariant();
}
catch (IOException)
{
_sha256 = string.Empty;
}
catch (UnauthorizedAccessException)
{
_sha256 = string.Empty;
}
return _sha256 ?? string.Empty;
}
}

View File

@@ -0,0 +1,298 @@
namespace StellaOps.Scanner.Analyzers.Lang.Rust.Internal;
internal static class RustCargoLockParser
{
public static IReadOnlyList<RustCargoPackage> Parse(string path, CancellationToken cancellationToken)
{
if (string.IsNullOrWhiteSpace(path))
{
throw new ArgumentException("Lock path is required", nameof(path));
}
var info = new FileInfo(path);
if (!info.Exists)
{
return Array.Empty<RustCargoPackage>();
}
var packages = new List<RustCargoPackage>();
using var stream = new FileStream(path, FileMode.Open, FileAccess.Read, FileShare.Read);
using var reader = new StreamReader(stream);
RustCargoPackageBuilder? builder = null;
string? currentArrayKey = null;
var arrayValues = new List<string>();
while (!reader.EndOfStream)
{
cancellationToken.ThrowIfCancellationRequested();
var line = reader.ReadLine();
if (line is null)
{
break;
}
var trimmed = TrimComments(line.AsSpan());
if (trimmed.Length == 0)
{
continue;
}
if (IsPackageHeader(trimmed))
{
FlushCurrent(builder, packages);
builder = new RustCargoPackageBuilder();
currentArrayKey = null;
arrayValues.Clear();
continue;
}
if (builder is null)
{
continue;
}
if (currentArrayKey is not null)
{
if (trimmed[0] == ']')
{
builder.SetArray(currentArrayKey, arrayValues);
currentArrayKey = null;
arrayValues.Clear();
continue;
}
var value = ExtractString(trimmed);
if (!string.IsNullOrEmpty(value))
{
arrayValues.Add(value);
}
continue;
}
if (trimmed[0] == '[')
{
// Entering a new table; finish any pending package and skip section.
FlushCurrent(builder, packages);
builder = null;
continue;
}
var equalsIndex = trimmed.IndexOf('=');
if (equalsIndex < 0)
{
continue;
}
var key = trimmed[..equalsIndex].Trim();
var valuePart = trimmed[(equalsIndex + 1)..].Trim();
if (valuePart.Length == 0)
{
continue;
}
if (valuePart[0] == '[')
{
currentArrayKey = key.ToString();
arrayValues.Clear();
if (valuePart.Length > 1 && valuePart[^1] == ']')
{
var inline = valuePart[1..^1].Trim();
if (inline.Length > 0)
{
foreach (var token in SplitInlineArray(inline.ToString()))
{
var parsedValue = ExtractString(token.AsSpan());
if (!string.IsNullOrEmpty(parsedValue))
{
arrayValues.Add(parsedValue);
}
}
}
builder.SetArray(currentArrayKey, arrayValues);
currentArrayKey = null;
arrayValues.Clear();
}
continue;
}
var parsed = ExtractString(valuePart);
if (parsed is not null)
{
builder.SetField(key, parsed);
}
}
if (currentArrayKey is not null && arrayValues.Count > 0)
{
builder?.SetArray(currentArrayKey, arrayValues);
}
FlushCurrent(builder, packages);
return packages;
}
private static ReadOnlySpan<char> TrimComments(ReadOnlySpan<char> line)
{
var index = line.IndexOf('#');
if (index >= 0)
{
line = line[..index];
}
return line.Trim();
}
private static bool IsPackageHeader(ReadOnlySpan<char> value)
=> value.SequenceEqual("[[package]]".AsSpan());
private static IEnumerable<string> SplitInlineArray(string value)
{
var start = 0;
var inString = false;
for (var i = 0; i < value.Length; i++)
{
var current = value[i];
if (current == '"')
{
inString = !inString;
}
if (current == ',' && !inString)
{
var item = value.AsSpan(start, i - start).Trim();
if (item.Length > 0)
{
yield return item.ToString();
}
start = i + 1;
}
}
if (start < value.Length)
{
var item = value.AsSpan(start).Trim();
if (item.Length > 0)
{
yield return item.ToString();
}
}
}
private static string? ExtractString(ReadOnlySpan<char> value)
{
if (value.Length == 0)
{
return null;
}
if (value[0] == '"' && value[^1] == '"')
{
var inner = value[1..^1];
return inner.ToString();
}
var trimmed = value.Trim();
return trimmed.Length == 0 ? null : trimmed.ToString();
}
private static void FlushCurrent(RustCargoPackageBuilder? builder, List<RustCargoPackage> packages)
{
if (builder is null || !builder.HasData)
{
return;
}
if (builder.TryBuild(out var package))
{
packages.Add(package);
}
}
private sealed class RustCargoPackageBuilder
{
private readonly SortedSet<string> _dependencies = new(StringComparer.Ordinal);
private string? _name;
private string? _version;
private string? _source;
private string? _checksum;
public bool HasData => !string.IsNullOrWhiteSpace(_name);
public void SetField(ReadOnlySpan<char> key, string value)
{
if (key.SequenceEqual("name".AsSpan()))
{
_name ??= value.Trim();
}
else if (key.SequenceEqual("version".AsSpan()))
{
_version ??= value.Trim();
}
else if (key.SequenceEqual("source".AsSpan()))
{
_source ??= value.Trim();
}
else if (key.SequenceEqual("checksum".AsSpan()))
{
_checksum ??= value.Trim();
}
}
public void SetArray(string key, IEnumerable<string> values)
{
if (!string.Equals(key, "dependencies", StringComparison.Ordinal))
{
return;
}
foreach (var entry in values)
{
if (string.IsNullOrWhiteSpace(entry))
{
continue;
}
var normalized = entry.Trim();
if (normalized.Length > 0)
{
_dependencies.Add(normalized);
}
}
}
public bool TryBuild(out RustCargoPackage package)
{
if (string.IsNullOrWhiteSpace(_name))
{
package = null!;
return false;
}
package = new RustCargoPackage(
_name!,
_version ?? string.Empty,
_source,
_checksum,
_dependencies.ToArray());
return true;
}
}
}
internal sealed record RustCargoPackage(
string Name,
string Version,
string? Source,
string? Checksum,
IReadOnlyList<string> Dependencies);

View File

@@ -0,0 +1,178 @@
using System.Text.Json;
namespace StellaOps.Scanner.Analyzers.Lang.Rust.Internal;
internal static class RustFingerprintScanner
{
private static readonly EnumerationOptions Enumeration = new()
{
MatchCasing = MatchCasing.CaseSensitive,
IgnoreInaccessible = true,
RecurseSubdirectories = true,
AttributesToSkip = FileAttributes.Device | FileAttributes.ReparsePoint,
};
private static readonly string FingerprintSegment = $"{Path.DirectorySeparatorChar}.fingerprint{Path.DirectorySeparatorChar}";
public static IReadOnlyList<RustFingerprintRecord> Scan(string rootPath, CancellationToken cancellationToken)
{
if (string.IsNullOrWhiteSpace(rootPath))
{
throw new ArgumentException("Root path is required", nameof(rootPath));
}
var results = new List<RustFingerprintRecord>();
foreach (var path in Directory.EnumerateFiles(rootPath, "*.json", Enumeration))
{
cancellationToken.ThrowIfCancellationRequested();
if (!path.Contains(FingerprintSegment, StringComparison.Ordinal))
{
continue;
}
if (TryParse(path, out var record))
{
results.Add(record);
}
}
return results;
}
private static bool TryParse(string path, out RustFingerprintRecord record)
{
record = default!;
try
{
using var stream = new FileStream(path, FileMode.Open, FileAccess.Read, FileShare.Read);
using var document = JsonDocument.Parse(stream);
var root = document.RootElement;
var pkgId = TryGetString(root, "pkgid")
?? TryGetString(root, "package_id")
?? TryGetString(root, "packageId");
var (name, version, source) = ParseIdentity(pkgId, path);
if (string.IsNullOrWhiteSpace(name))
{
return false;
}
var profile = TryGetString(root, "profile");
var targetKind = TryGetKind(root);
record = new RustFingerprintRecord(
Name: name!,
Version: version,
Source: source,
TargetKind: targetKind,
Profile: profile,
AbsolutePath: path);
return true;
}
catch (JsonException)
{
return false;
}
catch (IOException)
{
return false;
}
catch (UnauthorizedAccessException)
{
return false;
}
}
private static (string? Name, string? Version, string? Source) ParseIdentity(string? pkgId, string filePath)
{
if (!string.IsNullOrWhiteSpace(pkgId))
{
var span = pkgId.AsSpan().Trim();
var firstSpace = span.IndexOf(' ');
if (firstSpace > 0 && firstSpace < span.Length - 1)
{
var name = span[..firstSpace].ToString();
var remaining = span[(firstSpace + 1)..].Trim();
var secondSpace = remaining.IndexOf(' ');
if (secondSpace < 0)
{
return (name, remaining.ToString(), null);
}
var version = remaining[..secondSpace].ToString();
var potentialSource = remaining[(secondSpace + 1)..].Trim();
if (potentialSource.Length > 1 && potentialSource[0] == '(' && potentialSource[^1] == ')')
{
potentialSource = potentialSource[1..^1].Trim();
}
var source = potentialSource.Length == 0 ? null : potentialSource.ToString();
return (name, version, source);
}
}
var directory = Path.GetDirectoryName(filePath);
if (string.IsNullOrEmpty(directory))
{
return (null, null, null);
}
var crateDirectory = Path.GetFileName(directory);
if (string.IsNullOrWhiteSpace(crateDirectory))
{
return (null, null, null);
}
var dashIndex = crateDirectory.LastIndexOf('-');
if (dashIndex <= 0)
{
return (crateDirectory, null, null);
}
var maybeName = crateDirectory[..dashIndex];
return (maybeName, null, null);
}
private static string? TryGetKind(JsonElement root)
{
if (root.TryGetProperty("target_kind", out var array) && array.ValueKind == JsonValueKind.Array && array.GetArrayLength() > 0)
{
var first = array[0];
if (first.ValueKind == JsonValueKind.String)
{
return first.GetString();
}
}
if (root.TryGetProperty("target", out var target) && target.ValueKind == JsonValueKind.String)
{
return target.GetString();
}
return null;
}
private static string? TryGetString(JsonElement element, string propertyName)
{
if (element.TryGetProperty(propertyName, out var value) && value.ValueKind == JsonValueKind.String)
{
return value.GetString();
}
return null;
}
}
internal sealed record RustFingerprintRecord(
string Name,
string? Version,
string? Source,
string? TargetKind,
string? Profile,
string AbsolutePath);

View File

@@ -1,6 +0,0 @@
namespace StellaOps.Scanner.Analyzers.Lang.Rust;
internal static class Placeholder
{
// Analyzer implementation will be added during Sprint LA5.
}

View File

@@ -7,7 +7,7 @@ public sealed class RustAnalyzerPlugin : ILanguageAnalyzerPlugin
{ {
public string Name => "StellaOps.Scanner.Analyzers.Lang.Rust"; public string Name => "StellaOps.Scanner.Analyzers.Lang.Rust";
public bool IsAvailable(IServiceProvider services) => false; public bool IsAvailable(IServiceProvider services) => services is not null;
public ILanguageAnalyzer CreateAnalyzer(IServiceProvider services) public ILanguageAnalyzer CreateAnalyzer(IServiceProvider services)
{ {

View File

@@ -1,6 +1,7 @@
using System; using System;
using System.Threading; using System.Threading;
using System.Threading.Tasks; using System.Threading.Tasks;
using StellaOps.Scanner.Analyzers.Lang.Rust.Internal;
namespace StellaOps.Scanner.Analyzers.Lang.Rust; namespace StellaOps.Scanner.Analyzers.Lang.Rust;
@@ -11,5 +12,55 @@ public sealed class RustLanguageAnalyzer : ILanguageAnalyzer
public string DisplayName => "Rust Analyzer (preview)"; public string DisplayName => "Rust Analyzer (preview)";
public ValueTask AnalyzeAsync(LanguageAnalyzerContext context, LanguageComponentWriter writer, CancellationToken cancellationToken) public ValueTask AnalyzeAsync(LanguageAnalyzerContext context, LanguageComponentWriter writer, CancellationToken cancellationToken)
=> ValueTask.FromException(new NotImplementedException("Rust analyzer implementation pending Sprint LA5.")); {
ArgumentNullException.ThrowIfNull(context);
ArgumentNullException.ThrowIfNull(writer);
cancellationToken.ThrowIfCancellationRequested();
var collection = RustAnalyzerCollector.Collect(context, cancellationToken);
EmitRecords(Id, writer, collection.Crates);
EmitRecords(Id, writer, collection.Heuristics);
EmitRecords(Id, writer, collection.Fallbacks);
return ValueTask.CompletedTask;
}
private static void EmitRecords(string analyzerId, LanguageComponentWriter writer, IReadOnlyList<RustComponentRecord> records)
{
foreach (var record in records)
{
if (record is null)
{
continue;
}
if (!string.IsNullOrEmpty(record.Purl))
{
writer.AddFromPurl(
analyzerId: analyzerId,
purl: record.Purl!,
name: record.Name,
version: record.Version,
type: record.Type,
metadata: record.Metadata,
evidence: record.Evidence,
usedByEntrypoint: record.UsedByEntrypoint);
}
else
{
writer.AddFromExplicitKey(
analyzerId: analyzerId,
componentKey: record.ComponentKey,
purl: null,
name: record.Name,
version: record.Version,
type: record.Type,
metadata: record.Metadata,
evidence: record.Evidence,
usedByEntrypoint: record.UsedByEntrypoint);
}
}
}
} }

View File

@@ -2,9 +2,9 @@
| Seq | ID | Status | Depends on | Description | Exit Criteria | | Seq | ID | Status | Depends on | Description | Exit Criteria |
|-----|----|--------|------------|-------------|---------------| |-----|----|--------|------------|-------------|---------------|
| 1 | SCANNER-ANALYZERS-LANG-10-306A | TODO | SCANNER-ANALYZERS-LANG-10-307 | Parse Cargo metadata (`Cargo.lock`, `.fingerprint`, `.metadata`) and map crates to components with evidence. | Fixtures confirm crate attribution ≥85% coverage; metadata normalized; evidence includes path + hash. | | 1 | SCANNER-ANALYZERS-LANG-10-306A | DONE (2025-10-22) | SCANNER-ANALYZERS-LANG-10-307 | Parse Cargo metadata (`Cargo.lock`, `.fingerprint`, `.metadata`) and map crates to components with evidence. | Fixtures confirm crate attribution ≥85% coverage; metadata normalized; evidence includes path + hash. |
| 2 | SCANNER-ANALYZERS-LANG-10-306B | TODO | SCANNER-ANALYZERS-LANG-10-306A | Implement heuristic classifier using ELF section names, symbol mangling, and `.comment` data for stripped binaries. | Heuristic output flagged as `heuristic`; regression tests ensure no false “observed” classifications. | | 2 | SCANNER-ANALYZERS-LANG-10-306B | DONE (2025-10-22) | SCANNER-ANALYZERS-LANG-10-306A | Implement heuristic classifier using ELF section names, symbol mangling, and `.comment` data for stripped binaries. | Heuristic output flagged as `heuristic`; regression tests ensure no false “observed” classifications. |
| 3 | SCANNER-ANALYZERS-LANG-10-306C | TODO | SCANNER-ANALYZERS-LANG-10-306B | Integrate binary hash fallback (`bin:{sha256}`) and tie into shared quiet provenance helpers. | Fallback path deterministic; shared helpers reused; tests verify consistent hashing. | | 3 | SCANNER-ANALYZERS-LANG-10-306C | DONE (2025-10-22) | SCANNER-ANALYZERS-LANG-10-306B | Integrate binary hash fallback (`bin:{sha256}`) and tie into shared quiet provenance helpers. | Fallback path deterministic; shared helpers reused; tests verify consistent hashing. |
| 4 | SCANNER-ANALYZERS-LANG-10-307R | TODO | SCANNER-ANALYZERS-LANG-10-306C | Finalize shared helper usage (license, usage flags) and concurrency-safe caches. | Analyzer uses shared utilities; concurrency tests pass; no race conditions. | | 4 | SCANNER-ANALYZERS-LANG-10-307R | TODO | SCANNER-ANALYZERS-LANG-10-306C | Finalize shared helper usage (license, usage flags) and concurrency-safe caches. | Analyzer uses shared utilities; concurrency tests pass; no race conditions. |
| 5 | SCANNER-ANALYZERS-LANG-10-308R | TODO | SCANNER-ANALYZERS-LANG-10-307R | Determinism fixtures + performance benchmarks; compare against competitor heuristic coverage. | Fixtures `Fixtures/lang/rust/` committed; determinism guard; benchmark shows ≥15% better coverage vs competitor. | | 5 | SCANNER-ANALYZERS-LANG-10-308R | TODO | SCANNER-ANALYZERS-LANG-10-307R | Determinism fixtures + performance benchmarks; compare against competitor heuristic coverage. | Fixtures `Fixtures/lang/rust/` committed; determinism guard; benchmark shows ≥15% better coverage vs competitor. |
| 6 | SCANNER-ANALYZERS-LANG-10-309R | TODO | SCANNER-ANALYZERS-LANG-10-308R | Package plug-in manifest + Offline Kit documentation; ensure Worker integration. | Manifest copied; Worker loads analyzer; Offline Kit doc updated. | | 6 | SCANNER-ANALYZERS-LANG-10-309R | TODO | SCANNER-ANALYZERS-LANG-10-308R | Package plug-in manifest + Offline Kit documentation; ensure Worker integration. | Manifest copied; Worker loads analyzer; Offline Kit doc updated. |

View File

@@ -1,4 +1,7 @@
using System;
using System.IO; using System.IO;
using System.Threading;
using System.Threading.Tasks;
using StellaOps.Scanner.Analyzers.Lang.DotNet; using StellaOps.Scanner.Analyzers.Lang.DotNet;
using StellaOps.Scanner.Analyzers.Lang.Tests.Harness; using StellaOps.Scanner.Analyzers.Lang.Tests.Harness;
using StellaOps.Scanner.Analyzers.Lang.Tests.TestUtilities; using StellaOps.Scanner.Analyzers.Lang.Tests.TestUtilities;
@@ -25,4 +28,79 @@ public sealed class DotNetLanguageAnalyzerTests
analyzers, analyzers,
cancellationToken); cancellationToken);
} }
[Fact]
public async Task SignedFixtureCapturesAssemblyMetadataAsync()
{
var cancellationToken = TestContext.Current.CancellationToken;
var fixturePath = TestPaths.ResolveFixture("lang", "dotnet", "signed");
var goldenPath = Path.Combine(fixturePath, "expected.json");
var analyzers = new ILanguageAnalyzer[]
{
new DotNetLanguageAnalyzer()
};
var inspector = new StubAuthenticodeInspector();
var services = new SingleServiceProvider(inspector);
await LanguageAnalyzerTestHarness.AssertDeterministicAsync(
fixturePath,
goldenPath,
analyzers,
cancellationToken,
usageHints: null,
services: services);
}
[Fact]
public async Task SelfContainedFixtureHandlesNativeAssetsAndUsageAsync()
{
var cancellationToken = TestContext.Current.CancellationToken;
var fixturePath = TestPaths.ResolveFixture("lang", "dotnet", "selfcontained");
var goldenPath = Path.Combine(fixturePath, "expected.json");
var usageHints = new LanguageUsageHints(new[]
{
Path.Combine(fixturePath, "lib", "net10.0", "StellaOps.Toolkit.dll"),
Path.Combine(fixturePath, "runtimes", "linux-x64", "native", "libstellaopsnative.so")
});
var analyzers = new ILanguageAnalyzer[]
{
new DotNetLanguageAnalyzer()
};
await LanguageAnalyzerTestHarness.AssertDeterministicAsync(
fixturePath,
goldenPath,
analyzers,
cancellationToken,
usageHints);
}
private sealed class StubAuthenticodeInspector : IDotNetAuthenticodeInspector
{
public DotNetAuthenticodeMetadata? TryInspect(string assemblyPath, CancellationToken cancellationToken)
=> new DotNetAuthenticodeMetadata(
Subject: "CN=StellaOps Test Signing",
Issuer: "CN=StellaOps Root",
NotBefore: new DateTimeOffset(2025, 1, 1, 0, 0, 0, TimeSpan.Zero),
NotAfter: new DateTimeOffset(2026, 1, 1, 0, 0, 0, TimeSpan.Zero),
Thumbprint: "AA11BB22CC33DD44EE55FF66GG77HH88II99JJ00",
SerialNumber: "0123456789ABCDEF");
}
private sealed class SingleServiceProvider : IServiceProvider
{
private readonly object _service;
public SingleServiceProvider(object service)
{
_service = service;
}
public object? GetService(Type serviceType)
=> serviceType == typeof(IDotNetAuthenticodeInspector) ? _service : null;
}
} }

View File

@@ -0,0 +1,85 @@
{
"runtimeTarget": {
"name": ".NETCoreApp,Version=v10.0",
"signature": null
},
"targets": {
".NETCoreApp,Version=v10.0/linux-x64": {
"MyApp/1.0.0": {
"dependencies": {
"StellaOps.Toolkit": "1.2.3",
"StellaOps.Runtime.SelfContained": "2.1.0"
},
"runtime": {
"MyApp.dll": {}
}
},
"StellaOps.Toolkit/1.2.3": {
"runtime": {
"lib/net10.0/StellaOps.Toolkit.dll": {
"assemblyVersion": "1.2.3.0",
"fileVersion": "1.2.3.0"
}
}
},
"StellaOps.Runtime.SelfContained/2.1.0": {
"runtimeTargets": {
"runtimes/linux-x64/native/libstellaopsnative.so": {
"rid": "linux-x64",
"assetType": "native"
}
}
}
},
".NETCoreApp,Version=v10.0/win-x64": {
"MyApp/1.0.0": {
"dependencies": {
"StellaOps.Toolkit": "1.2.3",
"StellaOps.Runtime.SelfContained": "2.1.0"
},
"runtime": {
"MyApp.dll": {}
}
},
"StellaOps.Toolkit/1.2.3": {
"runtime": {
"lib/net10.0/StellaOps.Toolkit.dll": {
"assemblyVersion": "1.2.3.0",
"fileVersion": "1.2.3.0"
}
}
},
"StellaOps.Runtime.SelfContained/2.1.0": {
"runtimeTargets": {
"runtimes/win-x64/native/stellaopsnative.dll": {
"rid": "win-x64",
"assetType": "native"
}
}
}
}
},
"libraries": {
"MyApp/1.0.0": {
"type": "project",
"serviceable": false,
"sha512": "",
"path": null,
"hashPath": null
},
"StellaOps.Toolkit/1.2.3": {
"type": "package",
"serviceable": true,
"sha512": "sha512-FAKE_TOOLKIT_SHA==",
"path": "stellaops.toolkit/1.2.3",
"hashPath": "stellaops.toolkit.1.2.3.nupkg.sha512"
},
"StellaOps.Runtime.SelfContained/2.1.0": {
"type": "package",
"serviceable": true,
"sha512": "sha512-FAKE_RUNTIME_SHA==",
"path": "stellaops.runtime.selfcontained/2.1.0",
"hashPath": "stellaops.runtime.selfcontained.2.1.0.nupkg.sha512"
}
}
}

View File

@@ -0,0 +1,15 @@
{
"runtimeOptions": {
"tfm": "net10.0",
"framework": {
"name": "Microsoft.NETCore.App",
"version": "10.0.0"
},
"includedFrameworks": [
{
"name": "Microsoft.NETCore.DotNetAppHost",
"version": "10.0.0"
}
]
}
}

View File

@@ -0,0 +1,101 @@
[
{
"analyzerId": "dotnet",
"componentKey": "purl::pkg:nuget/stellaops.runtime.selfcontained@2.1.0",
"purl": "pkg:nuget/stellaops.runtime.selfcontained@2.1.0",
"name": "StellaOps.Runtime.SelfContained",
"version": "2.1.0",
"type": "nuget",
"usedByEntrypoint": true,
"metadata": {
"deps.path[0]": "MyApp.deps.json",
"deps.rid[0]": "linux-x64",
"deps.rid[1]": "win-x64",
"deps.tfm[0]": ".NETCoreApp,Version=v10.0",
"native[0].assetPath": "runtimes/linux-x64/native/libstellaopsnative.so",
"native[0].path": "runtimes/linux-x64/native/libstellaopsnative.so",
"native[0].rid[0]": "linux-x64",
"native[0].sha256": "c22d4a6584a3bb8fad4d255d1ab9e5a80d553eec35ea8dfcc2dd750e8581d3cb",
"native[0].tfm[0]": ".NETCoreApp,Version=v10.0",
"native[1].assetPath": "runtimes/win-x64/native/stellaopsnative.dll",
"native[1].path": "runtimes/win-x64/native/stellaopsnative.dll",
"native[1].rid[0]": "win-x64",
"native[1].sha256": "29cddd69702aedc715050304bec85aad2ae017ee1f9390df5e68ebe79a8d4745",
"native[1].tfm[0]": ".NETCoreApp,Version=v10.0",
"package.hashPath[0]": "stellaops.runtime.selfcontained.2.1.0.nupkg.sha512",
"package.id": "StellaOps.Runtime.SelfContained",
"package.id.normalized": "stellaops.runtime.selfcontained",
"package.path[0]": "stellaops.runtime.selfcontained/2.1.0",
"package.serviceable": "true",
"package.sha512[0]": "sha512-FAKE_RUNTIME_SHA==",
"package.version": "2.1.0"
},
"evidence": [
{
"kind": "file",
"source": "deps.json",
"locator": "MyApp.deps.json",
"value": "StellaOps.Runtime.SelfContained/2.1.0"
},
{
"kind": "file",
"source": "native",
"locator": "runtimes/linux-x64/native/libstellaopsnative.so",
"value": "runtimes/linux-x64/native/libstellaopsnative.so",
"sha256": "c22d4a6584a3bb8fad4d255d1ab9e5a80d553eec35ea8dfcc2dd750e8581d3cb"
},
{
"kind": "file",
"source": "native",
"locator": "runtimes/win-x64/native/stellaopsnative.dll",
"value": "runtimes/win-x64/native/stellaopsnative.dll",
"sha256": "29cddd69702aedc715050304bec85aad2ae017ee1f9390df5e68ebe79a8d4745"
}
]
},
{
"analyzerId": "dotnet",
"componentKey": "purl::pkg:nuget/stellaops.toolkit@1.2.3",
"purl": "pkg:nuget/stellaops.toolkit@1.2.3",
"name": "StellaOps.Toolkit",
"version": "1.2.3",
"type": "nuget",
"usedByEntrypoint": true,
"metadata": {
"assembly[0].assetPath": "lib/net10.0/StellaOps.Toolkit.dll",
"assembly[0].fileVersion": "1.2.3.0",
"assembly[0].path": "lib/net10.0/StellaOps.Toolkit.dll",
"assembly[0].rid[0]": "linux-x64",
"assembly[0].rid[1]": "win-x64",
"assembly[0].sha256": "5b82fd11cf6c2ba6b351592587c4203f6af48b89427b954903534eac0e9f17f7",
"assembly[0].tfm[0]": ".NETCoreApp,Version=v10.0",
"assembly[0].version": "1.2.3.0",
"deps.path[0]": "MyApp.deps.json",
"deps.rid[0]": "linux-x64",
"deps.rid[1]": "win-x64",
"deps.tfm[0]": ".NETCoreApp,Version=v10.0",
"package.hashPath[0]": "stellaops.toolkit.1.2.3.nupkg.sha512",
"package.id": "StellaOps.Toolkit",
"package.id.normalized": "stellaops.toolkit",
"package.path[0]": "stellaops.toolkit/1.2.3",
"package.serviceable": "true",
"package.sha512[0]": "sha512-FAKE_TOOLKIT_SHA==",
"package.version": "1.2.3"
},
"evidence": [
{
"kind": "file",
"source": "assembly",
"locator": "lib/net10.0/StellaOps.Toolkit.dll",
"value": "lib/net10.0/StellaOps.Toolkit.dll",
"sha256": "5b82fd11cf6c2ba6b351592587c4203f6af48b89427b954903534eac0e9f17f7"
},
{
"kind": "file",
"source": "deps.json",
"locator": "MyApp.deps.json",
"value": "StellaOps.Toolkit/1.2.3"
}
]
}
]

View File

@@ -0,0 +1,42 @@
{
"runtimeTarget": {
"name": ".NETCoreApp,Version=v10.0/linux-x64"
},
"targets": {
".NETCoreApp,Version=v10.0": {
"Signed.App/1.0.0": {
"dependencies": {
"Microsoft.Extensions.Logging": "9.0.0"
}
},
"Microsoft.Extensions.Logging/9.0.0": {
"runtime": {
"lib/net9.0/Microsoft.Extensions.Logging.dll": {
"assemblyVersion": "9.0.0.0",
"fileVersion": "9.0.24.52809"
}
}
}
},
".NETCoreApp,Version=v10.0/linux-x64": {
"Microsoft.Extensions.Logging/9.0.0": {
"runtime": {
"runtimes/linux-x64/lib/net9.0/Microsoft.Extensions.Logging.dll": {}
}
}
}
},
"libraries": {
"Signed.App/1.0.0": {
"type": "project",
"serviceable": false
},
"Microsoft.Extensions.Logging/9.0.0": {
"type": "package",
"serviceable": true,
"sha512": "sha512-FAKE_LOGGING_SHA==",
"path": "microsoft.extensions.logging/9.0.0",
"hashPath": "microsoft.extensions.logging.9.0.0.nupkg.sha512"
}
}
}

View File

@@ -0,0 +1,9 @@
{
"runtimeOptions": {
"tfm": "net10.0",
"framework": {
"name": "Microsoft.NETCore.App",
"version": "10.0.0"
}
}
}

View File

@@ -0,0 +1,59 @@
[
{
"analyzerId": "dotnet",
"componentKey": "purl::pkg:nuget/microsoft.extensions.logging@9.0.0",
"purl": "pkg:nuget/microsoft.extensions.logging@9.0.0",
"name": "Microsoft.Extensions.Logging",
"version": "9.0.0",
"type": "nuget",
"usedByEntrypoint": false,
"metadata": {
"assembly[0].assetPath": "lib/net9.0/Microsoft.Extensions.Logging.dll",
"assembly[0].authenticode.issuer": "CN=StellaOps Root",
"assembly[0].authenticode.notAfter": "2026-01-01T00:00:00.000Z",
"assembly[0].authenticode.notBefore": "2025-01-01T00:00:00.000Z",
"assembly[0].authenticode.serialNumber": "0123456789ABCDEF",
"assembly[0].authenticode.subject": "CN=StellaOps Test Signing",
"assembly[0].authenticode.thumbprint": "AA11BB22CC33DD44EE55FF66GG77HH88II99JJ00",
"assembly[0].company": "Microsoft Corporation",
"assembly[0].fileDescription": "Microsoft.Extensions.Logging",
"assembly[0].fileVersion": "9.0.24.52809",
"assembly[0].path": "packages/microsoft.extensions.logging/9.0.0/lib/net9.0/Microsoft.Extensions.Logging.dll",
"assembly[0].product": "Microsoft\u00ae .NET",
"assembly[0].productVersion": "9.0.0+9d5a6a9aa463d6d10b0b0ba6d5982cc82f363dc3",
"assembly[0].publicKeyToken": "adb9793829ddae60",
"assembly[0].sha256": "faed6cb5c9ca0d6077feaeb2df251251adccf0241f7a80b91c58e014cd5ad48f",
"assembly[0].strongName": "Microsoft.Extensions.Logging, Version=9.0.0.0, Culture=neutral, PublicKeyToken=adb9793829ddae60",
"assembly[0].tfm[0]": ".NETCoreApp,Version=v10.0",
"assembly[0].version": "9.0.0.0",
"assembly[1].assetPath": "runtimes/linux-x64/lib/net9.0/Microsoft.Extensions.Logging.dll",
"assembly[1].rid[0]": "linux-x64",
"assembly[1].tfm[0]": ".NETCoreApp,Version=v10.0",
"deps.path[0]": "Signed.App.deps.json",
"deps.rid[0]": "linux-x64",
"deps.tfm[0]": ".NETCoreApp,Version=v10.0",
"package.hashPath[0]": "microsoft.extensions.logging.9.0.0.nupkg.sha512",
"package.id": "Microsoft.Extensions.Logging",
"package.id.normalized": "microsoft.extensions.logging",
"package.path[0]": "microsoft.extensions.logging/9.0.0",
"package.serviceable": "true",
"package.sha512[0]": "sha512-FAKE_LOGGING_SHA==",
"package.version": "9.0.0"
},
"evidence": [
{
"kind": "file",
"source": "assembly",
"locator": "packages/microsoft.extensions.logging/9.0.0/lib/net9.0/Microsoft.Extensions.Logging.dll",
"value": "lib/net9.0/Microsoft.Extensions.Logging.dll",
"sha256": "faed6cb5c9ca0d6077feaeb2df251251adccf0241f7a80b91c58e014cd5ad48f"
},
{
"kind": "file",
"source": "deps.json",
"locator": "Signed.App.deps.json",
"value": "Microsoft.Extensions.Logging/9.0.0"
}
]
}
]

View File

@@ -8,6 +8,16 @@
"type": "nuget", "type": "nuget",
"usedByEntrypoint": false, "usedByEntrypoint": false,
"metadata": { "metadata": {
"assembly[0].assetPath": "lib/net9.0/Microsoft.Extensions.Logging.dll",
"assembly[0].fileVersion": "9.0.24.52809",
"assembly[0].tfm[0]": ".NETCoreApp,Version=v10.0",
"assembly[0].version": "9.0.0.0",
"assembly[1].assetPath": "runtimes/linux-x64/lib/net9.0/Microsoft.Extensions.Logging.dll",
"assembly[1].rid[0]": "linux-x64",
"assembly[1].tfm[0]": ".NETCoreApp,Version=v10.0",
"assembly[2].assetPath": "runtimes/win-x86/lib/net9.0/Microsoft.Extensions.Logging.dll",
"assembly[2].rid[0]": "win-x86",
"assembly[2].tfm[0]": ".NETCoreApp,Version=v10.0",
"deps.path[0]": "Sample.App.deps.json", "deps.path[0]": "Sample.App.deps.json",
"deps.rid[0]": "linux-x64", "deps.rid[0]": "linux-x64",
"deps.rid[1]": "win-x86", "deps.rid[1]": "win-x86",
@@ -38,6 +48,10 @@
"type": "nuget", "type": "nuget",
"usedByEntrypoint": false, "usedByEntrypoint": false,
"metadata": { "metadata": {
"assembly[0].assetPath": "lib/net10.0/StellaOps.Toolkit.dll",
"assembly[0].fileVersion": "1.2.3.0",
"assembly[0].tfm[0]": ".NETCoreApp,Version=v10.0",
"assembly[0].version": "1.2.3.0",
"deps.dependency[0]": "microsoft.extensions.logging", "deps.dependency[0]": "microsoft.extensions.logging",
"deps.path[0]": "Sample.App.deps.json", "deps.path[0]": "Sample.App.deps.json",
"deps.rid[0]": "linux-x64", "deps.rid[0]": "linux-x64",

View File

@@ -0,0 +1,12 @@
[[package]]
name = "my_app"
version = "0.1.0"
dependencies = [
"serde",
]
[[package]]
name = "serde"
version = "1.0.188"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "abc123"

View File

@@ -0,0 +1,90 @@
[
{
"analyzerId": "rust",
"componentKey": "bin::sha256:22caa7413d89026b52db64c8abc254bf9e7647ab9216e79c6972a39451f8c41e",
"name": "unknown_tool",
"type": "bin",
"usedByEntrypoint": false,
"metadata": {
"binary.path": "usr/local/bin/unknown_tool",
"binary.sha256": "22caa7413d89026b52db64c8abc254bf9e7647ab9216e79c6972a39451f8c41e",
"provenance": "binary"
},
"evidence": [
{
"kind": "file",
"source": "binary",
"locator": "usr/local/bin/unknown_tool",
"sha256": "22caa7413d89026b52db64c8abc254bf9e7647ab9216e79c6972a39451f8c41e"
}
]
},
{
"analyzerId": "rust",
"componentKey": "purl::pkg:cargo/my_app@0.1.0",
"purl": "pkg:cargo/my_app@0.1.0",
"name": "my_app",
"version": "0.1.0",
"type": "cargo",
"usedByEntrypoint": true,
"metadata": {
"binary.paths": "usr/local/bin/my_app",
"binary.sha256": "a95a4f4854bf973deacbd937bd1189fc3d0eef7a4fd4f7960f37cf66162c82fd",
"cargo.lock.path": "Cargo.lock",
"fingerprint.profile": "debug",
"fingerprint.targetKind": "bin",
"source": "registry\u002Bhttps://github.com/rust-lang/crates.io-index"
},
"evidence": [
{
"kind": "file",
"source": "binary",
"locator": "usr/local/bin/my_app",
"sha256": "a95a4f4854bf973deacbd937bd1189fc3d0eef7a4fd4f7960f37cf66162c82fd"
},
{
"kind": "file",
"source": "cargo.fingerprint",
"locator": "target/debug/.fingerprint/my_app-1234567890abcdef/bin-my_app-1234567890abcdef.json",
"value": "bin"
},
{
"kind": "file",
"source": "cargo.lock",
"locator": "Cargo.lock",
"value": "my_app 0.1.0"
}
]
},
{
"analyzerId": "rust",
"componentKey": "purl::pkg:cargo/serde@1.0.188",
"purl": "pkg:cargo/serde@1.0.188",
"name": "serde",
"version": "1.0.188",
"type": "cargo",
"usedByEntrypoint": false,
"metadata": {
"cargo.lock.path": "Cargo.lock",
"checksum": "abc123",
"fingerprint.profile": "release",
"fingerprint.targetKind": "lib",
"source": "registry\u002Bhttps://github.com/rust-lang/crates.io-index"
},
"evidence": [
{
"kind": "file",
"source": "cargo.fingerprint",
"locator": "target/debug/.fingerprint/serde-abcdef1234567890/libserde-abcdef1234567890.json",
"value": "lib"
},
{
"kind": "file",
"source": "cargo.lock",
"locator": "Cargo.lock",
"value": "serde 1.0.188",
"sha256": "abc123"
}
]
}
]

View File

@@ -0,0 +1,5 @@
{
"pkgid": "my_app 0.1.0 (registry+https://github.com/rust-lang/crates.io-index)",
"profile": "debug",
"target_kind": ["bin"]
}

View File

@@ -0,0 +1,5 @@
{
"pkgid": "serde 1.0.188 (registry+https://github.com/rust-lang/crates.io-index)",
"profile": "release",
"target_kind": ["lib"]
}

View File

@@ -4,23 +4,23 @@ namespace StellaOps.Scanner.Analyzers.Lang.Tests.Harness;
public static class LanguageAnalyzerTestHarness public static class LanguageAnalyzerTestHarness
{ {
public static async Task<string> RunToJsonAsync(string fixturePath, IEnumerable<ILanguageAnalyzer> analyzers, CancellationToken cancellationToken = default, LanguageUsageHints? usageHints = null) public static async Task<string> RunToJsonAsync(string fixturePath, IEnumerable<ILanguageAnalyzer> analyzers, CancellationToken cancellationToken = default, LanguageUsageHints? usageHints = null, IServiceProvider? services = null)
{ {
if (string.IsNullOrWhiteSpace(fixturePath)) if (string.IsNullOrWhiteSpace(fixturePath))
{ {
throw new ArgumentException("Fixture path is required", nameof(fixturePath)); throw new ArgumentException("Fixture path is required", nameof(fixturePath));
} }
var engine = new LanguageAnalyzerEngine(analyzers ?? Array.Empty<ILanguageAnalyzer>()); var engine = new LanguageAnalyzerEngine(analyzers ?? Array.Empty<ILanguageAnalyzer>());
var context = new LanguageAnalyzerContext(fixturePath, TimeProvider.System, usageHints); var context = new LanguageAnalyzerContext(fixturePath, TimeProvider.System, usageHints, services);
var result = await engine.AnalyzeAsync(context, cancellationToken).ConfigureAwait(false); var result = await engine.AnalyzeAsync(context, cancellationToken).ConfigureAwait(false);
return result.ToJson(indent: true); return result.ToJson(indent: true);
} }
public static async Task AssertDeterministicAsync(string fixturePath, string goldenPath, IEnumerable<ILanguageAnalyzer> analyzers, CancellationToken cancellationToken = default, LanguageUsageHints? usageHints = null) public static async Task AssertDeterministicAsync(string fixturePath, string goldenPath, IEnumerable<ILanguageAnalyzer> analyzers, CancellationToken cancellationToken = default, LanguageUsageHints? usageHints = null, IServiceProvider? services = null)
{ {
var actual = await RunToJsonAsync(fixturePath, analyzers, cancellationToken, usageHints).ConfigureAwait(false); var actual = await RunToJsonAsync(fixturePath, analyzers, cancellationToken, usageHints, services).ConfigureAwait(false);
var expected = await File.ReadAllTextAsync(goldenPath, cancellationToken).ConfigureAwait(false); var expected = await File.ReadAllTextAsync(goldenPath, cancellationToken).ConfigureAwait(false);
// Normalize newlines for portability. // Normalize newlines for portability.
actual = NormalizeLineEndings(actual).TrimEnd(); actual = NormalizeLineEndings(actual).TrimEnd();

View File

@@ -0,0 +1,34 @@
using System.IO;
using StellaOps.Scanner.Analyzers.Lang.Rust;
using StellaOps.Scanner.Analyzers.Lang.Tests.Harness;
using StellaOps.Scanner.Analyzers.Lang.Tests.TestUtilities;
namespace StellaOps.Scanner.Analyzers.Lang.Tests.Rust;
public sealed class RustLanguageAnalyzerTests
{
[Fact]
public async Task SimpleFixtureProducesDeterministicOutputAsync()
{
var cancellationToken = TestContext.Current.CancellationToken;
var fixturePath = TestPaths.ResolveFixture("lang", "rust", "simple");
var goldenPath = Path.Combine(fixturePath, "expected.json");
var usageHints = new LanguageUsageHints(new[]
{
Path.Combine(fixturePath, "usr/local/bin/my_app")
});
var analyzers = new ILanguageAnalyzer[]
{
new RustLanguageAnalyzer()
};
await LanguageAnalyzerTestHarness.AssertDeterministicAsync(
fixturePath,
goldenPath,
analyzers,
cancellationToken,
usageHints);
}
}

View File

@@ -32,6 +32,7 @@
<ItemGroup> <ItemGroup>
<ProjectReference Include="..\StellaOps.Scanner.Analyzers.Lang\StellaOps.Scanner.Analyzers.Lang.csproj" /> <ProjectReference Include="..\StellaOps.Scanner.Analyzers.Lang\StellaOps.Scanner.Analyzers.Lang.csproj" />
<ProjectReference Include="..\StellaOps.Scanner.Analyzers.Lang.DotNet\StellaOps.Scanner.Analyzers.Lang.DotNet.csproj" /> <ProjectReference Include="..\StellaOps.Scanner.Analyzers.Lang.DotNet\StellaOps.Scanner.Analyzers.Lang.DotNet.csproj" />
<ProjectReference Include="..\StellaOps.Scanner.Analyzers.Lang.Rust\StellaOps.Scanner.Analyzers.Lang.Rust.csproj" />
<ProjectReference Include="..\StellaOps.Scanner.Core\StellaOps.Scanner.Core.csproj" /> <ProjectReference Include="..\StellaOps.Scanner.Core\StellaOps.Scanner.Core.csproj" />
</ItemGroup> </ItemGroup>

View File

@@ -51,7 +51,7 @@ All sprints below assume prerequisites from SP10-G2 (core scaffolding + Java ana
- **Gate Artifacts:** - **Gate Artifacts:**
- Benchmarks vs competitor open-source tool (Trivy or Syft) demonstrating faster metadata extraction. - Benchmarks vs competitor open-source tool (Trivy or Syft) demonstrating faster metadata extraction.
- Documentation snippet explaining VCS metadata fields for Policy team. - Documentation snippet explaining VCS metadata fields for Policy team.
- **Progress (2025-10-22):** Build-info decoder shipped with DWARF-string fallback for `vcs.*` markers, plus cached metadata keyed by binary length/timestamp. Added Go test fixtures covering build-info and DWARF-only binaries with deterministic goldens; analyzer now emits `go.dwarf` evidence alongside `go.buildinfo` metadata to feed downstream provenance rules. - **Progress (2025-10-22):** Build-info decoder shipped with DWARF-string fallback for `vcs.*` markers, plus cached metadata keyed by binary length/timestamp. Added Go test fixtures covering build-info and DWARF-only binaries with deterministic goldens; analyzer now emits `go.dwarf` evidence alongside `go.buildinfo` metadata to feed downstream provenance rules. Completed stripped-binary heuristics with deterministic `golang::bin::sha256` components and a new `stripped` fixture to guard quiet-provenance behaviour.
## Sprint LA4 — .NET Analyzer & RID Variants (Tasks 10-305, 10-307, 10-308, 10-309 subset) ## Sprint LA4 — .NET Analyzer & RID Variants (Tasks 10-305, 10-307, 10-308, 10-309 subset)
- **Scope:** Parse `*.deps.json`, `runtimeconfig.json`, assembly metadata, and RID-specific assets; correlate with native dependencies. - **Scope:** Parse `*.deps.json`, `runtimeconfig.json`, assembly metadata, and RID-specific assets; correlate with native dependencies.

View File

@@ -6,8 +6,8 @@
| SCANNER-ANALYZERS-LANG-10-302 | DONE (2025-10-21) | Language Analyzer Guild | SCANNER-ANALYZERS-LANG-10-307 | Node analyzer resolving workspaces/symlinks into `pkg:npm` identities. | Node analyzer handles symlinks/workspaces; outputs sorted components; determinism harness covers hoisted deps. | | SCANNER-ANALYZERS-LANG-10-302 | DONE (2025-10-21) | Language Analyzer Guild | SCANNER-ANALYZERS-LANG-10-307 | Node analyzer resolving workspaces/symlinks into `pkg:npm` identities. | Node analyzer handles symlinks/workspaces; outputs sorted components; determinism harness covers hoisted deps. |
| SCANNER-ANALYZERS-LANG-10-303 | DONE (2025-10-21) | Language Analyzer Guild | SCANNER-ANALYZERS-LANG-10-307 | Python analyzer consuming `*.dist-info` metadata and RECORD hashes. | Analyzer binds METADATA + RECORD evidence, includes entry points, determinism fixtures stable. | | SCANNER-ANALYZERS-LANG-10-303 | DONE (2025-10-21) | Language Analyzer Guild | SCANNER-ANALYZERS-LANG-10-307 | Python analyzer consuming `*.dist-info` metadata and RECORD hashes. | Analyzer binds METADATA + RECORD evidence, includes entry points, determinism fixtures stable. |
| SCANNER-ANALYZERS-LANG-10-304 | DOING (2025-10-22) | Language Analyzer Guild | SCANNER-ANALYZERS-LANG-10-307 | Go analyzer leveraging buildinfo for `pkg:golang` components. | Buildinfo parser emits module path/version + vcs metadata; binaries without buildinfo downgraded gracefully. | | SCANNER-ANALYZERS-LANG-10-304 | DOING (2025-10-22) | Language Analyzer Guild | SCANNER-ANALYZERS-LANG-10-307 | Go analyzer leveraging buildinfo for `pkg:golang` components. | Buildinfo parser emits module path/version + vcs metadata; binaries without buildinfo downgraded gracefully. |
| SCANNER-ANALYZERS-LANG-10-305 | DOING (2025-10-22) | Language Analyzer Guild | SCANNER-ANALYZERS-LANG-10-307 | .NET analyzer parsing `*.deps.json`, assembly metadata, and RID variants. | Analyzer merges deps.json + assembly info; dedupes per RID; determinism verified. | | SCANNER-ANALYZERS-LANG-10-305 | DONE (2025-10-22) | Language Analyzer Guild | SCANNER-ANALYZERS-LANG-10-307 | .NET analyzer parsing `*.deps.json`, assembly metadata, and RID variants. | Analyzer merges deps.json + assembly info; dedupes per RID; determinism verified. |
| SCANNER-ANALYZERS-LANG-10-306 | TODO | Language Analyzer Guild | SCANNER-ANALYZERS-LANG-10-307 | Rust analyzer detecting crate provenance or falling back to `bin:{sha256}`. | Analyzer emits `pkg:cargo` when metadata present; falls back to binary hash; fixtures cover both paths. | | SCANNER-ANALYZERS-LANG-10-306 | DONE (2025-10-22) | Language Analyzer Guild | SCANNER-ANALYZERS-LANG-10-307 | Rust analyzer detecting crate provenance or falling back to `bin:{sha256}`. | Analyzer emits `pkg:cargo` when metadata present; falls back to binary hash; fixtures cover both paths. |
| SCANNER-ANALYZERS-LANG-10-307 | DONE (2025-10-19) | Language Analyzer Guild | SCANNER-CORE-09-501 | Shared language evidence helpers + usage flag propagation. | Shared abstractions implemented; analyzers reuse helpers; evidence includes usage hints; unit tests cover canonical ordering. | | SCANNER-ANALYZERS-LANG-10-307 | DONE (2025-10-19) | Language Analyzer Guild | SCANNER-CORE-09-501 | Shared language evidence helpers + usage flag propagation. | Shared abstractions implemented; analyzers reuse helpers; evidence includes usage hints; unit tests cover canonical ordering. |
| SCANNER-ANALYZERS-LANG-10-308 | DONE (2025-10-19) | Language Analyzer Guild | SCANNER-ANALYZERS-LANG-10-307 | Determinism + fixture harness for language analyzers. | Harness executes analyzers against fixtures; golden JSON stored; CI helper ensures stable hashes. | | SCANNER-ANALYZERS-LANG-10-308 | DONE (2025-10-19) | Language Analyzer Guild | SCANNER-ANALYZERS-LANG-10-307 | Determinism + fixture harness for language analyzers. | Harness executes analyzers against fixtures; golden JSON stored; CI helper ensures stable hashes. |
| SCANNER-ANALYZERS-LANG-10-309 | DONE (2025-10-21) | Language Analyzer Guild | SCANNER-ANALYZERS-LANG-10-301..308 | Package language analyzers as restart-time plug-ins (manifest + host registration). | Plugin manifests authored under `plugins/scanner/analyzers/lang`; Worker loads via DI; restart required flag enforced; tests confirm manifest integrity. | | SCANNER-ANALYZERS-LANG-10-309 | DONE (2025-10-21) | Language Analyzer Guild | SCANNER-ANALYZERS-LANG-10-301..308 | Package language analyzers as restart-time plug-ins (manifest + host registration). | Plugin manifests authored under `plugins/scanner/analyzers/lang`; Worker loads via DI; restart required flag enforced; tests confirm manifest integrity. |