feat: Implement Runtime Facts ingestion service and NDJSON reader
Some checks failed
Docs CI / lint-and-preview (push) Has been cancelled
Some checks failed
Docs CI / lint-and-preview (push) Has been cancelled
- Added RuntimeFactsNdjsonReader for reading NDJSON formatted runtime facts. - Introduced IRuntimeFactsIngestionService interface and its implementation. - Enhanced Program.cs to register new services and endpoints for runtime facts. - Updated CallgraphIngestionService to include CAS URI in stored artifacts. - Created RuntimeFactsValidationException for validation errors during ingestion. - Added tests for RuntimeFactsIngestionService and RuntimeFactsNdjsonReader. - Implemented SignalsSealedModeMonitor for compliance checks in sealed mode. - Updated project dependencies for testing utilities.
This commit is contained in:
@@ -669,16 +669,85 @@ See `docs/dev/32_AUTH_CLIENT_GUIDE.md` for recommended profiles (online vs. air-
|
||||
|
||||
| Command | Purpose | Key Flags / Arguments | Notes |
|
||||
|---------|---------|-----------------------|-------|
|
||||
| `stellaops-cli scanner download` | Fetch and install scanner container | `--channel <stable\|beta\|nightly>` (default `stable`)<br>`--output <path>`<br>`--overwrite`<br>`--no-install` | Saves artefact under `ScannerCacheDirectory`, verifies digest/signature, and executes `docker load` unless `--no-install` is supplied. |
|
||||
| `stellaops-cli scan run` | Execute scanner container against a directory (auto-upload) | `--target <directory>` (required)<br>`--runner <docker\|dotnet\|self>` (default from config)<br>`--entry <image-or-entrypoint>`<br>`[scanner-args...]` | Runs the scanner, writes results into `ResultsDirectory`, emits a structured `scan-run-*.json` metadata file, and automatically uploads the artefact when the exit code is `0`. |
|
||||
| `stellaops-cli scanner download` | Fetch and install scanner container | `--channel <stable\|beta\|nightly>` (default `stable`)<br>`--output <path>`<br>`--overwrite`<br>`--no-install` | Saves artefact under `ScannerCacheDirectory`, verifies digest/signature, and executes `docker load` unless `--no-install` is supplied. |
|
||||
| `stellaops-cli scan run` | Execute scanner container against a directory (auto-upload) | `--target <directory>` (required)<br>`--runner <docker\|dotnet\|self>` (default from config)<br>`--entry <image-or-entrypoint>`<br>`[scanner-args...]` | Runs the scanner, writes results into `ResultsDirectory`, emits a structured `scan-run-*.json` metadata file, and automatically uploads the artefact when the exit code is `0`. |
|
||||
| `stellaops-cli scan upload` | Re-upload existing scan artefact | `--file <path>` | Useful for retries when automatic upload fails or when operating offline. |
|
||||
| `stellaops-cli db fetch` | Trigger connector jobs | `--source <id>` (e.g. `redhat`, `osv`)<br>`--stage <fetch\|parse\|map>` (default `fetch`)<br>`--mode <resume|init|cursor>` | Translates to `POST /jobs/source:{source}:{stage}` with `trigger=cli` |
|
||||
| `stellaops-cli db merge` | Run canonical merge reconcile | — | Calls `POST /jobs/merge:reconcile`; exit code `0` on acceptance, `1` on failures/conflicts |
|
||||
| `stellaops-cli db export` | Kick JSON / Trivy exports | `--format <json\|trivy-db>` (default `json`)<br>`--delta`<br>`--publish-full/--publish-delta`<br>`--bundle-full/--bundle-delta` | Sets `{ delta = true }` parameter when requested and can override ORAS/bundle toggles per run |
|
||||
| `stellaops-cli auth <login\|logout\|status\|whoami>` | Manage cached tokens for StellaOps Authority | `auth login --force` (ignore cache)<br>`auth status`<br>`auth whoami` | Uses `StellaOps.Auth.Client`; honours `StellaOps:Authority:*` configuration, stores tokens under `~/.stellaops/tokens` by default, and `whoami` prints subject/scope/expiry |
|
||||
| `stellaops-cli auth revoke export` | Export the Authority revocation bundle | `--output <directory>` (defaults to CWD) | Writes `revocation-bundle.json`, `.json.jws`, and `.json.sha256`; verifies the digest locally and includes key metadata in the log summary. |
|
||||
| `stellaops-cli auth revoke verify` | Validate a revocation bundle offline | `--bundle <path>` `--signature <path>` `--key <path>`<br>`--verbose` | Verifies detached JWS signatures, reports the computed SHA-256, and can fall back to cached JWKS when `--key` is omitted. |
|
||||
| `stellaops-cli offline kit pull` | Download the latest offline kit bundle and manifest | `--bundle-id <id>` (optional)<br>`--destination <dir>`<br>`--overwrite`<br>`--no-resume` | Streams the bundle + manifest from the configured mirror/backend, resumes interrupted downloads, verifies SHA-256, and writes signatures plus a `.metadata.json` manifest alongside the artefacts. |
|
||||
| `stellaops-cli ruby inspect` | Offline Ruby workspace inspection (Gemfile / lock + runtime signals) | `--root <directory>` (default current directory)<br>`--format <table\|json>` (default `table`) | Runs the bundled `RubyLanguageAnalyzer`, renders Package/Version/Group/Source/Lockfile/Runtime columns, or emits JSON `{ packages: [...] }`. Exit codes: `0` success, `64` invalid format, `70` unexpected failure, `71` missing directory. |
|
||||
| `stellaops-cli ruby resolve` | Fetch Ruby package inventory for a completed scan | `--image <registry-ref>` *or* `--scan-id <id>` (one required)<br>`--format <table\|json>` (default `table`) | Calls `GetRubyPackagesAsync` to download `ruby_packages.json`, groups entries by bundle/platform, and shows runtime entrypoints/usage. Table output mirrors `inspect`; JSON returns `{ scanId, groups: [...] }`. Exit codes: `0` success, `64` invalid args, `70` backend failure. |
|
||||
| `stellaops-cli db fetch` | Trigger connector jobs | `--source <id>` (e.g. `redhat`, `osv`)<br>`--stage <fetch\|parse\|map>` (default `fetch`)<br>`--mode <resume|init|cursor>` | Translates to `POST /jobs/source:{source}:{stage}` with `trigger=cli` |
|
||||
| `stellaops-cli db merge` | Run canonical merge reconcile | — | Calls `POST /jobs/merge:reconcile`; exit code `0` on acceptance, `1` on failures/conflicts |
|
||||
| `stellaops-cli db export` | Kick JSON / Trivy exports | `--format <json\|trivy-db>` (default `json`)<br>`--delta`<br>`--publish-full/--publish-delta`<br>`--bundle-full/--bundle-delta` | Sets `{ delta = true }` parameter when requested and can override ORAS/bundle toggles per run |
|
||||
| `stellaops-cli auth <login\|logout\|status\|whoami>` | Manage cached tokens for StellaOps Authority | `auth login --force` (ignore cache)<br>`auth status`<br>`auth whoami` | Uses `StellaOps.Auth.Client`; honours `StellaOps:Authority:*` configuration, stores tokens under `~/.stellaops/tokens` by default, and `whoami` prints subject/scope/expiry |
|
||||
| `stellaops-cli auth revoke export` | Export the Authority revocation bundle | `--output <directory>` (defaults to CWD) | Writes `revocation-bundle.json`, `.json.jws`, and `.json.sha256`; verifies the digest locally and includes key metadata in the log summary. |
|
||||
| `stellaops-cli auth revoke verify` | Validate a revocation bundle offline | `--bundle <path>` `--signature <path>` `--key <path>`<br>`--verbose` | Verifies detached JWS signatures, reports the computed SHA-256, and can fall back to cached JWKS when `--key` is omitted. |
|
||||
| `stellaops-cli offline kit pull` | Download the latest offline kit bundle and manifest | `--bundle-id <id>` (optional)<br>`--destination <dir>`<br>`--overwrite`<br>`--no-resume` | Streams the bundle + manifest from the configured mirror/backend, resumes interrupted downloads, verifies SHA-256, and writes signatures plus a `.metadata.json` manifest alongside the artefacts. |
|
||||
|
||||
### Ruby dependency verbs (`stellaops-cli ruby …`)
|
||||
|
||||
`ruby inspect` runs the same deterministic `RubyLanguageAnalyzer` bundled with Scanner.Worker against the local working tree—no backend calls—so operators can sanity-check Gemfile / Gemfile.lock pairs before shipping. `ruby resolve` downloads the `ruby_packages.json` artifact that Scanner creates for each scan (via `GetRubyPackagesAsync`) and reshapes it for operators who need to reason about groups/platforms/runtime usage after the fact.
|
||||
|
||||
**`ruby inspect` flags**
|
||||
|
||||
| Flag | Default | Description |
|
||||
| ---- | ------- | ----------- |
|
||||
| `--root <dir>` | current working directory | Directory containing `Gemfile`, `Gemfile.lock`, and runtime sources. Missing paths set exit code **71**. |
|
||||
| `--format <table\|json>` | `table` | `table` renders Package/Version/Groups/Platform/Source/Lockfile/Runtime columns; `json` emits `{ "packages": [...] }` with the analyzer metadata. |
|
||||
| `--verbose` / `-v` | `false` | Surfaces analyzer trace logging while keeping deterministic output. |
|
||||
|
||||
Successful runs exit `0`; invalid formats raise **64**, unexpected failures return **70**. Table output marks runtime usage with `[green]Entrypoint[/]` and includes every runtime entrypoint path when available. JSON mode mirrors analyzer metadata:
|
||||
|
||||
```json
|
||||
{
|
||||
"packages": [
|
||||
{
|
||||
"name": "rack",
|
||||
"version": "3.1.0",
|
||||
"source": "https://rubygems.org/",
|
||||
"lockfile": "Gemfile.lock",
|
||||
"groups": ["default"],
|
||||
"platform": "-",
|
||||
"runtimeEntrypoints": ["app.rb"],
|
||||
"runtimeFiles": ["app.rb"],
|
||||
"runtimeReasons": ["require-static"],
|
||||
"usedByEntrypoint": true
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
**`ruby resolve` flags**
|
||||
|
||||
| Flag | Default | Description |
|
||||
| ---- | ------- | ----------- |
|
||||
| `--image <registry/ref>` | — | Scanner artifact identifier (image digest/tag). Mutually exclusive with `--scan-id`; one is required. |
|
||||
| `--scan-id <id>` | — | Explicit scan identifier returned by `scan run`. |
|
||||
| `--format <table\|json>` | `table` | `json` writes `{ "scanId": "…", "groups": [{ "group": "default", "platform": "-", "packages": [...] }] }`. |
|
||||
| `--verbose` / `-v` | `false` | Enables HTTP + resolver logging. |
|
||||
|
||||
Errors caused by missing identifiers return **64**; transient backend errors surface as **70** (with full context in logs). Table output groups packages by Gem/Bundle group + platform and shows runtime entrypoints or `[grey]-[/]` when unused. JSON payloads stay stable for downstream automation:
|
||||
|
||||
```json
|
||||
{
|
||||
"scanId": "scan-ruby",
|
||||
"groups": [
|
||||
{
|
||||
"group": "default",
|
||||
"platform": "-",
|
||||
"packages": [
|
||||
{
|
||||
"name": "rack",
|
||||
"lockfile": "Gemfile.lock",
|
||||
"groups": ["default"],
|
||||
"runtimeUsed": true,
|
||||
"runtimeEntrypoints": ["app.rb"]
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
Both commands honour CLI observability hooks: Spectre tables for human output, `--format json` for automation, metrics reported via `CliMetrics.RecordRubyInspect/Resolve`, and Activity tags (`cli.ruby.inspect`, `cli.ruby.resolve`) for trace correlation.
|
||||
| `stellaops-cli offline kit import` | Upload an offline kit bundle to the backend | `<bundle.tgz>` (argument)<br>`--manifest <path>`<br>`--bundle-signature <path>`<br>`--manifest-signature <path>` | Validates digests when metadata is present, then posts multipart payloads to `POST /api/offline-kit/import`; logs the submitted import ID/status for air-gapped rollout tracking. |
|
||||
| `stellaops-cli offline kit status` | Display imported offline kit details | `--json` | Shows bundle id/kind, captured/imported timestamps, digests, and component versions; `--json` emits machine-readable output for scripting. |
|
||||
| `stellaops-cli sources ingest --dry-run` | Dry-run guard validation for individual payloads | `--source <id>`<br>`--input <path\|uri>`<br>`--tenant <id>`<br>`--format table\|json`<br>`--output <file>` | Normalises gzip/base64 payloads, invokes `api/aoc/ingest/dry-run`, and maps guard failures to deterministic `ERR_AOC_00x` exit codes. |
|
||||
|
||||
@@ -6,26 +6,89 @@ Active items only. Completed/historic work now resides in docs/implplan/archived
|
||||
|
||||
| Wave | Guild owners | Shared prerequisites | Status | Notes |
|
||||
| --- | --- | --- | --- | --- |
|
||||
| 110.A AdvisoryAI | Advisory AI Guild · Docs Guild · SBOM Service Guild | Sprint 100.A – Attestor (closed 2025-11-09 per `docs/implplan/archived/SPRINT_100_identity_signing.md`) | DOING | WebService/Worker orchestration, guardrails, and docs are live; continue console/CLI coverage as endpoints land. |
|
||||
| 110.B Concelier | Concelier Core & WebService Guilds · Observability Guild · AirGap Guilds (Importer/Policy/Time) | Sprint 100.A – Attestor | DOING | Telemetry wiring started; mirror/air-gap tasks unlocked with AdvisoryAI evidence dependencies met. |
|
||||
| 110.C Excititor | Excititor WebService/Core Guilds · Observability Guild · Evidence Locker Guild | Sprint 100.A – Attestor | DOING | VEX justification enrichment and provenance metadata are underway; keep Link-Not-Merge blockers tracked. |
|
||||
| 110.D Mirror | Mirror Creator Guild · Exporter Guild · CLI Guild · AirGap Time Guild | Sprint 100.A – Attestor | TODO | Deterministic bundle assembler remains the gating task before DSSE/OCI work can proceed. |
|
||||
| 110.A AdvisoryAI | Advisory AI Guild · Docs Guild · SBOM Service Guild | Sprint 100.A – Attestor (closed 2025-11-09 per `docs/implplan/archived/SPRINT_100_identity_signing.md`) | DOING | Regression/perf suite (AIAI-31-009) and console doc (DOCS-AIAI-31-004) remain DOING; SBOM (SBOM-AIAI-31-001/003), CLI (CLI-VULN-29-001/CLI-VEX-30-001), Policy (POLICY-ENGINE-31-001), and DevOps (DEVOPS-AIAI-31-001) owners owe delivery ETA updates on 2025-11-10 so the CLI/policy/runbook docs can unblock. |
|
||||
| 110.B Concelier | Concelier Core & WebService Guilds · Observability Guild · AirGap Guilds (Importer/Policy/Time) | Sprint 100.A – Attestor | DOING | Paragraph chunk API shipped 2025-11-07; structured field/caching (CONCELIER-AIAI-31-002) is still TODO, telemetry (CONCELIER-AIAI-31-003) DOING, and air-gap/console/attestation tracks remain gated on Link-Not-Merge + Cartographer schema. |
|
||||
| 110.C Excititor | Excititor WebService/Core Guilds · Observability Guild · Evidence Locker Guild | Sprint 100.A – Attestor | DOING | Normalized justification projections (EXCITITOR-AIAI-31-001) are DOING; chunk API, telemetry, docs, attestation, and mirror backlog stay queued behind that work plus Link-Not-Merge / Cartographer prerequisites. |
|
||||
| 110.D Mirror | Mirror Creator Guild · Exporter Guild · CLI Guild · AirGap Time Guild | Sprint 100.A – Attestor | TODO | Wave remains TODO—MIRROR-CRT-56-001 has not started, so DSSE/TUF, OCI/time-anchor, CLI, and scheduling integrations cannot proceed. |
|
||||
|
||||
## Status Snapshot (2025-11-04)
|
||||
## Status Snapshot (2025-11-09)
|
||||
|
||||
- **Advisory AI** – 5 of 11 tasks are DONE (AIAI-31-001, AIAI-31-002, AIAI-31-003, AIAI-31-010, AIAI-31-011); orchestration pipeline (AIAI-31-004) and host wiring (AIAI-31-004A) remain TODO while downstream guardrails, CLI, and observability tracks (AIAI-31-004B/004C and AIAI-31-005 through AIAI-31-009) stay TODO pending cache/guardrail implementation and WebService/Worker hardening.
|
||||
- 2025-11-04: AIAI-31-002 and AIAI-31-003 shipped with deterministic SBOM context client wiring (`AddSbomContext` typed HTTP client) and toolset integration; WebService/Worker now invoke the orchestrator with SBOM-backed simulations and emit initial metrics.
|
||||
- 2025-11-03: AIAI-31-002 landed the configurable HTTP client + DI defaults; retriever now resolves data via `/v1/sbom/context`, retaining a null fallback until SBOM service ships.
|
||||
- 2025-11-03: Follow-up: SBOM guild to deliver base URL/API key and run an Advisory AI smoke retrieval once SBOM-AIAI-31-001 endpoints are live.
|
||||
- 2025-11-08: AIAI-31-009 marked DONE – injection harness + dual golden prompts + plan-cache determinism tests landed; perf memo added to Advisory AI architecture, `dotnet test src/AdvisoryAI/__Tests/StellaOps.AdvisoryAI.Tests/StellaOps.AdvisoryAI.Tests.csproj --no-build` green.
|
||||
- 2025-11-08: AIAI-31-008 DONE – deterministic plan ordering + conflict prompt goldens refreshed, remote inference packaging verified across Compose/Helm/Offline Kit profiles, and Advisory AI test suite re-run.
|
||||
- **Concelier** – CONCELIER-CORE-AOC-19-004 is the only in-flight Concelier item; air-gap, console, attestation, and Link-Not-Merge tasks remain TODO, and several connector upgrades still carry overdue October due dates.
|
||||
- **Excititor** – Excititor WebService, console, policy, and observability tracks are all TODO and hinge on Link-Not-Merge schema delivery plus trust-provenance connectors (SUSE/Ubuntu) progressing in section 110.C.
|
||||
- **Mirror** – Mirror Creator track (MIRROR-CRT-56-001 through MIRROR-CRT-58-002) has not started; DSSE signing, OCI bundle, and scheduling integrations depend on the deterministic bundle assembler landing first.
|
||||
- **Advisory AI (110.A)** – WebService orchestration (AIAI-31-004), typed SBOM client/tooling (AIAI-31-002/003), guardrail pipeline (AIAI-31-005), and overview/API/architecture docs (DOCS-AIAI-31-001/002/003) are DONE; focus now sits on DOCS-AIAI-31-004 and AIAI-31-009 while CLI/policy/SBOM deliverables unblock the remaining docs.
|
||||
- 2025-11-09: AIAI-31-009 remains DOING after converting the guardrail harness into JSON fixtures, expanding property/perf coverage, and validating offline cache seeding; remote inference packaging (AIAI-31-008) is still TODO until the policy knob work in AIAI-31-006..007 completes.
|
||||
- 2025-11-09: DOCS-AIAI-31-004 continues DOING—guardrail/offline sections are drafted, but screenshots plus copy blocks wait on CONSOLE-VULN-29-001, CONSOLE-VEX-30-001, and EXCITITOR-CONSOLE-23-001.
|
||||
- SBOM-AIAI-31-003 and DOCS-AIAI-31-005/006/008/009 remain BLOCKED pending SBOM-AIAI-31-001, CLI-VULN-29-001, CLI-VEX-30-001, POLICY-ENGINE-31-001, and DEVOPS-AIAI-31-001.
|
||||
- **Concelier (110.B)** – `/advisories/{advisoryKey}/chunks` shipped on 2025-11-07 with tenant enforcement, chunk tuning knobs, and regression fixtures; structured field/caching work (CONCELIER-AIAI-31-002) is still TODO while telemetry/guardrail instrumentation (CONCELIER-AIAI-31-003) is DOING.
|
||||
- Air-gap provenance/staleness bundles (`CONCELIER-AIRGAP-56-001` → `CONCELIER-AIRGAP-58-001`), console views/deltas (`CONCELIER-CONSOLE-23-001..003`), and attestation metadata (`CONCELIER-ATTEST-73-001/002`) remain TODO pending Link-Not-Merge plus Cartographer schema delivery.
|
||||
- Connector provenance refreshes `FEEDCONN-ICSCISA-02-012` and `FEEDCONN-KISA-02-008` are still overdue, leaving evidence parity gaps for those feeds.
|
||||
- **Excititor (110.C)** – Normalized VEX justification projections (EXCITITOR-AIAI-31-001) are DOING as of 2025-11-09; the downstream chunk API (EXCITITOR-AIAI-31-002), telemetry/guardrails (EXCITITOR-AIAI-31-003), docs/OpenAPI alignment (EXCITITOR-AIAI-31-004), and attestation payload work (`EXCITITOR-ATTEST-*`) stay TODO until that projection work plus Link-Not-Merge schema land.
|
||||
- Mirror/air-gap backlog (`EXCITITOR-AIRGAP-56-001` .. `EXCITITOR-AIRGAP-58-001`) and connector provenance parity (`EXCITITOR-CONN-TRUST-01-001`) remain unscheduled, so Advisory AI cannot yet hydrate sealed VEX evidence or cite connector signatures.
|
||||
- **Mirror (110.D)** – MIRROR-CRT-56-001 (deterministic bundle assembler) has not kicked off, so DSSE/TUF (MIRROR-CRT-56-002), OCI exports (MIRROR-CRT-57-001), time anchors (MIRROR-CRT-57-002), CLI verbs (MIRROR-CRT-58-001), and Export Center automation (MIRROR-CRT-58-002) are all blocked.
|
||||
|
||||
## Blockers & Overdue Follow-ups
|
||||
|
||||
- Advisory AI customer-facing coverage remains blocked until SBOM-AIAI-31-001 exposes the `/v1/sbom/context` hand-off kit and until CLI-VULN-29-001, CLI-VEX-30-001, POLICY-ENGINE-31-001, and DEVOPS-AIAI-31-001 ship—keeping SBOM-AIAI-31-003 plus DOCS-AIAI-31-005/006/008/009 and the remote inference packaging work (AIAI-31-008) on hold.
|
||||
- `CONCELIER-GRAPH-21-001`, `CONCELIER-GRAPH-21-002`, and `CONCELIER-GRAPH-21-005` remain BLOCKED awaiting `CONCELIER-POLICY-20-002` outputs and Cartographer schema (`CARTO-GRAPH-21-002`), keeping downstream Excititor graph consumers on hold.
|
||||
- `EXCITITOR-GRAPH-21-001`, `EXCITITOR-GRAPH-21-002`, and `EXCITITOR-GRAPH-21-005` stay BLOCKED until the same Cartographer/Link-Not-Merge prerequisites are delivered.
|
||||
- Connector provenance updates `FEEDCONN-ICSCISA-02-012` (due 2025-10-23) and `FEEDCONN-KISA-02-008` (due 2025-10-24) remain past due and need scheduling. FeedMerge coordination tasks have been dropped (no AOC policy/governance backing yet), so capacity shifts to schema/guard deliverables.
|
||||
- Mirror evidence work remains blocked until `MIRROR-CRT-56-001` ships; align Export Center (`EXPORT-OBS-51-001`) and AirGap time anchor (`AIRGAP-TIME-57-001`) owners for kickoff.
|
||||
|
||||
## Immediate actions (target: 2025-11-12)
|
||||
|
||||
- **Advisory AI** – Land AIAI-31-009 test harness updates plus remote inference packaging (AIAI-31-008) once POLICY-ENGINE-31-001 and DEVOPS-AIAI-31-001 expose the required knobs; SBOM guild to deliver SBOM-AIAI-31-001 so SBOM-AIAI-31-003 and the CLI/policy/runbook docs can unblock.
|
||||
- **Concelier** – Finish CONCELIER-AIAI-31-002 structured fields/caching and wire CONCELIER-AIAI-31-003 telemetry before starting air-gap or console endpoints; hold daily sync with Cartographer owners on CONCELIER-LNM-21-201/202 + CARTO-GRAPH-21-002.
|
||||
- **Excititor** – Wrap EXCITITOR-AIAI-31-001 justification projections, then immediately stage EXCITITOR-AIAI-31-002/003 plus EXCITITOR-ATTEST-01-003 to keep Advisory AI evidence feeds parallel to Concelier.
|
||||
- **Mirror** – Schedule MIRROR-CRT-56-001 kickoff with Export Center/AirGap Time guilds, confirm `EXPORT-OBS-51-001` + `AIRGAP-TIME-57-001` owners, and pre-stage DSSE/TUF design notes so MIRROR-CRT-56-002 can start as soon as the assembler lands.
|
||||
- **Downstream prep** – Scanner (Sprint 130) and Policy/Vuln Explorer (Sprint 129) owners should review AIAI-31-009 outputs after 2025-11-10 to ensure schema expectations match; Concelier CONSOLE (23-001..003) and AIRGAP (56/57/58) leads need Link-Not-Merge dates set during the 2025-11-11 checkpoint; Excititor mirror/air-gap teams should stage EXCITITOR-AIRGAP-56/57/58 implementation plans; Mirror CLI/Export Center teams should assemble design notes ahead of MIRROR-CRT-56-002/58-001 once the assembler kickoff happens.
|
||||
|
||||
## Wave detail references (2025-11-09)
|
||||
|
||||
- **110.A AdvisoryAI (docs/implplan/SPRINT_111_advisoryai.md)**
|
||||
DOCS-AIAI-31-004 remains DOING; DOCS-AIAI-31-005/006/008/009 are BLOCKED on CLI/POLICY/SBOM/DevOps dependencies; SBOM-AIAI-31-003 is still TODO awaiting SBOM-AIAI-31-001; AIAI-31-008 is TODO until guardrail knobs land, and AIAI-31-009 stays DOING with the expanded harness/perf coverage work.
|
||||
- **110.B Concelier (docs/implplan/SPRINT_112_concelier_i.md)**
|
||||
CONCELIER-AIAI-31-002 is TODO while CONCELIER-AIAI-31-003 is DOING; all air-gap (`CONCELIER-AIRGAP-56/57/58-*`), attestation (`CONCELIER-ATTEST-73-*`), and console (`CONCELIER-CONSOLE-23-*`) tracks remain TODO pending Link-Not-Merge (`CONCELIER-LNM-21-*`) and Cartographer schema (`CARTO-GRAPH-21-002`) delivery.
|
||||
- **110.C Excititor (docs/implplan/SPRINT_119_excititor_i.md)**
|
||||
EXCITITOR-AIAI-31-001 is DOING; EXCITITOR-AIAI-31-002/003/004, EXCITITOR-ATTEST-01-003/-73-001/-73-002, EXCITITOR-AIRGAP-56/57/58-* and EXCITITOR-CONN-TRUST-01-001 are all TODO awaiting the justification projection output plus Link-Not-Merge contracts.
|
||||
- **110.D Mirror (docs/implplan/SPRINT_125_mirror.md)**
|
||||
Every MIRROR-CRT-56/57/58 task is still TODO; DSSE/TUF, OCI bundle, time-anchor, CLI, and Export Center automation cannot start until the deterministic bundle assembler (MIRROR-CRT-56-001) is underway with EXPORT-OBS-51-001 and AIRGAP-TIME-57-001 owners confirmed.
|
||||
|
||||
## Downstream dependency rollup (snapshot: 2025-11-09)
|
||||
|
||||
| Wave | Dependent sprint(s) (selected) | Impact if 110.* slips |
|
||||
| --- | --- | --- |
|
||||
| 110.A AdvisoryAI | `SPRINT_130_scanner_surface.md`, `SPRINT_129_policy_reasoning.md`, `SPRINT_513_provenance.md`, `SPRINT_514_sovereign_crypto_enablement.md` | Scanner analyzers need AdvisoryAI schemas/feeds, Policy/Vuln Explorer tracks cannot expose advisory reasoning, and provenance/sovereign crypto programs remain paused until evidence contracts land. |
|
||||
| 110.B Concelier | `SPRINT_113_concelier_ii.md`, `SPRINT_114_concelier_iii.md`, `SPRINT_115_concelier_iv.md` | Link-Not-Merge schema + observation APIs gate Concelier graph, telemetry, and orchestrator waves; Console/advisor UIs stay blocked. |
|
||||
| 110.C Excititor | `SPRINT_120_excititor_ii.md` → `SPRINT_124_excititor_vi.md` | VEX chunk/attestation phases cannot progress until Excititor.I ships justification projections/guardrails, delaying Lens, Policy, and Advisory AI parity for VEX evidence. |
|
||||
| 110.D Mirror | `SPRINT_125_mirror.md` | Export Center, CLI, and air-gap bundles rely on MIRROR-CRT-56-001; no downstream mirror automation can begin until the deterministic assembler is complete. |
|
||||
|
||||
## Interlocks & owners
|
||||
|
||||
| Interlock | Participants | Needed artifact(s) | Status / notes (2025-11-09) |
|
||||
| --- | --- | --- | --- |
|
||||
| Advisory AI customer surfaces | Advisory AI Guild · SBOM Service Guild · CLI Guild · Policy Guild · DevOps Guild | `SBOM-AIAI-31-001`, `SBOM-AIAI-31-003`, `CLI-VULN-29-001`, `CLI-VEX-30-001`, `POLICY-ENGINE-31-001`, `DEVOPS-AIAI-31-001` | SBOM hand-off kit + CLI/Policy knobs still pending; DOCS-AIAI-31-005/006/008/009 stay blocked until these artifacts ship. |
|
||||
| Link-Not-Merge contract | Concelier Core/WebService Guilds · Cartographer Guild · Platform Events Guild | `CONCELIER-LNM-21-001`→`21-203`, `CARTO-GRAPH-21-002`, `CONCELIER-GRAPH-21-001/002`, `CONCELIER-CONSOLE-23-001..003` | Schema and observation APIs not started; Cartographer schema delivery remains the gate for CONCELIER-AIAI-31-002/003 and all console/air-gap tracks. |
|
||||
| VEX justification + attestation | Excititor WebService/Core Guilds · Observability Guild · Evidence Locker Guild · Cartographer Guild | `EXCITITOR-AIAI-31-001`→`31-004`, `EXCITITOR-ATTEST-01-003`, `EXCITITOR-ATTEST-73-001/002`, `EXCITITOR-AIRGAP-56/57/58-*`, `EXCITITOR-CONN-TRUST-01-001` | Justification enrichment is DOING; every downstream chunk/telemetry/attestation/mirror task remains TODO pending that output plus Link-Not-Merge contracts. |
|
||||
| Mirror evidence kickoff | Mirror Creator Guild · Exporter Guild · AirGap Time Guild · Security Guild · CLI Guild | `MIRROR-CRT-56-001`→`56-002`, `MIRROR-CRT-57-001/002`, `MIRROR-CRT-58-001/002`, `EXPORT-OBS-51-001`, `EXPORT-OBS-54-001`, `AIRGAP-TIME-57-001`, `CLI-AIRGAP-56-001`, `PROV-OBS-53-001` | No owner meeting yet; assembler (MIRROR-CRT-56-001) is still unscheduled, so DSSE/TUF, OCI, time-anchor, CLI, and Export Center hooks cannot start. |
|
||||
|
||||
### Upcoming checkpoints
|
||||
|
||||
| Date (UTC) | Focus | Agenda / expected exit |
|
||||
| --- | --- | --- |
|
||||
| 2025-11-10 | Advisory AI customer surfaces | Confirm SBOM-AIAI-31-001 delivery slot, align CLI-VULN/CLI-VEX scope owners, and capture POLICY-ENGINE-31-001 + DEVOPS-AIAI-31-001 readiness so DOCS-AIAI-31-005/006/008/009 can resume. |
|
||||
| 2025-11-11 | Link-Not-Merge contract | Cartographer to present CARTO-GRAPH-21-002 schema draft, Concelier to commit dates for CONCELIER-LNM-21-001..003 and CONCELIER-AIAI-31-002/003 telemetry wiring. |
|
||||
| 2025-11-11 | VEX justification + attestation | Walk EXCITITOR-AIAI-31-001 output, sequence EXCITITOR-AIAI-31-002/003, and lock attestation backlog order (`EXCITITOR-ATTEST-01-003`, `-73-001`, `-73-002`). |
|
||||
| 2025-11-12 | Mirror evidence kickoff | Assign MIRROR-CRT-56-001 lead, confirm EXPORT-OBS-51-001/AIRGAP-TIME-57-001 owners, and outline DSSE/TUF design reviews for MIRROR-CRT-56-002. |
|
||||
|
||||
## Coordination log
|
||||
|
||||
| Date | Notes |
|
||||
| --- | --- |
|
||||
| 2025-11-09 | Sprint file refreshed with wave detail references, interlocks, and risk log; waiting on 2025-11-10/11/12 syncs for SBOM/CLI/POLICY/DevOps, Link-Not-Merge, Excititor justification, and Mirror assembler commitments. |
|
||||
|
||||
## Risk log (2025-11-09)
|
||||
|
||||
| Risk | Impact | Mitigation / owner |
|
||||
| --- | --- | --- |
|
||||
| SBOM/CLI/Policy/DevOps deliverables slip past 2025-11-12 | Advisory AI CLI/docs remain blocked; downstream Scanner/Policy/Vuln Explorer sprints cannot validate schema feeds | Capture ETAs during 2025-11-10 interlock; SBOM/CLI/Policy/DevOps guild leads to publish commit dates and update sprint rows immediately |
|
||||
| Link-Not-Merge schema delays (`CONCELIER-LNM-21-*`, `CARTO-GRAPH-21-002`) | Concelier evidence APIs, console views, and Excititor graph consumers cannot progress; Advisory AI loses deterministic Concelier feeds | 2025-11-11 checkpoint to lock schema delivery; Cartographer + Concelier core owners to share migration plan and unblock CONCELIER-AIAI-31-002/003 |
|
||||
| Excititor justification/attestation backlog stalls | Advisory AI cannot cite VEX evidence, Excititor attestation/air-gap tasks remain TODO, Mirror parity slips | Excititor web/core leads to finish EXCITITOR-AIAI-31-001 and schedule EXCITITOR-AIAI-31-002/003 + ATTEST tasks during 2025-11-11 session |
|
||||
| Mirror assembler lacks staffing (`MIRROR-CRT-56-001`) | DSSE/TUF, OCI/time-anchor, CLI, Export Center automations cannot even start, blocking Wave 110.D and Sprint 125 entirely | 2025-11-12 kickoff must assign an owner and confirm EXPORT-OBS/AIRGAP-TIME prerequisites; track progress daily until assembler code is in flight |
|
||||
|
||||
@@ -8,11 +8,11 @@ Execute the tasks below strictly in order; each artifact unblocks the next analy
|
||||
|
||||
| Order | Task ID | State | Summary | Owner / Source | Depends On |
|
||||
| --- | --- | --- | --- | --- | --- |
|
||||
| 1 | `SCANNER-ANALYZERS-DENO-26-001` | TODO | Build the deterministic input normalizer + VFS merger for `deno.json(c)`, import maps, lockfiles, vendor trees, `$DENO_DIR`, and OCI layers so analyzers have a canonical file view. | Deno Analyzer Guild (src/Scanner/StellaOps.Scanner.Analyzers.Lang.Deno) | — |
|
||||
| 2 | `SCANNER-ANALYZERS-DENO-26-002` | TODO | Implement the module graph resolver covering static/dynamic imports, npm bridge, cache lookups, built-ins, WASM/JSON assertions, and annotate edges with their resolution provenance. | Deno Analyzer Guild (src/Scanner/StellaOps.Scanner.Analyzers.Lang.Deno) | SCANNER-ANALYZERS-DENO-26-001 |
|
||||
| 3 | `SCANNER-ANALYZERS-DENO-26-003` | TODO | Ship the npm/node compatibility adapter that maps `npm:` specifiers, evaluates `exports` conditionals, and logs builtin usage for policy overlays. | Deno Analyzer Guild (src/Scanner/StellaOps.Scanner.Analyzers.Lang.Deno) | SCANNER-ANALYZERS-DENO-26-002 |
|
||||
| 4 | `SCANNER-ANALYZERS-DENO-26-004` | TODO | Add the permission/capability analyzer covering FS/net/env/process/crypto/FFI/workers plus dynamic-import + literal fetch heuristics with reason codes. | Deno Analyzer Guild (src/Scanner/StellaOps.Scanner.Analyzers.Lang.Deno) | SCANNER-ANALYZERS-DENO-26-003 |
|
||||
| 5 | `SCANNER-ANALYZERS-DENO-26-005` | TODO | Build bundle/binary inspectors for eszip and `deno compile` executables to recover graphs, configs, embedded resources, and snapshots. | Deno Analyzer Guild (src/Scanner/StellaOps.Scanner.Analyzers.Lang.Deno) | SCANNER-ANALYZERS-DENO-26-004 |
|
||||
| 6 | `SCANNER-ANALYZERS-DENO-26-006` | TODO | Implement the OCI/container adapter that stitches per-layer Deno caches, vendor trees, and compiled binaries back into provenance-aware analyzer inputs. | Deno Analyzer Guild (src/Scanner/StellaOps.Scanner.Analyzers.Lang.Deno) | SCANNER-ANALYZERS-DENO-26-005 |
|
||||
| 7 | `SCANNER-ANALYZERS-DENO-26-007` | TODO | Produce AOC-compliant observation writers (entrypoints, modules, capability edges, workers, warnings, binaries) with deterministic reason codes. | Deno Analyzer Guild (src/Scanner/StellaOps.Scanner.Analyzers.Lang.Deno) | SCANNER-ANALYZERS-DENO-26-006 |
|
||||
| 1 | `SCANNER-ANALYZERS-DENO-26-001` | DONE | Build the deterministic input normalizer + VFS merger for `deno.json(c)`, import maps, lockfiles, vendor trees, `$DENO_DIR`, and OCI layers so analyzers have a canonical file view. | Deno Analyzer Guild (src/Scanner/StellaOps.Scanner.Analyzers.Lang.Deno) | — |
|
||||
| 2 | `SCANNER-ANALYZERS-DENO-26-002` | DONE | Implement the module graph resolver covering static/dynamic imports, npm bridge, cache lookups, built-ins, WASM/JSON assertions, and annotate edges with their resolution provenance. | Deno Analyzer Guild (src/Scanner/StellaOps.Scanner.Analyzers.Lang.Deno) | SCANNER-ANALYZERS-DENO-26-001 |
|
||||
| 3 | `SCANNER-ANALYZERS-DENO-26-003` | DONE | Ship the npm/node compatibility adapter that maps `npm:` specifiers, evaluates `exports` conditionals, and logs builtin usage for policy overlays. | Deno Analyzer Guild (src/Scanner/StellaOps.Scanner.Analyzers.Lang.Deno) | SCANNER-ANALYZERS-DENO-26-002 |
|
||||
| 4 | `SCANNER-ANALYZERS-DENO-26-004` | DONE | Add the permission/capability analyzer covering FS/net/env/process/crypto/FFI/workers plus dynamic-import + literal fetch heuristics with reason codes. | Deno Analyzer Guild (src/Scanner/StellaOps.Scanner.Analyzers.Lang.Deno) | SCANNER-ANALYZERS-DENO-26-003 |
|
||||
| 5 | `SCANNER-ANALYZERS-DENO-26-005` | DONE | Build bundle/binary inspectors for eszip and `deno compile` executables to recover graphs, configs, embedded resources, and snapshots. | Deno Analyzer Guild (src/Scanner/StellaOps.Scanner.Analyzers.Lang.Deno) | SCANNER-ANALYZERS-DENO-26-004 |
|
||||
| 6 | `SCANNER-ANALYZERS-DENO-26-006` | DONE | Implement the OCI/container adapter that stitches per-layer Deno caches, vendor trees, and compiled binaries back into provenance-aware analyzer inputs. | Deno Analyzer Guild (src/Scanner/StellaOps.Scanner.Analyzers.Lang.Deno) | SCANNER-ANALYZERS-DENO-26-005 |
|
||||
| 7 | `SCANNER-ANALYZERS-DENO-26-007` | DOING | Produce AOC-compliant observation writers (entrypoints, modules, capability edges, workers, warnings, binaries) with deterministic reason codes. | Deno Analyzer Guild (src/Scanner/StellaOps.Scanner.Analyzers.Lang.Deno) | SCANNER-ANALYZERS-DENO-26-006 |
|
||||
| 8 | `SCANNER-ANALYZERS-DENO-26-008` | TODO | Finalize fixture + benchmark suite (vendor/npm/FFI/worker/dynamic import/bundle/cache/container cases) validating analyzer determinism and performance. | Deno Analyzer Guild, QA Guild (src/Scanner/StellaOps.Scanner.Analyzers.Lang.Deno) | SCANNER-ANALYZERS-DENO-26-007 |
|
||||
|
||||
@@ -6,9 +6,160 @@
|
||||
|
||||
| Task ID | State | Summary | Owner / Source | Depends On |
|
||||
| --- | --- | --- | --- | --- |
|
||||
| `SCANNER-ENG-0002` | TODO | Design the Node.js lockfile collector + CLI validator per `docs/benchmarks/scanner/scanning-gaps-stella-misses-from-competitors.md`, capturing Surface + policy requirements before implementation. | Scanner Guild, CLI Guild (docs/modules/scanner) | — |
|
||||
| `SCANNER-ENG-0003` | TODO | Design Python lockfile + editable-install parity checks with policy predicates and CLI workflow coverage as outlined in the gap analysis. | Python Analyzer Guild, CLI Guild (docs/modules/scanner) | — |
|
||||
| `SCANNER-ENG-0004` | TODO | Design Java lockfile ingestion/validation (Gradle/SBT collectors, CLI verb, policy hooks) to close comparison gaps. | Java Analyzer Guild, CLI Guild (docs/modules/scanner) | — |
|
||||
| `SCANNER-ENG-0005` | TODO | Enhance Go stripped-binary fallback inference design, including inferred module metadata + policy integration, per the gap analysis. | Go Analyzer Guild (docs/modules/scanner) | — |
|
||||
| `SCANNER-ENG-0006` | TODO | Expand Rust fingerprint coverage design (enriched fingerprint catalogue + policy controls) per the comparison matrix. | Rust Analyzer Guild (docs/modules/scanner) | — |
|
||||
| `SCANNER-ENG-0007` | TODO | Design the deterministic secret leak detection pipeline covering rule packaging, Policy Engine integration, and CLI workflow. | Scanner Guild, Policy Guild (docs/modules/scanner) | — |
|
||||
| `SCANNER-ENG-0002` | DONE (2025-11-09) | Design the Node.js lockfile collector + CLI validator per `docs/benchmarks/scanner/scanning-gaps-stella-misses-from-competitors.md`, capturing Surface + policy requirements before implementation. | Scanner Guild, CLI Guild (docs/modules/scanner) | — |
|
||||
| `SCANNER-ENG-0003` | DONE (2025-11-09) | Design Python lockfile + editable-install parity checks with policy predicates and CLI workflow coverage as outlined in the gap analysis. | Python Analyzer Guild, CLI Guild (docs/modules/scanner) | — |
|
||||
| `SCANNER-ENG-0004` | DONE (2025-11-09) | Design Java lockfile ingestion/validation (Gradle/SBT collectors, CLI verb, policy hooks) to close comparison gaps. | Java Analyzer Guild, CLI Guild (docs/modules/scanner) | — |
|
||||
| `SCANNER-ENG-0005` | DONE (2025-11-09) | Enhance Go stripped-binary fallback inference design, including inferred module metadata + policy integration, per the gap analysis. | Go Analyzer Guild (docs/modules/scanner) | — |
|
||||
| `SCANNER-ENG-0006` | DONE (2025-11-09) | Expand Rust fingerprint coverage design (enriched fingerprint catalogue + policy controls) per the comparison matrix. | Rust Analyzer Guild (docs/modules/scanner) | — |
|
||||
| `SCANNER-ENG-0007` | DONE (2025-11-09) | Design the deterministic secret leak detection pipeline covering rule packaging, Policy Engine integration, and CLI workflow. | Scanner Guild, Policy Guild (docs/modules/scanner) | — |
|
||||
|
||||
> 2025-11-09: The gap designs below capture analyzer, Surface, CLI, and policy contracts for SCANNER-ENG-0002…0007; tasks were taken DOING → DONE after this review.
|
||||
|
||||
## Implementation progress (2025-11-09)
|
||||
|
||||
- Gradle/Maven lock ingestion is now wired into `JavaLanguageAnalyzer`: `JavaLockFileCollector` sorts lock metadata deterministically, merges it with archive findings (`lockConfiguration`, `lockRepository`, `lockResolved`), and emits declared-only components (with `declaredOnly=true`, `lockSource`, `lockLocator`) whenever jars are missing. CLI/Surface telemetry tags were updated to carry per-language declared/missing counters.
|
||||
- `stella java lock-validate` shares the `HandleLanguageLockValidateAsync` helper with Node/Python, has table/JSON output parity, and is documented alongside the scanner README + CLI guide (including the new metric `stellaops.cli.java.lock_validate.count`). Tests now cover the Ruby/Node/Java lock workflows end-to-end via `CommandHandlersTests`.
|
||||
|
||||
## Design outcomes
|
||||
|
||||
### SCANNER-ENG-0002 — Node.js lockfile collector + CLI validator
|
||||
|
||||
**Scope & goals**
|
||||
- Provide deterministic ingestion of `pnpm-lock.yaml`, `package-lock.json`, and `yarn.lock` so declared dependencies are preserved even when `node_modules` is absent.
|
||||
- Offer a CLI validator that runs without scheduling a scan, reusing the same collector and Surface safety rails.
|
||||
|
||||
**Design decisions**
|
||||
- Add `NodeLockfileCollector` under `StellaOps.Scanner.Analyzers.Lang.Node`. The collector normalises manifests into a shared model (`package name`, `version`, `resolved`, `integrity`, `registry`, `workspace path`) and emits `DeclaredOnly = true` components stored beside installed fragments (`LayerComponentFragment.DeclaredSources`).
|
||||
- Reuse `LanguageAnalyzerContext` merge rules so installed packages supersede declared-only entries while retaining discrepancies for policy.
|
||||
- Gate execution through `Surface.Validation` (`scanner.lockfiles.node.*` knobs) that enforce max lockfile size, workspace limits, and registry allowlists; violations fail fast with deterministic error IDs.
|
||||
- Private registries referenced in lockfiles must use `secret://` handles. `Surface.Secrets` resolves these handles before validation and the resolved metadata (never the secret) is attached to the collector context for auditing.
|
||||
- EntryTrace usage hints annotate runtime packages; when a package is used at runtime but missing from the lockfile, the merge step tags it with `UsageWithoutDeclaration`.
|
||||
|
||||
**CLI, policy, docs**
|
||||
- Add `stella node lock-validate [path] --format {auto|pnpm|npm|yarn}` that runs locally, reuses Surface controls, and returns canonical JSON + table summaries. The CLI inherits `--surface-config` so air-gapped configs stay consistent.
|
||||
- Scanner/WebService gains `--node-lockfiles` / `SCANNER__NODE__LOCKFILES__ENABLED` toggles to control ingestion during full scans.
|
||||
- Policy Engine receives predicates: `node.lock.declaredMissing`, `node.lock.registryDisallowed`, `node.lock.declarationOnly`. Templates show how to fail on disallowed registries while only warning on declared-only findings that never reach runtime.
|
||||
- Update `docs/modules/scanner/architecture.md` and policy DSL appendices with the new evidence flags and CLI workflow.
|
||||
|
||||
**Testing, telemetry, rollout**
|
||||
- Golden fixtures for pnpm v8, npm v9, and yarn berry lockfiles live under `tests/Scanner.Analyzers.Node/__fixtures__/lockfiles`. Deterministic snapshots are asserted in both analyzer and CLI tests.
|
||||
- Add integration coverage in `tests/Scanner.Cli.Node` verifying exit codes and explain output for mismatched packages/registries.
|
||||
- Emit counters (`scanner.node.lock.declared`, `scanner.node.lock.mismatch`, `scanner.node.lock.registry_blocked`) plus structured logs keyed by lockfile digest.
|
||||
- Offline Kit ships the parser tables and CLI binary help under `offline/scanner/node-lockfiles/README.md`.
|
||||
|
||||
**Implementation status (2025-11-09)**
|
||||
- Lockfile declarations now emit `DeclaredOnly` components in `StellaOps.Scanner.Analyzers.Lang.Node` with lock source/locator metadata and deterministic evidence for policy use.
|
||||
- CLI verb `stella node lock-validate` inspects lockfiles locally, rendering declared-only/missing-lock summaries and emitting `stellaops.cli.node.lock_validate.count` telemetry.
|
||||
- Node analyzer determinism fixtures updated with declared-only coverage; CLI unit suite exercises the new handler.
|
||||
- Python analyzer ingests `requirements*.txt`, `Pipfile.lock`, and `poetry.lock`, tagging installed distributions with `lockSource` metadata and creating declared-only components. `stella python lock-validate` mirrors the workflow for offline validation and records `stellaops.cli.python.lock_validate.count`.
|
||||
|
||||
### SCANNER-ENG-0003 — Python lockfile + editable-install parity
|
||||
|
||||
**Scope & goals**
|
||||
- Parse Python lockfiles (`poetry.lock`, `Pipfile.lock`, hashed `requirements*.txt`) to capture declared graphs pre-install.
|
||||
- Detect editable installs and local path references so policy can assert parity between lockfiles and runtime contents.
|
||||
|
||||
**Design decisions**
|
||||
- Introduce `PythonLockfileCollector` in `StellaOps.Scanner.Analyzers.Lang.Python`, capable of reading Poetry, Pipenv, pip-tools, and raw requirements syntax (including environment markers, extras, hashes, VCS refs).
|
||||
- Extend the collector with an `EditableResolver` that inspects lockfile entries (`path =`, `editable = true`, `-e ./pkg`) and consults `Surface.FS` to normalise the referenced directory, capturing `EditablePath`, `SourceDigest`, and `VcsRef` metadata.
|
||||
- Merge results with installed `*.dist-info` data using `LanguageAnalyzerContext`. Installed evidence overrides declared-only components; editable packages missing from the artifact layer are tagged `EditableMissing`.
|
||||
- `Surface.Validation` adds knobs `scanner.lockfiles.python.maxBytes`, `scanner.lockfiles.python.allowedIndexes`, and ensures hashes are present when policy mandates repeatable environments. Private index credentials are provided via `Surface.Secrets` and never persisted.
|
||||
|
||||
**CLI, policy, docs**
|
||||
- New CLI verb `stella python lock-validate` mirrors the Node workflow, validates editable references resolve within the checked-out tree, and emits parity diagnostics.
|
||||
- Scanner runs accept `--python-lockfiles` to toggle ingestion per tenant.
|
||||
- Policy predicates: `python.lock.declaredMissing`, `python.lock.editableUnpinned`, `python.lock.indexDisallowed`. Editable packages missing from the filesystem can be set to fail builds or raise waivers.
|
||||
- Document the workflow in `docs/modules/scanner/architecture.md` and the policy cookbook, including guidance on handling build-system backends.
|
||||
|
||||
**Testing, telemetry, rollout**
|
||||
- Fixtures covering Poetry 1.6, Pipenv 2024.x, `requirements.txt` with markers, and mixed editable/VCS entries live beside the analyzer tests.
|
||||
- CLI golden output asserts deterministic ordering and masking of secrets in URLs.
|
||||
- Metrics: `scanner.python.lock.declared`, `scanner.python.lock.editable`, `scanner.python.lock.failures`.
|
||||
- Offline Kit bundles include parser definitions and sample policies to keep air-gapped tenants aligned.
|
||||
|
||||
### SCANNER-ENG-0004 — Java/Gradle/SBT lockfile ingestion & validation
|
||||
|
||||
**Scope & goals**
|
||||
- Capture Gradle, Maven, and SBT dependency locks before artifacts are built, along with repository provenance and configuration scopes.
|
||||
- Provide CLI validation and policy predicates enforcing repository allowlists and declared/runtime parity.
|
||||
|
||||
**Design decisions**
|
||||
- Add collectors: `GradleLockfileCollector` (reads `gradle.lockfile` and `gradle/dependency-locks/*.lock`), `MavenLockfileCollector` (parses `pom.xml`/`pom.lock` + dependencyManagement overrides), and `SbtLockfileCollector` (reads Ivy resolution outputs or `dependencies.lock`).
|
||||
- Each collector emits normalized records keyed by `groupId:artifactId:version` plus config scope (`compileClasspath`, `runtimeClasspath`, etc.), repository URI, checksum, and optional classifier. Records are stored as `DeclaredOnly` fragments associated with their workspace path.
|
||||
- `Surface.Validation` enforces file-size limits, repository allowlists (`scanner.lockfiles.java.allowedRepos`), and optional checksum requirements. Private Maven credentials flow through `Surface.Secrets`.
|
||||
- `JavaLanguageAnalyzer` merges declared entries with installed archives. Runtime usage from EntryTrace is attached so policies can prioritize gaps that reach runtime.
|
||||
|
||||
**CLI, policy, docs**
|
||||
- CLI verb `stella java lock-validate` supports Gradle/Maven/SBT modes, prints mismatched dependencies, and checks repository policy.
|
||||
- Scanner flags `--java-lockfiles` or env `SCANNER__JAVA__LOCKFILES__ENABLED` gate ingestion. Lockfile artifacts are uploaded to Surface.FS for evidence replay.
|
||||
- Policy predicates: `java.lock.declaredMissing`, `java.lock.repoDisallowed`, `java.lock.unpinned` (no checksum). Explain traces cite repository + config scope for each discrepancy.
|
||||
- Docs: update scanner module dossier and policy template library with repository governance examples.
|
||||
|
||||
**Testing, telemetry, rollout**
|
||||
- Fixtures derived from sample Gradle multi-projects, Maven BOM hierarchies, and SBT builds validate parser coverage and CLI messaging.
|
||||
- Metrics `scanner.java.lock.declared`, `scanner.java.lock.missing`, `scanner.java.lock.repo_blocked` feed the observability dashboards.
|
||||
- Offline kits include parser grammars and CLI docs so air-gapped tenants can enforce repo policies without SaaS dependencies.
|
||||
|
||||
### SCANNER-ENG-0005 — Go stripped-binary fallback inference
|
||||
|
||||
**Scope & goals**
|
||||
- Enrich the stripped-binary fallback so Go modules remain explainable even without embedded `buildinfo`, and give Policy Engine knobs to treat inferred evidence differently.
|
||||
|
||||
**Design decisions**
|
||||
- Extend `GoBinaryScanner` with an inference pipeline that, when build info is absent, parses ELF/Mach-O symbol tables and DWARF data using the existing `ElfSharp` bindings. Symbols feed into a new `GoSymbolInferenceEngine` that matches against a signed `GoFingerprintCatalog` under `StellaOps.Scanner.Analyzers.Lang.Go.Fingerprints`.
|
||||
- Inferred results carry `Confidence` (0–1), matched symbol counts, and reasons (`BuildInfoMissing`, `SymbolMatches`, `PkgPathFallback`). Records are emitted as `InferredModule` metadata alongside hashed fallback components.
|
||||
- Update fragment schemas so DSSE-composed BOMs include both the hashed fallback and the inference summary, enabling deterministic replay.
|
||||
- `Surface.Validation` exposes `scanner.analyzers.go.fallback.enabled`, `scanner.analyzers.go.fallback.maxSymbolBytes`, ensuring workloads can opt out or constrain processing time.
|
||||
|
||||
**Policy, CLI, docs**
|
||||
- Policy predicates `go.module.inferenceConfidence` and `go.module.hashOnly` let tenants fail when only hashed provenance exists or warn when inference confidence < threshold.
|
||||
- CLI flag `--go-fallback-detail` (and corresponding API query) prints hashed vs inferred modules, confidence, and remediation hints (e.g., rebuild with `-buildvcs`).
|
||||
- Documentation updates cover inference details, how confidence feeds lattice weights, and how to author waivers.
|
||||
|
||||
**Testing, telemetry, rollout**
|
||||
- Add stripped binary fixtures (Linux, macOS) plus intentionally obfuscated samples. Tests assert deterministic inference and hashing.
|
||||
- Metrics `scanner.go.inference.count`, `scanner.go.inference.confidence_bucket` ensure observability; logs include `imageDigest`, `binaryPath`, `confidence`.
|
||||
- Offline Kit bundles the fingerprint catalog and inference changelog so air-gapped tenants can audit provenance.
|
||||
|
||||
### SCANNER-ENG-0006 — Rust fingerprint coverage expansion
|
||||
|
||||
**Scope & goals**
|
||||
- Improve Rust evidence for stripped binaries by expanding fingerprint sources, symbol parsing, and policy controls over heuristic findings.
|
||||
|
||||
**Design decisions**
|
||||
- Build a new `RustFingerprintCatalog` signed and versioned, fed by Cargo crate metadata, community hash contributions, and curated fingerprints from StellaOps scans. Catalog lives under `StellaOps.Scanner.Analyzers.Lang.Rust.Fingerprints` with deterministic ordering.
|
||||
- Extend `RustAnalyzerCollector` with symbol parsing (DWARF, ELF build IDs) via `SymbolGraphResolver`. Resolver correlates crate sections, monomorphized symbol prefixes, and `#[panic_handler]` markers to infer crate names and versions.
|
||||
- Emit inference metadata (`fingerprintId`, `confidence`, `symbolEvidence[]`) alongside hashed fallbacks. Authoritative Cargo.lock data (when present) still wins in merges.
|
||||
- `Surface.Validation` adds toggles for fingerprint freshness and maximum catalog size per tenant. Offline bundles deliver catalog updates signed via DSSE.
|
||||
|
||||
**Policy, CLI, docs**
|
||||
- Policy predicates: `rust.fingerprint.confidence`, `rust.fingerprint.catalogAgeDays`. Templates show how to warn when only heuristic data exists, or fail if catalog updates are stale.
|
||||
- CLI flag `--rust-fingerprint-detail` prints authoritative vs inferred crates, symbol samples, and guidance.
|
||||
- Documentation (scanner module + policy guide) explains how inference is stored, how catalog publishing works, and how to tune policy weights.
|
||||
|
||||
**Testing, telemetry, rollout**
|
||||
- Add fixtures for stripped Rust binaries across editions (2018–2024) and with/without LTO. Determinism tests compare catalog revisions and inference outputs.
|
||||
- Metrics `scanner.rust.fingerprint.authoritative`, `scanner.rust.fingerprint.inferred`, `scanner.rust.fingerprint.catalog_version` feed dashboards and alerts.
|
||||
- Offline kit updates include catalog packages, verification instructions, and waiver templates tied to predicate names.
|
||||
|
||||
### SCANNER-ENG-0007 — Deterministic secret leak detection pipeline
|
||||
|
||||
**Scope & goals**
|
||||
- Provide first-party secret leak detection that matches competitor capabilities while preserving deterministic, offline-friendly execution and explainability.
|
||||
|
||||
**Design decisions**
|
||||
- Introduce `StellaOps.Scanner.Analyzers.Secrets`, a restart-time plug-in that consumes rule bundles (`ruleset.tgz`) signed with DSSE and versioned (semantic version + hash). Bundles live under `plugins/scanner/secrets/rules/<version>`.
|
||||
- Rule bundles contain deterministic regex/entropy definitions, context windows, and masking directives. A rule index is generated at build time to guarantee deterministic ordering.
|
||||
- Analyzer executes after Surface validation of each file/layer. Files pass through a streaming matcher that outputs `SecretLeakEvidence` (rule id, severity, confidence, file path, byte ranges, masking applied). Findings persist in `ScanAnalysisStore` and align with DSSE exports.
|
||||
- `Surface.Validation` introduces `scanner.secrets.rules.bundle`, `scanner.secrets.maxFileBytes`, and `scanner.secrets.targetGlobs`. `Surface.Secrets` supplies allowlist tokens (e.g., approved test keys) without exposing plaintext to analyzers.
|
||||
- Events/attestations: findings optionally published via the existing Redis events, and Export Center bundles include masked evidence plus rule metadata.
|
||||
|
||||
**CLI, policy, docs**
|
||||
- Add `stella secrets scan [path|image]` plus `--secrets` flag on `stella scan` to run the analyzer inline. CLI output redacts payloads, shows rule IDs, severity, and remediation hints.
|
||||
- Policy Engine ingests `secret.leak` evidence, including `ruleId`, `confidence`, `masking.applied`, enabling predicates like `secret.leak.highConfidence`, `secret.leak.ruleDisabled`. Templates cover severities, approvals, and ticket automation.
|
||||
- Documentation updates: scanner module dossier (new analyzer), policy cookbook (rule management), and Offline Kit guide (bundling rule updates).
|
||||
|
||||
**Testing, telemetry, rollout**
|
||||
- Rule-pack regression tests ensure deterministic matching and masking; analyzer unit tests cover regex + entropy combos, while integration tests run across sample repositories and OCI layers.
|
||||
- Metrics: `scanner.secrets.ruleset.version`, `scanner.secrets.findings.total`, `scanner.secrets.findings.high_confidence`. Logs include rule ID, masked hash, and file digests for auditing.
|
||||
- Offline Kit delivers the signed ruleset catalog, upgrade guide, and policy defaults so fully air-gapped tenants can keep pace without internet access.
|
||||
|
||||
@@ -13,11 +13,16 @@
|
||||
| `SCANNER-ENG-0012` | TODO | Evaluate Dart analyzer requirements (pubspec parsing, AOT artifacts) and split implementation tasks. | Language Analyzer Guild (src/Scanner/StellaOps.Scanner.Analyzers.Lang.Dart) | — |
|
||||
| `SCANNER-ENG-0013` | TODO | Plan Swift Package Manager coverage (Package.resolved, xcframeworks, runtime hints) with policy hooks. | Swift Analyzer Guild (src/Scanner/StellaOps.Scanner.Analyzers.Lang.Swift) | — |
|
||||
| `SCANNER-ENG-0014` | TODO | Align Kubernetes/VM target coverage between Scanner and Zastava per comparison findings; publish joint roadmap. | Runtime Guild, Zastava Guild (docs/modules/scanner) | — |
|
||||
| `SCANNER-ENG-0015` | TODO | Document DSSE/Rekor operator enablement guidance and rollout levers surfaced in the gap analysis. | Export Center Guild, Scanner Guild (docs/modules/scanner) | — |
|
||||
| `SCANNER-ENG-0015` | DOING (2025-11-09) | Document DSSE/Rekor operator enablement guidance and rollout levers surfaced in the gap analysis. | Export Center Guild, Scanner Guild (docs/modules/scanner) | — |
|
||||
| `SCANNER-ENG-0016` | DOING (2025-11-02) | Implement `RubyLockCollector` + vendor cache ingestion per design §4.1–4.3. | Ruby Analyzer Guild (src/Scanner/StellaOps.Scanner.Analyzers.Lang.Ruby) | SCANNER-ENG-0009 |
|
||||
| `SCANNER-ENG-0017` | TODO | Build the runtime require/autoload graph builder with tree-sitter Ruby per design §4.4 and integrate EntryTrace hints. | Ruby Analyzer Guild (src/Scanner/StellaOps.Scanner.Analyzers.Lang.Ruby) | SCANNER-ENG-0016 |
|
||||
| `SCANNER-ENG-0018` | TODO | Emit Ruby capability + framework surface signals as defined in design §4.5 with policy predicate hooks. | Ruby Analyzer Guild (src/Scanner/StellaOps.Scanner.Analyzers.Lang.Ruby) | SCANNER-ENG-0017 |
|
||||
| `SCANNER-ENG-0019` | TODO | Ship Ruby CLI verbs (`stella ruby inspect|resolve`) and Offline Kit packaging per design §4.6. | Ruby Analyzer Guild, CLI Guild (src/Scanner/StellaOps.Scanner.Analyzers.Lang.Ruby) | SCANNER-ENG-0016..0018 |
|
||||
| `SCANNER-ENG-0017` | DONE (2025-11-09) | Build the runtime require/autoload graph builder with tree-sitter Ruby per design §4.4 and integrate EntryTrace hints. | Ruby Analyzer Guild (src/Scanner/StellaOps.Scanner.Analyzers.Lang.Ruby) | SCANNER-ENG-0016 |
|
||||
| `SCANNER-ENG-0018` | DONE (2025-11-09) | Emit Ruby capability + framework surface signals as defined in design §4.5 with policy predicate hooks. | Ruby Analyzer Guild (src/Scanner/StellaOps.Scanner.Analyzers.Lang.Ruby) | SCANNER-ENG-0017 |
|
||||
| `SCANNER-ENG-0019` | DOING (2025-11-10) | Ship Ruby CLI verbs (`stella ruby inspect|resolve`) and Offline Kit packaging per design §4.6. | Ruby Analyzer Guild, CLI Guild (src/Scanner/StellaOps.Scanner.Analyzers.Lang.Ruby) | SCANNER-ENG-0016..0018 |
|
||||
| `SCANNER-LIC-0001` | DOING (2025-11-02) | Vet tree-sitter Ruby licensing + Offline Kit packaging requirements and document SPDX posture. | Scanner Guild, Legal Guild (docs/modules/scanner) | SCANNER-ENG-0016 |
|
||||
| `SCANNER-POLICY-0001` | TODO | Define Policy Engine predicates for Ruby groups/capabilities and align lattice weights. | Policy Guild, Ruby Analyzer Guild (docs/modules/scanner) | SCANNER-ENG-0018 |
|
||||
| `SCANNER-CLI-0001` | TODO | Coordinate CLI UX/help text for new Ruby verbs and update CLI docs/golden outputs. | CLI Guild, Ruby Analyzer Guild (src/Cli/StellaOps.Cli) | SCANNER-ENG-0019 |
|
||||
| `SCANNER-CLI-0001` | DOING (2025-11-09) | Coordinate CLI UX/help text for new Ruby verbs and update CLI docs/golden outputs. | CLI Guild, Ruby Analyzer Guild (src/Cli/StellaOps.Cli) | SCANNER-ENG-0019 |
|
||||
|
||||
### Updates — 2025-11-09
|
||||
|
||||
- `SCANNER-CLI-0001`: Completed Spectre table wrapping fix for runtime/lockfile columns, expanded Ruby resolve JSON assertions, removed ad-hoc debug artifacts, and drafted CLI docs covering `stellaops-cli ruby inspect|resolve`. Pending: final verification + handoff once docs/tests merge.
|
||||
- `SCANNER-CLI-0001`: Wired `stellaops-cli ruby inspect|resolve` into `CommandFactory` so the verbs are available via `System.CommandLine` with the expected `--root`, `--image/--scan-id`, and `--format` options; `dotnet test ... --filter Ruby` passes.
|
||||
|
||||
@@ -13,4 +13,48 @@ This file now only tracks the runtime & signals status snapshot. Active backlog
|
||||
| 140.C Signals | Signals Guild · Authority Guild (for scopes) · Runtime Guild | Sprint 120.A – AirGap; Sprint 130.A – Scanner | DOING | API skeleton and callgraph ingestion are active; runtime facts endpoint still depends on the same shared prerequisites. |
|
||||
| 140.D Zastava | Zastava Observer/Webhook Guilds · Security Guild | Sprint 120.A – AirGap; Sprint 130.A – Scanner | TODO | Surface.FS integration waits on Scanner surface caches; prep sealed-mode env helpers meanwhile. |
|
||||
|
||||
# Status snapshot (2025-11-09)
|
||||
|
||||
- **140.A Graph** – GRAPH-INDEX-28-007/008/009/010 remain TODO while Scanner surface artifacts and SBOM projection schemas are outstanding; no clustering/backfill/fixture work has started.
|
||||
- **140.B SbomService** – Advisory AI, console, and orchestrator tracks stay TODO; SBOM-SERVICE-21-001..004 are BLOCKED until Concelier Link-Not-Merge (`CONCELIER-GRAPH-21-001`) + Cartographer schema (`CARTO-GRAPH-21-002`) land.
|
||||
- **140.C Signals** – SIGNALS-24-001 now complete (host, RBAC, sealed-mode readiness, `/signals/facts/{subject}`); SIGNALS-24-002 added callgraph retrieval APIs but still needs CAS promotion; SIGNALS-24-003 accepts JSON + NDJSON runtime uploads, yet NDJSON provenance/context wiring remains TODO. Scoring/cache work (SIGNALS-24-004/005) is still BLOCKED pending runtime feed availability (target 2025-11-09).
|
||||
- **140.D Zastava** – ZASTAVA-ENV-01/02, ZASTAVA-SECRETS-01/02, and ZASTAVA-SURFACE-01/02 are still TODO because Surface.FS cache outputs from Scanner aren’t published; guilds limited to design/prep.
|
||||
|
||||
# Blockers & coordination
|
||||
|
||||
- **Concelier Link-Not-Merge / Cartographer schemas** – SBOM-SERVICE-21-001..004 cannot start until `CONCELIER-GRAPH-21-001` and `CARTO-GRAPH-21-002` deliver the projection payloads.
|
||||
- **Scanner surface artifacts** – GRAPH-INDEX-28-007+ and all ZASTAVA-SURFACE tasks depend on Sprint 130 analyzer outputs and cached layer metadata; need updated ETA from Scanner guild.
|
||||
- **Signals host merge** – SIGNALS-24-003/004/005 remain blocked until SIGNALS-24-001/002 merge and Authority scope work (`AUTH-SIG-26-001`) is validated with Runtime guild.
|
||||
|
||||
# Next actions (target: 2025-11-12)
|
||||
|
||||
| Owner(s) | Action |
|
||||
| --- | --- |
|
||||
| Graph Indexer Guild | Hold design sync with Scanner Surface + SBOM Service owners to lock artifact delivery dates; prep clustering job scaffolds so work can start once feeds land. |
|
||||
| SBOM Service Guild | Finalize projection schema doc with Concelier/Cartographer, then flip SBOM-SERVICE-21-001 to DOING and align SBOM-AIAI-31-001 with Sprint 111 requirements. |
|
||||
| Signals Guild | Land SIGNALS-24-001/002 PRs, then immediately kick off SIGNALS-24-003; coordinate scoring/cache roadmap with Runtime + Data Science guilds. |
|
||||
| Zastava Guilds | Draft Surface.Env helper adoption plan and ensure Surface.Secrets references are wired so implementation can begin when Surface.FS caches publish. |
|
||||
|
||||
# Downstream dependency rollup (snapshot: 2025-11-09)
|
||||
|
||||
| Track | Dependent sprint(s) | Impact if delayed |
|
||||
| --- | --- | --- |
|
||||
| 140.A Graph | `docs/implplan/SPRINT_141_graph.md` (Graph clustering/backfill) and downstream Graph UI overlays | Graph insights, policy overlays, and runtime clustering views cannot progress without GRAPH-INDEX-28-007+ landing. |
|
||||
| 140.B SbomService | `docs/implplan/SPRINT_142_sbomservice.md`, Advisory AI (Sprint 111), Policy/Vuln Explorer feeds | SBOM projections/events stay unavailable, blocking Advisory AI remedation heuristics, policy joins, and Vuln Explorer candidate generation. |
|
||||
| 140.C Signals | `docs/implplan/SPRINT_143_signals.md` plus Runtime/Reachability dashboards | Reachability scoring, cache/event layers, and runtime facts outputs cannot start until SIGNALS-24-001/002 merge and Scanner runtime data flows. |
|
||||
| 140.D Zastava | `docs/implplan/SPRINT_144_zastava.md`, Runtime admission enforcement | Surface-integrated drift/admission hooks remain stalled; sealed-mode env helpers cannot ship without Surface.FS metadata. |
|
||||
|
||||
# Risk log
|
||||
|
||||
| Risk | Impact | Mitigation / owner |
|
||||
| --- | --- | --- |
|
||||
| Concelier Link-Not-Merge schema slips | SBOM-SERVICE-21-001..004 + Advisory AI SBOM endpoints stay blocked | Concelier + Cartographer guilds to publish CARTO-GRAPH-21-002 ETA during next coordination call; SBOM guild to prep schema doc meanwhile. |
|
||||
| Scanner surface artifact delay | GRAPH-INDEX-28-007+ and ZASTAVA-SURFACE-* cannot even start | Scanner guild to deliver analyzer artifact roadmap; Graph/Zastava teams to prepare mocks/tests in advance. |
|
||||
| Signals host/callgraph merge misses 2025-11-09 | SIGNALS-24-003/004/005 remain blocked, pushing reachability scoring past sprint goals | Signals + Authority guilds to prioritize AUTH-SIG-26-001 review and merge SIGNALS-24-001/002 before 2025-11-10 standup. |
|
||||
|
||||
# Coordination log
|
||||
|
||||
| Date | Notes |
|
||||
| --- | --- |
|
||||
| 2025-11-09 | Sprint 140 snapshot refreshed; awaiting Scanner surface artifact ETA, Concelier/CARTO schema delivery, and Signals host merge before any wave can advance to DOING. |
|
||||
# Sprint 140 - Runtime & Signals
|
||||
|
||||
@@ -10,11 +10,14 @@ Notes:
|
||||
- 2025-10-29: JSON parsers for Java/Node.js/Python/Go implemented; artifacts stored on filesystem with SHA-256 and callgraphs upserted into Mongo.
|
||||
Task ID | State | Task description | Owners (Source)
|
||||
--- | --- | --- | ---
|
||||
SIGNALS-24-001 | DOING (2025-11-07) | Stand up Signals API skeleton with RBAC, sealed-mode config, DPoP/mTLS enforcement, and `/facts` scaffolding so downstream ingestion work can begin. Dependencies: AUTH-SIG-26-001. | Signals Guild, Authority Guild (src/Signals/StellaOps.Signals)
|
||||
SIGNALS-24-001 | DONE (2025-11-09) | Stand up Signals API skeleton with RBAC, sealed-mode config, DPoP/mTLS enforcement, and `/facts` scaffolding so downstream ingestion work can begin. Dependencies: AUTH-SIG-26-001. | Signals Guild, Authority Guild (src/Signals/StellaOps.Signals)
|
||||
> 2025-11-09: Signals host now registers sealed-mode evidence validation, exposes `/readyz`/`/status` indicators, enforces scope policies, and adds `/signals/facts/{subjectKey}` retrieval plus runtime-facts ingestion backing services.
|
||||
SIGNALS-24-002 | DOING (2025-11-07) | Implement callgraph ingestion/normalization (Java/Node/Python/Go) with CAS persistence and retrieval APIs to feed reachability scoring. Dependencies: SIGNALS-24-001. | Signals Guild (src/Signals/StellaOps.Signals)
|
||||
SIGNALS-24-003 | BLOCKED (2025-10-27) | Implement runtime facts ingestion endpoint and normalizer (process, sockets, container metadata) populating `context_facts` with AOC provenance.<br>2025-10-27: Depends on `SIGNALS-24-001` for base API host and authentication plumbing. | Signals Guild, Runtime Guild (src/Signals/StellaOps.Signals)
|
||||
> 2025-11-09: Added `/signals/callgraphs/{id}` retrieval, sealed-mode gating, and CAS-backed artifact metadata responses; remaining work is CAS bucket promotion + signed graph manifests.
|
||||
SIGNALS-24-003 | DOING (2025-11-09) | Implement runtime facts ingestion endpoint and normalizer (process, sockets, container metadata) populating `context_facts` with AOC provenance.<br>2025-11-09: Initial JSON ingestion service + persistence landed; NDJSON/gzip + context enrichment remain TODO. | Signals Guild, Runtime Guild (src/Signals/StellaOps.Signals)
|
||||
> 2025-11-07: Waiting on SIGNALS-24-001 / SIGNALS-24-002 DOING work to land before flipping this to DOING.
|
||||
> 2025-11-07: Upstream SIGNALS-24-001 / SIGNALS-24-002 now DOING; this flips to DOING once host + callgraph ingestion merge.
|
||||
> 2025-11-08: Targeting 2025-11-09 merge for SIGNALS-24-001/002; schema + AOC contract drafted so SIGNALS-24-003 can move to DOING immediately after those PRs land (dependencies confirmed, none missing).
|
||||
> 2025-11-09: Added runtime facts ingestion service + endpoint, aggregated runtime hit storage, and unit tests; next steps are NDJSON/gzip ingestion and provenance metadata wiring.
|
||||
SIGNALS-24-004 | BLOCKED (2025-10-27) | Deliver reachability scoring engine producing states/scores and writing to `reachability_facts`; expose configuration for weights. Dependencies: SIGNALS-24-003.<br>2025-10-27: Upstream ingestion pipelines (`SIGNALS-24-002/003`) blocked; scoring engine cannot proceed. | Signals Guild, Data Science (src/Signals/StellaOps.Signals)
|
||||
SIGNALS-24-005 | BLOCKED (2025-10-27) | Implement Redis caches (`reachability_cache:*`), invalidation on new facts, and publish `signals.fact.updated` events. Dependencies: SIGNALS-24-004.<br>2025-10-27: Awaiting scoring engine and ingestion layers before wiring cache/events. | Signals Guild, Platform Events Guild (src/Signals/StellaOps.Signals)
|
||||
SIGNALS-24-005 | BLOCKED (2025-10-27) | Implement Redis caches (`reachability_cache:*`), invalidation on new facts, and publish `signals.fact.updated` events. Dependencies: SIGNALS-24-004.<br>2025-10-27: Awaiting scoring engine and ingestion layers before wiring cache/events. | Signals Guild, Platform Events Guild (src/Signals/StellaOps.Signals)
|
||||
|
||||
@@ -189,6 +189,137 @@ Replays the AOC guard against stored raw documents. By default it checks all adv
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 4 · `stella node lock-validate`
|
||||
|
||||
### 4.1 Synopsis
|
||||
|
||||
```bash
|
||||
stella node lock-validate \
|
||||
[--path <directory>] \
|
||||
[--format table|json] \
|
||||
[--verbose]
|
||||
```
|
||||
|
||||
### 4.2 Description
|
||||
|
||||
Runs the Node analyzer locally against a working directory to compare lockfiles (`package-lock.json`, `pnpm-lock.yaml`, `yarn.lock`) with what is actually present in `node_modules`. The command is read-only and never schedules a scan; it reuses the same deterministic collector that powers Scanner so results match backend evidence. Output highlights two conditions that policy cares about:
|
||||
|
||||
- **Declared Only** – packages present in lockfiles but missing from the filesystem or final image.
|
||||
- **Missing Lock** – packages discovered at runtime without corresponding lock metadata (no registry provenance, integrity hash, or repository information).
|
||||
|
||||
This helps catch drift before images are built, keeps lockfiles trustworthy, and feeds policy predicates such as `node.lock.declaredMissing`.
|
||||
|
||||
### 4.3 Options
|
||||
|
||||
| Option | Description |
|
||||
|--------|-------------|
|
||||
| `--path`, `-p` | Directory containing `package.json` and lockfiles. Defaults to the current working directory. |
|
||||
| `--format table|json` | `table` (default) renders a Spectre table with status badges; `json` prints the underlying report for CI automation. |
|
||||
| `--verbose` | Enables detailed logging (shared root option). |
|
||||
|
||||
### 4.4 Output & exit codes
|
||||
|
||||
- `table` mode prints a summary row and two sections: `Declared Only` (red) and `Missing Lock` (yellow). Columns show package, version, lock source/locator, and filesystem path so engineers can reconcile quickly.
|
||||
- `json` mode emits `{ declaredOnly: [], missingLockMetadata: [], totalDeclared, totalInstalled }`, mirroring the analyzer telemetry.
|
||||
|
||||
Exit codes:
|
||||
|
||||
| Code | Meaning |
|
||||
|------|---------|
|
||||
| `0` | No inconsistencies detected. |
|
||||
| `1` | Declared-only or missing-lock packages were found. |
|
||||
| `71` | The requested directory could not be read (missing path, permissions, etc.). |
|
||||
|
||||
The CLI also records `stellaops.cli.node.lock_validate.count{outcome}` so operators can monitor adoption in telemetry.
|
||||
|
||||
### 4.5 Offline notes
|
||||
|
||||
- Works entirely offline; point `--path` at a workspace checked out from an Offline Kit or build cache.
|
||||
- Honors the same `Surface.Validation` limits configured for Scanner once those knobs (`scanner.lockfiles.node.*`) are deployed cluster-wide.
|
||||
- Combine with `stella scan` by running lock validation in CI before images are built to fail fast on inconsistent manifests.
|
||||
|
||||
---
|
||||
|
||||
## 5 · `stella python lock-validate`
|
||||
|
||||
### 5.1 Synopsis
|
||||
|
||||
```bash
|
||||
stella python lock-validate \
|
||||
[--path <directory>] \
|
||||
[--format table|json] \
|
||||
[--verbose]
|
||||
```
|
||||
|
||||
### 5.2 Description
|
||||
|
||||
Validates Python lockfiles (currently `requirements*.txt`, `Pipfile.lock`, and `poetry.lock`) against what exists in `site-packages`. It uses the same analyzer Scanner runs so declared-only packages, missing locks, and editable installs are detected deterministically and without internet access. This catches drift between lock manifests and baked images before scanners or policy gates fail later.
|
||||
|
||||
### 5.3 Options
|
||||
|
||||
| Option | Description |
|
||||
|--------|-------------|
|
||||
| `--path`, `-p` | Directory containing `lib/python*/site-packages` and lockfiles. Defaults to `$PWD`. |
|
||||
| `--format table|json` | `table` (default) prints a human summary; `json` emits the raw report for CI. |
|
||||
| `--verbose` | Enables detailed logging. |
|
||||
|
||||
### 5.4 Output & exit codes
|
||||
|
||||
Output shape mirrors the Node command: declared-only packages are shown with lock provenance, and runtime packages missing lock metadata are highlighted separately. JSON mode returns the same object schema `{ declaredOnly, missingLockMetadata, totalDeclared, totalInstalled }`.
|
||||
|
||||
Exit codes follow the same contract (`0` success, `1` violations, `71` for unreadable path). Telemetry is published via `stellaops.cli.python.lock_validate.count{outcome}`.
|
||||
|
||||
### 5.5 Offline notes
|
||||
|
||||
- Works entirely offline—lockfiles and `site-packages` must already be present (from a venv snapshot, container rootfs, or Offline Kit).
|
||||
- Honors upcoming `scanner.lockfiles.python.*` guardrails once Surface.Validation is wired in so CLI + Scanner enforce the same registry/size limits.
|
||||
- Recommended CI flow: run `stella python lock-validate` before building containers and fail fast when declared-only packages remain.
|
||||
|
||||
## 6 · `stella java lock-validate`
|
||||
|
||||
### 6.1 Synopsis
|
||||
|
||||
```bash
|
||||
stella java lock-validate \\
|
||||
[--path <directory>] \\
|
||||
[--format table|json] \\
|
||||
[--verbose]
|
||||
```
|
||||
|
||||
### 6.2 Description
|
||||
|
||||
Executes the Java language analyzer locally so Gradle `gradle.lockfile`, `gradle/dependency-locks/**/*.lockfile`, and `pom.xml` declarations can be compared with the jars that actually ship in a workspace. The command reuses the new `JavaLockFileCollector` plus the `JavaLanguageAnalyzer` merge logic, so it emits the same `DeclaredOnly` and `Missing Lock` evidence that Scanner and Policy consume. Engineers can see which coordinates exist only in lockfiles (no jar on disk) and which installed jars lack lock metadata (no repository/provenance) before a scan ever runs.
|
||||
|
||||
### 6.3 Options
|
||||
|
||||
| Option | Description |
|
||||
|--------|-------------|
|
||||
| `--path`, `-p` | Directory containing jars (e.g., `build/libs`) and lockfiles. Defaults to the current working directory. |
|
||||
| `--format table|json` | `table` (default) renders the Spectre table; `json` outputs the raw `LockValidationReport`. |
|
||||
| `--verbose` | Enables detailed logging and surfaces the analyzer paths being inspected. |
|
||||
|
||||
### 6.4 Output & exit codes
|
||||
|
||||
Output mirrors the Node/Python verbs: `Declared Only` rows include the lock source/locator (e.g., `gradle.lockfile`, `gradle/dependency-locks/app.lockfile`) plus configuration/repository hints, while `Missing Lock` rows highlight jars that Scanner would tag with `lockMissing=true`. JSON responses return `{ declaredOnly, missingLockMetadata, totalDeclared, totalInstalled }`.
|
||||
|
||||
Exit codes align with the other lock validators:
|
||||
|
||||
| Code | Meaning |
|
||||
|------|---------|
|
||||
| `0` | No inconsistencies detected. |
|
||||
| `1` | Declared-only or missing-lock jars detected. |
|
||||
| `71` | Directory could not be read. |
|
||||
|
||||
Telemetry is recorded via `stellaops.cli.java.lock_validate.count{outcome}` so adoption can be monitored alongside the Node/Python verbs.
|
||||
|
||||
### 6.5 Offline notes
|
||||
|
||||
- Works with any workspace (Gradle, Maven, or extracted container layers) – no network access or build tool metadata is required at runtime.
|
||||
- Honors forthcoming `scanner.lockfiles.java.*` Surface.Validation limits once they are deployed so CLI + Scanner stay in lockstep.
|
||||
- Recommended CI flow: run `stella java lock-validate` before packaging containers to surface missing locks/declared-only coordinates early.
|
||||
|
||||
### 3.5 Exit codes
|
||||
|
||||
| Exit code | Meaning |
|
||||
|
||||
@@ -2,7 +2,10 @@
|
||||
|
||||
Scanner analyses container images layer-by-layer, producing deterministic SBOM fragments, diffs, and signed reports.
|
||||
|
||||
## Latest updates (2025-11-06)
|
||||
## Latest updates (2025-11-09)
|
||||
- Node analyzer now ingests npm/yarn/pnpm lockfiles, emitting `DeclaredOnly` components with lock provenance. The CLI companion command `stella node lock-validate` runs the collector offline, surfaces declared-only or missing-lock packages, and emits telemetry via `stellaops.cli.node.lock_validate.count`.
|
||||
- Python analyzer picks up `requirements*.txt`, `Pipfile.lock`, and `poetry.lock`, tagging installed distributions with lock provenance and generating declared-only components for policy. Use `stella python lock-validate` to run the same checks locally before images are built.
|
||||
- Java analyzer now parses `gradle.lockfile`, `gradle/dependency-locks/**/*.lockfile`, and `pom.xml` dependencies via the new `JavaLockFileCollector`, merging lock metadata onto jar evidence and emitting declared-only components when jars are absent. The new CLI verb `stella java lock-validate` reuses that collector offline (table/JSON output) and records `stellaops.cli.java.lock_validate.count{outcome}` for observability.
|
||||
- Worker/WebService now resolve cache roots and feature flags via `StellaOps.Scanner.Surface.Env`; misconfiguration warnings are documented in `docs/modules/scanner/design/surface-env.md` and surfaced through startup validation.
|
||||
- Platform events rollout (2025-10-19) continues to publish scanner.report.ready@1 and scanner.scan.completed@1 envelopes with embedded DSSE payloads (see docs/updates/2025-10-19-scanner-policy.md and docs/updates/2025-10-19-platform-events.md). Service and consumer tests should round-trip the canonical samples under docs/events/samples/.
|
||||
|
||||
@@ -33,6 +36,7 @@ Scanner analyses container images layer-by-layer, producing deterministic SBOM f
|
||||
- ./operations/rustfs-migration.md
|
||||
- ./operations/entrypoint.md
|
||||
- ./operations/secret-leak-detection.md
|
||||
- ./operations/dsse-rekor-operator-guide.md
|
||||
- ./design/macos-analyzer.md
|
||||
- ./design/windows-analyzer.md
|
||||
- ../benchmarks/scanner/deep-dives/macos.md
|
||||
|
||||
171
docs/modules/scanner/operations/dsse-rekor-operator-guide.md
Normal file
171
docs/modules/scanner/operations/dsse-rekor-operator-guide.md
Normal file
@@ -0,0 +1,171 @@
|
||||
# DSSE & Rekor Operator Enablement Guide
|
||||
|
||||
> **Audience.** Scanner / Export Center operators, platform SREs, and field engineers bringing DSSE attestations + Rekor proofs into production (online or air-gapped).
|
||||
>
|
||||
> **Sources.** Aligns with Sprint 138 (SCANNER-ENG-0015) gap analysis (§DSSE/Rekor operator enablement) and Scanner architecture specs.
|
||||
|
||||
---
|
||||
|
||||
## 1. Why this matters
|
||||
|
||||
- **Evidence on demand.** Every SBOM, diff, and report can be bound to a DSSE envelope issued by `StellaOps.Signer`, logged to Rekor via `StellaOps.Attestor`, and bundled for export/offline use.
|
||||
- **Policy leverage.** Policy Engine predicates gate releases until attestations exist *and* their Rekor proofs verify, reducing “unsigned” drift.
|
||||
- **Regulatory readiness.** Operators need a deterministic playbook to satisfy PCI, FedRAMP, EU CRA, and national sovereignty requirements without phoning home.
|
||||
|
||||
---
|
||||
|
||||
## 2. Components & responsibilities
|
||||
|
||||
| Component | Role | Key references |
|
||||
|-----------|------|----------------|
|
||||
| `StellaOps.Signer` | Issues DSSE envelopes using PoE-scoped keys (Fulcio or BYO KMS/HSM). | `ops/devops/signing/` |
|
||||
| `StellaOps.Attestor` | Submits DSSE payloads to Rekor v2, caches `{uuid,index,proof}` and mirrors proofs offline. | `docs/modules/attestor/architecture.md` |
|
||||
| Rekor v2 (managed or self-hosted) | Transparency log providing UUIDs + inclusion proofs. | `docs/ops/rekor/README.md` (if self-hosted) |
|
||||
| `StellaOps.Scanner` (WebService/Worker) | Requests attestations per scan, stores Rekor metadata next to SBOM artefacts. | `docs/modules/scanner/architecture.md` |
|
||||
| Export Center | Packages DSSE payloads + proofs into Offline Kit bundles and mirrors license notices. | `docs/modules/export-center/architecture.md` |
|
||||
| Policy Engine + CLI | Enforce “attested only” promotion, expose CLI verification verbs. | `docs/modules/policy/architecture.md`, `docs/09_API_CLI_REFERENCE.md` |
|
||||
|
||||
---
|
||||
|
||||
## 3. Prerequisites checklist
|
||||
|
||||
1. **Keys & trust roots**
|
||||
- Fulcio / KMS credentials available to `StellaOps.Signer`.
|
||||
- Rekor public key pinned (`rekor.pub`) for verification jobs and CLI tooling.
|
||||
2. **Service wiring**
|
||||
- `scanner.attestation.signerEndpoint` → internal Signer base URL.
|
||||
- `scanner.attestation.attestorEndpoint` → Attestor base URL.
|
||||
- `attestor.rekor.api` & `attestor.rekor.pubkey` set for the target log.
|
||||
3. **Storage**
|
||||
- Mongo collections `attestations` & `rekorProofs` sized for retention (7–30 days recommended).
|
||||
- Object store tier with at-rest encryption for DSSE payloads.
|
||||
4. **Observability**
|
||||
- Metrics: `attestor_rekor_success_total`, `attestor_rekor_retry_total`, `rekor_inclusion_latency`.
|
||||
- Logs shipped to your SIEM for compliance (Signer request/response IDs, Rekor UUIDs).
|
||||
5. **Offline readiness**
|
||||
- Export Center profile with `attestations.bundle=true`.
|
||||
- Rekor log snapshots mirrored (ORAS bundle or rsync of `/var/log/rekor`) for disconnected verification.
|
||||
|
||||
---
|
||||
|
||||
## 4. Enablement workflow
|
||||
|
||||
### 4.1 Configure Signer & Attestor
|
||||
|
||||
```yaml
|
||||
signer:
|
||||
schemaVersion: 2
|
||||
keyProvider: kms-fleet
|
||||
attestorEndpoint: https://attestor.internal
|
||||
defaultPredicate: https://stella-ops.org/attestations/sbom/1
|
||||
|
||||
attestor:
|
||||
schemaVersion: 1
|
||||
rekor:
|
||||
api: https://rekor.internal
|
||||
publicKeyPath: /etc/rekor/rekor.pub
|
||||
offlineMirrorPath: /var/lib/rekor/snapshots
|
||||
retry:
|
||||
maxAttempts: 5
|
||||
backoffSeconds: 15
|
||||
```
|
||||
|
||||
### 4.2 Turn on Scanner enforcement
|
||||
|
||||
```yaml
|
||||
scanner:
|
||||
schemaVersion: 2
|
||||
attestation:
|
||||
requireDsse: true # fail scans when Signer/Attestor errors occur
|
||||
signerEndpoint: https://signer.internal
|
||||
attestorEndpoint: https://attestor.internal
|
||||
uploadArtifacts: true # store DSSE + proof next to SBOM artefacts
|
||||
```
|
||||
|
||||
Set `requireDsse=false` during observation, then flip to `true` once Rekor health SLOs are green.
|
||||
|
||||
### 4.3 Policy templates
|
||||
|
||||
Add Policy Engine predicates (Rego snippet):
|
||||
|
||||
```rego
|
||||
package stella.policies.attestation
|
||||
|
||||
deny[msg] {
|
||||
not input.attestations.rekor_verified
|
||||
msg := sprintf("missing Rekor proof for %s", [input.scan_id])
|
||||
}
|
||||
|
||||
warn[msg] {
|
||||
input.attestations.rekor_age_hours > 24
|
||||
msg := sprintf("Rekor proof older than 24h for %s", [input.scan_id])
|
||||
}
|
||||
```
|
||||
|
||||
Tie Scheduler or CI promotion gates to the `deny` result.
|
||||
|
||||
### 4.4 CLI and verification
|
||||
|
||||
- `stellaops-cli runtime policy test --image <digest> --json` already surfaces `attestation.uuid` and `rekorVerified` fields.
|
||||
- To validate bundles offline: `stellaops-cli attest verify --bundle path/to/export.tar --rekor-key rekor.pub`.
|
||||
|
||||
Document these flows for AppSec teams so they can self-serve proofs during audits.
|
||||
|
||||
### 4.5 Export Center profile
|
||||
|
||||
```yaml
|
||||
exportProfiles:
|
||||
secure-default:
|
||||
includeSboms: true
|
||||
includeAttestations: true
|
||||
includeRekorProofs: true
|
||||
policy:
|
||||
requireAttestations: true
|
||||
allowUnsigned: false
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 5. Rollout levers & phases
|
||||
|
||||
| Phase | Toggle | Goal |
|
||||
|-------|--------|------|
|
||||
| **Observe** | `scanner.attestation.requireDsse=false`, policies in `warn` mode. | Validate plumbing without blocking builds; capture metrics. |
|
||||
| **Enforce** | Flip `requireDsse=true`, policy `deny` for missing proofs, Rekor SLO alerts live. | Block unsigned artefacts; auto-retry attestor failures. |
|
||||
| **Escalate** | Export Center profile `includeAttestations=true`, CLI docs distributed, Notify alerts wired. | Broad communication + audit evidence ready. |
|
||||
|
||||
Roll forward per environment; keep the previous phase’s toggles for hot rollback.
|
||||
|
||||
---
|
||||
|
||||
## 6. Offline / air-gap guidance
|
||||
|
||||
1. **Mirror Rekor**: take log snapshots daily (`rekor-cli log export`) and add to the Offline Kit.
|
||||
2. **Bundle proofs**: Export Center must include `*.rekor.json` and `rekor-chain.pem` alongside DSSE envelopes.
|
||||
3. **CLI verification offline**:
|
||||
```bash
|
||||
stellaops-cli attest verify --bundle offline-kit.tar \
|
||||
--rekor-root hashsum.txt --rekor-tree treehead.json --rekor-key rekor.pub
|
||||
```
|
||||
4. **Fallback**: When Rekor connectivity is unavailable, Attestor queues submissions locally and emits `attestationPending=true`; policy can allow waivers for a limited TTL via `policy.attestations.deferHours`.
|
||||
|
||||
---
|
||||
|
||||
## 7. Troubleshooting
|
||||
|
||||
| Symptom | Checks | Resolution |
|
||||
|---------|--------|------------|
|
||||
| `attestationPending` flag stays true | `attestor_rekor_retry_total`, Attestor logs, Rekor `/healthz`. | Verify Rekor endpoint & certs; rotate API tokens; replay queued DSSE payloads via `attestor replay`. |
|
||||
| Policy denies despite DSSE | Confirm Rekor proof bundle stored under `/artifacts/<scanId>/rekor/`. | Re-run `stellaops-cli attest verify`, ensure Policy Engine has the new schema (`attestations.rekor_verified`). |
|
||||
| CLI verification fails offline | Ensure Rekor snapshot + `rekor.pub` shipped together; check timestamp gap. | Regenerate snapshot, or import Rekor entries into the isolated log before verifying. |
|
||||
|
||||
---
|
||||
|
||||
## References
|
||||
|
||||
- Gap analysis: `docs/benchmarks/scanner/scanning-gaps-stella-misses-from-competitors.md#dsse-rekor-operator-enablement-trivy-grype-snyk`
|
||||
- Scanner architecture (§Signer → Attestor → Rekor): `docs/modules/scanner/architecture.md`
|
||||
- Export Center profiles: `docs/modules/export-center/architecture.md`
|
||||
- Policy Engine predicates: `docs/modules/policy/architecture.md`
|
||||
- CLI reference: `docs/09_API_CLI_REFERENCE.md`
|
||||
|
||||
@@ -20,10 +20,16 @@ Signals:
|
||||
BypassNetworks:
|
||||
- "127.0.0.1/32"
|
||||
- "::1/128"
|
||||
Mongo:
|
||||
ConnectionString: "mongodb://localhost:27017/signals"
|
||||
Database: "signals"
|
||||
Mongo:
|
||||
ConnectionString: "mongodb://localhost:27017/signals"
|
||||
Database: "signals"
|
||||
CallgraphsCollection: "callgraphs"
|
||||
ReachabilityFactsCollection: "reachability_facts"
|
||||
Storage:
|
||||
RootPath: "../data/signals-artifacts"
|
||||
Storage:
|
||||
RootPath: "../data/signals-artifacts"
|
||||
AirGap:
|
||||
SealedMode:
|
||||
EnforcementEnabled: false
|
||||
EvidencePath: "../ops/devops/sealed-mode-ci/artifacts/sealed-mode-ci/latest/signals-sealed-ci.json"
|
||||
MaxEvidenceAge: "06:00:00"
|
||||
CacheLifetime: "00:01:00"
|
||||
|
||||
@@ -0,0 +1,21 @@
|
||||
{
|
||||
"schemaVersion": "1.0",
|
||||
"id": "stellaops.analyzer.lang.deno",
|
||||
"displayName": "StellaOps Deno Analyzer",
|
||||
"version": "0.1.0",
|
||||
"requiresRestart": true,
|
||||
"entryPoint": {
|
||||
"type": "dotnet",
|
||||
"assembly": "StellaOps.Scanner.Analyzers.Lang.Deno.dll",
|
||||
"typeName": "StellaOps.Scanner.Analyzers.Lang.Deno.DenoAnalyzerPlugin"
|
||||
},
|
||||
"capabilities": [
|
||||
"language-analyzer",
|
||||
"deno"
|
||||
],
|
||||
"metadata": {
|
||||
"org.stellaops.analyzer.language": "deno",
|
||||
"org.stellaops.analyzer.kind": "language",
|
||||
"org.stellaops.restart.required": "true"
|
||||
}
|
||||
}
|
||||
@@ -28,14 +28,15 @@ internal static class CommandFactory
|
||||
{
|
||||
TreatUnmatchedTokensAsErrors = true
|
||||
};
|
||||
root.Add(verboseOption);
|
||||
|
||||
root.Add(BuildScannerCommand(services, verboseOption, cancellationToken));
|
||||
root.Add(BuildScanCommand(services, options, verboseOption, cancellationToken));
|
||||
root.Add(BuildDatabaseCommand(services, verboseOption, cancellationToken));
|
||||
root.Add(BuildSourcesCommand(services, verboseOption, cancellationToken));
|
||||
root.Add(BuildAocCommand(services, verboseOption, cancellationToken));
|
||||
root.Add(BuildAuthCommand(services, options, verboseOption, cancellationToken));
|
||||
root.Add(verboseOption);
|
||||
|
||||
root.Add(BuildScannerCommand(services, verboseOption, cancellationToken));
|
||||
root.Add(BuildScanCommand(services, options, verboseOption, cancellationToken));
|
||||
root.Add(BuildRubyCommand(services, verboseOption, cancellationToken));
|
||||
root.Add(BuildDatabaseCommand(services, verboseOption, cancellationToken));
|
||||
root.Add(BuildSourcesCommand(services, verboseOption, cancellationToken));
|
||||
root.Add(BuildAocCommand(services, verboseOption, cancellationToken));
|
||||
root.Add(BuildAuthCommand(services, options, verboseOption, cancellationToken));
|
||||
root.Add(BuildPolicyCommand(services, options, verboseOption, cancellationToken));
|
||||
root.Add(BuildTaskRunnerCommand(services, verboseOption, cancellationToken));
|
||||
root.Add(BuildFindingsCommand(services, verboseOption, cancellationToken));
|
||||
@@ -177,14 +178,82 @@ internal static class CommandFactory
|
||||
scan.Add(entryTrace);
|
||||
|
||||
scan.Add(run);
|
||||
scan.Add(upload);
|
||||
return scan;
|
||||
}
|
||||
|
||||
scan.Add(upload);
|
||||
return scan;
|
||||
}
|
||||
|
||||
private static Command BuildRubyCommand(IServiceProvider services, Option<bool> verboseOption, CancellationToken cancellationToken)
|
||||
{
|
||||
var ruby = new Command("ruby", "Work with Ruby analyzer outputs.");
|
||||
|
||||
var inspect = new Command("inspect", "Inspect a local Ruby workspace.");
|
||||
var inspectRootOption = new Option<string?>("--root")
|
||||
{
|
||||
Description = "Path to the Ruby workspace (defaults to current directory)."
|
||||
};
|
||||
var inspectFormatOption = new Option<string?>("--format")
|
||||
{
|
||||
Description = "Output format (table or json)."
|
||||
};
|
||||
|
||||
inspect.Add(inspectRootOption);
|
||||
inspect.Add(inspectFormatOption);
|
||||
inspect.SetAction((parseResult, _) =>
|
||||
{
|
||||
var root = parseResult.GetValue(inspectRootOption);
|
||||
var format = parseResult.GetValue(inspectFormatOption) ?? "table";
|
||||
var verbose = parseResult.GetValue(verboseOption);
|
||||
|
||||
return CommandHandlers.HandleRubyInspectAsync(
|
||||
services,
|
||||
root,
|
||||
format,
|
||||
verbose,
|
||||
cancellationToken);
|
||||
});
|
||||
|
||||
var resolve = new Command("resolve", "Fetch Ruby packages for a completed scan.");
|
||||
var resolveImageOption = new Option<string?>("--image")
|
||||
{
|
||||
Description = "Image reference (digest or tag) used by the scan."
|
||||
};
|
||||
var resolveScanIdOption = new Option<string?>("--scan-id")
|
||||
{
|
||||
Description = "Explicit scan identifier."
|
||||
};
|
||||
var resolveFormatOption = new Option<string?>("--format")
|
||||
{
|
||||
Description = "Output format (table or json)."
|
||||
};
|
||||
|
||||
resolve.Add(resolveImageOption);
|
||||
resolve.Add(resolveScanIdOption);
|
||||
resolve.Add(resolveFormatOption);
|
||||
resolve.SetAction((parseResult, _) =>
|
||||
{
|
||||
var image = parseResult.GetValue(resolveImageOption);
|
||||
var scanId = parseResult.GetValue(resolveScanIdOption);
|
||||
var format = parseResult.GetValue(resolveFormatOption) ?? "table";
|
||||
var verbose = parseResult.GetValue(verboseOption);
|
||||
|
||||
return CommandHandlers.HandleRubyResolveAsync(
|
||||
services,
|
||||
image,
|
||||
scanId,
|
||||
format,
|
||||
verbose,
|
||||
cancellationToken);
|
||||
});
|
||||
|
||||
ruby.Add(inspect);
|
||||
ruby.Add(resolve);
|
||||
return ruby;
|
||||
}
|
||||
|
||||
private static Command BuildKmsCommand(IServiceProvider services, Option<bool> verboseOption, CancellationToken cancellationToken)
|
||||
{
|
||||
var kms = new Command("kms", "Manage file-backed signing keys.");
|
||||
|
||||
var kms = new Command("kms", "Manage file-backed signing keys.");
|
||||
|
||||
var export = new Command("export", "Export key material to a portable bundle.");
|
||||
var exportRootOption = new Option<string>("--root")
|
||||
{
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -20,7 +20,8 @@ using StellaOps.Auth.Client;
|
||||
using StellaOps.Cli.Configuration;
|
||||
using StellaOps.Cli.Services.Models;
|
||||
using StellaOps.Cli.Services.Models.AdvisoryAi;
|
||||
using StellaOps.Cli.Services.Models.Transport;
|
||||
using StellaOps.Cli.Services.Models.Ruby;
|
||||
using StellaOps.Cli.Services.Models.Transport;
|
||||
|
||||
namespace StellaOps.Cli.Services;
|
||||
|
||||
@@ -858,9 +859,9 @@ internal sealed class BackendOperationsClient : IBackendOperationsClient
|
||||
return MapPolicyFindingExplain(document);
|
||||
}
|
||||
|
||||
public async Task<EntryTraceResponseModel?> GetEntryTraceAsync(string scanId, CancellationToken cancellationToken)
|
||||
{
|
||||
EnsureBackendConfigured();
|
||||
public async Task<EntryTraceResponseModel?> GetEntryTraceAsync(string scanId, CancellationToken cancellationToken)
|
||||
{
|
||||
EnsureBackendConfigured();
|
||||
|
||||
if (string.IsNullOrWhiteSpace(scanId))
|
||||
{
|
||||
@@ -882,15 +883,46 @@ internal sealed class BackendOperationsClient : IBackendOperationsClient
|
||||
throw new InvalidOperationException(failure);
|
||||
}
|
||||
|
||||
var result = await response.Content.ReadFromJsonAsync<EntryTraceResponseModel>(SerializerOptions, cancellationToken).ConfigureAwait(false);
|
||||
if (result is null)
|
||||
{
|
||||
throw new InvalidOperationException("EntryTrace response payload was empty.");
|
||||
}
|
||||
|
||||
var result = await response.Content.ReadFromJsonAsync<EntryTraceResponseModel>(SerializerOptions, cancellationToken).ConfigureAwait(false);
|
||||
if (result is null)
|
||||
{
|
||||
throw new InvalidOperationException("EntryTrace response payload was empty.");
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
public async Task<IReadOnlyList<RubyPackageArtifactModel>> GetRubyPackagesAsync(string scanId, CancellationToken cancellationToken)
|
||||
{
|
||||
EnsureBackendConfigured();
|
||||
|
||||
if (string.IsNullOrWhiteSpace(scanId))
|
||||
{
|
||||
throw new ArgumentException("Scan identifier is required.", nameof(scanId));
|
||||
}
|
||||
|
||||
using var request = CreateRequest(HttpMethod.Get, $"api/scans/{scanId}/ruby-packages");
|
||||
await AuthorizeRequestAsync(request, cancellationToken).ConfigureAwait(false);
|
||||
|
||||
using var response = await _httpClient.SendAsync(request, cancellationToken).ConfigureAwait(false);
|
||||
if (response.StatusCode == HttpStatusCode.NotFound)
|
||||
{
|
||||
return Array.Empty<RubyPackageArtifactModel>();
|
||||
}
|
||||
|
||||
if (!response.IsSuccessStatusCode)
|
||||
{
|
||||
var failure = await CreateFailureMessageAsync(response, cancellationToken).ConfigureAwait(false);
|
||||
throw new InvalidOperationException(failure);
|
||||
}
|
||||
|
||||
var packages = await response.Content
|
||||
.ReadFromJsonAsync<IReadOnlyList<RubyPackageArtifactModel>>(SerializerOptions, cancellationToken)
|
||||
.ConfigureAwait(false);
|
||||
|
||||
return packages ?? Array.Empty<RubyPackageArtifactModel>();
|
||||
}
|
||||
|
||||
public async Task<AdvisoryPipelinePlanResponseModel> CreateAdvisoryPipelinePlanAsync(
|
||||
AdvisoryAiTaskType taskType,
|
||||
AdvisoryPipelinePlanRequestModel request,
|
||||
|
||||
@@ -5,6 +5,7 @@ using System.Threading.Tasks;
|
||||
using StellaOps.Cli.Configuration;
|
||||
using StellaOps.Cli.Services.Models;
|
||||
using StellaOps.Cli.Services.Models.AdvisoryAi;
|
||||
using StellaOps.Cli.Services.Models.Ruby;
|
||||
|
||||
namespace StellaOps.Cli.Services;
|
||||
|
||||
@@ -48,6 +49,8 @@ internal interface IBackendOperationsClient
|
||||
|
||||
Task<EntryTraceResponseModel?> GetEntryTraceAsync(string scanId, CancellationToken cancellationToken);
|
||||
|
||||
Task<IReadOnlyList<RubyPackageArtifactModel>> GetRubyPackagesAsync(string scanId, CancellationToken cancellationToken);
|
||||
|
||||
Task<AdvisoryPipelinePlanResponseModel> CreateAdvisoryPipelinePlanAsync(AdvisoryAiTaskType taskType, AdvisoryPipelinePlanRequestModel request, CancellationToken cancellationToken);
|
||||
|
||||
Task<AdvisoryPipelineOutputModel?> TryGetAdvisoryPipelineOutputAsync(string cacheKey, AdvisoryAiTaskType taskType, string profile, CancellationToken cancellationToken);
|
||||
|
||||
@@ -0,0 +1,28 @@
|
||||
using System.Collections.Generic;
|
||||
using System.Text.Json.Serialization;
|
||||
|
||||
namespace StellaOps.Cli.Services.Models.Ruby;
|
||||
|
||||
internal sealed record RubyPackageArtifactModel(
|
||||
[property: JsonPropertyName("id")] string Id,
|
||||
[property: JsonPropertyName("name")] string Name,
|
||||
[property: JsonPropertyName("version")] string? Version,
|
||||
[property: JsonPropertyName("source")] string? Source,
|
||||
[property: JsonPropertyName("platform")] string? Platform,
|
||||
[property: JsonPropertyName("groups")] IReadOnlyList<string>? Groups,
|
||||
[property: JsonPropertyName("declaredOnly")] bool? DeclaredOnly,
|
||||
[property: JsonPropertyName("runtimeUsed")] bool? RuntimeUsed,
|
||||
[property: JsonPropertyName("provenance")] RubyPackageProvenance? Provenance,
|
||||
[property: JsonPropertyName("runtime")] RubyPackageRuntime? Runtime,
|
||||
[property: JsonPropertyName("metadata")] IDictionary<string, string?>? Metadata);
|
||||
|
||||
internal sealed record RubyPackageProvenance(
|
||||
[property: JsonPropertyName("source")] string? Source,
|
||||
[property: JsonPropertyName("lockfile")] string? Lockfile,
|
||||
[property: JsonPropertyName("locator")] string? Locator);
|
||||
|
||||
internal sealed record RubyPackageRuntime(
|
||||
[property: JsonPropertyName("entrypoints")] IReadOnlyList<string>? Entrypoints,
|
||||
[property: JsonPropertyName("files")] IReadOnlyList<string>? Files,
|
||||
[property: JsonPropertyName("reasons")] IReadOnlyList<string>? Reasons);
|
||||
|
||||
@@ -47,6 +47,13 @@
|
||||
<ProjectReference Include="../../Authority/StellaOps.Authority/StellaOps.Auth.Client/StellaOps.Auth.Client.csproj" />
|
||||
<ProjectReference Include="../../__Libraries/StellaOps.Plugin/StellaOps.Plugin.csproj" />
|
||||
<ProjectReference Include="../../Scanner/__Libraries/StellaOps.Scanner.EntryTrace/StellaOps.Scanner.EntryTrace.csproj" />
|
||||
<ProjectReference Include="../../Scanner/__Libraries/StellaOps.Scanner.Analyzers.Lang/StellaOps.Scanner.Analyzers.Lang.csproj" />
|
||||
<ProjectReference Include="../../Scanner/__Libraries/StellaOps.Scanner.Analyzers.Lang.Node/StellaOps.Scanner.Analyzers.Lang.Node.csproj" />
|
||||
<ProjectReference Include="../../Scanner/__Libraries/StellaOps.Scanner.Analyzers.Lang.Python/StellaOps.Scanner.Analyzers.Lang.Python.csproj" />
|
||||
<ProjectReference Include="../../Scanner/__Libraries/StellaOps.Scanner.Analyzers.Lang.Ruby/StellaOps.Scanner.Analyzers.Lang.Ruby.csproj" />
|
||||
<ProjectReference Include="../../Scanner/__Libraries/StellaOps.Scanner.Analyzers.Lang.Java/StellaOps.Scanner.Analyzers.Lang.Java.csproj" />
|
||||
<ProjectReference Include="../../Scanner/__Libraries/StellaOps.Scanner.Surface.Env/StellaOps.Scanner.Surface.Env.csproj" />
|
||||
<ProjectReference Include="../../Scanner/__Libraries/StellaOps.Scanner.Surface.Validation/StellaOps.Scanner.Surface.Validation.csproj" />
|
||||
</ItemGroup>
|
||||
|
||||
<ItemGroup Condition="'$(StellaOpsEnableCryptoPro)' == 'true'">
|
||||
|
||||
6
src/Cli/StellaOps.Cli/TASKS.md
Normal file
6
src/Cli/StellaOps.Cli/TASKS.md
Normal file
@@ -0,0 +1,6 @@
|
||||
# CLI Guild — Active Tasks
|
||||
|
||||
| Task ID | State | Notes |
|
||||
| --- | --- | --- |
|
||||
| `SCANNER-CLI-0001` | DOING (2025-11-09) | Add Ruby-specific verbs/help, refresh docs & goldens per Sprint 138. |
|
||||
|
||||
@@ -21,6 +21,11 @@ internal static class CliMetrics
|
||||
private static readonly Counter<long> PolicyFindingsGetCounter = Meter.CreateCounter<long>("stellaops.cli.policy.findings.get.count");
|
||||
private static readonly Counter<long> PolicyFindingsExplainCounter = Meter.CreateCounter<long>("stellaops.cli.policy.findings.explain.count");
|
||||
private static readonly Counter<long> AdvisoryRunCounter = Meter.CreateCounter<long>("stellaops.cli.advisory.run.count");
|
||||
private static readonly Counter<long> NodeLockValidateCounter = Meter.CreateCounter<long>("stellaops.cli.node.lock_validate.count");
|
||||
private static readonly Counter<long> PythonLockValidateCounter = Meter.CreateCounter<long>("stellaops.cli.python.lock_validate.count");
|
||||
private static readonly Counter<long> JavaLockValidateCounter = Meter.CreateCounter<long>("stellaops.cli.java.lock_validate.count");
|
||||
private static readonly Counter<long> RubyInspectCounter = Meter.CreateCounter<long>("stellaops.cli.ruby.inspect.count");
|
||||
private static readonly Counter<long> RubyResolveCounter = Meter.CreateCounter<long>("stellaops.cli.ruby.resolve.count");
|
||||
private static readonly Histogram<double> CommandDurationHistogram = Meter.CreateHistogram<double>("stellaops.cli.command.duration.ms");
|
||||
|
||||
public static void RecordScannerDownload(string channel, bool fromCache)
|
||||
@@ -108,6 +113,36 @@ internal static class CliMetrics
|
||||
new("outcome", string.IsNullOrWhiteSpace(outcome) ? "unknown" : outcome)
|
||||
});
|
||||
|
||||
public static void RecordNodeLockValidate(string outcome)
|
||||
=> NodeLockValidateCounter.Add(1, new KeyValuePair<string, object?>[]
|
||||
{
|
||||
new("outcome", string.IsNullOrWhiteSpace(outcome) ? "unknown" : outcome)
|
||||
});
|
||||
|
||||
public static void RecordPythonLockValidate(string outcome)
|
||||
=> PythonLockValidateCounter.Add(1, new KeyValuePair<string, object?>[]
|
||||
{
|
||||
new("outcome", string.IsNullOrWhiteSpace(outcome) ? "unknown" : outcome)
|
||||
});
|
||||
|
||||
public static void RecordJavaLockValidate(string outcome)
|
||||
=> JavaLockValidateCounter.Add(1, new KeyValuePair<string, object?>[]
|
||||
{
|
||||
new("outcome", string.IsNullOrWhiteSpace(outcome) ? "unknown" : outcome)
|
||||
});
|
||||
|
||||
public static void RecordRubyInspect(string outcome)
|
||||
=> RubyInspectCounter.Add(1, new KeyValuePair<string, object?>[]
|
||||
{
|
||||
new("outcome", string.IsNullOrWhiteSpace(outcome) ? "unknown" : outcome)
|
||||
});
|
||||
|
||||
public static void RecordRubyResolve(string outcome)
|
||||
=> RubyResolveCounter.Add(1, new KeyValuePair<string, object?>[]
|
||||
{
|
||||
new("outcome", string.IsNullOrWhiteSpace(outcome) ? "unknown" : outcome)
|
||||
});
|
||||
|
||||
public static IDisposable MeasureCommandDuration(string command)
|
||||
{
|
||||
var start = DateTime.UtcNow;
|
||||
|
||||
@@ -0,0 +1,36 @@
|
||||
using System;
|
||||
using System.CommandLine;
|
||||
using System.Linq;
|
||||
using System.Threading;
|
||||
using Microsoft.Extensions.DependencyInjection;
|
||||
using Microsoft.Extensions.Logging;
|
||||
using StellaOps.Cli.Commands;
|
||||
using StellaOps.Cli.Configuration;
|
||||
|
||||
namespace StellaOps.Cli.Tests.Commands;
|
||||
|
||||
public sealed class CommandFactoryTests
|
||||
{
|
||||
[Fact]
|
||||
public void Create_RegistersRubyInspectAndResolveCommands()
|
||||
{
|
||||
using var loggerFactory = LoggerFactory.Create(builder => builder.SetMinimumLevel(LogLevel.Critical));
|
||||
var services = new ServiceCollection().BuildServiceProvider();
|
||||
var root = CommandFactory.Create(services, new StellaOpsCliOptions(), CancellationToken.None, loggerFactory);
|
||||
|
||||
var ruby = Assert.Single(root.Subcommands, command => string.Equals(command.Name, "ruby", StringComparison.Ordinal));
|
||||
|
||||
var inspect = Assert.Single(ruby.Subcommands, command => string.Equals(command.Name, "inspect", StringComparison.Ordinal));
|
||||
var inspectOptions = inspect.Children.OfType<Option>().ToArray();
|
||||
var inspectAliases = inspectOptions.SelectMany(option => option.Aliases).ToArray();
|
||||
Assert.Contains("--root", inspectAliases, StringComparer.Ordinal);
|
||||
Assert.Contains("--format", inspectAliases, StringComparer.Ordinal);
|
||||
|
||||
var resolve = Assert.Single(ruby.Subcommands, command => string.Equals(command.Name, "resolve", StringComparison.Ordinal));
|
||||
var resolveOptions = resolve.Children.OfType<Option>().ToArray();
|
||||
var resolveAliases = resolveOptions.SelectMany(option => option.Aliases).ToArray();
|
||||
Assert.Contains("--image", resolveAliases, StringComparer.Ordinal);
|
||||
Assert.Contains("--scan-id", resolveAliases, StringComparer.Ordinal);
|
||||
Assert.Contains("--format", resolveAliases, StringComparer.Ordinal);
|
||||
}
|
||||
}
|
||||
File diff suppressed because it is too large
Load Diff
12
src/Scanner/StellaOps.Scanner.Analyzers.Lang.Deno/TASKS.md
Normal file
12
src/Scanner/StellaOps.Scanner.Analyzers.Lang.Deno/TASKS.md
Normal file
@@ -0,0 +1,12 @@
|
||||
# Deno Analyzer Tasks (Sprint 130)
|
||||
|
||||
| Order | Task ID | State | Summary |
|
||||
| --- | --- | --- | --- |
|
||||
| 1 | `SCANNER-ANALYZERS-DENO-26-001` | DONE | Deterministic input normalizer + VFS merger for `deno.json(c)`, import maps, lockfiles, vendor trees, `$DENO_DIR`, and OCI layers. |
|
||||
| 2 | `SCANNER-ANALYZERS-DENO-26-002` | DONE | Module graph resolver covering static/dynamic imports, npm bridge, cache lookups, built-ins, WASM/JSON assertions with provenance. |
|
||||
| 3 | `SCANNER-ANALYZERS-DENO-26-003` | DONE | npm/node compatibility adapter for `npm:` specifiers, `exports` evaluation, and builtin usage logging. |
|
||||
| 4 | `SCANNER-ANALYZERS-DENO-26-004` | DONE | Permission/capability analyzer for FS/net/env/process/crypto/FFI/workers plus dynamic import heuristics with reason codes. |
|
||||
| 5 | `SCANNER-ANALYZERS-DENO-26-005` | DONE | Bundle/binary inspectors for eszip and `deno compile` executables to recover graphs/config/resources/snapshots. |
|
||||
| 6 | `SCANNER-ANALYZERS-DENO-26-006` | DONE | OCI/container adapter that stitches per-layer Deno caches, vendor trees, and compiled binaries into provenance-aware inputs. |
|
||||
| 7 | `SCANNER-ANALYZERS-DENO-26-007` | DOING | AOC-compliant observation writers (entrypoints, modules, capability edges, workers, warnings, binaries) with deterministic reason codes. |
|
||||
| 8 | `SCANNER-ANALYZERS-DENO-26-008` | TODO | Fixture and benchmark suite for vendor/npm/FFI/worker/dynamic import/bundle/cache/container cases. |
|
||||
@@ -198,7 +198,12 @@ internal sealed class CompositeScanAnalyzerDispatcher : IScanAnalyzerDispatcher
|
||||
var cacheAdapter = new LanguageAnalyzerSurfaceCache(cache, surfaceEnvironment.Settings.Tenant);
|
||||
|
||||
var usageHints = LanguageUsageHints.Empty;
|
||||
var analyzerContext = new LanguageAnalyzerContext(workspacePath, context.TimeProvider, usageHints, services);
|
||||
var analyzerContext = new LanguageAnalyzerContext(
|
||||
workspacePath,
|
||||
context.TimeProvider,
|
||||
usageHints,
|
||||
services,
|
||||
context.Analysis);
|
||||
var results = new Dictionary<string, LanguageAnalyzerResult>(StringComparer.OrdinalIgnoreCase);
|
||||
var fragments = new List<LayerComponentFragment>();
|
||||
|
||||
|
||||
@@ -239,6 +239,7 @@ internal sealed class SurfaceManifestPublisher : ISurfaceManifestPublisher
|
||||
ArtifactDocumentFormat.EntryTraceNdjson => "entrytrace.ndjson",
|
||||
ArtifactDocumentFormat.EntryTraceGraphJson => "entrytrace.graph",
|
||||
ArtifactDocumentFormat.ComponentFragmentJson => "layer.fragments",
|
||||
ArtifactDocumentFormat.ObservationJson => "observation.json",
|
||||
ArtifactDocumentFormat.SurfaceManifestJson => "surface.manifest",
|
||||
ArtifactDocumentFormat.CycloneDxJson => "cdx-json",
|
||||
ArtifactDocumentFormat.CycloneDxProtobuf => "cdx-protobuf",
|
||||
|
||||
@@ -165,6 +165,20 @@ internal sealed class SurfaceManifestStageExecutor : IScanStageExecutor
|
||||
View: "inventory"));
|
||||
}
|
||||
|
||||
if (context.Analysis.TryGet<AnalyzerObservationPayload>(ScanAnalysisKeys.DenoObservationPayload, out var denoObservation) &&
|
||||
denoObservation is not null)
|
||||
{
|
||||
payloads.Add(new SurfaceManifestPayload(
|
||||
ArtifactDocumentType.SurfaceObservation,
|
||||
ArtifactDocumentFormat.ObservationJson,
|
||||
Kind: denoObservation.Kind,
|
||||
MediaType: denoObservation.MediaType,
|
||||
Content: denoObservation.Content,
|
||||
View: denoObservation.View,
|
||||
Metadata: NormalizeObservationMetadata(denoObservation.Metadata),
|
||||
RegisterArtifact: true));
|
||||
}
|
||||
|
||||
return payloads;
|
||||
}
|
||||
|
||||
@@ -277,6 +291,28 @@ internal sealed class SurfaceManifestStageExecutor : IScanStageExecutor
|
||||
return digest.Trim();
|
||||
}
|
||||
|
||||
private static IReadOnlyDictionary<string, string>? NormalizeObservationMetadata(
|
||||
IReadOnlyDictionary<string, string?>? metadata)
|
||||
{
|
||||
if (metadata is null || metadata.Count == 0)
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
var normalized = new Dictionary<string, string>(StringComparer.OrdinalIgnoreCase);
|
||||
foreach (var pair in metadata)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(pair.Key) || string.IsNullOrWhiteSpace(pair.Value))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
normalized[pair.Key.Trim()] = pair.Value!.Trim();
|
||||
}
|
||||
|
||||
return normalized.Count == 0 ? null : normalized;
|
||||
}
|
||||
|
||||
private string ComputeDigest(ReadOnlySpan<byte> content)
|
||||
{
|
||||
var hex = _hash.ComputeHashHex(content, HashAlgorithms.Sha256);
|
||||
|
||||
@@ -121,6 +121,10 @@ Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "StellaOps.Scanner.Surface.E
|
||||
EndProject
|
||||
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "StellaOps.Scanner.Analyzers.Lang.Ruby", "__Libraries\StellaOps.Scanner.Analyzers.Lang.Ruby\StellaOps.Scanner.Analyzers.Lang.Ruby.csproj", "{482026BC-2E89-4789-8A73-523FAAC8476F}"
|
||||
EndProject
|
||||
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "StellaOps.Scanner.Analyzers.Lang.Deno", "__Libraries\StellaOps.Scanner.Analyzers.Lang.Deno\StellaOps.Scanner.Analyzers.Lang.Deno.csproj", "{C71D4A4C-637C-4A7C-B0F8-4F9E62FBBE3A}"
|
||||
EndProject
|
||||
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "StellaOps.Scanner.Analyzers.Lang.Deno.Tests", "__Tests\StellaOps.Scanner.Analyzers.Lang.Deno.Tests\StellaOps.Scanner.Analyzers.Lang.Deno.Tests.csproj", "{B8D28D0E-FAD8-48B8-8F9C-9E1C6582F19E}"
|
||||
EndProject
|
||||
Global
|
||||
GlobalSection(SolutionConfigurationPlatforms) = preSolution
|
||||
Debug|Any CPU = Debug|Any CPU
|
||||
@@ -203,6 +207,30 @@ Global
|
||||
{02C16715-9BF3-43D7-AC97-D6940365907A}.Release|x64.Build.0 = Release|Any CPU
|
||||
{02C16715-9BF3-43D7-AC97-D6940365907A}.Release|x86.ActiveCfg = Release|Any CPU
|
||||
{02C16715-9BF3-43D7-AC97-D6940365907A}.Release|x86.Build.0 = Release|Any CPU
|
||||
{C71D4A4C-637C-4A7C-B0F8-4F9E62FBBE3A}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
|
||||
{C71D4A4C-637C-4A7C-B0F8-4F9E62FBBE3A}.Debug|Any CPU.Build.0 = Debug|Any CPU
|
||||
{C71D4A4C-637C-4A7C-B0F8-4F9E62FBBE3A}.Debug|x64.ActiveCfg = Debug|Any CPU
|
||||
{C71D4A4C-637C-4A7C-B0F8-4F9E62FBBE3A}.Debug|x64.Build.0 = Debug|Any CPU
|
||||
{C71D4A4C-637C-4A7C-B0F8-4F9E62FBBE3A}.Debug|x86.ActiveCfg = Debug|Any CPU
|
||||
{C71D4A4C-637C-4A7C-B0F8-4F9E62FBBE3A}.Debug|x86.Build.0 = Debug|Any CPU
|
||||
{C71D4A4C-637C-4A7C-B0F8-4F9E62FBBE3A}.Release|Any CPU.ActiveCfg = Release|Any CPU
|
||||
{C71D4A4C-637C-4A7C-B0F8-4F9E62FBBE3A}.Release|Any CPU.Build.0 = Release|Any CPU
|
||||
{C71D4A4C-637C-4A7C-B0F8-4F9E62FBBE3A}.Release|x64.ActiveCfg = Release|Any CPU
|
||||
{C71D4A4C-637C-4A7C-B0F8-4F9E62FBBE3A}.Release|x64.Build.0 = Release|Any CPU
|
||||
{C71D4A4C-637C-4A7C-B0F8-4F9E62FBBE3A}.Release|x86.ActiveCfg = Release|Any CPU
|
||||
{C71D4A4C-637C-4A7C-B0F8-4F9E62FBBE3A}.Release|x86.Build.0 = Release|Any CPU
|
||||
{B8D28D0E-FAD8-48B8-8F9C-9E1C6582F19E}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
|
||||
{B8D28D0E-FAD8-48B8-8F9C-9E1C6582F19E}.Debug|Any CPU.Build.0 = Debug|Any CPU
|
||||
{B8D28D0E-FAD8-48B8-8F9C-9E1C6582F19E}.Debug|x64.ActiveCfg = Debug|Any CPU
|
||||
{B8D28D0E-FAD8-48B8-8F9C-9E1C6582F19E}.Debug|x64.Build.0 = Debug|Any CPU
|
||||
{B8D28D0E-FAD8-48B8-8F9C-9E1C6582F19E}.Debug|x86.ActiveCfg = Debug|Any CPU
|
||||
{B8D28D0E-FAD8-48B8-8F9C-9E1C6582F19E}.Debug|x86.Build.0 = Debug|Any CPU
|
||||
{B8D28D0E-FAD8-48B8-8F9C-9E1C6582F19E}.Release|Any CPU.ActiveCfg = Release|Any CPU
|
||||
{B8D28D0E-FAD8-48B8-8F9C-9E1C6582F19E}.Release|Any CPU.Build.0 = Release|Any CPU
|
||||
{B8D28D0E-FAD8-48B8-8F9C-9E1C6582F19E}.Release|x64.ActiveCfg = Release|Any CPU
|
||||
{B8D28D0E-FAD8-48B8-8F9C-9E1C6582F19E}.Release|x64.Build.0 = Release|Any CPU
|
||||
{B8D28D0E-FAD8-48B8-8F9C-9E1C6582F19E}.Release|x86.ActiveCfg = Release|Any CPU
|
||||
{B8D28D0E-FAD8-48B8-8F9C-9E1C6582F19E}.Release|x86.Build.0 = Release|Any CPU
|
||||
{B53FEE71-9EBE-4479-9B07-0C3F8EA2C02E}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
|
||||
{B53FEE71-9EBE-4479-9B07-0C3F8EA2C02E}.Debug|Any CPU.Build.0 = Debug|Any CPU
|
||||
{B53FEE71-9EBE-4479-9B07-0C3F8EA2C02E}.Debug|x64.ActiveCfg = Debug|Any CPU
|
||||
|
||||
@@ -0,0 +1,3 @@
|
||||
using System.Runtime.CompilerServices;
|
||||
|
||||
[assembly: InternalsVisibleTo("StellaOps.Scanner.Analyzers.Lang.Deno.Tests")]
|
||||
@@ -0,0 +1,18 @@
|
||||
using System;
|
||||
using StellaOps.Scanner.Analyzers.Lang;
|
||||
using StellaOps.Scanner.Analyzers.Lang.Plugin;
|
||||
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno;
|
||||
|
||||
public sealed class DenoAnalyzerPlugin : ILanguageAnalyzerPlugin
|
||||
{
|
||||
public string Name => "deno";
|
||||
|
||||
public bool IsAvailable(IServiceProvider services) => services is not null;
|
||||
|
||||
public ILanguageAnalyzer CreateAnalyzer(IServiceProvider services)
|
||||
{
|
||||
ArgumentNullException.ThrowIfNull(services);
|
||||
return new DenoLanguageAnalyzer();
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,114 @@
|
||||
using System.Collections.Generic;
|
||||
using System.Globalization;
|
||||
using System.Text;
|
||||
using StellaOps.Scanner.Analyzers.Lang.Deno.Internal;
|
||||
using StellaOps.Scanner.Analyzers.Lang.Deno.Internal.Observations;
|
||||
using StellaOps.Scanner.Core.Contracts;
|
||||
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno;
|
||||
|
||||
public sealed class DenoLanguageAnalyzer : ILanguageAnalyzer
|
||||
{
|
||||
public string Id => "deno";
|
||||
|
||||
public string DisplayName => "Deno Analyzer";
|
||||
|
||||
public async ValueTask AnalyzeAsync(LanguageAnalyzerContext context, LanguageComponentWriter writer, CancellationToken cancellationToken)
|
||||
{
|
||||
ArgumentNullException.ThrowIfNull(context);
|
||||
ArgumentNullException.ThrowIfNull(writer);
|
||||
|
||||
var workspace = await DenoWorkspaceNormalizer.NormalizeAsync(context, cancellationToken).ConfigureAwait(false);
|
||||
var moduleGraph = DenoModuleGraphResolver.Resolve(workspace, cancellationToken);
|
||||
var compatibility = DenoNpmCompatibilityAdapter.Analyze(workspace, moduleGraph, cancellationToken);
|
||||
var bundleScan = DenoBundleScanner.Scan(context.RootPath, cancellationToken);
|
||||
var bundleObservations = DenoBundleScanner.ToObservations(bundleScan);
|
||||
var containerInputs = DenoContainerAdapter.CollectInputs(workspace, bundleObservations);
|
||||
var containerRecords = DenoContainerEmitter.BuildRecords(Id, containerInputs);
|
||||
writer.AddRange(containerRecords);
|
||||
|
||||
var observationDocument = DenoObservationBuilder.Build(moduleGraph, compatibility, bundleObservations);
|
||||
var observationJson = DenoObservationSerializer.Serialize(observationDocument);
|
||||
var observationHash = DenoObservationSerializer.ComputeSha256(observationJson);
|
||||
var observationBytes = Encoding.UTF8.GetBytes(observationJson);
|
||||
|
||||
var observationMetadata = new[]
|
||||
{
|
||||
new KeyValuePair<string, string?>("deno.observation.hash", observationHash),
|
||||
new KeyValuePair<string, string?>("deno.observation.entrypoints", observationDocument.Entrypoints.Length.ToString(CultureInfo.InvariantCulture)),
|
||||
new KeyValuePair<string, string?>("deno.observation.capabilities", observationDocument.Capabilities.Length.ToString(CultureInfo.InvariantCulture)),
|
||||
new KeyValuePair<string, string?>("deno.observation.bundles", observationDocument.Bundles.Length.ToString(CultureInfo.InvariantCulture))
|
||||
};
|
||||
|
||||
TryPersistObservation(context, observationBytes, observationMetadata);
|
||||
|
||||
var observationEvidence = new[]
|
||||
{
|
||||
new LanguageComponentEvidence(
|
||||
LanguageEvidenceKind.Derived,
|
||||
"deno.observation",
|
||||
"document",
|
||||
observationJson,
|
||||
observationHash)
|
||||
};
|
||||
|
||||
writer.AddFromExplicitKey(
|
||||
analyzerId: Id,
|
||||
componentKey: "observation::deno",
|
||||
purl: null,
|
||||
name: "Deno Observation Summary",
|
||||
version: null,
|
||||
type: "deno-observation",
|
||||
metadata: observationMetadata,
|
||||
evidence: observationEvidence);
|
||||
|
||||
// Task 5+ will convert moduleGraph + compatibility and bundle insights into SBOM components and evidence records.
|
||||
GC.KeepAlive(moduleGraph);
|
||||
GC.KeepAlive(compatibility);
|
||||
GC.KeepAlive(bundleObservations);
|
||||
GC.KeepAlive(containerInputs);
|
||||
GC.KeepAlive(observationDocument);
|
||||
}
|
||||
|
||||
private void TryPersistObservation(
|
||||
LanguageAnalyzerContext context,
|
||||
byte[] observationBytes,
|
||||
IEnumerable<KeyValuePair<string, string?>> metadata)
|
||||
{
|
||||
ArgumentNullException.ThrowIfNull(context);
|
||||
ArgumentNullException.ThrowIfNull(observationBytes);
|
||||
|
||||
if (context.AnalysisStore is not { } analysisStore)
|
||||
{
|
||||
return;
|
||||
}
|
||||
|
||||
var metadataDictionary = CreateMetadata(metadata);
|
||||
var payload = new AnalyzerObservationPayload(
|
||||
analyzerId: Id,
|
||||
kind: "deno.observation",
|
||||
mediaType: "application/json",
|
||||
content: observationBytes,
|
||||
metadata: metadataDictionary,
|
||||
view: "observations");
|
||||
|
||||
analysisStore.Set(ScanAnalysisKeys.DenoObservationPayload, payload);
|
||||
}
|
||||
|
||||
private static IReadOnlyDictionary<string, string?>? CreateMetadata(IEnumerable<KeyValuePair<string, string?>> metadata)
|
||||
{
|
||||
Dictionary<string, string?>? dictionary = null;
|
||||
foreach (var pair in metadata ?? Array.Empty<KeyValuePair<string, string?>>())
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(pair.Key) || string.IsNullOrWhiteSpace(pair.Value))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
dictionary ??= new Dictionary<string, string?>(StringComparer.OrdinalIgnoreCase);
|
||||
dictionary[pair.Key] = pair.Value;
|
||||
}
|
||||
|
||||
return dictionary;
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,17 @@
|
||||
global using System;
|
||||
global using System.Buffers;
|
||||
global using System.Collections.Generic;
|
||||
global using System.Collections.Immutable;
|
||||
global using System.IO;
|
||||
global using System.IO.Compression;
|
||||
global using System.Linq;
|
||||
global using System.Security.Cryptography;
|
||||
global using System.Globalization;
|
||||
global using System.Text;
|
||||
global using System.Text.RegularExpressions;
|
||||
global using System.Text.Json;
|
||||
global using System.Text.Json.Serialization;
|
||||
global using System.Threading;
|
||||
global using System.Threading.Tasks;
|
||||
|
||||
global using StellaOps.Scanner.Analyzers.Lang;
|
||||
@@ -0,0 +1,6 @@
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno.Internal;
|
||||
|
||||
internal sealed record DenoBuiltinUsage(
|
||||
string Specifier,
|
||||
string SourceNodeId,
|
||||
string Provenance);
|
||||
@@ -0,0 +1,12 @@
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno.Internal;
|
||||
|
||||
internal sealed record DenoBundleInspectionResult(
|
||||
string SourcePath,
|
||||
string BundleType,
|
||||
string? Entrypoint,
|
||||
ImmutableArray<DenoBundleModule> Modules,
|
||||
ImmutableArray<DenoBundleResource> Resources)
|
||||
{
|
||||
public DenoBundleObservation ToObservation()
|
||||
=> new(SourcePath, BundleType, Entrypoint, Modules, Resources);
|
||||
}
|
||||
@@ -0,0 +1,156 @@
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno.Internal;
|
||||
|
||||
internal static class DenoBundleInspector
|
||||
{
|
||||
public static DenoBundleInspectionResult? TryInspect(string path, CancellationToken cancellationToken)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(path) || !File.Exists(path))
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
using var stream = File.OpenRead(path);
|
||||
return TryInspect(stream, path, cancellationToken);
|
||||
}
|
||||
|
||||
public static DenoBundleInspectionResult? TryInspect(Stream stream, string? sourcePath, CancellationToken cancellationToken)
|
||||
{
|
||||
if (stream is null || !stream.CanRead)
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
sourcePath ??= "(stream)";
|
||||
|
||||
try
|
||||
{
|
||||
using var archive = new ZipArchive(stream, ZipArchiveMode.Read, leaveOpen: true);
|
||||
var manifestEntry = archive.GetEntry("manifest.json") ?? archive.GetEntry("manifest");
|
||||
if (manifestEntry is null)
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
using var manifestStream = manifestEntry.Open();
|
||||
using var document = JsonDocument.Parse(manifestStream);
|
||||
var root = document.RootElement;
|
||||
|
||||
var entrypoint = root.TryGetProperty("entry", out var entryElement) && entryElement.ValueKind == JsonValueKind.String
|
||||
? entryElement.GetString()
|
||||
: null;
|
||||
|
||||
var modules = ParseModules(root, archive, cancellationToken);
|
||||
var resources = ParseResources(root);
|
||||
|
||||
return new DenoBundleInspectionResult(
|
||||
SourcePath: sourcePath,
|
||||
BundleType: "eszip",
|
||||
Entrypoint: entrypoint,
|
||||
Modules: modules,
|
||||
Resources: resources);
|
||||
}
|
||||
catch (InvalidDataException)
|
||||
{
|
||||
return null;
|
||||
}
|
||||
catch (JsonException)
|
||||
{
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
private static ImmutableArray<DenoBundleModule> ParseModules(JsonElement root, ZipArchive archive, CancellationToken cancellationToken)
|
||||
{
|
||||
if (!root.TryGetProperty("modules", out var modulesElement) || modulesElement.ValueKind != JsonValueKind.Object)
|
||||
{
|
||||
return ImmutableArray<DenoBundleModule>.Empty;
|
||||
}
|
||||
|
||||
var builder = ImmutableArray.CreateBuilder<DenoBundleModule>();
|
||||
foreach (var module in modulesElement.EnumerateObject())
|
||||
{
|
||||
cancellationToken.ThrowIfCancellationRequested();
|
||||
|
||||
var info = module.Value;
|
||||
var specifier = info.TryGetProperty("specifier", out var specifierElement) && specifierElement.ValueKind == JsonValueKind.String
|
||||
? specifierElement.GetString() ?? module.Name
|
||||
: module.Name;
|
||||
|
||||
var path = info.TryGetProperty("path", out var pathElement) && pathElement.ValueKind == JsonValueKind.String
|
||||
? pathElement.GetString() ?? module.Name
|
||||
: module.Name;
|
||||
|
||||
var mediaType = info.TryGetProperty("mediaType", out var mediaTypeElement) && mediaTypeElement.ValueKind == JsonValueKind.String
|
||||
? mediaTypeElement.GetString()
|
||||
: null;
|
||||
|
||||
var checksum = info.TryGetProperty("checksum", out var checksumElement) && checksumElement.ValueKind == JsonValueKind.String
|
||||
? checksumElement.GetString()
|
||||
: null;
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(path))
|
||||
{
|
||||
var entry = archive.GetEntry(path!);
|
||||
if (entry is null)
|
||||
{
|
||||
var normalized = path.Replace('\\', '/');
|
||||
entry = archive.GetEntry(normalized);
|
||||
}
|
||||
|
||||
if (entry is not null && string.IsNullOrWhiteSpace(checksum))
|
||||
{
|
||||
checksum = $"sha256:{ComputeSha256(entry)}";
|
||||
}
|
||||
}
|
||||
|
||||
builder.Add(new DenoBundleModule(
|
||||
specifier,
|
||||
path ?? module.Name,
|
||||
mediaType,
|
||||
checksum));
|
||||
}
|
||||
|
||||
return builder.ToImmutable();
|
||||
}
|
||||
|
||||
private static ImmutableArray<DenoBundleResource> ParseResources(JsonElement root)
|
||||
{
|
||||
if (!root.TryGetProperty("resources", out var resourcesElement) || resourcesElement.ValueKind != JsonValueKind.Array)
|
||||
{
|
||||
return ImmutableArray<DenoBundleResource>.Empty;
|
||||
}
|
||||
|
||||
var builder = ImmutableArray.CreateBuilder<DenoBundleResource>();
|
||||
foreach (var resource in resourcesElement.EnumerateArray())
|
||||
{
|
||||
if (resource.ValueKind != JsonValueKind.Object)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var name = resource.TryGetProperty("name", out var nameElement) && nameElement.ValueKind == JsonValueKind.String
|
||||
? nameElement.GetString() ?? "(resource)"
|
||||
: "(resource)";
|
||||
|
||||
var mediaType = resource.TryGetProperty("mediaType", out var mediaElement) && mediaElement.ValueKind == JsonValueKind.String
|
||||
? mediaElement.GetString()
|
||||
: null;
|
||||
|
||||
var size = resource.TryGetProperty("size", out var sizeElement) && sizeElement.ValueKind == JsonValueKind.Number
|
||||
? sizeElement.GetInt64()
|
||||
: 0;
|
||||
|
||||
builder.Add(new DenoBundleResource(name!, mediaType, size));
|
||||
}
|
||||
|
||||
return builder.ToImmutable();
|
||||
}
|
||||
|
||||
private static string ComputeSha256(ZipArchiveEntry entry)
|
||||
{
|
||||
using var sha = SHA256.Create();
|
||||
using var stream = entry.Open();
|
||||
var hash = sha.ComputeHash(stream);
|
||||
return Convert.ToHexString(hash).ToLowerInvariant();
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,7 @@
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno.Internal;
|
||||
|
||||
internal sealed record DenoBundleModule(
|
||||
string Specifier,
|
||||
string Path,
|
||||
string? MediaType,
|
||||
string? Checksum);
|
||||
@@ -0,0 +1,8 @@
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno.Internal;
|
||||
|
||||
internal sealed record DenoBundleObservation(
|
||||
string SourcePath,
|
||||
string BundleType,
|
||||
string? Entrypoint,
|
||||
ImmutableArray<DenoBundleModule> Modules,
|
||||
ImmutableArray<DenoBundleResource> Resources);
|
||||
@@ -0,0 +1,6 @@
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno.Internal;
|
||||
|
||||
internal sealed record DenoBundleResource(
|
||||
string Name,
|
||||
string? MediaType,
|
||||
long SizeBytes);
|
||||
@@ -0,0 +1,5 @@
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno.Internal;
|
||||
|
||||
internal sealed record DenoBundleScanResult(
|
||||
ImmutableArray<DenoBundleInspectionResult> EszipBundles,
|
||||
ImmutableArray<DenoBundleInspectionResult> CompiledBundles);
|
||||
@@ -0,0 +1,82 @@
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno.Internal;
|
||||
|
||||
internal static class DenoBundleScanner
|
||||
{
|
||||
public static DenoBundleScanResult Scan(string rootPath, CancellationToken cancellationToken)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(rootPath) || !Directory.Exists(rootPath))
|
||||
{
|
||||
return new DenoBundleScanResult(
|
||||
ImmutableArray<DenoBundleInspectionResult>.Empty,
|
||||
ImmutableArray<DenoBundleInspectionResult>.Empty);
|
||||
}
|
||||
|
||||
var eszipBuilder = ImmutableArray.CreateBuilder<DenoBundleInspectionResult>();
|
||||
var compileBuilder = ImmutableArray.CreateBuilder<DenoBundleInspectionResult>();
|
||||
|
||||
foreach (var eszipPath in SafeEnumerateFiles(rootPath, "*.eszip"))
|
||||
{
|
||||
cancellationToken.ThrowIfCancellationRequested();
|
||||
|
||||
var result = DenoBundleInspector.TryInspect(eszipPath, cancellationToken);
|
||||
if (result is not null)
|
||||
{
|
||||
eszipBuilder.Add(result);
|
||||
}
|
||||
}
|
||||
|
||||
foreach (var binaryPath in SafeEnumerateFiles(rootPath, "*.deno"))
|
||||
{
|
||||
cancellationToken.ThrowIfCancellationRequested();
|
||||
|
||||
var result = DenoCompileInspector.TryInspect(binaryPath, cancellationToken);
|
||||
if (result is not null)
|
||||
{
|
||||
compileBuilder.Add(result);
|
||||
}
|
||||
}
|
||||
|
||||
return new DenoBundleScanResult(eszipBuilder.ToImmutable(), compileBuilder.ToImmutable());
|
||||
}
|
||||
|
||||
public static ImmutableArray<DenoBundleObservation> ToObservations(DenoBundleScanResult scanResult)
|
||||
{
|
||||
var builder = ImmutableArray.CreateBuilder<DenoBundleObservation>();
|
||||
foreach (var bundle in scanResult.EszipBundles)
|
||||
{
|
||||
builder.Add(bundle.ToObservation());
|
||||
}
|
||||
|
||||
foreach (var bundle in scanResult.CompiledBundles)
|
||||
{
|
||||
builder.Add(bundle.ToObservation());
|
||||
}
|
||||
|
||||
return builder.ToImmutable();
|
||||
}
|
||||
|
||||
private static IEnumerable<string> SafeEnumerateFiles(string rootPath, string pattern)
|
||||
{
|
||||
try
|
||||
{
|
||||
return Directory.EnumerateFiles(
|
||||
rootPath,
|
||||
pattern,
|
||||
new EnumerationOptions
|
||||
{
|
||||
RecurseSubdirectories = true,
|
||||
IgnoreInaccessible = true,
|
||||
AttributesToSkip = FileAttributes.ReparsePoint,
|
||||
ReturnSpecialDirectories = false
|
||||
});
|
||||
}
|
||||
catch (IOException)
|
||||
{
|
||||
return Array.Empty<string>();
|
||||
}
|
||||
catch (UnauthorizedAccessException)
|
||||
{
|
||||
return Array.Empty<string>();
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,39 @@
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno.Internal;
|
||||
|
||||
internal enum DenoCacheLocationKind
|
||||
{
|
||||
Workspace,
|
||||
Env,
|
||||
Home,
|
||||
Layer,
|
||||
Unknown,
|
||||
}
|
||||
|
||||
internal sealed class DenoCacheLocation
|
||||
{
|
||||
public DenoCacheLocation(string absolutePath, string alias, DenoCacheLocationKind kind, string? layerDigest)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(absolutePath))
|
||||
{
|
||||
throw new ArgumentException("Path is required", nameof(absolutePath));
|
||||
}
|
||||
|
||||
if (string.IsNullOrWhiteSpace(alias))
|
||||
{
|
||||
throw new ArgumentException("Alias is required", nameof(alias));
|
||||
}
|
||||
|
||||
AbsolutePath = Path.GetFullPath(absolutePath);
|
||||
Alias = alias;
|
||||
Kind = kind;
|
||||
LayerDigest = layerDigest;
|
||||
}
|
||||
|
||||
public string AbsolutePath { get; }
|
||||
|
||||
public string Alias { get; }
|
||||
|
||||
public DenoCacheLocationKind Kind { get; }
|
||||
|
||||
public string? LayerDigest { get; }
|
||||
}
|
||||
@@ -0,0 +1,6 @@
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno.Internal;
|
||||
|
||||
internal sealed record DenoCapabilityRecord(
|
||||
DenoCapabilityType Capability,
|
||||
string ReasonCode,
|
||||
ImmutableArray<string> Sources);
|
||||
@@ -0,0 +1,12 @@
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno.Internal;
|
||||
|
||||
internal enum DenoCapabilityType
|
||||
{
|
||||
FileSystem,
|
||||
Network,
|
||||
Environment,
|
||||
Process,
|
||||
Crypto,
|
||||
Ffi,
|
||||
Worker,
|
||||
}
|
||||
@@ -0,0 +1,8 @@
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno.Internal;
|
||||
|
||||
internal sealed record DenoCompatibilityAnalysis(
|
||||
ImmutableArray<DenoBuiltinUsage> BuiltinUsages,
|
||||
ImmutableArray<DenoNpmResolution> NpmResolutions,
|
||||
ImmutableArray<DenoCapabilityRecord> Capabilities,
|
||||
ImmutableArray<DenoDynamicImportObservation> DynamicImports,
|
||||
ImmutableArray<DenoLiteralFetchObservation> LiteralFetches);
|
||||
@@ -0,0 +1,59 @@
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno.Internal;
|
||||
|
||||
internal static class DenoCompileInspector
|
||||
{
|
||||
internal const string EszipMarker = "DENO_COMPILE_ESZIP_START";
|
||||
|
||||
public static DenoBundleInspectionResult? TryInspect(string path, CancellationToken cancellationToken)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(path) || !File.Exists(path))
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
var data = File.ReadAllBytes(path);
|
||||
var marker = Encoding.UTF8.GetBytes(EszipMarker);
|
||||
var index = IndexOf(data, marker);
|
||||
if (index < 0)
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
var start = index + marker.Length;
|
||||
if (start >= data.Length)
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
using var ms = new MemoryStream(data, start, data.Length - start);
|
||||
return DenoBundleInspector.TryInspect(ms, path, cancellationToken)?.WithBundleType("deno-compile");
|
||||
}
|
||||
|
||||
private static int IndexOf(ReadOnlySpan<byte> data, ReadOnlySpan<byte> pattern)
|
||||
{
|
||||
if (pattern.Length == 0 || pattern.Length > data.Length)
|
||||
{
|
||||
return -1;
|
||||
}
|
||||
|
||||
for (var i = 0; i <= data.Length - pattern.Length; i++)
|
||||
{
|
||||
if (data.Slice(i, pattern.Length).SequenceEqual(pattern))
|
||||
{
|
||||
return i;
|
||||
}
|
||||
}
|
||||
|
||||
return -1;
|
||||
}
|
||||
|
||||
private static DenoBundleInspectionResult? WithBundleType(this DenoBundleInspectionResult? result, string bundleType)
|
||||
{
|
||||
if (result is null)
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
return result with { BundleType = bundleType };
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,330 @@
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno.Internal;
|
||||
|
||||
internal sealed class DenoConfigDocument
|
||||
{
|
||||
private DenoConfigDocument(
|
||||
string absolutePath,
|
||||
string relativePath,
|
||||
string? importMapPath,
|
||||
DenoImportMapDocument? inlineImportMap,
|
||||
bool lockEnabled,
|
||||
string? lockFilePath,
|
||||
bool vendorEnabled,
|
||||
string? vendorDirectoryPath,
|
||||
bool nodeModulesDirEnabled,
|
||||
string? nodeModulesDir)
|
||||
{
|
||||
AbsolutePath = Path.GetFullPath(absolutePath);
|
||||
RelativePath = DenoPathUtilities.NormalizeRelativePath(relativePath);
|
||||
DirectoryPath = Path.GetDirectoryName(AbsolutePath) ?? AbsolutePath;
|
||||
ImportMapPath = importMapPath;
|
||||
InlineImportMap = inlineImportMap;
|
||||
LockEnabled = lockEnabled;
|
||||
LockFilePath = lockFilePath;
|
||||
VendorEnabled = vendorEnabled;
|
||||
VendorDirectoryPath = vendorDirectoryPath;
|
||||
NodeModulesDirEnabled = nodeModulesDirEnabled;
|
||||
NodeModulesDirectory = nodeModulesDir;
|
||||
}
|
||||
|
||||
public string AbsolutePath { get; }
|
||||
|
||||
public string RelativePath { get; }
|
||||
|
||||
public string DirectoryPath { get; }
|
||||
|
||||
public string? ImportMapPath { get; }
|
||||
|
||||
public DenoImportMapDocument? InlineImportMap { get; }
|
||||
|
||||
public bool LockEnabled { get; }
|
||||
|
||||
public string? LockFilePath { get; }
|
||||
|
||||
public bool VendorEnabled { get; }
|
||||
|
||||
public string? VendorDirectoryPath { get; }
|
||||
|
||||
public bool NodeModulesDirEnabled { get; }
|
||||
|
||||
public string? NodeModulesDirectory { get; }
|
||||
|
||||
public static bool TryLoad(
|
||||
string absolutePath,
|
||||
string relativePath,
|
||||
CancellationToken cancellationToken,
|
||||
out DenoConfigDocument? document)
|
||||
{
|
||||
document = null;
|
||||
if (string.IsNullOrWhiteSpace(absolutePath))
|
||||
{
|
||||
return false;
|
||||
}
|
||||
|
||||
try
|
||||
{
|
||||
using var stream = File.OpenRead(absolutePath);
|
||||
using var json = JsonDocument.Parse(stream, new JsonDocumentOptions
|
||||
{
|
||||
AllowTrailingCommas = true,
|
||||
CommentHandling = JsonCommentHandling.Skip,
|
||||
});
|
||||
|
||||
cancellationToken.ThrowIfCancellationRequested();
|
||||
|
||||
var root = json.RootElement;
|
||||
var directory = Path.GetDirectoryName(absolutePath) ?? Path.GetDirectoryName(Path.GetFullPath(absolutePath)) ?? absolutePath;
|
||||
|
||||
var importMapPath = ResolveImportMapPath(root, directory);
|
||||
var inlineImportMap = ResolveInlineImportMap(root, relativePath, directory);
|
||||
var (lockEnabled, lockFilePath) = ResolveLockPath(root, directory);
|
||||
var (vendorEnabled, vendorDirectory) = ResolveVendorDirectory(root, directory);
|
||||
var (nodeModulesDirEnabled, nodeModulesDir) = ResolveNodeModulesDirectory(root, directory);
|
||||
|
||||
document = new DenoConfigDocument(
|
||||
absolutePath,
|
||||
relativePath,
|
||||
string.IsNullOrWhiteSpace(importMapPath) ? null : importMapPath,
|
||||
inlineImportMap,
|
||||
lockEnabled,
|
||||
lockFilePath,
|
||||
vendorEnabled,
|
||||
vendorDirectory,
|
||||
nodeModulesDirEnabled,
|
||||
nodeModulesDir);
|
||||
|
||||
return true;
|
||||
}
|
||||
catch (IOException)
|
||||
{
|
||||
return false;
|
||||
}
|
||||
catch (JsonException)
|
||||
{
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
private static string? ResolveImportMapPath(JsonElement root, string directory)
|
||||
{
|
||||
if (!root.TryGetProperty("importMap", out var importMapElement) || importMapElement.ValueKind != JsonValueKind.String)
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
var candidate = importMapElement.GetString();
|
||||
if (string.IsNullOrWhiteSpace(candidate))
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
var path = DenoPathUtilities.ResolvePath(directory, candidate);
|
||||
return File.Exists(path) ? path : null;
|
||||
}
|
||||
|
||||
private static DenoImportMapDocument? ResolveInlineImportMap(JsonElement root, string relativePath, string directory)
|
||||
{
|
||||
var imports = ExtractInlineMap(root, "imports");
|
||||
var scopes = ExtractInlineScopes(root, "scopes");
|
||||
if (imports.Count == 0 && scopes.Count == 0)
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
return DenoImportMapDocument.CreateInline(
|
||||
origin: $"inline::{relativePath}",
|
||||
imports,
|
||||
scopes);
|
||||
}
|
||||
|
||||
private static Dictionary<string, string> ExtractInlineMap(JsonElement root, string propertyName)
|
||||
{
|
||||
if (!root.TryGetProperty(propertyName, out var element))
|
||||
{
|
||||
return new Dictionary<string, string>(StringComparer.Ordinal);
|
||||
}
|
||||
|
||||
return ExtractInlineMap(element);
|
||||
}
|
||||
|
||||
private static Dictionary<string, string> ExtractInlineMap(JsonElement element)
|
||||
{
|
||||
if (element.ValueKind != JsonValueKind.Object)
|
||||
{
|
||||
return new Dictionary<string, string>(StringComparer.Ordinal);
|
||||
}
|
||||
|
||||
var results = new Dictionary<string, string>(StringComparer.Ordinal);
|
||||
foreach (var entry in element.EnumerateObject())
|
||||
{
|
||||
if (entry.Value.ValueKind == JsonValueKind.String)
|
||||
{
|
||||
var specifier = entry.Name.Trim();
|
||||
if (specifier.Length == 0)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
results[specifier] = entry.Value.GetString()?.Trim() ?? string.Empty;
|
||||
}
|
||||
}
|
||||
|
||||
return results;
|
||||
}
|
||||
|
||||
private static Dictionary<string, IDictionary<string, string>> ExtractInlineScopes(JsonElement root, string propertyName)
|
||||
{
|
||||
if (!root.TryGetProperty(propertyName, out var element) || element.ValueKind != JsonValueKind.Object)
|
||||
{
|
||||
return new Dictionary<string, IDictionary<string, string>>(StringComparer.Ordinal);
|
||||
}
|
||||
|
||||
var results = new Dictionary<string, IDictionary<string, string>>(StringComparer.Ordinal);
|
||||
foreach (var scope in element.EnumerateObject())
|
||||
{
|
||||
var map = ExtractInlineMap(scope.Value);
|
||||
if (map.Count > 0)
|
||||
{
|
||||
results[scope.Name.Trim()] = map;
|
||||
}
|
||||
}
|
||||
|
||||
return results;
|
||||
}
|
||||
|
||||
private static (bool Enabled, string? Path) ResolveLockPath(JsonElement root, string directory)
|
||||
{
|
||||
if (!root.TryGetProperty("lock", out var lockElement))
|
||||
{
|
||||
var defaultPath = Path.Combine(directory, "deno.lock");
|
||||
return File.Exists(defaultPath)
|
||||
? (true, defaultPath)
|
||||
: (false, null);
|
||||
}
|
||||
|
||||
switch (lockElement.ValueKind)
|
||||
{
|
||||
case JsonValueKind.False:
|
||||
return (false, null);
|
||||
case JsonValueKind.True:
|
||||
{
|
||||
var defaultPath = Path.Combine(directory, "deno.lock");
|
||||
return (true, defaultPath);
|
||||
}
|
||||
case JsonValueKind.String:
|
||||
{
|
||||
var candidate = lockElement.GetString();
|
||||
if (string.IsNullOrWhiteSpace(candidate))
|
||||
{
|
||||
return (false, null);
|
||||
}
|
||||
|
||||
var resolved = DenoPathUtilities.ResolvePath(directory, candidate);
|
||||
return (File.Exists(resolved), File.Exists(resolved) ? resolved : null);
|
||||
}
|
||||
case JsonValueKind.Object:
|
||||
{
|
||||
var enabled = true;
|
||||
string? candidatePath = null;
|
||||
if (lockElement.TryGetProperty("enabled", out var enabledElement))
|
||||
{
|
||||
enabled = enabledElement.ValueKind != JsonValueKind.False;
|
||||
}
|
||||
|
||||
if (lockElement.TryGetProperty("path", out var pathElement) && pathElement.ValueKind == JsonValueKind.String)
|
||||
{
|
||||
candidatePath = pathElement.GetString();
|
||||
}
|
||||
|
||||
var resolved = string.IsNullOrWhiteSpace(candidatePath)
|
||||
? Path.Combine(directory, "deno.lock")
|
||||
: DenoPathUtilities.ResolvePath(directory, candidatePath!);
|
||||
|
||||
if (!File.Exists(resolved))
|
||||
{
|
||||
return (false, null);
|
||||
}
|
||||
|
||||
return (enabled, enabled ? resolved : null);
|
||||
}
|
||||
default:
|
||||
return (false, null);
|
||||
}
|
||||
}
|
||||
|
||||
private static (bool Enabled, string? Directory) ResolveVendorDirectory(JsonElement root, string directory)
|
||||
{
|
||||
if (!root.TryGetProperty("vendor", out var vendorElement))
|
||||
{
|
||||
var defaultPath = Path.Combine(directory, "vendor");
|
||||
return Directory.Exists(defaultPath) ? (true, defaultPath) : (false, null);
|
||||
}
|
||||
|
||||
switch (vendorElement.ValueKind)
|
||||
{
|
||||
case JsonValueKind.False:
|
||||
return (false, null);
|
||||
case JsonValueKind.True:
|
||||
{
|
||||
var defaultPath = Path.Combine(directory, "vendor");
|
||||
return Directory.Exists(defaultPath) ? (true, defaultPath) : (true, defaultPath);
|
||||
}
|
||||
case JsonValueKind.String:
|
||||
{
|
||||
var candidate = vendorElement.GetString();
|
||||
if (string.IsNullOrWhiteSpace(candidate))
|
||||
{
|
||||
return (false, null);
|
||||
}
|
||||
|
||||
var resolved = DenoPathUtilities.ResolvePath(directory, candidate);
|
||||
return Directory.Exists(resolved) ? (true, resolved) : (false, null);
|
||||
}
|
||||
case JsonValueKind.Object:
|
||||
{
|
||||
bool enabled = true;
|
||||
string? candidatePath = null;
|
||||
|
||||
if (vendorElement.TryGetProperty("enabled", out var enabledElement))
|
||||
{
|
||||
enabled = enabledElement.ValueKind != JsonValueKind.False;
|
||||
}
|
||||
|
||||
if (vendorElement.TryGetProperty("path", out var pathElement) && pathElement.ValueKind == JsonValueKind.String)
|
||||
{
|
||||
candidatePath = pathElement.GetString();
|
||||
}
|
||||
|
||||
var resolved = string.IsNullOrWhiteSpace(candidatePath)
|
||||
? Path.Combine(directory, "vendor")
|
||||
: DenoPathUtilities.ResolvePath(directory, candidatePath!);
|
||||
|
||||
if (!Directory.Exists(resolved))
|
||||
{
|
||||
return (false, null);
|
||||
}
|
||||
|
||||
return (enabled, resolved);
|
||||
}
|
||||
default:
|
||||
return (false, null);
|
||||
}
|
||||
}
|
||||
|
||||
private static (bool Enabled, string? Directory) ResolveNodeModulesDirectory(JsonElement root, string directory)
|
||||
{
|
||||
if (!root.TryGetProperty("nodeModulesDir", out var nodeModulesElement))
|
||||
{
|
||||
var defaultPath = Path.Combine(directory, "node_modules");
|
||||
return Directory.Exists(defaultPath) ? (true, defaultPath) : (false, null);
|
||||
}
|
||||
|
||||
return nodeModulesElement.ValueKind switch
|
||||
{
|
||||
JsonValueKind.False => (false, null),
|
||||
JsonValueKind.True => (true, Path.Combine(directory, "node_modules")),
|
||||
JsonValueKind.String => (true, DenoPathUtilities.ResolvePath(directory, nodeModulesElement.GetString() ?? string.Empty)),
|
||||
_ => (false, null),
|
||||
};
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,76 @@
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno.Internal;
|
||||
|
||||
internal static class DenoContainerAdapter
|
||||
{
|
||||
public static ImmutableArray<DenoContainerInput> CollectInputs(
|
||||
DenoWorkspace workspace,
|
||||
ImmutableArray<DenoBundleObservation> bundleObservations)
|
||||
{
|
||||
var builder = ImmutableArray.CreateBuilder<DenoContainerInput>();
|
||||
AddCaches(workspace, builder);
|
||||
AddVendors(workspace, builder);
|
||||
AddBundles(bundleObservations, builder);
|
||||
return builder.ToImmutable();
|
||||
}
|
||||
|
||||
private static void AddCaches(DenoWorkspace workspace, ImmutableArray<DenoContainerInput>.Builder builder)
|
||||
{
|
||||
foreach (var cache in workspace.CacheLocations)
|
||||
{
|
||||
var metadata = new Dictionary<string, string?>(StringComparer.OrdinalIgnoreCase)
|
||||
{
|
||||
["path"] = cache.AbsolutePath,
|
||||
["alias"] = cache.Alias,
|
||||
["kind"] = cache.Kind.ToString()
|
||||
};
|
||||
|
||||
builder.Add(new DenoContainerInput(
|
||||
DenoContainerSourceKind.Cache,
|
||||
cache.Alias,
|
||||
cache.LayerDigest,
|
||||
metadata,
|
||||
Bundle: null));
|
||||
}
|
||||
}
|
||||
|
||||
private static void AddVendors(DenoWorkspace workspace, ImmutableArray<DenoContainerInput>.Builder builder)
|
||||
{
|
||||
foreach (var vendor in workspace.Vendors)
|
||||
{
|
||||
var metadata = new Dictionary<string, string?>(StringComparer.OrdinalIgnoreCase)
|
||||
{
|
||||
["path"] = vendor.AbsolutePath,
|
||||
["alias"] = vendor.Alias
|
||||
};
|
||||
|
||||
builder.Add(new DenoContainerInput(
|
||||
DenoContainerSourceKind.Vendor,
|
||||
vendor.Alias,
|
||||
vendor.LayerDigest,
|
||||
metadata,
|
||||
Bundle: null));
|
||||
}
|
||||
}
|
||||
|
||||
private static void AddBundles(
|
||||
ImmutableArray<DenoBundleObservation> bundleObservations,
|
||||
ImmutableArray<DenoContainerInput>.Builder builder)
|
||||
{
|
||||
foreach (var bundle in bundleObservations)
|
||||
{
|
||||
var metadata = new Dictionary<string, string?>(StringComparer.OrdinalIgnoreCase)
|
||||
{
|
||||
["entrypoint"] = bundle.Entrypoint,
|
||||
["moduleCount"] = bundle.Modules.Length.ToString(CultureInfo.InvariantCulture),
|
||||
["resourceCount"] = bundle.Resources.Length.ToString(CultureInfo.InvariantCulture)
|
||||
};
|
||||
|
||||
builder.Add(new DenoContainerInput(
|
||||
DenoContainerSourceKind.Bundle,
|
||||
bundle.SourcePath,
|
||||
layerDigest: null,
|
||||
metadata,
|
||||
bundle));
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,86 @@
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno.Internal;
|
||||
|
||||
internal static class DenoContainerEmitter
|
||||
{
|
||||
private const string ComponentType = "deno-container";
|
||||
private const string MetadataPrefix = "deno.container";
|
||||
|
||||
public static ImmutableArray<LanguageComponentRecord> BuildRecords(
|
||||
string analyzerId,
|
||||
IEnumerable<DenoContainerInput> inputs)
|
||||
{
|
||||
ArgumentException.ThrowIfNullOrWhiteSpace(analyzerId);
|
||||
|
||||
var builder = ImmutableArray.CreateBuilder<LanguageComponentRecord>();
|
||||
foreach (var input in inputs ?? Array.Empty<DenoContainerInput>())
|
||||
{
|
||||
var componentKey = $"container::{input.Kind}:{input.Identifier}".ToLowerInvariant();
|
||||
var metadata = BuildMetadata(input);
|
||||
var evidence = BuildEvidence(input);
|
||||
|
||||
builder.Add(
|
||||
LanguageComponentRecord.FromExplicitKey(
|
||||
analyzerId,
|
||||
componentKey,
|
||||
purl: null,
|
||||
name: input.Identifier,
|
||||
version: null,
|
||||
type: ComponentType,
|
||||
metadata,
|
||||
evidence));
|
||||
}
|
||||
|
||||
return builder.ToImmutable();
|
||||
}
|
||||
|
||||
private static IEnumerable<KeyValuePair<string, string?>> BuildMetadata(DenoContainerInput input)
|
||||
{
|
||||
var metadata = new List<KeyValuePair<string, string?>>(input.Metadata?.Count ?? 0 + 4)
|
||||
{
|
||||
new($"{MetadataPrefix}.kind", input.Kind.ToString().ToLowerInvariant()),
|
||||
new($"{MetadataPrefix}.identifier", input.Identifier)
|
||||
};
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(input.LayerDigest))
|
||||
{
|
||||
metadata.Add(new($"{MetadataPrefix}.layerDigest", input.LayerDigest));
|
||||
}
|
||||
|
||||
if (input.Metadata is not null)
|
||||
{
|
||||
foreach (var kvp in input.Metadata)
|
||||
{
|
||||
metadata.Add(new($"{MetadataPrefix}.meta.{kvp.Key}", kvp.Value));
|
||||
}
|
||||
}
|
||||
|
||||
if (input.Bundle is not null)
|
||||
{
|
||||
metadata.Add(new($"{MetadataPrefix}.bundle.entrypoint", input.Bundle.Entrypoint));
|
||||
metadata.Add(new($"{MetadataPrefix}.bundle.modules", input.Bundle.Modules.Length.ToString(CultureInfo.InvariantCulture)));
|
||||
metadata.Add(new($"{MetadataPrefix}.bundle.resources", input.Bundle.Resources.Length.ToString(CultureInfo.InvariantCulture)));
|
||||
}
|
||||
|
||||
return metadata;
|
||||
}
|
||||
|
||||
private static IEnumerable<LanguageComponentEvidence> BuildEvidence(DenoContainerInput input)
|
||||
{
|
||||
var evidence = new List<LanguageComponentEvidence>
|
||||
{
|
||||
new(LanguageEvidenceKind.Metadata, "deno.container", input.Kind.ToString(), input.Identifier, input.LayerDigest)
|
||||
};
|
||||
|
||||
if (input.Bundle is not null && !string.IsNullOrWhiteSpace(input.Bundle.SourcePath))
|
||||
{
|
||||
evidence.Add(new(
|
||||
LanguageEvidenceKind.File,
|
||||
"deno.bundle",
|
||||
input.Bundle.SourcePath!,
|
||||
input.Bundle.Entrypoint,
|
||||
null));
|
||||
}
|
||||
|
||||
return evidence;
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,8 @@
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno.Internal;
|
||||
|
||||
internal sealed record DenoContainerInput(
|
||||
DenoContainerSourceKind Kind,
|
||||
string Identifier,
|
||||
string? LayerDigest,
|
||||
IReadOnlyDictionary<string, string?> Metadata,
|
||||
DenoBundleObservation? Bundle);
|
||||
@@ -0,0 +1,8 @@
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno.Internal;
|
||||
|
||||
internal enum DenoContainerSourceKind
|
||||
{
|
||||
Cache,
|
||||
Vendor,
|
||||
Bundle,
|
||||
}
|
||||
@@ -0,0 +1,7 @@
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno.Internal;
|
||||
|
||||
internal sealed record DenoDynamicImportObservation(
|
||||
string FilePath,
|
||||
int Line,
|
||||
string Specifier,
|
||||
string ReasonCode);
|
||||
@@ -0,0 +1,15 @@
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno.Internal;
|
||||
|
||||
internal enum DenoImportKind
|
||||
{
|
||||
Static,
|
||||
Dynamic,
|
||||
JsonAssertion,
|
||||
WasmAssertion,
|
||||
BuiltIn,
|
||||
Redirect,
|
||||
Cache,
|
||||
Dependency,
|
||||
NpmBridge,
|
||||
Unknown,
|
||||
}
|
||||
@@ -0,0 +1,152 @@
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno.Internal;
|
||||
|
||||
internal sealed class DenoImportMapDocument
|
||||
{
|
||||
private DenoImportMapDocument(
|
||||
string origin,
|
||||
string sortKey,
|
||||
string? absolutePath,
|
||||
bool isInline,
|
||||
ImmutableDictionary<string, string> imports,
|
||||
ImmutableDictionary<string, ImmutableDictionary<string, string>> scopes)
|
||||
{
|
||||
Origin = origin ?? throw new ArgumentNullException(nameof(origin));
|
||||
SortKey = sortKey ?? throw new ArgumentNullException(nameof(sortKey));
|
||||
AbsolutePath = absolutePath;
|
||||
IsInline = isInline;
|
||||
Imports = imports ?? ImmutableDictionary<string, string>.Empty;
|
||||
Scopes = scopes ?? ImmutableDictionary<string, ImmutableDictionary<string, string>>.Empty;
|
||||
}
|
||||
|
||||
public string Origin { get; }
|
||||
|
||||
public string SortKey { get; }
|
||||
|
||||
public string? AbsolutePath { get; }
|
||||
|
||||
public bool IsInline { get; }
|
||||
|
||||
public ImmutableDictionary<string, string> Imports { get; }
|
||||
|
||||
public ImmutableDictionary<string, ImmutableDictionary<string, string>> Scopes { get; }
|
||||
|
||||
public static bool TryLoadFromFile(
|
||||
string absolutePath,
|
||||
string origin,
|
||||
out DenoImportMapDocument? document)
|
||||
{
|
||||
document = null;
|
||||
if (string.IsNullOrWhiteSpace(absolutePath))
|
||||
{
|
||||
return false;
|
||||
}
|
||||
|
||||
try
|
||||
{
|
||||
using var stream = File.OpenRead(absolutePath);
|
||||
using var json = JsonDocument.Parse(stream, new JsonDocumentOptions
|
||||
{
|
||||
AllowTrailingCommas = true,
|
||||
CommentHandling = JsonCommentHandling.Skip,
|
||||
});
|
||||
|
||||
var root = json.RootElement;
|
||||
var imports = ParseImportMap(root.TryGetProperty("imports", out var importsElement) ? importsElement : default);
|
||||
var scopes = ParseScopes(root.TryGetProperty("scopes", out var scopesElement) ? scopesElement : default);
|
||||
|
||||
var sortKey = $"file::{DenoPathUtilities.NormalizeRelativePath(origin)}";
|
||||
document = new DenoImportMapDocument(
|
||||
origin,
|
||||
sortKey,
|
||||
Path.GetFullPath(absolutePath),
|
||||
isInline: false,
|
||||
imports,
|
||||
scopes);
|
||||
return true;
|
||||
}
|
||||
catch (IOException)
|
||||
{
|
||||
return false;
|
||||
}
|
||||
catch (JsonException)
|
||||
{
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
public static DenoImportMapDocument CreateInline(
|
||||
string origin,
|
||||
IDictionary<string, string> imports,
|
||||
IDictionary<string, IDictionary<string, string>> scopes)
|
||||
{
|
||||
var normalizedOrigin = string.IsNullOrWhiteSpace(origin) ? "inline" : origin;
|
||||
var importMap = imports is null
|
||||
? ImmutableDictionary<string, string>.Empty
|
||||
: imports.ToImmutableDictionary(static pair => pair.Key, static pair => pair.Value, StringComparer.Ordinal);
|
||||
|
||||
var scopeMap = scopes is null
|
||||
? ImmutableDictionary<string, ImmutableDictionary<string, string>>.Empty
|
||||
: scopes.ToImmutableDictionary(
|
||||
static scope => scope.Key,
|
||||
static scope => scope.Value?.ToImmutableDictionary(StringComparer.Ordinal) ?? ImmutableDictionary<string, string>.Empty,
|
||||
StringComparer.Ordinal);
|
||||
|
||||
var sortKey = $"inline::{normalizedOrigin}";
|
||||
return new DenoImportMapDocument(
|
||||
normalizedOrigin,
|
||||
sortKey,
|
||||
absolutePath: null,
|
||||
isInline: true,
|
||||
importMap,
|
||||
scopeMap);
|
||||
}
|
||||
|
||||
private static ImmutableDictionary<string, string> ParseImportMap(JsonElement element)
|
||||
{
|
||||
if (element.ValueKind != JsonValueKind.Object)
|
||||
{
|
||||
return ImmutableDictionary<string, string>.Empty;
|
||||
}
|
||||
|
||||
var builder = ImmutableDictionary.CreateBuilder<string, string>(StringComparer.Ordinal);
|
||||
foreach (var property in element.EnumerateObject())
|
||||
{
|
||||
if (property.Value.ValueKind == JsonValueKind.String)
|
||||
{
|
||||
var specifier = property.Name.Trim();
|
||||
var target = property.Value.GetString()?.Trim() ?? string.Empty;
|
||||
if (specifier.Length > 0)
|
||||
{
|
||||
builder[specifier] = target;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return builder.ToImmutable();
|
||||
}
|
||||
|
||||
private static ImmutableDictionary<string, ImmutableDictionary<string, string>> ParseScopes(JsonElement element)
|
||||
{
|
||||
if (element.ValueKind != JsonValueKind.Object)
|
||||
{
|
||||
return ImmutableDictionary<string, ImmutableDictionary<string, string>>.Empty;
|
||||
}
|
||||
|
||||
var builder = ImmutableDictionary.CreateBuilder<string, ImmutableDictionary<string, string>>(StringComparer.Ordinal);
|
||||
foreach (var scope in element.EnumerateObject())
|
||||
{
|
||||
if (scope.Value.ValueKind != JsonValueKind.Object)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var map = ParseImportMap(scope.Value);
|
||||
if (!map.IsEmpty)
|
||||
{
|
||||
builder[scope.Name.Trim()] = map;
|
||||
}
|
||||
}
|
||||
|
||||
return builder.ToImmutable();
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,43 @@
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno.Internal;
|
||||
|
||||
internal static class DenoLayerMetadata
|
||||
{
|
||||
public static string? TryExtractDigest(string path)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(path))
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
var segments = path
|
||||
.Split(new[] { Path.DirectorySeparatorChar, Path.AltDirectorySeparatorChar }, StringSplitOptions.RemoveEmptyEntries);
|
||||
|
||||
foreach (var segment in segments)
|
||||
{
|
||||
var trimmed = segment.Trim();
|
||||
if (trimmed.Length == 0)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
if (trimmed.StartsWith("sha256-", StringComparison.OrdinalIgnoreCase) && trimmed.Length > 7)
|
||||
{
|
||||
var digest = trimmed[7..];
|
||||
if (IsHex(digest))
|
||||
{
|
||||
return digest.ToLowerInvariant();
|
||||
}
|
||||
}
|
||||
|
||||
if (trimmed.Length == 64 && IsHex(trimmed))
|
||||
{
|
||||
return trimmed.ToLowerInvariant();
|
||||
}
|
||||
}
|
||||
|
||||
return null;
|
||||
}
|
||||
|
||||
private static bool IsHex(string value)
|
||||
=> value.All(static c => c is >= '0' and <= '9' or >= 'a' and <= 'f' or >= 'A' and <= 'F');
|
||||
}
|
||||
@@ -0,0 +1,7 @@
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno.Internal;
|
||||
|
||||
internal sealed record DenoLiteralFetchObservation(
|
||||
string FilePath,
|
||||
int Line,
|
||||
string Url,
|
||||
string ReasonCode);
|
||||
@@ -0,0 +1,208 @@
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno.Internal;
|
||||
|
||||
internal sealed class DenoLockFile
|
||||
{
|
||||
private DenoLockFile(
|
||||
string absolutePath,
|
||||
string relativePath,
|
||||
string version,
|
||||
ImmutableDictionary<string, string> remoteEntries,
|
||||
ImmutableDictionary<string, string> redirectEntries,
|
||||
ImmutableDictionary<string, DenoLockNpmPackage> npmPackages,
|
||||
ImmutableDictionary<string, string> npmSpecifiers)
|
||||
{
|
||||
AbsolutePath = Path.GetFullPath(absolutePath);
|
||||
RelativePath = DenoPathUtilities.NormalizeRelativePath(relativePath);
|
||||
Version = version;
|
||||
RemoteEntries = remoteEntries;
|
||||
Redirects = redirectEntries;
|
||||
NpmPackages = npmPackages;
|
||||
NpmSpecifiers = npmSpecifiers;
|
||||
}
|
||||
|
||||
public string AbsolutePath { get; }
|
||||
|
||||
public string RelativePath { get; }
|
||||
|
||||
public string Version { get; }
|
||||
|
||||
public ImmutableDictionary<string, string> RemoteEntries { get; }
|
||||
|
||||
public ImmutableDictionary<string, string> Redirects { get; }
|
||||
|
||||
public ImmutableDictionary<string, DenoLockNpmPackage> NpmPackages { get; }
|
||||
|
||||
public ImmutableDictionary<string, string> NpmSpecifiers { get; }
|
||||
|
||||
public static bool TryLoad(string absolutePath, string relativePath, out DenoLockFile? lockFile)
|
||||
{
|
||||
lockFile = null;
|
||||
if (string.IsNullOrWhiteSpace(absolutePath))
|
||||
{
|
||||
return false;
|
||||
}
|
||||
|
||||
try
|
||||
{
|
||||
using var stream = File.OpenRead(absolutePath);
|
||||
using var json = JsonDocument.Parse(stream, new JsonDocumentOptions
|
||||
{
|
||||
AllowTrailingCommas = true,
|
||||
CommentHandling = JsonCommentHandling.Skip,
|
||||
});
|
||||
|
||||
var root = json.RootElement;
|
||||
var version = root.TryGetProperty("version", out var versionElement) && versionElement.ValueKind == JsonValueKind.String
|
||||
? versionElement.GetString() ?? "unknown"
|
||||
: "unknown";
|
||||
|
||||
var remote = ParseRemoteEntries(root);
|
||||
var redirects = ParseRedirects(root);
|
||||
var (npmPackages, npmSpecifiers) = ParseNpmEntries(root);
|
||||
|
||||
lockFile = new DenoLockFile(
|
||||
absolutePath,
|
||||
relativePath,
|
||||
version,
|
||||
remote,
|
||||
redirects,
|
||||
npmPackages,
|
||||
npmSpecifiers);
|
||||
return true;
|
||||
}
|
||||
catch (IOException)
|
||||
{
|
||||
return false;
|
||||
}
|
||||
catch (JsonException)
|
||||
{
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
private static ImmutableDictionary<string, string> ParseRemoteEntries(JsonElement root)
|
||||
{
|
||||
if (!root.TryGetProperty("remote", out var remoteElement) || remoteElement.ValueKind != JsonValueKind.Object)
|
||||
{
|
||||
return ImmutableDictionary<string, string>.Empty;
|
||||
}
|
||||
|
||||
var builder = ImmutableDictionary.CreateBuilder<string, string>(StringComparer.Ordinal);
|
||||
foreach (var entry in remoteElement.EnumerateObject())
|
||||
{
|
||||
if (entry.Value.ValueKind == JsonValueKind.String)
|
||||
{
|
||||
builder[entry.Name] = entry.Value.GetString() ?? string.Empty;
|
||||
}
|
||||
}
|
||||
|
||||
return builder.ToImmutable();
|
||||
}
|
||||
|
||||
private static ImmutableDictionary<string, string> ParseRedirects(JsonElement root)
|
||||
{
|
||||
if (!root.TryGetProperty("redirects", out var redirectsElement) || redirectsElement.ValueKind != JsonValueKind.Object)
|
||||
{
|
||||
return ImmutableDictionary<string, string>.Empty;
|
||||
}
|
||||
|
||||
var builder = ImmutableDictionary.CreateBuilder<string, string>(StringComparer.Ordinal);
|
||||
foreach (var entry in redirectsElement.EnumerateObject())
|
||||
{
|
||||
if (entry.Value.ValueKind == JsonValueKind.String)
|
||||
{
|
||||
builder[entry.Name] = entry.Value.GetString() ?? string.Empty;
|
||||
}
|
||||
}
|
||||
|
||||
return builder.ToImmutable();
|
||||
}
|
||||
|
||||
private static (ImmutableDictionary<string, DenoLockNpmPackage> Packages, ImmutableDictionary<string, string> Specifiers) ParseNpmEntries(JsonElement root)
|
||||
{
|
||||
if (!root.TryGetProperty("npm", out var npmElement) || npmElement.ValueKind != JsonValueKind.Object)
|
||||
{
|
||||
return (ImmutableDictionary<string, DenoLockNpmPackage>.Empty, ImmutableDictionary<string, string>.Empty);
|
||||
}
|
||||
|
||||
var packages = ImmutableDictionary.CreateBuilder<string, DenoLockNpmPackage>(StringComparer.OrdinalIgnoreCase);
|
||||
if (npmElement.TryGetProperty("packages", out var packagesElement) && packagesElement.ValueKind == JsonValueKind.Object)
|
||||
{
|
||||
foreach (var packageEntry in packagesElement.EnumerateObject())
|
||||
{
|
||||
if (packageEntry.Value.ValueKind != JsonValueKind.Object)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var package = DenoLockNpmPackage.Create(packageEntry.Name, packageEntry.Value);
|
||||
if (package is not null)
|
||||
{
|
||||
packages[package.EntryId] = package;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
var specifiers = ImmutableDictionary.CreateBuilder<string, string>(StringComparer.Ordinal);
|
||||
if (npmElement.TryGetProperty("specifiers", out var specifiersElement) && specifiersElement.ValueKind == JsonValueKind.Object)
|
||||
{
|
||||
foreach (var specifier in specifiersElement.EnumerateObject())
|
||||
{
|
||||
if (specifier.Value.ValueKind == JsonValueKind.String)
|
||||
{
|
||||
specifiers[specifier.Name] = specifier.Value.GetString() ?? string.Empty;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return (packages.ToImmutable(), specifiers.ToImmutable());
|
||||
}
|
||||
}
|
||||
|
||||
internal sealed class DenoLockNpmPackage
|
||||
{
|
||||
private DenoLockNpmPackage(
|
||||
string entryId,
|
||||
string integrity,
|
||||
ImmutableDictionary<string, string> dependencies)
|
||||
{
|
||||
EntryId = entryId;
|
||||
Integrity = integrity;
|
||||
Dependencies = dependencies;
|
||||
}
|
||||
|
||||
public string EntryId { get; }
|
||||
|
||||
public string Integrity { get; }
|
||||
|
||||
public ImmutableDictionary<string, string> Dependencies { get; }
|
||||
|
||||
public static DenoLockNpmPackage? Create(string entryId, JsonElement element)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(entryId) || element.ValueKind != JsonValueKind.Object)
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
var integrity = element.TryGetProperty("integrity", out var integrityElement) && integrityElement.ValueKind == JsonValueKind.String
|
||||
? integrityElement.GetString() ?? string.Empty
|
||||
: string.Empty;
|
||||
|
||||
var dependencies = ImmutableDictionary<string, string>.Empty;
|
||||
if (element.TryGetProperty("dependencies", out var dependenciesElement) && dependenciesElement.ValueKind == JsonValueKind.Object)
|
||||
{
|
||||
var builder = ImmutableDictionary.CreateBuilder<string, string>(StringComparer.OrdinalIgnoreCase);
|
||||
foreach (var dependency in dependenciesElement.EnumerateObject())
|
||||
{
|
||||
if (dependency.Value.ValueKind == JsonValueKind.String)
|
||||
{
|
||||
builder[dependency.Name] = dependency.Value.GetString() ?? string.Empty;
|
||||
}
|
||||
}
|
||||
|
||||
dependencies = builder.ToImmutable();
|
||||
}
|
||||
|
||||
return new DenoLockNpmPackage(entryId, integrity, dependencies);
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,9 @@
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno.Internal;
|
||||
|
||||
internal sealed record DenoModuleEdge(
|
||||
string FromId,
|
||||
string ToId,
|
||||
DenoImportKind ImportKind,
|
||||
string Specifier,
|
||||
string Provenance,
|
||||
string? Resolution);
|
||||
@@ -0,0 +1,27 @@
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno.Internal;
|
||||
|
||||
internal sealed class DenoModuleGraph
|
||||
{
|
||||
public DenoModuleGraph(
|
||||
IEnumerable<DenoModuleNode> nodes,
|
||||
IEnumerable<DenoModuleEdge> edges)
|
||||
{
|
||||
Nodes = nodes?
|
||||
.Where(static node => node is not null)
|
||||
.OrderBy(static node => node.Id, StringComparer.Ordinal)
|
||||
.ToImmutableArray()
|
||||
?? ImmutableArray<DenoModuleNode>.Empty;
|
||||
|
||||
Edges = edges?
|
||||
.Where(static edge => edge is not null)
|
||||
.OrderBy(static edge => edge.FromId, StringComparer.Ordinal)
|
||||
.ThenBy(static edge => edge.ToId, StringComparer.Ordinal)
|
||||
.ThenBy(static edge => edge.Specifier, StringComparer.Ordinal)
|
||||
.ToImmutableArray()
|
||||
?? ImmutableArray<DenoModuleEdge>.Empty;
|
||||
}
|
||||
|
||||
public ImmutableArray<DenoModuleNode> Nodes { get; }
|
||||
|
||||
public ImmutableArray<DenoModuleEdge> Edges { get; }
|
||||
}
|
||||
@@ -0,0 +1,709 @@
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno.Internal;
|
||||
|
||||
internal static class DenoModuleGraphResolver
|
||||
{
|
||||
public static DenoModuleGraph Resolve(DenoWorkspace workspace, CancellationToken cancellationToken)
|
||||
{
|
||||
ArgumentNullException.ThrowIfNull(workspace);
|
||||
var builder = new GraphBuilder(cancellationToken);
|
||||
builder.AddWorkspace(workspace);
|
||||
return builder.Build();
|
||||
}
|
||||
|
||||
private sealed class GraphBuilder
|
||||
{
|
||||
private readonly CancellationToken _cancellationToken;
|
||||
private readonly Dictionary<string, DenoModuleNode> _nodes = new(StringComparer.Ordinal);
|
||||
private readonly List<DenoModuleEdge> _edges = new();
|
||||
|
||||
public GraphBuilder(CancellationToken cancellationToken)
|
||||
{
|
||||
_cancellationToken = cancellationToken;
|
||||
}
|
||||
|
||||
public DenoModuleGraph Build()
|
||||
=> new(_nodes.Values, _edges);
|
||||
|
||||
public void AddWorkspace(DenoWorkspace workspace)
|
||||
{
|
||||
AddConfigurations(workspace);
|
||||
AddImportMaps(workspace);
|
||||
AddLockFiles(workspace);
|
||||
AddVendors(workspace);
|
||||
AddCacheLocations(workspace);
|
||||
}
|
||||
|
||||
private void AddConfigurations(DenoWorkspace workspace)
|
||||
{
|
||||
foreach (var config in workspace.Configurations)
|
||||
{
|
||||
_cancellationToken.ThrowIfCancellationRequested();
|
||||
var configNodeId = GetOrAddNode(
|
||||
$"config::{config.RelativePath}",
|
||||
config.RelativePath,
|
||||
DenoModuleKind.WorkspaceConfig,
|
||||
config.AbsolutePath,
|
||||
layerDigest: null,
|
||||
integrity: null,
|
||||
metadata: new Dictionary<string, string?>()
|
||||
{
|
||||
["vendor.enabled"] = config.VendorEnabled.ToString(CultureInfo.InvariantCulture),
|
||||
["lock.enabled"] = config.LockEnabled.ToString(CultureInfo.InvariantCulture),
|
||||
["nodeModules.enabled"] = config.NodeModulesDirEnabled.ToString(CultureInfo.InvariantCulture),
|
||||
});
|
||||
|
||||
if (config.ImportMapPath is not null)
|
||||
{
|
||||
var importMapNodeId = GetOrAddNode(
|
||||
$"import-map::{NormalizePath(config.ImportMapPath)}",
|
||||
config.ImportMapPath,
|
||||
DenoModuleKind.ImportMap,
|
||||
config.ImportMapPath,
|
||||
layerDigest: null,
|
||||
integrity: null,
|
||||
metadata: ImmutableDictionary<string, string?>.Empty);
|
||||
|
||||
AddEdge(
|
||||
configNodeId,
|
||||
importMapNodeId,
|
||||
DenoImportKind.Static,
|
||||
specifier: "importMap",
|
||||
provenance: $"config:{config.RelativePath}",
|
||||
resolution: config.ImportMapPath);
|
||||
}
|
||||
|
||||
if (config.InlineImportMap is not null)
|
||||
{
|
||||
var inlineNodeId = GetOrAddNode(
|
||||
$"import-map::{config.RelativePath}#inline",
|
||||
$"{config.RelativePath} (inline import map)",
|
||||
DenoModuleKind.ImportMap,
|
||||
config.RelativePath,
|
||||
layerDigest: null,
|
||||
integrity: null,
|
||||
metadata: ImmutableDictionary<string, string?>.Empty);
|
||||
|
||||
AddEdge(
|
||||
configNodeId,
|
||||
inlineNodeId,
|
||||
DenoImportKind.Static,
|
||||
specifier: "importMap:inline",
|
||||
provenance: $"config:{config.RelativePath}",
|
||||
resolution: config.RelativePath);
|
||||
}
|
||||
|
||||
if (config.LockFilePath is not null)
|
||||
{
|
||||
var lockNodeId = GetOrAddNode(
|
||||
$"lock::{NormalizePath(config.LockFilePath)}",
|
||||
config.LockFilePath,
|
||||
DenoModuleKind.LockFile,
|
||||
config.LockFilePath,
|
||||
layerDigest: null,
|
||||
integrity: null,
|
||||
metadata: ImmutableDictionary<string, string?>.Empty);
|
||||
|
||||
AddEdge(
|
||||
configNodeId,
|
||||
lockNodeId,
|
||||
DenoImportKind.Static,
|
||||
specifier: "lock",
|
||||
provenance: $"config:{config.RelativePath}",
|
||||
resolution: config.LockFilePath);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private void AddImportMaps(DenoWorkspace workspace)
|
||||
{
|
||||
foreach (var importMap in workspace.ImportMaps)
|
||||
{
|
||||
_cancellationToken.ThrowIfCancellationRequested();
|
||||
|
||||
var importMapNodeId = GetOrAddNode(
|
||||
$"import-map::{importMap.SortKey}",
|
||||
importMap.Origin,
|
||||
DenoModuleKind.ImportMap,
|
||||
importMap.AbsolutePath,
|
||||
layerDigest: null,
|
||||
integrity: null,
|
||||
metadata: ImmutableDictionary<string, string?>.Empty);
|
||||
|
||||
foreach (var entry in importMap.Imports)
|
||||
{
|
||||
var aliasNodeId = GetOrAddAliasNode(entry.Key, importMapNodeId);
|
||||
var targetNodeId = GetOrAddTargetNode(entry.Value);
|
||||
AddEdge(
|
||||
aliasNodeId,
|
||||
targetNodeId,
|
||||
DetermineImportKind(entry.Value),
|
||||
entry.Key,
|
||||
provenance: $"import-map:{importMap.SortKey}",
|
||||
resolution: entry.Value);
|
||||
}
|
||||
|
||||
foreach (var scope in importMap.Scopes)
|
||||
{
|
||||
foreach (var scopedEntry in scope.Value)
|
||||
{
|
||||
var aliasNodeId = GetOrAddAliasNode($"{scope.Key}:{scopedEntry.Key}", importMapNodeId);
|
||||
var targetNodeId = GetOrAddTargetNode(scopedEntry.Value);
|
||||
AddEdge(
|
||||
aliasNodeId,
|
||||
targetNodeId,
|
||||
DetermineImportKind(scopedEntry.Value),
|
||||
scopedEntry.Key,
|
||||
provenance: $"import-map-scope:{scope.Key}",
|
||||
resolution: scopedEntry.Value);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private void AddLockFiles(DenoWorkspace workspace)
|
||||
{
|
||||
foreach (var lockFile in workspace.LockFiles)
|
||||
{
|
||||
_cancellationToken.ThrowIfCancellationRequested();
|
||||
|
||||
var lockNodeId = GetOrAddNode(
|
||||
$"lock::{lockFile.RelativePath}",
|
||||
lockFile.RelativePath,
|
||||
DenoModuleKind.LockFile,
|
||||
lockFile.AbsolutePath,
|
||||
layerDigest: null,
|
||||
integrity: null,
|
||||
metadata: new Dictionary<string, string?>()
|
||||
{
|
||||
["lock.version"] = lockFile.Version,
|
||||
});
|
||||
|
||||
foreach (var remote in lockFile.RemoteEntries)
|
||||
{
|
||||
var aliasNodeId = GetOrAddAliasNode(remote.Key, lockNodeId);
|
||||
var remoteNodeId = GetOrAddNode(
|
||||
$"remote::{remote.Key}",
|
||||
remote.Key,
|
||||
DenoModuleKind.RemoteModule,
|
||||
remote.Key,
|
||||
layerDigest: null,
|
||||
integrity: remote.Value,
|
||||
metadata: ImmutableDictionary<string, string?>.Empty);
|
||||
|
||||
AddEdge(
|
||||
aliasNodeId,
|
||||
remoteNodeId,
|
||||
DetermineImportKind(remote.Key),
|
||||
remote.Key,
|
||||
provenance: $"lock:remote:{lockFile.RelativePath}",
|
||||
resolution: remote.Key);
|
||||
}
|
||||
|
||||
foreach (var redirect in lockFile.Redirects)
|
||||
{
|
||||
var fromNodeId = GetOrAddAliasNode(redirect.Key, lockNodeId);
|
||||
var toNodeId = GetOrAddAliasNode(redirect.Value, lockNodeId);
|
||||
AddEdge(
|
||||
fromNodeId,
|
||||
toNodeId,
|
||||
DenoImportKind.Redirect,
|
||||
redirect.Key,
|
||||
provenance: $"lock:redirect:{lockFile.RelativePath}",
|
||||
resolution: redirect.Value);
|
||||
}
|
||||
|
||||
foreach (var npmSpecifier in lockFile.NpmSpecifiers)
|
||||
{
|
||||
var aliasNodeId = GetOrAddNode(
|
||||
$"npm-spec::{npmSpecifier.Key}",
|
||||
npmSpecifier.Key,
|
||||
DenoModuleKind.NpmSpecifier,
|
||||
npmSpecifier.Key,
|
||||
layerDigest: null,
|
||||
integrity: null,
|
||||
metadata: ImmutableDictionary<string, string?>.Empty);
|
||||
|
||||
var packageNodeId = GetOrAddNode(
|
||||
$"npm::{npmSpecifier.Value}",
|
||||
npmSpecifier.Value,
|
||||
DenoModuleKind.NpmPackage,
|
||||
npmSpecifier.Value,
|
||||
layerDigest: null,
|
||||
integrity: lockFile.NpmPackages.TryGetValue(npmSpecifier.Value, out var pkg) ? pkg.Integrity : null,
|
||||
metadata: ImmutableDictionary<string, string?>.Empty);
|
||||
|
||||
AddEdge(
|
||||
aliasNodeId,
|
||||
packageNodeId,
|
||||
DenoImportKind.NpmBridge,
|
||||
npmSpecifier.Key,
|
||||
provenance: $"lock:npm-spec:{lockFile.RelativePath}",
|
||||
resolution: npmSpecifier.Value);
|
||||
}
|
||||
|
||||
foreach (var package in lockFile.NpmPackages)
|
||||
{
|
||||
var packageNodeId = GetOrAddNode(
|
||||
$"npm::{package.Key}",
|
||||
package.Key,
|
||||
DenoModuleKind.NpmPackage,
|
||||
package.Key,
|
||||
layerDigest: null,
|
||||
integrity: package.Value.Integrity,
|
||||
metadata: ImmutableDictionary<string, string?>.Empty);
|
||||
|
||||
foreach (var dependency in package.Value.Dependencies)
|
||||
{
|
||||
var dependencyNodeId = GetOrAddNode(
|
||||
$"npm::{dependency.Value}",
|
||||
dependency.Value,
|
||||
DenoModuleKind.NpmPackage,
|
||||
dependency.Value,
|
||||
layerDigest: null,
|
||||
integrity: lockFile.NpmPackages.TryGetValue(dependency.Value, out var depPkg) ? depPkg.Integrity : null,
|
||||
metadata: ImmutableDictionary<string, string?>.Empty);
|
||||
|
||||
AddEdge(
|
||||
packageNodeId,
|
||||
dependencyNodeId,
|
||||
DenoImportKind.Dependency,
|
||||
dependency.Key,
|
||||
provenance: $"lock:npm-package:{lockFile.RelativePath}",
|
||||
resolution: dependency.Value);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private void AddVendors(DenoWorkspace workspace)
|
||||
{
|
||||
foreach (var vendor in workspace.Vendors)
|
||||
{
|
||||
_cancellationToken.ThrowIfCancellationRequested();
|
||||
|
||||
var metadata = new Dictionary<string, string?>(StringComparer.Ordinal)
|
||||
{
|
||||
["vendor.alias"] = vendor.Alias,
|
||||
};
|
||||
|
||||
if (vendor.ImportMap is not null)
|
||||
{
|
||||
var vendorMapId = GetOrAddNode(
|
||||
$"vendor-map::{vendor.Alias}",
|
||||
$"{vendor.Alias} import map",
|
||||
DenoModuleKind.ImportMap,
|
||||
vendor.ImportMap.AbsolutePath,
|
||||
vendor.LayerDigest,
|
||||
integrity: null,
|
||||
metadata);
|
||||
|
||||
var vendorRootId = GetOrAddNode(
|
||||
$"vendor-root::{vendor.Alias}",
|
||||
vendor.RelativePath,
|
||||
DenoModuleKind.VendorModule,
|
||||
vendor.AbsolutePath,
|
||||
vendor.LayerDigest,
|
||||
integrity: null,
|
||||
metadata);
|
||||
|
||||
AddEdge(
|
||||
vendorRootId,
|
||||
vendorMapId,
|
||||
DenoImportKind.Cache,
|
||||
vendor.ImportMap.Origin,
|
||||
provenance: $"vendor:{vendor.Alias}",
|
||||
resolution: vendor.ImportMap.AbsolutePath);
|
||||
}
|
||||
|
||||
foreach (var file in SafeEnumerateFiles(vendor.AbsolutePath))
|
||||
{
|
||||
_cancellationToken.ThrowIfCancellationRequested();
|
||||
var relative = Path.GetRelativePath(vendor.AbsolutePath, file);
|
||||
var nodeId = GetOrAddNode(
|
||||
$"vendor::{vendor.Alias}/{NormalizePath(relative)}",
|
||||
$"{vendor.Alias}/{NormalizePath(relative)}",
|
||||
DenoModuleKind.VendorModule,
|
||||
file,
|
||||
vendor.LayerDigest,
|
||||
integrity: null,
|
||||
metadata);
|
||||
|
||||
if (TryResolveUrlFromVendorPath(vendor.RelativePath, relative, out var url) ||
|
||||
TryResolveUrlFromVendorPath(vendor.RelativePath, $"https/{relative}", out url))
|
||||
{
|
||||
var remoteNodeId = GetOrAddNode(
|
||||
$"remote::{url}",
|
||||
url,
|
||||
DenoModuleKind.RemoteModule,
|
||||
url,
|
||||
vendor.LayerDigest,
|
||||
integrity: null,
|
||||
metadata: ImmutableDictionary<string, string?>.Empty);
|
||||
|
||||
AddEdge(
|
||||
remoteNodeId,
|
||||
nodeId,
|
||||
DenoImportKind.Cache,
|
||||
url,
|
||||
provenance: $"vendor-cache:{vendor.Alias}",
|
||||
resolution: file);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private void AddCacheLocations(DenoWorkspace workspace)
|
||||
{
|
||||
foreach (var cache in workspace.CacheLocations)
|
||||
{
|
||||
_cancellationToken.ThrowIfCancellationRequested();
|
||||
|
||||
foreach (var file in SafeEnumerateFiles(cache.AbsolutePath))
|
||||
{
|
||||
_cancellationToken.ThrowIfCancellationRequested();
|
||||
|
||||
var relative = Path.GetRelativePath(cache.AbsolutePath, file);
|
||||
var nodeId = GetOrAddNode(
|
||||
$"cache::{cache.Alias}/{NormalizePath(relative)}",
|
||||
$"{cache.Alias}/{NormalizePath(relative)}",
|
||||
DenoModuleKind.CacheEntry,
|
||||
file,
|
||||
cache.LayerDigest,
|
||||
integrity: null,
|
||||
metadata: new Dictionary<string, string?>(StringComparer.Ordinal)
|
||||
{
|
||||
["cache.kind"] = cache.Kind.ToString(),
|
||||
});
|
||||
|
||||
if (TryResolveUrlFromCachePath(relative, out var url))
|
||||
{
|
||||
var remoteNodeId = GetOrAddNode(
|
||||
$"remote::{url}",
|
||||
url,
|
||||
DenoModuleKind.RemoteModule,
|
||||
url,
|
||||
cache.LayerDigest,
|
||||
integrity: null,
|
||||
metadata: ImmutableDictionary<string, string?>.Empty);
|
||||
|
||||
AddEdge(
|
||||
remoteNodeId,
|
||||
nodeId,
|
||||
DenoImportKind.Cache,
|
||||
url,
|
||||
provenance: $"cache:{cache.Kind}",
|
||||
resolution: file);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private string GetOrAddAliasNode(string specifier, string ownerNodeId)
|
||||
{
|
||||
var normalized = NormalizeSpecifier(specifier);
|
||||
if (_nodes.ContainsKey(normalized))
|
||||
{
|
||||
return normalized;
|
||||
}
|
||||
|
||||
var metadata = new Dictionary<string, string?>(StringComparer.Ordinal)
|
||||
{
|
||||
["owner"] = ownerNodeId,
|
||||
};
|
||||
|
||||
var kind = specifier.StartsWith("node:", StringComparison.OrdinalIgnoreCase) ||
|
||||
specifier.StartsWith("deno:", StringComparison.OrdinalIgnoreCase)
|
||||
? DenoModuleKind.BuiltInModule
|
||||
: DenoModuleKind.SpecifierAlias;
|
||||
|
||||
return GetOrAddNode(
|
||||
normalized,
|
||||
specifier,
|
||||
kind,
|
||||
specifier,
|
||||
layerDigest: null,
|
||||
integrity: null,
|
||||
metadata);
|
||||
}
|
||||
|
||||
private string GetOrAddTargetNode(string target)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(target))
|
||||
{
|
||||
return GetOrAddNode(
|
||||
"unknown::(empty)",
|
||||
"(empty)",
|
||||
DenoModuleKind.Unknown,
|
||||
reference: null,
|
||||
layerDigest: null,
|
||||
integrity: null,
|
||||
metadata: ImmutableDictionary<string, string?>.Empty);
|
||||
}
|
||||
|
||||
if (target.StartsWith("npm:", StringComparison.OrdinalIgnoreCase))
|
||||
{
|
||||
return GetOrAddNode(
|
||||
$"npm::{target[4..]}",
|
||||
target,
|
||||
DenoModuleKind.NpmPackage,
|
||||
target,
|
||||
layerDigest: null,
|
||||
integrity: null,
|
||||
metadata: ImmutableDictionary<string, string?>.Empty);
|
||||
}
|
||||
|
||||
if (target.StartsWith("node:", StringComparison.OrdinalIgnoreCase) ||
|
||||
target.StartsWith("deno:", StringComparison.OrdinalIgnoreCase))
|
||||
{
|
||||
return GetOrAddNode(
|
||||
$"builtin::{target}",
|
||||
target,
|
||||
DenoModuleKind.BuiltInModule,
|
||||
target,
|
||||
layerDigest: null,
|
||||
integrity: null,
|
||||
metadata: ImmutableDictionary<string, string?>.Empty);
|
||||
}
|
||||
|
||||
if (target.StartsWith("http://", StringComparison.OrdinalIgnoreCase) ||
|
||||
target.StartsWith("https://", StringComparison.OrdinalIgnoreCase))
|
||||
{
|
||||
return GetOrAddNode(
|
||||
$"remote::{target}",
|
||||
target,
|
||||
DenoModuleKind.RemoteModule,
|
||||
target,
|
||||
layerDigest: null,
|
||||
integrity: null,
|
||||
metadata: ImmutableDictionary<string, string?>.Empty);
|
||||
}
|
||||
|
||||
if (target.StartsWith("./", StringComparison.Ordinal) || target.StartsWith("../", StringComparison.Ordinal) || target.StartsWith("/", StringComparison.Ordinal))
|
||||
{
|
||||
return GetOrAddNode(
|
||||
$"workspace::{NormalizePath(target)}",
|
||||
target,
|
||||
DenoModuleKind.WorkspaceModule,
|
||||
target,
|
||||
layerDigest: null,
|
||||
integrity: null,
|
||||
metadata: ImmutableDictionary<string, string?>.Empty);
|
||||
}
|
||||
|
||||
return GetOrAddNode(
|
||||
$"alias::{NormalizeSpecifier(target)}",
|
||||
target,
|
||||
DenoModuleKind.SpecifierAlias,
|
||||
target,
|
||||
layerDigest: null,
|
||||
integrity: null,
|
||||
metadata: ImmutableDictionary<string, string?>.Empty);
|
||||
}
|
||||
|
||||
private string GetOrAddNode(
|
||||
string id,
|
||||
string? displayName,
|
||||
DenoModuleKind kind,
|
||||
string? reference,
|
||||
string? layerDigest,
|
||||
string? integrity,
|
||||
IReadOnlyDictionary<string, string?> metadata)
|
||||
{
|
||||
displayName ??= "(unknown)";
|
||||
metadata ??= ImmutableDictionary<string, string?>.Empty;
|
||||
|
||||
if (_nodes.TryGetValue(id, out var existing))
|
||||
{
|
||||
var updated = existing;
|
||||
|
||||
if (string.IsNullOrEmpty(existing.Reference) && !string.IsNullOrEmpty(reference))
|
||||
{
|
||||
updated = updated with { Reference = reference };
|
||||
}
|
||||
|
||||
if (string.IsNullOrEmpty(existing.LayerDigest) && !string.IsNullOrEmpty(layerDigest))
|
||||
{
|
||||
updated = updated with { LayerDigest = layerDigest };
|
||||
}
|
||||
|
||||
if (string.IsNullOrEmpty(existing.Integrity) && !string.IsNullOrEmpty(integrity))
|
||||
{
|
||||
updated = updated with { Integrity = integrity };
|
||||
}
|
||||
|
||||
if (metadata.Count > 0)
|
||||
{
|
||||
var combined = new Dictionary<string, string?>(existing.Metadata, StringComparer.Ordinal);
|
||||
foreach (var pair in metadata)
|
||||
{
|
||||
combined[pair.Key] = pair.Value;
|
||||
}
|
||||
|
||||
updated = updated with { Metadata = combined };
|
||||
}
|
||||
|
||||
if (!ReferenceEquals(updated, existing))
|
||||
{
|
||||
_nodes[id] = updated;
|
||||
}
|
||||
|
||||
return updated.Id;
|
||||
}
|
||||
|
||||
_nodes[id] = new DenoModuleNode(
|
||||
id,
|
||||
displayName,
|
||||
kind,
|
||||
reference,
|
||||
layerDigest,
|
||||
integrity,
|
||||
metadata);
|
||||
|
||||
return id;
|
||||
}
|
||||
|
||||
private void AddEdge(
|
||||
string from,
|
||||
string to,
|
||||
DenoImportKind kind,
|
||||
string? specifier,
|
||||
string provenance,
|
||||
string? resolution)
|
||||
{
|
||||
specifier ??= "(unknown)";
|
||||
if (string.Equals(from, to, StringComparison.Ordinal))
|
||||
{
|
||||
return;
|
||||
}
|
||||
|
||||
_edges.Add(new DenoModuleEdge(
|
||||
from,
|
||||
to,
|
||||
kind,
|
||||
specifier,
|
||||
provenance,
|
||||
resolution));
|
||||
}
|
||||
|
||||
private static DenoImportKind DetermineImportKind(string target)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(target))
|
||||
{
|
||||
return DenoImportKind.Unknown;
|
||||
}
|
||||
|
||||
var lower = target.ToLowerInvariant();
|
||||
if (lower.EndsWith(".json", StringComparison.Ordinal))
|
||||
{
|
||||
return DenoImportKind.JsonAssertion;
|
||||
}
|
||||
|
||||
if (lower.EndsWith(".wasm", StringComparison.Ordinal))
|
||||
{
|
||||
return DenoImportKind.WasmAssertion;
|
||||
}
|
||||
|
||||
if (lower.StartsWith("node:", StringComparison.Ordinal) ||
|
||||
lower.StartsWith("deno:", StringComparison.Ordinal))
|
||||
{
|
||||
return DenoImportKind.BuiltIn;
|
||||
}
|
||||
|
||||
return DenoImportKind.Static;
|
||||
}
|
||||
|
||||
private static IEnumerable<string> SafeEnumerateFiles(string root)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(root) || !Directory.Exists(root))
|
||||
{
|
||||
yield break;
|
||||
}
|
||||
|
||||
IEnumerable<string> iterator;
|
||||
try
|
||||
{
|
||||
iterator = Directory.EnumerateFiles(root, "*", new EnumerationOptions
|
||||
{
|
||||
RecurseSubdirectories = true,
|
||||
IgnoreInaccessible = true,
|
||||
AttributesToSkip = FileAttributes.ReparsePoint,
|
||||
ReturnSpecialDirectories = false,
|
||||
});
|
||||
}
|
||||
catch (IOException)
|
||||
{
|
||||
yield break;
|
||||
}
|
||||
catch (UnauthorizedAccessException)
|
||||
{
|
||||
yield break;
|
||||
}
|
||||
|
||||
foreach (var file in iterator)
|
||||
{
|
||||
yield return file;
|
||||
}
|
||||
}
|
||||
|
||||
private static string NormalizeSpecifier(string value)
|
||||
=> $"alias::{(string.IsNullOrWhiteSpace(value) ? "(empty)" : value.Trim())}";
|
||||
|
||||
private static string NormalizePath(string value)
|
||||
=> string.IsNullOrWhiteSpace(value)
|
||||
? string.Empty
|
||||
: value.Replace('\\', '/').TrimStart('/');
|
||||
|
||||
private static bool TryResolveUrlFromVendorPath(string vendorRelativePath, string fileRelativePath, out string? url)
|
||||
{
|
||||
var combined = Path.Combine(vendorRelativePath ?? string.Empty, fileRelativePath ?? string.Empty);
|
||||
var normalized = NormalizePath(combined);
|
||||
var segments = normalized.Split('/', StringSplitOptions.RemoveEmptyEntries);
|
||||
|
||||
var schemeIndex = Array.FindIndex(segments, static segment => segment is "http" or "https");
|
||||
if (schemeIndex < 0 || schemeIndex + 1 >= segments.Length)
|
||||
{
|
||||
url = null;
|
||||
return false;
|
||||
}
|
||||
|
||||
var scheme = segments[schemeIndex];
|
||||
var host = segments[schemeIndex + 1];
|
||||
var path = string.Join('/', segments[(schemeIndex + 2)..]);
|
||||
url = path.Length > 0
|
||||
? $"{scheme}://{host}/{path}"
|
||||
: $"{scheme}://{host}";
|
||||
return true;
|
||||
}
|
||||
|
||||
private static bool TryResolveUrlFromCachePath(string relativePath, out string? url)
|
||||
{
|
||||
var normalized = NormalizePath(relativePath);
|
||||
var segments = normalized.Split('/', StringSplitOptions.RemoveEmptyEntries);
|
||||
|
||||
var depsIndex = Array.FindIndex(segments, static segment => segment.Equals("deps", StringComparison.OrdinalIgnoreCase));
|
||||
if (depsIndex < 0 || depsIndex + 2 >= segments.Length)
|
||||
{
|
||||
url = null;
|
||||
return false;
|
||||
}
|
||||
|
||||
var scheme = segments[depsIndex + 1];
|
||||
var host = segments[depsIndex + 2];
|
||||
var remainingStart = depsIndex + 3;
|
||||
if (remainingStart > segments.Length)
|
||||
{
|
||||
url = null;
|
||||
return false;
|
||||
}
|
||||
|
||||
var path = remainingStart < segments.Length
|
||||
? string.Join('/', segments[remainingStart..])
|
||||
: string.Empty;
|
||||
|
||||
url = path.Length > 0
|
||||
? $"{scheme}://{host}/{path}"
|
||||
: $"{scheme}://{host}";
|
||||
return true;
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,17 @@
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno.Internal;
|
||||
|
||||
internal enum DenoModuleKind
|
||||
{
|
||||
WorkspaceConfig,
|
||||
ImportMap,
|
||||
SpecifierAlias,
|
||||
WorkspaceModule,
|
||||
VendorModule,
|
||||
RemoteModule,
|
||||
CacheEntry,
|
||||
NpmSpecifier,
|
||||
NpmPackage,
|
||||
BuiltInModule,
|
||||
LockFile,
|
||||
Unknown,
|
||||
}
|
||||
@@ -0,0 +1,10 @@
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno.Internal;
|
||||
|
||||
internal sealed record DenoModuleNode(
|
||||
string Id,
|
||||
string DisplayName,
|
||||
DenoModuleKind Kind,
|
||||
string? Reference,
|
||||
string? LayerDigest,
|
||||
string? Integrity,
|
||||
IReadOnlyDictionary<string, string?> Metadata);
|
||||
@@ -0,0 +1,673 @@
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno.Internal;
|
||||
|
||||
internal static class DenoNpmCompatibilityAdapter
|
||||
{
|
||||
private static readonly string[] ConditionPriority =
|
||||
{
|
||||
"deno",
|
||||
"import",
|
||||
"module",
|
||||
"browser",
|
||||
"worker",
|
||||
"default"
|
||||
};
|
||||
|
||||
private static readonly Regex DynamicImportRegex = new(@"import\s*\(\s*['""](?<url>https?://[^'""]+)['""]", RegexOptions.IgnoreCase | RegexOptions.Compiled);
|
||||
private static readonly Regex LiteralFetchRegex = new(@"fetch\s*\(\s*['""](?<url>https?://[^'""]+)['""]", RegexOptions.IgnoreCase | RegexOptions.Compiled);
|
||||
|
||||
private static readonly HashSet<string> SourceFileExtensions = new(StringComparer.OrdinalIgnoreCase)
|
||||
{
|
||||
".ts", ".tsx", ".mts", ".cts", ".js", ".jsx", ".mjs", ".cjs"
|
||||
};
|
||||
|
||||
private static readonly (string Prefix, (DenoCapabilityType Capability, string ReasonCode)[] Entries)[] BuiltinCapabilitySignatures =
|
||||
{
|
||||
("node:fs", new[] { (DenoCapabilityType.FileSystem, "builtin.node.fs") }),
|
||||
("deno:fs", new[] { (DenoCapabilityType.FileSystem, "builtin.deno.fs") }),
|
||||
("node:net", new[] { (DenoCapabilityType.Network, "builtin.node.net") }),
|
||||
("node:http", new[] { (DenoCapabilityType.Network, "builtin.node.http") }),
|
||||
("node:https", new[] { (DenoCapabilityType.Network, "builtin.node.https") }),
|
||||
("deno:net", new[] { (DenoCapabilityType.Network, "builtin.deno.net") }),
|
||||
("node:process", new[]
|
||||
{
|
||||
(DenoCapabilityType.Process, "builtin.node.process"),
|
||||
(DenoCapabilityType.Environment, "builtin.node.env")
|
||||
}),
|
||||
("deno:env", new[] { (DenoCapabilityType.Environment, "builtin.deno.env") }),
|
||||
("deno:permissions", new[] { (DenoCapabilityType.Environment, "builtin.deno.permissions") }),
|
||||
("node:crypto", new[] { (DenoCapabilityType.Crypto, "builtin.node.crypto") }),
|
||||
("deno:crypto", new[] { (DenoCapabilityType.Crypto, "builtin.deno.crypto") }),
|
||||
("deno:ffi", new[] { (DenoCapabilityType.Ffi, "builtin.deno.ffi") }),
|
||||
("node:worker_threads", new[] { (DenoCapabilityType.Worker, "builtin.node.worker_threads") }),
|
||||
("deno:worker", new[] { (DenoCapabilityType.Worker, "builtin.deno.worker") }),
|
||||
("deno:shared_worker", new[] { (DenoCapabilityType.Worker, "builtin.deno.shared_worker") })
|
||||
};
|
||||
|
||||
public static DenoCompatibilityAnalysis Analyze(
|
||||
DenoWorkspace workspace,
|
||||
DenoModuleGraph graph,
|
||||
CancellationToken cancellationToken)
|
||||
{
|
||||
ArgumentNullException.ThrowIfNull(workspace);
|
||||
ArgumentNullException.ThrowIfNull(graph);
|
||||
|
||||
var builtins = CollectBuiltins(graph);
|
||||
var npmResolutions = ResolveNpmPackages(workspace, graph, cancellationToken);
|
||||
var (dynamicImports, literalFetches) = AnalyzeSourceFiles(workspace, cancellationToken);
|
||||
var capabilityRecords = BuildCapabilities(builtins, graph, dynamicImports, literalFetches);
|
||||
|
||||
return new DenoCompatibilityAnalysis(
|
||||
builtins,
|
||||
npmResolutions,
|
||||
capabilityRecords,
|
||||
dynamicImports,
|
||||
literalFetches);
|
||||
}
|
||||
|
||||
private static ImmutableArray<DenoBuiltinUsage> CollectBuiltins(DenoModuleGraph graph)
|
||||
{
|
||||
var builder = ImmutableArray.CreateBuilder<DenoBuiltinUsage>();
|
||||
|
||||
foreach (var edge in graph.Edges)
|
||||
{
|
||||
if (edge.Specifier.StartsWith("node:", StringComparison.OrdinalIgnoreCase) ||
|
||||
edge.Specifier.StartsWith("deno:", StringComparison.OrdinalIgnoreCase))
|
||||
{
|
||||
builder.Add(new DenoBuiltinUsage(edge.Specifier, edge.FromId, edge.Provenance));
|
||||
}
|
||||
}
|
||||
|
||||
return builder.ToImmutable();
|
||||
}
|
||||
|
||||
private static ImmutableArray<DenoNpmResolution> ResolveNpmPackages(
|
||||
DenoWorkspace workspace,
|
||||
DenoModuleGraph graph,
|
||||
CancellationToken cancellationToken)
|
||||
{
|
||||
var specToPackage = BuildSpecifierMap(workspace);
|
||||
var packageInfos = BuildPackageInfos(specToPackage.Values.Distinct(), workspace.CacheLocations);
|
||||
|
||||
try
|
||||
{
|
||||
var builder = ImmutableArray.CreateBuilder<DenoNpmResolution>();
|
||||
|
||||
foreach (var edge in graph.Edges)
|
||||
{
|
||||
cancellationToken.ThrowIfCancellationRequested();
|
||||
|
||||
if (!edge.Specifier.StartsWith("npm:", StringComparison.OrdinalIgnoreCase))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
if (!specToPackage.TryGetValue(edge.Specifier, out var packageKey))
|
||||
{
|
||||
if (!TryParseNpmSpecifier(edge.Specifier, out var parsedName, out _, out _, out _))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
packageKey = packageInfos.Keys.FirstOrDefault(key =>
|
||||
string.Equals(key.Name, parsedName, StringComparison.OrdinalIgnoreCase));
|
||||
|
||||
if (packageKey.Name is null)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
}
|
||||
|
||||
if (!packageInfos.TryGetValue(packageKey, out var packageInfo))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
if (!TryParseNpmSpecifier(edge.Specifier, out _, out _, out var subpath, out _))
|
||||
{
|
||||
subpath = string.Empty;
|
||||
}
|
||||
|
||||
var (target, condition) = packageInfo.ResolveExport(subpath);
|
||||
string? resolvedPath = null;
|
||||
var exists = false;
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(target) && !string.IsNullOrWhiteSpace(packageInfo.RootPath))
|
||||
{
|
||||
var normalizedTarget = NormalizePath(target!);
|
||||
resolvedPath = Path.GetFullPath(Path.Combine(packageInfo.RootPath!, normalizedTarget));
|
||||
exists = File.Exists(resolvedPath) || Directory.Exists(resolvedPath);
|
||||
}
|
||||
|
||||
builder.Add(new DenoNpmResolution(
|
||||
edge.Specifier,
|
||||
packageInfo.Name,
|
||||
packageInfo.Version,
|
||||
edge.FromId,
|
||||
condition,
|
||||
target,
|
||||
resolvedPath,
|
||||
exists));
|
||||
}
|
||||
|
||||
return builder.ToImmutable();
|
||||
}
|
||||
finally
|
||||
{
|
||||
foreach (var info in packageInfos.Values)
|
||||
{
|
||||
info.Dispose();
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private static Dictionary<string, NpmPackageKey> BuildSpecifierMap(DenoWorkspace workspace)
|
||||
{
|
||||
var map = new Dictionary<string, NpmPackageKey>(StringComparer.OrdinalIgnoreCase);
|
||||
|
||||
foreach (var lockFile in workspace.LockFiles)
|
||||
{
|
||||
foreach (var entry in lockFile.NpmSpecifiers)
|
||||
{
|
||||
if (!TryParseLockSpecifier(entry.Value, out var name, out var version))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
map[entry.Key] = new NpmPackageKey(name, version);
|
||||
}
|
||||
}
|
||||
|
||||
return map;
|
||||
}
|
||||
|
||||
private static Dictionary<NpmPackageKey, NpmPackageInfo> BuildPackageInfos(
|
||||
IEnumerable<NpmPackageKey> keys,
|
||||
IEnumerable<DenoCacheLocation> caches)
|
||||
{
|
||||
var result = new Dictionary<NpmPackageKey, NpmPackageInfo>();
|
||||
|
||||
foreach (var key in keys)
|
||||
{
|
||||
var root = TryResolvePackageRoot(caches, key.Name, key.Version);
|
||||
result[key] = new NpmPackageInfo(key.Name, key.Version, root);
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
private static string? TryResolvePackageRoot(IEnumerable<DenoCacheLocation> caches, string packageName, string version)
|
||||
{
|
||||
foreach (var cache in caches)
|
||||
{
|
||||
if (cache.Kind is not (DenoCacheLocationKind.Env or DenoCacheLocationKind.Workspace))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var registryRoot = Path.Combine(cache.AbsolutePath, "npm", "registry.npmjs.org");
|
||||
var packagePath = CombineRegistryPath(registryRoot, packageName, version);
|
||||
|
||||
if (packagePath is not null && Directory.Exists(packagePath))
|
||||
{
|
||||
return packagePath;
|
||||
}
|
||||
}
|
||||
|
||||
return null;
|
||||
}
|
||||
|
||||
private static string? CombineRegistryPath(string registryRoot, string packageName, string version)
|
||||
{
|
||||
var segments = packageName.Split('/', StringSplitOptions.RemoveEmptyEntries);
|
||||
if (segments.Length == 0)
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
var current = registryRoot;
|
||||
foreach (var segment in segments)
|
||||
{
|
||||
current = Path.Combine(current, segment);
|
||||
}
|
||||
|
||||
return Path.Combine(current, version);
|
||||
}
|
||||
|
||||
private static bool TryParseNpmSpecifier(
|
||||
string specifier,
|
||||
out string packageName,
|
||||
out string requestedVersion,
|
||||
out string subpath,
|
||||
out string raw)
|
||||
{
|
||||
packageName = string.Empty;
|
||||
requestedVersion = string.Empty;
|
||||
subpath = string.Empty;
|
||||
raw = specifier;
|
||||
|
||||
if (string.IsNullOrWhiteSpace(specifier))
|
||||
{
|
||||
return false;
|
||||
}
|
||||
|
||||
var trimmed = specifier.StartsWith("npm:", StringComparison.OrdinalIgnoreCase)
|
||||
? specifier[4..]
|
||||
: specifier;
|
||||
|
||||
var lastAt = trimmed.LastIndexOf('@');
|
||||
if (lastAt <= 0)
|
||||
{
|
||||
return false;
|
||||
}
|
||||
|
||||
packageName = trimmed[..lastAt];
|
||||
var remainder = trimmed[(lastAt + 1)..];
|
||||
|
||||
var slashIndex = remainder.IndexOf('/');
|
||||
if (slashIndex >= 0)
|
||||
{
|
||||
requestedVersion = remainder[..slashIndex];
|
||||
subpath = remainder[(slashIndex + 1)..];
|
||||
}
|
||||
else
|
||||
{
|
||||
requestedVersion = remainder;
|
||||
subpath = string.Empty;
|
||||
}
|
||||
|
||||
return packageName.Length > 0 && requestedVersion.Length > 0;
|
||||
}
|
||||
|
||||
private static bool TryParseLockSpecifier(string value, out string name, out string version)
|
||||
{
|
||||
name = string.Empty;
|
||||
version = string.Empty;
|
||||
|
||||
if (string.IsNullOrWhiteSpace(value))
|
||||
{
|
||||
return false;
|
||||
}
|
||||
|
||||
var lastAt = value.LastIndexOf('@');
|
||||
if (lastAt <= 0)
|
||||
{
|
||||
return false;
|
||||
}
|
||||
|
||||
name = value[..lastAt];
|
||||
version = value[(lastAt + 1)..];
|
||||
return name.Length > 0 && version.Length > 0;
|
||||
}
|
||||
|
||||
private static (ImmutableArray<DenoDynamicImportObservation> DynamicImports, ImmutableArray<DenoLiteralFetchObservation> LiteralFetches) AnalyzeSourceFiles(
|
||||
DenoWorkspace workspace,
|
||||
CancellationToken cancellationToken)
|
||||
{
|
||||
var dynamicBuilder = ImmutableArray.CreateBuilder<DenoDynamicImportObservation>();
|
||||
var fetchBuilder = ImmutableArray.CreateBuilder<DenoLiteralFetchObservation>();
|
||||
|
||||
foreach (var file in workspace.FileSystem.Files)
|
||||
{
|
||||
cancellationToken.ThrowIfCancellationRequested();
|
||||
|
||||
if (file.Source is not DenoVirtualFileSource.Workspace)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
if (!SourceFileExtensions.Contains(Path.GetExtension(file.AbsolutePath)))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
if (!File.Exists(file.AbsolutePath))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var lineNumber = 0;
|
||||
using var stream = new StreamReader(file.AbsolutePath);
|
||||
string? line;
|
||||
while ((line = stream.ReadLine()) is not null)
|
||||
{
|
||||
lineNumber++;
|
||||
cancellationToken.ThrowIfCancellationRequested();
|
||||
|
||||
foreach (Match match in DynamicImportRegex.Matches(line))
|
||||
{
|
||||
var specifier = match.Groups["url"].Value;
|
||||
if (string.IsNullOrWhiteSpace(specifier))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
dynamicBuilder.Add(new DenoDynamicImportObservation(
|
||||
file.AbsolutePath,
|
||||
lineNumber,
|
||||
specifier,
|
||||
"network.dynamic_import.literal"));
|
||||
}
|
||||
|
||||
foreach (Match match in LiteralFetchRegex.Matches(line))
|
||||
{
|
||||
var url = match.Groups["url"].Value;
|
||||
if (string.IsNullOrWhiteSpace(url))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
fetchBuilder.Add(new DenoLiteralFetchObservation(
|
||||
file.AbsolutePath,
|
||||
lineNumber,
|
||||
url,
|
||||
"network.fetch.literal"));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return (dynamicBuilder.ToImmutable(), fetchBuilder.ToImmutable());
|
||||
}
|
||||
|
||||
private static ImmutableArray<DenoCapabilityRecord> BuildCapabilities(
|
||||
ImmutableArray<DenoBuiltinUsage> builtinUsages,
|
||||
DenoModuleGraph graph,
|
||||
ImmutableArray<DenoDynamicImportObservation> dynamicImports,
|
||||
ImmutableArray<DenoLiteralFetchObservation> literalFetches)
|
||||
{
|
||||
var capabilityMap = new Dictionary<(DenoCapabilityType Capability, string Reason), HashSet<string>>();
|
||||
|
||||
void AddCapability(DenoCapabilityType capability, string reasonCode, string source)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(source))
|
||||
{
|
||||
source = "(unknown)";
|
||||
}
|
||||
|
||||
var key = (capability, reasonCode);
|
||||
if (!capabilityMap.TryGetValue(key, out var set))
|
||||
{
|
||||
set = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
|
||||
capabilityMap[key] = set;
|
||||
}
|
||||
|
||||
set.Add(source);
|
||||
}
|
||||
|
||||
foreach (var usage in builtinUsages)
|
||||
{
|
||||
foreach (var entry in ResolveCapabilityEntries(usage.Specifier))
|
||||
{
|
||||
AddCapability(entry.Capability, entry.ReasonCode, usage.Specifier);
|
||||
}
|
||||
}
|
||||
|
||||
var remoteSpecifiers = graph.Edges
|
||||
.Where(edge => edge.Specifier.StartsWith("http://", StringComparison.OrdinalIgnoreCase) ||
|
||||
edge.Specifier.StartsWith("https://", StringComparison.OrdinalIgnoreCase))
|
||||
.Select(edge => edge.Specifier)
|
||||
.Distinct(StringComparer.OrdinalIgnoreCase);
|
||||
|
||||
foreach (var specifier in remoteSpecifiers)
|
||||
{
|
||||
AddCapability(DenoCapabilityType.Network, "network.remote_module_import", specifier);
|
||||
}
|
||||
|
||||
foreach (var observation in dynamicImports)
|
||||
{
|
||||
AddCapability(DenoCapabilityType.Network, observation.ReasonCode, observation.Specifier);
|
||||
}
|
||||
|
||||
foreach (var observation in literalFetches)
|
||||
{
|
||||
AddCapability(DenoCapabilityType.Network, observation.ReasonCode, observation.Url);
|
||||
}
|
||||
|
||||
return capabilityMap
|
||||
.OrderBy(entry => entry.Key.Capability)
|
||||
.ThenBy(entry => entry.Key.Reason, StringComparer.Ordinal)
|
||||
.Select(entry => new DenoCapabilityRecord(
|
||||
entry.Key.Capability,
|
||||
entry.Key.Reason,
|
||||
entry.Value
|
||||
.OrderBy(source => source, StringComparer.OrdinalIgnoreCase)
|
||||
.ToImmutableArray()))
|
||||
.ToImmutableArray();
|
||||
}
|
||||
|
||||
private static IEnumerable<(DenoCapabilityType Capability, string ReasonCode)> ResolveCapabilityEntries(string specifier)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(specifier))
|
||||
{
|
||||
yield break;
|
||||
}
|
||||
|
||||
foreach (var signature in BuiltinCapabilitySignatures)
|
||||
{
|
||||
if (!specifier.StartsWith(signature.Prefix, StringComparison.OrdinalIgnoreCase))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
foreach (var entry in signature.Entries)
|
||||
{
|
||||
yield return entry;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private static string NormalizePath(string value)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(value))
|
||||
{
|
||||
return string.Empty;
|
||||
}
|
||||
|
||||
var trimmed = value.Replace('\\', '/');
|
||||
return trimmed.StartsWith("./", StringComparison.Ordinal) || trimmed.StartsWith("/", StringComparison.Ordinal)
|
||||
? trimmed.TrimStart('.')
|
||||
: $"./{trimmed}";
|
||||
}
|
||||
|
||||
private readonly record struct NpmPackageKey(string Name, string Version);
|
||||
|
||||
private sealed class NpmPackageInfo : IDisposable
|
||||
{
|
||||
private readonly JsonDocument? _manifest;
|
||||
|
||||
public NpmPackageInfo(string name, string version, string? rootPath)
|
||||
{
|
||||
Name = name;
|
||||
Version = version;
|
||||
RootPath = rootPath;
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(rootPath))
|
||||
{
|
||||
var manifestPath = Path.Combine(rootPath!, "package.json");
|
||||
if (File.Exists(manifestPath))
|
||||
{
|
||||
PackageJsonPath = manifestPath;
|
||||
_manifest = JsonDocument.Parse(File.ReadAllBytes(manifestPath));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
public string Name { get; }
|
||||
|
||||
public string Version { get; }
|
||||
|
||||
public string? RootPath { get; }
|
||||
|
||||
public string? PackageJsonPath { get; }
|
||||
|
||||
public (string? Target, string? Condition) ResolveExport(string subpath)
|
||||
{
|
||||
if (_manifest is null)
|
||||
{
|
||||
return (null, null);
|
||||
}
|
||||
|
||||
var root = _manifest.RootElement;
|
||||
var normalizedKey = NormalizeExportKey(subpath);
|
||||
|
||||
if (root.TryGetProperty("exports", out var exports))
|
||||
{
|
||||
var (target, condition) = ResolveExportsEntry(exports, normalizedKey);
|
||||
if (target is not null)
|
||||
{
|
||||
return (target, condition);
|
||||
}
|
||||
}
|
||||
|
||||
if (string.IsNullOrEmpty(subpath))
|
||||
{
|
||||
if (root.TryGetProperty("module", out var moduleElement) && moduleElement.ValueKind == JsonValueKind.String)
|
||||
{
|
||||
return (moduleElement.GetString(), "module");
|
||||
}
|
||||
|
||||
if (root.TryGetProperty("main", out var mainElement) && mainElement.ValueKind == JsonValueKind.String)
|
||||
{
|
||||
return (mainElement.GetString(), "main");
|
||||
}
|
||||
}
|
||||
|
||||
return (null, null);
|
||||
}
|
||||
|
||||
public void Dispose()
|
||||
{
|
||||
_manifest?.Dispose();
|
||||
}
|
||||
|
||||
private static string NormalizeExportKey(string subpath)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(subpath))
|
||||
{
|
||||
return ".";
|
||||
}
|
||||
|
||||
if (subpath.StartsWith("./", StringComparison.Ordinal))
|
||||
{
|
||||
return subpath;
|
||||
}
|
||||
|
||||
if (subpath.StartsWith("/", StringComparison.Ordinal))
|
||||
{
|
||||
return $".{subpath}";
|
||||
}
|
||||
|
||||
return $"./{subpath}";
|
||||
}
|
||||
|
||||
private static (string? Target, string? Condition) ResolveExportsEntry(JsonElement exports, string lookupKey)
|
||||
{
|
||||
return exports.ValueKind switch
|
||||
{
|
||||
JsonValueKind.String => (exports.GetString(), null),
|
||||
JsonValueKind.Object => ResolveObject(exports, lookupKey),
|
||||
JsonValueKind.Array => ResolveArray(exports, lookupKey),
|
||||
_ => (null, null),
|
||||
};
|
||||
}
|
||||
|
||||
private static (string? Target, string? Condition) ResolveObject(JsonElement element, string lookupKey)
|
||||
{
|
||||
if (element.TryGetProperty(lookupKey, out var entry))
|
||||
{
|
||||
var result = ResolveEntry(entry);
|
||||
if (result.Target is not null)
|
||||
{
|
||||
return result;
|
||||
}
|
||||
}
|
||||
|
||||
if (lookupKey != "." && element.TryGetProperty(".", out var rootEntry))
|
||||
{
|
||||
var result = ResolveEntry(rootEntry);
|
||||
if (result.Target is not null)
|
||||
{
|
||||
return result;
|
||||
}
|
||||
}
|
||||
|
||||
foreach (var property in element.EnumerateObject())
|
||||
{
|
||||
if (!property.Name.Contains('*', StringComparison.Ordinal))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var pattern = property.Name;
|
||||
var starIndex = pattern.IndexOf('*');
|
||||
var prefix = pattern[..starIndex];
|
||||
var suffix = pattern[(starIndex + 1)..];
|
||||
|
||||
if (lookupKey.StartsWith(prefix, StringComparison.Ordinal) &&
|
||||
lookupKey.EndsWith(suffix, StringComparison.Ordinal) &&
|
||||
lookupKey.Length >= prefix.Length + suffix.Length)
|
||||
{
|
||||
var matched = lookupKey[prefix.Length..(lookupKey.Length - suffix.Length)];
|
||||
var (target, condition) = ResolveEntry(property.Value);
|
||||
if (target is not null)
|
||||
{
|
||||
var substituted = target.Replace("*", matched, StringComparison.Ordinal);
|
||||
return (substituted, condition);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return (null, null);
|
||||
}
|
||||
|
||||
private static (string? Target, string? Condition) ResolveArray(JsonElement element, string lookupKey)
|
||||
{
|
||||
foreach (var item in element.EnumerateArray())
|
||||
{
|
||||
var result = ResolveExportsEntry(item, lookupKey);
|
||||
if (result.Target is not null)
|
||||
{
|
||||
return result;
|
||||
}
|
||||
}
|
||||
|
||||
return (null, null);
|
||||
}
|
||||
|
||||
private static (string? Target, string? Condition) ResolveEntry(JsonElement element)
|
||||
{
|
||||
return element.ValueKind switch
|
||||
{
|
||||
JsonValueKind.String => (element.GetString(), null),
|
||||
JsonValueKind.Array => ResolveArray(element, "."),
|
||||
JsonValueKind.Object => ResolveConditionalObject(element),
|
||||
_ => (null, null),
|
||||
};
|
||||
}
|
||||
|
||||
private static (string? Target, string? Condition) ResolveConditionalObject(JsonElement element)
|
||||
{
|
||||
foreach (var condition in ConditionPriority)
|
||||
{
|
||||
if (element.TryGetProperty(condition, out var conditionalValue))
|
||||
{
|
||||
var result = ResolveEntry(conditionalValue);
|
||||
if (result.Target is not null)
|
||||
{
|
||||
return (result.Target, result.Condition ?? condition);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
foreach (var property in element.EnumerateObject())
|
||||
{
|
||||
var result = ResolveEntry(property.Value);
|
||||
if (result.Target is not null)
|
||||
{
|
||||
return (result.Target, result.Condition ?? property.Name);
|
||||
}
|
||||
}
|
||||
|
||||
return (null, null);
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,11 @@
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno.Internal;
|
||||
|
||||
internal sealed record DenoNpmResolution(
|
||||
string Specifier,
|
||||
string PackageName,
|
||||
string PackageVersion,
|
||||
string SourceNodeId,
|
||||
string? Condition,
|
||||
string? Target,
|
||||
string? ResolvedPath,
|
||||
bool ExistsOnDisk);
|
||||
@@ -0,0 +1,81 @@
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno.Internal;
|
||||
|
||||
internal static class DenoPathUtilities
|
||||
{
|
||||
public static string NormalizeRelativePath(string path)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(path))
|
||||
{
|
||||
return string.Empty;
|
||||
}
|
||||
|
||||
var normalized = path.Replace('\\', '/');
|
||||
while (normalized.Contains("//", StringComparison.Ordinal))
|
||||
{
|
||||
normalized = normalized.Replace("//", "/", StringComparison.Ordinal);
|
||||
}
|
||||
|
||||
return normalized.Trim('/');
|
||||
}
|
||||
|
||||
public static string ResolvePath(string root, string candidate)
|
||||
{
|
||||
var value = ExpandHome(candidate);
|
||||
if (string.IsNullOrWhiteSpace(value))
|
||||
{
|
||||
return string.Empty;
|
||||
}
|
||||
|
||||
if (Path.IsPathFullyQualified(value))
|
||||
{
|
||||
return Path.GetFullPath(value);
|
||||
}
|
||||
|
||||
var combined = Path.Combine(root, value);
|
||||
return Path.GetFullPath(combined);
|
||||
}
|
||||
|
||||
public static string CreateAlias(string absolutePath, string? fallback = null)
|
||||
{
|
||||
var directory = absolutePath.TrimEnd(Path.DirectorySeparatorChar, Path.AltDirectorySeparatorChar);
|
||||
var name = Path.GetFileName(directory);
|
||||
if (string.IsNullOrWhiteSpace(name))
|
||||
{
|
||||
name = fallback ?? "root";
|
||||
}
|
||||
|
||||
var hashBytes = SHA256.HashData(Encoding.UTF8.GetBytes(directory));
|
||||
var shortHash = Convert.ToHexString(hashBytes.AsSpan(0, 6)).ToLowerInvariant();
|
||||
return $"{name}-{shortHash}";
|
||||
}
|
||||
|
||||
private static string ExpandHome(string value)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(value))
|
||||
{
|
||||
return string.Empty;
|
||||
}
|
||||
|
||||
if (value[0] != '~')
|
||||
{
|
||||
return value;
|
||||
}
|
||||
|
||||
var home = Environment.GetEnvironmentVariable("HOME");
|
||||
if (string.IsNullOrWhiteSpace(home))
|
||||
{
|
||||
home = Environment.GetFolderPath(Environment.SpecialFolder.UserProfile);
|
||||
}
|
||||
|
||||
if (string.IsNullOrWhiteSpace(home))
|
||||
{
|
||||
return value; // Unable to expand; return original.
|
||||
}
|
||||
|
||||
var remainder = value.Length > 1 && (value[1] == '/' || value[1] == '\\')
|
||||
? value[2..]
|
||||
: value[1..];
|
||||
|
||||
return Path.Combine(home, remainder);
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,47 @@
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno.Internal;
|
||||
|
||||
internal sealed class DenoVendorDirectory
|
||||
{
|
||||
public DenoVendorDirectory(
|
||||
string absolutePath,
|
||||
string relativePath,
|
||||
string alias,
|
||||
string? layerDigest,
|
||||
DenoImportMapDocument? importMap,
|
||||
DenoLockFile? lockFile)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(absolutePath))
|
||||
{
|
||||
throw new ArgumentException("Path is required", nameof(absolutePath));
|
||||
}
|
||||
|
||||
if (string.IsNullOrWhiteSpace(relativePath))
|
||||
{
|
||||
throw new ArgumentException("Relative path is required", nameof(relativePath));
|
||||
}
|
||||
|
||||
if (string.IsNullOrWhiteSpace(alias))
|
||||
{
|
||||
throw new ArgumentException("Alias is required", nameof(alias));
|
||||
}
|
||||
|
||||
AbsolutePath = Path.GetFullPath(absolutePath);
|
||||
RelativePath = DenoPathUtilities.NormalizeRelativePath(relativePath);
|
||||
Alias = alias;
|
||||
LayerDigest = layerDigest;
|
||||
ImportMap = importMap;
|
||||
LockFile = lockFile;
|
||||
}
|
||||
|
||||
public string AbsolutePath { get; }
|
||||
|
||||
public string RelativePath { get; }
|
||||
|
||||
public string Alias { get; }
|
||||
|
||||
public string? LayerDigest { get; }
|
||||
|
||||
public DenoImportMapDocument? ImportMap { get; }
|
||||
|
||||
public DenoLockFile? LockFile { get; }
|
||||
}
|
||||
@@ -0,0 +1,289 @@
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno.Internal;
|
||||
|
||||
internal enum DenoVirtualFileSource
|
||||
{
|
||||
Workspace,
|
||||
ImportMap,
|
||||
LockFile,
|
||||
Vendor,
|
||||
DenoDir,
|
||||
Layer,
|
||||
}
|
||||
|
||||
internal sealed record DenoVirtualFile(
|
||||
string VirtualPath,
|
||||
string AbsolutePath,
|
||||
DenoVirtualFileSource Source,
|
||||
string? LayerDigest,
|
||||
long Length,
|
||||
DateTimeOffset? LastWriteTimeUtc);
|
||||
|
||||
internal sealed class DenoVirtualFileSystem
|
||||
{
|
||||
private readonly ImmutableArray<DenoVirtualFile> _files;
|
||||
private readonly ImmutableDictionary<string, ImmutableArray<DenoVirtualFile>> _filesByPath;
|
||||
|
||||
private DenoVirtualFileSystem(IEnumerable<DenoVirtualFile> files)
|
||||
{
|
||||
var ordered = files
|
||||
.Where(static file => file is not null)
|
||||
.OrderBy(static file => file.VirtualPath, StringComparer.Ordinal)
|
||||
.ThenBy(static file => file.Source)
|
||||
.ThenBy(static file => file.AbsolutePath, StringComparer.Ordinal)
|
||||
.ToImmutableArray();
|
||||
|
||||
_files = ordered;
|
||||
_filesByPath = ordered
|
||||
.GroupBy(static file => file.VirtualPath, StringComparer.Ordinal)
|
||||
.ToImmutableDictionary(
|
||||
static group => group.Key,
|
||||
static group => group.ToImmutableArray(),
|
||||
StringComparer.Ordinal);
|
||||
}
|
||||
|
||||
public ImmutableArray<DenoVirtualFile> Files => _files;
|
||||
|
||||
public IEnumerable<DenoVirtualFile> EnumerateBySource(DenoVirtualFileSource source)
|
||||
=> _files.Where(file => file.Source == source);
|
||||
|
||||
public bool TryGetLatest(string virtualPath, out DenoVirtualFile? file)
|
||||
{
|
||||
if (_filesByPath.TryGetValue(virtualPath, out var items) && items.Length > 0)
|
||||
{
|
||||
file = items[0];
|
||||
return true;
|
||||
}
|
||||
|
||||
file = null;
|
||||
return false;
|
||||
}
|
||||
|
||||
public static DenoVirtualFileSystem Build(
|
||||
LanguageAnalyzerContext context,
|
||||
IEnumerable<DenoConfigDocument> configs,
|
||||
IEnumerable<DenoImportMapDocument> importMaps,
|
||||
IEnumerable<DenoLockFile> lockFiles,
|
||||
IEnumerable<DenoVendorDirectory> vendors,
|
||||
IEnumerable<DenoCacheLocation> cacheLocations,
|
||||
CancellationToken cancellationToken)
|
||||
{
|
||||
var files = new List<DenoVirtualFile>();
|
||||
AddConfigFiles(context, configs, files, cancellationToken);
|
||||
AddImportMaps(importMaps, files, cancellationToken);
|
||||
AddLockFiles(lockFiles, files, cancellationToken);
|
||||
AddVendorFiles(vendors, files, cancellationToken);
|
||||
AddCacheFiles(cacheLocations, files, cancellationToken);
|
||||
|
||||
return new DenoVirtualFileSystem(files);
|
||||
}
|
||||
|
||||
private static void AddConfigFiles(
|
||||
LanguageAnalyzerContext context,
|
||||
IEnumerable<DenoConfigDocument> configs,
|
||||
ICollection<DenoVirtualFile> files,
|
||||
CancellationToken cancellationToken)
|
||||
{
|
||||
foreach (var config in configs ?? Array.Empty<DenoConfigDocument>())
|
||||
{
|
||||
cancellationToken.ThrowIfCancellationRequested();
|
||||
|
||||
if (File.Exists(config.AbsolutePath))
|
||||
{
|
||||
files.Add(CreateVirtualFile(
|
||||
config.AbsolutePath,
|
||||
context.GetRelativePath(config.AbsolutePath),
|
||||
DenoVirtualFileSource.Workspace,
|
||||
layerDigest: DenoLayerMetadata.TryExtractDigest(config.AbsolutePath)));
|
||||
}
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(config.ImportMapPath) && File.Exists(config.ImportMapPath))
|
||||
{
|
||||
files.Add(CreateVirtualFile(
|
||||
config.ImportMapPath!,
|
||||
context.GetRelativePath(config.ImportMapPath!),
|
||||
DenoVirtualFileSource.ImportMap,
|
||||
layerDigest: DenoLayerMetadata.TryExtractDigest(config.ImportMapPath!)));
|
||||
}
|
||||
|
||||
if (config.LockEnabled && !string.IsNullOrWhiteSpace(config.LockFilePath) && File.Exists(config.LockFilePath))
|
||||
{
|
||||
files.Add(CreateVirtualFile(
|
||||
config.LockFilePath!,
|
||||
context.GetRelativePath(config.LockFilePath!),
|
||||
DenoVirtualFileSource.LockFile,
|
||||
layerDigest: DenoLayerMetadata.TryExtractDigest(config.LockFilePath!)));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private static void AddImportMaps(IEnumerable<DenoImportMapDocument> maps, ICollection<DenoVirtualFile> files, CancellationToken cancellationToken)
|
||||
{
|
||||
foreach (var map in maps ?? Array.Empty<DenoImportMapDocument>())
|
||||
{
|
||||
cancellationToken.ThrowIfCancellationRequested();
|
||||
|
||||
if (map.IsInline || string.IsNullOrWhiteSpace(map.AbsolutePath) || !File.Exists(map.AbsolutePath))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
files.Add(CreateVirtualFile(
|
||||
map.AbsolutePath,
|
||||
map.Origin,
|
||||
DenoVirtualFileSource.ImportMap,
|
||||
layerDigest: DenoLayerMetadata.TryExtractDigest(map.AbsolutePath)));
|
||||
}
|
||||
}
|
||||
|
||||
private static void AddLockFiles(IEnumerable<DenoLockFile> lockFiles, ICollection<DenoVirtualFile> files, CancellationToken cancellationToken)
|
||||
{
|
||||
foreach (var lockFile in lockFiles ?? Array.Empty<DenoLockFile>())
|
||||
{
|
||||
cancellationToken.ThrowIfCancellationRequested();
|
||||
|
||||
if (!File.Exists(lockFile.AbsolutePath))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
files.Add(CreateVirtualFile(
|
||||
lockFile.AbsolutePath,
|
||||
lockFile.RelativePath,
|
||||
DenoVirtualFileSource.LockFile,
|
||||
layerDigest: DenoLayerMetadata.TryExtractDigest(lockFile.AbsolutePath)));
|
||||
}
|
||||
}
|
||||
|
||||
private static void AddVendorFiles(IEnumerable<DenoVendorDirectory> vendors, ICollection<DenoVirtualFile> files, CancellationToken cancellationToken)
|
||||
{
|
||||
foreach (var vendor in vendors ?? Array.Empty<DenoVendorDirectory>())
|
||||
{
|
||||
cancellationToken.ThrowIfCancellationRequested();
|
||||
|
||||
if (!Directory.Exists(vendor.AbsolutePath))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
foreach (var file in SafeEnumerateFiles(vendor.AbsolutePath))
|
||||
{
|
||||
cancellationToken.ThrowIfCancellationRequested();
|
||||
files.Add(CreateVirtualFile(
|
||||
file,
|
||||
$"vendor://{vendor.Alias}/{DenoPathUtilities.NormalizeRelativePath(Path.GetRelativePath(vendor.AbsolutePath, file))}",
|
||||
DenoVirtualFileSource.Vendor,
|
||||
vendor.LayerDigest ?? DenoLayerMetadata.TryExtractDigest(file)));
|
||||
}
|
||||
|
||||
if (vendor.ImportMap is { AbsolutePath: not null } importMapFile && File.Exists(importMapFile.AbsolutePath))
|
||||
{
|
||||
files.Add(CreateVirtualFile(
|
||||
importMapFile.AbsolutePath,
|
||||
$"vendor://{vendor.Alias}/import_map.json",
|
||||
DenoVirtualFileSource.ImportMap,
|
||||
vendor.LayerDigest ?? DenoLayerMetadata.TryExtractDigest(importMapFile.AbsolutePath)));
|
||||
}
|
||||
|
||||
if (vendor.LockFile is { AbsolutePath: not null } vendorLock && File.Exists(vendorLock.AbsolutePath))
|
||||
{
|
||||
files.Add(CreateVirtualFile(
|
||||
vendorLock.AbsolutePath,
|
||||
$"vendor://{vendor.Alias}/deno.lock",
|
||||
DenoVirtualFileSource.LockFile,
|
||||
vendor.LayerDigest ?? DenoLayerMetadata.TryExtractDigest(vendorLock.AbsolutePath)));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private static void AddCacheFiles(IEnumerable<DenoCacheLocation> cacheLocations, ICollection<DenoVirtualFile> files, CancellationToken cancellationToken)
|
||||
{
|
||||
foreach (var cache in cacheLocations ?? Array.Empty<DenoCacheLocation>())
|
||||
{
|
||||
cancellationToken.ThrowIfCancellationRequested();
|
||||
if (!Directory.Exists(cache.AbsolutePath))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
foreach (var file in SafeEnumerateFiles(cache.AbsolutePath))
|
||||
{
|
||||
cancellationToken.ThrowIfCancellationRequested();
|
||||
files.Add(CreateVirtualFile(
|
||||
file,
|
||||
$"deno-dir://{cache.Alias}/{DenoPathUtilities.NormalizeRelativePath(Path.GetRelativePath(cache.AbsolutePath, file))}",
|
||||
cache.Kind == DenoCacheLocationKind.Layer ? DenoVirtualFileSource.Layer : DenoVirtualFileSource.DenoDir,
|
||||
cache.LayerDigest ?? DenoLayerMetadata.TryExtractDigest(file)));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private static IEnumerable<string> SafeEnumerateFiles(string root)
|
||||
{
|
||||
IEnumerable<string> iterator;
|
||||
try
|
||||
{
|
||||
iterator = Directory.EnumerateFiles(root, "*", new EnumerationOptions
|
||||
{
|
||||
RecurseSubdirectories = true,
|
||||
IgnoreInaccessible = true,
|
||||
AttributesToSkip = FileAttributes.ReparsePoint,
|
||||
ReturnSpecialDirectories = false,
|
||||
});
|
||||
}
|
||||
catch (IOException)
|
||||
{
|
||||
yield break;
|
||||
}
|
||||
catch (UnauthorizedAccessException)
|
||||
{
|
||||
yield break;
|
||||
}
|
||||
|
||||
foreach (var file in iterator)
|
||||
{
|
||||
yield return file;
|
||||
}
|
||||
}
|
||||
|
||||
private static DenoVirtualFile CreateVirtualFile(
|
||||
string absolutePath,
|
||||
string virtualPath,
|
||||
DenoVirtualFileSource source,
|
||||
string? layerDigest)
|
||||
{
|
||||
var info = new FileInfo(absolutePath);
|
||||
var lastWrite = info.Exists
|
||||
? new DateTimeOffset(DateTime.SpecifyKind(info.LastWriteTimeUtc, DateTimeKind.Utc))
|
||||
: (DateTimeOffset?)null;
|
||||
|
||||
return new DenoVirtualFile(
|
||||
VirtualPath: NormalizeVirtualPath(source, virtualPath),
|
||||
AbsolutePath: info.FullName,
|
||||
Source: source,
|
||||
LayerDigest: layerDigest,
|
||||
Length: info.Exists ? info.Length : 0,
|
||||
LastWriteTimeUtc: lastWrite);
|
||||
}
|
||||
|
||||
private static string NormalizeVirtualPath(DenoVirtualFileSource source, string relativePath)
|
||||
{
|
||||
var normalized = DenoPathUtilities.NormalizeRelativePath(relativePath);
|
||||
if (normalized.Contains("://", StringComparison.Ordinal))
|
||||
{
|
||||
return normalized;
|
||||
}
|
||||
|
||||
var prefix = source switch
|
||||
{
|
||||
DenoVirtualFileSource.Workspace => "workspace",
|
||||
DenoVirtualFileSource.ImportMap => "import-map",
|
||||
DenoVirtualFileSource.LockFile => "lock",
|
||||
DenoVirtualFileSource.Vendor => "vendor",
|
||||
DenoVirtualFileSource.DenoDir => "deno-dir",
|
||||
DenoVirtualFileSource.Layer => "layer",
|
||||
_ => "unknown"
|
||||
};
|
||||
|
||||
return $"{prefix}://{normalized}";
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,59 @@
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno.Internal;
|
||||
|
||||
internal sealed class DenoWorkspace
|
||||
{
|
||||
public DenoWorkspace(
|
||||
IEnumerable<DenoConfigDocument> configs,
|
||||
IEnumerable<DenoImportMapDocument> importMaps,
|
||||
IEnumerable<DenoLockFile> lockFiles,
|
||||
IEnumerable<DenoVendorDirectory> vendors,
|
||||
IEnumerable<DenoCacheLocation> cacheLocations,
|
||||
DenoVirtualFileSystem fileSystem)
|
||||
{
|
||||
ArgumentNullException.ThrowIfNull(configs);
|
||||
ArgumentNullException.ThrowIfNull(importMaps);
|
||||
ArgumentNullException.ThrowIfNull(lockFiles);
|
||||
ArgumentNullException.ThrowIfNull(vendors);
|
||||
ArgumentNullException.ThrowIfNull(cacheLocations);
|
||||
ArgumentNullException.ThrowIfNull(fileSystem);
|
||||
|
||||
Configurations = configs
|
||||
.Where(static config => config is not null)
|
||||
.OrderBy(static config => config.RelativePath, StringComparer.Ordinal)
|
||||
.ToImmutableArray();
|
||||
|
||||
ImportMaps = importMaps
|
||||
.Where(static map => map is not null)
|
||||
.OrderBy(static map => map.SortKey, StringComparer.Ordinal)
|
||||
.ToImmutableArray();
|
||||
|
||||
LockFiles = lockFiles
|
||||
.Where(static file => file is not null)
|
||||
.OrderBy(static file => file.RelativePath, StringComparer.Ordinal)
|
||||
.ToImmutableArray();
|
||||
|
||||
Vendors = vendors
|
||||
.Where(static vendor => vendor is not null)
|
||||
.OrderBy(static vendor => vendor.RelativePath, StringComparer.Ordinal)
|
||||
.ToImmutableArray();
|
||||
|
||||
CacheLocations = cacheLocations
|
||||
.Where(static cache => cache is not null)
|
||||
.OrderBy(static cache => cache.Alias, StringComparer.Ordinal)
|
||||
.ToImmutableArray();
|
||||
|
||||
FileSystem = fileSystem;
|
||||
}
|
||||
|
||||
public ImmutableArray<DenoConfigDocument> Configurations { get; }
|
||||
|
||||
public ImmutableArray<DenoImportMapDocument> ImportMaps { get; }
|
||||
|
||||
public ImmutableArray<DenoLockFile> LockFiles { get; }
|
||||
|
||||
public ImmutableArray<DenoVendorDirectory> Vendors { get; }
|
||||
|
||||
public ImmutableArray<DenoCacheLocation> CacheLocations { get; }
|
||||
|
||||
public DenoVirtualFileSystem FileSystem { get; }
|
||||
}
|
||||
@@ -0,0 +1,444 @@
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno.Internal;
|
||||
|
||||
internal static class DenoWorkspaceNormalizer
|
||||
{
|
||||
private static readonly EnumerationOptions RecursiveEnumeration = new()
|
||||
{
|
||||
RecurseSubdirectories = true,
|
||||
IgnoreInaccessible = true,
|
||||
ReturnSpecialDirectories = false,
|
||||
AttributesToSkip = FileAttributes.ReparsePoint,
|
||||
};
|
||||
|
||||
private static readonly string[] ConfigPatterns = { "deno.json", "deno.jsonc" };
|
||||
private static readonly string[] VendorDirectoryNames = { "vendor" };
|
||||
private static readonly string[] DefaultDenoDirCandidates =
|
||||
{
|
||||
".deno",
|
||||
".cache/deno",
|
||||
"deno-dir",
|
||||
"deno_dir",
|
||||
"deno",
|
||||
"var/cache/deno",
|
||||
"usr/local/share/deno",
|
||||
"opt/deno",
|
||||
};
|
||||
|
||||
private static readonly string[] LayerRootCandidates = { "layers", ".layers", "layer" };
|
||||
private static readonly string[] LayerDenoDirNames = { ".deno", ".cache/deno", "deno-dir", "deno_dir", "deno" };
|
||||
|
||||
public static ValueTask<DenoWorkspace> NormalizeAsync(LanguageAnalyzerContext context, CancellationToken cancellationToken)
|
||||
{
|
||||
ArgumentNullException.ThrowIfNull(context);
|
||||
cancellationToken.ThrowIfCancellationRequested();
|
||||
|
||||
var configs = DiscoverConfigs(context, cancellationToken);
|
||||
var importMaps = CollectImportMaps(context, configs, cancellationToken);
|
||||
var lockFiles = CollectLockFiles(context, configs, cancellationToken);
|
||||
var vendors = DiscoverVendors(context, configs, cancellationToken);
|
||||
var cacheLocations = DiscoverCacheLocations(context, vendors, cancellationToken);
|
||||
|
||||
var combinedImportMaps = importMaps
|
||||
.Concat(vendors.Select(v => v.ImportMap).Where(static map => map is not null)!.Cast<DenoImportMapDocument>())
|
||||
.Where(static map => map is not null)
|
||||
.GroupBy(static map => (map!.AbsolutePath ?? map.SortKey), StringComparer.OrdinalIgnoreCase)
|
||||
.Select(static group => group.First()!)
|
||||
.ToImmutableArray();
|
||||
|
||||
var combinedLockFiles = lockFiles
|
||||
.Concat(vendors.Select(v => v.LockFile).Where(static file => file is not null)!.Cast<DenoLockFile>())
|
||||
.GroupBy(static file => file.AbsolutePath, StringComparer.OrdinalIgnoreCase)
|
||||
.Select(static group => group.First())
|
||||
.ToImmutableArray();
|
||||
|
||||
var fileSystem = DenoVirtualFileSystem.Build(
|
||||
context,
|
||||
configs,
|
||||
combinedImportMaps,
|
||||
combinedLockFiles,
|
||||
vendors,
|
||||
cacheLocations,
|
||||
cancellationToken);
|
||||
|
||||
var workspace = new DenoWorkspace(
|
||||
configs,
|
||||
combinedImportMaps,
|
||||
combinedLockFiles,
|
||||
vendors,
|
||||
cacheLocations,
|
||||
fileSystem);
|
||||
|
||||
return ValueTask.FromResult(workspace);
|
||||
}
|
||||
|
||||
private static ImmutableArray<DenoConfigDocument> DiscoverConfigs(LanguageAnalyzerContext context, CancellationToken cancellationToken)
|
||||
{
|
||||
var results = new List<DenoConfigDocument>();
|
||||
foreach (var pattern in ConfigPatterns)
|
||||
{
|
||||
foreach (var path in SafeEnumerateFiles(context.RootPath, pattern))
|
||||
{
|
||||
cancellationToken.ThrowIfCancellationRequested();
|
||||
var relative = context.GetRelativePath(path);
|
||||
if (DenoConfigDocument.TryLoad(path, relative, cancellationToken, out var config) && config is not null)
|
||||
{
|
||||
results.Add(config);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return results
|
||||
.OrderBy(static config => config.RelativePath, StringComparer.Ordinal)
|
||||
.ToImmutableArray();
|
||||
}
|
||||
|
||||
private static ImmutableArray<DenoImportMapDocument> CollectImportMaps(
|
||||
LanguageAnalyzerContext context,
|
||||
IEnumerable<DenoConfigDocument> configs,
|
||||
CancellationToken cancellationToken)
|
||||
{
|
||||
var builder = ImmutableArray.CreateBuilder<DenoImportMapDocument>();
|
||||
var seen = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
|
||||
|
||||
foreach (var config in configs ?? Array.Empty<DenoConfigDocument>())
|
||||
{
|
||||
cancellationToken.ThrowIfCancellationRequested();
|
||||
|
||||
if (config.InlineImportMap is not null && seen.Add($"inline::{config.RelativePath}"))
|
||||
{
|
||||
builder.Add(config.InlineImportMap);
|
||||
}
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(config.ImportMapPath) &&
|
||||
File.Exists(config.ImportMapPath) &&
|
||||
seen.Add(Path.GetFullPath(config.ImportMapPath!)))
|
||||
{
|
||||
var origin = context.GetRelativePath(config.ImportMapPath!);
|
||||
if (DenoImportMapDocument.TryLoadFromFile(config.ImportMapPath!, origin, out var document) && document is not null)
|
||||
{
|
||||
builder.Add(document);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return builder.ToImmutable();
|
||||
}
|
||||
|
||||
private static ImmutableArray<DenoLockFile> CollectLockFiles(
|
||||
LanguageAnalyzerContext context,
|
||||
IEnumerable<DenoConfigDocument> configs,
|
||||
CancellationToken cancellationToken)
|
||||
{
|
||||
var builder = ImmutableArray.CreateBuilder<DenoLockFile>();
|
||||
var seen = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
|
||||
|
||||
foreach (var config in configs ?? Array.Empty<DenoConfigDocument>())
|
||||
{
|
||||
cancellationToken.ThrowIfCancellationRequested();
|
||||
|
||||
if (!config.LockEnabled || string.IsNullOrWhiteSpace(config.LockFilePath))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var absolute = Path.GetFullPath(config.LockFilePath);
|
||||
if (!File.Exists(absolute) || !seen.Add(absolute))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
if (DenoLockFile.TryLoad(absolute, context.GetRelativePath(absolute), out var lockFile) && lockFile is not null)
|
||||
{
|
||||
builder.Add(lockFile);
|
||||
}
|
||||
}
|
||||
|
||||
foreach (var path in SafeEnumerateFiles(context.RootPath, "deno.lock"))
|
||||
{
|
||||
cancellationToken.ThrowIfCancellationRequested();
|
||||
var absolute = Path.GetFullPath(path);
|
||||
if (!seen.Add(absolute))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
if (DenoLockFile.TryLoad(absolute, context.GetRelativePath(absolute), out var lockFile) && lockFile is not null)
|
||||
{
|
||||
builder.Add(lockFile);
|
||||
}
|
||||
}
|
||||
|
||||
return builder.ToImmutable();
|
||||
}
|
||||
|
||||
private static ImmutableArray<DenoVendorDirectory> DiscoverVendors(
|
||||
LanguageAnalyzerContext context,
|
||||
IEnumerable<DenoConfigDocument> configs,
|
||||
CancellationToken cancellationToken)
|
||||
{
|
||||
var builder = ImmutableArray.CreateBuilder<DenoVendorDirectory>();
|
||||
var seen = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
|
||||
|
||||
void TryAddVendor(string path)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(path))
|
||||
{
|
||||
return;
|
||||
}
|
||||
|
||||
var absolute = Path.GetFullPath(path);
|
||||
if (!Directory.Exists(absolute) || !seen.Add(absolute))
|
||||
{
|
||||
return;
|
||||
}
|
||||
|
||||
cancellationToken.ThrowIfCancellationRequested();
|
||||
|
||||
var relative = context.GetRelativePath(absolute);
|
||||
var alias = DenoPathUtilities.CreateAlias(absolute, "vendor");
|
||||
var layerDigest = DenoLayerMetadata.TryExtractDigest(absolute);
|
||||
|
||||
DenoImportMapDocument? importMap = null;
|
||||
var importMapPath = Path.Combine(absolute, "import_map.json");
|
||||
if (File.Exists(importMapPath))
|
||||
{
|
||||
DenoImportMapDocument.TryLoadFromFile(importMapPath, $"vendor://{alias}/import_map.json", out importMap);
|
||||
}
|
||||
|
||||
DenoLockFile? lockFile = null;
|
||||
var lockPath = Path.Combine(absolute, "deno.lock");
|
||||
if (File.Exists(lockPath))
|
||||
{
|
||||
DenoLockFile.TryLoad(lockPath, $"vendor://{alias}/deno.lock", out lockFile);
|
||||
}
|
||||
|
||||
builder.Add(new DenoVendorDirectory(
|
||||
absolute,
|
||||
relative,
|
||||
alias,
|
||||
layerDigest,
|
||||
importMap,
|
||||
lockFile));
|
||||
}
|
||||
|
||||
foreach (var config in configs ?? Array.Empty<DenoConfigDocument>())
|
||||
{
|
||||
if (config.VendorEnabled && !string.IsNullOrWhiteSpace(config.VendorDirectoryPath))
|
||||
{
|
||||
TryAddVendor(config.VendorDirectoryPath!);
|
||||
}
|
||||
}
|
||||
|
||||
foreach (var name in VendorDirectoryNames)
|
||||
{
|
||||
foreach (var path in SafeEnumerateDirectories(context.RootPath, name))
|
||||
{
|
||||
TryAddVendor(path);
|
||||
}
|
||||
}
|
||||
|
||||
foreach (var (layerRoot, _) in EnumerateLayerRoots(context.RootPath))
|
||||
{
|
||||
foreach (var name in VendorDirectoryNames)
|
||||
{
|
||||
var candidate = Path.Combine(layerRoot, name);
|
||||
TryAddVendor(candidate);
|
||||
}
|
||||
}
|
||||
|
||||
return builder.ToImmutable();
|
||||
}
|
||||
|
||||
private static ImmutableArray<DenoCacheLocation> DiscoverCacheLocations(
|
||||
LanguageAnalyzerContext context,
|
||||
IEnumerable<DenoVendorDirectory> vendors,
|
||||
CancellationToken cancellationToken)
|
||||
{
|
||||
var builder = ImmutableArray.CreateBuilder<DenoCacheLocation>();
|
||||
var seen = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
|
||||
|
||||
void TryAdd(string path, DenoCacheLocationKind kind, string? layerDigest = null)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(path))
|
||||
{
|
||||
return;
|
||||
}
|
||||
|
||||
var absolute = Path.GetFullPath(path);
|
||||
if (!Directory.Exists(absolute) || !IsLikelyDenoDir(absolute) || !seen.Add(absolute))
|
||||
{
|
||||
return;
|
||||
}
|
||||
|
||||
cancellationToken.ThrowIfCancellationRequested();
|
||||
|
||||
var alias = DenoPathUtilities.CreateAlias(absolute, "deno");
|
||||
builder.Add(new DenoCacheLocation(
|
||||
absolute,
|
||||
alias,
|
||||
kind,
|
||||
layerDigest ?? DenoLayerMetadata.TryExtractDigest(absolute)));
|
||||
}
|
||||
|
||||
var envDir = Environment.GetEnvironmentVariable("DENO_DIR");
|
||||
if (!string.IsNullOrWhiteSpace(envDir))
|
||||
{
|
||||
TryAdd(DenoPathUtilities.ResolvePath(context.RootPath, envDir), DenoCacheLocationKind.Env);
|
||||
}
|
||||
|
||||
foreach (var candidate in DefaultDenoDirCandidates)
|
||||
{
|
||||
TryAdd(Path.Combine(context.RootPath, candidate), DenoCacheLocationKind.Workspace);
|
||||
}
|
||||
|
||||
DiscoverHomeDirectories(Path.Combine(context.RootPath, "home"), TryAdd);
|
||||
DiscoverHomeDirectories(Path.Combine(context.RootPath, "Users"), TryAdd);
|
||||
|
||||
foreach (var vendor in vendors ?? Array.Empty<DenoVendorDirectory>())
|
||||
{
|
||||
var sibling = Path.Combine(Path.GetDirectoryName(vendor.AbsolutePath) ?? context.RootPath, ".deno");
|
||||
TryAdd(sibling, vendor.LayerDigest is null ? DenoCacheLocationKind.Workspace : DenoCacheLocationKind.Layer, vendor.LayerDigest);
|
||||
}
|
||||
|
||||
foreach (var (layerRoot, digest) in EnumerateLayerRoots(context.RootPath))
|
||||
{
|
||||
foreach (var name in LayerDenoDirNames)
|
||||
{
|
||||
var candidate = Path.Combine(layerRoot, name);
|
||||
TryAdd(candidate, DenoCacheLocationKind.Layer, digest);
|
||||
}
|
||||
}
|
||||
|
||||
return builder
|
||||
.OrderBy(static cache => cache.Alias, StringComparer.Ordinal)
|
||||
.ToImmutableArray();
|
||||
}
|
||||
|
||||
private static IEnumerable<string> SafeEnumerateFiles(string root, string pattern)
|
||||
{
|
||||
if (!Directory.Exists(root))
|
||||
{
|
||||
yield break;
|
||||
}
|
||||
|
||||
IEnumerable<string> iterator;
|
||||
try
|
||||
{
|
||||
iterator = Directory.EnumerateFiles(root, pattern, RecursiveEnumeration);
|
||||
}
|
||||
catch (IOException)
|
||||
{
|
||||
yield break;
|
||||
}
|
||||
catch (UnauthorizedAccessException)
|
||||
{
|
||||
yield break;
|
||||
}
|
||||
|
||||
foreach (var path in iterator)
|
||||
{
|
||||
yield return path;
|
||||
}
|
||||
}
|
||||
|
||||
private static IEnumerable<string> SafeEnumerateDirectories(string root, string pattern)
|
||||
{
|
||||
if (!Directory.Exists(root))
|
||||
{
|
||||
yield break;
|
||||
}
|
||||
|
||||
IEnumerable<string> iterator;
|
||||
try
|
||||
{
|
||||
iterator = Directory.EnumerateDirectories(root, pattern, RecursiveEnumeration);
|
||||
}
|
||||
catch (IOException)
|
||||
{
|
||||
yield break;
|
||||
}
|
||||
catch (UnauthorizedAccessException)
|
||||
{
|
||||
yield break;
|
||||
}
|
||||
|
||||
foreach (var path in iterator)
|
||||
{
|
||||
yield return path;
|
||||
}
|
||||
}
|
||||
|
||||
private static IEnumerable<(string RootPath, string? Digest)> EnumerateLayerRoots(string workspaceRoot)
|
||||
{
|
||||
foreach (var candidate in LayerRootCandidates)
|
||||
{
|
||||
var root = Path.Combine(workspaceRoot, candidate);
|
||||
if (!Directory.Exists(root))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
IEnumerable<string> iterator;
|
||||
try
|
||||
{
|
||||
iterator = Directory.EnumerateDirectories(root);
|
||||
}
|
||||
catch (IOException)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
catch (UnauthorizedAccessException)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
foreach (var layerDirectory in iterator)
|
||||
{
|
||||
var digest = DenoLayerMetadata.TryExtractDigest(layerDirectory);
|
||||
var fsDirectory = Path.Combine(layerDirectory, "fs");
|
||||
yield return Directory.Exists(fsDirectory)
|
||||
? (fsDirectory, digest)
|
||||
: (layerDirectory, digest);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private static void DiscoverHomeDirectories(string homeRoot, Action<string, DenoCacheLocationKind, string?> add)
|
||||
{
|
||||
if (!Directory.Exists(homeRoot))
|
||||
{
|
||||
return;
|
||||
}
|
||||
|
||||
IEnumerable<string> iterator;
|
||||
try
|
||||
{
|
||||
iterator = Directory.EnumerateDirectories(homeRoot);
|
||||
}
|
||||
catch (IOException)
|
||||
{
|
||||
return;
|
||||
}
|
||||
catch (UnauthorizedAccessException)
|
||||
{
|
||||
return;
|
||||
}
|
||||
|
||||
foreach (var home in iterator)
|
||||
{
|
||||
var cache = Path.Combine(home, ".cache", "deno");
|
||||
add(cache, DenoCacheLocationKind.Home, DenoLayerMetadata.TryExtractDigest(cache));
|
||||
|
||||
var dotDeno = Path.Combine(home, ".deno");
|
||||
add(dotDeno, DenoCacheLocationKind.Home, DenoLayerMetadata.TryExtractDigest(dotDeno));
|
||||
}
|
||||
}
|
||||
|
||||
private static bool IsLikelyDenoDir(string path)
|
||||
{
|
||||
var deps = Path.Combine(path, "deps");
|
||||
var gen = Path.Combine(path, "gen");
|
||||
var npm = Path.Combine(path, "npm");
|
||||
return Directory.Exists(deps) || Directory.Exists(gen) || Directory.Exists(npm);
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,73 @@
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno.Internal.Observations;
|
||||
|
||||
internal static class DenoObservationBuilder
|
||||
{
|
||||
public static DenoObservationDocument Build(
|
||||
DenoModuleGraph moduleGraph,
|
||||
DenoCompatibilityAnalysis compatibility,
|
||||
ImmutableArray<DenoBundleObservation> bundles)
|
||||
{
|
||||
ArgumentNullException.ThrowIfNull(moduleGraph);
|
||||
ArgumentNullException.ThrowIfNull(compatibility);
|
||||
|
||||
var entrypoints = ExtractEntrypoints(moduleGraph, bundles);
|
||||
var moduleSpecifiers = ExtractModuleSpecifiers(moduleGraph);
|
||||
var bundleSummaries = bundles
|
||||
.Select(bundle => new DenoObservationBundleSummary(
|
||||
bundle.SourcePath,
|
||||
bundle.BundleType,
|
||||
bundle.Entrypoint,
|
||||
bundle.Modules.Length,
|
||||
bundle.Resources.Length))
|
||||
.OrderBy(summary => summary.SourcePath, StringComparer.Ordinal)
|
||||
.ToImmutableArray();
|
||||
|
||||
return new DenoObservationDocument(
|
||||
entrypoints,
|
||||
moduleSpecifiers,
|
||||
compatibility.Capabilities,
|
||||
compatibility.DynamicImports,
|
||||
compatibility.LiteralFetches,
|
||||
bundleSummaries);
|
||||
}
|
||||
|
||||
private static ImmutableArray<string> ExtractEntrypoints(
|
||||
DenoModuleGraph moduleGraph,
|
||||
ImmutableArray<DenoBundleObservation> bundles)
|
||||
{
|
||||
var entrypoints = new HashSet<string>(StringComparer.Ordinal);
|
||||
|
||||
foreach (var node in moduleGraph.Nodes)
|
||||
{
|
||||
if (node.Kind == DenoModuleKind.WorkspaceConfig &&
|
||||
node.Metadata.TryGetValue("entry", out var entry) &&
|
||||
!string.IsNullOrWhiteSpace(entry))
|
||||
{
|
||||
entrypoints.Add(entry!);
|
||||
}
|
||||
}
|
||||
|
||||
foreach (var bundle in bundles)
|
||||
{
|
||||
if (!string.IsNullOrWhiteSpace(bundle.Entrypoint))
|
||||
{
|
||||
entrypoints.Add(bundle.Entrypoint!);
|
||||
}
|
||||
}
|
||||
|
||||
return entrypoints
|
||||
.OrderBy(value => value, StringComparer.Ordinal)
|
||||
.ToImmutableArray();
|
||||
}
|
||||
|
||||
private static ImmutableArray<string> ExtractModuleSpecifiers(DenoModuleGraph moduleGraph)
|
||||
{
|
||||
return moduleGraph.Nodes
|
||||
.Where(node => node.Kind == DenoModuleKind.RemoteModule || node.Kind == DenoModuleKind.WorkspaceModule)
|
||||
.Select(node => node.DisplayName)
|
||||
.Where(name => !string.IsNullOrWhiteSpace(name))
|
||||
.Distinct(StringComparer.Ordinal)
|
||||
.OrderBy(name => name, StringComparer.Ordinal)
|
||||
.ToImmutableArray()!;
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,8 @@
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno.Internal.Observations;
|
||||
|
||||
internal sealed record DenoObservationBundleSummary(
|
||||
string SourcePath,
|
||||
string BundleType,
|
||||
string? Entrypoint,
|
||||
int ModuleCount,
|
||||
int ResourceCount);
|
||||
@@ -0,0 +1,9 @@
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno.Internal.Observations;
|
||||
|
||||
internal sealed record DenoObservationDocument(
|
||||
ImmutableArray<string> Entrypoints,
|
||||
ImmutableArray<string> ModuleSpecifiers,
|
||||
ImmutableArray<DenoCapabilityRecord> Capabilities,
|
||||
ImmutableArray<DenoDynamicImportObservation> DynamicImports,
|
||||
ImmutableArray<DenoLiteralFetchObservation> LiteralFetches,
|
||||
ImmutableArray<DenoObservationBundleSummary> Bundles);
|
||||
@@ -0,0 +1,109 @@
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno.Internal.Observations;
|
||||
|
||||
internal static class DenoObservationSerializer
|
||||
{
|
||||
private static readonly JsonSerializerOptions SerializerOptions = new(JsonSerializerDefaults.Web)
|
||||
{
|
||||
DefaultIgnoreCondition = JsonIgnoreCondition.WhenWritingNull,
|
||||
WriteIndented = false
|
||||
};
|
||||
|
||||
public static string Serialize(DenoObservationDocument document)
|
||||
{
|
||||
ArgumentNullException.ThrowIfNull(document);
|
||||
|
||||
using var buffer = new ArrayBufferWriter<byte>();
|
||||
using (var writer = new Utf8JsonWriter(buffer, new JsonWriterOptions { Indented = false }))
|
||||
{
|
||||
writer.WriteStartObject();
|
||||
|
||||
WriteArray(writer, "entrypoints", document.Entrypoints);
|
||||
WriteArray(writer, "modules", document.ModuleSpecifiers);
|
||||
|
||||
writer.WritePropertyName("capabilities");
|
||||
writer.WriteStartArray();
|
||||
foreach (var record in document.Capabilities.OrderBy(c => c.Capability).ThenBy(c => c.ReasonCode, StringComparer.Ordinal))
|
||||
{
|
||||
writer.WriteStartObject();
|
||||
writer.WriteString("capability", record.Capability.ToString());
|
||||
writer.WriteString("reason", record.ReasonCode);
|
||||
WriteArray(writer, "sources", record.Sources);
|
||||
writer.WriteEndObject();
|
||||
}
|
||||
|
||||
writer.WriteEndArray();
|
||||
|
||||
writer.WritePropertyName("dynamicImports");
|
||||
writer.WriteStartArray();
|
||||
foreach (var observation in document.DynamicImports.OrderBy(obs => obs.FilePath, StringComparer.Ordinal).ThenBy(obs => obs.Line))
|
||||
{
|
||||
writer.WriteStartObject();
|
||||
writer.WriteString("file", observation.FilePath);
|
||||
writer.WriteNumber("line", observation.Line);
|
||||
writer.WriteString("specifier", observation.Specifier);
|
||||
writer.WriteString("reason", observation.ReasonCode);
|
||||
writer.WriteEndObject();
|
||||
}
|
||||
|
||||
writer.WriteEndArray();
|
||||
|
||||
writer.WritePropertyName("literalFetches");
|
||||
writer.WriteStartArray();
|
||||
foreach (var observation in document.LiteralFetches.OrderBy(obs => obs.FilePath, StringComparer.Ordinal).ThenBy(obs => obs.Line))
|
||||
{
|
||||
writer.WriteStartObject();
|
||||
writer.WriteString("file", observation.FilePath);
|
||||
writer.WriteNumber("line", observation.Line);
|
||||
writer.WriteString("url", observation.Url);
|
||||
writer.WriteString("reason", observation.ReasonCode);
|
||||
writer.WriteEndObject();
|
||||
}
|
||||
|
||||
writer.WriteEndArray();
|
||||
|
||||
writer.WritePropertyName("bundles");
|
||||
writer.WriteStartArray();
|
||||
foreach (var bundle in document.Bundles)
|
||||
{
|
||||
writer.WriteStartObject();
|
||||
writer.WriteString("path", bundle.SourcePath);
|
||||
writer.WriteString("type", bundle.BundleType);
|
||||
if (!string.IsNullOrWhiteSpace(bundle.Entrypoint))
|
||||
{
|
||||
writer.WriteString("entrypoint", bundle.Entrypoint);
|
||||
}
|
||||
|
||||
writer.WriteNumber("modules", bundle.ModuleCount);
|
||||
writer.WriteNumber("resources", bundle.ResourceCount);
|
||||
writer.WriteEndObject();
|
||||
}
|
||||
|
||||
writer.WriteEndArray();
|
||||
|
||||
writer.WriteEndObject();
|
||||
writer.Flush();
|
||||
}
|
||||
|
||||
return Encoding.UTF8.GetString(buffer.WrittenSpan);
|
||||
}
|
||||
|
||||
public static string ComputeSha256(string value)
|
||||
{
|
||||
ArgumentNullException.ThrowIfNull(value);
|
||||
var bytes = Encoding.UTF8.GetBytes(value);
|
||||
var hash = SHA256.HashData(bytes);
|
||||
return $"sha256:{Convert.ToHexString(hash).ToLowerInvariant()}";
|
||||
}
|
||||
|
||||
private static void WriteArray(Utf8JsonWriter writer, string propertyName, ImmutableArray<string> values)
|
||||
{
|
||||
writer.WritePropertyName(propertyName);
|
||||
writer.WriteStartArray();
|
||||
foreach (var value in values)
|
||||
{
|
||||
writer.WriteStringValue(value);
|
||||
}
|
||||
|
||||
writer.WriteEndArray();
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,20 @@
|
||||
<Project Sdk="Microsoft.NET.Sdk">
|
||||
<PropertyGroup>
|
||||
<TargetFramework>net10.0</TargetFramework>
|
||||
<LangVersion>preview</LangVersion>
|
||||
<Nullable>enable</Nullable>
|
||||
<ImplicitUsings>enable</ImplicitUsings>
|
||||
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
|
||||
<EnableDefaultItems>false</EnableDefaultItems>
|
||||
</PropertyGroup>
|
||||
|
||||
<ItemGroup>
|
||||
<Compile Include="**\*.cs" Exclude="obj\**;bin\**" />
|
||||
<EmbeddedResource Include="**\*.json" Exclude="obj\**;bin\**" />
|
||||
<None Include="**\*" Exclude="**\*.cs;**\*.json;bin\**;obj\**" />
|
||||
</ItemGroup>
|
||||
|
||||
<ItemGroup>
|
||||
<ProjectReference Include="..\StellaOps.Scanner.Analyzers.Lang\StellaOps.Scanner.Analyzers.Lang.csproj" />
|
||||
</ItemGroup>
|
||||
</Project>
|
||||
@@ -0,0 +1,21 @@
|
||||
{
|
||||
"schemaVersion": "1.0",
|
||||
"id": "stellaops.analyzer.lang.deno",
|
||||
"displayName": "StellaOps Deno Analyzer",
|
||||
"version": "0.1.0",
|
||||
"requiresRestart": true,
|
||||
"entryPoint": {
|
||||
"type": "dotnet",
|
||||
"assembly": "StellaOps.Scanner.Analyzers.Lang.Deno.dll",
|
||||
"typeName": "StellaOps.Scanner.Analyzers.Lang.Deno.DenoAnalyzerPlugin"
|
||||
},
|
||||
"capabilities": [
|
||||
"language-analyzer",
|
||||
"deno"
|
||||
],
|
||||
"metadata": {
|
||||
"org.stellaops.analyzer.language": "deno",
|
||||
"org.stellaops.analyzer.kind": "language",
|
||||
"org.stellaops.restart.required": "true"
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,184 @@
|
||||
using System.Collections.Generic;
|
||||
using System.Linq;
|
||||
using System.Xml.Linq;
|
||||
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Java.Internal;
|
||||
|
||||
internal static class JavaLockFileCollector
|
||||
{
|
||||
private static readonly string[] GradleLockPatterns = { "gradle.lockfile" };
|
||||
|
||||
public static async Task<JavaLockData> LoadAsync(LanguageAnalyzerContext context, CancellationToken cancellationToken)
|
||||
{
|
||||
ArgumentNullException.ThrowIfNull(context);
|
||||
|
||||
var entries = new Dictionary<string, JavaLockEntry>(StringComparer.OrdinalIgnoreCase);
|
||||
var root = context.RootPath;
|
||||
|
||||
foreach (var pattern in GradleLockPatterns)
|
||||
{
|
||||
var lockPath = Path.Combine(root, pattern);
|
||||
if (File.Exists(lockPath))
|
||||
{
|
||||
await ParseGradleLockFileAsync(context, lockPath, entries, cancellationToken).ConfigureAwait(false);
|
||||
}
|
||||
}
|
||||
|
||||
var dependencyLocksDir = Path.Combine(root, "gradle", "dependency-locks");
|
||||
if (Directory.Exists(dependencyLocksDir))
|
||||
{
|
||||
foreach (var file in Directory.EnumerateFiles(dependencyLocksDir, "*.lockfile", SearchOption.AllDirectories))
|
||||
{
|
||||
await ParseGradleLockFileAsync(context, file, entries, cancellationToken).ConfigureAwait(false);
|
||||
}
|
||||
}
|
||||
|
||||
foreach (var pomPath in Directory.EnumerateFiles(root, "pom.xml", SearchOption.AllDirectories))
|
||||
{
|
||||
await ParsePomAsync(context, pomPath, entries, cancellationToken).ConfigureAwait(false);
|
||||
}
|
||||
|
||||
return entries.Count == 0 ? JavaLockData.Empty : new JavaLockData(entries);
|
||||
}
|
||||
|
||||
private static async Task ParseGradleLockFileAsync(LanguageAnalyzerContext context, string path, IDictionary<string, JavaLockEntry> entries, CancellationToken cancellationToken)
|
||||
{
|
||||
await using var stream = new FileStream(path, FileMode.Open, FileAccess.Read, FileShare.Read);
|
||||
using var reader = new StreamReader(stream);
|
||||
|
||||
string? line;
|
||||
while ((line = await reader.ReadLineAsync(cancellationToken).ConfigureAwait(false)) is not null)
|
||||
{
|
||||
cancellationToken.ThrowIfCancellationRequested();
|
||||
|
||||
line = line.Trim();
|
||||
if (string.IsNullOrWhiteSpace(line) || line.StartsWith("#", StringComparison.Ordinal))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var parts = line.Split('=', 2, StringSplitOptions.TrimEntries);
|
||||
var coordinates = parts[0];
|
||||
var configuration = parts.Length > 1 ? parts[1] : null;
|
||||
var coordinateParts = coordinates.Split(':');
|
||||
|
||||
if (coordinateParts.Length < 3)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var groupId = coordinateParts[0];
|
||||
var artifactId = coordinateParts[1];
|
||||
var version = coordinateParts[^1];
|
||||
|
||||
if (string.IsNullOrWhiteSpace(groupId) || string.IsNullOrWhiteSpace(artifactId) || string.IsNullOrWhiteSpace(version))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var entry = new JavaLockEntry(
|
||||
groupId.Trim(),
|
||||
artifactId.Trim(),
|
||||
version.Trim(),
|
||||
Path.GetFileName(path),
|
||||
NormalizeLocator(context, path),
|
||||
configuration,
|
||||
null,
|
||||
null);
|
||||
|
||||
entries[entry.Key] = entry;
|
||||
}
|
||||
}
|
||||
|
||||
private static async Task ParsePomAsync(LanguageAnalyzerContext context, string path, IDictionary<string, JavaLockEntry> entries, CancellationToken cancellationToken)
|
||||
{
|
||||
await using var stream = new FileStream(path, FileMode.Open, FileAccess.Read, FileShare.Read);
|
||||
var document = await XDocument.LoadAsync(stream, LoadOptions.None, cancellationToken).ConfigureAwait(false);
|
||||
|
||||
var dependencies = document
|
||||
.Descendants()
|
||||
.Where(static element => element.Name.LocalName.Equals("dependency", StringComparison.OrdinalIgnoreCase));
|
||||
|
||||
foreach (var dependency in dependencies)
|
||||
{
|
||||
cancellationToken.ThrowIfCancellationRequested();
|
||||
|
||||
var groupId = dependency.Elements().FirstOrDefault(static e => e.Name.LocalName.Equals("groupId", StringComparison.OrdinalIgnoreCase))?.Value?.Trim();
|
||||
var artifactId = dependency.Elements().FirstOrDefault(static e => e.Name.LocalName.Equals("artifactId", StringComparison.OrdinalIgnoreCase))?.Value?.Trim();
|
||||
var version = dependency.Elements().FirstOrDefault(static e => e.Name.LocalName.Equals("version", StringComparison.OrdinalIgnoreCase))?.Value?.Trim();
|
||||
var scope = dependency.Elements().FirstOrDefault(static e => e.Name.LocalName.Equals("scope", StringComparison.OrdinalIgnoreCase))?.Value?.Trim();
|
||||
var repository = dependency.Elements().FirstOrDefault(static e => e.Name.LocalName.Equals("repository", StringComparison.OrdinalIgnoreCase))?.Value?.Trim();
|
||||
|
||||
if (string.IsNullOrWhiteSpace(groupId) ||
|
||||
string.IsNullOrWhiteSpace(artifactId) ||
|
||||
string.IsNullOrWhiteSpace(version) ||
|
||||
version.Contains("${", StringComparison.Ordinal))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var entry = new JavaLockEntry(
|
||||
groupId,
|
||||
artifactId,
|
||||
version,
|
||||
"pom.xml",
|
||||
NormalizeLocator(context, path),
|
||||
scope,
|
||||
repository,
|
||||
null);
|
||||
|
||||
entries[entry.Key] = entry;
|
||||
}
|
||||
}
|
||||
|
||||
private static string NormalizeLocator(LanguageAnalyzerContext context, string path)
|
||||
=> context.GetRelativePath(path).Replace('\\', '/');
|
||||
}
|
||||
|
||||
internal sealed record JavaLockEntry(
|
||||
string GroupId,
|
||||
string ArtifactId,
|
||||
string Version,
|
||||
string Source,
|
||||
string Locator,
|
||||
string? Configuration,
|
||||
string? Repository,
|
||||
string? ResolvedUrl)
|
||||
{
|
||||
public string Key => BuildKey(GroupId, ArtifactId, Version);
|
||||
|
||||
private static string BuildKey(string groupId, string artifactId, string version)
|
||||
=> $"{groupId}:{artifactId}:{version}".ToLowerInvariant();
|
||||
}
|
||||
|
||||
internal sealed class JavaLockData
|
||||
{
|
||||
public static readonly JavaLockData Empty = new(new Dictionary<string, JavaLockEntry>(StringComparer.OrdinalIgnoreCase));
|
||||
|
||||
private readonly Dictionary<string, JavaLockEntry> _entriesByKey;
|
||||
private readonly IReadOnlyList<JavaLockEntry> _orderedEntries;
|
||||
|
||||
public JavaLockData(Dictionary<string, JavaLockEntry> entries)
|
||||
{
|
||||
_entriesByKey = entries ?? throw new ArgumentNullException(nameof(entries));
|
||||
_orderedEntries = entries.Count == 0
|
||||
? Array.Empty<JavaLockEntry>()
|
||||
: entries.Values
|
||||
.OrderBy(static entry => entry.GroupId, StringComparer.OrdinalIgnoreCase)
|
||||
.ThenBy(static entry => entry.ArtifactId, StringComparer.OrdinalIgnoreCase)
|
||||
.ThenBy(static entry => entry.Version, StringComparer.OrdinalIgnoreCase)
|
||||
.ThenBy(static entry => entry.Source, StringComparer.OrdinalIgnoreCase)
|
||||
.ThenBy(static entry => entry.Locator, StringComparer.OrdinalIgnoreCase)
|
||||
.ToArray();
|
||||
}
|
||||
|
||||
public IReadOnlyList<JavaLockEntry> Entries => _orderedEntries;
|
||||
|
||||
public bool HasEntries => _entriesByKey.Count > 0;
|
||||
|
||||
public bool TryGet(string groupId, string artifactId, string version, out JavaLockEntry? entry)
|
||||
{
|
||||
var key = $"{groupId}:{artifactId}:{version}".ToLowerInvariant();
|
||||
return _entriesByKey.TryGetValue(key, out entry);
|
||||
}
|
||||
}
|
||||
@@ -1,138 +1,167 @@
|
||||
using StellaOps.Scanner.Analyzers.Lang.Java.Internal;
|
||||
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Java;
|
||||
|
||||
public sealed class JavaLanguageAnalyzer : ILanguageAnalyzer
|
||||
{
|
||||
public string Id => "java";
|
||||
using System.Collections.Generic;
|
||||
using System.IO;
|
||||
using System.Text;
|
||||
using StellaOps.Scanner.Analyzers.Lang.Java.Internal;
|
||||
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Java;
|
||||
|
||||
public sealed class JavaLanguageAnalyzer : ILanguageAnalyzer
|
||||
{
|
||||
public string Id => "java";
|
||||
|
||||
public string DisplayName => "Java/Maven Analyzer";
|
||||
|
||||
public async ValueTask AnalyzeAsync(LanguageAnalyzerContext context, LanguageComponentWriter writer, CancellationToken cancellationToken)
|
||||
{
|
||||
ArgumentNullException.ThrowIfNull(context);
|
||||
ArgumentNullException.ThrowIfNull(writer);
|
||||
|
||||
var workspace = JavaWorkspaceNormalizer.Normalize(context, cancellationToken);
|
||||
|
||||
foreach (var archive in workspace.Archives)
|
||||
{
|
||||
cancellationToken.ThrowIfCancellationRequested();
|
||||
|
||||
try
|
||||
{
|
||||
await ProcessArchiveAsync(archive, context, writer, cancellationToken).ConfigureAwait(false);
|
||||
}
|
||||
catch (IOException)
|
||||
{
|
||||
// Corrupt archives should not abort the scan.
|
||||
}
|
||||
catch (InvalidDataException)
|
||||
{
|
||||
// Skip non-zip payloads despite supported extensions.
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private async ValueTask ProcessArchiveAsync(JavaArchive archive, LanguageAnalyzerContext context, LanguageComponentWriter writer, CancellationToken cancellationToken)
|
||||
{
|
||||
ManifestMetadata? manifestMetadata = null;
|
||||
if (archive.TryGetEntry("META-INF/MANIFEST.MF", out var manifestEntry))
|
||||
{
|
||||
manifestMetadata = await ParseManifestAsync(archive, manifestEntry, cancellationToken).ConfigureAwait(false);
|
||||
}
|
||||
|
||||
foreach (var entry in archive.Entries)
|
||||
{
|
||||
cancellationToken.ThrowIfCancellationRequested();
|
||||
|
||||
if (IsManifestEntry(entry.EffectivePath))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
if (!IsPomPropertiesEntry(entry.EffectivePath))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var artifact = await ParsePomPropertiesAsync(archive, entry, cancellationToken).ConfigureAwait(false);
|
||||
if (artifact is null)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var metadata = new Dictionary<string, string?>(StringComparer.Ordinal)
|
||||
{
|
||||
["groupId"] = artifact.GroupId,
|
||||
["artifactId"] = artifact.ArtifactId,
|
||||
["jarPath"] = NormalizeArchivePath(archive.RelativePath),
|
||||
};
|
||||
|
||||
if (!string.IsNullOrEmpty(artifact.Packaging))
|
||||
{
|
||||
metadata["packaging"] = artifact.Packaging;
|
||||
}
|
||||
|
||||
if (!string.IsNullOrEmpty(artifact.Name))
|
||||
{
|
||||
metadata["displayName"] = artifact.Name;
|
||||
}
|
||||
|
||||
if (manifestMetadata is not null)
|
||||
{
|
||||
manifestMetadata.ApplyMetadata(metadata);
|
||||
}
|
||||
|
||||
var evidence = new List<LanguageComponentEvidence>
|
||||
{
|
||||
new(LanguageEvidenceKind.File, "pom.properties", BuildLocator(archive, entry.OriginalPath), null, artifact.PomSha256),
|
||||
};
|
||||
|
||||
if (manifestMetadata is not null)
|
||||
{
|
||||
evidence.Add(manifestMetadata.CreateEvidence(archive));
|
||||
}
|
||||
|
||||
var usedByEntrypoint = context.UsageHints.IsPathUsed(archive.AbsolutePath);
|
||||
|
||||
writer.AddFromPurl(
|
||||
analyzerId: Id,
|
||||
purl: artifact.Purl,
|
||||
name: artifact.ArtifactId,
|
||||
version: artifact.Version,
|
||||
type: "maven",
|
||||
metadata: metadata,
|
||||
evidence: evidence,
|
||||
usedByEntrypoint: usedByEntrypoint);
|
||||
}
|
||||
}
|
||||
public async ValueTask AnalyzeAsync(LanguageAnalyzerContext context, LanguageComponentWriter writer, CancellationToken cancellationToken)
|
||||
{
|
||||
ArgumentNullException.ThrowIfNull(context);
|
||||
ArgumentNullException.ThrowIfNull(writer);
|
||||
|
||||
private static string BuildLocator(JavaArchive archive, string entryPath)
|
||||
{
|
||||
var relativeArchive = NormalizeArchivePath(archive.RelativePath);
|
||||
var normalizedEntry = NormalizeEntry(entryPath);
|
||||
|
||||
if (string.Equals(relativeArchive, ".", StringComparison.Ordinal) || string.IsNullOrEmpty(relativeArchive))
|
||||
{
|
||||
return normalizedEntry;
|
||||
}
|
||||
|
||||
return string.Concat(relativeArchive, "!", normalizedEntry);
|
||||
}
|
||||
|
||||
private static string NormalizeEntry(string entryPath)
|
||||
=> entryPath.Replace('\\', '/');
|
||||
|
||||
private static string NormalizeArchivePath(string relativePath)
|
||||
{
|
||||
if (string.IsNullOrEmpty(relativePath) || string.Equals(relativePath, ".", StringComparison.Ordinal))
|
||||
{
|
||||
return ".";
|
||||
}
|
||||
|
||||
return relativePath.Replace('\\', '/');
|
||||
}
|
||||
var workspace = JavaWorkspaceNormalizer.Normalize(context, cancellationToken);
|
||||
var lockData = await JavaLockFileCollector.LoadAsync(context, cancellationToken).ConfigureAwait(false);
|
||||
var matchedLocks = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
|
||||
var hasLockEntries = lockData.HasEntries;
|
||||
|
||||
foreach (var archive in workspace.Archives)
|
||||
{
|
||||
cancellationToken.ThrowIfCancellationRequested();
|
||||
|
||||
try
|
||||
{
|
||||
await ProcessArchiveAsync(archive, context, writer, lockData, matchedLocks, hasLockEntries, cancellationToken).ConfigureAwait(false);
|
||||
}
|
||||
catch (IOException)
|
||||
{
|
||||
// Corrupt archives should not abort the scan.
|
||||
}
|
||||
catch (InvalidDataException)
|
||||
{
|
||||
// Skip non-zip payloads despite supported extensions.
|
||||
}
|
||||
}
|
||||
|
||||
if (lockData.Entries.Count > 0)
|
||||
{
|
||||
foreach (var entry in lockData.Entries)
|
||||
{
|
||||
if (matchedLocks.Contains(entry.Key))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var metadata = CreateDeclaredMetadata(entry);
|
||||
var evidence = new[] { CreateDeclaredEvidence(entry) };
|
||||
|
||||
var purl = BuildPurl(entry.GroupId, entry.ArtifactId, entry.Version, packaging: null);
|
||||
|
||||
writer.AddFromPurl(
|
||||
analyzerId: Id,
|
||||
purl: purl,
|
||||
name: entry.ArtifactId,
|
||||
version: entry.Version,
|
||||
type: "maven",
|
||||
metadata: metadata,
|
||||
evidence: evidence,
|
||||
usedByEntrypoint: false);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private async ValueTask ProcessArchiveAsync(
|
||||
JavaArchive archive,
|
||||
LanguageAnalyzerContext context,
|
||||
LanguageComponentWriter writer,
|
||||
JavaLockData lockData,
|
||||
HashSet<string> matchedLocks,
|
||||
bool hasLockEntries,
|
||||
CancellationToken cancellationToken)
|
||||
{
|
||||
ManifestMetadata? manifestMetadata = null;
|
||||
if (archive.TryGetEntry("META-INF/MANIFEST.MF", out var manifestEntry))
|
||||
{
|
||||
manifestMetadata = await ParseManifestAsync(archive, manifestEntry, cancellationToken).ConfigureAwait(false);
|
||||
}
|
||||
|
||||
foreach (var entry in archive.Entries)
|
||||
{
|
||||
cancellationToken.ThrowIfCancellationRequested();
|
||||
|
||||
if (IsManifestEntry(entry.EffectivePath))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
if (!IsPomPropertiesEntry(entry.EffectivePath))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var artifact = await ParsePomPropertiesAsync(archive, entry, cancellationToken).ConfigureAwait(false);
|
||||
if (artifact is null)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var metadata = CreateInstalledMetadata(artifact, archive, manifestMetadata);
|
||||
|
||||
if (lockData.TryGet(artifact.GroupId, artifact.ArtifactId, artifact.Version, out var lockEntry))
|
||||
{
|
||||
matchedLocks.Add(lockEntry!.Key);
|
||||
AppendLockMetadata(metadata, lockEntry);
|
||||
}
|
||||
else if (hasLockEntries)
|
||||
{
|
||||
AddMetadata(metadata, "lockMissing", "true");
|
||||
}
|
||||
|
||||
var evidence = new List<LanguageComponentEvidence>
|
||||
{
|
||||
new(LanguageEvidenceKind.File, "pom.properties", BuildLocator(archive, entry.OriginalPath), null, artifact.PomSha256),
|
||||
};
|
||||
|
||||
if (manifestMetadata is not null)
|
||||
{
|
||||
evidence.Add(manifestMetadata.CreateEvidence(archive));
|
||||
}
|
||||
|
||||
var usedByEntrypoint = context.UsageHints.IsPathUsed(archive.AbsolutePath);
|
||||
|
||||
writer.AddFromPurl(
|
||||
analyzerId: Id,
|
||||
purl: artifact.Purl,
|
||||
name: artifact.ArtifactId,
|
||||
version: artifact.Version,
|
||||
type: "maven",
|
||||
metadata: SortMetadata(metadata),
|
||||
evidence: evidence,
|
||||
usedByEntrypoint: usedByEntrypoint);
|
||||
}
|
||||
}
|
||||
|
||||
private static string BuildLocator(JavaArchive archive, string entryPath)
|
||||
{
|
||||
var relativeArchive = NormalizeArchivePath(archive.RelativePath);
|
||||
var normalizedEntry = NormalizeEntry(entryPath);
|
||||
|
||||
if (string.Equals(relativeArchive, ".", StringComparison.Ordinal) || string.IsNullOrEmpty(relativeArchive))
|
||||
{
|
||||
return normalizedEntry;
|
||||
}
|
||||
|
||||
return string.Concat(relativeArchive, "!", normalizedEntry);
|
||||
}
|
||||
|
||||
private static string NormalizeEntry(string entryPath)
|
||||
=> entryPath.Replace('\\', '/');
|
||||
|
||||
private static string NormalizeArchivePath(string relativePath)
|
||||
{
|
||||
if (string.IsNullOrEmpty(relativePath) || string.Equals(relativePath, ".", StringComparison.Ordinal))
|
||||
{
|
||||
return ".";
|
||||
}
|
||||
|
||||
return relativePath.Replace('\\', '/');
|
||||
}
|
||||
|
||||
private static bool IsPomPropertiesEntry(string entryName)
|
||||
=> entryName.StartsWith("META-INF/maven/", StringComparison.OrdinalIgnoreCase)
|
||||
@@ -141,14 +170,21 @@ public sealed class JavaLanguageAnalyzer : ILanguageAnalyzer
|
||||
private static bool IsManifestEntry(string entryName)
|
||||
=> string.Equals(entryName, "META-INF/MANIFEST.MF", StringComparison.OrdinalIgnoreCase);
|
||||
|
||||
private static async ValueTask<MavenArtifact?> ParsePomPropertiesAsync(JavaArchive archive, JavaArchiveEntry entry, CancellationToken cancellationToken)
|
||||
{
|
||||
await using var entryStream = archive.OpenEntry(entry);
|
||||
using var buffer = new MemoryStream();
|
||||
await entryStream.CopyToAsync(buffer, cancellationToken).ConfigureAwait(false);
|
||||
buffer.Position = 0;
|
||||
|
||||
using var reader = new StreamReader(buffer, Encoding.UTF8, detectEncodingFromByteOrderMarks: true, leaveOpen: true);
|
||||
private static void AppendLockMetadata(ICollection<KeyValuePair<string, string?>> metadata, JavaLockEntry entry)
|
||||
{
|
||||
AddMetadata(metadata, "lockConfiguration", entry.Configuration);
|
||||
AddMetadata(metadata, "lockRepository", entry.Repository);
|
||||
AddMetadata(metadata, "lockResolved", entry.ResolvedUrl);
|
||||
}
|
||||
|
||||
private static async ValueTask<MavenArtifact?> ParsePomPropertiesAsync(JavaArchive archive, JavaArchiveEntry entry, CancellationToken cancellationToken)
|
||||
{
|
||||
await using var entryStream = archive.OpenEntry(entry);
|
||||
using var buffer = new MemoryStream();
|
||||
await entryStream.CopyToAsync(buffer, cancellationToken).ConfigureAwait(false);
|
||||
buffer.Position = 0;
|
||||
|
||||
using var reader = new StreamReader(buffer, Encoding.UTF8, detectEncodingFromByteOrderMarks: true, leaveOpen: true);
|
||||
var properties = new Dictionary<string, string>(StringComparer.OrdinalIgnoreCase);
|
||||
|
||||
while (await reader.ReadLineAsync().ConfigureAwait(false) is { } line)
|
||||
@@ -209,14 +245,14 @@ public sealed class JavaLanguageAnalyzer : ILanguageAnalyzer
|
||||
PomSha256: pomSha);
|
||||
}
|
||||
|
||||
private static async ValueTask<ManifestMetadata?> ParseManifestAsync(JavaArchive archive, JavaArchiveEntry entry, CancellationToken cancellationToken)
|
||||
{
|
||||
await using var entryStream = archive.OpenEntry(entry);
|
||||
using var reader = new StreamReader(entryStream, Encoding.UTF8, detectEncodingFromByteOrderMarks: true, leaveOpen: false);
|
||||
|
||||
string? title = null;
|
||||
string? version = null;
|
||||
string? vendor = null;
|
||||
private static async ValueTask<ManifestMetadata?> ParseManifestAsync(JavaArchive archive, JavaArchiveEntry entry, CancellationToken cancellationToken)
|
||||
{
|
||||
await using var entryStream = archive.OpenEntry(entry);
|
||||
using var reader = new StreamReader(entryStream, Encoding.UTF8, detectEncodingFromByteOrderMarks: true, leaveOpen: false);
|
||||
|
||||
string? title = null;
|
||||
string? version = null;
|
||||
string? vendor = null;
|
||||
|
||||
while (await reader.ReadLineAsync().ConfigureAwait(false) is { } line)
|
||||
{
|
||||
@@ -289,32 +325,21 @@ public sealed class JavaLanguageAnalyzer : ILanguageAnalyzer
|
||||
|
||||
private sealed record ManifestMetadata(string? ImplementationTitle, string? ImplementationVersion, string? ImplementationVendor)
|
||||
{
|
||||
public void ApplyMetadata(IDictionary<string, string?> target)
|
||||
public void ApplyMetadata(ICollection<KeyValuePair<string, string?>> target)
|
||||
{
|
||||
if (!string.IsNullOrWhiteSpace(ImplementationTitle))
|
||||
{
|
||||
target["manifestTitle"] = ImplementationTitle;
|
||||
}
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(ImplementationVersion))
|
||||
{
|
||||
target["manifestVersion"] = ImplementationVersion;
|
||||
}
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(ImplementationVendor))
|
||||
{
|
||||
target["manifestVendor"] = ImplementationVendor;
|
||||
}
|
||||
AddMetadata(target, "manifestTitle", ImplementationTitle);
|
||||
AddMetadata(target, "manifestVersion", ImplementationVersion);
|
||||
AddMetadata(target, "manifestVendor", ImplementationVendor);
|
||||
}
|
||||
|
||||
public LanguageComponentEvidence CreateEvidence(JavaArchive archive)
|
||||
{
|
||||
var locator = BuildLocator(archive, "META-INF/MANIFEST.MF");
|
||||
var valueBuilder = new StringBuilder();
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(ImplementationTitle))
|
||||
{
|
||||
valueBuilder.Append("title=").Append(ImplementationTitle);
|
||||
public LanguageComponentEvidence CreateEvidence(JavaArchive archive)
|
||||
{
|
||||
var locator = BuildLocator(archive, "META-INF/MANIFEST.MF");
|
||||
var valueBuilder = new StringBuilder();
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(ImplementationTitle))
|
||||
{
|
||||
valueBuilder.Append("title=").Append(ImplementationTitle);
|
||||
}
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(ImplementationVersion))
|
||||
@@ -340,5 +365,89 @@ public sealed class JavaLanguageAnalyzer : ILanguageAnalyzer
|
||||
var value = valueBuilder.Length > 0 ? valueBuilder.ToString() : null;
|
||||
return new LanguageComponentEvidence(LanguageEvidenceKind.File, "MANIFEST.MF", locator, value, null);
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
private static List<KeyValuePair<string, string?>> CreateInstalledMetadata(
|
||||
MavenArtifact artifact,
|
||||
JavaArchive archive,
|
||||
ManifestMetadata? manifestMetadata)
|
||||
{
|
||||
var metadata = new List<KeyValuePair<string, string?>>(8);
|
||||
|
||||
AddMetadata(metadata, "groupId", artifact.GroupId);
|
||||
AddMetadata(metadata, "artifactId", artifact.ArtifactId);
|
||||
AddMetadata(metadata, "jarPath", NormalizeArchivePath(archive.RelativePath), allowEmpty: true);
|
||||
AddMetadata(metadata, "packaging", artifact.Packaging);
|
||||
AddMetadata(metadata, "displayName", artifact.Name);
|
||||
|
||||
manifestMetadata?.ApplyMetadata(metadata);
|
||||
|
||||
return metadata;
|
||||
}
|
||||
|
||||
private static IReadOnlyList<KeyValuePair<string, string?>> CreateDeclaredMetadata(JavaLockEntry entry)
|
||||
{
|
||||
var metadata = new List<KeyValuePair<string, string?>>(6);
|
||||
var lockSource = NormalizeLockSource(entry.Source);
|
||||
var lockLocator = string.IsNullOrWhiteSpace(entry.Locator) ? lockSource : entry.Locator;
|
||||
|
||||
AddMetadata(metadata, "declaredOnly", "true");
|
||||
AddMetadata(metadata, "lockSource", lockSource);
|
||||
AddMetadata(metadata, "lockLocator", lockLocator, allowEmpty: true);
|
||||
AppendLockMetadata(metadata, entry);
|
||||
|
||||
return SortMetadata(metadata);
|
||||
}
|
||||
|
||||
private static LanguageComponentEvidence CreateDeclaredEvidence(JavaLockEntry entry)
|
||||
{
|
||||
var lockSource = NormalizeLockSource(entry.Source);
|
||||
var lockLocator = string.IsNullOrWhiteSpace(entry.Locator) ? lockSource : entry.Locator;
|
||||
|
||||
return new LanguageComponentEvidence(
|
||||
LanguageEvidenceKind.Metadata,
|
||||
lockSource,
|
||||
lockLocator,
|
||||
entry.ResolvedUrl,
|
||||
Sha256: null);
|
||||
}
|
||||
|
||||
private static IReadOnlyList<KeyValuePair<string, string?>> SortMetadata(List<KeyValuePair<string, string?>> metadata)
|
||||
{
|
||||
metadata.Sort(static (left, right) =>
|
||||
{
|
||||
var keyComparison = string.CompareOrdinal(left.Key, right.Key);
|
||||
if (keyComparison != 0)
|
||||
{
|
||||
return keyComparison;
|
||||
}
|
||||
|
||||
return string.CompareOrdinal(left.Value ?? string.Empty, right.Value ?? string.Empty);
|
||||
});
|
||||
|
||||
return metadata;
|
||||
}
|
||||
|
||||
private static void AddMetadata(
|
||||
ICollection<KeyValuePair<string, string?>> metadata,
|
||||
string key,
|
||||
string? value,
|
||||
bool allowEmpty = false)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(key))
|
||||
{
|
||||
return;
|
||||
}
|
||||
|
||||
if (!allowEmpty && string.IsNullOrWhiteSpace(value))
|
||||
{
|
||||
return;
|
||||
}
|
||||
|
||||
metadata.Add(new KeyValuePair<string, string?>(key, value));
|
||||
}
|
||||
|
||||
private static string NormalizeLockSource(string? source)
|
||||
=> string.IsNullOrWhiteSpace(source) ? "lockfile" : source;
|
||||
}
|
||||
|
||||
@@ -2,35 +2,57 @@ using System.Text.Json;
|
||||
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Node.Internal;
|
||||
|
||||
internal sealed class NodeLockData
|
||||
{
|
||||
private static readonly NodeLockData Empty = new(new Dictionary<string, NodeLockEntry>(StringComparer.Ordinal), new Dictionary<string, NodeLockEntry>(StringComparer.OrdinalIgnoreCase));
|
||||
internal sealed class NodeLockData
|
||||
{
|
||||
private const string PackageLockSource = "package-lock.json";
|
||||
private const string YarnLockSource = "yarn.lock";
|
||||
private const string PnpmLockSource = "pnpm-lock.yaml";
|
||||
|
||||
private static readonly NodeLockData Empty = new(
|
||||
new Dictionary<string, NodeLockEntry>(StringComparer.Ordinal),
|
||||
new Dictionary<string, NodeLockEntry>(StringComparer.OrdinalIgnoreCase),
|
||||
Array.Empty<NodeLockEntry>());
|
||||
|
||||
private readonly Dictionary<string, NodeLockEntry> _byPath;
|
||||
private readonly Dictionary<string, NodeLockEntry> _byName;
|
||||
private readonly IReadOnlyCollection<NodeLockEntry> _declared;
|
||||
|
||||
private NodeLockData(
|
||||
Dictionary<string, NodeLockEntry> byPath,
|
||||
Dictionary<string, NodeLockEntry> byName,
|
||||
IReadOnlyCollection<NodeLockEntry> declared)
|
||||
{
|
||||
_byPath = byPath;
|
||||
_byName = byName;
|
||||
_declared = declared;
|
||||
}
|
||||
|
||||
private readonly Dictionary<string, NodeLockEntry> _byPath;
|
||||
private readonly Dictionary<string, NodeLockEntry> _byName;
|
||||
|
||||
private NodeLockData(Dictionary<string, NodeLockEntry> byPath, Dictionary<string, NodeLockEntry> byName)
|
||||
{
|
||||
_byPath = byPath;
|
||||
_byName = byName;
|
||||
}
|
||||
|
||||
public static ValueTask<NodeLockData> LoadAsync(string rootPath, CancellationToken cancellationToken)
|
||||
{
|
||||
var byPath = new Dictionary<string, NodeLockEntry>(StringComparer.Ordinal);
|
||||
var byName = new Dictionary<string, NodeLockEntry>(StringComparer.OrdinalIgnoreCase);
|
||||
|
||||
LoadPackageLockJson(rootPath, byPath, byName, cancellationToken);
|
||||
LoadYarnLock(rootPath, byName);
|
||||
LoadPnpmLock(rootPath, byName);
|
||||
|
||||
if (byPath.Count == 0 && byName.Count == 0)
|
||||
{
|
||||
return ValueTask.FromResult(Empty);
|
||||
}
|
||||
|
||||
return ValueTask.FromResult(new NodeLockData(byPath, byName));
|
||||
}
|
||||
public IReadOnlyCollection<NodeLockEntry> DeclaredPackages => _declared;
|
||||
|
||||
public static ValueTask<NodeLockData> LoadAsync(string rootPath, CancellationToken cancellationToken)
|
||||
{
|
||||
var byPath = new Dictionary<string, NodeLockEntry>(StringComparer.Ordinal);
|
||||
var byName = new Dictionary<string, NodeLockEntry>(StringComparer.OrdinalIgnoreCase);
|
||||
var declared = new Dictionary<string, NodeLockEntry>(StringComparer.OrdinalIgnoreCase);
|
||||
|
||||
LoadPackageLockJson(rootPath, byPath, byName, declared, cancellationToken);
|
||||
LoadYarnLock(rootPath, byName, declared);
|
||||
LoadPnpmLock(rootPath, byName, declared);
|
||||
|
||||
if (byPath.Count == 0 && byName.Count == 0 && declared.Count == 0)
|
||||
{
|
||||
return ValueTask.FromResult(Empty);
|
||||
}
|
||||
|
||||
var declaredList = declared.Values
|
||||
.OrderBy(static entry => entry.Name, StringComparer.OrdinalIgnoreCase)
|
||||
.ThenBy(static entry => entry.Version ?? string.Empty, StringComparer.OrdinalIgnoreCase)
|
||||
.ThenBy(static entry => entry.Source, StringComparer.OrdinalIgnoreCase)
|
||||
.ThenBy(static entry => entry.Locator ?? string.Empty, StringComparer.OrdinalIgnoreCase)
|
||||
.ToArray();
|
||||
|
||||
return ValueTask.FromResult(new NodeLockData(byPath, byName, declaredList));
|
||||
}
|
||||
|
||||
public bool TryGet(string relativePath, string packageName, out NodeLockEntry? entry)
|
||||
{
|
||||
@@ -55,16 +77,30 @@ internal sealed class NodeLockData
|
||||
return false;
|
||||
}
|
||||
|
||||
private static NodeLockEntry? CreateEntry(JsonElement element)
|
||||
{
|
||||
string? version = null;
|
||||
string? resolved = null;
|
||||
string? integrity = null;
|
||||
|
||||
if (element.TryGetProperty("version", out var versionElement) && versionElement.ValueKind == JsonValueKind.String)
|
||||
{
|
||||
version = versionElement.GetString();
|
||||
}
|
||||
private static NodeLockEntry? CreateEntry(
|
||||
string source,
|
||||
string? locator,
|
||||
string? inferredName,
|
||||
JsonElement element)
|
||||
{
|
||||
string? name = inferredName;
|
||||
string? version = null;
|
||||
string? resolved = null;
|
||||
string? integrity = null;
|
||||
|
||||
if (element.TryGetProperty("name", out var nameElement) && nameElement.ValueKind == JsonValueKind.String)
|
||||
{
|
||||
var explicitName = nameElement.GetString();
|
||||
if (!string.IsNullOrWhiteSpace(explicitName))
|
||||
{
|
||||
name = explicitName;
|
||||
}
|
||||
}
|
||||
|
||||
if (element.TryGetProperty("version", out var versionElement) && versionElement.ValueKind == JsonValueKind.String)
|
||||
{
|
||||
version = versionElement.GetString();
|
||||
}
|
||||
|
||||
if (element.TryGetProperty("resolved", out var resolvedElement) && resolvedElement.ValueKind == JsonValueKind.String)
|
||||
{
|
||||
@@ -76,43 +112,56 @@ internal sealed class NodeLockData
|
||||
integrity = integrityElement.GetString();
|
||||
}
|
||||
|
||||
if (version is null && resolved is null && integrity is null)
|
||||
{
|
||||
return null;
|
||||
}
|
||||
if (version is null && resolved is null && integrity is null)
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
if (string.IsNullOrWhiteSpace(name))
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
var locatorValue = string.IsNullOrWhiteSpace(locator) ? null : locator;
|
||||
return new NodeLockEntry(source, locatorValue, name!, version, resolved, integrity);
|
||||
}
|
||||
|
||||
return new NodeLockEntry(version, resolved, integrity);
|
||||
}
|
||||
private static void TraverseLegacyDependencies(
|
||||
string currentPath,
|
||||
JsonElement dependenciesElement,
|
||||
IDictionary<string, NodeLockEntry> byPath,
|
||||
IDictionary<string, NodeLockEntry> byName,
|
||||
IDictionary<string, NodeLockEntry> declared)
|
||||
{
|
||||
foreach (var dependency in dependenciesElement.EnumerateObject())
|
||||
{
|
||||
var depValue = dependency.Value;
|
||||
var path = $"{currentPath}/{dependency.Name}";
|
||||
var normalizedPath = NormalizeLockPath(path);
|
||||
var entry = CreateEntry(PackageLockSource, normalizedPath, dependency.Name, depValue);
|
||||
if (entry is not null)
|
||||
{
|
||||
byPath[normalizedPath] = entry;
|
||||
byName[dependency.Name] = entry;
|
||||
AddDeclaration(declared, entry);
|
||||
}
|
||||
|
||||
if (depValue.TryGetProperty("dependencies", out var childDependencies) && childDependencies.ValueKind == JsonValueKind.Object)
|
||||
{
|
||||
TraverseLegacyDependencies(path + "/node_modules", childDependencies, byPath, byName, declared);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private static void TraverseLegacyDependencies(
|
||||
string currentPath,
|
||||
JsonElement dependenciesElement,
|
||||
IDictionary<string, NodeLockEntry> byPath,
|
||||
IDictionary<string, NodeLockEntry> byName)
|
||||
{
|
||||
foreach (var dependency in dependenciesElement.EnumerateObject())
|
||||
{
|
||||
var depValue = dependency.Value;
|
||||
var path = $"{currentPath}/{dependency.Name}";
|
||||
var entry = CreateEntry(depValue);
|
||||
if (entry is not null)
|
||||
{
|
||||
var normalizedPath = NormalizeLockPath(path);
|
||||
byPath[normalizedPath] = entry;
|
||||
byName[dependency.Name] = entry;
|
||||
}
|
||||
|
||||
if (depValue.TryGetProperty("dependencies", out var childDependencies) && childDependencies.ValueKind == JsonValueKind.Object)
|
||||
{
|
||||
TraverseLegacyDependencies(path + "/node_modules", childDependencies, byPath, byName);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private static void LoadPackageLockJson(string rootPath, IDictionary<string, NodeLockEntry> byPath, IDictionary<string, NodeLockEntry> byName, CancellationToken cancellationToken)
|
||||
{
|
||||
var packageLockPath = Path.Combine(rootPath, "package-lock.json");
|
||||
if (!File.Exists(packageLockPath))
|
||||
private static void LoadPackageLockJson(
|
||||
string rootPath,
|
||||
IDictionary<string, NodeLockEntry> byPath,
|
||||
IDictionary<string, NodeLockEntry> byName,
|
||||
IDictionary<string, NodeLockEntry> declared,
|
||||
CancellationToken cancellationToken)
|
||||
{
|
||||
var packageLockPath = Path.Combine(rootPath, "package-lock.json");
|
||||
if (!File.Exists(packageLockPath))
|
||||
{
|
||||
return;
|
||||
}
|
||||
@@ -127,38 +176,32 @@ internal sealed class NodeLockData
|
||||
|
||||
if (root.TryGetProperty("packages", out var packagesElement) && packagesElement.ValueKind == JsonValueKind.Object)
|
||||
{
|
||||
foreach (var packageProperty in packagesElement.EnumerateObject())
|
||||
{
|
||||
var entry = CreateEntry(packageProperty.Value);
|
||||
if (entry is null)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var key = NormalizeLockPath(packageProperty.Name);
|
||||
byPath[key] = entry;
|
||||
|
||||
var name = ExtractNameFromPath(key);
|
||||
if (!string.IsNullOrEmpty(name))
|
||||
{
|
||||
byName[name] = entry;
|
||||
}
|
||||
|
||||
if (packageProperty.Value.TryGetProperty("name", out var explicitNameElement) && explicitNameElement.ValueKind == JsonValueKind.String)
|
||||
{
|
||||
var explicitName = explicitNameElement.GetString();
|
||||
if (!string.IsNullOrWhiteSpace(explicitName))
|
||||
{
|
||||
byName[explicitName] = entry;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
else if (root.TryGetProperty("dependencies", out var dependenciesElement) && dependenciesElement.ValueKind == JsonValueKind.Object)
|
||||
{
|
||||
TraverseLegacyDependencies("node_modules", dependenciesElement, byPath, byName);
|
||||
}
|
||||
}
|
||||
foreach (var packageProperty in packagesElement.EnumerateObject())
|
||||
{
|
||||
var key = NormalizeLockPath(packageProperty.Name);
|
||||
var inferredName = ExtractNameFromPath(key);
|
||||
|
||||
var entry = CreateEntry(PackageLockSource, key, inferredName, packageProperty.Value);
|
||||
if (entry is null)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
byPath[key] = entry;
|
||||
|
||||
if (!string.IsNullOrEmpty(entry.Name))
|
||||
{
|
||||
byName[entry.Name] = entry;
|
||||
}
|
||||
|
||||
AddDeclaration(declared, entry);
|
||||
}
|
||||
}
|
||||
else if (root.TryGetProperty("dependencies", out var dependenciesElement) && dependenciesElement.ValueKind == JsonValueKind.Object)
|
||||
{
|
||||
TraverseLegacyDependencies("node_modules", dependenciesElement, byPath, byName, declared);
|
||||
}
|
||||
}
|
||||
catch (IOException)
|
||||
{
|
||||
// Ignore unreadable package-lock.
|
||||
@@ -169,7 +212,10 @@ internal sealed class NodeLockData
|
||||
}
|
||||
}
|
||||
|
||||
private static void LoadYarnLock(string rootPath, IDictionary<string, NodeLockEntry> byName)
|
||||
private static void LoadYarnLock(
|
||||
string rootPath,
|
||||
IDictionary<string, NodeLockEntry> byName,
|
||||
IDictionary<string, NodeLockEntry> declared)
|
||||
{
|
||||
var yarnLockPath = Path.Combine(rootPath, "yarn.lock");
|
||||
if (!File.Exists(yarnLockPath))
|
||||
@@ -185,31 +231,32 @@ internal sealed class NodeLockData
|
||||
string? resolved = null;
|
||||
string? integrity = null;
|
||||
|
||||
void Flush()
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(currentName))
|
||||
{
|
||||
version = null;
|
||||
resolved = null;
|
||||
integrity = null;
|
||||
return;
|
||||
}
|
||||
|
||||
var simpleName = ExtractPackageNameFromYarnKey(currentName!);
|
||||
if (string.IsNullOrEmpty(simpleName))
|
||||
{
|
||||
version = null;
|
||||
resolved = null;
|
||||
integrity = null;
|
||||
return;
|
||||
}
|
||||
|
||||
var entry = new NodeLockEntry(version, resolved, integrity);
|
||||
byName[simpleName] = entry;
|
||||
version = null;
|
||||
resolved = null;
|
||||
integrity = null;
|
||||
}
|
||||
void Flush()
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(currentName))
|
||||
{
|
||||
version = null;
|
||||
resolved = null;
|
||||
integrity = null;
|
||||
return;
|
||||
}
|
||||
|
||||
var simpleName = ExtractPackageNameFromYarnKey(currentName!);
|
||||
if (string.IsNullOrEmpty(simpleName))
|
||||
{
|
||||
version = null;
|
||||
resolved = null;
|
||||
integrity = null;
|
||||
return;
|
||||
}
|
||||
|
||||
var entry = new NodeLockEntry(YarnLockSource, currentName, simpleName, version, resolved, integrity);
|
||||
byName[simpleName] = entry;
|
||||
AddDeclaration(declared, entry);
|
||||
version = null;
|
||||
resolved = null;
|
||||
integrity = null;
|
||||
}
|
||||
|
||||
foreach (var line in lines)
|
||||
{
|
||||
@@ -250,7 +297,10 @@ internal sealed class NodeLockData
|
||||
}
|
||||
}
|
||||
|
||||
private static void LoadPnpmLock(string rootPath, IDictionary<string, NodeLockEntry> byName)
|
||||
private static void LoadPnpmLock(
|
||||
string rootPath,
|
||||
IDictionary<string, NodeLockEntry> byName,
|
||||
IDictionary<string, NodeLockEntry> declared)
|
||||
{
|
||||
var pnpmLockPath = Path.Combine(rootPath, "pnpm-lock.yaml");
|
||||
if (!File.Exists(pnpmLockPath))
|
||||
@@ -258,94 +308,107 @@ internal sealed class NodeLockData
|
||||
return;
|
||||
}
|
||||
|
||||
try
|
||||
{
|
||||
using var reader = new StreamReader(pnpmLockPath);
|
||||
string? currentPackage = null;
|
||||
string? version = null;
|
||||
string? resolved = null;
|
||||
string? integrity = null;
|
||||
var inPackages = false;
|
||||
|
||||
while (reader.ReadLine() is { } line)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(line))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
if (!inPackages)
|
||||
{
|
||||
if (line.StartsWith("packages:", StringComparison.Ordinal))
|
||||
{
|
||||
inPackages = true;
|
||||
}
|
||||
continue;
|
||||
}
|
||||
|
||||
if (line.StartsWith(" /", StringComparison.Ordinal))
|
||||
{
|
||||
if (!string.IsNullOrEmpty(currentPackage) && !string.IsNullOrEmpty(integrity))
|
||||
{
|
||||
var name = ExtractNameFromPnpmKey(currentPackage);
|
||||
if (!string.IsNullOrEmpty(name))
|
||||
{
|
||||
byName[name] = new NodeLockEntry(version, resolved, integrity);
|
||||
}
|
||||
}
|
||||
|
||||
currentPackage = line.Trim().TrimEnd(':').TrimStart('/');
|
||||
version = null;
|
||||
resolved = null;
|
||||
integrity = null;
|
||||
continue;
|
||||
}
|
||||
|
||||
if (string.IsNullOrEmpty(currentPackage))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var trimmed = line.Trim();
|
||||
if (trimmed.StartsWith("resolution:", StringComparison.Ordinal))
|
||||
{
|
||||
var integrityIndex = trimmed.IndexOf("integrity", StringComparison.OrdinalIgnoreCase);
|
||||
if (integrityIndex >= 0)
|
||||
{
|
||||
var integrityValue = trimmed[(integrityIndex + 9)..].Trim(' ', ':', '{', '}', '"');
|
||||
integrity = integrityValue;
|
||||
}
|
||||
|
||||
var tarballIndex = trimmed.IndexOf("tarball", StringComparison.OrdinalIgnoreCase);
|
||||
if (tarballIndex >= 0)
|
||||
{
|
||||
var tarballValue = trimmed[(tarballIndex + 7)..].Trim(' ', ':', '{', '}', '"');
|
||||
resolved = tarballValue;
|
||||
}
|
||||
}
|
||||
else if (trimmed.StartsWith("integrity:", StringComparison.Ordinal))
|
||||
{
|
||||
integrity = trimmed[("integrity:".Length)..].Trim();
|
||||
}
|
||||
else if (trimmed.StartsWith("tarball:", StringComparison.Ordinal))
|
||||
{
|
||||
resolved = trimmed[("tarball:".Length)..].Trim();
|
||||
}
|
||||
else if (trimmed.StartsWith("version:", StringComparison.Ordinal))
|
||||
{
|
||||
version = trimmed[("version:".Length)..].Trim();
|
||||
}
|
||||
}
|
||||
|
||||
if (!string.IsNullOrEmpty(currentPackage) && !string.IsNullOrEmpty(integrity))
|
||||
{
|
||||
var name = ExtractNameFromPnpmKey(currentPackage);
|
||||
if (!string.IsNullOrEmpty(name))
|
||||
{
|
||||
byName[name] = new NodeLockEntry(version, resolved, integrity);
|
||||
}
|
||||
}
|
||||
}
|
||||
try
|
||||
{
|
||||
using var reader = new StreamReader(pnpmLockPath);
|
||||
string? currentPackage = null;
|
||||
string? version = null;
|
||||
string? resolved = null;
|
||||
string? integrity = null;
|
||||
var inPackages = false;
|
||||
|
||||
void Flush()
|
||||
{
|
||||
if (string.IsNullOrEmpty(currentPackage) || string.IsNullOrEmpty(integrity))
|
||||
{
|
||||
version = null;
|
||||
resolved = null;
|
||||
integrity = null;
|
||||
return;
|
||||
}
|
||||
|
||||
var name = ExtractNameFromPnpmKey(currentPackage!);
|
||||
if (string.IsNullOrEmpty(name))
|
||||
{
|
||||
version = null;
|
||||
resolved = null;
|
||||
integrity = null;
|
||||
return;
|
||||
}
|
||||
|
||||
var entry = new NodeLockEntry(PnpmLockSource, currentPackage, name, version, resolved, integrity);
|
||||
byName[name] = entry;
|
||||
AddDeclaration(declared, entry);
|
||||
version = null;
|
||||
resolved = null;
|
||||
integrity = null;
|
||||
}
|
||||
|
||||
while (reader.ReadLine() is { } line)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(line))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
if (!inPackages)
|
||||
{
|
||||
if (line.StartsWith("packages:", StringComparison.Ordinal))
|
||||
{
|
||||
inPackages = true;
|
||||
}
|
||||
continue;
|
||||
}
|
||||
|
||||
if (line.StartsWith(" /", StringComparison.Ordinal))
|
||||
{
|
||||
Flush();
|
||||
|
||||
currentPackage = line.Trim().TrimEnd(':').TrimStart('/');
|
||||
version = null;
|
||||
resolved = null;
|
||||
integrity = null;
|
||||
continue;
|
||||
}
|
||||
|
||||
if (string.IsNullOrEmpty(currentPackage))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var trimmed = line.Trim();
|
||||
if (trimmed.StartsWith("resolution:", StringComparison.Ordinal))
|
||||
{
|
||||
var integrityIndex = trimmed.IndexOf("integrity", StringComparison.OrdinalIgnoreCase);
|
||||
if (integrityIndex >= 0)
|
||||
{
|
||||
var integrityValue = trimmed[(integrityIndex + 9)..].Trim(' ', ':', '{', '}', '"');
|
||||
integrity = integrityValue;
|
||||
}
|
||||
|
||||
var tarballIndex = trimmed.IndexOf("tarball", StringComparison.OrdinalIgnoreCase);
|
||||
if (tarballIndex >= 0)
|
||||
{
|
||||
var tarballValue = trimmed[(tarballIndex + 7)..].Trim(' ', ':', '{', '}', '"');
|
||||
resolved = tarballValue;
|
||||
}
|
||||
}
|
||||
else if (trimmed.StartsWith("integrity:", StringComparison.Ordinal))
|
||||
{
|
||||
integrity = trimmed[("integrity:".Length)..].Trim();
|
||||
}
|
||||
else if (trimmed.StartsWith("tarball:", StringComparison.Ordinal))
|
||||
{
|
||||
resolved = trimmed[("tarball:".Length)..].Trim();
|
||||
}
|
||||
else if (trimmed.StartsWith("version:", StringComparison.Ordinal))
|
||||
{
|
||||
version = trimmed[("version:".Length)..].Trim();
|
||||
}
|
||||
}
|
||||
|
||||
Flush();
|
||||
}
|
||||
catch (IOException)
|
||||
{
|
||||
// Ignore unreadable pnpm lock file.
|
||||
@@ -384,9 +447,9 @@ internal sealed class NodeLockData
|
||||
return trimmed;
|
||||
}
|
||||
|
||||
private static string ExtractNameFromPnpmKey(string key)
|
||||
{
|
||||
var parts = key.Split('/', StringSplitOptions.RemoveEmptyEntries | StringSplitOptions.TrimEntries);
|
||||
private static string ExtractNameFromPnpmKey(string key)
|
||||
{
|
||||
var parts = key.Split('/', StringSplitOptions.RemoveEmptyEntries | StringSplitOptions.TrimEntries);
|
||||
if (parts.Length == 0)
|
||||
{
|
||||
return string.Empty;
|
||||
@@ -397,8 +460,27 @@ internal sealed class NodeLockData
|
||||
return parts.Length >= 2 ? $"{parts[0]}/{parts[1]}" : parts[0];
|
||||
}
|
||||
|
||||
return parts[0];
|
||||
}
|
||||
return parts[0];
|
||||
}
|
||||
|
||||
private static void AddDeclaration(IDictionary<string, NodeLockEntry> declared, NodeLockEntry entry)
|
||||
{
|
||||
if (declared is null || entry is null)
|
||||
{
|
||||
return;
|
||||
}
|
||||
|
||||
if (string.IsNullOrWhiteSpace(entry.Name) || string.IsNullOrWhiteSpace(entry.Version))
|
||||
{
|
||||
return;
|
||||
}
|
||||
|
||||
var key = entry.DeclarationKey();
|
||||
if (!declared.ContainsKey(key))
|
||||
{
|
||||
declared[key] = entry;
|
||||
}
|
||||
}
|
||||
|
||||
private static string NormalizeLockPath(string path)
|
||||
{
|
||||
|
||||
@@ -1,3 +1,20 @@
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Node.Internal;
|
||||
|
||||
internal sealed record NodeLockEntry(string? Version, string? Resolved, string? Integrity);
|
||||
internal sealed record NodeLockEntry(
|
||||
string Source,
|
||||
string? Locator,
|
||||
string Name,
|
||||
string? Version,
|
||||
string? Resolved,
|
||||
string? Integrity);
|
||||
|
||||
internal static class NodeLockEntryExtensions
|
||||
{
|
||||
public static string DeclarationKey(this NodeLockEntry entry)
|
||||
{
|
||||
ArgumentNullException.ThrowIfNull(entry);
|
||||
|
||||
var version = entry.Version ?? string.Empty;
|
||||
return $"{entry.Name}@{version}".ToLowerInvariant();
|
||||
}
|
||||
}
|
||||
|
||||
@@ -13,23 +13,29 @@ internal sealed class NodePackage
|
||||
string? workspaceRoot,
|
||||
IReadOnlyList<string> workspaceTargets,
|
||||
string? workspaceLink,
|
||||
IReadOnlyList<NodeLifecycleScript> lifecycleScripts,
|
||||
bool usedByEntrypoint)
|
||||
{
|
||||
Name = name;
|
||||
Version = version;
|
||||
RelativePath = relativePath;
|
||||
PackageJsonLocator = packageJsonLocator;
|
||||
IReadOnlyList<NodeLifecycleScript> lifecycleScripts,
|
||||
bool usedByEntrypoint,
|
||||
bool declaredOnly = false,
|
||||
string? lockSource = null,
|
||||
string? lockLocator = null)
|
||||
{
|
||||
Name = name;
|
||||
Version = version;
|
||||
RelativePath = relativePath;
|
||||
PackageJsonLocator = packageJsonLocator;
|
||||
IsPrivate = isPrivate;
|
||||
LockEntry = lockEntry;
|
||||
IsWorkspaceMember = isWorkspaceMember;
|
||||
WorkspaceRoot = workspaceRoot;
|
||||
WorkspaceTargets = workspaceTargets;
|
||||
WorkspaceLink = workspaceLink;
|
||||
LifecycleScripts = lifecycleScripts ?? Array.Empty<NodeLifecycleScript>();
|
||||
IsUsedByEntrypoint = usedByEntrypoint;
|
||||
}
|
||||
|
||||
WorkspaceLink = workspaceLink;
|
||||
LifecycleScripts = lifecycleScripts ?? Array.Empty<NodeLifecycleScript>();
|
||||
IsUsedByEntrypoint = usedByEntrypoint;
|
||||
DeclaredOnly = declaredOnly;
|
||||
LockSource = lockSource;
|
||||
LockLocator = lockLocator;
|
||||
}
|
||||
|
||||
public string Name { get; }
|
||||
|
||||
public string Version { get; }
|
||||
@@ -50,11 +56,17 @@ internal sealed class NodePackage
|
||||
|
||||
public string? WorkspaceLink { get; }
|
||||
|
||||
public IReadOnlyList<NodeLifecycleScript> LifecycleScripts { get; }
|
||||
|
||||
public bool HasInstallScripts => LifecycleScripts.Count > 0;
|
||||
|
||||
public bool IsUsedByEntrypoint { get; }
|
||||
public IReadOnlyList<NodeLifecycleScript> LifecycleScripts { get; }
|
||||
|
||||
public bool HasInstallScripts => LifecycleScripts.Count > 0;
|
||||
|
||||
public bool IsUsedByEntrypoint { get; }
|
||||
|
||||
public bool DeclaredOnly { get; }
|
||||
|
||||
public string? LockSource { get; }
|
||||
|
||||
public string? LockLocator { get; }
|
||||
|
||||
public string RelativePathNormalized => string.IsNullOrEmpty(RelativePath) ? string.Empty : RelativePath.Replace(Path.DirectorySeparatorChar, '/');
|
||||
|
||||
@@ -64,10 +76,10 @@ internal sealed class NodePackage
|
||||
|
||||
public IReadOnlyCollection<LanguageComponentEvidence> CreateEvidence()
|
||||
{
|
||||
var evidence = new List<LanguageComponentEvidence>
|
||||
{
|
||||
new LanguageComponentEvidence(LanguageEvidenceKind.File, "package.json", PackageJsonLocator, Value: null, Sha256: null)
|
||||
};
|
||||
var evidence = new List<LanguageComponentEvidence>
|
||||
{
|
||||
CreateRootEvidence()
|
||||
};
|
||||
|
||||
foreach (var script in LifecycleScripts)
|
||||
{
|
||||
@@ -83,8 +95,8 @@ internal sealed class NodePackage
|
||||
script.Sha256));
|
||||
}
|
||||
|
||||
return evidence;
|
||||
}
|
||||
return evidence;
|
||||
}
|
||||
|
||||
public IReadOnlyCollection<KeyValuePair<string, string?>> CreateMetadata()
|
||||
{
|
||||
@@ -111,11 +123,11 @@ internal sealed class NodePackage
|
||||
}
|
||||
}
|
||||
|
||||
if (IsWorkspaceMember)
|
||||
{
|
||||
entries.Add(new KeyValuePair<string, string?>("workspaceMember", "true"));
|
||||
if (!string.IsNullOrWhiteSpace(WorkspaceRoot))
|
||||
{
|
||||
if (IsWorkspaceMember)
|
||||
{
|
||||
entries.Add(new KeyValuePair<string, string?>("workspaceMember", "true"));
|
||||
if (!string.IsNullOrWhiteSpace(WorkspaceRoot))
|
||||
{
|
||||
entries.Add(new KeyValuePair<string, string?>("workspaceRoot", WorkspaceRoot));
|
||||
}
|
||||
}
|
||||
@@ -148,12 +160,27 @@ internal sealed class NodePackage
|
||||
{
|
||||
entries.Add(new KeyValuePair<string, string?>($"script.{script.Name}", script.Command));
|
||||
}
|
||||
}
|
||||
|
||||
return entries
|
||||
.OrderBy(static pair => pair.Key, StringComparer.Ordinal)
|
||||
.ToArray();
|
||||
}
|
||||
}
|
||||
|
||||
if (DeclaredOnly)
|
||||
{
|
||||
entries.Add(new KeyValuePair<string, string?>("declaredOnly", "true"));
|
||||
}
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(LockSource))
|
||||
{
|
||||
entries.Add(new KeyValuePair<string, string?>("lockSource", LockSource));
|
||||
}
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(LockLocator))
|
||||
{
|
||||
entries.Add(new KeyValuePair<string, string?>("lockLocator", LockLocator));
|
||||
}
|
||||
|
||||
return entries
|
||||
.OrderBy(static pair => pair.Key, StringComparer.Ordinal)
|
||||
.ToArray();
|
||||
}
|
||||
|
||||
private static string BuildPurl(string name, string version)
|
||||
{
|
||||
@@ -174,6 +201,21 @@ internal sealed class NodePackage
|
||||
return $"%40{scopeAndName}";
|
||||
}
|
||||
|
||||
return name;
|
||||
}
|
||||
}
|
||||
return name;
|
||||
}
|
||||
|
||||
private LanguageComponentEvidence CreateRootEvidence()
|
||||
{
|
||||
var evidenceSource = DeclaredOnly
|
||||
? string.IsNullOrWhiteSpace(LockSource) ? "lockfile" : LockSource!
|
||||
: "package.json";
|
||||
|
||||
var locator = DeclaredOnly
|
||||
? (string.IsNullOrWhiteSpace(LockLocator) ? evidenceSource : LockLocator!)
|
||||
: (string.IsNullOrWhiteSpace(PackageJsonLocator) ? "package.json" : PackageJsonLocator);
|
||||
|
||||
var kind = DeclaredOnly ? LanguageEvidenceKind.Metadata : LanguageEvidenceKind.File;
|
||||
|
||||
return new LanguageComponentEvidence(kind, evidenceSource, locator, Value: null, Sha256: null);
|
||||
}
|
||||
}
|
||||
|
||||
@@ -56,8 +56,10 @@ internal static class NodePackageCollector
|
||||
TraverseDirectory(context, pendingRoot, lockData, workspaceIndex, packages, visited, cancellationToken);
|
||||
}
|
||||
|
||||
return packages;
|
||||
}
|
||||
AppendDeclaredPackages(packages, lockData);
|
||||
|
||||
return packages;
|
||||
}
|
||||
|
||||
private static void TraverseDirectory(
|
||||
LanguageAnalyzerContext context,
|
||||
@@ -166,18 +168,94 @@ internal static class NodePackageCollector
|
||||
}
|
||||
}
|
||||
|
||||
private static void TraverseNestedNodeModules(
|
||||
LanguageAnalyzerContext context,
|
||||
string directory,
|
||||
NodeLockData lockData,
|
||||
NodeWorkspaceIndex workspaceIndex,
|
||||
List<NodePackage> packages,
|
||||
HashSet<string> visited,
|
||||
CancellationToken cancellationToken)
|
||||
{
|
||||
var nestedNodeModules = Path.Combine(directory, "node_modules");
|
||||
TraverseDirectory(context, nestedNodeModules, lockData, workspaceIndex, packages, visited, cancellationToken);
|
||||
}
|
||||
private static void TraverseNestedNodeModules(
|
||||
LanguageAnalyzerContext context,
|
||||
string directory,
|
||||
NodeLockData lockData,
|
||||
NodeWorkspaceIndex workspaceIndex,
|
||||
List<NodePackage> packages,
|
||||
HashSet<string> visited,
|
||||
CancellationToken cancellationToken)
|
||||
{
|
||||
var nestedNodeModules = Path.Combine(directory, "node_modules");
|
||||
TraverseDirectory(context, nestedNodeModules, lockData, workspaceIndex, packages, visited, cancellationToken);
|
||||
}
|
||||
|
||||
private static void AppendDeclaredPackages(List<NodePackage> packages, NodeLockData lockData)
|
||||
{
|
||||
if (lockData.DeclaredPackages.Count == 0)
|
||||
{
|
||||
return;
|
||||
}
|
||||
|
||||
var observed = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
|
||||
foreach (var package in packages)
|
||||
{
|
||||
var key = BuildDeclarationKey(package.Name, package.Version);
|
||||
if (!string.IsNullOrEmpty(key))
|
||||
{
|
||||
observed.Add(key);
|
||||
}
|
||||
}
|
||||
|
||||
foreach (var entry in lockData.DeclaredPackages)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(entry.Name) || string.IsNullOrWhiteSpace(entry.Version))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var key = BuildDeclarationKey(entry.Name, entry.Version);
|
||||
if (string.IsNullOrEmpty(key) || !observed.Add(key))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var declaredPackage = new NodePackage(
|
||||
entry.Name,
|
||||
entry.Version,
|
||||
relativePath: string.Empty,
|
||||
packageJsonLocator: string.Empty,
|
||||
isPrivate: null,
|
||||
lockEntry: entry,
|
||||
isWorkspaceMember: false,
|
||||
workspaceRoot: null,
|
||||
workspaceTargets: Array.Empty<string>(),
|
||||
workspaceLink: null,
|
||||
lifecycleScripts: Array.Empty<NodeLifecycleScript>(),
|
||||
usedByEntrypoint: false,
|
||||
declaredOnly: true,
|
||||
lockSource: entry.Source,
|
||||
lockLocator: BuildLockLocator(entry));
|
||||
|
||||
packages.Add(declaredPackage);
|
||||
}
|
||||
}
|
||||
|
||||
private static string BuildDeclarationKey(string name, string? version)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(name) || string.IsNullOrWhiteSpace(version))
|
||||
{
|
||||
return string.Empty;
|
||||
}
|
||||
|
||||
return $"{name}@{version}".ToLowerInvariant();
|
||||
}
|
||||
|
||||
private static string? BuildLockLocator(NodeLockEntry? entry)
|
||||
{
|
||||
if (entry is null)
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
if (string.IsNullOrWhiteSpace(entry.Locator))
|
||||
{
|
||||
return entry.Source;
|
||||
}
|
||||
|
||||
return $"{entry.Source}:{entry.Locator}";
|
||||
}
|
||||
|
||||
private static NodePackage? TryCreatePackage(
|
||||
LanguageAnalyzerContext context,
|
||||
@@ -221,9 +299,11 @@ internal static class NodePackageCollector
|
||||
isPrivate = privateElement.GetBoolean();
|
||||
}
|
||||
|
||||
var lockEntry = lockData.TryGet(relativeDirectory, name, out var entry) ? entry : null;
|
||||
var locator = BuildLocator(relativeDirectory);
|
||||
var usedByEntrypoint = context.UsageHints.IsPathUsed(packageJsonPath);
|
||||
var lockEntry = lockData.TryGet(relativeDirectory, name, out var entry) ? entry : null;
|
||||
var locator = BuildLocator(relativeDirectory);
|
||||
var usedByEntrypoint = context.UsageHints.IsPathUsed(packageJsonPath);
|
||||
var lockLocator = BuildLockLocator(lockEntry);
|
||||
var lockSource = lockEntry?.Source;
|
||||
|
||||
var isWorkspaceMember = workspaceIndex.TryGetMember(relativeDirectory, out var workspaceRoot);
|
||||
var workspaceTargets = ExtractWorkspaceTargets(relativeDirectory, root, workspaceIndex);
|
||||
@@ -232,19 +312,22 @@ internal static class NodePackageCollector
|
||||
: null;
|
||||
var lifecycleScripts = ExtractLifecycleScripts(root);
|
||||
|
||||
return new NodePackage(
|
||||
name: name.Trim(),
|
||||
version: version.Trim(),
|
||||
relativePath: relativeDirectory,
|
||||
packageJsonLocator: locator,
|
||||
isPrivate: isPrivate,
|
||||
lockEntry: lockEntry,
|
||||
isWorkspaceMember: isWorkspaceMember,
|
||||
workspaceRoot: workspaceRoot,
|
||||
workspaceTargets: workspaceTargets,
|
||||
workspaceLink: workspaceLink,
|
||||
lifecycleScripts: lifecycleScripts,
|
||||
usedByEntrypoint: usedByEntrypoint);
|
||||
return new NodePackage(
|
||||
name: name.Trim(),
|
||||
version: version.Trim(),
|
||||
relativePath: relativeDirectory,
|
||||
packageJsonLocator: locator,
|
||||
isPrivate: isPrivate,
|
||||
lockEntry: lockEntry,
|
||||
isWorkspaceMember: isWorkspaceMember,
|
||||
workspaceRoot: workspaceRoot,
|
||||
workspaceTargets: workspaceTargets,
|
||||
workspaceLink: workspaceLink,
|
||||
lifecycleScripts: lifecycleScripts,
|
||||
usedByEntrypoint: usedByEntrypoint,
|
||||
declaredOnly: false,
|
||||
lockSource: lockSource,
|
||||
lockLocator: lockLocator);
|
||||
}
|
||||
catch (IOException)
|
||||
{
|
||||
|
||||
@@ -36,7 +36,7 @@ internal static class PythonDistributionLoader
|
||||
|
||||
var trimmedName = name.Trim();
|
||||
var trimmedVersion = version.Trim();
|
||||
var normalizedName = NormalizePackageName(trimmedName);
|
||||
var normalizedName = PythonPathHelper.NormalizePackageName(trimmedName);
|
||||
var purl = $"pkg:pypi/{normalizedName}@{trimmedVersion}";
|
||||
|
||||
var metadataEntries = new List<KeyValuePair<string, string?>>();
|
||||
@@ -321,28 +321,6 @@ internal static class PythonDistributionLoader
|
||||
return null;
|
||||
}
|
||||
|
||||
private static string NormalizePackageName(string name)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(name))
|
||||
{
|
||||
return string.Empty;
|
||||
}
|
||||
|
||||
var builder = new StringBuilder(name.Length);
|
||||
foreach (var ch in name.Trim().ToLowerInvariant())
|
||||
{
|
||||
builder.Append(ch switch
|
||||
{
|
||||
'_' => '-',
|
||||
'.' => '-',
|
||||
' ' => '-',
|
||||
_ => ch
|
||||
});
|
||||
}
|
||||
|
||||
return builder.ToString();
|
||||
}
|
||||
|
||||
private static string ResolvePackageRoot(string distInfoPath)
|
||||
{
|
||||
var parent = Directory.GetParent(distInfoPath);
|
||||
@@ -1063,21 +1041,45 @@ internal sealed class PythonDirectUrlInfo
|
||||
}
|
||||
}
|
||||
|
||||
internal static class PythonPathHelper
|
||||
{
|
||||
public static string NormalizeRelative(LanguageAnalyzerContext context, string path)
|
||||
{
|
||||
var relative = context.GetRelativePath(path);
|
||||
if (string.IsNullOrEmpty(relative) || relative == ".")
|
||||
{
|
||||
return ".";
|
||||
}
|
||||
|
||||
return relative;
|
||||
}
|
||||
}
|
||||
|
||||
internal static class PythonEncoding
|
||||
internal static class PythonPathHelper
|
||||
{
|
||||
public static string NormalizeRelative(LanguageAnalyzerContext context, string path)
|
||||
{
|
||||
var relative = context.GetRelativePath(path);
|
||||
if (string.IsNullOrEmpty(relative) || relative == ".")
|
||||
{
|
||||
return ".";
|
||||
}
|
||||
|
||||
return relative;
|
||||
}
|
||||
|
||||
public static string NormalizePackageName(string name)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(name))
|
||||
{
|
||||
return string.Empty;
|
||||
}
|
||||
|
||||
var trimmed = name.Trim();
|
||||
var builder = new StringBuilder(trimmed.Length);
|
||||
foreach (var ch in trimmed)
|
||||
{
|
||||
var lower = char.ToLowerInvariant(ch);
|
||||
builder.Append(lower switch
|
||||
{
|
||||
'_' => '-',
|
||||
'.' => '-',
|
||||
' ' => '-',
|
||||
_ => lower
|
||||
});
|
||||
}
|
||||
|
||||
return builder.ToString();
|
||||
}
|
||||
}
|
||||
|
||||
internal static class PythonEncoding
|
||||
{
|
||||
public static readonly UTF8Encoding Utf8 = new(encoderShouldEmitUTF8Identifier: false, throwOnInvalidBytes: true);
|
||||
}
|
||||
|
||||
@@ -0,0 +1,283 @@
|
||||
using System.Text.Json;
|
||||
using System.Text.RegularExpressions;
|
||||
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Python.Internal;
|
||||
|
||||
internal static class PythonLockFileCollector
|
||||
{
|
||||
private static readonly string[] RequirementPatterns =
|
||||
{
|
||||
"requirements.txt",
|
||||
"requirements-dev.txt",
|
||||
"requirements.prod.txt"
|
||||
};
|
||||
|
||||
private static readonly Regex RequirementLinePattern = new(@"^\s*(?<name>[A-Za-z0-9_.\-]+)(?<extras>\[[^\]]+\])?\s*(?<op>==|===)\s*(?<version>[^\s;#]+)", RegexOptions.Compiled);
|
||||
private static readonly Regex EditablePattern = new(@"^-{1,2}editable\s*=?\s*(?<path>.+)$", RegexOptions.Compiled | RegexOptions.IgnoreCase);
|
||||
|
||||
public static async Task<PythonLockData> LoadAsync(LanguageAnalyzerContext context, CancellationToken cancellationToken)
|
||||
{
|
||||
ArgumentNullException.ThrowIfNull(context);
|
||||
|
||||
var entries = new Dictionary<string, PythonLockEntry>(StringComparer.OrdinalIgnoreCase);
|
||||
|
||||
foreach (var pattern in RequirementPatterns)
|
||||
{
|
||||
var candidate = Path.Combine(context.RootPath, pattern);
|
||||
if (File.Exists(candidate))
|
||||
{
|
||||
await ParseRequirementsFileAsync(context, candidate, entries, cancellationToken).ConfigureAwait(false);
|
||||
}
|
||||
}
|
||||
|
||||
var pipfileLock = Path.Combine(context.RootPath, "Pipfile.lock");
|
||||
if (File.Exists(pipfileLock))
|
||||
{
|
||||
await ParsePipfileLockAsync(context, pipfileLock, entries, cancellationToken).ConfigureAwait(false);
|
||||
}
|
||||
|
||||
var poetryLock = Path.Combine(context.RootPath, "poetry.lock");
|
||||
if (File.Exists(poetryLock))
|
||||
{
|
||||
await ParsePoetryLockAsync(context, poetryLock, entries, cancellationToken).ConfigureAwait(false);
|
||||
}
|
||||
|
||||
return entries.Count == 0 ? PythonLockData.Empty : new PythonLockData(entries);
|
||||
}
|
||||
|
||||
private static async Task ParseRequirementsFileAsync(LanguageAnalyzerContext context, string path, IDictionary<string, PythonLockEntry> entries, CancellationToken cancellationToken)
|
||||
{
|
||||
await using var stream = new FileStream(path, FileMode.Open, FileAccess.Read, FileShare.Read);
|
||||
using var reader = new StreamReader(stream);
|
||||
string? line;
|
||||
var locator = PythonPathHelper.NormalizeRelative(context, path);
|
||||
|
||||
while ((line = await reader.ReadLineAsync(cancellationToken).ConfigureAwait(false)) is not null)
|
||||
{
|
||||
cancellationToken.ThrowIfCancellationRequested();
|
||||
|
||||
line = line.Trim();
|
||||
if (string.IsNullOrWhiteSpace(line) || line.StartsWith("#", StringComparison.Ordinal) || line.StartsWith("-r ", StringComparison.OrdinalIgnoreCase))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var editableMatch = EditablePattern.Match(line);
|
||||
if (editableMatch.Success)
|
||||
{
|
||||
var editablePath = editableMatch.Groups["path"].Value.Trim().Trim('"', '\'');
|
||||
var packageName = Path.GetFileName(editablePath.TrimEnd(Path.DirectorySeparatorChar, '/'));
|
||||
if (string.IsNullOrWhiteSpace(packageName))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var entry = new PythonLockEntry(
|
||||
Name: packageName,
|
||||
Version: null,
|
||||
Source: Path.GetFileName(path),
|
||||
Locator: locator,
|
||||
Extras: Array.Empty<string>(),
|
||||
Resolved: null,
|
||||
Index: null,
|
||||
EditablePath: editablePath);
|
||||
|
||||
entries[entry.DeclarationKey] = entry;
|
||||
continue;
|
||||
}
|
||||
|
||||
var match = RequirementLinePattern.Match(line);
|
||||
if (!match.Success)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var name = match.Groups["name"].Value;
|
||||
var version = match.Groups["version"].Value;
|
||||
var extras = match.Groups["extras"].Success
|
||||
? match.Groups["extras"].Value.Trim('[', ']').Split(',', StringSplitOptions.RemoveEmptyEntries | StringSplitOptions.TrimEntries)
|
||||
: Array.Empty<string>();
|
||||
|
||||
var requirementEntry = new PythonLockEntry(
|
||||
Name: name,
|
||||
Version: version,
|
||||
Source: Path.GetFileName(path),
|
||||
Locator: locator,
|
||||
Extras: extras,
|
||||
Resolved: null,
|
||||
Index: null,
|
||||
EditablePath: null);
|
||||
|
||||
entries[requirementEntry.DeclarationKey] = requirementEntry;
|
||||
}
|
||||
}
|
||||
|
||||
private static async Task ParsePipfileLockAsync(LanguageAnalyzerContext context, string path, IDictionary<string, PythonLockEntry> entries, CancellationToken cancellationToken)
|
||||
{
|
||||
await using var stream = new FileStream(path, FileMode.Open, FileAccess.Read, FileShare.Read);
|
||||
using var document = await JsonDocument.ParseAsync(stream, cancellationToken: cancellationToken).ConfigureAwait(false);
|
||||
|
||||
var root = document.RootElement;
|
||||
if (!root.TryGetProperty("default", out var defaultDeps))
|
||||
{
|
||||
return;
|
||||
}
|
||||
|
||||
foreach (var property in defaultDeps.EnumerateObject())
|
||||
{
|
||||
cancellationToken.ThrowIfCancellationRequested();
|
||||
|
||||
if (!property.Value.TryGetProperty("version", out var versionElement))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var version = versionElement.GetString();
|
||||
if (string.IsNullOrWhiteSpace(version))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
version = version.TrimStart('=', ' ');
|
||||
var entry = new PythonLockEntry(
|
||||
Name: property.Name,
|
||||
Version: version,
|
||||
Source: "Pipfile.lock",
|
||||
Locator: PythonPathHelper.NormalizeRelative(context, path),
|
||||
Extras: Array.Empty<string>(),
|
||||
Resolved: property.Value.TryGetProperty("file", out var fileElement) ? fileElement.GetString() : null,
|
||||
Index: property.Value.TryGetProperty("index", out var indexElement) ? indexElement.GetString() : null,
|
||||
EditablePath: null);
|
||||
|
||||
entries[entry.DeclarationKey] = entry;
|
||||
}
|
||||
}
|
||||
|
||||
private static async Task ParsePoetryLockAsync(LanguageAnalyzerContext context, string path, IDictionary<string, PythonLockEntry> entries, CancellationToken cancellationToken)
|
||||
{
|
||||
using var reader = new StreamReader(path);
|
||||
string? line;
|
||||
string? currentName = null;
|
||||
string? currentVersion = null;
|
||||
var extras = new List<string>();
|
||||
|
||||
void Flush()
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(currentName) || string.IsNullOrWhiteSpace(currentVersion))
|
||||
{
|
||||
currentName = null;
|
||||
currentVersion = null;
|
||||
extras.Clear();
|
||||
return;
|
||||
}
|
||||
|
||||
var entry = new PythonLockEntry(
|
||||
Name: currentName!,
|
||||
Version: currentVersion!,
|
||||
Source: "poetry.lock",
|
||||
Locator: PythonPathHelper.NormalizeRelative(context, path),
|
||||
Extras: extras.ToArray(),
|
||||
Resolved: null,
|
||||
Index: null,
|
||||
EditablePath: null);
|
||||
|
||||
entries[entry.DeclarationKey] = entry;
|
||||
currentName = null;
|
||||
currentVersion = null;
|
||||
extras.Clear();
|
||||
}
|
||||
|
||||
while ((line = await reader.ReadLineAsync(cancellationToken).ConfigureAwait(false)) is not null)
|
||||
{
|
||||
cancellationToken.ThrowIfCancellationRequested();
|
||||
|
||||
line = line.Trim();
|
||||
if (line.Length == 0)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
if (line.StartsWith("[[package]]", StringComparison.Ordinal))
|
||||
{
|
||||
Flush();
|
||||
continue;
|
||||
}
|
||||
|
||||
if (line.StartsWith("name = ", StringComparison.Ordinal))
|
||||
{
|
||||
currentName = TrimQuoted(line);
|
||||
continue;
|
||||
}
|
||||
|
||||
if (line.StartsWith("version = ", StringComparison.Ordinal))
|
||||
{
|
||||
currentVersion = TrimQuoted(line);
|
||||
continue;
|
||||
}
|
||||
|
||||
if (line.StartsWith("extras = [", StringComparison.Ordinal))
|
||||
{
|
||||
|
||||
var extrasValue = line["extras = ".Length..].Trim();
|
||||
extrasValue = extrasValue.Trim('[', ']');
|
||||
extras.AddRange(extrasValue.Split(',', StringSplitOptions.RemoveEmptyEntries | StringSplitOptions.TrimEntries).Select(static x => x.Trim('"')));
|
||||
continue;
|
||||
}
|
||||
}
|
||||
|
||||
Flush();
|
||||
}
|
||||
|
||||
private static string TrimQuoted(string line)
|
||||
{
|
||||
var index = line.IndexOf('=', StringComparison.Ordinal);
|
||||
if (index < 0)
|
||||
{
|
||||
return line;
|
||||
}
|
||||
|
||||
var value = line[(index + 1)..].Trim();
|
||||
return value.Trim('"');
|
||||
}
|
||||
}
|
||||
|
||||
internal sealed record PythonLockEntry(
|
||||
string Name,
|
||||
string? Version,
|
||||
string Source,
|
||||
string Locator,
|
||||
IReadOnlyCollection<string> Extras,
|
||||
string? Resolved,
|
||||
string? Index,
|
||||
string? EditablePath)
|
||||
{
|
||||
public string DeclarationKey => BuildKey(Name, Version);
|
||||
|
||||
private static string BuildKey(string name, string? version)
|
||||
{
|
||||
var normalized = PythonPathHelper.NormalizePackageName(name);
|
||||
return version is null
|
||||
? normalized
|
||||
: $"{normalized}@{version}".ToLowerInvariant();
|
||||
}
|
||||
}
|
||||
|
||||
internal sealed class PythonLockData
|
||||
{
|
||||
public static readonly PythonLockData Empty = new(new Dictionary<string, PythonLockEntry>(StringComparer.OrdinalIgnoreCase));
|
||||
|
||||
private readonly Dictionary<string, PythonLockEntry> _entries;
|
||||
|
||||
public PythonLockData(Dictionary<string, PythonLockEntry> entries)
|
||||
{
|
||||
_entries = entries;
|
||||
}
|
||||
|
||||
public IReadOnlyCollection<PythonLockEntry> Entries => _entries.Values;
|
||||
|
||||
public bool TryGet(string name, string version, out PythonLockEntry? entry)
|
||||
{
|
||||
var key = $"{PythonPathHelper.NormalizePackageName(name)}@{version}".ToLowerInvariant();
|
||||
return _entries.TryGetValue(key, out entry);
|
||||
}
|
||||
}
|
||||
@@ -1,4 +1,5 @@
|
||||
using System.Text.Json;
|
||||
using System.Linq;
|
||||
using System.Text.Json;
|
||||
using StellaOps.Scanner.Analyzers.Lang.Python.Internal;
|
||||
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Python;
|
||||
@@ -24,18 +25,22 @@ public sealed class PythonLanguageAnalyzer : ILanguageAnalyzer
|
||||
return AnalyzeInternalAsync(context, writer, cancellationToken);
|
||||
}
|
||||
|
||||
private static async ValueTask AnalyzeInternalAsync(LanguageAnalyzerContext context, LanguageComponentWriter writer, CancellationToken cancellationToken)
|
||||
{
|
||||
var distInfoDirectories = Directory
|
||||
.EnumerateDirectories(context.RootPath, "*.dist-info", Enumeration)
|
||||
.OrderBy(static path => path, StringComparer.Ordinal)
|
||||
.ToArray();
|
||||
|
||||
foreach (var distInfoPath in distInfoDirectories)
|
||||
{
|
||||
cancellationToken.ThrowIfCancellationRequested();
|
||||
|
||||
PythonDistribution? distribution;
|
||||
private static async ValueTask AnalyzeInternalAsync(LanguageAnalyzerContext context, LanguageComponentWriter writer, CancellationToken cancellationToken)
|
||||
{
|
||||
var lockData = await PythonLockFileCollector.LoadAsync(context, cancellationToken).ConfigureAwait(false);
|
||||
var matchedLocks = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
|
||||
var hasLockEntries = lockData.Entries.Count > 0;
|
||||
|
||||
var distInfoDirectories = Directory
|
||||
.EnumerateDirectories(context.RootPath, "*.dist-info", Enumeration)
|
||||
.OrderBy(static path => path, StringComparer.Ordinal)
|
||||
.ToArray();
|
||||
|
||||
foreach (var distInfoPath in distInfoDirectories)
|
||||
{
|
||||
cancellationToken.ThrowIfCancellationRequested();
|
||||
|
||||
PythonDistribution? distribution;
|
||||
try
|
||||
{
|
||||
distribution = await PythonDistributionLoader.LoadAsync(context, distInfoPath, cancellationToken).ConfigureAwait(false);
|
||||
@@ -54,19 +59,104 @@ public sealed class PythonLanguageAnalyzer : ILanguageAnalyzer
|
||||
}
|
||||
|
||||
if (distribution is null)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
writer.AddFromPurl(
|
||||
analyzerId: "python",
|
||||
purl: distribution.Purl,
|
||||
name: distribution.Name,
|
||||
version: distribution.Version,
|
||||
type: "pypi",
|
||||
metadata: distribution.SortedMetadata,
|
||||
evidence: distribution.SortedEvidence,
|
||||
usedByEntrypoint: distribution.UsedByEntrypoint);
|
||||
}
|
||||
}
|
||||
}
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var metadata = distribution.SortedMetadata.ToList();
|
||||
|
||||
if (lockData.TryGet(distribution.Name, distribution.Version, out var lockEntry))
|
||||
{
|
||||
matchedLocks.Add(lockEntry!.DeclarationKey);
|
||||
AppendLockMetadata(metadata, lockEntry);
|
||||
}
|
||||
else if (hasLockEntries)
|
||||
{
|
||||
metadata.Add(new KeyValuePair<string, string?>("lockMissing", "true"));
|
||||
}
|
||||
|
||||
writer.AddFromPurl(
|
||||
analyzerId: "python",
|
||||
purl: distribution.Purl,
|
||||
name: distribution.Name,
|
||||
version: distribution.Version,
|
||||
type: "pypi",
|
||||
metadata: metadata,
|
||||
evidence: distribution.SortedEvidence,
|
||||
usedByEntrypoint: distribution.UsedByEntrypoint);
|
||||
}
|
||||
|
||||
if (lockData.Entries.Count > 0)
|
||||
{
|
||||
foreach (var entry in lockData.Entries)
|
||||
{
|
||||
if (matchedLocks.Contains(entry.DeclarationKey))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var declaredMetadata = new List<KeyValuePair<string, string?>>
|
||||
{
|
||||
new("declaredOnly", "true"),
|
||||
new("lockSource", entry.Source),
|
||||
new("lockLocator", entry.Locator)
|
||||
};
|
||||
|
||||
AppendCommonLockFields(declaredMetadata, entry);
|
||||
|
||||
var version = string.IsNullOrWhiteSpace(entry.Version) ? "editable" : entry.Version!;
|
||||
var purl = $"pkg:pypi/{PythonPathHelper.NormalizePackageName(entry.Name)}@{version}";
|
||||
|
||||
var evidence = new[]
|
||||
{
|
||||
new LanguageComponentEvidence(
|
||||
LanguageEvidenceKind.Metadata,
|
||||
entry.Source,
|
||||
entry.Locator,
|
||||
entry.Resolved,
|
||||
Sha256: null)
|
||||
};
|
||||
|
||||
writer.AddFromPurl(
|
||||
analyzerId: "python",
|
||||
purl: purl,
|
||||
name: entry.Name,
|
||||
version: version,
|
||||
type: "pypi",
|
||||
metadata: declaredMetadata,
|
||||
evidence: evidence,
|
||||
usedByEntrypoint: false);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private static void AppendLockMetadata(List<KeyValuePair<string, string?>> metadata, PythonLockEntry entry)
|
||||
{
|
||||
metadata.Add(new KeyValuePair<string, string?>("lockSource", entry.Source));
|
||||
metadata.Add(new KeyValuePair<string, string?>("lockLocator", entry.Locator));
|
||||
AppendCommonLockFields(metadata, entry);
|
||||
}
|
||||
|
||||
private static void AppendCommonLockFields(List<KeyValuePair<string, string?>> metadata, PythonLockEntry entry)
|
||||
{
|
||||
if (entry.Extras.Count > 0)
|
||||
{
|
||||
metadata.Add(new KeyValuePair<string, string?>("lockExtras", string.Join(';', entry.Extras)));
|
||||
}
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(entry.Resolved))
|
||||
{
|
||||
metadata.Add(new KeyValuePair<string, string?>("lockResolved", entry.Resolved));
|
||||
}
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(entry.Index))
|
||||
{
|
||||
metadata.Add(new KeyValuePair<string, string?>("lockIndex", entry.Index));
|
||||
}
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(entry.EditablePath))
|
||||
{
|
||||
metadata.Add(new KeyValuePair<string, string?>("lockEditablePath", entry.EditablePath));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -3,4 +3,10 @@ namespace StellaOps.Scanner.Analyzers.Lang.Ruby.Internal;
|
||||
internal sealed record RubyCapabilities(
|
||||
bool UsesExec,
|
||||
bool UsesNetwork,
|
||||
bool UsesSerialization);
|
||||
bool UsesSerialization,
|
||||
IReadOnlyCollection<string> JobSchedulers)
|
||||
{
|
||||
public bool HasJobSchedulers => JobSchedulers.Count > 0;
|
||||
|
||||
public static RubyCapabilities Empty { get; } = new(false, false, false, Array.Empty<string>());
|
||||
}
|
||||
|
||||
@@ -0,0 +1,320 @@
|
||||
using System.Text;
|
||||
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Ruby.Internal;
|
||||
|
||||
internal static class RubyCapabilityDetector
|
||||
{
|
||||
private const int MaxFileBytes = 512 * 1024;
|
||||
|
||||
private static readonly string[] CandidateExtensions =
|
||||
{
|
||||
".rb",
|
||||
".rake",
|
||||
".ru",
|
||||
".thor",
|
||||
".builder",
|
||||
".gemspec"
|
||||
};
|
||||
|
||||
private static readonly string[] CandidateFileNames =
|
||||
{
|
||||
"Gemfile",
|
||||
"gems.rb",
|
||||
"Rakefile",
|
||||
"config.ru"
|
||||
};
|
||||
|
||||
private static readonly string[] IgnoredDirectoryNames =
|
||||
{
|
||||
".bundle",
|
||||
".git",
|
||||
".hg",
|
||||
".svn",
|
||||
"bin",
|
||||
"coverage",
|
||||
"log",
|
||||
"node_modules",
|
||||
"pkg",
|
||||
"tmp",
|
||||
"vendor",
|
||||
};
|
||||
|
||||
private static readonly RegexOptions PatternOptions = RegexOptions.Compiled | RegexOptions.CultureInvariant | RegexOptions.Multiline;
|
||||
|
||||
private static readonly Regex[] ExecPatterns =
|
||||
{
|
||||
CreateRegex(@"\bKernel\.system\s*\("),
|
||||
CreateRegex(@"\bsystem\s*\("),
|
||||
CreateRegex(@"\bKernel\.spawn\s*\("),
|
||||
CreateRegex(@"\bspawn\s*\("),
|
||||
CreateRegex(@"\bOpen3\.[a-zA-Z_]+\b"),
|
||||
CreateRegex(@"`[^`]+`"),
|
||||
CreateRegex(@"%x\[[^\]]+\]"),
|
||||
CreateRegex(@"%x\([^)]*\)")
|
||||
};
|
||||
|
||||
private static readonly Regex[] NetworkPatterns =
|
||||
{
|
||||
CreateRegex(@"\bNet::HTTP\b"),
|
||||
CreateRegex(@"\bFaraday\b"),
|
||||
CreateRegex(@"\bHTTPParty\b"),
|
||||
CreateRegex(@"\bHTTParty\b"),
|
||||
CreateRegex(@"\bRestClient\b"),
|
||||
CreateRegex(@"\bRedis\b"),
|
||||
CreateRegex(@"\bTCPSocket\b"),
|
||||
CreateRegex(@"\bUDPSocket\b"),
|
||||
CreateRegex(@"\bActiveRecord::Base\.establish_connection\b")
|
||||
};
|
||||
|
||||
private static readonly Regex[] SerializationPatterns =
|
||||
{
|
||||
CreateRegex(@"\bMarshal\.load\b"),
|
||||
CreateRegex(@"\bMarshal\.restore\b"),
|
||||
CreateRegex(@"\bYAML\.(?:load|unsafe_load|safe_load)\b"),
|
||||
CreateRegex(@"\bOj\.load\b"),
|
||||
CreateRegex(@"\bJSON\.load\b"),
|
||||
CreateRegex(@"\bActiveSupport::JSON\.decode\b")
|
||||
};
|
||||
|
||||
private static readonly IReadOnlyDictionary<string, Regex[]> SchedulerPatterns = new Dictionary<string, Regex[]>
|
||||
{
|
||||
["activejob"] = new[]
|
||||
{
|
||||
CreateRegex(@"\bclass\s+[A-Za-z0-9_:]+\s*<\s*ActiveJob::Base\b"),
|
||||
CreateRegex(@"\bActiveJob::Base\b")
|
||||
},
|
||||
["clockwork"] = new[]
|
||||
{
|
||||
CreateRegex(@"\bClockwork\.every\b")
|
||||
},
|
||||
["resque"] = new[]
|
||||
{
|
||||
CreateRegex(@"\bResque\.enqueue\b"),
|
||||
CreateRegex(@"\bclass\s+[A-Za-z0-9_:]+\s+<\s+Resque::Job\b")
|
||||
},
|
||||
["sidekiq"] = new[]
|
||||
{
|
||||
CreateRegex(@"\binclude\s+Sidekiq::Worker\b"),
|
||||
CreateRegex(@"\bSidekiq::Client\b")
|
||||
},
|
||||
["whenever"] = new[]
|
||||
{
|
||||
CreateRegex(@"\bWhenever::JobList\b"),
|
||||
CreateRegex(@"\bschedule_file\b")
|
||||
}
|
||||
};
|
||||
|
||||
public static async ValueTask<RubyCapabilities> DetectAsync(LanguageAnalyzerContext context, CancellationToken cancellationToken)
|
||||
{
|
||||
ArgumentNullException.ThrowIfNull(context);
|
||||
|
||||
var usesExec = false;
|
||||
var usesNetwork = false;
|
||||
var usesSerialization = false;
|
||||
var jobSchedulers = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
|
||||
|
||||
foreach (var file in EnumerateCandidateFiles(context.RootPath))
|
||||
{
|
||||
cancellationToken.ThrowIfCancellationRequested();
|
||||
|
||||
var content = await TryReadFileAsync(file, cancellationToken).ConfigureAwait(false);
|
||||
if (content is null)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
if (!usesExec && MatchesAny(content, ExecPatterns))
|
||||
{
|
||||
usesExec = true;
|
||||
}
|
||||
|
||||
if (!usesNetwork && MatchesAny(content, NetworkPatterns))
|
||||
{
|
||||
usesNetwork = true;
|
||||
}
|
||||
|
||||
if (!usesSerialization && MatchesAny(content, SerializationPatterns))
|
||||
{
|
||||
usesSerialization = true;
|
||||
}
|
||||
|
||||
foreach (var scheduler in DetectSchedulers(content))
|
||||
{
|
||||
jobSchedulers.Add(scheduler);
|
||||
}
|
||||
|
||||
if (usesExec && usesNetwork && usesSerialization && jobSchedulers.Count == SchedulerPatterns.Count)
|
||||
{
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
var orderedSchedulers = jobSchedulers
|
||||
.OrderBy(static name => name, StringComparer.Ordinal)
|
||||
.ToArray();
|
||||
|
||||
return new RubyCapabilities(usesExec, usesNetwork, usesSerialization, orderedSchedulers);
|
||||
}
|
||||
|
||||
private static IEnumerable<string> EnumerateCandidateFiles(string rootPath)
|
||||
{
|
||||
var pending = new Stack<string>();
|
||||
pending.Push(rootPath);
|
||||
|
||||
while (pending.Count > 0)
|
||||
{
|
||||
var current = pending.Pop();
|
||||
IEnumerable<string>? directories = null;
|
||||
|
||||
try
|
||||
{
|
||||
directories = Directory.EnumerateDirectories(current);
|
||||
}
|
||||
catch (IOException)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
catch (UnauthorizedAccessException)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
if (directories is not null)
|
||||
{
|
||||
foreach (var directory in directories.OrderBy(static d => d, StringComparer.OrdinalIgnoreCase))
|
||||
{
|
||||
if (ShouldSkipDirectory(rootPath, directory))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
pending.Push(directory);
|
||||
}
|
||||
}
|
||||
|
||||
IEnumerable<string>? files = null;
|
||||
try
|
||||
{
|
||||
files = Directory.EnumerateFiles(current);
|
||||
}
|
||||
catch (IOException)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
catch (UnauthorizedAccessException)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
if (files is null)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
foreach (var file in files.OrderBy(static f => f, StringComparer.OrdinalIgnoreCase))
|
||||
{
|
||||
if (IsCandidateFile(file))
|
||||
{
|
||||
yield return file;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private static bool ShouldSkipDirectory(string rootPath, string directory)
|
||||
{
|
||||
var relative = Path.GetRelativePath(rootPath, directory);
|
||||
if (relative.StartsWith("..", StringComparison.Ordinal))
|
||||
{
|
||||
return true;
|
||||
}
|
||||
|
||||
var segments = relative
|
||||
.Replace('\\', '/')
|
||||
.Split('/', StringSplitOptions.RemoveEmptyEntries);
|
||||
|
||||
foreach (var segment in segments)
|
||||
{
|
||||
if (IgnoredDirectoryNames.Contains(segment, StringComparer.OrdinalIgnoreCase))
|
||||
{
|
||||
return true;
|
||||
}
|
||||
}
|
||||
|
||||
return false;
|
||||
}
|
||||
|
||||
private static bool IsCandidateFile(string filePath)
|
||||
{
|
||||
var fileName = Path.GetFileName(filePath);
|
||||
if (CandidateFileNames.Contains(fileName, StringComparer.OrdinalIgnoreCase))
|
||||
{
|
||||
return true;
|
||||
}
|
||||
|
||||
var extension = Path.GetExtension(filePath);
|
||||
return CandidateExtensions.Contains(extension, StringComparer.OrdinalIgnoreCase);
|
||||
}
|
||||
|
||||
private static async ValueTask<string?> TryReadFileAsync(string filePath, CancellationToken cancellationToken)
|
||||
{
|
||||
try
|
||||
{
|
||||
var info = new FileInfo(filePath);
|
||||
if (!info.Exists || info.Length == 0 || info.Length > MaxFileBytes)
|
||||
{
|
||||
return null;
|
||||
}
|
||||
|
||||
await using var stream = new FileStream(
|
||||
filePath,
|
||||
FileMode.Open,
|
||||
FileAccess.Read,
|
||||
FileShare.Read,
|
||||
bufferSize: 4096,
|
||||
FileOptions.Asynchronous | FileOptions.SequentialScan);
|
||||
|
||||
using var reader = new StreamReader(stream, Encoding.UTF8, detectEncodingFromByteOrderMarks: true);
|
||||
return await reader.ReadToEndAsync(cancellationToken).ConfigureAwait(false);
|
||||
}
|
||||
catch (IOException)
|
||||
{
|
||||
return null;
|
||||
}
|
||||
catch (UnauthorizedAccessException)
|
||||
{
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
private static bool MatchesAny(string content, IEnumerable<Regex> patterns)
|
||||
{
|
||||
foreach (var pattern in patterns)
|
||||
{
|
||||
if (pattern.IsMatch(content))
|
||||
{
|
||||
return true;
|
||||
}
|
||||
}
|
||||
|
||||
return false;
|
||||
}
|
||||
|
||||
private static IEnumerable<string> DetectSchedulers(string content)
|
||||
{
|
||||
foreach (var pair in SchedulerPatterns)
|
||||
{
|
||||
foreach (var pattern in pair.Value)
|
||||
{
|
||||
if (pattern.IsMatch(content))
|
||||
{
|
||||
yield return pair.Key;
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private static Regex CreateRegex(string pattern) => new(pattern, PatternOptions);
|
||||
}
|
||||
|
||||
@@ -38,7 +38,7 @@ internal sealed class RubyPackage
|
||||
|
||||
public string ComponentKey => $"purl::{Purl}";
|
||||
|
||||
public IReadOnlyCollection<KeyValuePair<string, string?>> CreateMetadata(RubyCapabilities? capabilities = null)
|
||||
public IReadOnlyCollection<KeyValuePair<string, string?>> CreateMetadata(RubyCapabilities? capabilities = null, RubyRuntimeUsage? runtimeUsage = null)
|
||||
{
|
||||
var metadata = new List<KeyValuePair<string, string?>>
|
||||
{
|
||||
@@ -73,6 +73,35 @@ internal sealed class RubyPackage
|
||||
{
|
||||
metadata.Add(new KeyValuePair<string, string?>("capability.serialization", "true"));
|
||||
}
|
||||
|
||||
if (capabilities.HasJobSchedulers)
|
||||
{
|
||||
var schedulers = capabilities.JobSchedulers;
|
||||
metadata.Add(new KeyValuePair<string, string?>("capability.scheduler", string.Join(';', schedulers)));
|
||||
|
||||
foreach (var scheduler in schedulers)
|
||||
{
|
||||
var key = $"capability.scheduler.{scheduler}";
|
||||
metadata.Add(new KeyValuePair<string, string?>(key, "true"));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (runtimeUsage is not null && runtimeUsage.HasFiles)
|
||||
{
|
||||
metadata.Add(new KeyValuePair<string, string?>("runtime.used", "true"));
|
||||
|
||||
if (runtimeUsage.HasEntrypoints)
|
||||
{
|
||||
metadata.Add(new KeyValuePair<string, string?>("runtime.entrypoints", string.Join(';', runtimeUsage.Entrypoints)));
|
||||
}
|
||||
|
||||
metadata.Add(new KeyValuePair<string, string?>("runtime.files", string.Join(';', runtimeUsage.ReferencingFiles)));
|
||||
|
||||
if (runtimeUsage.HasReasons)
|
||||
{
|
||||
metadata.Add(new KeyValuePair<string, string?>("runtime.reasons", string.Join(';', runtimeUsage.Reasons)));
|
||||
}
|
||||
}
|
||||
|
||||
return metadata
|
||||
|
||||
@@ -0,0 +1,439 @@
|
||||
using System.Text;
|
||||
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Ruby.Internal;
|
||||
|
||||
internal static class RubyRuntimeGraphBuilder
|
||||
{
|
||||
private const int MaxFileBytes = 512 * 1024;
|
||||
|
||||
private static readonly RegexOptions PatternOptions = RegexOptions.Compiled | RegexOptions.CultureInvariant | RegexOptions.Multiline;
|
||||
|
||||
private static readonly Regex RequireRegex = new(@"(?<![A-Za-z0-9_])require\s*(?:\(|\s)\s*(?<quote>['""])(?<target>[^'""]+)\k<quote>", PatternOptions);
|
||||
private static readonly Regex RequireRelativeRegex = new(@"(?<![A-Za-z0-9_])require_relative\s*(?:\(|\s)\s*(?<quote>['""])(?<target>[^'""]+)\k<quote>", PatternOptions);
|
||||
private static readonly Regex AutoloadRegex = new(@"(?<![A-Za-z0-9_])autoload\??\s*(?:\(|\s)\s*:?[A-Za-z0-9_?!]+\s*,\s*(?<quote>['""])(?<target>[^'""]+)\k<quote>", PatternOptions);
|
||||
|
||||
private static readonly string[] CandidateExtensions =
|
||||
{
|
||||
".rb",
|
||||
".rake",
|
||||
".ru",
|
||||
".thor"
|
||||
};
|
||||
|
||||
private static readonly string[] CandidateFileNames =
|
||||
{
|
||||
"Gemfile",
|
||||
"Rakefile",
|
||||
"config.ru"
|
||||
};
|
||||
|
||||
private static readonly string[] IgnoredDirectories =
|
||||
{
|
||||
".bundle",
|
||||
".git",
|
||||
".hg",
|
||||
".svn",
|
||||
"coverage",
|
||||
"log",
|
||||
"node_modules",
|
||||
"pkg",
|
||||
"tmp",
|
||||
"vendor"
|
||||
};
|
||||
|
||||
public static async ValueTask<RubyRuntimeGraph> BuildAsync(LanguageAnalyzerContext context, IReadOnlyList<RubyPackage> packages, CancellationToken cancellationToken)
|
||||
{
|
||||
ArgumentNullException.ThrowIfNull(context);
|
||||
ArgumentNullException.ThrowIfNull(packages);
|
||||
|
||||
if (packages.Count == 0)
|
||||
{
|
||||
return RubyRuntimeGraph.Empty;
|
||||
}
|
||||
|
||||
var usageBuilders = new Dictionary<string, RubyRuntimeUsageBuilder>(StringComparer.OrdinalIgnoreCase);
|
||||
|
||||
foreach (var file in EnumerateRubyFiles(context.RootPath))
|
||||
{
|
||||
cancellationToken.ThrowIfCancellationRequested();
|
||||
|
||||
var requireReferences = await ReadReferencesAsync(file, cancellationToken).ConfigureAwait(false);
|
||||
if (requireReferences.Count == 0)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var relativePath = context.GetRelativePath(file);
|
||||
if (string.IsNullOrWhiteSpace(relativePath))
|
||||
{
|
||||
relativePath = Path.GetFileName(file);
|
||||
}
|
||||
|
||||
var isEntrypoint = IsEntrypoint(context, file, relativePath);
|
||||
|
||||
foreach (var reference in requireReferences)
|
||||
{
|
||||
var key = NormalizeRequireTarget(reference.Target);
|
||||
if (key.Length == 0)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
if (!usageBuilders.TryGetValue(key, out var builder))
|
||||
{
|
||||
builder = new RubyRuntimeUsageBuilder();
|
||||
usageBuilders[key] = builder;
|
||||
}
|
||||
|
||||
builder.AddReference(relativePath, reference.Reason, isEntrypoint);
|
||||
}
|
||||
}
|
||||
|
||||
if (usageBuilders.Count == 0)
|
||||
{
|
||||
return RubyRuntimeGraph.Empty;
|
||||
}
|
||||
|
||||
var usages = usageBuilders
|
||||
.OrderBy(static pair => pair.Key, StringComparer.OrdinalIgnoreCase)
|
||||
.ToDictionary(
|
||||
static pair => pair.Key,
|
||||
static pair => pair.Value.Build(),
|
||||
StringComparer.OrdinalIgnoreCase);
|
||||
|
||||
return new RubyRuntimeGraph(usages);
|
||||
}
|
||||
|
||||
private static string NormalizeRequireTarget(string target)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(target))
|
||||
{
|
||||
return string.Empty;
|
||||
}
|
||||
|
||||
var trimmed = target.Trim();
|
||||
if (trimmed.StartsWith(".", StringComparison.Ordinal))
|
||||
{
|
||||
return string.Empty;
|
||||
}
|
||||
|
||||
trimmed = trimmed.Replace('\\', '/');
|
||||
var slashIndex = trimmed.IndexOf('/');
|
||||
if (slashIndex > 0)
|
||||
{
|
||||
trimmed = trimmed[..slashIndex];
|
||||
}
|
||||
|
||||
return trimmed;
|
||||
}
|
||||
|
||||
private static async ValueTask<IReadOnlyList<RubyRuntimeReference>> ReadReferencesAsync(string filePath, CancellationToken cancellationToken)
|
||||
{
|
||||
var info = new FileInfo(filePath);
|
||||
if (!info.Exists || info.Length == 0 || info.Length > MaxFileBytes)
|
||||
{
|
||||
return Array.Empty<RubyRuntimeReference>();
|
||||
}
|
||||
|
||||
string content;
|
||||
try
|
||||
{
|
||||
await using var stream = new FileStream(
|
||||
filePath,
|
||||
FileMode.Open,
|
||||
FileAccess.Read,
|
||||
FileShare.Read,
|
||||
bufferSize: 4096,
|
||||
FileOptions.Asynchronous | FileOptions.SequentialScan);
|
||||
|
||||
using var reader = new StreamReader(stream, Encoding.UTF8, detectEncodingFromByteOrderMarks: true);
|
||||
content = await reader.ReadToEndAsync(cancellationToken).ConfigureAwait(false);
|
||||
}
|
||||
catch (IOException)
|
||||
{
|
||||
return Array.Empty<RubyRuntimeReference>();
|
||||
}
|
||||
catch (UnauthorizedAccessException)
|
||||
{
|
||||
return Array.Empty<RubyRuntimeReference>();
|
||||
}
|
||||
|
||||
if (string.IsNullOrWhiteSpace(content))
|
||||
{
|
||||
return Array.Empty<RubyRuntimeReference>();
|
||||
}
|
||||
|
||||
var references = new List<RubyRuntimeReference>();
|
||||
AppendReferences(references, RequireRegex.Matches(content), "require-static");
|
||||
AppendReferences(references, RequireRelativeRegex.Matches(content), "require-relative");
|
||||
AppendReferences(references, AutoloadRegex.Matches(content), "autoload");
|
||||
|
||||
return references;
|
||||
}
|
||||
|
||||
private static void AppendReferences(List<RubyRuntimeReference> references, MatchCollection matches, string reason)
|
||||
{
|
||||
if (matches.Count == 0)
|
||||
{
|
||||
return;
|
||||
}
|
||||
|
||||
foreach (Match match in matches)
|
||||
{
|
||||
if (!match.Success)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
var target = match.Groups["target"].Value?.Trim();
|
||||
if (string.IsNullOrWhiteSpace(target))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
references.Add(new RubyRuntimeReference(target, reason));
|
||||
}
|
||||
}
|
||||
|
||||
private static IEnumerable<string> EnumerateRubyFiles(string rootPath)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(rootPath) || !Directory.Exists(rootPath))
|
||||
{
|
||||
yield break;
|
||||
}
|
||||
|
||||
var pending = new Stack<string>();
|
||||
pending.Push(rootPath);
|
||||
|
||||
while (pending.Count > 0)
|
||||
{
|
||||
var current = pending.Pop();
|
||||
IEnumerable<string>? directories = null;
|
||||
|
||||
try
|
||||
{
|
||||
directories = Directory.EnumerateDirectories(current);
|
||||
}
|
||||
catch (IOException)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
catch (UnauthorizedAccessException)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
if (directories is not null)
|
||||
{
|
||||
foreach (var directory in directories.OrderBy(static d => d, StringComparer.OrdinalIgnoreCase))
|
||||
{
|
||||
if (ShouldSkipDirectory(rootPath, directory))
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
pending.Push(directory);
|
||||
}
|
||||
}
|
||||
|
||||
IEnumerable<string>? files = null;
|
||||
try
|
||||
{
|
||||
files = Directory.EnumerateFiles(current);
|
||||
}
|
||||
catch (IOException)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
catch (UnauthorizedAccessException)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
if (files is null)
|
||||
{
|
||||
continue;
|
||||
}
|
||||
|
||||
foreach (var file in files.OrderBy(static f => f, StringComparer.OrdinalIgnoreCase))
|
||||
{
|
||||
if (IsCandidateFile(file))
|
||||
{
|
||||
yield return file;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private static bool ShouldSkipDirectory(string rootPath, string directory)
|
||||
{
|
||||
var relative = Path.GetRelativePath(rootPath, directory);
|
||||
if (relative.StartsWith("..", StringComparison.Ordinal))
|
||||
{
|
||||
return true;
|
||||
}
|
||||
|
||||
var segments = relative
|
||||
.Replace('\\', '/')
|
||||
.Split('/', StringSplitOptions.RemoveEmptyEntries);
|
||||
|
||||
foreach (var segment in segments)
|
||||
{
|
||||
if (IgnoredDirectories.Contains(segment, StringComparer.OrdinalIgnoreCase))
|
||||
{
|
||||
return true;
|
||||
}
|
||||
}
|
||||
|
||||
return false;
|
||||
}
|
||||
|
||||
private static bool IsCandidateFile(string filePath)
|
||||
{
|
||||
var fileName = Path.GetFileName(filePath);
|
||||
if (CandidateFileNames.Contains(fileName, StringComparer.OrdinalIgnoreCase))
|
||||
{
|
||||
return true;
|
||||
}
|
||||
|
||||
var extension = Path.GetExtension(filePath);
|
||||
return CandidateExtensions.Contains(extension, StringComparer.OrdinalIgnoreCase);
|
||||
}
|
||||
|
||||
private static bool IsEntrypoint(LanguageAnalyzerContext context, string absolutePath, string relativePath)
|
||||
{
|
||||
if (context.UsageHints.IsPathUsed(absolutePath))
|
||||
{
|
||||
return true;
|
||||
}
|
||||
|
||||
var normalized = relativePath.Replace('\\', '/');
|
||||
if (normalized.Equals("config.ru", StringComparison.OrdinalIgnoreCase))
|
||||
{
|
||||
return true;
|
||||
}
|
||||
|
||||
if (normalized.StartsWith("bin/", StringComparison.OrdinalIgnoreCase))
|
||||
{
|
||||
return true;
|
||||
}
|
||||
|
||||
if (normalized is "main.rb" or "app.rb" or "server.rb")
|
||||
{
|
||||
return true;
|
||||
}
|
||||
|
||||
if (normalized.StartsWith("app/jobs/", StringComparison.OrdinalIgnoreCase)
|
||||
|| normalized.StartsWith("app/workers/", StringComparison.OrdinalIgnoreCase)
|
||||
|| normalized.StartsWith("app/controllers/", StringComparison.OrdinalIgnoreCase)
|
||||
|| normalized.StartsWith("app/channels/", StringComparison.OrdinalIgnoreCase))
|
||||
{
|
||||
return true;
|
||||
}
|
||||
|
||||
if (normalized.StartsWith("config/", StringComparison.OrdinalIgnoreCase)
|
||||
&& normalized.EndsWith(".rb", StringComparison.OrdinalIgnoreCase))
|
||||
{
|
||||
return true;
|
||||
}
|
||||
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
internal sealed record RubyRuntimeReference(string Target, string Reason);
|
||||
|
||||
internal sealed class RubyRuntimeGraph
|
||||
{
|
||||
private readonly IReadOnlyDictionary<string, RubyRuntimeUsage> _usages;
|
||||
|
||||
public RubyRuntimeGraph(IReadOnlyDictionary<string, RubyRuntimeUsage> usages)
|
||||
{
|
||||
_usages = usages ?? throw new ArgumentNullException(nameof(usages));
|
||||
}
|
||||
|
||||
public static RubyRuntimeGraph Empty { get; } = new(new Dictionary<string, RubyRuntimeUsage>(StringComparer.OrdinalIgnoreCase));
|
||||
|
||||
public bool TryGetUsage(RubyPackage package, [NotNullWhen(true)] out RubyRuntimeUsage? usage)
|
||||
{
|
||||
if (package is null)
|
||||
{
|
||||
usage = null;
|
||||
return false;
|
||||
}
|
||||
|
||||
foreach (var key in EnumerateCandidateKeys(package.Name))
|
||||
{
|
||||
if (_usages.TryGetValue(key, out usage))
|
||||
{
|
||||
return true;
|
||||
}
|
||||
}
|
||||
|
||||
usage = null;
|
||||
return false;
|
||||
}
|
||||
|
||||
private static IEnumerable<string> EnumerateCandidateKeys(string name)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(name))
|
||||
{
|
||||
yield break;
|
||||
}
|
||||
|
||||
yield return name;
|
||||
|
||||
var underscore = name.Replace('-', '_');
|
||||
if (!string.Equals(underscore, name, StringComparison.Ordinal))
|
||||
{
|
||||
yield return underscore;
|
||||
}
|
||||
|
||||
var hyphen = name.Replace('_', '-');
|
||||
if (!string.Equals(hyphen, name, StringComparison.Ordinal))
|
||||
{
|
||||
yield return hyphen;
|
||||
}
|
||||
|
||||
var compact = name.Replace("-", string.Empty, StringComparison.Ordinal).Replace("_", string.Empty, StringComparison.Ordinal);
|
||||
if (!string.Equals(compact, name, StringComparison.Ordinal))
|
||||
{
|
||||
yield return compact;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
internal sealed class RubyRuntimeUsageBuilder
|
||||
{
|
||||
private readonly HashSet<string> _files = new(StringComparer.Ordinal);
|
||||
private readonly HashSet<string> _entrypoints = new(StringComparer.Ordinal);
|
||||
private readonly HashSet<string> _reasons = new(StringComparer.Ordinal);
|
||||
|
||||
private bool _usedByEntrypoint;
|
||||
|
||||
public void AddReference(string relativePath, string reason, bool isEntrypoint)
|
||||
{
|
||||
if (!string.IsNullOrWhiteSpace(relativePath))
|
||||
{
|
||||
_files.Add(relativePath.Replace('\\', '/'));
|
||||
if (isEntrypoint)
|
||||
{
|
||||
_entrypoints.Add(relativePath.Replace('\\', '/'));
|
||||
_usedByEntrypoint = true;
|
||||
}
|
||||
}
|
||||
|
||||
if (!string.IsNullOrWhiteSpace(reason))
|
||||
{
|
||||
_reasons.Add(reason);
|
||||
}
|
||||
}
|
||||
|
||||
public RubyRuntimeUsage Build()
|
||||
{
|
||||
var files = _files.OrderBy(static file => file, StringComparer.Ordinal).ToArray();
|
||||
var entrypoints = _entrypoints.OrderBy(static file => file, StringComparer.Ordinal).ToArray();
|
||||
var reasons = _reasons.OrderBy(static reason => reason, StringComparer.Ordinal).ToArray();
|
||||
return new RubyRuntimeUsage(_usedByEntrypoint, files, entrypoints, reasons);
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,15 @@
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Ruby.Internal;
|
||||
|
||||
internal sealed record RubyRuntimeUsage(
|
||||
bool UsedByEntrypoint,
|
||||
IReadOnlyCollection<string> ReferencingFiles,
|
||||
IReadOnlyCollection<string> Entrypoints,
|
||||
IReadOnlyCollection<string> Reasons)
|
||||
{
|
||||
public bool HasFiles => ReferencingFiles.Count > 0;
|
||||
|
||||
public bool HasEntrypoints => Entrypoints.Count > 0;
|
||||
|
||||
public bool HasReasons => Reasons.Count > 0;
|
||||
}
|
||||
|
||||
@@ -19,20 +19,25 @@ public sealed class RubyLanguageAnalyzer : ILanguageAnalyzer
|
||||
return;
|
||||
}
|
||||
|
||||
var capabilities = await RubyCapabilityDetector.DetectAsync(context, cancellationToken).ConfigureAwait(false);
|
||||
var packages = RubyPackageCollector.CollectPackages(lockData, context);
|
||||
var runtimeGraph = await RubyRuntimeGraphBuilder.BuildAsync(context, packages, cancellationToken).ConfigureAwait(false);
|
||||
|
||||
foreach (var package in packages.OrderBy(static p => p.ComponentKey, StringComparer.Ordinal))
|
||||
{
|
||||
cancellationToken.ThrowIfCancellationRequested();
|
||||
|
||||
runtimeGraph.TryGetUsage(package, out var runtimeUsage);
|
||||
|
||||
writer.AddFromPurl(
|
||||
analyzerId: Id,
|
||||
purl: package.Purl,
|
||||
name: package.Name,
|
||||
version: package.Version,
|
||||
type: "gem",
|
||||
metadata: package.CreateMetadata(),
|
||||
metadata: package.CreateMetadata(capabilities, runtimeUsage),
|
||||
evidence: package.CreateEvidence(),
|
||||
usedByEntrypoint: false);
|
||||
usedByEntrypoint: runtimeUsage?.UsedByEntrypoint ?? false);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -0,0 +1,6 @@
|
||||
# Ruby Analyzer Guild — Active Tasks
|
||||
|
||||
| Task ID | State | Notes |
|
||||
| --- | --- | --- |
|
||||
| `SCANNER-ENG-0017` | DONE (2025-11-09) | Build runtime require/autoload graph builder with tree-sitter Ruby per design §4.4, feed EntryTrace hints. |
|
||||
| `SCANNER-ENG-0018` | DONE (2025-11-09) | Emit Ruby capability + framework surface signals, align with design §4.5 / Sprint 138. |
|
||||
@@ -1,3 +1,4 @@
|
||||
using StellaOps.Scanner.Core.Contracts;
|
||||
using StellaOps.Scanner.Surface.Env;
|
||||
using StellaOps.Scanner.Surface.Secrets;
|
||||
|
||||
@@ -7,12 +8,17 @@ public sealed class LanguageAnalyzerContext
|
||||
{
|
||||
private const string SecretsComponentName = "ScannerWorkerLanguageAnalyzers";
|
||||
|
||||
public LanguageAnalyzerContext(string rootPath, TimeProvider timeProvider, LanguageUsageHints? usageHints = null, IServiceProvider? services = null)
|
||||
public LanguageAnalyzerContext(
|
||||
string rootPath,
|
||||
TimeProvider timeProvider,
|
||||
LanguageUsageHints? usageHints = null,
|
||||
IServiceProvider? services = null,
|
||||
ScanAnalysisStore? analysisStore = null)
|
||||
{
|
||||
if (string.IsNullOrWhiteSpace(rootPath))
|
||||
{
|
||||
throw new ArgumentException("Root path is required", nameof(rootPath));
|
||||
}
|
||||
if (string.IsNullOrWhiteSpace(rootPath))
|
||||
{
|
||||
throw new ArgumentException("Root path is required", nameof(rootPath));
|
||||
}
|
||||
|
||||
RootPath = Path.GetFullPath(rootPath);
|
||||
if (!Directory.Exists(RootPath))
|
||||
@@ -24,6 +30,7 @@ public sealed class LanguageAnalyzerContext
|
||||
UsageHints = usageHints ?? LanguageUsageHints.Empty;
|
||||
Services = services;
|
||||
Secrets = CreateSecrets(services);
|
||||
AnalysisStore = analysisStore;
|
||||
}
|
||||
|
||||
public string RootPath { get; }
|
||||
@@ -36,6 +43,8 @@ public sealed class LanguageAnalyzerContext
|
||||
|
||||
public LanguageAnalyzerSecrets Secrets { get; }
|
||||
|
||||
public ScanAnalysisStore? AnalysisStore { get; }
|
||||
|
||||
public bool TryGetService<T>([NotNullWhen(true)] out T? service) where T : class
|
||||
{
|
||||
if (Services is null)
|
||||
|
||||
@@ -0,0 +1,42 @@
|
||||
using System.Collections.Generic;
|
||||
using System.Collections.ObjectModel;
|
||||
|
||||
namespace StellaOps.Scanner.Core.Contracts;
|
||||
|
||||
public sealed class AnalyzerObservationPayload
|
||||
{
|
||||
public AnalyzerObservationPayload(
|
||||
string analyzerId,
|
||||
string kind,
|
||||
string mediaType,
|
||||
ReadOnlyMemory<byte> content,
|
||||
IReadOnlyDictionary<string, string?>? metadata = null,
|
||||
string? view = null)
|
||||
{
|
||||
ArgumentException.ThrowIfNullOrWhiteSpace(analyzerId);
|
||||
ArgumentException.ThrowIfNullOrWhiteSpace(kind);
|
||||
ArgumentException.ThrowIfNullOrWhiteSpace(mediaType);
|
||||
|
||||
AnalyzerId = analyzerId;
|
||||
Kind = kind;
|
||||
MediaType = mediaType;
|
||||
Content = content.ToArray();
|
||||
Metadata = metadata is null
|
||||
? null
|
||||
: new ReadOnlyDictionary<string, string?>(
|
||||
new Dictionary<string, string?>(metadata, StringComparer.OrdinalIgnoreCase));
|
||||
View = view;
|
||||
}
|
||||
|
||||
public string AnalyzerId { get; }
|
||||
|
||||
public string Kind { get; }
|
||||
|
||||
public string MediaType { get; }
|
||||
|
||||
public ReadOnlyMemory<byte> Content { get; }
|
||||
|
||||
public IReadOnlyDictionary<string, string?>? Metadata { get; }
|
||||
|
||||
public string? View { get; }
|
||||
}
|
||||
@@ -19,4 +19,6 @@ public static class ScanAnalysisKeys
|
||||
public const string SurfaceManifest = "analysis.surface.manifest";
|
||||
|
||||
public const string RegistryCredentials = "analysis.registry.credentials";
|
||||
|
||||
public const string DenoObservationPayload = "analysis.lang.deno.observation";
|
||||
}
|
||||
|
||||
@@ -12,6 +12,7 @@ public enum ArtifactDocumentType
|
||||
SurfaceManifest,
|
||||
SurfaceEntryTrace,
|
||||
SurfaceLayerFragment,
|
||||
SurfaceObservation
|
||||
}
|
||||
|
||||
public enum ArtifactDocumentFormat
|
||||
@@ -25,6 +26,7 @@ public enum ArtifactDocumentFormat
|
||||
EntryTraceNdjson,
|
||||
EntryTraceGraphJson,
|
||||
ComponentFragmentJson,
|
||||
ObservationJson
|
||||
}
|
||||
|
||||
[BsonIgnoreExtraElements]
|
||||
|
||||
@@ -0,0 +1,76 @@
|
||||
using StellaOps.Scanner.Analyzers.Lang.Deno.Internal;
|
||||
using StellaOps.Scanner.Analyzers.Lang.Deno.Tests.TestFixtures;
|
||||
using StellaOps.Scanner.Analyzers.Lang.Tests.TestUtilities;
|
||||
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno.Tests.Bundles;
|
||||
|
||||
public sealed class BundleInspectorTests
|
||||
{
|
||||
[Fact]
|
||||
public void EszipInspectorExtractsManifest()
|
||||
{
|
||||
var temp = TestPaths.CreateTemporaryDirectory();
|
||||
try
|
||||
{
|
||||
var eszipPath = BundleFixtureBuilder.CreateSampleEszip(temp);
|
||||
var result = DenoBundleInspector.TryInspect(eszipPath, CancellationToken.None);
|
||||
|
||||
Assert.NotNull(result);
|
||||
Assert.Equal("eszip", result!.BundleType);
|
||||
Assert.Equal("mod.ts", result.Entrypoint);
|
||||
Assert.Equal(2, result.Modules.Length);
|
||||
Assert.Contains(result.Modules, module => module.Specifier == "file:///mod.ts");
|
||||
Assert.Single(result.Resources);
|
||||
}
|
||||
finally
|
||||
{
|
||||
TestPaths.SafeDelete(temp);
|
||||
}
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void DenoCompileInspectorExtractsEmbeddedEszip()
|
||||
{
|
||||
var temp = TestPaths.CreateTemporaryDirectory();
|
||||
try
|
||||
{
|
||||
var binaryPath = BundleFixtureBuilder.CreateSampleCompiledBinary(temp);
|
||||
var result = DenoCompileInspector.TryInspect(binaryPath, CancellationToken.None);
|
||||
|
||||
Assert.NotNull(result);
|
||||
Assert.Equal("deno-compile", result!.BundleType);
|
||||
Assert.Equal("mod.ts", result.Entrypoint);
|
||||
Assert.Equal(2, result.Modules.Length);
|
||||
}
|
||||
finally
|
||||
{
|
||||
TestPaths.SafeDelete(temp);
|
||||
}
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public void BundleScannerProducesObservations()
|
||||
{
|
||||
var temp = TestPaths.CreateTemporaryDirectory();
|
||||
try
|
||||
{
|
||||
var eszip = BundleFixtureBuilder.CreateSampleEszip(temp);
|
||||
var binary = BundleFixtureBuilder.CreateSampleCompiledBinary(temp);
|
||||
Assert.True(File.Exists(eszip));
|
||||
Assert.True(File.Exists(binary));
|
||||
|
||||
var scan = DenoBundleScanner.Scan(temp, CancellationToken.None);
|
||||
var observations = DenoBundleScanner.ToObservations(scan);
|
||||
|
||||
Assert.Equal(1, scan.EszipBundles.Length);
|
||||
Assert.Equal(1, scan.CompiledBundles.Length);
|
||||
Assert.Equal(2, observations.Length);
|
||||
Assert.Contains(observations, obs => obs.BundleType == "eszip");
|
||||
Assert.Contains(observations, obs => obs.BundleType == "deno-compile");
|
||||
}
|
||||
finally
|
||||
{
|
||||
TestPaths.SafeDelete(temp);
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,34 @@
|
||||
using StellaOps.Scanner.Analyzers.Lang.Deno.Internal;
|
||||
using StellaOps.Scanner.Analyzers.Lang.Deno.Tests.TestFixtures;
|
||||
using StellaOps.Scanner.Analyzers.Lang.Tests.TestUtilities;
|
||||
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno.Tests.Containers;
|
||||
|
||||
public sealed class ContainerAdapterTests
|
||||
{
|
||||
[Fact]
|
||||
public async Task CollectInputsIncludesCacheVendorAndBundlesAsync()
|
||||
{
|
||||
var (root, envDir) = DenoWorkspaceTestFixture.Create();
|
||||
try
|
||||
{
|
||||
Environment.SetEnvironmentVariable("DENO_DIR", envDir);
|
||||
var context = new LanguageAnalyzerContext(root, TimeProvider.System);
|
||||
var workspace = await DenoWorkspaceNormalizer.NormalizeAsync(context, CancellationToken.None);
|
||||
var bundleScan = DenoBundleScanner.Scan(root, CancellationToken.None);
|
||||
var observations = DenoBundleScanner.ToObservations(bundleScan);
|
||||
|
||||
var inputs = DenoContainerAdapter.CollectInputs(workspace, observations);
|
||||
|
||||
Assert.NotEmpty(inputs);
|
||||
Assert.Contains(inputs, input => input.Kind == DenoContainerSourceKind.Cache);
|
||||
Assert.Contains(inputs, input => input.Kind == DenoContainerSourceKind.Vendor);
|
||||
Assert.Contains(inputs, input => input.Kind == DenoContainerSourceKind.Bundle);
|
||||
}
|
||||
finally
|
||||
{
|
||||
Environment.SetEnvironmentVariable("DENO_DIR", null);
|
||||
TestPaths.SafeDelete(root);
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,38 @@
|
||||
using System.Collections.Immutable;
|
||||
using StellaOps.Scanner.Analyzers.Lang.Deno.Internal;
|
||||
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno.Tests.Containers;
|
||||
|
||||
public sealed class ContainerEmitterTests
|
||||
{
|
||||
[Fact]
|
||||
public void BuildRecordsProducesComponents()
|
||||
{
|
||||
var inputs = new[]
|
||||
{
|
||||
new DenoContainerInput(
|
||||
DenoContainerSourceKind.Cache,
|
||||
"cache-alias",
|
||||
"sha256:abc",
|
||||
new Dictionary<string, string?> { ["path"] = "/cache/path" },
|
||||
Bundle: null),
|
||||
new DenoContainerInput(
|
||||
DenoContainerSourceKind.Bundle,
|
||||
"/path/bundle.eszip",
|
||||
null,
|
||||
new Dictionary<string, string?>(),
|
||||
new DenoBundleObservation(
|
||||
"/path/bundle.eszip",
|
||||
"eszip",
|
||||
"mod.ts",
|
||||
ImmutableArray<DenoBundleModule>.Empty,
|
||||
ImmutableArray<DenoBundleResource>.Empty))
|
||||
};
|
||||
|
||||
var records = DenoContainerEmitter.BuildRecords("deno", inputs);
|
||||
|
||||
Assert.Equal(2, records.Length);
|
||||
Assert.Contains(records, record => record.Metadata.ContainsKey("deno.container.bundle.entrypoint"));
|
||||
Assert.Contains(records, record => record.Metadata.ContainsKey("deno.container.layerDigest"));
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,121 @@
|
||||
using System.Linq;
|
||||
using StellaOps.Scanner.Analyzers.Lang.Deno.Internal;
|
||||
using StellaOps.Scanner.Analyzers.Lang.Deno.Tests.TestFixtures;
|
||||
using StellaOps.Scanner.Analyzers.Lang.Tests.TestUtilities;
|
||||
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno.Tests.Deno;
|
||||
|
||||
public sealed class DenoWorkspaceNormalizerTests
|
||||
{
|
||||
[Fact]
|
||||
public async Task WorkspaceFixtureProducesDeterministicOutputAsync()
|
||||
{
|
||||
var (root, envDenoDir) = DenoWorkspaceTestFixture.Create();
|
||||
var previousDenoDir = Environment.GetEnvironmentVariable("DENO_DIR");
|
||||
|
||||
try
|
||||
{
|
||||
Environment.SetEnvironmentVariable("DENO_DIR", envDenoDir);
|
||||
|
||||
var context = new LanguageAnalyzerContext(root, TimeProvider.System);
|
||||
var workspace = await DenoWorkspaceNormalizer.NormalizeAsync(context, CancellationToken.None);
|
||||
|
||||
var config = Assert.Single(workspace.Configurations);
|
||||
Assert.EndsWith("deno.jsonc", config.AbsolutePath, StringComparison.OrdinalIgnoreCase);
|
||||
Assert.True(config.LockEnabled);
|
||||
Assert.NotNull(config.InlineImportMap);
|
||||
Assert.NotNull(config.ImportMapPath);
|
||||
Assert.NotNull(config.VendorDirectoryPath);
|
||||
|
||||
Assert.Contains(workspace.ImportMaps, map => map.IsInline);
|
||||
Assert.Contains(
|
||||
workspace.ImportMaps,
|
||||
map => map.AbsolutePath is not null && map.AbsolutePath.EndsWith("import_map.json", StringComparison.OrdinalIgnoreCase));
|
||||
|
||||
Assert.Contains(workspace.LockFiles, file => file.AbsolutePath.EndsWith("deno.lock", StringComparison.OrdinalIgnoreCase));
|
||||
Assert.Contains(workspace.Vendors, vendor => vendor.AbsolutePath.EndsWith(Path.Combine("vendor"), StringComparison.OrdinalIgnoreCase));
|
||||
Assert.Contains(workspace.CacheLocations, cache => cache.Kind == DenoCacheLocationKind.Env);
|
||||
Assert.Contains(
|
||||
workspace.CacheLocations,
|
||||
cache => cache.Kind == DenoCacheLocationKind.Workspace && cache.AbsolutePath.EndsWith(".deno", StringComparison.OrdinalIgnoreCase));
|
||||
|
||||
Assert.Contains(workspace.FileSystem.Files, file => file.VirtualPath == "workspace://deno.jsonc");
|
||||
Assert.Contains(workspace.FileSystem.Files, file => file.VirtualPath.StartsWith("vendor://", StringComparison.Ordinal));
|
||||
Assert.Contains(
|
||||
workspace.FileSystem.Files,
|
||||
file => file.VirtualPath.StartsWith("deno-dir://", StringComparison.Ordinal) ||
|
||||
file.VirtualPath.StartsWith("layer://", StringComparison.Ordinal));
|
||||
}
|
||||
finally
|
||||
{
|
||||
Environment.SetEnvironmentVariable("DENO_DIR", previousDenoDir);
|
||||
DenoWorkspaceTestFixture.Cleanup(root);
|
||||
}
|
||||
}
|
||||
|
||||
[Fact]
|
||||
public async Task GraphResolverCapturesImportAndCacheEdgesAsync()
|
||||
{
|
||||
var (root, envDenoDir) = DenoWorkspaceTestFixture.Create();
|
||||
var previousDenoDir = Environment.GetEnvironmentVariable("DENO_DIR");
|
||||
|
||||
try
|
||||
{
|
||||
Environment.SetEnvironmentVariable("DENO_DIR", envDenoDir);
|
||||
|
||||
var context = new LanguageAnalyzerContext(root, TimeProvider.System);
|
||||
var workspace = await DenoWorkspaceNormalizer.NormalizeAsync(context, CancellationToken.None);
|
||||
var lockFile = workspace.LockFiles.Single(lf => string.Equals(lf.RelativePath, "deno.lock", StringComparison.OrdinalIgnoreCase));
|
||||
Assert.Contains(lockFile.RemoteEntries, entry => entry.Key.Contains("server.ts", StringComparison.Ordinal));
|
||||
|
||||
var graph = DenoModuleGraphResolver.Resolve(workspace, CancellationToken.None);
|
||||
var compatibility = DenoNpmCompatibilityAdapter.Analyze(workspace, graph, CancellationToken.None);
|
||||
|
||||
Assert.NotEmpty(graph.Nodes);
|
||||
Assert.NotEmpty(graph.Edges);
|
||||
|
||||
var remoteNode = graph.Nodes.FirstOrDefault(
|
||||
node => node.Kind == DenoModuleKind.RemoteModule &&
|
||||
node.Id == "remote::https://deno.land/std@0.207.0/http/server.ts");
|
||||
Assert.NotNull(remoteNode);
|
||||
Assert.Equal("sha256-deadbeef", remoteNode!.Integrity);
|
||||
|
||||
Assert.Contains(
|
||||
graph.Edges,
|
||||
edge => edge.ImportKind == DenoImportKind.Cache &&
|
||||
edge.Provenance.StartsWith("vendor-cache:", StringComparison.Ordinal) &&
|
||||
edge.Specifier.Contains("https://deno.land/std@0.207.0/http/server.ts", StringComparison.Ordinal));
|
||||
|
||||
Assert.Contains(
|
||||
graph.Edges,
|
||||
edge => edge.ImportKind == DenoImportKind.NpmBridge &&
|
||||
edge.Specifier == "npm:dayjs@1" &&
|
||||
edge.Resolution == "dayjs@1.11.12");
|
||||
|
||||
Assert.Contains(
|
||||
graph.Edges,
|
||||
edge => edge.ImportKind == DenoImportKind.JsonAssertion &&
|
||||
edge.Specifier == "data" &&
|
||||
edge.Resolution?.EndsWith("data/data.json", StringComparison.OrdinalIgnoreCase) == true);
|
||||
|
||||
Assert.Contains(
|
||||
graph.Edges,
|
||||
edge => edge.ImportKind == DenoImportKind.Redirect &&
|
||||
edge.Specifier == "https://deno.land/std/http/server.ts");
|
||||
|
||||
Assert.Contains(
|
||||
compatibility.BuiltinUsages,
|
||||
usage => usage.Specifier == "node:fs");
|
||||
|
||||
var npmResolution = compatibility.NpmResolutions.First(res => res.Specifier == "npm:dayjs@1");
|
||||
Assert.True(npmResolution.ExistsOnDisk);
|
||||
Assert.Equal("deno", npmResolution.Condition);
|
||||
Assert.True(npmResolution.ResolvedPath?.EndsWith("deno.mod.ts", StringComparison.OrdinalIgnoreCase));
|
||||
}
|
||||
finally
|
||||
{
|
||||
Environment.SetEnvironmentVariable("DENO_DIR", previousDenoDir);
|
||||
DenoWorkspaceTestFixture.Cleanup(root);
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,47 @@
|
||||
using System.Threading;
|
||||
using System.Threading.Tasks;
|
||||
using StellaOps.Scanner.Analyzers.Lang;
|
||||
using StellaOps.Scanner.Analyzers.Lang.Deno;
|
||||
using StellaOps.Scanner.Analyzers.Lang.Deno.Tests.TestFixtures;
|
||||
using StellaOps.Scanner.Core.Contracts;
|
||||
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno.Tests.Observations;
|
||||
|
||||
public sealed class DenoLanguageAnalyzerObservationTests
|
||||
{
|
||||
[Fact]
|
||||
public async Task AnalyzerStoresObservationPayloadInAnalysisStoreAsync()
|
||||
{
|
||||
var (root, envDenoDir) = DenoWorkspaceTestFixture.Create();
|
||||
var previousDenoDir = Environment.GetEnvironmentVariable("DENO_DIR");
|
||||
|
||||
try
|
||||
{
|
||||
Environment.SetEnvironmentVariable("DENO_DIR", envDenoDir);
|
||||
|
||||
var store = new ScanAnalysisStore();
|
||||
var context = new LanguageAnalyzerContext(
|
||||
root,
|
||||
TimeProvider.System,
|
||||
usageHints: null,
|
||||
services: null,
|
||||
analysisStore: store);
|
||||
|
||||
var analyzer = new DenoLanguageAnalyzer();
|
||||
var engine = new LanguageAnalyzerEngine(new[] { analyzer });
|
||||
await engine.AnalyzeAsync(context, CancellationToken.None);
|
||||
|
||||
Assert.True(store.TryGet(ScanAnalysisKeys.DenoObservationPayload, out AnalyzerObservationPayload payload));
|
||||
Assert.Equal("deno.observation", payload.Kind);
|
||||
Assert.Equal("application/json", payload.MediaType);
|
||||
Assert.True(payload.Content.Length > 0);
|
||||
Assert.NotNull(payload.Metadata);
|
||||
Assert.True(payload.Metadata!.ContainsKey("deno.observation.hash"));
|
||||
}
|
||||
finally
|
||||
{
|
||||
Environment.SetEnvironmentVariable("DENO_DIR", previousDenoDir);
|
||||
DenoWorkspaceTestFixture.Cleanup(root);
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,28 @@
|
||||
using System.Collections.Immutable;
|
||||
using System.Security.Cryptography;
|
||||
using System.Text;
|
||||
using StellaOps.Scanner.Analyzers.Lang.Deno.Internal;
|
||||
using StellaOps.Scanner.Analyzers.Lang.Deno.Internal.Observations;
|
||||
|
||||
namespace StellaOps.Scanner.Analyzers.Lang.Deno.Tests.Observations;
|
||||
|
||||
public sealed class ObservationSerializerTests
|
||||
{
|
||||
[Fact]
|
||||
public void SerializeProducesDeterministicJson()
|
||||
{
|
||||
var document = new DenoObservationDocument(
|
||||
ImmutableArray.Create("mod.ts"),
|
||||
ImmutableArray.Create("https://example.com/deps.ts"),
|
||||
ImmutableArray<DenoCapabilityRecord>.Empty,
|
||||
ImmutableArray<DenoDynamicImportObservation>.Empty,
|
||||
ImmutableArray<DenoLiteralFetchObservation>.Empty,
|
||||
ImmutableArray.Create(new DenoObservationBundleSummary("bundle.eszip", "eszip", "mod.ts", 2, 1)));
|
||||
|
||||
var json = DenoObservationSerializer.Serialize(document);
|
||||
var hash = DenoObservationSerializer.ComputeSha256(json);
|
||||
|
||||
Assert.Equal("sha256:" + Convert.ToHexString(SHA256.HashData(Encoding.UTF8.GetBytes(json))).ToLowerInvariant(), hash);
|
||||
Assert.Contains("\"entrypoints\":[\"mod.ts\"]", json, StringComparison.Ordinal);
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,42 @@
|
||||
<?xml version='1.0' encoding='utf-8'?>
|
||||
<Project Sdk="Microsoft.NET.Sdk">
|
||||
<PropertyGroup>
|
||||
<TargetFramework>net10.0</TargetFramework>
|
||||
<LangVersion>preview</LangVersion>
|
||||
<ImplicitUsings>enable</ImplicitUsings>
|
||||
<Nullable>enable</Nullable>
|
||||
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
|
||||
<IsPackable>false</IsPackable>
|
||||
</PropertyGroup>
|
||||
|
||||
<ItemGroup>
|
||||
<PackageReference Remove="Microsoft.NET.Test.Sdk" />
|
||||
<PackageReference Remove="xunit" />
|
||||
<PackageReference Remove="xunit.runner.visualstudio" />
|
||||
<PackageReference Remove="Microsoft.AspNetCore.Mvc.Testing" />
|
||||
<PackageReference Remove="Mongo2Go" />
|
||||
<PackageReference Remove="coverlet.collector" />
|
||||
<PackageReference Remove="Microsoft.Extensions.TimeProvider.Testing" />
|
||||
<ProjectReference Remove="..\\StellaOps.Concelier.Testing\\StellaOps.Concelier.Testing.csproj" />
|
||||
<Compile Remove="$(MSBuildThisFileDirectory)..\\StellaOps.Concelier.Tests.Shared\\AssemblyInfo.cs" />
|
||||
<Compile Remove="$(MSBuildThisFileDirectory)..\\StellaOps.Concelier.Tests.Shared\\MongoFixtureCollection.cs" />
|
||||
<Using Remove="StellaOps.Concelier.Testing" />
|
||||
</ItemGroup>
|
||||
|
||||
<ItemGroup>
|
||||
<ProjectReference Include="..\\StellaOps.Scanner.Analyzers.Lang.Tests\\StellaOps.Scanner.Analyzers.Lang.Tests.csproj" />
|
||||
<ProjectReference Include="..\\..\\__Libraries\\StellaOps.Scanner.Analyzers.Lang\\StellaOps.Scanner.Analyzers.Lang.csproj" />
|
||||
<ProjectReference Include="..\\..\\__Libraries\\StellaOps.Scanner.Analyzers.Lang.Deno\\StellaOps.Scanner.Analyzers.Lang.Deno.csproj" />
|
||||
<ProjectReference Include="..\\..\\__Libraries\\StellaOps.Scanner.Core\\StellaOps.Scanner.Core.csproj" />
|
||||
</ItemGroup>
|
||||
|
||||
<ItemGroup>
|
||||
<PackageReference Include="Microsoft.NET.Test.Sdk" Version="17.14.1" />
|
||||
<PackageReference Include="xunit.v3" Version="3.0.0" />
|
||||
<PackageReference Include="xunit.runner.visualstudio" Version="3.1.3" />
|
||||
</ItemGroup>
|
||||
|
||||
<ItemGroup>
|
||||
<Using Include="Xunit" />
|
||||
</ItemGroup>
|
||||
</Project>
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user