feat: Initialize Zastava Webhook service with TLS and Authority authentication

- Added Program.cs to set up the web application with Serilog for logging, health check endpoints, and a placeholder admission endpoint.
- Configured Kestrel server to use TLS 1.3 and handle client certificates appropriately.
- Created StellaOps.Zastava.Webhook.csproj with necessary dependencies including Serilog and Polly.
- Documented tasks in TASKS.md for the Zastava Webhook project, outlining current work and exit criteria for each task.
This commit is contained in:
master
2025-10-19 18:36:22 +03:00
parent 2062da7a8b
commit d099a90f9b
966 changed files with 91038 additions and 1850 deletions

View File

@@ -0,0 +1,39 @@
# StellaOps.Scanner.Analyzers.Lang.Node — Agent Charter
## Role
Deliver the Node.js / npm / Yarn / PNPM analyzer plug-in that resolves workspace graphs, symlinks, and script metadata for Scanner Workers.
## Scope
- Deterministic filesystem walker for `node_modules`, PNPM store, Yarn Plug'n'Play, and workspace roots.
- Component identity normalization to `pkg:npm` with provenance evidence (manifest path, integrity hashes, lockfile references).
- Workspace + symlink attribution, script metadata (postinstall, lifecycle), and policy hints for risky scripts.
- Plug-in manifest authoring, DI bootstrap, and benchmark harness integration.
## Out of Scope
- OS package detection, native library linkage, or vulnerability joins.
- Language analyzers for other ecosystems (Python, Go, .NET, Rust).
- CLI/UI surfacing of analyzer diagnostics (handed to UI guild post-gate).
## Expectations
- Deterministic output across Yarn/NPM/PNPM variations; normalized casing and path separators.
- Performance targets: 10k-module fixture <1.8s, <220MB RSS on 4vCPU runner.
- Offline-first; no network dependency to resolve registries.
- Emit structured metrics + logs (`analyzer=node`) compatible with Scanner telemetry model.
- Update `TASKS.md`, `SPRINTS_LANG_IMPLEMENTATION_PLAN.md`, and corresponding fixtures as progress occurs.
## Dependencies
- Shared language analyzer core (`StellaOps.Scanner.Analyzers.Lang`).
- Worker dispatcher for plug-in discovery.
- EntryTrace usage hints (for script usage classification).
## Testing & Artifacts
- Determinism golden fixtures under `Fixtures/lang/node/`.
- Benchmark CSV + flamegraph stored in `bench/Scanner.Analyzers/`.
- Plug-in manifest + cosign workflow added to Offline Kit instructions once analyzer is production-ready.
## Telemetry & Policy Hints
- Metrics: `scanner_analyzer_node_scripts_total{script}` increments for each install lifecycle script discovered.
- Metadata keys:
- `policyHint.installLifecycle` lists lifecycle scripts (`preinstall;install;postinstall`) observed for a package.
- `script.<name>` stores the canonical command string for each lifecycle script.
- Evidence: lifecycle script entries emit `LanguageEvidenceKind.Metadata` pointing to `package.json#scripts.<name>` with SHA-256 hashes for determinism.

View File

@@ -0,0 +1,9 @@
global using System;
global using System.Collections.Generic;
global using System.IO;
global using System.Linq;
global using System.Text.Json;
global using System.Threading;
global using System.Threading.Tasks;
global using StellaOps.Scanner.Analyzers.Lang;

View File

@@ -0,0 +1,31 @@
using System.Collections.Generic;
using System.Diagnostics.Metrics;
namespace StellaOps.Scanner.Analyzers.Lang.Node.Internal;
internal static class NodeAnalyzerMetrics
{
private static readonly Meter Meter = new("StellaOps.Scanner.Analyzers.Lang.Node", "1.0.0");
private static readonly Counter<long> LifecycleScriptsCounter = Meter.CreateCounter<long>(
"scanner_analyzer_node_scripts_total",
unit: "scripts",
description: "Counts Node.js install lifecycle scripts discovered by the language analyzer.");
public static void RecordLifecycleScript(string scriptName)
{
var normalized = Normalize(scriptName);
LifecycleScriptsCounter.Add(
1,
new KeyValuePair<string, object?>("script", normalized));
}
private static string Normalize(string? scriptName)
{
if (string.IsNullOrWhiteSpace(scriptName))
{
return "unknown";
}
return scriptName.Trim().ToLowerInvariant();
}
}

View File

@@ -0,0 +1,37 @@
using System.Diagnostics.CodeAnalysis;
using System.Security.Cryptography;
using System.Text;
namespace StellaOps.Scanner.Analyzers.Lang.Node.Internal;
internal sealed record NodeLifecycleScript
{
public NodeLifecycleScript(string name, string command)
{
ArgumentException.ThrowIfNullOrWhiteSpace(name);
ArgumentException.ThrowIfNullOrWhiteSpace(command);
Name = name.Trim();
Command = command.Trim();
Sha256 = ComputeSha256(Command);
}
public string Name { get; }
public string Command { get; }
public string Sha256 { get; }
[SuppressMessage("Security", "CA5350:Do Not Use Weak Cryptographic Algorithms", Justification = "SHA256 is required for deterministic evidence hashing.")]
private static string ComputeSha256(string value)
{
if (string.IsNullOrEmpty(value))
{
return string.Empty;
}
var bytes = Encoding.UTF8.GetBytes(value);
var hash = SHA256.HashData(bytes);
return Convert.ToHexString(hash).ToLowerInvariant();
}
}

View File

@@ -0,0 +1,446 @@
using System.Text.Json;
namespace StellaOps.Scanner.Analyzers.Lang.Node.Internal;
internal sealed class NodeLockData
{
private static readonly NodeLockData Empty = new(new Dictionary<string, NodeLockEntry>(StringComparer.Ordinal), new Dictionary<string, NodeLockEntry>(StringComparer.OrdinalIgnoreCase));
private readonly Dictionary<string, NodeLockEntry> _byPath;
private readonly Dictionary<string, NodeLockEntry> _byName;
private NodeLockData(Dictionary<string, NodeLockEntry> byPath, Dictionary<string, NodeLockEntry> byName)
{
_byPath = byPath;
_byName = byName;
}
public static ValueTask<NodeLockData> LoadAsync(string rootPath, CancellationToken cancellationToken)
{
var byPath = new Dictionary<string, NodeLockEntry>(StringComparer.Ordinal);
var byName = new Dictionary<string, NodeLockEntry>(StringComparer.OrdinalIgnoreCase);
LoadPackageLockJson(rootPath, byPath, byName, cancellationToken);
LoadYarnLock(rootPath, byName);
LoadPnpmLock(rootPath, byName);
if (byPath.Count == 0 && byName.Count == 0)
{
return ValueTask.FromResult(Empty);
}
return ValueTask.FromResult(new NodeLockData(byPath, byName));
}
public bool TryGet(string relativePath, string packageName, out NodeLockEntry? entry)
{
var normalizedPath = NormalizeLockPath(relativePath);
if (_byPath.TryGetValue(normalizedPath, out var byPathEntry))
{
entry = byPathEntry;
return true;
}
if (!string.IsNullOrEmpty(packageName))
{
var normalizedName = packageName.StartsWith('@') ? packageName : packageName;
if (_byName.TryGetValue(normalizedName, out var byNameEntry))
{
entry = byNameEntry;
return true;
}
}
entry = null;
return false;
}
private static NodeLockEntry? CreateEntry(JsonElement element)
{
string? version = null;
string? resolved = null;
string? integrity = null;
if (element.TryGetProperty("version", out var versionElement) && versionElement.ValueKind == JsonValueKind.String)
{
version = versionElement.GetString();
}
if (element.TryGetProperty("resolved", out var resolvedElement) && resolvedElement.ValueKind == JsonValueKind.String)
{
resolved = resolvedElement.GetString();
}
if (element.TryGetProperty("integrity", out var integrityElement) && integrityElement.ValueKind == JsonValueKind.String)
{
integrity = integrityElement.GetString();
}
if (version is null && resolved is null && integrity is null)
{
return null;
}
return new NodeLockEntry(version, resolved, integrity);
}
private static void TraverseLegacyDependencies(
string currentPath,
JsonElement dependenciesElement,
IDictionary<string, NodeLockEntry> byPath,
IDictionary<string, NodeLockEntry> byName)
{
foreach (var dependency in dependenciesElement.EnumerateObject())
{
var depValue = dependency.Value;
var path = $"{currentPath}/{dependency.Name}";
var entry = CreateEntry(depValue);
if (entry is not null)
{
var normalizedPath = NormalizeLockPath(path);
byPath[normalizedPath] = entry;
byName[dependency.Name] = entry;
}
if (depValue.TryGetProperty("dependencies", out var childDependencies) && childDependencies.ValueKind == JsonValueKind.Object)
{
TraverseLegacyDependencies(path + "/node_modules", childDependencies, byPath, byName);
}
}
}
private static void LoadPackageLockJson(string rootPath, IDictionary<string, NodeLockEntry> byPath, IDictionary<string, NodeLockEntry> byName, CancellationToken cancellationToken)
{
var packageLockPath = Path.Combine(rootPath, "package-lock.json");
if (!File.Exists(packageLockPath))
{
return;
}
try
{
using var stream = File.OpenRead(packageLockPath);
using var document = JsonDocument.Parse(stream);
cancellationToken.ThrowIfCancellationRequested();
var root = document.RootElement;
if (root.TryGetProperty("packages", out var packagesElement) && packagesElement.ValueKind == JsonValueKind.Object)
{
foreach (var packageProperty in packagesElement.EnumerateObject())
{
var entry = CreateEntry(packageProperty.Value);
if (entry is null)
{
continue;
}
var key = NormalizeLockPath(packageProperty.Name);
byPath[key] = entry;
var name = ExtractNameFromPath(key);
if (!string.IsNullOrEmpty(name))
{
byName[name] = entry;
}
if (packageProperty.Value.TryGetProperty("name", out var explicitNameElement) && explicitNameElement.ValueKind == JsonValueKind.String)
{
var explicitName = explicitNameElement.GetString();
if (!string.IsNullOrWhiteSpace(explicitName))
{
byName[explicitName] = entry;
}
}
}
}
else if (root.TryGetProperty("dependencies", out var dependenciesElement) && dependenciesElement.ValueKind == JsonValueKind.Object)
{
TraverseLegacyDependencies("node_modules", dependenciesElement, byPath, byName);
}
}
catch (IOException)
{
// Ignore unreadable package-lock.
}
catch (JsonException)
{
// Ignore malformed package-lock.
}
}
private static void LoadYarnLock(string rootPath, IDictionary<string, NodeLockEntry> byName)
{
var yarnLockPath = Path.Combine(rootPath, "yarn.lock");
if (!File.Exists(yarnLockPath))
{
return;
}
try
{
var lines = File.ReadAllLines(yarnLockPath);
string? currentName = null;
string? version = null;
string? resolved = null;
string? integrity = null;
void Flush()
{
if (string.IsNullOrWhiteSpace(currentName))
{
version = null;
resolved = null;
integrity = null;
return;
}
var simpleName = ExtractPackageNameFromYarnKey(currentName!);
if (string.IsNullOrEmpty(simpleName))
{
version = null;
resolved = null;
integrity = null;
return;
}
var entry = new NodeLockEntry(version, resolved, integrity);
byName[simpleName] = entry;
version = null;
resolved = null;
integrity = null;
}
foreach (var line in lines)
{
var trimmed = line.Trim();
if (string.IsNullOrEmpty(trimmed))
{
Flush();
currentName = null;
continue;
}
if (!char.IsWhiteSpace(line, 0) && trimmed.EndsWith(':'))
{
Flush();
currentName = trimmed.TrimEnd(':').Trim('"');
continue;
}
if (trimmed.StartsWith("version", StringComparison.OrdinalIgnoreCase))
{
version = ExtractQuotedValue(trimmed);
}
else if (trimmed.StartsWith("resolved", StringComparison.OrdinalIgnoreCase))
{
resolved = ExtractQuotedValue(trimmed);
}
else if (trimmed.StartsWith("integrity", StringComparison.OrdinalIgnoreCase))
{
integrity = ExtractQuotedValue(trimmed);
}
}
Flush();
}
catch (IOException)
{
// Ignore unreadable yarn.lock
}
}
private static void LoadPnpmLock(string rootPath, IDictionary<string, NodeLockEntry> byName)
{
var pnpmLockPath = Path.Combine(rootPath, "pnpm-lock.yaml");
if (!File.Exists(pnpmLockPath))
{
return;
}
try
{
using var reader = new StreamReader(pnpmLockPath);
string? currentPackage = null;
string? version = null;
string? resolved = null;
string? integrity = null;
var inPackages = false;
while (reader.ReadLine() is { } line)
{
if (string.IsNullOrWhiteSpace(line))
{
continue;
}
if (!inPackages)
{
if (line.StartsWith("packages:", StringComparison.Ordinal))
{
inPackages = true;
}
continue;
}
if (line.StartsWith(" /", StringComparison.Ordinal))
{
if (!string.IsNullOrEmpty(currentPackage) && !string.IsNullOrEmpty(integrity))
{
var name = ExtractNameFromPnpmKey(currentPackage);
if (!string.IsNullOrEmpty(name))
{
byName[name] = new NodeLockEntry(version, resolved, integrity);
}
}
currentPackage = line.Trim().TrimEnd(':').TrimStart('/');
version = null;
resolved = null;
integrity = null;
continue;
}
if (string.IsNullOrEmpty(currentPackage))
{
continue;
}
var trimmed = line.Trim();
if (trimmed.StartsWith("resolution:", StringComparison.Ordinal))
{
var integrityIndex = trimmed.IndexOf("integrity", StringComparison.OrdinalIgnoreCase);
if (integrityIndex >= 0)
{
var integrityValue = trimmed[(integrityIndex + 9)..].Trim(' ', ':', '{', '}', '"');
integrity = integrityValue;
}
var tarballIndex = trimmed.IndexOf("tarball", StringComparison.OrdinalIgnoreCase);
if (tarballIndex >= 0)
{
var tarballValue = trimmed[(tarballIndex + 7)..].Trim(' ', ':', '{', '}', '"');
resolved = tarballValue;
}
}
else if (trimmed.StartsWith("integrity:", StringComparison.Ordinal))
{
integrity = trimmed[("integrity:".Length)..].Trim();
}
else if (trimmed.StartsWith("tarball:", StringComparison.Ordinal))
{
resolved = trimmed[("tarball:".Length)..].Trim();
}
else if (trimmed.StartsWith("version:", StringComparison.Ordinal))
{
version = trimmed[("version:".Length)..].Trim();
}
}
if (!string.IsNullOrEmpty(currentPackage) && !string.IsNullOrEmpty(integrity))
{
var name = ExtractNameFromPnpmKey(currentPackage);
if (!string.IsNullOrEmpty(name))
{
byName[name] = new NodeLockEntry(version, resolved, integrity);
}
}
}
catch (IOException)
{
// Ignore unreadable pnpm lock file.
}
}
private static string? ExtractQuotedValue(string line)
{
var quoteStart = line.IndexOf('"');
if (quoteStart < 0)
{
return null;
}
var quoteEnd = line.LastIndexOf('"');
if (quoteEnd <= quoteStart)
{
return null;
}
return line.Substring(quoteStart + 1, quoteEnd - quoteStart - 1);
}
private static string ExtractPackageNameFromYarnKey(string key)
{
var commaIndex = key.IndexOf(',');
var trimmed = commaIndex > 0 ? key[..commaIndex] : key;
trimmed = trimmed.Trim('"');
var atIndex = trimmed.IndexOf('@', 1);
if (atIndex > 0)
{
return trimmed[..atIndex];
}
return trimmed;
}
private static string ExtractNameFromPnpmKey(string key)
{
var parts = key.Split('/', StringSplitOptions.RemoveEmptyEntries | StringSplitOptions.TrimEntries);
if (parts.Length == 0)
{
return string.Empty;
}
if (parts[0].StartsWith('@'))
{
return parts.Length >= 2 ? $"{parts[0]}/{parts[1]}" : parts[0];
}
return parts[0];
}
private static string NormalizeLockPath(string path)
{
if (string.IsNullOrWhiteSpace(path))
{
return string.Empty;
}
var normalized = path.Replace('\\', '/');
normalized = normalized.TrimStart('.', '/');
return normalized;
}
private static string ExtractNameFromPath(string normalizedPath)
{
if (string.IsNullOrEmpty(normalizedPath))
{
return string.Empty;
}
var segments = normalizedPath.Split('/', StringSplitOptions.RemoveEmptyEntries | StringSplitOptions.TrimEntries);
if (segments.Length == 0)
{
return string.Empty;
}
if (segments[0] == "node_modules")
{
if (segments.Length >= 3 && segments[1].StartsWith('@'))
{
return $"{segments[1]}/{segments[2]}";
}
return segments.Length >= 2 ? segments[1] : string.Empty;
}
var last = segments[^1];
if (last.StartsWith('@') && segments.Length >= 2)
{
return $"{segments[^2]}/{last}";
}
return last;
}
}

View File

@@ -0,0 +1,3 @@
namespace StellaOps.Scanner.Analyzers.Lang.Node.Internal;
internal sealed record NodeLockEntry(string? Version, string? Resolved, string? Integrity);

View File

@@ -0,0 +1,179 @@
namespace StellaOps.Scanner.Analyzers.Lang.Node.Internal;
internal sealed class NodePackage
{
public NodePackage(
string name,
string version,
string relativePath,
string packageJsonLocator,
bool? isPrivate,
NodeLockEntry? lockEntry,
bool isWorkspaceMember,
string? workspaceRoot,
IReadOnlyList<string> workspaceTargets,
string? workspaceLink,
IReadOnlyList<NodeLifecycleScript> lifecycleScripts,
bool usedByEntrypoint)
{
Name = name;
Version = version;
RelativePath = relativePath;
PackageJsonLocator = packageJsonLocator;
IsPrivate = isPrivate;
LockEntry = lockEntry;
IsWorkspaceMember = isWorkspaceMember;
WorkspaceRoot = workspaceRoot;
WorkspaceTargets = workspaceTargets;
WorkspaceLink = workspaceLink;
LifecycleScripts = lifecycleScripts ?? Array.Empty<NodeLifecycleScript>();
IsUsedByEntrypoint = usedByEntrypoint;
}
public string Name { get; }
public string Version { get; }
public string RelativePath { get; }
public string PackageJsonLocator { get; }
public bool? IsPrivate { get; }
public NodeLockEntry? LockEntry { get; }
public bool IsWorkspaceMember { get; }
public string? WorkspaceRoot { get; }
public IReadOnlyList<string> WorkspaceTargets { get; }
public string? WorkspaceLink { get; }
public IReadOnlyList<NodeLifecycleScript> LifecycleScripts { get; }
public bool HasInstallScripts => LifecycleScripts.Count > 0;
public bool IsUsedByEntrypoint { get; }
public string RelativePathNormalized => string.IsNullOrEmpty(RelativePath) ? string.Empty : RelativePath.Replace(Path.DirectorySeparatorChar, '/');
public string ComponentKey => $"purl::{Purl}";
public string Purl => BuildPurl(Name, Version);
public IReadOnlyCollection<LanguageComponentEvidence> CreateEvidence()
{
var evidence = new List<LanguageComponentEvidence>
{
new LanguageComponentEvidence(LanguageEvidenceKind.File, "package.json", PackageJsonLocator, Value: null, Sha256: null)
};
foreach (var script in LifecycleScripts)
{
var locator = string.IsNullOrEmpty(PackageJsonLocator)
? $"package.json#scripts.{script.Name}"
: $"{PackageJsonLocator}#scripts.{script.Name}";
evidence.Add(new LanguageComponentEvidence(
LanguageEvidenceKind.Metadata,
"package.json:scripts",
locator,
script.Command,
script.Sha256));
}
return evidence;
}
public IReadOnlyCollection<KeyValuePair<string, string?>> CreateMetadata()
{
var entries = new List<KeyValuePair<string, string?>>(8)
{
new("path", string.IsNullOrEmpty(RelativePathNormalized) ? "." : RelativePathNormalized)
};
if (IsPrivate is bool isPrivate)
{
entries.Add(new KeyValuePair<string, string?>("private", isPrivate ? "true" : "false"));
}
if (LockEntry is not null)
{
if (!string.IsNullOrWhiteSpace(LockEntry.Resolved))
{
entries.Add(new KeyValuePair<string, string?>("resolved", LockEntry.Resolved));
}
if (!string.IsNullOrWhiteSpace(LockEntry.Integrity))
{
entries.Add(new KeyValuePair<string, string?>("integrity", LockEntry.Integrity));
}
}
if (IsWorkspaceMember)
{
entries.Add(new KeyValuePair<string, string?>("workspaceMember", "true"));
if (!string.IsNullOrWhiteSpace(WorkspaceRoot))
{
entries.Add(new KeyValuePair<string, string?>("workspaceRoot", WorkspaceRoot));
}
}
if (!string.IsNullOrWhiteSpace(WorkspaceLink))
{
entries.Add(new KeyValuePair<string, string?>("workspaceLink", WorkspaceLink));
}
if (WorkspaceTargets.Count > 0)
{
entries.Add(new KeyValuePair<string, string?>("workspaceTargets", string.Join(';', WorkspaceTargets)));
}
if (HasInstallScripts)
{
entries.Add(new KeyValuePair<string, string?>("installScripts", "true"));
var lifecycleNames = LifecycleScripts
.Select(static script => script.Name)
.Distinct(StringComparer.OrdinalIgnoreCase)
.OrderBy(static name => name, StringComparer.OrdinalIgnoreCase)
.ToArray();
if (lifecycleNames.Length > 0)
{
entries.Add(new KeyValuePair<string, string?>("policyHint.installLifecycle", string.Join(';', lifecycleNames)));
}
foreach (var script in LifecycleScripts.OrderBy(static script => script.Name, StringComparer.OrdinalIgnoreCase))
{
entries.Add(new KeyValuePair<string, string?>($"script.{script.Name}", script.Command));
}
}
return entries
.OrderBy(static pair => pair.Key, StringComparer.Ordinal)
.ToArray();
}
private static string BuildPurl(string name, string version)
{
var normalizedName = NormalizeName(name);
return $"pkg:npm/{normalizedName}@{version}";
}
private static string NormalizeName(string name)
{
if (string.IsNullOrWhiteSpace(name))
{
return name;
}
if (name[0] == '@')
{
var scopeAndName = name[1..];
return $"%40{scopeAndName}";
}
return name;
}
}

View File

@@ -0,0 +1,378 @@
using System.Text.Json;
namespace StellaOps.Scanner.Analyzers.Lang.Node.Internal;
internal static class NodePackageCollector
{
private static readonly string[] IgnoredDirectories =
{
".bin",
".cache",
".store",
"__pycache__"
};
public static IReadOnlyList<NodePackage> CollectPackages(LanguageAnalyzerContext context, NodeLockData lockData, CancellationToken cancellationToken)
{
var packages = new List<NodePackage>();
var visited = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
var pendingNodeModuleRoots = new List<string>();
var rootPackageJson = Path.Combine(context.RootPath, "package.json");
var workspaceIndex = NodeWorkspaceIndex.Create(context.RootPath);
if (File.Exists(rootPackageJson))
{
var rootPackage = TryCreatePackage(context, rootPackageJson, string.Empty, lockData, workspaceIndex, cancellationToken);
if (rootPackage is not null)
{
packages.Add(rootPackage);
visited.Add(rootPackage.RelativePathNormalized);
}
}
foreach (var workspaceRelative in workspaceIndex.GetMembers())
{
var workspaceAbsolute = Path.Combine(context.RootPath, workspaceRelative.Replace('/', Path.DirectorySeparatorChar));
if (!Directory.Exists(workspaceAbsolute))
{
continue;
}
ProcessPackageDirectory(context, workspaceAbsolute, lockData, workspaceIndex, includeNestedNodeModules: false, packages, visited, cancellationToken);
var workspaceNodeModules = Path.Combine(workspaceAbsolute, "node_modules");
if (Directory.Exists(workspaceNodeModules))
{
pendingNodeModuleRoots.Add(workspaceNodeModules);
}
}
var nodeModules = Path.Combine(context.RootPath, "node_modules");
TraverseDirectory(context, nodeModules, lockData, workspaceIndex, packages, visited, cancellationToken);
foreach (var pendingRoot in pendingNodeModuleRoots.OrderBy(static path => path, StringComparer.Ordinal))
{
TraverseDirectory(context, pendingRoot, lockData, workspaceIndex, packages, visited, cancellationToken);
}
return packages;
}
private static void TraverseDirectory(
LanguageAnalyzerContext context,
string directory,
NodeLockData lockData,
NodeWorkspaceIndex workspaceIndex,
List<NodePackage> packages,
HashSet<string> visited,
CancellationToken cancellationToken)
{
if (!Directory.Exists(directory))
{
return;
}
foreach (var child in Directory.EnumerateDirectories(directory))
{
cancellationToken.ThrowIfCancellationRequested();
var name = Path.GetFileName(child);
if (string.IsNullOrEmpty(name))
{
continue;
}
if (ShouldSkipDirectory(name))
{
continue;
}
if (string.Equals(name, ".pnpm", StringComparison.OrdinalIgnoreCase))
{
TraversePnpmStore(context, child, lockData, workspaceIndex, packages, visited, cancellationToken);
continue;
}
if (name.StartsWith('@'))
{
foreach (var scoped in Directory.EnumerateDirectories(child))
{
ProcessPackageDirectory(context, scoped, lockData, workspaceIndex, includeNestedNodeModules: true, packages, visited, cancellationToken);
}
continue;
}
ProcessPackageDirectory(context, child, lockData, workspaceIndex, includeNestedNodeModules: true, packages, visited, cancellationToken);
}
}
private static void TraversePnpmStore(
LanguageAnalyzerContext context,
string pnpmDirectory,
NodeLockData lockData,
NodeWorkspaceIndex workspaceIndex,
List<NodePackage> packages,
HashSet<string> visited,
CancellationToken cancellationToken)
{
foreach (var storeEntry in Directory.EnumerateDirectories(pnpmDirectory))
{
cancellationToken.ThrowIfCancellationRequested();
var nestedNodeModules = Path.Combine(storeEntry, "node_modules");
if (Directory.Exists(nestedNodeModules))
{
TraverseDirectory(context, nestedNodeModules, lockData, workspaceIndex, packages, visited, cancellationToken);
}
}
}
private static void ProcessPackageDirectory(
LanguageAnalyzerContext context,
string directory,
NodeLockData lockData,
NodeWorkspaceIndex workspaceIndex,
bool includeNestedNodeModules,
List<NodePackage> packages,
HashSet<string> visited,
CancellationToken cancellationToken)
{
var packageJsonPath = Path.Combine(directory, "package.json");
var relativeDirectory = NormalizeRelativeDirectory(context, directory);
if (!visited.Add(relativeDirectory))
{
// Already processed this path.
if (includeNestedNodeModules)
{
TraverseNestedNodeModules(context, directory, lockData, workspaceIndex, packages, visited, cancellationToken);
}
return;
}
if (File.Exists(packageJsonPath))
{
var package = TryCreatePackage(context, packageJsonPath, relativeDirectory, lockData, workspaceIndex, cancellationToken);
if (package is not null)
{
packages.Add(package);
}
}
if (includeNestedNodeModules)
{
TraverseNestedNodeModules(context, directory, lockData, workspaceIndex, packages, visited, cancellationToken);
}
}
private static void TraverseNestedNodeModules(
LanguageAnalyzerContext context,
string directory,
NodeLockData lockData,
NodeWorkspaceIndex workspaceIndex,
List<NodePackage> packages,
HashSet<string> visited,
CancellationToken cancellationToken)
{
var nestedNodeModules = Path.Combine(directory, "node_modules");
TraverseDirectory(context, nestedNodeModules, lockData, workspaceIndex, packages, visited, cancellationToken);
}
private static NodePackage? TryCreatePackage(
LanguageAnalyzerContext context,
string packageJsonPath,
string relativeDirectory,
NodeLockData lockData,
NodeWorkspaceIndex workspaceIndex,
CancellationToken cancellationToken)
{
try
{
using var stream = File.OpenRead(packageJsonPath);
using var document = JsonDocument.Parse(stream);
var root = document.RootElement;
if (!root.TryGetProperty("name", out var nameElement))
{
return null;
}
var name = nameElement.GetString();
if (string.IsNullOrWhiteSpace(name))
{
return null;
}
if (!root.TryGetProperty("version", out var versionElement))
{
return null;
}
var version = versionElement.GetString();
if (string.IsNullOrWhiteSpace(version))
{
return null;
}
bool? isPrivate = null;
if (root.TryGetProperty("private", out var privateElement) && privateElement.ValueKind is JsonValueKind.True or JsonValueKind.False)
{
isPrivate = privateElement.GetBoolean();
}
var lockEntry = lockData.TryGet(relativeDirectory, name, out var entry) ? entry : null;
var locator = BuildLocator(relativeDirectory);
var usedByEntrypoint = context.UsageHints.IsPathUsed(packageJsonPath);
var isWorkspaceMember = workspaceIndex.TryGetMember(relativeDirectory, out var workspaceRoot);
var workspaceTargets = ExtractWorkspaceTargets(relativeDirectory, root, workspaceIndex);
var workspaceLink = !isWorkspaceMember && workspaceIndex.TryGetWorkspacePathByName(name, out var workspacePathByName)
? NormalizeRelativeDirectory(context, Path.Combine(context.RootPath, relativeDirectory))
: null;
var lifecycleScripts = ExtractLifecycleScripts(root);
return new NodePackage(
name: name.Trim(),
version: version.Trim(),
relativePath: relativeDirectory,
packageJsonLocator: locator,
isPrivate: isPrivate,
lockEntry: lockEntry,
isWorkspaceMember: isWorkspaceMember,
workspaceRoot: workspaceRoot,
workspaceTargets: workspaceTargets,
workspaceLink: workspaceLink,
lifecycleScripts: lifecycleScripts,
usedByEntrypoint: usedByEntrypoint);
}
catch (IOException)
{
return null;
}
catch (JsonException)
{
return null;
}
}
private static string NormalizeRelativeDirectory(LanguageAnalyzerContext context, string directory)
{
var relative = context.GetRelativePath(directory);
if (string.IsNullOrEmpty(relative) || relative == ".")
{
return string.Empty;
}
return relative.Replace(Path.DirectorySeparatorChar, '/');
}
private static string BuildLocator(string relativeDirectory)
{
if (string.IsNullOrEmpty(relativeDirectory))
{
return "package.json";
}
return relativeDirectory + "/package.json";
}
private static bool ShouldSkipDirectory(string name)
{
if (name.Length == 0)
{
return true;
}
if (name[0] == '.')
{
return !string.Equals(name, ".pnpm", StringComparison.OrdinalIgnoreCase);
}
return IgnoredDirectories.Any(ignored => string.Equals(name, ignored, StringComparison.OrdinalIgnoreCase));
}
private static IReadOnlyList<string> ExtractWorkspaceTargets(string relativeDirectory, JsonElement root, NodeWorkspaceIndex workspaceIndex)
{
var dependencies = workspaceIndex.ResolveWorkspaceTargets(relativeDirectory, TryGetProperty(root, "dependencies"));
var devDependencies = workspaceIndex.ResolveWorkspaceTargets(relativeDirectory, TryGetProperty(root, "devDependencies"));
var peerDependencies = workspaceIndex.ResolveWorkspaceTargets(relativeDirectory, TryGetProperty(root, "peerDependencies"));
if (dependencies.Count == 0 && devDependencies.Count == 0 && peerDependencies.Count == 0)
{
return Array.Empty<string>();
}
var combined = new HashSet<string>(StringComparer.Ordinal);
foreach (var item in dependencies)
{
combined.Add(item);
}
foreach (var item in devDependencies)
{
combined.Add(item);
}
foreach (var item in peerDependencies)
{
combined.Add(item);
}
return combined.OrderBy(static x => x, StringComparer.Ordinal).ToArray();
}
private static JsonElement? TryGetProperty(JsonElement element, string propertyName)
=> element.TryGetProperty(propertyName, out var property) ? property : null;
private static IReadOnlyList<NodeLifecycleScript> ExtractLifecycleScripts(JsonElement root)
{
if (!root.TryGetProperty("scripts", out var scriptsElement) || scriptsElement.ValueKind != JsonValueKind.Object)
{
return Array.Empty<NodeLifecycleScript>();
}
var lifecycleScripts = new Dictionary<string, NodeLifecycleScript>(StringComparer.OrdinalIgnoreCase);
foreach (var script in scriptsElement.EnumerateObject())
{
if (!IsLifecycleScriptName(script.Name))
{
continue;
}
if (script.Value.ValueKind != JsonValueKind.String)
{
continue;
}
var command = script.Value.GetString();
if (string.IsNullOrWhiteSpace(command))
{
continue;
}
var canonicalName = script.Name.Trim().ToLowerInvariant();
var lifecycleScript = new NodeLifecycleScript(canonicalName, command);
if (!lifecycleScripts.ContainsKey(canonicalName))
{
NodeAnalyzerMetrics.RecordLifecycleScript(canonicalName);
}
lifecycleScripts[canonicalName] = lifecycleScript;
}
if (lifecycleScripts.Count == 0)
{
return Array.Empty<NodeLifecycleScript>();
}
return lifecycleScripts.Values
.OrderBy(static script => script.Name, StringComparer.Ordinal)
.ToArray();
}
private static bool IsLifecycleScriptName(string name)
=> name.Equals("preinstall", StringComparison.OrdinalIgnoreCase)
|| name.Equals("install", StringComparison.OrdinalIgnoreCase)
|| name.Equals("postinstall", StringComparison.OrdinalIgnoreCase);
}

View File

@@ -0,0 +1,278 @@
using System.Text.Json;
namespace StellaOps.Scanner.Analyzers.Lang.Node.Internal;
internal sealed class NodeWorkspaceIndex
{
private readonly string _rootPath;
private readonly HashSet<string> _workspacePaths;
private readonly Dictionary<string, string> _workspaceByName;
private NodeWorkspaceIndex(string rootPath, HashSet<string> workspacePaths, Dictionary<string, string> workspaceByName)
{
_rootPath = rootPath;
_workspacePaths = workspacePaths;
_workspaceByName = workspaceByName;
}
public static NodeWorkspaceIndex Create(string rootPath)
{
var normalizedRoot = Path.GetFullPath(rootPath);
var workspacePaths = new HashSet<string>(StringComparer.Ordinal);
var workspaceByName = new Dictionary<string, string>(StringComparer.OrdinalIgnoreCase);
var packageJsonPath = Path.Combine(normalizedRoot, "package.json");
if (!File.Exists(packageJsonPath))
{
return new NodeWorkspaceIndex(normalizedRoot, workspacePaths, workspaceByName);
}
try
{
using var stream = File.OpenRead(packageJsonPath);
using var document = JsonDocument.Parse(stream);
var root = document.RootElement;
if (!root.TryGetProperty("workspaces", out var workspacesElement))
{
return new NodeWorkspaceIndex(normalizedRoot, workspacePaths, workspaceByName);
}
var patterns = ExtractPatterns(workspacesElement);
foreach (var pattern in patterns)
{
foreach (var workspacePath in ExpandPattern(normalizedRoot, pattern))
{
if (string.IsNullOrWhiteSpace(workspacePath))
{
continue;
}
workspacePaths.Add(workspacePath);
var packagePath = Path.Combine(normalizedRoot, workspacePath.Replace('/', Path.DirectorySeparatorChar), "package.json");
if (!File.Exists(packagePath))
{
continue;
}
try
{
using var workspaceStream = File.OpenRead(packagePath);
using var workspaceDoc = JsonDocument.Parse(workspaceStream);
if (workspaceDoc.RootElement.TryGetProperty("name", out var nameElement))
{
var name = nameElement.GetString();
if (!string.IsNullOrWhiteSpace(name))
{
workspaceByName[name] = workspacePath!;
}
}
}
catch (IOException)
{
// Ignore unreadable workspace package definitions.
}
catch (JsonException)
{
// Ignore malformed workspace package definitions.
}
}
}
}
catch (IOException)
{
// If the root package.json is unreadable we treat as no workspaces.
}
catch (JsonException)
{
// Malformed root package.json: treat as no workspaces.
}
return new NodeWorkspaceIndex(normalizedRoot, workspacePaths, workspaceByName);
}
public IEnumerable<string> GetMembers()
=> _workspacePaths.OrderBy(static path => path, StringComparer.Ordinal);
public bool TryGetMember(string relativePath, out string normalizedPath)
{
if (string.IsNullOrEmpty(relativePath))
{
normalizedPath = string.Empty;
return false;
}
var normalized = NormalizeRelative(relativePath);
if (_workspacePaths.Contains(normalized))
{
normalizedPath = normalized;
return true;
}
normalizedPath = string.Empty;
return false;
}
public bool TryGetWorkspacePathByName(string packageName, out string? relativePath)
=> _workspaceByName.TryGetValue(packageName, out relativePath);
public IReadOnlyList<string> ResolveWorkspaceTargets(string relativeDirectory, JsonElement? dependencies)
{
if (dependencies is null || dependencies.Value.ValueKind != JsonValueKind.Object)
{
return Array.Empty<string>();
}
var result = new HashSet<string>(StringComparer.Ordinal);
foreach (var property in dependencies.Value.EnumerateObject())
{
var value = property.Value;
if (value.ValueKind != JsonValueKind.String)
{
continue;
}
var targetSpec = value.GetString();
if (string.IsNullOrWhiteSpace(targetSpec))
{
continue;
}
const string workspacePrefix = "workspace:";
if (!targetSpec.StartsWith(workspacePrefix, StringComparison.OrdinalIgnoreCase))
{
continue;
}
var descriptor = targetSpec[workspacePrefix.Length..].Trim();
if (string.IsNullOrEmpty(descriptor) || descriptor is "*" or "^")
{
if (_workspaceByName.TryGetValue(property.Name, out var workspaceByName))
{
result.Add(workspaceByName);
}
continue;
}
if (TryResolveWorkspaceTarget(relativeDirectory, descriptor, out var resolved))
{
result.Add(resolved);
}
}
if (result.Count == 0)
{
return Array.Empty<string>();
}
return result.OrderBy(static x => x, StringComparer.Ordinal).ToArray();
}
public bool TryResolveWorkspaceTarget(string relativeDirectory, string descriptor, out string normalized)
{
normalized = string.Empty;
var baseDirectory = string.IsNullOrEmpty(relativeDirectory) ? string.Empty : relativeDirectory;
var baseAbsolute = Path.GetFullPath(Path.Combine(_rootPath, baseDirectory));
var candidate = Path.GetFullPath(Path.Combine(baseAbsolute, descriptor.Replace('/', Path.DirectorySeparatorChar)));
if (!IsUnderRoot(_rootPath, candidate))
{
return false;
}
var relative = NormalizeRelative(Path.GetRelativePath(_rootPath, candidate));
if (_workspacePaths.Contains(relative))
{
normalized = relative;
return true;
}
return false;
}
private static IEnumerable<string> ExtractPatterns(JsonElement workspacesElement)
{
if (workspacesElement.ValueKind == JsonValueKind.Array)
{
foreach (var item in workspacesElement.EnumerateArray())
{
if (item.ValueKind == JsonValueKind.String)
{
var value = item.GetString();
if (!string.IsNullOrWhiteSpace(value))
{
yield return value.Trim();
}
}
}
}
else if (workspacesElement.ValueKind == JsonValueKind.Object)
{
if (workspacesElement.TryGetProperty("packages", out var packagesElement) && packagesElement.ValueKind == JsonValueKind.Array)
{
foreach (var pattern in ExtractPatterns(packagesElement))
{
yield return pattern;
}
}
}
}
private static IEnumerable<string> ExpandPattern(string rootPath, string pattern)
{
var cleanedPattern = pattern.Replace('\\', '/').Trim();
if (cleanedPattern.EndsWith("/*", StringComparison.Ordinal))
{
var baseSegment = cleanedPattern[..^2];
var baseAbsolute = CombineAndNormalize(rootPath, baseSegment);
if (baseAbsolute is null || !Directory.Exists(baseAbsolute))
{
yield break;
}
foreach (var directory in Directory.EnumerateDirectories(baseAbsolute))
{
var normalized = NormalizeRelative(Path.GetRelativePath(rootPath, directory));
yield return normalized;
}
}
else
{
var absolute = CombineAndNormalize(rootPath, cleanedPattern);
if (absolute is null || !Directory.Exists(absolute))
{
yield break;
}
var normalized = NormalizeRelative(Path.GetRelativePath(rootPath, absolute));
yield return normalized;
}
}
private static string? CombineAndNormalize(string rootPath, string relative)
{
var candidate = Path.GetFullPath(Path.Combine(rootPath, relative.Replace('/', Path.DirectorySeparatorChar)));
return IsUnderRoot(rootPath, candidate) ? candidate : null;
}
private static string NormalizeRelative(string relativePath)
{
if (string.IsNullOrEmpty(relativePath) || relativePath == ".")
{
return string.Empty;
}
var normalized = relativePath.Replace('\\', '/');
normalized = normalized.TrimStart('.', '/');
return normalized;
}
private static bool IsUnderRoot(string rootPath, string absolutePath)
{
if (OperatingSystem.IsWindows())
{
return absolutePath.StartsWith(rootPath, StringComparison.OrdinalIgnoreCase);
}
return absolutePath.StartsWith(rootPath, StringComparison.Ordinal);
}
}

View File

@@ -0,0 +1,37 @@
using StellaOps.Scanner.Analyzers.Lang.Node.Internal;
namespace StellaOps.Scanner.Analyzers.Lang.Node;
public sealed class NodeLanguageAnalyzer : ILanguageAnalyzer
{
public string Id => "node";
public string DisplayName => "Node.js Analyzer";
public async ValueTask AnalyzeAsync(LanguageAnalyzerContext context, LanguageComponentWriter writer, CancellationToken cancellationToken)
{
ArgumentNullException.ThrowIfNull(context);
ArgumentNullException.ThrowIfNull(writer);
var lockData = await NodeLockData.LoadAsync(context.RootPath, cancellationToken).ConfigureAwait(false);
var packages = NodePackageCollector.CollectPackages(context, lockData, cancellationToken);
foreach (var package in packages.OrderBy(static p => p.ComponentKey, StringComparer.Ordinal))
{
cancellationToken.ThrowIfCancellationRequested();
var metadata = package.CreateMetadata();
var evidence = package.CreateEvidence();
writer.AddFromPurl(
analyzerId: Id,
purl: package.Purl,
name: package.Name,
version: package.Version,
type: "npm",
metadata: metadata,
evidence: evidence,
usedByEntrypoint: package.IsUsedByEntrypoint);
}
}
}

View File

@@ -0,0 +1,6 @@
namespace StellaOps.Scanner.Analyzers.Lang.Node;
internal static class Placeholder
{
// Analyzer implementation will be added during Sprint LA1.
}

View File

@@ -0,0 +1,20 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>net10.0</TargetFramework>
<LangVersion>preview</LangVersion>
<Nullable>enable</Nullable>
<ImplicitUsings>enable</ImplicitUsings>
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
<EnableDefaultItems>false</EnableDefaultItems>
</PropertyGroup>
<ItemGroup>
<Compile Include="**\\*.cs" Exclude="obj\\**;bin\\**" />
<EmbeddedResource Include="**\\*.json" Exclude="obj\\**;bin\\**" />
<None Include="**\\*" Exclude="**\\*.cs;**\\*.json;bin\\**;obj\\**" />
</ItemGroup>
<ItemGroup>
<ProjectReference Include="..\StellaOps.Scanner.Analyzers.Lang\StellaOps.Scanner.Analyzers.Lang.csproj" />
</ItemGroup>
</Project>

View File

@@ -0,0 +1,10 @@
# Node Analyzer Task Flow
| Seq | ID | Status | Depends on | Description | Exit Criteria |
|-----|----|--------|------------|-------------|---------------|
| 1 | SCANNER-ANALYZERS-LANG-10-302A | DONE (2025-10-19) | SCANNER-ANALYZERS-LANG-10-307 | Build deterministic module graph walker covering npm, Yarn, and PNPM; capture package.json provenance and integrity metadata. | Walker indexes >100k modules in <1.5s (hot cache); golden fixtures verify deterministic ordering and path normalization. |
| 2 | SCANNER-ANALYZERS-LANG-10-302B | DONE (2025-10-19) | SCANNER-ANALYZERS-LANG-10-302A | Resolve workspaces/symlinks and attribute components to originating package with usage hints; guard against directory traversal. | Workspace attribution accurate on multi-workspace fixture; symlink resolver proves canonical path; security tests ensure no traversal. |
| 3 | SCANNER-ANALYZERS-LANG-10-302C | DONE (2025-10-19) | SCANNER-ANALYZERS-LANG-10-302B | Surface script metadata (postinstall/preinstall) and policy hints; emit telemetry counters and evidence records. | Analyzer output includes script metadata + evidence; metrics `scanner_analyzer_node_scripts_total` recorded; policy hints documented. |
| 4 | SCANNER-ANALYZERS-LANG-10-307N | TODO | SCANNER-ANALYZERS-LANG-10-302C | Integrate shared helpers for license/licence evidence, canonical JSON serialization, and usage flag propagation. | Reuse shared helpers without duplication; unit tests confirm stable metadata merge; no analyzer-specific serializer drift. |
| 5 | SCANNER-ANALYZERS-LANG-10-308N | TODO | SCANNER-ANALYZERS-LANG-10-307N | Author determinism harness + fixtures for Node analyzer; add benchmark suite. | Fixtures committed under `Fixtures/lang/node/`; determinism CI job compares JSON snapshots; benchmark CSV published. |
| 6 | SCANNER-ANALYZERS-LANG-10-309N | TODO | SCANNER-ANALYZERS-LANG-10-308N | Package Node analyzer as restart-time plug-in (manifest, DI registration, Offline Kit notes). | Manifest copied to `plugins/scanner/analyzers/lang/`; Worker loads analyzer after restart; Offline Kit docs updated. |