Files
git.stella-ops.org/docs/product-advisories/27-Nov-2025 - Blueprint for a 2026‑Ready Scanner.md
StellaOps Bot 3488b22c0c
Some checks failed
Docs CI / lint-and-preview (push) Has been cancelled
AOC Guard CI / aoc-guard (push) Has been cancelled
AOC Guard CI / aoc-verify (push) Has been cancelled
up
2025-11-29 11:08:08 +02:00

23 KiB
Raw Blame History

Here are some key developments in the softwaresupplychain and vulnerabilityscoring world that youll want on your radar.


1. CVSS v4.0 traceable scoring with richer context

Image

Image

Image

Image

Image

Image

  • CVSSv4.0 was officially released by FIRST (Forum of Incident Response & Security Teams) on November1,2023. (first.org)
  • The specification now clearly divides metrics into four groups: Base, Threat, Environmental, and Supplemental. (first.org)
  • The National Vulnerability Database (NVD) has added support for CVSSv4.0 — meaning newer vulnerability records can carry v4style scores, vector strings and search filters. (NVD)
  • Whats new/tangible: better granularity, explicit “Attack Requirements” and richer metadata to better reflect realworld contextual risk. (Seemplicity)
  • Why this matters: Enables more traceable evidence of how a score was derived (which metrics used, what context), supporting auditing, prioritisation and transparency.

Takeaway for your world: If youre leveraging vulnerability scanning, SBOM enrichment or compliance workflows (given your interest in SBOM/VEX/provenance), then moving to or supporting CVSSv4.0 ensures you have stronger traceability and richer scoring context that maps into policy, audit and remediation workflows.


2. CycloneDX v1.7 SBOM/VEX/provenance with cryptographic & IP transparency

Image

Image

Image

Image

Image

Image

  • Version1.7 of the SBOM standard from OWASP Foundation (CycloneDX) launched on October21,2025. (CycloneDX)
  • Key enhancements: Cryptography Bill of Materials (CBOM) support (listing algorithm families, elliptic curves, etc) and structured citations (who provided component info, how, when) to improve provenance. (CycloneDX)
  • Provenance usecases: The spec enables declaring supplier/author/publisher metadata, component origin, external references. (CycloneDX)
  • Broadening scope: CycloneDX now supports not just SBOM (software), but hardware BOMs (HBOM), machine learning BOMs, cryptographic BOMs (CBOM) and supports VEX/attestation usecases. (openssf.org)
  • Why this matters: For your StellaOps architecture (with a strong emphasis on provenance, deterministic scans, trustframeworks) CycloneDXv1.7 provides native standard support for deeper auditready evidence, cryptographic algorithm visibility (which matters for cryptosovereign readiness) and formal attestations/citations in the BOM.

Takeaway: Aligning your SBOM/VEX/provenance stack (e.g., scanner.webservice) to output CycloneDXv1.7compliant artifacts means you jump ahead in terms of traceability, auditability and futureproofing (crypto and IP).


3. SLSA v1.2 Release Candidate2 supplychain build provenance standard

Image

Image

Image

Image

Image

Image

  • On November10,2025, the Open Source Security Foundation (via the SLSA community) announced RC2 of SLSAv1.2, open for public comment until November24,2025. (SLSA)
  • Whats new: Introduction of a Source Track (in addition to the Build Track) to capture source control provenance, distributed provenance, artifact attestations. (SLSA)
  • Specification clarifies provenance/attestation formats, how builds should be produced, distributed, verified. (SLSA)
  • Why this matters: SLSA gives you a standard framework for “I can trace this binary back to the code, the build system, the signer, the provenance chain,” which aligns directly with your strategic moats around deterministic replayable scans, proofofintegrity graph, and attestations.

Takeaway: If you integrate SLSAv1.2 (once finalised) into StellaOps, you gain an industryrecognised standard for build provenance and attestation, complementing your SBOM/VEX and CVSS code bases.


Why Im sharing this with you

Given your interest in cryptographicsovereign readiness, deterministic scanning, provenance and auditgrade supplychain tooling (your StellaOps moat list), this trifecta (CVSSv4.0 + CycloneDXv1.7 + SLSAv1.2) represents the major standards you need to converge on. They each address different layers: vulnerability scoring, component provenance and build/trust chain assurance. Aligning all three will give you a strong governance and tooling stack.

If you like, I can pull together a detailed gapanalysis table (your current architecture versus what these standards demand) and propose roadmap steps for StellaOps to adopt them.

Cool, lets turn all that standards talk into something your engineers can actually build against.

Below is a concrete implementation plan, broken into 3 workstreams, each with phases, tasks and clear acceptance criteria:

  • A — CVSS v4.0 integration (scoring & evidence)
  • B — CycloneDX 1.7 SBOM/CBOM + provenance
  • C — SLSA 1.2 (build + source provenance)
  • X — Crosscutting (APIs, UX, docs, rollout)

Ill assume you have:

  • A scanner / ingestion pipeline,
  • A central data model (DB or graph),
  • An API + UI layer (StellaOps console or similar),
  • CI/CD on GitHub/GitLab/whatever.

A. CVSS v4.0 integration

Goal: Your platform can ingest, calculate, store and expose CVSS v4.0 scores and vectors alongside (or instead of) v3.x, using the official FIRST spec and NVD data. (FIRST)

A1. Foundations & decisions

Tasks

  1. Pick canonical CVSSv4 library or implementation

    • Evaluate existing OSS libraries for your main language(s), or plan an internal one based directly on FIRSTs spec (Base, Threat, Environmental, Supplemental groups).

    • Decide:

      • Supported metric groups (Base only vs. Base+Threat+Environmental+Supplemental).
      • Which groups your UI will expose/edit vs. read-only from upstream feeds.
  2. Versioning strategy

    • Decide how to represent CVSS v3.0/v3.1/v4.0 in your DB:

      • vulnerability_scores table with version, vector, base_score, environmental_score, temporal_score, severity_band.
    • Define precedence rules: if both v3.1 and v4.0 exist, which one your “headline” severity uses.

Acceptance criteria

  • Tech design doc reviewed & approved.
  • Decision on library vs. custom implementation recorded.
  • DB schema migration plan ready.

A2. Data model & storage

Tasks

  1. DB schema changes

    • Add a cvss_scores table or expand the existing vulnerability table, e.g.:

      cvss_scores
        id (PK)
        vuln_id (FK)
        source (enum: NVD, scanner, manual)
        version (enum: 2.0, 3.0, 3.1, 4.0)
        vector (string)
        base_score (float)
        temporal_score (float, nullable)
        environmental_score (float, nullable)
        severity (enum: NONE/LOW/MEDIUM/HIGH/CRITICAL)
        metrics_json (JSONB)  // raw metrics for traceability
        created_at / updated_at
      
  2. Traceable evidence

    • Store:

      • Raw CVSS vector string (e.g. CVSS:4.0/AV:N/...(etc)).
      • Parsed metrics as JSON for audit (show “why” a score is what it is).
    • Optional: add calculated_by + calculated_at for your internal scoring runs.

Acceptance criteria

  • Migrations applied in dev.
  • Read/write repository functions implemented and unittested.

A3. Ingestion & calculation

Tasks

  1. NVD / external feeds

    • Update your NVD ingestion to read CVSS v4.0 when present in JSON metrics fields. (NVD)
    • Map NVD → internal cvss_scores model.
  2. Local CVSSv4 calculator service

    • Implement a service (or module) that:

      • Accepts metric values (Base/Threat/Environmental/Supplemental).

      • Produces:

        • Canonical vector.
        • Base/Threat/Environmental scores.
        • Severity band.
    • Make this callable by:

      • Scanner engine (calculating scores for private vulns).
      • UI (recalculate button).
      • API (for automated clients).

Acceptance criteria

  • Given a set of reference vectors from FIRST, your calculator returns exact expected scores.
  • NVD ingestion for a sample of CVEs produces v4 scores in your DB.

A4. UI & API

Tasks

  1. API

    • Extend vulnerability API payload with:

      {
        "id": "CVE-2024-XXXX",
        "cvss": [
          {
            "version": "4.0",
            "source": "NVD",
            "vector": "CVSS:4.0/AV:N/...",
            "base_score": 8.3,
            "severity": "HIGH",
            "metrics": { "...": "..." }
          }
        ]
      }
      
    • Add filters: cvss.version, cvss.min_score, cvss.severity.

  2. UI

    • On vulnerability detail:

      • Show v3.x and v4.0 side-by-side.
      • Expandable panel with metric breakdown and “explain my score” text.
    • On list views:

      • Support sorting & filtering by v4.0 base score & severity.

Acceptance criteria

  • Frontend can render v4.0 vectors and scores.
  • QA can filter vulnerabilities using v4 metrics via API and UI.

A5. Migration & rollout

Tasks

  1. Backfill

    • For all stored vulnerabilities where metrics exist:

      • If v4 not present but inputs available, compute v4.
    • Store both historical (v3.x) and new v4 for comparison.

  2. Feature flag / rollout

    • Introduce feature flag cvss_v4_enabled per tenant or environment.
    • Run A/B comparison internally before enabling for all users.

Acceptance criteria

  • Backfill job runs successfully on staging data.
  • Rollout plan + rollback strategy documented.

B. CycloneDX 1.7 SBOM/CBOM + provenance

CycloneDX 1.7 is now the current spec; it adds things like a Cryptography BOM (CBOM) and structured citations/provenance to strengthen trust and traceability. (CycloneDX)

B1. Decide scope & generators

Tasks

  1. Select BOM formats & languages

    • JSON as your primary format (application/vnd.cyclonedx+json). (CycloneDX)

    • Components youll cover:

      • Application BOMs (packages, containers).
      • Optional: infrastructure (IaC, images).
      • Optional: CBOM for crypto usage.
  2. Choose or implement generators

    • For each ecosystem (e.g., Maven, NPM, PyPI, containers), choose:

      • Existing tools (cyclonedx-maven-plugin, cyclonedx-npm, etc).
      • Or central generator using lockfiles/manifests.

Acceptance criteria

  • Matrix of ecosystems → generator tool finalized.
  • POC shows valid CycloneDX 1.7 JSON BOM for one representative project.

B2. Schema alignment & validation

Tasks

  1. Model updates

    • Extend your internal SBOM model to include:

      • spec_version: "1.7"
      • bomFormat: "CycloneDX"
      • serialNumber (UUID/URI).
      • metadata.tools (how BOM was produced).
      • properties, licenses, crypto (for CBOM).
    • For provenance:

      • metadata.authors, metadata.manufacture, metadata.supplier.
      • components[x].evidence and components[x].properties for evidence & citations. (CycloneDX)
  2. Validation pipeline

    • Integrate the official CycloneDX JSON schema validation step into:

      • CI (for projects generating BOMs).
      • Your ingestion path (reject/flag invalid BOMs).

Acceptance criteria

  • Any BOM produced must pass CycloneDX 1.7 JSON schema validation in CI.
  • Ingestion rejects malformed BOMs with clear error messages.

B3. Provenance & citations in BOMs

Tasks

  1. Define provenance policy

    • Minimal set for every BOM:

      • Author (CI system / team).
      • Build pipeline ID, commit, repo URL.
      • Build time.
    • Extended:

      • externalReferences for:

        • Build logs.
        • SLSA attestations.
        • Security reports (e.g., scanner runs).
  2. Implement metadata injection

    • In your CI templates:

      • Capture build info (commit SHA, pipeline ID, creator, environment).
      • Add it into CycloneDX metadata and properties.
    • For evidence:

      • Use components[x].evidence to reference where a component was detected (e.g., file paths, manifest lines).

Acceptance criteria

  • For any BOM, engineers can trace:

    • WHO built it.
    • WHEN it was built.
    • WHICH repo/commit/pipeline it came from.

B4. CBOM (Cryptography BOM) support (optional but powerful)

Tasks

  1. Crypto inventory

    • Scanner enhancement:

      • Detect crypto libraries & primitives used (e.g., OpenSSL, bcrypt, TLS versions).
      • Map them into CycloneDX CBOM structures in crypto sections (per spec).
  2. Policy hooks

    • Define policy checks:

      • “Disallow SHA-1,”
      • “Warn on RSA < 2048 bits,”
      • “Flag non-FIPS-approved algorithms.”

Acceptance criteria

  • From a BOM, you can list all cryptographic algorithms and libraries used in an application.
  • At least one simple crypto policy implemented (e.g., SHA-1 usage alert).

B5. Ingestion, correlation & UI

Tasks

  1. Ingestion service

    • API endpoint: POST /sboms accepting CycloneDX 1.7 JSON.

    • Store:

      • Raw BOM (for evidence).

      • Normalized component graph (packages, relationships).

      • Link BOM to:

        • Repo/project.
        • Build (from SLSA provenance).
        • Deployed asset.
  2. Correlation

    • Join SBOM components with:

      • Vulnerability data (CVE/CWE/CPE/PURL).
      • Crypto policy results.
    • Maintain “asset → BOM → components → vulnerabilities” graph.

  3. UI

    • For any service/image:

      • Show latest BOM metadata (CycloneDX version, timestamp).
      • Component list with vulnerability badges.
      • Crypto tab (if CBOM enabled).
      • Provenance tab (author, build pipeline, SLSA attestation links).

Acceptance criteria

  • Given an SBOM upload, the UI shows:

    • Components.
    • Associated vulnerabilities.
    • Provenance metadata.
  • API consumers can fetch SBOM + correlated risk in a single call.


C. SLSA 1.2 build + source provenance

SLSA 1.2 (final) introduces a Source Track in addition to the Build Track, defining levels and attestation formats for both source control and build provenance. (SLSA)

C1. Target SLSA levels & scope

Tasks

  1. Choose target levels

    • For each critical product:

      • Pick Build Track level (e.g., target L2 now, L3 later).
      • Pick Source Track level (e.g., L1 for all, L2 for sensitive repos).
  2. Repo inventory

    • Classify repos by risk:

      • Critical (agents, scanners, control-plane).
      • Important (integrations).
      • Lowrisk (internal tools).
    • Map target SLSA levels accordingly.

Acceptance criteria

  • For every repo, there is an explicit target SLSA Build + Source level.
  • Gap analysis doc exists (current vs target).

C2. Build provenance in CI/CD

Tasks

  1. Attestation generation

    • For each CI pipeline:

      • Use SLSA-compatible builders or tooling (e.g., slsa-github-generator, slsa-framework actions, Tekton Chains, etc.) to produce build provenance attestations in SLSA 1.2 format.
    • Attestation content includes:

      • Builder identity.
      • Build inputs (commit, repo, config).
      • Build parameters.
      • Produced artifacts (digest, image tags).
  2. Signing & storage

    • Sign attestations (Sigstore/cosign or equivalent).

    • Store:

      • In an OCI registry (as artifacts).
      • Or in a dedicated provenance store.
    • Expose pointer to attestation in:

      • BOM (externalReferences).
      • Your StellaOps metadata.

Acceptance criteria

  • For any built artifact (image/binary), you can retrieve a SLSA attestation proving:

    • What source it came from.
    • Which builder ran.
    • What steps were executed.

C3. Source Track controls

Tasks

  1. Source provenance

    • Implement controls to support SLSA Source Track:

      • Enforce protected branches.
      • Require code review (e.g., 2 reviewers) for main branches.
      • Require signed commits for critical repos.
    • Log:

      • Author, reviewers, branch, PR ID, merge SHA.
  2. Source attestation

    • For each release:

      • Generate source attestations capturing:

        • Repo URL and commit.
        • Review status.
        • Policy compliance (review count, checks passing).
    • Link these to build attestations (Source → Build provenance chain).

Acceptance criteria

  • For a release, you can prove:

    • Which reviews happened.
    • Which branch strategy was followed.
    • That policies were met at merge time.

C4. Verification & policy in StellaOps

Tasks

  1. Verifier service

    • Implement a service that:

      • Fetches SLSA attestations (source + build).

      • Verifies signatures and integrity.

      • Evaluates them against policies:

        • “Artifact must have SLSA Build L2 attestation from trusted builders.”
        • “Critical services must have Source L2 attestation (review, branch protections).”
  2. Runtime & deployment gates

    • Integrate verification into:

      • Admission controller (Kubernetes or deployment gate).
      • CI release stage (block promotion if SLSA requirements not met).
  3. UI

    • On artifact/service detail page:

      • Surface SLSA level achieved (per track).
      • Status (pass/fail).
      • Drill-down view of attestation evidence (who built, when, from where).

Acceptance criteria

  • A deployment can be blocked (in a test env) when SLSA requirements are not satisfied.
  • Operators can visually see SLSA status for an artifact/service.

X. Crosscutting: APIs, UX, docs, rollout

X1. Unified data model & APIs

Tasks

  1. Graph relationships

    • Model the relationship:

      • Source repoSLSA Source attestationBuild attestationArtifactSBOM (CycloneDX 1.7)ComponentsVulnerabilities (CVSS v4).
  2. Graph queries

    • Build API endpoints for:

      • “Given a CVE, show all affected artifacts and their SLSA + BOM evidence.”
      • “Given an artifact, show its full provenance chain and risk posture.”

Acceptance criteria

  • At least 2 endtoend queries work:

    • CVE → impacted assets with scores + provenance.
    • Artifact → SBOM + vulnerabilities + SLSA + crypto posture.

X2. Observability & auditing

Tasks

  1. Audit logs

    • Log:

      • BOM uploads and generators.
      • SLSA attestation creation/verification.
      • CVSS recalculations (who/what triggered them).
  2. Metrics

    • Track:

      • % of builds with valid SLSA attestations.
      • % artifacts with CycloneDX 1.7 BOMs.
      • % vulns with v4 scores.
    • Expose dashboards (Prometheus/Grafana or similar).

Acceptance criteria

  • Dashboards exist showing coverage for:

    • CVSSv4 adoption.
    • CycloneDX 1.7 coverage.
    • SLSA coverage.

X3. Documentation & developer experience

Tasks

  1. Developer playbooks

    • Short, repofriendly docs:

      • “How to enable CycloneDX BOM generation in this repo.”
      • “How to ensure your service reaches SLSA Build L2.”
      • “How to interpret CVSS v4 in StellaOps.”
  2. Templates

    • CI templates:

      • bom-enabled-pipeline.yaml
      • slsa-enabled-pipeline.yaml
    • Code snippets:

      • API examples for pushing SBOMs.
      • API examples for querying risk posture.

Acceptance criteria

  • A new project can:

    • Copy a CI template.
    • Produce a validated CycloneDX 1.7 BOM.
    • Generate SLSA attestations.
    • Show up correctly in StellaOps with CVSS v4 scoring.

If youd like, next step I can:

  • Turn this into a Jira-ready epic + stories breakdown, or
  • Draft concrete API schemas (OpenAPI/JSON) for SBOM ingestion, CVSS scoring, and SLSA attestation verification.