Align AOC tasks for Excititor and Concelier
This commit is contained in:
@@ -1,4 +1,4 @@
|
||||
# Deployment & Operations — Agent Charter
|
||||
|
||||
## Mission
|
||||
Maintain deployment/upgrade/rollback workflows (Helm/Compose) per `docs/modules/devops/ARCHITECTURE.md` including environment-specific configs.
|
||||
# Deployment & Operations — Agent Charter
|
||||
|
||||
## Mission
|
||||
Maintain deployment/upgrade/rollback workflows (Helm/Compose) per `docs/modules/devops/ARCHITECTURE.md` including environment-specific configs.
|
||||
|
||||
@@ -1,11 +1,11 @@
|
||||
# DevOps & Release — Agent Charter
|
||||
|
||||
## Mission
|
||||
Execute deterministic build/release pipeline per `docs/modules/devops/ARCHITECTURE.md`:
|
||||
- Reproducible builds with SBOM/provenance, cosign signing, transparency logging.
|
||||
- Channel manifests (LTS/Stable/Edge) with digests, Helm/Compose profiles.
|
||||
- Performance guard jobs ensuring budgets.
|
||||
|
||||
## Expectations
|
||||
- Coordinate with Scanner/Scheduler/Notify teams for artifact availability.
|
||||
- Maintain CI reliability; update `TASKS.md` as states change.
|
||||
# DevOps & Release — Agent Charter
|
||||
|
||||
## Mission
|
||||
Execute deterministic build/release pipeline per `docs/modules/devops/ARCHITECTURE.md`:
|
||||
- Reproducible builds with SBOM/provenance, cosign signing, transparency logging.
|
||||
- Channel manifests (LTS/Stable/Edge) with digests, Helm/Compose profiles.
|
||||
- Performance guard jobs ensuring budgets.
|
||||
|
||||
## Expectations
|
||||
- Coordinate with Scanner/Scheduler/Notify teams for artifact availability.
|
||||
- Maintain CI reliability; update `TASKS.md` as states change.
|
||||
|
||||
@@ -1,92 +1,92 @@
|
||||
# DevOps Release Automation
|
||||
|
||||
The **release** workflow builds and signs the StellaOps service containers,
|
||||
generates SBOM + provenance attestations, and emits a canonical
|
||||
`release.yaml`. The logic lives under `ops/devops/release/` and is invoked
|
||||
by the new `.gitea/workflows/release.yml` pipeline.
|
||||
|
||||
## Local dry run
|
||||
|
||||
```bash
|
||||
./ops/devops/release/build_release.py \
|
||||
--version 2025.10.0-edge \
|
||||
--channel edge \
|
||||
--dry-run
|
||||
```
|
||||
|
||||
Outputs land under `out/release/`. Use `--no-push` to run full builds without
|
||||
pushing to the registry.
|
||||
|
||||
After the build completes, run the verifier to validate recorded hashes and artefact
|
||||
presence:
|
||||
|
||||
```bash
|
||||
python ops/devops/release/verify_release.py --release-dir out/release
|
||||
```
|
||||
|
||||
## Python analyzer smoke & signing
|
||||
|
||||
`dotnet run --project src/Tools/LanguageAnalyzerSmoke` exercises the Python language
|
||||
analyzer plug-in against the golden fixtures (cold/warm timings, determinism). The
|
||||
release workflow runs this harness automatically and then produces Cosign
|
||||
signatures + SHA-256 sidecars for `StellaOps.Scanner.Analyzers.Lang.Python.dll`
|
||||
and its `manifest.json`. Keep `COSIGN_KEY_REF`/`COSIGN_IDENTITY_TOKEN` populated so
|
||||
the step can sign the artefacts; the generated `.sig`/`.sha256` files ship with the
|
||||
Offline Kit bundle.
|
||||
|
||||
## Required tooling
|
||||
|
||||
- Docker 25+ with Buildx
|
||||
- .NET 10 preview SDK (builds container stages and the SBOM generator)
|
||||
- Node.js 20 (Angular UI build)
|
||||
- Helm 3.16+
|
||||
- Cosign 2.2+
|
||||
|
||||
Supply signing material via environment variables:
|
||||
|
||||
- `COSIGN_KEY_REF` – e.g. `file:./keys/cosign.key` or `azurekms://…`
|
||||
- `COSIGN_PASSWORD` – password protecting the above key
|
||||
|
||||
The workflow defaults to multi-arch (`linux/amd64,linux/arm64`), SBOM in
|
||||
CycloneDX, and SLSA provenance (`https://slsa.dev/provenance/v1`).
|
||||
|
||||
## Debug store extraction
|
||||
|
||||
`build_release.py` now exports stripped debug artefacts for every ELF discovered in the published images. The files land under `out/release/debug/.build-id/<aa>/<rest>.debug`, with metadata captured in `debug/debug-manifest.json` (and a `.sha256` sidecar). Use `jq` to inspect the manifest or `readelf -n` to spot-check a build-id. Offline Kit packaging should reuse the `debug/` directory as-is.
|
||||
|
||||
## UI auth smoke (Playwright)
|
||||
|
||||
As part of **DEVOPS-UI-13-006** the pipelines will execute the UI auth smoke
|
||||
tests (`npm run test:e2e`) after building the Angular bundle. See
|
||||
`docs/modules/ui/operations/auth-smoke.md` for the job design, environment stubs, and
|
||||
offline runner considerations.
|
||||
|
||||
## NuGet preview bootstrap
|
||||
|
||||
`.NET 10` preview packages (Microsoft.Extensions.*, JwtBearer 10.0 RC, Sqlite 9 RC)
|
||||
ship from the public `dotnet-public` Azure DevOps feed. We mirror them into
|
||||
`./local-nuget` so restores succeed inside Offline Kit.
|
||||
|
||||
1. Run `./ops/devops/sync-preview-nuget.sh` whenever you update the manifest.
|
||||
2. The script now understands the optional `SourceBase` column (V3 flat container)
|
||||
and writes packages alongside their SHA-256 checks.
|
||||
3. `NuGet.config` registers the mirror (`local`), dotnet-public, and nuget.org.
|
||||
|
||||
Use `python3 ops/devops/validate_restore_sources.py` to prove the repo still
|
||||
prefers the local mirror and that `Directory.Build.props` enforces the same order.
|
||||
The validator now runs automatically in the `build-test-deploy` and `release`
|
||||
workflows so CI fails fast when a feed priority regression slips in.
|
||||
|
||||
Detailed operator instructions live in `docs/modules/devops/runbooks/nuget-preview-bootstrap.md`.
|
||||
|
||||
## Telemetry collector tooling (DEVOPS-OBS-50-001)
|
||||
|
||||
- `ops/devops/telemetry/generate_dev_tls.sh` – generates a development CA and
|
||||
client/server certificates for the OpenTelemetry collector overlay (mutual TLS).
|
||||
- `ops/devops/telemetry/smoke_otel_collector.py` – sends OTLP traces/metrics/logs
|
||||
over TLS and validates that the collector increments its receiver counters.
|
||||
- `ops/devops/telemetry/package_offline_bundle.py` – re-packages collector assets for the Offline Kit.
|
||||
- `deploy/compose/docker-compose.telemetry-storage.yaml` – Prometheus/Tempo/Loki stack for staging validation.
|
||||
|
||||
Combine these helpers with `deploy/compose/docker-compose.telemetry.yaml` to run
|
||||
a secured collector locally before rolling out the Helm-based deployment.
|
||||
# DevOps Release Automation
|
||||
|
||||
The **release** workflow builds and signs the StellaOps service containers,
|
||||
generates SBOM + provenance attestations, and emits a canonical
|
||||
`release.yaml`. The logic lives under `ops/devops/release/` and is invoked
|
||||
by the new `.gitea/workflows/release.yml` pipeline.
|
||||
|
||||
## Local dry run
|
||||
|
||||
```bash
|
||||
./ops/devops/release/build_release.py \
|
||||
--version 2025.10.0-edge \
|
||||
--channel edge \
|
||||
--dry-run
|
||||
```
|
||||
|
||||
Outputs land under `out/release/`. Use `--no-push` to run full builds without
|
||||
pushing to the registry.
|
||||
|
||||
After the build completes, run the verifier to validate recorded hashes and artefact
|
||||
presence:
|
||||
|
||||
```bash
|
||||
python ops/devops/release/verify_release.py --release-dir out/release
|
||||
```
|
||||
|
||||
## Python analyzer smoke & signing
|
||||
|
||||
`dotnet run --project src/Tools/LanguageAnalyzerSmoke` exercises the Python language
|
||||
analyzer plug-in against the golden fixtures (cold/warm timings, determinism). The
|
||||
release workflow runs this harness automatically and then produces Cosign
|
||||
signatures + SHA-256 sidecars for `StellaOps.Scanner.Analyzers.Lang.Python.dll`
|
||||
and its `manifest.json`. Keep `COSIGN_KEY_REF`/`COSIGN_IDENTITY_TOKEN` populated so
|
||||
the step can sign the artefacts; the generated `.sig`/`.sha256` files ship with the
|
||||
Offline Kit bundle.
|
||||
|
||||
## Required tooling
|
||||
|
||||
- Docker 25+ with Buildx
|
||||
- .NET 10 preview SDK (builds container stages and the SBOM generator)
|
||||
- Node.js 20 (Angular UI build)
|
||||
- Helm 3.16+
|
||||
- Cosign 2.2+
|
||||
|
||||
Supply signing material via environment variables:
|
||||
|
||||
- `COSIGN_KEY_REF` – e.g. `file:./keys/cosign.key` or `azurekms://…`
|
||||
- `COSIGN_PASSWORD` – password protecting the above key
|
||||
|
||||
The workflow defaults to multi-arch (`linux/amd64,linux/arm64`), SBOM in
|
||||
CycloneDX, and SLSA provenance (`https://slsa.dev/provenance/v1`).
|
||||
|
||||
## Debug store extraction
|
||||
|
||||
`build_release.py` now exports stripped debug artefacts for every ELF discovered in the published images. The files land under `out/release/debug/.build-id/<aa>/<rest>.debug`, with metadata captured in `debug/debug-manifest.json` (and a `.sha256` sidecar). Use `jq` to inspect the manifest or `readelf -n` to spot-check a build-id. Offline Kit packaging should reuse the `debug/` directory as-is.
|
||||
|
||||
## UI auth smoke (Playwright)
|
||||
|
||||
As part of **DEVOPS-UI-13-006** the pipelines will execute the UI auth smoke
|
||||
tests (`npm run test:e2e`) after building the Angular bundle. See
|
||||
`docs/modules/ui/operations/auth-smoke.md` for the job design, environment stubs, and
|
||||
offline runner considerations.
|
||||
|
||||
## NuGet preview bootstrap
|
||||
|
||||
`.NET 10` preview packages (Microsoft.Extensions.*, JwtBearer 10.0 RC, Sqlite 9 RC)
|
||||
ship from the public `dotnet-public` Azure DevOps feed. We mirror them into
|
||||
`./local-nuget` so restores succeed inside Offline Kit.
|
||||
|
||||
1. Run `./ops/devops/sync-preview-nuget.sh` whenever you update the manifest.
|
||||
2. The script now understands the optional `SourceBase` column (V3 flat container)
|
||||
and writes packages alongside their SHA-256 checks.
|
||||
3. `NuGet.config` registers the mirror (`local`), dotnet-public, and nuget.org.
|
||||
|
||||
Use `python3 ops/devops/validate_restore_sources.py` to prove the repo still
|
||||
prefers the local mirror and that `Directory.Build.props` enforces the same order.
|
||||
The validator now runs automatically in the `build-test-deploy` and `release`
|
||||
workflows so CI fails fast when a feed priority regression slips in.
|
||||
|
||||
Detailed operator instructions live in `docs/modules/devops/runbooks/nuget-preview-bootstrap.md`.
|
||||
|
||||
## Telemetry collector tooling (DEVOPS-OBS-50-001)
|
||||
|
||||
- `ops/devops/telemetry/generate_dev_tls.sh` – generates a development CA and
|
||||
client/server certificates for the OpenTelemetry collector overlay (mutual TLS).
|
||||
- `ops/devops/telemetry/smoke_otel_collector.py` – sends OTLP traces/metrics/logs
|
||||
over TLS and validates that the collector increments its receiver counters.
|
||||
- `ops/devops/telemetry/package_offline_bundle.py` – re-packages collector assets for the Offline Kit.
|
||||
- `deploy/compose/docker-compose.telemetry-storage.yaml` – Prometheus/Tempo/Loki stack for staging validation.
|
||||
|
||||
Combine these helpers with `deploy/compose/docker-compose.telemetry.yaml` to run
|
||||
a secured collector locally before rolling out the Helm-based deployment.
|
||||
|
||||
@@ -1,136 +1,136 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Package telemetry collector assets for offline/air-gapped installs.
|
||||
|
||||
Outputs a tarball containing the collector configuration, Compose overlay,
|
||||
Helm defaults, and operator README. A SHA-256 checksum sidecar is emitted, and
|
||||
optional Cosign signing can be enabled with --sign.
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
import argparse
|
||||
import hashlib
|
||||
import os
|
||||
import subprocess
|
||||
import sys
|
||||
import tarfile
|
||||
from pathlib import Path
|
||||
from typing import Iterable
|
||||
|
||||
REPO_ROOT = Path(__file__).resolve().parents[3]
|
||||
DEFAULT_OUTPUT = REPO_ROOT / "out" / "telemetry" / "telemetry-offline-bundle.tar.gz"
|
||||
BUNDLE_CONTENTS: tuple[Path, ...] = (
|
||||
Path("deploy/telemetry/README.md"),
|
||||
Path("deploy/telemetry/otel-collector-config.yaml"),
|
||||
Path("deploy/telemetry/storage/README.md"),
|
||||
Path("deploy/telemetry/storage/prometheus.yaml"),
|
||||
Path("deploy/telemetry/storage/tempo.yaml"),
|
||||
Path("deploy/telemetry/storage/loki.yaml"),
|
||||
Path("deploy/telemetry/storage/tenants/tempo-overrides.yaml"),
|
||||
Path("deploy/telemetry/storage/tenants/loki-overrides.yaml"),
|
||||
Path("deploy/helm/stellaops/files/otel-collector-config.yaml"),
|
||||
Path("deploy/helm/stellaops/values.yaml"),
|
||||
Path("deploy/helm/stellaops/templates/otel-collector.yaml"),
|
||||
Path("deploy/compose/docker-compose.telemetry.yaml"),
|
||||
Path("deploy/compose/docker-compose.telemetry-storage.yaml"),
|
||||
Path("docs/modules/telemetry/operations/collector.md"),
|
||||
Path("docs/modules/telemetry/operations/storage.md"),
|
||||
)
|
||||
|
||||
|
||||
def compute_sha256(path: Path) -> str:
|
||||
sha = hashlib.sha256()
|
||||
with path.open("rb") as handle:
|
||||
for chunk in iter(lambda: handle.read(1024 * 1024), b""):
|
||||
sha.update(chunk)
|
||||
return sha.hexdigest()
|
||||
|
||||
|
||||
def validate_files(paths: Iterable[Path]) -> None:
|
||||
missing = [str(p) for p in paths if not (REPO_ROOT / p).exists()]
|
||||
if missing:
|
||||
raise FileNotFoundError(f"Missing bundle artefacts: {', '.join(missing)}")
|
||||
|
||||
|
||||
def create_bundle(output_path: Path) -> Path:
|
||||
output_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
with tarfile.open(output_path, "w:gz") as tar:
|
||||
for rel_path in BUNDLE_CONTENTS:
|
||||
abs_path = REPO_ROOT / rel_path
|
||||
tar.add(abs_path, arcname=str(rel_path))
|
||||
return output_path
|
||||
|
||||
|
||||
def write_checksum(bundle_path: Path) -> Path:
|
||||
digest = compute_sha256(bundle_path)
|
||||
sha_path = bundle_path.with_suffix(bundle_path.suffix + ".sha256")
|
||||
sha_path.write_text(f"{digest} {bundle_path.name}\n", encoding="utf-8")
|
||||
return sha_path
|
||||
|
||||
|
||||
def cosign_sign(bundle_path: Path, key_ref: str | None, identity_token: str | None) -> None:
|
||||
cmd = ["cosign", "sign-blob", "--yes", str(bundle_path)]
|
||||
if key_ref:
|
||||
cmd.extend(["--key", key_ref])
|
||||
env = os.environ.copy()
|
||||
if identity_token:
|
||||
env["COSIGN_IDENTITY_TOKEN"] = identity_token
|
||||
try:
|
||||
subprocess.run(cmd, check=True, env=env)
|
||||
except FileNotFoundError as exc:
|
||||
raise RuntimeError("cosign not found on PATH; install cosign or omit --sign") from exc
|
||||
except subprocess.CalledProcessError as exc:
|
||||
raise RuntimeError(f"cosign sign-blob failed: {exc}") from exc
|
||||
|
||||
|
||||
def parse_args(argv: list[str] | None = None) -> argparse.Namespace:
|
||||
parser = argparse.ArgumentParser(description=__doc__)
|
||||
parser.add_argument(
|
||||
"--output",
|
||||
type=Path,
|
||||
default=DEFAULT_OUTPUT,
|
||||
help=f"Output bundle path (default: {DEFAULT_OUTPUT})",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--sign",
|
||||
action="store_true",
|
||||
help="Sign the bundle using cosign (requires cosign on PATH)",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--cosign-key",
|
||||
type=str,
|
||||
default=os.environ.get("COSIGN_KEY_REF"),
|
||||
help="Cosign key reference (file:..., azurekms://..., etc.)",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--identity-token",
|
||||
type=str,
|
||||
default=os.environ.get("COSIGN_IDENTITY_TOKEN"),
|
||||
help="OIDC identity token for keyless signing",
|
||||
)
|
||||
return parser.parse_args(argv)
|
||||
|
||||
|
||||
def main(argv: list[str] | None = None) -> int:
|
||||
args = parse_args(argv)
|
||||
validate_files(BUNDLE_CONTENTS)
|
||||
|
||||
bundle_path = args.output.resolve()
|
||||
print(f"[*] Creating telemetry bundle at {bundle_path}")
|
||||
create_bundle(bundle_path)
|
||||
sha_path = write_checksum(bundle_path)
|
||||
print(f"[✓] SHA-256 written to {sha_path}")
|
||||
|
||||
if args.sign:
|
||||
print("[*] Signing bundle with cosign")
|
||||
cosign_sign(bundle_path, args.cosign_key, args.identity_token)
|
||||
sig_path = bundle_path.with_suffix(bundle_path.suffix + ".sig")
|
||||
if sig_path.exists():
|
||||
print(f"[✓] Cosign signature written to {sig_path}")
|
||||
else:
|
||||
print("[!] Cosign completed but signature file not found (ensure cosign version >= 2.2)")
|
||||
|
||||
return 0
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
sys.exit(main())
|
||||
#!/usr/bin/env python3
|
||||
"""Package telemetry collector assets for offline/air-gapped installs.
|
||||
|
||||
Outputs a tarball containing the collector configuration, Compose overlay,
|
||||
Helm defaults, and operator README. A SHA-256 checksum sidecar is emitted, and
|
||||
optional Cosign signing can be enabled with --sign.
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
import argparse
|
||||
import hashlib
|
||||
import os
|
||||
import subprocess
|
||||
import sys
|
||||
import tarfile
|
||||
from pathlib import Path
|
||||
from typing import Iterable
|
||||
|
||||
REPO_ROOT = Path(__file__).resolve().parents[3]
|
||||
DEFAULT_OUTPUT = REPO_ROOT / "out" / "telemetry" / "telemetry-offline-bundle.tar.gz"
|
||||
BUNDLE_CONTENTS: tuple[Path, ...] = (
|
||||
Path("deploy/telemetry/README.md"),
|
||||
Path("deploy/telemetry/otel-collector-config.yaml"),
|
||||
Path("deploy/telemetry/storage/README.md"),
|
||||
Path("deploy/telemetry/storage/prometheus.yaml"),
|
||||
Path("deploy/telemetry/storage/tempo.yaml"),
|
||||
Path("deploy/telemetry/storage/loki.yaml"),
|
||||
Path("deploy/telemetry/storage/tenants/tempo-overrides.yaml"),
|
||||
Path("deploy/telemetry/storage/tenants/loki-overrides.yaml"),
|
||||
Path("deploy/helm/stellaops/files/otel-collector-config.yaml"),
|
||||
Path("deploy/helm/stellaops/values.yaml"),
|
||||
Path("deploy/helm/stellaops/templates/otel-collector.yaml"),
|
||||
Path("deploy/compose/docker-compose.telemetry.yaml"),
|
||||
Path("deploy/compose/docker-compose.telemetry-storage.yaml"),
|
||||
Path("docs/modules/telemetry/operations/collector.md"),
|
||||
Path("docs/modules/telemetry/operations/storage.md"),
|
||||
)
|
||||
|
||||
|
||||
def compute_sha256(path: Path) -> str:
|
||||
sha = hashlib.sha256()
|
||||
with path.open("rb") as handle:
|
||||
for chunk in iter(lambda: handle.read(1024 * 1024), b""):
|
||||
sha.update(chunk)
|
||||
return sha.hexdigest()
|
||||
|
||||
|
||||
def validate_files(paths: Iterable[Path]) -> None:
|
||||
missing = [str(p) for p in paths if not (REPO_ROOT / p).exists()]
|
||||
if missing:
|
||||
raise FileNotFoundError(f"Missing bundle artefacts: {', '.join(missing)}")
|
||||
|
||||
|
||||
def create_bundle(output_path: Path) -> Path:
|
||||
output_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
with tarfile.open(output_path, "w:gz") as tar:
|
||||
for rel_path in BUNDLE_CONTENTS:
|
||||
abs_path = REPO_ROOT / rel_path
|
||||
tar.add(abs_path, arcname=str(rel_path))
|
||||
return output_path
|
||||
|
||||
|
||||
def write_checksum(bundle_path: Path) -> Path:
|
||||
digest = compute_sha256(bundle_path)
|
||||
sha_path = bundle_path.with_suffix(bundle_path.suffix + ".sha256")
|
||||
sha_path.write_text(f"{digest} {bundle_path.name}\n", encoding="utf-8")
|
||||
return sha_path
|
||||
|
||||
|
||||
def cosign_sign(bundle_path: Path, key_ref: str | None, identity_token: str | None) -> None:
|
||||
cmd = ["cosign", "sign-blob", "--yes", str(bundle_path)]
|
||||
if key_ref:
|
||||
cmd.extend(["--key", key_ref])
|
||||
env = os.environ.copy()
|
||||
if identity_token:
|
||||
env["COSIGN_IDENTITY_TOKEN"] = identity_token
|
||||
try:
|
||||
subprocess.run(cmd, check=True, env=env)
|
||||
except FileNotFoundError as exc:
|
||||
raise RuntimeError("cosign not found on PATH; install cosign or omit --sign") from exc
|
||||
except subprocess.CalledProcessError as exc:
|
||||
raise RuntimeError(f"cosign sign-blob failed: {exc}") from exc
|
||||
|
||||
|
||||
def parse_args(argv: list[str] | None = None) -> argparse.Namespace:
|
||||
parser = argparse.ArgumentParser(description=__doc__)
|
||||
parser.add_argument(
|
||||
"--output",
|
||||
type=Path,
|
||||
default=DEFAULT_OUTPUT,
|
||||
help=f"Output bundle path (default: {DEFAULT_OUTPUT})",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--sign",
|
||||
action="store_true",
|
||||
help="Sign the bundle using cosign (requires cosign on PATH)",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--cosign-key",
|
||||
type=str,
|
||||
default=os.environ.get("COSIGN_KEY_REF"),
|
||||
help="Cosign key reference (file:..., azurekms://..., etc.)",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--identity-token",
|
||||
type=str,
|
||||
default=os.environ.get("COSIGN_IDENTITY_TOKEN"),
|
||||
help="OIDC identity token for keyless signing",
|
||||
)
|
||||
return parser.parse_args(argv)
|
||||
|
||||
|
||||
def main(argv: list[str] | None = None) -> int:
|
||||
args = parse_args(argv)
|
||||
validate_files(BUNDLE_CONTENTS)
|
||||
|
||||
bundle_path = args.output.resolve()
|
||||
print(f"[*] Creating telemetry bundle at {bundle_path}")
|
||||
create_bundle(bundle_path)
|
||||
sha_path = write_checksum(bundle_path)
|
||||
print(f"[✓] SHA-256 written to {sha_path}")
|
||||
|
||||
if args.sign:
|
||||
print("[*] Signing bundle with cosign")
|
||||
cosign_sign(bundle_path, args.cosign_key, args.identity_token)
|
||||
sig_path = bundle_path.with_suffix(bundle_path.suffix + ".sig")
|
||||
if sig_path.exists():
|
||||
print(f"[✓] Cosign signature written to {sig_path}")
|
||||
else:
|
||||
print("[!] Cosign completed but signature file not found (ensure cosign version >= 2.2)")
|
||||
|
||||
return 0
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
sys.exit(main())
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
# Licensing & Registry Access — Agent Charter
|
||||
|
||||
## Mission
|
||||
Implement licensing token service and registry access workflows described in `docs/modules/devops/ARCHITECTURE.md`.
|
||||
# Licensing & Registry Access — Agent Charter
|
||||
|
||||
## Mission
|
||||
Implement licensing token service and registry access workflows described in `docs/modules/devops/ARCHITECTURE.md`.
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
# Offline Kit — Agent Charter
|
||||
|
||||
## Mission
|
||||
Package Offline Update Kit per `docs/modules/devops/ARCHITECTURE.md` and `docs/24_OFFLINE_KIT.md` with deterministic digests and import tooling.
|
||||
# Offline Kit — Agent Charter
|
||||
|
||||
## Mission
|
||||
Package Offline Update Kit per `docs/modules/devops/ARCHITECTURE.md` and `docs/24_OFFLINE_KIT.md` with deterministic digests and import tooling.
|
||||
|
||||
@@ -1,36 +1,36 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
repo_root="$(git -C "${BASH_SOURCE%/*}/.." rev-parse --show-toplevel 2>/dev/null || pwd)"
|
||||
project_path="${repo_root}/src/Scanner/__Libraries/StellaOps.Scanner.Analyzers.Lang.Python/StellaOps.Scanner.Analyzers.Lang.Python.csproj"
|
||||
output_dir="${repo_root}/out/analyzers/python"
|
||||
plugin_dir="${repo_root}/plugins/scanner/analyzers/lang/StellaOps.Scanner.Analyzers.Lang.Python"
|
||||
|
||||
to_win_path() {
|
||||
if command -v wslpath >/dev/null 2>&1; then
|
||||
wslpath -w "$1"
|
||||
else
|
||||
printf '%s\n' "$1"
|
||||
fi
|
||||
}
|
||||
|
||||
rm -rf "${output_dir}"
|
||||
project_path_win="$(to_win_path "$project_path")"
|
||||
output_dir_win="$(to_win_path "$output_dir")"
|
||||
|
||||
dotnet publish "$project_path_win" \
|
||||
--configuration Release \
|
||||
--output "$output_dir_win" \
|
||||
--self-contained false
|
||||
|
||||
mkdir -p "${plugin_dir}"
|
||||
cp "${output_dir}/StellaOps.Scanner.Analyzers.Lang.Python.dll" "${plugin_dir}/"
|
||||
if [[ -f "${output_dir}/StellaOps.Scanner.Analyzers.Lang.Python.pdb" ]]; then
|
||||
cp "${output_dir}/StellaOps.Scanner.Analyzers.Lang.Python.pdb" "${plugin_dir}/"
|
||||
fi
|
||||
|
||||
repo_root_win="$(to_win_path "$repo_root")"
|
||||
exec dotnet run \
|
||||
--project "${repo_root_win}/src/Tools/LanguageAnalyzerSmoke/LanguageAnalyzerSmoke.csproj" \
|
||||
--configuration Release \
|
||||
-- --repo-root "${repo_root_win}"
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
repo_root="$(git -C "${BASH_SOURCE%/*}/.." rev-parse --show-toplevel 2>/dev/null || pwd)"
|
||||
project_path="${repo_root}/src/Scanner/__Libraries/StellaOps.Scanner.Analyzers.Lang.Python/StellaOps.Scanner.Analyzers.Lang.Python.csproj"
|
||||
output_dir="${repo_root}/out/analyzers/python"
|
||||
plugin_dir="${repo_root}/plugins/scanner/analyzers/lang/StellaOps.Scanner.Analyzers.Lang.Python"
|
||||
|
||||
to_win_path() {
|
||||
if command -v wslpath >/dev/null 2>&1; then
|
||||
wslpath -w "$1"
|
||||
else
|
||||
printf '%s\n' "$1"
|
||||
fi
|
||||
}
|
||||
|
||||
rm -rf "${output_dir}"
|
||||
project_path_win="$(to_win_path "$project_path")"
|
||||
output_dir_win="$(to_win_path "$output_dir")"
|
||||
|
||||
dotnet publish "$project_path_win" \
|
||||
--configuration Release \
|
||||
--output "$output_dir_win" \
|
||||
--self-contained false
|
||||
|
||||
mkdir -p "${plugin_dir}"
|
||||
cp "${output_dir}/StellaOps.Scanner.Analyzers.Lang.Python.dll" "${plugin_dir}/"
|
||||
if [[ -f "${output_dir}/StellaOps.Scanner.Analyzers.Lang.Python.pdb" ]]; then
|
||||
cp "${output_dir}/StellaOps.Scanner.Analyzers.Lang.Python.pdb" "${plugin_dir}/"
|
||||
fi
|
||||
|
||||
repo_root_win="$(to_win_path "$repo_root")"
|
||||
exec dotnet run \
|
||||
--project "${repo_root_win}/src/Tools/LanguageAnalyzerSmoke/LanguageAnalyzerSmoke.csproj" \
|
||||
--configuration Release \
|
||||
-- --repo-root "${repo_root_win}"
|
||||
|
||||
Reference in New Issue
Block a user