This commit is contained in:
41
ops/devops/README.md
Normal file
41
ops/devops/README.md
Normal file
@@ -0,0 +1,41 @@
|
||||
# DevOps Release Automation
|
||||
|
||||
The **release** workflow builds and signs the StellaOps service containers,
|
||||
generates SBOM + provenance attestations, and emits a canonical
|
||||
`release.yaml`. The logic lives under `ops/devops/release/` and is invoked
|
||||
by the new `.gitea/workflows/release.yml` pipeline.
|
||||
|
||||
## Local dry run
|
||||
|
||||
```bash
|
||||
./ops/devops/release/build_release.py \
|
||||
--version 2025.10.0-edge \
|
||||
--channel edge \
|
||||
--dry-run
|
||||
```
|
||||
|
||||
Outputs land under `out/release/`. Use `--no-push` to run full builds without
|
||||
pushing to the registry.
|
||||
|
||||
## Required tooling
|
||||
|
||||
- Docker 25+ with Buildx
|
||||
- .NET 10 preview SDK (builds container stages and the SBOM generator)
|
||||
- Node.js 20 (Angular UI build)
|
||||
- Helm 3.16+
|
||||
- Cosign 2.2+
|
||||
|
||||
Supply signing material via environment variables:
|
||||
|
||||
- `COSIGN_KEY_REF` – e.g. `file:./keys/cosign.key` or `azurekms://…`
|
||||
- `COSIGN_PASSWORD` – password protecting the above key
|
||||
|
||||
The workflow defaults to multi-arch (`linux/amd64,linux/arm64`), SBOM in
|
||||
CycloneDX, and SLSA provenance (`https://slsa.dev/provenance/v1`).
|
||||
|
||||
## UI auth smoke (Playwright)
|
||||
|
||||
As part of **DEVOPS-UI-13-006** the pipelines will execute the UI auth smoke
|
||||
tests (`npm run test:e2e`) after building the Angular bundle. See
|
||||
`docs/ops/ui-auth-smoke.md` for the job design, environment stubs, and
|
||||
offline runner considerations.
|
||||
@@ -7,7 +7,7 @@
|
||||
| DEVOPS-SCANNER-09-205 | DONE (2025-10-21) | DevOps Guild, Notify Guild | DEVOPS-SCANNER-09-204 | Add Notify smoke stage that tails the Redis stream and asserts `scanner.report.ready`/`scanner.scan.completed` reach Notify WebService in staging. | CI job reads Redis stream during scanner smoke deploy, confirms Notify ingestion via API, alerts on failure. |
|
||||
| DEVOPS-PERF-10-001 | DONE | DevOps Guild | BENCH-SCANNER-10-001 | Add perf smoke job (SBOM compose <5 s target) to CI. | CI job runs sample build verifying <5 s; alerts configured. |
|
||||
| DEVOPS-PERF-10-002 | DONE (2025-10-23) | DevOps Guild | BENCH-SCANNER-10-002 | Publish analyzer bench metrics to Grafana/perf workbook and alarm on ≥20 % regressions. | CI exports JSON for dashboards; Grafana panel wired; Ops on-call doc updated with alert hook. |
|
||||
| DEVOPS-REL-14-001 | TODO | DevOps Guild | SIGNER-API-11-101, ATTESTOR-API-11-201 | Deterministic build/release pipeline with SBOM/provenance, signing, manifest generation. | CI pipeline produces signed images + SBOM/attestations, manifests published with verified hashes, docs updated. |
|
||||
| DEVOPS-REL-14-001 | DOING (2025-10-23) | DevOps Guild | SIGNER-API-11-101, ATTESTOR-API-11-201 | Deterministic build/release pipeline with SBOM/provenance, signing, manifest generation. | CI pipeline produces signed images + SBOM/attestations, manifests published with verified hashes, docs updated. |
|
||||
| DEVOPS-REL-14-004 | TODO | DevOps Guild, Scanner Guild | DEVOPS-REL-14-001, SCANNER-ANALYZERS-LANG-10-309P | Extend release/offline smoke jobs to exercise the Python analyzer plug-in (warm/cold scans, determinism, signature checks). | Release/Offline pipelines run Python analyzer smoke suite; alerts hooked; docs updated with new coverage matrix. |
|
||||
| DEVOPS-REL-17-002 | TODO | DevOps Guild | DEVOPS-REL-14-001, SCANNER-EMIT-17-701 | Persist stripped-debug artifacts organised by GNU build-id and bundle them into release/offline kits with checksum manifests. | CI job writes `.debug` files under `artifacts/debug/.build-id/`, manifest + checksums published, offline kit includes cache, smoke job proves symbol lookup via build-id. |
|
||||
| DEVOPS-MIRROR-08-001 | DONE (2025-10-19) | DevOps Guild | DEVOPS-REL-14-001 | Stand up managed mirror profiles for `*.stella-ops.org` (Concelier/Excititor), including Helm/Compose overlays, multi-tenant secrets, CDN caching, and sync documentation. | Infra overlays committed, CI smoke deploy hits mirror endpoints, runbooks published for downstream sync and quota management. |
|
||||
@@ -15,8 +15,9 @@
|
||||
| DEVOPS-LAUNCH-18-100 | TODO | DevOps Guild | - | Finalise production environment footprint (clusters, secrets, network overlays) for full-platform go-live. | IaC/compose overlays committed, secrets placeholders documented, dry-run deploy succeeds in staging. |
|
||||
| DEVOPS-LAUNCH-18-900 | TODO | DevOps Guild, Module Leads | Wave 0 completion | Collect “full implementation” sign-off from module owners and consolidate launch readiness checklist. | Sign-off record stored under `docs/ops/launch-readiness.md`; outstanding gaps triaged; checklist approved. |
|
||||
| DEVOPS-LAUNCH-18-001 | TODO | DevOps Guild | DEVOPS-LAUNCH-18-100, DEVOPS-LAUNCH-18-900 | Production launch cutover rehearsal and runbook publication. | `docs/ops/launch-cutover.md` drafted, rehearsal executed with rollback drill, approvals captured. |
|
||||
| DEVOPS-NUGET-13-001 | TODO | DevOps Guild, Platform Leads | DEVOPS-REL-14-001 | Add .NET 10 preview feeds / local mirrors so `Microsoft.Extensions.*` 10.0 preview packages restore offline; refresh restore docs. | NuGet.config maps preview feeds (or local mirrored packages), `dotnet restore` succeeds for Excititor/Concelier solutions without ad-hoc feed edits, docs updated for offline bootstrap. |
|
||||
| DEVOPS-NUGET-13-001 | DOING (2025-10-24) | DevOps Guild, Platform Leads | DEVOPS-REL-14-001 | Add .NET 10 preview feeds / local mirrors so `Microsoft.Extensions.*` 10.0 preview packages restore offline; refresh restore docs. | NuGet.config maps preview feeds (or local mirrored packages), `dotnet restore` succeeds for Excititor/Concelier solutions without ad-hoc feed edits, docs updated for offline bootstrap. |
|
||||
| DEVOPS-NUGET-13-002 | TODO | DevOps Guild | DEVOPS-NUGET-13-001 | Ensure all solutions/projects prefer `local-nuget` before public sources and document restore order validation. | `NuGet.config` and solution-level configs resolve from `local-nuget` first; automated check verifies priority; docs updated for restore ordering. |
|
||||
| DEVOPS-NUGET-13-003 | TODO | DevOps Guild, Platform Leads | DEVOPS-NUGET-13-002 | Sweep `Microsoft.*` NuGet dependencies pinned to 8.* and upgrade to latest .NET 10 equivalents (or .NET 9 when 10 unavailable), updating restore guidance. | Dependency audit shows no 8.* `Microsoft.*` packages remaining; CI builds green; changelog/doc sections capture upgrade rationale. |
|
||||
| DEVOPS-UI-13-006 | TODO | DevOps Guild, UI Guild | UI-AUTH-13-001 | Add Playwright-based UI auth smoke job to CI/offline pipelines, wiring sample `/config.json` provisioning and reporting. | CI + Offline Kit run Playwright auth smoke (headless Chromium) post-build; job reuses stub config artifact, exports junit + trace on failure, docs updated under `docs/ops/ui-auth-smoke.md`. |
|
||||
> Remark (2025-10-20): Repacked `Mongo2Go` local feed to require MongoDB.Driver 3.5.0 + SharpCompress 0.41.0; cache regression tests green and NU1902/NU1903 suppressed.
|
||||
> Remark (2025-10-21): Compose/Helm profiles now surface `SCANNER__EVENTS__*` toggles with docs pointing at new `.env` placeholders.
|
||||
|
||||
16
ops/devops/nuget-preview-packages.csv
Normal file
16
ops/devops/nuget-preview-packages.csv
Normal file
@@ -0,0 +1,16 @@
|
||||
# Package,Version,SHA256
|
||||
Microsoft.Extensions.Caching.Memory,10.0.0-preview.7.25380.108,8721fd1420fea6e828963c8343cd83605902b663385e8c9060098374139f9b2f
|
||||
Microsoft.Extensions.Configuration,10.0.0-preview.7.25380.108,5a17ba4ba47f920a04ae51d80560833da82a0926d1e462af0d11c16b5da969f4
|
||||
Microsoft.Extensions.Configuration.Binder,10.0.0-preview.7.25380.108,5a3af17729241e205fe8fbb1d458470e9603935ab2eb67cbbb06ce51265ff68f
|
||||
Microsoft.Extensions.DependencyInjection.Abstractions,10.0.0-preview.7.25380.108,1e9cd330d7833a3a850a7a42bbe0c729906c60bf1c359ad30a8622b50da4399b
|
||||
Microsoft.Extensions.Hosting,10.0.0-preview.7.25380.108,3123bb019bbc0182cf7ac27f30018ca620929f8027e137bd5bdfb952037c7d29
|
||||
Microsoft.Extensions.Hosting.Abstractions,10.0.0-preview.7.25380.108,b57625436c9eb53e3aa27445b680bb93285d0d2c91007bbc221b0c378ab016a3
|
||||
Microsoft.Extensions.Http,10.0.0-preview.7.25380.108,daec142b7c7bd09ec1f2a86bfc3d7fe009825f5b653d310bc9e959c0a98a0f19
|
||||
Microsoft.Extensions.Logging.Abstractions,10.0.0-preview.7.25380.108,87a495fa0b7054e134a5cf44ec8b071fe2bc3ddfb27e9aefc6375701dca2a33a
|
||||
Microsoft.Extensions.Options,10.0.0-preview.7.25380.108,c0657c2be3b7b894024586cf6e46a2ebc0e710db64d2645c4655b893b8487d8a
|
||||
Microsoft.Extensions.DependencyInjection.Abstractions,9.0.0,0a7715c24299e42b081b63b4f8e33da97b985e1de9e941b2b9e4c748b0d52fe7
|
||||
Microsoft.Extensions.Logging.Abstractions,9.0.0,8814ecf6dc2359715e111b78084ae42087282595358eb775456088f15e63eca5
|
||||
Microsoft.Extensions.Options,9.0.0,0d3e5eb80418fc8b41e4b3c8f16229e839ddd254af0513f7e6f1643970baf1c9
|
||||
Microsoft.Extensions.Options.ConfigurationExtensions,9.0.0,af5677b04552223787d942a3f8a323f3a85aafaf20ff3c9b4aaa128c44817280
|
||||
Microsoft.Data.Sqlite,9.0.0-rc.1.24451.1,770b637317e1e924f1b13587b31af0787c8c668b1d9f53f2fccae8ee8704e167
|
||||
Microsoft.AspNetCore.Authentication.JwtBearer,10.0.0-rc.1.25451.107,05f168c2db7ba79230e3fd77e84f6912bc73721c6656494df0b227867a6c2d3c
|
||||
|
630
ops/devops/release/build_release.py
Normal file
630
ops/devops/release/build_release.py
Normal file
@@ -0,0 +1,630 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Deterministic release pipeline helper for StellaOps.
|
||||
|
||||
This script builds service containers, generates SBOM and provenance artefacts,
|
||||
signs them with cosign, and writes a channel-specific release manifest.
|
||||
|
||||
The workflow expects external tooling to be available on PATH:
|
||||
- docker (with buildx)
|
||||
- cosign
|
||||
- helm
|
||||
- npm / node (for the UI build)
|
||||
- dotnet SDK (for BuildX plugin publication)
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
import argparse
|
||||
import datetime as dt
|
||||
import hashlib
|
||||
import json
|
||||
import os
|
||||
import pathlib
|
||||
import re
|
||||
import shlex
|
||||
import subprocess
|
||||
import sys
|
||||
import tempfile
|
||||
from collections import OrderedDict
|
||||
from typing import Any, Dict, Iterable, List, Mapping, MutableMapping, Optional, Sequence
|
||||
|
||||
REPO_ROOT = pathlib.Path(__file__).resolve().parents[3]
|
||||
DEFAULT_CONFIG = REPO_ROOT / "ops/devops/release/components.json"
|
||||
|
||||
class CommandError(RuntimeError):
|
||||
pass
|
||||
|
||||
def run(cmd: Sequence[str], *, cwd: Optional[pathlib.Path] = None, env: Optional[Mapping[str, str]] = None, capture: bool = True) -> str:
|
||||
"""Run a subprocess command, returning stdout (text)."""
|
||||
process_env = os.environ.copy()
|
||||
if env:
|
||||
process_env.update(env)
|
||||
result = subprocess.run(
|
||||
list(cmd),
|
||||
cwd=str(cwd) if cwd else None,
|
||||
env=process_env,
|
||||
check=False,
|
||||
capture_output=capture,
|
||||
text=True,
|
||||
)
|
||||
if process_env.get("STELLAOPS_RELEASE_DEBUG"):
|
||||
sys.stderr.write(f"[debug] {' '.join(shlex.quote(c) for c in cmd)}\n")
|
||||
if capture:
|
||||
sys.stderr.write(result.stdout)
|
||||
sys.stderr.write(result.stderr)
|
||||
if result.returncode != 0:
|
||||
stdout = result.stdout if capture else ""
|
||||
stderr = result.stderr if capture else ""
|
||||
raise CommandError(f"Command failed ({result.returncode}): {' '.join(cmd)}\nSTDOUT:\n{stdout}\nSTDERR:\n{stderr}")
|
||||
|
||||
return result.stdout if capture else ""
|
||||
|
||||
|
||||
def load_json_config(path: pathlib.Path) -> Dict[str, Any]:
|
||||
with path.open("r", encoding="utf-8") as handle:
|
||||
return json.load(handle)
|
||||
|
||||
|
||||
def ensure_directory(path: pathlib.Path) -> pathlib.Path:
|
||||
path.mkdir(parents=True, exist_ok=True)
|
||||
return path
|
||||
|
||||
|
||||
def compute_sha256(path: pathlib.Path) -> str:
|
||||
sha = hashlib.sha256()
|
||||
with path.open("rb") as handle:
|
||||
for chunk in iter(lambda: handle.read(1024 * 1024), b""):
|
||||
sha.update(chunk)
|
||||
return sha.hexdigest()
|
||||
|
||||
|
||||
def format_scalar(value: Any) -> str:
|
||||
if isinstance(value, bool):
|
||||
return "true" if value else "false"
|
||||
if value is None:
|
||||
return "null"
|
||||
if isinstance(value, (int, float)):
|
||||
return str(value)
|
||||
text = str(value)
|
||||
if text == "":
|
||||
return '""'
|
||||
if re.search(r"[\s:#\-\[\]\{\}]", text):
|
||||
return json.dumps(text, ensure_ascii=False)
|
||||
return text
|
||||
|
||||
|
||||
def _yaml_lines(value: Any, indent: int = 0) -> List[str]:
|
||||
pad = " " * indent
|
||||
if isinstance(value, Mapping):
|
||||
lines: List[str] = []
|
||||
for key, val in value.items():
|
||||
if isinstance(val, (Mapping, list)):
|
||||
lines.append(f"{pad}{key}:")
|
||||
lines.extend(_yaml_lines(val, indent + 1))
|
||||
else:
|
||||
lines.append(f"{pad}{key}: {format_scalar(val)}")
|
||||
if not lines:
|
||||
lines.append(f"{pad}{{}}")
|
||||
return lines
|
||||
if isinstance(value, list):
|
||||
lines = []
|
||||
if not value:
|
||||
lines.append(f"{pad}[]")
|
||||
return lines
|
||||
for item in value:
|
||||
if isinstance(item, (Mapping, list)):
|
||||
lines.append(f"{pad}-")
|
||||
lines.extend(_yaml_lines(item, indent + 1))
|
||||
else:
|
||||
lines.append(f"{pad}- {format_scalar(item)}")
|
||||
return lines
|
||||
return [f"{pad}{format_scalar(value)}"]
|
||||
|
||||
|
||||
def dump_yaml(data: Mapping[str, Any]) -> str:
|
||||
lines: List[str] = _yaml_lines(data)
|
||||
return "\n".join(lines) + "\n"
|
||||
|
||||
|
||||
def utc_now_iso() -> str:
|
||||
return dt.datetime.now(tz=dt.timezone.utc).replace(microsecond=0).isoformat().replace("+00:00", "Z")
|
||||
|
||||
|
||||
def sanitize_calendar(version: str, explicit: Optional[str]) -> str:
|
||||
if explicit:
|
||||
return explicit
|
||||
# Expect version like 2025.10.0-edge or 2.4.1
|
||||
parts = re.findall(r"\d+", version)
|
||||
if len(parts) >= 2:
|
||||
return f"{parts[0]}.{parts[1]}"
|
||||
return dt.datetime.now(tz=dt.timezone.utc).strftime("%Y.%m")
|
||||
|
||||
|
||||
class ReleaseBuilder:
|
||||
def __init__(
|
||||
self,
|
||||
*,
|
||||
repo_root: pathlib.Path,
|
||||
config: Mapping[str, Any],
|
||||
version: str,
|
||||
channel: str,
|
||||
calendar: str,
|
||||
release_date: str,
|
||||
git_sha: str,
|
||||
output_dir: pathlib.Path,
|
||||
push: bool,
|
||||
dry_run: bool,
|
||||
registry_override: Optional[str] = None,
|
||||
platforms_override: Optional[Sequence[str]] = None,
|
||||
skip_signing: bool = False,
|
||||
cosign_key_ref: Optional[str] = None,
|
||||
cosign_password: Optional[str] = None,
|
||||
cosign_identity_token: Optional[str] = None,
|
||||
tlog_upload: bool = True,
|
||||
) -> None:
|
||||
self.repo_root = repo_root
|
||||
self.config = config
|
||||
self.version = version
|
||||
self.channel = channel
|
||||
self.calendar = calendar
|
||||
self.release_date = release_date
|
||||
self.git_sha = git_sha
|
||||
self.output_dir = ensure_directory(output_dir)
|
||||
self.push = push
|
||||
self.dry_run = dry_run
|
||||
self.registry = registry_override or config.get("registry")
|
||||
if not self.registry:
|
||||
raise ValueError("Config missing 'registry'")
|
||||
platforms = list(platforms_override) if platforms_override else config.get("platforms")
|
||||
if not platforms:
|
||||
platforms = ["linux/amd64", "linux/arm64"]
|
||||
self.platforms = list(platforms)
|
||||
self.source_date_epoch = str(int(dt.datetime.fromisoformat(release_date.replace("Z", "+00:00")).timestamp()))
|
||||
self.artifacts_dir = ensure_directory(self.output_dir / "artifacts")
|
||||
self.sboms_dir = ensure_directory(self.artifacts_dir / "sboms")
|
||||
self.provenance_dir = ensure_directory(self.artifacts_dir / "provenance")
|
||||
self.signature_dir = ensure_directory(self.artifacts_dir / "signatures")
|
||||
self.metadata_dir = ensure_directory(self.artifacts_dir / "metadata")
|
||||
self.temp_dir = pathlib.Path(tempfile.mkdtemp(prefix="stellaops-release-"))
|
||||
self.skip_signing = skip_signing
|
||||
self.tlog_upload = tlog_upload
|
||||
self.cosign_key_ref = cosign_key_ref or os.environ.get("COSIGN_KEY_REF")
|
||||
self.cosign_identity_token = cosign_identity_token or os.environ.get("COSIGN_IDENTITY_TOKEN")
|
||||
password = cosign_password if cosign_password is not None else os.environ.get("COSIGN_PASSWORD", "")
|
||||
self.cosign_env = {
|
||||
"COSIGN_PASSWORD": password,
|
||||
"COSIGN_EXPERIMENTAL": "1",
|
||||
"COSIGN_ALLOW_HTTP_REGISTRY": os.environ.get("COSIGN_ALLOW_HTTP_REGISTRY", "1"),
|
||||
"COSIGN_DOCKER_MEDIA_TYPES": os.environ.get("COSIGN_DOCKER_MEDIA_TYPES", "1"),
|
||||
}
|
||||
|
||||
# ----------------
|
||||
# Build steps
|
||||
# ----------------
|
||||
def run(self) -> Dict[str, Any]:
|
||||
components_result = []
|
||||
if self.dry_run:
|
||||
print("⚠️ Dry-run enabled; commands will be skipped")
|
||||
self._prime_buildx_plugin()
|
||||
for component in self.config.get("components", []):
|
||||
result = self._build_component(component)
|
||||
components_result.append(result)
|
||||
helm_meta = self._package_helm()
|
||||
compose_meta = self._digest_compose_files()
|
||||
manifest = self._compose_manifest(components_result, helm_meta, compose_meta)
|
||||
return manifest
|
||||
|
||||
def _prime_buildx_plugin(self) -> None:
|
||||
plugin_cfg = self.config.get("buildxPlugin")
|
||||
if not plugin_cfg:
|
||||
return
|
||||
project = plugin_cfg.get("project")
|
||||
if not project:
|
||||
return
|
||||
out_dir = ensure_directory(self.temp_dir / "buildx")
|
||||
if not self.dry_run:
|
||||
run([
|
||||
"dotnet",
|
||||
"publish",
|
||||
project,
|
||||
"-c",
|
||||
"Release",
|
||||
"-o",
|
||||
str(out_dir),
|
||||
])
|
||||
cas_dir = ensure_directory(self.temp_dir / "cas")
|
||||
run([
|
||||
"dotnet",
|
||||
str(out_dir / "StellaOps.Scanner.Sbomer.BuildXPlugin.dll"),
|
||||
"handshake",
|
||||
"--manifest",
|
||||
str(out_dir),
|
||||
"--cas",
|
||||
str(cas_dir),
|
||||
])
|
||||
|
||||
def _component_tags(self, repo: str) -> List[str]:
|
||||
base = f"{self.registry}/{repo}"
|
||||
tags = [f"{base}:{self.version}"]
|
||||
if self.channel:
|
||||
tags.append(f"{base}:{self.channel}")
|
||||
return tags
|
||||
|
||||
def _component_ref(self, repo: str, digest: str) -> str:
|
||||
return f"{self.registry}/{repo}@{digest}"
|
||||
|
||||
def _build_component(self, component: Mapping[str, Any]) -> Mapping[str, Any]:
|
||||
name = component["name"]
|
||||
repo = component.get("repository", name)
|
||||
kind = component.get("kind", "dotnet-service")
|
||||
dockerfile = component.get("dockerfile")
|
||||
if not dockerfile:
|
||||
raise ValueError(f"Component {name} missing dockerfile")
|
||||
context = component.get("context", ".")
|
||||
iid_file = self.temp_dir / f"{name}.iid"
|
||||
metadata_file = self.metadata_dir / f"{name}.metadata.json"
|
||||
|
||||
build_args = {
|
||||
"VERSION": self.version,
|
||||
"CHANNEL": self.channel,
|
||||
"GIT_SHA": self.git_sha,
|
||||
"SOURCE_DATE_EPOCH": self.source_date_epoch,
|
||||
}
|
||||
docker_cfg = self.config.get("docker", {})
|
||||
if kind == "dotnet-service":
|
||||
build_args.update({
|
||||
"PROJECT": component["project"],
|
||||
"ENTRYPOINT_DLL": component["entrypoint"],
|
||||
"SDK_IMAGE": docker_cfg.get("sdkImage", "mcr.microsoft.com/dotnet/nightly/sdk:10.0"),
|
||||
"RUNTIME_IMAGE": docker_cfg.get("runtimeImage", "gcr.io/distroless/dotnet/aspnet:latest"),
|
||||
})
|
||||
elif kind == "angular-ui":
|
||||
build_args.update({
|
||||
"NODE_IMAGE": docker_cfg.get("nodeImage", "node:20.14.0-bookworm"),
|
||||
"NGINX_IMAGE": docker_cfg.get("nginxImage", "nginx:1.27-alpine"),
|
||||
})
|
||||
else:
|
||||
raise ValueError(f"Unsupported component kind {kind}")
|
||||
|
||||
tags = self._component_tags(repo)
|
||||
build_cmd = [
|
||||
"docker",
|
||||
"buildx",
|
||||
"build",
|
||||
"--file",
|
||||
dockerfile,
|
||||
"--metadata-file",
|
||||
str(metadata_file),
|
||||
"--iidfile",
|
||||
str(iid_file),
|
||||
"--progress",
|
||||
"plain",
|
||||
"--platform",
|
||||
",".join(self.platforms),
|
||||
]
|
||||
for key, value in build_args.items():
|
||||
build_cmd.extend(["--build-arg", f"{key}={value}"])
|
||||
for tag in tags:
|
||||
build_cmd.extend(["--tag", tag])
|
||||
build_cmd.extend([
|
||||
"--attest",
|
||||
"type=sbom",
|
||||
"--attest",
|
||||
"type=provenance,mode=max",
|
||||
])
|
||||
if self.push:
|
||||
build_cmd.append("--push")
|
||||
else:
|
||||
build_cmd.append("--load")
|
||||
build_cmd.append(context)
|
||||
|
||||
if not self.dry_run:
|
||||
run(build_cmd, cwd=self.repo_root)
|
||||
|
||||
digest = iid_file.read_text(encoding="utf-8").strip() if iid_file.exists() else ""
|
||||
image_ref = self._component_ref(repo, digest) if digest else ""
|
||||
|
||||
bundle_info = self._sign_image(name, image_ref, tags)
|
||||
sbom_info = self._generate_sbom(name, image_ref)
|
||||
provenance_info = self._attach_provenance(name, image_ref)
|
||||
|
||||
component_entry = OrderedDict()
|
||||
component_entry["name"] = name
|
||||
if digest:
|
||||
component_entry["image"] = image_ref
|
||||
component_entry["tags"] = tags
|
||||
if sbom_info:
|
||||
component_entry["sbom"] = sbom_info
|
||||
if provenance_info:
|
||||
component_entry["provenance"] = provenance_info
|
||||
if bundle_info:
|
||||
component_entry["signature"] = bundle_info
|
||||
if metadata_file.exists():
|
||||
component_entry["metadata"] = str(metadata_file.relative_to(self.output_dir.parent)) if metadata_file.is_relative_to(self.output_dir.parent) else str(metadata_file)
|
||||
return component_entry
|
||||
|
||||
def _sign_image(self, name: str, image_ref: str, tags: Sequence[str]) -> Optional[Mapping[str, Any]]:
|
||||
if self.skip_signing:
|
||||
return None
|
||||
if not image_ref:
|
||||
return None
|
||||
if not (self.cosign_key_ref or self.cosign_identity_token):
|
||||
raise ValueError("Signing requested but no cosign key or identity token provided. Use --skip-signing to bypass.")
|
||||
signature_path = self.signature_dir / f"{name}.signature"
|
||||
cmd = ["cosign", "sign", "--yes"]
|
||||
if self.cosign_key_ref:
|
||||
cmd.extend(["--key", self.cosign_key_ref])
|
||||
if self.cosign_identity_token:
|
||||
cmd.extend(["--identity-token", self.cosign_identity_token])
|
||||
if not self.tlog_upload:
|
||||
cmd.append("--tlog-upload=false")
|
||||
cmd.append("--allow-http-registry")
|
||||
cmd.append(image_ref)
|
||||
if self.dry_run:
|
||||
return None
|
||||
run(cmd, env=self.cosign_env)
|
||||
signature_data = run([
|
||||
"cosign",
|
||||
"download",
|
||||
"signature",
|
||||
"--allow-http-registry",
|
||||
image_ref,
|
||||
])
|
||||
signature_path.write_text(signature_data, encoding="utf-8")
|
||||
signature_ref = run([
|
||||
"cosign",
|
||||
"triangulate",
|
||||
"--allow-http-registry",
|
||||
image_ref,
|
||||
]).strip()
|
||||
return OrderedDict(
|
||||
(
|
||||
("signature", OrderedDict((
|
||||
("path", str(signature_path.relative_to(self.output_dir.parent)) if signature_path.is_relative_to(self.output_dir.parent) else str(signature_path)),
|
||||
("ref", signature_ref),
|
||||
("tlogUploaded", self.tlog_upload),
|
||||
))),
|
||||
)
|
||||
)
|
||||
|
||||
def _generate_sbom(self, name: str, image_ref: str) -> Optional[Mapping[str, Any]]:
|
||||
if not image_ref or self.dry_run:
|
||||
return None
|
||||
sbom_path = self.sboms_dir / f"{name}.cyclonedx.json"
|
||||
run([
|
||||
"docker",
|
||||
"sbom",
|
||||
image_ref,
|
||||
"--format",
|
||||
"cyclonedx-json",
|
||||
"--output",
|
||||
str(sbom_path),
|
||||
])
|
||||
entry = OrderedDict((
|
||||
("path", str(sbom_path.relative_to(self.output_dir.parent)) if sbom_path.is_relative_to(self.output_dir.parent) else str(sbom_path)),
|
||||
("sha256", compute_sha256(sbom_path)),
|
||||
))
|
||||
if self.skip_signing:
|
||||
return entry
|
||||
attach_cmd = [
|
||||
"cosign",
|
||||
"attach",
|
||||
"sbom",
|
||||
"--sbom",
|
||||
str(sbom_path),
|
||||
"--type",
|
||||
"cyclonedx",
|
||||
]
|
||||
if self.cosign_key_ref:
|
||||
attach_cmd.extend(["--key", self.cosign_key_ref])
|
||||
attach_cmd.append("--allow-http-registry")
|
||||
attach_cmd.append(image_ref)
|
||||
run(attach_cmd, env=self.cosign_env)
|
||||
reference = run(["cosign", "triangulate", "--type", "sbom", "--allow-http-registry", image_ref]).strip()
|
||||
entry["ref"] = reference
|
||||
return entry
|
||||
|
||||
def _attach_provenance(self, name: str, image_ref: str) -> Optional[Mapping[str, Any]]:
|
||||
if not image_ref or self.dry_run:
|
||||
return None
|
||||
predicate = OrderedDict()
|
||||
predicate["buildDefinition"] = OrderedDict(
|
||||
(
|
||||
("buildType", "https://git.stella-ops.org/stellaops/release"),
|
||||
("externalParameters", OrderedDict((
|
||||
("component", name),
|
||||
("version", self.version),
|
||||
("channel", self.channel),
|
||||
))),
|
||||
)
|
||||
)
|
||||
predicate["runDetails"] = OrderedDict(
|
||||
(
|
||||
("builder", OrderedDict((("id", "https://github.com/actions"),))),
|
||||
("metadata", OrderedDict((("finishedOn", self.release_date),))),
|
||||
)
|
||||
)
|
||||
predicate_path = self.provenance_dir / f"{name}.provenance.json"
|
||||
with predicate_path.open("w", encoding="utf-8") as handle:
|
||||
json.dump(predicate, handle, indent=2, sort_keys=True)
|
||||
handle.write("\n")
|
||||
entry = OrderedDict((
|
||||
("path", str(predicate_path.relative_to(self.output_dir.parent)) if predicate_path.is_relative_to(self.output_dir.parent) else str(predicate_path)),
|
||||
("sha256", compute_sha256(predicate_path)),
|
||||
))
|
||||
if self.skip_signing:
|
||||
return entry
|
||||
cmd = [
|
||||
"cosign",
|
||||
"attest",
|
||||
"--predicate",
|
||||
str(predicate_path),
|
||||
"--type",
|
||||
"https://slsa.dev/provenance/v1",
|
||||
]
|
||||
if self.cosign_key_ref:
|
||||
cmd.extend(["--key", self.cosign_key_ref])
|
||||
if not self.tlog_upload:
|
||||
cmd.append("--tlog-upload=false")
|
||||
cmd.append("--allow-http-registry")
|
||||
cmd.append(image_ref)
|
||||
run(cmd, env=self.cosign_env)
|
||||
ref = run([
|
||||
"cosign",
|
||||
"triangulate",
|
||||
"--type",
|
||||
"https://slsa.dev/provenance/v1",
|
||||
"--allow-http-registry",
|
||||
image_ref,
|
||||
]).strip()
|
||||
entry["ref"] = ref
|
||||
return entry
|
||||
|
||||
# ----------------
|
||||
# Helm + compose
|
||||
# ----------------
|
||||
def _package_helm(self) -> Optional[Mapping[str, Any]]:
|
||||
helm_cfg = self.config.get("helm")
|
||||
if not helm_cfg:
|
||||
return None
|
||||
chart_path = helm_cfg.get("chartPath")
|
||||
if not chart_path:
|
||||
return None
|
||||
chart_dir = self.repo_root / chart_path
|
||||
output_dir = ensure_directory(self.output_dir / "helm")
|
||||
archive_path = output_dir / f"stellaops-{self.version}.tgz"
|
||||
if not self.dry_run:
|
||||
cmd = [
|
||||
"helm",
|
||||
"package",
|
||||
str(chart_dir),
|
||||
"--destination",
|
||||
str(output_dir),
|
||||
"--version",
|
||||
self.version,
|
||||
"--app-version",
|
||||
self.version,
|
||||
]
|
||||
run(cmd)
|
||||
packaged = next(output_dir.glob("*.tgz"), None)
|
||||
if packaged and packaged != archive_path:
|
||||
packaged.rename(archive_path)
|
||||
digest = compute_sha256(archive_path) if archive_path.exists() else None
|
||||
if archive_path.exists() and archive_path.is_relative_to(self.output_dir):
|
||||
manifest_path = str(archive_path.relative_to(self.output_dir))
|
||||
elif archive_path.exists() and archive_path.is_relative_to(self.output_dir.parent):
|
||||
manifest_path = str(archive_path.relative_to(self.output_dir.parent))
|
||||
else:
|
||||
manifest_path = f"helm/{archive_path.name}"
|
||||
return OrderedDict((
|
||||
("name", "stellaops"),
|
||||
("version", self.version),
|
||||
("path", manifest_path),
|
||||
("sha256", digest),
|
||||
))
|
||||
|
||||
def _digest_compose_files(self) -> List[Mapping[str, Any]]:
|
||||
compose_cfg = self.config.get("compose", {})
|
||||
files = compose_cfg.get("files", [])
|
||||
entries: List[Mapping[str, Any]] = []
|
||||
for rel_path in files:
|
||||
src = self.repo_root / rel_path
|
||||
if not src.exists():
|
||||
continue
|
||||
digest = compute_sha256(src)
|
||||
entries.append(OrderedDict((
|
||||
("name", pathlib.Path(rel_path).name),
|
||||
("path", rel_path),
|
||||
("sha256", digest),
|
||||
)))
|
||||
return entries
|
||||
|
||||
# ----------------
|
||||
# Manifest assembly
|
||||
# ----------------
|
||||
def _compose_manifest(
|
||||
self,
|
||||
components: List[Mapping[str, Any]],
|
||||
helm_meta: Optional[Mapping[str, Any]],
|
||||
compose_meta: List[Mapping[str, Any]],
|
||||
) -> Dict[str, Any]:
|
||||
manifest = OrderedDict()
|
||||
manifest["release"] = OrderedDict((
|
||||
("version", self.version),
|
||||
("channel", self.channel),
|
||||
("date", self.release_date),
|
||||
("calendar", self.calendar),
|
||||
))
|
||||
manifest["components"] = components
|
||||
if helm_meta:
|
||||
manifest["charts"] = [helm_meta]
|
||||
if compose_meta:
|
||||
manifest["compose"] = compose_meta
|
||||
return manifest
|
||||
|
||||
|
||||
def parse_args(argv: Optional[Sequence[str]] = None) -> argparse.Namespace:
|
||||
parser = argparse.ArgumentParser(description="Build StellaOps release artefacts deterministically")
|
||||
parser.add_argument("--config", type=pathlib.Path, default=DEFAULT_CONFIG, help="Path to release config JSON")
|
||||
parser.add_argument("--version", required=True, help="Release version string (e.g. 2025.10.0-edge)")
|
||||
parser.add_argument("--channel", required=True, help="Release channel (edge|stable|lts)")
|
||||
parser.add_argument("--calendar", help="Calendar tag (YYYY.MM); defaults derived from version")
|
||||
parser.add_argument("--git-sha", default=os.environ.get("GIT_COMMIT", "unknown"), help="Git revision to embed")
|
||||
parser.add_argument("--output", type=pathlib.Path, default=REPO_ROOT / "out/release", help="Output directory for artefacts")
|
||||
parser.add_argument("--no-push", action="store_true", help="Do not push images (use docker load)")
|
||||
parser.add_argument("--dry-run", action="store_true", help="Print steps without executing commands")
|
||||
parser.add_argument("--registry", help="Override registry root (e.g. localhost:5000/stellaops)")
|
||||
parser.add_argument("--platform", dest="platforms", action="append", metavar="PLATFORM", help="Override build platforms (repeatable)")
|
||||
parser.add_argument("--skip-signing", action="store_true", help="Skip cosign signing/attestation steps")
|
||||
parser.add_argument("--cosign-key", dest="cosign_key", help="Override COSIGN_KEY_REF value")
|
||||
parser.add_argument("--cosign-password", dest="cosign_password", help="Password for cosign key")
|
||||
parser.add_argument("--cosign-identity-token", dest="cosign_identity_token", help="Identity token for keyless cosign flows")
|
||||
parser.add_argument("--no-transparency", action="store_true", help="Disable Rekor transparency log upload during signing")
|
||||
return parser.parse_args(argv)
|
||||
|
||||
|
||||
def write_manifest(manifest: Mapping[str, Any], output_dir: pathlib.Path) -> pathlib.Path:
|
||||
# Copy manifest to avoid mutating input when computing checksum
|
||||
base_manifest = OrderedDict(manifest)
|
||||
yaml_without_checksum = dump_yaml(base_manifest)
|
||||
digest = hashlib.sha256(yaml_without_checksum.encode("utf-8")).hexdigest()
|
||||
manifest_with_checksum = OrderedDict(base_manifest)
|
||||
manifest_with_checksum["checksums"] = OrderedDict((("sha256", digest),))
|
||||
final_yaml = dump_yaml(manifest_with_checksum)
|
||||
output_path = output_dir / "release.yaml"
|
||||
with output_path.open("w", encoding="utf-8") as handle:
|
||||
handle.write(final_yaml)
|
||||
return output_path
|
||||
|
||||
|
||||
def main(argv: Optional[Sequence[str]] = None) -> int:
|
||||
args = parse_args(argv)
|
||||
config = load_json_config(args.config)
|
||||
release_date = utc_now_iso()
|
||||
calendar = sanitize_calendar(args.version, args.calendar)
|
||||
builder = ReleaseBuilder(
|
||||
repo_root=REPO_ROOT,
|
||||
config=config,
|
||||
version=args.version,
|
||||
channel=args.channel,
|
||||
calendar=calendar,
|
||||
release_date=release_date,
|
||||
git_sha=args.git_sha,
|
||||
output_dir=args.output,
|
||||
push=not args.no_push,
|
||||
dry_run=args.dry_run,
|
||||
registry_override=args.registry,
|
||||
platforms_override=args.platforms,
|
||||
skip_signing=args.skip_signing,
|
||||
cosign_key_ref=args.cosign_key,
|
||||
cosign_password=args.cosign_password,
|
||||
cosign_identity_token=args.cosign_identity_token,
|
||||
tlog_upload=not args.no_transparency,
|
||||
)
|
||||
manifest = builder.run()
|
||||
manifest_path = write_manifest(manifest, builder.output_dir)
|
||||
print(f"✅ Release manifest written to {manifest_path}")
|
||||
return 0
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
raise SystemExit(main())
|
||||
97
ops/devops/release/components.json
Normal file
97
ops/devops/release/components.json
Normal file
@@ -0,0 +1,97 @@
|
||||
{
|
||||
"registry": "registry.stella-ops.org/stellaops",
|
||||
"platforms": ["linux/amd64", "linux/arm64"],
|
||||
"defaultChannel": "edge",
|
||||
"docker": {
|
||||
"sdkImage": "mcr.microsoft.com/dotnet/nightly/sdk:10.0",
|
||||
"runtimeImage": "mcr.microsoft.com/dotnet/nightly/aspnet:10.0",
|
||||
"nodeImage": "node:20.14.0-bookworm",
|
||||
"nginxImage": "nginx:1.27-alpine"
|
||||
},
|
||||
"components": [
|
||||
{
|
||||
"name": "authority",
|
||||
"repository": "authority",
|
||||
"kind": "dotnet-service",
|
||||
"context": ".",
|
||||
"dockerfile": "ops/devops/release/docker/Dockerfile.dotnet-service",
|
||||
"project": "src/StellaOps.Authority/StellaOps.Authority/StellaOps.Authority.csproj",
|
||||
"entrypoint": "StellaOps.Authority.dll"
|
||||
},
|
||||
{
|
||||
"name": "signer",
|
||||
"repository": "signer",
|
||||
"kind": "dotnet-service",
|
||||
"context": ".",
|
||||
"dockerfile": "ops/devops/release/docker/Dockerfile.dotnet-service",
|
||||
"project": "src/StellaOps.Signer/StellaOps.Signer.WebService/StellaOps.Signer.WebService.csproj",
|
||||
"entrypoint": "StellaOps.Signer.WebService.dll"
|
||||
},
|
||||
{
|
||||
"name": "attestor",
|
||||
"repository": "attestor",
|
||||
"kind": "dotnet-service",
|
||||
"context": ".",
|
||||
"dockerfile": "ops/devops/release/docker/Dockerfile.dotnet-service",
|
||||
"project": "src/StellaOps.Attestor/StellaOps.Attestor.WebService/StellaOps.Attestor.WebService.csproj",
|
||||
"entrypoint": "StellaOps.Attestor.WebService.dll"
|
||||
},
|
||||
{
|
||||
"name": "scanner-web",
|
||||
"repository": "scanner-web",
|
||||
"kind": "dotnet-service",
|
||||
"context": ".",
|
||||
"dockerfile": "ops/devops/release/docker/Dockerfile.dotnet-service",
|
||||
"project": "src/StellaOps.Scanner.WebService/StellaOps.Scanner.WebService.csproj",
|
||||
"entrypoint": "StellaOps.Scanner.WebService.dll"
|
||||
},
|
||||
{
|
||||
"name": "scanner-worker",
|
||||
"repository": "scanner-worker",
|
||||
"kind": "dotnet-service",
|
||||
"context": ".",
|
||||
"dockerfile": "ops/devops/release/docker/Dockerfile.dotnet-service",
|
||||
"project": "src/StellaOps.Scanner.Worker/StellaOps.Scanner.Worker.csproj",
|
||||
"entrypoint": "StellaOps.Scanner.Worker.dll"
|
||||
},
|
||||
{
|
||||
"name": "concelier",
|
||||
"repository": "concelier",
|
||||
"kind": "dotnet-service",
|
||||
"context": ".",
|
||||
"dockerfile": "ops/devops/release/docker/Dockerfile.dotnet-service",
|
||||
"project": "src/StellaOps.Concelier.WebService/StellaOps.Concelier.WebService.csproj",
|
||||
"entrypoint": "StellaOps.Concelier.WebService.dll"
|
||||
},
|
||||
{
|
||||
"name": "excititor",
|
||||
"repository": "excititor",
|
||||
"kind": "dotnet-service",
|
||||
"context": ".",
|
||||
"dockerfile": "ops/devops/release/docker/Dockerfile.dotnet-service",
|
||||
"project": "src/StellaOps.Excititor.WebService/StellaOps.Excititor.WebService.csproj",
|
||||
"entrypoint": "StellaOps.Excititor.WebService.dll"
|
||||
},
|
||||
{
|
||||
"name": "web-ui",
|
||||
"repository": "web-ui",
|
||||
"kind": "angular-ui",
|
||||
"context": ".",
|
||||
"dockerfile": "ops/devops/release/docker/Dockerfile.angular-ui"
|
||||
}
|
||||
],
|
||||
"helm": {
|
||||
"chartPath": "deploy/helm/stellaops",
|
||||
"outputDir": "out/release/helm"
|
||||
},
|
||||
"compose": {
|
||||
"files": [
|
||||
"deploy/compose/docker-compose.dev.yaml",
|
||||
"deploy/compose/docker-compose.stage.yaml",
|
||||
"deploy/compose/docker-compose.airgap.yaml"
|
||||
]
|
||||
},
|
||||
"buildxPlugin": {
|
||||
"project": "src/StellaOps.Scanner.Sbomer.BuildXPlugin/StellaOps.Scanner.Sbomer.BuildXPlugin.csproj"
|
||||
}
|
||||
}
|
||||
31
ops/devops/release/docker/Dockerfile.angular-ui
Normal file
31
ops/devops/release/docker/Dockerfile.angular-ui
Normal file
@@ -0,0 +1,31 @@
|
||||
# syntax=docker/dockerfile:1.7-labs
|
||||
|
||||
ARG NODE_IMAGE=node:20.14.0-bookworm
|
||||
ARG NGINX_IMAGE=nginx:1.27-alpine
|
||||
ARG VERSION=0.0.0
|
||||
ARG CHANNEL=dev
|
||||
ARG GIT_SHA=0000000
|
||||
ARG SOURCE_DATE_EPOCH=0
|
||||
|
||||
FROM ${NODE_IMAGE} AS build
|
||||
WORKDIR /workspace
|
||||
ENV CI=1 \
|
||||
SOURCE_DATE_EPOCH=${SOURCE_DATE_EPOCH}
|
||||
COPY src/StellaOps.Web/package.json src/StellaOps.Web/package-lock.json ./
|
||||
RUN npm ci --prefer-offline --no-audit --no-fund
|
||||
COPY src/StellaOps.Web/ ./
|
||||
RUN npm run build -- --configuration=production
|
||||
|
||||
FROM ${NGINX_IMAGE} AS runtime
|
||||
ARG VERSION
|
||||
ARG CHANNEL
|
||||
ARG GIT_SHA
|
||||
WORKDIR /usr/share/nginx/html
|
||||
RUN rm -rf ./*
|
||||
COPY --from=build /workspace/dist/stellaops-web/ /usr/share/nginx/html/
|
||||
COPY ops/devops/release/docker/nginx-default.conf /etc/nginx/conf.d/default.conf
|
||||
LABEL org.opencontainers.image.version="${VERSION}" \
|
||||
org.opencontainers.image.revision="${GIT_SHA}" \
|
||||
org.opencontainers.image.source="https://git.stella-ops.org/stella-ops/feedser" \
|
||||
org.stellaops.release.channel="${CHANNEL}"
|
||||
EXPOSE 8080
|
||||
52
ops/devops/release/docker/Dockerfile.dotnet-service
Normal file
52
ops/devops/release/docker/Dockerfile.dotnet-service
Normal file
@@ -0,0 +1,52 @@
|
||||
# syntax=docker/dockerfile:1.7-labs
|
||||
|
||||
ARG SDK_IMAGE=mcr.microsoft.com/dotnet/nightly/sdk:10.0
|
||||
ARG RUNTIME_IMAGE=gcr.io/distroless/dotnet/aspnet:latest
|
||||
|
||||
ARG PROJECT
|
||||
ARG ENTRYPOINT_DLL
|
||||
ARG VERSION=0.0.0
|
||||
ARG CHANNEL=dev
|
||||
ARG GIT_SHA=0000000
|
||||
ARG SOURCE_DATE_EPOCH=0
|
||||
|
||||
FROM ${SDK_IMAGE} AS build
|
||||
ARG PROJECT
|
||||
ARG GIT_SHA
|
||||
ARG SOURCE_DATE_EPOCH
|
||||
WORKDIR /src
|
||||
ENV DOTNET_CLI_TELEMETRY_OPTOUT=1 \
|
||||
DOTNET_SKIP_FIRST_TIME_EXPERIENCE=1 \
|
||||
NUGET_XMLDOC_MODE=skip \
|
||||
SOURCE_DATE_EPOCH=${SOURCE_DATE_EPOCH}
|
||||
COPY . .
|
||||
RUN --mount=type=cache,target=/root/.nuget/packages \
|
||||
dotnet restore "${PROJECT}"
|
||||
RUN --mount=type=cache,target=/root/.nuget/packages \
|
||||
dotnet publish "${PROJECT}" \
|
||||
-c Release \
|
||||
-o /app/publish \
|
||||
/p:UseAppHost=false \
|
||||
/p:ContinuousIntegrationBuild=true \
|
||||
/p:SourceRevisionId=${GIT_SHA} \
|
||||
/p:Deterministic=true \
|
||||
/p:TreatWarningsAsErrors=true
|
||||
|
||||
FROM ${RUNTIME_IMAGE} AS runtime
|
||||
WORKDIR /app
|
||||
ARG ENTRYPOINT_DLL
|
||||
ARG VERSION
|
||||
ARG CHANNEL
|
||||
ARG GIT_SHA
|
||||
ENV DOTNET_EnableDiagnostics=0 \
|
||||
ASPNETCORE_URLS=http://0.0.0.0:8080
|
||||
COPY --from=build /app/publish/ ./
|
||||
RUN set -eu; \
|
||||
printf '#!/usr/bin/env sh\nset -e\nexec dotnet %s "$@"\n' "${ENTRYPOINT_DLL}" > /entrypoint.sh; \
|
||||
chmod +x /entrypoint.sh
|
||||
EXPOSE 8080
|
||||
LABEL org.opencontainers.image.version="${VERSION}" \
|
||||
org.opencontainers.image.revision="${GIT_SHA}" \
|
||||
org.opencontainers.image.source="https://git.stella-ops.org/stella-ops/feedser" \
|
||||
org.stellaops.release.channel="${CHANNEL}"
|
||||
ENTRYPOINT ["/entrypoint.sh"]
|
||||
22
ops/devops/release/docker/nginx-default.conf
Normal file
22
ops/devops/release/docker/nginx-default.conf
Normal file
@@ -0,0 +1,22 @@
|
||||
server {
|
||||
listen 8080;
|
||||
listen [::]:8080;
|
||||
server_name _;
|
||||
|
||||
root /usr/share/nginx/html;
|
||||
index index.html;
|
||||
|
||||
location / {
|
||||
try_files $uri $uri/ /index.html;
|
||||
}
|
||||
|
||||
location ~* \.(?:js|css|svg|png|jpg|jpeg|gif|ico|woff2?)$ {
|
||||
add_header Cache-Control "public, max-age=2592000";
|
||||
}
|
||||
|
||||
location = /healthz {
|
||||
access_log off;
|
||||
add_header Content-Type text/plain;
|
||||
return 200 'ok';
|
||||
}
|
||||
}
|
||||
60
ops/devops/sync-preview-nuget.sh
Normal file
60
ops/devops/sync-preview-nuget.sh
Normal file
@@ -0,0 +1,60 @@
|
||||
#!/usr/bin/env bash
|
||||
|
||||
# Sync preview NuGet packages into the local offline feed.
|
||||
# Reads package metadata from ops/devops/nuget-preview-packages.csv
|
||||
# and ensures ./local-nuget holds the expected artefacts (with SHA-256 verification).
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
repo_root="$(git -C "${BASH_SOURCE%/*}/.." rev-parse --show-toplevel 2>/dev/null || pwd)"
|
||||
manifest="${repo_root}/ops/devops/nuget-preview-packages.csv"
|
||||
dest="${repo_root}/local-nuget"
|
||||
|
||||
if [[ ! -f "$manifest" ]]; then
|
||||
echo "Manifest not found: $manifest" >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
mkdir -p "$dest"
|
||||
|
||||
fetch_package() {
|
||||
local package="$1"
|
||||
local version="$2"
|
||||
local expected_sha="$3"
|
||||
local target="$dest/${package}.${version}.nupkg"
|
||||
local url="https://www.nuget.org/api/v2/package/${package}/${version}"
|
||||
|
||||
echo "[sync-nuget] Fetching ${package} ${version}"
|
||||
local tmp
|
||||
tmp="$(mktemp)"
|
||||
trap 'rm -f "$tmp"' RETURN
|
||||
curl -fsSL --retry 3 --retry-delay 1 "$url" -o "$tmp"
|
||||
local actual_sha
|
||||
actual_sha="$(sha256sum "$tmp" | awk '{print $1}')"
|
||||
if [[ "$actual_sha" != "$expected_sha" ]]; then
|
||||
echo "Checksum mismatch for ${package} ${version}" >&2
|
||||
echo " expected: $expected_sha" >&2
|
||||
echo " actual: $actual_sha" >&2
|
||||
exit 1
|
||||
fi
|
||||
mv "$tmp" "$target"
|
||||
trap - RETURN
|
||||
}
|
||||
|
||||
while IFS=',' read -r package version sha; do
|
||||
[[ -z "$package" || "$package" == \#* ]] && continue
|
||||
|
||||
local_path="$dest/${package}.${version}.nupkg"
|
||||
if [[ -f "$local_path" ]]; then
|
||||
current_sha="$(sha256sum "$local_path" | awk '{print $1}')"
|
||||
if [[ "$current_sha" == "$sha" ]]; then
|
||||
echo "[sync-nuget] OK ${package} ${version}"
|
||||
continue
|
||||
fi
|
||||
echo "[sync-nuget] SHA mismatch for ${package} ${version}, refreshing"
|
||||
else
|
||||
echo "[sync-nuget] Missing ${package} ${version}"
|
||||
fi
|
||||
|
||||
fetch_package "$package" "$version" "$sha"
|
||||
done < "$manifest"
|
||||
Reference in New Issue
Block a user