save progress
This commit is contained in:
279
.gitea/README.md
Normal file
279
.gitea/README.md
Normal file
@@ -0,0 +1,279 @@
|
||||
# StellaOps CI/CD Infrastructure
|
||||
|
||||
Comprehensive CI/CD infrastructure for the StellaOps platform using Gitea Actions.
|
||||
|
||||
## Quick Reference
|
||||
|
||||
| Resource | Location |
|
||||
|----------|----------|
|
||||
| Workflows | `.gitea/workflows/` (96 workflows) |
|
||||
| Scripts | `.gitea/scripts/` |
|
||||
| Documentation | `.gitea/docs/` |
|
||||
| DevOps Configs | `devops/` |
|
||||
| Release Manifests | `devops/releases/` |
|
||||
|
||||
## Workflow Categories
|
||||
|
||||
### Core Build & Test
|
||||
|
||||
| Workflow | File | Description |
|
||||
|----------|------|-------------|
|
||||
| Build Test Deploy | `build-test-deploy.yml` | Main CI pipeline for all modules |
|
||||
| Test Matrix | `test-matrix.yml` | Unified test execution with TRX reporting |
|
||||
| Test Lanes | `test-lanes.yml` | Parallel test lane execution |
|
||||
| Integration Tests | `integration-tests-gate.yml` | Integration test quality gate |
|
||||
|
||||
### Release Pipelines
|
||||
|
||||
| Workflow | File | Description |
|
||||
|----------|------|-------------|
|
||||
| Suite Release | `release-suite.yml` | Full platform release (YYYY.MM versioning) |
|
||||
| Service Release | `service-release.yml` | Per-service release pipeline |
|
||||
| Module Publish | `module-publish.yml` | NuGet and container publishing |
|
||||
| Release Validation | `release-validation.yml` | Post-release verification |
|
||||
| Promote | `promote.yml` | Environment promotion (dev/stage/prod) |
|
||||
|
||||
### CLI & SDK
|
||||
|
||||
| Workflow | File | Description |
|
||||
|----------|------|-------------|
|
||||
| CLI Build | `cli-build.yml` | Multi-platform CLI builds |
|
||||
| CLI Chaos Parity | `cli-chaos-parity.yml` | CLI behavioral consistency tests |
|
||||
| SDK Generator | `sdk-generator.yml` | Client SDK generation |
|
||||
| SDK Publish | `sdk-publish.yml` | SDK package publishing |
|
||||
|
||||
### Security & Compliance
|
||||
|
||||
| Workflow | File | Description |
|
||||
|----------|------|-------------|
|
||||
| Artifact Signing | `artifact-signing.yml` | Cosign artifact signing |
|
||||
| Dependency Security | `dependency-security-scan.yml` | Vulnerability scanning |
|
||||
| License Audit | `license-audit.yml` | OSS license compliance |
|
||||
| License Gate | `dependency-license-gate.yml` | PR license compliance gate |
|
||||
| Crypto Compliance | `crypto-compliance.yml` | Cryptographic compliance checks |
|
||||
| Provenance Check | `provenance-check.yml` | Supply chain provenance |
|
||||
|
||||
### Attestation & Evidence
|
||||
|
||||
| Workflow | File | Description |
|
||||
|----------|------|-------------|
|
||||
| Attestation Bundle | `attestation-bundle.yml` | in-toto attestation bundling |
|
||||
| Evidence Locker | `evidence-locker.yml` | Evidence artifact storage |
|
||||
| VEX Proof Bundles | `vex-proof-bundles.yml` | VEX proof generation |
|
||||
| Signals Evidence | `signals-evidence-locker.yml` | Signal evidence collection |
|
||||
| Signals DSSE Sign | `signals-dsse-sign.yml` | DSSE envelope signing |
|
||||
|
||||
### Scanner & Analysis
|
||||
|
||||
| Workflow | File | Description |
|
||||
|----------|------|-------------|
|
||||
| Scanner Analyzers | `scanner-analyzers.yml` | Language analyzer CI |
|
||||
| Scanner Determinism | `scanner-determinism.yml` | Output reproducibility tests |
|
||||
| Reachability Bench | `reachability-bench.yaml` | Reachability analysis benchmarks |
|
||||
| Reachability Corpus | `reachability-corpus-ci.yml` | Corpus maintenance |
|
||||
| EPSS Ingest Perf | `epss-ingest-perf.yml` | EPSS ingestion performance |
|
||||
|
||||
### Determinism & Reproducibility
|
||||
|
||||
| Workflow | File | Description |
|
||||
|----------|------|-------------|
|
||||
| Determinism Gate | `determinism-gate.yml` | Build determinism quality gate |
|
||||
| Cross-Platform Det. | `cross-platform-determinism.yml` | Cross-OS reproducibility |
|
||||
| Bench Determinism | `bench-determinism.yml` | Benchmark determinism |
|
||||
| E2E Reproducibility | `e2e-reproducibility.yml` | End-to-end reproducibility |
|
||||
|
||||
### Module-Specific
|
||||
|
||||
| Workflow | File | Description |
|
||||
|----------|------|-------------|
|
||||
| Advisory AI Release | `advisory-ai-release.yml` | AI module release |
|
||||
| AOC Guard | `aoc-guard.yml` | AOC policy enforcement |
|
||||
| Authority Key Rotation | `authority-key-rotation.yml` | Key rotation automation |
|
||||
| Concelier Tests | `concelier-attestation-tests.yml` | Concelier attestation tests |
|
||||
| Findings Ledger | `findings-ledger-ci.yml` | Findings ledger CI |
|
||||
| Policy Lint | `policy-lint.yml` | Policy DSL validation |
|
||||
| Router Chaos | `router-chaos.yml` | Router chaos testing |
|
||||
| Signals CI | `signals-ci.yml` | Signals module CI |
|
||||
|
||||
### Infrastructure & Ops
|
||||
|
||||
| Workflow | File | Description |
|
||||
|----------|------|-------------|
|
||||
| Containers Multiarch | `containers-multiarch.yml` | Multi-architecture builds |
|
||||
| Docker Regional | `docker-regional-builds.yml` | Regional Docker builds |
|
||||
| Helm Validation | (via scripts) | Helm chart validation |
|
||||
| Console Runner | `console-runner-image.yml` | Runner image builds |
|
||||
| Obs SLO | `obs-slo.yml` | Observability SLO checks |
|
||||
| Obs Stream | `obs-stream.yml` | Telemetry streaming |
|
||||
|
||||
### Documentation & API
|
||||
|
||||
| Workflow | File | Description |
|
||||
|----------|------|-------------|
|
||||
| Docs | `docs.yml` | Documentation site build |
|
||||
| OAS CI | `oas-ci.yml` | OpenAPI spec validation |
|
||||
| API Governance | `api-governance.yml` | API governance checks |
|
||||
| Schema Validation | `schema-validation.yml` | JSON schema validation |
|
||||
|
||||
### Dependency Management
|
||||
|
||||
| Workflow | File | Description |
|
||||
|----------|------|-------------|
|
||||
| Renovate | `renovate.yml` | Automated dependency updates |
|
||||
| License Gate | `dependency-license-gate.yml` | License compliance gate |
|
||||
| Security Scan | `dependency-security-scan.yml` | Vulnerability scanning |
|
||||
|
||||
## Script Categories
|
||||
|
||||
### Build Scripts (`scripts/build/`)
|
||||
|
||||
| Script | Purpose |
|
||||
|--------|---------|
|
||||
| `build-cli.sh` | Build CLI for specific runtime |
|
||||
| `build-multiarch.sh` | Multi-architecture container builds |
|
||||
| `build-airgap-bundle.sh` | Air-gap deployment bundle |
|
||||
|
||||
### Test Scripts (`scripts/test/`)
|
||||
|
||||
| Script | Purpose |
|
||||
|--------|---------|
|
||||
| `determinism-run.sh` | Determinism verification |
|
||||
| `run-fixtures-check.sh` | Test fixture validation |
|
||||
|
||||
### Validation Scripts (`scripts/validate/`)
|
||||
|
||||
| Script | Purpose |
|
||||
|--------|---------|
|
||||
| `validate-compose.sh` | Docker Compose validation |
|
||||
| `validate-helm.sh` | Helm chart validation |
|
||||
| `validate-licenses.sh` | License compliance |
|
||||
| `validate-migrations.sh` | Database migration validation |
|
||||
| `validate-sbom.sh` | SBOM validation |
|
||||
| `validate-spdx.sh` | SPDX format validation |
|
||||
| `validate-vex.sh` | VEX document validation |
|
||||
| `validate-workflows.sh` | Workflow YAML validation |
|
||||
| `verify-binaries.sh` | Binary integrity verification |
|
||||
|
||||
### Signing Scripts (`scripts/sign/`)
|
||||
|
||||
| Script | Purpose |
|
||||
|--------|---------|
|
||||
| `sign-authority-gaps.sh` | Sign authority gap attestations |
|
||||
| `sign-policy.sh` | Sign policy artifacts |
|
||||
| `sign-signals.sh` | Sign signals data |
|
||||
|
||||
### Release Scripts (`scripts/release/`)
|
||||
|
||||
| Script | Purpose |
|
||||
|--------|---------|
|
||||
| `build_release.py` | Suite release orchestration |
|
||||
| `verify_release.py` | Release verification |
|
||||
| `bump-service-version.py` | Service version management |
|
||||
| `read-service-version.sh` | Read current version |
|
||||
| `generate-docker-tag.sh` | Generate Docker tags |
|
||||
| `generate_changelog.py` | AI-assisted changelog |
|
||||
| `generate_suite_docs.py` | Release documentation |
|
||||
| `generate_compose.py` | Docker Compose generation |
|
||||
| `collect_versions.py` | Version collection |
|
||||
| `check_cli_parity.py` | CLI version parity |
|
||||
|
||||
### Evidence Scripts (`scripts/evidence/`)
|
||||
|
||||
| Script | Purpose |
|
||||
|--------|---------|
|
||||
| `upload-all-evidence.sh` | Upload all evidence bundles |
|
||||
| `signals-upload-evidence.sh` | Upload signals evidence |
|
||||
| `zastava-upload-evidence.sh` | Upload Zastava evidence |
|
||||
|
||||
### Metrics Scripts (`scripts/metrics/`)
|
||||
|
||||
| Script | Purpose |
|
||||
|--------|---------|
|
||||
| `compute-reachability-metrics.sh` | Reachability analysis metrics |
|
||||
| `compute-ttfs-metrics.sh` | Time-to-first-scan metrics |
|
||||
| `enforce-performance-slos.sh` | SLO enforcement |
|
||||
|
||||
### Utility Scripts (`scripts/util/`)
|
||||
|
||||
| Script | Purpose |
|
||||
|--------|---------|
|
||||
| `cleanup-runner-space.sh` | Runner disk cleanup |
|
||||
| `dotnet-filter.sh` | .NET project filtering |
|
||||
| `enable-openssl11-shim.sh` | OpenSSL 1.1 compatibility |
|
||||
|
||||
## Environment Variables
|
||||
|
||||
### Required Secrets
|
||||
|
||||
| Secret | Purpose | Workflows |
|
||||
|--------|---------|-----------|
|
||||
| `GITEA_TOKEN` | API access, commits | All |
|
||||
| `RENOVATE_TOKEN` | Dependency bot access | `renovate.yml` |
|
||||
| `COSIGN_PRIVATE_KEY_B64` | Artifact signing | Release pipelines |
|
||||
| `AI_API_KEY` | Changelog generation | `release-suite.yml` |
|
||||
| `REGISTRY_USERNAME` | Container registry | Build/deploy |
|
||||
| `REGISTRY_PASSWORD` | Container registry | Build/deploy |
|
||||
| `SSH_PRIVATE_KEY` | Deployment access | Deploy pipelines |
|
||||
|
||||
### Common Variables
|
||||
|
||||
| Variable | Default | Purpose |
|
||||
|----------|---------|---------|
|
||||
| `DOTNET_VERSION` | `10.0.100` | .NET SDK version |
|
||||
| `NODE_VERSION` | `20` | Node.js version |
|
||||
| `RENOVATE_VERSION` | `37.100.0` | Renovate version |
|
||||
| `REGISTRY_HOST` | `git.stella-ops.org` | Container registry |
|
||||
|
||||
## Versioning Strategy
|
||||
|
||||
### Suite Releases (Platform)
|
||||
|
||||
- Format: `YYYY.MM` with codenames (Ubuntu-style)
|
||||
- Example: `2026.04 Nova`
|
||||
- Triggered by: Tag `suite-YYYY.MM`
|
||||
- Documentation: `docs/releases/YYYY.MM/`
|
||||
|
||||
### Service Releases (Individual)
|
||||
|
||||
- Format: SemVer `MAJOR.MINOR.PATCH`
|
||||
- Docker tag: `{version}+{YYYYMMDDHHmmss}`
|
||||
- Example: `1.2.3+20250128143022`
|
||||
- Triggered by: Tag `service-{name}-v{version}`
|
||||
- Version source: `src/Directory.Versions.props`
|
||||
|
||||
### Module Releases
|
||||
|
||||
- Format: SemVer `MAJOR.MINOR.PATCH`
|
||||
- Triggered by: Tag `module-{name}-v{version}`
|
||||
|
||||
## Documentation
|
||||
|
||||
| Document | Description |
|
||||
|----------|-------------|
|
||||
| [Architecture](docs/architecture.md) | Workflow architecture and dependencies |
|
||||
| [Scripts Inventory](docs/scripts.md) | Complete script documentation |
|
||||
| [Troubleshooting](docs/troubleshooting.md) | Common issues and solutions |
|
||||
| [Development Guide](docs/development.md) | Creating new workflows |
|
||||
| [Runners](docs/runners.md) | Self-hosted runner setup |
|
||||
| [Dependency Management](docs/dependency-management.md) | Renovate guide |
|
||||
|
||||
## Related Documentation
|
||||
|
||||
- [Main Architecture](../docs/07_HIGH_LEVEL_ARCHITECTURE.md)
|
||||
- [DevOps README](../devops/README.md)
|
||||
- [Release Versioning](../docs/releases/VERSIONING.md)
|
||||
- [Offline Operations](../docs/24_OFFLINE_KIT.md)
|
||||
|
||||
## Contributing
|
||||
|
||||
1. Read `AGENTS.md` before making changes
|
||||
2. Follow workflow naming conventions
|
||||
3. Pin tool versions where possible
|
||||
4. Keep workflows deterministic and offline-friendly
|
||||
5. Update documentation when adding/modifying workflows
|
||||
6. Test locally with `act` when possible
|
||||
|
||||
## Support
|
||||
|
||||
- Issues: https://git.stella-ops.org/stella-ops.org/issues
|
||||
- Documentation: `docs/`
|
||||
533
.gitea/config/path-filters.yml
Normal file
533
.gitea/config/path-filters.yml
Normal file
@@ -0,0 +1,533 @@
|
||||
# =============================================================================
|
||||
# CENTRALIZED PATH FILTER DEFINITIONS
|
||||
# =============================================================================
|
||||
# This file documents the path filters used across all CI/CD workflows.
|
||||
# Each workflow should reference these patterns for consistency.
|
||||
#
|
||||
# Last updated: 2025-12-28
|
||||
# =============================================================================
|
||||
|
||||
# -----------------------------------------------------------------------------
|
||||
# INFRASTRUCTURE FILES - Changes trigger FULL CI
|
||||
# -----------------------------------------------------------------------------
|
||||
infrastructure:
|
||||
- 'Directory.Build.props'
|
||||
- 'Directory.Build.rsp'
|
||||
- 'Directory.Packages.props'
|
||||
- 'src/Directory.Build.props'
|
||||
- 'src/Directory.Packages.props'
|
||||
- 'nuget.config'
|
||||
- 'StellaOps.sln'
|
||||
|
||||
# -----------------------------------------------------------------------------
|
||||
# DOCUMENTATION - Should NOT trigger builds (paths-ignore)
|
||||
# -----------------------------------------------------------------------------
|
||||
docs_ignore:
|
||||
- 'docs/**'
|
||||
- '*.md'
|
||||
- '!CLAUDE.md' # Exception: Agent instructions SHOULD trigger
|
||||
- '!AGENTS.md' # Exception: Module guidance SHOULD trigger
|
||||
- 'etc/**'
|
||||
- 'LICENSE'
|
||||
- '.gitignore'
|
||||
- '.editorconfig'
|
||||
|
||||
# -----------------------------------------------------------------------------
|
||||
# SHARED LIBRARIES - Trigger cascading tests
|
||||
# -----------------------------------------------------------------------------
|
||||
shared_libraries:
|
||||
# Cryptography - CRITICAL, affects all security modules
|
||||
cryptography:
|
||||
paths:
|
||||
- 'src/__Libraries/StellaOps.Cryptography*/**'
|
||||
- 'src/Cryptography/**'
|
||||
cascades_to:
|
||||
- scanner
|
||||
- attestor
|
||||
- authority
|
||||
- evidence_locker
|
||||
- signer
|
||||
- airgap
|
||||
|
||||
# Evidence & Provenance - Affects attestation chain
|
||||
evidence:
|
||||
paths:
|
||||
- 'src/__Libraries/StellaOps.Evidence*/**'
|
||||
- 'src/__Libraries/StellaOps.Provenance/**'
|
||||
cascades_to:
|
||||
- scanner
|
||||
- attestor
|
||||
- evidence_locker
|
||||
- export_center
|
||||
- sbom_service
|
||||
|
||||
# Infrastructure - Affects all database-backed modules
|
||||
infrastructure:
|
||||
paths:
|
||||
- 'src/__Libraries/StellaOps.Infrastructure*/**'
|
||||
- 'src/__Libraries/StellaOps.DependencyInjection/**'
|
||||
cascades_to:
|
||||
- all_integration_tests
|
||||
|
||||
# Replay & Determinism - Affects reproducibility tests
|
||||
replay:
|
||||
paths:
|
||||
- 'src/__Libraries/StellaOps.Replay*/**'
|
||||
- 'src/__Libraries/StellaOps.Testing.Determinism/**'
|
||||
cascades_to:
|
||||
- scanner
|
||||
- determinism_tests
|
||||
- replay
|
||||
|
||||
# Verdict & Policy Primitives
|
||||
verdict:
|
||||
paths:
|
||||
- 'src/__Libraries/StellaOps.Verdict/**'
|
||||
- 'src/__Libraries/StellaOps.DeltaVerdict/**'
|
||||
cascades_to:
|
||||
- policy
|
||||
- risk_engine
|
||||
- reach_graph
|
||||
|
||||
# Plugin Framework
|
||||
plugin:
|
||||
paths:
|
||||
- 'src/__Libraries/StellaOps.Plugin/**'
|
||||
cascades_to:
|
||||
- authority
|
||||
- scanner
|
||||
- concelier
|
||||
|
||||
# Configuration
|
||||
configuration:
|
||||
paths:
|
||||
- 'src/__Libraries/StellaOps.Configuration/**'
|
||||
cascades_to:
|
||||
- all_modules
|
||||
|
||||
# -----------------------------------------------------------------------------
|
||||
# MODULE PATHS - Each module with its source and test paths
|
||||
# -----------------------------------------------------------------------------
|
||||
modules:
|
||||
# Scanning & Analysis
|
||||
scanner:
|
||||
source:
|
||||
- 'src/Scanner/**'
|
||||
- 'src/BinaryIndex/**'
|
||||
tests:
|
||||
- 'src/Scanner/__Tests/**'
|
||||
- 'src/BinaryIndex/__Tests/**'
|
||||
workflows:
|
||||
- 'scanner-*.yml'
|
||||
- 'scanner-analyzers*.yml'
|
||||
dependencies:
|
||||
- 'src/__Libraries/StellaOps.Evidence*/**'
|
||||
- 'src/__Libraries/StellaOps.Cryptography*/**'
|
||||
- 'src/__Libraries/StellaOps.Replay*/**'
|
||||
- 'src/__Libraries/StellaOps.Provenance/**'
|
||||
|
||||
binary_index:
|
||||
source:
|
||||
- 'src/BinaryIndex/**'
|
||||
tests:
|
||||
- 'src/BinaryIndex/__Tests/**'
|
||||
|
||||
# Data Ingestion
|
||||
concelier:
|
||||
source:
|
||||
- 'src/Concelier/**'
|
||||
tests:
|
||||
- 'src/Concelier/__Tests/**'
|
||||
workflows:
|
||||
- 'concelier-*.yml'
|
||||
- 'connector-*.yml'
|
||||
dependencies:
|
||||
- 'src/__Libraries/StellaOps.Plugin/**'
|
||||
|
||||
excititor:
|
||||
source:
|
||||
- 'src/Excititor/**'
|
||||
tests:
|
||||
- 'src/Excititor/__Tests/**'
|
||||
workflows:
|
||||
- 'vex-*.yml'
|
||||
- 'export-*.yml'
|
||||
|
||||
vexlens:
|
||||
source:
|
||||
- 'src/VexLens/**'
|
||||
tests:
|
||||
- 'src/VexLens/__Tests/**'
|
||||
|
||||
vexhub:
|
||||
source:
|
||||
- 'src/VexHub/**'
|
||||
tests:
|
||||
- 'src/VexHub/__Tests/**'
|
||||
|
||||
# Core Platform
|
||||
authority:
|
||||
source:
|
||||
- 'src/Authority/**'
|
||||
tests:
|
||||
- 'src/Authority/__Tests/**'
|
||||
workflows:
|
||||
- 'authority-*.yml'
|
||||
dependencies:
|
||||
- 'src/__Libraries/StellaOps.Cryptography*/**'
|
||||
- 'src/__Libraries/StellaOps.Plugin/**'
|
||||
|
||||
gateway:
|
||||
source:
|
||||
- 'src/Gateway/**'
|
||||
tests:
|
||||
- 'src/Gateway/__Tests/**'
|
||||
|
||||
router:
|
||||
source:
|
||||
- 'src/Router/**'
|
||||
tests:
|
||||
- 'src/Router/__Tests/**'
|
||||
workflows:
|
||||
- 'router-*.yml'
|
||||
|
||||
# Artifacts & Evidence
|
||||
attestor:
|
||||
source:
|
||||
- 'src/Attestor/**'
|
||||
tests:
|
||||
- 'src/Attestor/__Tests/**'
|
||||
workflows:
|
||||
- 'attestation-*.yml'
|
||||
- 'attestor-*.yml'
|
||||
dependencies:
|
||||
- 'src/__Libraries/StellaOps.Cryptography*/**'
|
||||
- 'src/__Libraries/StellaOps.Evidence*/**'
|
||||
- 'src/__Libraries/StellaOps.Provenance/**'
|
||||
|
||||
sbom_service:
|
||||
source:
|
||||
- 'src/SbomService/**'
|
||||
tests:
|
||||
- 'src/SbomService/__Tests/**'
|
||||
dependencies:
|
||||
- 'src/__Libraries/StellaOps.Evidence*/**'
|
||||
|
||||
evidence_locker:
|
||||
source:
|
||||
- 'src/EvidenceLocker/**'
|
||||
tests:
|
||||
- 'src/EvidenceLocker/__Tests/**'
|
||||
workflows:
|
||||
- 'evidence-*.yml'
|
||||
dependencies:
|
||||
- 'src/__Libraries/StellaOps.Evidence*/**'
|
||||
- 'src/__Libraries/StellaOps.Cryptography*/**'
|
||||
|
||||
export_center:
|
||||
source:
|
||||
- 'src/ExportCenter/**'
|
||||
tests:
|
||||
- 'src/ExportCenter/__Tests/**'
|
||||
workflows:
|
||||
- 'export-*.yml'
|
||||
|
||||
findings:
|
||||
source:
|
||||
- 'src/Findings/**'
|
||||
tests:
|
||||
- 'src/Findings/__Tests/**'
|
||||
workflows:
|
||||
- 'findings-*.yml'
|
||||
- 'ledger-*.yml'
|
||||
|
||||
provenance:
|
||||
source:
|
||||
- 'src/Provenance/**'
|
||||
tests:
|
||||
- 'src/Provenance/__Tests/**'
|
||||
workflows:
|
||||
- 'provenance-*.yml'
|
||||
|
||||
signer:
|
||||
source:
|
||||
- 'src/Signer/**'
|
||||
tests:
|
||||
- 'src/Signer/__Tests/**'
|
||||
dependencies:
|
||||
- 'src/__Libraries/StellaOps.Cryptography*/**'
|
||||
|
||||
# Policy & Risk
|
||||
policy:
|
||||
source:
|
||||
- 'src/Policy/**'
|
||||
tests:
|
||||
- 'src/Policy/__Tests/**'
|
||||
workflows:
|
||||
- 'policy-*.yml'
|
||||
dependencies:
|
||||
- 'src/__Libraries/StellaOps.Verdict/**'
|
||||
|
||||
risk_engine:
|
||||
source:
|
||||
- 'src/RiskEngine/**'
|
||||
tests:
|
||||
- 'src/RiskEngine/__Tests/**'
|
||||
dependencies:
|
||||
- 'src/__Libraries/StellaOps.Verdict/**'
|
||||
|
||||
reach_graph:
|
||||
source:
|
||||
- 'src/ReachGraph/**'
|
||||
tests:
|
||||
- 'src/ReachGraph/__Tests/**'
|
||||
workflows:
|
||||
- 'reachability-*.yml'
|
||||
dependencies:
|
||||
- 'src/__Libraries/StellaOps.ReachGraph*/**'
|
||||
|
||||
# Operations
|
||||
notify:
|
||||
source:
|
||||
- 'src/Notify/**'
|
||||
- 'src/Notifier/**'
|
||||
tests:
|
||||
- 'src/Notify/__Tests/**'
|
||||
workflows:
|
||||
- 'notify-*.yml'
|
||||
|
||||
orchestrator:
|
||||
source:
|
||||
- 'src/Orchestrator/**'
|
||||
tests:
|
||||
- 'src/Orchestrator/__Tests/**'
|
||||
|
||||
scheduler:
|
||||
source:
|
||||
- 'src/Scheduler/**'
|
||||
tests:
|
||||
- 'src/Scheduler/__Tests/**'
|
||||
|
||||
task_runner:
|
||||
source:
|
||||
- 'src/TaskRunner/**'
|
||||
tests:
|
||||
- 'src/TaskRunner/__Tests/**'
|
||||
|
||||
packs_registry:
|
||||
source:
|
||||
- 'src/PacksRegistry/**'
|
||||
tests:
|
||||
- 'src/PacksRegistry/__Tests/**'
|
||||
workflows:
|
||||
- 'packs-*.yml'
|
||||
|
||||
replay:
|
||||
source:
|
||||
- 'src/Replay/**'
|
||||
tests:
|
||||
- 'src/Replay/__Tests/**'
|
||||
workflows:
|
||||
- 'replay-*.yml'
|
||||
dependencies:
|
||||
- 'src/__Libraries/StellaOps.Replay*/**'
|
||||
|
||||
# Infrastructure
|
||||
cryptography:
|
||||
source:
|
||||
- 'src/Cryptography/**'
|
||||
tests:
|
||||
- 'src/__Libraries/__Tests/StellaOps.Cryptography*/**'
|
||||
workflows:
|
||||
- 'crypto-*.yml'
|
||||
|
||||
telemetry:
|
||||
source:
|
||||
- 'src/Telemetry/**'
|
||||
tests:
|
||||
- 'src/Telemetry/__Tests/**'
|
||||
|
||||
signals:
|
||||
source:
|
||||
- 'src/Signals/**'
|
||||
tests:
|
||||
- 'src/Signals/__Tests/**'
|
||||
workflows:
|
||||
- 'signals-*.yml'
|
||||
|
||||
airgap:
|
||||
source:
|
||||
- 'src/AirGap/**'
|
||||
tests:
|
||||
- 'src/AirGap/__Tests/**'
|
||||
workflows:
|
||||
- 'airgap-*.yml'
|
||||
- 'offline-*.yml'
|
||||
dependencies:
|
||||
- 'src/__Libraries/StellaOps.Cryptography*/**'
|
||||
|
||||
aoc:
|
||||
source:
|
||||
- 'src/Aoc/**'
|
||||
tests:
|
||||
- 'src/Aoc/__Tests/**'
|
||||
workflows:
|
||||
- 'aoc-*.yml'
|
||||
|
||||
# Integration
|
||||
cli:
|
||||
source:
|
||||
- 'src/Cli/**'
|
||||
tests:
|
||||
- 'src/Cli/__Tests/**'
|
||||
workflows:
|
||||
- 'cli-*.yml'
|
||||
|
||||
web:
|
||||
source:
|
||||
- 'src/Web/**'
|
||||
tests:
|
||||
- 'src/Web/**/*.spec.ts'
|
||||
workflows:
|
||||
- 'lighthouse-*.yml'
|
||||
|
||||
issuer_directory:
|
||||
source:
|
||||
- 'src/IssuerDirectory/**'
|
||||
tests:
|
||||
- 'src/IssuerDirectory/__Tests/**'
|
||||
|
||||
mirror:
|
||||
source:
|
||||
- 'src/Mirror/**'
|
||||
tests:
|
||||
- 'src/Mirror/__Tests/**'
|
||||
workflows:
|
||||
- 'mirror-*.yml'
|
||||
|
||||
advisory_ai:
|
||||
source:
|
||||
- 'src/AdvisoryAI/**'
|
||||
tests:
|
||||
- 'src/AdvisoryAI/__Tests/**'
|
||||
workflows:
|
||||
- 'advisory-*.yml'
|
||||
|
||||
symbols:
|
||||
source:
|
||||
- 'src/Symbols/**'
|
||||
tests:
|
||||
- 'src/Symbols/__Tests/**'
|
||||
workflows:
|
||||
- 'symbols-*.yml'
|
||||
|
||||
graph:
|
||||
source:
|
||||
- 'src/Graph/**'
|
||||
tests:
|
||||
- 'src/Graph/__Tests/**'
|
||||
workflows:
|
||||
- 'graph-*.yml'
|
||||
|
||||
# -----------------------------------------------------------------------------
|
||||
# DEVOPS & CI/CD - Changes affecting infrastructure
|
||||
# -----------------------------------------------------------------------------
|
||||
devops:
|
||||
docker:
|
||||
- 'devops/docker/**'
|
||||
- '**/Dockerfile'
|
||||
compose:
|
||||
- 'devops/compose/**'
|
||||
helm:
|
||||
- 'devops/helm/**'
|
||||
database:
|
||||
- 'devops/database/**'
|
||||
scripts:
|
||||
- '.gitea/scripts/**'
|
||||
workflows:
|
||||
- '.gitea/workflows/**'
|
||||
|
||||
# -----------------------------------------------------------------------------
|
||||
# TEST INFRASTRUCTURE
|
||||
# -----------------------------------------------------------------------------
|
||||
test_infrastructure:
|
||||
global_tests:
|
||||
- 'src/__Tests/**'
|
||||
shared_libraries:
|
||||
- 'src/__Tests/__Libraries/**'
|
||||
datasets:
|
||||
- 'src/__Tests/__Datasets/**'
|
||||
benchmarks:
|
||||
- 'src/__Tests/__Benchmarks/**'
|
||||
|
||||
# -----------------------------------------------------------------------------
|
||||
# TRIGGER CATEGORY DEFINITIONS
|
||||
# -----------------------------------------------------------------------------
|
||||
# Reference for which workflows belong to each trigger category
|
||||
|
||||
categories:
|
||||
# Category A: PR-Gating (MUST PASS for merge)
|
||||
pr_gating:
|
||||
trigger: 'pull_request + push to main'
|
||||
workflows:
|
||||
- build-test-deploy.yml
|
||||
- test-matrix.yml
|
||||
- determinism-gate.yml
|
||||
- policy-lint.yml
|
||||
- sast-scan.yml
|
||||
- secrets-scan.yml
|
||||
- dependency-license-gate.yml
|
||||
|
||||
# Category B: Main-Branch Only (Post-merge verification)
|
||||
main_only:
|
||||
trigger: 'push to main only'
|
||||
workflows:
|
||||
- container-scan.yml
|
||||
- integration-tests-gate.yml
|
||||
- api-governance.yml
|
||||
- aoc-guard.yml
|
||||
- provenance-check.yml
|
||||
- manifest-integrity.yml
|
||||
|
||||
# Category C: Module-Specific (Selective by path)
|
||||
module_specific:
|
||||
trigger: 'PR + main with path filters'
|
||||
patterns:
|
||||
- 'scanner-*.yml'
|
||||
- 'concelier-*.yml'
|
||||
- 'authority-*.yml'
|
||||
- 'attestor-*.yml'
|
||||
- 'policy-*.yml'
|
||||
- 'evidence-*.yml'
|
||||
- 'export-*.yml'
|
||||
- 'notify-*.yml'
|
||||
- 'router-*.yml'
|
||||
- 'crypto-*.yml'
|
||||
|
||||
# Category D: Release/Deploy (Tag or Manual only)
|
||||
release:
|
||||
trigger: 'tags or workflow_dispatch only'
|
||||
workflows:
|
||||
- release-suite.yml
|
||||
- module-publish.yml
|
||||
- service-release.yml
|
||||
- cli-build.yml
|
||||
- containers-multiarch.yml
|
||||
- rollback.yml
|
||||
- promote.yml
|
||||
tag_patterns:
|
||||
suite: 'suite-*'
|
||||
module: 'module-*-v*'
|
||||
service: 'service-*-v*'
|
||||
cli: 'cli-v*'
|
||||
bundle: 'v*.*.*'
|
||||
|
||||
# Category E: Scheduled (Nightly/Weekly)
|
||||
scheduled:
|
||||
workflows:
|
||||
- nightly-regression.yml # Daily 2:00 UTC
|
||||
- dependency-security-scan.yml # Weekly Sun 2:00 UTC
|
||||
- container-scan.yml # Daily 4:00 UTC (also main-only)
|
||||
- sast-scan.yml # Weekly Mon 3:30 UTC
|
||||
- renovate.yml # Daily 3:00, 15:00 UTC
|
||||
- benchmark-vs-competitors.yml # Weekly Sat 1:00 UTC
|
||||
432
.gitea/docs/architecture.md
Normal file
432
.gitea/docs/architecture.md
Normal file
@@ -0,0 +1,432 @@
|
||||
# CI/CD Architecture
|
||||
|
||||
> **Extended Documentation:** See [docs/cicd/](../../docs/cicd/) for comprehensive CI/CD guides.
|
||||
|
||||
## Overview
|
||||
|
||||
StellaOps CI/CD infrastructure is built on Gitea Actions with a modular, layered architecture designed for:
|
||||
- **Determinism**: Reproducible builds and tests across environments
|
||||
- **Offline-first**: Support for air-gapped deployments
|
||||
- **Security**: Cryptographic signing and attestation at every stage
|
||||
- **Scalability**: Parallel execution with intelligent caching
|
||||
|
||||
## Quick Links
|
||||
|
||||
| Document | Purpose |
|
||||
|----------|---------|
|
||||
| [CI/CD Overview](../../docs/cicd/README.md) | High-level architecture and getting started |
|
||||
| [Workflow Triggers](../../docs/cicd/workflow-triggers.md) | Complete trigger matrix and dependency chains |
|
||||
| [Release Pipelines](../../docs/cicd/release-pipelines.md) | Suite, module, and bundle release flows |
|
||||
| [Security Scanning](../../docs/cicd/security-scanning.md) | SAST, secrets, container, and dependency scanning |
|
||||
| [Troubleshooting](./troubleshooting.md) | Common issues and solutions |
|
||||
| [Script Reference](./scripts.md) | CI/CD script documentation |
|
||||
|
||||
## Workflow Trigger Summary
|
||||
|
||||
### Trigger Matrix (100 Workflows)
|
||||
|
||||
| Trigger Type | Count | Examples |
|
||||
|--------------|-------|----------|
|
||||
| PR + Main Push | 15 | `test-matrix.yml`, `build-test-deploy.yml` |
|
||||
| Tag-Based | 3 | `release-suite.yml`, `release.yml`, `module-publish.yml` |
|
||||
| Scheduled | 8 | `nightly-regression.yml`, `renovate.yml` |
|
||||
| Manual Only | 25+ | `rollback.yml`, `cli-build.yml` |
|
||||
| Module-Specific | 50+ | Scanner, Concelier, Authority workflows |
|
||||
|
||||
### Tag Patterns
|
||||
|
||||
| Pattern | Workflow | Example |
|
||||
|---------|----------|---------|
|
||||
| `suite-*` | Suite release | `suite-2026.04` |
|
||||
| `v*` | Bundle release | `v2025.12.1` |
|
||||
| `module-*-v*` | Module publish | `module-authority-v1.2.3` |
|
||||
|
||||
### Schedule Overview
|
||||
|
||||
| Time (UTC) | Workflow | Purpose |
|
||||
|------------|----------|---------|
|
||||
| 2:00 AM Daily | `nightly-regression.yml` | Full regression |
|
||||
| 3:00 AM/PM Daily | `renovate.yml` | Dependency updates |
|
||||
| 3:30 AM Monday | `sast-scan.yml` | Weekly security scan |
|
||||
| 5:00 AM Daily | `test-matrix.yml` | Extended tests |
|
||||
|
||||
> **Full Details:** See [Workflow Triggers](../../docs/cicd/workflow-triggers.md)
|
||||
|
||||
## Pipeline Architecture
|
||||
|
||||
### Release Pipeline Flow
|
||||
|
||||
```mermaid
|
||||
graph TD
|
||||
subgraph "Trigger Layer"
|
||||
TAG[Git Tag] --> PARSE[Parse Tag]
|
||||
DISPATCH[Manual Dispatch] --> PARSE
|
||||
SCHEDULE[Scheduled] --> PARSE
|
||||
end
|
||||
|
||||
subgraph "Validation Layer"
|
||||
PARSE --> VALIDATE[Validate Inputs]
|
||||
VALIDATE --> RESOLVE[Resolve Versions]
|
||||
end
|
||||
|
||||
subgraph "Build Layer"
|
||||
RESOLVE --> BUILD[Build Modules]
|
||||
BUILD --> TEST[Run Tests]
|
||||
TEST --> DETERMINISM[Determinism Check]
|
||||
end
|
||||
|
||||
subgraph "Artifact Layer"
|
||||
DETERMINISM --> CONTAINER[Build Container]
|
||||
CONTAINER --> SBOM[Generate SBOM]
|
||||
SBOM --> SIGN[Sign Artifacts]
|
||||
end
|
||||
|
||||
subgraph "Release Layer"
|
||||
SIGN --> MANIFEST[Update Manifest]
|
||||
MANIFEST --> CHANGELOG[Generate Changelog]
|
||||
CHANGELOG --> DOCS[Generate Docs]
|
||||
DOCS --> PUBLISH[Publish Release]
|
||||
end
|
||||
|
||||
subgraph "Post-Release"
|
||||
PUBLISH --> VERIFY[Verify Release]
|
||||
VERIFY --> NOTIFY[Notify Stakeholders]
|
||||
end
|
||||
```
|
||||
|
||||
### Service Release Pipeline
|
||||
|
||||
```mermaid
|
||||
graph LR
|
||||
subgraph "Trigger"
|
||||
A[service-{name}-v{semver}] --> B[Parse Service & Version]
|
||||
end
|
||||
|
||||
subgraph "Build"
|
||||
B --> C[Read Directory.Versions.props]
|
||||
C --> D[Bump Version]
|
||||
D --> E[Build Service]
|
||||
E --> F[Run Tests]
|
||||
end
|
||||
|
||||
subgraph "Package"
|
||||
F --> G[Build Container]
|
||||
G --> H[Generate Docker Tag]
|
||||
H --> I[Push to Registry]
|
||||
end
|
||||
|
||||
subgraph "Attestation"
|
||||
I --> J[Generate SBOM]
|
||||
J --> K[Sign with Cosign]
|
||||
K --> L[Create Attestation]
|
||||
end
|
||||
|
||||
subgraph "Finalize"
|
||||
L --> M[Update Manifest]
|
||||
M --> N[Commit Changes]
|
||||
end
|
||||
```
|
||||
|
||||
### Test Matrix Execution
|
||||
|
||||
```mermaid
|
||||
graph TD
|
||||
subgraph "Matrix Strategy"
|
||||
TRIGGER[PR/Push] --> FILTER[Path Filter]
|
||||
FILTER --> MATRIX[Generate Matrix]
|
||||
end
|
||||
|
||||
subgraph "Parallel Execution"
|
||||
MATRIX --> UNIT[Unit Tests]
|
||||
MATRIX --> INT[Integration Tests]
|
||||
MATRIX --> DET[Determinism Tests]
|
||||
end
|
||||
|
||||
subgraph "Test Types"
|
||||
UNIT --> UNIT_FAST[Fast Unit]
|
||||
UNIT --> UNIT_SLOW[Slow Unit]
|
||||
INT --> INT_PG[PostgreSQL]
|
||||
INT --> INT_VALKEY[Valkey]
|
||||
DET --> DET_SCANNER[Scanner]
|
||||
DET --> DET_BUILD[Build Output]
|
||||
end
|
||||
|
||||
subgraph "Reporting"
|
||||
UNIT_FAST --> TRX[TRX Reports]
|
||||
UNIT_SLOW --> TRX
|
||||
INT_PG --> TRX
|
||||
INT_VALKEY --> TRX
|
||||
DET_SCANNER --> TRX
|
||||
DET_BUILD --> TRX
|
||||
TRX --> SUMMARY[Job Summary]
|
||||
end
|
||||
```
|
||||
|
||||
## Workflow Dependencies
|
||||
|
||||
### Core Dependencies
|
||||
|
||||
```mermaid
|
||||
graph TD
|
||||
BTD[build-test-deploy.yml] --> TM[test-matrix.yml]
|
||||
BTD --> DG[determinism-gate.yml]
|
||||
|
||||
TM --> TL[test-lanes.yml]
|
||||
TM --> ITG[integration-tests-gate.yml]
|
||||
|
||||
RS[release-suite.yml] --> BTD
|
||||
RS --> MP[module-publish.yml]
|
||||
RS --> AS[artifact-signing.yml]
|
||||
|
||||
SR[service-release.yml] --> BTD
|
||||
SR --> AS
|
||||
|
||||
MP --> AS
|
||||
MP --> AB[attestation-bundle.yml]
|
||||
```
|
||||
|
||||
### Security Chain
|
||||
|
||||
```mermaid
|
||||
graph LR
|
||||
BUILD[Build] --> SBOM[SBOM Generation]
|
||||
SBOM --> SIGN[Cosign Signing]
|
||||
SIGN --> ATTEST[Attestation]
|
||||
ATTEST --> VERIFY[Verification]
|
||||
VERIFY --> PUBLISH[Publish]
|
||||
```
|
||||
|
||||
## Execution Stages
|
||||
|
||||
### Stage 1: Validation
|
||||
|
||||
| Step | Purpose | Tools |
|
||||
|------|---------|-------|
|
||||
| Parse trigger | Extract tag/input parameters | bash |
|
||||
| Validate config | Check required files exist | bash |
|
||||
| Resolve versions | Read from Directory.Versions.props | Python |
|
||||
| Check permissions | Verify secrets available | Gitea Actions |
|
||||
|
||||
### Stage 2: Build
|
||||
|
||||
| Step | Purpose | Tools |
|
||||
|------|---------|-------|
|
||||
| Restore packages | NuGet/npm dependencies | dotnet restore, npm ci |
|
||||
| Build solution | Compile all projects | dotnet build |
|
||||
| Run analyzers | Code analysis | dotnet analyzers |
|
||||
|
||||
### Stage 3: Test
|
||||
|
||||
| Step | Purpose | Tools |
|
||||
|------|---------|-------|
|
||||
| Unit tests | Component testing | xUnit |
|
||||
| Integration tests | Service integration | Testcontainers |
|
||||
| Determinism tests | Output reproducibility | Custom scripts |
|
||||
|
||||
### Stage 4: Package
|
||||
|
||||
| Step | Purpose | Tools |
|
||||
|------|---------|-------|
|
||||
| Build container | Docker image | docker build |
|
||||
| Generate SBOM | Software bill of materials | Syft |
|
||||
| Sign artifacts | Cryptographic signing | Cosign |
|
||||
| Create attestation | in-toto/DSSE envelope | Custom tools |
|
||||
|
||||
### Stage 5: Publish
|
||||
|
||||
| Step | Purpose | Tools |
|
||||
|------|---------|-------|
|
||||
| Push container | Registry upload | docker push |
|
||||
| Upload attestation | Rekor transparency | Cosign |
|
||||
| Update manifest | Version tracking | Python |
|
||||
| Generate docs | Release documentation | Python |
|
||||
|
||||
## Concurrency Control
|
||||
|
||||
### Strategy
|
||||
|
||||
```yaml
|
||||
concurrency:
|
||||
group: ${{ github.workflow }}-${{ github.ref }}
|
||||
cancel-in-progress: true
|
||||
```
|
||||
|
||||
### Workflow Groups
|
||||
|
||||
| Group | Behavior | Workflows |
|
||||
|-------|----------|-----------|
|
||||
| Build | Cancel in-progress | `build-test-deploy.yml` |
|
||||
| Release | No cancel (sequential) | `release-suite.yml` |
|
||||
| Deploy | Environment-locked | `promote.yml` |
|
||||
| Scheduled | Allow concurrent | `renovate.yml` |
|
||||
|
||||
## Caching Strategy
|
||||
|
||||
### Cache Layers
|
||||
|
||||
```mermaid
|
||||
graph TD
|
||||
subgraph "Package Cache"
|
||||
NUGET[NuGet Cache<br>~/.nuget/packages]
|
||||
NPM[npm Cache<br>~/.npm]
|
||||
end
|
||||
|
||||
subgraph "Build Cache"
|
||||
OBJ[Object Files<br>**/obj]
|
||||
BIN[Binaries<br>**/bin]
|
||||
end
|
||||
|
||||
subgraph "Test Cache"
|
||||
TC[Testcontainers<br>Images]
|
||||
FIX[Test Fixtures]
|
||||
end
|
||||
|
||||
subgraph "Keys"
|
||||
K1[runner.os-nuget-hash] --> NUGET
|
||||
K2[runner.os-npm-hash] --> NPM
|
||||
K3[runner.os-dotnet-hash] --> OBJ
|
||||
K3 --> BIN
|
||||
end
|
||||
```
|
||||
|
||||
### Cache Configuration
|
||||
|
||||
| Cache | Key Pattern | Restore Keys |
|
||||
|-------|-------------|--------------|
|
||||
| NuGet | `${{ runner.os }}-nuget-${{ hashFiles('**/*.csproj') }}` | `${{ runner.os }}-nuget-` |
|
||||
| npm | `${{ runner.os }}-npm-${{ hashFiles('**/package-lock.json') }}` | `${{ runner.os }}-npm-` |
|
||||
| .NET Build | `${{ runner.os }}-dotnet-${{ github.sha }}` | `${{ runner.os }}-dotnet-` |
|
||||
|
||||
## Runner Requirements
|
||||
|
||||
### Self-Hosted Runners
|
||||
|
||||
| Label | Purpose | Requirements |
|
||||
|-------|---------|--------------|
|
||||
| `ubuntu-latest` | General builds | 4 CPU, 16GB RAM, 100GB disk |
|
||||
| `linux-arm64` | ARM builds | ARM64 host |
|
||||
| `windows-latest` | Windows builds | Windows Server 2022 |
|
||||
| `macos-latest` | macOS builds | macOS 13+ |
|
||||
|
||||
### Docker-in-Docker
|
||||
|
||||
Required for:
|
||||
- Testcontainers integration tests
|
||||
- Multi-architecture builds
|
||||
- Container scanning
|
||||
|
||||
### Network Requirements
|
||||
|
||||
| Endpoint | Purpose | Required |
|
||||
|----------|---------|----------|
|
||||
| `git.stella-ops.org` | Source, Registry | Always |
|
||||
| `nuget.org` | NuGet packages | Online mode |
|
||||
| `registry.npmjs.org` | npm packages | Online mode |
|
||||
| `ghcr.io` | GitHub Container Registry | Optional |
|
||||
|
||||
## Artifact Flow
|
||||
|
||||
### Build Artifacts
|
||||
|
||||
```
|
||||
artifacts/
|
||||
├── binaries/
|
||||
│ ├── StellaOps.Cli-linux-x64
|
||||
│ ├── StellaOps.Cli-linux-arm64
|
||||
│ ├── StellaOps.Cli-win-x64
|
||||
│ └── StellaOps.Cli-osx-arm64
|
||||
├── containers/
|
||||
│ ├── scanner:1.2.3+20250128143022
|
||||
│ └── authority:1.0.0+20250128143022
|
||||
├── sbom/
|
||||
│ ├── scanner.cyclonedx.json
|
||||
│ └── authority.cyclonedx.json
|
||||
└── attestations/
|
||||
├── scanner.intoto.jsonl
|
||||
└── authority.intoto.jsonl
|
||||
```
|
||||
|
||||
### Release Artifacts
|
||||
|
||||
```
|
||||
docs/releases/2026.04/
|
||||
├── README.md
|
||||
├── CHANGELOG.md
|
||||
├── services.md
|
||||
├── docker-compose.yml
|
||||
├── docker-compose.airgap.yml
|
||||
├── upgrade-guide.md
|
||||
├── checksums.txt
|
||||
└── manifest.yaml
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
### Retry Strategy
|
||||
|
||||
| Step Type | Retries | Backoff |
|
||||
|-----------|---------|---------|
|
||||
| Network calls | 3 | Exponential |
|
||||
| Docker push | 3 | Linear (30s) |
|
||||
| Tests | 0 | N/A |
|
||||
| Signing | 2 | Linear (10s) |
|
||||
|
||||
### Failure Actions
|
||||
|
||||
| Failure Type | Action |
|
||||
|--------------|--------|
|
||||
| Build failure | Fail fast, notify |
|
||||
| Test failure | Continue, report |
|
||||
| Signing failure | Fail, alert security |
|
||||
| Deploy failure | Rollback, notify |
|
||||
|
||||
## Security Architecture
|
||||
|
||||
### Secret Management
|
||||
|
||||
```mermaid
|
||||
graph TD
|
||||
subgraph "Gitea Secrets"
|
||||
GS[Organization Secrets]
|
||||
RS[Repository Secrets]
|
||||
ES[Environment Secrets]
|
||||
end
|
||||
|
||||
subgraph "Usage"
|
||||
GS --> BUILD[Build Workflows]
|
||||
RS --> SIGN[Signing Workflows]
|
||||
ES --> DEPLOY[Deploy Workflows]
|
||||
end
|
||||
|
||||
subgraph "Rotation"
|
||||
ROTATE[Key Rotation] --> RS
|
||||
ROTATE --> ES
|
||||
end
|
||||
```
|
||||
|
||||
### Signing Chain
|
||||
|
||||
1. **Build outputs**: SHA-256 checksums
|
||||
2. **Container images**: Cosign keyless/keyed signing
|
||||
3. **SBOMs**: in-toto attestation
|
||||
4. **Releases**: GPG-signed tags
|
||||
|
||||
## Monitoring & Observability
|
||||
|
||||
### Workflow Metrics
|
||||
|
||||
| Metric | Source | Dashboard |
|
||||
|--------|--------|-----------|
|
||||
| Build duration | Gitea Actions | Grafana |
|
||||
| Test pass rate | TRX reports | Grafana |
|
||||
| Cache hit rate | Actions cache | Prometheus |
|
||||
| Artifact size | Upload artifact | Prometheus |
|
||||
|
||||
### Alerts
|
||||
|
||||
| Alert | Condition | Action |
|
||||
|-------|-----------|--------|
|
||||
| Build time > 30m | Duration threshold | Investigate |
|
||||
| Test failures > 5% | Rate threshold | Review |
|
||||
| Cache miss streak | 3 consecutive | Clear cache |
|
||||
| Security scan critical | Any critical CVE | Block merge |
|
||||
736
.gitea/docs/scripts.md
Normal file
736
.gitea/docs/scripts.md
Normal file
@@ -0,0 +1,736 @@
|
||||
# CI/CD Scripts Inventory
|
||||
|
||||
Complete documentation of all scripts in `.gitea/scripts/`.
|
||||
|
||||
## Directory Structure
|
||||
|
||||
```
|
||||
.gitea/scripts/
|
||||
├── build/ # Build orchestration
|
||||
├── evidence/ # Evidence bundle management
|
||||
├── metrics/ # Performance metrics
|
||||
├── release/ # Release automation
|
||||
├── sign/ # Artifact signing
|
||||
├── test/ # Test execution
|
||||
├── util/ # Utilities
|
||||
└── validate/ # Validation scripts
|
||||
```
|
||||
|
||||
## Exit Code Conventions
|
||||
|
||||
| Code | Meaning |
|
||||
|------|---------|
|
||||
| 0 | Success |
|
||||
| 1 | General error |
|
||||
| 2 | Missing configuration/key |
|
||||
| 3 | Missing required file |
|
||||
| 69 | Tool not found (EX_UNAVAILABLE) |
|
||||
|
||||
---
|
||||
|
||||
## Build Scripts (`scripts/build/`)
|
||||
|
||||
### build-cli.sh
|
||||
|
||||
Multi-platform CLI build with SBOM generation and signing.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
RIDS=linux-x64,win-x64,osx-arm64 ./build-cli.sh
|
||||
```
|
||||
|
||||
**Environment Variables:**
|
||||
|
||||
| Variable | Default | Description |
|
||||
|----------|---------|-------------|
|
||||
| `RIDS` | `linux-x64,win-x64,osx-arm64` | Comma-separated runtime identifiers |
|
||||
| `CONFIG` | `Release` | Build configuration |
|
||||
| `SBOM_TOOL` | `syft` | SBOM generator (`syft` or `none`) |
|
||||
| `SIGN` | `false` | Enable artifact signing |
|
||||
| `COSIGN_KEY` | - | Path to Cosign key file |
|
||||
|
||||
**Output:**
|
||||
```
|
||||
out/cli/
|
||||
├── linux-x64/
|
||||
│ ├── publish/
|
||||
│ ├── stella-cli-linux-x64.tar.gz
|
||||
│ ├── stella-cli-linux-x64.tar.gz.sha256
|
||||
│ └── stella-cli-linux-x64.tar.gz.sbom.json
|
||||
├── win-x64/
|
||||
│ ├── publish/
|
||||
│ ├── stella-cli-win-x64.zip
|
||||
│ └── ...
|
||||
└── manifest.json
|
||||
```
|
||||
|
||||
**Features:**
|
||||
- Builds self-contained single-file executables
|
||||
- Includes CLI plugins (Aoc, Symbols)
|
||||
- Generates SHA-256 checksums
|
||||
- Optional SBOM generation via Syft
|
||||
- Optional Cosign signing
|
||||
|
||||
---
|
||||
|
||||
### build-multiarch.sh
|
||||
|
||||
Multi-architecture Docker image builds using buildx.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
IMAGE=scanner PLATFORMS=linux/amd64,linux/arm64 ./build-multiarch.sh
|
||||
```
|
||||
|
||||
**Environment Variables:**
|
||||
|
||||
| Variable | Default | Description |
|
||||
|----------|---------|-------------|
|
||||
| `IMAGE` | - | Image name (required) |
|
||||
| `PLATFORMS` | `linux/amd64,linux/arm64` | Target platforms |
|
||||
| `REGISTRY` | `git.stella-ops.org` | Container registry |
|
||||
| `TAG` | `latest` | Image tag |
|
||||
| `PUSH` | `false` | Push to registry |
|
||||
|
||||
---
|
||||
|
||||
### build-airgap-bundle.sh
|
||||
|
||||
Build offline/air-gapped deployment bundle.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
VERSION=2026.04 ./build-airgap-bundle.sh
|
||||
```
|
||||
|
||||
**Output:**
|
||||
```
|
||||
out/airgap/
|
||||
├── images.tar # All container images
|
||||
├── helm-charts.tar.gz # Helm charts
|
||||
├── compose.tar.gz # Docker Compose files
|
||||
├── checksums.txt
|
||||
└── manifest.json
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Test Scripts (`scripts/test/`)
|
||||
|
||||
### determinism-run.sh
|
||||
|
||||
Run determinism verification tests.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
./determinism-run.sh
|
||||
```
|
||||
|
||||
**Purpose:**
|
||||
- Executes tests filtered by `Determinism` category
|
||||
- Collects TRX test results
|
||||
- Generates summary and artifacts archive
|
||||
|
||||
**Output:**
|
||||
```
|
||||
out/scanner-determinism/
|
||||
├── determinism.trx
|
||||
├── summary.txt
|
||||
└── determinism-artifacts.tgz
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### run-fixtures-check.sh
|
||||
|
||||
Validate test fixtures against expected schemas.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
./run-fixtures-check.sh [--update]
|
||||
```
|
||||
|
||||
**Options:**
|
||||
- `--update`: Update golden fixtures if mismatched
|
||||
|
||||
---
|
||||
|
||||
## Validation Scripts (`scripts/validate/`)
|
||||
|
||||
### validate-sbom.sh
|
||||
|
||||
Validate CycloneDX SBOM files.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
./validate-sbom.sh <sbom-file>
|
||||
./validate-sbom.sh --all
|
||||
./validate-sbom.sh --schema custom.json sample.json
|
||||
```
|
||||
|
||||
**Options:**
|
||||
|
||||
| Option | Description |
|
||||
|--------|-------------|
|
||||
| `--all` | Validate all fixtures in `src/__Tests/__Benchmarks/golden-corpus/` |
|
||||
| `--schema <path>` | Custom schema file |
|
||||
|
||||
**Dependencies:**
|
||||
- `sbom-utility` (auto-installed if missing)
|
||||
|
||||
**Exit Codes:**
|
||||
- `0`: All validations passed
|
||||
- `1`: Validation failed
|
||||
|
||||
---
|
||||
|
||||
### validate-spdx.sh
|
||||
|
||||
Validate SPDX SBOM files.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
./validate-spdx.sh <spdx-file>
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### validate-vex.sh
|
||||
|
||||
Validate VEX documents (OpenVEX, CSAF).
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
./validate-vex.sh <vex-file>
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### validate-helm.sh
|
||||
|
||||
Validate Helm charts.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
./validate-helm.sh [chart-path]
|
||||
```
|
||||
|
||||
**Default Path:** `devops/helm/stellaops`
|
||||
|
||||
**Checks:**
|
||||
- `helm lint`
|
||||
- Template rendering
|
||||
- Schema validation
|
||||
|
||||
---
|
||||
|
||||
### validate-compose.sh
|
||||
|
||||
Validate Docker Compose files.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
./validate-compose.sh [profile]
|
||||
```
|
||||
|
||||
**Profiles:**
|
||||
- `dev` - Development
|
||||
- `stage` - Staging
|
||||
- `prod` - Production
|
||||
- `airgap` - Air-gapped
|
||||
|
||||
---
|
||||
|
||||
### validate-licenses.sh
|
||||
|
||||
Check dependency licenses for compliance.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
./validate-licenses.sh
|
||||
```
|
||||
|
||||
**Checks:**
|
||||
- NuGet packages via `dotnet-delice`
|
||||
- npm packages via `license-checker`
|
||||
- Reports blocked licenses (GPL-2.0-only, SSPL, etc.)
|
||||
|
||||
---
|
||||
|
||||
### validate-migrations.sh
|
||||
|
||||
Validate database migrations.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
./validate-migrations.sh
|
||||
```
|
||||
|
||||
**Checks:**
|
||||
- Migration naming conventions
|
||||
- Forward/rollback pairs
|
||||
- Idempotency
|
||||
|
||||
---
|
||||
|
||||
### validate-workflows.sh
|
||||
|
||||
Validate Gitea Actions workflow YAML files.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
./validate-workflows.sh
|
||||
```
|
||||
|
||||
**Checks:**
|
||||
- YAML syntax
|
||||
- Required fields
|
||||
- Action version pinning
|
||||
|
||||
---
|
||||
|
||||
### verify-binaries.sh
|
||||
|
||||
Verify binary integrity.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
./verify-binaries.sh <binary-path> [checksum-file]
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Signing Scripts (`scripts/sign/`)
|
||||
|
||||
### sign-signals.sh
|
||||
|
||||
Sign Signals artifacts with Cosign.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
./sign-signals.sh
|
||||
```
|
||||
|
||||
**Environment Variables:**
|
||||
|
||||
| Variable | Description |
|
||||
|----------|-------------|
|
||||
| `COSIGN_KEY_FILE` | Path to signing key |
|
||||
| `COSIGN_PRIVATE_KEY_B64` | Base64-encoded private key |
|
||||
| `COSIGN_PASSWORD` | Key password |
|
||||
| `COSIGN_ALLOW_DEV_KEY` | Allow development key (`1`) |
|
||||
| `OUT_DIR` | Output directory |
|
||||
|
||||
**Key Resolution Order:**
|
||||
1. `COSIGN_KEY_FILE` environment variable
|
||||
2. `COSIGN_PRIVATE_KEY_B64` environment variable (decoded)
|
||||
3. `tools/cosign/cosign.key`
|
||||
4. `tools/cosign/cosign.dev.key` (if `COSIGN_ALLOW_DEV_KEY=1`)
|
||||
|
||||
**Signed Artifacts:**
|
||||
- `confidence_decay_config.yaml`
|
||||
- `unknowns_scoring_manifest.json`
|
||||
- `heuristics.catalog.json`
|
||||
|
||||
**Output:**
|
||||
```
|
||||
evidence-locker/signals/{date}/
|
||||
├── confidence_decay_config.sigstore.json
|
||||
├── unknowns_scoring_manifest.sigstore.json
|
||||
├── heuristics_catalog.sigstore.json
|
||||
└── SHA256SUMS
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### sign-policy.sh
|
||||
|
||||
Sign policy artifacts.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
./sign-policy.sh <policy-file>
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### sign-authority-gaps.sh
|
||||
|
||||
Sign authority gap attestations.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
./sign-authority-gaps.sh
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Release Scripts (`scripts/release/`)
|
||||
|
||||
### build_release.py
|
||||
|
||||
Main release pipeline orchestration.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
python build_release.py --channel stable --version 2026.04
|
||||
```
|
||||
|
||||
**Arguments:**
|
||||
|
||||
| Argument | Description |
|
||||
|----------|-------------|
|
||||
| `--channel` | Release channel (`stable`, `beta`, `nightly`) |
|
||||
| `--version` | Version string |
|
||||
| `--config` | Component config file |
|
||||
| `--dry-run` | Don't push artifacts |
|
||||
|
||||
**Dependencies:**
|
||||
- docker (with buildx)
|
||||
- cosign
|
||||
- helm
|
||||
- npm/node
|
||||
- dotnet SDK
|
||||
|
||||
---
|
||||
|
||||
### verify_release.py
|
||||
|
||||
Post-release verification.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
python verify_release.py --version 2026.04
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### bump-service-version.py
|
||||
|
||||
Manage service versions in `Directory.Versions.props`.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
# Bump version
|
||||
python bump-service-version.py --service scanner --bump minor
|
||||
|
||||
# Set explicit version
|
||||
python bump-service-version.py --service scanner --version 2.0.0
|
||||
|
||||
# List versions
|
||||
python bump-service-version.py --list
|
||||
```
|
||||
|
||||
**Arguments:**
|
||||
|
||||
| Argument | Description |
|
||||
|----------|-------------|
|
||||
| `--service` | Service name (e.g., `scanner`, `authority`) |
|
||||
| `--bump` | Bump type (`major`, `minor`, `patch`) |
|
||||
| `--version` | Explicit version to set |
|
||||
| `--list` | List all service versions |
|
||||
| `--dry-run` | Don't write changes |
|
||||
|
||||
---
|
||||
|
||||
### read-service-version.sh
|
||||
|
||||
Read current service version.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
./read-service-version.sh scanner
|
||||
```
|
||||
|
||||
**Output:**
|
||||
```
|
||||
1.2.3
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### generate-docker-tag.sh
|
||||
|
||||
Generate Docker tag with datetime suffix.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
./generate-docker-tag.sh 1.2.3
|
||||
```
|
||||
|
||||
**Output:**
|
||||
```
|
||||
1.2.3+20250128143022
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### generate_changelog.py
|
||||
|
||||
AI-assisted changelog generation.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
python generate_changelog.py --version 2026.04 --codename Nova
|
||||
```
|
||||
|
||||
**Environment Variables:**
|
||||
|
||||
| Variable | Description |
|
||||
|----------|-------------|
|
||||
| `AI_API_KEY` | AI service API key |
|
||||
| `AI_API_URL` | AI service endpoint (optional) |
|
||||
|
||||
**Features:**
|
||||
- Parses git commits since last release
|
||||
- Categorizes by type (Breaking, Security, Features, Fixes)
|
||||
- Groups by module
|
||||
- AI-assisted summary generation
|
||||
- Fallback to rule-based generation
|
||||
|
||||
---
|
||||
|
||||
### generate_suite_docs.py
|
||||
|
||||
Generate suite release documentation.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
python generate_suite_docs.py --version 2026.04 --codename Nova
|
||||
```
|
||||
|
||||
**Output:**
|
||||
```
|
||||
docs/releases/2026.04/
|
||||
├── README.md
|
||||
├── CHANGELOG.md
|
||||
├── services.md
|
||||
├── upgrade-guide.md
|
||||
├── checksums.txt
|
||||
└── manifest.yaml
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### generate_compose.py
|
||||
|
||||
Generate pinned Docker Compose files.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
python generate_compose.py --version 2026.04
|
||||
```
|
||||
|
||||
**Output:**
|
||||
- `docker-compose.yml` - Standard deployment
|
||||
- `docker-compose.airgap.yml` - Air-gapped deployment
|
||||
|
||||
---
|
||||
|
||||
### collect_versions.py
|
||||
|
||||
Collect service versions from `Directory.Versions.props`.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
python collect_versions.py --format json
|
||||
python collect_versions.py --format yaml
|
||||
python collect_versions.py --format markdown
|
||||
python collect_versions.py --format env
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### check_cli_parity.py
|
||||
|
||||
Verify CLI version parity across platforms.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
python check_cli_parity.py
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Evidence Scripts (`scripts/evidence/`)
|
||||
|
||||
### upload-all-evidence.sh
|
||||
|
||||
Upload all evidence bundles to Evidence Locker.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
./upload-all-evidence.sh
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### signals-upload-evidence.sh
|
||||
|
||||
Upload Signals evidence.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
./signals-upload-evidence.sh
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### zastava-upload-evidence.sh
|
||||
|
||||
Upload Zastava evidence.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
./zastava-upload-evidence.sh
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Metrics Scripts (`scripts/metrics/`)
|
||||
|
||||
### compute-reachability-metrics.sh
|
||||
|
||||
Compute reachability analysis metrics.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
./compute-reachability-metrics.sh
|
||||
```
|
||||
|
||||
**Output Metrics:**
|
||||
- Total functions analyzed
|
||||
- Reachable functions
|
||||
- Coverage percentage
|
||||
- Analysis duration
|
||||
|
||||
---
|
||||
|
||||
### compute-ttfs-metrics.sh
|
||||
|
||||
Compute Time-to-First-Scan metrics.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
./compute-ttfs-metrics.sh
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### enforce-performance-slos.sh
|
||||
|
||||
Enforce performance SLOs.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
./enforce-performance-slos.sh
|
||||
```
|
||||
|
||||
**Checked SLOs:**
|
||||
- Build time < 30 minutes
|
||||
- Test coverage > 80%
|
||||
- TTFS < 60 seconds
|
||||
|
||||
---
|
||||
|
||||
## Utility Scripts (`scripts/util/`)
|
||||
|
||||
### cleanup-runner-space.sh
|
||||
|
||||
Clean up runner disk space.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
./cleanup-runner-space.sh
|
||||
```
|
||||
|
||||
**Actions:**
|
||||
- Remove Docker build cache
|
||||
- Clean NuGet cache
|
||||
- Remove old test results
|
||||
- Prune unused images
|
||||
|
||||
---
|
||||
|
||||
### dotnet-filter.sh
|
||||
|
||||
Filter .NET projects for selective builds.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
./dotnet-filter.sh --changed
|
||||
./dotnet-filter.sh --module Scanner
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### enable-openssl11-shim.sh
|
||||
|
||||
Enable OpenSSL 1.1 compatibility shim.
|
||||
|
||||
**Usage:**
|
||||
```bash
|
||||
./enable-openssl11-shim.sh
|
||||
```
|
||||
|
||||
**Purpose:**
|
||||
Required for certain cryptographic operations on newer Linux distributions that have removed OpenSSL 1.1.
|
||||
|
||||
---
|
||||
|
||||
## Script Development Guidelines
|
||||
|
||||
### Required Elements
|
||||
|
||||
1. **Shebang:**
|
||||
```bash
|
||||
#!/usr/bin/env bash
|
||||
```
|
||||
|
||||
2. **Strict Mode:**
|
||||
```bash
|
||||
set -euo pipefail
|
||||
```
|
||||
|
||||
3. **Sprint Reference:**
|
||||
```bash
|
||||
# DEVOPS-XXX-YY-ZZZ: Description
|
||||
# Sprint: SPRINT_XXXX_XXXX_XXXX - Topic
|
||||
```
|
||||
|
||||
4. **Usage Documentation:**
|
||||
```bash
|
||||
# Usage:
|
||||
# ./script.sh <required-arg> [optional-arg]
|
||||
```
|
||||
|
||||
### Best Practices
|
||||
|
||||
1. **Use environment variables with defaults:**
|
||||
```bash
|
||||
CONFIG="${CONFIG:-Release}"
|
||||
```
|
||||
|
||||
2. **Validate required tools:**
|
||||
```bash
|
||||
if ! command -v dotnet >/dev/null 2>&1; then
|
||||
echo "dotnet CLI not found" >&2
|
||||
exit 69
|
||||
fi
|
||||
```
|
||||
|
||||
3. **Use absolute paths:**
|
||||
```bash
|
||||
ROOT="$(cd "$(dirname "${BASH_SOURCE[0]}")/../.." && pwd)"
|
||||
```
|
||||
|
||||
4. **Handle cleanup:**
|
||||
```bash
|
||||
trap 'rm -f "$TMP_FILE"' EXIT
|
||||
```
|
||||
|
||||
5. **Use logging functions:**
|
||||
```bash
|
||||
log_info() { echo "[INFO] $*"; }
|
||||
log_error() { echo "[ERROR] $*" >&2; }
|
||||
```
|
||||
624
.gitea/docs/troubleshooting.md
Normal file
624
.gitea/docs/troubleshooting.md
Normal file
@@ -0,0 +1,624 @@
|
||||
# CI/CD Troubleshooting Guide
|
||||
|
||||
Common issues and solutions for StellaOps CI/CD infrastructure.
|
||||
|
||||
## Quick Diagnostics
|
||||
|
||||
### Check Workflow Status
|
||||
|
||||
```bash
|
||||
# View recent workflow runs
|
||||
gh run list --limit 10
|
||||
|
||||
# View specific run logs
|
||||
gh run view <run-id> --log
|
||||
|
||||
# Re-run failed workflow
|
||||
gh run rerun <run-id>
|
||||
```
|
||||
|
||||
### Verify Local Environment
|
||||
|
||||
```bash
|
||||
# Check .NET SDK
|
||||
dotnet --list-sdks
|
||||
|
||||
# Check Docker
|
||||
docker version
|
||||
docker buildx version
|
||||
|
||||
# Check Node.js
|
||||
node --version
|
||||
npm --version
|
||||
|
||||
# Check required tools
|
||||
which cosign syft helm
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Build Failures
|
||||
|
||||
### NuGet Restore Failures
|
||||
|
||||
**Symptom:** `error NU1301: Unable to load the service index`
|
||||
|
||||
**Causes:**
|
||||
1. Network connectivity issues
|
||||
2. NuGet source unavailable
|
||||
3. Invalid credentials
|
||||
|
||||
**Solutions:**
|
||||
|
||||
```bash
|
||||
# Clear NuGet cache
|
||||
dotnet nuget locals all --clear
|
||||
|
||||
# Check NuGet sources
|
||||
dotnet nuget list source
|
||||
|
||||
# Restore with verbose logging
|
||||
dotnet restore src/StellaOps.sln -v detailed
|
||||
```
|
||||
|
||||
**In CI:**
|
||||
```yaml
|
||||
- name: Restore with retry
|
||||
run: |
|
||||
for i in {1..3}; do
|
||||
dotnet restore src/StellaOps.sln && break
|
||||
echo "Retry $i..."
|
||||
sleep 30
|
||||
done
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### SDK Version Mismatch
|
||||
|
||||
**Symptom:** `error MSB4236: The SDK 'Microsoft.NET.Sdk' specified could not be found`
|
||||
|
||||
**Solutions:**
|
||||
|
||||
1. Check `global.json`:
|
||||
```bash
|
||||
cat global.json
|
||||
```
|
||||
|
||||
2. Install correct SDK:
|
||||
```bash
|
||||
# CI environment
|
||||
- uses: actions/setup-dotnet@v4
|
||||
with:
|
||||
dotnet-version: '10.0.100'
|
||||
include-prerelease: true
|
||||
```
|
||||
|
||||
3. Override SDK version:
|
||||
```bash
|
||||
# Remove global.json override
|
||||
rm global.json
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Docker Build Failures
|
||||
|
||||
**Symptom:** `failed to solve: rpc error: code = Unknown`
|
||||
|
||||
**Causes:**
|
||||
1. Disk space exhausted
|
||||
2. Layer cache corruption
|
||||
3. Network timeout
|
||||
|
||||
**Solutions:**
|
||||
|
||||
```bash
|
||||
# Clean Docker system
|
||||
docker system prune -af
|
||||
docker builder prune -af
|
||||
|
||||
# Build without cache
|
||||
docker build --no-cache -t myimage .
|
||||
|
||||
# Increase buildx timeout
|
||||
docker buildx create --driver-opt network=host --use
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Multi-arch Build Failures
|
||||
|
||||
**Symptom:** `exec format error` or QEMU issues
|
||||
|
||||
**Solutions:**
|
||||
|
||||
```bash
|
||||
# Install QEMU for cross-platform builds
|
||||
docker run --rm --privileged multiarch/qemu-user-static --reset -p yes
|
||||
|
||||
# Create new buildx builder
|
||||
docker buildx create --name multiarch --driver docker-container --use
|
||||
docker buildx inspect --bootstrap
|
||||
|
||||
# Build for specific platforms
|
||||
docker buildx build --platform linux/amd64 -t myimage .
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Test Failures
|
||||
|
||||
### Testcontainers Issues
|
||||
|
||||
**Symptom:** `Could not find a running Docker daemon`
|
||||
|
||||
**Solutions:**
|
||||
|
||||
1. Ensure Docker is running:
|
||||
```bash
|
||||
docker info
|
||||
```
|
||||
|
||||
2. Set Testcontainers host:
|
||||
```bash
|
||||
export TESTCONTAINERS_HOST_OVERRIDE=host.docker.internal
|
||||
# or for Linux
|
||||
export TESTCONTAINERS_HOST_OVERRIDE=$(hostname -I | awk '{print $1}')
|
||||
```
|
||||
|
||||
3. Use Ryuk container for cleanup:
|
||||
```bash
|
||||
export TESTCONTAINERS_RYUK_DISABLED=false
|
||||
```
|
||||
|
||||
4. CI configuration:
|
||||
```yaml
|
||||
services:
|
||||
dind:
|
||||
image: docker:dind
|
||||
privileged: true
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### PostgreSQL Test Failures
|
||||
|
||||
**Symptom:** `FATAL: role "postgres" does not exist`
|
||||
|
||||
**Solutions:**
|
||||
|
||||
1. Check connection string:
|
||||
```bash
|
||||
export STELLAOPS_TEST_POSTGRES_CONNECTION="Host=localhost;Database=test;Username=postgres;Password=postgres"
|
||||
```
|
||||
|
||||
2. Use Testcontainers PostgreSQL:
|
||||
```csharp
|
||||
var container = new PostgreSqlBuilder()
|
||||
.WithDatabase("test")
|
||||
.WithUsername("postgres")
|
||||
.WithPassword("postgres")
|
||||
.Build();
|
||||
```
|
||||
|
||||
3. Wait for PostgreSQL readiness:
|
||||
```bash
|
||||
until pg_isready -h localhost -p 5432; do
|
||||
sleep 1
|
||||
done
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Test Timeouts
|
||||
|
||||
**Symptom:** `Test exceeded timeout`
|
||||
|
||||
**Solutions:**
|
||||
|
||||
1. Increase timeout:
|
||||
```bash
|
||||
dotnet test --blame-hang-timeout 10m
|
||||
```
|
||||
|
||||
2. Run tests in parallel with limited concurrency:
|
||||
```bash
|
||||
dotnet test -maxcpucount:2
|
||||
```
|
||||
|
||||
3. Identify slow tests:
|
||||
```bash
|
||||
dotnet test --logger "console;verbosity=detailed" --logger "trx"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Determinism Test Failures
|
||||
|
||||
**Symptom:** `Output mismatch: expected SHA256 differs`
|
||||
|
||||
**Solutions:**
|
||||
|
||||
1. Check for non-deterministic sources:
|
||||
- Timestamps
|
||||
- Random GUIDs
|
||||
- Floating-point operations
|
||||
- Dictionary ordering
|
||||
|
||||
2. Run determinism comparison:
|
||||
```bash
|
||||
.gitea/scripts/test/determinism-run.sh
|
||||
diff out/scanner-determinism/run1.json out/scanner-determinism/run2.json
|
||||
```
|
||||
|
||||
3. Update golden fixtures:
|
||||
```bash
|
||||
.gitea/scripts/test/run-fixtures-check.sh --update
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Deployment Failures
|
||||
|
||||
### SSH Connection Issues
|
||||
|
||||
**Symptom:** `ssh: connect to host X.X.X.X port 22: Connection refused`
|
||||
|
||||
**Solutions:**
|
||||
|
||||
1. Verify SSH key:
|
||||
```bash
|
||||
ssh-keygen -lf ~/.ssh/id_rsa.pub
|
||||
```
|
||||
|
||||
2. Test connection:
|
||||
```bash
|
||||
ssh -vvv user@host
|
||||
```
|
||||
|
||||
3. Add host to known_hosts:
|
||||
```yaml
|
||||
- name: Setup SSH
|
||||
run: |
|
||||
mkdir -p ~/.ssh
|
||||
ssh-keyscan -H ${{ secrets.DEPLOY_HOST }} >> ~/.ssh/known_hosts
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Registry Push Failures
|
||||
|
||||
**Symptom:** `unauthorized: authentication required`
|
||||
|
||||
**Solutions:**
|
||||
|
||||
1. Login to registry:
|
||||
```bash
|
||||
docker login git.stella-ops.org -u $REGISTRY_USERNAME -p $REGISTRY_PASSWORD
|
||||
```
|
||||
|
||||
2. Check token permissions:
|
||||
- `write:packages` scope required
|
||||
- Token not expired
|
||||
|
||||
3. Use credential helper:
|
||||
```yaml
|
||||
- name: Login to Registry
|
||||
uses: docker/login-action@v3
|
||||
with:
|
||||
registry: git.stella-ops.org
|
||||
username: ${{ secrets.REGISTRY_USERNAME }}
|
||||
password: ${{ secrets.REGISTRY_PASSWORD }}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Helm Deployment Failures
|
||||
|
||||
**Symptom:** `Error: UPGRADE FAILED: cannot patch`
|
||||
|
||||
**Solutions:**
|
||||
|
||||
1. Check resource conflicts:
|
||||
```bash
|
||||
kubectl get events -n stellaops --sort-by='.lastTimestamp'
|
||||
```
|
||||
|
||||
2. Force upgrade:
|
||||
```bash
|
||||
helm upgrade --install --force stellaops ./devops/helm/stellaops
|
||||
```
|
||||
|
||||
3. Clean up stuck release:
|
||||
```bash
|
||||
helm history stellaops
|
||||
helm rollback stellaops <revision>
|
||||
# or
|
||||
kubectl delete secret -l name=stellaops,owner=helm
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Workflow Issues
|
||||
|
||||
### Workflow Not Triggering
|
||||
|
||||
**Symptom:** Push/PR doesn't trigger workflow
|
||||
|
||||
**Causes:**
|
||||
1. Path filter not matching
|
||||
2. Branch protection rules
|
||||
3. YAML syntax error
|
||||
|
||||
**Solutions:**
|
||||
|
||||
1. Check path filters:
|
||||
```yaml
|
||||
on:
|
||||
push:
|
||||
paths:
|
||||
- 'src/**' # Check if files match
|
||||
```
|
||||
|
||||
2. Validate YAML:
|
||||
```bash
|
||||
.gitea/scripts/validate/validate-workflows.sh
|
||||
```
|
||||
|
||||
3. Check branch rules:
|
||||
- Verify workflow permissions
|
||||
- Check protected branch settings
|
||||
|
||||
---
|
||||
|
||||
### Concurrency Issues
|
||||
|
||||
**Symptom:** Duplicate runs or stuck workflows
|
||||
|
||||
**Solutions:**
|
||||
|
||||
1. Add concurrency control:
|
||||
```yaml
|
||||
concurrency:
|
||||
group: ${{ github.workflow }}-${{ github.ref }}
|
||||
cancel-in-progress: true
|
||||
```
|
||||
|
||||
2. Cancel stale runs manually:
|
||||
```bash
|
||||
gh run cancel <run-id>
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Artifact Upload/Download Failures
|
||||
|
||||
**Symptom:** `Unable to find any artifacts`
|
||||
|
||||
**Solutions:**
|
||||
|
||||
1. Check artifact names match:
|
||||
```yaml
|
||||
# Upload
|
||||
- uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: my-artifact # Must match
|
||||
|
||||
# Download
|
||||
- uses: actions/download-artifact@v4
|
||||
with:
|
||||
name: my-artifact # Must match
|
||||
```
|
||||
|
||||
2. Check retention period:
|
||||
```yaml
|
||||
- uses: actions/upload-artifact@v4
|
||||
with:
|
||||
retention-days: 90 # Default is 90
|
||||
```
|
||||
|
||||
3. Verify job dependencies:
|
||||
```yaml
|
||||
download-job:
|
||||
needs: [upload-job] # Must complete first
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Runner Issues
|
||||
|
||||
### Disk Space Exhausted
|
||||
|
||||
**Symptom:** `No space left on device`
|
||||
|
||||
**Solutions:**
|
||||
|
||||
1. Run cleanup script:
|
||||
```bash
|
||||
.gitea/scripts/util/cleanup-runner-space.sh
|
||||
```
|
||||
|
||||
2. Add cleanup step to workflow:
|
||||
```yaml
|
||||
- name: Free disk space
|
||||
run: |
|
||||
docker system prune -af
|
||||
rm -rf /tmp/*
|
||||
df -h
|
||||
```
|
||||
|
||||
3. Use larger runner:
|
||||
```yaml
|
||||
runs-on: ubuntu-latest-4xlarge
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Out of Memory
|
||||
|
||||
**Symptom:** `Killed` or `OOMKilled`
|
||||
|
||||
**Solutions:**
|
||||
|
||||
1. Limit parallel jobs:
|
||||
```yaml
|
||||
strategy:
|
||||
max-parallel: 2
|
||||
```
|
||||
|
||||
2. Limit dotnet memory:
|
||||
```bash
|
||||
export DOTNET_GCHeapHardLimit=0x40000000 # 1GB
|
||||
```
|
||||
|
||||
3. Use swap:
|
||||
```yaml
|
||||
- name: Create swap
|
||||
run: |
|
||||
sudo fallocate -l 4G /swapfile
|
||||
sudo chmod 600 /swapfile
|
||||
sudo mkswap /swapfile
|
||||
sudo swapon /swapfile
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Runner Not Picking Up Jobs
|
||||
|
||||
**Symptom:** Jobs stuck in `queued` state
|
||||
|
||||
**Solutions:**
|
||||
|
||||
1. Check runner status:
|
||||
```bash
|
||||
# Self-hosted runner
|
||||
./run.sh --check
|
||||
```
|
||||
|
||||
2. Verify labels match:
|
||||
```yaml
|
||||
runs-on: [self-hosted, linux, x64] # All labels must match
|
||||
```
|
||||
|
||||
3. Restart runner service:
|
||||
```bash
|
||||
sudo systemctl restart actions.runner.*.service
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Signing & Attestation Issues
|
||||
|
||||
### Cosign Signing Failures
|
||||
|
||||
**Symptom:** `error opening key: no such file`
|
||||
|
||||
**Solutions:**
|
||||
|
||||
1. Check key configuration:
|
||||
```bash
|
||||
# From base64 secret
|
||||
echo "$COSIGN_PRIVATE_KEY_B64" | base64 -d > cosign.key
|
||||
|
||||
# Verify key
|
||||
cosign public-key --key cosign.key
|
||||
```
|
||||
|
||||
2. Set password:
|
||||
```bash
|
||||
export COSIGN_PASSWORD="${{ secrets.COSIGN_PASSWORD }}"
|
||||
```
|
||||
|
||||
3. Use keyless signing:
|
||||
```yaml
|
||||
- name: Sign with keyless
|
||||
env:
|
||||
COSIGN_EXPERIMENTAL: 1
|
||||
run: cosign sign --yes $IMAGE
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### SBOM Generation Failures
|
||||
|
||||
**Symptom:** `syft: command not found`
|
||||
|
||||
**Solutions:**
|
||||
|
||||
1. Install Syft:
|
||||
```bash
|
||||
curl -sSfL https://raw.githubusercontent.com/anchore/syft/main/install.sh | sh -s -- -b /usr/local/bin
|
||||
```
|
||||
|
||||
2. Use container:
|
||||
```yaml
|
||||
- name: Generate SBOM
|
||||
uses: anchore/sbom-action@v0
|
||||
with:
|
||||
image: ${{ env.IMAGE }}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Debugging Tips
|
||||
|
||||
### Enable Debug Logging
|
||||
|
||||
```yaml
|
||||
env:
|
||||
ACTIONS_STEP_DEBUG: true
|
||||
ACTIONS_RUNNER_DEBUG: true
|
||||
```
|
||||
|
||||
### SSH into Runner
|
||||
|
||||
```yaml
|
||||
- name: Debug SSH
|
||||
uses: mxschmitt/action-tmate@v3
|
||||
if: failure()
|
||||
```
|
||||
|
||||
### Collect Diagnostic Info
|
||||
|
||||
```yaml
|
||||
- name: Diagnostics
|
||||
if: failure()
|
||||
run: |
|
||||
echo "=== Environment ==="
|
||||
env | sort
|
||||
echo "=== Disk ==="
|
||||
df -h
|
||||
echo "=== Memory ==="
|
||||
free -m
|
||||
echo "=== Docker ==="
|
||||
docker info
|
||||
docker ps -a
|
||||
```
|
||||
|
||||
### View Workflow Logs
|
||||
|
||||
```bash
|
||||
# Stream logs
|
||||
gh run watch <run-id>
|
||||
|
||||
# Download logs
|
||||
gh run download <run-id> --name logs
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Getting Help
|
||||
|
||||
1. **Check existing issues:** Search repository issues
|
||||
2. **Review workflow history:** Look for similar failures
|
||||
3. **Consult documentation:** `docs/` and `.gitea/docs/`
|
||||
4. **Contact DevOps:** Create issue with label `ci-cd`
|
||||
|
||||
### Information to Include
|
||||
|
||||
- Workflow name and run ID
|
||||
- Error message and stack trace
|
||||
- Steps to reproduce
|
||||
- Environment details (OS, SDK versions)
|
||||
- Recent changes to affected code
|
||||
350
.gitea/scripts/release/bump-service-version.py
Normal file
350
.gitea/scripts/release/bump-service-version.py
Normal file
@@ -0,0 +1,350 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
bump-service-version.py - Bump service version in centralized version storage
|
||||
|
||||
Sprint: CI/CD Enhancement - Per-Service Auto-Versioning
|
||||
This script manages service versions stored in src/Directory.Versions.props
|
||||
and devops/releases/service-versions.json.
|
||||
|
||||
Usage:
|
||||
python bump-service-version.py <service> <bump-type> [options]
|
||||
python bump-service-version.py authority patch
|
||||
python bump-service-version.py scanner minor --dry-run
|
||||
python bump-service-version.py cli major --commit
|
||||
|
||||
Arguments:
|
||||
service Service name (authority, attestor, concelier, scanner, etc.)
|
||||
bump-type Version bump type: major, minor, patch, or explicit version (e.g., 2.0.0)
|
||||
|
||||
Options:
|
||||
--dry-run Show what would be changed without modifying files
|
||||
--commit Commit changes to git after updating
|
||||
--no-manifest Skip updating service-versions.json manifest
|
||||
--git-sha SHA Git SHA to record in manifest (defaults to HEAD)
|
||||
--docker-tag TAG Docker tag to record in manifest
|
||||
"""
|
||||
|
||||
import argparse
|
||||
import json
|
||||
import os
|
||||
import re
|
||||
import subprocess
|
||||
import sys
|
||||
from datetime import datetime, timezone
|
||||
from pathlib import Path
|
||||
from typing import Optional, Tuple
|
||||
|
||||
# Repository paths
|
||||
SCRIPT_DIR = Path(__file__).parent
|
||||
REPO_ROOT = SCRIPT_DIR.parent.parent.parent
|
||||
VERSIONS_FILE = REPO_ROOT / "src" / "Directory.Versions.props"
|
||||
MANIFEST_FILE = REPO_ROOT / "devops" / "releases" / "service-versions.json"
|
||||
|
||||
# Service name mapping (lowercase key -> property suffix)
|
||||
SERVICE_MAP = {
|
||||
"authority": "Authority",
|
||||
"attestor": "Attestor",
|
||||
"concelier": "Concelier",
|
||||
"scanner": "Scanner",
|
||||
"policy": "Policy",
|
||||
"signer": "Signer",
|
||||
"excititor": "Excititor",
|
||||
"gateway": "Gateway",
|
||||
"scheduler": "Scheduler",
|
||||
"cli": "Cli",
|
||||
"orchestrator": "Orchestrator",
|
||||
"notify": "Notify",
|
||||
"sbomservice": "SbomService",
|
||||
"vexhub": "VexHub",
|
||||
"evidencelocker": "EvidenceLocker",
|
||||
}
|
||||
|
||||
|
||||
def parse_version(version_str: str) -> Tuple[int, int, int]:
|
||||
"""Parse semantic version string into tuple."""
|
||||
match = re.match(r"^(\d+)\.(\d+)\.(\d+)$", version_str)
|
||||
if not match:
|
||||
raise ValueError(f"Invalid version format: {version_str}")
|
||||
return int(match.group(1)), int(match.group(2)), int(match.group(3))
|
||||
|
||||
|
||||
def format_version(major: int, minor: int, patch: int) -> str:
|
||||
"""Format version tuple as string."""
|
||||
return f"{major}.{minor}.{patch}"
|
||||
|
||||
|
||||
def bump_version(current: str, bump_type: str) -> str:
|
||||
"""Bump version according to bump type."""
|
||||
# Check if bump_type is an explicit version
|
||||
if re.match(r"^\d+\.\d+\.\d+$", bump_type):
|
||||
return bump_type
|
||||
|
||||
major, minor, patch = parse_version(current)
|
||||
|
||||
if bump_type == "major":
|
||||
return format_version(major + 1, 0, 0)
|
||||
elif bump_type == "minor":
|
||||
return format_version(major, minor + 1, 0)
|
||||
elif bump_type == "patch":
|
||||
return format_version(major, minor, patch + 1)
|
||||
else:
|
||||
raise ValueError(f"Invalid bump type: {bump_type}")
|
||||
|
||||
|
||||
def read_version_from_props(service_key: str) -> Optional[str]:
|
||||
"""Read current version from Directory.Versions.props."""
|
||||
if not VERSIONS_FILE.exists():
|
||||
return None
|
||||
|
||||
property_name = f"StellaOps{SERVICE_MAP[service_key]}Version"
|
||||
pattern = rf"<{property_name}>(\d+\.\d+\.\d+)</{property_name}>"
|
||||
|
||||
content = VERSIONS_FILE.read_text(encoding="utf-8")
|
||||
match = re.search(pattern, content)
|
||||
return match.group(1) if match else None
|
||||
|
||||
|
||||
def update_version_in_props(service_key: str, new_version: str, dry_run: bool = False) -> bool:
|
||||
"""Update version in Directory.Versions.props."""
|
||||
if not VERSIONS_FILE.exists():
|
||||
print(f"Error: {VERSIONS_FILE} not found", file=sys.stderr)
|
||||
return False
|
||||
|
||||
property_name = f"StellaOps{SERVICE_MAP[service_key]}Version"
|
||||
pattern = rf"(<{property_name}>)\d+\.\d+\.\d+(</{property_name}>)"
|
||||
replacement = rf"\g<1>{new_version}\g<2>"
|
||||
|
||||
content = VERSIONS_FILE.read_text(encoding="utf-8")
|
||||
new_content, count = re.subn(pattern, replacement, content)
|
||||
|
||||
if count == 0:
|
||||
print(f"Error: Property {property_name} not found in {VERSIONS_FILE}", file=sys.stderr)
|
||||
return False
|
||||
|
||||
if dry_run:
|
||||
print(f"[DRY-RUN] Would update {VERSIONS_FILE}")
|
||||
print(f"[DRY-RUN] {property_name}: {new_version}")
|
||||
else:
|
||||
VERSIONS_FILE.write_text(new_content, encoding="utf-8")
|
||||
print(f"Updated {VERSIONS_FILE}")
|
||||
print(f" {property_name}: {new_version}")
|
||||
|
||||
return True
|
||||
|
||||
|
||||
def update_manifest(
|
||||
service_key: str,
|
||||
new_version: str,
|
||||
git_sha: Optional[str] = None,
|
||||
docker_tag: Optional[str] = None,
|
||||
dry_run: bool = False,
|
||||
) -> bool:
|
||||
"""Update service-versions.json manifest."""
|
||||
if not MANIFEST_FILE.exists():
|
||||
print(f"Warning: {MANIFEST_FILE} not found, skipping manifest update", file=sys.stderr)
|
||||
return True
|
||||
|
||||
try:
|
||||
manifest = json.loads(MANIFEST_FILE.read_text(encoding="utf-8"))
|
||||
except json.JSONDecodeError as e:
|
||||
print(f"Error parsing {MANIFEST_FILE}: {e}", file=sys.stderr)
|
||||
return False
|
||||
|
||||
if service_key not in manifest.get("services", {}):
|
||||
print(f"Warning: Service '{service_key}' not found in manifest", file=sys.stderr)
|
||||
return True
|
||||
|
||||
# Update service entry
|
||||
now = datetime.now(timezone.utc).strftime("%Y-%m-%dT%H:%M:%SZ")
|
||||
service = manifest["services"][service_key]
|
||||
service["version"] = new_version
|
||||
service["releasedAt"] = now
|
||||
|
||||
if git_sha:
|
||||
service["gitSha"] = git_sha
|
||||
if docker_tag:
|
||||
service["dockerTag"] = docker_tag
|
||||
|
||||
# Update manifest timestamp
|
||||
manifest["lastUpdated"] = now
|
||||
|
||||
if dry_run:
|
||||
print(f"[DRY-RUN] Would update {MANIFEST_FILE}")
|
||||
print(f"[DRY-RUN] {service_key}.version: {new_version}")
|
||||
if docker_tag:
|
||||
print(f"[DRY-RUN] {service_key}.dockerTag: {docker_tag}")
|
||||
else:
|
||||
MANIFEST_FILE.write_text(
|
||||
json.dumps(manifest, indent=2, ensure_ascii=False) + "\n",
|
||||
encoding="utf-8",
|
||||
)
|
||||
print(f"Updated {MANIFEST_FILE}")
|
||||
|
||||
return True
|
||||
|
||||
|
||||
def get_git_sha() -> Optional[str]:
|
||||
"""Get current git HEAD SHA."""
|
||||
try:
|
||||
result = subprocess.run(
|
||||
["git", "rev-parse", "HEAD"],
|
||||
capture_output=True,
|
||||
text=True,
|
||||
cwd=REPO_ROOT,
|
||||
check=True,
|
||||
)
|
||||
return result.stdout.strip()[:12] # Short SHA
|
||||
except subprocess.CalledProcessError:
|
||||
return None
|
||||
|
||||
|
||||
def commit_changes(service_key: str, old_version: str, new_version: str) -> bool:
|
||||
"""Commit version changes to git."""
|
||||
try:
|
||||
# Stage the files
|
||||
subprocess.run(
|
||||
["git", "add", str(VERSIONS_FILE), str(MANIFEST_FILE)],
|
||||
cwd=REPO_ROOT,
|
||||
check=True,
|
||||
)
|
||||
|
||||
# Create commit
|
||||
commit_msg = f"""chore({service_key}): bump version {old_version} -> {new_version}
|
||||
|
||||
Automated version bump via bump-service-version.py
|
||||
|
||||
Co-Authored-By: github-actions[bot] <github-actions[bot]@users.noreply.github.com>"""
|
||||
|
||||
subprocess.run(
|
||||
["git", "commit", "-m", commit_msg],
|
||||
cwd=REPO_ROOT,
|
||||
check=True,
|
||||
)
|
||||
print(f"Committed version bump: {old_version} -> {new_version}")
|
||||
return True
|
||||
except subprocess.CalledProcessError as e:
|
||||
print(f"Error committing changes: {e}", file=sys.stderr)
|
||||
return False
|
||||
|
||||
|
||||
def generate_docker_tag(version: str) -> str:
|
||||
"""Generate Docker tag with datetime suffix: {version}+{YYYYMMDDHHmmss}."""
|
||||
timestamp = datetime.now(timezone.utc).strftime("%Y%m%d%H%M%S")
|
||||
return f"{version}+{timestamp}"
|
||||
|
||||
|
||||
def main():
|
||||
parser = argparse.ArgumentParser(
|
||||
description="Bump service version in centralized version storage",
|
||||
formatter_class=argparse.RawDescriptionHelpFormatter,
|
||||
epilog="""
|
||||
Examples:
|
||||
%(prog)s authority patch # Bump authority from 1.0.0 to 1.0.1
|
||||
%(prog)s scanner minor --dry-run # Preview bumping scanner minor version
|
||||
%(prog)s cli 2.0.0 --commit # Set CLI to 2.0.0 and commit
|
||||
%(prog)s gateway patch --docker-tag # Bump and generate docker tag
|
||||
""",
|
||||
)
|
||||
|
||||
parser.add_argument(
|
||||
"service",
|
||||
choices=list(SERVICE_MAP.keys()),
|
||||
help="Service name to bump",
|
||||
)
|
||||
parser.add_argument(
|
||||
"bump_type",
|
||||
help="Bump type: major, minor, patch, or explicit version (e.g., 2.0.0)",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--dry-run",
|
||||
action="store_true",
|
||||
help="Show what would be changed without modifying files",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--commit",
|
||||
action="store_true",
|
||||
help="Commit changes to git after updating",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--no-manifest",
|
||||
action="store_true",
|
||||
help="Skip updating service-versions.json manifest",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--git-sha",
|
||||
help="Git SHA to record in manifest (defaults to HEAD)",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--docker-tag",
|
||||
nargs="?",
|
||||
const="auto",
|
||||
help="Docker tag to record in manifest (use 'auto' to generate)",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--output-version",
|
||||
action="store_true",
|
||||
help="Output only the new version (for CI scripts)",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--output-docker-tag",
|
||||
action="store_true",
|
||||
help="Output only the docker tag (for CI scripts)",
|
||||
)
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
# Read current version
|
||||
current_version = read_version_from_props(args.service)
|
||||
if not current_version:
|
||||
print(f"Error: Could not read current version for {args.service}", file=sys.stderr)
|
||||
sys.exit(1)
|
||||
|
||||
# Calculate new version
|
||||
try:
|
||||
new_version = bump_version(current_version, args.bump_type)
|
||||
except ValueError as e:
|
||||
print(f"Error: {e}", file=sys.stderr)
|
||||
sys.exit(1)
|
||||
|
||||
# Generate docker tag if requested
|
||||
docker_tag = None
|
||||
if args.docker_tag:
|
||||
docker_tag = generate_docker_tag(new_version) if args.docker_tag == "auto" else args.docker_tag
|
||||
|
||||
# Output mode for CI scripts
|
||||
if args.output_version:
|
||||
print(new_version)
|
||||
sys.exit(0)
|
||||
if args.output_docker_tag:
|
||||
print(docker_tag or generate_docker_tag(new_version))
|
||||
sys.exit(0)
|
||||
|
||||
# Print summary
|
||||
print(f"Service: {args.service}")
|
||||
print(f"Current version: {current_version}")
|
||||
print(f"New version: {new_version}")
|
||||
if docker_tag:
|
||||
print(f"Docker tag: {docker_tag}")
|
||||
print()
|
||||
|
||||
# Update version in props file
|
||||
if not update_version_in_props(args.service, new_version, args.dry_run):
|
||||
sys.exit(1)
|
||||
|
||||
# Update manifest if not skipped
|
||||
if not args.no_manifest:
|
||||
git_sha = args.git_sha or get_git_sha()
|
||||
if not update_manifest(args.service, new_version, git_sha, docker_tag, args.dry_run):
|
||||
sys.exit(1)
|
||||
|
||||
# Commit if requested
|
||||
if args.commit and not args.dry_run:
|
||||
if not commit_changes(args.service, current_version, new_version):
|
||||
sys.exit(1)
|
||||
|
||||
print()
|
||||
print(f"Successfully bumped {args.service}: {current_version} -> {new_version}")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
259
.gitea/scripts/release/collect_versions.py
Normal file
259
.gitea/scripts/release/collect_versions.py
Normal file
@@ -0,0 +1,259 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
collect_versions.py - Collect service versions for suite release
|
||||
|
||||
Sprint: CI/CD Enhancement - Suite Release Pipeline
|
||||
Gathers all service versions from Directory.Versions.props and service-versions.json.
|
||||
|
||||
Usage:
|
||||
python collect_versions.py [options]
|
||||
python collect_versions.py --format json
|
||||
python collect_versions.py --format yaml --output versions.yaml
|
||||
|
||||
Options:
|
||||
--format FMT Output format: json, yaml, markdown, env (default: json)
|
||||
--output FILE Output file (defaults to stdout)
|
||||
--include-unreleased Include services with no Docker tag
|
||||
--registry URL Container registry URL
|
||||
"""
|
||||
|
||||
import argparse
|
||||
import json
|
||||
import os
|
||||
import re
|
||||
import sys
|
||||
from dataclasses import dataclass, asdict
|
||||
from datetime import datetime, timezone
|
||||
from pathlib import Path
|
||||
from typing import Dict, List, Optional
|
||||
|
||||
# Repository paths
|
||||
SCRIPT_DIR = Path(__file__).parent
|
||||
REPO_ROOT = SCRIPT_DIR.parent.parent.parent
|
||||
VERSIONS_FILE = REPO_ROOT / "src" / "Directory.Versions.props"
|
||||
MANIFEST_FILE = REPO_ROOT / "devops" / "releases" / "service-versions.json"
|
||||
|
||||
# Default registry
|
||||
DEFAULT_REGISTRY = "git.stella-ops.org/stella-ops.org"
|
||||
|
||||
|
||||
@dataclass
|
||||
class ServiceVersion:
|
||||
name: str
|
||||
version: str
|
||||
docker_tag: Optional[str] = None
|
||||
released_at: Optional[str] = None
|
||||
git_sha: Optional[str] = None
|
||||
image: Optional[str] = None
|
||||
|
||||
|
||||
def read_versions_from_props() -> Dict[str, str]:
|
||||
"""Read versions from Directory.Versions.props."""
|
||||
if not VERSIONS_FILE.exists():
|
||||
print(f"Warning: {VERSIONS_FILE} not found", file=sys.stderr)
|
||||
return {}
|
||||
|
||||
content = VERSIONS_FILE.read_text(encoding="utf-8")
|
||||
versions = {}
|
||||
|
||||
# Pattern: <StellaOps{Service}Version>X.Y.Z</StellaOps{Service}Version>
|
||||
pattern = r"<StellaOps(\w+)Version>(\d+\.\d+\.\d+)</StellaOps\1Version>"
|
||||
|
||||
for match in re.finditer(pattern, content):
|
||||
service_name = match.group(1)
|
||||
version = match.group(2)
|
||||
versions[service_name.lower()] = version
|
||||
|
||||
return versions
|
||||
|
||||
|
||||
def read_manifest() -> Dict[str, dict]:
|
||||
"""Read service metadata from manifest file."""
|
||||
if not MANIFEST_FILE.exists():
|
||||
print(f"Warning: {MANIFEST_FILE} not found", file=sys.stderr)
|
||||
return {}
|
||||
|
||||
try:
|
||||
manifest = json.loads(MANIFEST_FILE.read_text(encoding="utf-8"))
|
||||
return manifest.get("services", {})
|
||||
except json.JSONDecodeError as e:
|
||||
print(f"Warning: Failed to parse {MANIFEST_FILE}: {e}", file=sys.stderr)
|
||||
return {}
|
||||
|
||||
|
||||
def collect_all_versions(
|
||||
registry: str = DEFAULT_REGISTRY,
|
||||
include_unreleased: bool = False,
|
||||
) -> List[ServiceVersion]:
|
||||
"""Collect all service versions."""
|
||||
props_versions = read_versions_from_props()
|
||||
manifest_services = read_manifest()
|
||||
|
||||
services = []
|
||||
|
||||
# Merge data from both sources
|
||||
all_service_keys = set(props_versions.keys()) | set(manifest_services.keys())
|
||||
|
||||
for key in sorted(all_service_keys):
|
||||
version = props_versions.get(key, "0.0.0")
|
||||
manifest = manifest_services.get(key, {})
|
||||
|
||||
docker_tag = manifest.get("dockerTag")
|
||||
released_at = manifest.get("releasedAt")
|
||||
git_sha = manifest.get("gitSha")
|
||||
|
||||
# Skip unreleased if not requested
|
||||
if not include_unreleased and not docker_tag:
|
||||
continue
|
||||
|
||||
# Build image reference
|
||||
if docker_tag:
|
||||
image = f"{registry}/{key}:{docker_tag}"
|
||||
else:
|
||||
image = f"{registry}/{key}:{version}"
|
||||
|
||||
service = ServiceVersion(
|
||||
name=manifest.get("name", key.title()),
|
||||
version=version,
|
||||
docker_tag=docker_tag,
|
||||
released_at=released_at,
|
||||
git_sha=git_sha,
|
||||
image=image,
|
||||
)
|
||||
|
||||
services.append(service)
|
||||
|
||||
return services
|
||||
|
||||
|
||||
def format_json(services: List[ServiceVersion]) -> str:
|
||||
"""Format as JSON."""
|
||||
data = {
|
||||
"generatedAt": datetime.now(timezone.utc).isoformat(),
|
||||
"services": [asdict(s) for s in services],
|
||||
}
|
||||
return json.dumps(data, indent=2, ensure_ascii=False)
|
||||
|
||||
|
||||
def format_yaml(services: List[ServiceVersion]) -> str:
|
||||
"""Format as YAML."""
|
||||
lines = [
|
||||
"# Service Versions",
|
||||
f"# Generated: {datetime.now(timezone.utc).isoformat()}",
|
||||
"",
|
||||
"services:",
|
||||
]
|
||||
|
||||
for s in services:
|
||||
lines.extend([
|
||||
f" {s.name.lower()}:",
|
||||
f" name: {s.name}",
|
||||
f" version: \"{s.version}\"",
|
||||
])
|
||||
if s.docker_tag:
|
||||
lines.append(f" dockerTag: \"{s.docker_tag}\"")
|
||||
if s.image:
|
||||
lines.append(f" image: \"{s.image}\"")
|
||||
if s.released_at:
|
||||
lines.append(f" releasedAt: \"{s.released_at}\"")
|
||||
if s.git_sha:
|
||||
lines.append(f" gitSha: \"{s.git_sha}\"")
|
||||
|
||||
return "\n".join(lines)
|
||||
|
||||
|
||||
def format_markdown(services: List[ServiceVersion]) -> str:
|
||||
"""Format as Markdown table."""
|
||||
lines = [
|
||||
"# Service Versions",
|
||||
"",
|
||||
f"Generated: {datetime.now(timezone.utc).strftime('%Y-%m-%d %H:%M:%S UTC')}",
|
||||
"",
|
||||
"| Service | Version | Docker Tag | Released |",
|
||||
"|---------|---------|------------|----------|",
|
||||
]
|
||||
|
||||
for s in services:
|
||||
released = s.released_at[:10] if s.released_at else "-"
|
||||
docker_tag = f"`{s.docker_tag}`" if s.docker_tag else "-"
|
||||
lines.append(f"| {s.name} | {s.version} | {docker_tag} | {released} |")
|
||||
|
||||
return "\n".join(lines)
|
||||
|
||||
|
||||
def format_env(services: List[ServiceVersion]) -> str:
|
||||
"""Format as environment variables."""
|
||||
lines = [
|
||||
"# Service Versions as Environment Variables",
|
||||
f"# Generated: {datetime.now(timezone.utc).isoformat()}",
|
||||
"",
|
||||
]
|
||||
|
||||
for s in services:
|
||||
name_upper = s.name.upper().replace(" ", "_")
|
||||
lines.append(f"STELLAOPS_{name_upper}_VERSION={s.version}")
|
||||
if s.docker_tag:
|
||||
lines.append(f"STELLAOPS_{name_upper}_DOCKER_TAG={s.docker_tag}")
|
||||
if s.image:
|
||||
lines.append(f"STELLAOPS_{name_upper}_IMAGE={s.image}")
|
||||
|
||||
return "\n".join(lines)
|
||||
|
||||
|
||||
def main():
|
||||
parser = argparse.ArgumentParser(
|
||||
description="Collect service versions for suite release",
|
||||
)
|
||||
|
||||
parser.add_argument(
|
||||
"--format",
|
||||
choices=["json", "yaml", "markdown", "env"],
|
||||
default="json",
|
||||
help="Output format",
|
||||
)
|
||||
parser.add_argument("--output", "-o", help="Output file")
|
||||
parser.add_argument(
|
||||
"--include-unreleased",
|
||||
action="store_true",
|
||||
help="Include services without Docker tags",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--registry",
|
||||
default=DEFAULT_REGISTRY,
|
||||
help="Container registry URL",
|
||||
)
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
# Collect versions
|
||||
services = collect_all_versions(
|
||||
registry=args.registry,
|
||||
include_unreleased=args.include_unreleased,
|
||||
)
|
||||
|
||||
if not services:
|
||||
print("No services found", file=sys.stderr)
|
||||
if not args.include_unreleased:
|
||||
print("Hint: Use --include-unreleased to show all services", file=sys.stderr)
|
||||
sys.exit(0)
|
||||
|
||||
# Format output
|
||||
formatters = {
|
||||
"json": format_json,
|
||||
"yaml": format_yaml,
|
||||
"markdown": format_markdown,
|
||||
"env": format_env,
|
||||
}
|
||||
|
||||
output = formatters[args.format](services)
|
||||
|
||||
# Write output
|
||||
if args.output:
|
||||
Path(args.output).write_text(output, encoding="utf-8")
|
||||
print(f"Versions written to: {args.output}", file=sys.stderr)
|
||||
else:
|
||||
print(output)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
130
.gitea/scripts/release/generate-docker-tag.sh
Normal file
130
.gitea/scripts/release/generate-docker-tag.sh
Normal file
@@ -0,0 +1,130 @@
|
||||
#!/bin/bash
|
||||
# generate-docker-tag.sh - Generate Docker tag with datetime suffix
|
||||
#
|
||||
# Sprint: CI/CD Enhancement - Per-Service Auto-Versioning
|
||||
# Generates Docker tags in format: {semver}+{YYYYMMDDHHmmss}
|
||||
#
|
||||
# Usage:
|
||||
# ./generate-docker-tag.sh <service>
|
||||
# ./generate-docker-tag.sh --version <version>
|
||||
# ./generate-docker-tag.sh authority
|
||||
# ./generate-docker-tag.sh --version 1.2.3
|
||||
#
|
||||
# Output:
|
||||
# Prints the Docker tag to stdout (e.g., "1.2.3+20250128143022")
|
||||
# Exit code 0 on success, 1 on error
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
|
||||
usage() {
|
||||
cat << EOF
|
||||
Usage: $(basename "$0") <service|--version VERSION>
|
||||
|
||||
Generate Docker tag with datetime suffix.
|
||||
|
||||
Format: {semver}+{YYYYMMDDHHmmss}
|
||||
Example: 1.2.3+20250128143022
|
||||
|
||||
Arguments:
|
||||
service Service name to read version from
|
||||
--version VERSION Use explicit version instead of reading from file
|
||||
|
||||
Options:
|
||||
--timestamp TS Use explicit timestamp (YYYYMMDDHHmmss format)
|
||||
--output-parts Output version and timestamp separately (JSON)
|
||||
--help, -h Show this help message
|
||||
|
||||
Examples:
|
||||
$(basename "$0") authority # 1.0.0+20250128143022
|
||||
$(basename "$0") --version 2.0.0 # 2.0.0+20250128143022
|
||||
$(basename "$0") scanner --timestamp 20250101120000
|
||||
$(basename "$0") --version 1.0.0 --output-parts
|
||||
EOF
|
||||
}
|
||||
|
||||
# Generate timestamp in UTC
|
||||
generate_timestamp() {
|
||||
date -u +"%Y%m%d%H%M%S"
|
||||
}
|
||||
|
||||
main() {
|
||||
local version=""
|
||||
local timestamp=""
|
||||
local output_parts=false
|
||||
local service=""
|
||||
|
||||
while [[ $# -gt 0 ]]; do
|
||||
case "$1" in
|
||||
--help|-h)
|
||||
usage
|
||||
exit 0
|
||||
;;
|
||||
--version)
|
||||
version="$2"
|
||||
shift 2
|
||||
;;
|
||||
--timestamp)
|
||||
timestamp="$2"
|
||||
shift 2
|
||||
;;
|
||||
--output-parts)
|
||||
output_parts=true
|
||||
shift
|
||||
;;
|
||||
-*)
|
||||
echo "Error: Unknown option: $1" >&2
|
||||
usage
|
||||
exit 1
|
||||
;;
|
||||
*)
|
||||
service="$1"
|
||||
shift
|
||||
;;
|
||||
esac
|
||||
done
|
||||
|
||||
# Get version from service if not explicitly provided
|
||||
if [[ -z "$version" ]]; then
|
||||
if [[ -z "$service" ]]; then
|
||||
echo "Error: Either service name or --version must be provided" >&2
|
||||
usage
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Read version using read-service-version.sh
|
||||
if [[ ! -x "${SCRIPT_DIR}/read-service-version.sh" ]]; then
|
||||
echo "Error: read-service-version.sh not found or not executable" >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
version=$("${SCRIPT_DIR}/read-service-version.sh" "$service")
|
||||
fi
|
||||
|
||||
# Validate version format
|
||||
if ! [[ "$version" =~ ^[0-9]+\.[0-9]+\.[0-9]+$ ]]; then
|
||||
echo "Error: Invalid version format: $version (expected: X.Y.Z)" >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Generate timestamp if not provided
|
||||
if [[ -z "$timestamp" ]]; then
|
||||
timestamp=$(generate_timestamp)
|
||||
fi
|
||||
|
||||
# Validate timestamp format
|
||||
if ! [[ "$timestamp" =~ ^[0-9]{14}$ ]]; then
|
||||
echo "Error: Invalid timestamp format: $timestamp (expected: YYYYMMDDHHmmss)" >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Output
|
||||
if [[ "$output_parts" == "true" ]]; then
|
||||
echo "{\"version\":\"$version\",\"timestamp\":\"$timestamp\",\"tag\":\"${version}+${timestamp}\"}"
|
||||
else
|
||||
echo "${version}+${timestamp}"
|
||||
fi
|
||||
}
|
||||
|
||||
main "$@"
|
||||
448
.gitea/scripts/release/generate_changelog.py
Normal file
448
.gitea/scripts/release/generate_changelog.py
Normal file
@@ -0,0 +1,448 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
generate_changelog.py - AI-assisted changelog generation for suite releases
|
||||
|
||||
Sprint: CI/CD Enhancement - Suite Release Pipeline
|
||||
Generates changelogs from git commit history with optional AI enhancement.
|
||||
|
||||
Usage:
|
||||
python generate_changelog.py <version> [options]
|
||||
python generate_changelog.py 2026.04 --codename Nova
|
||||
python generate_changelog.py 2026.04 --from-tag suite-2025.10 --ai
|
||||
|
||||
Arguments:
|
||||
version Suite version (YYYY.MM format)
|
||||
|
||||
Options:
|
||||
--codename NAME Release codename
|
||||
--from-tag TAG Previous release tag (defaults to latest suite-* tag)
|
||||
--to-ref REF End reference (defaults to HEAD)
|
||||
--ai Use AI to enhance changelog descriptions
|
||||
--output FILE Output file (defaults to stdout)
|
||||
--format FMT Output format: markdown, json (default: markdown)
|
||||
"""
|
||||
|
||||
import argparse
|
||||
import json
|
||||
import os
|
||||
import re
|
||||
import subprocess
|
||||
import sys
|
||||
from dataclasses import dataclass, field
|
||||
from datetime import datetime, timezone
|
||||
from pathlib import Path
|
||||
from typing import Dict, List, Optional, Tuple
|
||||
from collections import defaultdict
|
||||
|
||||
# Repository paths
|
||||
SCRIPT_DIR = Path(__file__).parent
|
||||
REPO_ROOT = SCRIPT_DIR.parent.parent.parent
|
||||
|
||||
# Module patterns for categorization
|
||||
MODULE_PATTERNS = {
|
||||
"Authority": r"src/Authority/",
|
||||
"Attestor": r"src/Attestor/",
|
||||
"Concelier": r"src/Concelier/",
|
||||
"Scanner": r"src/Scanner/",
|
||||
"Policy": r"src/Policy/",
|
||||
"Signer": r"src/Signer/",
|
||||
"Excititor": r"src/Excititor/",
|
||||
"Gateway": r"src/Gateway/",
|
||||
"Scheduler": r"src/Scheduler/",
|
||||
"CLI": r"src/Cli/",
|
||||
"Orchestrator": r"src/Orchestrator/",
|
||||
"Notify": r"src/Notify/",
|
||||
"Infrastructure": r"(devops/|\.gitea/|docs/)",
|
||||
"Core": r"src/__Libraries/",
|
||||
}
|
||||
|
||||
# Commit type patterns (conventional commits)
|
||||
COMMIT_TYPE_PATTERNS = {
|
||||
"breaking": r"^(feat|fix|refactor)(\(.+\))?!:|BREAKING CHANGE:",
|
||||
"security": r"^(security|fix)(\(.+\))?:|CVE-|vulnerability|exploit",
|
||||
"feature": r"^feat(\(.+\))?:",
|
||||
"fix": r"^fix(\(.+\))?:",
|
||||
"performance": r"^perf(\(.+\))?:|performance|optimize",
|
||||
"refactor": r"^refactor(\(.+\))?:",
|
||||
"docs": r"^docs(\(.+\))?:",
|
||||
"test": r"^test(\(.+\))?:",
|
||||
"chore": r"^chore(\(.+\))?:|^ci(\(.+\))?:|^build(\(.+\))?:",
|
||||
}
|
||||
|
||||
|
||||
@dataclass
|
||||
class Commit:
|
||||
sha: str
|
||||
short_sha: str
|
||||
message: str
|
||||
body: str
|
||||
author: str
|
||||
date: str
|
||||
files: List[str] = field(default_factory=list)
|
||||
type: str = "other"
|
||||
module: str = "Other"
|
||||
scope: str = ""
|
||||
|
||||
|
||||
@dataclass
|
||||
class ChangelogEntry:
|
||||
description: str
|
||||
commits: List[Commit]
|
||||
module: str
|
||||
type: str
|
||||
|
||||
|
||||
def run_git(args: List[str], cwd: Path = REPO_ROOT) -> str:
|
||||
"""Run git command and return output."""
|
||||
result = subprocess.run(
|
||||
["git"] + args,
|
||||
capture_output=True,
|
||||
text=True,
|
||||
cwd=cwd,
|
||||
)
|
||||
if result.returncode != 0:
|
||||
raise RuntimeError(f"Git command failed: {result.stderr}")
|
||||
return result.stdout.strip()
|
||||
|
||||
|
||||
def get_latest_suite_tag() -> Optional[str]:
|
||||
"""Get the most recent suite-* tag."""
|
||||
try:
|
||||
output = run_git(["tag", "-l", "suite-*", "--sort=-creatordate"])
|
||||
tags = output.split("\n")
|
||||
return tags[0] if tags and tags[0] else None
|
||||
except RuntimeError:
|
||||
return None
|
||||
|
||||
|
||||
def get_commits_between(from_ref: str, to_ref: str = "HEAD") -> List[Commit]:
|
||||
"""Get commits between two refs."""
|
||||
# Format: sha|short_sha|subject|body|author|date
|
||||
format_str = "%H|%h|%s|%b|%an|%aI"
|
||||
separator = "---COMMIT_SEPARATOR---"
|
||||
|
||||
try:
|
||||
output = run_git([
|
||||
"log",
|
||||
f"{from_ref}..{to_ref}",
|
||||
f"--format={format_str}{separator}",
|
||||
"--name-only",
|
||||
])
|
||||
except RuntimeError:
|
||||
# If from_ref doesn't exist, get all commits up to to_ref
|
||||
output = run_git([
|
||||
"log",
|
||||
to_ref,
|
||||
"-100", # Limit to last 100 commits
|
||||
f"--format={format_str}{separator}",
|
||||
"--name-only",
|
||||
])
|
||||
|
||||
commits = []
|
||||
entries = output.split(separator)
|
||||
|
||||
for entry in entries:
|
||||
entry = entry.strip()
|
||||
if not entry:
|
||||
continue
|
||||
|
||||
lines = entry.split("\n")
|
||||
if not lines:
|
||||
continue
|
||||
|
||||
# Parse commit info
|
||||
parts = lines[0].split("|")
|
||||
if len(parts) < 6:
|
||||
continue
|
||||
|
||||
# Get changed files (remaining lines after commit info)
|
||||
files = [f.strip() for f in lines[1:] if f.strip()]
|
||||
|
||||
commit = Commit(
|
||||
sha=parts[0],
|
||||
short_sha=parts[1],
|
||||
message=parts[2],
|
||||
body=parts[3] if len(parts) > 3 else "",
|
||||
author=parts[4] if len(parts) > 4 else "",
|
||||
date=parts[5] if len(parts) > 5 else "",
|
||||
files=files,
|
||||
)
|
||||
|
||||
# Categorize commit
|
||||
commit.type = categorize_commit_type(commit.message)
|
||||
commit.module = categorize_commit_module(commit.files, commit.message)
|
||||
commit.scope = extract_scope(commit.message)
|
||||
|
||||
commits.append(commit)
|
||||
|
||||
return commits
|
||||
|
||||
|
||||
def categorize_commit_type(message: str) -> str:
|
||||
"""Categorize commit by type based on message."""
|
||||
message_lower = message.lower()
|
||||
|
||||
for commit_type, pattern in COMMIT_TYPE_PATTERNS.items():
|
||||
if re.search(pattern, message, re.IGNORECASE):
|
||||
return commit_type
|
||||
|
||||
return "other"
|
||||
|
||||
|
||||
def categorize_commit_module(files: List[str], message: str) -> str:
|
||||
"""Categorize commit by module based on changed files."""
|
||||
module_counts: Dict[str, int] = defaultdict(int)
|
||||
|
||||
for file in files:
|
||||
for module, pattern in MODULE_PATTERNS.items():
|
||||
if re.search(pattern, file):
|
||||
module_counts[module] += 1
|
||||
break
|
||||
|
||||
if module_counts:
|
||||
return max(module_counts, key=module_counts.get)
|
||||
|
||||
# Try to extract from message scope
|
||||
scope_match = re.match(r"^\w+\((\w+)\):", message)
|
||||
if scope_match:
|
||||
scope = scope_match.group(1).lower()
|
||||
for module in MODULE_PATTERNS:
|
||||
if module.lower() == scope:
|
||||
return module
|
||||
|
||||
return "Other"
|
||||
|
||||
|
||||
def extract_scope(message: str) -> str:
|
||||
"""Extract scope from conventional commit message."""
|
||||
match = re.match(r"^\w+\(([^)]+)\):", message)
|
||||
return match.group(1) if match else ""
|
||||
|
||||
|
||||
def group_commits_by_type_and_module(
|
||||
commits: List[Commit],
|
||||
) -> Dict[str, Dict[str, List[Commit]]]:
|
||||
"""Group commits by type and module."""
|
||||
grouped: Dict[str, Dict[str, List[Commit]]] = defaultdict(lambda: defaultdict(list))
|
||||
|
||||
for commit in commits:
|
||||
grouped[commit.type][commit.module].append(commit)
|
||||
|
||||
return grouped
|
||||
|
||||
|
||||
def generate_markdown_changelog(
|
||||
version: str,
|
||||
codename: str,
|
||||
commits: List[Commit],
|
||||
ai_enhanced: bool = False,
|
||||
) -> str:
|
||||
"""Generate markdown changelog."""
|
||||
grouped = group_commits_by_type_and_module(commits)
|
||||
|
||||
lines = [
|
||||
f"# Changelog - StellaOps {version} \"{codename}\"",
|
||||
"",
|
||||
f"Release Date: {datetime.now(timezone.utc).strftime('%Y-%m-%d')}",
|
||||
"",
|
||||
]
|
||||
|
||||
# Order of sections
|
||||
section_order = [
|
||||
("breaking", "Breaking Changes"),
|
||||
("security", "Security"),
|
||||
("feature", "Features"),
|
||||
("fix", "Bug Fixes"),
|
||||
("performance", "Performance"),
|
||||
("refactor", "Refactoring"),
|
||||
("docs", "Documentation"),
|
||||
("other", "Other Changes"),
|
||||
]
|
||||
|
||||
for type_key, section_title in section_order:
|
||||
if type_key not in grouped:
|
||||
continue
|
||||
|
||||
modules = grouped[type_key]
|
||||
if not modules:
|
||||
continue
|
||||
|
||||
lines.append(f"## {section_title}")
|
||||
lines.append("")
|
||||
|
||||
# Sort modules alphabetically
|
||||
for module in sorted(modules.keys()):
|
||||
commits_in_module = modules[module]
|
||||
if not commits_in_module:
|
||||
continue
|
||||
|
||||
lines.append(f"### {module}")
|
||||
lines.append("")
|
||||
|
||||
for commit in commits_in_module:
|
||||
# Clean up message
|
||||
msg = commit.message
|
||||
# Remove conventional commit prefix for display
|
||||
msg = re.sub(r"^\w+(\([^)]+\))?[!]?:\s*", "", msg)
|
||||
|
||||
if ai_enhanced:
|
||||
# Placeholder for AI-enhanced description
|
||||
lines.append(f"- {msg} ([{commit.short_sha}])")
|
||||
else:
|
||||
lines.append(f"- {msg} (`{commit.short_sha}`)")
|
||||
|
||||
lines.append("")
|
||||
|
||||
# Add statistics
|
||||
lines.extend([
|
||||
"---",
|
||||
"",
|
||||
"## Statistics",
|
||||
"",
|
||||
f"- **Total Commits:** {len(commits)}",
|
||||
f"- **Contributors:** {len(set(c.author for c in commits))}",
|
||||
f"- **Files Changed:** {len(set(f for c in commits for f in c.files))}",
|
||||
"",
|
||||
])
|
||||
|
||||
return "\n".join(lines)
|
||||
|
||||
|
||||
def generate_json_changelog(
|
||||
version: str,
|
||||
codename: str,
|
||||
commits: List[Commit],
|
||||
) -> str:
|
||||
"""Generate JSON changelog."""
|
||||
grouped = group_commits_by_type_and_module(commits)
|
||||
|
||||
changelog = {
|
||||
"version": version,
|
||||
"codename": codename,
|
||||
"date": datetime.now(timezone.utc).isoformat(),
|
||||
"statistics": {
|
||||
"totalCommits": len(commits),
|
||||
"contributors": len(set(c.author for c in commits)),
|
||||
"filesChanged": len(set(f for c in commits for f in c.files)),
|
||||
},
|
||||
"sections": {},
|
||||
}
|
||||
|
||||
for type_key, modules in grouped.items():
|
||||
if not modules:
|
||||
continue
|
||||
|
||||
changelog["sections"][type_key] = {}
|
||||
|
||||
for module, module_commits in modules.items():
|
||||
changelog["sections"][type_key][module] = [
|
||||
{
|
||||
"sha": c.short_sha,
|
||||
"message": c.message,
|
||||
"author": c.author,
|
||||
"date": c.date,
|
||||
}
|
||||
for c in module_commits
|
||||
]
|
||||
|
||||
return json.dumps(changelog, indent=2, ensure_ascii=False)
|
||||
|
||||
|
||||
def enhance_with_ai(changelog: str, api_key: Optional[str] = None) -> str:
|
||||
"""Enhance changelog using AI (if available)."""
|
||||
if not api_key:
|
||||
api_key = os.environ.get("AI_API_KEY")
|
||||
|
||||
if not api_key:
|
||||
print("Warning: No AI API key provided, skipping AI enhancement", file=sys.stderr)
|
||||
return changelog
|
||||
|
||||
# This is a placeholder for AI integration
|
||||
# In production, this would call Claude API or similar
|
||||
prompt = f"""
|
||||
You are a technical writer creating release notes for a security platform.
|
||||
Improve the following changelog by:
|
||||
1. Making descriptions more user-friendly
|
||||
2. Highlighting important changes
|
||||
3. Adding context where helpful
|
||||
4. Keeping it concise
|
||||
|
||||
Original changelog:
|
||||
{changelog}
|
||||
|
||||
Generate improved changelog in the same markdown format.
|
||||
"""
|
||||
|
||||
# For now, return the original changelog
|
||||
# TODO: Implement actual AI API call
|
||||
print("Note: AI enhancement is a placeholder, returning original changelog", file=sys.stderr)
|
||||
return changelog
|
||||
|
||||
|
||||
def main():
|
||||
parser = argparse.ArgumentParser(
|
||||
description="Generate changelog from git history",
|
||||
formatter_class=argparse.RawDescriptionHelpFormatter,
|
||||
)
|
||||
|
||||
parser.add_argument("version", help="Suite version (YYYY.MM format)")
|
||||
parser.add_argument("--codename", default="", help="Release codename")
|
||||
parser.add_argument("--from-tag", help="Previous release tag")
|
||||
parser.add_argument("--to-ref", default="HEAD", help="End reference")
|
||||
parser.add_argument("--ai", action="store_true", help="Use AI enhancement")
|
||||
parser.add_argument("--output", "-o", help="Output file")
|
||||
parser.add_argument(
|
||||
"--format",
|
||||
choices=["markdown", "json"],
|
||||
default="markdown",
|
||||
help="Output format",
|
||||
)
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
# Validate version format
|
||||
if not re.match(r"^\d{4}\.(04|10)$", args.version):
|
||||
print(f"Warning: Non-standard version format: {args.version}", file=sys.stderr)
|
||||
|
||||
# Determine from tag
|
||||
from_tag = args.from_tag
|
||||
if not from_tag:
|
||||
from_tag = get_latest_suite_tag()
|
||||
if from_tag:
|
||||
print(f"Using previous tag: {from_tag}", file=sys.stderr)
|
||||
else:
|
||||
print("No previous suite tag found, using last 100 commits", file=sys.stderr)
|
||||
from_tag = "HEAD~100"
|
||||
|
||||
# Get commits
|
||||
print(f"Collecting commits from {from_tag} to {args.to_ref}...", file=sys.stderr)
|
||||
commits = get_commits_between(from_tag, args.to_ref)
|
||||
print(f"Found {len(commits)} commits", file=sys.stderr)
|
||||
|
||||
if not commits:
|
||||
print("No commits found in range", file=sys.stderr)
|
||||
sys.exit(0)
|
||||
|
||||
# Generate changelog
|
||||
codename = args.codename or "TBD"
|
||||
|
||||
if args.format == "json":
|
||||
output = generate_json_changelog(args.version, codename, commits)
|
||||
else:
|
||||
output = generate_markdown_changelog(
|
||||
args.version, codename, commits, ai_enhanced=args.ai
|
||||
)
|
||||
|
||||
if args.ai:
|
||||
output = enhance_with_ai(output)
|
||||
|
||||
# Output
|
||||
if args.output:
|
||||
Path(args.output).write_text(output, encoding="utf-8")
|
||||
print(f"Changelog written to: {args.output}", file=sys.stderr)
|
||||
else:
|
||||
print(output)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
373
.gitea/scripts/release/generate_compose.py
Normal file
373
.gitea/scripts/release/generate_compose.py
Normal file
@@ -0,0 +1,373 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
generate_compose.py - Generate pinned Docker Compose files for suite releases
|
||||
|
||||
Sprint: CI/CD Enhancement - Suite Release Pipeline
|
||||
Creates docker-compose.yml files with pinned image versions for releases.
|
||||
|
||||
Usage:
|
||||
python generate_compose.py <version> <codename> [options]
|
||||
python generate_compose.py 2026.04 Nova --output docker-compose.yml
|
||||
python generate_compose.py 2026.04 Nova --airgap --output docker-compose.airgap.yml
|
||||
|
||||
Arguments:
|
||||
version Suite version (YYYY.MM format)
|
||||
codename Release codename
|
||||
|
||||
Options:
|
||||
--output FILE Output file (default: stdout)
|
||||
--airgap Generate air-gap variant
|
||||
--registry URL Container registry URL
|
||||
--include-deps Include infrastructure dependencies (postgres, valkey)
|
||||
"""
|
||||
|
||||
import argparse
|
||||
import json
|
||||
import sys
|
||||
from datetime import datetime, timezone
|
||||
from pathlib import Path
|
||||
from typing import Dict, List, Optional
|
||||
|
||||
# Repository paths
|
||||
SCRIPT_DIR = Path(__file__).parent
|
||||
REPO_ROOT = SCRIPT_DIR.parent.parent.parent
|
||||
MANIFEST_FILE = REPO_ROOT / "devops" / "releases" / "service-versions.json"
|
||||
|
||||
# Default registry
|
||||
DEFAULT_REGISTRY = "git.stella-ops.org/stella-ops.org"
|
||||
|
||||
# Service definitions with port mappings and dependencies
|
||||
SERVICE_DEFINITIONS = {
|
||||
"authority": {
|
||||
"ports": ["8080:8080"],
|
||||
"depends_on": ["postgres"],
|
||||
"environment": {
|
||||
"AUTHORITY_DB_CONNECTION": "Host=postgres;Database=authority;Username=stellaops;Password=${POSTGRES_PASSWORD}",
|
||||
},
|
||||
"healthcheck": {
|
||||
"test": ["CMD", "curl", "-f", "http://localhost:8080/health"],
|
||||
"interval": "30s",
|
||||
"timeout": "10s",
|
||||
"retries": 3,
|
||||
},
|
||||
},
|
||||
"attestor": {
|
||||
"ports": ["8081:8080"],
|
||||
"depends_on": ["postgres", "authority"],
|
||||
"environment": {
|
||||
"ATTESTOR_DB_CONNECTION": "Host=postgres;Database=attestor;Username=stellaops;Password=${POSTGRES_PASSWORD}",
|
||||
"ATTESTOR_AUTHORITY_URL": "http://authority:8080",
|
||||
},
|
||||
},
|
||||
"concelier": {
|
||||
"ports": ["8082:8080"],
|
||||
"depends_on": ["postgres", "valkey"],
|
||||
"environment": {
|
||||
"CONCELIER_DB_CONNECTION": "Host=postgres;Database=concelier;Username=stellaops;Password=${POSTGRES_PASSWORD}",
|
||||
"CONCELIER_CACHE_URL": "valkey:6379",
|
||||
},
|
||||
},
|
||||
"scanner": {
|
||||
"ports": ["8083:8080"],
|
||||
"depends_on": ["postgres", "concelier"],
|
||||
"environment": {
|
||||
"SCANNER_DB_CONNECTION": "Host=postgres;Database=scanner;Username=stellaops;Password=${POSTGRES_PASSWORD}",
|
||||
"SCANNER_CONCELIER_URL": "http://concelier:8080",
|
||||
},
|
||||
"volumes": ["/var/run/docker.sock:/var/run/docker.sock:ro"],
|
||||
},
|
||||
"policy": {
|
||||
"ports": ["8084:8080"],
|
||||
"depends_on": ["postgres"],
|
||||
"environment": {
|
||||
"POLICY_DB_CONNECTION": "Host=postgres;Database=policy;Username=stellaops;Password=${POSTGRES_PASSWORD}",
|
||||
},
|
||||
},
|
||||
"signer": {
|
||||
"ports": ["8085:8080"],
|
||||
"depends_on": ["authority"],
|
||||
"environment": {
|
||||
"SIGNER_AUTHORITY_URL": "http://authority:8080",
|
||||
},
|
||||
},
|
||||
"excititor": {
|
||||
"ports": ["8086:8080"],
|
||||
"depends_on": ["postgres", "concelier"],
|
||||
"environment": {
|
||||
"EXCITITOR_DB_CONNECTION": "Host=postgres;Database=excititor;Username=stellaops;Password=${POSTGRES_PASSWORD}",
|
||||
},
|
||||
},
|
||||
"gateway": {
|
||||
"ports": ["8000:8080"],
|
||||
"depends_on": ["authority"],
|
||||
"environment": {
|
||||
"GATEWAY_AUTHORITY_URL": "http://authority:8080",
|
||||
},
|
||||
},
|
||||
"scheduler": {
|
||||
"ports": ["8087:8080"],
|
||||
"depends_on": ["postgres", "valkey"],
|
||||
"environment": {
|
||||
"SCHEDULER_DB_CONNECTION": "Host=postgres;Database=scheduler;Username=stellaops;Password=${POSTGRES_PASSWORD}",
|
||||
"SCHEDULER_QUEUE_URL": "valkey:6379",
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
# Infrastructure services
|
||||
INFRASTRUCTURE_SERVICES = {
|
||||
"postgres": {
|
||||
"image": "postgres:16-alpine",
|
||||
"environment": {
|
||||
"POSTGRES_USER": "stellaops",
|
||||
"POSTGRES_PASSWORD": "${POSTGRES_PASSWORD:-stellaops}",
|
||||
"POSTGRES_DB": "stellaops",
|
||||
},
|
||||
"volumes": ["postgres_data:/var/lib/postgresql/data"],
|
||||
"healthcheck": {
|
||||
"test": ["CMD-SHELL", "pg_isready -U stellaops"],
|
||||
"interval": "10s",
|
||||
"timeout": "5s",
|
||||
"retries": 5,
|
||||
},
|
||||
},
|
||||
"valkey": {
|
||||
"image": "valkey/valkey:8-alpine",
|
||||
"volumes": ["valkey_data:/data"],
|
||||
"healthcheck": {
|
||||
"test": ["CMD", "valkey-cli", "ping"],
|
||||
"interval": "10s",
|
||||
"timeout": "5s",
|
||||
"retries": 5,
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
|
||||
def read_service_versions() -> Dict[str, dict]:
|
||||
"""Read service versions from manifest."""
|
||||
if not MANIFEST_FILE.exists():
|
||||
return {}
|
||||
|
||||
try:
|
||||
manifest = json.loads(MANIFEST_FILE.read_text(encoding="utf-8"))
|
||||
return manifest.get("services", {})
|
||||
except json.JSONDecodeError:
|
||||
return {}
|
||||
|
||||
|
||||
def generate_compose(
|
||||
version: str,
|
||||
codename: str,
|
||||
registry: str,
|
||||
services: Dict[str, dict],
|
||||
airgap: bool = False,
|
||||
include_deps: bool = True,
|
||||
) -> str:
|
||||
"""Generate Docker Compose YAML."""
|
||||
now = datetime.now(timezone.utc)
|
||||
|
||||
lines = [
|
||||
"# Docker Compose for StellaOps Suite",
|
||||
f"# Version: {version} \"{codename}\"",
|
||||
f"# Generated: {now.isoformat()}",
|
||||
"#",
|
||||
"# Usage:",
|
||||
"# docker compose up -d",
|
||||
"# docker compose logs -f",
|
||||
"# docker compose down",
|
||||
"#",
|
||||
"# Environment variables:",
|
||||
"# POSTGRES_PASSWORD - PostgreSQL password (default: stellaops)",
|
||||
"#",
|
||||
"",
|
||||
"services:",
|
||||
]
|
||||
|
||||
# Add infrastructure services if requested
|
||||
if include_deps:
|
||||
for name, config in INFRASTRUCTURE_SERVICES.items():
|
||||
lines.extend(generate_service_block(name, config, indent=2))
|
||||
|
||||
# Add StellaOps services
|
||||
for svc_name, svc_def in SERVICE_DEFINITIONS.items():
|
||||
# Get version info from manifest
|
||||
manifest_info = services.get(svc_name, {})
|
||||
docker_tag = manifest_info.get("dockerTag") or manifest_info.get("version", version)
|
||||
|
||||
# Build image reference
|
||||
if airgap:
|
||||
image = f"localhost:5000/{svc_name}:{docker_tag}"
|
||||
else:
|
||||
image = f"{registry}/{svc_name}:{docker_tag}"
|
||||
|
||||
# Build service config
|
||||
config = {
|
||||
"image": image,
|
||||
"restart": "unless-stopped",
|
||||
**svc_def,
|
||||
}
|
||||
|
||||
# Add release labels
|
||||
config["labels"] = {
|
||||
"com.stellaops.release.version": version,
|
||||
"com.stellaops.release.codename": codename,
|
||||
"com.stellaops.service.name": svc_name,
|
||||
"com.stellaops.service.version": manifest_info.get("version", "1.0.0"),
|
||||
}
|
||||
|
||||
lines.extend(generate_service_block(svc_name, config, indent=2))
|
||||
|
||||
# Add volumes
|
||||
lines.extend([
|
||||
"",
|
||||
"volumes:",
|
||||
])
|
||||
|
||||
if include_deps:
|
||||
lines.extend([
|
||||
" postgres_data:",
|
||||
" driver: local",
|
||||
" valkey_data:",
|
||||
" driver: local",
|
||||
])
|
||||
|
||||
# Add networks
|
||||
lines.extend([
|
||||
"",
|
||||
"networks:",
|
||||
" default:",
|
||||
" name: stellaops",
|
||||
" driver: bridge",
|
||||
])
|
||||
|
||||
return "\n".join(lines)
|
||||
|
||||
|
||||
def generate_service_block(name: str, config: dict, indent: int = 2) -> List[str]:
|
||||
"""Generate YAML block for a service."""
|
||||
prefix = " " * indent
|
||||
lines = [
|
||||
"",
|
||||
f"{prefix}{name}:",
|
||||
]
|
||||
|
||||
inner_prefix = " " * (indent + 2)
|
||||
|
||||
# Image
|
||||
if "image" in config:
|
||||
lines.append(f"{inner_prefix}image: {config['image']}")
|
||||
|
||||
# Container name
|
||||
lines.append(f"{inner_prefix}container_name: stellaops-{name}")
|
||||
|
||||
# Restart policy
|
||||
if "restart" in config:
|
||||
lines.append(f"{inner_prefix}restart: {config['restart']}")
|
||||
|
||||
# Ports
|
||||
if "ports" in config:
|
||||
lines.append(f"{inner_prefix}ports:")
|
||||
for port in config["ports"]:
|
||||
lines.append(f"{inner_prefix} - \"{port}\"")
|
||||
|
||||
# Volumes
|
||||
if "volumes" in config:
|
||||
lines.append(f"{inner_prefix}volumes:")
|
||||
for vol in config["volumes"]:
|
||||
lines.append(f"{inner_prefix} - {vol}")
|
||||
|
||||
# Environment
|
||||
if "environment" in config:
|
||||
lines.append(f"{inner_prefix}environment:")
|
||||
for key, value in config["environment"].items():
|
||||
lines.append(f"{inner_prefix} {key}: \"{value}\"")
|
||||
|
||||
# Depends on
|
||||
if "depends_on" in config:
|
||||
lines.append(f"{inner_prefix}depends_on:")
|
||||
for dep in config["depends_on"]:
|
||||
lines.append(f"{inner_prefix} {dep}:")
|
||||
lines.append(f"{inner_prefix} condition: service_healthy")
|
||||
|
||||
# Health check
|
||||
if "healthcheck" in config:
|
||||
hc = config["healthcheck"]
|
||||
lines.append(f"{inner_prefix}healthcheck:")
|
||||
if "test" in hc:
|
||||
test = hc["test"]
|
||||
if isinstance(test, list):
|
||||
lines.append(f"{inner_prefix} test: {json.dumps(test)}")
|
||||
else:
|
||||
lines.append(f"{inner_prefix} test: \"{test}\"")
|
||||
for key in ["interval", "timeout", "retries", "start_period"]:
|
||||
if key in hc:
|
||||
lines.append(f"{inner_prefix} {key}: {hc[key]}")
|
||||
|
||||
# Labels
|
||||
if "labels" in config:
|
||||
lines.append(f"{inner_prefix}labels:")
|
||||
for key, value in config["labels"].items():
|
||||
lines.append(f"{inner_prefix} {key}: \"{value}\"")
|
||||
|
||||
return lines
|
||||
|
||||
|
||||
def main():
|
||||
parser = argparse.ArgumentParser(
|
||||
description="Generate pinned Docker Compose files for suite releases",
|
||||
)
|
||||
|
||||
parser.add_argument("version", help="Suite version (YYYY.MM format)")
|
||||
parser.add_argument("codename", help="Release codename")
|
||||
parser.add_argument("--output", "-o", help="Output file")
|
||||
parser.add_argument(
|
||||
"--airgap",
|
||||
action="store_true",
|
||||
help="Generate air-gap variant (localhost:5000 registry)",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--registry",
|
||||
default=DEFAULT_REGISTRY,
|
||||
help="Container registry URL",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--include-deps",
|
||||
action="store_true",
|
||||
default=True,
|
||||
help="Include infrastructure dependencies",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--no-deps",
|
||||
action="store_true",
|
||||
help="Exclude infrastructure dependencies",
|
||||
)
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
# Read service versions
|
||||
services = read_service_versions()
|
||||
if not services:
|
||||
print("Warning: No service versions found in manifest", file=sys.stderr)
|
||||
|
||||
# Generate compose file
|
||||
include_deps = args.include_deps and not args.no_deps
|
||||
compose = generate_compose(
|
||||
version=args.version,
|
||||
codename=args.codename,
|
||||
registry=args.registry,
|
||||
services=services,
|
||||
airgap=args.airgap,
|
||||
include_deps=include_deps,
|
||||
)
|
||||
|
||||
# Output
|
||||
if args.output:
|
||||
Path(args.output).write_text(compose, encoding="utf-8")
|
||||
print(f"Docker Compose written to: {args.output}", file=sys.stderr)
|
||||
else:
|
||||
print(compose)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
477
.gitea/scripts/release/generate_suite_docs.py
Normal file
477
.gitea/scripts/release/generate_suite_docs.py
Normal file
@@ -0,0 +1,477 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
generate_suite_docs.py - Generate suite release documentation
|
||||
|
||||
Sprint: CI/CD Enhancement - Suite Release Pipeline
|
||||
Creates the docs/releases/YYYY.MM/ documentation structure.
|
||||
|
||||
Usage:
|
||||
python generate_suite_docs.py <version> <codename> [options]
|
||||
python generate_suite_docs.py 2026.04 Nova --channel lts
|
||||
python generate_suite_docs.py 2026.10 Orion --changelog CHANGELOG.md
|
||||
|
||||
Arguments:
|
||||
version Suite version (YYYY.MM format)
|
||||
codename Release codename
|
||||
|
||||
Options:
|
||||
--channel CH Release channel: edge, stable, lts
|
||||
--changelog FILE Pre-generated changelog file
|
||||
--output-dir DIR Output directory (default: docs/releases/YYYY.MM)
|
||||
--registry URL Container registry URL
|
||||
--previous VERSION Previous version for upgrade guide
|
||||
"""
|
||||
|
||||
import argparse
|
||||
import json
|
||||
import os
|
||||
import re
|
||||
import subprocess
|
||||
import sys
|
||||
from datetime import datetime, timezone
|
||||
from pathlib import Path
|
||||
from typing import Dict, List, Optional
|
||||
|
||||
# Repository paths
|
||||
SCRIPT_DIR = Path(__file__).parent
|
||||
REPO_ROOT = SCRIPT_DIR.parent.parent.parent
|
||||
VERSIONS_FILE = REPO_ROOT / "src" / "Directory.Versions.props"
|
||||
MANIFEST_FILE = REPO_ROOT / "devops" / "releases" / "service-versions.json"
|
||||
|
||||
# Default registry
|
||||
DEFAULT_REGISTRY = "git.stella-ops.org/stella-ops.org"
|
||||
|
||||
# Support timeline
|
||||
SUPPORT_TIMELINE = {
|
||||
"edge": "3 months",
|
||||
"stable": "9 months",
|
||||
"lts": "5 years",
|
||||
}
|
||||
|
||||
|
||||
def get_git_sha() -> str:
|
||||
"""Get current git HEAD SHA."""
|
||||
try:
|
||||
result = subprocess.run(
|
||||
["git", "rev-parse", "HEAD"],
|
||||
capture_output=True,
|
||||
text=True,
|
||||
cwd=REPO_ROOT,
|
||||
check=True,
|
||||
)
|
||||
return result.stdout.strip()[:12]
|
||||
except subprocess.CalledProcessError:
|
||||
return "unknown"
|
||||
|
||||
|
||||
def read_service_versions() -> Dict[str, dict]:
|
||||
"""Read service versions from manifest."""
|
||||
if not MANIFEST_FILE.exists():
|
||||
return {}
|
||||
|
||||
try:
|
||||
manifest = json.loads(MANIFEST_FILE.read_text(encoding="utf-8"))
|
||||
return manifest.get("services", {})
|
||||
except json.JSONDecodeError:
|
||||
return {}
|
||||
|
||||
|
||||
def generate_readme(
|
||||
version: str,
|
||||
codename: str,
|
||||
channel: str,
|
||||
registry: str,
|
||||
services: Dict[str, dict],
|
||||
) -> str:
|
||||
"""Generate README.md for the release."""
|
||||
now = datetime.now(timezone.utc)
|
||||
support_period = SUPPORT_TIMELINE.get(channel, "unknown")
|
||||
|
||||
lines = [
|
||||
f"# StellaOps {version} \"{codename}\"",
|
||||
"",
|
||||
f"**Release Date:** {now.strftime('%B %d, %Y')}",
|
||||
f"**Channel:** {channel.upper()}",
|
||||
f"**Support Period:** {support_period}",
|
||||
"",
|
||||
"## Overview",
|
||||
"",
|
||||
f"StellaOps {version} \"{codename}\" is a {'Long-Term Support (LTS)' if channel == 'lts' else channel} release ",
|
||||
"of the StellaOps container security platform.",
|
||||
"",
|
||||
"## Quick Start",
|
||||
"",
|
||||
"### Docker Compose",
|
||||
"",
|
||||
"```bash",
|
||||
f"curl -O https://git.stella-ops.org/stella-ops.org/releases/{version}/docker-compose.yml",
|
||||
"docker compose up -d",
|
||||
"```",
|
||||
"",
|
||||
"### Helm",
|
||||
"",
|
||||
"```bash",
|
||||
f"helm repo add stellaops https://charts.stella-ops.org",
|
||||
f"helm install stellaops stellaops/stellaops --version {version}",
|
||||
"```",
|
||||
"",
|
||||
"## Included Services",
|
||||
"",
|
||||
"| Service | Version | Image |",
|
||||
"|---------|---------|-------|",
|
||||
]
|
||||
|
||||
for key, svc in sorted(services.items()):
|
||||
name = svc.get("name", key.title())
|
||||
ver = svc.get("version", "1.0.0")
|
||||
tag = svc.get("dockerTag", ver)
|
||||
image = f"`{registry}/{key}:{tag}`"
|
||||
lines.append(f"| {name} | {ver} | {image} |")
|
||||
|
||||
lines.extend([
|
||||
"",
|
||||
"## Documentation",
|
||||
"",
|
||||
"- [CHANGELOG.md](./CHANGELOG.md) - Detailed list of changes",
|
||||
"- [services.md](./services.md) - Service version details",
|
||||
"- [upgrade-guide.md](./upgrade-guide.md) - Upgrade instructions",
|
||||
"- [docker-compose.yml](./docker-compose.yml) - Docker Compose configuration",
|
||||
"",
|
||||
"## Support",
|
||||
"",
|
||||
f"This release is supported until **{calculate_eol(now, channel)}**.",
|
||||
"",
|
||||
"For issues and feature requests, please visit:",
|
||||
"https://git.stella-ops.org/stella-ops.org/git.stella-ops.org/issues",
|
||||
"",
|
||||
"---",
|
||||
"",
|
||||
f"Generated: {now.isoformat()}",
|
||||
f"Git SHA: {get_git_sha()}",
|
||||
])
|
||||
|
||||
return "\n".join(lines)
|
||||
|
||||
|
||||
def calculate_eol(release_date: datetime, channel: str) -> str:
|
||||
"""Calculate end-of-life date based on channel."""
|
||||
from dateutil.relativedelta import relativedelta
|
||||
|
||||
periods = {
|
||||
"edge": relativedelta(months=3),
|
||||
"stable": relativedelta(months=9),
|
||||
"lts": relativedelta(years=5),
|
||||
}
|
||||
|
||||
try:
|
||||
eol = release_date + periods.get(channel, relativedelta(months=9))
|
||||
return eol.strftime("%B %Y")
|
||||
except ImportError:
|
||||
# Fallback without dateutil
|
||||
return f"See {channel} support policy"
|
||||
|
||||
|
||||
def generate_services_doc(
|
||||
version: str,
|
||||
codename: str,
|
||||
registry: str,
|
||||
services: Dict[str, dict],
|
||||
) -> str:
|
||||
"""Generate services.md with detailed service information."""
|
||||
lines = [
|
||||
f"# Services - StellaOps {version} \"{codename}\"",
|
||||
"",
|
||||
"This document lists all services included in this release with their versions,",
|
||||
"Docker images, and configuration details.",
|
||||
"",
|
||||
"## Service Matrix",
|
||||
"",
|
||||
"| Service | Version | Docker Tag | Released | Git SHA |",
|
||||
"|---------|---------|------------|----------|---------|",
|
||||
]
|
||||
|
||||
for key, svc in sorted(services.items()):
|
||||
name = svc.get("name", key.title())
|
||||
ver = svc.get("version", "1.0.0")
|
||||
tag = svc.get("dockerTag") or "-"
|
||||
released = svc.get("releasedAt", "-")
|
||||
if released != "-":
|
||||
released = released[:10]
|
||||
sha = svc.get("gitSha") or "-"
|
||||
lines.append(f"| {name} | {ver} | `{tag}` | {released} | `{sha}` |")
|
||||
|
||||
lines.extend([
|
||||
"",
|
||||
"## Container Images",
|
||||
"",
|
||||
"All images are available from the StellaOps registry:",
|
||||
"",
|
||||
"```",
|
||||
f"Registry: {registry}",
|
||||
"```",
|
||||
"",
|
||||
"### Pull Commands",
|
||||
"",
|
||||
"```bash",
|
||||
])
|
||||
|
||||
for key, svc in sorted(services.items()):
|
||||
tag = svc.get("dockerTag") or svc.get("version", "latest")
|
||||
lines.append(f"docker pull {registry}/{key}:{tag}")
|
||||
|
||||
lines.extend([
|
||||
"```",
|
||||
"",
|
||||
"## Service Descriptions",
|
||||
"",
|
||||
])
|
||||
|
||||
service_descriptions = {
|
||||
"authority": "Authentication and authorization service with OAuth/OIDC support",
|
||||
"attestor": "in-toto/DSSE attestation generation and verification",
|
||||
"concelier": "Vulnerability advisory ingestion and merge engine",
|
||||
"scanner": "Container scanning with SBOM generation",
|
||||
"policy": "Policy engine with K4 lattice logic",
|
||||
"signer": "Cryptographic signing operations",
|
||||
"excititor": "VEX document ingestion and export",
|
||||
"gateway": "API gateway with routing and transport abstraction",
|
||||
"scheduler": "Job scheduling and queue management",
|
||||
"cli": "Command-line interface",
|
||||
"orchestrator": "Workflow orchestration and task coordination",
|
||||
"notify": "Notification delivery (Email, Slack, Teams, Webhooks)",
|
||||
}
|
||||
|
||||
for key, svc in sorted(services.items()):
|
||||
name = svc.get("name", key.title())
|
||||
desc = service_descriptions.get(key, "StellaOps service")
|
||||
lines.extend([
|
||||
f"### {name}",
|
||||
"",
|
||||
desc,
|
||||
"",
|
||||
f"- **Version:** {svc.get('version', '1.0.0')}",
|
||||
f"- **Image:** `{registry}/{key}:{svc.get('dockerTag', 'latest')}`",
|
||||
"",
|
||||
])
|
||||
|
||||
return "\n".join(lines)
|
||||
|
||||
|
||||
def generate_upgrade_guide(
|
||||
version: str,
|
||||
codename: str,
|
||||
previous_version: Optional[str],
|
||||
) -> str:
|
||||
"""Generate upgrade-guide.md."""
|
||||
lines = [
|
||||
f"# Upgrade Guide - StellaOps {version} \"{codename}\"",
|
||||
"",
|
||||
]
|
||||
|
||||
if previous_version:
|
||||
lines.extend([
|
||||
f"This guide covers upgrading from StellaOps {previous_version} to {version}.",
|
||||
"",
|
||||
])
|
||||
else:
|
||||
lines.extend([
|
||||
"This guide covers upgrading to this release from a previous version.",
|
||||
"",
|
||||
])
|
||||
|
||||
lines.extend([
|
||||
"## Before You Begin",
|
||||
"",
|
||||
"1. **Backup your data** - Ensure all databases and configuration are backed up",
|
||||
"2. **Review changelog** - Check [CHANGELOG.md](./CHANGELOG.md) for breaking changes",
|
||||
"3. **Check compatibility** - Verify your environment meets the requirements",
|
||||
"",
|
||||
"## Upgrade Steps",
|
||||
"",
|
||||
"### Docker Compose",
|
||||
"",
|
||||
"```bash",
|
||||
"# Pull new images",
|
||||
"docker compose pull",
|
||||
"",
|
||||
"# Stop services",
|
||||
"docker compose down",
|
||||
"",
|
||||
"# Start with new version",
|
||||
"docker compose up -d",
|
||||
"",
|
||||
"# Verify health",
|
||||
"docker compose ps",
|
||||
"```",
|
||||
"",
|
||||
"### Helm",
|
||||
"",
|
||||
"```bash",
|
||||
"# Update repository",
|
||||
"helm repo update stellaops",
|
||||
"",
|
||||
"# Upgrade release",
|
||||
f"helm upgrade stellaops stellaops/stellaops --version {version}",
|
||||
"",
|
||||
"# Verify status",
|
||||
"helm status stellaops",
|
||||
"```",
|
||||
"",
|
||||
"## Database Migrations",
|
||||
"",
|
||||
"Database migrations are applied automatically on service startup.",
|
||||
"For manual migration control, set `AUTO_MIGRATE=false` and run:",
|
||||
"",
|
||||
"```bash",
|
||||
"stellaops-cli db migrate",
|
||||
"```",
|
||||
"",
|
||||
"## Configuration Changes",
|
||||
"",
|
||||
"Review the following configuration changes:",
|
||||
"",
|
||||
"| Setting | Previous | New | Notes |",
|
||||
"|---------|----------|-----|-------|",
|
||||
"| (No breaking changes) | - | - | - |",
|
||||
"",
|
||||
"## Rollback Procedure",
|
||||
"",
|
||||
"If issues occur, rollback to the previous version:",
|
||||
"",
|
||||
"### Docker Compose",
|
||||
"",
|
||||
"```bash",
|
||||
"# Edit docker-compose.yml to use previous image tags",
|
||||
"docker compose down",
|
||||
"docker compose up -d",
|
||||
"```",
|
||||
"",
|
||||
"### Helm",
|
||||
"",
|
||||
"```bash",
|
||||
"helm rollback stellaops",
|
||||
"```",
|
||||
"",
|
||||
"## Support",
|
||||
"",
|
||||
"For upgrade assistance, contact support or open an issue at:",
|
||||
"https://git.stella-ops.org/stella-ops.org/git.stella-ops.org/issues",
|
||||
])
|
||||
|
||||
return "\n".join(lines)
|
||||
|
||||
|
||||
def generate_manifest_yaml(
|
||||
version: str,
|
||||
codename: str,
|
||||
channel: str,
|
||||
services: Dict[str, dict],
|
||||
) -> str:
|
||||
"""Generate manifest.yaml for the release."""
|
||||
now = datetime.now(timezone.utc)
|
||||
|
||||
lines = [
|
||||
"apiVersion: stellaops.org/v1",
|
||||
"kind: SuiteRelease",
|
||||
"metadata:",
|
||||
f" version: \"{version}\"",
|
||||
f" codename: \"{codename}\"",
|
||||
f" channel: \"{channel}\"",
|
||||
f" date: \"{now.isoformat()}\"",
|
||||
f" gitSha: \"{get_git_sha()}\"",
|
||||
"spec:",
|
||||
" services:",
|
||||
]
|
||||
|
||||
for key, svc in sorted(services.items()):
|
||||
lines.append(f" {key}:")
|
||||
lines.append(f" version: \"{svc.get('version', '1.0.0')}\"")
|
||||
if svc.get("dockerTag"):
|
||||
lines.append(f" dockerTag: \"{svc['dockerTag']}\"")
|
||||
if svc.get("gitSha"):
|
||||
lines.append(f" gitSha: \"{svc['gitSha']}\"")
|
||||
|
||||
return "\n".join(lines)
|
||||
|
||||
|
||||
def main():
|
||||
parser = argparse.ArgumentParser(
|
||||
description="Generate suite release documentation",
|
||||
)
|
||||
|
||||
parser.add_argument("version", help="Suite version (YYYY.MM format)")
|
||||
parser.add_argument("codename", help="Release codename")
|
||||
parser.add_argument(
|
||||
"--channel",
|
||||
choices=["edge", "stable", "lts"],
|
||||
default="stable",
|
||||
help="Release channel",
|
||||
)
|
||||
parser.add_argument("--changelog", help="Pre-generated changelog file")
|
||||
parser.add_argument("--output-dir", help="Output directory")
|
||||
parser.add_argument(
|
||||
"--registry",
|
||||
default=DEFAULT_REGISTRY,
|
||||
help="Container registry URL",
|
||||
)
|
||||
parser.add_argument("--previous", help="Previous version for upgrade guide")
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
# Determine output directory
|
||||
if args.output_dir:
|
||||
output_dir = Path(args.output_dir)
|
||||
else:
|
||||
output_dir = REPO_ROOT / "docs" / "releases" / args.version
|
||||
|
||||
output_dir.mkdir(parents=True, exist_ok=True)
|
||||
print(f"Output directory: {output_dir}", file=sys.stderr)
|
||||
|
||||
# Read service versions
|
||||
services = read_service_versions()
|
||||
if not services:
|
||||
print("Warning: No service versions found in manifest", file=sys.stderr)
|
||||
|
||||
# Generate README.md
|
||||
readme = generate_readme(
|
||||
args.version, args.codename, args.channel, args.registry, services
|
||||
)
|
||||
(output_dir / "README.md").write_text(readme, encoding="utf-8")
|
||||
print("Generated: README.md", file=sys.stderr)
|
||||
|
||||
# Copy or generate CHANGELOG.md
|
||||
if args.changelog and Path(args.changelog).exists():
|
||||
changelog = Path(args.changelog).read_text(encoding="utf-8")
|
||||
else:
|
||||
# Generate basic changelog
|
||||
changelog = f"# Changelog - StellaOps {args.version} \"{args.codename}\"\n\n"
|
||||
changelog += "See git history for detailed changes.\n"
|
||||
(output_dir / "CHANGELOG.md").write_text(changelog, encoding="utf-8")
|
||||
print("Generated: CHANGELOG.md", file=sys.stderr)
|
||||
|
||||
# Generate services.md
|
||||
services_doc = generate_services_doc(
|
||||
args.version, args.codename, args.registry, services
|
||||
)
|
||||
(output_dir / "services.md").write_text(services_doc, encoding="utf-8")
|
||||
print("Generated: services.md", file=sys.stderr)
|
||||
|
||||
# Generate upgrade-guide.md
|
||||
upgrade_guide = generate_upgrade_guide(
|
||||
args.version, args.codename, args.previous
|
||||
)
|
||||
(output_dir / "upgrade-guide.md").write_text(upgrade_guide, encoding="utf-8")
|
||||
print("Generated: upgrade-guide.md", file=sys.stderr)
|
||||
|
||||
# Generate manifest.yaml
|
||||
manifest = generate_manifest_yaml(
|
||||
args.version, args.codename, args.channel, services
|
||||
)
|
||||
(output_dir / "manifest.yaml").write_text(manifest, encoding="utf-8")
|
||||
print("Generated: manifest.yaml", file=sys.stderr)
|
||||
|
||||
print(f"\nSuite documentation generated in: {output_dir}", file=sys.stderr)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
131
.gitea/scripts/release/read-service-version.sh
Normal file
131
.gitea/scripts/release/read-service-version.sh
Normal file
@@ -0,0 +1,131 @@
|
||||
#!/bin/bash
|
||||
# read-service-version.sh - Read service version from centralized storage
|
||||
#
|
||||
# Sprint: CI/CD Enhancement - Per-Service Auto-Versioning
|
||||
# This script reads service versions from src/Directory.Versions.props
|
||||
#
|
||||
# Usage:
|
||||
# ./read-service-version.sh <service>
|
||||
# ./read-service-version.sh authority
|
||||
# ./read-service-version.sh --all
|
||||
#
|
||||
# Output:
|
||||
# Prints the version string to stdout (e.g., "1.2.3")
|
||||
# Exit code 0 on success, 1 on error
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
REPO_ROOT="$(cd "${SCRIPT_DIR}/../../.." && pwd)"
|
||||
VERSIONS_FILE="${REPO_ROOT}/src/Directory.Versions.props"
|
||||
|
||||
# Service name to property suffix mapping
|
||||
declare -A SERVICE_MAP=(
|
||||
["authority"]="Authority"
|
||||
["attestor"]="Attestor"
|
||||
["concelier"]="Concelier"
|
||||
["scanner"]="Scanner"
|
||||
["policy"]="Policy"
|
||||
["signer"]="Signer"
|
||||
["excititor"]="Excititor"
|
||||
["gateway"]="Gateway"
|
||||
["scheduler"]="Scheduler"
|
||||
["cli"]="Cli"
|
||||
["orchestrator"]="Orchestrator"
|
||||
["notify"]="Notify"
|
||||
["sbomservice"]="SbomService"
|
||||
["vexhub"]="VexHub"
|
||||
["evidencelocker"]="EvidenceLocker"
|
||||
)
|
||||
|
||||
usage() {
|
||||
cat << EOF
|
||||
Usage: $(basename "$0") <service|--all>
|
||||
|
||||
Read service version from centralized version storage.
|
||||
|
||||
Arguments:
|
||||
service Service name (authority, attestor, concelier, scanner, etc.)
|
||||
--all Print all service versions in JSON format
|
||||
|
||||
Services:
|
||||
${!SERVICE_MAP[*]}
|
||||
|
||||
Examples:
|
||||
$(basename "$0") authority # Output: 1.0.0
|
||||
$(basename "$0") scanner # Output: 1.2.3
|
||||
$(basename "$0") --all # Output: {"authority":"1.0.0",...}
|
||||
EOF
|
||||
}
|
||||
|
||||
read_version() {
|
||||
local service="$1"
|
||||
local property_suffix="${SERVICE_MAP[$service]:-}"
|
||||
|
||||
if [[ -z "$property_suffix" ]]; then
|
||||
echo "Error: Unknown service '$service'" >&2
|
||||
echo "Valid services: ${!SERVICE_MAP[*]}" >&2
|
||||
return 1
|
||||
fi
|
||||
|
||||
if [[ ! -f "$VERSIONS_FILE" ]]; then
|
||||
echo "Error: Versions file not found: $VERSIONS_FILE" >&2
|
||||
return 1
|
||||
fi
|
||||
|
||||
local property_name="StellaOps${property_suffix}Version"
|
||||
local version
|
||||
|
||||
version=$(grep -oP "<${property_name}>\K[0-9]+\.[0-9]+\.[0-9]+" "$VERSIONS_FILE" || true)
|
||||
|
||||
if [[ -z "$version" ]]; then
|
||||
echo "Error: Property '$property_name' not found in $VERSIONS_FILE" >&2
|
||||
return 1
|
||||
fi
|
||||
|
||||
echo "$version"
|
||||
}
|
||||
|
||||
read_all_versions() {
|
||||
if [[ ! -f "$VERSIONS_FILE" ]]; then
|
||||
echo "Error: Versions file not found: $VERSIONS_FILE" >&2
|
||||
return 1
|
||||
fi
|
||||
|
||||
echo -n "{"
|
||||
local first=true
|
||||
for service in "${!SERVICE_MAP[@]}"; do
|
||||
local version
|
||||
version=$(read_version "$service" 2>/dev/null || echo "")
|
||||
if [[ -n "$version" ]]; then
|
||||
if [[ "$first" != "true" ]]; then
|
||||
echo -n ","
|
||||
fi
|
||||
echo -n "\"$service\":\"$version\""
|
||||
first=false
|
||||
fi
|
||||
done
|
||||
echo "}"
|
||||
}
|
||||
|
||||
main() {
|
||||
if [[ $# -eq 0 ]]; then
|
||||
usage
|
||||
exit 1
|
||||
fi
|
||||
|
||||
case "$1" in
|
||||
--help|-h)
|
||||
usage
|
||||
exit 0
|
||||
;;
|
||||
--all)
|
||||
read_all_versions
|
||||
;;
|
||||
*)
|
||||
read_version "$1"
|
||||
;;
|
||||
esac
|
||||
}
|
||||
|
||||
main "$@"
|
||||
226
.gitea/scripts/release/rollback.sh
Normal file
226
.gitea/scripts/release/rollback.sh
Normal file
@@ -0,0 +1,226 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
# Rollback Script
|
||||
# Sprint: CI/CD Enhancement - Deployment Safety
|
||||
#
|
||||
# Purpose: Execute rollback to a previous version
|
||||
# Usage:
|
||||
# ./rollback.sh --environment <env> --version <ver> --services <json> --reason <text>
|
||||
#
|
||||
# Exit codes:
|
||||
# 0 - Rollback successful
|
||||
# 1 - General error
|
||||
# 2 - Invalid arguments
|
||||
# 3 - Deployment failed
|
||||
# 4 - Health check failed
|
||||
|
||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
REPO_ROOT="$(cd "${SCRIPT_DIR}/../../.." && pwd)"
|
||||
|
||||
# Colors for output
|
||||
RED='\033[0;31m'
|
||||
GREEN='\033[0;32m'
|
||||
YELLOW='\033[1;33m'
|
||||
BLUE='\033[0;34m'
|
||||
NC='\033[0m'
|
||||
|
||||
log_info() {
|
||||
echo -e "${GREEN}[INFO]${NC} $*"
|
||||
}
|
||||
|
||||
log_warn() {
|
||||
echo -e "${YELLOW}[WARN]${NC} $*"
|
||||
}
|
||||
|
||||
log_error() {
|
||||
echo -e "${RED}[ERROR]${NC} $*" >&2
|
||||
}
|
||||
|
||||
log_step() {
|
||||
echo -e "${BLUE}[STEP]${NC} $*"
|
||||
}
|
||||
|
||||
usage() {
|
||||
cat << EOF
|
||||
Usage: $(basename "$0") [OPTIONS]
|
||||
|
||||
Execute rollback to a previous version.
|
||||
|
||||
Options:
|
||||
--environment <env> Target environment (staging|production)
|
||||
--version <version> Target version to rollback to
|
||||
--services <json> JSON array of services to rollback
|
||||
--reason <text> Reason for rollback
|
||||
--dry-run Show what would be done without executing
|
||||
--help, -h Show this help message
|
||||
|
||||
Examples:
|
||||
$(basename "$0") --environment staging --version 1.2.3 --services '["scanner"]' --reason "Bug fix"
|
||||
$(basename "$0") --environment production --version 1.2.0 --services '["authority","scanner"]' --reason "Hotfix rollback"
|
||||
|
||||
Exit codes:
|
||||
0 Rollback successful
|
||||
1 General error
|
||||
2 Invalid arguments
|
||||
3 Deployment failed
|
||||
4 Health check failed
|
||||
EOF
|
||||
}
|
||||
|
||||
# Default values
|
||||
ENVIRONMENT=""
|
||||
VERSION=""
|
||||
SERVICES=""
|
||||
REASON=""
|
||||
DRY_RUN=false
|
||||
|
||||
# Parse arguments
|
||||
while [[ $# -gt 0 ]]; do
|
||||
case "$1" in
|
||||
--environment)
|
||||
ENVIRONMENT="$2"
|
||||
shift 2
|
||||
;;
|
||||
--version)
|
||||
VERSION="$2"
|
||||
shift 2
|
||||
;;
|
||||
--services)
|
||||
SERVICES="$2"
|
||||
shift 2
|
||||
;;
|
||||
--reason)
|
||||
REASON="$2"
|
||||
shift 2
|
||||
;;
|
||||
--dry-run)
|
||||
DRY_RUN=true
|
||||
shift
|
||||
;;
|
||||
--help|-h)
|
||||
usage
|
||||
exit 0
|
||||
;;
|
||||
*)
|
||||
log_error "Unknown option: $1"
|
||||
usage
|
||||
exit 2
|
||||
;;
|
||||
esac
|
||||
done
|
||||
|
||||
# Validate required arguments
|
||||
if [[ -z "$ENVIRONMENT" ]] || [[ -z "$VERSION" ]] || [[ -z "$SERVICES" ]]; then
|
||||
log_error "Missing required arguments"
|
||||
usage
|
||||
exit 2
|
||||
fi
|
||||
|
||||
# Validate environment
|
||||
if [[ "$ENVIRONMENT" != "staging" ]] && [[ "$ENVIRONMENT" != "production" ]]; then
|
||||
log_error "Invalid environment: $ENVIRONMENT (must be staging or production)"
|
||||
exit 2
|
||||
fi
|
||||
|
||||
# Validate services JSON
|
||||
if ! echo "$SERVICES" | jq empty 2>/dev/null; then
|
||||
log_error "Invalid services JSON: $SERVICES"
|
||||
exit 2
|
||||
fi
|
||||
|
||||
log_info "Starting rollback process"
|
||||
log_info " Environment: $ENVIRONMENT"
|
||||
log_info " Version: $VERSION"
|
||||
log_info " Services: $SERVICES"
|
||||
log_info " Reason: $REASON"
|
||||
log_info " Dry run: $DRY_RUN"
|
||||
|
||||
# Record start time
|
||||
START_TIME=$(date +%s)
|
||||
|
||||
# Rollback each service
|
||||
FAILED_SERVICES=()
|
||||
SUCCESSFUL_SERVICES=()
|
||||
|
||||
echo "$SERVICES" | jq -r '.[]' | while read -r service; do
|
||||
log_step "Rolling back $service to $VERSION..."
|
||||
|
||||
if [[ "$DRY_RUN" == "true" ]]; then
|
||||
log_info " [DRY RUN] Would rollback $service"
|
||||
continue
|
||||
fi
|
||||
|
||||
# Determine deployment method
|
||||
HELM_RELEASE="stellaops-${service}"
|
||||
NAMESPACE="stellaops-${ENVIRONMENT}"
|
||||
|
||||
# Check if Helm release exists
|
||||
if helm status "$HELM_RELEASE" -n "$NAMESPACE" >/dev/null 2>&1; then
|
||||
log_info " Using Helm rollback for $service"
|
||||
|
||||
# Get revision for target version
|
||||
REVISION=$(helm history "$HELM_RELEASE" -n "$NAMESPACE" --output json | \
|
||||
jq -r --arg ver "$VERSION" '.[] | select(.app_version == $ver) | .revision' | tail -1)
|
||||
|
||||
if [[ -n "$REVISION" ]]; then
|
||||
if helm rollback "$HELM_RELEASE" "$REVISION" -n "$NAMESPACE" --wait --timeout 5m; then
|
||||
log_info " Successfully rolled back $service to revision $REVISION"
|
||||
SUCCESSFUL_SERVICES+=("$service")
|
||||
else
|
||||
log_error " Failed to rollback $service"
|
||||
FAILED_SERVICES+=("$service")
|
||||
fi
|
||||
else
|
||||
log_warn " No Helm revision found for version $VERSION"
|
||||
log_info " Attempting deployment with specific version..."
|
||||
|
||||
# Try to deploy specific version
|
||||
IMAGE_TAG="${VERSION}"
|
||||
VALUES_FILE="${REPO_ROOT}/devops/helm/values-${ENVIRONMENT}.yaml"
|
||||
|
||||
if helm upgrade "$HELM_RELEASE" "${REPO_ROOT}/devops/helm/stellaops" \
|
||||
-n "$NAMESPACE" \
|
||||
--set "services.${service}.image.tag=${IMAGE_TAG}" \
|
||||
-f "$VALUES_FILE" \
|
||||
--wait --timeout 5m 2>/dev/null; then
|
||||
log_info " Deployed $service with version $VERSION"
|
||||
SUCCESSFUL_SERVICES+=("$service")
|
||||
else
|
||||
log_error " Failed to deploy $service with version $VERSION"
|
||||
FAILED_SERVICES+=("$service")
|
||||
fi
|
||||
fi
|
||||
else
|
||||
log_warn " No Helm release found for $service"
|
||||
log_info " Attempting kubectl rollout undo..."
|
||||
|
||||
DEPLOYMENT="stellaops-${service}"
|
||||
|
||||
if kubectl rollout undo deployment/"$DEPLOYMENT" -n "$NAMESPACE" 2>/dev/null; then
|
||||
log_info " Rolled back deployment $DEPLOYMENT"
|
||||
SUCCESSFUL_SERVICES+=("$service")
|
||||
else
|
||||
log_error " Failed to rollback deployment $DEPLOYMENT"
|
||||
FAILED_SERVICES+=("$service")
|
||||
fi
|
||||
fi
|
||||
done
|
||||
|
||||
# Calculate duration
|
||||
END_TIME=$(date +%s)
|
||||
DURATION=$((END_TIME - START_TIME))
|
||||
|
||||
# Summary
|
||||
echo ""
|
||||
log_info "Rollback completed in ${DURATION}s"
|
||||
log_info " Successful: ${#SUCCESSFUL_SERVICES[@]}"
|
||||
log_info " Failed: ${#FAILED_SERVICES[@]}"
|
||||
|
||||
if [[ ${#FAILED_SERVICES[@]} -gt 0 ]]; then
|
||||
log_error "Failed services: ${FAILED_SERVICES[*]}"
|
||||
exit 3
|
||||
fi
|
||||
|
||||
log_info "Rollback successful"
|
||||
exit 0
|
||||
299
.gitea/scripts/test/run-test-category.sh
Normal file
299
.gitea/scripts/test/run-test-category.sh
Normal file
@@ -0,0 +1,299 @@
|
||||
#!/usr/bin/env bash
|
||||
# Test Category Runner
|
||||
# Sprint: CI/CD Enhancement - Script Consolidation
|
||||
#
|
||||
# Purpose: Run tests for a specific category across all test projects
|
||||
# Usage: ./run-test-category.sh <category> [options]
|
||||
#
|
||||
# Options:
|
||||
# --fail-on-empty Fail if no tests are found for the category
|
||||
# --collect-coverage Collect code coverage data
|
||||
# --verbose Show detailed output
|
||||
#
|
||||
# Exit Codes:
|
||||
# 0 - Success (all tests passed or no tests found)
|
||||
# 1 - One or more tests failed
|
||||
# 2 - Invalid usage
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
# Source shared libraries if available
|
||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
REPO_ROOT="$(cd "$SCRIPT_DIR/../../.." && pwd)"
|
||||
|
||||
if [[ -f "$REPO_ROOT/devops/scripts/lib/logging.sh" ]]; then
|
||||
source "$REPO_ROOT/devops/scripts/lib/logging.sh"
|
||||
else
|
||||
# Minimal logging fallback
|
||||
log_info() { echo "[INFO] $*"; }
|
||||
log_error() { echo "[ERROR] $*" >&2; }
|
||||
log_debug() { [[ -n "${DEBUG:-}" ]] && echo "[DEBUG] $*"; }
|
||||
log_step() { echo "==> $*"; }
|
||||
fi
|
||||
|
||||
if [[ -f "$REPO_ROOT/devops/scripts/lib/exit-codes.sh" ]]; then
|
||||
source "$REPO_ROOT/devops/scripts/lib/exit-codes.sh"
|
||||
fi
|
||||
|
||||
# =============================================================================
|
||||
# Constants
|
||||
# =============================================================================
|
||||
|
||||
readonly FIND_PATTERN='\( -name "*.Tests.csproj" -o -name "*UnitTests.csproj" -o -name "*SmokeTests.csproj" -o -name "*FixtureTests.csproj" -o -name "*IntegrationTests.csproj" \)'
|
||||
readonly EXCLUDE_PATHS='! -path "*/node_modules/*" ! -path "*/.git/*" ! -path "*/bin/*" ! -path "*/obj/*"'
|
||||
readonly EXCLUDE_FILES='! -name "StellaOps.TestKit.csproj" ! -name "*Testing.csproj"'
|
||||
|
||||
# =============================================================================
|
||||
# Functions
|
||||
# =============================================================================
|
||||
|
||||
usage() {
|
||||
cat <<EOF
|
||||
Usage: $(basename "$0") <category> [options]
|
||||
|
||||
Run tests for a specific test category across all test projects.
|
||||
|
||||
Arguments:
|
||||
category Test category (Unit, Architecture, Contract, Integration,
|
||||
Security, Golden, Performance, Benchmark, AirGap, Chaos,
|
||||
Determinism, Resilience, Observability)
|
||||
|
||||
Options:
|
||||
--fail-on-empty Exit with error if no tests found for the category
|
||||
--collect-coverage Collect XPlat Code Coverage data
|
||||
--verbose Show detailed test output
|
||||
--results-dir DIR Custom results directory (default: ./TestResults/<category>)
|
||||
--help Show this help message
|
||||
|
||||
Environment Variables:
|
||||
DOTNET_VERSION .NET SDK version (default: uses installed version)
|
||||
TZ Timezone (should be UTC for determinism)
|
||||
|
||||
Examples:
|
||||
$(basename "$0") Unit
|
||||
$(basename "$0") Integration --collect-coverage
|
||||
$(basename "$0") Performance --results-dir ./perf-results
|
||||
EOF
|
||||
}
|
||||
|
||||
find_test_projects() {
|
||||
local search_dir="${1:-src}"
|
||||
|
||||
# Use eval to properly expand the find pattern
|
||||
eval "find '$search_dir' $FIND_PATTERN -type f $EXCLUDE_PATHS $EXCLUDE_FILES" | sort
|
||||
}
|
||||
|
||||
sanitize_project_name() {
|
||||
local proj="$1"
|
||||
# Replace slashes with underscores, remove .csproj extension
|
||||
echo "$proj" | sed 's|/|_|g' | sed 's|\.csproj$||'
|
||||
}
|
||||
|
||||
run_tests() {
|
||||
local category="$1"
|
||||
local results_dir="$2"
|
||||
local collect_coverage="$3"
|
||||
local verbose="$4"
|
||||
local fail_on_empty="$5"
|
||||
|
||||
local passed=0
|
||||
local failed=0
|
||||
local skipped=0
|
||||
local no_tests=0
|
||||
|
||||
mkdir -p "$results_dir"
|
||||
|
||||
local projects
|
||||
projects=$(find_test_projects "$REPO_ROOT/src")
|
||||
|
||||
if [[ -z "$projects" ]]; then
|
||||
log_error "No test projects found"
|
||||
return 1
|
||||
fi
|
||||
|
||||
local project_count
|
||||
project_count=$(echo "$projects" | grep -c '.csproj' || echo "0")
|
||||
log_info "Found $project_count test projects"
|
||||
|
||||
local category_lower
|
||||
category_lower=$(echo "$category" | tr '[:upper:]' '[:lower:]')
|
||||
|
||||
while IFS= read -r proj; do
|
||||
[[ -z "$proj" ]] && continue
|
||||
|
||||
local proj_name
|
||||
proj_name=$(sanitize_project_name "$proj")
|
||||
local trx_name="${proj_name}-${category_lower}.trx"
|
||||
|
||||
# GitHub Actions grouping
|
||||
if [[ -n "${GITHUB_ACTIONS:-}" ]]; then
|
||||
echo "::group::Testing $proj ($category)"
|
||||
else
|
||||
log_step "Testing $proj ($category)"
|
||||
fi
|
||||
|
||||
# Build dotnet test command
|
||||
local cmd="dotnet test \"$proj\""
|
||||
cmd+=" --filter \"Category=$category\""
|
||||
cmd+=" --configuration Release"
|
||||
cmd+=" --logger \"trx;LogFileName=$trx_name\""
|
||||
cmd+=" --results-directory \"$results_dir\""
|
||||
|
||||
if [[ "$collect_coverage" == "true" ]]; then
|
||||
cmd+=" --collect:\"XPlat Code Coverage\""
|
||||
fi
|
||||
|
||||
if [[ "$verbose" == "true" ]]; then
|
||||
cmd+=" --verbosity normal"
|
||||
else
|
||||
cmd+=" --verbosity minimal"
|
||||
fi
|
||||
|
||||
# Execute tests
|
||||
local exit_code=0
|
||||
eval "$cmd" 2>&1 || exit_code=$?
|
||||
|
||||
if [[ $exit_code -eq 0 ]]; then
|
||||
# Check if TRX was created (tests actually ran)
|
||||
if [[ -f "$results_dir/$trx_name" ]]; then
|
||||
((passed++))
|
||||
log_info "PASS: $proj"
|
||||
else
|
||||
((no_tests++))
|
||||
log_debug "SKIP: $proj (no $category tests)"
|
||||
fi
|
||||
else
|
||||
# Check if failure was due to no tests matching the filter
|
||||
if [[ -f "$results_dir/$trx_name" ]]; then
|
||||
((failed++))
|
||||
log_error "FAIL: $proj"
|
||||
else
|
||||
((no_tests++))
|
||||
log_debug "SKIP: $proj (no $category tests or build error)"
|
||||
fi
|
||||
fi
|
||||
|
||||
# Close GitHub Actions group
|
||||
if [[ -n "${GITHUB_ACTIONS:-}" ]]; then
|
||||
echo "::endgroup::"
|
||||
fi
|
||||
|
||||
done <<< "$projects"
|
||||
|
||||
# Generate summary
|
||||
log_info ""
|
||||
log_info "=========================================="
|
||||
log_info "$category Test Summary"
|
||||
log_info "=========================================="
|
||||
log_info "Passed: $passed"
|
||||
log_info "Failed: $failed"
|
||||
log_info "No Tests: $no_tests"
|
||||
log_info "Total: $project_count"
|
||||
log_info "=========================================="
|
||||
|
||||
# GitHub Actions summary
|
||||
if [[ -n "${GITHUB_ACTIONS:-}" ]]; then
|
||||
{
|
||||
echo "## $category Test Summary"
|
||||
echo ""
|
||||
echo "| Metric | Count |"
|
||||
echo "|--------|-------|"
|
||||
echo "| Passed | $passed |"
|
||||
echo "| Failed | $failed |"
|
||||
echo "| No Tests | $no_tests |"
|
||||
echo "| Total Projects | $project_count |"
|
||||
} >> "$GITHUB_STEP_SUMMARY"
|
||||
fi
|
||||
|
||||
# Determine exit code
|
||||
if [[ $failed -gt 0 ]]; then
|
||||
return 1
|
||||
fi
|
||||
|
||||
if [[ "$fail_on_empty" == "true" ]] && [[ $passed -eq 0 ]]; then
|
||||
log_error "No tests found for category: $category"
|
||||
return 1
|
||||
fi
|
||||
|
||||
return 0
|
||||
}
|
||||
|
||||
# =============================================================================
|
||||
# Main
|
||||
# =============================================================================
|
||||
|
||||
main() {
|
||||
local category=""
|
||||
local results_dir=""
|
||||
local collect_coverage="false"
|
||||
local verbose="false"
|
||||
local fail_on_empty="false"
|
||||
|
||||
# Parse arguments
|
||||
while [[ $# -gt 0 ]]; do
|
||||
case "$1" in
|
||||
--help|-h)
|
||||
usage
|
||||
exit 0
|
||||
;;
|
||||
--fail-on-empty)
|
||||
fail_on_empty="true"
|
||||
shift
|
||||
;;
|
||||
--collect-coverage)
|
||||
collect_coverage="true"
|
||||
shift
|
||||
;;
|
||||
--verbose|-v)
|
||||
verbose="true"
|
||||
shift
|
||||
;;
|
||||
--results-dir)
|
||||
results_dir="$2"
|
||||
shift 2
|
||||
;;
|
||||
-*)
|
||||
log_error "Unknown option: $1"
|
||||
usage
|
||||
exit 2
|
||||
;;
|
||||
*)
|
||||
if [[ -z "$category" ]]; then
|
||||
category="$1"
|
||||
else
|
||||
log_error "Unexpected argument: $1"
|
||||
usage
|
||||
exit 2
|
||||
fi
|
||||
shift
|
||||
;;
|
||||
esac
|
||||
done
|
||||
|
||||
# Validate category
|
||||
if [[ -z "$category" ]]; then
|
||||
log_error "Category is required"
|
||||
usage
|
||||
exit 2
|
||||
fi
|
||||
|
||||
# Validate category name
|
||||
local valid_categories="Unit Architecture Contract Integration Security Golden Performance Benchmark AirGap Chaos Determinism Resilience Observability"
|
||||
if ! echo "$valid_categories" | grep -qw "$category"; then
|
||||
log_error "Invalid category: $category"
|
||||
log_error "Valid categories: $valid_categories"
|
||||
exit 2
|
||||
fi
|
||||
|
||||
# Set default results directory
|
||||
if [[ -z "$results_dir" ]]; then
|
||||
results_dir="./TestResults/$category"
|
||||
fi
|
||||
|
||||
log_info "Running $category tests..."
|
||||
log_info "Results directory: $results_dir"
|
||||
|
||||
run_tests "$category" "$results_dir" "$collect_coverage" "$verbose" "$fail_on_empty"
|
||||
}
|
||||
|
||||
main "$@"
|
||||
260
.gitea/scripts/validate/validate-migrations.sh
Normal file
260
.gitea/scripts/validate/validate-migrations.sh
Normal file
@@ -0,0 +1,260 @@
|
||||
#!/usr/bin/env bash
|
||||
# Migration Validation Script
|
||||
# Validates migration naming conventions, detects duplicates, and checks for issues.
|
||||
#
|
||||
# Usage:
|
||||
# ./validate-migrations.sh [--strict] [--fix-scanner]
|
||||
#
|
||||
# Options:
|
||||
# --strict Exit with error on any warning
|
||||
# --fix-scanner Generate rename commands for Scanner duplicates
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
REPO_ROOT="$(cd "$SCRIPT_DIR/../../.." && pwd)"
|
||||
|
||||
STRICT_MODE=false
|
||||
FIX_SCANNER=false
|
||||
EXIT_CODE=0
|
||||
|
||||
# Parse arguments
|
||||
for arg in "$@"; do
|
||||
case $arg in
|
||||
--strict)
|
||||
STRICT_MODE=true
|
||||
shift
|
||||
;;
|
||||
--fix-scanner)
|
||||
FIX_SCANNER=true
|
||||
shift
|
||||
;;
|
||||
esac
|
||||
done
|
||||
|
||||
echo "=== Migration Validation ==="
|
||||
echo "Repository: $REPO_ROOT"
|
||||
echo ""
|
||||
|
||||
# Colors for output
|
||||
RED='\033[0;31m'
|
||||
YELLOW='\033[1;33m'
|
||||
GREEN='\033[0;32m'
|
||||
NC='\033[0m' # No Color
|
||||
|
||||
# Track issues
|
||||
ERRORS=()
|
||||
WARNINGS=()
|
||||
|
||||
# Function to check for duplicates in a directory
|
||||
check_duplicates() {
|
||||
local dir="$1"
|
||||
local module="$2"
|
||||
|
||||
if [ ! -d "$dir" ]; then
|
||||
return
|
||||
fi
|
||||
|
||||
# Extract numeric prefixes and find duplicates
|
||||
local duplicates
|
||||
duplicates=$(find "$dir" -maxdepth 1 -name "*.sql" -printf "%f\n" 2>/dev/null | \
|
||||
sed -E 's/^([0-9]+)_.*/\1/' | \
|
||||
sort | uniq -d)
|
||||
|
||||
if [ -n "$duplicates" ]; then
|
||||
for prefix in $duplicates; do
|
||||
local files
|
||||
files=$(find "$dir" -maxdepth 1 -name "${prefix}_*.sql" -printf "%f\n" | tr '\n' ', ' | sed 's/,$//')
|
||||
ERRORS+=("[$module] Duplicate prefix $prefix: $files")
|
||||
done
|
||||
fi
|
||||
}
|
||||
|
||||
# Function to check naming convention
|
||||
check_naming() {
|
||||
local dir="$1"
|
||||
local module="$2"
|
||||
|
||||
if [ ! -d "$dir" ]; then
|
||||
return
|
||||
fi
|
||||
|
||||
find "$dir" -maxdepth 1 -name "*.sql" -printf "%f\n" 2>/dev/null | while read -r file; do
|
||||
# Check standard pattern: NNN_description.sql
|
||||
if [[ "$file" =~ ^[0-9]{3}_[a-z0-9_]+\.sql$ ]]; then
|
||||
continue # Valid standard
|
||||
fi
|
||||
# Check seed pattern: SNNN_description.sql
|
||||
if [[ "$file" =~ ^S[0-9]{3}_[a-z0-9_]+\.sql$ ]]; then
|
||||
continue # Valid seed
|
||||
fi
|
||||
# Check data migration pattern: DMNNN_description.sql
|
||||
if [[ "$file" =~ ^DM[0-9]{3}_[a-z0-9_]+\.sql$ ]]; then
|
||||
continue # Valid data migration
|
||||
fi
|
||||
# Check for Flyway-style
|
||||
if [[ "$file" =~ ^V[0-9]+.*\.sql$ ]]; then
|
||||
WARNINGS+=("[$module] Flyway-style naming: $file (consider NNN_description.sql)")
|
||||
continue
|
||||
fi
|
||||
# Check for EF Core timestamp style
|
||||
if [[ "$file" =~ ^[0-9]{14,}_.*\.sql$ ]]; then
|
||||
WARNINGS+=("[$module] EF Core timestamp naming: $file (consider NNN_description.sql)")
|
||||
continue
|
||||
fi
|
||||
# Check for 4-digit prefix
|
||||
if [[ "$file" =~ ^[0-9]{4}_.*\.sql$ ]]; then
|
||||
WARNINGS+=("[$module] 4-digit prefix: $file (standard is 3-digit NNN_description.sql)")
|
||||
continue
|
||||
fi
|
||||
# Non-standard
|
||||
WARNINGS+=("[$module] Non-standard naming: $file")
|
||||
done
|
||||
}
|
||||
|
||||
# Function to check for dangerous operations in startup migrations
|
||||
check_dangerous_ops() {
|
||||
local dir="$1"
|
||||
local module="$2"
|
||||
|
||||
if [ ! -d "$dir" ]; then
|
||||
return
|
||||
fi
|
||||
|
||||
find "$dir" -maxdepth 1 -name "*.sql" -printf "%f\n" 2>/dev/null | while read -r file; do
|
||||
local filepath="$dir/$file"
|
||||
local prefix
|
||||
prefix=$(echo "$file" | sed -E 's/^([0-9]+)_.*/\1/')
|
||||
|
||||
# Only check startup migrations (001-099)
|
||||
if [[ "$prefix" =~ ^0[0-9]{2}$ ]] && [ "$prefix" -lt 100 ]; then
|
||||
# Check for DROP TABLE without IF EXISTS
|
||||
if grep -qE "DROP\s+TABLE\s+(?!IF\s+EXISTS)" "$filepath" 2>/dev/null; then
|
||||
ERRORS+=("[$module] $file: DROP TABLE without IF EXISTS in startup migration")
|
||||
fi
|
||||
|
||||
# Check for DROP COLUMN (breaking change in startup)
|
||||
if grep -qiE "ALTER\s+TABLE.*DROP\s+COLUMN" "$filepath" 2>/dev/null; then
|
||||
ERRORS+=("[$module] $file: DROP COLUMN in startup migration (should be release migration 100+)")
|
||||
fi
|
||||
|
||||
# Check for TRUNCATE
|
||||
if grep -qiE "^\s*TRUNCATE" "$filepath" 2>/dev/null; then
|
||||
ERRORS+=("[$module] $file: TRUNCATE in startup migration")
|
||||
fi
|
||||
fi
|
||||
done
|
||||
}
|
||||
|
||||
# Scan all module migration directories
|
||||
echo "Scanning migration directories..."
|
||||
echo ""
|
||||
|
||||
# Define module migration paths
|
||||
declare -A MIGRATION_PATHS
|
||||
MIGRATION_PATHS=(
|
||||
["Authority"]="src/Authority/__Libraries/StellaOps.Authority.Storage.Postgres/Migrations"
|
||||
["Concelier"]="src/Concelier/__Libraries/StellaOps.Concelier.Storage.Postgres/Migrations"
|
||||
["Excititor"]="src/Excititor/__Libraries/StellaOps.Excititor.Storage.Postgres/Migrations"
|
||||
["Policy"]="src/Policy/__Libraries/StellaOps.Policy.Storage.Postgres/Migrations"
|
||||
["Scheduler"]="src/Scheduler/__Libraries/StellaOps.Scheduler.Storage.Postgres/Migrations"
|
||||
["Notify"]="src/Notify/__Libraries/StellaOps.Notify.Storage.Postgres/Migrations"
|
||||
["Scanner"]="src/Scanner/__Libraries/StellaOps.Scanner.Storage/Postgres/Migrations"
|
||||
["Scanner.Triage"]="src/Scanner/__Libraries/StellaOps.Scanner.Triage/Migrations"
|
||||
["Attestor"]="src/Attestor/__Libraries/StellaOps.Attestor.Persistence/Migrations"
|
||||
["Signer"]="src/Signer/__Libraries/StellaOps.Signer.KeyManagement/Migrations"
|
||||
["Signals"]="src/Signals/StellaOps.Signals.Storage.Postgres/Migrations"
|
||||
["EvidenceLocker"]="src/EvidenceLocker/StellaOps.EvidenceLocker/StellaOps.EvidenceLocker.Infrastructure/Db/Migrations"
|
||||
["ExportCenter"]="src/ExportCenter/StellaOps.ExportCenter/StellaOps.ExportCenter.Infrastructure/Db/Migrations"
|
||||
["IssuerDirectory"]="src/IssuerDirectory/StellaOps.IssuerDirectory/StellaOps.IssuerDirectory.Storage.Postgres/Migrations"
|
||||
["Orchestrator"]="src/Orchestrator/StellaOps.Orchestrator/StellaOps.Orchestrator.Infrastructure/migrations"
|
||||
["TimelineIndexer"]="src/TimelineIndexer/StellaOps.TimelineIndexer/StellaOps.TimelineIndexer.Infrastructure/Db/Migrations"
|
||||
["BinaryIndex"]="src/BinaryIndex/__Libraries/StellaOps.BinaryIndex.Persistence/Migrations"
|
||||
["Unknowns"]="src/Unknowns/__Libraries/StellaOps.Unknowns.Storage.Postgres/Migrations"
|
||||
["VexHub"]="src/VexHub/__Libraries/StellaOps.VexHub.Storage.Postgres/Migrations"
|
||||
)
|
||||
|
||||
for module in "${!MIGRATION_PATHS[@]}"; do
|
||||
path="$REPO_ROOT/${MIGRATION_PATHS[$module]}"
|
||||
if [ -d "$path" ]; then
|
||||
echo "Checking: $module"
|
||||
check_duplicates "$path" "$module"
|
||||
check_naming "$path" "$module"
|
||||
check_dangerous_ops "$path" "$module"
|
||||
fi
|
||||
done
|
||||
|
||||
echo ""
|
||||
|
||||
# Report errors
|
||||
if [ ${#ERRORS[@]} -gt 0 ]; then
|
||||
echo -e "${RED}=== ERRORS (${#ERRORS[@]}) ===${NC}"
|
||||
for error in "${ERRORS[@]}"; do
|
||||
echo -e "${RED} ✗ $error${NC}"
|
||||
done
|
||||
EXIT_CODE=1
|
||||
echo ""
|
||||
fi
|
||||
|
||||
# Report warnings
|
||||
if [ ${#WARNINGS[@]} -gt 0 ]; then
|
||||
echo -e "${YELLOW}=== WARNINGS (${#WARNINGS[@]}) ===${NC}"
|
||||
for warning in "${WARNINGS[@]}"; do
|
||||
echo -e "${YELLOW} ⚠ $warning${NC}"
|
||||
done
|
||||
if [ "$STRICT_MODE" = true ]; then
|
||||
EXIT_CODE=1
|
||||
fi
|
||||
echo ""
|
||||
fi
|
||||
|
||||
# Scanner fix suggestions
|
||||
if [ "$FIX_SCANNER" = true ]; then
|
||||
echo "=== Scanner Migration Rename Suggestions ==="
|
||||
echo "# Run these commands to fix Scanner duplicate migrations:"
|
||||
echo ""
|
||||
|
||||
SCANNER_DIR="$REPO_ROOT/src/Scanner/__Libraries/StellaOps.Scanner.Storage/Postgres/Migrations"
|
||||
if [ -d "$SCANNER_DIR" ]; then
|
||||
# Map old names to new sequential numbers
|
||||
cat << 'EOF'
|
||||
# Before running: backup the schema_migrations table!
|
||||
# After renaming: update schema_migrations.migration_name to match new names
|
||||
|
||||
cd src/Scanner/__Libraries/StellaOps.Scanner.Storage/Postgres/Migrations
|
||||
|
||||
# Fix duplicate 009 prefixes
|
||||
git mv 009_call_graph_tables.sql 020_call_graph_tables.sql
|
||||
git mv 009_smart_diff_tables_search_path.sql 021_smart_diff_tables_search_path.sql
|
||||
|
||||
# Fix duplicate 010 prefixes
|
||||
git mv 010_reachability_drift_tables.sql 022_reachability_drift_tables.sql
|
||||
git mv 010_scanner_api_ingestion.sql 023_scanner_api_ingestion.sql
|
||||
git mv 010_smart_diff_priority_score_widen.sql 024_smart_diff_priority_score_widen.sql
|
||||
|
||||
# Fix duplicate 014 prefixes
|
||||
git mv 014_epss_triage_columns.sql 025_epss_triage_columns.sql
|
||||
git mv 014_vuln_surfaces.sql 026_vuln_surfaces.sql
|
||||
|
||||
# Renumber subsequent migrations
|
||||
git mv 011_epss_raw_layer.sql 027_epss_raw_layer.sql
|
||||
git mv 012_epss_signal_layer.sql 028_epss_signal_layer.sql
|
||||
git mv 013_witness_storage.sql 029_witness_storage.sql
|
||||
git mv 015_vuln_surface_triggers_update.sql 030_vuln_surface_triggers_update.sql
|
||||
git mv 016_reach_cache.sql 031_reach_cache.sql
|
||||
git mv 017_idempotency_keys.sql 032_idempotency_keys.sql
|
||||
git mv 018_binary_evidence.sql 033_binary_evidence.sql
|
||||
git mv 019_func_proof_tables.sql 034_func_proof_tables.sql
|
||||
EOF
|
||||
fi
|
||||
echo ""
|
||||
fi
|
||||
|
||||
# Summary
|
||||
if [ $EXIT_CODE -eq 0 ]; then
|
||||
echo -e "${GREEN}=== VALIDATION PASSED ===${NC}"
|
||||
else
|
||||
echo -e "${RED}=== VALIDATION FAILED ===${NC}"
|
||||
fi
|
||||
|
||||
exit $EXIT_CODE
|
||||
227
.gitea/workflows/container-scan.yml
Normal file
227
.gitea/workflows/container-scan.yml
Normal file
@@ -0,0 +1,227 @@
|
||||
# Container Security Scanning Workflow
|
||||
# Sprint: CI/CD Enhancement - Security Scanning
|
||||
#
|
||||
# Purpose: Scan container images for vulnerabilities beyond SBOM generation
|
||||
# Triggers: Dockerfile changes, scheduled daily, manual dispatch
|
||||
#
|
||||
# Tool: PLACEHOLDER - Choose one: Trivy, Grype, or Snyk
|
||||
|
||||
name: Container Security Scan
|
||||
|
||||
on:
|
||||
push:
|
||||
paths:
|
||||
- '**/Dockerfile'
|
||||
- '**/Dockerfile.*'
|
||||
- 'devops/docker/**'
|
||||
pull_request:
|
||||
paths:
|
||||
- '**/Dockerfile'
|
||||
- '**/Dockerfile.*'
|
||||
- 'devops/docker/**'
|
||||
schedule:
|
||||
# Run daily at 4 AM UTC
|
||||
- cron: '0 4 * * *'
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
severity_threshold:
|
||||
description: 'Minimum severity to fail'
|
||||
required: false
|
||||
type: choice
|
||||
options:
|
||||
- CRITICAL
|
||||
- HIGH
|
||||
- MEDIUM
|
||||
- LOW
|
||||
default: HIGH
|
||||
image:
|
||||
description: 'Specific image to scan (optional)'
|
||||
required: false
|
||||
type: string
|
||||
|
||||
env:
|
||||
SEVERITY_THRESHOLD: ${{ github.event.inputs.severity_threshold || 'HIGH' }}
|
||||
|
||||
jobs:
|
||||
discover-images:
|
||||
name: Discover Container Images
|
||||
runs-on: ubuntu-latest
|
||||
outputs:
|
||||
images: ${{ steps.discover.outputs.images }}
|
||||
count: ${{ steps.discover.outputs.count }}
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Discover Dockerfiles
|
||||
id: discover
|
||||
run: |
|
||||
# Find all Dockerfiles
|
||||
DOCKERFILES=$(find . -name "Dockerfile" -o -name "Dockerfile.*" | grep -v node_modules | grep -v bin | grep -v obj || true)
|
||||
|
||||
# Build image list
|
||||
IMAGES='[]'
|
||||
COUNT=0
|
||||
|
||||
while IFS= read -r dockerfile; do
|
||||
if [[ -n "$dockerfile" ]]; then
|
||||
DIR=$(dirname "$dockerfile")
|
||||
NAME=$(basename "$DIR" | tr '[:upper:]' '[:lower:]' | tr '.' '-')
|
||||
|
||||
# Get image name from directory structure
|
||||
if [[ "$DIR" == *"devops/docker"* ]]; then
|
||||
NAME=$(echo "$dockerfile" | sed 's|.*devops/docker/||' | sed 's|/Dockerfile.*||' | tr '/' '-')
|
||||
fi
|
||||
|
||||
IMAGES=$(echo "$IMAGES" | jq --arg name "$NAME" --arg path "$dockerfile" '. + [{"name": $name, "dockerfile": $path}]')
|
||||
COUNT=$((COUNT + 1))
|
||||
fi
|
||||
done <<< "$DOCKERFILES"
|
||||
|
||||
echo "Found $COUNT Dockerfile(s)"
|
||||
echo "images=$(echo "$IMAGES" | jq -c .)" >> $GITHUB_OUTPUT
|
||||
echo "count=$COUNT" >> $GITHUB_OUTPUT
|
||||
|
||||
scan-images:
|
||||
name: Scan ${{ matrix.image.name }}
|
||||
runs-on: ubuntu-latest
|
||||
needs: [discover-images]
|
||||
if: needs.discover-images.outputs.count != '0'
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
image: ${{ fromJson(needs.discover-images.outputs.images) }}
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Set up Docker Buildx
|
||||
uses: docker/setup-buildx-action@v3
|
||||
|
||||
- name: Build image for scanning
|
||||
id: build
|
||||
run: |
|
||||
IMAGE_TAG="scan-${{ matrix.image.name }}:${{ github.sha }}"
|
||||
DOCKERFILE="${{ matrix.image.dockerfile }}"
|
||||
CONTEXT=$(dirname "$DOCKERFILE")
|
||||
|
||||
echo "Building $IMAGE_TAG from $DOCKERFILE..."
|
||||
docker build -t "$IMAGE_TAG" -f "$DOCKERFILE" "$CONTEXT" || {
|
||||
echo "::warning::Failed to build $IMAGE_TAG - skipping scan"
|
||||
echo "skip=true" >> $GITHUB_OUTPUT
|
||||
exit 0
|
||||
}
|
||||
|
||||
echo "image_tag=$IMAGE_TAG" >> $GITHUB_OUTPUT
|
||||
echo "skip=false" >> $GITHUB_OUTPUT
|
||||
|
||||
# PLACEHOLDER: Choose your container scanner
|
||||
# Option 1: Trivy (recommended - comprehensive, free)
|
||||
# Option 2: Grype (Anchore - good integration with Syft SBOMs)
|
||||
# Option 3: Snyk (commercial, comprehensive)
|
||||
|
||||
- name: Trivy Vulnerability Scan
|
||||
if: steps.build.outputs.skip != 'true'
|
||||
id: trivy
|
||||
# Uncomment when ready to use Trivy:
|
||||
# uses: aquasecurity/trivy-action@master
|
||||
# with:
|
||||
# image-ref: ${{ steps.build.outputs.image_tag }}
|
||||
# format: 'sarif'
|
||||
# output: 'trivy-${{ matrix.image.name }}.sarif'
|
||||
# severity: ${{ env.SEVERITY_THRESHOLD }},CRITICAL
|
||||
# exit-code: '1'
|
||||
run: |
|
||||
echo "::notice::Container scanning placeholder - configure scanner below"
|
||||
echo ""
|
||||
echo "Image: ${{ steps.build.outputs.image_tag }}"
|
||||
echo "Severity threshold: ${{ env.SEVERITY_THRESHOLD }}"
|
||||
echo ""
|
||||
echo "Available scanners:"
|
||||
echo " 1. Trivy: aquasecurity/trivy-action@master"
|
||||
echo " 2. Grype: anchore/scan-action@v3"
|
||||
echo " 3. Snyk: snyk/actions/docker@master"
|
||||
|
||||
# Create placeholder report
|
||||
mkdir -p scan-results
|
||||
echo '{"placeholder": true, "image": "${{ matrix.image.name }}"}' > scan-results/scan-${{ matrix.image.name }}.json
|
||||
|
||||
# Alternative: Grype (works well with existing Syft SBOM workflow)
|
||||
# - name: Grype Vulnerability Scan
|
||||
# if: steps.build.outputs.skip != 'true'
|
||||
# uses: anchore/scan-action@v3
|
||||
# with:
|
||||
# image: ${{ steps.build.outputs.image_tag }}
|
||||
# severity-cutoff: ${{ env.SEVERITY_THRESHOLD }}
|
||||
# fail-build: true
|
||||
|
||||
# Alternative: Snyk Container
|
||||
# - name: Snyk Container Scan
|
||||
# if: steps.build.outputs.skip != 'true'
|
||||
# uses: snyk/actions/docker@master
|
||||
# env:
|
||||
# SNYK_TOKEN: ${{ secrets.SNYK_TOKEN }}
|
||||
# with:
|
||||
# image: ${{ steps.build.outputs.image_tag }}
|
||||
# args: --severity-threshold=${{ env.SEVERITY_THRESHOLD }}
|
||||
|
||||
- name: Upload scan results
|
||||
if: always() && steps.build.outputs.skip != 'true'
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: container-scan-${{ matrix.image.name }}
|
||||
path: |
|
||||
scan-results/
|
||||
*.sarif
|
||||
*.json
|
||||
retention-days: 30
|
||||
if-no-files-found: ignore
|
||||
|
||||
- name: Cleanup
|
||||
if: always()
|
||||
run: |
|
||||
docker rmi "${{ steps.build.outputs.image_tag }}" 2>/dev/null || true
|
||||
|
||||
summary:
|
||||
name: Scan Summary
|
||||
runs-on: ubuntu-latest
|
||||
needs: [discover-images, scan-images]
|
||||
if: always()
|
||||
|
||||
steps:
|
||||
- name: Download all scan results
|
||||
uses: actions/download-artifact@v4
|
||||
with:
|
||||
pattern: container-scan-*
|
||||
path: all-results/
|
||||
merge-multiple: true
|
||||
continue-on-error: true
|
||||
|
||||
- name: Generate summary
|
||||
run: |
|
||||
echo "## Container Security Scan Results" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Image | Status |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "|-------|--------|" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
IMAGES='${{ needs.discover-images.outputs.images }}'
|
||||
SCAN_RESULT="${{ needs.scan-images.result }}"
|
||||
|
||||
echo "$IMAGES" | jq -r '.[] | .name' | while read -r name; do
|
||||
if [[ "$SCAN_RESULT" == "success" ]]; then
|
||||
echo "| $name | No vulnerabilities found |" >> $GITHUB_STEP_SUMMARY
|
||||
elif [[ "$SCAN_RESULT" == "failure" ]]; then
|
||||
echo "| $name | Vulnerabilities detected |" >> $GITHUB_STEP_SUMMARY
|
||||
else
|
||||
echo "| $name | $SCAN_RESULT |" >> $GITHUB_STEP_SUMMARY
|
||||
fi
|
||||
done
|
||||
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "### Configuration" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- **Scanner:** Placeholder (configure in workflow)" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- **Severity Threshold:** ${{ env.SEVERITY_THRESHOLD }}" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- **Images Scanned:** ${{ needs.discover-images.outputs.count }}" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- **Trigger:** ${{ github.event_name }}" >> $GITHUB_STEP_SUMMARY
|
||||
204
.gitea/workflows/dependency-license-gate.yml
Normal file
204
.gitea/workflows/dependency-license-gate.yml
Normal file
@@ -0,0 +1,204 @@
|
||||
# Dependency License Compliance Gate
|
||||
# Sprint: CI/CD Enhancement - Dependency Management Automation
|
||||
#
|
||||
# Purpose: Validate that all dependencies use approved licenses
|
||||
# Triggers: PRs modifying package files
|
||||
|
||||
name: License Compliance
|
||||
|
||||
on:
|
||||
pull_request:
|
||||
paths:
|
||||
- 'src/Directory.Packages.props'
|
||||
- '**/package.json'
|
||||
- '**/package-lock.json'
|
||||
- '**/*.csproj'
|
||||
|
||||
env:
|
||||
DOTNET_VERSION: '10.0.100'
|
||||
# Blocked licenses (incompatible with AGPL-3.0)
|
||||
BLOCKED_LICENSES: 'GPL-2.0-only,SSPL-1.0,BUSL-1.1,Proprietary,Commercial'
|
||||
# Allowed licenses
|
||||
ALLOWED_LICENSES: 'MIT,Apache-2.0,BSD-2-Clause,BSD-3-Clause,ISC,0BSD,Unlicense,CC0-1.0,LGPL-2.1,LGPL-3.0,MPL-2.0,AGPL-3.0,GPL-3.0'
|
||||
|
||||
jobs:
|
||||
check-nuget-licenses:
|
||||
name: NuGet License Check
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Setup .NET
|
||||
uses: actions/setup-dotnet@v4
|
||||
with:
|
||||
dotnet-version: ${{ env.DOTNET_VERSION }}
|
||||
include-prerelease: true
|
||||
|
||||
- name: Install dotnet-delice
|
||||
run: dotnet tool install --global dotnet-delice
|
||||
|
||||
- name: Restore packages
|
||||
run: dotnet restore src/StellaOps.sln
|
||||
|
||||
- name: Check NuGet licenses
|
||||
id: nuget-check
|
||||
run: |
|
||||
mkdir -p license-reports
|
||||
|
||||
echo "Checking NuGet package licenses..."
|
||||
|
||||
# Run delice on the solution
|
||||
dotnet delice src/StellaOps.sln \
|
||||
--output license-reports/nuget-licenses.json \
|
||||
--format json \
|
||||
2>&1 | tee license-reports/nuget-check.log || true
|
||||
|
||||
# Check for blocked licenses
|
||||
BLOCKED_FOUND=0
|
||||
BLOCKED_PACKAGES=""
|
||||
|
||||
IFS=',' read -ra BLOCKED_ARRAY <<< "$BLOCKED_LICENSES"
|
||||
for license in "${BLOCKED_ARRAY[@]}"; do
|
||||
if grep -qi "\"$license\"" license-reports/nuget-licenses.json 2>/dev/null; then
|
||||
BLOCKED_FOUND=1
|
||||
PACKAGES=$(grep -B5 "\"$license\"" license-reports/nuget-licenses.json | grep -o '"[^"]*"' | head -1 || echo "unknown")
|
||||
BLOCKED_PACKAGES="$BLOCKED_PACKAGES\n- $license: $PACKAGES"
|
||||
fi
|
||||
done
|
||||
|
||||
if [[ $BLOCKED_FOUND -eq 1 ]]; then
|
||||
echo "::error::Blocked licenses found in NuGet packages:$BLOCKED_PACKAGES"
|
||||
echo "blocked=true" >> $GITHUB_OUTPUT
|
||||
echo "blocked_packages<<EOF" >> $GITHUB_OUTPUT
|
||||
echo -e "$BLOCKED_PACKAGES" >> $GITHUB_OUTPUT
|
||||
echo "EOF" >> $GITHUB_OUTPUT
|
||||
else
|
||||
echo "All NuGet packages have approved licenses"
|
||||
echo "blocked=false" >> $GITHUB_OUTPUT
|
||||
fi
|
||||
|
||||
- name: Upload NuGet license report
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: nuget-license-report
|
||||
path: license-reports/
|
||||
retention-days: 30
|
||||
|
||||
check-npm-licenses:
|
||||
name: npm License Check
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Setup Node.js
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: '20'
|
||||
|
||||
- name: Find package.json files
|
||||
id: find-packages
|
||||
run: |
|
||||
PACKAGES=$(find . -name "package.json" -not -path "*/node_modules/*" -not -path "*/bin/*" -not -path "*/obj/*" | head -10)
|
||||
echo "Found package.json files:"
|
||||
echo "$PACKAGES"
|
||||
echo "packages<<EOF" >> $GITHUB_OUTPUT
|
||||
echo "$PACKAGES" >> $GITHUB_OUTPUT
|
||||
echo "EOF" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: Install license-checker
|
||||
run: npm install -g license-checker
|
||||
|
||||
- name: Check npm licenses
|
||||
id: npm-check
|
||||
run: |
|
||||
mkdir -p license-reports
|
||||
BLOCKED_FOUND=0
|
||||
BLOCKED_PACKAGES=""
|
||||
|
||||
# Check each package.json directory
|
||||
while IFS= read -r pkg; do
|
||||
if [[ -z "$pkg" ]]; then continue; fi
|
||||
|
||||
DIR=$(dirname "$pkg")
|
||||
echo "Checking $DIR..."
|
||||
|
||||
cd "$DIR"
|
||||
if [[ -f "package-lock.json" ]] || [[ -f "yarn.lock" ]]; then
|
||||
npm install --ignore-scripts 2>/dev/null || true
|
||||
|
||||
# Run license checker
|
||||
license-checker --json > "${GITHUB_WORKSPACE}/license-reports/npm-$(basename $DIR).json" 2>/dev/null || true
|
||||
|
||||
# Check for blocked licenses
|
||||
IFS=',' read -ra BLOCKED_ARRAY <<< "$BLOCKED_LICENSES"
|
||||
for license in "${BLOCKED_ARRAY[@]}"; do
|
||||
if grep -qi "\"$license\"" "${GITHUB_WORKSPACE}/license-reports/npm-$(basename $DIR).json" 2>/dev/null; then
|
||||
BLOCKED_FOUND=1
|
||||
BLOCKED_PACKAGES="$BLOCKED_PACKAGES\n- $license in $DIR"
|
||||
fi
|
||||
done
|
||||
fi
|
||||
cd "$GITHUB_WORKSPACE"
|
||||
done <<< "${{ steps.find-packages.outputs.packages }}"
|
||||
|
||||
if [[ $BLOCKED_FOUND -eq 1 ]]; then
|
||||
echo "::error::Blocked licenses found in npm packages:$BLOCKED_PACKAGES"
|
||||
echo "blocked=true" >> $GITHUB_OUTPUT
|
||||
else
|
||||
echo "All npm packages have approved licenses"
|
||||
echo "blocked=false" >> $GITHUB_OUTPUT
|
||||
fi
|
||||
|
||||
- name: Upload npm license report
|
||||
uses: actions/upload-artifact@v4
|
||||
if: always()
|
||||
with:
|
||||
name: npm-license-report
|
||||
path: license-reports/
|
||||
retention-days: 30
|
||||
|
||||
gate:
|
||||
name: License Gate
|
||||
runs-on: ubuntu-latest
|
||||
needs: [check-nuget-licenses, check-npm-licenses]
|
||||
if: always()
|
||||
steps:
|
||||
- name: Check results
|
||||
run: |
|
||||
NUGET_BLOCKED="${{ needs.check-nuget-licenses.outputs.blocked }}"
|
||||
NPM_BLOCKED="${{ needs.check-npm-licenses.outputs.blocked }}"
|
||||
|
||||
echo "## License Compliance Results" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Check | Status |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "|-------|--------|" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
if [[ "$NUGET_BLOCKED" == "true" ]]; then
|
||||
echo "| NuGet | ❌ Blocked licenses found |" >> $GITHUB_STEP_SUMMARY
|
||||
else
|
||||
echo "| NuGet | ✅ Approved |" >> $GITHUB_STEP_SUMMARY
|
||||
fi
|
||||
|
||||
if [[ "$NPM_BLOCKED" == "true" ]]; then
|
||||
echo "| npm | ❌ Blocked licenses found |" >> $GITHUB_STEP_SUMMARY
|
||||
else
|
||||
echo "| npm | ✅ Approved |" >> $GITHUB_STEP_SUMMARY
|
||||
fi
|
||||
|
||||
if [[ "$NUGET_BLOCKED" == "true" ]] || [[ "$NPM_BLOCKED" == "true" ]]; then
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "### Blocked Licenses" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "The following licenses are not compatible with AGPL-3.0:" >> $GITHUB_STEP_SUMMARY
|
||||
echo "\`$BLOCKED_LICENSES\`" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "Please replace the offending packages or request an exception." >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
echo "::error::License compliance check failed"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "✅ All dependencies use approved licenses" >> $GITHUB_STEP_SUMMARY
|
||||
249
.gitea/workflows/dependency-security-scan.yml
Normal file
249
.gitea/workflows/dependency-security-scan.yml
Normal file
@@ -0,0 +1,249 @@
|
||||
# Dependency Security Scan
|
||||
# Sprint: CI/CD Enhancement - Dependency Management Automation
|
||||
#
|
||||
# Purpose: Scan dependencies for known vulnerabilities
|
||||
# Schedule: Weekly and on PRs modifying package files
|
||||
|
||||
name: Dependency Security Scan
|
||||
|
||||
on:
|
||||
schedule:
|
||||
# Run weekly on Sundays at 02:00 UTC
|
||||
- cron: '0 2 * * 0'
|
||||
pull_request:
|
||||
paths:
|
||||
- 'src/Directory.Packages.props'
|
||||
- '**/package.json'
|
||||
- '**/package-lock.json'
|
||||
- '**/*.csproj'
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
fail_on_vulnerabilities:
|
||||
description: 'Fail if vulnerabilities found'
|
||||
required: false
|
||||
type: boolean
|
||||
default: true
|
||||
|
||||
env:
|
||||
DOTNET_VERSION: '10.0.100'
|
||||
|
||||
jobs:
|
||||
scan-nuget:
|
||||
name: NuGet Vulnerability Scan
|
||||
runs-on: ubuntu-latest
|
||||
outputs:
|
||||
vulnerabilities_found: ${{ steps.scan.outputs.vulnerabilities_found }}
|
||||
critical_count: ${{ steps.scan.outputs.critical_count }}
|
||||
high_count: ${{ steps.scan.outputs.high_count }}
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Setup .NET
|
||||
uses: actions/setup-dotnet@v4
|
||||
with:
|
||||
dotnet-version: ${{ env.DOTNET_VERSION }}
|
||||
include-prerelease: true
|
||||
|
||||
- name: Restore packages
|
||||
run: dotnet restore src/StellaOps.sln
|
||||
|
||||
- name: Scan for vulnerabilities
|
||||
id: scan
|
||||
run: |
|
||||
mkdir -p security-reports
|
||||
|
||||
echo "Scanning NuGet packages for vulnerabilities..."
|
||||
|
||||
# Run vulnerability check
|
||||
dotnet list src/StellaOps.sln package --vulnerable --include-transitive \
|
||||
> security-reports/nuget-vulnerabilities.txt 2>&1 || true
|
||||
|
||||
# Parse results
|
||||
CRITICAL=$(grep -c "Critical" security-reports/nuget-vulnerabilities.txt 2>/dev/null || echo "0")
|
||||
HIGH=$(grep -c "High" security-reports/nuget-vulnerabilities.txt 2>/dev/null || echo "0")
|
||||
MEDIUM=$(grep -c "Medium" security-reports/nuget-vulnerabilities.txt 2>/dev/null || echo "0")
|
||||
LOW=$(grep -c "Low" security-reports/nuget-vulnerabilities.txt 2>/dev/null || echo "0")
|
||||
|
||||
TOTAL=$((CRITICAL + HIGH + MEDIUM + LOW))
|
||||
|
||||
echo "=== Vulnerability Summary ==="
|
||||
echo "Critical: $CRITICAL"
|
||||
echo "High: $HIGH"
|
||||
echo "Medium: $MEDIUM"
|
||||
echo "Low: $LOW"
|
||||
echo "Total: $TOTAL"
|
||||
|
||||
echo "critical_count=$CRITICAL" >> $GITHUB_OUTPUT
|
||||
echo "high_count=$HIGH" >> $GITHUB_OUTPUT
|
||||
echo "medium_count=$MEDIUM" >> $GITHUB_OUTPUT
|
||||
echo "low_count=$LOW" >> $GITHUB_OUTPUT
|
||||
|
||||
if [[ $TOTAL -gt 0 ]]; then
|
||||
echo "vulnerabilities_found=true" >> $GITHUB_OUTPUT
|
||||
else
|
||||
echo "vulnerabilities_found=false" >> $GITHUB_OUTPUT
|
||||
fi
|
||||
|
||||
# Show detailed report
|
||||
echo ""
|
||||
echo "=== Detailed Report ==="
|
||||
cat security-reports/nuget-vulnerabilities.txt
|
||||
|
||||
- name: Upload NuGet security report
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: nuget-security-report
|
||||
path: security-reports/
|
||||
retention-days: 90
|
||||
|
||||
scan-npm:
|
||||
name: npm Vulnerability Scan
|
||||
runs-on: ubuntu-latest
|
||||
outputs:
|
||||
vulnerabilities_found: ${{ steps.scan.outputs.vulnerabilities_found }}
|
||||
critical_count: ${{ steps.scan.outputs.critical_count }}
|
||||
high_count: ${{ steps.scan.outputs.high_count }}
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Setup Node.js
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: '20'
|
||||
|
||||
- name: Find and scan package.json files
|
||||
id: scan
|
||||
run: |
|
||||
mkdir -p security-reports
|
||||
|
||||
TOTAL_CRITICAL=0
|
||||
TOTAL_HIGH=0
|
||||
TOTAL_MEDIUM=0
|
||||
TOTAL_LOW=0
|
||||
VULNERABILITIES_FOUND=false
|
||||
|
||||
# Find all package.json files
|
||||
PACKAGES=$(find . -name "package.json" -not -path "*/node_modules/*" -not -path "*/bin/*" -not -path "*/obj/*")
|
||||
|
||||
for pkg in $PACKAGES; do
|
||||
DIR=$(dirname "$pkg")
|
||||
if [[ ! -f "$DIR/package-lock.json" ]] && [[ ! -f "$DIR/yarn.lock" ]]; then
|
||||
continue
|
||||
fi
|
||||
|
||||
echo "Scanning $DIR..."
|
||||
cd "$DIR"
|
||||
|
||||
# Install dependencies
|
||||
npm install --ignore-scripts 2>/dev/null || true
|
||||
|
||||
# Run npm audit
|
||||
REPORT_FILE="${GITHUB_WORKSPACE}/security-reports/npm-audit-$(basename $DIR).json"
|
||||
npm audit --json > "$REPORT_FILE" 2>/dev/null || true
|
||||
|
||||
# Parse results
|
||||
if [[ -f "$REPORT_FILE" ]]; then
|
||||
CRITICAL=$(jq '.metadata.vulnerabilities.critical // 0' "$REPORT_FILE" 2>/dev/null || echo "0")
|
||||
HIGH=$(jq '.metadata.vulnerabilities.high // 0' "$REPORT_FILE" 2>/dev/null || echo "0")
|
||||
MEDIUM=$(jq '.metadata.vulnerabilities.moderate // 0' "$REPORT_FILE" 2>/dev/null || echo "0")
|
||||
LOW=$(jq '.metadata.vulnerabilities.low // 0' "$REPORT_FILE" 2>/dev/null || echo "0")
|
||||
|
||||
TOTAL_CRITICAL=$((TOTAL_CRITICAL + CRITICAL))
|
||||
TOTAL_HIGH=$((TOTAL_HIGH + HIGH))
|
||||
TOTAL_MEDIUM=$((TOTAL_MEDIUM + MEDIUM))
|
||||
TOTAL_LOW=$((TOTAL_LOW + LOW))
|
||||
|
||||
if [[ $((CRITICAL + HIGH + MEDIUM + LOW)) -gt 0 ]]; then
|
||||
VULNERABILITIES_FOUND=true
|
||||
fi
|
||||
fi
|
||||
|
||||
cd "$GITHUB_WORKSPACE"
|
||||
done
|
||||
|
||||
echo "=== npm Vulnerability Summary ==="
|
||||
echo "Critical: $TOTAL_CRITICAL"
|
||||
echo "High: $TOTAL_HIGH"
|
||||
echo "Medium: $TOTAL_MEDIUM"
|
||||
echo "Low: $TOTAL_LOW"
|
||||
|
||||
echo "critical_count=$TOTAL_CRITICAL" >> $GITHUB_OUTPUT
|
||||
echo "high_count=$TOTAL_HIGH" >> $GITHUB_OUTPUT
|
||||
echo "vulnerabilities_found=$VULNERABILITIES_FOUND" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: Upload npm security report
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: npm-security-report
|
||||
path: security-reports/
|
||||
retention-days: 90
|
||||
|
||||
summary:
|
||||
name: Security Summary
|
||||
runs-on: ubuntu-latest
|
||||
needs: [scan-nuget, scan-npm]
|
||||
if: always()
|
||||
|
||||
steps:
|
||||
- name: Generate summary
|
||||
run: |
|
||||
NUGET_VULNS="${{ needs.scan-nuget.outputs.vulnerabilities_found }}"
|
||||
NPM_VULNS="${{ needs.scan-npm.outputs.vulnerabilities_found }}"
|
||||
|
||||
NUGET_CRITICAL="${{ needs.scan-nuget.outputs.critical_count }}"
|
||||
NUGET_HIGH="${{ needs.scan-nuget.outputs.high_count }}"
|
||||
NPM_CRITICAL="${{ needs.scan-npm.outputs.critical_count }}"
|
||||
NPM_HIGH="${{ needs.scan-npm.outputs.high_count }}"
|
||||
|
||||
echo "## Dependency Security Scan Results" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "### NuGet Packages" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Severity | Count |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "|----------|-------|" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Critical | ${NUGET_CRITICAL:-0} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| High | ${NUGET_HIGH:-0} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
echo "### npm Packages" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Severity | Count |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "|----------|-------|" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Critical | ${NPM_CRITICAL:-0} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| High | ${NPM_HIGH:-0} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
# Determine overall status
|
||||
TOTAL_CRITICAL=$((${NUGET_CRITICAL:-0} + ${NPM_CRITICAL:-0}))
|
||||
TOTAL_HIGH=$((${NUGET_HIGH:-0} + ${NPM_HIGH:-0}))
|
||||
|
||||
if [[ $TOTAL_CRITICAL -gt 0 ]]; then
|
||||
echo "### ⚠️ Critical Vulnerabilities Found" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "Please review and remediate critical vulnerabilities before merging." >> $GITHUB_STEP_SUMMARY
|
||||
elif [[ $TOTAL_HIGH -gt 0 ]]; then
|
||||
echo "### ⚠️ High Severity Vulnerabilities Found" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "Please review high severity vulnerabilities." >> $GITHUB_STEP_SUMMARY
|
||||
else
|
||||
echo "### ✅ No Critical or High Vulnerabilities" >> $GITHUB_STEP_SUMMARY
|
||||
fi
|
||||
|
||||
- name: Check gate
|
||||
if: github.event.inputs.fail_on_vulnerabilities == 'true' || github.event_name == 'pull_request'
|
||||
run: |
|
||||
NUGET_CRITICAL="${{ needs.scan-nuget.outputs.critical_count }}"
|
||||
NPM_CRITICAL="${{ needs.scan-npm.outputs.critical_count }}"
|
||||
|
||||
TOTAL_CRITICAL=$((${NUGET_CRITICAL:-0} + ${NPM_CRITICAL:-0}))
|
||||
|
||||
if [[ $TOTAL_CRITICAL -gt 0 ]]; then
|
||||
echo "::error::$TOTAL_CRITICAL critical vulnerabilities found in dependencies"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "Security scan passed - no critical vulnerabilities"
|
||||
512
.gitea/workflows/migration-test.yml
Normal file
512
.gitea/workflows/migration-test.yml
Normal file
@@ -0,0 +1,512 @@
|
||||
# .gitea/workflows/migration-test.yml
|
||||
# Database Migration Testing Workflow
|
||||
# Sprint: CI/CD Enhancement - Migration Safety
|
||||
#
|
||||
# Purpose: Validate database migrations work correctly in both directions
|
||||
# - Forward migrations (upgrade)
|
||||
# - Backward migrations (rollback)
|
||||
# - Idempotency checks (re-running migrations)
|
||||
# - Data integrity verification
|
||||
#
|
||||
# Triggers:
|
||||
# - Pull requests that modify migration files
|
||||
# - Scheduled daily validation
|
||||
# - Manual dispatch for full migration suite
|
||||
#
|
||||
# Prerequisites:
|
||||
# - PostgreSQL 16+ database
|
||||
# - EF Core migrations in src/**/Migrations/
|
||||
# - Migration scripts in devops/database/migrations/
|
||||
|
||||
name: Migration Testing
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [main]
|
||||
paths:
|
||||
- '**/Migrations/**'
|
||||
- 'devops/database/**'
|
||||
pull_request:
|
||||
paths:
|
||||
- '**/Migrations/**'
|
||||
- 'devops/database/**'
|
||||
schedule:
|
||||
- cron: '30 4 * * *' # Daily at 4:30 AM UTC
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
test_rollback:
|
||||
description: 'Test rollback migrations'
|
||||
type: boolean
|
||||
default: true
|
||||
test_idempotency:
|
||||
description: 'Test migration idempotency'
|
||||
type: boolean
|
||||
default: true
|
||||
target_module:
|
||||
description: 'Specific module to test (empty = all)'
|
||||
type: string
|
||||
default: ''
|
||||
baseline_version:
|
||||
description: 'Baseline version to test from'
|
||||
type: string
|
||||
default: ''
|
||||
|
||||
env:
|
||||
DOTNET_VERSION: '10.0.100'
|
||||
DOTNET_NOLOGO: 1
|
||||
DOTNET_CLI_TELEMETRY_OPTOUT: 1
|
||||
TZ: UTC
|
||||
POSTGRES_HOST: localhost
|
||||
POSTGRES_PORT: 5432
|
||||
POSTGRES_USER: stellaops_migration
|
||||
POSTGRES_PASSWORD: migration_test_password
|
||||
POSTGRES_DB: stellaops_migration_test
|
||||
|
||||
jobs:
|
||||
# ===========================================================================
|
||||
# DISCOVER MODULES WITH MIGRATIONS
|
||||
# ===========================================================================
|
||||
|
||||
discover:
|
||||
name: Discover Migrations
|
||||
runs-on: ubuntu-22.04
|
||||
outputs:
|
||||
modules: ${{ steps.find.outputs.modules }}
|
||||
module_count: ${{ steps.find.outputs.count }}
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Find modules with migrations
|
||||
id: find
|
||||
run: |
|
||||
# Find all EF Core migration directories
|
||||
MODULES=$(find src -type d -name "Migrations" -path "*/Persistence/*" | \
|
||||
sed 's|/Migrations||' | \
|
||||
sort -u | \
|
||||
jq -R -s -c 'split("\n") | map(select(length > 0))')
|
||||
|
||||
COUNT=$(echo "$MODULES" | jq 'length')
|
||||
|
||||
echo "Found $COUNT modules with migrations"
|
||||
echo "$MODULES" | jq -r '.[]'
|
||||
|
||||
# Filter by target module if specified
|
||||
if [[ -n "${{ github.event.inputs.target_module }}" ]]; then
|
||||
MODULES=$(echo "$MODULES" | jq -c --arg target "${{ github.event.inputs.target_module }}" \
|
||||
'map(select(contains($target)))')
|
||||
COUNT=$(echo "$MODULES" | jq 'length')
|
||||
echo "Filtered to $COUNT modules matching: ${{ github.event.inputs.target_module }}"
|
||||
fi
|
||||
|
||||
echo "modules=$MODULES" >> $GITHUB_OUTPUT
|
||||
echo "count=$COUNT" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: Display discovered modules
|
||||
run: |
|
||||
echo "## Discovered Migration Modules" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Module | Path |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "|--------|------|" >> $GITHUB_STEP_SUMMARY
|
||||
for path in $(echo '${{ steps.find.outputs.modules }}' | jq -r '.[]'); do
|
||||
module=$(basename $(dirname "$path"))
|
||||
echo "| $module | $path |" >> $GITHUB_STEP_SUMMARY
|
||||
done
|
||||
|
||||
# ===========================================================================
|
||||
# FORWARD MIGRATION TESTS
|
||||
# ===========================================================================
|
||||
|
||||
forward-migrations:
|
||||
name: Forward Migration
|
||||
runs-on: ubuntu-22.04
|
||||
timeout-minutes: 30
|
||||
needs: discover
|
||||
if: needs.discover.outputs.module_count != '0'
|
||||
services:
|
||||
postgres:
|
||||
image: postgres:16
|
||||
env:
|
||||
POSTGRES_USER: ${{ env.POSTGRES_USER }}
|
||||
POSTGRES_PASSWORD: ${{ env.POSTGRES_PASSWORD }}
|
||||
POSTGRES_DB: ${{ env.POSTGRES_DB }}
|
||||
ports:
|
||||
- 5432:5432
|
||||
options: >-
|
||||
--health-cmd pg_isready
|
||||
--health-interval 10s
|
||||
--health-timeout 5s
|
||||
--health-retries 5
|
||||
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
module: ${{ fromJson(needs.discover.outputs.modules) }}
|
||||
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0
|
||||
|
||||
- name: Setup .NET
|
||||
uses: actions/setup-dotnet@v4
|
||||
with:
|
||||
dotnet-version: ${{ env.DOTNET_VERSION }}
|
||||
include-prerelease: true
|
||||
|
||||
- name: Install EF Core tools
|
||||
run: dotnet tool install -g dotnet-ef
|
||||
|
||||
- name: Get module name
|
||||
id: module
|
||||
run: |
|
||||
MODULE_NAME=$(basename $(dirname "${{ matrix.module }}"))
|
||||
echo "name=$MODULE_NAME" >> $GITHUB_OUTPUT
|
||||
echo "Testing module: $MODULE_NAME"
|
||||
|
||||
- name: Find project file
|
||||
id: project
|
||||
run: |
|
||||
# Find the csproj file in the persistence directory
|
||||
PROJECT_FILE=$(find "${{ matrix.module }}" -maxdepth 1 -name "*.csproj" | head -1)
|
||||
if [[ -z "$PROJECT_FILE" ]]; then
|
||||
echo "::error::No project file found in ${{ matrix.module }}"
|
||||
exit 1
|
||||
fi
|
||||
echo "project=$PROJECT_FILE" >> $GITHUB_OUTPUT
|
||||
echo "Found project: $PROJECT_FILE"
|
||||
|
||||
- name: Create fresh database
|
||||
run: |
|
||||
PGPASSWORD=${{ env.POSTGRES_PASSWORD }} psql -h ${{ env.POSTGRES_HOST }} \
|
||||
-U ${{ env.POSTGRES_USER }} -d postgres \
|
||||
-c "DROP DATABASE IF EXISTS ${{ env.POSTGRES_DB }}_${{ steps.module.outputs.name }};"
|
||||
PGPASSWORD=${{ env.POSTGRES_PASSWORD }} psql -h ${{ env.POSTGRES_HOST }} \
|
||||
-U ${{ env.POSTGRES_USER }} -d postgres \
|
||||
-c "CREATE DATABASE ${{ env.POSTGRES_DB }}_${{ steps.module.outputs.name }};"
|
||||
|
||||
- name: Apply all migrations (forward)
|
||||
id: forward
|
||||
env:
|
||||
ConnectionStrings__Default: "Host=${{ env.POSTGRES_HOST }};Port=${{ env.POSTGRES_PORT }};Database=${{ env.POSTGRES_DB }}_${{ steps.module.outputs.name }};Username=${{ env.POSTGRES_USER }};Password=${{ env.POSTGRES_PASSWORD }}"
|
||||
run: |
|
||||
echo "Applying migrations for ${{ steps.module.outputs.name }}..."
|
||||
|
||||
# List available migrations first
|
||||
dotnet ef migrations list --project "${{ steps.project.outputs.project }}" \
|
||||
--no-build 2>/dev/null || true
|
||||
|
||||
# Apply all migrations
|
||||
START_TIME=$(date +%s)
|
||||
dotnet ef database update --project "${{ steps.project.outputs.project }}"
|
||||
END_TIME=$(date +%s)
|
||||
DURATION=$((END_TIME - START_TIME))
|
||||
|
||||
echo "duration=$DURATION" >> $GITHUB_OUTPUT
|
||||
echo "Migration completed in ${DURATION}s"
|
||||
|
||||
- name: Verify schema
|
||||
env:
|
||||
PGPASSWORD: ${{ env.POSTGRES_PASSWORD }}
|
||||
run: |
|
||||
echo "## Schema verification for ${{ steps.module.outputs.name }}" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
# Get table count
|
||||
TABLE_COUNT=$(psql -h ${{ env.POSTGRES_HOST }} -U ${{ env.POSTGRES_USER }} \
|
||||
-d "${{ env.POSTGRES_DB }}_${{ steps.module.outputs.name }}" -t -c \
|
||||
"SELECT COUNT(*) FROM information_schema.tables WHERE table_schema = 'public';")
|
||||
|
||||
echo "- Tables created: $TABLE_COUNT" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- Migration time: ${{ steps.forward.outputs.duration }}s" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
# List tables
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "<details><summary>Tables</summary>" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo '```' >> $GITHUB_STEP_SUMMARY
|
||||
psql -h ${{ env.POSTGRES_HOST }} -U ${{ env.POSTGRES_USER }} \
|
||||
-d "${{ env.POSTGRES_DB }}_${{ steps.module.outputs.name }}" -c \
|
||||
"SELECT table_name FROM information_schema.tables WHERE table_schema = 'public' ORDER BY table_name;" >> $GITHUB_STEP_SUMMARY
|
||||
echo '```' >> $GITHUB_STEP_SUMMARY
|
||||
echo "</details>" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
- name: Upload migration log
|
||||
uses: actions/upload-artifact@v4
|
||||
if: always()
|
||||
with:
|
||||
name: migration-forward-${{ steps.module.outputs.name }}
|
||||
path: |
|
||||
**/*.migration.log
|
||||
retention-days: 7
|
||||
|
||||
# ===========================================================================
|
||||
# ROLLBACK MIGRATION TESTS
|
||||
# ===========================================================================
|
||||
|
||||
rollback-migrations:
|
||||
name: Rollback Migration
|
||||
runs-on: ubuntu-22.04
|
||||
timeout-minutes: 30
|
||||
needs: [discover, forward-migrations]
|
||||
if: |
|
||||
needs.discover.outputs.module_count != '0' &&
|
||||
(github.event_name == 'schedule' || github.event.inputs.test_rollback == 'true')
|
||||
services:
|
||||
postgres:
|
||||
image: postgres:16
|
||||
env:
|
||||
POSTGRES_USER: ${{ env.POSTGRES_USER }}
|
||||
POSTGRES_PASSWORD: ${{ env.POSTGRES_PASSWORD }}
|
||||
POSTGRES_DB: ${{ env.POSTGRES_DB }}
|
||||
ports:
|
||||
- 5432:5432
|
||||
options: >-
|
||||
--health-cmd pg_isready
|
||||
--health-interval 10s
|
||||
--health-timeout 5s
|
||||
--health-retries 5
|
||||
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
module: ${{ fromJson(needs.discover.outputs.modules) }}
|
||||
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0
|
||||
|
||||
- name: Setup .NET
|
||||
uses: actions/setup-dotnet@v4
|
||||
with:
|
||||
dotnet-version: ${{ env.DOTNET_VERSION }}
|
||||
include-prerelease: true
|
||||
|
||||
- name: Install EF Core tools
|
||||
run: dotnet tool install -g dotnet-ef
|
||||
|
||||
- name: Get module info
|
||||
id: module
|
||||
run: |
|
||||
MODULE_NAME=$(basename $(dirname "${{ matrix.module }}"))
|
||||
echo "name=$MODULE_NAME" >> $GITHUB_OUTPUT
|
||||
|
||||
PROJECT_FILE=$(find "${{ matrix.module }}" -maxdepth 1 -name "*.csproj" | head -1)
|
||||
echo "project=$PROJECT_FILE" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: Create and migrate database
|
||||
env:
|
||||
ConnectionStrings__Default: "Host=${{ env.POSTGRES_HOST }};Port=${{ env.POSTGRES_PORT }};Database=${{ env.POSTGRES_DB }}_rb_${{ steps.module.outputs.name }};Username=${{ env.POSTGRES_USER }};Password=${{ env.POSTGRES_PASSWORD }}"
|
||||
PGPASSWORD: ${{ env.POSTGRES_PASSWORD }}
|
||||
run: |
|
||||
# Create database
|
||||
psql -h ${{ env.POSTGRES_HOST }} -U ${{ env.POSTGRES_USER }} -d postgres \
|
||||
-c "DROP DATABASE IF EXISTS ${{ env.POSTGRES_DB }}_rb_${{ steps.module.outputs.name }};"
|
||||
psql -h ${{ env.POSTGRES_HOST }} -U ${{ env.POSTGRES_USER }} -d postgres \
|
||||
-c "CREATE DATABASE ${{ env.POSTGRES_DB }}_rb_${{ steps.module.outputs.name }};"
|
||||
|
||||
# Apply all migrations
|
||||
dotnet ef database update --project "${{ steps.module.outputs.project }}"
|
||||
|
||||
- name: Get migration list
|
||||
id: migrations
|
||||
env:
|
||||
ConnectionStrings__Default: "Host=${{ env.POSTGRES_HOST }};Port=${{ env.POSTGRES_PORT }};Database=${{ env.POSTGRES_DB }}_rb_${{ steps.module.outputs.name }};Username=${{ env.POSTGRES_USER }};Password=${{ env.POSTGRES_PASSWORD }}"
|
||||
run: |
|
||||
# Get list of applied migrations
|
||||
MIGRATIONS=$(dotnet ef migrations list --project "${{ steps.module.outputs.project }}" \
|
||||
--no-build 2>/dev/null | grep -E "^\d{14}_" | tail -5)
|
||||
|
||||
MIGRATION_COUNT=$(echo "$MIGRATIONS" | wc -l)
|
||||
echo "count=$MIGRATION_COUNT" >> $GITHUB_OUTPUT
|
||||
|
||||
if [[ $MIGRATION_COUNT -gt 1 ]]; then
|
||||
# Get the second-to-last migration for rollback target
|
||||
ROLLBACK_TARGET=$(echo "$MIGRATIONS" | tail -2 | head -1)
|
||||
echo "rollback_to=$ROLLBACK_TARGET" >> $GITHUB_OUTPUT
|
||||
echo "Will rollback to: $ROLLBACK_TARGET"
|
||||
else
|
||||
echo "rollback_to=" >> $GITHUB_OUTPUT
|
||||
echo "Not enough migrations to test rollback"
|
||||
fi
|
||||
|
||||
- name: Test rollback
|
||||
if: steps.migrations.outputs.rollback_to != ''
|
||||
env:
|
||||
ConnectionStrings__Default: "Host=${{ env.POSTGRES_HOST }};Port=${{ env.POSTGRES_PORT }};Database=${{ env.POSTGRES_DB }}_rb_${{ steps.module.outputs.name }};Username=${{ env.POSTGRES_USER }};Password=${{ env.POSTGRES_PASSWORD }}"
|
||||
run: |
|
||||
echo "Rolling back to: ${{ steps.migrations.outputs.rollback_to }}"
|
||||
dotnet ef database update "${{ steps.migrations.outputs.rollback_to }}" \
|
||||
--project "${{ steps.module.outputs.project }}"
|
||||
|
||||
echo "Rollback successful!"
|
||||
|
||||
- name: Test re-apply after rollback
|
||||
if: steps.migrations.outputs.rollback_to != ''
|
||||
env:
|
||||
ConnectionStrings__Default: "Host=${{ env.POSTGRES_HOST }};Port=${{ env.POSTGRES_PORT }};Database=${{ env.POSTGRES_DB }}_rb_${{ steps.module.outputs.name }};Username=${{ env.POSTGRES_USER }};Password=${{ env.POSTGRES_PASSWORD }}"
|
||||
run: |
|
||||
echo "Re-applying migrations after rollback..."
|
||||
dotnet ef database update --project "${{ steps.module.outputs.project }}"
|
||||
|
||||
echo "Re-apply successful!"
|
||||
|
||||
- name: Report rollback results
|
||||
if: always()
|
||||
run: |
|
||||
echo "## Rollback Test: ${{ steps.module.outputs.name }}" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
if [[ -n "${{ steps.migrations.outputs.rollback_to }}" ]]; then
|
||||
echo "- Rollback target: ${{ steps.migrations.outputs.rollback_to }}" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- Status: Tested" >> $GITHUB_STEP_SUMMARY
|
||||
else
|
||||
echo "- Status: Skipped (insufficient migrations)" >> $GITHUB_STEP_SUMMARY
|
||||
fi
|
||||
|
||||
# ===========================================================================
|
||||
# IDEMPOTENCY TESTS
|
||||
# ===========================================================================
|
||||
|
||||
idempotency:
|
||||
name: Idempotency Test
|
||||
runs-on: ubuntu-22.04
|
||||
timeout-minutes: 20
|
||||
needs: [discover, forward-migrations]
|
||||
if: |
|
||||
needs.discover.outputs.module_count != '0' &&
|
||||
(github.event_name == 'schedule' || github.event.inputs.test_idempotency == 'true')
|
||||
services:
|
||||
postgres:
|
||||
image: postgres:16
|
||||
env:
|
||||
POSTGRES_USER: ${{ env.POSTGRES_USER }}
|
||||
POSTGRES_PASSWORD: ${{ env.POSTGRES_PASSWORD }}
|
||||
POSTGRES_DB: ${{ env.POSTGRES_DB }}
|
||||
ports:
|
||||
- 5432:5432
|
||||
options: >-
|
||||
--health-cmd pg_isready
|
||||
--health-interval 10s
|
||||
--health-timeout 5s
|
||||
--health-retries 5
|
||||
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
module: ${{ fromJson(needs.discover.outputs.modules) }}
|
||||
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Setup .NET
|
||||
uses: actions/setup-dotnet@v4
|
||||
with:
|
||||
dotnet-version: ${{ env.DOTNET_VERSION }}
|
||||
include-prerelease: true
|
||||
|
||||
- name: Install EF Core tools
|
||||
run: dotnet tool install -g dotnet-ef
|
||||
|
||||
- name: Get module info
|
||||
id: module
|
||||
run: |
|
||||
MODULE_NAME=$(basename $(dirname "${{ matrix.module }}"))
|
||||
echo "name=$MODULE_NAME" >> $GITHUB_OUTPUT
|
||||
|
||||
PROJECT_FILE=$(find "${{ matrix.module }}" -maxdepth 1 -name "*.csproj" | head -1)
|
||||
echo "project=$PROJECT_FILE" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: Setup database
|
||||
env:
|
||||
ConnectionStrings__Default: "Host=${{ env.POSTGRES_HOST }};Port=${{ env.POSTGRES_PORT }};Database=${{ env.POSTGRES_DB }}_idem_${{ steps.module.outputs.name }};Username=${{ env.POSTGRES_USER }};Password=${{ env.POSTGRES_PASSWORD }}"
|
||||
PGPASSWORD: ${{ env.POSTGRES_PASSWORD }}
|
||||
run: |
|
||||
psql -h ${{ env.POSTGRES_HOST }} -U ${{ env.POSTGRES_USER }} -d postgres \
|
||||
-c "DROP DATABASE IF EXISTS ${{ env.POSTGRES_DB }}_idem_${{ steps.module.outputs.name }};"
|
||||
psql -h ${{ env.POSTGRES_HOST }} -U ${{ env.POSTGRES_USER }} -d postgres \
|
||||
-c "CREATE DATABASE ${{ env.POSTGRES_DB }}_idem_${{ steps.module.outputs.name }};"
|
||||
|
||||
- name: First migration run
|
||||
env:
|
||||
ConnectionStrings__Default: "Host=${{ env.POSTGRES_HOST }};Port=${{ env.POSTGRES_PORT }};Database=${{ env.POSTGRES_DB }}_idem_${{ steps.module.outputs.name }};Username=${{ env.POSTGRES_USER }};Password=${{ env.POSTGRES_PASSWORD }}"
|
||||
run: |
|
||||
dotnet ef database update --project "${{ steps.module.outputs.project }}"
|
||||
|
||||
- name: Get initial schema hash
|
||||
id: hash1
|
||||
env:
|
||||
PGPASSWORD: ${{ env.POSTGRES_PASSWORD }}
|
||||
run: |
|
||||
SCHEMA_HASH=$(psql -h ${{ env.POSTGRES_HOST }} -U ${{ env.POSTGRES_USER }} \
|
||||
-d "${{ env.POSTGRES_DB }}_idem_${{ steps.module.outputs.name }}" -t -c \
|
||||
"SELECT md5(string_agg(table_name || column_name || data_type, '' ORDER BY table_name, column_name))
|
||||
FROM information_schema.columns WHERE table_schema = 'public';")
|
||||
echo "hash=$SCHEMA_HASH" >> $GITHUB_OUTPUT
|
||||
echo "Initial schema hash: $SCHEMA_HASH"
|
||||
|
||||
- name: Second migration run (idempotency test)
|
||||
env:
|
||||
ConnectionStrings__Default: "Host=${{ env.POSTGRES_HOST }};Port=${{ env.POSTGRES_PORT }};Database=${{ env.POSTGRES_DB }}_idem_${{ steps.module.outputs.name }};Username=${{ env.POSTGRES_USER }};Password=${{ env.POSTGRES_PASSWORD }}"
|
||||
run: |
|
||||
# Running migrations again should be a no-op
|
||||
dotnet ef database update --project "${{ steps.module.outputs.project }}"
|
||||
|
||||
- name: Get final schema hash
|
||||
id: hash2
|
||||
env:
|
||||
PGPASSWORD: ${{ env.POSTGRES_PASSWORD }}
|
||||
run: |
|
||||
SCHEMA_HASH=$(psql -h ${{ env.POSTGRES_HOST }} -U ${{ env.POSTGRES_USER }} \
|
||||
-d "${{ env.POSTGRES_DB }}_idem_${{ steps.module.outputs.name }}" -t -c \
|
||||
"SELECT md5(string_agg(table_name || column_name || data_type, '' ORDER BY table_name, column_name))
|
||||
FROM information_schema.columns WHERE table_schema = 'public';")
|
||||
echo "hash=$SCHEMA_HASH" >> $GITHUB_OUTPUT
|
||||
echo "Final schema hash: $SCHEMA_HASH"
|
||||
|
||||
- name: Verify idempotency
|
||||
run: |
|
||||
HASH1="${{ steps.hash1.outputs.hash }}"
|
||||
HASH2="${{ steps.hash2.outputs.hash }}"
|
||||
|
||||
echo "## Idempotency Test: ${{ steps.module.outputs.name }}" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- Initial schema hash: $HASH1" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- Final schema hash: $HASH2" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
if [[ "$HASH1" == "$HASH2" ]]; then
|
||||
echo "- Result: PASS (schemas identical)" >> $GITHUB_STEP_SUMMARY
|
||||
else
|
||||
echo "- Result: FAIL (schemas differ)" >> $GITHUB_STEP_SUMMARY
|
||||
echo "::error::Idempotency test failed for ${{ steps.module.outputs.name }}"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# ===========================================================================
|
||||
# SUMMARY
|
||||
# ===========================================================================
|
||||
|
||||
summary:
|
||||
name: Migration Summary
|
||||
runs-on: ubuntu-22.04
|
||||
needs: [discover, forward-migrations, rollback-migrations, idempotency]
|
||||
if: always()
|
||||
steps:
|
||||
- name: Generate Summary
|
||||
run: |
|
||||
echo "## Migration Test Summary" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Test | Status |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "|------|--------|" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Discovery | ${{ needs.discover.result }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Forward Migrations | ${{ needs.forward-migrations.result }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Rollback Migrations | ${{ needs.rollback-migrations.result }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Idempotency | ${{ needs.idempotency.result }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "### Modules Tested: ${{ needs.discover.outputs.module_count }}" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
- name: Check for failures
|
||||
if: contains(needs.*.result, 'failure')
|
||||
run: exit 1
|
||||
483
.gitea/workflows/nightly-regression.yml
Normal file
483
.gitea/workflows/nightly-regression.yml
Normal file
@@ -0,0 +1,483 @@
|
||||
# .gitea/workflows/nightly-regression.yml
|
||||
# Nightly Full-Suite Regression Testing
|
||||
# Sprint: CI/CD Enhancement - Comprehensive Testing
|
||||
#
|
||||
# Purpose: Run comprehensive regression tests that are too expensive for PR gating
|
||||
# - Full test matrix (all categories)
|
||||
# - Extended integration tests
|
||||
# - Performance benchmarks with historical comparison
|
||||
# - Cross-module dependency validation
|
||||
# - Determinism verification
|
||||
#
|
||||
# Schedule: Daily at 2:00 AM UTC (off-peak hours)
|
||||
#
|
||||
# Notifications: Slack/Teams on failure
|
||||
|
||||
name: Nightly Regression
|
||||
|
||||
on:
|
||||
schedule:
|
||||
- cron: '0 2 * * *' # Daily at 2:00 AM UTC
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
skip_performance:
|
||||
description: 'Skip performance tests'
|
||||
type: boolean
|
||||
default: false
|
||||
skip_determinism:
|
||||
description: 'Skip determinism tests'
|
||||
type: boolean
|
||||
default: false
|
||||
notify_on_success:
|
||||
description: 'Send notification on success'
|
||||
type: boolean
|
||||
default: false
|
||||
|
||||
env:
|
||||
DOTNET_VERSION: '10.0.100'
|
||||
DOTNET_NOLOGO: 1
|
||||
DOTNET_CLI_TELEMETRY_OPTOUT: 1
|
||||
DOTNET_SYSTEM_GLOBALIZATION_INVARIANT: 1
|
||||
TZ: UTC
|
||||
|
||||
jobs:
|
||||
# ===========================================================================
|
||||
# PREPARE NIGHTLY RUN
|
||||
# ===========================================================================
|
||||
|
||||
prepare:
|
||||
name: Prepare Nightly Run
|
||||
runs-on: ubuntu-22.04
|
||||
outputs:
|
||||
run_id: ${{ steps.metadata.outputs.run_id }}
|
||||
run_date: ${{ steps.metadata.outputs.run_date }}
|
||||
commit_sha: ${{ steps.metadata.outputs.commit_sha }}
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0
|
||||
|
||||
- name: Generate run metadata
|
||||
id: metadata
|
||||
run: |
|
||||
RUN_ID="nightly-$(date -u +%Y%m%d-%H%M%S)"
|
||||
RUN_DATE=$(date -u +%Y-%m-%d)
|
||||
COMMIT_SHA=$(git rev-parse HEAD)
|
||||
|
||||
echo "run_id=$RUN_ID" >> $GITHUB_OUTPUT
|
||||
echo "run_date=$RUN_DATE" >> $GITHUB_OUTPUT
|
||||
echo "commit_sha=$COMMIT_SHA" >> $GITHUB_OUTPUT
|
||||
|
||||
echo "## Nightly Regression Run" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- **Run ID:** $RUN_ID" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- **Date:** $RUN_DATE" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- **Commit:** $COMMIT_SHA" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
- name: Check recent commits
|
||||
run: |
|
||||
echo "### Recent Commits" >> $GITHUB_STEP_SUMMARY
|
||||
echo '```' >> $GITHUB_STEP_SUMMARY
|
||||
git log --oneline -10 >> $GITHUB_STEP_SUMMARY
|
||||
echo '```' >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
# ===========================================================================
|
||||
# FULL BUILD VERIFICATION
|
||||
# ===========================================================================
|
||||
|
||||
build:
|
||||
name: Full Build
|
||||
runs-on: ubuntu-22.04
|
||||
timeout-minutes: 30
|
||||
needs: prepare
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Setup .NET
|
||||
uses: actions/setup-dotnet@v4
|
||||
with:
|
||||
dotnet-version: ${{ env.DOTNET_VERSION }}
|
||||
include-prerelease: true
|
||||
|
||||
- name: Restore dependencies
|
||||
run: dotnet restore src/StellaOps.sln
|
||||
|
||||
- name: Build solution (Release)
|
||||
run: |
|
||||
START_TIME=$(date +%s)
|
||||
dotnet build src/StellaOps.sln --configuration Release --no-restore
|
||||
END_TIME=$(date +%s)
|
||||
DURATION=$((END_TIME - START_TIME))
|
||||
echo "build_time=$DURATION" >> $GITHUB_ENV
|
||||
echo "Build completed in ${DURATION}s"
|
||||
|
||||
- name: Report build metrics
|
||||
run: |
|
||||
echo "### Build Metrics" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- **Build Time:** ${{ env.build_time }}s" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- **Configuration:** Release" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
# ===========================================================================
|
||||
# COMPREHENSIVE TEST SUITE
|
||||
# ===========================================================================
|
||||
|
||||
test-pr-gating:
|
||||
name: PR-Gating Tests
|
||||
runs-on: ubuntu-22.04
|
||||
timeout-minutes: 45
|
||||
needs: build
|
||||
services:
|
||||
postgres:
|
||||
image: postgres:16
|
||||
env:
|
||||
POSTGRES_USER: stellaops
|
||||
POSTGRES_PASSWORD: stellaops
|
||||
POSTGRES_DB: stellaops_test
|
||||
ports:
|
||||
- 5432:5432
|
||||
options: >-
|
||||
--health-cmd pg_isready
|
||||
--health-interval 10s
|
||||
--health-timeout 5s
|
||||
--health-retries 5
|
||||
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
category:
|
||||
- Unit
|
||||
- Architecture
|
||||
- Contract
|
||||
- Integration
|
||||
- Security
|
||||
- Golden
|
||||
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Setup .NET
|
||||
uses: actions/setup-dotnet@v4
|
||||
with:
|
||||
dotnet-version: ${{ env.DOTNET_VERSION }}
|
||||
include-prerelease: true
|
||||
|
||||
- name: Run ${{ matrix.category }} Tests
|
||||
env:
|
||||
STELLAOPS_TEST_POSTGRES_CONNECTION: "Host=localhost;Port=5432;Database=stellaops_test;Username=stellaops;Password=stellaops"
|
||||
run: |
|
||||
chmod +x .gitea/scripts/test/run-test-category.sh
|
||||
.gitea/scripts/test/run-test-category.sh "${{ matrix.category }}"
|
||||
|
||||
- name: Upload Test Results
|
||||
uses: actions/upload-artifact@v4
|
||||
if: always()
|
||||
with:
|
||||
name: nightly-test-${{ matrix.category }}
|
||||
path: ./TestResults/${{ matrix.category }}
|
||||
retention-days: 30
|
||||
|
||||
test-extended:
|
||||
name: Extended Tests
|
||||
runs-on: ubuntu-22.04
|
||||
timeout-minutes: 60
|
||||
needs: build
|
||||
if: github.event.inputs.skip_performance != 'true'
|
||||
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
category:
|
||||
- Performance
|
||||
- Benchmark
|
||||
- Resilience
|
||||
- Observability
|
||||
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Setup .NET
|
||||
uses: actions/setup-dotnet@v4
|
||||
with:
|
||||
dotnet-version: ${{ env.DOTNET_VERSION }}
|
||||
include-prerelease: true
|
||||
|
||||
- name: Run ${{ matrix.category }} Tests
|
||||
run: |
|
||||
chmod +x .gitea/scripts/test/run-test-category.sh
|
||||
.gitea/scripts/test/run-test-category.sh "${{ matrix.category }}"
|
||||
|
||||
- name: Upload Test Results
|
||||
uses: actions/upload-artifact@v4
|
||||
if: always()
|
||||
with:
|
||||
name: nightly-extended-${{ matrix.category }}
|
||||
path: ./TestResults/${{ matrix.category }}
|
||||
retention-days: 30
|
||||
|
||||
# ===========================================================================
|
||||
# DETERMINISM VERIFICATION
|
||||
# ===========================================================================
|
||||
|
||||
determinism:
|
||||
name: Determinism Verification
|
||||
runs-on: ubuntu-22.04
|
||||
timeout-minutes: 45
|
||||
needs: build
|
||||
if: github.event.inputs.skip_determinism != 'true'
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Setup .NET
|
||||
uses: actions/setup-dotnet@v4
|
||||
with:
|
||||
dotnet-version: ${{ env.DOTNET_VERSION }}
|
||||
include-prerelease: true
|
||||
|
||||
- name: First build
|
||||
run: |
|
||||
dotnet build src/StellaOps.sln --configuration Release -o ./build-1
|
||||
find ./build-1 -name "*.dll" -exec sha256sum {} \; | sort > checksums-1.txt
|
||||
|
||||
- name: Clean and rebuild
|
||||
run: |
|
||||
rm -rf ./build-1
|
||||
dotnet clean src/StellaOps.sln
|
||||
dotnet build src/StellaOps.sln --configuration Release -o ./build-2
|
||||
find ./build-2 -name "*.dll" -exec sha256sum {} \; | sort > checksums-2.txt
|
||||
|
||||
- name: Compare builds
|
||||
id: compare
|
||||
run: |
|
||||
echo "### Determinism Check" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
if diff checksums-1.txt checksums-2.txt > /dev/null; then
|
||||
echo "PASS: Builds are deterministic" >> $GITHUB_STEP_SUMMARY
|
||||
echo "deterministic=true" >> $GITHUB_OUTPUT
|
||||
else
|
||||
echo "FAIL: Builds differ" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "<details><summary>Differences</summary>" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo '```diff' >> $GITHUB_STEP_SUMMARY
|
||||
diff checksums-1.txt checksums-2.txt >> $GITHUB_STEP_SUMMARY || true
|
||||
echo '```' >> $GITHUB_STEP_SUMMARY
|
||||
echo "</details>" >> $GITHUB_STEP_SUMMARY
|
||||
echo "deterministic=false" >> $GITHUB_OUTPUT
|
||||
exit 1
|
||||
fi
|
||||
|
||||
- name: Upload checksums
|
||||
uses: actions/upload-artifact@v4
|
||||
if: always()
|
||||
with:
|
||||
name: nightly-determinism-checksums
|
||||
path: checksums-*.txt
|
||||
retention-days: 30
|
||||
|
||||
# ===========================================================================
|
||||
# CROSS-MODULE VALIDATION
|
||||
# ===========================================================================
|
||||
|
||||
cross-module:
|
||||
name: Cross-Module Validation
|
||||
runs-on: ubuntu-22.04
|
||||
timeout-minutes: 30
|
||||
needs: build
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Setup .NET
|
||||
uses: actions/setup-dotnet@v4
|
||||
with:
|
||||
dotnet-version: ${{ env.DOTNET_VERSION }}
|
||||
include-prerelease: true
|
||||
|
||||
- name: Check for circular dependencies
|
||||
run: |
|
||||
echo "### Dependency Analysis" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
# Build dependency graph
|
||||
echo "Analyzing project dependencies..."
|
||||
for proj in $(find src -name "*.csproj" ! -path "*/bin/*" ! -path "*/obj/*" | head -50); do
|
||||
# Extract ProjectReference entries
|
||||
refs=$(grep -oP 'ProjectReference Include="\K[^"]+' "$proj" 2>/dev/null || true)
|
||||
if [[ -n "$refs" ]]; then
|
||||
basename "$proj" >> deps.txt
|
||||
echo "$refs" | while read ref; do
|
||||
echo " -> $(basename "$ref")" >> deps.txt
|
||||
done
|
||||
fi
|
||||
done
|
||||
|
||||
if [[ -f deps.txt ]]; then
|
||||
echo "<details><summary>Project Dependencies (first 50)</summary>" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo '```' >> $GITHUB_STEP_SUMMARY
|
||||
head -100 deps.txt >> $GITHUB_STEP_SUMMARY
|
||||
echo '```' >> $GITHUB_STEP_SUMMARY
|
||||
echo "</details>" >> $GITHUB_STEP_SUMMARY
|
||||
fi
|
||||
|
||||
- name: Validate no deprecated APIs
|
||||
run: |
|
||||
# Check for use of deprecated patterns
|
||||
DEPRECATED_COUNT=$(grep -r "Obsolete" src --include="*.cs" | wc -l || echo "0")
|
||||
echo "- Obsolete attribute usages: $DEPRECATED_COUNT" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
# ===========================================================================
|
||||
# CODE COVERAGE REPORT
|
||||
# ===========================================================================
|
||||
|
||||
coverage:
|
||||
name: Code Coverage
|
||||
runs-on: ubuntu-22.04
|
||||
timeout-minutes: 45
|
||||
needs: build
|
||||
services:
|
||||
postgres:
|
||||
image: postgres:16
|
||||
env:
|
||||
POSTGRES_USER: stellaops
|
||||
POSTGRES_PASSWORD: stellaops
|
||||
POSTGRES_DB: stellaops_test
|
||||
ports:
|
||||
- 5432:5432
|
||||
options: >-
|
||||
--health-cmd pg_isready
|
||||
--health-interval 10s
|
||||
--health-timeout 5s
|
||||
--health-retries 5
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Setup .NET
|
||||
uses: actions/setup-dotnet@v4
|
||||
with:
|
||||
dotnet-version: ${{ env.DOTNET_VERSION }}
|
||||
include-prerelease: true
|
||||
|
||||
- name: Run tests with coverage
|
||||
env:
|
||||
STELLAOPS_TEST_POSTGRES_CONNECTION: "Host=localhost;Port=5432;Database=stellaops_test;Username=stellaops;Password=stellaops"
|
||||
run: |
|
||||
dotnet test src/StellaOps.sln \
|
||||
--configuration Release \
|
||||
--collect:"XPlat Code Coverage" \
|
||||
--results-directory ./TestResults/Coverage \
|
||||
--filter "Category=Unit|Category=Integration" \
|
||||
--verbosity minimal \
|
||||
-- DataCollectionRunSettings.DataCollectors.DataCollector.Configuration.Format=cobertura
|
||||
|
||||
- name: Install ReportGenerator
|
||||
run: dotnet tool install -g dotnet-reportgenerator-globaltool
|
||||
|
||||
- name: Generate coverage report
|
||||
run: |
|
||||
reportgenerator \
|
||||
-reports:"./TestResults/Coverage/**/coverage.cobertura.xml" \
|
||||
-targetdir:"./TestResults/CoverageReport" \
|
||||
-reporttypes:"Html;MarkdownSummary;Cobertura" \
|
||||
|| true
|
||||
|
||||
- name: Add coverage to summary
|
||||
run: |
|
||||
echo "### Code Coverage Report" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
if [[ -f "./TestResults/CoverageReport/Summary.md" ]]; then
|
||||
cat "./TestResults/CoverageReport/Summary.md" >> $GITHUB_STEP_SUMMARY
|
||||
else
|
||||
echo "Coverage report generation failed or no coverage data collected." >> $GITHUB_STEP_SUMMARY
|
||||
fi
|
||||
|
||||
- name: Upload coverage report
|
||||
uses: actions/upload-artifact@v4
|
||||
if: always()
|
||||
with:
|
||||
name: nightly-coverage-report
|
||||
path: ./TestResults/CoverageReport
|
||||
retention-days: 30
|
||||
|
||||
# ===========================================================================
|
||||
# SUMMARY AND NOTIFICATION
|
||||
# ===========================================================================
|
||||
|
||||
summary:
|
||||
name: Nightly Summary
|
||||
runs-on: ubuntu-22.04
|
||||
needs:
|
||||
- prepare
|
||||
- build
|
||||
- test-pr-gating
|
||||
- test-extended
|
||||
- determinism
|
||||
- cross-module
|
||||
- coverage
|
||||
if: always()
|
||||
steps:
|
||||
- name: Generate final summary
|
||||
run: |
|
||||
echo "## Nightly Regression Summary" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "**Run ID:** ${{ needs.prepare.outputs.run_id }}" >> $GITHUB_STEP_SUMMARY
|
||||
echo "**Date:** ${{ needs.prepare.outputs.run_date }}" >> $GITHUB_STEP_SUMMARY
|
||||
echo "**Commit:** ${{ needs.prepare.outputs.commit_sha }}" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "### Job Results" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Job | Status |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "|-----|--------|" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Build | ${{ needs.build.result }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| PR-Gating Tests | ${{ needs.test-pr-gating.result }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Extended Tests | ${{ needs.test-extended.result }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Determinism | ${{ needs.determinism.result }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Cross-Module | ${{ needs.cross-module.result }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Coverage | ${{ needs.coverage.result }} |" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
- name: Determine overall status
|
||||
id: status
|
||||
run: |
|
||||
if [[ "${{ needs.build.result }}" == "failure" ]] || \
|
||||
[[ "${{ needs.test-pr-gating.result }}" == "failure" ]] || \
|
||||
[[ "${{ needs.determinism.result }}" == "failure" ]]; then
|
||||
echo "status=failure" >> $GITHUB_OUTPUT
|
||||
else
|
||||
echo "status=success" >> $GITHUB_OUTPUT
|
||||
fi
|
||||
|
||||
# Placeholder for notifications - configure webhook URL in secrets
|
||||
- name: Send failure notification
|
||||
if: steps.status.outputs.status == 'failure'
|
||||
run: |
|
||||
echo "::warning::Nightly regression failed - notification would be sent here"
|
||||
# Uncomment and configure when webhook is available:
|
||||
# curl -X POST "${{ secrets.SLACK_WEBHOOK_URL }}" \
|
||||
# -H "Content-Type: application/json" \
|
||||
# -d '{
|
||||
# "text": "Nightly Regression Failed",
|
||||
# "attachments": [{
|
||||
# "color": "danger",
|
||||
# "fields": [
|
||||
# {"title": "Run ID", "value": "${{ needs.prepare.outputs.run_id }}", "short": true},
|
||||
# {"title": "Commit", "value": "${{ needs.prepare.outputs.commit_sha }}", "short": true}
|
||||
# ]
|
||||
# }]
|
||||
# }'
|
||||
|
||||
- name: Send success notification
|
||||
if: steps.status.outputs.status == 'success' && github.event.inputs.notify_on_success == 'true'
|
||||
run: |
|
||||
echo "::notice::Nightly regression passed"
|
||||
|
||||
- name: Exit with appropriate code
|
||||
if: steps.status.outputs.status == 'failure'
|
||||
run: exit 1
|
||||
@@ -532,6 +532,233 @@ jobs:
|
||||
path: out/release
|
||||
retention-days: 90
|
||||
|
||||
# ===========================================================================
|
||||
# GENERATE CHANGELOG (AI-assisted)
|
||||
# ===========================================================================
|
||||
|
||||
generate-changelog:
|
||||
name: Generate Changelog
|
||||
runs-on: ubuntu-22.04
|
||||
needs: [validate, build-modules]
|
||||
if: always() && needs.validate.result == 'success'
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0
|
||||
|
||||
- name: Set up Python
|
||||
uses: actions/setup-python@v5
|
||||
with:
|
||||
python-version: '3.12'
|
||||
|
||||
- name: Find previous release tag
|
||||
id: prev-tag
|
||||
run: |
|
||||
PREV_TAG=$(git tag -l "suite-*" --sort=-creatordate | head -1)
|
||||
echo "Previous tag: ${PREV_TAG:-none}"
|
||||
echo "prev_tag=${PREV_TAG}" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: Generate changelog
|
||||
env:
|
||||
AI_API_KEY: ${{ secrets.AI_API_KEY }}
|
||||
run: |
|
||||
VERSION="${{ needs.validate.outputs.version }}"
|
||||
CODENAME="${{ needs.validate.outputs.codename }}"
|
||||
PREV_TAG="${{ steps.prev-tag.outputs.prev_tag }}"
|
||||
|
||||
mkdir -p out/docs
|
||||
|
||||
ARGS="$VERSION --codename $CODENAME --output out/docs/CHANGELOG.md"
|
||||
if [[ -n "$PREV_TAG" ]]; then
|
||||
ARGS="$ARGS --from-tag $PREV_TAG"
|
||||
fi
|
||||
if [[ -n "$AI_API_KEY" ]]; then
|
||||
ARGS="$ARGS --ai"
|
||||
fi
|
||||
|
||||
python3 .gitea/scripts/release/generate_changelog.py $ARGS
|
||||
|
||||
echo "=== Generated Changelog ==="
|
||||
head -50 out/docs/CHANGELOG.md
|
||||
|
||||
- name: Upload changelog
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: changelog-${{ needs.validate.outputs.version }}
|
||||
path: out/docs/CHANGELOG.md
|
||||
retention-days: 90
|
||||
|
||||
# ===========================================================================
|
||||
# GENERATE SUITE DOCUMENTATION
|
||||
# ===========================================================================
|
||||
|
||||
generate-suite-docs:
|
||||
name: Generate Suite Docs
|
||||
runs-on: ubuntu-22.04
|
||||
needs: [validate, generate-changelog, release-manifest]
|
||||
if: always() && needs.validate.result == 'success'
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Set up Python
|
||||
uses: actions/setup-python@v5
|
||||
with:
|
||||
python-version: '3.12'
|
||||
|
||||
- name: Install dependencies
|
||||
run: pip install python-dateutil
|
||||
|
||||
- name: Download changelog
|
||||
uses: actions/download-artifact@v4
|
||||
with:
|
||||
name: changelog-${{ needs.validate.outputs.version }}
|
||||
path: changelog
|
||||
|
||||
- name: Find previous version
|
||||
id: prev-version
|
||||
run: |
|
||||
PREV_TAG=$(git tag -l "suite-*" --sort=-creatordate | head -1)
|
||||
if [[ -n "$PREV_TAG" ]]; then
|
||||
PREV_VERSION=$(echo "$PREV_TAG" | sed 's/suite-//')
|
||||
echo "prev_version=$PREV_VERSION" >> $GITHUB_OUTPUT
|
||||
fi
|
||||
|
||||
- name: Generate suite documentation
|
||||
run: |
|
||||
VERSION="${{ needs.validate.outputs.version }}"
|
||||
CODENAME="${{ needs.validate.outputs.codename }}"
|
||||
CHANNEL="${{ needs.validate.outputs.channel }}"
|
||||
PREV="${{ steps.prev-version.outputs.prev_version }}"
|
||||
|
||||
ARGS="$VERSION $CODENAME --channel $CHANNEL"
|
||||
if [[ -f "changelog/CHANGELOG.md" ]]; then
|
||||
ARGS="$ARGS --changelog changelog/CHANGELOG.md"
|
||||
fi
|
||||
if [[ -n "$PREV" ]]; then
|
||||
ARGS="$ARGS --previous $PREV"
|
||||
fi
|
||||
|
||||
python3 .gitea/scripts/release/generate_suite_docs.py $ARGS
|
||||
|
||||
echo "=== Generated Documentation ==="
|
||||
ls -la docs/releases/$VERSION/
|
||||
|
||||
- name: Upload suite docs
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: suite-docs-${{ needs.validate.outputs.version }}
|
||||
path: docs/releases/${{ needs.validate.outputs.version }}
|
||||
retention-days: 90
|
||||
|
||||
# ===========================================================================
|
||||
# GENERATE DOCKER COMPOSE FILES
|
||||
# ===========================================================================
|
||||
|
||||
generate-compose:
|
||||
name: Generate Docker Compose
|
||||
runs-on: ubuntu-22.04
|
||||
needs: [validate, release-manifest]
|
||||
if: always() && needs.validate.result == 'success'
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Set up Python
|
||||
uses: actions/setup-python@v5
|
||||
with:
|
||||
python-version: '3.12'
|
||||
|
||||
- name: Generate Docker Compose files
|
||||
run: |
|
||||
VERSION="${{ needs.validate.outputs.version }}"
|
||||
CODENAME="${{ needs.validate.outputs.codename }}"
|
||||
|
||||
mkdir -p out/compose
|
||||
|
||||
# Standard compose
|
||||
python3 .gitea/scripts/release/generate_compose.py \
|
||||
"$VERSION" "$CODENAME" \
|
||||
--output out/compose/docker-compose.yml
|
||||
|
||||
# Air-gap variant
|
||||
python3 .gitea/scripts/release/generate_compose.py \
|
||||
"$VERSION" "$CODENAME" \
|
||||
--airgap \
|
||||
--output out/compose/docker-compose.airgap.yml
|
||||
|
||||
echo "=== Generated Compose Files ==="
|
||||
ls -la out/compose/
|
||||
|
||||
- name: Upload compose files
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: compose-${{ needs.validate.outputs.version }}
|
||||
path: out/compose
|
||||
retention-days: 90
|
||||
|
||||
# ===========================================================================
|
||||
# COMMIT DOCS TO REPOSITORY
|
||||
# ===========================================================================
|
||||
|
||||
commit-docs:
|
||||
name: Commit Documentation
|
||||
runs-on: ubuntu-22.04
|
||||
needs: [validate, generate-suite-docs, generate-compose, create-release]
|
||||
if: needs.validate.outputs.dry_run != 'true' && needs.create-release.result == 'success'
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
token: ${{ secrets.GITEA_TOKEN }}
|
||||
fetch-depth: 0
|
||||
|
||||
- name: Download suite docs
|
||||
uses: actions/download-artifact@v4
|
||||
with:
|
||||
name: suite-docs-${{ needs.validate.outputs.version }}
|
||||
path: docs/releases/${{ needs.validate.outputs.version }}
|
||||
|
||||
- name: Download compose files
|
||||
uses: actions/download-artifact@v4
|
||||
with:
|
||||
name: compose-${{ needs.validate.outputs.version }}
|
||||
path: docs/releases/${{ needs.validate.outputs.version }}
|
||||
|
||||
- name: Commit documentation
|
||||
run: |
|
||||
VERSION="${{ needs.validate.outputs.version }}"
|
||||
CODENAME="${{ needs.validate.outputs.codename }}"
|
||||
|
||||
git config user.name "github-actions[bot]"
|
||||
git config user.email "github-actions[bot]@users.noreply.github.com"
|
||||
|
||||
git add "docs/releases/${VERSION}"
|
||||
|
||||
if git diff --cached --quiet; then
|
||||
echo "No documentation changes to commit"
|
||||
else
|
||||
git commit -m "docs: add release documentation for ${VERSION} ${CODENAME}
|
||||
|
||||
Generated documentation for StellaOps ${VERSION} \"${CODENAME}\"
|
||||
|
||||
- README.md
|
||||
- CHANGELOG.md
|
||||
- services.md
|
||||
- upgrade-guide.md
|
||||
- docker-compose.yml
|
||||
- docker-compose.airgap.yml
|
||||
- manifest.yaml
|
||||
|
||||
🤖 Generated with [Claude Code](https://claude.com/claude-code)
|
||||
|
||||
Co-Authored-By: github-actions[bot] <github-actions[bot]@users.noreply.github.com>"
|
||||
|
||||
git push
|
||||
echo "Documentation committed and pushed"
|
||||
fi
|
||||
|
||||
# ===========================================================================
|
||||
# CREATE GITEA RELEASE
|
||||
# ===========================================================================
|
||||
@@ -651,7 +878,7 @@ jobs:
|
||||
summary:
|
||||
name: Release Summary
|
||||
runs-on: ubuntu-22.04
|
||||
needs: [validate, build-modules, build-containers, build-cli, build-helm, release-manifest, create-release]
|
||||
needs: [validate, build-modules, build-containers, build-cli, build-helm, release-manifest, generate-changelog, generate-suite-docs, generate-compose, create-release, commit-docs]
|
||||
if: always()
|
||||
steps:
|
||||
- name: Generate Summary
|
||||
@@ -674,7 +901,11 @@ jobs:
|
||||
echo "| Build CLI | ${{ needs.build-cli.result }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Build Helm | ${{ needs.build-helm.result }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Release Manifest | ${{ needs.release-manifest.result }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Generate Changelog | ${{ needs.generate-changelog.result || 'skipped' }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Generate Suite Docs | ${{ needs.generate-suite-docs.result || 'skipped' }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Generate Compose | ${{ needs.generate-compose.result || 'skipped' }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Create Release | ${{ needs.create-release.result || 'skipped' }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Commit Documentation | ${{ needs.commit-docs.result || 'skipped' }} |" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
- name: Check for failures
|
||||
if: contains(needs.*.result, 'failure')
|
||||
|
||||
114
.gitea/workflows/renovate.yml
Normal file
114
.gitea/workflows/renovate.yml
Normal file
@@ -0,0 +1,114 @@
|
||||
# Renovate Bot Workflow for Gitea
|
||||
# Sprint: CI/CD Enhancement - Dependency Management Automation
|
||||
#
|
||||
# Purpose: Run Renovate Bot to automatically update dependencies
|
||||
# Schedule: Twice daily (03:00 and 15:00 UTC)
|
||||
#
|
||||
# Requirements:
|
||||
# - RENOVATE_TOKEN secret with repo write access
|
||||
# - renovate.json configuration in repo root
|
||||
|
||||
name: Renovate
|
||||
|
||||
on:
|
||||
schedule:
|
||||
# Run at 03:00 and 15:00 UTC
|
||||
- cron: '0 3,15 * * *'
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
dry_run:
|
||||
description: 'Dry run (no PRs created)'
|
||||
required: false
|
||||
type: boolean
|
||||
default: false
|
||||
log_level:
|
||||
description: 'Log level'
|
||||
required: false
|
||||
type: choice
|
||||
options:
|
||||
- debug
|
||||
- info
|
||||
- warn
|
||||
default: 'info'
|
||||
|
||||
env:
|
||||
RENOVATE_VERSION: '37.100.0'
|
||||
LOG_LEVEL: ${{ github.event.inputs.log_level || 'info' }}
|
||||
|
||||
jobs:
|
||||
renovate:
|
||||
name: Run Renovate
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 30
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Validate configuration
|
||||
run: |
|
||||
if [[ ! -f "renovate.json" ]]; then
|
||||
echo "::error::renovate.json not found in repository root"
|
||||
exit 1
|
||||
fi
|
||||
echo "Renovate configuration found"
|
||||
cat renovate.json | head -20
|
||||
|
||||
- name: Run Renovate
|
||||
env:
|
||||
RENOVATE_TOKEN: ${{ secrets.RENOVATE_TOKEN }}
|
||||
RENOVATE_PLATFORM: gitea
|
||||
RENOVATE_ENDPOINT: ${{ github.server_url }}/api/v1
|
||||
RENOVATE_REPOSITORIES: ${{ github.repository }}
|
||||
RENOVATE_DRY_RUN: ${{ github.event.inputs.dry_run == 'true' && 'full' || 'null' }}
|
||||
LOG_LEVEL: ${{ env.LOG_LEVEL }}
|
||||
run: |
|
||||
# Install Renovate
|
||||
npm install -g renovate@${{ env.RENOVATE_VERSION }}
|
||||
|
||||
# Configure Renovate
|
||||
export RENOVATE_CONFIG_FILE="${GITHUB_WORKSPACE}/renovate.json"
|
||||
|
||||
# Set dry run mode
|
||||
if [[ "$RENOVATE_DRY_RUN" == "full" ]]; then
|
||||
echo "Running in DRY RUN mode - no PRs will be created"
|
||||
export RENOVATE_DRY_RUN="full"
|
||||
fi
|
||||
|
||||
# Run Renovate
|
||||
renovate \
|
||||
--platform="$RENOVATE_PLATFORM" \
|
||||
--endpoint="$RENOVATE_ENDPOINT" \
|
||||
--token="$RENOVATE_TOKEN" \
|
||||
"$RENOVATE_REPOSITORIES" \
|
||||
2>&1 | tee renovate.log
|
||||
|
||||
- name: Upload Renovate log
|
||||
uses: actions/upload-artifact@v4
|
||||
if: always()
|
||||
with:
|
||||
name: renovate-log-${{ github.run_id }}
|
||||
path: renovate.log
|
||||
retention-days: 7
|
||||
|
||||
- name: Summary
|
||||
if: always()
|
||||
run: |
|
||||
echo "## Renovate Run Summary" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Property | Value |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "|----------|-------|" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Version | ${{ env.RENOVATE_VERSION }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Log Level | ${{ env.LOG_LEVEL }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Dry Run | ${{ github.event.inputs.dry_run || 'false' }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Trigger | ${{ github.event_name }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
if [[ -f renovate.log ]]; then
|
||||
# Count PRs created/updated
|
||||
CREATED=$(grep -c "PR created" renovate.log 2>/dev/null || echo "0")
|
||||
UPDATED=$(grep -c "PR updated" renovate.log 2>/dev/null || echo "0")
|
||||
echo "### Results" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- PRs Created: $CREATED" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- PRs Updated: $UPDATED" >> $GITHUB_STEP_SUMMARY
|
||||
fi
|
||||
277
.gitea/workflows/rollback.yml
Normal file
277
.gitea/workflows/rollback.yml
Normal file
@@ -0,0 +1,277 @@
|
||||
# Emergency Rollback Workflow
|
||||
# Sprint: CI/CD Enhancement - Deployment Safety
|
||||
#
|
||||
# Purpose: Automated rollback to previous known-good version
|
||||
# Triggers: Manual dispatch only (emergency procedure)
|
||||
#
|
||||
# SLA Target: < 5 minutes from trigger to rollback complete
|
||||
|
||||
name: Emergency Rollback
|
||||
|
||||
on:
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
environment:
|
||||
description: 'Target environment'
|
||||
required: true
|
||||
type: choice
|
||||
options:
|
||||
- staging
|
||||
- production
|
||||
service:
|
||||
description: 'Service to rollback (or "all" for full rollback)'
|
||||
required: true
|
||||
type: choice
|
||||
options:
|
||||
- all
|
||||
- authority
|
||||
- attestor
|
||||
- concelier
|
||||
- scanner
|
||||
- policy
|
||||
- excititor
|
||||
- gateway
|
||||
- scheduler
|
||||
- cli
|
||||
target_version:
|
||||
description: 'Version to rollback to (leave empty for previous version)'
|
||||
required: false
|
||||
type: string
|
||||
reason:
|
||||
description: 'Reason for rollback'
|
||||
required: true
|
||||
type: string
|
||||
skip_health_check:
|
||||
description: 'Skip health check (use only in emergencies)'
|
||||
required: false
|
||||
type: boolean
|
||||
default: false
|
||||
|
||||
env:
|
||||
ROLLBACK_TIMEOUT: 300 # 5 minutes
|
||||
|
||||
jobs:
|
||||
validate:
|
||||
name: Validate Rollback Request
|
||||
runs-on: ubuntu-latest
|
||||
outputs:
|
||||
target_version: ${{ steps.resolve.outputs.version }}
|
||||
services: ${{ steps.resolve.outputs.services }}
|
||||
approved: ${{ steps.validate.outputs.approved }}
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Validate inputs
|
||||
id: validate
|
||||
run: |
|
||||
echo "## Rollback Request Validation" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Parameter | Value |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "|-----------|-------|" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Environment | ${{ inputs.environment }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Service | ${{ inputs.service }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Target Version | ${{ inputs.target_version || 'previous' }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Reason | ${{ inputs.reason }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Triggered By | ${{ github.actor }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Timestamp | $(date -u +"%Y-%m-%dT%H:%M:%SZ") |" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
# Production requires additional validation
|
||||
if [[ "${{ inputs.environment }}" == "production" ]]; then
|
||||
echo ""
|
||||
echo "### Production Rollback Warning" >> $GITHUB_STEP_SUMMARY
|
||||
echo "This will affect production users immediately." >> $GITHUB_STEP_SUMMARY
|
||||
fi
|
||||
|
||||
echo "approved=true" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: Resolve target version
|
||||
id: resolve
|
||||
run: |
|
||||
VERSION="${{ inputs.target_version }}"
|
||||
SERVICE="${{ inputs.service }}"
|
||||
|
||||
# If no version specified, get previous from manifest
|
||||
if [[ -z "$VERSION" ]]; then
|
||||
MANIFEST="devops/releases/service-versions.json"
|
||||
if [[ -f "$MANIFEST" ]]; then
|
||||
if [[ "$SERVICE" == "all" ]]; then
|
||||
# Get oldest version across all services
|
||||
VERSION=$(jq -r '.services | to_entries | map(.value.version) | sort | first // "unknown"' "$MANIFEST")
|
||||
else
|
||||
VERSION=$(jq -r --arg svc "$SERVICE" '.services[$svc].previousVersion // .services[$svc].version // "unknown"' "$MANIFEST")
|
||||
fi
|
||||
fi
|
||||
fi
|
||||
|
||||
# Determine services to rollback
|
||||
if [[ "$SERVICE" == "all" ]]; then
|
||||
SERVICES='["authority","attestor","concelier","scanner","policy","excititor","gateway","scheduler"]'
|
||||
else
|
||||
SERVICES="[\"$SERVICE\"]"
|
||||
fi
|
||||
|
||||
echo "Resolved version: $VERSION"
|
||||
echo "Services: $SERVICES"
|
||||
|
||||
echo "version=$VERSION" >> $GITHUB_OUTPUT
|
||||
echo "services=$SERVICES" >> $GITHUB_OUTPUT
|
||||
|
||||
rollback:
|
||||
name: Execute Rollback
|
||||
runs-on: ubuntu-latest
|
||||
needs: [validate]
|
||||
if: needs.validate.outputs.approved == 'true'
|
||||
environment: ${{ inputs.environment }}
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Setup kubectl
|
||||
uses: azure/setup-kubectl@v3
|
||||
with:
|
||||
version: 'latest'
|
||||
|
||||
- name: Setup Helm
|
||||
uses: azure/setup-helm@v3
|
||||
with:
|
||||
version: 'latest'
|
||||
|
||||
- name: Configure deployment access
|
||||
run: |
|
||||
echo "::notice::Configure deployment access for ${{ inputs.environment }}"
|
||||
# TODO: Configure kubectl context / kubeconfig
|
||||
# kubectl config use-context ${{ inputs.environment }}
|
||||
|
||||
- name: Execute rollback
|
||||
id: rollback
|
||||
run: |
|
||||
echo "Starting rollback..."
|
||||
START_TIME=$(date +%s)
|
||||
|
||||
TARGET_VERSION="${{ needs.validate.outputs.target_version }}"
|
||||
SERVICES='${{ needs.validate.outputs.services }}'
|
||||
ENVIRONMENT="${{ inputs.environment }}"
|
||||
|
||||
# Execute rollback script
|
||||
if [[ -f ".gitea/scripts/release/rollback.sh" ]]; then
|
||||
.gitea/scripts/release/rollback.sh \
|
||||
--environment "$ENVIRONMENT" \
|
||||
--version "$TARGET_VERSION" \
|
||||
--services "$SERVICES" \
|
||||
--reason "${{ inputs.reason }}"
|
||||
else
|
||||
echo "::warning::Rollback script not found - using placeholder"
|
||||
echo ""
|
||||
echo "Rollback would execute:"
|
||||
echo " Environment: $ENVIRONMENT"
|
||||
echo " Version: $TARGET_VERSION"
|
||||
echo " Services: $SERVICES"
|
||||
echo ""
|
||||
echo "TODO: Implement rollback.sh script"
|
||||
fi
|
||||
|
||||
END_TIME=$(date +%s)
|
||||
DURATION=$((END_TIME - START_TIME))
|
||||
|
||||
echo "duration=$DURATION" >> $GITHUB_OUTPUT
|
||||
echo "Rollback completed in ${DURATION}s"
|
||||
|
||||
- name: Health check
|
||||
if: inputs.skip_health_check != true
|
||||
run: |
|
||||
echo "Running health checks..."
|
||||
|
||||
SERVICES='${{ needs.validate.outputs.services }}'
|
||||
|
||||
echo "$SERVICES" | jq -r '.[]' | while read -r service; do
|
||||
echo "Checking $service..."
|
||||
# TODO: Implement service-specific health checks
|
||||
# curl -sf "https://${service}.${{ inputs.environment }}.stella-ops.org/health" || exit 1
|
||||
echo " Status: OK (placeholder)"
|
||||
done
|
||||
|
||||
echo "All health checks passed"
|
||||
|
||||
- name: Rollback summary
|
||||
if: always()
|
||||
run: |
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "## Rollback Execution" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
if [[ "${{ steps.rollback.outcome }}" == "success" ]]; then
|
||||
echo "### Rollback Successful" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- Duration: ${{ steps.rollback.outputs.duration }}s" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- Target Version: ${{ needs.validate.outputs.target_version }}" >> $GITHUB_STEP_SUMMARY
|
||||
else
|
||||
echo "### Rollback Failed" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "Please investigate immediately and consider manual intervention." >> $GITHUB_STEP_SUMMARY
|
||||
fi
|
||||
|
||||
notify:
|
||||
name: Send Notifications
|
||||
runs-on: ubuntu-latest
|
||||
needs: [validate, rollback]
|
||||
if: always()
|
||||
|
||||
steps:
|
||||
- name: Notify team
|
||||
run: |
|
||||
STATUS="${{ needs.rollback.result }}"
|
||||
ENVIRONMENT="${{ inputs.environment }}"
|
||||
SERVICE="${{ inputs.service }}"
|
||||
ACTOR="${{ github.actor }}"
|
||||
REASON="${{ inputs.reason }}"
|
||||
VERSION="${{ needs.validate.outputs.target_version }}"
|
||||
|
||||
# Build notification message
|
||||
if [[ "$STATUS" == "success" ]]; then
|
||||
EMOJI="white_check_mark"
|
||||
TITLE="Rollback Completed Successfully"
|
||||
else
|
||||
EMOJI="x"
|
||||
TITLE="Rollback Failed - Immediate Attention Required"
|
||||
fi
|
||||
|
||||
echo "Notification:"
|
||||
echo " Title: $TITLE"
|
||||
echo " Environment: $ENVIRONMENT"
|
||||
echo " Service: $SERVICE"
|
||||
echo " Version: $VERSION"
|
||||
echo " Actor: $ACTOR"
|
||||
echo " Reason: $REASON"
|
||||
|
||||
# TODO: Send to Slack/Teams/PagerDuty
|
||||
# - name: Slack notification
|
||||
# uses: slackapi/slack-github-action@v1
|
||||
# with:
|
||||
# payload: |
|
||||
# {
|
||||
# "text": "${{ env.TITLE }}",
|
||||
# "blocks": [...]
|
||||
# }
|
||||
|
||||
- name: Create incident record
|
||||
run: |
|
||||
echo "Creating incident record..."
|
||||
|
||||
# Log to incident tracking
|
||||
INCIDENT_LOG="devops/incidents/$(date +%Y-%m-%d)-rollback.json"
|
||||
echo "{
|
||||
\"timestamp\": \"$(date -u +"%Y-%m-%dT%H:%M:%SZ")\",
|
||||
\"type\": \"rollback\",
|
||||
\"environment\": \"${{ inputs.environment }}\",
|
||||
\"service\": \"${{ inputs.service }}\",
|
||||
\"target_version\": \"${{ needs.validate.outputs.target_version }}\",
|
||||
\"reason\": \"${{ inputs.reason }}\",
|
||||
\"actor\": \"${{ github.actor }}\",
|
||||
\"status\": \"${{ needs.rollback.result }}\",
|
||||
\"run_id\": \"${{ github.run_id }}\"
|
||||
}"
|
||||
|
||||
echo "::notice::Incident record would be created at $INCIDENT_LOG"
|
||||
386
.gitea/workflows/sast-scan.yml
Normal file
386
.gitea/workflows/sast-scan.yml
Normal file
@@ -0,0 +1,386 @@
|
||||
# .gitea/workflows/sast-scan.yml
|
||||
# Static Application Security Testing (SAST) Workflow
|
||||
# Sprint: CI/CD Enhancement - Security Scanning (Tier 2)
|
||||
#
|
||||
# Purpose: Detect security vulnerabilities in source code through static analysis
|
||||
# - Code injection vulnerabilities
|
||||
# - Authentication/authorization issues
|
||||
# - Cryptographic weaknesses
|
||||
# - Data exposure risks
|
||||
# - OWASP Top 10 detection
|
||||
#
|
||||
# Supported Languages: C#/.NET, JavaScript/TypeScript, Python, YAML, Dockerfile
|
||||
#
|
||||
# PLACEHOLDER: Choose your SAST scanner implementation below
|
||||
# Options:
|
||||
# 1. Semgrep - Fast, open-source, good .NET support
|
||||
# 2. CodeQL - GitHub's analysis engine
|
||||
# 3. SonarQube - Enterprise-grade with dashboards
|
||||
# 4. Snyk Code - Commercial with good accuracy
|
||||
|
||||
name: SAST Scanning
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [main, develop]
|
||||
paths:
|
||||
- 'src/**'
|
||||
- '*.csproj'
|
||||
- '*.cs'
|
||||
- '*.ts'
|
||||
- '*.js'
|
||||
- '*.py'
|
||||
- 'Dockerfile*'
|
||||
pull_request:
|
||||
paths:
|
||||
- 'src/**'
|
||||
- '*.csproj'
|
||||
- '*.cs'
|
||||
- '*.ts'
|
||||
- '*.js'
|
||||
- '*.py'
|
||||
- 'Dockerfile*'
|
||||
schedule:
|
||||
- cron: '30 3 * * 1' # Weekly on Monday at 3:30 AM UTC
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
scan_level:
|
||||
description: 'Scan thoroughness level'
|
||||
type: choice
|
||||
options:
|
||||
- quick
|
||||
- standard
|
||||
- comprehensive
|
||||
default: standard
|
||||
fail_on_findings:
|
||||
description: 'Fail workflow on findings'
|
||||
type: boolean
|
||||
default: true
|
||||
|
||||
env:
|
||||
DOTNET_VERSION: '10.0.100'
|
||||
TZ: UTC
|
||||
|
||||
jobs:
|
||||
# ===========================================================================
|
||||
# PLACEHOLDER SAST IMPLEMENTATION
|
||||
# ===========================================================================
|
||||
#
|
||||
# IMPORTANT: Configure your preferred SAST tool by uncommenting ONE of the
|
||||
# implementation options below. Each option includes the necessary steps
|
||||
# and configuration for that specific tool.
|
||||
#
|
||||
# ===========================================================================
|
||||
|
||||
sast-scan:
|
||||
name: SAST Analysis
|
||||
runs-on: ubuntu-22.04
|
||||
timeout-minutes: 30
|
||||
permissions:
|
||||
security-events: write
|
||||
contents: read
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0
|
||||
|
||||
# =========================================================================
|
||||
# PLACEHOLDER: Uncomment your preferred SAST tool configuration
|
||||
# =========================================================================
|
||||
|
||||
- name: SAST Scan Placeholder
|
||||
run: |
|
||||
echo "::notice::SAST scanning placeholder - configure your scanner below"
|
||||
echo ""
|
||||
echo "Available SAST options:"
|
||||
echo ""
|
||||
echo "1. SEMGREP (Recommended for open-source)"
|
||||
echo " Uncomment the Semgrep section below"
|
||||
echo " - Fast, accurate, good .NET support"
|
||||
echo " - Free for open-source projects"
|
||||
echo ""
|
||||
echo "2. CODEQL (GitHub native)"
|
||||
echo " Uncomment the CodeQL section below"
|
||||
echo " - Deep analysis capabilities"
|
||||
echo " - Native GitHub integration"
|
||||
echo ""
|
||||
echo "3. SONARQUBE (Enterprise)"
|
||||
echo " Uncomment the SonarQube section below"
|
||||
echo " - Comprehensive dashboards"
|
||||
echo " - Technical debt tracking"
|
||||
echo ""
|
||||
echo "4. SNYK CODE (Commercial)"
|
||||
echo " Uncomment the Snyk section below"
|
||||
echo " - High accuracy"
|
||||
echo " - Good IDE integration"
|
||||
|
||||
# =========================================================================
|
||||
# OPTION 1: SEMGREP
|
||||
# =========================================================================
|
||||
# Uncomment the following section to use Semgrep:
|
||||
#
|
||||
# - name: Run Semgrep
|
||||
# uses: returntocorp/semgrep-action@v1
|
||||
# with:
|
||||
# config: >-
|
||||
# p/default
|
||||
# p/security-audit
|
||||
# p/owasp-top-ten
|
||||
# p/csharp
|
||||
# p/javascript
|
||||
# p/typescript
|
||||
# p/python
|
||||
# p/docker
|
||||
# env:
|
||||
# SEMGREP_APP_TOKEN: ${{ secrets.SEMGREP_APP_TOKEN }}
|
||||
|
||||
# =========================================================================
|
||||
# OPTION 2: CODEQL
|
||||
# =========================================================================
|
||||
# Uncomment the following section to use CodeQL:
|
||||
#
|
||||
# - name: Initialize CodeQL
|
||||
# uses: github/codeql-action/init@v3
|
||||
# with:
|
||||
# languages: csharp, javascript
|
||||
# queries: security-and-quality
|
||||
#
|
||||
# - name: Build for CodeQL
|
||||
# run: |
|
||||
# dotnet build src/StellaOps.sln --configuration Release
|
||||
#
|
||||
# - name: Perform CodeQL Analysis
|
||||
# uses: github/codeql-action/analyze@v3
|
||||
# with:
|
||||
# category: "/language:csharp"
|
||||
|
||||
# =========================================================================
|
||||
# OPTION 3: SONARQUBE
|
||||
# =========================================================================
|
||||
# Uncomment the following section to use SonarQube:
|
||||
#
|
||||
# - name: SonarQube Scan
|
||||
# uses: SonarSource/sonarqube-scan-action@master
|
||||
# env:
|
||||
# SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}
|
||||
# SONAR_HOST_URL: ${{ secrets.SONAR_HOST_URL }}
|
||||
# with:
|
||||
# args: >
|
||||
# -Dsonar.projectKey=stellaops
|
||||
# -Dsonar.sources=src/
|
||||
# -Dsonar.exclusions=**/bin/**,**/obj/**,**/node_modules/**
|
||||
|
||||
# =========================================================================
|
||||
# OPTION 4: SNYK CODE
|
||||
# =========================================================================
|
||||
# Uncomment the following section to use Snyk Code:
|
||||
#
|
||||
# - name: Setup Snyk
|
||||
# uses: snyk/actions/setup@master
|
||||
#
|
||||
# - name: Snyk Code Test
|
||||
# run: snyk code test --sarif-file-output=snyk-code.sarif
|
||||
# env:
|
||||
# SNYK_TOKEN: ${{ secrets.SNYK_TOKEN }}
|
||||
# continue-on-error: true
|
||||
#
|
||||
# - name: Upload Snyk results
|
||||
# uses: github/codeql-action/upload-sarif@v3
|
||||
# with:
|
||||
# sarif_file: snyk-code.sarif
|
||||
|
||||
# ===========================================================================
|
||||
# .NET SECURITY ANALYSIS (built-in)
|
||||
# ===========================================================================
|
||||
|
||||
dotnet-security:
|
||||
name: .NET Security Analysis
|
||||
runs-on: ubuntu-22.04
|
||||
timeout-minutes: 20
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Setup .NET
|
||||
uses: actions/setup-dotnet@v4
|
||||
with:
|
||||
dotnet-version: ${{ env.DOTNET_VERSION }}
|
||||
include-prerelease: true
|
||||
|
||||
- name: Restore packages
|
||||
run: dotnet restore src/StellaOps.sln
|
||||
|
||||
- name: Run Security Code Analysis
|
||||
run: |
|
||||
# Enable nullable reference types warnings as errors for security
|
||||
dotnet build src/StellaOps.sln \
|
||||
--configuration Release \
|
||||
--no-restore \
|
||||
/p:TreatWarningsAsErrors=false \
|
||||
/p:EnableNETAnalyzers=true \
|
||||
/p:AnalysisLevel=latest \
|
||||
/warnaserror:CA2100,CA2109,CA2119,CA2153,CA2300,CA2301,CA2302,CA2305,CA2310,CA2311,CA2312,CA2315,CA2321,CA2322,CA2326,CA2327,CA2328,CA2329,CA2330,CA2350,CA2351,CA2352,CA2353,CA2354,CA2355,CA2356,CA2361,CA2362,CA3001,CA3002,CA3003,CA3004,CA3005,CA3006,CA3007,CA3008,CA3009,CA3010,CA3011,CA3012,CA3061,CA3075,CA3076,CA3077,CA3147,CA5350,CA5351,CA5358,CA5359,CA5360,CA5361,CA5362,CA5363,CA5364,CA5365,CA5366,CA5367,CA5368,CA5369,CA5370,CA5371,CA5372,CA5373,CA5374,CA5375,CA5376,CA5377,CA5378,CA5379,CA5380,CA5381,CA5382,CA5383,CA5384,CA5385,CA5386,CA5387,CA5388,CA5389,CA5390,CA5391,CA5392,CA5393,CA5394,CA5395,CA5396,CA5397,CA5398,CA5399,CA5400,CA5401,CA5402,CA5403 \
|
||||
2>&1 | tee build-security.log || true
|
||||
|
||||
- name: Parse security warnings
|
||||
run: |
|
||||
echo "### .NET Security Analysis" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
# Count security warnings
|
||||
SECURITY_WARNINGS=$(grep -E "warning CA[235][0-9]{3}" build-security.log | wc -l || echo "0")
|
||||
echo "- Security warnings found: $SECURITY_WARNINGS" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
if [[ $SECURITY_WARNINGS -gt 0 ]]; then
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "<details><summary>Security Warnings</summary>" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo '```' >> $GITHUB_STEP_SUMMARY
|
||||
grep -E "warning CA[235][0-9]{3}" build-security.log | head -50 >> $GITHUB_STEP_SUMMARY
|
||||
echo '```' >> $GITHUB_STEP_SUMMARY
|
||||
echo "</details>" >> $GITHUB_STEP_SUMMARY
|
||||
fi
|
||||
|
||||
- name: Upload security log
|
||||
uses: actions/upload-artifact@v4
|
||||
if: always()
|
||||
with:
|
||||
name: sast-dotnet-security-log
|
||||
path: build-security.log
|
||||
retention-days: 14
|
||||
|
||||
# ===========================================================================
|
||||
# DEPENDENCY VULNERABILITY CHECK
|
||||
# ===========================================================================
|
||||
|
||||
dependency-check:
|
||||
name: Dependency Vulnerabilities
|
||||
runs-on: ubuntu-22.04
|
||||
timeout-minutes: 15
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Setup .NET
|
||||
uses: actions/setup-dotnet@v4
|
||||
with:
|
||||
dotnet-version: ${{ env.DOTNET_VERSION }}
|
||||
include-prerelease: true
|
||||
|
||||
- name: Run vulnerability audit
|
||||
run: |
|
||||
echo "### Dependency Vulnerability Audit" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
# Check for known vulnerabilities in NuGet packages
|
||||
dotnet list src/StellaOps.sln package --vulnerable --include-transitive 2>&1 | tee vuln-report.txt || true
|
||||
|
||||
# Parse results
|
||||
VULN_COUNT=$(grep -c "has the following vulnerable packages" vuln-report.txt || echo "0")
|
||||
|
||||
if [[ $VULN_COUNT -gt 0 ]]; then
|
||||
echo "::warning::Found $VULN_COUNT projects with vulnerable dependencies"
|
||||
echo "- Projects with vulnerabilities: $VULN_COUNT" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "<details><summary>Vulnerability Report</summary>" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo '```' >> $GITHUB_STEP_SUMMARY
|
||||
cat vuln-report.txt >> $GITHUB_STEP_SUMMARY
|
||||
echo '```' >> $GITHUB_STEP_SUMMARY
|
||||
echo "</details>" >> $GITHUB_STEP_SUMMARY
|
||||
else
|
||||
echo "No known vulnerabilities found in dependencies." >> $GITHUB_STEP_SUMMARY
|
||||
fi
|
||||
|
||||
- name: Upload vulnerability report
|
||||
uses: actions/upload-artifact@v4
|
||||
if: always()
|
||||
with:
|
||||
name: sast-vulnerability-report
|
||||
path: vuln-report.txt
|
||||
retention-days: 14
|
||||
|
||||
# ===========================================================================
|
||||
# DOCKERFILE SECURITY LINTING
|
||||
# ===========================================================================
|
||||
|
||||
dockerfile-lint:
|
||||
name: Dockerfile Security
|
||||
runs-on: ubuntu-22.04
|
||||
timeout-minutes: 10
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Find Dockerfiles
|
||||
id: find
|
||||
run: |
|
||||
DOCKERFILES=$(find . -name "Dockerfile*" -type f ! -path "./node_modules/*" | jq -R -s -c 'split("\n") | map(select(length > 0))')
|
||||
COUNT=$(echo "$DOCKERFILES" | jq 'length')
|
||||
echo "files=$DOCKERFILES" >> $GITHUB_OUTPUT
|
||||
echo "count=$COUNT" >> $GITHUB_OUTPUT
|
||||
echo "Found $COUNT Dockerfiles"
|
||||
|
||||
- name: Install Hadolint
|
||||
if: steps.find.outputs.count != '0'
|
||||
run: |
|
||||
wget -qO hadolint https://github.com/hadolint/hadolint/releases/download/v2.12.0/hadolint-Linux-x86_64
|
||||
chmod +x hadolint
|
||||
sudo mv hadolint /usr/local/bin/
|
||||
|
||||
- name: Lint Dockerfiles
|
||||
if: steps.find.outputs.count != '0'
|
||||
run: |
|
||||
echo "### Dockerfile Security Lint" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
TOTAL_ISSUES=0
|
||||
|
||||
for dockerfile in $(echo '${{ steps.find.outputs.files }}' | jq -r '.[]'); do
|
||||
echo "Linting: $dockerfile"
|
||||
ISSUES=$(hadolint --format json "$dockerfile" 2>/dev/null || echo "[]")
|
||||
ISSUE_COUNT=$(echo "$ISSUES" | jq 'length')
|
||||
TOTAL_ISSUES=$((TOTAL_ISSUES + ISSUE_COUNT))
|
||||
|
||||
if [[ $ISSUE_COUNT -gt 0 ]]; then
|
||||
echo "- **$dockerfile**: $ISSUE_COUNT issues" >> $GITHUB_STEP_SUMMARY
|
||||
fi
|
||||
done
|
||||
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "**Total issues found: $TOTAL_ISSUES**" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
if [[ $TOTAL_ISSUES -gt 0 ]] && [[ "${{ github.event.inputs.fail_on_findings }}" == "true" ]]; then
|
||||
echo "::warning::Found $TOTAL_ISSUES Dockerfile security issues"
|
||||
fi
|
||||
|
||||
# ===========================================================================
|
||||
# SUMMARY
|
||||
# ===========================================================================
|
||||
|
||||
summary:
|
||||
name: SAST Summary
|
||||
runs-on: ubuntu-22.04
|
||||
needs: [sast-scan, dotnet-security, dependency-check, dockerfile-lint]
|
||||
if: always()
|
||||
steps:
|
||||
- name: Generate summary
|
||||
run: |
|
||||
echo "## SAST Scan Summary" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Check | Status |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "|-------|--------|" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| SAST Analysis | ${{ needs.sast-scan.result }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| .NET Security | ${{ needs.dotnet-security.result }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Dependency Check | ${{ needs.dependency-check.result }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Dockerfile Lint | ${{ needs.dockerfile-lint.result }} |" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
- name: Check for failures
|
||||
if: |
|
||||
github.event.inputs.fail_on_findings == 'true' &&
|
||||
(needs.sast-scan.result == 'failure' ||
|
||||
needs.dotnet-security.result == 'failure' ||
|
||||
needs.dependency-check.result == 'failure')
|
||||
run: exit 1
|
||||
105
.gitea/workflows/secrets-scan.yml
Normal file
105
.gitea/workflows/secrets-scan.yml
Normal file
@@ -0,0 +1,105 @@
|
||||
# Secrets Scanning Workflow
|
||||
# Sprint: CI/CD Enhancement - Security Scanning
|
||||
#
|
||||
# Purpose: Detect hardcoded secrets, API keys, and credentials in code
|
||||
# Triggers: Push to main/develop, all PRs
|
||||
#
|
||||
# Tool: PLACEHOLDER - Choose one: TruffleHog, Gitleaks, or Semgrep
|
||||
|
||||
name: Secrets Scanning
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [main, develop]
|
||||
pull_request:
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
scan_history:
|
||||
description: 'Scan full git history'
|
||||
required: false
|
||||
type: boolean
|
||||
default: false
|
||||
|
||||
jobs:
|
||||
secrets-scan:
|
||||
name: Scan for Secrets
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: ${{ github.event.inputs.scan_history == 'true' && 0 || 50 }}
|
||||
|
||||
# PLACEHOLDER: Choose your secrets scanner
|
||||
# Option 1: TruffleHog (recommended - comprehensive, low false positives)
|
||||
# Option 2: Gitleaks (fast, good for CI)
|
||||
# Option 3: Semgrep (if already using for SAST)
|
||||
|
||||
- name: TruffleHog Scan
|
||||
id: trufflehog
|
||||
# Uncomment when ready to use TruffleHog:
|
||||
# uses: trufflesecurity/trufflehog@main
|
||||
# with:
|
||||
# extra_args: --only-verified
|
||||
run: |
|
||||
echo "::notice::Secrets scanning placeholder - configure scanner below"
|
||||
echo ""
|
||||
echo "Available options:"
|
||||
echo " 1. TruffleHog: trufflesecurity/trufflehog@main"
|
||||
echo " 2. Gitleaks: gitleaks/gitleaks-action@v2"
|
||||
echo " 3. Semgrep: returntocorp/semgrep-action@v1"
|
||||
echo ""
|
||||
echo "To enable, uncomment the appropriate action above"
|
||||
|
||||
# Alternative: Gitleaks
|
||||
# - name: Gitleaks Scan
|
||||
# uses: gitleaks/gitleaks-action@v2
|
||||
# env:
|
||||
# GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
# GITLEAKS_LICENSE: ${{ secrets.GITLEAKS_LICENSE }}
|
||||
|
||||
# Alternative: Semgrep (secrets rules)
|
||||
# - name: Semgrep Secrets Scan
|
||||
# uses: returntocorp/semgrep-action@v1
|
||||
# with:
|
||||
# config: p/secrets
|
||||
|
||||
- name: Upload scan results
|
||||
if: always()
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: secrets-scan-results
|
||||
path: |
|
||||
**/trufflehog-*.json
|
||||
**/gitleaks-*.json
|
||||
**/semgrep-*.json
|
||||
retention-days: 30
|
||||
if-no-files-found: ignore
|
||||
|
||||
summary:
|
||||
name: Scan Summary
|
||||
runs-on: ubuntu-latest
|
||||
needs: [secrets-scan]
|
||||
if: always()
|
||||
|
||||
steps:
|
||||
- name: Generate summary
|
||||
run: |
|
||||
echo "## Secrets Scanning Results" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
if [[ "${{ needs.secrets-scan.result }}" == "success" ]]; then
|
||||
echo "### No secrets detected" >> $GITHUB_STEP_SUMMARY
|
||||
elif [[ "${{ needs.secrets-scan.result }}" == "failure" ]]; then
|
||||
echo "### Secrets detected - review required" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "Please review the scan artifacts for details." >> $GITHUB_STEP_SUMMARY
|
||||
else
|
||||
echo "### Scan status: ${{ needs.secrets-scan.result }}" >> $GITHUB_STEP_SUMMARY
|
||||
fi
|
||||
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "**Scanner:** Placeholder (configure in workflow)" >> $GITHUB_STEP_SUMMARY
|
||||
echo "**Trigger:** ${{ github.event_name }}" >> $GITHUB_STEP_SUMMARY
|
||||
echo "**Branch:** ${{ github.ref_name }}" >> $GITHUB_STEP_SUMMARY
|
||||
490
.gitea/workflows/service-release.yml
Normal file
490
.gitea/workflows/service-release.yml
Normal file
@@ -0,0 +1,490 @@
|
||||
# Service Release Pipeline
|
||||
# Sprint: CI/CD Enhancement - Per-Service Auto-Versioning
|
||||
#
|
||||
# Purpose: Automated per-service release pipeline with semantic versioning
|
||||
# and Docker tag format: {semver}+{YYYYMMDDHHmmss}
|
||||
#
|
||||
# Triggers:
|
||||
# - Tag: service-{name}-v{semver} (e.g., service-scanner-v1.2.3)
|
||||
# - Manual dispatch with service selection and bump type
|
||||
|
||||
name: Service Release
|
||||
|
||||
on:
|
||||
push:
|
||||
tags:
|
||||
- 'service-*-v*'
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
service:
|
||||
description: 'Service to release'
|
||||
required: true
|
||||
type: choice
|
||||
options:
|
||||
- authority
|
||||
- attestor
|
||||
- concelier
|
||||
- scanner
|
||||
- policy
|
||||
- signer
|
||||
- excititor
|
||||
- gateway
|
||||
- scheduler
|
||||
- cli
|
||||
- orchestrator
|
||||
- notify
|
||||
- sbomservice
|
||||
- vexhub
|
||||
- evidencelocker
|
||||
bump_type:
|
||||
description: 'Version bump type'
|
||||
required: true
|
||||
type: choice
|
||||
options:
|
||||
- patch
|
||||
- minor
|
||||
- major
|
||||
default: 'patch'
|
||||
dry_run:
|
||||
description: 'Dry run (no actual release)'
|
||||
required: false
|
||||
type: boolean
|
||||
default: false
|
||||
skip_tests:
|
||||
description: 'Skip tests (use with caution)'
|
||||
required: false
|
||||
type: boolean
|
||||
default: false
|
||||
|
||||
env:
|
||||
DOTNET_VERSION: '10.0.100'
|
||||
DOTNET_SKIP_FIRST_TIME_EXPERIENCE: true
|
||||
DOTNET_CLI_TELEMETRY_OPTOUT: true
|
||||
REGISTRY: git.stella-ops.org/stella-ops.org
|
||||
SYFT_VERSION: '1.21.0'
|
||||
|
||||
jobs:
|
||||
# ===========================================================================
|
||||
# Parse tag or manual inputs to determine service and version
|
||||
# ===========================================================================
|
||||
resolve:
|
||||
name: Resolve Release Parameters
|
||||
runs-on: ubuntu-latest
|
||||
outputs:
|
||||
service: ${{ steps.resolve.outputs.service }}
|
||||
bump_type: ${{ steps.resolve.outputs.bump_type }}
|
||||
current_version: ${{ steps.resolve.outputs.current_version }}
|
||||
new_version: ${{ steps.resolve.outputs.new_version }}
|
||||
docker_tag: ${{ steps.resolve.outputs.docker_tag }}
|
||||
is_dry_run: ${{ steps.resolve.outputs.is_dry_run }}
|
||||
skip_tests: ${{ steps.resolve.outputs.skip_tests }}
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Resolve parameters
|
||||
id: resolve
|
||||
run: |
|
||||
if [[ "${{ github.event_name }}" == "push" ]]; then
|
||||
# Parse tag: service-{name}-v{version}
|
||||
TAG="${GITHUB_REF#refs/tags/}"
|
||||
echo "Processing tag: $TAG"
|
||||
|
||||
if [[ "$TAG" =~ ^service-([a-z]+)-v([0-9]+\.[0-9]+\.[0-9]+)$ ]]; then
|
||||
SERVICE="${BASH_REMATCH[1]}"
|
||||
VERSION="${BASH_REMATCH[2]}"
|
||||
BUMP_TYPE="explicit"
|
||||
else
|
||||
echo "::error::Invalid tag format: $TAG (expected: service-{name}-v{semver})"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
IS_DRY_RUN="false"
|
||||
SKIP_TESTS="false"
|
||||
else
|
||||
# Manual dispatch
|
||||
SERVICE="${{ github.event.inputs.service }}"
|
||||
BUMP_TYPE="${{ github.event.inputs.bump_type }}"
|
||||
VERSION="" # Will be calculated
|
||||
IS_DRY_RUN="${{ github.event.inputs.dry_run }}"
|
||||
SKIP_TESTS="${{ github.event.inputs.skip_tests }}"
|
||||
fi
|
||||
|
||||
# Read current version
|
||||
CURRENT_VERSION=$(.gitea/scripts/release/read-service-version.sh "$SERVICE")
|
||||
echo "Current version: $CURRENT_VERSION"
|
||||
|
||||
# Calculate new version
|
||||
if [[ -n "$VERSION" ]]; then
|
||||
NEW_VERSION="$VERSION"
|
||||
else
|
||||
NEW_VERSION=$(python3 .gitea/scripts/release/bump-service-version.py "$SERVICE" "$BUMP_TYPE" --output-version)
|
||||
fi
|
||||
echo "New version: $NEW_VERSION"
|
||||
|
||||
# Generate Docker tag
|
||||
DOCKER_TAG=$(.gitea/scripts/release/generate-docker-tag.sh --version "$NEW_VERSION")
|
||||
echo "Docker tag: $DOCKER_TAG"
|
||||
|
||||
# Set outputs
|
||||
echo "service=$SERVICE" >> $GITHUB_OUTPUT
|
||||
echo "bump_type=$BUMP_TYPE" >> $GITHUB_OUTPUT
|
||||
echo "current_version=$CURRENT_VERSION" >> $GITHUB_OUTPUT
|
||||
echo "new_version=$NEW_VERSION" >> $GITHUB_OUTPUT
|
||||
echo "docker_tag=$DOCKER_TAG" >> $GITHUB_OUTPUT
|
||||
echo "is_dry_run=$IS_DRY_RUN" >> $GITHUB_OUTPUT
|
||||
echo "skip_tests=$SKIP_TESTS" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: Summary
|
||||
run: |
|
||||
echo "## Release Parameters" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Parameter | Value |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "|-----------|-------|" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Service | ${{ steps.resolve.outputs.service }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Current Version | ${{ steps.resolve.outputs.current_version }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| New Version | ${{ steps.resolve.outputs.new_version }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Docker Tag | ${{ steps.resolve.outputs.docker_tag }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Dry Run | ${{ steps.resolve.outputs.is_dry_run }} |" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
# ===========================================================================
|
||||
# Update version in source files
|
||||
# ===========================================================================
|
||||
update-version:
|
||||
name: Update Version
|
||||
runs-on: ubuntu-latest
|
||||
needs: [resolve]
|
||||
if: needs.resolve.outputs.is_dry_run != 'true'
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
token: ${{ secrets.GITEA_TOKEN }}
|
||||
fetch-depth: 0
|
||||
|
||||
- name: Set up Python
|
||||
uses: actions/setup-python@v5
|
||||
with:
|
||||
python-version: '3.12'
|
||||
|
||||
- name: Update version
|
||||
run: |
|
||||
python3 .gitea/scripts/release/bump-service-version.py \
|
||||
"${{ needs.resolve.outputs.service }}" \
|
||||
"${{ needs.resolve.outputs.new_version }}" \
|
||||
--docker-tag "${{ needs.resolve.outputs.docker_tag }}" \
|
||||
--git-sha "${{ github.sha }}"
|
||||
|
||||
- name: Commit version update
|
||||
run: |
|
||||
git config user.name "github-actions[bot]"
|
||||
git config user.email "github-actions[bot]@users.noreply.github.com"
|
||||
|
||||
git add src/Directory.Versions.props devops/releases/service-versions.json
|
||||
|
||||
if git diff --cached --quiet; then
|
||||
echo "No version changes to commit"
|
||||
else
|
||||
git commit -m "chore(${{ needs.resolve.outputs.service }}): release v${{ needs.resolve.outputs.new_version }}
|
||||
|
||||
Docker tag: ${{ needs.resolve.outputs.docker_tag }}
|
||||
|
||||
🤖 Generated with [Claude Code](https://claude.com/claude-code)
|
||||
|
||||
Co-Authored-By: github-actions[bot] <github-actions[bot]@users.noreply.github.com>"
|
||||
|
||||
git push
|
||||
fi
|
||||
|
||||
# ===========================================================================
|
||||
# Build and test the service
|
||||
# ===========================================================================
|
||||
build-test:
|
||||
name: Build and Test
|
||||
runs-on: ubuntu-latest
|
||||
needs: [resolve, update-version]
|
||||
if: always() && (needs.update-version.result == 'success' || needs.update-version.result == 'skipped')
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
ref: ${{ github.ref }}
|
||||
|
||||
- name: Setup .NET
|
||||
uses: actions/setup-dotnet@v4
|
||||
with:
|
||||
dotnet-version: ${{ env.DOTNET_VERSION }}
|
||||
include-prerelease: true
|
||||
|
||||
- name: Restore dependencies
|
||||
run: dotnet restore src/StellaOps.sln
|
||||
|
||||
- name: Build solution
|
||||
run: |
|
||||
dotnet build src/StellaOps.sln \
|
||||
--configuration Release \
|
||||
--no-restore \
|
||||
-p:StellaOpsServiceVersion=${{ needs.resolve.outputs.new_version }}
|
||||
|
||||
- name: Run tests
|
||||
if: needs.resolve.outputs.skip_tests != 'true'
|
||||
run: |
|
||||
SERVICE="${{ needs.resolve.outputs.service }}"
|
||||
SERVICE_PASCAL=$(echo "$SERVICE" | sed -r 's/(^|-)(\w)/\U\2/g')
|
||||
|
||||
# Find and run tests for this service
|
||||
TEST_PROJECTS=$(find src -path "*/${SERVICE_PASCAL}/*" -name "*.Tests.csproj" -o -path "*/${SERVICE_PASCAL}*Tests*" -name "*.csproj" | head -20)
|
||||
|
||||
if [[ -n "$TEST_PROJECTS" ]]; then
|
||||
echo "Running tests for: $TEST_PROJECTS"
|
||||
echo "$TEST_PROJECTS" | xargs -I{} dotnet test {} --configuration Release --no-build --verbosity normal
|
||||
else
|
||||
echo "::warning::No test projects found for service: $SERVICE"
|
||||
fi
|
||||
|
||||
- name: Upload build artifacts
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: build-${{ needs.resolve.outputs.service }}
|
||||
path: |
|
||||
src/**/bin/Release/**/*.dll
|
||||
src/**/bin/Release/**/*.exe
|
||||
src/**/bin/Release/**/*.pdb
|
||||
retention-days: 7
|
||||
|
||||
# ===========================================================================
|
||||
# Build and publish Docker image
|
||||
# ===========================================================================
|
||||
publish-container:
|
||||
name: Publish Container
|
||||
runs-on: ubuntu-latest
|
||||
needs: [resolve, build-test]
|
||||
if: needs.resolve.outputs.is_dry_run != 'true'
|
||||
outputs:
|
||||
image_digest: ${{ steps.push.outputs.digest }}
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Set up Docker Buildx
|
||||
uses: docker/setup-buildx-action@v3
|
||||
|
||||
- name: Login to registry
|
||||
uses: docker/login-action@v3
|
||||
with:
|
||||
registry: ${{ env.REGISTRY }}
|
||||
username: ${{ secrets.REGISTRY_USERNAME }}
|
||||
password: ${{ secrets.REGISTRY_PASSWORD }}
|
||||
|
||||
- name: Determine Dockerfile path
|
||||
id: dockerfile
|
||||
run: |
|
||||
SERVICE="${{ needs.resolve.outputs.service }}"
|
||||
SERVICE_PASCAL=$(echo "$SERVICE" | sed -r 's/(^|-)(\w)/\U\2/g')
|
||||
|
||||
# Look for service-specific Dockerfile
|
||||
DOCKERFILE_PATHS=(
|
||||
"devops/docker/${SERVICE}/Dockerfile"
|
||||
"devops/docker/${SERVICE_PASCAL}/Dockerfile"
|
||||
"src/${SERVICE_PASCAL}/Dockerfile"
|
||||
"src/${SERVICE_PASCAL}/StellaOps.${SERVICE_PASCAL}.WebService/Dockerfile"
|
||||
"devops/docker/platform/Dockerfile"
|
||||
)
|
||||
|
||||
for path in "${DOCKERFILE_PATHS[@]}"; do
|
||||
if [[ -f "$path" ]]; then
|
||||
echo "dockerfile=$path" >> $GITHUB_OUTPUT
|
||||
echo "Found Dockerfile: $path"
|
||||
exit 0
|
||||
fi
|
||||
done
|
||||
|
||||
echo "::error::No Dockerfile found for service: $SERVICE"
|
||||
exit 1
|
||||
|
||||
- name: Build and push image
|
||||
id: push
|
||||
uses: docker/build-push-action@v5
|
||||
with:
|
||||
context: .
|
||||
file: ${{ steps.dockerfile.outputs.dockerfile }}
|
||||
push: true
|
||||
tags: |
|
||||
${{ env.REGISTRY }}/${{ needs.resolve.outputs.service }}:${{ needs.resolve.outputs.docker_tag }}
|
||||
${{ env.REGISTRY }}/${{ needs.resolve.outputs.service }}:${{ needs.resolve.outputs.new_version }}
|
||||
${{ env.REGISTRY }}/${{ needs.resolve.outputs.service }}:latest
|
||||
labels: |
|
||||
org.opencontainers.image.title=${{ needs.resolve.outputs.service }}
|
||||
org.opencontainers.image.version=${{ needs.resolve.outputs.new_version }}
|
||||
org.opencontainers.image.revision=${{ github.sha }}
|
||||
org.opencontainers.image.source=${{ github.server_url }}/${{ github.repository }}
|
||||
com.stellaops.service.name=${{ needs.resolve.outputs.service }}
|
||||
com.stellaops.service.version=${{ needs.resolve.outputs.new_version }}
|
||||
com.stellaops.docker.tag=${{ needs.resolve.outputs.docker_tag }}
|
||||
build-args: |
|
||||
VERSION=${{ needs.resolve.outputs.new_version }}
|
||||
GIT_SHA=${{ github.sha }}
|
||||
cache-from: type=gha
|
||||
cache-to: type=gha,mode=max
|
||||
|
||||
- name: Image summary
|
||||
run: |
|
||||
echo "## Container Image" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Property | Value |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "|----------|-------|" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Image | \`${{ env.REGISTRY }}/${{ needs.resolve.outputs.service }}\` |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Tag | \`${{ needs.resolve.outputs.docker_tag }}\` |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Digest | \`${{ steps.push.outputs.digest }}\` |" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
# ===========================================================================
|
||||
# Generate SBOM
|
||||
# ===========================================================================
|
||||
generate-sbom:
|
||||
name: Generate SBOM
|
||||
runs-on: ubuntu-latest
|
||||
needs: [resolve, publish-container]
|
||||
if: needs.resolve.outputs.is_dry_run != 'true'
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Install Syft
|
||||
run: |
|
||||
curl -sSfL https://raw.githubusercontent.com/anchore/syft/main/install.sh | \
|
||||
sh -s -- -b /usr/local/bin v${{ env.SYFT_VERSION }}
|
||||
|
||||
- name: Login to registry
|
||||
uses: docker/login-action@v3
|
||||
with:
|
||||
registry: ${{ env.REGISTRY }}
|
||||
username: ${{ secrets.REGISTRY_USERNAME }}
|
||||
password: ${{ secrets.REGISTRY_PASSWORD }}
|
||||
|
||||
- name: Generate SBOM
|
||||
run: |
|
||||
IMAGE="${{ env.REGISTRY }}/${{ needs.resolve.outputs.service }}:${{ needs.resolve.outputs.docker_tag }}"
|
||||
|
||||
syft "$IMAGE" \
|
||||
--output cyclonedx-json=sbom.cyclonedx.json \
|
||||
--output spdx-json=sbom.spdx.json
|
||||
|
||||
echo "Generated SBOMs for: $IMAGE"
|
||||
|
||||
- name: Upload SBOM artifacts
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: sbom-${{ needs.resolve.outputs.service }}-${{ needs.resolve.outputs.new_version }}
|
||||
path: |
|
||||
sbom.cyclonedx.json
|
||||
sbom.spdx.json
|
||||
retention-days: 90
|
||||
|
||||
# ===========================================================================
|
||||
# Sign artifacts with Cosign
|
||||
# ===========================================================================
|
||||
sign-artifacts:
|
||||
name: Sign Artifacts
|
||||
runs-on: ubuntu-latest
|
||||
needs: [resolve, publish-container, generate-sbom]
|
||||
if: needs.resolve.outputs.is_dry_run != 'true'
|
||||
|
||||
steps:
|
||||
- name: Install Cosign
|
||||
uses: sigstore/cosign-installer@v3
|
||||
|
||||
- name: Login to registry
|
||||
uses: docker/login-action@v3
|
||||
with:
|
||||
registry: ${{ env.REGISTRY }}
|
||||
username: ${{ secrets.REGISTRY_USERNAME }}
|
||||
password: ${{ secrets.REGISTRY_PASSWORD }}
|
||||
|
||||
- name: Sign container image
|
||||
if: env.COSIGN_PRIVATE_KEY_B64 != ''
|
||||
env:
|
||||
COSIGN_PRIVATE_KEY_B64: ${{ secrets.COSIGN_PRIVATE_KEY_B64 }}
|
||||
COSIGN_PASSWORD: ${{ secrets.COSIGN_PASSWORD }}
|
||||
run: |
|
||||
echo "$COSIGN_PRIVATE_KEY_B64" | base64 -d > cosign.key
|
||||
|
||||
IMAGE="${{ env.REGISTRY }}/${{ needs.resolve.outputs.service }}@${{ needs.publish-container.outputs.image_digest }}"
|
||||
|
||||
cosign sign --key cosign.key \
|
||||
-a "service=${{ needs.resolve.outputs.service }}" \
|
||||
-a "version=${{ needs.resolve.outputs.new_version }}" \
|
||||
-a "docker-tag=${{ needs.resolve.outputs.docker_tag }}" \
|
||||
"$IMAGE"
|
||||
|
||||
rm -f cosign.key
|
||||
echo "Signed: $IMAGE"
|
||||
|
||||
- name: Download SBOM
|
||||
uses: actions/download-artifact@v4
|
||||
with:
|
||||
name: sbom-${{ needs.resolve.outputs.service }}-${{ needs.resolve.outputs.new_version }}
|
||||
path: sbom/
|
||||
|
||||
- name: Attach SBOM to image
|
||||
if: env.COSIGN_PRIVATE_KEY_B64 != ''
|
||||
env:
|
||||
COSIGN_PRIVATE_KEY_B64: ${{ secrets.COSIGN_PRIVATE_KEY_B64 }}
|
||||
COSIGN_PASSWORD: ${{ secrets.COSIGN_PASSWORD }}
|
||||
run: |
|
||||
echo "$COSIGN_PRIVATE_KEY_B64" | base64 -d > cosign.key
|
||||
|
||||
IMAGE="${{ env.REGISTRY }}/${{ needs.resolve.outputs.service }}@${{ needs.publish-container.outputs.image_digest }}"
|
||||
|
||||
cosign attach sbom --sbom sbom/sbom.cyclonedx.json "$IMAGE"
|
||||
cosign sign --key cosign.key --attachment sbom "$IMAGE"
|
||||
|
||||
rm -f cosign.key
|
||||
|
||||
# ===========================================================================
|
||||
# Release summary
|
||||
# ===========================================================================
|
||||
summary:
|
||||
name: Release Summary
|
||||
runs-on: ubuntu-latest
|
||||
needs: [resolve, build-test, publish-container, generate-sbom, sign-artifacts]
|
||||
if: always()
|
||||
|
||||
steps:
|
||||
- name: Generate summary
|
||||
run: |
|
||||
echo "# Service Release: ${{ needs.resolve.outputs.service }}" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "## Release Details" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Property | Value |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "|----------|-------|" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Service | ${{ needs.resolve.outputs.service }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Version | ${{ needs.resolve.outputs.new_version }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Previous | ${{ needs.resolve.outputs.current_version }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Docker Tag | \`${{ needs.resolve.outputs.docker_tag }}\` |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Git SHA | \`${{ github.sha }}\` |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "## Job Results" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Job | Status |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "|-----|--------|" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Build & Test | ${{ needs.build-test.result }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Publish Container | ${{ needs.publish-container.result }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Generate SBOM | ${{ needs.generate-sbom.result }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Sign Artifacts | ${{ needs.sign-artifacts.result }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
if [[ "${{ needs.resolve.outputs.is_dry_run }}" == "true" ]]; then
|
||||
echo "⚠️ **This was a dry run. No artifacts were published.**" >> $GITHUB_STEP_SUMMARY
|
||||
else
|
||||
echo "## Pull Image" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "\`\`\`bash" >> $GITHUB_STEP_SUMMARY
|
||||
echo "docker pull ${{ env.REGISTRY }}/${{ needs.resolve.outputs.service }}:${{ needs.resolve.outputs.docker_tag }}" >> $GITHUB_STEP_SUMMARY
|
||||
echo "\`\`\`" >> $GITHUB_STEP_SUMMARY
|
||||
fi
|
||||
267
.gitea/workflows/templates/replay-verify.yml
Normal file
267
.gitea/workflows/templates/replay-verify.yml
Normal file
@@ -0,0 +1,267 @@
|
||||
# =============================================================================
|
||||
# replay-verify.yml
|
||||
# Sprint: SPRINT_20251228_001_BE_replay_manifest_ci (T4)
|
||||
# Description: CI workflow template for SBOM hash drift detection
|
||||
# =============================================================================
|
||||
#
|
||||
# This workflow verifies that SBOM generation and verdict computation are
|
||||
# deterministic by comparing replay manifest hashes across builds.
|
||||
#
|
||||
# Usage:
|
||||
# 1. Copy this template to your project's .gitea/workflows/ directory
|
||||
# 2. Adjust the image name and scan parameters as needed
|
||||
# 3. Optionally enable the SBOM attestation step
|
||||
#
|
||||
# Exit codes:
|
||||
# 0 - Verification passed, all hashes match
|
||||
# 1 - Drift detected, hashes differ
|
||||
# 2 - Verification error (missing inputs, invalid manifest)
|
||||
#
|
||||
# =============================================================================
|
||||
|
||||
name: SBOM Replay Verification
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [main, develop]
|
||||
pull_request:
|
||||
branches: [main]
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
fail_on_drift:
|
||||
description: 'Fail build if hash drift detected'
|
||||
required: false
|
||||
default: 'true'
|
||||
type: boolean
|
||||
strict_mode:
|
||||
description: 'Enable strict verification mode'
|
||||
required: false
|
||||
default: 'false'
|
||||
type: boolean
|
||||
|
||||
env:
|
||||
REGISTRY: ghcr.io
|
||||
IMAGE_NAME: ${{ github.repository }}
|
||||
STELLAOPS_VERSION: '1.0.0'
|
||||
|
||||
jobs:
|
||||
build-and-scan:
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
contents: read
|
||||
packages: write
|
||||
id-token: write # For OIDC-based signing
|
||||
|
||||
outputs:
|
||||
image_digest: ${{ steps.build.outputs.digest }}
|
||||
sbom_digest: ${{ steps.scan.outputs.sbom_digest }}
|
||||
verdict_digest: ${{ steps.scan.outputs.verdict_digest }}
|
||||
replay_manifest: ${{ steps.scan.outputs.replay_manifest }}
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Set up Docker Buildx
|
||||
uses: docker/setup-buildx-action@v3
|
||||
|
||||
- name: Log in to container registry
|
||||
if: github.event_name != 'pull_request'
|
||||
uses: docker/login-action@v3
|
||||
with:
|
||||
registry: ${{ env.REGISTRY }}
|
||||
username: ${{ github.actor }}
|
||||
password: ${{ secrets.GITHUB_TOKEN }}
|
||||
|
||||
- name: Extract metadata for Docker
|
||||
id: meta
|
||||
uses: docker/metadata-action@v5
|
||||
with:
|
||||
images: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}
|
||||
tags: |
|
||||
type=sha,prefix=
|
||||
type=ref,event=branch
|
||||
type=ref,event=pr
|
||||
|
||||
- name: Build and push image
|
||||
id: build
|
||||
uses: docker/build-push-action@v5
|
||||
with:
|
||||
context: .
|
||||
push: ${{ github.event_name != 'pull_request' }}
|
||||
tags: ${{ steps.meta.outputs.tags }}
|
||||
labels: ${{ steps.meta.outputs.labels }}
|
||||
cache-from: type=gha
|
||||
cache-to: type=gha,mode=max
|
||||
provenance: true
|
||||
sbom: false # We generate our own SBOM
|
||||
|
||||
- name: Install StellaOps CLI
|
||||
run: |
|
||||
curl -sSfL https://stellaops.io/install.sh | sh -s -- -v ${{ env.STELLAOPS_VERSION }}
|
||||
echo "$HOME/.stellaops/bin" >> $GITHUB_PATH
|
||||
|
||||
- name: Scan image and generate replay manifest
|
||||
id: scan
|
||||
env:
|
||||
IMAGE_REF: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}@${{ steps.build.outputs.digest }}
|
||||
run: |
|
||||
# Scan image with StellaOps
|
||||
stella scan \
|
||||
--image "${IMAGE_REF}" \
|
||||
--output-sbom sbom.json \
|
||||
--output-findings findings.json \
|
||||
--output-verdict verdict.json \
|
||||
--format cyclonedx-1.6
|
||||
|
||||
# Export replay manifest for CI verification
|
||||
stella replay export \
|
||||
--image "${IMAGE_REF}" \
|
||||
--output replay.json \
|
||||
--include-feeds \
|
||||
--include-reachability \
|
||||
--pretty
|
||||
|
||||
# Extract digests for outputs
|
||||
SBOM_DIGEST=$(sha256sum sbom.json | cut -d' ' -f1)
|
||||
VERDICT_DIGEST=$(sha256sum verdict.json | cut -d' ' -f1)
|
||||
|
||||
echo "sbom_digest=sha256:${SBOM_DIGEST}" >> $GITHUB_OUTPUT
|
||||
echo "verdict_digest=sha256:${VERDICT_DIGEST}" >> $GITHUB_OUTPUT
|
||||
echo "replay_manifest=replay.json" >> $GITHUB_OUTPUT
|
||||
|
||||
# Display summary
|
||||
echo "### Scan Results" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Artifact | Digest |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "|----------|--------|" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Image | \`${{ steps.build.outputs.digest }}\` |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| SBOM | \`sha256:${SBOM_DIGEST}\` |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Verdict | \`sha256:${VERDICT_DIGEST}\` |" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
- name: Upload scan artifacts
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: scan-artifacts-${{ github.sha }}
|
||||
path: |
|
||||
sbom.json
|
||||
findings.json
|
||||
verdict.json
|
||||
replay.json
|
||||
retention-days: 30
|
||||
|
||||
verify-determinism:
|
||||
runs-on: ubuntu-latest
|
||||
needs: build-and-scan
|
||||
|
||||
steps:
|
||||
- name: Download scan artifacts
|
||||
uses: actions/download-artifact@v4
|
||||
with:
|
||||
name: scan-artifacts-${{ github.sha }}
|
||||
|
||||
- name: Install StellaOps CLI
|
||||
run: |
|
||||
curl -sSfL https://stellaops.io/install.sh | sh -s -- -v ${{ env.STELLAOPS_VERSION }}
|
||||
echo "$HOME/.stellaops/bin" >> $GITHUB_PATH
|
||||
|
||||
- name: Verify SBOM determinism
|
||||
id: verify
|
||||
env:
|
||||
FAIL_ON_DRIFT: ${{ inputs.fail_on_drift || 'true' }}
|
||||
STRICT_MODE: ${{ inputs.strict_mode || 'false' }}
|
||||
run: |
|
||||
# Build verification flags
|
||||
VERIFY_FLAGS="--manifest replay.json"
|
||||
if [ "${FAIL_ON_DRIFT}" = "true" ]; then
|
||||
VERIFY_FLAGS="${VERIFY_FLAGS} --fail-on-drift"
|
||||
fi
|
||||
if [ "${STRICT_MODE}" = "true" ]; then
|
||||
VERIFY_FLAGS="${VERIFY_FLAGS} --strict-mode"
|
||||
fi
|
||||
|
||||
# Run verification
|
||||
stella replay export verify ${VERIFY_FLAGS}
|
||||
EXIT_CODE=$?
|
||||
|
||||
# Report results
|
||||
if [ $EXIT_CODE -eq 0 ]; then
|
||||
echo "✅ Verification passed - all hashes match" >> $GITHUB_STEP_SUMMARY
|
||||
echo "status=success" >> $GITHUB_OUTPUT
|
||||
elif [ $EXIT_CODE -eq 1 ]; then
|
||||
echo "⚠️ Drift detected - hashes differ from expected" >> $GITHUB_STEP_SUMMARY
|
||||
echo "status=drift" >> $GITHUB_OUTPUT
|
||||
else
|
||||
echo "❌ Verification error" >> $GITHUB_STEP_SUMMARY
|
||||
echo "status=error" >> $GITHUB_OUTPUT
|
||||
fi
|
||||
|
||||
exit $EXIT_CODE
|
||||
|
||||
- name: Comment on PR (on drift)
|
||||
if: failure() && github.event_name == 'pull_request'
|
||||
uses: actions/github-script@v7
|
||||
with:
|
||||
script: |
|
||||
github.rest.issues.createComment({
|
||||
issue_number: context.issue.number,
|
||||
owner: context.repo.owner,
|
||||
repo: context.repo.repo,
|
||||
body: `## ⚠️ SBOM Determinism Check Failed
|
||||
|
||||
Hash drift detected between scan runs. This may indicate non-deterministic build or scan behavior.
|
||||
|
||||
**Expected digests:**
|
||||
- SBOM: \`${{ needs.build-and-scan.outputs.sbom_digest }}\`
|
||||
- Verdict: \`${{ needs.build-and-scan.outputs.verdict_digest }}\`
|
||||
|
||||
**Possible causes:**
|
||||
- Non-deterministic build artifacts (timestamps, random values)
|
||||
- Changed dependencies between runs
|
||||
- Environment differences
|
||||
|
||||
**Next steps:**
|
||||
1. Review the replay manifest in the artifacts
|
||||
2. Check build logs for non-deterministic elements
|
||||
3. Consider using \`--strict-mode\` for detailed drift analysis`
|
||||
})
|
||||
|
||||
# Optional: Attest SBOM to OCI registry
|
||||
attest-sbom:
|
||||
runs-on: ubuntu-latest
|
||||
needs: [build-and-scan, verify-determinism]
|
||||
if: github.event_name != 'pull_request' && success()
|
||||
permissions:
|
||||
packages: write
|
||||
id-token: write
|
||||
|
||||
steps:
|
||||
- name: Download scan artifacts
|
||||
uses: actions/download-artifact@v4
|
||||
with:
|
||||
name: scan-artifacts-${{ github.sha }}
|
||||
|
||||
- name: Install StellaOps CLI
|
||||
run: |
|
||||
curl -sSfL https://stellaops.io/install.sh | sh -s -- -v ${{ env.STELLAOPS_VERSION }}
|
||||
echo "$HOME/.stellaops/bin" >> $GITHUB_PATH
|
||||
|
||||
- name: Log in to container registry
|
||||
uses: docker/login-action@v3
|
||||
with:
|
||||
registry: ${{ env.REGISTRY }}
|
||||
username: ${{ github.actor }}
|
||||
password: ${{ secrets.GITHUB_TOKEN }}
|
||||
|
||||
- name: Attach SBOM attestation
|
||||
env:
|
||||
IMAGE_REF: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}@${{ needs.build-and-scan.outputs.image_digest }}
|
||||
run: |
|
||||
# Sign and attach SBOM as in-toto attestation
|
||||
stella attest attach \
|
||||
--image "${IMAGE_REF}" \
|
||||
--sbom sbom.json \
|
||||
--predicate-type https://cyclonedx.org/bom/v1.6 \
|
||||
--sign keyless
|
||||
|
||||
echo "### SBOM Attestation" >> $GITHUB_STEP_SUMMARY
|
||||
echo "SBOM attached to \`${IMAGE_REF}\`" >> $GITHUB_STEP_SUMMARY
|
||||
@@ -1,9 +1,10 @@
|
||||
# .gitea/workflows/test-matrix.yml
|
||||
# Unified test matrix pipeline with TRX reporting for all test categories
|
||||
# Sprint: SPRINT_20251226_007_CICD - Dynamic test discovery
|
||||
# Refactored: SPRINT_CICD_Enhancement - DRY principle, matrix strategy
|
||||
#
|
||||
# WORKFLOW INTEGRATION STRATEGY (Sprint 20251226_003_CICD):
|
||||
# =========================================================
|
||||
# WORKFLOW INTEGRATION STRATEGY:
|
||||
# ==============================
|
||||
# This workflow is the PRIMARY test execution workflow for PR gating.
|
||||
# It dynamically discovers and runs ALL test projects by Category trait.
|
||||
#
|
||||
@@ -12,8 +13,6 @@
|
||||
#
|
||||
# Scheduled/On-Demand Categories:
|
||||
# Performance, Benchmark, AirGap, Chaos, Determinism, Resilience, Observability
|
||||
#
|
||||
# For build/deploy operations, see: build-test-deploy.yml (runs in parallel)
|
||||
|
||||
name: Test Matrix
|
||||
|
||||
@@ -85,10 +84,6 @@ jobs:
|
||||
- name: Find all test projects
|
||||
id: find
|
||||
run: |
|
||||
# Find all test project files, including non-standard naming conventions:
|
||||
# - *.Tests.csproj (standard)
|
||||
# - *UnitTests.csproj, *SmokeTests.csproj, *FixtureTests.csproj, *IntegrationTests.csproj
|
||||
# Exclude: TestKit, Testing libraries, node_modules, bin, obj
|
||||
PROJECTS=$(find src \( \
|
||||
-name "*.Tests.csproj" \
|
||||
-o -name "*UnitTests.csproj" \
|
||||
@@ -104,11 +99,9 @@ jobs:
|
||||
! -name "*Testing.csproj" \
|
||||
| sort)
|
||||
|
||||
# Count projects
|
||||
COUNT=$(echo "$PROJECTS" | grep -c '.csproj' || echo "0")
|
||||
echo "Found $COUNT test projects"
|
||||
|
||||
# Output as JSON array for matrix
|
||||
echo "projects=$(echo "$PROJECTS" | jq -R -s -c 'split("\n") | map(select(length > 0))')" >> $GITHUB_OUTPUT
|
||||
echo "count=$COUNT" >> $GITHUB_OUTPUT
|
||||
|
||||
@@ -122,13 +115,34 @@ jobs:
|
||||
|
||||
# ===========================================================================
|
||||
# PR-GATING TESTS (run on every push/PR)
|
||||
# Uses matrix strategy to run all categories in parallel
|
||||
# ===========================================================================
|
||||
|
||||
unit:
|
||||
name: Unit Tests
|
||||
pr-gating-tests:
|
||||
name: ${{ matrix.category }} Tests
|
||||
runs-on: ubuntu-22.04
|
||||
timeout-minutes: 20
|
||||
timeout-minutes: ${{ matrix.timeout }}
|
||||
needs: discover
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
include:
|
||||
- category: Unit
|
||||
timeout: 20
|
||||
collect_coverage: true
|
||||
- category: Architecture
|
||||
timeout: 15
|
||||
collect_coverage: false
|
||||
- category: Contract
|
||||
timeout: 15
|
||||
collect_coverage: false
|
||||
- category: Security
|
||||
timeout: 25
|
||||
collect_coverage: false
|
||||
- category: Golden
|
||||
timeout: 25
|
||||
collect_coverage: false
|
||||
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
@@ -141,165 +155,26 @@ jobs:
|
||||
dotnet-version: ${{ env.DOTNET_VERSION }}
|
||||
include-prerelease: true
|
||||
|
||||
- name: Run Unit Tests (all test projects)
|
||||
- name: Run ${{ matrix.category }} Tests
|
||||
run: |
|
||||
mkdir -p ./TestResults/Unit
|
||||
FAILED=0
|
||||
PASSED=0
|
||||
SKIPPED=0
|
||||
|
||||
# Find and run all test projects with Unit category
|
||||
# Use expanded pattern to include non-standard naming conventions
|
||||
for proj in $(find src \( -name "*.Tests.csproj" -o -name "*UnitTests.csproj" -o -name "*SmokeTests.csproj" -o -name "*FixtureTests.csproj" -o -name "*IntegrationTests.csproj" \) -type f ! -path "*/node_modules/*" ! -name "StellaOps.TestKit.csproj" ! -name "*Testing.csproj" | sort); do
|
||||
echo "::group::Testing $proj"
|
||||
|
||||
# Create unique TRX filename using path hash to avoid duplicates
|
||||
TRX_NAME=$(echo "$proj" | sed 's|/|_|g' | sed 's|\.csproj||')-unit.trx
|
||||
|
||||
# Restore and build in one step, then test
|
||||
if dotnet test "$proj" \
|
||||
--filter "Category=Unit" \
|
||||
--configuration Release \
|
||||
--logger "trx;LogFileName=$TRX_NAME" \
|
||||
--results-directory ./TestResults/Unit \
|
||||
--collect:"XPlat Code Coverage" \
|
||||
--verbosity minimal 2>&1; then
|
||||
PASSED=$((PASSED + 1))
|
||||
echo "✓ $proj passed"
|
||||
else
|
||||
# Check if it was just "no tests matched" which is not a failure
|
||||
if [ $? -eq 0 ] || grep -q "No test matches" /tmp/test-output.txt 2>/dev/null; then
|
||||
SKIPPED=$((SKIPPED + 1))
|
||||
echo "○ $proj skipped (no Unit tests)"
|
||||
else
|
||||
FAILED=$((FAILED + 1))
|
||||
echo "✗ $proj failed"
|
||||
fi
|
||||
fi
|
||||
echo "::endgroup::"
|
||||
done
|
||||
|
||||
echo "## Unit Test Summary" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- Passed: $PASSED" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- Failed: $FAILED" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- Skipped: $SKIPPED" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
# Fail if any tests failed
|
||||
if [ $FAILED -gt 0 ]; then
|
||||
exit 1
|
||||
chmod +x .gitea/scripts/test/run-test-category.sh
|
||||
if [[ "${{ matrix.collect_coverage }}" == "true" ]]; then
|
||||
.gitea/scripts/test/run-test-category.sh "${{ matrix.category }}" --collect-coverage
|
||||
else
|
||||
.gitea/scripts/test/run-test-category.sh "${{ matrix.category }}"
|
||||
fi
|
||||
|
||||
- name: Upload Test Results
|
||||
uses: actions/upload-artifact@v4
|
||||
if: always()
|
||||
with:
|
||||
name: test-results-unit
|
||||
path: ./TestResults/Unit
|
||||
name: test-results-${{ matrix.category }}
|
||||
path: ./TestResults/${{ matrix.category }}
|
||||
retention-days: 14
|
||||
|
||||
architecture:
|
||||
name: Architecture Tests
|
||||
runs-on: ubuntu-22.04
|
||||
timeout-minutes: 15
|
||||
needs: discover
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0
|
||||
|
||||
- name: Setup .NET
|
||||
uses: actions/setup-dotnet@v4
|
||||
with:
|
||||
dotnet-version: ${{ env.DOTNET_VERSION }}
|
||||
include-prerelease: true
|
||||
|
||||
- name: Run Architecture Tests (all test projects)
|
||||
run: |
|
||||
mkdir -p ./TestResults/Architecture
|
||||
FAILED=0
|
||||
PASSED=0
|
||||
SKIPPED=0
|
||||
|
||||
for proj in $(find src \( -name "*.Tests.csproj" -o -name "*UnitTests.csproj" -o -name "*SmokeTests.csproj" -o -name "*FixtureTests.csproj" -o -name "*IntegrationTests.csproj" \) -type f ! -path "*/node_modules/*" ! -name "StellaOps.TestKit.csproj" ! -name "*Testing.csproj" | sort); do
|
||||
echo "::group::Testing $proj"
|
||||
TRX_NAME=$(echo "$proj" | sed 's|/|_|g' | sed 's|\.csproj||')-architecture.trx
|
||||
if dotnet test "$proj" \
|
||||
--filter "Category=Architecture" \
|
||||
--configuration Release \
|
||||
--logger "trx;LogFileName=$TRX_NAME" \
|
||||
--results-directory ./TestResults/Architecture \
|
||||
--verbosity minimal 2>&1; then
|
||||
PASSED=$((PASSED + 1))
|
||||
else
|
||||
SKIPPED=$((SKIPPED + 1))
|
||||
fi
|
||||
echo "::endgroup::"
|
||||
done
|
||||
|
||||
echo "## Architecture Test Summary" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- Passed: $PASSED" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- Skipped: $SKIPPED" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
- name: Upload Test Results
|
||||
uses: actions/upload-artifact@v4
|
||||
if: always()
|
||||
with:
|
||||
name: test-results-architecture
|
||||
path: ./TestResults/Architecture
|
||||
retention-days: 14
|
||||
|
||||
contract:
|
||||
name: Contract Tests
|
||||
runs-on: ubuntu-22.04
|
||||
timeout-minutes: 15
|
||||
needs: discover
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0
|
||||
|
||||
- name: Setup .NET
|
||||
uses: actions/setup-dotnet@v4
|
||||
with:
|
||||
dotnet-version: ${{ env.DOTNET_VERSION }}
|
||||
include-prerelease: true
|
||||
|
||||
- name: Run Contract Tests (all test projects)
|
||||
run: |
|
||||
mkdir -p ./TestResults/Contract
|
||||
FAILED=0
|
||||
PASSED=0
|
||||
SKIPPED=0
|
||||
|
||||
for proj in $(find src \( -name "*.Tests.csproj" -o -name "*UnitTests.csproj" -o -name "*SmokeTests.csproj" -o -name "*FixtureTests.csproj" -o -name "*IntegrationTests.csproj" \) -type f ! -path "*/node_modules/*" ! -name "StellaOps.TestKit.csproj" ! -name "*Testing.csproj" | sort); do
|
||||
echo "::group::Testing $proj"
|
||||
TRX_NAME=$(echo "$proj" | sed 's|/|_|g' | sed 's|\.csproj||')-contract.trx
|
||||
if dotnet test "$proj" \
|
||||
--filter "Category=Contract" \
|
||||
--configuration Release \
|
||||
--logger "trx;LogFileName=$TRX_NAME" \
|
||||
--results-directory ./TestResults/Contract \
|
||||
--verbosity minimal 2>&1; then
|
||||
PASSED=$((PASSED + 1))
|
||||
else
|
||||
SKIPPED=$((SKIPPED + 1))
|
||||
fi
|
||||
echo "::endgroup::"
|
||||
done
|
||||
|
||||
echo "## Contract Test Summary" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- Passed: $PASSED" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- Skipped: $SKIPPED" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
- name: Upload Test Results
|
||||
uses: actions/upload-artifact@v4
|
||||
if: always()
|
||||
with:
|
||||
name: test-results-contract
|
||||
path: ./TestResults/Contract
|
||||
retention-days: 14
|
||||
# ===========================================================================
|
||||
# INTEGRATION TESTS (separate due to service dependency)
|
||||
# ===========================================================================
|
||||
|
||||
integration:
|
||||
name: Integration Tests
|
||||
@@ -332,520 +207,112 @@ jobs:
|
||||
dotnet-version: ${{ env.DOTNET_VERSION }}
|
||||
include-prerelease: true
|
||||
|
||||
- name: Run Integration Tests (all test projects)
|
||||
- name: Run Integration Tests
|
||||
env:
|
||||
STELLAOPS_TEST_POSTGRES_CONNECTION: "Host=localhost;Port=5432;Database=stellaops_test;Username=stellaops;Password=stellaops"
|
||||
run: |
|
||||
mkdir -p ./TestResults/Integration
|
||||
FAILED=0
|
||||
PASSED=0
|
||||
SKIPPED=0
|
||||
|
||||
for proj in $(find src \( -name "*.Tests.csproj" -o -name "*UnitTests.csproj" -o -name "*SmokeTests.csproj" -o -name "*FixtureTests.csproj" -o -name "*IntegrationTests.csproj" \) -type f ! -path "*/node_modules/*" ! -name "StellaOps.TestKit.csproj" ! -name "*Testing.csproj" | sort); do
|
||||
echo "::group::Testing $proj"
|
||||
TRX_NAME=$(echo "$proj" | sed 's|/|_|g' | sed 's|\.csproj||')-integration.trx
|
||||
if dotnet test "$proj" \
|
||||
--filter "Category=Integration" \
|
||||
--configuration Release \
|
||||
--logger "trx;LogFileName=$TRX_NAME" \
|
||||
--results-directory ./TestResults/Integration \
|
||||
--verbosity minimal 2>&1; then
|
||||
PASSED=$((PASSED + 1))
|
||||
else
|
||||
SKIPPED=$((SKIPPED + 1))
|
||||
fi
|
||||
echo "::endgroup::"
|
||||
done
|
||||
|
||||
echo "## Integration Test Summary" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- Passed: $PASSED" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- Skipped: $SKIPPED" >> $GITHUB_STEP_SUMMARY
|
||||
chmod +x .gitea/scripts/test/run-test-category.sh
|
||||
.gitea/scripts/test/run-test-category.sh Integration
|
||||
|
||||
- name: Upload Test Results
|
||||
uses: actions/upload-artifact@v4
|
||||
if: always()
|
||||
with:
|
||||
name: test-results-integration
|
||||
name: test-results-Integration
|
||||
path: ./TestResults/Integration
|
||||
retention-days: 14
|
||||
|
||||
security:
|
||||
name: Security Tests
|
||||
runs-on: ubuntu-22.04
|
||||
timeout-minutes: 25
|
||||
needs: discover
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0
|
||||
|
||||
- name: Setup .NET
|
||||
uses: actions/setup-dotnet@v4
|
||||
with:
|
||||
dotnet-version: ${{ env.DOTNET_VERSION }}
|
||||
include-prerelease: true
|
||||
|
||||
- name: Run Security Tests (all test projects)
|
||||
run: |
|
||||
mkdir -p ./TestResults/Security
|
||||
FAILED=0
|
||||
PASSED=0
|
||||
SKIPPED=0
|
||||
|
||||
for proj in $(find src \( -name "*.Tests.csproj" -o -name "*UnitTests.csproj" -o -name "*SmokeTests.csproj" -o -name "*FixtureTests.csproj" -o -name "*IntegrationTests.csproj" \) -type f ! -path "*/node_modules/*" ! -name "StellaOps.TestKit.csproj" ! -name "*Testing.csproj" | sort); do
|
||||
echo "::group::Testing $proj"
|
||||
TRX_NAME=$(echo "$proj" | sed 's|/|_|g' | sed 's|\.csproj||')-security.trx
|
||||
if dotnet test "$proj" \
|
||||
--filter "Category=Security" \
|
||||
--configuration Release \
|
||||
--logger "trx;LogFileName=$TRX_NAME" \
|
||||
--results-directory ./TestResults/Security \
|
||||
--verbosity minimal 2>&1; then
|
||||
PASSED=$((PASSED + 1))
|
||||
else
|
||||
SKIPPED=$((SKIPPED + 1))
|
||||
fi
|
||||
echo "::endgroup::"
|
||||
done
|
||||
|
||||
echo "## Security Test Summary" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- Passed: $PASSED" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- Skipped: $SKIPPED" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
- name: Upload Test Results
|
||||
uses: actions/upload-artifact@v4
|
||||
if: always()
|
||||
with:
|
||||
name: test-results-security
|
||||
path: ./TestResults/Security
|
||||
retention-days: 14
|
||||
|
||||
golden:
|
||||
name: Golden Tests
|
||||
runs-on: ubuntu-22.04
|
||||
timeout-minutes: 25
|
||||
needs: discover
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0
|
||||
|
||||
- name: Setup .NET
|
||||
uses: actions/setup-dotnet@v4
|
||||
with:
|
||||
dotnet-version: ${{ env.DOTNET_VERSION }}
|
||||
include-prerelease: true
|
||||
|
||||
- name: Run Golden Tests (all test projects)
|
||||
run: |
|
||||
mkdir -p ./TestResults/Golden
|
||||
FAILED=0
|
||||
PASSED=0
|
||||
SKIPPED=0
|
||||
|
||||
for proj in $(find src \( -name "*.Tests.csproj" -o -name "*UnitTests.csproj" -o -name "*SmokeTests.csproj" -o -name "*FixtureTests.csproj" -o -name "*IntegrationTests.csproj" \) -type f ! -path "*/node_modules/*" ! -name "StellaOps.TestKit.csproj" ! -name "*Testing.csproj" | sort); do
|
||||
echo "::group::Testing $proj"
|
||||
TRX_NAME=$(echo "$proj" | sed 's|/|_|g' | sed 's|\.csproj||')-golden.trx
|
||||
if dotnet test "$proj" \
|
||||
--filter "Category=Golden" \
|
||||
--configuration Release \
|
||||
--logger "trx;LogFileName=$TRX_NAME" \
|
||||
--results-directory ./TestResults/Golden \
|
||||
--verbosity minimal 2>&1; then
|
||||
PASSED=$((PASSED + 1))
|
||||
else
|
||||
SKIPPED=$((SKIPPED + 1))
|
||||
fi
|
||||
echo "::endgroup::"
|
||||
done
|
||||
|
||||
echo "## Golden Test Summary" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- Passed: $PASSED" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- Skipped: $SKIPPED" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
- name: Upload Test Results
|
||||
uses: actions/upload-artifact@v4
|
||||
if: always()
|
||||
with:
|
||||
name: test-results-golden
|
||||
path: ./TestResults/Golden
|
||||
retention-days: 14
|
||||
|
||||
# ===========================================================================
|
||||
# SCHEDULED/ON-DEMAND TESTS
|
||||
# Uses matrix strategy for extended test categories
|
||||
# ===========================================================================
|
||||
|
||||
performance:
|
||||
name: Performance Tests
|
||||
extended-tests:
|
||||
name: ${{ matrix.category }} Tests
|
||||
runs-on: ubuntu-22.04
|
||||
timeout-minutes: 45
|
||||
timeout-minutes: ${{ matrix.timeout }}
|
||||
needs: discover
|
||||
if: github.event_name == 'schedule' || github.event.inputs.include_performance == 'true'
|
||||
if: >-
|
||||
github.event_name == 'schedule' ||
|
||||
github.event.inputs.include_performance == 'true' ||
|
||||
github.event.inputs.include_benchmark == 'true' ||
|
||||
github.event.inputs.include_airgap == 'true' ||
|
||||
github.event.inputs.include_chaos == 'true' ||
|
||||
github.event.inputs.include_determinism == 'true' ||
|
||||
github.event.inputs.include_resilience == 'true' ||
|
||||
github.event.inputs.include_observability == 'true'
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
include:
|
||||
- category: Performance
|
||||
timeout: 45
|
||||
trigger_input: include_performance
|
||||
run_on_schedule: true
|
||||
- category: Benchmark
|
||||
timeout: 60
|
||||
trigger_input: include_benchmark
|
||||
run_on_schedule: true
|
||||
- category: AirGap
|
||||
timeout: 45
|
||||
trigger_input: include_airgap
|
||||
run_on_schedule: false
|
||||
- category: Chaos
|
||||
timeout: 45
|
||||
trigger_input: include_chaos
|
||||
run_on_schedule: false
|
||||
- category: Determinism
|
||||
timeout: 45
|
||||
trigger_input: include_determinism
|
||||
run_on_schedule: false
|
||||
- category: Resilience
|
||||
timeout: 45
|
||||
trigger_input: include_resilience
|
||||
run_on_schedule: false
|
||||
- category: Observability
|
||||
timeout: 30
|
||||
trigger_input: include_observability
|
||||
run_on_schedule: false
|
||||
|
||||
steps:
|
||||
- name: Check if should run
|
||||
id: should_run
|
||||
run: |
|
||||
SHOULD_RUN="false"
|
||||
if [[ "${{ github.event_name }}" == "schedule" && "${{ matrix.run_on_schedule }}" == "true" ]]; then
|
||||
SHOULD_RUN="true"
|
||||
fi
|
||||
if [[ "${{ github.event.inputs[matrix.trigger_input] }}" == "true" ]]; then
|
||||
SHOULD_RUN="true"
|
||||
fi
|
||||
echo "run=$SHOULD_RUN" >> $GITHUB_OUTPUT
|
||||
echo "Should run ${{ matrix.category }}: $SHOULD_RUN"
|
||||
|
||||
- name: Checkout
|
||||
if: steps.should_run.outputs.run == 'true'
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0
|
||||
|
||||
- name: Setup .NET
|
||||
if: steps.should_run.outputs.run == 'true'
|
||||
uses: actions/setup-dotnet@v4
|
||||
with:
|
||||
dotnet-version: ${{ env.DOTNET_VERSION }}
|
||||
include-prerelease: true
|
||||
|
||||
- name: Run Performance Tests (all test projects)
|
||||
- name: Run ${{ matrix.category }} Tests
|
||||
if: steps.should_run.outputs.run == 'true'
|
||||
run: |
|
||||
mkdir -p ./TestResults/Performance
|
||||
FAILED=0
|
||||
PASSED=0
|
||||
SKIPPED=0
|
||||
|
||||
for proj in $(find src \( -name "*.Tests.csproj" -o -name "*UnitTests.csproj" -o -name "*SmokeTests.csproj" -o -name "*FixtureTests.csproj" -o -name "*IntegrationTests.csproj" \) -type f ! -path "*/node_modules/*" ! -name "StellaOps.TestKit.csproj" ! -name "*Testing.csproj" | sort); do
|
||||
echo "::group::Testing $proj"
|
||||
TRX_NAME=$(echo "$proj" | sed 's|/|_|g' | sed 's|\.csproj||')-performance.trx
|
||||
if dotnet test "$proj" \
|
||||
--filter "Category=Performance" \
|
||||
--configuration Release \
|
||||
--logger "trx;LogFileName=$TRX_NAME" \
|
||||
--results-directory ./TestResults/Performance \
|
||||
--verbosity minimal 2>&1; then
|
||||
PASSED=$((PASSED + 1))
|
||||
else
|
||||
SKIPPED=$((SKIPPED + 1))
|
||||
fi
|
||||
echo "::endgroup::"
|
||||
done
|
||||
|
||||
echo "## Performance Test Summary" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- Passed: $PASSED" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- Skipped: $SKIPPED" >> $GITHUB_STEP_SUMMARY
|
||||
chmod +x .gitea/scripts/test/run-test-category.sh
|
||||
.gitea/scripts/test/run-test-category.sh "${{ matrix.category }}"
|
||||
|
||||
- name: Upload Test Results
|
||||
uses: actions/upload-artifact@v4
|
||||
if: always()
|
||||
if: always() && steps.should_run.outputs.run == 'true'
|
||||
with:
|
||||
name: test-results-performance
|
||||
path: ./TestResults/Performance
|
||||
retention-days: 14
|
||||
|
||||
benchmark:
|
||||
name: Benchmark Tests
|
||||
runs-on: ubuntu-22.04
|
||||
timeout-minutes: 60
|
||||
needs: discover
|
||||
if: github.event_name == 'schedule' || github.event.inputs.include_benchmark == 'true'
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0
|
||||
|
||||
- name: Setup .NET
|
||||
uses: actions/setup-dotnet@v4
|
||||
with:
|
||||
dotnet-version: ${{ env.DOTNET_VERSION }}
|
||||
include-prerelease: true
|
||||
|
||||
- name: Run Benchmark Tests (all test projects)
|
||||
run: |
|
||||
mkdir -p ./TestResults/Benchmark
|
||||
FAILED=0
|
||||
PASSED=0
|
||||
SKIPPED=0
|
||||
|
||||
for proj in $(find src \( -name "*.Tests.csproj" -o -name "*UnitTests.csproj" -o -name "*SmokeTests.csproj" -o -name "*FixtureTests.csproj" -o -name "*IntegrationTests.csproj" \) -type f ! -path "*/node_modules/*" ! -name "StellaOps.TestKit.csproj" ! -name "*Testing.csproj" | sort); do
|
||||
echo "::group::Testing $proj"
|
||||
TRX_NAME=$(echo "$proj" | sed 's|/|_|g' | sed 's|\.csproj||')-benchmark.trx
|
||||
if dotnet test "$proj" \
|
||||
--filter "Category=Benchmark" \
|
||||
--configuration Release \
|
||||
--logger "trx;LogFileName=$TRX_NAME" \
|
||||
--results-directory ./TestResults/Benchmark \
|
||||
--verbosity minimal 2>&1; then
|
||||
PASSED=$((PASSED + 1))
|
||||
else
|
||||
SKIPPED=$((SKIPPED + 1))
|
||||
fi
|
||||
echo "::endgroup::"
|
||||
done
|
||||
|
||||
echo "## Benchmark Test Summary" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- Passed: $PASSED" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- Skipped: $SKIPPED" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
- name: Upload Test Results
|
||||
uses: actions/upload-artifact@v4
|
||||
if: always()
|
||||
with:
|
||||
name: test-results-benchmark
|
||||
path: ./TestResults/Benchmark
|
||||
retention-days: 14
|
||||
|
||||
airgap:
|
||||
name: AirGap Tests
|
||||
runs-on: ubuntu-22.04
|
||||
timeout-minutes: 45
|
||||
needs: discover
|
||||
if: github.event.inputs.include_airgap == 'true'
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0
|
||||
|
||||
- name: Setup .NET
|
||||
uses: actions/setup-dotnet@v4
|
||||
with:
|
||||
dotnet-version: ${{ env.DOTNET_VERSION }}
|
||||
include-prerelease: true
|
||||
|
||||
- name: Run AirGap Tests (all test projects)
|
||||
run: |
|
||||
mkdir -p ./TestResults/AirGap
|
||||
FAILED=0
|
||||
PASSED=0
|
||||
SKIPPED=0
|
||||
|
||||
for proj in $(find src \( -name "*.Tests.csproj" -o -name "*UnitTests.csproj" -o -name "*SmokeTests.csproj" -o -name "*FixtureTests.csproj" -o -name "*IntegrationTests.csproj" \) -type f ! -path "*/node_modules/*" ! -name "StellaOps.TestKit.csproj" ! -name "*Testing.csproj" | sort); do
|
||||
echo "::group::Testing $proj"
|
||||
TRX_NAME=$(echo "$proj" | sed 's|/|_|g' | sed 's|\.csproj||')-airgap.trx
|
||||
if dotnet test "$proj" \
|
||||
--filter "Category=AirGap" \
|
||||
--configuration Release \
|
||||
--logger "trx;LogFileName=$TRX_NAME" \
|
||||
--results-directory ./TestResults/AirGap \
|
||||
--verbosity minimal 2>&1; then
|
||||
PASSED=$((PASSED + 1))
|
||||
else
|
||||
SKIPPED=$((SKIPPED + 1))
|
||||
fi
|
||||
echo "::endgroup::"
|
||||
done
|
||||
|
||||
echo "## AirGap Test Summary" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- Passed: $PASSED" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- Skipped: $SKIPPED" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
- name: Upload Test Results
|
||||
uses: actions/upload-artifact@v4
|
||||
if: always()
|
||||
with:
|
||||
name: test-results-airgap
|
||||
path: ./TestResults/AirGap
|
||||
retention-days: 14
|
||||
|
||||
chaos:
|
||||
name: Chaos Tests
|
||||
runs-on: ubuntu-22.04
|
||||
timeout-minutes: 45
|
||||
needs: discover
|
||||
if: github.event.inputs.include_chaos == 'true'
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0
|
||||
|
||||
- name: Setup .NET
|
||||
uses: actions/setup-dotnet@v4
|
||||
with:
|
||||
dotnet-version: ${{ env.DOTNET_VERSION }}
|
||||
include-prerelease: true
|
||||
|
||||
- name: Run Chaos Tests (all test projects)
|
||||
run: |
|
||||
mkdir -p ./TestResults/Chaos
|
||||
FAILED=0
|
||||
PASSED=0
|
||||
SKIPPED=0
|
||||
|
||||
for proj in $(find src \( -name "*.Tests.csproj" -o -name "*UnitTests.csproj" -o -name "*SmokeTests.csproj" -o -name "*FixtureTests.csproj" -o -name "*IntegrationTests.csproj" \) -type f ! -path "*/node_modules/*" ! -name "StellaOps.TestKit.csproj" ! -name "*Testing.csproj" | sort); do
|
||||
echo "::group::Testing $proj"
|
||||
TRX_NAME=$(echo "$proj" | sed 's|/|_|g' | sed 's|\.csproj||')-chaos.trx
|
||||
if dotnet test "$proj" \
|
||||
--filter "Category=Chaos" \
|
||||
--configuration Release \
|
||||
--logger "trx;LogFileName=$TRX_NAME" \
|
||||
--results-directory ./TestResults/Chaos \
|
||||
--verbosity minimal 2>&1; then
|
||||
PASSED=$((PASSED + 1))
|
||||
else
|
||||
SKIPPED=$((SKIPPED + 1))
|
||||
fi
|
||||
echo "::endgroup::"
|
||||
done
|
||||
|
||||
echo "## Chaos Test Summary" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- Passed: $PASSED" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- Skipped: $SKIPPED" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
- name: Upload Test Results
|
||||
uses: actions/upload-artifact@v4
|
||||
if: always()
|
||||
with:
|
||||
name: test-results-chaos
|
||||
path: ./TestResults/Chaos
|
||||
retention-days: 14
|
||||
|
||||
determinism:
|
||||
name: Determinism Tests
|
||||
runs-on: ubuntu-22.04
|
||||
timeout-minutes: 45
|
||||
needs: discover
|
||||
if: github.event.inputs.include_determinism == 'true'
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0
|
||||
|
||||
- name: Setup .NET
|
||||
uses: actions/setup-dotnet@v4
|
||||
with:
|
||||
dotnet-version: ${{ env.DOTNET_VERSION }}
|
||||
include-prerelease: true
|
||||
|
||||
- name: Run Determinism Tests (all test projects)
|
||||
run: |
|
||||
mkdir -p ./TestResults/Determinism
|
||||
FAILED=0
|
||||
PASSED=0
|
||||
SKIPPED=0
|
||||
|
||||
for proj in $(find src \( -name "*.Tests.csproj" -o -name "*UnitTests.csproj" -o -name "*SmokeTests.csproj" -o -name "*FixtureTests.csproj" -o -name "*IntegrationTests.csproj" \) -type f ! -path "*/node_modules/*" ! -name "StellaOps.TestKit.csproj" ! -name "*Testing.csproj" | sort); do
|
||||
echo "::group::Testing $proj"
|
||||
TRX_NAME=$(echo "$proj" | sed 's|/|_|g' | sed 's|\.csproj||')-determinism.trx
|
||||
if dotnet test "$proj" \
|
||||
--filter "Category=Determinism" \
|
||||
--configuration Release \
|
||||
--logger "trx;LogFileName=$TRX_NAME" \
|
||||
--results-directory ./TestResults/Determinism \
|
||||
--verbosity minimal 2>&1; then
|
||||
PASSED=$((PASSED + 1))
|
||||
else
|
||||
SKIPPED=$((SKIPPED + 1))
|
||||
fi
|
||||
echo "::endgroup::"
|
||||
done
|
||||
|
||||
echo "## Determinism Test Summary" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- Passed: $PASSED" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- Skipped: $SKIPPED" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
- name: Upload Test Results
|
||||
uses: actions/upload-artifact@v4
|
||||
if: always()
|
||||
with:
|
||||
name: test-results-determinism
|
||||
path: ./TestResults/Determinism
|
||||
retention-days: 14
|
||||
|
||||
resilience:
|
||||
name: Resilience Tests
|
||||
runs-on: ubuntu-22.04
|
||||
timeout-minutes: 45
|
||||
needs: discover
|
||||
if: github.event.inputs.include_resilience == 'true'
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0
|
||||
|
||||
- name: Setup .NET
|
||||
uses: actions/setup-dotnet@v4
|
||||
with:
|
||||
dotnet-version: ${{ env.DOTNET_VERSION }}
|
||||
include-prerelease: true
|
||||
|
||||
- name: Run Resilience Tests (all test projects)
|
||||
run: |
|
||||
mkdir -p ./TestResults/Resilience
|
||||
FAILED=0
|
||||
PASSED=0
|
||||
SKIPPED=0
|
||||
|
||||
for proj in $(find src \( -name "*.Tests.csproj" -o -name "*UnitTests.csproj" -o -name "*SmokeTests.csproj" -o -name "*FixtureTests.csproj" -o -name "*IntegrationTests.csproj" \) -type f ! -path "*/node_modules/*" ! -name "StellaOps.TestKit.csproj" ! -name "*Testing.csproj" | sort); do
|
||||
echo "::group::Testing $proj"
|
||||
TRX_NAME=$(echo "$proj" | sed 's|/|_|g' | sed 's|\.csproj||')-resilience.trx
|
||||
if dotnet test "$proj" \
|
||||
--filter "Category=Resilience" \
|
||||
--configuration Release \
|
||||
--logger "trx;LogFileName=$TRX_NAME" \
|
||||
--results-directory ./TestResults/Resilience \
|
||||
--verbosity minimal 2>&1; then
|
||||
PASSED=$((PASSED + 1))
|
||||
else
|
||||
SKIPPED=$((SKIPPED + 1))
|
||||
fi
|
||||
echo "::endgroup::"
|
||||
done
|
||||
|
||||
echo "## Resilience Test Summary" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- Passed: $PASSED" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- Skipped: $SKIPPED" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
- name: Upload Test Results
|
||||
uses: actions/upload-artifact@v4
|
||||
if: always()
|
||||
with:
|
||||
name: test-results-resilience
|
||||
path: ./TestResults/Resilience
|
||||
retention-days: 14
|
||||
|
||||
observability:
|
||||
name: Observability Tests
|
||||
runs-on: ubuntu-22.04
|
||||
timeout-minutes: 30
|
||||
needs: discover
|
||||
if: github.event.inputs.include_observability == 'true'
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0
|
||||
|
||||
- name: Setup .NET
|
||||
uses: actions/setup-dotnet@v4
|
||||
with:
|
||||
dotnet-version: ${{ env.DOTNET_VERSION }}
|
||||
include-prerelease: true
|
||||
|
||||
- name: Run Observability Tests (all test projects)
|
||||
run: |
|
||||
mkdir -p ./TestResults/Observability
|
||||
FAILED=0
|
||||
PASSED=0
|
||||
SKIPPED=0
|
||||
|
||||
for proj in $(find src \( -name "*.Tests.csproj" -o -name "*UnitTests.csproj" -o -name "*SmokeTests.csproj" -o -name "*FixtureTests.csproj" -o -name "*IntegrationTests.csproj" \) -type f ! -path "*/node_modules/*" ! -name "StellaOps.TestKit.csproj" ! -name "*Testing.csproj" | sort); do
|
||||
echo "::group::Testing $proj"
|
||||
TRX_NAME=$(echo "$proj" | sed 's|/|_|g' | sed 's|\.csproj||')-observability.trx
|
||||
if dotnet test "$proj" \
|
||||
--filter "Category=Observability" \
|
||||
--configuration Release \
|
||||
--logger "trx;LogFileName=$TRX_NAME" \
|
||||
--results-directory ./TestResults/Observability \
|
||||
--verbosity minimal 2>&1; then
|
||||
PASSED=$((PASSED + 1))
|
||||
else
|
||||
SKIPPED=$((SKIPPED + 1))
|
||||
fi
|
||||
echo "::endgroup::"
|
||||
done
|
||||
|
||||
echo "## Observability Test Summary" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- Passed: $PASSED" >> $GITHUB_STEP_SUMMARY
|
||||
echo "- Skipped: $SKIPPED" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
- name: Upload Test Results
|
||||
uses: actions/upload-artifact@v4
|
||||
if: always()
|
||||
with:
|
||||
name: test-results-observability
|
||||
path: ./TestResults/Observability
|
||||
name: test-results-${{ matrix.category }}
|
||||
path: ./TestResults/${{ matrix.category }}
|
||||
retention-days: 14
|
||||
|
||||
# ===========================================================================
|
||||
@@ -855,7 +322,7 @@ jobs:
|
||||
summary:
|
||||
name: Test Summary
|
||||
runs-on: ubuntu-22.04
|
||||
needs: [discover, unit, architecture, contract, integration, security, golden]
|
||||
needs: [discover, pr-gating-tests, integration]
|
||||
if: always()
|
||||
steps:
|
||||
- name: Download all test results
|
||||
@@ -885,18 +352,14 @@ jobs:
|
||||
echo "| Category | Status |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "|----------|--------|" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Discover | ${{ needs.discover.result }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Unit | ${{ needs.unit.result }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Architecture | ${{ needs.architecture.result }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Contract | ${{ needs.contract.result }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| PR-Gating Matrix | ${{ needs.pr-gating-tests.result }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Integration | ${{ needs.integration.result }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Security | ${{ needs.security.result }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "| Golden | ${{ needs.golden.result }} |" >> $GITHUB_STEP_SUMMARY
|
||||
echo "" >> $GITHUB_STEP_SUMMARY
|
||||
echo "### Test Projects Discovered: ${{ needs.discover.outputs.test-count }}" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
- name: Count TRX files
|
||||
run: |
|
||||
TRX_COUNT=$(find ./TestResults -name "*.trx" | wc -l)
|
||||
TRX_COUNT=$(find ./TestResults -name "*.trx" 2>/dev/null | wc -l || echo "0")
|
||||
echo "### Total TRX Files Generated: $TRX_COUNT" >> $GITHUB_STEP_SUMMARY
|
||||
|
||||
- name: Upload Combined Results
|
||||
|
||||
Reference in New Issue
Block a user