Compare commits
8 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
d437e6ff36
|
|||
|
93f1b726ce
|
|||
|
c58c5d3eff
|
|||
|
60e8c7f9a8
|
|||
|
272ddf17c2
|
|||
|
b0bf71aa61
|
|||
|
0cb26df68b
|
|||
|
a7c2eb1d9b
|
74
CHANGELOG.md
Normal file
74
CHANGELOG.md
Normal file
@@ -0,0 +1,74 @@
|
||||
# Changelog
|
||||
|
||||
All notable changes to this project will be documented in this file.
|
||||
|
||||
## [1.5.0] - 2026-03-29
|
||||
|
||||
### Added
|
||||
|
||||
- **wfe-rustlang**: New crate with Rust toolchain step executors
|
||||
- Cargo steps: `cargo-build`, `cargo-test`, `cargo-check`, `cargo-clippy`, `cargo-fmt`, `cargo-doc`, `cargo-publish`
|
||||
- External tool steps with auto-install: `cargo-audit`, `cargo-deny`, `cargo-nextest`, `cargo-llvm-cov`
|
||||
- Rustup steps: `rust-install`, `rustup-toolchain`, `rustup-component`, `rustup-target`
|
||||
- `cargo-doc-mdx`: generates MDX documentation from rustdoc JSON output using the `rustdoc-types` crate
|
||||
- **wfe-yaml**: `rustlang` feature flag enabling all cargo/rustup step types
|
||||
- **wfe-yaml**: Schema fields for Rust steps (`package`, `features`, `toolchain`, `profile`, `output_dir`, etc.)
|
||||
- **wfe-containerd**: Remote daemon support via `WFE_IO_DIR` environment variable
|
||||
- **wfe-containerd**: Image chain ID resolution from content store for proper rootfs snapshots
|
||||
- **wfe-containerd**: Docker-default Linux capabilities for root containers
|
||||
- Lima `wfe-test` VM config (Alpine + containerd + BuildKit, TCP socat proxy)
|
||||
- Containerd integration tests running Rust toolchain in containers
|
||||
|
||||
### Fixed
|
||||
|
||||
- **wfe-containerd**: Empty rootfs — snapshot parent now resolved from image chain ID instead of empty string
|
||||
- **wfe-containerd**: FIFO deadlock with remote daemons — replaced with regular file I/O
|
||||
- **wfe-containerd**: `sh: not found` — use absolute `/bin/sh` path in OCI process spec
|
||||
- **wfe-containerd**: `setgroups: Operation not permitted` — grant capabilities when running as UID 0
|
||||
|
||||
### Changed
|
||||
|
||||
- Lima `wfe-test` VM uses Alpine apk packages instead of GitHub release binaries
|
||||
- Container tests use TCP proxy (`http://127.0.0.1:2500`) instead of Unix socket forwarding
|
||||
- CI pipeline (`workflows.yaml`) updated with `wfe-rustlang` in test, package, and publish steps
|
||||
|
||||
879 tests. 88.8% coverage on wfe-rustlang.
|
||||
|
||||
## [1.4.0] - 2026-03-26
|
||||
|
||||
### Added
|
||||
|
||||
- Type-safe `when:` conditions on workflow steps with compile-time validation
|
||||
- Full boolean combinator set: `all` (AND), `any` (OR), `none` (NOR), `one_of` (XOR), `not` (NOT)
|
||||
- Task file includes with cycle detection
|
||||
- Self-hosting CI pipeline (`workflows.yaml`) demonstrating all features
|
||||
- `readFile()` op for deno runtime
|
||||
- Auto-typed `##wfe[output]` annotations (bool, number conversion)
|
||||
- Multi-workflow YAML files, SubWorkflow step type, typed input/output schemas
|
||||
- HostContext for programmatic child workflow invocation
|
||||
- BuildKit image builder and containerd container runner as standalone crates
|
||||
- gRPC clients generated from official upstream proto files (tonic 0.14)
|
||||
|
||||
### Fixed
|
||||
|
||||
- Pipeline coverage step produces valid JSON, deno reads it with `readFile()`
|
||||
- Host context field added to container executor test contexts
|
||||
- `.outputs.` paths resolved flat for child workflows
|
||||
- Pointer status conversion for Skipped in postgres provider
|
||||
|
||||
629 tests. 87.7% coverage.
|
||||
|
||||
## [1.0.0] - 2026-03-23
|
||||
|
||||
### Added
|
||||
|
||||
- **wfe-core**: Workflow engine with step primitives, executor, fluent builder API
|
||||
- **wfe**: WorkflowHost, registry, sync runner, and purger
|
||||
- **wfe-sqlite**: SQLite persistence provider
|
||||
- **wfe-postgres**: PostgreSQL persistence provider
|
||||
- **wfe-opensearch**: OpenSearch search index provider
|
||||
- **wfe-valkey**: Valkey provider for locks, queues, and lifecycle events
|
||||
- **wfe-yaml**: YAML workflow definitions with shell and deno executors
|
||||
- **wfe-yaml**: Deno JS/TS runtime with sandboxed permissions, HTTP ops, npm support via esm.sh
|
||||
- OpenTelemetry tracing support behind `otel` feature flag
|
||||
- In-memory test support providers
|
||||
21
Cargo.toml
21
Cargo.toml
@@ -1,9 +1,9 @@
|
||||
[workspace]
|
||||
members = ["wfe-core", "wfe-sqlite", "wfe-postgres", "wfe-opensearch", "wfe-valkey", "wfe", "wfe-yaml", "wfe-buildkit", "wfe-containerd", "wfe-containerd-protos", "wfe-buildkit-protos"]
|
||||
members = ["wfe-core", "wfe-sqlite", "wfe-postgres", "wfe-opensearch", "wfe-valkey", "wfe", "wfe-yaml", "wfe-buildkit", "wfe-containerd", "wfe-containerd-protos", "wfe-buildkit-protos", "wfe-rustlang"]
|
||||
resolver = "2"
|
||||
|
||||
[workspace.package]
|
||||
version = "1.4.0"
|
||||
version = "1.5.0"
|
||||
edition = "2024"
|
||||
license = "MIT"
|
||||
repository = "https://src.sunbeam.pt/studio/wfe"
|
||||
@@ -38,14 +38,15 @@ redis = { version = "0.27", features = ["tokio-comp", "connection-manager"] }
|
||||
opensearch = "2"
|
||||
|
||||
# Internal crates
|
||||
wfe-core = { version = "1.4.0", path = "wfe-core" }
|
||||
wfe-sqlite = { version = "1.4.0", path = "wfe-sqlite" }
|
||||
wfe-postgres = { version = "1.4.0", path = "wfe-postgres" }
|
||||
wfe-opensearch = { version = "1.4.0", path = "wfe-opensearch" }
|
||||
wfe-valkey = { version = "1.4.0", path = "wfe-valkey" }
|
||||
wfe-yaml = { version = "1.4.0", path = "wfe-yaml" }
|
||||
wfe-buildkit = { version = "1.4.0", path = "wfe-buildkit" }
|
||||
wfe-containerd = { version = "1.4.0", path = "wfe-containerd" }
|
||||
wfe-core = { version = "1.5.0", path = "wfe-core", registry = "sunbeam" }
|
||||
wfe-sqlite = { version = "1.5.0", path = "wfe-sqlite", registry = "sunbeam" }
|
||||
wfe-postgres = { version = "1.5.0", path = "wfe-postgres", registry = "sunbeam" }
|
||||
wfe-opensearch = { version = "1.5.0", path = "wfe-opensearch", registry = "sunbeam" }
|
||||
wfe-valkey = { version = "1.5.0", path = "wfe-valkey", registry = "sunbeam" }
|
||||
wfe-yaml = { version = "1.5.0", path = "wfe-yaml", registry = "sunbeam" }
|
||||
wfe-buildkit = { version = "1.5.0", path = "wfe-buildkit", registry = "sunbeam" }
|
||||
wfe-containerd = { version = "1.5.0", path = "wfe-containerd", registry = "sunbeam" }
|
||||
wfe-rustlang = { version = "1.5.0", path = "wfe-rustlang", registry = "sunbeam" }
|
||||
|
||||
# YAML
|
||||
serde_yaml = "0.9"
|
||||
|
||||
@@ -1,18 +1,22 @@
|
||||
# WFE Test VM — BuildKit + containerd with host-accessible sockets
|
||||
# WFE Test VM — Alpine + containerd + BuildKit
|
||||
#
|
||||
# Provides both buildkitd and containerd daemons with Unix sockets
|
||||
# forwarded to the host for integration testing.
|
||||
# Lightweight VM for running wfe-buildkit and wfe-containerd integration tests.
|
||||
# Provides system-level containerd and BuildKit daemons with Unix sockets
|
||||
# forwarded to the host.
|
||||
#
|
||||
# Usage:
|
||||
# limactl start ./test/lima/wfe-test.yaml
|
||||
# limactl create --name wfe-test ./test/lima/wfe-test.yaml
|
||||
# limactl start wfe-test
|
||||
#
|
||||
# Sockets (on host after start):
|
||||
# BuildKit: unix://$HOME/.lima/wfe-test/sock/buildkitd.sock
|
||||
# containerd: unix://$HOME/.lima/wfe-test/sock/containerd.sock
|
||||
# BuildKit: unix://$HOME/.lima/wfe-test/buildkitd.sock
|
||||
# containerd: unix://$HOME/.lima/wfe-test/containerd.sock
|
||||
#
|
||||
# Verify:
|
||||
# BUILDKIT_HOST="unix://$HOME/.lima/wfe-test/sock/buildkitd.sock" buildctl debug workers
|
||||
# # containerd accessible via gRPC at unix://$HOME/.lima/wfe-test/sock/containerd.sock
|
||||
# Run tests:
|
||||
# WFE_BUILDKIT_ADDR="unix://$HOME/.lima/wfe-test/buildkitd.sock" \
|
||||
# WFE_CONTAINERD_ADDR="unix://$HOME/.lima/wfe-test/containerd.sock" \
|
||||
# cargo test -p wfe-buildkit -p wfe-containerd --test integration
|
||||
# cargo test -p wfe-yaml --features rustlang,containerd --test rustlang_containerd -- --ignored
|
||||
#
|
||||
# Teardown:
|
||||
# limactl stop wfe-test
|
||||
@@ -21,30 +25,117 @@
|
||||
message: |
|
||||
WFE integration test VM is ready.
|
||||
|
||||
BuildKit socket: unix://{{.Dir}}/sock/buildkitd.sock
|
||||
containerd socket: unix://{{.Dir}}/sock/containerd.sock
|
||||
|
||||
Verify BuildKit:
|
||||
BUILDKIT_HOST="unix://{{.Dir}}/sock/buildkitd.sock" buildctl debug workers
|
||||
containerd: http://127.0.0.1:2500 (TCP proxy, use for gRPC)
|
||||
BuildKit: http://127.0.0.1:2501 (TCP proxy, use for gRPC)
|
||||
|
||||
Run tests:
|
||||
WFE_BUILDKIT_ADDR="unix://{{.Dir}}/sock/buildkitd.sock" \
|
||||
WFE_CONTAINERD_ADDR="unix://{{.Dir}}/sock/containerd.sock" \
|
||||
cargo nextest run -p wfe-buildkit -p wfe-containerd
|
||||
WFE_CONTAINERD_ADDR="http://127.0.0.1:2500" \
|
||||
WFE_BUILDKIT_ADDR="http://127.0.0.1:2501" \
|
||||
cargo test -p wfe-yaml --features rustlang,containerd --test rustlang_containerd -- --ignored
|
||||
|
||||
minimumLimaVersion: 2.0.0
|
||||
minimumLimaVersion: "2.0.0"
|
||||
|
||||
base: template:_images/ubuntu-lts
|
||||
vmType: vz
|
||||
mountType: virtiofs
|
||||
cpus: 2
|
||||
memory: 4GiB
|
||||
disk: 20GiB
|
||||
|
||||
images:
|
||||
- location: "https://dl-cdn.alpinelinux.org/alpine/v3.21/releases/cloud/nocloud_alpine-3.21.6-aarch64-uefi-cloudinit-r0.qcow2"
|
||||
arch: "aarch64"
|
||||
- location: "https://dl-cdn.alpinelinux.org/alpine/v3.21/releases/cloud/nocloud_alpine-3.21.6-x86_64-uefi-cloudinit-r0.qcow2"
|
||||
arch: "x86_64"
|
||||
|
||||
mounts:
|
||||
# Share /tmp so the containerd shim can access FIFOs created by the host-side executor
|
||||
- location: /tmp/wfe-io
|
||||
mountPoint: /tmp/wfe-io
|
||||
writable: true
|
||||
|
||||
containerd:
|
||||
system: false
|
||||
user: true
|
||||
user: false
|
||||
|
||||
provision:
|
||||
# 1. Base packages + containerd + buildkit from Alpine repos (musl-compatible)
|
||||
- mode: system
|
||||
script: |
|
||||
#!/bin/sh
|
||||
set -eux
|
||||
apk update
|
||||
apk add --no-cache \
|
||||
curl bash coreutils findutils grep tar gzip pigz \
|
||||
containerd containerd-openrc \
|
||||
runc \
|
||||
buildkit buildkit-openrc \
|
||||
nerdctl
|
||||
|
||||
# 2. Start containerd
|
||||
- mode: system
|
||||
script: |
|
||||
#!/bin/sh
|
||||
set -eux
|
||||
rc-update add containerd default 2>/dev/null || true
|
||||
rc-service containerd start 2>/dev/null || true
|
||||
# Wait for socket
|
||||
for i in $(seq 1 15); do
|
||||
[ -S /run/containerd/containerd.sock ] && break
|
||||
sleep 1
|
||||
done
|
||||
chmod 666 /run/containerd/containerd.sock 2>/dev/null || true
|
||||
|
||||
# 3. Start BuildKit (Alpine package names the service "buildkitd")
|
||||
- mode: system
|
||||
script: |
|
||||
#!/bin/sh
|
||||
set -eux
|
||||
rc-update add buildkitd default 2>/dev/null || true
|
||||
rc-service buildkitd start 2>/dev/null || true
|
||||
|
||||
# 4. Fix socket permissions + TCP proxy for gRPC access (persists across reboots)
|
||||
- mode: system
|
||||
script: |
|
||||
#!/bin/sh
|
||||
set -eux
|
||||
apk add --no-cache socat
|
||||
mkdir -p /etc/local.d
|
||||
cat > /etc/local.d/fix-sockets.start << 'EOF'
|
||||
#!/bin/sh
|
||||
# Wait for daemons
|
||||
for i in $(seq 1 30); do
|
||||
[ -S /run/buildkit/buildkitd.sock ] && break
|
||||
sleep 1
|
||||
done
|
||||
# Fix permissions for Lima socket forwarding
|
||||
chmod 755 /run/buildkit /run/containerd 2>/dev/null
|
||||
chmod 666 /run/buildkit/buildkitd.sock /run/containerd/containerd.sock 2>/dev/null
|
||||
# TCP proxy for gRPC (Lima socket forwarding breaks HTTP/2)
|
||||
socat TCP4-LISTEN:2500,fork,reuseaddr UNIX-CONNECT:/run/containerd/containerd.sock &
|
||||
socat TCP4-LISTEN:2501,fork,reuseaddr UNIX-CONNECT:/run/buildkit/buildkitd.sock &
|
||||
EOF
|
||||
chmod +x /etc/local.d/fix-sockets.start
|
||||
rc-update add local default 2>/dev/null || true
|
||||
/etc/local.d/fix-sockets.start
|
||||
|
||||
probes:
|
||||
- script: |
|
||||
#!/bin/sh
|
||||
set -eux
|
||||
sudo test -S /run/containerd/containerd.sock
|
||||
sudo chmod 755 /run/containerd 2>/dev/null
|
||||
sudo chmod 666 /run/containerd/containerd.sock 2>/dev/null
|
||||
hint: "Waiting for containerd socket"
|
||||
- script: |
|
||||
#!/bin/sh
|
||||
set -eux
|
||||
sudo test -S /run/buildkit/buildkitd.sock
|
||||
sudo chmod 755 /run/buildkit 2>/dev/null
|
||||
sudo chmod 666 /run/buildkit/buildkitd.sock 2>/dev/null
|
||||
hint: "Waiting for BuildKit socket"
|
||||
|
||||
portForwards:
|
||||
# BuildKit daemon socket
|
||||
- guestSocket: "/run/user/{{.UID}}/buildkit-default/buildkitd.sock"
|
||||
hostSocket: "{{.Dir}}/sock/buildkitd.sock"
|
||||
|
||||
# containerd daemon socket (rootless)
|
||||
- guestSocket: "/run/user/{{.UID}}/containerd/containerd.sock"
|
||||
hostSocket: "{{.Dir}}/sock/containerd.sock"
|
||||
- guestSocket: "/run/buildkit/buildkitd.sock"
|
||||
hostSocket: "{{.Dir}}/buildkitd.sock"
|
||||
- guestSocket: "/run/containerd/containerd.sock"
|
||||
hostSocket: "{{.Dir}}/containerd.sock"
|
||||
|
||||
@@ -16,7 +16,7 @@ async-trait = { workspace = true }
|
||||
tracing = { workspace = true }
|
||||
thiserror = { workspace = true }
|
||||
regex = { workspace = true }
|
||||
wfe-buildkit-protos = { path = "../wfe-buildkit-protos" }
|
||||
wfe-buildkit-protos = { version = "1.5.0", path = "../wfe-buildkit-protos", registry = "sunbeam" }
|
||||
tonic = "0.14"
|
||||
tower = { version = "0.4", features = ["util"] }
|
||||
hyper-util = { version = "0.1", features = ["tokio"] }
|
||||
|
||||
@@ -9,7 +9,7 @@ description = "containerd container runner executor for WFE"
|
||||
|
||||
[dependencies]
|
||||
wfe-core = { workspace = true }
|
||||
wfe-containerd-protos = { path = "../wfe-containerd-protos" }
|
||||
wfe-containerd-protos = { version = "1.5.0", path = "../wfe-containerd-protos", registry = "sunbeam" }
|
||||
tokio = { workspace = true }
|
||||
serde = { workspace = true }
|
||||
serde_json = { workspace = true }
|
||||
@@ -21,7 +21,8 @@ tower = "0.5"
|
||||
hyper-util = { version = "0.1", features = ["tokio"] }
|
||||
prost-types = "0.14"
|
||||
uuid = { version = "1", features = ["v4"] }
|
||||
libc = "0.2"
|
||||
sha2 = "0.10"
|
||||
tokio-stream = "0.1"
|
||||
|
||||
[dev-dependencies]
|
||||
pretty_assertions = { workspace = true }
|
||||
|
||||
@@ -1,3 +1,50 @@
|
||||
//! Containerd container executor for WFE.
|
||||
//!
|
||||
//! Runs workflow steps as isolated OCI containers via the containerd gRPC API.
|
||||
//!
|
||||
//! # Remote daemon support
|
||||
//!
|
||||
//! The executor creates named pipes (FIFOs) on the **local** filesystem for
|
||||
//! stdout/stderr capture, then passes those paths to the containerd task spec.
|
||||
//! The containerd shim opens the FIFOs from **its** side. This means the FIFO
|
||||
//! paths must be accessible to both the executor process and the containerd
|
||||
//! daemon.
|
||||
//!
|
||||
//! When containerd runs on a different machine (e.g. a Lima VM), you need:
|
||||
//!
|
||||
//! 1. **Shared filesystem** — mount a host directory into the VM so both sides
|
||||
//! see the same FIFO files. With Lima + virtiofs:
|
||||
//! ```yaml
|
||||
//! # lima config
|
||||
//! mounts:
|
||||
//! - location: /tmp/wfe-io
|
||||
//! mountPoint: /tmp/wfe-io
|
||||
//! writable: true
|
||||
//! ```
|
||||
//!
|
||||
//! 2. **`WFE_IO_DIR` env var** — point the executor at the shared directory:
|
||||
//! ```sh
|
||||
//! export WFE_IO_DIR=/tmp/wfe-io
|
||||
//! ```
|
||||
//! Without this, FIFOs are created under `std::env::temp_dir()` which is
|
||||
//! only visible to the host.
|
||||
//!
|
||||
//! 3. **gRPC transport** — Lima's Unix socket forwarding is unreliable for
|
||||
//! HTTP/2 (gRPC). Use a TCP socat proxy inside the VM instead:
|
||||
//! ```sh
|
||||
//! # Inside the VM:
|
||||
//! socat TCP4-LISTEN:2500,fork,reuseaddr UNIX-CONNECT:/run/containerd/containerd.sock &
|
||||
//! ```
|
||||
//! Then connect via `WFE_CONTAINERD_ADDR=http://127.0.0.1:2500` (Lima
|
||||
//! auto-forwards guest TCP ports).
|
||||
//!
|
||||
//! 4. **FIFO permissions** — the FIFOs are created with mode `0666` and a
|
||||
//! temporarily cleared umask so the remote shim (running as root) can open
|
||||
//! them through the shared mount.
|
||||
//!
|
||||
//! See `test/lima/wfe-test.yaml` for a complete VM configuration that sets all
|
||||
//! of this up.
|
||||
|
||||
pub mod config;
|
||||
pub mod step;
|
||||
|
||||
|
||||
@@ -11,6 +11,9 @@ use wfe_containerd_protos::containerd::services::containers::v1::{
|
||||
containers_client::ContainersClient, Container, CreateContainerRequest,
|
||||
DeleteContainerRequest, container::Runtime,
|
||||
};
|
||||
use wfe_containerd_protos::containerd::services::content::v1::{
|
||||
content_client::ContentClient, ReadContentRequest,
|
||||
};
|
||||
use wfe_containerd_protos::containerd::services::images::v1::{
|
||||
images_client::ImagesClient, GetImageRequest,
|
||||
};
|
||||
@@ -134,6 +137,153 @@ impl ContainerdStep {
|
||||
}
|
||||
}
|
||||
|
||||
/// Resolve the snapshot chain ID for an image.
|
||||
///
|
||||
/// This reads the image manifest and config from the content store to
|
||||
/// compute the chain ID of the topmost layer. The chain ID is used as
|
||||
/// the parent snapshot when preparing a writable rootfs for a container.
|
||||
///
|
||||
/// Chain ID computation follows the OCI image spec:
|
||||
/// chain_id[0] = diff_id[0]
|
||||
/// chain_id[n] = sha256(chain_id[n-1] + " " + diff_id[n])
|
||||
async fn resolve_image_chain_id(
|
||||
channel: &Channel,
|
||||
image: &str,
|
||||
namespace: &str,
|
||||
) -> Result<String, WfeError> {
|
||||
use sha2::{Sha256, Digest};
|
||||
|
||||
// 1. Get the image record to find the manifest digest.
|
||||
let mut images_client = ImagesClient::new(channel.clone());
|
||||
let req = Self::with_namespace(
|
||||
GetImageRequest { name: image.to_string() },
|
||||
namespace,
|
||||
);
|
||||
let image_resp = images_client.get(req).await.map_err(|e| {
|
||||
WfeError::StepExecution(format!("failed to get image '{image}': {e}"))
|
||||
})?;
|
||||
let img = image_resp.into_inner().image.ok_or_else(|| {
|
||||
WfeError::StepExecution(format!("image '{image}' has no record"))
|
||||
})?;
|
||||
let target = img.target.ok_or_else(|| {
|
||||
WfeError::StepExecution(format!("image '{image}' has no target descriptor"))
|
||||
})?;
|
||||
|
||||
// The target might be an index (multi-platform) or a manifest.
|
||||
// Read the content and determine based on mediaType.
|
||||
let manifest_digest = target.digest.clone();
|
||||
let manifest_bytes = Self::read_content(channel, &manifest_digest, namespace).await?;
|
||||
let manifest_json: serde_json::Value = serde_json::from_slice(&manifest_bytes)
|
||||
.map_err(|e| WfeError::StepExecution(format!("failed to parse manifest: {e}")))?;
|
||||
|
||||
// 2. If it's an index, pick the matching platform manifest.
|
||||
let manifest_json = if manifest_json.get("manifests").is_some() {
|
||||
// OCI image index — find the platform-matching manifest.
|
||||
let arch = std::env::consts::ARCH;
|
||||
let oci_arch = match arch {
|
||||
"aarch64" => "arm64",
|
||||
"x86_64" => "amd64",
|
||||
other => other,
|
||||
};
|
||||
let manifests = manifest_json["manifests"].as_array().ok_or_else(|| {
|
||||
WfeError::StepExecution("image index has no manifests array".to_string())
|
||||
})?;
|
||||
let platform_manifest = manifests.iter().find(|m| {
|
||||
m.get("platform")
|
||||
.and_then(|p| p.get("architecture"))
|
||||
.and_then(|a| a.as_str())
|
||||
== Some(oci_arch)
|
||||
}).ok_or_else(|| {
|
||||
WfeError::StepExecution(format!(
|
||||
"no manifest for architecture '{oci_arch}' in image index"
|
||||
))
|
||||
})?;
|
||||
let digest = platform_manifest["digest"].as_str().ok_or_else(|| {
|
||||
WfeError::StepExecution("platform manifest has no digest".to_string())
|
||||
})?;
|
||||
let bytes = Self::read_content(channel, digest, namespace).await?;
|
||||
serde_json::from_slice(&bytes)
|
||||
.map_err(|e| WfeError::StepExecution(format!("failed to parse platform manifest: {e}")))?
|
||||
} else {
|
||||
manifest_json
|
||||
};
|
||||
|
||||
// 3. Get the config digest from the manifest.
|
||||
let config_digest = manifest_json["config"]["digest"]
|
||||
.as_str()
|
||||
.ok_or_else(|| {
|
||||
WfeError::StepExecution("manifest has no config.digest".to_string())
|
||||
})?;
|
||||
|
||||
// 4. Read the image config.
|
||||
let config_bytes = Self::read_content(channel, config_digest, namespace).await?;
|
||||
let config_json: serde_json::Value = serde_json::from_slice(&config_bytes)
|
||||
.map_err(|e| WfeError::StepExecution(format!("failed to parse image config: {e}")))?;
|
||||
|
||||
// 5. Extract diff_ids and compute chain ID.
|
||||
let diff_ids = config_json["rootfs"]["diff_ids"]
|
||||
.as_array()
|
||||
.ok_or_else(|| {
|
||||
WfeError::StepExecution("image config has no rootfs.diff_ids".to_string())
|
||||
})?;
|
||||
|
||||
if diff_ids.is_empty() {
|
||||
return Err(WfeError::StepExecution(
|
||||
"image has no layers (empty diff_ids)".to_string(),
|
||||
));
|
||||
}
|
||||
|
||||
let mut chain_id = diff_ids[0]
|
||||
.as_str()
|
||||
.ok_or_else(|| WfeError::StepExecution("diff_id is not a string".to_string()))?
|
||||
.to_string();
|
||||
|
||||
for diff_id in &diff_ids[1..] {
|
||||
let diff = diff_id.as_str().ok_or_else(|| {
|
||||
WfeError::StepExecution("diff_id is not a string".to_string())
|
||||
})?;
|
||||
let mut hasher = Sha256::new();
|
||||
hasher.update(format!("{chain_id} {diff}"));
|
||||
chain_id = format!("sha256:{:x}", hasher.finalize());
|
||||
}
|
||||
|
||||
tracing::debug!(image = image, chain_id = %chain_id, "resolved image chain ID");
|
||||
Ok(chain_id)
|
||||
}
|
||||
|
||||
/// Read content from the containerd content store by digest.
|
||||
async fn read_content(
|
||||
channel: &Channel,
|
||||
digest: &str,
|
||||
namespace: &str,
|
||||
) -> Result<Vec<u8>, WfeError> {
|
||||
use tokio_stream::StreamExt;
|
||||
|
||||
let mut client = ContentClient::new(channel.clone());
|
||||
let req = Self::with_namespace(
|
||||
ReadContentRequest {
|
||||
digest: digest.to_string(),
|
||||
offset: 0,
|
||||
size: 0, // read all
|
||||
},
|
||||
namespace,
|
||||
);
|
||||
|
||||
let mut stream = client.read(req).await.map_err(|e| {
|
||||
WfeError::StepExecution(format!("failed to read content {digest}: {e}"))
|
||||
})?.into_inner();
|
||||
|
||||
let mut data = Vec::new();
|
||||
while let Some(chunk) = stream.next().await {
|
||||
let chunk = chunk.map_err(|e| {
|
||||
WfeError::StepExecution(format!("error reading content {digest}: {e}"))
|
||||
})?;
|
||||
data.extend_from_slice(&chunk.data);
|
||||
}
|
||||
|
||||
Ok(data)
|
||||
}
|
||||
|
||||
/// Build a minimal OCI runtime spec as a `prost_types::Any`.
|
||||
///
|
||||
/// The spec is serialized as JSON and wrapped in a protobuf Any with
|
||||
@@ -144,7 +294,7 @@ impl ContainerdStep {
|
||||
) -> prost_types::Any {
|
||||
// Build the args array for the process.
|
||||
let args: Vec<String> = if let Some(ref run) = self.config.run {
|
||||
vec!["sh".to_string(), "-c".to_string(), run.clone()]
|
||||
vec!["/bin/sh".to_string(), "-c".to_string(), run.clone()]
|
||||
} else if let Some(ref command) = self.config.command {
|
||||
command.clone()
|
||||
} else {
|
||||
@@ -206,13 +356,24 @@ impl ContainerdStep {
|
||||
"cwd": self.config.working_dir.as_deref().unwrap_or("/"),
|
||||
});
|
||||
|
||||
// Add capabilities (minimal set).
|
||||
// Add capabilities. When running as root, grant the default Docker
|
||||
// capability set so tools like apt-get work. Non-root gets nothing.
|
||||
let caps = if uid == 0 {
|
||||
serde_json::json!([
|
||||
"CAP_AUDIT_WRITE", "CAP_CHOWN", "CAP_DAC_OVERRIDE",
|
||||
"CAP_FOWNER", "CAP_FSETID", "CAP_KILL", "CAP_MKNOD",
|
||||
"CAP_NET_BIND_SERVICE", "CAP_NET_RAW", "CAP_SETFCAP",
|
||||
"CAP_SETGID", "CAP_SETPCAP", "CAP_SETUID", "CAP_SYS_CHROOT",
|
||||
])
|
||||
} else {
|
||||
serde_json::json!([])
|
||||
};
|
||||
process["capabilities"] = serde_json::json!({
|
||||
"bounding": [],
|
||||
"effective": [],
|
||||
"inheritable": [],
|
||||
"permitted": [],
|
||||
"ambient": [],
|
||||
"bounding": caps,
|
||||
"effective": caps,
|
||||
"inheritable": caps,
|
||||
"permitted": caps,
|
||||
"ambient": caps,
|
||||
});
|
||||
|
||||
let spec = serde_json::json!({
|
||||
@@ -400,15 +561,11 @@ impl StepBody for ContainerdStep {
|
||||
WfeError::StepExecution(format!("failed to create container: {e}"))
|
||||
})?;
|
||||
|
||||
// 6. Prepare snapshot to get rootfs mounts.
|
||||
// 6. Prepare snapshot with the image's layers as parent.
|
||||
let mut snapshots_client = SnapshotsClient::new(channel.clone());
|
||||
|
||||
// Get the image's chain ID to use as parent for the snapshot.
|
||||
// We try to get mounts from the snapshot (already committed by image unpack).
|
||||
// If snapshot already exists, use Mounts; otherwise Prepare from the image's
|
||||
// snapshot key (same as container_id for our flow).
|
||||
let mounts = {
|
||||
// First try: see if the snapshot was already prepared.
|
||||
// First try: see if a snapshot was already prepared for this container.
|
||||
let mounts_req = Self::with_namespace(
|
||||
MountsRequest {
|
||||
snapshotter: DEFAULT_SNAPSHOTTER.to_string(),
|
||||
@@ -420,12 +577,18 @@ impl StepBody for ContainerdStep {
|
||||
match snapshots_client.mounts(mounts_req).await {
|
||||
Ok(resp) => resp.into_inner().mounts,
|
||||
Err(_) => {
|
||||
// Try to prepare a fresh snapshot.
|
||||
// Resolve the image's chain ID to use as snapshot parent.
|
||||
let parent = if should_check {
|
||||
Self::resolve_image_chain_id(&channel, &self.config.image, namespace).await?
|
||||
} else {
|
||||
String::new()
|
||||
};
|
||||
|
||||
let prepare_req = Self::with_namespace(
|
||||
PrepareSnapshotRequest {
|
||||
snapshotter: DEFAULT_SNAPSHOTTER.to_string(),
|
||||
key: container_id.clone(),
|
||||
parent: String::new(),
|
||||
parent,
|
||||
labels: HashMap::new(),
|
||||
},
|
||||
namespace,
|
||||
@@ -445,7 +608,12 @@ impl StepBody for ContainerdStep {
|
||||
};
|
||||
|
||||
// 7. Create FIFO paths for stdout/stderr capture.
|
||||
let tmp_dir = std::env::temp_dir().join(format!("wfe-io-{container_id}"));
|
||||
// Use WFE_IO_DIR if set (e.g., a shared mount with a remote containerd daemon),
|
||||
// otherwise fall back to the system temp directory.
|
||||
let io_base = std::env::var("WFE_IO_DIR")
|
||||
.map(std::path::PathBuf::from)
|
||||
.unwrap_or_else(|_| std::env::temp_dir());
|
||||
let tmp_dir = io_base.join(format!("wfe-io-{container_id}"));
|
||||
std::fs::create_dir_all(&tmp_dir).map_err(|e| {
|
||||
WfeError::StepExecution(format!("failed to create IO temp dir: {e}"))
|
||||
})?;
|
||||
@@ -453,19 +621,26 @@ impl StepBody for ContainerdStep {
|
||||
let stdout_path = tmp_dir.join("stdout");
|
||||
let stderr_path = tmp_dir.join("stderr");
|
||||
|
||||
// Create named pipes (FIFOs) for the task I/O.
|
||||
// Create empty files for the shim to write stdout/stderr to.
|
||||
// We use regular files instead of FIFOs because FIFOs don't work
|
||||
// across filesystem boundaries (e.g., virtiofs mounts with Lima VMs).
|
||||
for path in [&stdout_path, &stderr_path] {
|
||||
// Remove if exists from a previous run.
|
||||
let _ = std::fs::remove_file(path);
|
||||
nix_mkfifo(path).map_err(|e| {
|
||||
WfeError::StepExecution(format!("failed to create FIFO {}: {e}", path.display()))
|
||||
std::fs::File::create(path).map_err(|e| {
|
||||
WfeError::StepExecution(format!("failed to create IO file {}: {e}", path.display()))
|
||||
})?;
|
||||
// Ensure the remote shim can write to it.
|
||||
#[cfg(unix)]
|
||||
{
|
||||
use std::os::unix::fs::PermissionsExt;
|
||||
std::fs::set_permissions(path, std::fs::Permissions::from_mode(0o666)).ok();
|
||||
}
|
||||
}
|
||||
|
||||
let stdout_str = stdout_path.to_string_lossy().to_string();
|
||||
let stderr_str = stderr_path.to_string_lossy().to_string();
|
||||
|
||||
// 8. Create and start task.
|
||||
// 8. Create task.
|
||||
let mut tasks_client = TasksClient::new(channel.clone());
|
||||
|
||||
let create_task_req = Self::with_namespace(
|
||||
@@ -487,17 +662,6 @@ impl StepBody for ContainerdStep {
|
||||
WfeError::StepExecution(format!("failed to create task: {e}"))
|
||||
})?;
|
||||
|
||||
// Spawn readers for FIFOs before starting the task (FIFOs block on open
|
||||
// until both ends connect).
|
||||
let stdout_reader = {
|
||||
let path = stdout_path.clone();
|
||||
tokio::spawn(async move { read_fifo(&path).await })
|
||||
};
|
||||
let stderr_reader = {
|
||||
let path = stderr_path.clone();
|
||||
tokio::spawn(async move { read_fifo(&path).await })
|
||||
};
|
||||
|
||||
// Start the task.
|
||||
let start_req = Self::with_namespace(
|
||||
StartRequest {
|
||||
@@ -555,14 +719,12 @@ impl StepBody for ContainerdStep {
|
||||
}
|
||||
};
|
||||
|
||||
// 10. Read captured output.
|
||||
let stdout_content = stdout_reader
|
||||
// 10. Read captured output from files.
|
||||
let stdout_content = tokio::fs::read_to_string(&stdout_path)
|
||||
.await
|
||||
.unwrap_or_else(|_| Ok(String::new()))
|
||||
.unwrap_or_default();
|
||||
let stderr_content = stderr_reader
|
||||
let stderr_content = tokio::fs::read_to_string(&stderr_path)
|
||||
.await
|
||||
.unwrap_or_else(|_| Ok(String::new()))
|
||||
.unwrap_or_default();
|
||||
|
||||
// 11. Cleanup: delete task, then container.
|
||||
@@ -629,38 +791,6 @@ impl ContainerdStep {
|
||||
}
|
||||
}
|
||||
|
||||
/// Create a named pipe (FIFO) at the given path. This is a thin wrapper
|
||||
/// around the `mkfifo` libc call, avoiding an extra dependency.
|
||||
fn nix_mkfifo(path: &Path) -> std::io::Result<()> {
|
||||
use std::ffi::CString;
|
||||
use std::os::unix::ffi::OsStrExt;
|
||||
|
||||
let c_path = CString::new(path.as_os_str().as_bytes())
|
||||
.map_err(|e| std::io::Error::new(std::io::ErrorKind::InvalidInput, e))?;
|
||||
|
||||
// SAFETY: c_path is a valid null-terminated C string and 0o622 is a
|
||||
// standard FIFO permission mode.
|
||||
let ret = unsafe { libc::mkfifo(c_path.as_ptr(), 0o622) };
|
||||
if ret != 0 {
|
||||
Err(std::io::Error::last_os_error())
|
||||
} else {
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
||||
/// Read the entire contents of a FIFO into a String. This opens the FIFO
|
||||
/// in read mode (which blocks until a writer opens the other end) and reads
|
||||
/// until EOF.
|
||||
async fn read_fifo(path: &Path) -> Result<String, std::io::Error> {
|
||||
use tokio::io::AsyncReadExt;
|
||||
|
||||
let file = tokio::fs::File::open(path).await?;
|
||||
let mut reader = tokio::io::BufReader::new(file);
|
||||
let mut buf = String::new();
|
||||
reader.read_to_string(&mut buf).await?;
|
||||
Ok(buf)
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
@@ -864,7 +994,7 @@ mod tests {
|
||||
// Deserialize and verify.
|
||||
let parsed: serde_json::Value = serde_json::from_slice(&spec.value).unwrap();
|
||||
assert_eq!(parsed["ociVersion"], "1.0.2");
|
||||
assert_eq!(parsed["process"]["args"][0], "sh");
|
||||
assert_eq!(parsed["process"]["args"][0], "/bin/sh");
|
||||
assert_eq!(parsed["process"]["args"][1], "-c");
|
||||
assert_eq!(parsed["process"]["args"][2], "echo hello");
|
||||
assert_eq!(parsed["process"]["user"]["uid"], 65534);
|
||||
@@ -1033,22 +1163,6 @@ mod tests {
|
||||
assert_eq!(step.config.containerd_addr, "/run/containerd/containerd.sock");
|
||||
}
|
||||
|
||||
// ── nix_mkfifo ─────────────────────────────────────────────────────
|
||||
|
||||
#[test]
|
||||
fn mkfifo_creates_and_removes_fifo() {
|
||||
let tmp = tempfile::tempdir().unwrap();
|
||||
let fifo_path = tmp.path().join("test.fifo");
|
||||
nix_mkfifo(&fifo_path).unwrap();
|
||||
assert!(fifo_path.exists());
|
||||
std::fs::remove_file(&fifo_path).unwrap();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn mkfifo_invalid_path_returns_error() {
|
||||
let result = nix_mkfifo(Path::new("/nonexistent-dir/fifo"));
|
||||
assert!(result.is_err());
|
||||
}
|
||||
}
|
||||
|
||||
/// Integration tests that require a live containerd daemon.
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
//!
|
||||
//! These tests require a live containerd daemon. They are skipped when the
|
||||
//! socket is not available. Set `WFE_CONTAINERD_ADDR` to point to a custom
|
||||
//! socket, or use the default `~/.lima/wfe-test/sock/containerd.sock`.
|
||||
//! socket, or use the default `~/.lima/wfe-test/containerd.sock`.
|
||||
//!
|
||||
//! Before running, ensure the test image is pre-pulled:
|
||||
//! ctr -n default image pull docker.io/library/alpine:3.18
|
||||
@@ -15,19 +15,27 @@ use wfe_containerd::ContainerdStep;
|
||||
use wfe_core::models::{ExecutionPointer, WorkflowInstance, WorkflowStep};
|
||||
use wfe_core::traits::step::{StepBody, StepExecutionContext};
|
||||
|
||||
/// Returns the containerd socket address if available, or None.
|
||||
/// Returns the containerd address if available, or None.
|
||||
/// Set `WFE_CONTAINERD_ADDR` to a TCP address (http://host:port) or
|
||||
/// Unix socket path (unix:///path). Defaults to the Lima wfe-test
|
||||
/// TCP proxy at http://127.0.0.1:2500.
|
||||
fn containerd_addr() -> Option<String> {
|
||||
let addr = std::env::var("WFE_CONTAINERD_ADDR").unwrap_or_else(|_| {
|
||||
format!(
|
||||
"unix://{}/.lima/wfe-test/sock/containerd.sock",
|
||||
std::env::var("HOME").unwrap_or_else(|_| "/root".to_string())
|
||||
)
|
||||
});
|
||||
if let Ok(addr) = std::env::var("WFE_CONTAINERD_ADDR") {
|
||||
if addr.starts_with("http://") || addr.starts_with("tcp://") {
|
||||
return Some(addr);
|
||||
}
|
||||
let socket_path = addr.strip_prefix("unix://").unwrap_or(addr.as_str());
|
||||
if Path::new(socket_path).exists() {
|
||||
return Some(addr);
|
||||
}
|
||||
return None;
|
||||
}
|
||||
|
||||
let socket_path = addr.strip_prefix("unix://").unwrap_or(addr.as_str());
|
||||
|
||||
if Path::new(socket_path).exists() {
|
||||
Some(addr)
|
||||
// Default: check if the Lima wfe-test socket exists (for lightweight tests).
|
||||
let home = std::env::var("HOME").unwrap_or_else(|_| "/root".to_string());
|
||||
let socket = format!("{home}/.lima/wfe-test/containerd.sock");
|
||||
if Path::new(&socket).exists() {
|
||||
Some(format!("unix://{socket}"))
|
||||
} else {
|
||||
None
|
||||
}
|
||||
@@ -151,6 +159,142 @@ async fn skip_image_check_when_pull_never() {
|
||||
);
|
||||
}
|
||||
|
||||
// ── Run a real container end-to-end ──────────────────────────────────
|
||||
|
||||
#[tokio::test]
|
||||
async fn run_echo_hello_in_container() {
|
||||
let Some(addr) = containerd_addr() else {
|
||||
eprintln!("SKIP: containerd socket not available");
|
||||
return;
|
||||
};
|
||||
|
||||
let mut config = minimal_config(&addr);
|
||||
config.image = "docker.io/library/alpine:3.18".to_string();
|
||||
config.run = Some("echo hello-from-container".to_string());
|
||||
config.pull = "if-not-present".to_string();
|
||||
config.user = "0:0".to_string();
|
||||
config.timeout_ms = Some(30_000);
|
||||
let mut step = ContainerdStep::new(config);
|
||||
|
||||
let mut wf_step = WorkflowStep::new(0, "containerd");
|
||||
wf_step.name = Some("echo-test".to_string());
|
||||
let workflow = WorkflowInstance::new("test-wf", 1, serde_json::json!({}));
|
||||
let pointer = ExecutionPointer::new(0);
|
||||
let ctx = make_context(&wf_step, &workflow, &pointer);
|
||||
|
||||
let result = step.run(&ctx).await;
|
||||
match &result {
|
||||
Ok(r) => {
|
||||
eprintln!("SUCCESS: {:?}", r.output_data);
|
||||
let data = r.output_data.as_ref().unwrap().as_object().unwrap();
|
||||
let stdout = data.get("echo-test.stdout").unwrap().as_str().unwrap();
|
||||
assert!(stdout.contains("hello-from-container"), "stdout: {stdout}");
|
||||
}
|
||||
Err(e) => panic!("container step failed: {e}"),
|
||||
}
|
||||
}
|
||||
|
||||
// ── Run a container with a volume mount ──────────────────────────────
|
||||
|
||||
#[tokio::test]
|
||||
async fn run_container_with_volume_mount() {
|
||||
let Some(addr) = containerd_addr() else {
|
||||
eprintln!("SKIP: containerd socket not available");
|
||||
return;
|
||||
};
|
||||
|
||||
let shared_dir = std::env::var("WFE_IO_DIR")
|
||||
.unwrap_or_else(|_| "/tmp/wfe-io".to_string());
|
||||
let vol_dir = format!("{shared_dir}/test-vol");
|
||||
std::fs::create_dir_all(&vol_dir).unwrap();
|
||||
|
||||
let mut config = minimal_config(&addr);
|
||||
config.image = "docker.io/library/alpine:3.18".to_string();
|
||||
config.run = Some("echo hello > /mnt/test/output.txt && cat /mnt/test/output.txt".to_string());
|
||||
config.pull = "if-not-present".to_string();
|
||||
config.user = "0:0".to_string();
|
||||
config.timeout_ms = Some(30_000);
|
||||
config.volumes = vec![wfe_containerd::VolumeMountConfig {
|
||||
source: vol_dir.clone(),
|
||||
target: "/mnt/test".to_string(),
|
||||
readonly: false,
|
||||
}];
|
||||
let mut step = ContainerdStep::new(config);
|
||||
|
||||
let mut wf_step = WorkflowStep::new(0, "containerd");
|
||||
wf_step.name = Some("vol-test".to_string());
|
||||
let workflow = WorkflowInstance::new("test-wf", 1, serde_json::json!({}));
|
||||
let pointer = ExecutionPointer::new(0);
|
||||
let ctx = make_context(&wf_step, &workflow, &pointer);
|
||||
|
||||
match step.run(&ctx).await {
|
||||
Ok(r) => {
|
||||
let data = r.output_data.as_ref().unwrap().as_object().unwrap();
|
||||
let stdout = data.get("vol-test.stdout").unwrap().as_str().unwrap();
|
||||
assert!(stdout.contains("hello"), "stdout: {stdout}");
|
||||
}
|
||||
Err(e) => panic!("container step with volume failed: {e}"),
|
||||
}
|
||||
|
||||
std::fs::remove_dir_all(&vol_dir).ok();
|
||||
}
|
||||
|
||||
// ── Run a container with volume mount and network (simulates install step) ──
|
||||
|
||||
#[tokio::test]
|
||||
async fn run_debian_with_volume_and_network() {
|
||||
let Some(addr) = containerd_addr() else {
|
||||
eprintln!("SKIP: containerd socket not available");
|
||||
return;
|
||||
};
|
||||
|
||||
let shared_dir = std::env::var("WFE_IO_DIR")
|
||||
.unwrap_or_else(|_| "/tmp/wfe-io".to_string());
|
||||
let cargo_dir = format!("{shared_dir}/test-cargo");
|
||||
let rustup_dir = format!("{shared_dir}/test-rustup");
|
||||
std::fs::create_dir_all(&cargo_dir).unwrap();
|
||||
std::fs::create_dir_all(&rustup_dir).unwrap();
|
||||
|
||||
let mut config = minimal_config(&addr);
|
||||
config.image = "docker.io/library/debian:bookworm-slim".to_string();
|
||||
config.run = Some("echo hello && ls /cargo && ls /rustup".to_string());
|
||||
config.pull = "if-not-present".to_string();
|
||||
config.user = "0:0".to_string();
|
||||
config.network = "host".to_string();
|
||||
config.timeout_ms = Some(30_000);
|
||||
config.env.insert("CARGO_HOME".to_string(), "/cargo".to_string());
|
||||
config.env.insert("RUSTUP_HOME".to_string(), "/rustup".to_string());
|
||||
config.volumes = vec![
|
||||
wfe_containerd::VolumeMountConfig {
|
||||
source: cargo_dir.clone(),
|
||||
target: "/cargo".to_string(),
|
||||
readonly: false,
|
||||
},
|
||||
wfe_containerd::VolumeMountConfig {
|
||||
source: rustup_dir.clone(),
|
||||
target: "/rustup".to_string(),
|
||||
readonly: false,
|
||||
},
|
||||
];
|
||||
let mut step = ContainerdStep::new(config);
|
||||
|
||||
let mut wf_step = WorkflowStep::new(0, "containerd");
|
||||
wf_step.name = Some("debian-test".to_string());
|
||||
let workflow = WorkflowInstance::new("test-wf", 1, serde_json::json!({}));
|
||||
let pointer = ExecutionPointer::new(0);
|
||||
let ctx = make_context(&wf_step, &workflow, &pointer);
|
||||
|
||||
match step.run(&ctx).await {
|
||||
Ok(r) => {
|
||||
eprintln!("SUCCESS: {:?}", r.output_data);
|
||||
}
|
||||
Err(e) => panic!("debian container with volumes failed: {e}"),
|
||||
}
|
||||
|
||||
std::fs::remove_dir_all(&cargo_dir).ok();
|
||||
std::fs::remove_dir_all(&rustup_dir).ok();
|
||||
}
|
||||
|
||||
// ── Step name defaults to "unknown" when None ────────────────────────
|
||||
|
||||
#[tokio::test]
|
||||
|
||||
22
wfe-rustlang/Cargo.toml
Normal file
22
wfe-rustlang/Cargo.toml
Normal file
@@ -0,0 +1,22 @@
|
||||
[package]
|
||||
name = "wfe-rustlang"
|
||||
version.workspace = true
|
||||
edition.workspace = true
|
||||
license.workspace = true
|
||||
repository.workspace = true
|
||||
homepage.workspace = true
|
||||
description = "Rust toolchain step executors (cargo, rustup) for WFE"
|
||||
|
||||
[dependencies]
|
||||
wfe-core = { workspace = true }
|
||||
tokio = { workspace = true }
|
||||
serde = { workspace = true }
|
||||
serde_json = { workspace = true }
|
||||
async-trait = { workspace = true }
|
||||
tracing = { workspace = true }
|
||||
rustdoc-types = "0.38"
|
||||
|
||||
[dev-dependencies]
|
||||
pretty_assertions = { workspace = true }
|
||||
tokio = { workspace = true, features = ["test-util", "process"] }
|
||||
tempfile = { workspace = true }
|
||||
301
wfe-rustlang/src/cargo/config.rs
Normal file
301
wfe-rustlang/src/cargo/config.rs
Normal file
@@ -0,0 +1,301 @@
|
||||
use std::collections::HashMap;
|
||||
|
||||
use serde::{Deserialize, Serialize};
|
||||
|
||||
/// Which cargo subcommand to run.
|
||||
#[derive(Debug, Clone, Serialize, Deserialize, PartialEq)]
|
||||
#[serde(rename_all = "kebab-case")]
|
||||
pub enum CargoCommand {
|
||||
Build,
|
||||
Test,
|
||||
Check,
|
||||
Clippy,
|
||||
Fmt,
|
||||
Doc,
|
||||
Publish,
|
||||
Audit,
|
||||
Deny,
|
||||
Nextest,
|
||||
LlvmCov,
|
||||
DocMdx,
|
||||
}
|
||||
|
||||
impl CargoCommand {
|
||||
pub fn as_str(&self) -> &'static str {
|
||||
match self {
|
||||
Self::Build => "build",
|
||||
Self::Test => "test",
|
||||
Self::Check => "check",
|
||||
Self::Clippy => "clippy",
|
||||
Self::Fmt => "fmt",
|
||||
Self::Doc => "doc",
|
||||
Self::Publish => "publish",
|
||||
Self::Audit => "audit",
|
||||
Self::Deny => "deny",
|
||||
Self::Nextest => "nextest",
|
||||
Self::LlvmCov => "llvm-cov",
|
||||
Self::DocMdx => "doc-mdx",
|
||||
}
|
||||
}
|
||||
|
||||
/// Returns the subcommand arg(s) to pass to cargo.
|
||||
/// Most commands are a single arg, but nextest needs "nextest run".
|
||||
/// DocMdx uses `rustdoc` (the actual cargo subcommand).
|
||||
pub fn subcommand_args(&self) -> Vec<&'static str> {
|
||||
match self {
|
||||
Self::Nextest => vec!["nextest", "run"],
|
||||
Self::DocMdx => vec!["rustdoc"],
|
||||
other => vec![other.as_str()],
|
||||
}
|
||||
}
|
||||
|
||||
/// Returns the cargo-install package name if this is an external tool.
|
||||
/// Returns `None` for built-in cargo subcommands.
|
||||
pub fn install_package(&self) -> Option<&'static str> {
|
||||
match self {
|
||||
Self::Audit => Some("cargo-audit"),
|
||||
Self::Deny => Some("cargo-deny"),
|
||||
Self::Nextest => Some("cargo-nextest"),
|
||||
Self::LlvmCov => Some("cargo-llvm-cov"),
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
|
||||
/// Returns the binary name to probe for availability.
|
||||
pub fn binary_name(&self) -> Option<&'static str> {
|
||||
match self {
|
||||
Self::Audit => Some("cargo-audit"),
|
||||
Self::Deny => Some("cargo-deny"),
|
||||
Self::Nextest => Some("cargo-nextest"),
|
||||
Self::LlvmCov => Some("cargo-llvm-cov"),
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Shared configuration for all cargo step types.
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct CargoConfig {
|
||||
pub command: CargoCommand,
|
||||
/// Rust toolchain override (e.g. "nightly", "1.78.0").
|
||||
#[serde(default)]
|
||||
pub toolchain: Option<String>,
|
||||
/// Target package (`-p`).
|
||||
#[serde(default)]
|
||||
pub package: Option<String>,
|
||||
/// Features to enable (`--features`).
|
||||
#[serde(default)]
|
||||
pub features: Vec<String>,
|
||||
/// Enable all features (`--all-features`).
|
||||
#[serde(default)]
|
||||
pub all_features: bool,
|
||||
/// Disable default features (`--no-default-features`).
|
||||
#[serde(default)]
|
||||
pub no_default_features: bool,
|
||||
/// Build in release mode (`--release`).
|
||||
#[serde(default)]
|
||||
pub release: bool,
|
||||
/// Compilation target triple (`--target`).
|
||||
#[serde(default)]
|
||||
pub target: Option<String>,
|
||||
/// Build profile (`--profile`).
|
||||
#[serde(default)]
|
||||
pub profile: Option<String>,
|
||||
/// Additional arguments appended to the command.
|
||||
#[serde(default)]
|
||||
pub extra_args: Vec<String>,
|
||||
/// Environment variables.
|
||||
#[serde(default)]
|
||||
pub env: HashMap<String, String>,
|
||||
/// Working directory.
|
||||
#[serde(default)]
|
||||
pub working_dir: Option<String>,
|
||||
/// Execution timeout in milliseconds.
|
||||
#[serde(default)]
|
||||
pub timeout_ms: Option<u64>,
|
||||
/// Output directory for generated files (e.g., MDX docs).
|
||||
#[serde(default)]
|
||||
pub output_dir: Option<String>,
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use pretty_assertions::assert_eq;
|
||||
|
||||
#[test]
|
||||
fn serde_round_trip_minimal() {
|
||||
let config = CargoConfig {
|
||||
command: CargoCommand::Build,
|
||||
toolchain: None,
|
||||
package: None,
|
||||
features: vec![],
|
||||
all_features: false,
|
||||
no_default_features: false,
|
||||
release: false,
|
||||
target: None,
|
||||
profile: None,
|
||||
extra_args: vec![],
|
||||
env: HashMap::new(),
|
||||
working_dir: None,
|
||||
timeout_ms: None,
|
||||
output_dir: None,
|
||||
};
|
||||
let json = serde_json::to_string(&config).unwrap();
|
||||
let de: CargoConfig = serde_json::from_str(&json).unwrap();
|
||||
assert_eq!(de.command, CargoCommand::Build);
|
||||
assert!(de.features.is_empty());
|
||||
assert!(!de.release);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn serde_round_trip_full() {
|
||||
let mut env = HashMap::new();
|
||||
env.insert("RUSTFLAGS".to_string(), "-D warnings".to_string());
|
||||
|
||||
let config = CargoConfig {
|
||||
command: CargoCommand::Clippy,
|
||||
toolchain: Some("nightly".to_string()),
|
||||
package: Some("my-crate".to_string()),
|
||||
features: vec!["feat1".to_string(), "feat2".to_string()],
|
||||
all_features: false,
|
||||
no_default_features: true,
|
||||
release: true,
|
||||
target: Some("x86_64-unknown-linux-gnu".to_string()),
|
||||
profile: None,
|
||||
extra_args: vec!["--".to_string(), "-D".to_string(), "warnings".to_string()],
|
||||
env,
|
||||
working_dir: Some("/src".to_string()),
|
||||
timeout_ms: Some(60_000),
|
||||
output_dir: None,
|
||||
};
|
||||
let json = serde_json::to_string(&config).unwrap();
|
||||
let de: CargoConfig = serde_json::from_str(&json).unwrap();
|
||||
assert_eq!(de.command, CargoCommand::Clippy);
|
||||
assert_eq!(de.toolchain, Some("nightly".to_string()));
|
||||
assert_eq!(de.package, Some("my-crate".to_string()));
|
||||
assert_eq!(de.features, vec!["feat1", "feat2"]);
|
||||
assert!(de.no_default_features);
|
||||
assert!(de.release);
|
||||
assert_eq!(de.extra_args, vec!["--", "-D", "warnings"]);
|
||||
assert_eq!(de.timeout_ms, Some(60_000));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn command_as_str() {
|
||||
assert_eq!(CargoCommand::Build.as_str(), "build");
|
||||
assert_eq!(CargoCommand::Test.as_str(), "test");
|
||||
assert_eq!(CargoCommand::Check.as_str(), "check");
|
||||
assert_eq!(CargoCommand::Clippy.as_str(), "clippy");
|
||||
assert_eq!(CargoCommand::Fmt.as_str(), "fmt");
|
||||
assert_eq!(CargoCommand::Doc.as_str(), "doc");
|
||||
assert_eq!(CargoCommand::Publish.as_str(), "publish");
|
||||
assert_eq!(CargoCommand::Audit.as_str(), "audit");
|
||||
assert_eq!(CargoCommand::Deny.as_str(), "deny");
|
||||
assert_eq!(CargoCommand::Nextest.as_str(), "nextest");
|
||||
assert_eq!(CargoCommand::LlvmCov.as_str(), "llvm-cov");
|
||||
assert_eq!(CargoCommand::DocMdx.as_str(), "doc-mdx");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn command_serde_kebab_case() {
|
||||
let json = r#""build""#;
|
||||
let cmd: CargoCommand = serde_json::from_str(json).unwrap();
|
||||
assert_eq!(cmd, CargoCommand::Build);
|
||||
|
||||
let serialized = serde_json::to_string(&CargoCommand::Build).unwrap();
|
||||
assert_eq!(serialized, r#""build""#);
|
||||
|
||||
// External tools
|
||||
let json = r#""llvm-cov""#;
|
||||
let cmd: CargoCommand = serde_json::from_str(json).unwrap();
|
||||
assert_eq!(cmd, CargoCommand::LlvmCov);
|
||||
|
||||
let json = r#""nextest""#;
|
||||
let cmd: CargoCommand = serde_json::from_str(json).unwrap();
|
||||
assert_eq!(cmd, CargoCommand::Nextest);
|
||||
|
||||
let json = r#""doc-mdx""#;
|
||||
let cmd: CargoCommand = serde_json::from_str(json).unwrap();
|
||||
assert_eq!(cmd, CargoCommand::DocMdx);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn subcommand_args_single() {
|
||||
assert_eq!(CargoCommand::Build.subcommand_args(), vec!["build"]);
|
||||
assert_eq!(CargoCommand::Audit.subcommand_args(), vec!["audit"]);
|
||||
assert_eq!(CargoCommand::LlvmCov.subcommand_args(), vec!["llvm-cov"]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn subcommand_args_nextest_has_run() {
|
||||
assert_eq!(CargoCommand::Nextest.subcommand_args(), vec!["nextest", "run"]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn subcommand_args_doc_mdx_uses_rustdoc() {
|
||||
assert_eq!(CargoCommand::DocMdx.subcommand_args(), vec!["rustdoc"]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn install_package_external_tools() {
|
||||
assert_eq!(CargoCommand::Audit.install_package(), Some("cargo-audit"));
|
||||
assert_eq!(CargoCommand::Deny.install_package(), Some("cargo-deny"));
|
||||
assert_eq!(CargoCommand::Nextest.install_package(), Some("cargo-nextest"));
|
||||
assert_eq!(CargoCommand::LlvmCov.install_package(), Some("cargo-llvm-cov"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn install_package_builtin_returns_none() {
|
||||
assert_eq!(CargoCommand::Build.install_package(), None);
|
||||
assert_eq!(CargoCommand::Test.install_package(), None);
|
||||
assert_eq!(CargoCommand::Check.install_package(), None);
|
||||
assert_eq!(CargoCommand::Clippy.install_package(), None);
|
||||
assert_eq!(CargoCommand::Fmt.install_package(), None);
|
||||
assert_eq!(CargoCommand::Doc.install_package(), None);
|
||||
assert_eq!(CargoCommand::Publish.install_package(), None);
|
||||
assert_eq!(CargoCommand::DocMdx.install_package(), None);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn binary_name_external_tools() {
|
||||
assert_eq!(CargoCommand::Audit.binary_name(), Some("cargo-audit"));
|
||||
assert_eq!(CargoCommand::Deny.binary_name(), Some("cargo-deny"));
|
||||
assert_eq!(CargoCommand::Nextest.binary_name(), Some("cargo-nextest"));
|
||||
assert_eq!(CargoCommand::LlvmCov.binary_name(), Some("cargo-llvm-cov"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn binary_name_builtin_returns_none() {
|
||||
assert_eq!(CargoCommand::Build.binary_name(), None);
|
||||
assert_eq!(CargoCommand::Test.binary_name(), None);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn config_defaults() {
|
||||
let json = r#"{"command": "test"}"#;
|
||||
let config: CargoConfig = serde_json::from_str(json).unwrap();
|
||||
assert_eq!(config.command, CargoCommand::Test);
|
||||
assert!(config.toolchain.is_none());
|
||||
assert!(config.package.is_none());
|
||||
assert!(config.features.is_empty());
|
||||
assert!(!config.all_features);
|
||||
assert!(!config.no_default_features);
|
||||
assert!(!config.release);
|
||||
assert!(config.target.is_none());
|
||||
assert!(config.profile.is_none());
|
||||
assert!(config.extra_args.is_empty());
|
||||
assert!(config.env.is_empty());
|
||||
assert!(config.working_dir.is_none());
|
||||
assert!(config.timeout_ms.is_none());
|
||||
assert!(config.output_dir.is_none());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn config_with_output_dir() {
|
||||
let json = r#"{"command": "doc-mdx", "output_dir": "docs/api"}"#;
|
||||
let config: CargoConfig = serde_json::from_str(json).unwrap();
|
||||
assert_eq!(config.command, CargoCommand::DocMdx);
|
||||
assert_eq!(config.output_dir, Some("docs/api".to_string()));
|
||||
}
|
||||
}
|
||||
5
wfe-rustlang/src/cargo/mod.rs
Normal file
5
wfe-rustlang/src/cargo/mod.rs
Normal file
@@ -0,0 +1,5 @@
|
||||
pub mod config;
|
||||
pub mod step;
|
||||
|
||||
pub use config::{CargoCommand, CargoConfig};
|
||||
pub use step::CargoStep;
|
||||
532
wfe-rustlang/src/cargo/step.rs
Normal file
532
wfe-rustlang/src/cargo/step.rs
Normal file
@@ -0,0 +1,532 @@
|
||||
use async_trait::async_trait;
|
||||
use wfe_core::models::ExecutionResult;
|
||||
use wfe_core::traits::step::{StepBody, StepExecutionContext};
|
||||
use wfe_core::WfeError;
|
||||
|
||||
use crate::cargo::config::{CargoCommand, CargoConfig};
|
||||
|
||||
pub struct CargoStep {
|
||||
config: CargoConfig,
|
||||
}
|
||||
|
||||
impl CargoStep {
|
||||
pub fn new(config: CargoConfig) -> Self {
|
||||
Self { config }
|
||||
}
|
||||
|
||||
pub fn build_command(&self) -> tokio::process::Command {
|
||||
// DocMdx requires nightly for --output-format json.
|
||||
let toolchain = if matches!(self.config.command, CargoCommand::DocMdx) {
|
||||
Some(self.config.toolchain.as_deref().unwrap_or("nightly"))
|
||||
} else {
|
||||
self.config.toolchain.as_deref()
|
||||
};
|
||||
|
||||
let mut cmd = if let Some(tc) = toolchain {
|
||||
let mut c = tokio::process::Command::new("rustup");
|
||||
c.args(["run", tc, "cargo"]);
|
||||
c
|
||||
} else {
|
||||
tokio::process::Command::new("cargo")
|
||||
};
|
||||
|
||||
for arg in self.config.command.subcommand_args() {
|
||||
cmd.arg(arg);
|
||||
}
|
||||
|
||||
if let Some(ref pkg) = self.config.package {
|
||||
cmd.args(["-p", pkg]);
|
||||
}
|
||||
|
||||
if !self.config.features.is_empty() {
|
||||
cmd.args(["--features", &self.config.features.join(",")]);
|
||||
}
|
||||
|
||||
if self.config.all_features {
|
||||
cmd.arg("--all-features");
|
||||
}
|
||||
|
||||
if self.config.no_default_features {
|
||||
cmd.arg("--no-default-features");
|
||||
}
|
||||
|
||||
if self.config.release {
|
||||
cmd.arg("--release");
|
||||
}
|
||||
|
||||
if let Some(ref target) = self.config.target {
|
||||
cmd.args(["--target", target]);
|
||||
}
|
||||
|
||||
if let Some(ref profile) = self.config.profile {
|
||||
cmd.args(["--profile", profile]);
|
||||
}
|
||||
|
||||
for arg in &self.config.extra_args {
|
||||
cmd.arg(arg);
|
||||
}
|
||||
|
||||
// DocMdx appends rustdoc-specific flags after user extra_args.
|
||||
if matches!(self.config.command, CargoCommand::DocMdx) {
|
||||
cmd.args(["--", "-Z", "unstable-options", "--output-format", "json"]);
|
||||
}
|
||||
|
||||
for (key, value) in &self.config.env {
|
||||
cmd.env(key, value);
|
||||
}
|
||||
|
||||
if let Some(ref dir) = self.config.working_dir {
|
||||
cmd.current_dir(dir);
|
||||
}
|
||||
|
||||
cmd.stdout(std::process::Stdio::piped());
|
||||
cmd.stderr(std::process::Stdio::piped());
|
||||
|
||||
cmd
|
||||
}
|
||||
|
||||
/// Ensures an external cargo tool is installed before running it.
|
||||
/// For built-in cargo subcommands, this is a no-op.
|
||||
async fn ensure_tool_available(&self) -> Result<(), WfeError> {
|
||||
let (binary, package) = match (self.config.command.binary_name(), self.config.command.install_package()) {
|
||||
(Some(b), Some(p)) => (b, p),
|
||||
_ => return Ok(()),
|
||||
};
|
||||
|
||||
// Probe for the binary.
|
||||
let probe = tokio::process::Command::new(binary)
|
||||
.arg("--version")
|
||||
.stdout(std::process::Stdio::null())
|
||||
.stderr(std::process::Stdio::null())
|
||||
.status()
|
||||
.await;
|
||||
|
||||
if let Ok(status) = probe {
|
||||
if status.success() {
|
||||
return Ok(());
|
||||
}
|
||||
}
|
||||
|
||||
tracing::info!(package = package, "cargo tool not found, installing");
|
||||
|
||||
// For llvm-cov, ensure the rustup component is present first.
|
||||
if matches!(self.config.command, CargoCommand::LlvmCov) {
|
||||
let component = tokio::process::Command::new("rustup")
|
||||
.args(["component", "add", "llvm-tools-preview"])
|
||||
.stdout(std::process::Stdio::piped())
|
||||
.stderr(std::process::Stdio::piped())
|
||||
.output()
|
||||
.await
|
||||
.map_err(|e| WfeError::StepExecution(format!(
|
||||
"Failed to add llvm-tools-preview component: {e}"
|
||||
)))?;
|
||||
|
||||
if !component.status.success() {
|
||||
let stderr = String::from_utf8_lossy(&component.stderr);
|
||||
return Err(WfeError::StepExecution(format!(
|
||||
"Failed to add llvm-tools-preview component: {stderr}"
|
||||
)));
|
||||
}
|
||||
}
|
||||
|
||||
let install = tokio::process::Command::new("cargo")
|
||||
.args(["install", package])
|
||||
.stdout(std::process::Stdio::piped())
|
||||
.stderr(std::process::Stdio::piped())
|
||||
.output()
|
||||
.await
|
||||
.map_err(|e| WfeError::StepExecution(format!(
|
||||
"Failed to install {package}: {e}"
|
||||
)))?;
|
||||
|
||||
if !install.status.success() {
|
||||
let stderr = String::from_utf8_lossy(&install.stderr);
|
||||
return Err(WfeError::StepExecution(format!(
|
||||
"Failed to install {package}: {stderr}"
|
||||
)));
|
||||
}
|
||||
|
||||
tracing::info!(package = package, "cargo tool installed successfully");
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Post-process rustdoc JSON output into MDX files.
|
||||
fn transform_rustdoc_json(
|
||||
&self,
|
||||
outputs: &mut serde_json::Map<String, serde_json::Value>,
|
||||
) -> Result<(), WfeError> {
|
||||
use crate::rustdoc::transformer::{transform_to_mdx, write_mdx_files};
|
||||
|
||||
// Find the JSON file in target/doc/.
|
||||
let working_dir = self.config.working_dir.as_deref().unwrap_or(".");
|
||||
let doc_dir = std::path::Path::new(working_dir).join("target/doc");
|
||||
|
||||
let json_path = std::fs::read_dir(&doc_dir)
|
||||
.map_err(|e| WfeError::StepExecution(format!(
|
||||
"failed to read target/doc: {e}"
|
||||
)))?
|
||||
.filter_map(|entry| entry.ok())
|
||||
.find(|entry| {
|
||||
entry.path().extension().is_some_and(|ext| ext == "json")
|
||||
})
|
||||
.map(|entry| entry.path())
|
||||
.ok_or_else(|| WfeError::StepExecution(
|
||||
"no JSON file found in target/doc/ — did rustdoc --output-format json succeed?".to_string()
|
||||
))?;
|
||||
|
||||
tracing::info!(path = %json_path.display(), "reading rustdoc JSON");
|
||||
|
||||
let json_content = std::fs::read_to_string(&json_path).map_err(|e| {
|
||||
WfeError::StepExecution(format!("failed to read {}: {e}", json_path.display()))
|
||||
})?;
|
||||
|
||||
let krate: rustdoc_types::Crate = serde_json::from_str(&json_content).map_err(|e| {
|
||||
WfeError::StepExecution(format!("failed to parse rustdoc JSON: {e}"))
|
||||
})?;
|
||||
|
||||
let mdx_files = transform_to_mdx(&krate);
|
||||
|
||||
let output_dir = self.config.output_dir
|
||||
.as_deref()
|
||||
.unwrap_or("target/doc/mdx");
|
||||
let output_path = std::path::Path::new(working_dir).join(output_dir);
|
||||
|
||||
write_mdx_files(&mdx_files, &output_path).map_err(|e| {
|
||||
WfeError::StepExecution(format!("failed to write MDX files: {e}"))
|
||||
})?;
|
||||
|
||||
let file_count = mdx_files.len();
|
||||
tracing::info!(
|
||||
output_dir = %output_path.display(),
|
||||
file_count,
|
||||
"generated MDX documentation"
|
||||
);
|
||||
|
||||
outputs.insert(
|
||||
"mdx.output_dir".to_string(),
|
||||
serde_json::Value::String(output_path.to_string_lossy().to_string()),
|
||||
);
|
||||
outputs.insert(
|
||||
"mdx.file_count".to_string(),
|
||||
serde_json::Value::Number(file_count.into()),
|
||||
);
|
||||
let file_paths: Vec<_> = mdx_files.iter().map(|f| f.path.clone()).collect();
|
||||
outputs.insert(
|
||||
"mdx.files".to_string(),
|
||||
serde_json::Value::Array(
|
||||
file_paths.into_iter().map(serde_json::Value::String).collect(),
|
||||
),
|
||||
);
|
||||
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl StepBody for CargoStep {
|
||||
async fn run(&mut self, context: &StepExecutionContext<'_>) -> wfe_core::Result<ExecutionResult> {
|
||||
let step_name = context.step.name.as_deref().unwrap_or("unknown");
|
||||
let subcmd = self.config.command.as_str();
|
||||
|
||||
// Ensure external tools are installed before running.
|
||||
self.ensure_tool_available().await?;
|
||||
|
||||
tracing::info!(step = step_name, command = subcmd, "running cargo");
|
||||
|
||||
let mut cmd = self.build_command();
|
||||
|
||||
let output = if let Some(timeout_ms) = self.config.timeout_ms {
|
||||
let duration = std::time::Duration::from_millis(timeout_ms);
|
||||
match tokio::time::timeout(duration, cmd.output()).await {
|
||||
Ok(result) => result.map_err(|e| {
|
||||
WfeError::StepExecution(format!("Failed to spawn cargo {subcmd}: {e}"))
|
||||
})?,
|
||||
Err(_) => {
|
||||
return Err(WfeError::StepExecution(format!(
|
||||
"cargo {subcmd} timed out after {timeout_ms}ms"
|
||||
)));
|
||||
}
|
||||
}
|
||||
} else {
|
||||
cmd.output()
|
||||
.await
|
||||
.map_err(|e| WfeError::StepExecution(format!("Failed to spawn cargo {subcmd}: {e}")))?
|
||||
};
|
||||
|
||||
let stdout = String::from_utf8_lossy(&output.stdout).to_string();
|
||||
let stderr = String::from_utf8_lossy(&output.stderr).to_string();
|
||||
|
||||
if !output.status.success() {
|
||||
let code = output.status.code().unwrap_or(-1);
|
||||
return Err(WfeError::StepExecution(format!(
|
||||
"cargo {subcmd} exited with code {code}\nstdout: {stdout}\nstderr: {stderr}"
|
||||
)));
|
||||
}
|
||||
|
||||
let mut outputs = serde_json::Map::new();
|
||||
outputs.insert(
|
||||
format!("{step_name}.stdout"),
|
||||
serde_json::Value::String(stdout),
|
||||
);
|
||||
outputs.insert(
|
||||
format!("{step_name}.stderr"),
|
||||
serde_json::Value::String(stderr),
|
||||
);
|
||||
|
||||
// DocMdx post-processing: transform rustdoc JSON → MDX files.
|
||||
if matches!(self.config.command, CargoCommand::DocMdx) {
|
||||
self.transform_rustdoc_json(&mut outputs)?;
|
||||
}
|
||||
|
||||
Ok(ExecutionResult {
|
||||
proceed: true,
|
||||
output_data: Some(serde_json::Value::Object(outputs)),
|
||||
..Default::default()
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use crate::cargo::config::{CargoCommand, CargoConfig};
|
||||
use std::collections::HashMap;
|
||||
|
||||
fn minimal_config(command: CargoCommand) -> CargoConfig {
|
||||
CargoConfig {
|
||||
command,
|
||||
toolchain: None,
|
||||
package: None,
|
||||
features: vec![],
|
||||
all_features: false,
|
||||
no_default_features: false,
|
||||
release: false,
|
||||
target: None,
|
||||
profile: None,
|
||||
extra_args: vec![],
|
||||
env: HashMap::new(),
|
||||
working_dir: None,
|
||||
timeout_ms: None,
|
||||
output_dir: None,
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn build_command_minimal() {
|
||||
let step = CargoStep::new(minimal_config(CargoCommand::Build));
|
||||
let cmd = step.build_command();
|
||||
let prog = cmd.as_std().get_program().to_str().unwrap();
|
||||
assert_eq!(prog, "cargo");
|
||||
let args: Vec<_> = cmd.as_std().get_args().map(|a| a.to_str().unwrap()).collect();
|
||||
assert_eq!(args, vec!["build"]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn build_command_with_toolchain() {
|
||||
let mut config = minimal_config(CargoCommand::Test);
|
||||
config.toolchain = Some("nightly".to_string());
|
||||
let step = CargoStep::new(config);
|
||||
let cmd = step.build_command();
|
||||
let prog = cmd.as_std().get_program().to_str().unwrap();
|
||||
assert_eq!(prog, "rustup");
|
||||
let args: Vec<_> = cmd.as_std().get_args().map(|a| a.to_str().unwrap()).collect();
|
||||
assert_eq!(args, vec!["run", "nightly", "cargo", "test"]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn build_command_with_package_and_features() {
|
||||
let mut config = minimal_config(CargoCommand::Check);
|
||||
config.package = Some("my-crate".to_string());
|
||||
config.features = vec!["feat1".to_string(), "feat2".to_string()];
|
||||
let step = CargoStep::new(config);
|
||||
let cmd = step.build_command();
|
||||
let args: Vec<_> = cmd.as_std().get_args().map(|a| a.to_str().unwrap()).collect();
|
||||
assert_eq!(args, vec!["check", "-p", "my-crate", "--features", "feat1,feat2"]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn build_command_release_and_target() {
|
||||
let mut config = minimal_config(CargoCommand::Build);
|
||||
config.release = true;
|
||||
config.target = Some("aarch64-unknown-linux-gnu".to_string());
|
||||
let step = CargoStep::new(config);
|
||||
let cmd = step.build_command();
|
||||
let args: Vec<_> = cmd.as_std().get_args().map(|a| a.to_str().unwrap()).collect();
|
||||
assert_eq!(args, vec!["build", "--release", "--target", "aarch64-unknown-linux-gnu"]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn build_command_all_flags() {
|
||||
let mut config = minimal_config(CargoCommand::Clippy);
|
||||
config.all_features = true;
|
||||
config.no_default_features = true;
|
||||
config.profile = Some("dev".to_string());
|
||||
config.extra_args = vec!["--".to_string(), "-D".to_string(), "warnings".to_string()];
|
||||
let step = CargoStep::new(config);
|
||||
let cmd = step.build_command();
|
||||
let args: Vec<_> = cmd.as_std().get_args().map(|a| a.to_str().unwrap()).collect();
|
||||
assert_eq!(
|
||||
args,
|
||||
vec!["clippy", "--all-features", "--no-default-features", "--profile", "dev", "--", "-D", "warnings"]
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn build_command_fmt() {
|
||||
let mut config = minimal_config(CargoCommand::Fmt);
|
||||
config.extra_args = vec!["--check".to_string()];
|
||||
let step = CargoStep::new(config);
|
||||
let cmd = step.build_command();
|
||||
let args: Vec<_> = cmd.as_std().get_args().map(|a| a.to_str().unwrap()).collect();
|
||||
assert_eq!(args, vec!["fmt", "--check"]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn build_command_publish_dry_run() {
|
||||
let mut config = minimal_config(CargoCommand::Publish);
|
||||
config.extra_args = vec!["--dry-run".to_string(), "--registry".to_string(), "my-reg".to_string()];
|
||||
let step = CargoStep::new(config);
|
||||
let cmd = step.build_command();
|
||||
let args: Vec<_> = cmd.as_std().get_args().map(|a| a.to_str().unwrap()).collect();
|
||||
assert_eq!(args, vec!["publish", "--dry-run", "--registry", "my-reg"]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn build_command_doc() {
|
||||
let mut config = minimal_config(CargoCommand::Doc);
|
||||
config.extra_args = vec!["--no-deps".to_string()];
|
||||
config.release = true;
|
||||
let step = CargoStep::new(config);
|
||||
let cmd = step.build_command();
|
||||
let args: Vec<_> = cmd.as_std().get_args().map(|a| a.to_str().unwrap()).collect();
|
||||
assert_eq!(args, vec!["doc", "--release", "--no-deps"]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn build_command_env_vars() {
|
||||
let mut config = minimal_config(CargoCommand::Build);
|
||||
config.env.insert("RUSTFLAGS".to_string(), "-D warnings".to_string());
|
||||
let step = CargoStep::new(config);
|
||||
let cmd = step.build_command();
|
||||
let envs: Vec<_> = cmd.as_std().get_envs().collect();
|
||||
assert!(envs.iter().any(|(k, v)| *k == "RUSTFLAGS" && v == &Some("-D warnings".as_ref())));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn build_command_working_dir() {
|
||||
let mut config = minimal_config(CargoCommand::Test);
|
||||
config.working_dir = Some("/my/project".to_string());
|
||||
let step = CargoStep::new(config);
|
||||
let cmd = step.build_command();
|
||||
assert_eq!(cmd.as_std().get_current_dir(), Some(std::path::Path::new("/my/project")));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn build_command_audit() {
|
||||
let step = CargoStep::new(minimal_config(CargoCommand::Audit));
|
||||
let cmd = step.build_command();
|
||||
let args: Vec<_> = cmd.as_std().get_args().map(|a| a.to_str().unwrap()).collect();
|
||||
assert_eq!(args, vec!["audit"]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn build_command_deny() {
|
||||
let mut config = minimal_config(CargoCommand::Deny);
|
||||
config.extra_args = vec!["check".to_string()];
|
||||
let step = CargoStep::new(config);
|
||||
let cmd = step.build_command();
|
||||
let args: Vec<_> = cmd.as_std().get_args().map(|a| a.to_str().unwrap()).collect();
|
||||
assert_eq!(args, vec!["deny", "check"]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn build_command_nextest() {
|
||||
let step = CargoStep::new(minimal_config(CargoCommand::Nextest));
|
||||
let cmd = step.build_command();
|
||||
let args: Vec<_> = cmd.as_std().get_args().map(|a| a.to_str().unwrap()).collect();
|
||||
assert_eq!(args, vec!["nextest", "run"]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn build_command_nextest_with_features() {
|
||||
let mut config = minimal_config(CargoCommand::Nextest);
|
||||
config.features = vec!["feat1".to_string()];
|
||||
config.extra_args = vec!["--no-fail-fast".to_string()];
|
||||
let step = CargoStep::new(config);
|
||||
let cmd = step.build_command();
|
||||
let args: Vec<_> = cmd.as_std().get_args().map(|a| a.to_str().unwrap()).collect();
|
||||
assert_eq!(args, vec!["nextest", "run", "--features", "feat1", "--no-fail-fast"]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn build_command_llvm_cov() {
|
||||
let step = CargoStep::new(minimal_config(CargoCommand::LlvmCov));
|
||||
let cmd = step.build_command();
|
||||
let args: Vec<_> = cmd.as_std().get_args().map(|a| a.to_str().unwrap()).collect();
|
||||
assert_eq!(args, vec!["llvm-cov"]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn build_command_llvm_cov_with_args() {
|
||||
let mut config = minimal_config(CargoCommand::LlvmCov);
|
||||
config.extra_args = vec!["--html".to_string(), "--output-dir".to_string(), "coverage".to_string()];
|
||||
let step = CargoStep::new(config);
|
||||
let cmd = step.build_command();
|
||||
let args: Vec<_> = cmd.as_std().get_args().map(|a| a.to_str().unwrap()).collect();
|
||||
assert_eq!(args, vec!["llvm-cov", "--html", "--output-dir", "coverage"]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn build_command_doc_mdx_forces_nightly() {
|
||||
let step = CargoStep::new(minimal_config(CargoCommand::DocMdx));
|
||||
let cmd = step.build_command();
|
||||
let prog = cmd.as_std().get_program().to_str().unwrap();
|
||||
assert_eq!(prog, "rustup");
|
||||
let args: Vec<_> = cmd.as_std().get_args().map(|a| a.to_str().unwrap()).collect();
|
||||
assert_eq!(
|
||||
args,
|
||||
vec!["run", "nightly", "cargo", "rustdoc", "--", "-Z", "unstable-options", "--output-format", "json"]
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn build_command_doc_mdx_with_package() {
|
||||
let mut config = minimal_config(CargoCommand::DocMdx);
|
||||
config.package = Some("my-crate".to_string());
|
||||
config.extra_args = vec!["--no-deps".to_string()];
|
||||
let step = CargoStep::new(config);
|
||||
let cmd = step.build_command();
|
||||
let args: Vec<_> = cmd.as_std().get_args().map(|a| a.to_str().unwrap()).collect();
|
||||
assert_eq!(
|
||||
args,
|
||||
vec!["run", "nightly", "cargo", "rustdoc", "-p", "my-crate", "--no-deps", "--", "-Z", "unstable-options", "--output-format", "json"]
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn build_command_doc_mdx_custom_toolchain() {
|
||||
let mut config = minimal_config(CargoCommand::DocMdx);
|
||||
config.toolchain = Some("nightly-2024-06-01".to_string());
|
||||
let step = CargoStep::new(config);
|
||||
let cmd = step.build_command();
|
||||
let args: Vec<_> = cmd.as_std().get_args().map(|a| a.to_str().unwrap()).collect();
|
||||
assert!(args.contains(&"nightly-2024-06-01"));
|
||||
}
|
||||
|
||||
#[tokio::test]
|
||||
async fn ensure_tool_builtin_is_noop() {
|
||||
let step = CargoStep::new(minimal_config(CargoCommand::Build));
|
||||
// Should return Ok immediately for built-in commands.
|
||||
step.ensure_tool_available().await.unwrap();
|
||||
}
|
||||
|
||||
#[tokio::test]
|
||||
async fn ensure_tool_already_installed_succeeds() {
|
||||
// cargo-audit/nextest/etc may or may not be installed,
|
||||
// but we can test with a known-installed tool: cargo itself
|
||||
// is always available. Test the flow by verifying the
|
||||
// built-in path returns Ok.
|
||||
let step = CargoStep::new(minimal_config(CargoCommand::Check));
|
||||
step.ensure_tool_available().await.unwrap();
|
||||
}
|
||||
}
|
||||
6
wfe-rustlang/src/lib.rs
Normal file
6
wfe-rustlang/src/lib.rs
Normal file
@@ -0,0 +1,6 @@
|
||||
pub mod cargo;
|
||||
pub mod rustdoc;
|
||||
pub mod rustup;
|
||||
|
||||
pub use cargo::{CargoCommand, CargoConfig, CargoStep};
|
||||
pub use rustup::{RustupCommand, RustupConfig, RustupStep};
|
||||
3
wfe-rustlang/src/rustdoc/mod.rs
Normal file
3
wfe-rustlang/src/rustdoc/mod.rs
Normal file
@@ -0,0 +1,3 @@
|
||||
pub mod transformer;
|
||||
|
||||
pub use transformer::transform_to_mdx;
|
||||
847
wfe-rustlang/src/rustdoc/transformer.rs
Normal file
847
wfe-rustlang/src/rustdoc/transformer.rs
Normal file
@@ -0,0 +1,847 @@
|
||||
use std::collections::HashMap;
|
||||
use std::path::Path;
|
||||
|
||||
use rustdoc_types::{Crate, Id, Item, ItemEnum, Type};
|
||||
|
||||
/// A generated MDX file with its relative path and content.
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct MdxFile {
|
||||
/// Relative path (e.g., `my_crate/utils.mdx`).
|
||||
pub path: String,
|
||||
/// MDX content.
|
||||
pub content: String,
|
||||
}
|
||||
|
||||
/// Transform a rustdoc JSON `Crate` into a set of MDX files.
|
||||
///
|
||||
/// Generates one MDX file per module, with all items in that module
|
||||
/// grouped by kind (structs, enums, functions, traits, etc.).
|
||||
pub fn transform_to_mdx(krate: &Crate) -> Vec<MdxFile> {
|
||||
let mut files = Vec::new();
|
||||
let mut module_items: HashMap<String, Vec<(&Item, &str)>> = HashMap::new();
|
||||
|
||||
for (id, item) in &krate.index {
|
||||
let module_path = resolve_module_path(krate, id);
|
||||
let kind_label = item_kind_label(&item.inner);
|
||||
if let Some(label) = kind_label {
|
||||
module_items
|
||||
.entry(module_path)
|
||||
.or_default()
|
||||
.push((item, label));
|
||||
}
|
||||
}
|
||||
|
||||
let mut paths: Vec<_> = module_items.keys().cloned().collect();
|
||||
paths.sort();
|
||||
|
||||
for module_path in paths {
|
||||
let items = &module_items[&module_path];
|
||||
let content = render_module(&module_path, items, krate);
|
||||
let file_path = if module_path.is_empty() {
|
||||
"index.mdx".to_string()
|
||||
} else {
|
||||
format!("{}.mdx", module_path.replace("::", "/"))
|
||||
};
|
||||
files.push(MdxFile {
|
||||
path: file_path,
|
||||
content,
|
||||
});
|
||||
}
|
||||
|
||||
files
|
||||
}
|
||||
|
||||
/// Write MDX files to the output directory.
|
||||
pub fn write_mdx_files(files: &[MdxFile], output_dir: &Path) -> std::io::Result<()> {
|
||||
for file in files {
|
||||
let path = output_dir.join(&file.path);
|
||||
if let Some(parent) = path.parent() {
|
||||
std::fs::create_dir_all(parent)?;
|
||||
}
|
||||
std::fs::write(&path, &file.content)?;
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn resolve_module_path(krate: &Crate, id: &Id) -> String {
|
||||
if let Some(summary) = krate.paths.get(id) {
|
||||
let path = &summary.path;
|
||||
if path.len() > 1 {
|
||||
path[..path.len() - 1].join("::")
|
||||
} else if !path.is_empty() {
|
||||
path[0].clone()
|
||||
} else {
|
||||
String::new()
|
||||
}
|
||||
} else {
|
||||
String::new()
|
||||
}
|
||||
}
|
||||
|
||||
fn item_kind_label(inner: &ItemEnum) -> Option<&'static str> {
|
||||
match inner {
|
||||
ItemEnum::Module(_) => Some("Modules"),
|
||||
ItemEnum::Struct(_) => Some("Structs"),
|
||||
ItemEnum::Enum(_) => Some("Enums"),
|
||||
ItemEnum::Function(_) => Some("Functions"),
|
||||
ItemEnum::Trait(_) => Some("Traits"),
|
||||
ItemEnum::TypeAlias(_) => Some("Type Aliases"),
|
||||
ItemEnum::Constant { .. } => Some("Constants"),
|
||||
ItemEnum::Static(_) => Some("Statics"),
|
||||
ItemEnum::Macro(_) => Some("Macros"),
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
|
||||
fn render_module(module_path: &str, items: &[(&Item, &str)], krate: &Crate) -> String {
|
||||
let mut out = String::new();
|
||||
|
||||
let title = if module_path.is_empty() {
|
||||
krate
|
||||
.index
|
||||
.get(&krate.root)
|
||||
.and_then(|i| i.name.clone())
|
||||
.unwrap_or_else(|| "crate".to_string())
|
||||
} else {
|
||||
module_path.to_string()
|
||||
};
|
||||
|
||||
let description = if module_path.is_empty() {
|
||||
krate
|
||||
.index
|
||||
.get(&krate.root)
|
||||
.and_then(|i| i.docs.as_ref())
|
||||
.map(|d| first_sentence(d))
|
||||
.unwrap_or_default()
|
||||
} else {
|
||||
items
|
||||
.iter()
|
||||
.find(|(item, kind)| {
|
||||
*kind == "Modules"
|
||||
&& item.name.as_deref() == module_path.split("::").last()
|
||||
})
|
||||
.and_then(|(item, _)| item.docs.as_ref())
|
||||
.map(|d| first_sentence(d))
|
||||
.unwrap_or_default()
|
||||
};
|
||||
|
||||
out.push_str(&format!(
|
||||
"---\ntitle: \"{title}\"\ndescription: \"{}\"\n---\n\n",
|
||||
description.replace('"', "\\\"")
|
||||
));
|
||||
|
||||
let mut by_kind: HashMap<&str, Vec<&Item>> = HashMap::new();
|
||||
for (item, kind) in items {
|
||||
by_kind.entry(kind).or_default().push(item);
|
||||
}
|
||||
|
||||
let kind_order = [
|
||||
"Modules", "Structs", "Enums", "Traits", "Functions",
|
||||
"Type Aliases", "Constants", "Statics", "Macros",
|
||||
];
|
||||
|
||||
for kind in &kind_order {
|
||||
if let Some(kind_items) = by_kind.get(kind) {
|
||||
let mut sorted: Vec<_> = kind_items.iter().collect();
|
||||
sorted.sort_by_key(|item| &item.name);
|
||||
|
||||
out.push_str(&format!("## {kind}\n\n"));
|
||||
|
||||
for item in sorted {
|
||||
render_item(&mut out, item, krate);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
out
|
||||
}
|
||||
|
||||
fn render_item(out: &mut String, item: &Item, krate: &Crate) {
|
||||
let name = item.name.as_deref().unwrap_or("_");
|
||||
|
||||
out.push_str(&format!("### `{name}`\n\n"));
|
||||
|
||||
if let Some(sig) = render_signature(item, krate) {
|
||||
out.push_str("```rust\n");
|
||||
out.push_str(&sig);
|
||||
out.push('\n');
|
||||
out.push_str("```\n\n");
|
||||
}
|
||||
|
||||
if let Some(ref docs) = item.docs {
|
||||
out.push_str(docs);
|
||||
out.push_str("\n\n");
|
||||
}
|
||||
}
|
||||
|
||||
fn render_signature(item: &Item, krate: &Crate) -> Option<String> {
|
||||
let name = item.name.as_deref()?;
|
||||
match &item.inner {
|
||||
ItemEnum::Function(f) => {
|
||||
let mut sig = String::new();
|
||||
if f.header.is_const {
|
||||
sig.push_str("const ");
|
||||
}
|
||||
if f.header.is_async {
|
||||
sig.push_str("async ");
|
||||
}
|
||||
if f.header.is_unsafe {
|
||||
sig.push_str("unsafe ");
|
||||
}
|
||||
sig.push_str("fn ");
|
||||
sig.push_str(name);
|
||||
if !f.generics.params.is_empty() {
|
||||
sig.push('<');
|
||||
let params: Vec<_> = f.generics.params.iter().map(|p| p.name.clone()).collect();
|
||||
sig.push_str(¶ms.join(", "));
|
||||
sig.push('>');
|
||||
}
|
||||
sig.push('(');
|
||||
let params: Vec<_> = f
|
||||
.sig
|
||||
.inputs
|
||||
.iter()
|
||||
.map(|(pname, ty)| format!("{pname}: {}", render_type(ty, krate)))
|
||||
.collect();
|
||||
sig.push_str(¶ms.join(", "));
|
||||
sig.push(')');
|
||||
if let Some(ref output) = f.sig.output {
|
||||
sig.push_str(&format!(" -> {}", render_type(output, krate)));
|
||||
}
|
||||
Some(sig)
|
||||
}
|
||||
ItemEnum::Struct(s) => {
|
||||
let mut sig = String::from("pub struct ");
|
||||
sig.push_str(name);
|
||||
if !s.generics.params.is_empty() {
|
||||
sig.push('<');
|
||||
let params: Vec<_> = s.generics.params.iter().map(|p| p.name.clone()).collect();
|
||||
sig.push_str(¶ms.join(", "));
|
||||
sig.push('>');
|
||||
}
|
||||
match &s.kind {
|
||||
rustdoc_types::StructKind::Unit => sig.push(';'),
|
||||
rustdoc_types::StructKind::Tuple(_) => sig.push_str("(...)"),
|
||||
rustdoc_types::StructKind::Plain { fields, .. } => {
|
||||
sig.push_str(" { ");
|
||||
let field_names: Vec<_> = fields
|
||||
.iter()
|
||||
.filter_map(|fid| krate.index.get(fid))
|
||||
.filter_map(|f| f.name.as_deref())
|
||||
.collect();
|
||||
sig.push_str(&field_names.join(", "));
|
||||
sig.push_str(" }");
|
||||
}
|
||||
}
|
||||
Some(sig)
|
||||
}
|
||||
ItemEnum::Enum(e) => {
|
||||
let mut sig = String::from("pub enum ");
|
||||
sig.push_str(name);
|
||||
if !e.generics.params.is_empty() {
|
||||
sig.push('<');
|
||||
let params: Vec<_> = e.generics.params.iter().map(|p| p.name.clone()).collect();
|
||||
sig.push_str(¶ms.join(", "));
|
||||
sig.push('>');
|
||||
}
|
||||
sig.push_str(" { ");
|
||||
let variant_names: Vec<_> = e
|
||||
.variants
|
||||
.iter()
|
||||
.filter_map(|vid| krate.index.get(vid))
|
||||
.filter_map(|v| v.name.as_deref())
|
||||
.collect();
|
||||
sig.push_str(&variant_names.join(", "));
|
||||
sig.push_str(" }");
|
||||
Some(sig)
|
||||
}
|
||||
ItemEnum::Trait(t) => {
|
||||
let mut sig = String::from("pub trait ");
|
||||
sig.push_str(name);
|
||||
if !t.generics.params.is_empty() {
|
||||
sig.push('<');
|
||||
let params: Vec<_> = t.generics.params.iter().map(|p| p.name.clone()).collect();
|
||||
sig.push_str(¶ms.join(", "));
|
||||
sig.push('>');
|
||||
}
|
||||
Some(sig)
|
||||
}
|
||||
ItemEnum::TypeAlias(ta) => {
|
||||
Some(format!("pub type {name} = {}", render_type(&ta.type_, krate)))
|
||||
}
|
||||
ItemEnum::Constant { type_, const_: c } => {
|
||||
Some(format!(
|
||||
"pub const {name}: {} = {}",
|
||||
render_type(type_, krate),
|
||||
c.value.as_deref().unwrap_or("...")
|
||||
))
|
||||
}
|
||||
ItemEnum::Macro(_) => Some(format!("macro_rules! {name} {{ ... }}")),
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
|
||||
fn render_type(ty: &Type, krate: &Crate) -> String {
|
||||
match ty {
|
||||
Type::ResolvedPath(p) => {
|
||||
let mut s = p.path.clone();
|
||||
if let Some(ref args) = p.args {
|
||||
if let rustdoc_types::GenericArgs::AngleBracketed { args, .. } = args.as_ref() {
|
||||
if !args.is_empty() {
|
||||
s.push('<');
|
||||
let rendered: Vec<_> = args
|
||||
.iter()
|
||||
.map(|a| match a {
|
||||
rustdoc_types::GenericArg::Type(t) => render_type(t, krate),
|
||||
rustdoc_types::GenericArg::Lifetime(l) => l.clone(),
|
||||
rustdoc_types::GenericArg::Const(c) => {
|
||||
c.value.clone().unwrap_or_else(|| c.expr.clone())
|
||||
}
|
||||
rustdoc_types::GenericArg::Infer => "_".to_string(),
|
||||
})
|
||||
.collect();
|
||||
s.push_str(&rendered.join(", "));
|
||||
s.push('>');
|
||||
}
|
||||
}
|
||||
}
|
||||
s
|
||||
}
|
||||
Type::Generic(name) => name.clone(),
|
||||
Type::Primitive(name) => name.clone(),
|
||||
Type::BorrowedRef { lifetime, is_mutable, type_ } => {
|
||||
let mut s = String::from("&");
|
||||
if let Some(lt) = lifetime {
|
||||
s.push_str(lt);
|
||||
s.push(' ');
|
||||
}
|
||||
if *is_mutable {
|
||||
s.push_str("mut ");
|
||||
}
|
||||
s.push_str(&render_type(type_, krate));
|
||||
s
|
||||
}
|
||||
Type::Tuple(types) => {
|
||||
let inner: Vec<_> = types.iter().map(|t| render_type(t, krate)).collect();
|
||||
format!("({})", inner.join(", "))
|
||||
}
|
||||
Type::Slice(ty) => format!("[{}]", render_type(ty, krate)),
|
||||
Type::Array { type_, len } => format!("[{}; {}]", render_type(type_, krate), len),
|
||||
Type::RawPointer { is_mutable, type_ } => {
|
||||
if *is_mutable {
|
||||
format!("*mut {}", render_type(type_, krate))
|
||||
} else {
|
||||
format!("*const {}", render_type(type_, krate))
|
||||
}
|
||||
}
|
||||
Type::ImplTrait(bounds) => {
|
||||
let rendered: Vec<_> = bounds
|
||||
.iter()
|
||||
.filter_map(|b| match b {
|
||||
rustdoc_types::GenericBound::TraitBound { trait_, .. } => {
|
||||
Some(trait_.path.clone())
|
||||
}
|
||||
_ => None,
|
||||
})
|
||||
.collect();
|
||||
format!("impl {}", rendered.join(" + "))
|
||||
}
|
||||
Type::QualifiedPath { name, self_type, trait_, .. } => {
|
||||
let self_str = render_type(self_type, krate);
|
||||
if let Some(t) = trait_ {
|
||||
format!("<{self_str} as {}>::{name}", t.path)
|
||||
} else {
|
||||
format!("{self_str}::{name}")
|
||||
}
|
||||
}
|
||||
Type::DynTrait(dt) => {
|
||||
let traits: Vec<_> = dt.traits.iter().map(|pb| pb.trait_.path.clone()).collect();
|
||||
format!("dyn {}", traits.join(" + "))
|
||||
}
|
||||
Type::FunctionPointer(fp) => {
|
||||
let params: Vec<_> = fp
|
||||
.sig
|
||||
.inputs
|
||||
.iter()
|
||||
.map(|(_, t)| render_type(t, krate))
|
||||
.collect();
|
||||
let ret = fp
|
||||
.sig
|
||||
.output
|
||||
.as_ref()
|
||||
.map(|t| format!(" -> {}", render_type(t, krate)))
|
||||
.unwrap_or_default();
|
||||
format!("fn({}){ret}", params.join(", "))
|
||||
}
|
||||
Type::Pat { type_, .. } => render_type(type_, krate),
|
||||
Type::Infer => "_".to_string(),
|
||||
}
|
||||
}
|
||||
|
||||
fn first_sentence(docs: &str) -> String {
|
||||
docs.split('\n')
|
||||
.next()
|
||||
.unwrap_or("")
|
||||
.trim()
|
||||
.trim_end_matches('.')
|
||||
.to_string()
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use rustdoc_types::*;
|
||||
|
||||
fn empty_crate() -> Crate {
|
||||
Crate {
|
||||
root: Id(0),
|
||||
crate_version: Some("0.1.0".to_string()),
|
||||
includes_private: false,
|
||||
index: HashMap::new(),
|
||||
paths: HashMap::new(),
|
||||
external_crates: HashMap::new(),
|
||||
format_version: 38,
|
||||
}
|
||||
}
|
||||
|
||||
fn make_function(name: &str, params: Vec<(&str, Type)>, output: Option<Type>) -> Item {
|
||||
Item {
|
||||
id: Id(1),
|
||||
crate_id: 0,
|
||||
name: Some(name.to_string()),
|
||||
span: None,
|
||||
visibility: Visibility::Public,
|
||||
docs: Some(format!("Documentation for {name}.")),
|
||||
links: HashMap::new(),
|
||||
attrs: vec![],
|
||||
deprecation: None,
|
||||
inner: ItemEnum::Function(Function {
|
||||
sig: FunctionSignature {
|
||||
inputs: params.into_iter().map(|(n, t)| (n.to_string(), t)).collect(),
|
||||
output,
|
||||
is_c_variadic: false,
|
||||
},
|
||||
generics: Generics { params: vec![], where_predicates: vec![] },
|
||||
header: FunctionHeader {
|
||||
is_const: false,
|
||||
is_unsafe: false,
|
||||
is_async: false,
|
||||
abi: Abi::Rust,
|
||||
},
|
||||
has_body: true,
|
||||
}),
|
||||
}
|
||||
}
|
||||
|
||||
fn make_struct(name: &str) -> Item {
|
||||
Item {
|
||||
id: Id(2),
|
||||
crate_id: 0,
|
||||
name: Some(name.to_string()),
|
||||
span: None,
|
||||
visibility: Visibility::Public,
|
||||
docs: Some(format!("A {name} struct.")),
|
||||
links: HashMap::new(),
|
||||
attrs: vec![],
|
||||
deprecation: None,
|
||||
inner: ItemEnum::Struct(Struct {
|
||||
kind: StructKind::Unit,
|
||||
generics: Generics { params: vec![], where_predicates: vec![] },
|
||||
impls: vec![],
|
||||
}),
|
||||
}
|
||||
}
|
||||
|
||||
fn make_enum(name: &str) -> Item {
|
||||
Item {
|
||||
id: Id(3),
|
||||
crate_id: 0,
|
||||
name: Some(name.to_string()),
|
||||
span: None,
|
||||
visibility: Visibility::Public,
|
||||
docs: Some(format!("The {name} enum.")),
|
||||
links: HashMap::new(),
|
||||
attrs: vec![],
|
||||
deprecation: None,
|
||||
inner: ItemEnum::Enum(Enum {
|
||||
generics: Generics { params: vec![], where_predicates: vec![] },
|
||||
variants: vec![],
|
||||
has_stripped_variants: false,
|
||||
impls: vec![],
|
||||
}),
|
||||
}
|
||||
}
|
||||
|
||||
fn make_trait(name: &str) -> Item {
|
||||
Item {
|
||||
id: Id(4),
|
||||
crate_id: 0,
|
||||
name: Some(name.to_string()),
|
||||
span: None,
|
||||
visibility: Visibility::Public,
|
||||
docs: Some(format!("The {name} trait.")),
|
||||
links: HashMap::new(),
|
||||
attrs: vec![],
|
||||
deprecation: None,
|
||||
inner: ItemEnum::Trait(Trait {
|
||||
is_auto: false,
|
||||
is_unsafe: false,
|
||||
is_dyn_compatible: true,
|
||||
items: vec![],
|
||||
generics: Generics { params: vec![], where_predicates: vec![] },
|
||||
bounds: vec![],
|
||||
implementations: vec![],
|
||||
}),
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn first_sentence_basic() {
|
||||
assert_eq!(first_sentence("Hello world."), "Hello world");
|
||||
assert_eq!(first_sentence("First line.\nSecond line."), "First line");
|
||||
assert_eq!(first_sentence(""), "");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn render_type_primitives() {
|
||||
let krate = empty_crate();
|
||||
assert_eq!(render_type(&Type::Primitive("u32".into()), &krate), "u32");
|
||||
assert_eq!(render_type(&Type::Primitive("bool".into()), &krate), "bool");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn render_type_generic() {
|
||||
let krate = empty_crate();
|
||||
assert_eq!(render_type(&Type::Generic("T".into()), &krate), "T");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn render_type_reference() {
|
||||
let krate = empty_crate();
|
||||
let ty = Type::BorrowedRef {
|
||||
lifetime: Some("'a".into()),
|
||||
is_mutable: false,
|
||||
type_: Box::new(Type::Primitive("str".into())),
|
||||
};
|
||||
assert_eq!(render_type(&ty, &krate), "&'a str");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn render_type_mut_reference() {
|
||||
let krate = empty_crate();
|
||||
let ty = Type::BorrowedRef {
|
||||
lifetime: None,
|
||||
is_mutable: true,
|
||||
type_: Box::new(Type::Primitive("u8".into())),
|
||||
};
|
||||
assert_eq!(render_type(&ty, &krate), "&mut u8");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn render_type_tuple() {
|
||||
let krate = empty_crate();
|
||||
let ty = Type::Tuple(vec![Type::Primitive("u32".into()), Type::Primitive("String".into())]);
|
||||
assert_eq!(render_type(&ty, &krate), "(u32, String)");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn render_type_slice() {
|
||||
let krate = empty_crate();
|
||||
assert_eq!(render_type(&Type::Slice(Box::new(Type::Primitive("u8".into()))), &krate), "[u8]");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn render_type_array() {
|
||||
let krate = empty_crate();
|
||||
let ty = Type::Array { type_: Box::new(Type::Primitive("u8".into())), len: "32".into() };
|
||||
assert_eq!(render_type(&ty, &krate), "[u8; 32]");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn render_type_raw_pointer() {
|
||||
let krate = empty_crate();
|
||||
let ty = Type::RawPointer { is_mutable: true, type_: Box::new(Type::Primitive("u8".into())) };
|
||||
assert_eq!(render_type(&ty, &krate), "*mut u8");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn render_function_signature() {
|
||||
let krate = empty_crate();
|
||||
let item = make_function("add", vec![("a", Type::Primitive("u32".into())), ("b", Type::Primitive("u32".into()))], Some(Type::Primitive("u32".into())));
|
||||
assert_eq!(render_signature(&item, &krate).unwrap(), "fn add(a: u32, b: u32) -> u32");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn render_function_no_return() {
|
||||
let krate = empty_crate();
|
||||
let item = make_function("do_thing", vec![], None);
|
||||
assert_eq!(render_signature(&item, &krate).unwrap(), "fn do_thing()");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn render_struct_signature() {
|
||||
let krate = empty_crate();
|
||||
assert_eq!(render_signature(&make_struct("MyStruct"), &krate).unwrap(), "pub struct MyStruct;");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn render_enum_signature() {
|
||||
let krate = empty_crate();
|
||||
assert_eq!(render_signature(&make_enum("Color"), &krate).unwrap(), "pub enum Color { }");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn render_trait_signature() {
|
||||
let krate = empty_crate();
|
||||
assert_eq!(render_signature(&make_trait("Drawable"), &krate).unwrap(), "pub trait Drawable");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn item_kind_labels() {
|
||||
assert_eq!(item_kind_label(&ItemEnum::Module(Module { is_crate: false, items: vec![], is_stripped: false })), Some("Modules"));
|
||||
assert_eq!(item_kind_label(&ItemEnum::Struct(Struct { kind: StructKind::Unit, generics: Generics { params: vec![], where_predicates: vec![] }, impls: vec![] })), Some("Structs"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn transform_empty_crate() {
|
||||
assert!(transform_to_mdx(&empty_crate()).is_empty());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn transform_crate_with_function() {
|
||||
let mut krate = empty_crate();
|
||||
let func = make_function("hello", vec![], None);
|
||||
let id = Id(1);
|
||||
krate.index.insert(id.clone(), func);
|
||||
krate.paths.insert(id, ItemSummary { crate_id: 0, path: vec!["my_crate".into(), "hello".into()], kind: ItemKind::Function });
|
||||
|
||||
let files = transform_to_mdx(&krate);
|
||||
assert_eq!(files.len(), 1);
|
||||
assert_eq!(files[0].path, "my_crate.mdx");
|
||||
assert!(files[0].content.contains("### `hello`"));
|
||||
assert!(files[0].content.contains("fn hello()"));
|
||||
assert!(files[0].content.contains("Documentation for hello."));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn transform_crate_with_multiple_kinds() {
|
||||
let mut krate = empty_crate();
|
||||
let func = make_function("do_thing", vec![], None);
|
||||
krate.index.insert(Id(1), func);
|
||||
krate.paths.insert(Id(1), ItemSummary { crate_id: 0, path: vec!["mc".into(), "do_thing".into()], kind: ItemKind::Function });
|
||||
|
||||
let st = make_struct("Widget");
|
||||
krate.index.insert(Id(2), st);
|
||||
krate.paths.insert(Id(2), ItemSummary { crate_id: 0, path: vec!["mc".into(), "Widget".into()], kind: ItemKind::Struct });
|
||||
|
||||
let files = transform_to_mdx(&krate);
|
||||
assert_eq!(files.len(), 1);
|
||||
let content = &files[0].content;
|
||||
assert!(content.find("## Structs").unwrap() < content.find("## Functions").unwrap());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn frontmatter_escapes_quotes() {
|
||||
// Put a module with quoted docs as the root so it becomes the frontmatter description.
|
||||
let mut krate = empty_crate();
|
||||
let root_module = Item {
|
||||
id: Id(0),
|
||||
crate_id: 0,
|
||||
name: Some("mylib".into()),
|
||||
span: None,
|
||||
visibility: Visibility::Public,
|
||||
docs: Some("A \"quoted\" crate.".into()),
|
||||
links: HashMap::new(),
|
||||
attrs: vec![],
|
||||
deprecation: None,
|
||||
inner: ItemEnum::Module(Module { is_crate: true, items: vec![Id(1)], is_stripped: false }),
|
||||
};
|
||||
krate.root = Id(0);
|
||||
krate.index.insert(Id(0), root_module);
|
||||
|
||||
// Add a function so the module generates a file.
|
||||
let func = make_function("f", vec![], None);
|
||||
krate.index.insert(Id(1), func);
|
||||
krate.paths.insert(Id(1), ItemSummary { crate_id: 0, path: vec!["f".into()], kind: ItemKind::Function });
|
||||
|
||||
let files = transform_to_mdx(&krate);
|
||||
// The root module's description in frontmatter should have escaped quotes.
|
||||
let index = files.iter().find(|f| f.path == "index.mdx").unwrap();
|
||||
assert!(index.content.contains("\\\"quoted\\\""), "content: {}", index.content);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn render_type_resolved_path_with_args() {
|
||||
let krate = empty_crate();
|
||||
let ty = Type::ResolvedPath(rustdoc_types::Path {
|
||||
path: "Option".into(),
|
||||
id: Id(99),
|
||||
args: Some(Box::new(rustdoc_types::GenericArgs::AngleBracketed {
|
||||
args: vec![rustdoc_types::GenericArg::Type(Type::Primitive("u32".into()))],
|
||||
constraints: vec![],
|
||||
})),
|
||||
});
|
||||
assert_eq!(render_type(&ty, &krate), "Option<u32>");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn render_type_impl_trait() {
|
||||
let krate = empty_crate();
|
||||
let ty = Type::ImplTrait(vec![
|
||||
rustdoc_types::GenericBound::TraitBound {
|
||||
trait_: rustdoc_types::Path { path: "Display".into(), id: Id(99), args: None },
|
||||
generic_params: vec![],
|
||||
modifier: rustdoc_types::TraitBoundModifier::None,
|
||||
},
|
||||
]);
|
||||
assert_eq!(render_type(&ty, &krate), "impl Display");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn render_type_dyn_trait() {
|
||||
let krate = empty_crate();
|
||||
let ty = Type::DynTrait(rustdoc_types::DynTrait {
|
||||
traits: vec![rustdoc_types::PolyTrait {
|
||||
trait_: rustdoc_types::Path { path: "Error".into(), id: Id(99), args: None },
|
||||
generic_params: vec![],
|
||||
}],
|
||||
lifetime: None,
|
||||
});
|
||||
assert_eq!(render_type(&ty, &krate), "dyn Error");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn render_type_function_pointer() {
|
||||
let krate = empty_crate();
|
||||
let ty = Type::FunctionPointer(Box::new(rustdoc_types::FunctionPointer {
|
||||
sig: FunctionSignature {
|
||||
inputs: vec![("x".into(), Type::Primitive("u32".into()))],
|
||||
output: Some(Type::Primitive("bool".into())),
|
||||
is_c_variadic: false,
|
||||
},
|
||||
generic_params: vec![],
|
||||
header: FunctionHeader { is_const: false, is_unsafe: false, is_async: false, abi: Abi::Rust },
|
||||
}));
|
||||
assert_eq!(render_type(&ty, &krate), "fn(u32) -> bool");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn render_type_const_pointer() {
|
||||
let krate = empty_crate();
|
||||
let ty = Type::RawPointer { is_mutable: false, type_: Box::new(Type::Primitive("u8".into())) };
|
||||
assert_eq!(render_type(&ty, &krate), "*const u8");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn render_type_infer() {
|
||||
let krate = empty_crate();
|
||||
assert_eq!(render_type(&Type::Infer, &krate), "_");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn render_type_qualified_path() {
|
||||
let krate = empty_crate();
|
||||
let ty = Type::QualifiedPath {
|
||||
name: "Item".into(),
|
||||
args: Box::new(rustdoc_types::GenericArgs::AngleBracketed { args: vec![], constraints: vec![] }),
|
||||
self_type: Box::new(Type::Generic("T".into())),
|
||||
trait_: Some(rustdoc_types::Path { path: "Iterator".into(), id: Id(99), args: None }),
|
||||
};
|
||||
assert_eq!(render_type(&ty, &krate), "<T as Iterator>::Item");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn item_kind_label_all_variants() {
|
||||
// Test the remaining untested variants
|
||||
assert_eq!(item_kind_label(&ItemEnum::Enum(Enum {
|
||||
generics: Generics { params: vec![], where_predicates: vec![] },
|
||||
variants: vec![], has_stripped_variants: false, impls: vec![],
|
||||
})), Some("Enums"));
|
||||
assert_eq!(item_kind_label(&ItemEnum::Trait(Trait {
|
||||
is_auto: false, is_unsafe: false, is_dyn_compatible: true,
|
||||
items: vec![], generics: Generics { params: vec![], where_predicates: vec![] },
|
||||
bounds: vec![], implementations: vec![],
|
||||
})), Some("Traits"));
|
||||
assert_eq!(item_kind_label(&ItemEnum::Macro("".into())), Some("Macros"));
|
||||
assert_eq!(item_kind_label(&ItemEnum::Static(rustdoc_types::Static {
|
||||
type_: Type::Primitive("u32".into()),
|
||||
is_mutable: false,
|
||||
is_unsafe: false,
|
||||
expr: String::new(),
|
||||
})), Some("Statics"));
|
||||
// Impl blocks should be skipped
|
||||
assert_eq!(item_kind_label(&ItemEnum::Impl(rustdoc_types::Impl {
|
||||
is_unsafe: false, generics: Generics { params: vec![], where_predicates: vec![] },
|
||||
provided_trait_methods: vec![], trait_: None, for_: Type::Primitive("u32".into()),
|
||||
items: vec![], is_negative: false, is_synthetic: false,
|
||||
blanket_impl: None,
|
||||
})), None);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn render_constant_signature() {
|
||||
let krate = empty_crate();
|
||||
let item = Item {
|
||||
id: Id(5), crate_id: 0,
|
||||
name: Some("MAX_SIZE".into()), span: None,
|
||||
visibility: Visibility::Public, docs: None,
|
||||
links: HashMap::new(), attrs: vec![], deprecation: None,
|
||||
inner: ItemEnum::Constant {
|
||||
type_: Type::Primitive("usize".into()),
|
||||
const_: rustdoc_types::Constant { expr: "1024".into(), value: Some("1024".into()), is_literal: true },
|
||||
},
|
||||
};
|
||||
assert_eq!(render_signature(&item, &krate).unwrap(), "pub const MAX_SIZE: usize = 1024");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn render_type_alias_signature() {
|
||||
let krate = empty_crate();
|
||||
let item = Item {
|
||||
id: Id(6), crate_id: 0,
|
||||
name: Some("Result".into()), span: None,
|
||||
visibility: Visibility::Public, docs: None,
|
||||
links: HashMap::new(), attrs: vec![], deprecation: None,
|
||||
inner: ItemEnum::TypeAlias(rustdoc_types::TypeAlias {
|
||||
type_: Type::Primitive("u32".into()),
|
||||
generics: Generics { params: vec![], where_predicates: vec![] },
|
||||
}),
|
||||
};
|
||||
assert_eq!(render_signature(&item, &krate).unwrap(), "pub type Result = u32");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn render_macro_signature() {
|
||||
let krate = empty_crate();
|
||||
let item = Item {
|
||||
id: Id(7), crate_id: 0,
|
||||
name: Some("my_macro".into()), span: None,
|
||||
visibility: Visibility::Public, docs: None,
|
||||
links: HashMap::new(), attrs: vec![], deprecation: None,
|
||||
inner: ItemEnum::Macro("macro body".into()),
|
||||
};
|
||||
assert_eq!(render_signature(&item, &krate).unwrap(), "macro_rules! my_macro { ... }");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn render_item_without_docs() {
|
||||
let krate = empty_crate();
|
||||
let mut item = make_struct("NoDocs");
|
||||
item.docs = None;
|
||||
let mut out = String::new();
|
||||
render_item(&mut out, &item, &krate);
|
||||
assert!(out.contains("### `NoDocs`"));
|
||||
assert!(out.contains("pub struct NoDocs;"));
|
||||
// Should not have trailing doc content
|
||||
assert!(!out.contains("A NoDocs struct."));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn write_mdx_files_creates_directories() {
|
||||
let tmp = tempfile::tempdir().unwrap();
|
||||
let files = vec![MdxFile { path: "nested/module.mdx".into(), content: "# Test\n".into() }];
|
||||
write_mdx_files(&files, tmp.path()).unwrap();
|
||||
assert!(tmp.path().join("nested/module.mdx").exists());
|
||||
assert_eq!(std::fs::read_to_string(tmp.path().join("nested/module.mdx")).unwrap(), "# Test\n");
|
||||
}
|
||||
}
|
||||
183
wfe-rustlang/src/rustup/config.rs
Normal file
183
wfe-rustlang/src/rustup/config.rs
Normal file
@@ -0,0 +1,183 @@
|
||||
use serde::{Deserialize, Serialize};
|
||||
|
||||
/// Which rustup operation to perform.
|
||||
#[derive(Debug, Clone, Serialize, Deserialize, PartialEq)]
|
||||
#[serde(rename_all = "kebab-case")]
|
||||
pub enum RustupCommand {
|
||||
/// Install Rust via rustup-init.
|
||||
Install,
|
||||
/// Install a toolchain (`rustup toolchain install`).
|
||||
ToolchainInstall,
|
||||
/// Add a component (`rustup component add`).
|
||||
ComponentAdd,
|
||||
/// Add a compilation target (`rustup target add`).
|
||||
TargetAdd,
|
||||
}
|
||||
|
||||
impl RustupCommand {
|
||||
pub fn as_str(&self) -> &'static str {
|
||||
match self {
|
||||
Self::Install => "install",
|
||||
Self::ToolchainInstall => "toolchain-install",
|
||||
Self::ComponentAdd => "component-add",
|
||||
Self::TargetAdd => "target-add",
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Configuration for rustup step types.
|
||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||
pub struct RustupConfig {
|
||||
pub command: RustupCommand,
|
||||
/// Toolchain to install or scope components/targets to (e.g. "nightly", "1.78.0").
|
||||
#[serde(default)]
|
||||
pub toolchain: Option<String>,
|
||||
/// Components to add (e.g. ["clippy", "rustfmt", "rust-src"]).
|
||||
#[serde(default)]
|
||||
pub components: Vec<String>,
|
||||
/// Compilation targets to add (e.g. ["wasm32-unknown-unknown"]).
|
||||
#[serde(default)]
|
||||
pub targets: Vec<String>,
|
||||
/// Rustup profile for initial install: "minimal", "default", or "complete".
|
||||
#[serde(default)]
|
||||
pub profile: Option<String>,
|
||||
/// Default toolchain to set during install.
|
||||
#[serde(default)]
|
||||
pub default_toolchain: Option<String>,
|
||||
/// Additional arguments appended to the command.
|
||||
#[serde(default)]
|
||||
pub extra_args: Vec<String>,
|
||||
/// Execution timeout in milliseconds.
|
||||
#[serde(default)]
|
||||
pub timeout_ms: Option<u64>,
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use pretty_assertions::assert_eq;
|
||||
|
||||
#[test]
|
||||
fn command_as_str() {
|
||||
assert_eq!(RustupCommand::Install.as_str(), "install");
|
||||
assert_eq!(RustupCommand::ToolchainInstall.as_str(), "toolchain-install");
|
||||
assert_eq!(RustupCommand::ComponentAdd.as_str(), "component-add");
|
||||
assert_eq!(RustupCommand::TargetAdd.as_str(), "target-add");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn command_serde_kebab_case() {
|
||||
let json = r#""toolchain-install""#;
|
||||
let cmd: RustupCommand = serde_json::from_str(json).unwrap();
|
||||
assert_eq!(cmd, RustupCommand::ToolchainInstall);
|
||||
|
||||
let serialized = serde_json::to_string(&RustupCommand::ComponentAdd).unwrap();
|
||||
assert_eq!(serialized, r#""component-add""#);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn serde_round_trip_install() {
|
||||
let config = RustupConfig {
|
||||
command: RustupCommand::Install,
|
||||
toolchain: None,
|
||||
components: vec![],
|
||||
targets: vec![],
|
||||
profile: Some("minimal".to_string()),
|
||||
default_toolchain: Some("stable".to_string()),
|
||||
extra_args: vec![],
|
||||
timeout_ms: Some(300_000),
|
||||
};
|
||||
let json = serde_json::to_string(&config).unwrap();
|
||||
let de: RustupConfig = serde_json::from_str(&json).unwrap();
|
||||
assert_eq!(de.command, RustupCommand::Install);
|
||||
assert_eq!(de.profile, Some("minimal".to_string()));
|
||||
assert_eq!(de.default_toolchain, Some("stable".to_string()));
|
||||
assert_eq!(de.timeout_ms, Some(300_000));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn serde_round_trip_toolchain_install() {
|
||||
let config = RustupConfig {
|
||||
command: RustupCommand::ToolchainInstall,
|
||||
toolchain: Some("nightly-2024-06-01".to_string()),
|
||||
components: vec![],
|
||||
targets: vec![],
|
||||
profile: None,
|
||||
default_toolchain: None,
|
||||
extra_args: vec![],
|
||||
timeout_ms: None,
|
||||
};
|
||||
let json = serde_json::to_string(&config).unwrap();
|
||||
let de: RustupConfig = serde_json::from_str(&json).unwrap();
|
||||
assert_eq!(de.command, RustupCommand::ToolchainInstall);
|
||||
assert_eq!(de.toolchain, Some("nightly-2024-06-01".to_string()));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn serde_round_trip_component_add() {
|
||||
let config = RustupConfig {
|
||||
command: RustupCommand::ComponentAdd,
|
||||
toolchain: Some("nightly".to_string()),
|
||||
components: vec!["clippy".to_string(), "rustfmt".to_string(), "rust-src".to_string()],
|
||||
targets: vec![],
|
||||
profile: None,
|
||||
default_toolchain: None,
|
||||
extra_args: vec![],
|
||||
timeout_ms: None,
|
||||
};
|
||||
let json = serde_json::to_string(&config).unwrap();
|
||||
let de: RustupConfig = serde_json::from_str(&json).unwrap();
|
||||
assert_eq!(de.command, RustupCommand::ComponentAdd);
|
||||
assert_eq!(de.components, vec!["clippy", "rustfmt", "rust-src"]);
|
||||
assert_eq!(de.toolchain, Some("nightly".to_string()));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn serde_round_trip_target_add() {
|
||||
let config = RustupConfig {
|
||||
command: RustupCommand::TargetAdd,
|
||||
toolchain: Some("stable".to_string()),
|
||||
components: vec![],
|
||||
targets: vec!["wasm32-unknown-unknown".to_string(), "aarch64-linux-android".to_string()],
|
||||
profile: None,
|
||||
default_toolchain: None,
|
||||
extra_args: vec![],
|
||||
timeout_ms: None,
|
||||
};
|
||||
let json = serde_json::to_string(&config).unwrap();
|
||||
let de: RustupConfig = serde_json::from_str(&json).unwrap();
|
||||
assert_eq!(de.command, RustupCommand::TargetAdd);
|
||||
assert_eq!(de.targets, vec!["wasm32-unknown-unknown", "aarch64-linux-android"]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn config_defaults() {
|
||||
let json = r#"{"command": "install"}"#;
|
||||
let config: RustupConfig = serde_json::from_str(json).unwrap();
|
||||
assert_eq!(config.command, RustupCommand::Install);
|
||||
assert!(config.toolchain.is_none());
|
||||
assert!(config.components.is_empty());
|
||||
assert!(config.targets.is_empty());
|
||||
assert!(config.profile.is_none());
|
||||
assert!(config.default_toolchain.is_none());
|
||||
assert!(config.extra_args.is_empty());
|
||||
assert!(config.timeout_ms.is_none());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn serde_with_extra_args() {
|
||||
let config = RustupConfig {
|
||||
command: RustupCommand::ToolchainInstall,
|
||||
toolchain: Some("nightly".to_string()),
|
||||
components: vec![],
|
||||
targets: vec![],
|
||||
profile: Some("minimal".to_string()),
|
||||
default_toolchain: None,
|
||||
extra_args: vec!["--force".to_string()],
|
||||
timeout_ms: None,
|
||||
};
|
||||
let json = serde_json::to_string(&config).unwrap();
|
||||
let de: RustupConfig = serde_json::from_str(&json).unwrap();
|
||||
assert_eq!(de.extra_args, vec!["--force"]);
|
||||
}
|
||||
}
|
||||
5
wfe-rustlang/src/rustup/mod.rs
Normal file
5
wfe-rustlang/src/rustup/mod.rs
Normal file
@@ -0,0 +1,5 @@
|
||||
pub mod config;
|
||||
pub mod step;
|
||||
|
||||
pub use config::{RustupCommand, RustupConfig};
|
||||
pub use step::RustupStep;
|
||||
357
wfe-rustlang/src/rustup/step.rs
Normal file
357
wfe-rustlang/src/rustup/step.rs
Normal file
@@ -0,0 +1,357 @@
|
||||
use async_trait::async_trait;
|
||||
use wfe_core::models::ExecutionResult;
|
||||
use wfe_core::traits::step::{StepBody, StepExecutionContext};
|
||||
use wfe_core::WfeError;
|
||||
|
||||
use crate::rustup::config::{RustupCommand, RustupConfig};
|
||||
|
||||
pub struct RustupStep {
|
||||
config: RustupConfig,
|
||||
}
|
||||
|
||||
impl RustupStep {
|
||||
pub fn new(config: RustupConfig) -> Self {
|
||||
Self { config }
|
||||
}
|
||||
|
||||
pub fn build_command(&self) -> tokio::process::Command {
|
||||
match self.config.command {
|
||||
RustupCommand::Install => self.build_install_command(),
|
||||
RustupCommand::ToolchainInstall => self.build_toolchain_install_command(),
|
||||
RustupCommand::ComponentAdd => self.build_component_add_command(),
|
||||
RustupCommand::TargetAdd => self.build_target_add_command(),
|
||||
}
|
||||
}
|
||||
|
||||
fn build_install_command(&self) -> tokio::process::Command {
|
||||
let mut cmd = tokio::process::Command::new("sh");
|
||||
// Pipe rustup-init through sh with non-interactive flag.
|
||||
let mut script = "curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y".to_string();
|
||||
|
||||
if let Some(ref profile) = self.config.profile {
|
||||
script.push_str(&format!(" --profile {profile}"));
|
||||
}
|
||||
|
||||
if let Some(ref tc) = self.config.default_toolchain {
|
||||
script.push_str(&format!(" --default-toolchain {tc}"));
|
||||
}
|
||||
|
||||
for arg in &self.config.extra_args {
|
||||
script.push_str(&format!(" {arg}"));
|
||||
}
|
||||
|
||||
cmd.arg("-c").arg(&script);
|
||||
cmd.stdout(std::process::Stdio::piped());
|
||||
cmd.stderr(std::process::Stdio::piped());
|
||||
cmd
|
||||
}
|
||||
|
||||
fn build_toolchain_install_command(&self) -> tokio::process::Command {
|
||||
let mut cmd = tokio::process::Command::new("rustup");
|
||||
cmd.args(["toolchain", "install"]);
|
||||
|
||||
if let Some(ref tc) = self.config.toolchain {
|
||||
cmd.arg(tc);
|
||||
}
|
||||
|
||||
if let Some(ref profile) = self.config.profile {
|
||||
cmd.args(["--profile", profile]);
|
||||
}
|
||||
|
||||
for arg in &self.config.extra_args {
|
||||
cmd.arg(arg);
|
||||
}
|
||||
|
||||
cmd.stdout(std::process::Stdio::piped());
|
||||
cmd.stderr(std::process::Stdio::piped());
|
||||
cmd
|
||||
}
|
||||
|
||||
fn build_component_add_command(&self) -> tokio::process::Command {
|
||||
let mut cmd = tokio::process::Command::new("rustup");
|
||||
cmd.args(["component", "add"]);
|
||||
|
||||
for component in &self.config.components {
|
||||
cmd.arg(component);
|
||||
}
|
||||
|
||||
if let Some(ref tc) = self.config.toolchain {
|
||||
cmd.args(["--toolchain", tc]);
|
||||
}
|
||||
|
||||
for arg in &self.config.extra_args {
|
||||
cmd.arg(arg);
|
||||
}
|
||||
|
||||
cmd.stdout(std::process::Stdio::piped());
|
||||
cmd.stderr(std::process::Stdio::piped());
|
||||
cmd
|
||||
}
|
||||
|
||||
fn build_target_add_command(&self) -> tokio::process::Command {
|
||||
let mut cmd = tokio::process::Command::new("rustup");
|
||||
cmd.args(["target", "add"]);
|
||||
|
||||
for target in &self.config.targets {
|
||||
cmd.arg(target);
|
||||
}
|
||||
|
||||
if let Some(ref tc) = self.config.toolchain {
|
||||
cmd.args(["--toolchain", tc]);
|
||||
}
|
||||
|
||||
for arg in &self.config.extra_args {
|
||||
cmd.arg(arg);
|
||||
}
|
||||
|
||||
cmd.stdout(std::process::Stdio::piped());
|
||||
cmd.stderr(std::process::Stdio::piped());
|
||||
cmd
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl StepBody for RustupStep {
|
||||
async fn run(&mut self, context: &StepExecutionContext<'_>) -> wfe_core::Result<ExecutionResult> {
|
||||
let step_name = context.step.name.as_deref().unwrap_or("unknown");
|
||||
let subcmd = self.config.command.as_str();
|
||||
|
||||
tracing::info!(step = step_name, command = subcmd, "running rustup");
|
||||
|
||||
let mut cmd = self.build_command();
|
||||
|
||||
let output = if let Some(timeout_ms) = self.config.timeout_ms {
|
||||
let duration = std::time::Duration::from_millis(timeout_ms);
|
||||
match tokio::time::timeout(duration, cmd.output()).await {
|
||||
Ok(result) => result.map_err(|e| {
|
||||
WfeError::StepExecution(format!("Failed to spawn rustup {subcmd}: {e}"))
|
||||
})?,
|
||||
Err(_) => {
|
||||
return Err(WfeError::StepExecution(format!(
|
||||
"rustup {subcmd} timed out after {timeout_ms}ms"
|
||||
)));
|
||||
}
|
||||
}
|
||||
} else {
|
||||
cmd.output()
|
||||
.await
|
||||
.map_err(|e| WfeError::StepExecution(format!("Failed to spawn rustup {subcmd}: {e}")))?
|
||||
};
|
||||
|
||||
let stdout = String::from_utf8_lossy(&output.stdout).to_string();
|
||||
let stderr = String::from_utf8_lossy(&output.stderr).to_string();
|
||||
|
||||
if !output.status.success() {
|
||||
let code = output.status.code().unwrap_or(-1);
|
||||
return Err(WfeError::StepExecution(format!(
|
||||
"rustup {subcmd} exited with code {code}\nstdout: {stdout}\nstderr: {stderr}"
|
||||
)));
|
||||
}
|
||||
|
||||
let mut outputs = serde_json::Map::new();
|
||||
outputs.insert(
|
||||
format!("{step_name}.stdout"),
|
||||
serde_json::Value::String(stdout),
|
||||
);
|
||||
outputs.insert(
|
||||
format!("{step_name}.stderr"),
|
||||
serde_json::Value::String(stderr),
|
||||
);
|
||||
|
||||
Ok(ExecutionResult {
|
||||
proceed: true,
|
||||
output_data: Some(serde_json::Value::Object(outputs)),
|
||||
..Default::default()
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
fn install_config() -> RustupConfig {
|
||||
RustupConfig {
|
||||
command: RustupCommand::Install,
|
||||
toolchain: None,
|
||||
components: vec![],
|
||||
targets: vec![],
|
||||
profile: None,
|
||||
default_toolchain: None,
|
||||
extra_args: vec![],
|
||||
timeout_ms: None,
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn build_install_command_minimal() {
|
||||
let step = RustupStep::new(install_config());
|
||||
let cmd = step.build_command();
|
||||
let prog = cmd.as_std().get_program().to_str().unwrap();
|
||||
assert_eq!(prog, "sh");
|
||||
let args: Vec<_> = cmd.as_std().get_args().map(|a| a.to_str().unwrap()).collect();
|
||||
assert_eq!(args[0], "-c");
|
||||
assert!(args[1].contains("rustup.rs"));
|
||||
assert!(args[1].contains("-y"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn build_install_command_with_profile_and_toolchain() {
|
||||
let mut config = install_config();
|
||||
config.profile = Some("minimal".to_string());
|
||||
config.default_toolchain = Some("nightly".to_string());
|
||||
let step = RustupStep::new(config);
|
||||
let cmd = step.build_command();
|
||||
let args: Vec<_> = cmd.as_std().get_args().map(|a| a.to_str().unwrap()).collect();
|
||||
assert!(args[1].contains("--profile minimal"));
|
||||
assert!(args[1].contains("--default-toolchain nightly"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn build_install_command_with_extra_args() {
|
||||
let mut config = install_config();
|
||||
config.extra_args = vec!["--no-modify-path".to_string()];
|
||||
let step = RustupStep::new(config);
|
||||
let cmd = step.build_command();
|
||||
let args: Vec<_> = cmd.as_std().get_args().map(|a| a.to_str().unwrap()).collect();
|
||||
assert!(args[1].contains("--no-modify-path"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn build_toolchain_install_command() {
|
||||
let config = RustupConfig {
|
||||
command: RustupCommand::ToolchainInstall,
|
||||
toolchain: Some("nightly-2024-06-01".to_string()),
|
||||
components: vec![],
|
||||
targets: vec![],
|
||||
profile: Some("minimal".to_string()),
|
||||
default_toolchain: None,
|
||||
extra_args: vec![],
|
||||
timeout_ms: None,
|
||||
};
|
||||
let step = RustupStep::new(config);
|
||||
let cmd = step.build_command();
|
||||
let prog = cmd.as_std().get_program().to_str().unwrap();
|
||||
assert_eq!(prog, "rustup");
|
||||
let args: Vec<_> = cmd.as_std().get_args().map(|a| a.to_str().unwrap()).collect();
|
||||
assert_eq!(args, vec!["toolchain", "install", "nightly-2024-06-01", "--profile", "minimal"]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn build_toolchain_install_with_extra_args() {
|
||||
let config = RustupConfig {
|
||||
command: RustupCommand::ToolchainInstall,
|
||||
toolchain: Some("stable".to_string()),
|
||||
components: vec![],
|
||||
targets: vec![],
|
||||
profile: None,
|
||||
default_toolchain: None,
|
||||
extra_args: vec!["--force".to_string()],
|
||||
timeout_ms: None,
|
||||
};
|
||||
let step = RustupStep::new(config);
|
||||
let cmd = step.build_command();
|
||||
let args: Vec<_> = cmd.as_std().get_args().map(|a| a.to_str().unwrap()).collect();
|
||||
assert_eq!(args, vec!["toolchain", "install", "stable", "--force"]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn build_component_add_command() {
|
||||
let config = RustupConfig {
|
||||
command: RustupCommand::ComponentAdd,
|
||||
toolchain: Some("nightly".to_string()),
|
||||
components: vec!["clippy".to_string(), "rustfmt".to_string()],
|
||||
targets: vec![],
|
||||
profile: None,
|
||||
default_toolchain: None,
|
||||
extra_args: vec![],
|
||||
timeout_ms: None,
|
||||
};
|
||||
let step = RustupStep::new(config);
|
||||
let cmd = step.build_command();
|
||||
let prog = cmd.as_std().get_program().to_str().unwrap();
|
||||
assert_eq!(prog, "rustup");
|
||||
let args: Vec<_> = cmd.as_std().get_args().map(|a| a.to_str().unwrap()).collect();
|
||||
assert_eq!(args, vec!["component", "add", "clippy", "rustfmt", "--toolchain", "nightly"]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn build_component_add_without_toolchain() {
|
||||
let config = RustupConfig {
|
||||
command: RustupCommand::ComponentAdd,
|
||||
toolchain: None,
|
||||
components: vec!["rust-src".to_string()],
|
||||
targets: vec![],
|
||||
profile: None,
|
||||
default_toolchain: None,
|
||||
extra_args: vec![],
|
||||
timeout_ms: None,
|
||||
};
|
||||
let step = RustupStep::new(config);
|
||||
let cmd = step.build_command();
|
||||
let args: Vec<_> = cmd.as_std().get_args().map(|a| a.to_str().unwrap()).collect();
|
||||
assert_eq!(args, vec!["component", "add", "rust-src"]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn build_target_add_command() {
|
||||
let config = RustupConfig {
|
||||
command: RustupCommand::TargetAdd,
|
||||
toolchain: Some("stable".to_string()),
|
||||
components: vec![],
|
||||
targets: vec!["wasm32-unknown-unknown".to_string()],
|
||||
profile: None,
|
||||
default_toolchain: None,
|
||||
extra_args: vec![],
|
||||
timeout_ms: None,
|
||||
};
|
||||
let step = RustupStep::new(config);
|
||||
let cmd = step.build_command();
|
||||
let prog = cmd.as_std().get_program().to_str().unwrap();
|
||||
assert_eq!(prog, "rustup");
|
||||
let args: Vec<_> = cmd.as_std().get_args().map(|a| a.to_str().unwrap()).collect();
|
||||
assert_eq!(args, vec!["target", "add", "wasm32-unknown-unknown", "--toolchain", "stable"]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn build_target_add_multiple_targets() {
|
||||
let config = RustupConfig {
|
||||
command: RustupCommand::TargetAdd,
|
||||
toolchain: None,
|
||||
components: vec![],
|
||||
targets: vec![
|
||||
"wasm32-unknown-unknown".to_string(),
|
||||
"aarch64-linux-android".to_string(),
|
||||
],
|
||||
profile: None,
|
||||
default_toolchain: None,
|
||||
extra_args: vec![],
|
||||
timeout_ms: None,
|
||||
};
|
||||
let step = RustupStep::new(config);
|
||||
let cmd = step.build_command();
|
||||
let args: Vec<_> = cmd.as_std().get_args().map(|a| a.to_str().unwrap()).collect();
|
||||
assert_eq!(args, vec!["target", "add", "wasm32-unknown-unknown", "aarch64-linux-android"]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn build_target_add_with_extra_args() {
|
||||
let config = RustupConfig {
|
||||
command: RustupCommand::TargetAdd,
|
||||
toolchain: Some("nightly".to_string()),
|
||||
components: vec![],
|
||||
targets: vec!["x86_64-unknown-linux-musl".to_string()],
|
||||
profile: None,
|
||||
default_toolchain: None,
|
||||
extra_args: vec!["--force".to_string()],
|
||||
timeout_ms: None,
|
||||
};
|
||||
let step = RustupStep::new(config);
|
||||
let cmd = step.build_command();
|
||||
let args: Vec<_> = cmd.as_std().get_args().map(|a| a.to_str().unwrap()).collect();
|
||||
assert_eq!(
|
||||
args,
|
||||
vec!["target", "add", "x86_64-unknown-linux-musl", "--toolchain", "nightly", "--force"]
|
||||
);
|
||||
}
|
||||
}
|
||||
@@ -9,6 +9,7 @@ default = []
|
||||
deno = ["deno_core", "deno_error", "url", "reqwest"]
|
||||
buildkit = ["wfe-buildkit"]
|
||||
containerd = ["wfe-containerd"]
|
||||
rustlang = ["wfe-rustlang"]
|
||||
|
||||
[dependencies]
|
||||
wfe-core = { workspace = true }
|
||||
@@ -27,6 +28,7 @@ url = { workspace = true, optional = true }
|
||||
reqwest = { workspace = true, optional = true }
|
||||
wfe-buildkit = { workspace = true, optional = true }
|
||||
wfe-containerd = { workspace = true, optional = true }
|
||||
wfe-rustlang = { workspace = true, optional = true }
|
||||
|
||||
[dev-dependencies]
|
||||
pretty_assertions = { workspace = true }
|
||||
@@ -36,3 +38,4 @@ wfe-core = { workspace = true, features = ["test-support"] }
|
||||
wfe = { path = "../wfe" }
|
||||
wiremock = { workspace = true }
|
||||
tempfile = { workspace = true }
|
||||
tracing-subscriber = { workspace = true }
|
||||
|
||||
@@ -13,6 +13,8 @@ use crate::executors::deno::{DenoConfig, DenoPermissions, DenoStep};
|
||||
use wfe_buildkit::{BuildkitConfig, BuildkitStep};
|
||||
#[cfg(feature = "containerd")]
|
||||
use wfe_containerd::{ContainerdConfig, ContainerdStep};
|
||||
#[cfg(feature = "rustlang")]
|
||||
use wfe_rustlang::{CargoCommand, CargoConfig, CargoStep, RustupCommand, RustupConfig, RustupStep};
|
||||
use wfe_core::primitives::sub_workflow::SubWorkflowStep;
|
||||
use wfe_core::models::condition::{ComparisonOp, FieldComparison, StepCondition};
|
||||
|
||||
@@ -454,6 +456,38 @@ fn build_step_config_and_factory(
|
||||
});
|
||||
Ok((key, value, factory))
|
||||
}
|
||||
#[cfg(feature = "rustlang")]
|
||||
"cargo-build" | "cargo-test" | "cargo-check" | "cargo-clippy" | "cargo-fmt"
|
||||
| "cargo-doc" | "cargo-publish" | "cargo-audit" | "cargo-deny" | "cargo-nextest"
|
||||
| "cargo-llvm-cov" | "cargo-doc-mdx" => {
|
||||
let config = build_cargo_config(step, step_type)?;
|
||||
let key = format!("wfe_yaml::cargo::{}", step.name);
|
||||
let value = serde_json::to_value(&config).map_err(|e| {
|
||||
YamlWorkflowError::Compilation(format!(
|
||||
"Failed to serialize cargo config: {e}"
|
||||
))
|
||||
})?;
|
||||
let config_clone = config.clone();
|
||||
let factory: StepFactory = Box::new(move || {
|
||||
Box::new(CargoStep::new(config_clone.clone())) as Box<dyn StepBody>
|
||||
});
|
||||
Ok((key, value, factory))
|
||||
}
|
||||
#[cfg(feature = "rustlang")]
|
||||
"rust-install" | "rustup-toolchain" | "rustup-component" | "rustup-target" => {
|
||||
let config = build_rustup_config(step, step_type)?;
|
||||
let key = format!("wfe_yaml::rustup::{}", step.name);
|
||||
let value = serde_json::to_value(&config).map_err(|e| {
|
||||
YamlWorkflowError::Compilation(format!(
|
||||
"Failed to serialize rustup config: {e}"
|
||||
))
|
||||
})?;
|
||||
let config_clone = config.clone();
|
||||
let factory: StepFactory = Box::new(move || {
|
||||
Box::new(RustupStep::new(config_clone.clone())) as Box<dyn StepBody>
|
||||
});
|
||||
Ok((key, value, factory))
|
||||
}
|
||||
"workflow" => {
|
||||
let config = step.config.as_ref().ok_or_else(|| {
|
||||
YamlWorkflowError::Compilation(format!(
|
||||
@@ -576,6 +610,88 @@ fn build_shell_config(step: &YamlStep) -> Result<ShellConfig, YamlWorkflowError>
|
||||
})
|
||||
}
|
||||
|
||||
#[cfg(feature = "rustlang")]
|
||||
fn build_cargo_config(
|
||||
step: &YamlStep,
|
||||
step_type: &str,
|
||||
) -> Result<CargoConfig, YamlWorkflowError> {
|
||||
let command = match step_type {
|
||||
"cargo-build" => CargoCommand::Build,
|
||||
"cargo-test" => CargoCommand::Test,
|
||||
"cargo-check" => CargoCommand::Check,
|
||||
"cargo-clippy" => CargoCommand::Clippy,
|
||||
"cargo-fmt" => CargoCommand::Fmt,
|
||||
"cargo-doc" => CargoCommand::Doc,
|
||||
"cargo-publish" => CargoCommand::Publish,
|
||||
"cargo-audit" => CargoCommand::Audit,
|
||||
"cargo-deny" => CargoCommand::Deny,
|
||||
"cargo-nextest" => CargoCommand::Nextest,
|
||||
"cargo-llvm-cov" => CargoCommand::LlvmCov,
|
||||
"cargo-doc-mdx" => CargoCommand::DocMdx,
|
||||
_ => {
|
||||
return Err(YamlWorkflowError::Compilation(format!(
|
||||
"Unknown cargo step type: '{step_type}'"
|
||||
)));
|
||||
}
|
||||
};
|
||||
|
||||
let config = step.config.as_ref();
|
||||
let timeout_ms = config
|
||||
.and_then(|c| c.timeout.as_ref())
|
||||
.and_then(|t| parse_duration_ms(t));
|
||||
|
||||
Ok(CargoConfig {
|
||||
command,
|
||||
toolchain: config.and_then(|c| c.toolchain.clone()),
|
||||
package: config.and_then(|c| c.package.clone()),
|
||||
features: config.map(|c| c.features.clone()).unwrap_or_default(),
|
||||
all_features: config.and_then(|c| c.all_features).unwrap_or(false),
|
||||
no_default_features: config.and_then(|c| c.no_default_features).unwrap_or(false),
|
||||
release: config.and_then(|c| c.release).unwrap_or(false),
|
||||
target: config.and_then(|c| c.target.clone()),
|
||||
profile: config.and_then(|c| c.profile.clone()),
|
||||
extra_args: config.map(|c| c.extra_args.clone()).unwrap_or_default(),
|
||||
env: config.map(|c| c.env.clone()).unwrap_or_default(),
|
||||
working_dir: config.and_then(|c| c.working_dir.clone()),
|
||||
timeout_ms,
|
||||
output_dir: config.and_then(|c| c.output_dir.clone()),
|
||||
})
|
||||
}
|
||||
|
||||
#[cfg(feature = "rustlang")]
|
||||
fn build_rustup_config(
|
||||
step: &YamlStep,
|
||||
step_type: &str,
|
||||
) -> Result<RustupConfig, YamlWorkflowError> {
|
||||
let command = match step_type {
|
||||
"rust-install" => RustupCommand::Install,
|
||||
"rustup-toolchain" => RustupCommand::ToolchainInstall,
|
||||
"rustup-component" => RustupCommand::ComponentAdd,
|
||||
"rustup-target" => RustupCommand::TargetAdd,
|
||||
_ => {
|
||||
return Err(YamlWorkflowError::Compilation(format!(
|
||||
"Unknown rustup step type: '{step_type}'"
|
||||
)));
|
||||
}
|
||||
};
|
||||
|
||||
let config = step.config.as_ref();
|
||||
let timeout_ms = config
|
||||
.and_then(|c| c.timeout.as_ref())
|
||||
.and_then(|t| parse_duration_ms(t));
|
||||
|
||||
Ok(RustupConfig {
|
||||
command,
|
||||
toolchain: config.and_then(|c| c.toolchain.clone()),
|
||||
components: config.map(|c| c.components.clone()).unwrap_or_default(),
|
||||
targets: config.map(|c| c.targets.clone()).unwrap_or_default(),
|
||||
profile: config.and_then(|c| c.profile.clone()),
|
||||
default_toolchain: config.and_then(|c| c.default_toolchain.clone()),
|
||||
extra_args: config.map(|c| c.extra_args.clone()).unwrap_or_default(),
|
||||
timeout_ms,
|
||||
})
|
||||
}
|
||||
|
||||
fn parse_duration_ms(s: &str) -> Option<u64> {
|
||||
let s = s.trim();
|
||||
// Check "ms" before "s" since strip_suffix('s') would also match "500ms"
|
||||
|
||||
@@ -164,6 +164,39 @@ pub struct StepConfig {
|
||||
pub containerd_addr: Option<String>,
|
||||
/// CLI binary name for containerd steps: "nerdctl" (default) or "docker".
|
||||
pub cli: Option<String>,
|
||||
// Cargo fields
|
||||
/// Target package for cargo steps (`-p`).
|
||||
pub package: Option<String>,
|
||||
/// Features to enable for cargo steps.
|
||||
#[serde(default)]
|
||||
pub features: Vec<String>,
|
||||
/// Enable all features for cargo steps.
|
||||
#[serde(default)]
|
||||
pub all_features: Option<bool>,
|
||||
/// Disable default features for cargo steps.
|
||||
#[serde(default)]
|
||||
pub no_default_features: Option<bool>,
|
||||
/// Build in release mode for cargo steps.
|
||||
#[serde(default)]
|
||||
pub release: Option<bool>,
|
||||
/// Build profile for cargo steps (`--profile`).
|
||||
pub profile: Option<String>,
|
||||
/// Rust toolchain override for cargo steps (e.g. "nightly").
|
||||
pub toolchain: Option<String>,
|
||||
/// Additional arguments for cargo/rustup steps.
|
||||
#[serde(default)]
|
||||
pub extra_args: Vec<String>,
|
||||
/// Output directory for generated files (e.g., MDX docs).
|
||||
pub output_dir: Option<String>,
|
||||
// Rustup fields
|
||||
/// Components to add for rustup steps (e.g. ["clippy", "rustfmt"]).
|
||||
#[serde(default)]
|
||||
pub components: Vec<String>,
|
||||
/// Compilation targets to add for rustup steps (e.g. ["wasm32-unknown-unknown"]).
|
||||
#[serde(default)]
|
||||
pub targets: Vec<String>,
|
||||
/// Default toolchain for rust-install steps.
|
||||
pub default_toolchain: Option<String>,
|
||||
// Workflow (sub-workflow) fields
|
||||
/// Child workflow ID (for `type: workflow` steps).
|
||||
#[serde(rename = "workflow")]
|
||||
|
||||
777
wfe-yaml/tests/rustlang.rs
Normal file
777
wfe-yaml/tests/rustlang.rs
Normal file
@@ -0,0 +1,777 @@
|
||||
#![cfg(feature = "rustlang")]
|
||||
|
||||
use std::collections::HashMap;
|
||||
use std::sync::Arc;
|
||||
use std::time::Duration;
|
||||
|
||||
use wfe::models::WorkflowStatus;
|
||||
use wfe::{WorkflowHostBuilder, run_workflow_sync};
|
||||
use wfe_core::test_support::{
|
||||
InMemoryLockProvider, InMemoryPersistenceProvider, InMemoryQueueProvider,
|
||||
};
|
||||
use wfe_yaml::load_single_workflow_from_str;
|
||||
|
||||
fn has_factory(compiled: &wfe_yaml::compiler::CompiledWorkflow, key: &str) -> bool {
|
||||
compiled.step_factories.iter().any(|(k, _)| k == key)
|
||||
}
|
||||
|
||||
async fn run_yaml_workflow(yaml: &str) -> wfe::models::WorkflowInstance {
|
||||
let config = HashMap::new();
|
||||
let compiled = load_single_workflow_from_str(yaml, &config).unwrap();
|
||||
|
||||
let persistence = Arc::new(InMemoryPersistenceProvider::new());
|
||||
let lock = Arc::new(InMemoryLockProvider::new());
|
||||
let queue = Arc::new(InMemoryQueueProvider::new());
|
||||
|
||||
let host = WorkflowHostBuilder::new()
|
||||
.use_persistence(persistence as Arc<dyn wfe_core::traits::PersistenceProvider>)
|
||||
.use_lock_provider(lock as Arc<dyn wfe_core::traits::DistributedLockProvider>)
|
||||
.use_queue_provider(queue as Arc<dyn wfe_core::traits::QueueProvider>)
|
||||
.build()
|
||||
.unwrap();
|
||||
|
||||
for (key, factory) in compiled.step_factories {
|
||||
host.register_step_factory(&key, factory).await;
|
||||
}
|
||||
|
||||
host.register_workflow_definition(compiled.definition.clone())
|
||||
.await;
|
||||
host.start().await.unwrap();
|
||||
|
||||
let instance = run_workflow_sync(
|
||||
&host,
|
||||
&compiled.definition.id,
|
||||
compiled.definition.version,
|
||||
serde_json::json!({}),
|
||||
Duration::from_secs(30),
|
||||
)
|
||||
.await
|
||||
.unwrap();
|
||||
|
||||
host.stop().await;
|
||||
instance
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Compiler tests — verify YAML compiles to correct step types and configs
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
#[test]
|
||||
fn compile_cargo_build_step() {
|
||||
let yaml = r#"
|
||||
workflow:
|
||||
id: cargo-build-wf
|
||||
version: 1
|
||||
steps:
|
||||
- name: build
|
||||
type: cargo-build
|
||||
config:
|
||||
release: true
|
||||
package: my-crate
|
||||
"#;
|
||||
let compiled = load_single_workflow_from_str(yaml, &HashMap::new()).unwrap();
|
||||
let step = compiled
|
||||
.definition
|
||||
.steps
|
||||
.iter()
|
||||
.find(|s| s.name.as_deref() == Some("build"))
|
||||
.unwrap();
|
||||
assert_eq!(step.step_type, "wfe_yaml::cargo::build");
|
||||
assert!(has_factory(&compiled, "wfe_yaml::cargo::build"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn compile_cargo_test_step() {
|
||||
let yaml = r#"
|
||||
workflow:
|
||||
id: cargo-test-wf
|
||||
version: 1
|
||||
steps:
|
||||
- name: test
|
||||
type: cargo-test
|
||||
config:
|
||||
features:
|
||||
- feat1
|
||||
- feat2
|
||||
extra_args:
|
||||
- "--"
|
||||
- "--nocapture"
|
||||
"#;
|
||||
let compiled = load_single_workflow_from_str(yaml, &HashMap::new()).unwrap();
|
||||
let step = compiled
|
||||
.definition
|
||||
.steps
|
||||
.iter()
|
||||
.find(|s| s.name.as_deref() == Some("test"))
|
||||
.unwrap();
|
||||
assert_eq!(step.step_type, "wfe_yaml::cargo::test");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn compile_cargo_check_step() {
|
||||
let yaml = r#"
|
||||
workflow:
|
||||
id: cargo-check-wf
|
||||
version: 1
|
||||
steps:
|
||||
- name: check
|
||||
type: cargo-check
|
||||
"#;
|
||||
let compiled = load_single_workflow_from_str(yaml, &HashMap::new()).unwrap();
|
||||
assert!(has_factory(&compiled, "wfe_yaml::cargo::check"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn compile_cargo_clippy_step() {
|
||||
let yaml = r#"
|
||||
workflow:
|
||||
id: cargo-clippy-wf
|
||||
version: 1
|
||||
steps:
|
||||
- name: lint
|
||||
type: cargo-clippy
|
||||
config:
|
||||
all_features: true
|
||||
extra_args:
|
||||
- "--"
|
||||
- "-D"
|
||||
- "warnings"
|
||||
"#;
|
||||
let compiled = load_single_workflow_from_str(yaml, &HashMap::new()).unwrap();
|
||||
assert!(has_factory(&compiled, "wfe_yaml::cargo::lint"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn compile_cargo_fmt_step() {
|
||||
let yaml = r#"
|
||||
workflow:
|
||||
id: cargo-fmt-wf
|
||||
version: 1
|
||||
steps:
|
||||
- name: format
|
||||
type: cargo-fmt
|
||||
config:
|
||||
extra_args:
|
||||
- "--check"
|
||||
"#;
|
||||
let compiled = load_single_workflow_from_str(yaml, &HashMap::new()).unwrap();
|
||||
assert!(has_factory(&compiled, "wfe_yaml::cargo::format"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn compile_cargo_doc_step() {
|
||||
let yaml = r#"
|
||||
workflow:
|
||||
id: cargo-doc-wf
|
||||
version: 1
|
||||
steps:
|
||||
- name: docs
|
||||
type: cargo-doc
|
||||
config:
|
||||
extra_args:
|
||||
- "--no-deps"
|
||||
"#;
|
||||
let compiled = load_single_workflow_from_str(yaml, &HashMap::new()).unwrap();
|
||||
assert!(has_factory(&compiled, "wfe_yaml::cargo::docs"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn compile_cargo_publish_step() {
|
||||
let yaml = r#"
|
||||
workflow:
|
||||
id: cargo-publish-wf
|
||||
version: 1
|
||||
steps:
|
||||
- name: publish
|
||||
type: cargo-publish
|
||||
config:
|
||||
extra_args:
|
||||
- "--dry-run"
|
||||
"#;
|
||||
let compiled = load_single_workflow_from_str(yaml, &HashMap::new()).unwrap();
|
||||
assert!(has_factory(&compiled, "wfe_yaml::cargo::publish"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn compile_cargo_step_with_toolchain() {
|
||||
let yaml = r#"
|
||||
workflow:
|
||||
id: nightly-wf
|
||||
version: 1
|
||||
steps:
|
||||
- name: nightly-check
|
||||
type: cargo-check
|
||||
config:
|
||||
toolchain: nightly
|
||||
no_default_features: true
|
||||
"#;
|
||||
let compiled = load_single_workflow_from_str(yaml, &HashMap::new()).unwrap();
|
||||
assert!(has_factory(&compiled, "wfe_yaml::cargo::nightly-check"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn compile_cargo_step_with_timeout() {
|
||||
let yaml = r#"
|
||||
workflow:
|
||||
id: timeout-wf
|
||||
version: 1
|
||||
steps:
|
||||
- name: slow-build
|
||||
type: cargo-build
|
||||
config:
|
||||
timeout: 5m
|
||||
"#;
|
||||
let compiled = load_single_workflow_from_str(yaml, &HashMap::new()).unwrap();
|
||||
assert!(has_factory(&compiled, "wfe_yaml::cargo::slow-build"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn compile_cargo_step_without_config() {
|
||||
let yaml = r#"
|
||||
workflow:
|
||||
id: bare-wf
|
||||
version: 1
|
||||
steps:
|
||||
- name: bare-check
|
||||
type: cargo-check
|
||||
"#;
|
||||
let compiled = load_single_workflow_from_str(yaml, &HashMap::new()).unwrap();
|
||||
assert!(has_factory(&compiled, "wfe_yaml::cargo::bare-check"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn compile_cargo_multi_step_pipeline() {
|
||||
let yaml = r#"
|
||||
workflow:
|
||||
id: ci-pipeline
|
||||
version: 1
|
||||
steps:
|
||||
- name: fmt
|
||||
type: cargo-fmt
|
||||
config:
|
||||
extra_args: ["--check"]
|
||||
- name: check
|
||||
type: cargo-check
|
||||
- name: clippy
|
||||
type: cargo-clippy
|
||||
config:
|
||||
extra_args: ["--", "-D", "warnings"]
|
||||
- name: test
|
||||
type: cargo-test
|
||||
- name: build
|
||||
type: cargo-build
|
||||
config:
|
||||
release: true
|
||||
"#;
|
||||
let compiled = load_single_workflow_from_str(yaml, &HashMap::new()).unwrap();
|
||||
assert!(has_factory(&compiled, "wfe_yaml::cargo::fmt"));
|
||||
assert!(has_factory(&compiled, "wfe_yaml::cargo::check"));
|
||||
assert!(has_factory(&compiled, "wfe_yaml::cargo::clippy"));
|
||||
assert!(has_factory(&compiled, "wfe_yaml::cargo::test"));
|
||||
assert!(has_factory(&compiled, "wfe_yaml::cargo::build"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn compile_cargo_step_with_all_shared_flags() {
|
||||
let yaml = r#"
|
||||
workflow:
|
||||
id: full-flags-wf
|
||||
version: 1
|
||||
steps:
|
||||
- name: full
|
||||
type: cargo-build
|
||||
config:
|
||||
package: my-crate
|
||||
features: [foo, bar]
|
||||
all_features: false
|
||||
no_default_features: true
|
||||
release: true
|
||||
toolchain: stable
|
||||
profile: release
|
||||
extra_args: ["--jobs", "4"]
|
||||
working_dir: /tmp/project
|
||||
timeout: 30s
|
||||
env:
|
||||
RUSTFLAGS: "-C target-cpu=native"
|
||||
"#;
|
||||
let compiled = load_single_workflow_from_str(yaml, &HashMap::new()).unwrap();
|
||||
assert!(has_factory(&compiled, "wfe_yaml::cargo::full"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn compile_cargo_step_preserves_step_config_json() {
|
||||
let yaml = r#"
|
||||
workflow:
|
||||
id: config-json-wf
|
||||
version: 1
|
||||
steps:
|
||||
- name: build
|
||||
type: cargo-build
|
||||
config:
|
||||
release: true
|
||||
package: wfe-core
|
||||
"#;
|
||||
let compiled = load_single_workflow_from_str(yaml, &HashMap::new()).unwrap();
|
||||
let step = compiled
|
||||
.definition
|
||||
.steps
|
||||
.iter()
|
||||
.find(|s| s.name.as_deref() == Some("build"))
|
||||
.unwrap();
|
||||
|
||||
let step_config = step.step_config.as_ref().unwrap();
|
||||
assert_eq!(step_config["command"], "build");
|
||||
assert_eq!(step_config["release"], true);
|
||||
assert_eq!(step_config["package"], "wfe-core");
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Integration tests — run actual cargo commands through the workflow engine
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
#[tokio::test]
|
||||
async fn cargo_check_on_self_succeeds() {
|
||||
let yaml = r#"
|
||||
workflow:
|
||||
id: self-check
|
||||
version: 1
|
||||
steps:
|
||||
- name: check
|
||||
type: cargo-check
|
||||
config:
|
||||
working_dir: .
|
||||
timeout: 120s
|
||||
"#;
|
||||
let instance = run_yaml_workflow(yaml).await;
|
||||
assert_eq!(instance.status, WorkflowStatus::Complete);
|
||||
|
||||
let data = instance.data.as_object().unwrap();
|
||||
assert!(data.contains_key("check.stdout") || data.contains_key("check.stderr"));
|
||||
}
|
||||
|
||||
#[tokio::test]
|
||||
async fn cargo_fmt_check_compiles() {
|
||||
let yaml = r#"
|
||||
workflow:
|
||||
id: fmt-check
|
||||
version: 1
|
||||
steps:
|
||||
- name: fmt
|
||||
type: cargo-fmt
|
||||
config:
|
||||
working_dir: .
|
||||
extra_args: ["--check"]
|
||||
timeout: 60s
|
||||
"#;
|
||||
let config = HashMap::new();
|
||||
let compiled = load_single_workflow_from_str(yaml, &config).unwrap();
|
||||
assert!(has_factory(&compiled, "wfe_yaml::cargo::fmt"));
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Rustup compiler tests
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
#[test]
|
||||
fn compile_rust_install_step() {
|
||||
let yaml = r#"
|
||||
workflow:
|
||||
id: rust-install-wf
|
||||
version: 1
|
||||
steps:
|
||||
- name: install-rust
|
||||
type: rust-install
|
||||
config:
|
||||
profile: minimal
|
||||
default_toolchain: stable
|
||||
timeout: 5m
|
||||
"#;
|
||||
let compiled = load_single_workflow_from_str(yaml, &HashMap::new()).unwrap();
|
||||
let step = compiled
|
||||
.definition
|
||||
.steps
|
||||
.iter()
|
||||
.find(|s| s.name.as_deref() == Some("install-rust"))
|
||||
.unwrap();
|
||||
assert_eq!(step.step_type, "wfe_yaml::rustup::install-rust");
|
||||
assert!(has_factory(&compiled, "wfe_yaml::rustup::install-rust"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn compile_rustup_toolchain_step() {
|
||||
let yaml = r#"
|
||||
workflow:
|
||||
id: tc-install-wf
|
||||
version: 1
|
||||
steps:
|
||||
- name: add-nightly
|
||||
type: rustup-toolchain
|
||||
config:
|
||||
toolchain: nightly-2024-06-01
|
||||
profile: minimal
|
||||
"#;
|
||||
let compiled = load_single_workflow_from_str(yaml, &HashMap::new()).unwrap();
|
||||
assert!(has_factory(&compiled, "wfe_yaml::rustup::add-nightly"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn compile_rustup_component_step() {
|
||||
let yaml = r#"
|
||||
workflow:
|
||||
id: comp-add-wf
|
||||
version: 1
|
||||
steps:
|
||||
- name: add-tools
|
||||
type: rustup-component
|
||||
config:
|
||||
components: [clippy, rustfmt, rust-src]
|
||||
toolchain: nightly
|
||||
"#;
|
||||
let compiled = load_single_workflow_from_str(yaml, &HashMap::new()).unwrap();
|
||||
assert!(has_factory(&compiled, "wfe_yaml::rustup::add-tools"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn compile_rustup_target_step() {
|
||||
let yaml = r#"
|
||||
workflow:
|
||||
id: target-add-wf
|
||||
version: 1
|
||||
steps:
|
||||
- name: add-wasm
|
||||
type: rustup-target
|
||||
config:
|
||||
targets: [wasm32-unknown-unknown]
|
||||
toolchain: stable
|
||||
"#;
|
||||
let compiled = load_single_workflow_from_str(yaml, &HashMap::new()).unwrap();
|
||||
assert!(has_factory(&compiled, "wfe_yaml::rustup::add-wasm"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn compile_rustup_step_without_config() {
|
||||
let yaml = r#"
|
||||
workflow:
|
||||
id: bare-install-wf
|
||||
version: 1
|
||||
steps:
|
||||
- name: install
|
||||
type: rust-install
|
||||
"#;
|
||||
let compiled = load_single_workflow_from_str(yaml, &HashMap::new()).unwrap();
|
||||
assert!(has_factory(&compiled, "wfe_yaml::rustup::install"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn compile_rustup_step_preserves_config_json() {
|
||||
let yaml = r#"
|
||||
workflow:
|
||||
id: config-json-wf
|
||||
version: 1
|
||||
steps:
|
||||
- name: tc
|
||||
type: rustup-toolchain
|
||||
config:
|
||||
toolchain: nightly
|
||||
profile: minimal
|
||||
"#;
|
||||
let compiled = load_single_workflow_from_str(yaml, &HashMap::new()).unwrap();
|
||||
let step = compiled
|
||||
.definition
|
||||
.steps
|
||||
.iter()
|
||||
.find(|s| s.name.as_deref() == Some("tc"))
|
||||
.unwrap();
|
||||
|
||||
let step_config = step.step_config.as_ref().unwrap();
|
||||
assert_eq!(step_config["command"], "toolchain-install");
|
||||
assert_eq!(step_config["toolchain"], "nightly");
|
||||
assert_eq!(step_config["profile"], "minimal");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn compile_full_rust_ci_pipeline() {
|
||||
let yaml = r#"
|
||||
workflow:
|
||||
id: full-rust-ci
|
||||
version: 1
|
||||
steps:
|
||||
- name: install
|
||||
type: rust-install
|
||||
config:
|
||||
profile: minimal
|
||||
default_toolchain: stable
|
||||
- name: add-nightly
|
||||
type: rustup-toolchain
|
||||
config:
|
||||
toolchain: nightly
|
||||
- name: add-components
|
||||
type: rustup-component
|
||||
config:
|
||||
components: [clippy, rustfmt]
|
||||
- name: add-wasm
|
||||
type: rustup-target
|
||||
config:
|
||||
targets: [wasm32-unknown-unknown]
|
||||
- name: fmt
|
||||
type: cargo-fmt
|
||||
config:
|
||||
extra_args: ["--check"]
|
||||
- name: check
|
||||
type: cargo-check
|
||||
- name: clippy
|
||||
type: cargo-clippy
|
||||
config:
|
||||
extra_args: ["--", "-D", "warnings"]
|
||||
- name: test
|
||||
type: cargo-test
|
||||
- name: build
|
||||
type: cargo-build
|
||||
config:
|
||||
release: true
|
||||
"#;
|
||||
let compiled = load_single_workflow_from_str(yaml, &HashMap::new()).unwrap();
|
||||
assert!(has_factory(&compiled, "wfe_yaml::rustup::install"));
|
||||
assert!(has_factory(&compiled, "wfe_yaml::rustup::add-nightly"));
|
||||
assert!(has_factory(&compiled, "wfe_yaml::rustup::add-components"));
|
||||
assert!(has_factory(&compiled, "wfe_yaml::rustup::add-wasm"));
|
||||
assert!(has_factory(&compiled, "wfe_yaml::cargo::fmt"));
|
||||
assert!(has_factory(&compiled, "wfe_yaml::cargo::check"));
|
||||
assert!(has_factory(&compiled, "wfe_yaml::cargo::clippy"));
|
||||
assert!(has_factory(&compiled, "wfe_yaml::cargo::test"));
|
||||
assert!(has_factory(&compiled, "wfe_yaml::cargo::build"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn compile_rustup_component_with_extra_args() {
|
||||
let yaml = r#"
|
||||
workflow:
|
||||
id: comp-extra-wf
|
||||
version: 1
|
||||
steps:
|
||||
- name: add-llvm
|
||||
type: rustup-component
|
||||
config:
|
||||
components: [llvm-tools-preview]
|
||||
extra_args: ["--force"]
|
||||
"#;
|
||||
let compiled = load_single_workflow_from_str(yaml, &HashMap::new()).unwrap();
|
||||
assert!(has_factory(&compiled, "wfe_yaml::rustup::add-llvm"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn compile_rustup_target_multiple() {
|
||||
let yaml = r#"
|
||||
workflow:
|
||||
id: multi-target-wf
|
||||
version: 1
|
||||
steps:
|
||||
- name: cross-targets
|
||||
type: rustup-target
|
||||
config:
|
||||
targets:
|
||||
- wasm32-unknown-unknown
|
||||
- aarch64-linux-android
|
||||
- x86_64-unknown-linux-musl
|
||||
toolchain: nightly
|
||||
"#;
|
||||
let compiled = load_single_workflow_from_str(yaml, &HashMap::new()).unwrap();
|
||||
assert!(has_factory(&compiled, "wfe_yaml::rustup::cross-targets"));
|
||||
|
||||
let step_config = compiled
|
||||
.definition
|
||||
.steps
|
||||
.iter()
|
||||
.find(|s| s.name.as_deref() == Some("cross-targets"))
|
||||
.unwrap()
|
||||
.step_config
|
||||
.as_ref()
|
||||
.unwrap();
|
||||
assert_eq!(step_config["command"], "target-add");
|
||||
let targets = step_config["targets"].as_array().unwrap();
|
||||
assert_eq!(targets.len(), 3);
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// External cargo tool step compiler tests
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
#[test]
|
||||
fn compile_cargo_audit_step() {
|
||||
let yaml = r#"
|
||||
workflow:
|
||||
id: audit-wf
|
||||
version: 1
|
||||
steps:
|
||||
- name: audit
|
||||
type: cargo-audit
|
||||
"#;
|
||||
let compiled = load_single_workflow_from_str(yaml, &HashMap::new()).unwrap();
|
||||
assert!(has_factory(&compiled, "wfe_yaml::cargo::audit"));
|
||||
|
||||
let step_config = compiled
|
||||
.definition
|
||||
.steps
|
||||
.iter()
|
||||
.find(|s| s.name.as_deref() == Some("audit"))
|
||||
.unwrap()
|
||||
.step_config
|
||||
.as_ref()
|
||||
.unwrap();
|
||||
assert_eq!(step_config["command"], "audit");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn compile_cargo_deny_step() {
|
||||
let yaml = r#"
|
||||
workflow:
|
||||
id: deny-wf
|
||||
version: 1
|
||||
steps:
|
||||
- name: license-check
|
||||
type: cargo-deny
|
||||
config:
|
||||
extra_args: ["check"]
|
||||
"#;
|
||||
let compiled = load_single_workflow_from_str(yaml, &HashMap::new()).unwrap();
|
||||
assert!(has_factory(&compiled, "wfe_yaml::cargo::license-check"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn compile_cargo_nextest_step() {
|
||||
let yaml = r#"
|
||||
workflow:
|
||||
id: nextest-wf
|
||||
version: 1
|
||||
steps:
|
||||
- name: fast-test
|
||||
type: cargo-nextest
|
||||
config:
|
||||
features: [foo]
|
||||
extra_args: ["--no-fail-fast"]
|
||||
"#;
|
||||
let compiled = load_single_workflow_from_str(yaml, &HashMap::new()).unwrap();
|
||||
assert!(has_factory(&compiled, "wfe_yaml::cargo::fast-test"));
|
||||
|
||||
let step_config = compiled
|
||||
.definition
|
||||
.steps
|
||||
.iter()
|
||||
.find(|s| s.name.as_deref() == Some("fast-test"))
|
||||
.unwrap()
|
||||
.step_config
|
||||
.as_ref()
|
||||
.unwrap();
|
||||
assert_eq!(step_config["command"], "nextest");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn compile_cargo_llvm_cov_step() {
|
||||
let yaml = r#"
|
||||
workflow:
|
||||
id: cov-wf
|
||||
version: 1
|
||||
steps:
|
||||
- name: coverage
|
||||
type: cargo-llvm-cov
|
||||
config:
|
||||
extra_args: ["--html", "--output-dir", "coverage"]
|
||||
"#;
|
||||
let compiled = load_single_workflow_from_str(yaml, &HashMap::new()).unwrap();
|
||||
assert!(has_factory(&compiled, "wfe_yaml::cargo::coverage"));
|
||||
|
||||
let step_config = compiled
|
||||
.definition
|
||||
.steps
|
||||
.iter()
|
||||
.find(|s| s.name.as_deref() == Some("coverage"))
|
||||
.unwrap()
|
||||
.step_config
|
||||
.as_ref()
|
||||
.unwrap();
|
||||
assert_eq!(step_config["command"], "llvm-cov");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn compile_full_ci_with_external_tools() {
|
||||
let yaml = r#"
|
||||
workflow:
|
||||
id: full-ci-external
|
||||
version: 1
|
||||
steps:
|
||||
- name: audit
|
||||
type: cargo-audit
|
||||
- name: deny
|
||||
type: cargo-deny
|
||||
config:
|
||||
extra_args: ["check", "licenses"]
|
||||
- name: test
|
||||
type: cargo-nextest
|
||||
- name: coverage
|
||||
type: cargo-llvm-cov
|
||||
config:
|
||||
extra_args: ["--summary-only"]
|
||||
"#;
|
||||
let compiled = load_single_workflow_from_str(yaml, &HashMap::new()).unwrap();
|
||||
assert!(has_factory(&compiled, "wfe_yaml::cargo::audit"));
|
||||
assert!(has_factory(&compiled, "wfe_yaml::cargo::deny"));
|
||||
assert!(has_factory(&compiled, "wfe_yaml::cargo::test"));
|
||||
assert!(has_factory(&compiled, "wfe_yaml::cargo::coverage"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn compile_cargo_doc_mdx_step() {
|
||||
let yaml = r#"
|
||||
workflow:
|
||||
id: doc-mdx-wf
|
||||
version: 1
|
||||
steps:
|
||||
- name: docs
|
||||
type: cargo-doc-mdx
|
||||
config:
|
||||
package: my-crate
|
||||
output_dir: docs/api
|
||||
extra_args: ["--no-deps"]
|
||||
"#;
|
||||
let compiled = load_single_workflow_from_str(yaml, &HashMap::new()).unwrap();
|
||||
assert!(has_factory(&compiled, "wfe_yaml::cargo::docs"));
|
||||
|
||||
let step_config = compiled
|
||||
.definition
|
||||
.steps
|
||||
.iter()
|
||||
.find(|s| s.name.as_deref() == Some("docs"))
|
||||
.unwrap()
|
||||
.step_config
|
||||
.as_ref()
|
||||
.unwrap();
|
||||
assert_eq!(step_config["command"], "doc-mdx");
|
||||
assert_eq!(step_config["package"], "my-crate");
|
||||
assert_eq!(step_config["output_dir"], "docs/api");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn compile_cargo_doc_mdx_minimal() {
|
||||
let yaml = r#"
|
||||
workflow:
|
||||
id: doc-mdx-minimal-wf
|
||||
version: 1
|
||||
steps:
|
||||
- name: generate-docs
|
||||
type: cargo-doc-mdx
|
||||
"#;
|
||||
let compiled = load_single_workflow_from_str(yaml, &HashMap::new()).unwrap();
|
||||
assert!(has_factory(&compiled, "wfe_yaml::cargo::generate-docs"));
|
||||
|
||||
let step_config = compiled
|
||||
.definition
|
||||
.steps
|
||||
.iter()
|
||||
.find(|s| s.name.as_deref() == Some("generate-docs"))
|
||||
.unwrap()
|
||||
.step_config
|
||||
.as_ref()
|
||||
.unwrap();
|
||||
assert_eq!(step_config["command"], "doc-mdx");
|
||||
assert!(step_config["output_dir"].is_null());
|
||||
}
|
||||
474
wfe-yaml/tests/rustlang_containerd.rs
Normal file
474
wfe-yaml/tests/rustlang_containerd.rs
Normal file
@@ -0,0 +1,474 @@
|
||||
//! End-to-end integration tests for the Rust toolchain steps running inside
|
||||
//! containerd containers.
|
||||
//!
|
||||
//! These tests start from a bare Debian image (no Rust installed) and exercise
|
||||
//! the full Rust CI pipeline: install Rust, install external tools, create a
|
||||
//! test project, and run every cargo operation.
|
||||
//!
|
||||
//! Requirements:
|
||||
//! - A running containerd daemon (Lima/colima or native)
|
||||
//! - Set `WFE_CONTAINERD_ADDR` to point to the socket
|
||||
//!
|
||||
//! These tests are gated behind `rustlang` + `containerd` features and are
|
||||
//! marked `#[ignore]` so they don't run in normal CI. Run them explicitly:
|
||||
//! cargo test -p wfe-yaml --features rustlang,containerd --test rustlang_containerd -- --ignored
|
||||
|
||||
#![cfg(all(feature = "rustlang", feature = "containerd"))]
|
||||
|
||||
use std::collections::HashMap;
|
||||
use std::path::Path;
|
||||
use std::sync::Arc;
|
||||
use std::time::Duration;
|
||||
|
||||
use wfe::models::WorkflowStatus;
|
||||
use wfe::{WorkflowHostBuilder, run_workflow_sync};
|
||||
use wfe_core::test_support::{
|
||||
InMemoryLockProvider, InMemoryPersistenceProvider, InMemoryQueueProvider,
|
||||
};
|
||||
use wfe_yaml::load_single_workflow_from_str;
|
||||
|
||||
/// Returns the containerd address if available, or None.
|
||||
/// Supports both Unix sockets (`unix:///path`) and TCP (`http://host:port`).
|
||||
fn containerd_addr() -> Option<String> {
|
||||
let addr = std::env::var("WFE_CONTAINERD_ADDR").unwrap_or_else(|_| {
|
||||
// Default: TCP proxy on the Lima VM (socat forwarding containerd socket)
|
||||
"http://127.0.0.1:2500".to_string()
|
||||
});
|
||||
|
||||
// For TCP addresses, assume reachable (the test will fail fast if not).
|
||||
if addr.starts_with("http://") || addr.starts_with("tcp://") {
|
||||
return Some(addr);
|
||||
}
|
||||
|
||||
// For Unix sockets, check the file exists.
|
||||
let socket_path = addr.strip_prefix("unix://").unwrap_or(addr.as_str());
|
||||
if Path::new(socket_path).exists() {
|
||||
Some(addr)
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
|
||||
async fn run_yaml_workflow_with_config(
|
||||
yaml: &str,
|
||||
config: &HashMap<String, serde_json::Value>,
|
||||
) -> wfe::models::WorkflowInstance {
|
||||
let compiled = load_single_workflow_from_str(yaml, config).unwrap();
|
||||
for step in &compiled.definition.steps {
|
||||
eprintln!(" step: {:?} type={} config={:?}", step.name, step.step_type, step.step_config);
|
||||
}
|
||||
eprintln!(" factories: {:?}", compiled.step_factories.iter().map(|(k, _)| k.clone()).collect::<Vec<_>>());
|
||||
|
||||
let persistence = Arc::new(InMemoryPersistenceProvider::new());
|
||||
let lock = Arc::new(InMemoryLockProvider::new());
|
||||
let queue = Arc::new(InMemoryQueueProvider::new());
|
||||
|
||||
let host = WorkflowHostBuilder::new()
|
||||
.use_persistence(persistence as Arc<dyn wfe_core::traits::PersistenceProvider>)
|
||||
.use_lock_provider(lock as Arc<dyn wfe_core::traits::DistributedLockProvider>)
|
||||
.use_queue_provider(queue as Arc<dyn wfe_core::traits::QueueProvider>)
|
||||
.build()
|
||||
.unwrap();
|
||||
|
||||
for (key, factory) in compiled.step_factories {
|
||||
host.register_step_factory(&key, factory).await;
|
||||
}
|
||||
|
||||
host.register_workflow_definition(compiled.definition.clone())
|
||||
.await;
|
||||
host.start().await.unwrap();
|
||||
|
||||
let instance = run_workflow_sync(
|
||||
&host,
|
||||
&compiled.definition.id,
|
||||
compiled.definition.version,
|
||||
serde_json::json!({}),
|
||||
Duration::from_secs(1800),
|
||||
)
|
||||
.await
|
||||
.unwrap();
|
||||
|
||||
host.stop().await;
|
||||
instance
|
||||
}
|
||||
|
||||
/// Shared env block and volume template for containerd steps.
|
||||
/// Uses format! to avoid Rust 2024 reserved `##` token in raw strings.
|
||||
fn containerd_step_yaml(
|
||||
name: &str,
|
||||
network: &str,
|
||||
pull: &str,
|
||||
timeout: &str,
|
||||
working_dir: Option<&str>,
|
||||
mount_workspace: bool,
|
||||
run_script: &str,
|
||||
) -> String {
|
||||
let wfe = "##wfe";
|
||||
let wd = working_dir
|
||||
.map(|d| format!(" working_dir: {d}"))
|
||||
.unwrap_or_default();
|
||||
let ws_volume = if mount_workspace {
|
||||
" - source: ((workspace))\n target: /workspace"
|
||||
} else {
|
||||
""
|
||||
};
|
||||
|
||||
format!(
|
||||
r#" - name: {name}
|
||||
type: containerd
|
||||
config:
|
||||
image: docker.io/library/debian:bookworm-slim
|
||||
containerd_addr: ((containerd_addr))
|
||||
user: "0:0"
|
||||
network: {network}
|
||||
pull: {pull}
|
||||
timeout: {timeout}
|
||||
{wd}
|
||||
env:
|
||||
CARGO_HOME: /cargo
|
||||
RUSTUP_HOME: /rustup
|
||||
PATH: /cargo/bin:/rustup/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
|
||||
volumes:
|
||||
- source: ((cargo_home))
|
||||
target: /cargo
|
||||
- source: ((rustup_home))
|
||||
target: /rustup
|
||||
{ws_volume}
|
||||
run: |
|
||||
{run_script}
|
||||
echo "{wfe}[output {name}.status=ok]"
|
||||
"#
|
||||
)
|
||||
}
|
||||
|
||||
/// Base directory for shared state between host and containerd VM.
|
||||
/// Must be inside the virtiofs mount defined in test/lima/wfe-test.yaml.
|
||||
fn shared_dir() -> std::path::PathBuf {
|
||||
let base = std::env::var("WFE_IO_DIR")
|
||||
.map(std::path::PathBuf::from)
|
||||
.unwrap_or_else(|_| std::path::PathBuf::from("/tmp/wfe-io"));
|
||||
std::fs::create_dir_all(&base).unwrap();
|
||||
base
|
||||
}
|
||||
|
||||
/// Create a temporary directory inside the shared mount so containerd can see it.
|
||||
fn shared_tempdir(name: &str) -> std::path::PathBuf {
|
||||
let id = std::time::SystemTime::now()
|
||||
.duration_since(std::time::UNIX_EPOCH)
|
||||
.unwrap()
|
||||
.as_nanos();
|
||||
let dir = shared_dir().join(format!("{name}-{id}"));
|
||||
std::fs::create_dir_all(&dir).unwrap();
|
||||
dir
|
||||
}
|
||||
|
||||
fn make_config(
|
||||
addr: &str,
|
||||
cargo_home: &Path,
|
||||
rustup_home: &Path,
|
||||
workspace: Option<&Path>,
|
||||
) -> HashMap<String, serde_json::Value> {
|
||||
let mut config = HashMap::new();
|
||||
config.insert(
|
||||
"containerd_addr".to_string(),
|
||||
serde_json::Value::String(addr.to_string()),
|
||||
);
|
||||
config.insert(
|
||||
"cargo_home".to_string(),
|
||||
serde_json::Value::String(cargo_home.to_str().unwrap().to_string()),
|
||||
);
|
||||
config.insert(
|
||||
"rustup_home".to_string(),
|
||||
serde_json::Value::String(rustup_home.to_str().unwrap().to_string()),
|
||||
);
|
||||
if let Some(ws) = workspace {
|
||||
config.insert(
|
||||
"workspace".to_string(),
|
||||
serde_json::Value::String(ws.to_str().unwrap().to_string()),
|
||||
);
|
||||
}
|
||||
config
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Minimal: just echo hello in a containerd step through the workflow engine
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
#[tokio::test]
|
||||
#[ignore = "requires containerd daemon"]
|
||||
async fn minimal_echo_in_containerd_via_workflow() {
|
||||
let _ = tracing_subscriber::fmt().with_env_filter("wfe_containerd=debug,wfe_core::executor=debug").try_init();
|
||||
let Some(addr) = containerd_addr() else {
|
||||
eprintln!("SKIP: containerd not available");
|
||||
return;
|
||||
};
|
||||
|
||||
let mut config = HashMap::new();
|
||||
config.insert(
|
||||
"containerd_addr".to_string(),
|
||||
serde_json::Value::String(addr),
|
||||
);
|
||||
|
||||
let wfe = "##wfe";
|
||||
let yaml = format!(
|
||||
r#"workflow:
|
||||
id: minimal-containerd
|
||||
version: 1
|
||||
error_behavior:
|
||||
type: terminate
|
||||
steps:
|
||||
- name: echo
|
||||
type: containerd
|
||||
config:
|
||||
image: docker.io/library/alpine:3.18
|
||||
containerd_addr: ((containerd_addr))
|
||||
user: "0:0"
|
||||
network: none
|
||||
pull: if-not-present
|
||||
timeout: 30s
|
||||
run: |
|
||||
echo hello-from-workflow
|
||||
echo "{wfe}[output echo.status=ok]"
|
||||
"#
|
||||
);
|
||||
|
||||
let instance = run_yaml_workflow_with_config(&yaml, &config).await;
|
||||
|
||||
eprintln!("Status: {:?}, Data: {:?}", instance.status, instance.data);
|
||||
assert_eq!(instance.status, WorkflowStatus::Complete);
|
||||
let data = instance.data.as_object().unwrap();
|
||||
assert_eq!(
|
||||
data.get("echo.status").and_then(|v| v.as_str()),
|
||||
Some("ok"),
|
||||
);
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Full Rust CI pipeline in a container: install → build → test → lint → cover
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
#[tokio::test]
|
||||
#[ignore = "requires containerd daemon"]
|
||||
async fn full_rust_pipeline_in_container() {
|
||||
let Some(addr) = containerd_addr() else {
|
||||
eprintln!("SKIP: containerd socket not available");
|
||||
return;
|
||||
};
|
||||
|
||||
let cargo_home = shared_tempdir("cargo");
|
||||
let rustup_home = shared_tempdir("rustup");
|
||||
let workspace = shared_tempdir("workspace");
|
||||
|
||||
let config = make_config(
|
||||
&addr,
|
||||
&cargo_home,
|
||||
&rustup_home,
|
||||
Some(&workspace),
|
||||
);
|
||||
|
||||
let steps = [
|
||||
containerd_step_yaml(
|
||||
"install-rust", "host", "if-not-present", "10m", None, false,
|
||||
" apt-get update && apt-get install -y curl gcc pkg-config libssl-dev\n\
|
||||
\x20 curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y --profile minimal --default-toolchain stable",
|
||||
),
|
||||
containerd_step_yaml(
|
||||
"install-tools", "host", "never", "10m", None, false,
|
||||
" rustup component add clippy rustfmt llvm-tools-preview\n\
|
||||
\x20 cargo install cargo-audit cargo-deny cargo-nextest cargo-llvm-cov",
|
||||
),
|
||||
containerd_step_yaml(
|
||||
"create-project", "host", "never", "2m", None, true,
|
||||
" cargo init /workspace/test-crate --name test-crate\n\
|
||||
\x20 cd /workspace/test-crate\n\
|
||||
\x20 echo '#[cfg(test)] mod tests { #[test] fn it_works() { assert_eq!(2+2,4); } }' >> src/main.rs",
|
||||
),
|
||||
containerd_step_yaml(
|
||||
"cargo-fmt", "none", "never", "2m",
|
||||
Some("/workspace/test-crate"), true,
|
||||
" cargo fmt -- --check || cargo fmt",
|
||||
),
|
||||
containerd_step_yaml(
|
||||
"cargo-check", "none", "never", "5m",
|
||||
Some("/workspace/test-crate"), true,
|
||||
" cargo check",
|
||||
),
|
||||
containerd_step_yaml(
|
||||
"cargo-clippy", "none", "never", "5m",
|
||||
Some("/workspace/test-crate"), true,
|
||||
" cargo clippy -- -D warnings",
|
||||
),
|
||||
containerd_step_yaml(
|
||||
"cargo-test", "none", "never", "5m",
|
||||
Some("/workspace/test-crate"), true,
|
||||
" cargo test",
|
||||
),
|
||||
containerd_step_yaml(
|
||||
"cargo-build", "none", "never", "5m",
|
||||
Some("/workspace/test-crate"), true,
|
||||
" cargo build --release",
|
||||
),
|
||||
containerd_step_yaml(
|
||||
"cargo-nextest", "none", "never", "5m",
|
||||
Some("/workspace/test-crate"), true,
|
||||
" cargo nextest run",
|
||||
),
|
||||
containerd_step_yaml(
|
||||
"cargo-llvm-cov", "none", "never", "5m",
|
||||
Some("/workspace/test-crate"), true,
|
||||
" cargo llvm-cov --summary-only",
|
||||
),
|
||||
containerd_step_yaml(
|
||||
"cargo-audit", "host", "never", "5m",
|
||||
Some("/workspace/test-crate"), true,
|
||||
" cargo audit || true",
|
||||
),
|
||||
containerd_step_yaml(
|
||||
"cargo-deny", "none", "never", "5m",
|
||||
Some("/workspace/test-crate"), true,
|
||||
" cargo deny init\n\
|
||||
\x20 cargo deny check || true",
|
||||
),
|
||||
containerd_step_yaml(
|
||||
"cargo-doc", "none", "never", "5m",
|
||||
Some("/workspace/test-crate"), true,
|
||||
" cargo doc --no-deps",
|
||||
),
|
||||
];
|
||||
|
||||
let yaml = format!(
|
||||
"workflow:\n id: rust-container-pipeline\n version: 1\n error_behavior:\n type: terminate\n steps:\n{}",
|
||||
steps.join("\n")
|
||||
);
|
||||
|
||||
let instance = run_yaml_workflow_with_config(&yaml, &config).await;
|
||||
|
||||
assert_eq!(
|
||||
instance.status,
|
||||
WorkflowStatus::Complete,
|
||||
"workflow should complete successfully, data: {:?}",
|
||||
instance.data
|
||||
);
|
||||
|
||||
let data = instance.data.as_object().unwrap();
|
||||
|
||||
for key in [
|
||||
"install-rust.status",
|
||||
"install-tools.status",
|
||||
"create-project.status",
|
||||
"cargo-fmt.status",
|
||||
"cargo-check.status",
|
||||
"cargo-clippy.status",
|
||||
"cargo-test.status",
|
||||
"cargo-build.status",
|
||||
"cargo-nextest.status",
|
||||
"cargo-llvm-cov.status",
|
||||
"cargo-audit.status",
|
||||
"cargo-deny.status",
|
||||
"cargo-doc.status",
|
||||
] {
|
||||
assert_eq!(
|
||||
data.get(key).and_then(|v| v.as_str()),
|
||||
Some("ok"),
|
||||
"step output '{key}' should be 'ok', got: {:?}",
|
||||
data.get(key)
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Focused test: just rust-install in a bare container
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
#[tokio::test]
|
||||
#[ignore = "requires containerd daemon"]
|
||||
async fn rust_install_in_bare_container() {
|
||||
let Some(addr) = containerd_addr() else {
|
||||
eprintln!("SKIP: containerd socket not available");
|
||||
return;
|
||||
};
|
||||
|
||||
let cargo_home = shared_tempdir("cargo");
|
||||
let rustup_home = shared_tempdir("rustup");
|
||||
|
||||
let config = make_config(&addr, &cargo_home, &rustup_home, None);
|
||||
|
||||
let wfe = "##wfe";
|
||||
let yaml = format!(
|
||||
r#"workflow:
|
||||
id: rust-install-container
|
||||
version: 1
|
||||
error_behavior:
|
||||
type: terminate
|
||||
steps:
|
||||
- name: install
|
||||
type: containerd
|
||||
config:
|
||||
image: docker.io/library/debian:bookworm-slim
|
||||
containerd_addr: ((containerd_addr))
|
||||
user: "0:0"
|
||||
network: host
|
||||
pull: if-not-present
|
||||
timeout: 10m
|
||||
env:
|
||||
CARGO_HOME: /cargo
|
||||
RUSTUP_HOME: /rustup
|
||||
PATH: /cargo/bin:/rustup/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
|
||||
volumes:
|
||||
- source: ((cargo_home))
|
||||
target: /cargo
|
||||
- source: ((rustup_home))
|
||||
target: /rustup
|
||||
run: |
|
||||
apt-get update && apt-get install -y curl
|
||||
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y --profile minimal --default-toolchain stable
|
||||
rustc --version
|
||||
cargo --version
|
||||
echo "{wfe}[output rustc_installed=true]"
|
||||
|
||||
- name: verify
|
||||
type: containerd
|
||||
config:
|
||||
image: docker.io/library/debian:bookworm-slim
|
||||
containerd_addr: ((containerd_addr))
|
||||
user: "0:0"
|
||||
network: none
|
||||
pull: if-not-present
|
||||
timeout: 2m
|
||||
env:
|
||||
CARGO_HOME: /cargo
|
||||
RUSTUP_HOME: /rustup
|
||||
PATH: /cargo/bin:/rustup/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
|
||||
volumes:
|
||||
- source: ((cargo_home))
|
||||
target: /cargo
|
||||
- source: ((rustup_home))
|
||||
target: /rustup
|
||||
run: |
|
||||
rustc --version
|
||||
cargo --version
|
||||
echo "{wfe}[output verify.status=ok]"
|
||||
"#
|
||||
);
|
||||
|
||||
let instance = run_yaml_workflow_with_config(&yaml, &config).await;
|
||||
|
||||
assert_eq!(
|
||||
instance.status,
|
||||
WorkflowStatus::Complete,
|
||||
"install workflow should complete, data: {:?}",
|
||||
instance.data
|
||||
);
|
||||
|
||||
let data = instance.data.as_object().unwrap();
|
||||
eprintln!("Workflow data: {:?}", instance.data);
|
||||
assert!(
|
||||
data.get("rustc_installed").is_some(),
|
||||
"rustc_installed should be set, got data: {:?}",
|
||||
data
|
||||
);
|
||||
assert_eq!(
|
||||
data.get("verify.status").and_then(|v| v.as_str()),
|
||||
Some("ok"),
|
||||
);
|
||||
}
|
||||
@@ -158,7 +158,8 @@ workflows:
|
||||
config:
|
||||
run: |
|
||||
cd "$WORKSPACE_DIR"
|
||||
cargo nextest run -p wfe-yaml --features buildkit,containerd -P ci
|
||||
cargo nextest run -p wfe-yaml --features buildkit,containerd,rustlang -P ci
|
||||
cargo nextest run -p wfe-rustlang -P ci
|
||||
|
||||
# ─── Workflow: test-integration ──────────────────────────────────
|
||||
|
||||
@@ -299,12 +300,12 @@ workflows:
|
||||
}
|
||||
fi
|
||||
|
||||
# Wait for sockets to be available
|
||||
# Wait for TCP proxy ports (socat bridges to containerd/buildkit sockets)
|
||||
for i in $(seq 1 30); do
|
||||
if [ -S "$HOME/.lima/wfe-test/sock/buildkitd.sock" ]; then
|
||||
if curl -sf http://127.0.0.1:2500 >/dev/null 2>&1 || [ $? -eq 56 ]; then
|
||||
break
|
||||
fi
|
||||
echo "Waiting for buildkitd socket... ($i/30)"
|
||||
echo "Waiting for containerd TCP proxy... ($i/30)"
|
||||
sleep 2
|
||||
done
|
||||
|
||||
@@ -320,7 +321,7 @@ workflows:
|
||||
config:
|
||||
run: |
|
||||
cd "$WORKSPACE_DIR"
|
||||
export WFE_BUILDKIT_ADDR="unix://$HOME/.lima/wfe-test/sock/buildkitd.sock"
|
||||
export WFE_BUILDKIT_ADDR="http://127.0.0.1:2501"
|
||||
cargo nextest run -p wfe-buildkit -P ci
|
||||
echo "##wfe[output buildkit_ok=true]"
|
||||
|
||||
@@ -334,8 +335,11 @@ workflows:
|
||||
config:
|
||||
run: |
|
||||
cd "$WORKSPACE_DIR"
|
||||
export WFE_CONTAINERD_ADDR="unix://$HOME/.lima/wfe-test/sock/containerd.sock"
|
||||
export WFE_CONTAINERD_ADDR="http://127.0.0.1:2500"
|
||||
export WFE_IO_DIR="/tmp/wfe-io"
|
||||
mkdir -p "$WFE_IO_DIR"
|
||||
cargo nextest run -p wfe-containerd -P ci
|
||||
cargo nextest run -p wfe-yaml --features rustlang,containerd --test rustlang_containerd -P ci -- --ignored
|
||||
echo "##wfe[output containerd_ok=true]"
|
||||
|
||||
ensure:
|
||||
@@ -475,7 +479,7 @@ workflows:
|
||||
cd "$WORKSPACE_DIR"
|
||||
for crate in wfe-core wfe-sqlite wfe-postgres wfe-opensearch wfe-valkey \
|
||||
wfe-buildkit-protos wfe-containerd-protos wfe-buildkit wfe-containerd \
|
||||
wfe wfe-yaml; do
|
||||
wfe-rustlang wfe wfe-yaml; do
|
||||
echo "Packaging $crate..."
|
||||
cargo package -p "$crate" --no-verify --allow-dirty 2>&1 || exit 1
|
||||
done
|
||||
@@ -619,7 +623,7 @@ workflows:
|
||||
exit 0
|
||||
cd "$WORKSPACE_DIR"
|
||||
REGISTRY="${REGISTRY:-sunbeam}"
|
||||
for crate in wfe-buildkit wfe-containerd; do
|
||||
for crate in wfe-buildkit wfe-containerd wfe-rustlang; do
|
||||
echo "Publishing $crate..."
|
||||
cargo publish -p "$crate" --registry "$REGISTRY" 2>&1 || echo "Already published: $crate"
|
||||
done
|
||||
|
||||
Reference in New Issue
Block a user