Declarative YAML Specs
Instead of writing Rust code, you can define runs as YAML files and execute them directly with the CLI.
1. Supported kinds
The runtime currently supports four top-level spec kinds:
agent— a single agent run inside one isolated VMpipeline— multiple agent boxes composed into explicit stagesworkflow— command-oriented workflow steps executed in a shared sandboxsandbox— a bare VM definition used for interactive or shell-oriented flows
2. Agent spec
A single agent spec declares the sandbox, LLM provider, prompt, and skills:
api_version: v1
kind: agent
name: hn_researcher
sandbox:
mode: auto
memory_mb: 1024
vcpus: 1
network: true
llm:
provider: claude
agent:
prompt: >
Analyze current HackerNews top stories and produce
a tactical briefing for AI engineering teams.
skills:
- "file:examples/hackernews/skills/hackernews-api.md"
timeout_secs: 600
3. Pipeline spec
A pipeline spec chains multiple boxes with explicit stage ordering:
api_version: v1
kind: pipeline
name: quick-pipeline
sandbox:
mode: auto
memory_mb: 512
network: true
pipeline:
boxes:
- name: analyst
prompt: Extract key facts from the input.
skills: ["agent:claude-code"]
timeout_secs: 300
- name: writer
prompt: Turn facts into a concise executive summary.
skills: ["agent:claude-code"]
timeout_secs: 300
stages:
- type: box
name: analyst
- type: box
name: writer
4. Running with the CLI
From a clone of void-box, use voidbox run --file to execute any spec file:
voidbox run --file examples/hackernews/hackernews_agent.yaml
voidbox run --file examples/specs/pipeline.yaml
5. Environment overrides
Override the LLM provider at runtime without changing the spec:
VOIDBOX_LLM_PROVIDER=ollama VOIDBOX_LLM_MODEL=phi4-mini \
voidbox run --file examples/specs/pipeline.yaml
Environment variables take precedence over spec-level llm.provider settings.
6. Next
Learn how to compose pipelines in Rust, or set up local LLMs with Ollama.