16 Commits

Author SHA1 Message Date
b7dfdb18e0 fix: escape pipe chars in mermaid TUI diagram 2026-03-24 13:07:15 +00:00
7f5c27a868 docs: add coding agent section to README and docs index 2026-03-24 13:02:16 +00:00
789a08a353 docs: add sunbeam code terminal coding agent documentation
Comprehensive doc covering project discovery, symbol indexing, tool
execution with permissions, LSP auto-detection, TUI layout, session
resumption, reindex-code command, and the three-layer architecture.
2026-03-24 12:58:51 +00:00
04f10d2794 feat: sunbeam reindex-code CLI verb + ReindexCode proto
Proto: ReindexCode RPC with org/repo/branch filters.
CLI: sunbeam reindex-code [--org studio] [--repo owner/name] [--endpoint ...]
Calls Sol's gRPC ReindexCode endpoint, prints indexed symbol count.
2026-03-24 09:38:02 +00:00
8726e8fbe7 feat(lsp): client-side LSP toolkit with 5 tools + integration tests
LSP client (lsp/client.rs):
- JSON-RPC framing over subprocess stdio
- Async request/response with oneshot channels
- Background read loop routing responses to pending requests
- 30s timeout per request, graceful shutdown

LSP manager (lsp/manager.rs):
- Auto-detect: Cargo.toml → rust-analyzer, package.json → tsserver,
  pyproject.toml → pyright, go.mod → gopls
- Initialize handshake, lazy textDocument/didOpen
- High-level methods: definition, references, hover, document_symbols,
  workspace_symbols
- Graceful degradation when binary not on PATH

LSP tools (tools.rs):
- lsp_definition, lsp_references, lsp_hover, lsp_diagnostics, lsp_symbols
- execute_lsp() async dispatch, is_lsp_tool() check
- All routed as ToolSide::Client in orchestrator

Tool schemas registered in Sol's build_tool_definitions() for Mistral.

Integration tests (6 new):
- Language detection for Rust project
- is_lsp_tool routing
- LSP initialize + hover on src/main.rs
- Document symbols (finds main function)
- Workspace symbols with retry (waits for rust-analyzer indexing)
- Graceful degradation with bad project path
2026-03-24 00:58:05 +00:00
73d7d6c15b feat(code): tree-sitter symbol extraction + auto-indexing
Symbol extraction (symbols.rs):
- tree-sitter parsers for Rust, TypeScript, Python
- Extracts: functions, structs, enums, traits, classes, interfaces
- Signatures, docstrings, line ranges for each symbol
- extract_project_symbols() walks project directory
- Skips hidden/vendor/target/node_modules, files >100KB

Proto: IndexSymbols + SymbolEntry messages for client→server symbol relay

Client: after SessionReady, extracts symbols and sends IndexSymbols
to Sol for indexing into the code search index.

14 unit tests for symbol extraction across Rust/TS/Python.
2026-03-24 00:42:03 +00:00
c6d6dbe5c8 fix(tests): update mock SessionReady with resumed + history fields 2026-03-23 21:45:03 +00:00
32f6ebacea feat(tui): wire approval prompt with key handlers
- ApprovalPrompt gains call_id for routing decisions
- Up/Down navigates options, Enter selects
- "yes, always allow {tool}" sends ApprovedAlways
- Input/cursor blocked while approval prompt is active
- AgentEvent::ApprovalNeeded populates the prompt
2026-03-23 21:35:35 +00:00
5f1fb09abb feat(client): emit ChatEvent::ToolCall with approval metadata
ToolCall events carry call_id, name, args, needs_approval — agent
layer uses these to route through the permission/approval flow.
2026-03-23 21:34:57 +00:00
8e73d52776 feat(agent): approval channel + per-tool permission checks
- ApprovalDecision enum (Approved/Denied/ApprovedAlways)
- Approval channel (crossbeam) from TUI to agent loop
- Agent checks config.permission_for() on each client tool call
- "always" auto-executes, "never" auto-denies, "ask" prompts
- ApprovedAlways upgrades session permission for future calls
- Unit tests for permissions, decisions, error messages
2026-03-23 21:27:10 +00:00
e06f74ed5e feat(config): permission_for() + upgrade_to_always()
LoadedConfig gains methods for tool approval policy:
- permission_for(tool_name) → "always" | "ask" | "never"
- upgrade_to_always(tool_name) — session-only override
2026-03-23 21:24:33 +00:00
d7c5a677da feat(code): friendly errors, batch history, persistent command history
- Agent errors sanitized: raw hyper/h2/gRPC dumps replaced with
  human-readable messages ("sol disconnected", "connection lost", etc.)
- Batch history loading: single viewport rebuild instead of per-entry
- Persistent command history: saved to .sunbeam/history, loaded on start
- Default model: mistral-medium-latest (personality adherence)
2026-03-23 17:08:24 +00:00
8b4f187d1b feat(code): async agent bus, virtual viewport, event drain
- Agent service (crossbeam channels): TUI never blocks on gRPC I/O.
  Chat runs on a background tokio task, events flow back via bounded
  crossbeam channel. Designed as a library-friendly internal RPC.

- Virtual viewport: pre-wrap text with textwrap on content/width change,
  slice only visible rows for rendering. Paragraph gets no Wrap, no
  scroll() — pure O(viewport) per frame.

- Event drain loop: coalesce all queued terminal events before drawing.
  Filters MouseEventKind::Moved (crossterm's EnableMouseCapture floods
  these via ?1003h any-event tracking). Single redraw per batch.

- Conditional drawing: skip frames when nothing changed (needs_redraw).

- Mouse wheel + PageUp/Down + Home/End scrolling, command history
  (Up/Down, persistent to .sunbeam/history), Alt+L debug log overlay.

- Proto: SessionReady now includes history entries + resumed flag.
  Session resume loads conversation from Matrix room on reconnect.

- Default model: devstral-small-latest (was devstral-small-2506).
2026-03-23 15:57:15 +00:00
cc9f169264 feat(code): wire TUI into real code path, /exit, color swap
- user input: white text, dim > prompt
- sol responses: warm yellow
- /exit slash command quits cleanly
- TUI replaces stdin loop in sunbeam code start
- hidden demo mode for testing (sunbeam code demo)
2026-03-23 12:53:34 +00:00
02e4d7fb37 feat(code): CLI client with gRPC connection + local tools
phase 3 client core:
- sunbeam code subcommand with project discovery, config loading
- gRPC client connects to Sol, starts bidirectional session
- 7 client-side tool executors: file_read, file_write, search_replace,
  grep, bash, list_directory
- project context: .sunbeam/prompt.md, .sunbeam/config.toml, git info
- tool permission config (always/ask/never per tool)
- simple stdin loop (ratatui TUI in phase 4)
- aligned sunbeam-proto to tonic 0.14
2026-03-23 11:57:24 +00:00
f3e67e589b feat(code): add sunbeam-proto crate with gRPC service definition
shared protobuf definitions for the sunbeam code agent:
- CodeAgent service with bidirectional Session streaming
- ClientMessage: StartSession, UserInput, ToolResult, ToolApproval
- ServerMessage: TextDelta, ToolCall, ApprovalNeeded, Status
- ToolDef for client-side tool registration

this is the transport layer between `sunbeam code` (TUI client)
and Sol (server-side agent loop). next: JWT middleware for OIDC
auth, Sol gRPC server stub, CLI subcommand stub.
2026-03-23 11:12:51 +00:00
44 changed files with 6398 additions and 2216 deletions

1
.gitignore vendored
View File

@@ -11,4 +11,3 @@ build/
# Environment
.envrc
.DS_Store

78
.sunbeam/history Normal file
View File

@@ -0,0 +1,78 @@
hmm
just testing the ux
/exit
/exit
hmm, scrolling is very slow. needs to be async
/exit
/exit
/exit
/exit
/exit
/exit
/exit
[<35;52;20M/exit
/exit
/exit
hey you
who are you?
hmm.
that's not right.
you're supposed to be `sol`
what's on page 2 of hackernews today?
don't you have fetch tooling?
/exit
hey.
hey
/exot
/exit
hey
say hello from sunbeam code!
can you search the web for me and tell me what's on page 2 of hackernews?
/exit
hey boo
tell me about yourself
can you please do some googling and some research to see if i can use devstral-medium as an agent?
/exit
/exit
hey can you give me a full demo of markdown text please? i'm accessing you from a terminal and want to make sure my text processing is working as expected
no i mean give me a bunch of markdown artifacts that i can use to test the markdown renderer with
i don't want you to write a file, i want you to output it as text.
DO NOT FUCKING NEST IT IN MARKDOWN BLOCKS YOU STUPID FUCKING CHUD JUST GIVE ME RAW FUCKING MARKDOWN
imagine you are a terminal output and you needed to output exactly what the user is asking for, which are native markdown tokens. do that
that's exactly what i needed, thank you
/exit
hey
can you please do a deep dive into the mechanics of black holes?
i would like a technical breakdown
/exit
yeah, run me though paradox resolutions
/exit
yeah please dive into the most recent 3-4 so i can understand them
/exit
hello
/exit
tell me about astrophysics
give me a deep dive into black holes
/exit
go deeper into the paradoxes
yeah zoom in on ER=EPR
hey
/exit
yes please do
/exit
hey you.
what's up?
how are you today?
what's the weather in amsterdam right now?
/exit
hey.
what can you tell me about black holes?
/exit
yo dawg
/exit
yo dawg
/exit
hey beautiful
yes
idk, mostly i'm just tryna figure your ui/ux out. cuz you know you're a coding bot in this context, yeah?
what are you up to?

View File

@@ -1,108 +0,0 @@
# Changelog
## v1.1.2
- 30dc4f9 fix(opensearch): make ML model registration idempotent
- 3d2d16d feat(secrets): add xchacha20-poly1305 cipher key seeding for Kratos
- 80ab6d6 feat: enable Meet external API, fix SDK path
- b08a80d refactor: nest infra commands under `sunbeam platform`
## v1.1.1
- cd80a57 fix: DynamicBearer auth, retry on 500/429, upload resilience
- de5c807 fix: progress bar tracks files not bytes, retry on 502, dedup folders
- 2ab2fd5 fix: polish Drive upload progress UI
- 27536b4 feat: parallel Drive upload with indicatif progress UI
## v1.1.0
- 477006e chore: bump to v1.1.0, update package description
- ca0748b feat: encrypted vault keystore, JWT auth, Drive upload
- 13e3f5d fix opensearch pod resolution + sol-agent vault policy
- faf5255 feat: async SunbeamClient factory with unified auth resolution
## v1.0.1
- 34647e6 feat: seed Sol agent vault policy + gitea creds, bump v1.0.1
## v1.0.0
- 051e17d chore: bump to v1.0.0, drop native-tls for pure rustls
- 7ebf900 feat: wire 15 service subcommands into CLI, remove old user command
- f867805 feat: CLI modules for all 25+ service clients
- 3d7a2d5 feat: OutputFormat enum + render/render_list/read_json_input helpers
- 756fbc5 chore: update Cargo.lock
- 97976e0 fix: include build module (was gitignored)
- f06a167 feat: BuildKit client + integration test suite (651 tests)
- b60e22e feat: La Suite clients — 7 DRF services (75 endpoints)
- 915f0b2 feat: monitoring clients — Prometheus, Loki, Grafana (57 endpoints)
- 21f9e18 feat: LiveKitClient — real-time media API (15 endpoints + JWT)
- a33697c feat: S3Client — object storage API (21 endpoints)
- 329c18b feat: OpenSearchClient — search and analytics API (60 endpoints)
- 2888d59 feat: MatrixClient — chat and collaboration API (80 endpoints)
- 890d7b8 feat: GiteaClient — unified git forge API (50+ endpoints)
- c597234 feat: HydraClient — OAuth2/OIDC admin API (35 endpoints)
- f0bc363 feat: KratosClient — identity management (30 endpoints)
- 6823772 feat: ServiceClient trait, HttpTransport, and SunbeamClient factory
- 31fde1a fix: forge URL derivation for bare IP hosts, add Cargo registry config
- 46d2133 docs: update README for Rust workspace layout
- 3ef3fc0 feat: Python upstream — Sol bot registration TODO
- e0961cc refactor: binary crate — thin main.rs + cli.rs dispatch
- 8e5d295 refactor: SDK small command modules — services, cluster, manifests, gitea, update, auth
- 6c7e1cd refactor: SDK users, pm, and checks modules with submodule splits
- bc65b91 refactor: SDK images and secrets modules with submodule splits
- 8e51e0b refactor: SDK kube, openbao, and tools modules
- b92700d refactor: SDK core modules — error, config, output, constants
- 2ffedb9 refactor: workspace scaffolding — sunbeam-sdk + sunbeam binary crate
- b6daf60 chore: suppress dead_code warning on exit code constants
- b92c6ad feat: Python upstream — onboard/offboard, mailbox, Projects, --no-cache
- 8d6e815 feat: --no-cache build flag and Sol build target
- f75f61f feat: user provisioning — mailbox, Projects, welcome email
- c6aa1bd feat: complete pm subcommands with board discovery and user resolution
- ffc0fe9 feat: split auth into sso/git, Planka token exchange, board discovery
- ded0ab4 refactor: remove --env flag, use --context like kubectl
- 88b02ac feat: kubectl-style contexts with per-domain auth tokens
- 3a5e1c6 fix: use predictable client_id via pre-seeded K8s secret
- 1029ff0 fix: auth login UX — timeout, Ctrl+C, suppress K8s error, center HTML
- 43b5a4e fix: URL-encode scope parameter with %20 instead of +
- 7fab2a7 fix: auth login domain resolution with --domain flag
- 184ad85 fix: install rustls ring crypto provider at startup
- 5bdb789 feat: unified project management across Planka and Gitea
- d4421d3 feat: OAuth2 CLI authentication with PKCE and token caching
- aad469e fix: stdin password, port-forward retry, seed advisory lock
- dff4588 fix: employee ID pagination, add async tests
- 019c73e fix: S3 auth signature tested against AWS reference vector
- e95ee4f fix: rewrite users.rs to fully async (was blocking tokio runtime)
- 24e98b4 fix: CNPG readiness, DKIM SPKI format, kv_patch, container name
- 6ec0666 fix: SSH tunnel leak, cmd_bao injection, discovery cache, DNS async
- bcfb443 refactor: deduplicate constants, fix secret key mismatch, add VSS pruning
- 503e407 feat: implement OpenSearch ML setup and model_id injection
- bc5eeaa feat: implement secrets.rs with OpenBao HTTP API
- 7fd8874 refactor: migrate all modules from anyhow to SunbeamError
- cc0b6a8 refactor: add thiserror error tree and tracing logging
- ec23568 feat: Phase 2 feature modules + comprehensive test suite (142 tests)
- 42c2a74 feat: Phase 1 foundations — kube-rs client, OpenBao HTTP client, self-update
- 80c67d3 feat: Rust rewrite scaffolding with embedded kustomize+helm
- d5b9632 refactor: cross-platform tool downloads, configurable infra dir and ACME email
- c82f15b feat: add tuwunel/matrix support with OpenSearch ML post-apply hooks
- 928323e fix(cli): unify proxy build path, fix Gitea password sync
- 956a883 chore: added AGENTS.md file for various models.
- 507b4d3 feat(config): add production host and infrastructure directory configuration
- cbf5c12 docs: update repository URLs to use HTTPS remotes for src.sunbeam.pt
- 133fc98 docs: add comprehensive README with professional documentation
- 33d7774 chore: added license
- 1a97781 docs: add comprehensive documentation for sunbeam CLI
- 28c266e feat(cli): partial apply with namespace filter
- 2569978 feat(cli): meet build/seed support, production kube tunnel, gitea OIDC bootstrap
- c759f2c feat(users): add disable/enable lockout commands; fix table output
- cb5a290 feat: auto-restart deployments on ConfigMap change after sunbeam apply
- 1a3df1f feat: add sunbeam build integration target
- de12847 feat: add impress image mirroring and docs secret seeding
- 14dd685 feat: add kratos-admin-ui build target and user management commands
- b917aa3 fix: specify -c openbao container in cmd_bao kubectl exec
- 352f0b6 feat: add sunbeam k8s kubectl passthrough; fix kube_exec container arg
- fb3fd93 fix: sunbeam apply and bootstrap reliability
- 0acbf66 check: rewrite seaweedfs probe with S3 SigV4 auth
- 6bd59ab sunbeam check: parallel execution, 5s timeout, external S3 check
- 39a2f70 Fix sunbeam check: group by namespace, never crash on network errors
- 1573faa Add sunbeam check verb with service-level health probes

1094
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -1,3 +1,3 @@
[workspace]
members = ["sunbeam-sdk", "sunbeam"]
members = ["sunbeam-sdk", "sunbeam", "sunbeam-proto"]
resolver = "3"

View File

@@ -157,6 +157,17 @@ sunbeam check # Run all functional probes
sunbeam check devtools # Scoped to namespace
```
### Coding Agent
```bash
sunbeam code # Terminal coding agent (connects to Sol via gRPC)
sunbeam code start --model devstral-small # Override model
sunbeam code demo # Demo TUI without Sol connection
sunbeam reindex-code --org studio # Index repos into Sol's code search
```
See [docs/sol-code.md](docs/sol-code.md) for full documentation.
### Passthrough
```bash

View File

@@ -57,6 +57,7 @@ sunbeam logs ory/kratos
## Documentation Structure
- **[CLI Reference](cli-reference)**: Complete command reference
- **[Sol Code](sol-code)**: Terminal coding agent powered by Sol
- **[Core Modules](core-modules)**: Detailed module documentation
- **[Architecture](architecture)**: System architecture and design
- **[Usage Examples](usage-examples)**: Practical usage scenarios

205
docs/sol-code.md Normal file
View File

@@ -0,0 +1,205 @@
# sunbeam code — Terminal Coding Agent
`sunbeam code` is a terminal-based coding agent powered by Sol. It connects to Sol's gRPC `CodeAgent` service and provides an interactive TUI for writing code, asking questions, and executing tools — with Sol handling the AI reasoning and the CLI handling local file operations.
## Quick Start
```bash
sunbeam code # start a session (auto-detects project)
sunbeam code start --model devstral-small # override the model
sunbeam code start --endpoint http://sol:50051 # custom Sol endpoint
sunbeam code demo # demo the TUI without Sol
```
## How It Works
```mermaid
sequenceDiagram
participant User
participant TUI as sunbeam code TUI
participant Agent as Background Agent
participant Sol as Sol gRPC
User->>TUI: sunbeam code
TUI->>TUI: Discover project context
TUI->>Agent: Spawn background tasks
Agent->>Sol: StartSession (project, capabilities)
Agent->>Sol: IndexSymbols (tree-sitter symbols)
Sol-->>Agent: SessionReady (session_id, model)
Agent-->>TUI: Connected
User->>TUI: Type message, press Enter
TUI->>Agent: Chat request
Agent->>Sol: UserInput (text)
loop Tool calls
Sol-->>Agent: ToolCall (is_local=true)
Agent->>Agent: Check permissions
alt needs approval
Agent-->>TUI: Show approval prompt
User->>TUI: yes / always / no
TUI->>Agent: Decision
end
Agent->>Agent: Execute tool locally
Agent->>Sol: ToolResult
end
Sol-->>Agent: TextDone (response + tokens)
Agent-->>TUI: Display response
```
## Project Discovery
On startup, the CLI discovers project context from the current working directory:
- **Project name** — directory basename
- **Custom instructions** — `.sunbeam/prompt.md` (injected into Sol's system prompt)
- **Tool configuration** — `.sunbeam/config.toml` (model + tool permissions)
- **Git state** — current branch + `git status --short`
- **File tree** — recursive scan (max depth 2, skips `target/`, `node_modules/`, hidden dirs)
All of this is sent to Sol in the `StartSession` message so it has full project context.
## Symbol Indexing
After connecting, the CLI extracts code symbols from the project using tree-sitter and sends them to Sol via `IndexSymbols`. Sol indexes these in OpenSearch for code search during the session.
Supported languages:
- **Rust** — functions, structs, enums, traits
- **TypeScript/JavaScript** — functions, classes, interfaces, types
- **Python** — functions, classes, methods
Each symbol includes name, kind, signature, docstring, line numbers, and a preview of the body.
## Tool Execution
Sol decides which tools to call. Tools marked `is_local=true` execute on your machine; everything else runs on the server.
### Client-Side Tools
| Tool | Default Permission | Description |
|------|-------------------|-------------|
| `file_read` | always | Read file contents (with optional line ranges) |
| `file_write` | ask | Write or create files |
| `search_replace` | ask | Apply SEARCH/REPLACE diffs to files |
| `grep` | always | Search files with ripgrep or grep |
| `bash` | ask | Execute shell commands |
| `list_directory` | always | List directory tree (with depth limit) |
### LSP Tools
Auto-detected based on project files:
| Project File | Server |
|-------------|--------|
| `Cargo.toml` | `rust-analyzer` |
| `package.json` or `tsconfig.json` | `typescript-language-server` |
| `pyproject.toml`, `setup.py`, `requirements.txt` | `pyright-langserver` |
| `go.mod` | `gopls` |
LSP tools: `lsp_definition`, `lsp_references`, `lsp_hover`, `lsp_diagnostics`, `lsp_symbols`. These are advertised as client capabilities in `StartSession` — Sol only registers tools for LSP servers the client can actually spawn.
### Server-Side Tools
Sol can also call its own server-side tools during coding sessions: `search_code`, `search_archive`, `search_web`, `research`, and others. These execute on Sol's side — no local action needed.
## Tool Permissions
Configure in `.sunbeam/config.toml`:
```toml
[model]
name = "devstral-2" # override default model
[tools]
file_read = "always" # always, ask, never
file_write = "ask"
bash = "never" # block shell commands entirely
search_replace = "ask"
grep = "always"
list_directory = "always"
```
Permissions:
- **`always`** — execute immediately, no prompt
- **`ask`** — show approval prompt with three choices: *yes*, *yes, always allow*, *no*
- **`never`** — deny silently, Sol gets an error response
Choosing "yes, always allow" upgrades the permission to `always` for the rest of the session (in-memory only).
## TUI
```mermaid
flowchart TD
subgraph Layout
title["Title Bar<br/>project, branch, model, tokens, connection"]
conversation["Conversation Area<br/>user + assistant messages,<br/>tool output, status"]
input["Input Bar<br/>current line"]
end
title --> conversation
conversation --> input
```
**Key bindings:**
| Key | Action |
|-----|--------|
| Enter | Send message |
| Ctrl+C | Quit |
| Alt+L | Toggle debug log view |
| Up/Down | Navigate input history |
| Page Up/Down | Scroll conversation |
The TUI shows real-time status updates as Sol thinks and executes tools. Approval prompts appear inline when a tool needs permission.
## Session Resumption
Sessions are tied to a project path + git branch. If a session already exists for the current context, Sol resumes it — the TUI loads conversation history and you can continue where you left off.
## Code Reindexing
Separately from coding sessions, you can trigger repo indexing into Sol's code search:
```bash
sunbeam reindex-code # all repos
sunbeam reindex-code --org studio # specific org
sunbeam reindex-code --repo studio/sol --branch main # specific repo + branch
```
This calls Sol's `ReindexCode` gRPC endpoint, which walks Gitea repos, extracts symbols via tree-sitter, and indexes them to OpenSearch.
## Architecture
The `sunbeam code` command is structured as three concurrent layers:
```mermaid
flowchart TD
subgraph "Main Thread"
tui[TUI Event Loop<br/>ratatui, 50ms poll]
end
subgraph "Tokio Runtime"
agent[Agent Loop<br/>chat processing,<br/>tool execution]
heartbeat[Heartbeat<br/>1s ping to Sol]
end
subgraph "Sol (remote)"
grpc_service[gRPC CodeAgent]
orchestrator[Orchestrator]
mistral[Mistral AI]
end
tui <--> |crossbeam channels| agent
agent <--> |gRPC stream| grpc_service
heartbeat --> |health check| grpc_service
grpc_service --> orchestrator
orchestrator --> mistral
```
- **TUI** (main thread) — Ratatui event loop, renders conversation, handles input, shows tool approval prompts
- **Agent** (tokio task) — Manages the gRPC session, executes client-side tools, bridges between TUI and Sol via crossbeam channels
- **Heartbeat** (tokio task) — Pings Sol every second, updates the connection indicator in the title bar
The TUI never blocks on network calls. All gRPC communication happens in the agent task, with events flowing back via bounded channels.

163
markdown_test_artifacts.md Normal file
View File

@@ -0,0 +1,163 @@
# markdown test artifacts
---
## 1. headers
# h1: the quick brown fox
## h2: jumps over
### h3: the lazy dog
#### h4: 42
##### h5: why not
###### h6: minimum viable header
---
## 2. text formatting
**bold**, *italic*, ***bold italic***, ~~strikethrough~~, `inline code`, ==highlight== (if supported).
---
## 3. lists
### unordered
- top level
- nested
- deeply nested
- back to top
### ordered
1. first
1. nested first
2. nested second
2. second
3. third
### task lists
- [ ] unchecked
- [x] checked
- [ ] partially done (if supported)
---
## 4. code blocks
### inline `code` example
### fenced blocks
```python
def factorial(n):
return 1 if n <= 1 else n * factorial(n - 1)
```
```bash
# shebang test
#!/bin/bash
echo "hello world"
```
```
plaintext with no language
preserves spaces and newlines
```
---
## 5. tables
| syntax | description | test |
|-------------|-------------|------|
| header | title | here |
| paragraph | text | more |
| `code` | **bold** | *italics* |
---
## 6. blockquotes
> single line
> multi-line
> continuation
>> nested blockquote
---
## 7. horizontal rules
text
---
text
***
text
___
---
## 8. links & images
[regular link](https://example.com)
[reference-style link][1]
[1]: https://example.com "title"
![image alt](https://via.placeholder.com/150 "placeholder")
---
## 9. footnotes
here's a footnote[^1].
[^1]: this is the footnote text.
---
## 10. html (if supported)
<span style="color: red">red text</span>
<br>
<button disabled>interactive (but not here)</button>
---
## 11. edge cases
### whitespace
line with irregular spaces
### unicode
emoji: 🚀 ✨ 🦊
symbols: ← ↑ → ↓ ↔ ↕ ⇄ ⇅
math: 30° ½ ¼ ¾ ± × ÷ ≠ ≤ ≥ ≈ ∞
### escapes
\*not bold\* \`not code\` \[not a link\](https://example.com)
### empty elements
[]
()
{}
---
## 12. mixed nesting
1. ordered item
> with a blockquote
> - and a nested list
2. another item
```
code block inside list
```
---
## 13. long content
lorem ipsum dolor sit amet, consectetur adipiscing elit. sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.
---
## 14. definition lists (if supported)
term 1
: definition 1
term 2
: definition 2a
: definition 2b
---
## 15. math (if supported)
$E = mc^2$
$$\int_a^b f(x) dx$$

14
sunbeam-proto/Cargo.toml Normal file
View File

@@ -0,0 +1,14 @@
[package]
name = "sunbeam-proto"
version = "0.1.0"
edition = "2024"
description = "Shared protobuf definitions for Sunbeam gRPC services"
[dependencies]
tonic = "0.14"
tonic-prost = "0.14"
prost = "0.14"
[build-dependencies]
tonic-build = "0.14"
tonic-prost-build = "0.14"

4
sunbeam-proto/build.rs Normal file
View File

@@ -0,0 +1,4 @@
fn main() -> Result<(), Box<dyn std::error::Error>> {
tonic_prost_build::compile_protos("proto/code.proto")?;
Ok(())
}

View File

@@ -0,0 +1,160 @@
syntax = "proto3";
package sunbeam.code.v1;
// Sol's coding agent service. Bidirectional streaming between
// the `sunbeam code` TUI client and Sol's server-side agent loop.
service CodeAgent {
rpc Session(stream ClientMessage) returns (stream ServerMessage);
rpc ReindexCode(ReindexCodeRequest) returns (ReindexCodeResponse);
}
message ReindexCodeRequest {
string org = 1; // optional: filter to an org (empty = all)
string repo = 2; // optional: specific repo (empty = all)
string branch = 3; // optional: specific branch (empty = default)
}
message ReindexCodeResponse {
uint32 repos_indexed = 1;
uint32 symbols_indexed = 2;
string error = 3; // empty on success
}
// ── Client → Sol ───────────────────────────────────────────────
message ClientMessage {
oneof payload {
StartSession start = 1;
UserInput input = 2;
ToolResult tool_result = 3;
ToolApproval approval = 4;
EndSession end = 5;
IndexSymbols index_symbols = 6;
}
}
message IndexSymbols {
string project_name = 1;
string branch = 2;
repeated SymbolEntry symbols = 3;
}
message SymbolEntry {
string file_path = 1;
string name = 2;
string kind = 3;
string signature = 4;
string docstring = 5;
int32 start_line = 6;
int32 end_line = 7;
string language = 8;
string content = 9;
}
message StartSession {
string project_path = 1;
string prompt_md = 2;
string config_toml = 3;
string git_branch = 4;
string git_status = 5;
repeated string file_tree = 6;
string model = 7;
repeated ToolDef client_tools = 8;
}
message UserInput {
string text = 1;
}
message ToolResult {
string call_id = 1;
string result = 2;
bool is_error = 3;
}
message ToolApproval {
string call_id = 1;
bool approved = 2;
}
message EndSession {}
// ── Sol → Client ───────────────────────────────────────────────
message ServerMessage {
oneof payload {
SessionReady ready = 1;
TextDelta delta = 2;
TextDone done = 3;
ToolCall tool_call = 4;
ApprovalNeeded approval = 5;
Status status = 6;
SessionEnd end = 7;
Error error = 8;
}
}
message SessionReady {
string session_id = 1;
string room_id = 2;
string model = 3;
bool resumed = 4;
repeated HistoryEntry history = 5;
}
message HistoryEntry {
string role = 1; // "user" or "assistant"
string content = 2;
}
message TextDelta {
string text = 1;
}
message TextDone {
string full_text = 1;
uint32 input_tokens = 2;
uint32 output_tokens = 3;
}
message ToolCall {
string call_id = 1;
string name = 2;
string args_json = 3;
bool is_local = 4;
bool needs_approval = 5;
}
message ApprovalNeeded {
string call_id = 1;
string name = 2;
string args_json = 3;
string summary = 4;
}
message Status {
string message = 1;
StatusKind kind = 2;
}
enum StatusKind {
INFO = 0;
TOOL_RUNNING = 1;
TOOL_DONE = 2;
THINKING = 3;
}
message SessionEnd {
string summary = 1;
}
message Error {
string message = 1;
bool fatal = 2;
}
message ToolDef {
string name = 1;
string description = 2;
string schema_json = 3;
}

3
sunbeam-proto/src/lib.rs Normal file
View File

@@ -0,0 +1,3 @@
pub mod sunbeam_code_v1 {
tonic::include_proto!("sunbeam.code.v1");
}

View File

@@ -1,8 +1,8 @@
[package]
name = "sunbeam-sdk"
version = "1.1.2"
version = "1.0.1"
edition = "2024"
description = "Sunbeam Studios SDK, CLI, and ecosystem integrations"
description = "Sunbeam SDK — reusable library for cluster management"
repository = "https://src.sunbeam.pt/studio/cli"
license = "MIT"
publish = ["sunbeam"]
@@ -53,9 +53,6 @@ sha2 = "0.10"
hmac = "0.12"
base64 = "0.22"
rand = "0.8"
aes-gcm = "0.10"
argon2 = "0.5"
indicatif = { version = "0.17", features = ["tokio"] }
# Certificate generation
rcgen = "0.14"

View File

@@ -674,28 +674,6 @@ pub fn get_gitea_token() -> Result<String> {
})
}
/// Get cached SSO access token synchronously (reads from cache file).
/// If the token was recently refreshed by the async `get_token()`, this
/// returns the fresh one. Used by DynamicBearer for per-request auth.
pub fn get_token_sync() -> Result<String> {
let cached = read_cache().map_err(|_| {
SunbeamError::identity("Not logged in. Run `sunbeam auth login` first.")
})?;
Ok(cached.access_token)
}
/// Get cached OIDC id_token (JWT).
pub fn get_id_token() -> Result<String> {
let tokens = read_cache().map_err(|_| {
SunbeamError::identity("Not logged in. Run `sunbeam auth login` first.")
})?;
tokens.id_token.ok_or_else(|| {
SunbeamError::identity(
"No id_token cached. Run `sunbeam auth sso` to get one.",
)
})
}
/// Remove cached auth tokens.
pub async fn cmd_auth_logout() -> Result<()> {
let path = cache_path();

View File

@@ -20,8 +20,6 @@ pub enum AuthMethod {
None,
/// Bearer token (`Authorization: Bearer <token>`).
Bearer(String),
/// Dynamic bearer — resolves token fresh on each request (survives expiry).
DynamicBearer,
/// Custom header (e.g. `X-Vault-Token`).
Header { name: &'static str, value: String },
/// Gitea-style PAT (`Authorization: token <pat>`).
@@ -86,12 +84,6 @@ impl HttpTransport {
AuthMethod::Bearer(token) => {
req = req.bearer_auth(token);
}
AuthMethod::DynamicBearer => {
// Resolve token fresh on each request — survives token expiry/refresh.
if let Ok(token) = crate::auth::get_token_sync() {
req = req.bearer_auth(token);
}
}
AuthMethod::Header { name, value } => {
req = req.header(*name, value);
}
@@ -343,11 +335,6 @@ impl SunbeamClient {
crate::auth::get_gitea_token()
}
/// Get cached OIDC id_token (JWT with claims including admin flag).
fn id_token(&self) -> Result<String> {
crate::auth::get_id_token()
}
// -- Lazy async accessors (each feature-gated) ---------------------------
//
// Each accessor resolves the appropriate auth and constructs the client
@@ -435,122 +422,72 @@ impl SunbeamClient {
#[cfg(feature = "lasuite")]
pub async fn people(&self) -> Result<&crate::lasuite::PeopleClient> {
// Ensure we have a valid token (triggers refresh if expired).
self.sso_token().await?;
self.people.get_or_try_init(|| async {
let url = format!("https://people.{}/external_api/v1.0", self.domain);
Ok(crate::lasuite::PeopleClient::from_parts(url, AuthMethod::DynamicBearer))
let token = self.sso_token().await?;
let url = format!("https://people.{}/api/v1.0", self.domain);
Ok(crate::lasuite::PeopleClient::from_parts(url, AuthMethod::Bearer(token)))
}).await
}
#[cfg(feature = "lasuite")]
pub async fn docs(&self) -> Result<&crate::lasuite::DocsClient> {
self.sso_token().await?;
self.docs.get_or_try_init(|| async {
let url = format!("https://docs.{}/external_api/v1.0", self.domain);
Ok(crate::lasuite::DocsClient::from_parts(url, AuthMethod::DynamicBearer))
let token = self.sso_token().await?;
let url = format!("https://docs.{}/api/v1.0", self.domain);
Ok(crate::lasuite::DocsClient::from_parts(url, AuthMethod::Bearer(token)))
}).await
}
#[cfg(feature = "lasuite")]
pub async fn meet(&self) -> Result<&crate::lasuite::MeetClient> {
self.sso_token().await?;
self.meet.get_or_try_init(|| async {
let url = format!("https://meet.{}/external-api/v1.0", self.domain);
Ok(crate::lasuite::MeetClient::from_parts(url, AuthMethod::DynamicBearer))
let token = self.sso_token().await?;
let url = format!("https://meet.{}/api/v1.0", self.domain);
Ok(crate::lasuite::MeetClient::from_parts(url, AuthMethod::Bearer(token)))
}).await
}
#[cfg(feature = "lasuite")]
pub async fn drive(&self) -> Result<&crate::lasuite::DriveClient> {
self.sso_token().await?;
self.drive.get_or_try_init(|| async {
let url = format!("https://drive.{}/external_api/v1.0", self.domain);
Ok(crate::lasuite::DriveClient::from_parts(url, AuthMethod::DynamicBearer))
let token = self.sso_token().await?;
let url = format!("https://drive.{}/api/v1.0", self.domain);
Ok(crate::lasuite::DriveClient::from_parts(url, AuthMethod::Bearer(token)))
}).await
}
#[cfg(feature = "lasuite")]
pub async fn messages(&self) -> Result<&crate::lasuite::MessagesClient> {
self.sso_token().await?;
self.messages.get_or_try_init(|| async {
let url = format!("https://mail.{}/external_api/v1.0", self.domain);
Ok(crate::lasuite::MessagesClient::from_parts(url, AuthMethod::DynamicBearer))
let token = self.sso_token().await?;
let url = format!("https://mail.{}/api/v1.0", self.domain);
Ok(crate::lasuite::MessagesClient::from_parts(url, AuthMethod::Bearer(token)))
}).await
}
#[cfg(feature = "lasuite")]
pub async fn calendars(&self) -> Result<&crate::lasuite::CalendarsClient> {
self.sso_token().await?;
self.calendars.get_or_try_init(|| async {
let url = format!("https://calendar.{}/external_api/v1.0", self.domain);
Ok(crate::lasuite::CalendarsClient::from_parts(url, AuthMethod::DynamicBearer))
let token = self.sso_token().await?;
let url = format!("https://calendar.{}/api/v1.0", self.domain);
Ok(crate::lasuite::CalendarsClient::from_parts(url, AuthMethod::Bearer(token)))
}).await
}
#[cfg(feature = "lasuite")]
pub async fn find(&self) -> Result<&crate::lasuite::FindClient> {
self.sso_token().await?;
self.find.get_or_try_init(|| async {
let url = format!("https://find.{}/external_api/v1.0", self.domain);
Ok(crate::lasuite::FindClient::from_parts(url, AuthMethod::DynamicBearer))
let token = self.sso_token().await?;
let url = format!("https://find.{}/api/v1.0", self.domain);
Ok(crate::lasuite::FindClient::from_parts(url, AuthMethod::Bearer(token)))
}).await
}
pub async fn bao(&self) -> Result<&crate::openbao::BaoClient> {
self.bao.get_or_try_init(|| async {
let token = self.sso_token().await?;
let url = format!("https://vault.{}", self.domain);
let id_token = self.id_token()?;
let bearer = self.sso_token().await?;
// Authenticate to OpenBao via JWT auth method using the OIDC id_token.
// Try admin role first (for users with admin: true), fall back to reader.
let http = reqwest::Client::new();
let vault_token = {
let mut token = None;
for role in &["cli-admin", "cli-reader"] {
let resp = http
.post(format!("{url}/v1/auth/jwt/login"))
.bearer_auth(&bearer)
.json(&serde_json::json!({ "jwt": id_token, "role": role }))
.send()
.await;
match resp {
Ok(r) => {
let status = r.status();
if status.is_success() {
if let Ok(body) = r.json::<serde_json::Value>().await {
if let Some(t) = body["auth"]["client_token"].as_str() {
tracing::debug!("vault JWT login ok (role={role})");
token = Some(t.to_string());
break;
}
}
} else {
let body = r.text().await.unwrap_or_default();
tracing::debug!("vault JWT login {status} (role={role}): {body}");
}
}
Err(e) => {
tracing::debug!("vault JWT login request failed (role={role}): {e}");
}
}
}
match token {
Some(t) => t,
None => {
tracing::debug!("vault JWT auth failed, falling back to local keystore");
match crate::vault_keystore::load_keystore(&self.domain) {
Ok(ks) => ks.root_token,
Err(_) => return Err(SunbeamError::secrets(
"Vault auth failed: no valid JWT role and no local keystore. Run `sunbeam auth sso` and retry."
)),
}
}
}
};
Ok(crate::openbao::BaoClient::with_proxy_auth(&url, &vault_token, &bearer))
Ok(crate::openbao::BaoClient::with_token(&url, &token))
}).await
}
}

View File

@@ -296,9 +296,6 @@ pub async fn create_secret(ns: &str, name: &str, data: HashMap<String, String>)
"metadata": {
"name": name,
"namespace": ns,
"labels": {
"sunbeam.dev/managed-by": "sunbeam"
},
},
"type": "Opaque",
"data": encoded,

View File

@@ -550,18 +550,6 @@ pub enum DriveCommand {
#[command(subcommand)]
action: PermissionAction,
},
/// Upload a local file or directory to a Drive folder.
Upload {
/// Local path to upload (file or directory).
#[arg(short, long)]
path: String,
/// Target Drive folder ID.
#[arg(short = 't', long)]
folder_id: String,
/// Number of concurrent uploads.
#[arg(long, default_value = "3")]
parallel: usize,
},
}
#[derive(Subcommand, Debug)]
@@ -626,14 +614,13 @@ pub async fn dispatch_drive(
let page_data = drive.list_files(page).await?;
output::render_list(
&page_data.results,
&["ID", "TITLE", "TYPE", "SIZE", "MIMETYPE"],
&["ID", "NAME", "SIZE", "MIME_TYPE"],
|f| {
vec![
f.id.clone(),
f.title.clone().unwrap_or_default(),
f.item_type.clone().unwrap_or_default(),
f.name.clone().unwrap_or_default(),
f.size.map_or("-".into(), |s| s.to_string()),
f.mimetype.clone().unwrap_or_default(),
f.mime_type.clone().unwrap_or_default(),
]
},
fmt,
@@ -659,13 +646,12 @@ pub async fn dispatch_drive(
let page_data = drive.list_folders(page).await?;
output::render_list(
&page_data.results,
&["ID", "TITLE", "CHILDREN", "CREATED"],
&["ID", "NAME", "PARENT_ID"],
|f| {
vec![
f.id.clone(),
f.title.clone().unwrap_or_default(),
f.numchild.map_or("-".into(), |n| n.to_string()),
f.created_at.clone().unwrap_or_default(),
f.name.clone().unwrap_or_default(),
f.parent_id.clone().unwrap_or_default(),
]
},
fmt,
@@ -701,397 +687,8 @@ pub async fn dispatch_drive(
)
}
},
DriveCommand::Upload { path, folder_id, parallel } => {
upload_recursive(drive, &path, &folder_id, parallel).await
}
}
}
/// A file that needs uploading, collected during the directory-walk phase.
struct UploadJob {
local_path: std::path::PathBuf,
parent_id: String,
file_size: u64,
relative_path: String,
}
/// Recursively upload a local file or directory to a Drive folder.
async fn upload_recursive(
drive: &super::DriveClient,
local_path: &str,
parent_id: &str,
parallel: usize,
) -> Result<()> {
use indicatif::{HumanBytes, MultiProgress, ProgressBar, ProgressStyle};
use std::sync::Arc;
use tokio::sync::Semaphore;
let path = std::path::Path::new(local_path);
if !path.exists() {
return Err(crate::error::SunbeamError::Other(format!(
"Path does not exist: {local_path}"
)));
}
// Phase 1 — Walk and collect: create folders sequentially, gather file jobs.
let mut jobs = Vec::new();
if path.is_file() {
let file_size = std::fs::metadata(path)
.map_err(|e| crate::error::SunbeamError::Other(format!("stat: {e}")))?
.len();
let filename = path
.file_name()
.and_then(|n| n.to_str())
.unwrap_or("unnamed");
if !filename.starts_with('.') {
jobs.push(UploadJob {
local_path: path.to_path_buf(),
parent_id: parent_id.to_string(),
file_size,
relative_path: filename.to_string(),
});
}
} else if path.is_dir() {
collect_upload_jobs(drive, path, parent_id, "", &mut jobs).await?;
} else {
return Err(crate::error::SunbeamError::Other(format!(
"Not a file or directory: {local_path}"
)));
}
if jobs.is_empty() {
output::ok("Nothing to upload.");
return Ok(());
}
let total_files = jobs.len() as u64;
let total_bytes: u64 = jobs.iter().map(|j| j.file_size).sum();
// Clear the folder creation line
eprint!("\r\x1b[K");
// Phase 2 — Parallel upload with progress bars.
let multi = MultiProgress::new();
// Overall bar tracks file count. Bandwidth is computed manually in the message.
let overall_style = ProgressStyle::with_template(
" {spinner:.green} [{elapsed_precise}] {bar:40.cyan/blue} {pos}/{len} files {msg}",
)
.unwrap()
.progress_chars("█▓░");
let overall = multi.add(ProgressBar::new(total_files));
overall.set_style(overall_style);
overall.enable_steady_tick(std::time::Duration::from_millis(100));
let completed_bytes = std::sync::Arc::new(std::sync::atomic::AtomicU64::new(0));
let file_style = ProgressStyle::with_template(
" {spinner:.cyan} {wide_msg}",
)
.unwrap();
let sem = Arc::new(Semaphore::new(parallel));
let drive = Arc::new(drive.clone());
let mut handles = Vec::new();
let start = std::time::Instant::now();
for job in jobs {
let permit = sem.clone().acquire_owned().await.unwrap();
let drive = Arc::clone(&drive);
let multi = multi.clone();
let overall = overall.clone();
let file_style = file_style.clone();
let job_size = job.file_size;
let completed_bytes = Arc::clone(&completed_bytes);
let total_bytes = total_bytes;
let start = start.clone();
let handle = tokio::spawn(async move {
let pb = multi.add(ProgressBar::new_spinner());
pb.set_style(file_style);
pb.set_message(job.relative_path.clone());
pb.enable_steady_tick(std::time::Duration::from_millis(80));
let result = upload_single_file_with_progress(&drive, &job, &pb).await;
pb.finish_and_clear();
multi.remove(&pb);
// Update overall — increment file count, compute bandwidth from bytes
overall.inc(1);
let done_bytes = completed_bytes.fetch_add(job_size, std::sync::atomic::Ordering::Relaxed) + job_size;
let elapsed = start.elapsed().as_secs_f64();
let speed = if elapsed > 1.0 { done_bytes as f64 / elapsed } else { 0.0 };
let remaining = total_bytes.saturating_sub(done_bytes);
let eta = if speed > 0.0 { remaining as f64 / speed } else { 0.0 };
let eta_m = eta as u64 / 60;
let eta_s = eta as u64 % 60;
overall.set_message(format!(
"{}/{} {}/s ETA: {}m {:02}s",
indicatif::HumanBytes(done_bytes),
indicatif::HumanBytes(total_bytes),
indicatif::HumanBytes(speed as u64),
eta_m, eta_s,
));
drop(permit);
result
});
handles.push(handle);
}
let mut errors = 0u64;
for handle in handles {
match handle.await {
Ok(Ok(())) => {}
Ok(Err(e)) => {
errors += 1;
multi.suspend(|| eprintln!(" ERROR: {e}"));
}
Err(e) => {
errors += 1;
multi.suspend(|| eprintln!(" ERROR: task panic: {e}"));
}
}
}
overall.finish_and_clear();
multi.clear().ok();
let elapsed = start.elapsed();
let secs = elapsed.as_secs_f64();
let speed = if secs > 0.0 {
total_bytes as f64 / secs
} else {
0.0
};
let mins = elapsed.as_secs() / 60;
let secs_rem = elapsed.as_secs() % 60;
let uploaded = total_files - errors;
if errors > 0 {
println!(
"✓ Uploaded {uploaded}/{total_files} files ({}) in {mins}m {secs_rem}s ({}/s) — {errors} failed",
HumanBytes(total_bytes),
HumanBytes(speed as u64),
);
} else {
println!(
"✓ Uploaded {total_files} files ({}) in {mins}m {secs_rem}s ({}/s)",
HumanBytes(total_bytes),
HumanBytes(speed as u64),
);
}
Ok(())
}
/// Phase 1: Walk a directory recursively, create folders in Drive sequentially,
/// and collect [`UploadJob`]s for every regular file.
async fn collect_upload_jobs(
drive: &super::DriveClient,
dir: &std::path::Path,
parent_id: &str,
prefix: &str,
jobs: &mut Vec<UploadJob>,
) -> Result<()> {
let dir_name = dir
.file_name()
.and_then(|n| n.to_str())
.unwrap_or("unnamed");
// Skip hidden directories
if dir_name.starts_with('.') {
return Ok(());
}
// Build the display prefix for children
let display_prefix = if prefix.is_empty() {
dir_name.to_string()
} else {
format!("{prefix}/{dir_name}")
};
eprint!("\r\x1b[K Scanning: {display_prefix} ");
// Check if folder already exists under the parent.
let existing = drive.list_children(parent_id, None).await.ok();
let existing_folder_id = existing.and_then(|page| {
page.results.iter().find_map(|item| {
let is_folder = item.get("type").and_then(|v| v.as_str()) == Some("folder");
let title_matches = item.get("title").and_then(|v| v.as_str()) == Some(dir_name);
if is_folder && title_matches {
item.get("id").and_then(|v| v.as_str()).map(String::from)
} else {
None
}
})
});
let folder_id = if let Some(id) = existing_folder_id {
id
} else {
let folder = drive
.create_child(
parent_id,
&serde_json::json!({
"title": dir_name,
"type": "folder",
}),
)
.await?;
folder["id"]
.as_str()
.ok_or_else(|| crate::error::SunbeamError::Other("No folder ID in response".into()))?
.to_string()
};
// Build a set of existing file titles in this folder to skip duplicates.
let existing_file_titles: std::collections::HashSet<String> = {
let mut titles = std::collections::HashSet::new();
if let Ok(page) = drive.list_children(&folder_id, None).await {
for item in &page.results {
if item.get("type").and_then(|v| v.as_str()) == Some("file") {
if let Some(title) = item.get("title").and_then(|v| v.as_str()) {
titles.insert(title.to_string());
}
}
}
}
titles
};
let mut entries: Vec<_> = std::fs::read_dir(dir)
.map_err(|e| crate::error::SunbeamError::Other(format!("reading dir: {e}")))?
.filter_map(|e| e.ok())
.collect();
entries.sort_by_key(|e| e.file_name());
for entry in entries {
let entry_path = entry.path();
let name = entry
.file_name()
.to_str()
.unwrap_or_default()
.to_string();
// Skip hidden entries
if name.starts_with('.') {
continue;
}
if entry_path.is_dir() {
Box::pin(collect_upload_jobs(
drive,
&entry_path,
&folder_id,
&display_prefix,
jobs,
))
.await?;
} else if entry_path.is_file() {
// Skip if a file with this title already exists in the folder.
if existing_file_titles.contains(&name) {
continue;
}
let file_size = std::fs::metadata(&entry_path)
.map_err(|e| crate::error::SunbeamError::Other(format!("stat: {e}")))?
.len();
jobs.push(UploadJob {
local_path: entry_path,
parent_id: folder_id.clone(),
file_size,
relative_path: format!("{display_prefix}/{name}"),
});
}
}
Ok(())
}
/// Upload a single file to Drive, updating the progress bar.
/// Retries on 429/500/502/503 up to 5 times with exponential backoff.
async fn upload_single_file_with_progress(
drive: &super::DriveClient,
job: &UploadJob,
pb: &indicatif::ProgressBar,
) -> Result<()> {
let filename = job
.local_path
.file_name()
.and_then(|n| n.to_str())
.unwrap_or("unnamed");
// Create the file item in Drive (with retry)
let body = serde_json::json!({
"title": filename,
"filename": filename,
"type": "file",
});
let item = retry_drive_call(|| drive.create_child(&job.parent_id, &body), 5).await?;
let item_id = item["id"]
.as_str()
.ok_or_else(|| crate::error::SunbeamError::Other("No item ID in response".into()))?;
let upload_url = item["policy"]
.as_str()
.ok_or_else(|| {
crate::error::SunbeamError::Other(
"No upload policy URL in response \u{2014} is the item a file?".into(),
)
})?;
tracing::debug!("S3 presigned URL: {upload_url}");
// Read the file and upload to S3
let data = std::fs::read(&job.local_path)
.map_err(|e| crate::error::SunbeamError::Other(format!("reading file: {e}")))?;
let len = data.len() as u64;
drive
.upload_to_s3(upload_url, bytes::Bytes::from(data))
.await?;
pb.set_position(len);
// Notify Drive the upload is complete (with retry)
retry_drive_call(|| drive.upload_ended(item_id), 5).await?;
Ok(())
}
/// Retry a Drive API call on 429/500/502/503 with exponential backoff.
async fn retry_drive_call<F, Fut, T>(f: F, max_retries: u32) -> Result<T>
where
F: Fn() -> Fut,
Fut: std::future::Future<Output = Result<T>>,
{
let mut last_err = None;
for attempt in 0..=max_retries {
match f().await {
Ok(v) => return Ok(v),
Err(e) => {
let msg = e.to_string();
let retryable = msg.contains("429")
|| msg.contains("500")
|| msg.contains("502")
|| msg.contains("503")
|| msg.contains("request failed");
if retryable && attempt < max_retries {
// On 500, try refreshing the SSO token (may have expired)
if msg.contains("500") {
let _ = crate::auth::get_token().await;
}
let delay = std::time::Duration::from_millis(
500 * 2u64.pow(attempt.min(4)),
);
tokio::time::sleep(delay).await;
last_err = Some(e);
continue;
}
return Err(e);
}
}
}
Err(last_err.unwrap())
}
// ═══════════════════════════════════════════════════════════════════════════
// Mail (Messages)

View File

@@ -6,7 +6,6 @@ use reqwest::Method;
use super::types::*;
/// Client for the La Suite Drive API.
#[derive(Clone)]
pub struct DriveClient {
pub(crate) transport: HttpTransport,
}
@@ -40,164 +39,70 @@ impl DriveClient {
self
}
// -- Items --------------------------------------------------------------
// -- Files --------------------------------------------------------------
/// List items with optional pagination and type filter.
pub async fn list_items(
&self,
page: Option<u32>,
item_type: Option<&str>,
) -> Result<DRFPage<DriveFile>> {
let mut path = String::from("items/?");
if let Some(p) = page {
path.push_str(&format!("page={p}&"));
}
if let Some(t) = item_type {
path.push_str(&format!("type={t}&"));
}
/// List files with optional pagination.
pub async fn list_files(&self, page: Option<u32>) -> Result<DRFPage<DriveFile>> {
let path = match page {
Some(p) => format!("files/?page={p}"),
None => "files/".to_string(),
};
self.transport
.json(Method::GET, &path, Option::<&()>::None, "drive list items")
.json(Method::GET, &path, Option::<&()>::None, "drive list files")
.await
}
/// List files (items with type=file).
pub async fn list_files(&self, page: Option<u32>) -> Result<DRFPage<DriveFile>> {
self.list_items(page, Some("file")).await
/// Get a single file by ID.
pub async fn get_file(&self, id: &str) -> Result<DriveFile> {
self.transport
.json(
Method::GET,
&format!("files/{id}/"),
Option::<&()>::None,
"drive get file",
)
.await
}
/// List folders (items with type=folder).
pub async fn list_folders(&self, page: Option<u32>) -> Result<DRFPage<DriveFolder>> {
let mut path = String::from("items/?type=folder&");
if let Some(p) = page {
path.push_str(&format!("page={p}&"));
/// Upload a new file.
pub async fn upload_file(&self, body: &serde_json::Value) -> Result<DriveFile> {
self.transport
.json(Method::POST, "files/", Some(body), "drive upload file")
.await
}
/// Delete a file.
pub async fn delete_file(&self, id: &str) -> Result<()> {
self.transport
.send(
Method::DELETE,
&format!("files/{id}/"),
Option::<&()>::None,
"drive delete file",
)
.await
}
// -- Folders ------------------------------------------------------------
/// List folders with optional pagination.
pub async fn list_folders(&self, page: Option<u32>) -> Result<DRFPage<DriveFolder>> {
let path = match page {
Some(p) => format!("folders/?page={p}"),
None => "folders/".to_string(),
};
self.transport
.json(Method::GET, &path, Option::<&()>::None, "drive list folders")
.await
}
/// Get a single item by ID.
pub async fn get_file(&self, id: &str) -> Result<DriveFile> {
self.transport
.json(
Method::GET,
&format!("items/{id}/"),
Option::<&()>::None,
"drive get item",
)
.await
}
/// Create a new item (file or folder) at the root level.
pub async fn upload_file(&self, body: &serde_json::Value) -> Result<DriveFile> {
self.transport
.json(Method::POST, "items/", Some(body), "drive create item")
.await
}
/// Delete an item.
pub async fn delete_file(&self, id: &str) -> Result<()> {
self.transport
.send(
Method::DELETE,
&format!("items/{id}/"),
Option::<&()>::None,
"drive delete item",
)
.await
}
/// Create a new folder at the root level.
/// Create a new folder.
pub async fn create_folder(&self, body: &serde_json::Value) -> Result<DriveFolder> {
self.transport
.json(Method::POST, "items/", Some(body), "drive create folder")
.json(Method::POST, "folders/", Some(body), "drive create folder")
.await
}
// -- Items (children API) ------------------------------------------------
/// Create a child item under a parent folder.
/// Returns the created item including its upload_url for files.
pub async fn create_child(
&self,
parent_id: &str,
body: &serde_json::Value,
) -> Result<serde_json::Value> {
self.transport
.json(
Method::POST,
&format!("items/{parent_id}/children/"),
Some(body),
"drive create child",
)
.await
}
/// List children of an item (folder).
pub async fn list_children(
&self,
parent_id: &str,
page: Option<u32>,
) -> Result<DRFPage<serde_json::Value>> {
let path = match page {
Some(p) => format!("items/{parent_id}/children/?page={p}"),
None => format!("items/{parent_id}/children/"),
};
self.transport
.json(Method::GET, &path, Option::<&()>::None, "drive list children")
.await
}
/// Notify Drive that a file upload to S3 is complete.
pub async fn upload_ended(&self, item_id: &str) -> Result<serde_json::Value> {
self.transport
.json(
Method::POST,
&format!("items/{item_id}/upload-ended/"),
Option::<&()>::None,
"drive upload ended",
)
.await
}
/// Upload file bytes directly to a presigned S3 URL.
/// The presigned URL's SigV4 signature covers host + x-amz-acl headers.
/// Retries up to 3 times on 502/503/connection errors.
pub async fn upload_to_s3(&self, presigned_url: &str, data: bytes::Bytes) -> Result<()> {
let max_retries = 3;
for attempt in 0..=max_retries {
let resp = self.transport.http
.put(presigned_url)
.header("x-amz-acl", "private")
.body(data.clone())
.send()
.await;
match resp {
Ok(r) if r.status().is_success() => return Ok(()),
Ok(r) if (r.status() == 502 || r.status() == 503) && attempt < max_retries => {
tokio::time::sleep(std::time::Duration::from_millis(500 * (attempt as u64 + 1))).await;
continue;
}
Ok(r) => {
let status = r.status();
let body = r.text().await.unwrap_or_default();
return Err(crate::error::SunbeamError::network(format!(
"S3 upload: HTTP {status}: {body}"
)));
}
Err(_) if attempt < max_retries => {
tokio::time::sleep(std::time::Duration::from_millis(500 * (attempt as u64 + 1))).await;
continue;
}
Err(e) => {
return Err(crate::error::SunbeamError::network(format!("S3 upload: {e}")));
}
}
}
Ok(())
}
// -- Shares -------------------------------------------------------------
/// Share a file with a user.

View File

@@ -219,17 +219,13 @@ pub struct DriveFile {
#[serde(default)]
pub id: String,
#[serde(default)]
pub title: Option<String>,
#[serde(default)]
pub filename: Option<String>,
#[serde(default, rename = "type")]
pub item_type: Option<String>,
pub name: Option<String>,
#[serde(default)]
pub size: Option<u64>,
#[serde(default)]
pub mimetype: Option<String>,
pub mime_type: Option<String>,
#[serde(default)]
pub upload_state: Option<String>,
pub folder_id: Option<String>,
#[serde(default)]
pub url: Option<String>,
#[serde(default)]
@@ -238,17 +234,15 @@ pub struct DriveFile {
pub updated_at: Option<String>,
}
/// A folder in the Drive service (same API, type=folder).
/// A folder in the Drive service.
#[derive(Debug, Clone, Default, Serialize, Deserialize)]
pub struct DriveFolder {
#[serde(default)]
pub id: String,
#[serde(default)]
pub title: Option<String>,
#[serde(default, rename = "type")]
pub item_type: Option<String>,
pub name: Option<String>,
#[serde(default)]
pub numchild: Option<u32>,
pub parent_id: Option<String>,
#[serde(default)]
pub created_at: Option<String>,
#[serde(default)]

View File

@@ -19,7 +19,6 @@ pub mod secrets;
pub mod services;
pub mod update;
pub mod users;
pub mod vault_keystore;
// Feature-gated service client modules
#[cfg(feature = "identity")]

View File

@@ -617,14 +617,10 @@ async fn ensure_opensearch_ml() {
already_deployed = true;
break;
}
// Any existing model (even DEPLOY_FAILED) — reuse it instead of
// registering a new version. This prevents accumulating stale
// copies in .plugins-ml-model when the pod restarts.
_ => {
if model_id.is_none() && !id.is_empty() {
"REGISTERED" | "DEPLOYING" => {
model_id = Some(id.to_string());
}
}
_ => {}
}
}

View File

@@ -66,12 +66,6 @@ pub enum VaultCommand {
#[arg(short, long)]
data: Option<String>,
},
/// Re-initialize the vault (destructive — wipes all secrets).
Reinit,
/// Show local keystore status.
Keys,
/// Export vault keys as plaintext (for machine migration).
ExportKeys,
}
#[derive(Subcommand, Debug)]
@@ -236,89 +230,12 @@ pub async fn dispatch(
client: &SunbeamClient,
fmt: OutputFormat,
) -> Result<()> {
// -- Commands that don't need a BaoClient -------------------------------
match cmd {
VaultCommand::Keys => {
let domain = crate::config::domain();
let path = crate::vault_keystore::keystore_path(domain);
if !crate::vault_keystore::keystore_exists(domain) {
output::warn(&format!("No local keystore found at {}", path.display()));
output::warn("Run `sunbeam seed` to create one, or `sunbeam vault reinit` to start fresh.");
return Ok(());
}
match crate::vault_keystore::verify_vault_keys(domain) {
Ok(ks) => {
output::ok(&format!("Domain: {}", ks.domain));
output::ok(&format!("Created: {}", ks.created_at.format("%Y-%m-%d %H:%M:%S UTC")));
output::ok(&format!("Updated: {}", ks.updated_at.format("%Y-%m-%d %H:%M:%S UTC")));
output::ok(&format!("Shares: {}/{}", ks.key_threshold, ks.key_shares));
output::ok(&format!(
"Token: {}...{}",
&ks.root_token[..8.min(ks.root_token.len())],
&ks.root_token[ks.root_token.len().saturating_sub(4)..]
));
output::ok(&format!("Unseal keys: {}", ks.unseal_keys_b64.len()));
output::ok(&format!("Path: {}", path.display()));
}
Err(e) => {
output::warn(&format!("Keystore at {} is invalid: {e}", path.display()));
}
}
return Ok(());
}
VaultCommand::ExportKeys => {
let domain = crate::config::domain();
output::warn("WARNING: This prints vault root token and unseal keys in PLAINTEXT.");
output::warn("Only use this for machine migration. Do not share or log this output.");
eprint!(" Type 'export' to confirm: ");
let mut answer = String::new();
std::io::stdin()
.read_line(&mut answer)
.map_err(|e| crate::error::SunbeamError::Other(format!("stdin: {e}")))?;
if answer.trim() != "export" {
output::ok("Aborted.");
return Ok(());
}
let json = crate::vault_keystore::export_plaintext(domain)?;
println!("{json}");
return Ok(());
}
VaultCommand::Reinit => {
return dispatch_reinit().await;
}
// All other commands need a BaoClient — fall through.
_ => {}
}
let bao = client.bao().await?;
match cmd {
// -- Status ---------------------------------------------------------
VaultCommand::Status => {
let status = bao.seal_status().await?;
output::render(&status, fmt)?;
// Show local keystore status
let domain = crate::config::domain();
if crate::vault_keystore::keystore_exists(domain) {
match crate::vault_keystore::load_keystore(domain) {
Ok(ks) => {
output::ok(&format!(
"Local keystore: valid (updated {})",
ks.updated_at.format("%Y-%m-%d %H:%M:%S UTC")
));
}
Err(e) => {
output::warn(&format!("Local keystore: corrupt ({e})"));
}
}
} else {
output::warn("Local keystore: not found");
}
Ok(())
output::render(&status, fmt)
}
// -- Init -----------------------------------------------------------
@@ -418,194 +335,5 @@ pub async fn dispatch(
output::render(&resp, fmt)
}
}
// Already handled above; unreachable.
VaultCommand::Keys | VaultCommand::ExportKeys | VaultCommand::Reinit => unreachable!(),
}
}
// ═══════════════════════════════════════════════════════════════════════════
// Reinit
// ═══════════════════════════════════════════════════════════════════════════
/// Run a kubectl command, returning Ok(()) on success.
async fn kubectl(args: &[&str]) -> Result<()> {
crate::kube::ensure_tunnel().await?;
let ctx = format!("--context={}", crate::kube::context());
let status = tokio::process::Command::new("kubectl")
.arg(&ctx)
.args(args)
.stdin(std::process::Stdio::null())
.stdout(std::process::Stdio::inherit())
.stderr(std::process::Stdio::inherit())
.status()
.await
.map_err(|e| crate::error::SunbeamError::Other(format!("kubectl: {e}")))?;
if !status.success() {
return Err(crate::error::SunbeamError::Other(format!(
"kubectl {} exited with {}",
args.join(" "),
status.code().unwrap_or(-1)
)));
}
Ok(())
}
/// Port-forward guard — cancels the background forwarder on drop.
struct PortForwardGuard {
_abort_handle: tokio::task::AbortHandle,
pub local_port: u16,
}
impl Drop for PortForwardGuard {
fn drop(&mut self) {
self._abort_handle.abort();
}
}
/// Open a kube-rs port-forward to `pod_name` in `namespace` on `remote_port`.
async fn port_forward(namespace: &str, pod_name: &str, remote_port: u16) -> Result<PortForwardGuard> {
use k8s_openapi::api::core::v1::Pod;
use kube::api::{Api, ListParams};
use tokio::net::TcpListener;
let client = crate::kube::get_client().await?;
let pods: Api<Pod> = Api::namespaced(client.clone(), namespace);
let listener = TcpListener::bind("127.0.0.1:0")
.await
.map_err(|e| crate::error::SunbeamError::Other(format!("bind: {e}")))?;
let local_port = listener
.local_addr()
.map_err(|e| crate::error::SunbeamError::Other(format!("local_addr: {e}")))?
.port();
let pod_name = pod_name.to_string();
let ns = namespace.to_string();
let task = tokio::spawn(async move {
let mut current_pod = pod_name;
loop {
let (mut client_stream, _) = match listener.accept().await {
Ok(s) => s,
Err(_) => break,
};
let pf_result = pods.portforward(&current_pod, &[remote_port]).await;
let mut pf = match pf_result {
Ok(pf) => pf,
Err(e) => {
tracing::warn!("Port-forward failed, re-resolving pod: {e}");
if let Ok(new_client) = crate::kube::get_client().await {
let new_pods: Api<Pod> = Api::namespaced(new_client.clone(), &ns);
let lp = ListParams::default();
if let Ok(pod_list) = new_pods.list(&lp).await {
if let Some(name) = pod_list
.items
.iter()
.find(|p| {
p.metadata
.name
.as_deref()
.map(|n| n.starts_with(current_pod.split('-').next().unwrap_or("")))
.unwrap_or(false)
})
.and_then(|p| p.metadata.name.clone())
{
current_pod = name;
}
}
}
continue;
}
};
let mut upstream = match pf.take_stream(remote_port) {
Some(s) => s,
None => continue,
};
tokio::spawn(async move {
let _ = tokio::io::copy_bidirectional(&mut client_stream, &mut upstream).await;
});
}
});
let abort_handle = task.abort_handle();
tokio::time::sleep(std::time::Duration::from_millis(100)).await;
Ok(PortForwardGuard {
_abort_handle: abort_handle,
local_port,
})
}
/// Destructive vault re-initialization workflow.
async fn dispatch_reinit() -> Result<()> {
output::warn("This will DESTROY all vault secrets. You must re-run `sunbeam seed` after.");
eprint!(" Type 'reinit' to confirm: ");
let mut answer = String::new();
std::io::stdin()
.read_line(&mut answer)
.map_err(|e| crate::error::SunbeamError::Other(format!("stdin: {e}")))?;
if answer.trim() != "reinit" {
output::ok("Aborted.");
return Ok(());
}
output::step("Re-initializing vault...");
// Delete PVC and pod
output::ok("Deleting vault storage...");
let _ = kubectl(&["-n", "data", "delete", "pvc", "data-openbao-0", "--ignore-not-found"]).await;
let _ = kubectl(&["-n", "data", "delete", "pod", "openbao-0", "--ignore-not-found"]).await;
// Wait for pod to come back
output::ok("Waiting for vault pod to restart...");
tokio::time::sleep(std::time::Duration::from_secs(15)).await;
let _ = kubectl(&[
"-n", "data", "wait", "--for=condition=Ready", "pod/openbao-0",
"--timeout=120s",
])
.await;
// Port-forward and init
let pf = port_forward("data", "openbao-0", 8200).await?;
let bao_url = format!("http://127.0.0.1:{}", pf.local_port);
let fresh_bao = crate::openbao::BaoClient::new(&bao_url);
let init = fresh_bao.init(1, 1).await?;
let unseal_key = init.unseal_keys_b64[0].clone();
let root_token = init.root_token.clone();
// Save to local keystore
let domain = crate::config::domain();
let ks = crate::vault_keystore::VaultKeystore {
version: 1,
domain: domain.to_string(),
created_at: chrono::Utc::now(),
updated_at: chrono::Utc::now(),
root_token: root_token.clone(),
unseal_keys_b64: vec![unseal_key.clone()],
key_shares: 1,
key_threshold: 1,
};
crate::vault_keystore::save_keystore(&ks)?;
output::ok(&format!(
"Keys saved to local keystore at {}",
crate::vault_keystore::keystore_path(domain).display()
));
// Save to K8s Secret
let mut data = HashMap::new();
data.insert("key".to_string(), unseal_key.clone());
data.insert("root-token".to_string(), root_token.clone());
crate::kube::create_secret("data", "openbao-keys", data).await?;
output::ok("Keys stored in K8s Secret openbao-keys.");
// Unseal
fresh_bao.unseal(&unseal_key).await?;
output::ok("Vault unsealed.");
output::step("Vault re-initialized. Run `sunbeam seed` now to restore all secrets.");
Ok(())
}

View File

@@ -15,8 +15,6 @@ use std::collections::HashMap;
pub struct BaoClient {
pub base_url: String,
pub token: Option<String>,
/// Optional bearer token for proxy auth_request (separate from vault token).
pub bearer_token: Option<String>,
http: reqwest::Client,
}
@@ -69,26 +67,17 @@ impl BaoClient {
Self {
base_url: base_url.trim_end_matches('/').to_string(),
token: None,
bearer_token: None,
http: reqwest::Client::new(),
}
}
/// Create a client with a vault authentication token.
/// Create a client with an authentication token.
pub fn with_token(base_url: &str, token: &str) -> Self {
let mut client = Self::new(base_url);
client.token = Some(token.to_string());
client
}
/// Create a client with both a vault token and a bearer token for proxy auth.
pub fn with_proxy_auth(base_url: &str, vault_token: &str, bearer_token: &str) -> Self {
let mut client = Self::new(base_url);
client.token = Some(vault_token.to_string());
client.bearer_token = Some(bearer_token.to_string());
client
}
fn url(&self, path: &str) -> String {
format!("{}/v1/{}", self.base_url, path.trim_start_matches('/'))
}
@@ -98,9 +87,6 @@ impl BaoClient {
if let Some(ref token) = self.token {
req = req.header("X-Vault-Token", token);
}
if let Some(ref bearer) = self.bearer_token {
req = req.header("Authorization", format!("Bearer {bearer}"));
}
req
}
@@ -109,7 +95,8 @@ impl BaoClient {
/// Get the seal status of the OpenBao instance.
pub async fn seal_status(&self) -> Result<SealStatusResponse> {
let resp = self
.request(reqwest::Method::GET, "sys/seal-status")
.http
.get(format!("{}/v1/sys/seal-status", self.base_url))
.send()
.await
.ctx("Failed to connect to OpenBao")?;

View File

@@ -102,15 +102,6 @@ fn rand_token_n(n: usize) -> String {
base64::engine::general_purpose::URL_SAFE_NO_PAD.encode(buf)
}
/// Generate an alphanumeric random string of exactly `n` characters.
/// Used for secrets that require a fixed character length (e.g. xchacha20-poly1305 cipher keys).
pub(crate) fn rand_alphanum(n: usize) -> String {
use rand::rngs::OsRng;
use rand::Rng;
const CHARSET: &[u8] = b"ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789";
(0..n).map(|_| CHARSET[OsRng.gen_range(0..CHARSET.len())] as char).collect()
}
// ── Port-forward helper ─────────────────────────────────────────────────────
/// Port-forward guard — cancels the background forwarder on drop.

View File

@@ -11,8 +11,8 @@ use crate::openbao::BaoClient;
use crate::output::{ok, warn};
use super::{
gen_dkim_key_pair, gen_fernet_key, port_forward, rand_alphanum, rand_token, rand_token_n,
scw_config, wait_pod_running, delete_resource, GITEA_ADMIN_USER, SMTP_URI,
gen_dkim_key_pair, gen_fernet_key, port_forward, rand_token, rand_token_n, scw_config,
wait_pod_running, delete_resource, GITEA_ADMIN_USER, SMTP_URI,
};
/// Internal result from seed_openbao, used by cmd_seed.
@@ -101,21 +101,6 @@ pub async fn seed_openbao() -> Result<Option<SeedResult>> {
data.insert("root-token".to_string(), root_token.clone());
k::create_secret("data", "openbao-keys", data).await?;
ok("Initialized -- keys stored in secret/openbao-keys.");
// Save to local keystore
let domain = crate::config::domain();
let ks = crate::vault_keystore::VaultKeystore {
version: 1,
domain: domain.to_string(),
created_at: chrono::Utc::now(),
updated_at: chrono::Utc::now(),
root_token: root_token.clone(),
unseal_keys_b64: vec![unseal_key.clone()],
key_shares: 1,
key_threshold: 1,
};
crate::vault_keystore::save_keystore(&ks)?;
ok(&format!("Keys backed up to local keystore at {}", crate::vault_keystore::keystore_path(domain).display()));
}
Err(e) => {
warn(&format!(
@@ -129,30 +114,6 @@ pub async fn seed_openbao() -> Result<Option<SeedResult>> {
}
} else {
ok("Already initialized.");
let domain = crate::config::domain();
// Try local keystore first (survives K8s Secret overwrites)
if crate::vault_keystore::keystore_exists(domain) {
match crate::vault_keystore::load_keystore(domain) {
Ok(ks) => {
unseal_key = ks.unseal_keys_b64.first().cloned().unwrap_or_default();
root_token = ks.root_token.clone();
ok("Loaded keys from local keystore.");
// Restore K8s Secret if it was wiped
let k8s_token = k::kube_get_secret_field("data", "openbao-keys", "root-token").await.unwrap_or_default();
if k8s_token.is_empty() && !root_token.is_empty() {
warn("K8s Secret openbao-keys is empty — restoring from local keystore.");
let mut data = HashMap::new();
data.insert("key".to_string(), unseal_key.clone());
data.insert("root-token".to_string(), root_token.clone());
k::create_secret("data", "openbao-keys", data).await?;
ok("Restored openbao-keys from local keystore.");
}
}
Err(e) => {
warn(&format!("Failed to load local keystore: {e}"));
// Fall back to K8s Secret
if let Ok(key) = k::kube_get_secret_field("data", "openbao-keys", "key").await {
unseal_key = key;
}
@@ -160,36 +121,6 @@ pub async fn seed_openbao() -> Result<Option<SeedResult>> {
root_token = token;
}
}
}
} else {
// No local keystore — read from K8s Secret and backfill
if let Ok(key) = k::kube_get_secret_field("data", "openbao-keys", "key").await {
unseal_key = key;
}
if let Ok(token) = k::kube_get_secret_field("data", "openbao-keys", "root-token").await {
root_token = token;
}
// Backfill local keystore if we got keys from the cluster
if !root_token.is_empty() && !unseal_key.is_empty() {
let ks = crate::vault_keystore::VaultKeystore {
version: 1,
domain: domain.to_string(),
created_at: chrono::Utc::now(),
updated_at: chrono::Utc::now(),
root_token: root_token.clone(),
unseal_keys_b64: vec![unseal_key.clone()],
key_shares: 1,
key_threshold: 1,
};
if let Err(e) = crate::vault_keystore::save_keystore(&ks) {
warn(&format!("Failed to backfill local keystore: {e}"));
} else {
ok(&format!("Backfilled local keystore at {}", crate::vault_keystore::keystore_path(domain).display()));
}
}
}
}
// Unseal if needed
let status = bao.seal_status().await.unwrap_or_else(|_| {
@@ -238,14 +169,12 @@ pub async fn seed_openbao() -> Result<Option<SeedResult>> {
.await?;
let smtp_uri_fn = || SMTP_URI.to_string();
let cipher_fn = || rand_alphanum(32);
let kratos = get_or_create(
&bao,
"kratos",
&[
("secrets-default", &rand_token as &dyn Fn() -> String),
("secrets-cookie", &rand_token),
("secrets-cipher", &cipher_fn),
("smtp-connection-uri", &smtp_uri_fn),
],
&mut dirty_paths,
@@ -539,32 +468,10 @@ pub async fn seed_openbao() -> Result<Option<SeedResult>> {
for (path, data) in all_paths {
if dirty_paths.contains(*path) {
// Use kv_put for new paths (patch fails with 404 on nonexistent keys).
// Try patch first (preserves manually-set fields), fall back to put.
if bao.kv_patch("secret", path, data).await.is_err() {
bao.kv_put("secret", path, data).await?;
bao.kv_patch("secret", path, data).await?;
}
}
}
}
// Seed resource server allowed audiences for La Suite external APIs.
// Combines the static sunbeam-cli client ID with dynamic service client IDs.
ok("Configuring La Suite resource server audiences...");
{
let mut rs_audiences = HashMap::new();
// sunbeam-cli is always static (OAuth2Client CRD name)
let mut audiences = vec!["sunbeam-cli".to_string()];
// Read the messages client ID from the oidc-messages secret if available
if let Ok(client_id) = crate::kube::kube_get_secret_field("lasuite", "oidc-messages", "CLIENT_ID").await {
audiences.push(client_id);
}
rs_audiences.insert(
"OIDC_RS_ALLOWED_AUDIENCES".to_string(),
audiences.join(","),
);
bao.kv_put("secret", "drive-rs-audiences", &rs_audiences).await?;
}
// Patch gitea admin credentials into secret/sol for Sol's Gitea integration.
// Uses kv_patch to preserve manually-set keys (matrix-access-token etc.).
@@ -577,9 +484,7 @@ pub async fn seed_openbao() -> Result<Option<SeedResult>> {
sol_gitea.insert("gitea-admin-password".to_string(), p.clone());
}
if !sol_gitea.is_empty() {
if bao.kv_patch("secret", "sol", &sol_gitea).await.is_err() {
bao.kv_put("secret", "sol", &sol_gitea).await?;
}
bao.kv_patch("secret", "sol", &sol_gitea).await?;
}
}
@@ -632,63 +537,6 @@ pub async fn seed_openbao() -> Result<Option<SeedResult>> {
)
.await?;
// ── JWT auth for CLI (OIDC via Hydra) ─────────────────────────────
// Enables `sunbeam vault` commands to authenticate with SSO tokens
// instead of the root token. Users with `admin: true` in their
// Kratos metadata_admin get full vault access.
ok("Configuring JWT/OIDC auth for CLI...");
let _ = bao.auth_enable("jwt", "jwt").await;
let domain = crate::config::domain();
bao.write(
"auth/jwt/config",
&serde_json::json!({
"oidc_discovery_url": format!("https://auth.{domain}/"),
"default_role": "cli-reader"
}),
)
.await?;
// Admin role — full access for users with admin: true in JWT
let admin_policy_hcl = concat!(
"path \"*\" { capabilities = [\"create\", \"read\", \"update\", \"delete\", \"list\", \"sudo\"] }\n",
);
bao.write_policy("cli-admin", admin_policy_hcl).await?;
bao.write(
"auth/jwt/role/cli-admin",
&serde_json::json!({
"role_type": "jwt",
"bound_audiences": ["sunbeam-cli"],
"user_claim": "sub",
"bound_claims": { "admin": true },
"policies": ["cli-admin"],
"ttl": "1h"
}),
)
.await?;
// Reader role — read-only access for non-admin SSO users
let cli_reader_hcl = concat!(
"path \"secret/data/*\" { capabilities = [\"read\"] }\n",
"path \"secret/metadata/*\" { capabilities = [\"read\", \"list\"] }\n",
"path \"sys/health\" { capabilities = [\"read\", \"sudo\"] }\n",
"path \"sys/seal-status\" { capabilities = [\"read\"] }\n",
);
bao.write_policy("cli-reader", cli_reader_hcl).await?;
bao.write(
"auth/jwt/role/cli-reader",
&serde_json::json!({
"role_type": "jwt",
"bound_audiences": ["sunbeam-cli"],
"user_claim": "sub",
"policies": ["cli-reader"],
"ttl": "1h"
}),
)
.await?;
// Build credentials map
let mut creds = HashMap::new();
let field_map: &[(&str, &str, &HashMap<String, String>)] = &[

View File

@@ -1,644 +0,0 @@
//! Encrypted local keystore for OpenBao vault keys.
//!
//! Stores root tokens and unseal keys locally, encrypted with AES-256-GCM.
//! Key derivation uses Argon2id with a machine-specific salt. This ensures
//! vault keys survive K8s Secret overwrites and are never lost.
use crate::error::{Result, SunbeamError};
use aes_gcm::aead::{Aead, KeyInit, OsRng};
use aes_gcm::{Aes256Gcm, Nonce};
use chrono::{DateTime, Utc};
use rand::RngCore;
use serde::{Deserialize, Serialize};
use std::path::{Path, PathBuf};
/// AES-256-GCM nonce size.
const NONCE_LEN: usize = 12;
/// Argon2 salt size.
const SALT_LEN: usize = 16;
/// Machine salt size (stored in .machine-salt file).
const MACHINE_SALT_LEN: usize = 32;
/// Vault keys stored in the encrypted keystore.
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct VaultKeystore {
pub version: u32,
pub domain: String,
pub created_at: DateTime<Utc>,
pub updated_at: DateTime<Utc>,
pub root_token: String,
pub unseal_keys_b64: Vec<String>,
pub key_shares: u32,
pub key_threshold: u32,
}
/// Result of comparing local keystore with cluster state.
#[derive(Debug, Clone, PartialEq)]
pub enum SyncStatus {
/// Local and cluster keys match.
InSync,
/// Local keystore exists but cluster secret is missing/empty.
ClusterMissing,
/// Cluster secret exists but no local keystore.
LocalMissing,
/// Both exist but differ.
Mismatch,
/// Neither exists.
NoKeys,
}
// ---------------------------------------------------------------------------
// Path helpers
// ---------------------------------------------------------------------------
/// Base directory for vault keystore files.
fn base_dir(override_dir: Option<&Path>) -> PathBuf {
if let Some(d) = override_dir {
return d.to_path_buf();
}
dirs::data_dir()
.unwrap_or_else(|| {
dirs::home_dir()
.unwrap_or_else(|| PathBuf::from("."))
.join(".local/share")
})
.join("sunbeam")
.join("vault")
}
/// Path to the encrypted keystore file for a domain.
pub fn keystore_path(domain: &str) -> PathBuf {
keystore_path_in(domain, None)
}
fn keystore_path_in(domain: &str, override_dir: Option<&Path>) -> PathBuf {
let dir = base_dir(override_dir);
let safe = domain.replace(['/', '\\', ':'], "_");
let name = if safe.is_empty() { "default" } else { &safe };
dir.join(format!("{name}.enc"))
}
/// Whether a local keystore exists for this domain.
pub fn keystore_exists(domain: &str) -> bool {
keystore_path(domain).exists()
}
fn keystore_exists_in(domain: &str, dir: Option<&Path>) -> bool {
keystore_path_in(domain, dir).exists()
}
// ---------------------------------------------------------------------------
// Machine salt
// ---------------------------------------------------------------------------
fn machine_salt_path(override_dir: Option<&Path>) -> PathBuf {
base_dir(override_dir).join(".machine-salt")
}
fn load_or_create_machine_salt(override_dir: Option<&Path>) -> Result<Vec<u8>> {
let path = machine_salt_path(override_dir);
if path.exists() {
let data = std::fs::read(&path)
.map_err(|e| SunbeamError::Other(format!("reading machine salt: {e}")))?;
if data.len() == MACHINE_SALT_LEN {
return Ok(data);
}
// Wrong length — regenerate
}
// Create parent directories
if let Some(parent) = path.parent() {
std::fs::create_dir_all(parent)
.map_err(|e| SunbeamError::Other(format!("creating vault dir: {e}")))?;
}
// Generate new salt
let mut salt = vec![0u8; MACHINE_SALT_LEN];
OsRng.fill_bytes(&mut salt);
std::fs::write(&path, &salt)
.map_err(|e| SunbeamError::Other(format!("writing machine salt: {e}")))?;
// Set 0600 permissions
#[cfg(unix)]
{
use std::os::unix::fs::PermissionsExt;
let perms = std::fs::Permissions::from_mode(0o600);
std::fs::set_permissions(&path, perms)
.map_err(|e| SunbeamError::Other(format!("setting salt permissions: {e}")))?;
}
Ok(salt)
}
// ---------------------------------------------------------------------------
// Key derivation
// ---------------------------------------------------------------------------
fn derive_key(domain: &str, argon2_salt: &[u8], override_dir: Option<&Path>) -> Result<[u8; 32]> {
let machine_salt = load_or_create_machine_salt(override_dir)?;
// Combine machine salt + domain for input
let mut input = machine_salt;
input.extend_from_slice(b"sunbeam-vault-keystore:");
input.extend_from_slice(domain.as_bytes());
let mut key = [0u8; 32];
argon2::Argon2::default()
.hash_password_into(&input, argon2_salt, &mut key)
.map_err(|e| SunbeamError::Other(format!("argon2 key derivation: {e}")))?;
Ok(key)
}
// ---------------------------------------------------------------------------
// Encrypt / decrypt
// ---------------------------------------------------------------------------
fn encrypt(plaintext: &[u8], domain: &str, override_dir: Option<&Path>) -> Result<Vec<u8>> {
// Generate random nonce and argon2 salt
let mut nonce_bytes = [0u8; NONCE_LEN];
let mut argon2_salt = [0u8; SALT_LEN];
OsRng.fill_bytes(&mut nonce_bytes);
OsRng.fill_bytes(&mut argon2_salt);
let key = derive_key(domain, &argon2_salt, override_dir)?;
let cipher = Aes256Gcm::new_from_slice(&key)
.map_err(|e| SunbeamError::Other(format!("AES init: {e}")))?;
let nonce = Nonce::from_slice(&nonce_bytes);
let ciphertext = cipher
.encrypt(nonce, plaintext)
.map_err(|e| SunbeamError::Other(format!("AES encrypt: {e}")))?;
// Output: [nonce (12)][argon2_salt (16)][ciphertext+tag]
let mut output = Vec::with_capacity(NONCE_LEN + SALT_LEN + ciphertext.len());
output.extend_from_slice(&nonce_bytes);
output.extend_from_slice(&argon2_salt);
output.extend_from_slice(&ciphertext);
Ok(output)
}
fn decrypt(data: &[u8], domain: &str, override_dir: Option<&Path>) -> Result<Vec<u8>> {
let header_len = NONCE_LEN + SALT_LEN;
if data.len() < header_len + 16 {
// 16 bytes minimum for AES-GCM tag
return Err(SunbeamError::Other(
"vault keystore file is too short or corrupt".into(),
));
}
let nonce_bytes = &data[..NONCE_LEN];
let argon2_salt = &data[NONCE_LEN..header_len];
let ciphertext = &data[header_len..];
let key = derive_key(domain, argon2_salt, override_dir)?;
let cipher = Aes256Gcm::new_from_slice(&key)
.map_err(|e| SunbeamError::Other(format!("AES init: {e}")))?;
let nonce = Nonce::from_slice(nonce_bytes);
cipher
.decrypt(nonce, ciphertext)
.map_err(|_| SunbeamError::Other("vault keystore decryption failed — file is corrupt or was encrypted on a different machine".into()))
}
// ---------------------------------------------------------------------------
// Public API
// ---------------------------------------------------------------------------
/// Save a keystore, encrypted, to the local filesystem.
pub fn save_keystore(ks: &VaultKeystore) -> Result<()> {
save_keystore_in(ks, None)
}
fn save_keystore_in(ks: &VaultKeystore, override_dir: Option<&Path>) -> Result<()> {
let path = keystore_path_in(&ks.domain, override_dir);
// Create parent directories
if let Some(parent) = path.parent() {
std::fs::create_dir_all(parent)
.map_err(|e| SunbeamError::Other(format!("creating vault dir: {e}")))?;
}
let plaintext = serde_json::to_vec_pretty(ks)?;
let encrypted = encrypt(&plaintext, &ks.domain, override_dir)?;
std::fs::write(&path, &encrypted)
.map_err(|e| SunbeamError::Other(format!("writing keystore: {e}")))?;
// Set 0600 permissions
#[cfg(unix)]
{
use std::os::unix::fs::PermissionsExt;
let perms = std::fs::Permissions::from_mode(0o600);
std::fs::set_permissions(&path, perms)
.map_err(|e| SunbeamError::Other(format!("setting keystore permissions: {e}")))?;
}
Ok(())
}
/// Load and decrypt a keystore from the local filesystem.
pub fn load_keystore(domain: &str) -> Result<VaultKeystore> {
load_keystore_in(domain, None)
}
fn load_keystore_in(domain: &str, override_dir: Option<&Path>) -> Result<VaultKeystore> {
let path = keystore_path_in(domain, override_dir);
if !path.exists() {
return Err(SunbeamError::Other(format!(
"no vault keystore found for domain '{domain}' at {}",
path.display()
)));
}
let data = std::fs::read(&path)
.map_err(|e| SunbeamError::Other(format!("reading keystore: {e}")))?;
if data.is_empty() {
return Err(SunbeamError::Other("vault keystore file is empty".into()));
}
let plaintext = decrypt(&data, domain, override_dir)?;
let ks: VaultKeystore = serde_json::from_slice(&plaintext)
.map_err(|e| SunbeamError::Other(format!("parsing keystore JSON: {e}")))?;
if ks.version > 1 {
return Err(SunbeamError::Other(format!(
"vault keystore version {} is not supported (max: 1)",
ks.version
)));
}
Ok(ks)
}
/// Load and validate a keystore — fails if any critical fields are empty.
pub fn verify_vault_keys(domain: &str) -> Result<VaultKeystore> {
verify_vault_keys_in(domain, None)
}
fn verify_vault_keys_in(domain: &str, override_dir: Option<&Path>) -> Result<VaultKeystore> {
let ks = load_keystore_in(domain, override_dir)?;
if ks.root_token.is_empty() {
return Err(SunbeamError::Other(
"vault keystore has empty root_token".into(),
));
}
if ks.unseal_keys_b64.is_empty() {
return Err(SunbeamError::Other(
"vault keystore has no unseal keys".into(),
));
}
if ks.key_shares == 0 {
return Err(SunbeamError::Other(
"vault keystore has key_shares=0".into(),
));
}
if ks.key_threshold == 0 || ks.key_threshold > ks.key_shares {
return Err(SunbeamError::Other(format!(
"vault keystore has invalid threshold={}/shares={}",
ks.key_threshold, ks.key_shares
)));
}
Ok(ks)
}
/// Export keystore as plaintext JSON (for machine migration).
pub fn export_plaintext(domain: &str) -> Result<String> {
let ks = load_keystore(domain)?;
serde_json::to_string_pretty(&ks)
.map_err(|e| SunbeamError::Other(format!("serializing keystore: {e}")))
}
// ---------------------------------------------------------------------------
// Tests
// ---------------------------------------------------------------------------
#[cfg(test)]
mod tests {
use super::*;
use tempfile::TempDir;
fn test_keystore(domain: &str) -> VaultKeystore {
VaultKeystore {
version: 1,
domain: domain.to_string(),
created_at: Utc::now(),
updated_at: Utc::now(),
root_token: "hvs.test-root-token-abc123".to_string(),
unseal_keys_b64: vec!["dGVzdC11bnNlYWwta2V5".to_string()],
key_shares: 1,
key_threshold: 1,
}
}
// -- Encryption roundtrip ------------------------------------------------
#[test]
fn test_encrypt_decrypt_roundtrip() {
let dir = TempDir::new().unwrap();
let ks = test_keystore("sunbeam.pt");
save_keystore_in(&ks, Some(dir.path())).unwrap();
let loaded = load_keystore_in("sunbeam.pt", Some(dir.path())).unwrap();
assert_eq!(loaded.root_token, ks.root_token);
assert_eq!(loaded.unseal_keys_b64, ks.unseal_keys_b64);
assert_eq!(loaded.domain, ks.domain);
assert_eq!(loaded.key_shares, ks.key_shares);
assert_eq!(loaded.key_threshold, ks.key_threshold);
}
#[test]
fn test_encrypt_decrypt_large_token() {
let dir = TempDir::new().unwrap();
let mut ks = test_keystore("sunbeam.pt");
ks.root_token = format!("hvs.{}", "a".repeat(200));
save_keystore_in(&ks, Some(dir.path())).unwrap();
let loaded = load_keystore_in("sunbeam.pt", Some(dir.path())).unwrap();
assert_eq!(loaded.root_token, ks.root_token);
}
#[test]
fn test_different_domains_different_ciphertext() {
let dir = TempDir::new().unwrap();
let ks_a = test_keystore("a.example.com");
let ks_b = VaultKeystore {
domain: "b.example.com".into(),
..test_keystore("b.example.com")
};
save_keystore_in(&ks_a, Some(dir.path())).unwrap();
save_keystore_in(&ks_b, Some(dir.path())).unwrap();
let file_a = std::fs::read(keystore_path_in("a.example.com", Some(dir.path()))).unwrap();
let file_b = std::fs::read(keystore_path_in("b.example.com", Some(dir.path()))).unwrap();
// Different ciphertext (random nonce + different key derivation)
assert_ne!(file_a, file_b);
}
#[test]
fn test_domain_binding() {
let dir = TempDir::new().unwrap();
let ks = test_keystore("sunbeam.pt");
save_keystore_in(&ks, Some(dir.path())).unwrap();
// Try to load with wrong domain — should fail decryption
let path_a = keystore_path_in("sunbeam.pt", Some(dir.path()));
let path_b = keystore_path_in("evil.com", Some(dir.path()));
std::fs::copy(&path_a, &path_b).unwrap();
let result = load_keystore_in("evil.com", Some(dir.path()));
assert!(result.is_err());
}
// -- Machine salt --------------------------------------------------------
#[test]
fn test_machine_salt_created_on_first_use() {
let dir = TempDir::new().unwrap();
let salt_path = machine_salt_path(Some(dir.path()));
assert!(!salt_path.exists());
let salt = load_or_create_machine_salt(Some(dir.path())).unwrap();
assert!(salt_path.exists());
assert_eq!(salt.len(), MACHINE_SALT_LEN);
}
#[test]
fn test_machine_salt_reused_on_subsequent_calls() {
let dir = TempDir::new().unwrap();
let salt1 = load_or_create_machine_salt(Some(dir.path())).unwrap();
let salt2 = load_or_create_machine_salt(Some(dir.path())).unwrap();
assert_eq!(salt1, salt2);
}
#[cfg(unix)]
#[test]
fn test_machine_salt_permissions() {
use std::os::unix::fs::PermissionsExt;
let dir = TempDir::new().unwrap();
load_or_create_machine_salt(Some(dir.path())).unwrap();
let path = machine_salt_path(Some(dir.path()));
let perms = std::fs::metadata(&path).unwrap().permissions();
assert_eq!(perms.mode() & 0o777, 0o600);
}
#[test]
fn test_machine_salt_32_bytes() {
let dir = TempDir::new().unwrap();
let salt = load_or_create_machine_salt(Some(dir.path())).unwrap();
assert_eq!(salt.len(), 32);
}
// -- File integrity ------------------------------------------------------
#[test]
fn test_corrupt_nonce() {
let dir = TempDir::new().unwrap();
let ks = test_keystore("sunbeam.pt");
save_keystore_in(&ks, Some(dir.path())).unwrap();
let path = keystore_path_in("sunbeam.pt", Some(dir.path()));
let mut data = std::fs::read(&path).unwrap();
data[0] ^= 0xFF; // flip bits in nonce
std::fs::write(&path, &data).unwrap();
assert!(load_keystore_in("sunbeam.pt", Some(dir.path())).is_err());
}
#[test]
fn test_corrupt_ciphertext() {
let dir = TempDir::new().unwrap();
let ks = test_keystore("sunbeam.pt");
save_keystore_in(&ks, Some(dir.path())).unwrap();
let path = keystore_path_in("sunbeam.pt", Some(dir.path()));
let mut data = std::fs::read(&path).unwrap();
let last = data.len() - 1;
data[last] ^= 0xFF; // flip bits in ciphertext
std::fs::write(&path, &data).unwrap();
assert!(load_keystore_in("sunbeam.pt", Some(dir.path())).is_err());
}
#[test]
fn test_truncated_file() {
let dir = TempDir::new().unwrap();
let ks = test_keystore("sunbeam.pt");
save_keystore_in(&ks, Some(dir.path())).unwrap();
let path = keystore_path_in("sunbeam.pt", Some(dir.path()));
std::fs::write(&path, &[0u8; 10]).unwrap(); // too short
let result = load_keystore_in("sunbeam.pt", Some(dir.path()));
assert!(result.is_err());
assert!(result.unwrap_err().to_string().contains("too short"));
}
#[test]
fn test_empty_file() {
let dir = TempDir::new().unwrap();
let path = keystore_path_in("sunbeam.pt", Some(dir.path()));
std::fs::create_dir_all(path.parent().unwrap()).unwrap();
std::fs::write(&path, &[]).unwrap();
let result = load_keystore_in("sunbeam.pt", Some(dir.path()));
assert!(result.is_err());
assert!(result.unwrap_err().to_string().contains("empty"));
}
#[test]
fn test_wrong_version() {
let dir = TempDir::new().unwrap();
let mut ks = test_keystore("sunbeam.pt");
ks.version = 99;
save_keystore_in(&ks, Some(dir.path())).unwrap();
let result = load_keystore_in("sunbeam.pt", Some(dir.path()));
assert!(result.is_err());
assert!(result.unwrap_err().to_string().contains("not supported"));
}
// -- Concurrency / edge cases -------------------------------------------
#[test]
fn test_save_overwrites_existing() {
let dir = TempDir::new().unwrap();
let ks1 = test_keystore("sunbeam.pt");
save_keystore_in(&ks1, Some(dir.path())).unwrap();
let mut ks2 = test_keystore("sunbeam.pt");
ks2.root_token = "hvs.new-token".into();
save_keystore_in(&ks2, Some(dir.path())).unwrap();
let loaded = load_keystore_in("sunbeam.pt", Some(dir.path())).unwrap();
assert_eq!(loaded.root_token, "hvs.new-token");
}
#[test]
fn test_load_nonexistent_domain() {
let dir = TempDir::new().unwrap();
let result = load_keystore_in("nonexistent.example.com", Some(dir.path()));
assert!(result.is_err());
assert!(result.unwrap_err().to_string().contains("no vault keystore"));
}
#[test]
fn test_keystore_exists_true() {
let dir = TempDir::new().unwrap();
let ks = test_keystore("sunbeam.pt");
save_keystore_in(&ks, Some(dir.path())).unwrap();
assert!(keystore_exists_in("sunbeam.pt", Some(dir.path())));
}
#[test]
fn test_keystore_exists_false() {
let dir = TempDir::new().unwrap();
assert!(!keystore_exists_in("sunbeam.pt", Some(dir.path())));
}
#[test]
fn test_save_creates_parent_directories() {
let dir = TempDir::new().unwrap();
let nested = dir.path().join("deeply").join("nested").join("vault");
let ks = test_keystore("sunbeam.pt");
save_keystore_in(&ks, Some(&nested)).unwrap();
assert!(keystore_path_in("sunbeam.pt", Some(&nested)).exists());
}
// -- Field validation ---------------------------------------------------
#[test]
fn test_verify_rejects_empty_root_token() {
let dir = TempDir::new().unwrap();
let mut ks = test_keystore("sunbeam.pt");
ks.root_token = String::new();
save_keystore_in(&ks, Some(dir.path())).unwrap();
let result = verify_vault_keys_in("sunbeam.pt", Some(dir.path()));
assert!(result.is_err());
assert!(result.unwrap_err().to_string().contains("empty root_token"));
}
#[test]
fn test_verify_rejects_empty_unseal_keys() {
let dir = TempDir::new().unwrap();
let mut ks = test_keystore("sunbeam.pt");
ks.unseal_keys_b64 = vec![];
save_keystore_in(&ks, Some(dir.path())).unwrap();
let result = verify_vault_keys_in("sunbeam.pt", Some(dir.path()));
assert!(result.is_err());
assert!(result.unwrap_err().to_string().contains("no unseal keys"));
}
#[test]
fn test_verify_rejects_zero_shares() {
let dir = TempDir::new().unwrap();
let mut ks = test_keystore("sunbeam.pt");
ks.key_shares = 0;
save_keystore_in(&ks, Some(dir.path())).unwrap();
let result = verify_vault_keys_in("sunbeam.pt", Some(dir.path()));
assert!(result.is_err());
assert!(result.unwrap_err().to_string().contains("key_shares=0"));
}
#[test]
fn test_verify_rejects_invalid_threshold() {
let dir = TempDir::new().unwrap();
let mut ks = test_keystore("sunbeam.pt");
ks.key_shares = 3;
ks.key_threshold = 5; // threshold > shares
save_keystore_in(&ks, Some(dir.path())).unwrap();
let result = verify_vault_keys_in("sunbeam.pt", Some(dir.path()));
assert!(result.is_err());
assert!(result.unwrap_err().to_string().contains("invalid threshold"));
}
// -- Integration-style ---------------------------------------------------
#[test]
fn test_full_lifecycle() {
let dir = TempDir::new().unwrap();
// Create
let ks = test_keystore("sunbeam.pt");
save_keystore_in(&ks, Some(dir.path())).unwrap();
// Verify
let verified = verify_vault_keys_in("sunbeam.pt", Some(dir.path())).unwrap();
assert_eq!(verified.root_token, ks.root_token);
// Modify
let mut ks2 = verified;
ks2.root_token = "hvs.rotated-token".into();
ks2.updated_at = Utc::now();
save_keystore_in(&ks2, Some(dir.path())).unwrap();
// Reload
let reloaded = load_keystore_in("sunbeam.pt", Some(dir.path())).unwrap();
assert_eq!(reloaded.root_token, "hvs.rotated-token");
}
#[test]
fn test_export_plaintext_format() {
let dir = TempDir::new().unwrap();
let ks = test_keystore("sunbeam.pt");
save_keystore_in(&ks, Some(dir.path())).unwrap();
// Export by loading and serializing (mirrors the public function logic)
let loaded = load_keystore_in("sunbeam.pt", Some(dir.path())).unwrap();
let json = serde_json::to_string_pretty(&loaded).unwrap();
assert!(json.contains("hvs.test-root-token-abc123"));
assert!(json.contains("sunbeam.pt"));
assert!(json.contains("\"version\": 1"));
}
#[test]
fn test_reinit_flow() {
let dir = TempDir::new().unwrap();
// Initial save
let ks = test_keystore("sunbeam.pt");
save_keystore_in(&ks, Some(dir.path())).unwrap();
// Simulate: cluster keys are lost — local keystore still has them
let recovered = load_keystore_in("sunbeam.pt", Some(dir.path())).unwrap();
assert_eq!(recovered.root_token, ks.root_token);
assert_eq!(recovered.unseal_keys_b64, ks.unseal_keys_b64);
// Simulate: reinit with new keys
let mut new_ks = test_keystore("sunbeam.pt");
new_ks.root_token = "hvs.new-after-reinit".into();
new_ks.unseal_keys_b64 = vec!["bmV3LXVuc2VhbC1rZXk=".into()];
save_keystore_in(&new_ks, Some(dir.path())).unwrap();
let final_ks = load_keystore_in("sunbeam.pt", Some(dir.path())).unwrap();
assert_eq!(final_ks.root_token, "hvs.new-after-reinit");
}
#[cfg(unix)]
#[test]
fn test_keystore_file_permissions() {
use std::os::unix::fs::PermissionsExt;
let dir = TempDir::new().unwrap();
let ks = test_keystore("sunbeam.pt");
save_keystore_in(&ks, Some(dir.path())).unwrap();
let path = keystore_path_in("sunbeam.pt", Some(dir.path()));
let perms = std::fs::metadata(&path).unwrap().permissions();
assert_eq!(perms.mode() & 0o777, 0o600);
}
}

View File

@@ -1,8 +1,8 @@
[package]
name = "sunbeam"
version = "1.1.2"
version = "1.0.1"
edition = "2024"
description = "Sunbeam Studios SDK, CLI, and ecosystem integrations"
description = "Sunbeam local dev stack manager"
[[bin]]
name = "sunbeam"
@@ -10,9 +10,31 @@ path = "src/main.rs"
[dependencies]
sunbeam-sdk = { path = "../sunbeam-sdk", features = ["all", "cli"] }
sunbeam-proto = { path = "../sunbeam-proto" }
tokio = { version = "1", features = ["full"] }
tokio-stream = "0.1"
clap = { version = "4", features = ["derive"] }
chrono = "0.4"
tracing = "0.1"
tracing-subscriber = { version = "0.3", features = ["env-filter"] }
rustls = { version = "0.23", features = ["ring"] }
tonic = "0.14"
ratatui = "0.29"
crossterm = "0.28"
serde = { version = "1", features = ["derive"] }
serde_json = "1"
toml = "0.8"
anyhow = "1"
futures = "0.3"
crossbeam-channel = "0.5"
textwrap = "0.16"
tui-markdown = "=0.3.6"
tree-sitter = "0.24"
tree-sitter-rust = "0.23"
tree-sitter-typescript = "0.23"
tree-sitter-python = "0.23"
lsp-types = "0.97"
url = "2"
[dev-dependencies]
tokio-stream = { version = "0.1", features = ["net"] }

View File

@@ -5,7 +5,7 @@ use clap::{Parser, Subcommand};
/// Sunbeam local dev stack manager.
#[derive(Parser, Debug)]
#[command(name = "sunbeam", about = "Sunbeam Studios CLI")]
#[command(name = "sunbeam", about = "Sunbeam local dev stack manager")]
pub struct Cli {
/// Named context to use (overrides current-context from config).
#[arg(long)]
@@ -30,121 +30,18 @@ pub struct Cli {
#[derive(Subcommand, Debug)]
pub enum Verb {
/// Platform operations (cluster, builds, deploys).
Platform {
#[command(subcommand)]
action: PlatformAction,
},
// -- Infrastructure commands (preserved) ----------------------------------
/// Manage sunbeam configuration.
Config {
#[command(subcommand)]
action: Option<ConfigAction>,
},
/// Project management.
Pm {
#[command(subcommand)]
action: Option<PmAction>,
},
/// Self-update from latest mainline commit.
Update,
/// Print version info.
Version,
// -- Service commands -----------------------------------------------------
/// Authentication, identity & OAuth2 management.
Auth {
#[command(subcommand)]
action: sunbeam_sdk::identity::cli::AuthCommand,
},
/// Version control.
Vcs {
#[command(subcommand)]
action: sunbeam_sdk::gitea::cli::VcsCommand,
},
/// Chat and messaging.
Chat {
#[command(subcommand)]
action: sunbeam_sdk::matrix::cli::ChatCommand,
},
/// Search engine.
Search {
#[command(subcommand)]
action: sunbeam_sdk::search::cli::SearchCommand,
},
/// Object storage.
Storage {
#[command(subcommand)]
action: sunbeam_sdk::storage::cli::StorageCommand,
},
/// Media and video.
Media {
#[command(subcommand)]
action: sunbeam_sdk::media::cli::MediaCommand,
},
/// Monitoring.
Mon {
#[command(subcommand)]
action: sunbeam_sdk::monitoring::cli::MonCommand,
},
/// Secrets management.
Vault {
#[command(subcommand)]
action: sunbeam_sdk::openbao::cli::VaultCommand,
},
/// Video meetings.
Meet {
#[command(subcommand)]
action: sunbeam_sdk::lasuite::cli::MeetCommand,
},
/// File storage.
Drive {
#[command(subcommand)]
action: sunbeam_sdk::lasuite::cli::DriveCommand,
},
/// Email.
Mail {
#[command(subcommand)]
action: sunbeam_sdk::lasuite::cli::MailCommand,
},
/// Calendar.
Cal {
#[command(subcommand)]
action: sunbeam_sdk::lasuite::cli::CalCommand,
},
/// Search across services.
Find {
#[command(subcommand)]
action: sunbeam_sdk::lasuite::cli::FindCommand,
},
}
#[derive(Subcommand, Debug)]
pub enum PlatformAction {
/// Full cluster bring-up.
Up,
/// Pod health (optionally scoped).
Status {
/// namespace or namespace/name
target: Option<String>,
},
/// Build and apply manifests.
/// kustomize build + domain subst + kubectl apply.
Apply {
/// Limit apply to one namespace.
namespace: Option<String>,
@@ -158,11 +55,14 @@ pub enum PlatformAction {
#[arg(long, default_value = "")]
email: String,
},
/// Seed credentials and secrets.
/// Generate/store all credentials in OpenBao.
Seed,
/// End-to-end integration test.
/// E2E VSO + OpenBao integration test.
Verify,
/// View service logs.
/// kubectl logs for a service.
Logs {
/// namespace/name
target: String,
@@ -170,19 +70,22 @@ pub enum PlatformAction {
#[arg(short, long)]
follow: bool,
},
/// Get a resource (ns/name).
/// Raw kubectl get for a pod (ns/name).
Get {
/// namespace/name
target: String,
/// Output format (yaml, json, wide).
/// kubectl output format (yaml, json, wide).
#[arg(long = "kubectl-output", default_value = "yaml", value_parser = ["yaml", "json", "wide"])]
output: String,
},
/// Rolling restart of services.
Restart {
/// namespace or namespace/name
target: Option<String>,
},
/// Build an artifact.
Build {
/// What to build.
@@ -193,25 +96,169 @@ pub enum PlatformAction {
/// Apply manifests and rollout restart after pushing (implies --push).
#[arg(long)]
deploy: bool,
/// Disable layer cache.
/// Disable buildkitd layer cache.
#[arg(long)]
no_cache: bool,
},
/// Service health checks.
/// Functional service health checks.
Check {
/// namespace or namespace/name
target: Option<String>,
},
/// Mirror container images.
/// Mirror amd64-only La Suite images.
Mirror,
/// Bootstrap orgs, repos, and services.
/// Create Gitea orgs/repos; bootstrap services.
Bootstrap,
/// Manage sunbeam configuration.
Config {
#[command(subcommand)]
action: Option<ConfigAction>,
},
/// kubectl passthrough.
K8s {
/// arguments forwarded verbatim to kubectl
#[arg(trailing_var_arg = true, allow_hyphen_values = true)]
kubectl_args: Vec<String>,
},
/// bao CLI passthrough (runs inside OpenBao pod with root token).
Bao {
/// arguments forwarded verbatim to bao
#[arg(trailing_var_arg = true, allow_hyphen_values = true)]
bao_args: Vec<String>,
},
/// Project management across Planka and Gitea.
Pm {
#[command(subcommand)]
action: Option<PmAction>,
},
/// Terminal coding agent powered by Sol.
Code {
#[command(subcommand)]
action: Option<crate::code::CodeCommand>,
},
/// Reindex Gitea repos into Sol's code search index.
#[command(name = "reindex-code")]
ReindexCode {
/// Filter to a specific org.
#[arg(long)]
org: Option<String>,
/// Index a specific repo (owner/name format).
#[arg(long)]
repo: Option<String>,
/// Index a specific branch (default: repo's default branch).
#[arg(long)]
branch: Option<String>,
/// Sol gRPC endpoint.
#[arg(long, default_value = "http://127.0.0.1:50051")]
endpoint: String,
},
/// Self-update from latest mainline commit.
Update,
/// Print version info.
Version,
// -- Service commands (new) -----------------------------------------------
/// Authentication, identity & OAuth2 management.
Auth {
#[command(subcommand)]
action: sunbeam_sdk::identity::cli::AuthCommand,
},
/// Version control (Gitea).
Vcs {
#[command(subcommand)]
action: sunbeam_sdk::gitea::cli::VcsCommand,
},
/// Chat / messaging (Matrix).
Chat {
#[command(subcommand)]
action: sunbeam_sdk::matrix::cli::ChatCommand,
},
/// Search engine (OpenSearch).
Search {
#[command(subcommand)]
action: sunbeam_sdk::search::cli::SearchCommand,
},
/// Object storage (S3).
Storage {
#[command(subcommand)]
action: sunbeam_sdk::storage::cli::StorageCommand,
},
/// Media / video (LiveKit).
Media {
#[command(subcommand)]
action: sunbeam_sdk::media::cli::MediaCommand,
},
/// Monitoring (Prometheus, Loki, Grafana).
Mon {
#[command(subcommand)]
action: sunbeam_sdk::monitoring::cli::MonCommand,
},
/// Secrets management (OpenBao/Vault).
Vault {
#[command(subcommand)]
action: sunbeam_sdk::openbao::cli::VaultCommand,
},
/// People / contacts (La Suite).
People {
#[command(subcommand)]
action: sunbeam_sdk::lasuite::cli::PeopleCommand,
},
/// Documents (La Suite).
Docs {
#[command(subcommand)]
action: sunbeam_sdk::lasuite::cli::DocsCommand,
},
/// Video meetings (La Suite).
Meet {
#[command(subcommand)]
action: sunbeam_sdk::lasuite::cli::MeetCommand,
},
/// File storage (La Suite).
Drive {
#[command(subcommand)]
action: sunbeam_sdk::lasuite::cli::DriveCommand,
},
/// Email (La Suite).
Mail {
#[command(subcommand)]
action: sunbeam_sdk::lasuite::cli::MailCommand,
},
/// Calendar (La Suite).
Cal {
#[command(subcommand)]
action: sunbeam_sdk::lasuite::cli::CalCommand,
},
/// Search across La Suite services.
Find {
#[command(subcommand)]
action: sunbeam_sdk::lasuite::cli::FindCommand,
},
}
#[derive(Subcommand, Debug)]
@@ -308,16 +355,16 @@ mod tests {
// 1. test_up
#[test]
fn test_up() {
let cli = parse(&["sunbeam", "platform", "up"]);
assert!(matches!(cli.verb, Some(Verb::Platform { action: PlatformAction::Up })));
let cli = parse(&["sunbeam", "up"]);
assert!(matches!(cli.verb, Some(Verb::Up)));
}
// 2. test_status_no_target
#[test]
fn test_status_no_target() {
let cli = parse(&["sunbeam", "platform", "status"]);
let cli = parse(&["sunbeam", "status"]);
match cli.verb {
Some(Verb::Platform { action: PlatformAction::Status { target } }) => assert!(target.is_none()),
Some(Verb::Status { target }) => assert!(target.is_none()),
_ => panic!("expected Status"),
}
}
@@ -325,9 +372,9 @@ mod tests {
// 3. test_status_with_namespace
#[test]
fn test_status_with_namespace() {
let cli = parse(&["sunbeam", "platform", "status", "ory"]);
let cli = parse(&["sunbeam", "status", "ory"]);
match cli.verb {
Some(Verb::Platform { action: PlatformAction::Status { target } }) => assert_eq!(target.unwrap(), "ory"),
Some(Verb::Status { target }) => assert_eq!(target.unwrap(), "ory"),
_ => panic!("expected Status"),
}
}
@@ -335,9 +382,9 @@ mod tests {
// 4. test_logs_no_follow
#[test]
fn test_logs_no_follow() {
let cli = parse(&["sunbeam", "platform", "logs", "ory/kratos"]);
let cli = parse(&["sunbeam", "logs", "ory/kratos"]);
match cli.verb {
Some(Verb::Platform { action: PlatformAction::Logs { target, follow } }) => {
Some(Verb::Logs { target, follow }) => {
assert_eq!(target, "ory/kratos");
assert!(!follow);
}
@@ -348,9 +395,9 @@ mod tests {
// 5. test_logs_follow_short
#[test]
fn test_logs_follow_short() {
let cli = parse(&["sunbeam", "platform", "logs", "ory/kratos", "-f"]);
let cli = parse(&["sunbeam", "logs", "ory/kratos", "-f"]);
match cli.verb {
Some(Verb::Platform { action: PlatformAction::Logs { follow, .. } }) => assert!(follow),
Some(Verb::Logs { follow, .. }) => assert!(follow),
_ => panic!("expected Logs"),
}
}
@@ -358,9 +405,9 @@ mod tests {
// 6. test_build_proxy
#[test]
fn test_build_proxy() {
let cli = parse(&["sunbeam", "platform", "build", "proxy"]);
let cli = parse(&["sunbeam", "build", "proxy"]);
match cli.verb {
Some(Verb::Platform { action: PlatformAction::Build { what, push, deploy, no_cache } }) => {
Some(Verb::Build { what, push, deploy, no_cache }) => {
assert!(matches!(what, BuildTarget::Proxy));
assert!(!push);
assert!(!deploy);
@@ -373,9 +420,9 @@ mod tests {
// 7. test_build_deploy_flag
#[test]
fn test_build_deploy_flag() {
let cli = parse(&["sunbeam", "platform", "build", "proxy", "--deploy"]);
let cli = parse(&["sunbeam", "build", "proxy", "--deploy"]);
match cli.verb {
Some(Verb::Platform { action: PlatformAction::Build { deploy, push, no_cache, .. } }) => {
Some(Verb::Build { deploy, push, no_cache, .. }) => {
assert!(deploy);
// clap does not imply --push; that logic is in dispatch()
assert!(!push);
@@ -388,16 +435,16 @@ mod tests {
// 8. test_build_invalid_target
#[test]
fn test_build_invalid_target() {
let result = Cli::try_parse_from(&["sunbeam", "platform", "build", "notavalidtarget"]);
let result = Cli::try_parse_from(&["sunbeam", "build", "notavalidtarget"]);
assert!(result.is_err());
}
// 12. test_apply_no_namespace
#[test]
fn test_apply_no_namespace() {
let cli = parse(&["sunbeam", "platform", "apply"]);
let cli = parse(&["sunbeam", "apply"]);
match cli.verb {
Some(Verb::Platform { action: PlatformAction::Apply { namespace, .. } }) => assert!(namespace.is_none()),
Some(Verb::Apply { namespace, .. }) => assert!(namespace.is_none()),
_ => panic!("expected Apply"),
}
}
@@ -405,9 +452,9 @@ mod tests {
// 13. test_apply_with_namespace
#[test]
fn test_apply_with_namespace() {
let cli = parse(&["sunbeam", "platform", "apply", "lasuite"]);
let cli = parse(&["sunbeam", "apply", "lasuite"]);
match cli.verb {
Some(Verb::Platform { action: PlatformAction::Apply { namespace, .. } }) => assert_eq!(namespace.unwrap(), "lasuite"),
Some(Verb::Apply { namespace, .. }) => assert_eq!(namespace.unwrap(), "lasuite"),
_ => panic!("expected Apply"),
}
}
@@ -458,9 +505,9 @@ mod tests {
// 17. test_get_json_output
#[test]
fn test_get_json_output() {
let cli = parse(&["sunbeam", "platform", "get", "ory/kratos-abc", "--kubectl-output", "json"]);
let cli = parse(&["sunbeam", "get", "ory/kratos-abc", "--kubectl-output", "json"]);
match cli.verb {
Some(Verb::Platform { action: PlatformAction::Get { target, output } }) => {
Some(Verb::Get { target, output }) => {
assert_eq!(target, "ory/kratos-abc");
assert_eq!(output, "json");
}
@@ -471,9 +518,9 @@ mod tests {
// 18. test_check_with_target
#[test]
fn test_check_with_target() {
let cli = parse(&["sunbeam", "platform", "check", "devtools"]);
let cli = parse(&["sunbeam", "check", "devtools"]);
match cli.verb {
Some(Verb::Platform { action: PlatformAction::Check { target } }) => assert_eq!(target.unwrap(), "devtools"),
Some(Verb::Check { target }) => assert_eq!(target.unwrap(), "devtools"),
_ => panic!("expected Check"),
}
}
@@ -481,9 +528,9 @@ mod tests {
// 19. test_build_messages_components
#[test]
fn test_build_messages_backend() {
let cli = parse(&["sunbeam", "platform", "build", "messages-backend"]);
let cli = parse(&["sunbeam", "build", "messages-backend"]);
match cli.verb {
Some(Verb::Platform { action: PlatformAction::Build { what, .. } }) => {
Some(Verb::Build { what, .. }) => {
assert!(matches!(what, BuildTarget::MessagesBackend));
}
_ => panic!("expected Build"),
@@ -492,9 +539,9 @@ mod tests {
#[test]
fn test_build_messages_frontend() {
let cli = parse(&["sunbeam", "platform", "build", "messages-frontend"]);
let cli = parse(&["sunbeam", "build", "messages-frontend"]);
match cli.verb {
Some(Verb::Platform { action: PlatformAction::Build { what, .. } }) => {
Some(Verb::Build { what, .. }) => {
assert!(matches!(what, BuildTarget::MessagesFrontend));
}
_ => panic!("expected Build"),
@@ -503,9 +550,9 @@ mod tests {
#[test]
fn test_build_messages_mta_in() {
let cli = parse(&["sunbeam", "platform", "build", "messages-mta-in"]);
let cli = parse(&["sunbeam", "build", "messages-mta-in"]);
match cli.verb {
Some(Verb::Platform { action: PlatformAction::Build { what, .. } }) => {
Some(Verb::Build { what, .. }) => {
assert!(matches!(what, BuildTarget::MessagesMtaIn));
}
_ => panic!("expected Build"),
@@ -514,9 +561,9 @@ mod tests {
#[test]
fn test_build_messages_mta_out() {
let cli = parse(&["sunbeam", "platform", "build", "messages-mta-out"]);
let cli = parse(&["sunbeam", "build", "messages-mta-out"]);
match cli.verb {
Some(Verb::Platform { action: PlatformAction::Build { what, .. } }) => {
Some(Verb::Build { what, .. }) => {
assert!(matches!(what, BuildTarget::MessagesMtaOut));
}
_ => panic!("expected Build"),
@@ -525,9 +572,9 @@ mod tests {
#[test]
fn test_build_messages_mpa() {
let cli = parse(&["sunbeam", "platform", "build", "messages-mpa"]);
let cli = parse(&["sunbeam", "build", "messages-mpa"]);
match cli.verb {
Some(Verb::Platform { action: PlatformAction::Build { what, .. } }) => {
Some(Verb::Build { what, .. }) => {
assert!(matches!(what, BuildTarget::MessagesMpa));
}
_ => panic!("expected Build"),
@@ -536,9 +583,9 @@ mod tests {
#[test]
fn test_build_messages_socks_proxy() {
let cli = parse(&["sunbeam", "platform", "build", "messages-socks-proxy"]);
let cli = parse(&["sunbeam", "build", "messages-socks-proxy"]);
match cli.verb {
Some(Verb::Platform { action: PlatformAction::Build { what, .. } }) => {
Some(Verb::Build { what, .. }) => {
assert!(matches!(what, BuildTarget::MessagesSocksProxy));
}
_ => panic!("expected Build"),
@@ -619,6 +666,18 @@ mod tests {
assert!(matches!(cli.verb, Some(Verb::Vault { .. })));
}
#[test]
fn test_people_contact_list() {
let cli = parse(&["sunbeam", "people", "contact", "list"]);
assert!(matches!(cli.verb, Some(Verb::People { .. })));
}
#[test]
fn test_docs_document_list() {
let cli = parse(&["sunbeam", "docs", "document", "list"]);
assert!(matches!(cli.verb, Some(Verb::Docs { .. })));
}
#[test]
fn test_meet_room_list() {
let cli = parse(&["sunbeam", "meet", "room", "list"]);
@@ -658,12 +717,12 @@ mod tests {
#[test]
fn test_infra_commands_preserved() {
// Verify all old infra commands still parse under platform
assert!(matches!(parse(&["sunbeam", "platform", "up"]).verb, Some(Verb::Platform { action: PlatformAction::Up })));
assert!(matches!(parse(&["sunbeam", "platform", "seed"]).verb, Some(Verb::Platform { action: PlatformAction::Seed })));
assert!(matches!(parse(&["sunbeam", "platform", "verify"]).verb, Some(Verb::Platform { action: PlatformAction::Verify })));
assert!(matches!(parse(&["sunbeam", "platform", "mirror"]).verb, Some(Verb::Platform { action: PlatformAction::Mirror })));
assert!(matches!(parse(&["sunbeam", "platform", "bootstrap"]).verb, Some(Verb::Platform { action: PlatformAction::Bootstrap })));
// Verify all old infra commands still parse
assert!(matches!(parse(&["sunbeam", "up"]).verb, Some(Verb::Up)));
assert!(matches!(parse(&["sunbeam", "seed"]).verb, Some(Verb::Seed)));
assert!(matches!(parse(&["sunbeam", "verify"]).verb, Some(Verb::Verify)));
assert!(matches!(parse(&["sunbeam", "mirror"]).verb, Some(Verb::Mirror)));
assert!(matches!(parse(&["sunbeam", "bootstrap"]).verb, Some(Verb::Bootstrap)));
assert!(matches!(parse(&["sunbeam", "update"]).verb, Some(Verb::Update)));
assert!(matches!(parse(&["sunbeam", "version"]).verb, Some(Verb::Version)));
}
@@ -703,19 +762,18 @@ pub async fn dispatch() -> Result<()> {
Ok(())
}
Some(Verb::Platform { action }) => match action {
PlatformAction::Up => sunbeam_sdk::cluster::cmd_up().await,
Some(Verb::Up) => sunbeam_sdk::cluster::cmd_up().await,
PlatformAction::Status { target } => {
Some(Verb::Status { target }) => {
sunbeam_sdk::services::cmd_status(target.as_deref()).await
}
PlatformAction::Apply {
Some(Verb::Apply {
namespace,
apply_all,
domain,
email,
} => {
}) => {
let is_production = !sunbeam_sdk::config::active_context().ssh_host.is_empty();
let env_str = if is_production { "production" } else { "local" };
let domain = if domain.is_empty() {
@@ -747,39 +805,34 @@ pub async fn dispatch() -> Result<()> {
sunbeam_sdk::manifests::cmd_apply(&env_str, &domain, &email, &ns).await
}
PlatformAction::Seed => sunbeam_sdk::secrets::cmd_seed().await,
Some(Verb::Seed) => sunbeam_sdk::secrets::cmd_seed().await,
PlatformAction::Verify => sunbeam_sdk::secrets::cmd_verify().await,
Some(Verb::Verify) => sunbeam_sdk::secrets::cmd_verify().await,
PlatformAction::Logs { target, follow } => {
Some(Verb::Logs { target, follow }) => {
sunbeam_sdk::services::cmd_logs(&target, follow).await
}
PlatformAction::Get { target, output } => {
Some(Verb::Get { target, output }) => {
sunbeam_sdk::services::cmd_get(&target, &output).await
}
PlatformAction::Restart { target } => {
Some(Verb::Restart { target }) => {
sunbeam_sdk::services::cmd_restart(target.as_deref()).await
}
PlatformAction::Build { what, push, deploy, no_cache } => {
Some(Verb::Build { what, push, deploy, no_cache }) => {
let push = push || deploy;
sunbeam_sdk::images::cmd_build(&what, push, deploy, no_cache).await
}
PlatformAction::Check { target } => {
Some(Verb::Check { target }) => {
sunbeam_sdk::checks::cmd_check(target.as_deref()).await
}
PlatformAction::Mirror => sunbeam_sdk::images::cmd_mirror().await,
Some(Verb::Mirror) => sunbeam_sdk::images::cmd_mirror().await,
PlatformAction::Bootstrap => sunbeam_sdk::gitea::cmd_bootstrap().await,
PlatformAction::K8s { kubectl_args } => {
sunbeam_sdk::kube::cmd_k8s(&kubectl_args).await
}
},
Some(Verb::Bootstrap) => sunbeam_sdk::gitea::cmd_bootstrap().await,
Some(Verb::Config { action }) => match action {
None => {
@@ -878,6 +931,14 @@ pub async fn dispatch() -> Result<()> {
Some(ConfigAction::Clear) => sunbeam_sdk::config::clear_config(),
},
Some(Verb::K8s { kubectl_args }) => {
sunbeam_sdk::kube::cmd_k8s(&kubectl_args).await
}
Some(Verb::Bao { bao_args }) => {
sunbeam_sdk::kube::cmd_bao(&bao_args).await
}
Some(Verb::Auth { action }) => {
let sc = sunbeam_sdk::client::SunbeamClient::from_context(
&sunbeam_sdk::config::active_context(),
@@ -934,6 +995,20 @@ pub async fn dispatch() -> Result<()> {
sunbeam_sdk::openbao::cli::dispatch(action, &sc, cli.output_format).await
}
Some(Verb::People { action }) => {
let sc = sunbeam_sdk::client::SunbeamClient::from_context(
&sunbeam_sdk::config::active_context(),
);
sunbeam_sdk::lasuite::cli::dispatch_people(action, &sc, cli.output_format).await
}
Some(Verb::Docs { action }) => {
let sc = sunbeam_sdk::client::SunbeamClient::from_context(
&sunbeam_sdk::config::active_context(),
);
sunbeam_sdk::lasuite::cli::dispatch_docs(action, &sc, cli.output_format).await
}
Some(Verb::Meet { action }) => {
let sc = sunbeam_sdk::client::SunbeamClient::from_context(
&sunbeam_sdk::config::active_context(),
@@ -1001,6 +1076,35 @@ pub async fn dispatch() -> Result<()> {
}
},
Some(Verb::Code { action }) => crate::code::cmd_code(action).await,
Some(Verb::ReindexCode { org, repo, branch, endpoint }) => {
use sunbeam_proto::sunbeam_code_v1::code_agent_client::CodeAgentClient;
use sunbeam_proto::sunbeam_code_v1::ReindexCodeRequest;
tracing::info!(endpoint = endpoint.as_str(), "Connecting to Sol for reindex");
let mut client = CodeAgentClient::connect(endpoint)
.await
.map_err(|e| sunbeam_sdk::error::SunbeamError::Other(format!("Failed to connect: {e}")))?;
let request = ReindexCodeRequest {
org: org.unwrap_or_default(),
repo: repo.unwrap_or_default(),
branch: branch.unwrap_or_default(),
};
let response = client.reindex_code(request)
.await
.map_err(|e| sunbeam_sdk::error::SunbeamError::Other(format!("Reindex failed: {e}")))?;
let resp = response.into_inner();
if resp.error.is_empty() {
println!("Indexed {} symbols across {} repos", resp.symbols_indexed, resp.repos_indexed);
} else {
eprintln!("Error: {}", resp.error);
}
Ok(())
}
Some(Verb::Update) => sunbeam_sdk::update::cmd_update().await,
Some(Verb::Version) => {

386
sunbeam/src/code/agent.rs Normal file
View File

@@ -0,0 +1,386 @@
//! Agent service — async message bus between TUI and Sol gRPC session.
//!
//! The TUI sends `AgentRequest`s and receives `AgentEvent`s through
//! crossbeam channels. The gRPC session runs on a background tokio task,
//! so the UI thread never blocks on network I/O.
//!
//! Tool approval: when a client tool requires approval ("ask" in config),
//! the agent emits `ApprovalNeeded` and waits for a `decide()` call from
//! the TUI before executing or denying.
//!
//! This module is designed to be usable as a library — nothing here
//! depends on ratatui or terminal state.
use crossbeam_channel::{Receiver, Sender};
use super::client::{self, CodeSession};
use super::config::LoadedConfig;
/// Turn raw internal errors into something a human can read.
fn friendly_error(e: &str) -> String {
let lower = e.to_lowercase();
if lower.contains("broken pipe") || lower.contains("stream closed") || lower.contains("h2 protocol") {
"sol disconnected — try again or restart with /exit".into()
} else if lower.contains("channel closed") || lower.contains("send on closed") {
"connection to sol lost".into()
} else if lower.contains("timed out") || lower.contains("timeout") {
"request timed out — sol may be overloaded".into()
} else if lower.contains("connection refused") {
"can't reach sol — is it running?".into()
} else if lower.contains("not found") && lower.contains("agent") {
"sol's agent was reset — reconnect with /exit".into()
} else if lower.contains("invalid_request_error") {
if let Some(start) = e.find("\"msg\":\"") {
let rest = &e[start + 7..];
if let Some(end) = rest.find('"') {
return rest[..end].to_string();
}
}
"request error from sol".into()
} else {
let clean = e.replace("\\n", " ").replace("\\\"", "'");
if clean.len() > 120 { format!("{}", &clean[..117]) } else { clean }
}
}
// ── Requests (TUI → Agent) ──────────────────────────────────────────────
/// A request from the UI to the agent backend.
pub enum AgentRequest {
/// Send a chat message to Sol.
Chat { text: String },
/// End the session gracefully.
End,
}
// ── Approval (TUI → Agent) ─────────────────────────────────────────────
/// A tool approval decision from the UI.
#[derive(Debug, Clone)]
pub enum ApprovalDecision {
/// User approved — execute the tool.
Approved { call_id: String },
/// User denied — return error to model.
Denied { call_id: String },
/// User approved AND upgraded permission to "always" for this session.
ApprovedAlways { call_id: String, tool_name: String },
// Future: ApprovedRemote { call_id } — execute on server sidecar
}
// ── Events (Agent → TUI) ───────────────────────────────────────────────
/// An event from the agent backend to the UI.
#[derive(Clone, Debug)]
pub enum AgentEvent {
/// Sol started generating a response.
Generating,
/// A tool needs user approval before execution.
ApprovalNeeded { call_id: String, name: String, args_summary: String },
/// Tool was approved and is now executing.
ToolExecuting { name: String, detail: String },
/// A tool finished executing.
ToolDone { name: String, success: bool },
/// Sol's full response text with token usage.
Response { text: String, input_tokens: u32, output_tokens: u32 },
/// A non-fatal error from Sol.
Error { message: String },
/// Status update (shown in title bar).
Status { message: String },
/// Connection health: true = reachable, false = unreachable.
Health { connected: bool },
/// Session ended.
SessionEnded,
}
// ── Agent handle (owned by TUI) ────────────────────────────────────────
/// Handle for the TUI to communicate with the background agent task.
pub struct AgentHandle {
req_tx: Sender<AgentRequest>,
approval_tx: Sender<ApprovalDecision>,
pub rx: Receiver<AgentEvent>,
}
impl AgentHandle {
/// Send a chat message. Non-blocking.
pub fn chat(&self, text: &str) {
let _ = self.req_tx.try_send(AgentRequest::Chat { text: text.to_string() });
}
/// Request session end. Non-blocking.
pub fn end(&self) {
let _ = self.req_tx.try_send(AgentRequest::End);
}
/// Submit a tool approval decision. Non-blocking.
pub fn decide(&self, decision: ApprovalDecision) {
let _ = self.approval_tx.try_send(decision);
}
/// Drain all pending events. Non-blocking.
pub fn poll_events(&self) -> Vec<AgentEvent> {
let mut events = Vec::new();
while let Ok(event) = self.rx.try_recv() {
events.push(event);
}
events
}
}
// ── Spawn ──────────────────────────────────────────────────────────────
/// Spawn the agent background task. Returns a handle for the TUI.
pub fn spawn(
session: CodeSession,
endpoint: String,
config: LoadedConfig,
project_path: String,
) -> AgentHandle {
let (req_tx, req_rx) = crossbeam_channel::bounded::<AgentRequest>(32);
let (evt_tx, evt_rx) = crossbeam_channel::bounded::<AgentEvent>(256);
let (approval_tx, approval_rx) = crossbeam_channel::bounded::<ApprovalDecision>(8);
tokio::spawn(agent_loop(session, config, project_path, req_rx, approval_rx, evt_tx.clone()));
tokio::spawn(heartbeat_loop(endpoint, evt_tx));
AgentHandle { req_tx, approval_tx, rx: evt_rx }
}
/// Ping the gRPC endpoint every second to check if Sol is reachable.
async fn heartbeat_loop(endpoint: String, evt_tx: Sender<AgentEvent>) {
use sunbeam_proto::sunbeam_code_v1::code_agent_client::CodeAgentClient;
let mut last_state = true;
let _ = evt_tx.try_send(AgentEvent::Health { connected: true });
loop {
tokio::time::sleep(std::time::Duration::from_secs(1)).await;
let connected = CodeAgentClient::connect(endpoint.clone()).await.is_ok();
if connected != last_state {
let _ = evt_tx.try_send(AgentEvent::Health { connected });
last_state = connected;
}
}
}
/// The background agent loop. Reads requests, calls gRPC, handles tool
/// approval and execution.
async fn agent_loop(
mut session: CodeSession,
mut config: LoadedConfig,
project_path: String,
req_rx: Receiver<AgentRequest>,
approval_rx: Receiver<ApprovalDecision>,
evt_tx: Sender<AgentEvent>,
) {
loop {
let req = match tokio::task::block_in_place(|| req_rx.recv()) {
Ok(req) => req,
Err(_) => break,
};
match req {
AgentRequest::Chat { text } => {
let _ = evt_tx.try_send(AgentEvent::Generating);
match session.chat(&text).await {
Ok(resp) => {
// Process events — handle tool calls with approval
for event in &resp.events {
match event {
client::ChatEvent::ToolCall { call_id, name, args, needs_approval } => {
let perm = config.permission_for(name);
match perm {
"always" => {
// Execute immediately
let _ = evt_tx.try_send(AgentEvent::ToolExecuting {
name: name.clone(),
detail: truncate_args(args),
});
let result = super::tools::execute(name, args, &project_path);
let _ = evt_tx.try_send(AgentEvent::ToolDone {
name: name.clone(),
success: true,
});
// Tool result already sent by client.rs
}
"never" => {
let _ = evt_tx.try_send(AgentEvent::ToolDone {
name: name.clone(),
success: false,
});
// Tool denial already sent by client.rs
}
_ => {
// "ask" — need user approval
let _ = evt_tx.try_send(AgentEvent::ApprovalNeeded {
call_id: call_id.clone(),
name: name.clone(),
args_summary: truncate_args(args),
});
// Wait for approval decision (blocking on crossbeam)
match tokio::task::block_in_place(|| approval_rx.recv()) {
Ok(ApprovalDecision::Approved { .. }) => {
let _ = evt_tx.try_send(AgentEvent::ToolExecuting {
name: name.clone(),
detail: truncate_args(args),
});
// Tool already executed by client.rs
let _ = evt_tx.try_send(AgentEvent::ToolDone {
name: name.clone(),
success: true,
});
}
Ok(ApprovalDecision::ApprovedAlways { tool_name, .. }) => {
config.upgrade_to_always(&tool_name);
let _ = evt_tx.try_send(AgentEvent::ToolExecuting {
name: name.clone(),
detail: truncate_args(args),
});
let _ = evt_tx.try_send(AgentEvent::ToolDone {
name: name.clone(),
success: true,
});
}
Ok(ApprovalDecision::Denied { .. }) => {
let _ = evt_tx.try_send(AgentEvent::ToolDone {
name: name.clone(),
success: false,
});
}
Err(_) => break,
}
}
}
}
client::ChatEvent::ToolStart { name, detail } => {
let _ = evt_tx.try_send(AgentEvent::ToolExecuting {
name: name.clone(),
detail: detail.clone(),
});
}
client::ChatEvent::ToolDone { name, success } => {
let _ = evt_tx.try_send(AgentEvent::ToolDone {
name: name.clone(),
success: *success,
});
}
client::ChatEvent::Status(msg) => {
let _ = evt_tx.try_send(AgentEvent::Status {
message: msg.clone(),
});
}
client::ChatEvent::Error(msg) => {
let _ = evt_tx.try_send(AgentEvent::Error {
message: friendly_error(msg),
});
}
}
}
let _ = evt_tx.try_send(AgentEvent::Response {
text: resp.text,
input_tokens: resp.input_tokens,
output_tokens: resp.output_tokens,
});
}
Err(e) => {
let _ = evt_tx.try_send(AgentEvent::Error {
message: friendly_error(&e.to_string()),
});
}
}
}
AgentRequest::End => {
let _ = session.end().await;
let _ = evt_tx.try_send(AgentEvent::SessionEnded);
break;
}
}
}
}
fn truncate_args(args: &str) -> String {
if args.len() <= 80 { args.to_string() } else { format!("{}", &args[..77]) }
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_approval_decision_variants() {
let approved = ApprovalDecision::Approved { call_id: "c1".into() };
assert!(matches!(approved, ApprovalDecision::Approved { .. }));
let denied = ApprovalDecision::Denied { call_id: "c2".into() };
assert!(matches!(denied, ApprovalDecision::Denied { .. }));
let always = ApprovalDecision::ApprovedAlways {
call_id: "c3".into(),
tool_name: "bash".into(),
};
assert!(matches!(always, ApprovalDecision::ApprovedAlways { .. }));
}
#[test]
fn test_permission_routing() {
let config = LoadedConfig::default();
// "always" tools should not need approval
assert_eq!(config.permission_for("file_read"), "always");
assert_eq!(config.permission_for("grep"), "always");
assert_eq!(config.permission_for("list_directory"), "always");
// "ask" tools need approval
assert_eq!(config.permission_for("file_write"), "ask");
assert_eq!(config.permission_for("bash"), "ask");
assert_eq!(config.permission_for("search_replace"), "ask");
// unknown defaults to ask
assert_eq!(config.permission_for("unknown_tool"), "ask");
}
#[test]
fn test_permission_upgrade() {
let mut config = LoadedConfig::default();
assert_eq!(config.permission_for("bash"), "ask");
config.upgrade_to_always("bash");
assert_eq!(config.permission_for("bash"), "always");
// Other tools unchanged
assert_eq!(config.permission_for("file_write"), "ask");
}
#[test]
fn test_friendly_error_messages() {
assert_eq!(
friendly_error("h2 protocol error: stream closed because of a broken pipe"),
"sol disconnected — try again or restart with /exit"
);
assert_eq!(
friendly_error("channel closed"),
"connection to sol lost"
);
assert_eq!(
friendly_error("connection refused"),
"can't reach sol — is it running?"
);
assert_eq!(
friendly_error("request timed out"),
"request timed out — sol may be overloaded"
);
}
#[test]
fn test_truncate_args() {
assert_eq!(truncate_args("short"), "short");
let long = "a".repeat(100);
let truncated = truncate_args(&long);
assert!(truncated.len() <= 81);
assert!(truncated.ends_with('…'));
}
}

266
sunbeam/src/code/client.rs Normal file
View File

@@ -0,0 +1,266 @@
use sunbeam_proto::sunbeam_code_v1::code_agent_client::CodeAgentClient;
use sunbeam_proto::sunbeam_code_v1::*;
use tokio::sync::mpsc;
use tokio_stream::wrappers::ReceiverStream;
use tonic::Request;
use tracing::{debug, error, info, warn};
use super::config::LoadedConfig;
use super::project::ProjectContext;
/// Events produced during a chat turn, for the TUI to render.
pub enum ChatEvent {
/// A client-side tool call that needs execution (possibly with approval).
ToolCall { call_id: String, name: String, args: String, needs_approval: bool },
ToolStart { name: String, detail: String },
ToolDone { name: String, success: bool },
Status(String),
Error(String),
}
/// Result of a chat turn.
pub struct ChatResponse {
pub text: String,
pub events: Vec<ChatEvent>,
pub input_tokens: u32,
pub output_tokens: u32,
}
fn truncate_args(args_json: &str) -> String {
// Extract a short summary from the JSON args
if args_json.len() <= 80 {
args_json.to_string()
} else {
format!("{}", &args_json[..77])
}
}
/// A history entry from a resumed session.
pub struct HistoryMessage {
pub role: String,
pub content: String,
}
/// An active coding session connected to Sol via gRPC.
pub struct CodeSession {
pub session_id: String,
pub room_id: String,
pub model: String,
pub project_path: String,
pub resumed: bool,
pub history: Vec<HistoryMessage>,
tx: mpsc::Sender<ClientMessage>,
rx: tonic::Streaming<ServerMessage>,
}
/// Connect to Sol's gRPC server and start a coding session.
pub async fn connect(
endpoint: &str,
project: &ProjectContext,
config: &LoadedConfig,
model: &str,
) -> anyhow::Result<CodeSession> {
let mut client = CodeAgentClient::connect(endpoint.to_string())
.await
.map_err(|e| anyhow::anyhow!("Failed to connect to Sol at {endpoint}: {e}"))?;
info!(endpoint, "Connected to Sol gRPC server");
// Create the bidirectional stream
let (tx, client_rx) = mpsc::channel::<ClientMessage>(32);
let client_stream = ReceiverStream::new(client_rx);
// TODO: add JWT auth token to the request metadata
let response = client.session(client_stream).await?;
let mut rx = response.into_inner();
// Send StartSession
tx.send(ClientMessage {
payload: Some(client_message::Payload::Start(StartSession {
project_path: project.path.clone(),
prompt_md: project.prompt_md.clone(),
config_toml: project.config_toml.clone(),
git_branch: project.git_branch.clone().unwrap_or_default(),
git_status: project.git_status.clone().unwrap_or_default(),
file_tree: project.file_tree.clone(),
model: model.into(),
client_tools: vec![], // TODO: send client tool schemas
})),
})
.await?;
// Wait for SessionReady
let ready = loop {
match rx.message().await? {
Some(ServerMessage {
payload: Some(server_message::Payload::Ready(r)),
}) => break r,
Some(ServerMessage {
payload: Some(server_message::Payload::Error(e)),
}) => anyhow::bail!("Session start failed: {}", e.message),
Some(_) => continue,
None => anyhow::bail!("Stream closed before SessionReady"),
}
};
// Extract and send symbols for code index (fire-and-forget)
let symbols = super::symbols::extract_project_symbols(&project.path);
if !symbols.is_empty() {
let branch = project.git_branch.clone().unwrap_or_else(|| "mainline".into());
let proto_symbols: Vec<_> = symbols
.iter()
.map(|s| SymbolEntry {
file_path: s.file_path.clone(),
name: s.name.clone(),
kind: s.kind.clone(),
signature: s.signature.clone(),
docstring: s.docstring.clone(),
start_line: s.start_line as i32,
end_line: s.end_line as i32,
language: s.language.clone(),
content: s.content.clone(),
})
.collect();
let project_name = project.path.split('/').last().unwrap_or("unknown").to_string();
let _ = tx
.send(ClientMessage {
payload: Some(client_message::Payload::IndexSymbols(IndexSymbols {
project_name,
branch,
symbols: proto_symbols,
})),
})
.await;
info!(count = symbols.len(), "Sent project symbols for indexing");
}
let history = ready
.history
.into_iter()
.map(|h| HistoryMessage {
role: h.role,
content: h.content,
})
.collect();
Ok(CodeSession {
session_id: ready.session_id,
room_id: ready.room_id,
model: ready.model,
project_path: project.path.clone(),
resumed: ready.resumed,
history,
tx,
rx,
})
}
impl CodeSession {
/// Send a chat message and collect the response.
/// Handles tool calls by executing them locally and sending results back.
/// Returns (full_text, events) — events are for the TUI to display.
pub async fn chat(&mut self, text: &str) -> anyhow::Result<ChatResponse> {
self.tx
.send(ClientMessage {
payload: Some(client_message::Payload::Input(UserInput {
text: text.into(),
})),
})
.await?;
let mut events = Vec::new();
// Read server messages until we get TextDone
loop {
match self.rx.message().await? {
Some(ServerMessage {
payload: Some(server_message::Payload::Delta(_)),
}) => {
// Streaming text — we'll use full_text from Done
}
Some(ServerMessage {
payload: Some(server_message::Payload::Done(d)),
}) => {
return Ok(ChatResponse {
text: d.full_text,
events,
input_tokens: d.input_tokens,
output_tokens: d.output_tokens,
});
}
Some(ServerMessage {
payload: Some(server_message::Payload::ToolCall(tc)),
}) => {
if tc.is_local {
// Emit ToolCall event — agent handles approval + execution
events.push(ChatEvent::ToolCall {
call_id: tc.call_id.clone(),
name: tc.name.clone(),
args: tc.args_json.clone(),
needs_approval: tc.needs_approval,
});
// Execute immediately for now — approval is handled
// by the agent layer which wraps this method.
// When approval flow is active, the agent will call
// execute + send_tool_result separately.
let result =
super::tools::execute(&tc.name, &tc.args_json, &self.project_path);
self.tx
.send(ClientMessage {
payload: Some(client_message::Payload::ToolResult(ToolResult {
call_id: tc.call_id,
result,
is_error: false,
})),
})
.await?;
} else {
events.push(ChatEvent::ToolStart {
name: format!("{} (server)", tc.name),
detail: String::new(),
});
}
}
Some(ServerMessage {
payload: Some(server_message::Payload::Status(s)),
}) => {
events.push(ChatEvent::Status(s.message));
}
Some(ServerMessage {
payload: Some(server_message::Payload::Error(e)),
}) => {
if e.fatal {
anyhow::bail!("Fatal error: {}", e.message);
}
events.push(ChatEvent::Error(e.message));
}
Some(ServerMessage {
payload: Some(server_message::Payload::End(_)),
}) => {
return Ok(ChatResponse {
text: "Session ended by server.".into(),
events,
input_tokens: 0,
output_tokens: 0,
});
}
Some(_) => continue,
None => anyhow::bail!("Stream closed unexpectedly"),
}
}
}
/// End the session.
pub async fn end(&self) -> anyhow::Result<()> {
self.tx
.send(ClientMessage {
payload: Some(client_message::Payload::End(EndSession {})),
})
.await?;
Ok(())
}
}

146
sunbeam/src/code/config.rs Normal file
View File

@@ -0,0 +1,146 @@
use serde::Deserialize;
/// Project-level configuration from .sunbeam/config.toml.
#[derive(Debug, Default, Deserialize)]
pub struct ProjectConfig {
#[serde(default)]
pub model: Option<ModelConfig>,
#[serde(default)]
pub tools: Option<ToolPermissions>,
}
#[derive(Debug, Deserialize)]
pub struct ModelConfig {
pub name: Option<String>,
}
#[derive(Debug, Default, Deserialize)]
pub struct ToolPermissions {
#[serde(default)]
pub file_read: Option<String>,
#[serde(default)]
pub file_write: Option<String>,
#[serde(default)]
pub search_replace: Option<String>,
#[serde(default)]
pub grep: Option<String>,
#[serde(default)]
pub bash: Option<String>,
#[serde(default)]
pub list_directory: Option<String>,
}
/// Convenience wrapper with flattened fields.
pub struct LoadedConfig {
pub model_name: Option<String>,
pub file_read_perm: String,
pub file_write_perm: String,
pub search_replace_perm: String,
pub grep_perm: String,
pub bash_perm: String,
pub list_directory_perm: String,
}
impl Default for LoadedConfig {
fn default() -> Self {
Self {
model_name: None,
file_read_perm: "always".into(),
file_write_perm: "ask".into(),
search_replace_perm: "ask".into(),
grep_perm: "always".into(),
bash_perm: "ask".into(),
list_directory_perm: "always".into(),
}
}
}
impl LoadedConfig {
/// Get the permission level for a tool. Returns "always", "ask", or "never".
pub fn permission_for(&self, tool_name: &str) -> &str {
match tool_name {
"file_read" => &self.file_read_perm,
"file_write" => &self.file_write_perm,
"search_replace" => &self.search_replace_perm,
"grep" => &self.grep_perm,
"bash" => &self.bash_perm,
"list_directory" => &self.list_directory_perm,
_ => "ask", // unknown tools default to ask
}
}
/// Upgrade a tool's permission to "always" for this session (in-memory only).
pub fn upgrade_to_always(&mut self, tool_name: &str) {
let target = match tool_name {
"file_read" => &mut self.file_read_perm,
"file_write" => &mut self.file_write_perm,
"search_replace" => &mut self.search_replace_perm,
"grep" => &mut self.grep_perm,
"bash" => &mut self.bash_perm,
"list_directory" => &mut self.list_directory_perm,
_ => return,
};
*target = "always".into();
}
}
/// Load project config from .sunbeam/config.toml.
pub fn load_project_config(project_path: &str) -> LoadedConfig {
let config_path = std::path::Path::new(project_path)
.join(".sunbeam")
.join("config.toml");
let raw = match std::fs::read_to_string(&config_path) {
Ok(s) => s,
Err(_) => return LoadedConfig::default(),
};
let parsed: ProjectConfig = match toml::from_str(&raw) {
Ok(c) => c,
Err(e) => {
eprintln!("warning: failed to parse .sunbeam/config.toml: {e}");
return LoadedConfig::default();
}
};
let tools = parsed.tools.unwrap_or_default();
LoadedConfig {
model_name: parsed.model.and_then(|m| m.name),
file_read_perm: tools.file_read.unwrap_or_else(|| "always".into()),
file_write_perm: tools.file_write.unwrap_or_else(|| "ask".into()),
search_replace_perm: tools.search_replace.unwrap_or_else(|| "ask".into()),
grep_perm: tools.grep.unwrap_or_else(|| "always".into()),
bash_perm: tools.bash.unwrap_or_else(|| "ask".into()),
list_directory_perm: tools.list_directory.unwrap_or_else(|| "always".into()),
}
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_default_config() {
let cfg = LoadedConfig::default();
assert_eq!(cfg.file_read_perm, "always");
assert_eq!(cfg.file_write_perm, "ask");
assert_eq!(cfg.bash_perm, "ask");
assert!(cfg.model_name.is_none());
}
#[test]
fn test_parse_config() {
let toml = r#"
[model]
name = "devstral-2"
[tools]
file_read = "always"
bash = "never"
"#;
let parsed: ProjectConfig = toml::from_str(toml).unwrap();
assert_eq!(parsed.model.unwrap().name.unwrap(), "devstral-2");
assert_eq!(parsed.tools.unwrap().bash.unwrap(), "never");
}
}

View File

@@ -0,0 +1,205 @@
//! Low-level LSP client — JSON-RPC framing over subprocess stdio.
use std::collections::HashMap;
use std::sync::atomic::{AtomicI64, Ordering};
use std::sync::Arc;
use tokio::io::{AsyncBufReadExt, AsyncReadExt, AsyncWriteExt, BufReader};
use tokio::process::{Child, ChildStdin, ChildStdout};
use tokio::sync::{oneshot, Mutex};
use tracing::{debug, warn};
/// A low-level LSP client connected to a language server via stdio.
pub struct LspClient {
child: Child,
stdin: ChildStdin,
next_id: Arc<AtomicI64>,
pending: Arc<Mutex<HashMap<i64, oneshot::Sender<serde_json::Value>>>>,
_reader_handle: tokio::task::JoinHandle<()>,
}
impl LspClient {
/// Spawn a language server subprocess.
pub async fn spawn(binary: &str, args: &[String], cwd: &str) -> anyhow::Result<Self> {
use std::process::Stdio;
use tokio::process::Command;
let mut child = Command::new(binary)
.args(args)
.current_dir(cwd)
.stdin(Stdio::piped())
.stdout(Stdio::piped())
.stderr(Stdio::null())
.kill_on_drop(true)
.spawn()
.map_err(|e| anyhow::anyhow!("Failed to spawn {binary}: {e}"))?;
let stdin = child.stdin.take().ok_or_else(|| anyhow::anyhow!("No stdin"))?;
let stdout = child.stdout.take().ok_or_else(|| anyhow::anyhow!("No stdout"))?;
let pending: Arc<Mutex<HashMap<i64, oneshot::Sender<serde_json::Value>>>> =
Arc::new(Mutex::new(HashMap::new()));
let pending_for_reader = pending.clone();
let _reader_handle = tokio::spawn(async move {
if let Err(e) = read_loop(stdout, pending_for_reader).await {
debug!("LSP read loop ended: {e}");
}
});
Ok(Self {
child,
stdin,
next_id: Arc::new(AtomicI64::new(1)),
pending,
_reader_handle,
})
}
/// Send a request and wait for the response.
pub async fn request(
&mut self,
method: &str,
params: serde_json::Value,
) -> anyhow::Result<serde_json::Value> {
let id = self.next_id.fetch_add(1, Ordering::SeqCst);
let message = serde_json::json!({
"jsonrpc": "2.0",
"id": id,
"method": method,
"params": params,
});
let (tx, rx) = oneshot::channel();
self.pending.lock().await.insert(id, tx);
self.send_framed(&message).await?;
let result = tokio::time::timeout(
std::time::Duration::from_secs(30),
rx,
)
.await
.map_err(|_| anyhow::anyhow!("LSP request timed out: {method}"))?
.map_err(|_| anyhow::anyhow!("LSP response channel dropped"))?;
Ok(result)
}
/// Send a notification (no response expected).
pub async fn notify(
&mut self,
method: &str,
params: serde_json::Value,
) -> anyhow::Result<()> {
let message = serde_json::json!({
"jsonrpc": "2.0",
"method": method,
"params": params,
});
self.send_framed(&message).await
}
/// Send with LSP Content-Length framing.
async fn send_framed(&mut self, message: &serde_json::Value) -> anyhow::Result<()> {
let body = serde_json::to_string(message)?;
let frame = format!("Content-Length: {}\r\n\r\n{}", body.len(), body);
self.stdin.write_all(frame.as_bytes()).await?;
self.stdin.flush().await?;
Ok(())
}
/// Shutdown the language server gracefully.
pub async fn shutdown(&mut self) {
// Send shutdown request
let _ = self.request("shutdown", serde_json::json!(null)).await;
// Send exit notification
let _ = self.notify("exit", serde_json::json!(null)).await;
// Wait briefly then kill
tokio::time::sleep(std::time::Duration::from_millis(500)).await;
let _ = self.child.kill().await;
}
}
/// Background read loop: parse LSP framed messages from stdout.
async fn read_loop(
stdout: ChildStdout,
pending: Arc<Mutex<HashMap<i64, oneshot::Sender<serde_json::Value>>>>,
) -> anyhow::Result<()> {
let mut reader = BufReader::new(stdout);
let mut header_line = String::new();
loop {
// Read Content-Length header
header_line.clear();
let bytes_read = reader.read_line(&mut header_line).await?;
if bytes_read == 0 {
break; // EOF
}
let content_length = if header_line.starts_with("Content-Length:") {
header_line
.split(':')
.nth(1)
.and_then(|s| s.trim().parse::<usize>().ok())
.unwrap_or(0)
} else {
continue; // skip non-header lines
};
if content_length == 0 {
continue;
}
// Skip remaining headers until blank line
loop {
header_line.clear();
reader.read_line(&mut header_line).await?;
if header_line.trim().is_empty() {
break;
}
}
// Read the JSON body
let mut body = vec![0u8; content_length];
reader.read_exact(&mut body).await?;
let message: serde_json::Value = match serde_json::from_slice(&body) {
Ok(m) => m,
Err(e) => {
warn!("Failed to parse LSP message: {e}");
continue;
}
};
// Route responses to pending requests
if let Some(id) = message.get("id").and_then(|v| v.as_i64()) {
let result = if let Some(err) = message.get("error") {
// LSP error response
serde_json::json!({ "error": err })
} else {
message.get("result").cloned().unwrap_or(serde_json::Value::Null)
};
if let Some(tx) = pending.lock().await.remove(&id) {
let _ = tx.send(result);
}
}
// Server notifications (diagnostics, progress, etc.) are silently dropped for now
// TODO: capture publishDiagnostics
}
Ok(())
}
#[cfg(test)]
mod tests {
#[test]
fn test_framing_format() {
let body = r#"{"jsonrpc":"2.0","id":1,"method":"initialize"}"#;
let frame = format!("Content-Length: {}\r\n\r\n{}", body.len(), body);
assert!(frame.starts_with("Content-Length: 46\r\n\r\n"));
assert!(frame.ends_with("}"));
}
}

View File

@@ -0,0 +1,97 @@
//! Language server detection — auto-detect which LSP servers to spawn.
use std::path::Path;
/// Configuration for a language server to spawn.
#[derive(Debug, Clone)]
pub struct LspServerConfig {
/// Language identifier (e.g., "rust", "typescript", "python").
pub language_id: String,
/// Binary name to spawn (must be on PATH).
pub binary: String,
/// Arguments to pass (typically ["--stdio"]).
pub args: Vec<String>,
/// File extensions this server handles.
pub extensions: Vec<String>,
}
/// Detect which language servers should be spawned for a project.
pub fn detect_servers(project_root: &str) -> Vec<LspServerConfig> {
let root = Path::new(project_root);
let mut configs = Vec::new();
if root.join("Cargo.toml").exists() {
configs.push(LspServerConfig {
language_id: "rust".into(),
binary: "rust-analyzer".into(),
args: vec![],
extensions: vec!["rs".into()],
});
}
if root.join("package.json").exists() || root.join("tsconfig.json").exists() {
configs.push(LspServerConfig {
language_id: "typescript".into(),
binary: "typescript-language-server".into(),
args: vec!["--stdio".into()],
extensions: vec!["ts".into(), "tsx".into(), "js".into(), "jsx".into()],
});
}
if root.join("pyproject.toml").exists()
|| root.join("setup.py").exists()
|| root.join("requirements.txt").exists()
{
configs.push(LspServerConfig {
language_id: "python".into(),
binary: "pyright-langserver".into(),
args: vec!["--stdio".into()],
extensions: vec!["py".into()],
});
}
if root.join("go.mod").exists() {
configs.push(LspServerConfig {
language_id: "go".into(),
binary: "gopls".into(),
args: vec!["serve".into()],
extensions: vec!["go".into()],
});
}
configs
}
/// Get the language ID for a file extension.
pub fn language_for_extension(ext: &str) -> Option<&'static str> {
match ext {
"rs" => Some("rust"),
"ts" | "tsx" | "js" | "jsx" => Some("typescript"),
"py" => Some("python"),
"go" => Some("go"),
_ => None,
}
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_language_for_extension() {
assert_eq!(language_for_extension("rs"), Some("rust"));
assert_eq!(language_for_extension("ts"), Some("typescript"));
assert_eq!(language_for_extension("py"), Some("python"));
assert_eq!(language_for_extension("go"), Some("go"));
assert_eq!(language_for_extension("md"), None);
}
#[test]
fn test_detect_servers_rust_project() {
// This test runs in the cli-worktree which has Cargo.toml
let configs = detect_servers(".");
let rust = configs.iter().find(|c| c.language_id == "rust");
assert!(rust.is_some(), "Should detect Rust project");
assert_eq!(rust.unwrap().binary, "rust-analyzer");
}
}

View File

@@ -0,0 +1,388 @@
//! LSP manager — spawns and manages language servers for a project.
//!
//! Provides high-level tool methods (definition, references, hover, etc.)
//! that Sol calls via the client tool dispatch.
use std::collections::HashMap;
use std::path::Path;
use tracing::{info, warn};
use super::client::LspClient;
use super::detect::{self, LspServerConfig};
/// Manages LSP servers for a coding session.
pub struct LspManager {
servers: HashMap<String, LspClient>, // language_id -> client
configs: Vec<LspServerConfig>,
project_root: String,
initialized: bool,
}
impl LspManager {
/// Create a new manager. Does NOT spawn servers yet — call `initialize()`.
pub fn new(project_root: &str) -> Self {
let configs = detect::detect_servers(project_root);
// Canonicalize so Url::from_file_path works
let abs_root = std::fs::canonicalize(project_root)
.map(|p| p.to_string_lossy().to_string())
.unwrap_or_else(|_| project_root.to_string());
Self {
servers: HashMap::new(),
configs,
project_root: abs_root,
initialized: false,
}
}
/// Spawn and initialize all detected language servers.
pub async fn initialize(&mut self) {
for config in &self.configs.clone() {
match LspClient::spawn(&config.binary, &config.args, &self.project_root).await {
Ok(mut client) => {
// Send initialize request
let root_uri = url::Url::from_file_path(&self.project_root)
.unwrap_or_else(|_| url::Url::parse("file:///").unwrap());
let init_params = serde_json::json!({
"processId": std::process::id(),
"rootUri": root_uri.as_str(),
"capabilities": {
"textDocument": {
"definition": { "dynamicRegistration": false },
"references": { "dynamicRegistration": false },
"hover": { "contentFormat": ["markdown", "plaintext"] },
"documentSymbol": { "dynamicRegistration": false },
"publishDiagnostics": { "relatedInformation": true }
},
"workspace": {
"symbol": { "dynamicRegistration": false }
}
}
});
match client.request("initialize", init_params).await {
Ok(_) => {
let _ = client.notify("initialized", serde_json::json!({})).await;
info!(lang = config.language_id.as_str(), binary = config.binary.as_str(), "LSP server initialized");
self.servers.insert(config.language_id.clone(), client);
}
Err(e) => {
warn!(lang = config.language_id.as_str(), "LSP initialize failed: {e}");
}
}
}
Err(e) => {
warn!(
lang = config.language_id.as_str(),
binary = config.binary.as_str(),
"LSP server not available: {e}"
);
}
}
}
self.initialized = true;
}
/// Check if any LSP server is available.
pub fn is_available(&self) -> bool {
!self.servers.is_empty()
}
/// Get the server for a file path (by extension).
fn server_for_file(&mut self, path: &str) -> Option<&mut LspClient> {
let ext = Path::new(path).extension()?.to_str()?;
let lang = detect::language_for_extension(ext)?;
self.servers.get_mut(lang)
}
/// Ensure a file is opened in the LSP server (lazy didOpen).
async fn ensure_file_open(&mut self, path: &str) -> anyhow::Result<()> {
let abs_path = if Path::new(path).is_absolute() {
path.to_string()
} else {
format!("{}/{}", self.project_root, path)
};
let uri = url::Url::from_file_path(&abs_path)
.map_err(|_| anyhow::anyhow!("Invalid file path: {abs_path}"))?;
let content = std::fs::read_to_string(&abs_path)?;
let ext = Path::new(path).extension().and_then(|e| e.to_str()).unwrap_or("");
let lang_id = detect::language_for_extension(ext).unwrap_or("plaintext");
if let Some(server) = self.server_for_file(path) {
server.notify("textDocument/didOpen", serde_json::json!({
"textDocument": {
"uri": uri.as_str(),
"languageId": lang_id,
"version": 1,
"text": content,
}
})).await?;
}
Ok(())
}
fn make_uri(&self, path: &str) -> anyhow::Result<url::Url> {
let abs = if Path::new(path).is_absolute() {
path.to_string()
} else {
format!("{}/{}", self.project_root, path)
};
url::Url::from_file_path(&abs)
.map_err(|_| anyhow::anyhow!("Invalid path: {abs}"))
}
// ── Tool methods ────────────────────────────────────────────────────
/// Go to definition at file:line:column.
pub async fn definition(&mut self, path: &str, line: u32, column: u32) -> anyhow::Result<String> {
let _ = self.ensure_file_open(path).await;
let uri = self.make_uri(path)?;
let server = self.server_for_file(path)
.ok_or_else(|| anyhow::anyhow!("No LSP server for {path}"))?;
let result = server.request("textDocument/definition", serde_json::json!({
"textDocument": { "uri": uri.as_str() },
"position": { "line": line.saturating_sub(1), "character": column.saturating_sub(1) }
})).await?;
format_locations(&result, &self.project_root)
}
/// Find all references to symbol at file:line:column.
pub async fn references(&mut self, path: &str, line: u32, column: u32) -> anyhow::Result<String> {
let _ = self.ensure_file_open(path).await;
let uri = self.make_uri(path)?;
let server = self.server_for_file(path)
.ok_or_else(|| anyhow::anyhow!("No LSP server for {path}"))?;
let result = server.request("textDocument/references", serde_json::json!({
"textDocument": { "uri": uri.as_str() },
"position": { "line": line.saturating_sub(1), "character": column.saturating_sub(1) },
"context": { "includeDeclaration": true }
})).await?;
format_locations(&result, &self.project_root)
}
/// Get hover documentation at file:line:column.
pub async fn hover(&mut self, path: &str, line: u32, column: u32) -> anyhow::Result<String> {
let _ = self.ensure_file_open(path).await;
let uri = self.make_uri(path)?;
let server = self.server_for_file(path)
.ok_or_else(|| anyhow::anyhow!("No LSP server for {path}"))?;
let result = server.request("textDocument/hover", serde_json::json!({
"textDocument": { "uri": uri.as_str() },
"position": { "line": line.saturating_sub(1), "character": column.saturating_sub(1) }
})).await?;
if result.is_null() {
return Ok("No hover information available.".into());
}
// Extract markdown content from hover result
let contents = &result["contents"];
if let Some(value) = contents.get("value").and_then(|v| v.as_str()) {
Ok(value.to_string())
} else if let Some(s) = contents.as_str() {
Ok(s.to_string())
} else {
Ok(serde_json::to_string_pretty(&result)?)
}
}
/// Get document symbols (outline) for a file.
pub async fn document_symbols(&mut self, path: &str) -> anyhow::Result<String> {
let _ = self.ensure_file_open(path).await;
let uri = self.make_uri(path)?;
let server = self.server_for_file(path)
.ok_or_else(|| anyhow::anyhow!("No LSP server for {path}"))?;
let result = server.request("textDocument/documentSymbol", serde_json::json!({
"textDocument": { "uri": uri.as_str() }
})).await?;
format_symbols(&result)
}
/// Workspace-wide symbol search.
pub async fn workspace_symbols(&mut self, query: &str, lang: Option<&str>) -> anyhow::Result<String> {
// Use the first available server, or a specific one if lang is given
let server = if let Some(lang) = lang {
self.servers.get_mut(lang)
} else {
self.servers.values_mut().next()
}
.ok_or_else(|| anyhow::anyhow!("No LSP server available"))?;
let result = server.request("workspace/symbol", serde_json::json!({
"query": query
})).await?;
format_symbols(&result)
}
/// Shutdown all servers.
pub async fn shutdown(&mut self) {
for (lang, mut server) in self.servers.drain() {
info!(lang = lang.as_str(), "Shutting down LSP server");
server.shutdown().await;
}
}
}
/// Format LSP location results as readable text.
fn format_locations(result: &serde_json::Value, project_root: &str) -> anyhow::Result<String> {
let locations = if result.is_array() {
result.as_array().unwrap().clone()
} else if result.is_object() {
vec![result.clone()]
} else if result.is_null() {
return Ok("No results found.".into());
} else {
return Ok(format!("{result}"));
};
if locations.is_empty() {
return Ok("No results found.".into());
}
let mut lines = Vec::new();
for loc in &locations {
let uri = loc.get("uri").or_else(|| loc.get("targetUri"))
.and_then(|v| v.as_str())
.unwrap_or("?");
let range = loc.get("range").or_else(|| loc.get("targetRange"));
let line = range.and_then(|r| r["start"]["line"].as_u64()).unwrap_or(0) + 1;
let col = range.and_then(|r| r["start"]["character"].as_u64()).unwrap_or(0) + 1;
// Strip file:// prefix and project root for readability
let path = uri.strip_prefix("file://").unwrap_or(uri);
let rel_path = path.strip_prefix(project_root).unwrap_or(path);
let rel_path = rel_path.strip_prefix('/').unwrap_or(rel_path);
lines.push(format!("{rel_path}:{line}:{col}"));
}
Ok(lines.join("\n"))
}
/// Format LSP symbol results.
fn format_symbols(result: &serde_json::Value) -> anyhow::Result<String> {
let symbols = result.as_array().ok_or_else(|| anyhow::anyhow!("Expected array"))?;
if symbols.is_empty() {
return Ok("No symbols found.".into());
}
let mut lines = Vec::new();
for sym in symbols {
let name = sym.get("name").and_then(|v| v.as_str()).unwrap_or("?");
let kind_num = sym.get("kind").and_then(|v| v.as_u64()).unwrap_or(0);
let kind = symbol_kind_name(kind_num);
if let Some(loc) = sym.get("location") {
let line = loc["range"]["start"]["line"].as_u64().unwrap_or(0) + 1;
lines.push(format!("{kind} {name} (line {line})"));
} else if let Some(range) = sym.get("range") {
let line = range["start"]["line"].as_u64().unwrap_or(0) + 1;
lines.push(format!("{kind} {name} (line {line})"));
} else {
lines.push(format!("{kind} {name}"));
}
// Recurse into children (DocumentSymbol)
if let Some(children) = sym.get("children").and_then(|c| c.as_array()) {
for child in children {
let cname = child.get("name").and_then(|v| v.as_str()).unwrap_or("?");
let ckind = symbol_kind_name(child.get("kind").and_then(|v| v.as_u64()).unwrap_or(0));
let cline = child.get("range").and_then(|r| r["start"]["line"].as_u64()).unwrap_or(0) + 1;
lines.push(format!(" {ckind} {cname} (line {cline})"));
}
}
}
Ok(lines.join("\n"))
}
fn symbol_kind_name(kind: u64) -> &'static str {
match kind {
1 => "file",
2 => "module",
3 => "namespace",
4 => "package",
5 => "class",
6 => "method",
7 => "property",
8 => "field",
9 => "constructor",
10 => "enum",
11 => "interface",
12 => "function",
13 => "variable",
14 => "constant",
15 => "string",
16 => "number",
17 => "boolean",
18 => "array",
19 => "object",
20 => "key",
21 => "null",
22 => "enum_member",
23 => "struct",
24 => "event",
25 => "operator",
26 => "type_parameter",
_ => "unknown",
}
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_symbol_kind_names() {
assert_eq!(symbol_kind_name(12), "function");
assert_eq!(symbol_kind_name(5), "class");
assert_eq!(symbol_kind_name(23), "struct");
assert_eq!(symbol_kind_name(10), "enum");
assert_eq!(symbol_kind_name(999), "unknown");
}
#[test]
fn test_format_locations_empty() {
let result = serde_json::json!([]);
let formatted = format_locations(&result, "/project").unwrap();
assert_eq!(formatted, "No results found.");
}
#[test]
fn test_format_locations_single() {
let result = serde_json::json!([{
"uri": "file:///project/src/main.rs",
"range": { "start": { "line": 9, "character": 3 }, "end": { "line": 9, "character": 10 } }
}]);
let formatted = format_locations(&result, "/project").unwrap();
assert_eq!(formatted, "src/main.rs:10:4");
}
#[test]
fn test_format_symbols() {
let result = serde_json::json!([
{ "name": "main", "kind": 12, "range": { "start": { "line": 0 }, "end": { "line": 5 } } },
{ "name": "Config", "kind": 23, "range": { "start": { "line": 10 }, "end": { "line": 20 } } }
]);
let formatted = format_symbols(&result).unwrap();
assert!(formatted.contains("function main (line 1)"));
assert!(formatted.contains("struct Config (line 11)"));
}
}

View File

@@ -0,0 +1,8 @@
//! LSP client — spawns language servers and queries them for code intelligence.
//!
//! Manages per-language LSP subprocesses. Provides tools for Sol:
//! lsp_definition, lsp_references, lsp_hover, lsp_diagnostics, lsp_symbols.
pub mod client;
pub mod detect;
pub mod manager;

494
sunbeam/src/code/mod.rs Normal file
View File

@@ -0,0 +1,494 @@
pub mod agent;
pub mod client;
pub mod config;
pub mod lsp;
pub mod project;
pub mod symbols;
pub mod tools;
pub mod tui;
use clap::Subcommand;
use tracing::info;
#[derive(Subcommand, Debug)]
pub enum CodeCommand {
/// Start a coding session (default — can omit subcommand)
Start {
/// Model override (e.g., devstral-small-latest)
#[arg(long)]
model: Option<String>,
/// Sol gRPC endpoint (default: from sunbeam config)
#[arg(long)]
endpoint: Option<String>,
/// Connect to localhost:50051 (dev mode)
#[arg(long, hide = true)]
localhost: bool,
},
/// Demo the TUI with sample data (no Sol connection needed)
#[command(hide = true)]
Demo,
}
pub async fn cmd_code(cmd: Option<CodeCommand>) -> sunbeam_sdk::error::Result<()> {
cmd_code_inner(cmd).await.map_err(|e| sunbeam_sdk::error::SunbeamError::Other(e.to_string()))
}
/// Install a tracing subscriber that writes to a LogBuffer instead of stderr.
/// Returns the guard — when dropped, the subscriber is unset.
fn install_tui_tracing(log_buffer: &tui::LogBuffer) -> tracing::subscriber::DefaultGuard {
use tracing_subscriber::fmt;
use tracing_subscriber::EnvFilter;
let subscriber = fmt::Subscriber::builder()
.with_env_filter(
EnvFilter::try_from_default_env()
.unwrap_or_else(|_| EnvFilter::new("sunbeam=info,sunbeam_sdk=info,warn")),
)
.with_target(false)
.with_ansi(false)
.with_writer(log_buffer.clone())
.finish();
tracing::subscriber::set_default(subscriber)
}
async fn cmd_code_inner(cmd: Option<CodeCommand>) -> anyhow::Result<()> {
let cmd = cmd.unwrap_or(CodeCommand::Start {
model: None,
endpoint: None,
localhost: false,
});
match cmd {
CodeCommand::Demo => {
return run_demo().await;
}
CodeCommand::Start { model, endpoint, localhost } => {
let endpoint = if localhost {
"http://127.0.0.1:50051".into()
} else {
endpoint.unwrap_or_else(|| "http://127.0.0.1:50051".into())
};
// Discover project context
let project = project::discover_project(".")?;
info!(
project = project.name.as_str(),
path = project.path.as_str(),
branch = project.git_branch.as_deref().unwrap_or("?"),
"Discovered project"
);
// Load project config
let cfg = config::load_project_config(&project.path);
let model = model
.or(cfg.model_name.clone())
.unwrap_or_else(|| "mistral-medium-latest".into());
// Connect to Sol
let mut session = client::connect(&endpoint, &project, &cfg, &model).await?;
info!(
session_id = session.session_id.as_str(),
room_id = session.room_id.as_str(),
model = session.model.as_str(),
resumed = session.resumed,
"Connected to Sol"
);
let resumed = session.resumed;
let history: Vec<_> = std::mem::take(&mut session.history);
// Switch tracing to in-memory buffer before entering TUI
let log_buffer = tui::LogBuffer::new();
let _guard = install_tui_tracing(&log_buffer);
// Spawn agent on background task
let project_path = project.path.clone();
let agent = agent::spawn(session, endpoint.clone(), cfg, project.path.clone());
// TUI event loop — never blocks on network I/O
use crossterm::event::{self, Event, KeyCode, KeyModifiers, MouseEventKind};
let mut terminal = tui::setup_terminal()?;
let branch = project.git_branch.as_deref().unwrap_or("?");
let mut app = tui::App::new(&project.name, branch, &model, log_buffer);
// Load persistent command history
app.load_history(&project_path);
// Load conversation history from resumed session (batch, single rebuild)
if resumed {
let entries: Vec<_> = history
.iter()
.filter_map(|msg| match msg.role.as_str() {
"user" => Some(tui::LogEntry::UserInput(msg.content.clone())),
"assistant" => Some(tui::LogEntry::AssistantText(msg.content.clone())),
_ => None,
})
.collect();
app.push_logs(entries);
}
let result = loop {
// 1. Process any pending agent events (non-blocking)
for evt in agent.poll_events() {
match evt {
agent::AgentEvent::ApprovalNeeded { call_id, name, args_summary } => {
app.approval = Some(tui::ApprovalPrompt {
call_id: call_id.clone(),
tool_name: name.clone(),
command: args_summary.clone(),
options: vec![
"yes".into(),
format!("yes, always allow {name}"),
"no".into(),
],
selected: 0,
});
app.needs_redraw = true;
}
agent::AgentEvent::Generating => {
app.is_thinking = true;
app.sol_status.clear();
app.thinking_message = tui::random_sol_status().to_string();
app.thinking_since = Some(std::time::Instant::now());
app.needs_redraw = true;
}
agent::AgentEvent::ToolExecuting { name, detail } => {
app.push_log(tui::LogEntry::ToolExecuting { name, detail });
}
agent::AgentEvent::ToolDone { name, success } => {
if success {
app.push_log(tui::LogEntry::ToolSuccess { name, detail: String::new() });
}
}
agent::AgentEvent::Status { message } => {
app.sol_status = message;
app.needs_redraw = true;
}
agent::AgentEvent::Response { text, input_tokens, output_tokens } => {
app.is_thinking = false;
app.sol_status.clear();
app.thinking_since = None;
app.last_turn_tokens = input_tokens + output_tokens;
app.input_tokens += input_tokens;
app.output_tokens += output_tokens;
app.push_log(tui::LogEntry::AssistantText(text));
}
agent::AgentEvent::Error { message } => {
app.is_thinking = false;
app.sol_status.clear();
app.thinking_since = None;
app.push_log(tui::LogEntry::Error(message));
}
agent::AgentEvent::Health { connected } => {
if app.sol_connected != connected {
app.sol_connected = connected;
app.needs_redraw = true;
}
}
agent::AgentEvent::SessionEnded => {
break;
}
}
}
// 2. Draw only when something changed (or animating)
if app.needs_redraw || app.is_thinking {
terminal.draw(|frame| tui::draw(frame, &mut app))?;
app.needs_redraw = false;
}
// 3. Handle input — shorter poll when animating
let poll_ms = if app.is_thinking { 100 } else { 50 };
if event::poll(std::time::Duration::from_millis(poll_ms))? {
// Drain all queued events in one batch (coalesces rapid scroll)
while event::poll(std::time::Duration::ZERO)? {
match event::read()? {
Event::Mouse(mouse) => {
match mouse.kind {
MouseEventKind::ScrollUp | MouseEventKind::ScrollDown => {
app.needs_redraw = true;
let size = terminal.size().unwrap_or_default();
let viewport_h = size.height.saturating_sub(5);
let delta: i16 = if matches!(mouse.kind, MouseEventKind::ScrollUp) { -3 } else { 3 };
if app.show_logs {
if delta < 0 {
app.log_scroll = if app.log_scroll == u16::MAX { u16::MAX.saturating_sub(3) } else { app.log_scroll.saturating_sub(3) };
} else {
app.log_scroll = app.log_scroll.saturating_add(3);
}
} else {
app.resolve_scroll(size.width, viewport_h);
if delta < 0 {
app.scroll_offset = app.scroll_offset.saturating_sub(3);
} else {
app.scroll_offset = app.scroll_offset.saturating_add(3);
}
}
}
_ => {} // Ignore MouseEventKind::Moved and other mouse events
}
}
Event::Key(key) => {
app.needs_redraw = true;
match key.code {
KeyCode::Char('c') if key.modifiers.contains(KeyModifiers::CONTROL) => {
agent.end();
app.should_quit = true;
break; // exit drain loop
}
KeyCode::Char('l') if key.modifiers.contains(KeyModifiers::ALT) => {
app.show_logs = !app.show_logs;
app.log_scroll = u16::MAX;
}
// Approval prompt navigation
KeyCode::Up if app.approval.is_some() => {
if let Some(ref mut a) = app.approval {
a.selected = a.selected.saturating_sub(1);
}
}
KeyCode::Down if app.approval.is_some() => {
if let Some(ref mut a) = app.approval {
a.selected = (a.selected + 1).min(a.options.len() - 1);
}
}
KeyCode::Enter if app.approval.is_some() => {
if let Some(a) = app.approval.take() {
let decision = match a.selected {
0 => agent::ApprovalDecision::Approved {
call_id: a.call_id.clone(),
},
1 => agent::ApprovalDecision::ApprovedAlways {
call_id: a.call_id.clone(),
tool_name: a.tool_name.clone(),
},
_ => agent::ApprovalDecision::Denied {
call_id: a.call_id.clone(),
},
};
agent.decide(decision);
}
}
KeyCode::Char(c) if !app.show_logs && app.approval.is_none() => {
app.history_index = None;
app.input.insert(app.cursor_pos, c);
app.cursor_pos += 1;
}
KeyCode::Backspace if !app.show_logs && app.approval.is_none() => {
if app.cursor_pos > 0 {
app.history_index = None;
app.cursor_pos -= 1;
app.input.remove(app.cursor_pos);
}
}
KeyCode::Left if !app.show_logs && app.approval.is_none() => app.cursor_pos = app.cursor_pos.saturating_sub(1),
KeyCode::Right if !app.show_logs && app.approval.is_none() => app.cursor_pos = (app.cursor_pos + 1).min(app.input.len()),
KeyCode::Up if !app.show_logs => {
if !app.command_history.is_empty() {
let idx = match app.history_index {
None => {
app.input_saved = app.input.clone();
app.command_history.len() - 1
}
Some(i) => i.saturating_sub(1),
};
app.history_index = Some(idx);
app.input = app.command_history[idx].clone();
app.cursor_pos = app.input.len();
}
}
KeyCode::Down if !app.show_logs => {
if let Some(idx) = app.history_index {
if idx + 1 < app.command_history.len() {
let new_idx = idx + 1;
app.history_index = Some(new_idx);
app.input = app.command_history[new_idx].clone();
app.cursor_pos = app.input.len();
} else {
app.history_index = None;
app.input = app.input_saved.clone();
app.cursor_pos = app.input.len();
}
}
}
KeyCode::Up if app.show_logs => {
app.log_scroll = if app.log_scroll == u16::MAX { u16::MAX.saturating_sub(1) } else { app.log_scroll.saturating_sub(1) };
}
KeyCode::Down if app.show_logs => {
app.log_scroll = app.log_scroll.saturating_add(1);
}
KeyCode::PageUp => {
let size = terminal.size().unwrap_or_default();
app.resolve_scroll(size.width, size.height.saturating_sub(5));
app.scroll_offset = app.scroll_offset.saturating_sub(20);
}
KeyCode::PageDown => {
let size = terminal.size().unwrap_or_default();
app.resolve_scroll(size.width, size.height.saturating_sub(5));
app.scroll_offset = app.scroll_offset.saturating_add(20);
}
KeyCode::Home => app.scroll_offset = 0,
KeyCode::End => app.scroll_offset = u16::MAX,
KeyCode::Enter if !app.show_logs && !app.is_thinking => {
if !app.input.is_empty() {
let text = app.input.clone();
app.command_history.push(text.clone());
app.history_index = None;
app.input.clear();
app.cursor_pos = 0;
if text == "/exit" {
agent.end();
app.should_quit = true;
break; // exit drain loop
}
app.push_log(tui::LogEntry::UserInput(text.clone()));
agent.chat(&text);
}
}
_ => {}
}
}
_ => {}
} // match event::read
} // while poll(ZERO)
} // if poll(50ms)
if app.should_quit {
break Ok(());
}
};
app.save_history(&project_path);
tui::restore_terminal(&mut terminal)?;
result
}
}
}
async fn run_demo() -> anyhow::Result<()> {
use crossterm::event::{self, Event, KeyCode, KeyModifiers};
let log_buffer = tui::LogBuffer::new();
let _guard = install_tui_tracing(&log_buffer);
let mut terminal = tui::setup_terminal()?;
let mut app = tui::App::new("sol", "mainline ±", "devstral-small-latest", log_buffer);
// Populate with sample conversation
app.push_log(tui::LogEntry::UserInput("fix the token validation bug in auth.rs".into()));
app.push_log(tui::LogEntry::AssistantText(
"Looking at the auth module, I can see the issue on line 42 where the token \
is not properly validated before use. The expiry check is missing entirely."
.into(),
));
app.push_log(tui::LogEntry::ToolSuccess {
name: "file_read".into(),
detail: "src/auth.rs (127 lines)".into(),
});
app.push_log(tui::LogEntry::ToolOutput {
lines: vec![
"38│ fn validate_token(token: &str) -> bool {".into(),
"39│ let decoded = decode(token);".into(),
"40│ // BUG: missing expiry check".into(),
"41│ decoded.is_ok()".into(),
"42│ }".into(),
"43│".into(),
"44│ fn refresh_token(token: &str) -> Result<String> {".into(),
"45│ let client = reqwest::Client::new();".into(),
"46│ // ...".into(),
],
collapsed: true,
});
app.push_log(tui::LogEntry::ToolSuccess {
name: "search_replace".into(),
detail: "src/auth.rs — applied 1 replacement (line 41)".into(),
});
app.push_log(tui::LogEntry::ToolExecuting {
name: "bash".into(),
detail: "cargo test --lib".into(),
});
app.push_log(tui::LogEntry::ToolOutput {
lines: vec![
"running 23 tests".into(),
"test auth::tests::test_validate_token ... ok".into(),
"test auth::tests::test_expired_token ... ok".into(),
"test auth::tests::test_refresh_flow ... ok".into(),
"test result: ok. 23 passed; 0 failed".into(),
],
collapsed: false,
});
app.push_log(tui::LogEntry::AssistantText(
"Fixed. The token validation now checks expiry before use. All 23 tests pass."
.into(),
));
app.push_log(tui::LogEntry::UserInput("now add rate limiting to the auth endpoint".into()));
app.push_log(tui::LogEntry::ToolExecuting {
name: "file_read".into(),
detail: "src/routes/auth.rs".into(),
});
app.is_thinking = true;
app.input_tokens = 2400;
app.output_tokens = 890;
loop {
terminal.draw(|frame| tui::draw(frame, &mut app))?;
if event::poll(std::time::Duration::from_millis(100))? {
if let Event::Key(key) = event::read()? {
match key.code {
KeyCode::Char('c') if key.modifiers.contains(KeyModifiers::CONTROL) => break,
KeyCode::Char('q') => break,
KeyCode::Char('l') if key.modifiers.contains(KeyModifiers::ALT) => {
app.show_logs = !app.show_logs;
app.log_scroll = u16::MAX;
}
KeyCode::Char(c) => {
app.input.insert(app.cursor_pos, c);
app.cursor_pos += 1;
}
KeyCode::Backspace => {
if app.cursor_pos > 0 {
app.cursor_pos -= 1;
app.input.remove(app.cursor_pos);
}
}
KeyCode::Left => {
app.cursor_pos = app.cursor_pos.saturating_sub(1);
}
KeyCode::Right => {
app.cursor_pos = (app.cursor_pos + 1).min(app.input.len());
}
KeyCode::Enter => {
if !app.input.is_empty() {
let text = app.input.clone();
app.input.clear();
app.cursor_pos = 0;
if text == "/exit" {
break;
}
app.push_log(tui::LogEntry::UserInput(text));
app.is_thinking = true;
}
}
KeyCode::Up => {
app.scroll_offset = app.scroll_offset.saturating_sub(1);
}
KeyCode::Down => {
app.scroll_offset = app.scroll_offset.saturating_add(1);
}
_ => {}
}
}
}
}
tui::restore_terminal(&mut terminal)?;
Ok(())
}

131
sunbeam/src/code/project.rs Normal file
View File

@@ -0,0 +1,131 @@
use std::path::{Path, PathBuf};
use std::process::Command;
/// Discovered project context sent to Sol on session start.
pub struct ProjectContext {
pub name: String,
pub path: String,
pub prompt_md: String,
pub config_toml: String,
pub git_branch: Option<String>,
pub git_status: Option<String>,
pub file_tree: Vec<String>,
}
/// Discover project context from the working directory.
pub fn discover_project(dir: &str) -> anyhow::Result<ProjectContext> {
let path = std::fs::canonicalize(dir)?;
let name = path
.file_name()
.and_then(|n| n.to_str())
.unwrap_or("unknown")
.to_string();
let prompt_md = read_optional(&path.join(".sunbeam").join("prompt.md"));
let config_toml = read_optional(&path.join(".sunbeam").join("config.toml"));
let git_branch = run_git(&path, &["rev-parse", "--abbrev-ref", "HEAD"]);
let git_status = run_git(&path, &["status", "--short"]);
let file_tree = list_tree(&path, 2);
Ok(ProjectContext {
name,
path: path.to_string_lossy().into(),
prompt_md,
config_toml,
git_branch,
git_status,
file_tree,
})
}
fn read_optional(path: &Path) -> String {
std::fs::read_to_string(path).unwrap_or_default()
}
fn run_git(dir: &Path, args: &[&str]) -> Option<String> {
Command::new("git")
.args(args)
.current_dir(dir)
.output()
.ok()
.filter(|o| o.status.success())
.map(|o| String::from_utf8_lossy(&o.stdout).trim().to_string())
}
fn list_tree(dir: &Path, max_depth: usize) -> Vec<String> {
let mut entries = Vec::new();
list_tree_inner(dir, dir, 0, max_depth, &mut entries);
entries
}
fn list_tree_inner(
base: &Path,
dir: &Path,
depth: usize,
max_depth: usize,
entries: &mut Vec<String>,
) {
if depth > max_depth {
return;
}
let Ok(read_dir) = std::fs::read_dir(dir) else {
return;
};
let mut items: Vec<_> = read_dir.filter_map(|e| e.ok()).collect();
items.sort_by_key(|e| e.file_name());
for entry in items {
let name = entry.file_name().to_string_lossy().to_string();
// Skip hidden dirs, target, node_modules, vendor
if name.starts_with('.') || name == "target" || name == "node_modules" || name == "vendor"
{
continue;
}
let relative = entry
.path()
.strip_prefix(base)
.unwrap_or(&entry.path())
.to_string_lossy()
.to_string();
entries.push(relative);
if entry.file_type().map(|t| t.is_dir()).unwrap_or(false) {
list_tree_inner(base, &entry.path(), depth + 1, max_depth, entries);
}
}
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_discover_current_dir() {
// Should work in any directory
let ctx = discover_project(".").unwrap();
assert!(!ctx.name.is_empty());
assert!(!ctx.path.is_empty());
}
#[test]
fn test_list_tree_excludes_hidden() {
let dir = std::env::temp_dir().join("sunbeam-test-tree");
let _ = std::fs::create_dir_all(dir.join(".hidden"));
let _ = std::fs::create_dir_all(dir.join("visible"));
let _ = std::fs::write(dir.join("file.txt"), "test");
let tree = list_tree(&dir, 1);
assert!(tree.iter().any(|e| e == "visible"));
assert!(tree.iter().any(|e| e == "file.txt"));
assert!(!tree.iter().any(|e| e.contains(".hidden")));
let _ = std::fs::remove_dir_all(&dir);
}
}

659
sunbeam/src/code/symbols.rs Normal file
View File

@@ -0,0 +1,659 @@
//! Symbol extraction from source code using tree-sitter.
//!
//! Extracts function signatures, struct/enum/trait definitions, and
//! docstrings from Rust, TypeScript, and Python files. These symbols
//! are sent to Sol for indexing in the code search index.
use std::path::Path;
use tracing::debug;
/// An extracted code symbol with file context.
#[derive(Debug, Clone)]
pub struct ProjectSymbol {
pub file_path: String, // relative to project root
pub name: String,
pub kind: String,
pub signature: String,
pub docstring: String,
pub start_line: u32,
pub end_line: u32,
pub language: String,
pub content: String,
}
/// Extract symbols from all source files in a project.
pub fn extract_project_symbols(project_root: &str) -> Vec<ProjectSymbol> {
let root = Path::new(project_root);
let mut symbols = Vec::new();
walk_directory(root, root, &mut symbols);
debug!(count = symbols.len(), "Extracted project symbols");
symbols
}
fn walk_directory(dir: &Path, root: &Path, symbols: &mut Vec<ProjectSymbol>) {
let Ok(entries) = std::fs::read_dir(dir) else { return };
for entry in entries.flatten() {
let path = entry.path();
let name = entry.file_name().to_string_lossy().to_string();
// Skip hidden, vendor, target, node_modules, etc.
if name.starts_with('.') || name == "target" || name == "vendor"
|| name == "node_modules" || name == "dist" || name == "build"
|| name == "__pycache__" || name == ".git"
{
continue;
}
if path.is_dir() {
walk_directory(&path, root, symbols);
} else if path.is_file() {
let path_str = path.to_string_lossy().to_string();
if detect_language(&path_str).is_some() {
// Read file (skip large files)
if let Ok(content) = std::fs::read_to_string(&path) {
if content.len() > 100_000 { continue; } // skip >100KB
let rel_path = path.strip_prefix(root)
.map(|p| p.to_string_lossy().to_string())
.unwrap_or(path_str.clone());
for sym in extract_symbols(&path_str, &content) {
// Build content: signature + body up to 500 chars
let body_start = content.lines()
.take(sym.start_line as usize - 1)
.map(|l| l.len() + 1)
.sum::<usize>();
let body_end = content.lines()
.take(sym.end_line as usize)
.map(|l| l.len() + 1)
.sum::<usize>()
.min(content.len());
let body = &content[body_start..body_end];
let truncated = if body.len() > 500 {
format!("{}", &body[..497])
} else {
body.to_string()
};
symbols.push(ProjectSymbol {
file_path: rel_path.clone(),
name: sym.name,
kind: sym.kind,
signature: sym.signature,
docstring: sym.docstring,
start_line: sym.start_line,
end_line: sym.end_line,
language: sym.language,
content: truncated,
});
}
}
}
}
}
}
/// An extracted code symbol.
#[derive(Debug, Clone)]
pub struct CodeSymbol {
pub name: String,
pub kind: String, // "function", "struct", "enum", "trait", "class", "interface", "method"
pub signature: String, // full signature line
pub docstring: String, // doc comment / docstring
pub start_line: u32, // 1-based
pub end_line: u32, // 1-based
pub language: String,
}
/// Detect language from file extension.
pub fn detect_language(path: &str) -> Option<&'static str> {
let ext = Path::new(path).extension()?.to_str()?;
match ext {
"rs" => Some("rust"),
"ts" | "tsx" => Some("typescript"),
"js" | "jsx" => Some("javascript"),
"py" => Some("python"),
_ => None,
}
}
/// Extract symbols from a source file's content.
pub fn extract_symbols(path: &str, content: &str) -> Vec<CodeSymbol> {
let Some(lang) = detect_language(path) else {
return Vec::new();
};
match lang {
"rust" => extract_rust_symbols(content),
"typescript" | "javascript" => extract_ts_symbols(content),
"python" => extract_python_symbols(content),
_ => Vec::new(),
}
}
// ── Rust ────────────────────────────────────────────────────────────────
fn extract_rust_symbols(content: &str) -> Vec<CodeSymbol> {
let mut parser = tree_sitter::Parser::new();
parser.set_language(&tree_sitter_rust::LANGUAGE.into()).ok();
let Some(tree) = parser.parse(content, None) else {
return Vec::new();
};
let mut symbols = Vec::new();
let root = tree.root_node();
let bytes = content.as_bytes();
walk_rust_node(root, bytes, content, &mut symbols);
symbols
}
fn walk_rust_node(
node: tree_sitter::Node,
bytes: &[u8],
source: &str,
symbols: &mut Vec<CodeSymbol>,
) {
match node.kind() {
"function_item" | "function_signature_item" => {
if let Some(sym) = extract_rust_function(node, bytes, source) {
symbols.push(sym);
}
}
"struct_item" => {
if let Some(sym) = extract_rust_type(node, bytes, source, "struct") {
symbols.push(sym);
}
}
"enum_item" => {
if let Some(sym) = extract_rust_type(node, bytes, source, "enum") {
symbols.push(sym);
}
}
"trait_item" => {
if let Some(sym) = extract_rust_type(node, bytes, source, "trait") {
symbols.push(sym);
}
}
"impl_item" => {
// Walk impl methods
for i in 0..node.child_count() {
if let Some(child) = node.child(i) {
if child.kind() == "declaration_list" {
walk_rust_node(child, bytes, source, symbols);
}
}
}
}
_ => {
for i in 0..node.child_count() {
if let Some(child) = node.child(i) {
walk_rust_node(child, bytes, source, symbols);
}
}
}
}
}
fn extract_rust_function(node: tree_sitter::Node, bytes: &[u8], source: &str) -> Option<CodeSymbol> {
let name = node.child_by_field_name("name")?;
let name_str = name.utf8_text(bytes).ok()?.to_string();
// Build signature: everything from start to the opening brace (or end if no body)
let start_byte = node.start_byte();
let sig_end = find_rust_sig_end(node, source);
let signature = source[start_byte..sig_end].trim().to_string();
// Extract doc comment (line comments starting with /// before the function)
let docstring = extract_rust_doc_comment(node, source);
Some(CodeSymbol {
name: name_str,
kind: "function".into(),
signature,
docstring,
start_line: node.start_position().row as u32 + 1,
end_line: node.end_position().row as u32 + 1,
language: "rust".into(),
})
}
fn extract_rust_type(node: tree_sitter::Node, bytes: &[u8], source: &str, kind: &str) -> Option<CodeSymbol> {
let name = node.child_by_field_name("name")?;
let name_str = name.utf8_text(bytes).ok()?.to_string();
// Signature: first line of the definition
let start = node.start_byte();
let first_line_end = source[start..].find('\n').map(|i| start + i).unwrap_or(node.end_byte());
let signature = source[start..first_line_end].trim().to_string();
let docstring = extract_rust_doc_comment(node, source);
Some(CodeSymbol {
name: name_str,
kind: kind.into(),
signature,
docstring,
start_line: node.start_position().row as u32 + 1,
end_line: node.end_position().row as u32 + 1,
language: "rust".into(),
})
}
fn find_rust_sig_end(node: tree_sitter::Node, source: &str) -> usize {
// Find the opening brace
for i in 0..node.child_count() {
if let Some(child) = node.child(i) {
if child.kind() == "block" || child.kind() == "field_declaration_list"
|| child.kind() == "enum_variant_list" || child.kind() == "declaration_list"
{
return child.start_byte();
}
}
}
// No body (e.g., trait method signature)
node.end_byte().min(source.len())
}
fn extract_rust_doc_comment(node: tree_sitter::Node, source: &str) -> String {
let start_line = node.start_position().row;
if start_line == 0 {
return String::new();
}
let lines: Vec<&str> = source.lines().collect();
let mut doc_lines = Vec::new();
// Walk backwards from the line before the node
let mut line_idx = start_line.saturating_sub(1);
loop {
if line_idx >= lines.len() {
break;
}
let line = lines[line_idx].trim();
if line.starts_with("///") {
doc_lines.push(line.trim_start_matches("///").trim());
} else if line.starts_with("#[") || line.is_empty() {
// Skip attributes and blank lines between doc and function
if line.is_empty() && !doc_lines.is_empty() {
break; // blank line after doc block = stop
}
} else {
break;
}
if line_idx == 0 {
break;
}
line_idx -= 1;
}
doc_lines.reverse();
doc_lines.join("\n")
}
// ── TypeScript / JavaScript ─────────────────────────────────────────────
fn extract_ts_symbols(content: &str) -> Vec<CodeSymbol> {
let mut parser = tree_sitter::Parser::new();
parser.set_language(&tree_sitter_typescript::LANGUAGE_TYPESCRIPT.into()).ok();
let Some(tree) = parser.parse(content, None) else {
return Vec::new();
};
let mut symbols = Vec::new();
walk_ts_node(tree.root_node(), content.as_bytes(), content, &mut symbols);
symbols
}
fn walk_ts_node(
node: tree_sitter::Node,
bytes: &[u8],
source: &str,
symbols: &mut Vec<CodeSymbol>,
) {
match node.kind() {
"function_declaration" | "method_definition" | "arrow_function" => {
if let Some(name) = node.child_by_field_name("name") {
let name_str = name.utf8_text(bytes).unwrap_or("").to_string();
if !name_str.is_empty() {
let start = node.start_byte();
let first_line_end = source[start..].find('\n').map(|i| start + i).unwrap_or(node.end_byte());
symbols.push(CodeSymbol {
name: name_str,
kind: "function".into(),
signature: source[start..first_line_end].trim().to_string(),
docstring: String::new(), // TODO: JSDoc extraction
start_line: node.start_position().row as u32 + 1,
end_line: node.end_position().row as u32 + 1,
language: "typescript".into(),
});
}
}
}
"class_declaration" | "interface_declaration" | "type_alias_declaration" | "enum_declaration" => {
if let Some(name) = node.child_by_field_name("name") {
let name_str = name.utf8_text(bytes).unwrap_or("").to_string();
let kind = match node.kind() {
"class_declaration" => "class",
"interface_declaration" => "interface",
"enum_declaration" => "enum",
_ => "type",
};
let start = node.start_byte();
let first_line_end = source[start..].find('\n').map(|i| start + i).unwrap_or(node.end_byte());
symbols.push(CodeSymbol {
name: name_str,
kind: kind.into(),
signature: source[start..first_line_end].trim().to_string(),
docstring: String::new(),
start_line: node.start_position().row as u32 + 1,
end_line: node.end_position().row as u32 + 1,
language: "typescript".into(),
});
}
}
_ => {}
}
for i in 0..node.child_count() {
if let Some(child) = node.child(i) {
walk_ts_node(child, bytes, source, symbols);
}
}
}
// ── Python ──────────────────────────────────────────────────────────────
fn extract_python_symbols(content: &str) -> Vec<CodeSymbol> {
let mut parser = tree_sitter::Parser::new();
parser.set_language(&tree_sitter_python::LANGUAGE.into()).ok();
let Some(tree) = parser.parse(content, None) else {
return Vec::new();
};
let mut symbols = Vec::new();
walk_python_node(tree.root_node(), content.as_bytes(), content, &mut symbols);
symbols
}
fn walk_python_node(
node: tree_sitter::Node,
bytes: &[u8],
source: &str,
symbols: &mut Vec<CodeSymbol>,
) {
match node.kind() {
"function_definition" => {
if let Some(name) = node.child_by_field_name("name") {
let name_str = name.utf8_text(bytes).unwrap_or("").to_string();
let start = node.start_byte();
let first_line_end = source[start..].find('\n').map(|i| start + i).unwrap_or(node.end_byte());
let docstring = extract_python_docstring(node, bytes);
symbols.push(CodeSymbol {
name: name_str,
kind: "function".into(),
signature: source[start..first_line_end].trim().to_string(),
docstring,
start_line: node.start_position().row as u32 + 1,
end_line: node.end_position().row as u32 + 1,
language: "python".into(),
});
}
}
"class_definition" => {
if let Some(name) = node.child_by_field_name("name") {
let name_str = name.utf8_text(bytes).unwrap_or("").to_string();
let start = node.start_byte();
let first_line_end = source[start..].find('\n').map(|i| start + i).unwrap_or(node.end_byte());
let docstring = extract_python_docstring(node, bytes);
symbols.push(CodeSymbol {
name: name_str,
kind: "class".into(),
signature: source[start..first_line_end].trim().to_string(),
docstring,
start_line: node.start_position().row as u32 + 1,
end_line: node.end_position().row as u32 + 1,
language: "python".into(),
});
}
}
_ => {}
}
for i in 0..node.child_count() {
if let Some(child) = node.child(i) {
walk_python_node(child, bytes, source, symbols);
}
}
}
fn extract_python_docstring(node: tree_sitter::Node, bytes: &[u8]) -> String {
// Python docstrings are the first expression_statement in the body
if let Some(body) = node.child_by_field_name("body") {
if let Some(first_stmt) = body.child(0) {
if first_stmt.kind() == "expression_statement" {
if let Some(expr) = first_stmt.child(0) {
if expr.kind() == "string" {
let text = expr.utf8_text(bytes).unwrap_or("");
// Strip triple quotes
let trimmed = text
.trim_start_matches("\"\"\"")
.trim_start_matches("'''")
.trim_end_matches("\"\"\"")
.trim_end_matches("'''")
.trim();
return trimmed.to_string();
}
}
}
}
}
String::new()
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_detect_language() {
assert_eq!(detect_language("src/main.rs"), Some("rust"));
assert_eq!(detect_language("app.ts"), Some("typescript"));
assert_eq!(detect_language("app.tsx"), Some("typescript"));
assert_eq!(detect_language("script.py"), Some("python"));
assert_eq!(detect_language("script.js"), Some("javascript"));
assert_eq!(detect_language("data.json"), None);
assert_eq!(detect_language("README.md"), None);
}
#[test]
fn test_extract_rust_function() {
let source = r#"
/// Generate a response.
pub async fn generate(&self, req: &GenerateRequest) -> Option<String> {
self.run_and_emit(req).await
}
"#;
let symbols = extract_rust_symbols(source);
assert!(!symbols.is_empty(), "Should extract at least one symbol");
let func = &symbols[0];
assert_eq!(func.name, "generate");
assert_eq!(func.kind, "function");
assert!(func.signature.contains("pub async fn generate"));
assert!(func.docstring.contains("Generate a response"));
assert_eq!(func.language, "rust");
}
#[test]
fn test_extract_rust_struct() {
let source = r#"
/// A request to generate.
pub struct GenerateRequest {
pub text: String,
pub user_id: String,
}
"#;
let symbols = extract_rust_symbols(source);
let structs: Vec<_> = symbols.iter().filter(|s| s.kind == "struct").collect();
assert!(!structs.is_empty());
assert_eq!(structs[0].name, "GenerateRequest");
assert!(structs[0].docstring.contains("request to generate"));
}
#[test]
fn test_extract_rust_enum() {
let source = r#"
/// Whether server or client.
pub enum ToolSide {
Server,
Client,
}
"#;
let symbols = extract_rust_symbols(source);
let enums: Vec<_> = symbols.iter().filter(|s| s.kind == "enum").collect();
assert!(!enums.is_empty());
assert_eq!(enums[0].name, "ToolSide");
}
#[test]
fn test_extract_rust_trait() {
let source = r#"
pub trait Executor {
fn execute(&self, args: &str) -> String;
}
"#;
let symbols = extract_rust_symbols(source);
let traits: Vec<_> = symbols.iter().filter(|s| s.kind == "trait").collect();
assert!(!traits.is_empty());
assert_eq!(traits[0].name, "Executor");
}
#[test]
fn test_extract_rust_impl_methods() {
let source = r#"
impl Orchestrator {
/// Create new.
pub fn new(config: Config) -> Self {
Self { config }
}
/// Subscribe to events.
pub fn subscribe(&self) -> Receiver {
self.tx.subscribe()
}
}
"#;
let symbols = extract_rust_symbols(source);
let fns: Vec<_> = symbols.iter().filter(|s| s.kind == "function").collect();
assert!(fns.len() >= 2, "Should find impl methods, got {}", fns.len());
let names: Vec<&str> = fns.iter().map(|s| s.name.as_str()).collect();
assert!(names.contains(&"new"));
assert!(names.contains(&"subscribe"));
}
#[test]
fn test_extract_ts_function() {
let source = r#"
function greet(name: string): string {
return `Hello, ${name}`;
}
"#;
let symbols = extract_ts_symbols(source);
assert!(!symbols.is_empty());
assert_eq!(symbols[0].name, "greet");
assert_eq!(symbols[0].kind, "function");
}
#[test]
fn test_extract_ts_class() {
let source = r#"
class UserService {
constructor(private db: Database) {}
async getUser(id: string): Promise<User> {
return this.db.find(id);
}
}
"#;
let symbols = extract_ts_symbols(source);
let classes: Vec<_> = symbols.iter().filter(|s| s.kind == "class").collect();
assert!(!classes.is_empty());
assert_eq!(classes[0].name, "UserService");
}
#[test]
fn test_extract_ts_interface() {
let source = r#"
interface User {
id: string;
name: string;
email?: string;
}
"#;
let symbols = extract_ts_symbols(source);
let ifaces: Vec<_> = symbols.iter().filter(|s| s.kind == "interface").collect();
assert!(!ifaces.is_empty());
assert_eq!(ifaces[0].name, "User");
}
#[test]
fn test_extract_python_function() {
let source = r#"
def process_data(items: list[str]) -> dict:
"""Process a list of items into a dictionary."""
return {item: len(item) for item in items}
"#;
let symbols = extract_python_symbols(source);
assert!(!symbols.is_empty());
assert_eq!(symbols[0].name, "process_data");
assert_eq!(symbols[0].kind, "function");
assert!(symbols[0].docstring.contains("Process a list"));
}
#[test]
fn test_extract_python_class() {
let source = r#"
class DataProcessor:
"""Processes data from various sources."""
def __init__(self, config):
self.config = config
def run(self):
pass
"#;
let symbols = extract_python_symbols(source);
let classes: Vec<_> = symbols.iter().filter(|s| s.kind == "class").collect();
assert!(!classes.is_empty());
assert_eq!(classes[0].name, "DataProcessor");
assert!(classes[0].docstring.contains("Processes data"));
}
#[test]
fn test_extract_symbols_unknown_language() {
let symbols = extract_symbols("data.json", "{}");
assert!(symbols.is_empty());
}
#[test]
fn test_extract_symbols_empty_file() {
let symbols = extract_symbols("empty.rs", "");
assert!(symbols.is_empty());
}
#[test]
fn test_line_numbers_are_1_based() {
let source = "fn first() {}\nfn second() {}\nfn third() {}";
let symbols = extract_rust_symbols(source);
assert!(symbols.len() >= 3);
assert_eq!(symbols[0].start_line, 1);
assert_eq!(symbols[1].start_line, 2);
assert_eq!(symbols[2].start_line, 3);
}
}

345
sunbeam/src/code/tools.rs Normal file
View File

@@ -0,0 +1,345 @@
use std::path::Path;
use std::process::Command;
use serde_json::Value;
use tracing::info;
/// Execute a client-side tool and return the result as a string.
pub fn execute(name: &str, args_json: &str, project_root: &str) -> String {
let args: Value = serde_json::from_str(args_json).unwrap_or_default();
match name {
"file_read" => file_read(&args, project_root),
"file_write" => file_write(&args, project_root),
"search_replace" => search_replace(&args, project_root),
"grep" => grep(&args, project_root),
"bash" => bash(&args, project_root),
"list_directory" => list_directory(&args, project_root),
_ => format!("Unknown client tool: {name}"),
}
}
/// Execute an LSP tool asynchronously. Returns None if tool is not an LSP tool.
pub async fn execute_lsp(
name: &str,
args_json: &str,
lsp: &mut super::lsp::manager::LspManager,
) -> Option<String> {
let args: Value = serde_json::from_str(args_json).unwrap_or_default();
let result = match name {
"lsp_definition" => {
let path = args["path"].as_str().unwrap_or("");
let line = args["line"].as_u64().unwrap_or(1) as u32;
let col = args["column"].as_u64().unwrap_or(1) as u32;
Some(lsp.definition(path, line, col).await
.unwrap_or_else(|e| format!("LSP error: {e}")))
}
"lsp_references" => {
let path = args["path"].as_str().unwrap_or("");
let line = args["line"].as_u64().unwrap_or(1) as u32;
let col = args["column"].as_u64().unwrap_or(1) as u32;
Some(lsp.references(path, line, col).await
.unwrap_or_else(|e| format!("LSP error: {e}")))
}
"lsp_hover" => {
let path = args["path"].as_str().unwrap_or("");
let line = args["line"].as_u64().unwrap_or(1) as u32;
let col = args["column"].as_u64().unwrap_or(1) as u32;
Some(lsp.hover(path, line, col).await
.unwrap_or_else(|e| format!("LSP error: {e}")))
}
"lsp_diagnostics" => {
let path = args["path"].as_str().unwrap_or("");
if path.is_empty() {
Some("Specify a file path for diagnostics.".into())
} else {
// TODO: return cached diagnostics from publishDiagnostics
Some("Diagnostics not yet implemented. Use `bash` with `cargo check` or equivalent.".into())
}
}
"lsp_symbols" => {
let path = args["path"].as_str();
let query = args["query"].as_str().unwrap_or("");
if let Some(path) = path {
Some(lsp.document_symbols(path).await
.unwrap_or_else(|e| format!("LSP error: {e}")))
} else {
Some(lsp.workspace_symbols(query, None).await
.unwrap_or_else(|e| format!("LSP error: {e}")))
}
}
_ => None,
};
result
}
/// Check if a tool name is an LSP tool.
pub fn is_lsp_tool(name: &str) -> bool {
matches!(name, "lsp_definition" | "lsp_references" | "lsp_hover" | "lsp_diagnostics" | "lsp_symbols")
}
fn resolve_path(path: &str, project_root: &str) -> String {
let p = Path::new(path);
if p.is_absolute() {
path.to_string()
} else {
Path::new(project_root)
.join(path)
.to_string_lossy()
.into()
}
}
fn file_read(args: &Value, root: &str) -> String {
let path = args["path"].as_str().unwrap_or("");
let resolved = resolve_path(path, root);
let content = match std::fs::read_to_string(&resolved) {
Ok(c) => c,
Err(e) => return format!("Error reading {path}: {e}"),
};
let start = args["start_line"].as_u64().map(|n| n as usize);
let end = args["end_line"].as_u64().map(|n| n as usize);
match (start, end) {
(Some(s), Some(e)) => {
let lines: Vec<&str> = content.lines().collect();
let s = s.saturating_sub(1).min(lines.len());
let e = e.min(lines.len());
lines[s..e].join("\n")
}
(Some(s), None) => {
let lines: Vec<&str> = content.lines().collect();
let s = s.saturating_sub(1).min(lines.len());
lines[s..].join("\n")
}
_ => content,
}
}
fn file_write(args: &Value, root: &str) -> String {
let path = args["path"].as_str().unwrap_or("");
let content = args["content"].as_str().unwrap_or("");
let resolved = resolve_path(path, root);
// Ensure parent directory exists
if let Some(parent) = Path::new(&resolved).parent() {
let _ = std::fs::create_dir_all(parent);
}
match std::fs::write(&resolved, content) {
Ok(_) => format!("Written {} bytes to {path}", content.len()),
Err(e) => format!("Error writing {path}: {e}"),
}
}
fn search_replace(args: &Value, root: &str) -> String {
let path = args["path"].as_str().unwrap_or("");
let diff = args["diff"].as_str().unwrap_or("");
let resolved = resolve_path(path, root);
let content = match std::fs::read_to_string(&resolved) {
Ok(c) => c,
Err(e) => return format!("Error reading {path}: {e}"),
};
// Parse SEARCH/REPLACE blocks
let mut result = content.clone();
let mut replacements = 0;
for block in diff.split("<<<< SEARCH\n").skip(1) {
let parts: Vec<&str> = block.splitn(2, "=====\n").collect();
if parts.len() != 2 {
continue;
}
let search = parts[0].trim_end_matches('\n');
let rest: Vec<&str> = parts[1].splitn(2, ">>>>> REPLACE").collect();
if rest.is_empty() {
continue;
}
let replace = rest[0].trim_end_matches('\n');
if result.contains(search) {
result = result.replacen(search, replace, 1);
replacements += 1;
}
}
if replacements > 0 {
match std::fs::write(&resolved, &result) {
Ok(_) => format!("{replacements} replacement(s) applied to {path}"),
Err(e) => format!("Error writing {path}: {e}"),
}
} else {
format!("No matches found in {path}")
}
}
fn grep(args: &Value, root: &str) -> String {
let pattern = args["pattern"].as_str().unwrap_or("");
let path = args["path"].as_str().unwrap_or(".");
let resolved = resolve_path(path, root);
// Try rg first, fall back to grep
let output = Command::new("rg")
.args(["--no-heading", "--line-number", pattern, &resolved])
.output()
.or_else(|_| {
Command::new("grep")
.args(["-rn", pattern, &resolved])
.output()
});
match output {
Ok(o) => {
let stdout = String::from_utf8_lossy(&o.stdout);
if stdout.is_empty() {
format!("No matches for '{pattern}' in {path}")
} else {
// Truncate if too long
if stdout.len() > 8192 {
format!("{}...\n(truncated)", &stdout[..8192])
} else {
stdout.into()
}
}
}
Err(e) => format!("Error running grep: {e}"),
}
}
fn bash(args: &Value, root: &str) -> String {
let command = args["command"].as_str().unwrap_or("");
info!(command, "Executing bash command");
let output = Command::new("sh")
.args(["-c", command])
.current_dir(root)
.output();
match output {
Ok(o) => {
let stdout = String::from_utf8_lossy(&o.stdout);
let stderr = String::from_utf8_lossy(&o.stderr);
let mut result = String::new();
if !stdout.is_empty() {
result.push_str(&stdout);
}
if !stderr.is_empty() {
if !result.is_empty() {
result.push('\n');
}
result.push_str("stderr: ");
result.push_str(&stderr);
}
if !o.status.success() {
result.push_str(&format!("\nexit code: {}", o.status.code().unwrap_or(-1)));
}
if result.len() > 16384 {
format!("{}...\n(truncated)", &result[..16384])
} else {
result
}
}
Err(e) => format!("Error: {e}"),
}
}
fn list_directory(args: &Value, root: &str) -> String {
let path = args["path"].as_str().unwrap_or(".");
let depth = args["depth"].as_u64().unwrap_or(1) as usize;
let resolved = resolve_path(path, root);
let mut entries = Vec::new();
list_dir_inner(Path::new(&resolved), Path::new(&resolved), 0, depth, &mut entries);
if entries.is_empty() {
format!("Empty directory: {path}")
} else {
entries.join("\n")
}
}
fn list_dir_inner(
base: &Path,
dir: &Path,
depth: usize,
max_depth: usize,
entries: &mut Vec<String>,
) {
if depth > max_depth {
return;
}
let Ok(read_dir) = std::fs::read_dir(dir) else {
return;
};
let mut items: Vec<_> = read_dir.filter_map(|e| e.ok()).collect();
items.sort_by_key(|e| e.file_name());
for entry in items {
let name = entry.file_name().to_string_lossy().to_string();
if name.starts_with('.') || name == "target" || name == "node_modules" || name == "vendor" {
continue;
}
let is_dir = entry.file_type().map(|t| t.is_dir()).unwrap_or(false);
let relative = entry
.path()
.strip_prefix(base)
.unwrap_or(&entry.path())
.to_string_lossy()
.to_string();
let prefix = " ".repeat(depth);
let marker = if is_dir { "/" } else { "" };
entries.push(format!("{prefix}{relative}{marker}"));
if is_dir {
list_dir_inner(base, &entry.path(), depth + 1, max_depth, entries);
}
}
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_resolve_path_relative() {
let resolved = resolve_path("src/main.rs", "/project");
assert_eq!(resolved, "/project/src/main.rs");
}
#[test]
fn test_resolve_path_absolute() {
let resolved = resolve_path("/etc/hosts", "/project");
assert_eq!(resolved, "/etc/hosts");
}
#[test]
fn test_file_read_nonexistent() {
let args = serde_json::json!({"path": "/nonexistent/file.txt"});
let result = file_read(&args, "/tmp");
assert!(result.contains("Error"));
}
#[test]
fn test_bash_echo() {
let args = serde_json::json!({"command": "echo hello"});
let result = bash(&args, "/tmp");
assert_eq!(result.trim(), "hello");
}
#[test]
fn test_bash_exit_code() {
let args = serde_json::json!({"command": "false"});
let result = bash(&args, "/tmp");
assert!(result.contains("exit code"));
}
}

838
sunbeam/src/code/tui.rs Normal file
View File

@@ -0,0 +1,838 @@
use std::io;
use std::sync::{Arc, Mutex};
use crossterm::event::{self, Event, KeyCode, KeyEvent, KeyModifiers};
use crossterm::terminal::{self, EnterAlternateScreen, LeaveAlternateScreen};
use crossterm::execute;
use ratatui::backend::CrosstermBackend;
use ratatui::layout::{Constraint, Layout, Rect};
use ratatui::style::{Color, Modifier, Style};
use ratatui::text::{Line, Span, Text};
use ratatui::widgets::{Block, Borders, Paragraph, Wrap};
use ratatui::Terminal;
use tracing_subscriber::fmt::MakeWriter;
// ── Sol status messages (sun/fusion theme) ───────────────────────────────
const SOL_STATUS_MESSAGES: &[&str] = &[
"fusing hydrogen",
"solar flare",
"coronal mass",
"helium flash",
"photon escape",
"plasma arc",
"sunspot forming",
"chromosphere",
"radiating",
"nuclear fusion",
"proton chain",
"solar wind",
"burning bright",
"going nova",
"core ignition",
"stellar drift",
"dawn breaking",
"light bending",
"warmth spreading",
"horizon glow",
"golden hour",
"ray tracing",
"luminous flux",
"thermal bloom",
"heliosphere",
"magnetic storm",
"sun worship",
"solstice",
"perihelion",
"daybreak",
"photosphere",
"solar apex",
"corona pulse",
"neutrino bath",
"deuterium burn",
"kelvin climb",
"fusion yield",
"radiant heat",
"stellar core",
"light speed",
];
/// Pick a random status message for the generating indicator.
pub fn random_sol_status() -> &'static str {
use std::collections::hash_map::DefaultHasher;
use std::hash::{Hash, Hasher};
use std::time::SystemTime;
let mut hasher = DefaultHasher::new();
SystemTime::now()
.duration_since(SystemTime::UNIX_EPOCH)
.unwrap_or_default()
.as_millis()
.hash(&mut hasher);
let idx = hasher.finish() as usize % SOL_STATUS_MESSAGES.len();
SOL_STATUS_MESSAGES[idx]
}
// ── Sol color wave palette (warm amber gradient) ─────────────────────────
const WAVE_COLORS: &[(u8, u8, u8)] = &[
(255, 216, 0), // bright gold
(255, 197, 66), // sol yellow
(245, 175, 0), // amber
(232, 140, 30), // deep amber
(210, 110, 20), // burnt orange
];
/// Get the wave color for a character position at the current frame.
fn wave_color_at(pos: usize, frame: u64, text_len: usize) -> Color {
let total = text_len + 2; // text + padding
let cycle_len = total * 2; // bounce back and forth
let wave_pos = (frame as usize / 2) % cycle_len; // advance every 2 frames
let wave_pos = if wave_pos >= total {
cycle_len - wave_pos - 1 // bounce back
} else {
wave_pos
};
// Distance from wave front determines color index
let dist = if pos >= wave_pos { pos - wave_pos } else { wave_pos - pos };
let idx = dist.min(WAVE_COLORS.len() - 1);
let (r, g, b) = WAVE_COLORS[idx];
Color::Rgb(r, g, b)
}
// ── Sol color palette ──────────────────────────────────────────────────────
const SOL_YELLOW: Color = Color::Rgb(245, 197, 66);
const SOL_AMBER: Color = Color::Rgb(232, 168, 64);
const SOL_BLUE: Color = Color::Rgb(108, 166, 224);
const SOL_RED: Color = Color::Rgb(224, 88, 88);
const SOL_DIM: Color = Color::Rgb(138, 122, 90);
const SOL_GRAY: Color = Color::Rgb(112, 112, 112);
const SOL_FAINT: Color = Color::Rgb(80, 80, 80);
const SOL_APPROVAL_BG: Color = Color::Rgb(50, 42, 20);
const SOL_APPROVAL_CMD: Color = Color::Rgb(200, 180, 120);
// ── In-memory log buffer for tracing ─────────────────────────────────────
const LOG_BUFFER_CAPACITY: usize = 500;
#[derive(Clone)]
pub struct LogBuffer(Arc<Mutex<Vec<String>>>);
impl LogBuffer {
pub fn new() -> Self {
Self(Arc::new(Mutex::new(Vec::new())))
}
pub fn lines(&self) -> Vec<String> {
self.0.lock().unwrap().clone()
}
}
/// Writer that appends each line to the ring buffer.
pub struct LogBufferWriter(Arc<Mutex<Vec<String>>>);
impl io::Write for LogBufferWriter {
fn write(&mut self, buf: &[u8]) -> io::Result<usize> {
let s = String::from_utf8_lossy(buf);
let mut lines = self.0.lock().unwrap();
for line in s.lines() {
if !line.is_empty() {
lines.push(line.to_string());
if lines.len() > LOG_BUFFER_CAPACITY {
lines.remove(0);
}
}
}
Ok(buf.len())
}
fn flush(&mut self) -> io::Result<()> {
Ok(())
}
}
impl<'a> MakeWriter<'a> for LogBuffer {
type Writer = LogBufferWriter;
fn make_writer(&'a self) -> Self::Writer {
LogBufferWriter(self.0.clone())
}
}
// ── Virtual viewport ─────────────────────────────────────────────────────
/// Cached pre-wrapped visual lines for the conversation log.
/// Text is wrapped using `textwrap` when content or width changes.
/// Drawing just slices the visible window — O(viewport), zero wrapping by ratatui.
pub struct Viewport {
/// Pre-wrapped visual lines (one Line per screen row). Already wrapped to width.
visual_lines: Vec<Line<'static>>,
/// Width used for the last wrap pass.
last_width: u16,
/// True when log content changed.
dirty: bool,
}
impl Viewport {
pub fn new() -> Self {
Self {
visual_lines: Vec::new(),
last_width: 0,
dirty: true,
}
}
pub fn invalidate(&mut self) {
self.dirty = true;
}
/// Total visual (screen) lines.
pub fn len(&self) -> u16 {
self.visual_lines.len() as u16
}
/// Rebuild pre-wrapped lines from log entries for a given width.
pub fn rebuild(&mut self, log: &[LogEntry], width: u16) {
let w = width.max(1) as usize;
self.visual_lines.clear();
for entry in log {
match entry {
LogEntry::UserInput(text) => {
self.visual_lines.push(Line::from(""));
// Wrap user input with "> " prefix
let prefixed = format!("> {text}");
for wrapped in wrap_styled(&prefixed, w, SOL_DIM, Color::White, 2) {
self.visual_lines.push(wrapped);
}
self.visual_lines.push(Line::from(""));
}
LogEntry::AssistantText(text) => {
// Render markdown to styled ratatui Lines
let md_text: Text<'_> = tui_markdown::from_str(text);
let base_style = Style::default().fg(SOL_YELLOW);
for line in md_text.lines {
// Apply base yellow color to spans that don't have explicit styling
let styled_spans: Vec<Span<'static>> = line
.spans
.into_iter()
.map(|span| {
let mut style = span.style;
if style.fg.is_none() {
style = style.fg(SOL_YELLOW);
}
Span::styled(span.content.into_owned(), style)
})
.collect();
let styled_line = Line::from(styled_spans);
// Wrap long lines
let line_width = styled_line.width();
if line_width <= w {
self.visual_lines.push(styled_line);
} else {
// For wrapped markdown lines, fall back to textwrap on the raw text
let raw: String = styled_line.spans.iter().map(|s| s.content.as_ref()).collect();
for wrapped in textwrap::wrap(&raw, w) {
self.visual_lines.push(Line::styled(wrapped.into_owned(), base_style));
}
}
}
}
LogEntry::ToolSuccess { name, detail } => {
self.visual_lines.push(Line::from(vec![
Span::styled("", Style::default().fg(SOL_BLUE)),
Span::styled(name.clone(), Style::default().fg(SOL_AMBER)),
Span::styled(format!(" {detail}"), Style::default().fg(SOL_DIM)),
]));
}
LogEntry::ToolExecuting { name, detail } => {
self.visual_lines.push(Line::from(vec![
Span::styled("", Style::default().fg(SOL_AMBER)),
Span::styled(name.clone(), Style::default().fg(SOL_AMBER)),
Span::styled(format!(" {detail}"), Style::default().fg(SOL_DIM)),
]));
}
LogEntry::ToolFailed { name, detail } => {
self.visual_lines.push(Line::from(vec![
Span::styled("", Style::default().fg(SOL_RED)),
Span::styled(name.clone(), Style::default().fg(SOL_RED)),
Span::styled(format!(" {detail}"), Style::default().fg(SOL_DIM)),
]));
}
LogEntry::ToolOutput { lines: output_lines, collapsed } => {
let show = if *collapsed { 5 } else { output_lines.len() };
let style = Style::default().fg(SOL_GRAY);
for line in output_lines.iter().take(show) {
self.visual_lines.push(Line::styled(format!(" {line}"), style));
}
if *collapsed && output_lines.len() > 5 {
self.visual_lines.push(Line::styled(
format!(" … +{} lines", output_lines.len() - 5),
Style::default().fg(SOL_FAINT),
));
}
}
LogEntry::Status(msg) => {
self.visual_lines.push(Line::styled(
format!(" [{msg}]"),
Style::default().fg(SOL_DIM),
));
}
LogEntry::Error(msg) => {
let style = Style::default().fg(SOL_RED);
for wrapped in textwrap::wrap(&format!(" error: {msg}"), w) {
self.visual_lines.push(Line::styled(wrapped.into_owned(), style));
}
}
}
}
self.dirty = false;
self.last_width = width;
}
/// Ensure lines are built for the given width. Rebuilds if width changed.
pub fn ensure(&mut self, log: &[LogEntry], width: u16) {
if self.dirty || self.last_width != width {
self.rebuild(log, width);
}
}
/// Get the visible slice of pre-wrapped lines for the scroll position.
/// Returns owned lines ready to render — NO wrapping by ratatui.
pub fn window(&self, height: u16, scroll_offset: u16) -> Vec<Line<'static>> {
let total = self.visual_lines.len() as u16;
let max_scroll = total.saturating_sub(height);
let scroll = if scroll_offset == u16::MAX {
max_scroll
} else {
scroll_offset.min(max_scroll)
};
let start = scroll as usize;
let end = (start + height as usize).min(self.visual_lines.len());
self.visual_lines[start..end].to_vec()
}
pub fn max_scroll(&self, height: u16) -> u16 {
(self.visual_lines.len() as u16).saturating_sub(height)
}
}
/// Wrap a "> text" line preserving the dim prefix style on the first line
/// and white text style for content. Returns pre-wrapped visual lines.
fn wrap_styled(text: &str, width: usize, prefix_color: Color, text_color: Color, prefix_len: usize) -> Vec<Line<'static>> {
let wrapped = textwrap::wrap(text, width);
let mut lines = Vec::with_capacity(wrapped.len());
for (i, w) in wrapped.iter().enumerate() {
let s = w.to_string();
if i == 0 && s.len() >= prefix_len {
// First line: split into styled prefix + text
lines.push(Line::from(vec![
Span::styled(s[..prefix_len].to_string(), Style::default().fg(prefix_color)),
Span::styled(s[prefix_len..].to_string(), Style::default().fg(text_color)),
]));
} else {
lines.push(Line::styled(s, Style::default().fg(text_color)));
}
}
lines
}
// ── Message types for the conversation log ─────────────────────────────────
#[derive(Clone)]
pub enum LogEntry {
UserInput(String),
AssistantText(String),
ToolSuccess { name: String, detail: String },
ToolExecuting { name: String, detail: String },
ToolFailed { name: String, detail: String },
ToolOutput { lines: Vec<String>, collapsed: bool },
Status(String),
Error(String),
}
// ── Approval state ─────────────────────────────────────────────────────────
pub struct ApprovalPrompt {
pub call_id: String,
pub tool_name: String,
pub command: String,
pub options: Vec<String>,
pub selected: usize,
}
// ── App state ──────────────────────────────────────────────────────────────
pub struct App {
pub log: Vec<LogEntry>,
pub viewport: Viewport,
pub input: String,
pub cursor_pos: usize,
pub scroll_offset: u16,
pub project_name: String,
pub branch: String,
pub model: String,
pub input_tokens: u32,
pub output_tokens: u32,
pub last_turn_tokens: u32,
pub approval: Option<ApprovalPrompt>,
pub is_thinking: bool,
pub sol_status: String,
pub sol_connected: bool,
pub thinking_since: Option<std::time::Instant>,
pub thinking_message: String,
pub should_quit: bool,
pub show_logs: bool,
pub log_buffer: LogBuffer,
pub log_scroll: u16,
pub command_history: Vec<String>,
pub history_index: Option<usize>,
pub input_saved: String,
pub needs_redraw: bool,
pub frame_count: u64,
}
impl App {
pub fn new(project_name: &str, branch: &str, model: &str, log_buffer: LogBuffer) -> Self {
Self {
log: Vec::new(),
viewport: Viewport::new(),
input: String::new(),
cursor_pos: 0,
scroll_offset: 0,
project_name: project_name.into(),
branch: branch.into(),
model: model.into(),
input_tokens: 0,
output_tokens: 0,
last_turn_tokens: 0,
approval: None,
is_thinking: false,
sol_status: String::new(),
sol_connected: true,
thinking_since: None,
thinking_message: String::new(),
should_quit: false,
show_logs: false,
log_buffer,
log_scroll: u16::MAX,
command_history: Vec::new(),
history_index: None,
input_saved: String::new(),
needs_redraw: true,
frame_count: 0,
}
}
pub fn push_log(&mut self, entry: LogEntry) {
self.log.push(entry);
self.viewport.invalidate();
self.scroll_offset = u16::MAX;
self.needs_redraw = true;
}
/// Batch-add log entries without per-entry viewport rebuilds.
pub fn push_logs(&mut self, entries: Vec<LogEntry>) {
self.log.extend(entries);
self.viewport.invalidate();
self.scroll_offset = u16::MAX;
self.needs_redraw = true;
}
/// Resolve the u16::MAX auto-scroll sentinel to the actual max scroll
/// position. Call before applying relative scroll deltas.
/// Resolve scroll sentinel AND clamp to valid range. Call before
/// applying any relative scroll delta.
pub fn resolve_scroll(&mut self, width: u16, height: u16) {
self.viewport.ensure(&self.log, width);
let max = self.viewport.max_scroll(height);
if self.scroll_offset == u16::MAX {
self.scroll_offset = max;
} else {
self.scroll_offset = self.scroll_offset.min(max);
}
}
/// Load command history from a project's .sunbeam/history file.
pub fn load_history(&mut self, project_path: &str) {
let path = std::path::Path::new(project_path).join(".sunbeam").join("history");
if let Ok(contents) = std::fs::read_to_string(&path) {
self.command_history = contents.lines().map(String::from).collect();
}
}
/// Save command history to a project's .sunbeam/history file.
pub fn save_history(&self, project_path: &str) {
let dir = std::path::Path::new(project_path).join(".sunbeam");
let _ = std::fs::create_dir_all(&dir);
let path = dir.join("history");
// Keep last 500 entries
let start = self.command_history.len().saturating_sub(500);
let contents = self.command_history[start..].join("\n");
let _ = std::fs::write(&path, contents);
}
}
// ── Rendering ──────────────────────────────────────────────────────────────
pub fn draw(frame: &mut ratatui::Frame, app: &mut App) {
app.frame_count = app.frame_count.wrapping_add(1);
let area = frame.area();
// Layout: title (1) + log (flex) + input (3) — no status bar
let chunks = Layout::vertical([
Constraint::Length(1), // title bar (all system info)
Constraint::Min(5), // conversation log
Constraint::Length(3), // input area
])
.split(area);
draw_title_bar(frame, chunks[0], app);
if app.show_logs {
draw_debug_log(frame, chunks[1], app);
} else {
draw_log(frame, chunks[1], app);
}
if let Some(ref approval) = app.approval {
draw_approval(frame, chunks[2], approval);
} else {
draw_input(frame, chunks[2], app);
}
}
fn draw_title_bar(frame: &mut ratatui::Frame, area: Rect, app: &App) {
let health = if app.sol_connected { "☀️" } else { "⛈️" };
// Left: branding + project + branch
let left = vec![
Span::styled("sunbeam code", Style::default().fg(SOL_YELLOW).add_modifier(Modifier::BOLD)),
Span::styled(" · ", Style::default().fg(SOL_FAINT)),
Span::raw(&app.project_name),
Span::styled(" · ", Style::default().fg(SOL_FAINT)),
Span::styled(&app.branch, Style::default().fg(SOL_DIM)),
];
// Right: timer · status_wave · tokens · model · health
let mut right_parts: Vec<Span> = Vec::new();
if app.is_thinking {
// Elapsed timer first
if let Some(since) = app.thinking_since {
let elapsed = since.elapsed().as_secs();
right_parts.push(Span::styled(
format!("{elapsed}s "),
Style::default().fg(SOL_FAINT),
));
}
let status = if app.thinking_message.is_empty() {
"generating"
} else {
&app.thinking_message
};
let status_text = format!("{status}");
// Per-character color wave + global dim/brighten pulse
let pulse = ((app.frame_count as f64 / 15.0).sin() + 1.0) / 2.0; // 0.01.0
let text_len = status_text.chars().count();
for (i, ch) in status_text.chars().enumerate() {
let wave = wave_color_at(i, app.frame_count, text_len);
// Blend wave color with pulse brightness
let (wr, wg, wb) = match wave { Color::Rgb(r, g, b) => (r, g, b), _ => (245, 197, 66) };
let r = (wr as f64 * (0.4 + 0.6 * pulse)) as u8;
let g = (wg as f64 * (0.4 + 0.6 * pulse)) as u8;
let b = (wb as f64 * (0.4 + 0.6 * pulse)) as u8;
right_parts.push(Span::styled(
ch.to_string(),
Style::default().fg(Color::Rgb(r, g, b)).add_modifier(Modifier::BOLD),
));
}
right_parts.push(Span::styled(" · ", Style::default().fg(SOL_FAINT)));
}
// Token counters — context (last turn prompt) + total session tokens
let total = app.input_tokens + app.output_tokens;
if total > 0 {
right_parts.push(Span::styled(
format!("ctx:{} tot:{}", format_tokens(app.last_turn_tokens), format_tokens(total)),
Style::default().fg(SOL_DIM),
));
} else {
right_parts.push(Span::styled("", Style::default().fg(SOL_FAINT)));
}
right_parts.push(Span::styled(" · ", Style::default().fg(SOL_FAINT)));
right_parts.push(Span::styled(&app.model, Style::default().fg(SOL_DIM)));
right_parts.push(Span::styled(" ", Style::default().fg(SOL_FAINT)));
right_parts.push(Span::raw(health.to_string()));
let title_line = Line::from(left);
frame.render_widget(Paragraph::new(title_line), area);
let right_line = Line::from(right_parts);
let right_width = right_line.width() as u16 + 1;
let right_area = Rect {
x: area.width.saturating_sub(right_width),
y: area.y,
width: right_width,
height: 1,
};
frame.render_widget(Paragraph::new(right_line), right_area);
}
/// Format token count: 1234 → "1.2k", 123 → "123"
fn format_tokens(n: u32) -> String {
if n >= 1_000_000 {
format!("{:.1}M", n as f64 / 1_000_000.0)
} else if n >= 1_000 {
format!("{:.1}k", n as f64 / 1_000.0)
} else {
n.to_string()
}
}
fn draw_log(frame: &mut ratatui::Frame, area: Rect, app: &mut App) {
// Ensure pre-wrapped lines are built for current width
app.viewport.ensure(&app.log, area.width);
// Slice only the visible rows — O(viewport), no wrapping by ratatui
let window = app.viewport.window(area.height, app.scroll_offset);
frame.render_widget(Paragraph::new(window), area);
}
fn draw_debug_log(frame: &mut ratatui::Frame, area: Rect, app: &App) {
let log_lines = app.log_buffer.lines();
let lines: Vec<Line> = std::iter::once(
Line::from(Span::styled(
" debug log (Alt+L to close) ",
Style::default().fg(SOL_AMBER).add_modifier(Modifier::BOLD),
)),
)
.chain(log_lines.iter().map(|l| {
let color = if l.contains("ERROR") {
SOL_RED
} else if l.contains("WARN") {
SOL_YELLOW
} else {
SOL_GRAY
};
Line::from(Span::styled(l.as_str(), Style::default().fg(color)))
}))
.collect();
let total = lines.len() as u16;
let visible = area.height;
let max_scroll = total.saturating_sub(visible);
let scroll = if app.log_scroll == u16::MAX {
max_scroll
} else {
app.log_scroll.min(max_scroll)
};
let widget = Paragraph::new(Text::from(lines))
.wrap(Wrap { trim: false })
.scroll((scroll, 0));
frame.render_widget(widget, area);
}
fn draw_input(frame: &mut ratatui::Frame, area: Rect, app: &App) {
let input_block = Block::default()
.borders(Borders::TOP)
.border_style(Style::default().fg(SOL_FAINT));
let input_text = Line::from(vec![
Span::styled("> ", Style::default().fg(SOL_DIM)),
Span::raw(&app.input),
]);
let input_widget = Paragraph::new(input_text)
.block(input_block)
.wrap(Wrap { trim: false });
frame.render_widget(input_widget, area);
if !app.is_thinking {
// Only show cursor when not waiting for Sol
let cursor_x = area.x + 2 + app.cursor_pos as u16;
let cursor_y = area.y + 1;
frame.set_cursor_position((cursor_x, cursor_y));
}
}
fn draw_approval(frame: &mut ratatui::Frame, area: Rect, approval: &ApprovalPrompt) {
let block = Block::default()
.borders(Borders::TOP)
.border_style(Style::default().fg(SOL_FAINT));
let mut lines = vec![
Line::from(vec![
Span::styled("", Style::default().fg(SOL_YELLOW)),
Span::styled(&approval.tool_name, Style::default().fg(SOL_YELLOW).add_modifier(Modifier::BOLD)),
Span::styled(format!(" {}", approval.command), Style::default().fg(SOL_APPROVAL_CMD)),
]),
];
for (i, opt) in approval.options.iter().enumerate() {
let prefix = if i == approval.selected { " " } else { " " };
let style = if i == approval.selected {
Style::default().fg(SOL_YELLOW)
} else {
Style::default().fg(SOL_DIM)
};
lines.push(Line::from(Span::styled(format!("{prefix}{opt}"), style)));
}
let widget = Paragraph::new(Text::from(lines))
.block(block)
.style(Style::default().bg(SOL_APPROVAL_BG));
frame.render_widget(widget, area);
}
// ── Terminal setup/teardown ────────────────────────────────────────────────
pub fn setup_terminal() -> io::Result<Terminal<CrosstermBackend<io::Stdout>>> {
terminal::enable_raw_mode()?;
let mut stdout = io::stdout();
execute!(stdout, EnterAlternateScreen, crossterm::event::EnableMouseCapture)?;
let backend = CrosstermBackend::new(stdout);
Terminal::new(backend)
}
pub fn restore_terminal(terminal: &mut Terminal<CrosstermBackend<io::Stdout>>) -> io::Result<()> {
terminal::disable_raw_mode()?;
execute!(
terminal.backend_mut(),
LeaveAlternateScreen,
crossterm::event::DisableMouseCapture
)?;
terminal.show_cursor()?;
Ok(())
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_app_creation() {
let app = App::new("sol", "mainline", "devstral-2", LogBuffer::new());
assert_eq!(app.project_name, "sol");
assert!(!app.should_quit);
assert!(app.log.is_empty());
}
#[test]
fn test_push_log_auto_scrolls() {
let mut app = App::new("sol", "main", "devstral-2", LogBuffer::new());
app.scroll_offset = 0;
app.push_log(LogEntry::Status("test".into()));
assert_eq!(app.scroll_offset, u16::MAX); // auto-scroll to bottom
}
#[test]
fn test_color_constants() {
assert!(matches!(SOL_YELLOW, Color::Rgb(245, 197, 66)));
assert!(matches!(SOL_AMBER, Color::Rgb(232, 168, 64)));
assert!(matches!(SOL_BLUE, Color::Rgb(108, 166, 224)));
assert!(matches!(SOL_RED, Color::Rgb(224, 88, 88)));
// No green in the palette
assert!(!matches!(SOL_YELLOW, Color::Rgb(_, 255, _)));
assert!(!matches!(SOL_BLUE, Color::Rgb(_, 255, _)));
}
#[test]
fn test_log_entries_all_variants() {
let mut app = App::new("test", "main", "devstral-2", LogBuffer::new());
app.push_log(LogEntry::UserInput("hello".into()));
app.push_log(LogEntry::AssistantText("response".into()));
app.push_log(LogEntry::ToolSuccess { name: "file_read".into(), detail: "src/main.rs".into() });
app.push_log(LogEntry::ToolExecuting { name: "bash".into(), detail: "cargo test".into() });
app.push_log(LogEntry::ToolFailed { name: "grep".into(), detail: "no matches".into() });
app.push_log(LogEntry::ToolOutput { lines: vec!["line 1".into(), "line 2".into()], collapsed: true });
app.push_log(LogEntry::Status("thinking".into()));
app.push_log(LogEntry::Error("connection lost".into()));
assert_eq!(app.log.len(), 8);
}
#[test]
fn test_tool_output_collapse_threshold() {
// Collapsed output shows max 5 lines + "... +N lines"
let lines: Vec<String> = (0..20).map(|i| format!("line {i}")).collect();
let entry = LogEntry::ToolOutput { lines: lines.clone(), collapsed: true };
if let LogEntry::ToolOutput { lines, collapsed } = &entry {
assert!(lines.len() > 5);
assert!(*collapsed);
}
}
#[test]
fn test_approval_prompt() {
let approval = ApprovalPrompt {
call_id: "test-1".into(),
tool_name: "bash".into(),
command: "cargo test".into(),
options: vec![
"Yes".into(),
"Yes, always allow bash".into(),
"No".into(),
],
selected: 0,
};
assert_eq!(approval.options.len(), 3);
assert_eq!(approval.selected, 0);
}
#[test]
fn test_approval_navigation() {
let mut approval = ApprovalPrompt {
call_id: "test-2".into(),
tool_name: "bash".into(),
command: "rm -rf".into(),
options: vec!["Yes".into(), "No".into()],
selected: 0,
};
// Navigate down
approval.selected = (approval.selected + 1).min(approval.options.len() - 1);
assert_eq!(approval.selected, 1);
// Navigate down again (clamped)
approval.selected = (approval.selected + 1).min(approval.options.len() - 1);
assert_eq!(approval.selected, 1);
// Navigate up
approval.selected = approval.selected.saturating_sub(1);
assert_eq!(approval.selected, 0);
}
#[test]
fn test_thinking_state() {
let mut app = App::new("sol", "main", "devstral-2", LogBuffer::new());
assert!(!app.is_thinking);
app.is_thinking = true;
assert!(app.is_thinking);
}
#[test]
fn test_input_cursor() {
let mut app = App::new("sol", "main", "devstral-2", LogBuffer::new());
app.input = "hello world".into();
app.cursor_pos = 5;
assert_eq!(&app.input[..app.cursor_pos], "hello");
}
#[test]
fn test_token_tracking() {
let mut app = App::new("sol", "main", "devstral-2", LogBuffer::new());
app.input_tokens = 1200;
app.output_tokens = 340;
assert_eq!(app.input_tokens / 1000, 1);
assert_eq!(app.output_tokens / 1000, 0);
}
}

2
sunbeam/src/lib.rs Normal file
View File

@@ -0,0 +1,2 @@
// Thin library export for integration tests.
pub mod code;

View File

@@ -1,4 +1,5 @@
mod cli;
mod code;
#[tokio::main]
async fn main() {

View File

@@ -0,0 +1,346 @@
/// Integration test: starts a mock gRPC server and connects the client.
/// Tests the full bidirectional stream lifecycle without needing Sol or Mistral.
use std::pin::Pin;
use std::sync::Arc;
use futures::Stream;
use sunbeam_proto::sunbeam_code_v1::code_agent_server::{CodeAgent, CodeAgentServer};
use sunbeam_proto::sunbeam_code_v1::*;
use tokio::sync::mpsc;
use tokio_stream::wrappers::ReceiverStream;
use tonic::{Request, Response, Status, Streaming};
/// Mock server that echoes back user input as assistant text.
struct MockCodeAgent;
#[tonic::async_trait]
impl CodeAgent for MockCodeAgent {
type SessionStream = Pin<Box<dyn Stream<Item = Result<ServerMessage, Status>> + Send>>;
async fn session(
&self,
request: Request<Streaming<ClientMessage>>,
) -> Result<Response<Self::SessionStream>, Status> {
let mut in_stream = request.into_inner();
let (tx, rx) = mpsc::channel(32);
tokio::spawn(async move {
// Wait for StartSession
if let Ok(Some(msg)) = in_stream.message().await {
if let Some(client_message::Payload::Start(start)) = msg.payload {
let _ = tx.send(Ok(ServerMessage {
payload: Some(server_message::Payload::Ready(SessionReady {
session_id: "test-session-123".into(),
room_id: "!test-room:local".into(),
model: if start.model.is_empty() {
"devstral-2".into()
} else {
start.model
},
resumed: false,
history: vec![],
})),
})).await;
}
}
// Echo loop
while let Ok(Some(msg)) = in_stream.message().await {
match msg.payload {
Some(client_message::Payload::Input(input)) => {
let _ = tx.send(Ok(ServerMessage {
payload: Some(server_message::Payload::Done(TextDone {
full_text: format!("[echo] {}", input.text),
input_tokens: 10,
output_tokens: 5,
})),
})).await;
}
Some(client_message::Payload::End(_)) => {
let _ = tx.send(Ok(ServerMessage {
payload: Some(server_message::Payload::End(SessionEnd {
summary: "Session ended.".into(),
})),
})).await;
break;
}
_ => {}
}
}
});
Ok(Response::new(Box::pin(ReceiverStream::new(rx))))
}
async fn reindex_code(&self, _req: Request<ReindexCodeRequest>) -> Result<Response<ReindexCodeResponse>, Status> {
Ok(Response::new(ReindexCodeResponse { repos_indexed: 0, symbols_indexed: 0, error: "mock".into() }))
}
}
#[tokio::test]
async fn test_session_lifecycle() {
// Start mock server on a random port
let listener = tokio::net::TcpListener::bind("127.0.0.1:0").await.unwrap();
let addr = listener.local_addr().unwrap();
tokio::spawn(async move {
let incoming = tokio_stream::wrappers::TcpListenerStream::new(listener);
tonic::transport::Server::builder()
.add_service(CodeAgentServer::new(MockCodeAgent))
.serve_with_incoming(incoming)
.await
.unwrap();
});
// Give server a moment to start
tokio::time::sleep(std::time::Duration::from_millis(100)).await;
// Connect client
let endpoint = format!("http://{addr}");
use sunbeam_proto::sunbeam_code_v1::code_agent_client::CodeAgentClient;
let mut client = CodeAgentClient::connect(endpoint).await.unwrap();
let (tx, client_rx) = mpsc::channel::<ClientMessage>(32);
let client_stream = ReceiverStream::new(client_rx);
let response = client.session(client_stream).await.unwrap();
let mut rx = response.into_inner();
// Send StartSession
tx.send(ClientMessage {
payload: Some(client_message::Payload::Start(StartSession {
project_path: "/test/project".into(),
prompt_md: "test prompt".into(),
config_toml: String::new(),
git_branch: "main".into(),
git_status: String::new(),
file_tree: vec!["src/".into(), "Cargo.toml".into()],
model: "test-model".into(),
client_tools: vec![],
})),
}).await.unwrap();
// Receive SessionReady
let msg = rx.message().await.unwrap().unwrap();
match msg.payload {
Some(server_message::Payload::Ready(ready)) => {
assert_eq!(ready.session_id, "test-session-123");
assert_eq!(ready.model, "test-model");
}
other => panic!("Expected SessionReady, got {other:?}"),
}
// Send a chat message
tx.send(ClientMessage {
payload: Some(client_message::Payload::Input(UserInput {
text: "hello sol".into(),
})),
}).await.unwrap();
// Receive echo response
let msg = rx.message().await.unwrap().unwrap();
match msg.payload {
Some(server_message::Payload::Done(done)) => {
assert_eq!(done.full_text, "[echo] hello sol");
assert_eq!(done.input_tokens, 10);
assert_eq!(done.output_tokens, 5);
}
other => panic!("Expected TextDone, got {other:?}"),
}
// End session
tx.send(ClientMessage {
payload: Some(client_message::Payload::End(EndSession {})),
}).await.unwrap();
let msg = rx.message().await.unwrap().unwrap();
match msg.payload {
Some(server_message::Payload::End(end)) => {
assert_eq!(end.summary, "Session ended.");
}
other => panic!("Expected SessionEnd, got {other:?}"),
}
}
#[tokio::test]
async fn test_multiple_messages() {
let listener = tokio::net::TcpListener::bind("127.0.0.1:0").await.unwrap();
let addr = listener.local_addr().unwrap();
tokio::spawn(async move {
let incoming = tokio_stream::wrappers::TcpListenerStream::new(listener);
tonic::transport::Server::builder()
.add_service(CodeAgentServer::new(MockCodeAgent))
.serve_with_incoming(incoming)
.await
.unwrap();
});
tokio::time::sleep(std::time::Duration::from_millis(100)).await;
let endpoint = format!("http://{addr}");
use sunbeam_proto::sunbeam_code_v1::code_agent_client::CodeAgentClient;
let mut client = CodeAgentClient::connect(endpoint).await.unwrap();
let (tx, client_rx) = mpsc::channel::<ClientMessage>(32);
let client_stream = ReceiverStream::new(client_rx);
let response = client.session(client_stream).await.unwrap();
let mut rx = response.into_inner();
// Start
tx.send(ClientMessage {
payload: Some(client_message::Payload::Start(StartSession {
project_path: "/test".into(),
model: "devstral-2".into(),
..Default::default()
})),
}).await.unwrap();
let _ = rx.message().await.unwrap().unwrap(); // SessionReady
// Send 3 messages and verify each echo
for i in 0..3 {
tx.send(ClientMessage {
payload: Some(client_message::Payload::Input(UserInput {
text: format!("message {i}"),
})),
}).await.unwrap();
let msg = rx.message().await.unwrap().unwrap();
match msg.payload {
Some(server_message::Payload::Done(done)) => {
assert_eq!(done.full_text, format!("[echo] message {i}"));
}
other => panic!("Expected TextDone for message {i}, got {other:?}"),
}
}
}
// ══════════════════════════════════════════════════════════════════════════
// LSP integration tests (requires rust-analyzer on PATH)
// ══════════════════════════════════════════════════════════════════════════
mod lsp_tests {
use sunbeam::code::lsp::detect;
use sunbeam::code::lsp::manager::LspManager;
use sunbeam::code::tools;
#[test]
fn test_detect_servers_in_cli_project() {
let configs = detect::detect_servers(".");
assert!(!configs.is_empty(), "Should detect at least one language server");
let rust = configs.iter().find(|c| c.language_id == "rust");
assert!(rust.is_some(), "Should detect Rust (Cargo.toml present)");
}
#[test]
fn test_is_lsp_tool() {
assert!(tools::is_lsp_tool("lsp_definition"));
assert!(tools::is_lsp_tool("lsp_references"));
assert!(tools::is_lsp_tool("lsp_hover"));
assert!(tools::is_lsp_tool("lsp_diagnostics"));
assert!(tools::is_lsp_tool("lsp_symbols"));
assert!(!tools::is_lsp_tool("file_read"));
assert!(!tools::is_lsp_tool("bash"));
}
#[tokio::test]
async fn test_lsp_manager_initialize_and_hover() {
// This test requires rust-analyzer on PATH
if std::process::Command::new("rust-analyzer").arg("--version").output().is_err() {
eprintln!("Skipping: rust-analyzer not on PATH");
return;
}
let mut manager = LspManager::new(".");
manager.initialize().await;
if !manager.is_available() {
eprintln!("Skipping: LSP initialization failed");
return;
}
// Hover on a known file in this project
let result = manager.hover("src/main.rs", 1, 1).await;
assert!(result.is_ok(), "Hover should not error: {:?}", result.err());
manager.shutdown().await;
}
#[tokio::test]
async fn test_lsp_document_symbols() {
if std::process::Command::new("rust-analyzer").arg("--version").output().is_err() {
eprintln!("Skipping: rust-analyzer not on PATH");
return;
}
let mut manager = LspManager::new(".");
manager.initialize().await;
if !manager.is_available() {
eprintln!("Skipping: LSP initialization failed");
return;
}
let result = manager.document_symbols("src/main.rs").await;
assert!(result.is_ok(), "Document symbols should not error: {:?}", result.err());
let symbols = result.unwrap();
assert!(!symbols.is_empty(), "Should find symbols in main.rs");
// main.rs should have at least a `main` function
assert!(
symbols.to_lowercase().contains("main"),
"Should find main function, got: {symbols}"
);
manager.shutdown().await;
}
#[tokio::test]
async fn test_lsp_workspace_symbols() {
if std::process::Command::new("rust-analyzer").arg("--version").output().is_err() {
eprintln!("Skipping: rust-analyzer not on PATH");
return;
}
let mut manager = LspManager::new(".");
manager.initialize().await;
if !manager.is_available() {
eprintln!("Skipping: LSP initialization failed");
return;
}
// Wait for rust-analyzer to finish indexing (workspace symbols need full index)
let mut found = false;
for attempt in 0..10 {
tokio::time::sleep(std::time::Duration::from_secs(1)).await;
let result = manager.workspace_symbols("CodeCommand", None).await;
if let Ok(ref symbols) = result {
if symbols.contains("CodeCommand") {
found = true;
break;
}
}
if attempt == 9 {
eprintln!("Skipping: rust-analyzer did not finish indexing within 10s");
manager.shutdown().await;
return;
}
}
assert!(found, "Should eventually find CodeCommand in workspace");
manager.shutdown().await;
}
#[tokio::test]
async fn test_lsp_graceful_degradation() {
// Use a non-existent binary
let mut manager = LspManager::new("/nonexistent/path");
manager.initialize().await;
assert!(!manager.is_available(), "Should not be available with bad path");
}
}