19 Commits

Author SHA1 Message Date
b7dfdb18e0 fix: escape pipe chars in mermaid TUI diagram 2026-03-24 13:07:15 +00:00
7f5c27a868 docs: add coding agent section to README and docs index 2026-03-24 13:02:16 +00:00
789a08a353 docs: add sunbeam code terminal coding agent documentation
Comprehensive doc covering project discovery, symbol indexing, tool
execution with permissions, LSP auto-detection, TUI layout, session
resumption, reindex-code command, and the three-layer architecture.
2026-03-24 12:58:51 +00:00
04f10d2794 feat: sunbeam reindex-code CLI verb + ReindexCode proto
Proto: ReindexCode RPC with org/repo/branch filters.
CLI: sunbeam reindex-code [--org studio] [--repo owner/name] [--endpoint ...]
Calls Sol's gRPC ReindexCode endpoint, prints indexed symbol count.
2026-03-24 09:38:02 +00:00
8726e8fbe7 feat(lsp): client-side LSP toolkit with 5 tools + integration tests
LSP client (lsp/client.rs):
- JSON-RPC framing over subprocess stdio
- Async request/response with oneshot channels
- Background read loop routing responses to pending requests
- 30s timeout per request, graceful shutdown

LSP manager (lsp/manager.rs):
- Auto-detect: Cargo.toml → rust-analyzer, package.json → tsserver,
  pyproject.toml → pyright, go.mod → gopls
- Initialize handshake, lazy textDocument/didOpen
- High-level methods: definition, references, hover, document_symbols,
  workspace_symbols
- Graceful degradation when binary not on PATH

LSP tools (tools.rs):
- lsp_definition, lsp_references, lsp_hover, lsp_diagnostics, lsp_symbols
- execute_lsp() async dispatch, is_lsp_tool() check
- All routed as ToolSide::Client in orchestrator

Tool schemas registered in Sol's build_tool_definitions() for Mistral.

Integration tests (6 new):
- Language detection for Rust project
- is_lsp_tool routing
- LSP initialize + hover on src/main.rs
- Document symbols (finds main function)
- Workspace symbols with retry (waits for rust-analyzer indexing)
- Graceful degradation with bad project path
2026-03-24 00:58:05 +00:00
73d7d6c15b feat(code): tree-sitter symbol extraction + auto-indexing
Symbol extraction (symbols.rs):
- tree-sitter parsers for Rust, TypeScript, Python
- Extracts: functions, structs, enums, traits, classes, interfaces
- Signatures, docstrings, line ranges for each symbol
- extract_project_symbols() walks project directory
- Skips hidden/vendor/target/node_modules, files >100KB

Proto: IndexSymbols + SymbolEntry messages for client→server symbol relay

Client: after SessionReady, extracts symbols and sends IndexSymbols
to Sol for indexing into the code search index.

14 unit tests for symbol extraction across Rust/TS/Python.
2026-03-24 00:42:03 +00:00
c6d6dbe5c8 fix(tests): update mock SessionReady with resumed + history fields 2026-03-23 21:45:03 +00:00
32f6ebacea feat(tui): wire approval prompt with key handlers
- ApprovalPrompt gains call_id for routing decisions
- Up/Down navigates options, Enter selects
- "yes, always allow {tool}" sends ApprovedAlways
- Input/cursor blocked while approval prompt is active
- AgentEvent::ApprovalNeeded populates the prompt
2026-03-23 21:35:35 +00:00
5f1fb09abb feat(client): emit ChatEvent::ToolCall with approval metadata
ToolCall events carry call_id, name, args, needs_approval — agent
layer uses these to route through the permission/approval flow.
2026-03-23 21:34:57 +00:00
8e73d52776 feat(agent): approval channel + per-tool permission checks
- ApprovalDecision enum (Approved/Denied/ApprovedAlways)
- Approval channel (crossbeam) from TUI to agent loop
- Agent checks config.permission_for() on each client tool call
- "always" auto-executes, "never" auto-denies, "ask" prompts
- ApprovedAlways upgrades session permission for future calls
- Unit tests for permissions, decisions, error messages
2026-03-23 21:27:10 +00:00
e06f74ed5e feat(config): permission_for() + upgrade_to_always()
LoadedConfig gains methods for tool approval policy:
- permission_for(tool_name) → "always" | "ask" | "never"
- upgrade_to_always(tool_name) — session-only override
2026-03-23 21:24:33 +00:00
d7c5a677da feat(code): friendly errors, batch history, persistent command history
- Agent errors sanitized: raw hyper/h2/gRPC dumps replaced with
  human-readable messages ("sol disconnected", "connection lost", etc.)
- Batch history loading: single viewport rebuild instead of per-entry
- Persistent command history: saved to .sunbeam/history, loaded on start
- Default model: mistral-medium-latest (personality adherence)
2026-03-23 17:08:24 +00:00
8b4f187d1b feat(code): async agent bus, virtual viewport, event drain
- Agent service (crossbeam channels): TUI never blocks on gRPC I/O.
  Chat runs on a background tokio task, events flow back via bounded
  crossbeam channel. Designed as a library-friendly internal RPC.

- Virtual viewport: pre-wrap text with textwrap on content/width change,
  slice only visible rows for rendering. Paragraph gets no Wrap, no
  scroll() — pure O(viewport) per frame.

- Event drain loop: coalesce all queued terminal events before drawing.
  Filters MouseEventKind::Moved (crossterm's EnableMouseCapture floods
  these via ?1003h any-event tracking). Single redraw per batch.

- Conditional drawing: skip frames when nothing changed (needs_redraw).

- Mouse wheel + PageUp/Down + Home/End scrolling, command history
  (Up/Down, persistent to .sunbeam/history), Alt+L debug log overlay.

- Proto: SessionReady now includes history entries + resumed flag.
  Session resume loads conversation from Matrix room on reconnect.

- Default model: devstral-small-latest (was devstral-small-2506).
2026-03-23 15:57:15 +00:00
cc9f169264 feat(code): wire TUI into real code path, /exit, color swap
- user input: white text, dim > prompt
- sol responses: warm yellow
- /exit slash command quits cleanly
- TUI replaces stdin loop in sunbeam code start
- hidden demo mode for testing (sunbeam code demo)
2026-03-23 12:53:34 +00:00
02e4d7fb37 feat(code): CLI client with gRPC connection + local tools
phase 3 client core:
- sunbeam code subcommand with project discovery, config loading
- gRPC client connects to Sol, starts bidirectional session
- 7 client-side tool executors: file_read, file_write, search_replace,
  grep, bash, list_directory
- project context: .sunbeam/prompt.md, .sunbeam/config.toml, git info
- tool permission config (always/ask/never per tool)
- simple stdin loop (ratatui TUI in phase 4)
- aligned sunbeam-proto to tonic 0.14
2026-03-23 11:57:24 +00:00
f3e67e589b feat(code): add sunbeam-proto crate with gRPC service definition
shared protobuf definitions for the sunbeam code agent:
- CodeAgent service with bidirectional Session streaming
- ClientMessage: StartSession, UserInput, ToolResult, ToolApproval
- ServerMessage: TextDelta, ToolCall, ApprovalNeeded, Status
- ToolDef for client-side tool registration

this is the transport layer between `sunbeam code` (TUI client)
and Sol (server-side agent loop). next: JWT middleware for OIDC
auth, Sol gRPC server stub, CLI subcommand stub.
2026-03-23 11:12:51 +00:00
13e3f5d42e fix opensearch pod resolution + sol-agent vault policy
os_api: resolve pod name by label instead of hardcoded opensearch-0.
added find_pod_by_label helper to kube.rs.

secrets.py: sol-agent policy (read/write sol-tokens/*) and k8s auth
role bound to matrix namespace default SA.
2026-03-23 08:48:33 +00:00
faf525522c feat: async SunbeamClient factory with unified auth resolution
SunbeamClient accessors are now async and resolve auth per-client:
- SSO bearer (get_token) for admin APIs, Matrix, La Suite, OpenSearch
- Gitea PAT (get_gitea_token) for VCS
- None for Prometheus, Loki, S3, LiveKit

Fixes client URLs to match deployed routes: hydra→hydra.{domain},
matrix→messages.{domain}, grafana→metrics.{domain},
prometheus→systemmetrics.{domain}, loki→systemlogs.{domain}.

Removes all ad-hoc token helpers from CLI modules (matrix_with_token,
os_client, people_client, etc). Every dispatch just calls
client.service().await?.
2026-03-22 18:57:22 +00:00
34647e6bcb feat: seed Sol agent vault policy + gitea creds, bump v1.0.1
Patches gitea admin credentials into secret/sol for Sol's Gitea
integration. Adds sol-agent vault policy with read/write access
to sol-tokens/* for user impersonation PATs, plus k8s auth role
bound to the matrix namespace.
2026-03-22 13:46:15 +00:00
51 changed files with 6420 additions and 264 deletions

78
.sunbeam/history Normal file
View File

@@ -0,0 +1,78 @@
hmm
just testing the ux
/exit
/exit
hmm, scrolling is very slow. needs to be async
/exit
/exit
/exit
/exit
/exit
/exit
/exit
[<35;52;20M/exit
/exit
/exit
hey you
who are you?
hmm.
that's not right.
you're supposed to be `sol`
what's on page 2 of hackernews today?
don't you have fetch tooling?
/exit
hey.
hey
/exot
/exit
hey
say hello from sunbeam code!
can you search the web for me and tell me what's on page 2 of hackernews?
/exit
hey boo
tell me about yourself
can you please do some googling and some research to see if i can use devstral-medium as an agent?
/exit
/exit
hey can you give me a full demo of markdown text please? i'm accessing you from a terminal and want to make sure my text processing is working as expected
no i mean give me a bunch of markdown artifacts that i can use to test the markdown renderer with
i don't want you to write a file, i want you to output it as text.
DO NOT FUCKING NEST IT IN MARKDOWN BLOCKS YOU STUPID FUCKING CHUD JUST GIVE ME RAW FUCKING MARKDOWN
imagine you are a terminal output and you needed to output exactly what the user is asking for, which are native markdown tokens. do that
that's exactly what i needed, thank you
/exit
hey
can you please do a deep dive into the mechanics of black holes?
i would like a technical breakdown
/exit
yeah, run me though paradox resolutions
/exit
yeah please dive into the most recent 3-4 so i can understand them
/exit
hello
/exit
tell me about astrophysics
give me a deep dive into black holes
/exit
go deeper into the paradoxes
yeah zoom in on ER=EPR
hey
/exit
yes please do
/exit
hey you.
what's up?
how are you today?
what's the weather in amsterdam right now?
/exit
hey.
what can you tell me about black holes?
/exit
yo dawg
/exit
yo dawg
/exit
hey beautiful
yes
idk, mostly i'm just tryna figure your ui/ux out. cuz you know you're a coding bot in this context, yeah?
what are you up to?

1030
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -1,3 +1,3 @@
[workspace] [workspace]
members = ["sunbeam-sdk", "sunbeam"] members = ["sunbeam-sdk", "sunbeam", "sunbeam-proto"]
resolver = "3" resolver = "3"

View File

@@ -157,6 +157,17 @@ sunbeam check # Run all functional probes
sunbeam check devtools # Scoped to namespace sunbeam check devtools # Scoped to namespace
``` ```
### Coding Agent
```bash
sunbeam code # Terminal coding agent (connects to Sol via gRPC)
sunbeam code start --model devstral-small # Override model
sunbeam code demo # Demo TUI without Sol connection
sunbeam reindex-code --org studio # Index repos into Sol's code search
```
See [docs/sol-code.md](docs/sol-code.md) for full documentation.
### Passthrough ### Passthrough
```bash ```bash

View File

@@ -57,6 +57,7 @@ sunbeam logs ory/kratos
## Documentation Structure ## Documentation Structure
- **[CLI Reference](cli-reference)**: Complete command reference - **[CLI Reference](cli-reference)**: Complete command reference
- **[Sol Code](sol-code)**: Terminal coding agent powered by Sol
- **[Core Modules](core-modules)**: Detailed module documentation - **[Core Modules](core-modules)**: Detailed module documentation
- **[Architecture](architecture)**: System architecture and design - **[Architecture](architecture)**: System architecture and design
- **[Usage Examples](usage-examples)**: Practical usage scenarios - **[Usage Examples](usage-examples)**: Practical usage scenarios

205
docs/sol-code.md Normal file
View File

@@ -0,0 +1,205 @@
# sunbeam code — Terminal Coding Agent
`sunbeam code` is a terminal-based coding agent powered by Sol. It connects to Sol's gRPC `CodeAgent` service and provides an interactive TUI for writing code, asking questions, and executing tools — with Sol handling the AI reasoning and the CLI handling local file operations.
## Quick Start
```bash
sunbeam code # start a session (auto-detects project)
sunbeam code start --model devstral-small # override the model
sunbeam code start --endpoint http://sol:50051 # custom Sol endpoint
sunbeam code demo # demo the TUI without Sol
```
## How It Works
```mermaid
sequenceDiagram
participant User
participant TUI as sunbeam code TUI
participant Agent as Background Agent
participant Sol as Sol gRPC
User->>TUI: sunbeam code
TUI->>TUI: Discover project context
TUI->>Agent: Spawn background tasks
Agent->>Sol: StartSession (project, capabilities)
Agent->>Sol: IndexSymbols (tree-sitter symbols)
Sol-->>Agent: SessionReady (session_id, model)
Agent-->>TUI: Connected
User->>TUI: Type message, press Enter
TUI->>Agent: Chat request
Agent->>Sol: UserInput (text)
loop Tool calls
Sol-->>Agent: ToolCall (is_local=true)
Agent->>Agent: Check permissions
alt needs approval
Agent-->>TUI: Show approval prompt
User->>TUI: yes / always / no
TUI->>Agent: Decision
end
Agent->>Agent: Execute tool locally
Agent->>Sol: ToolResult
end
Sol-->>Agent: TextDone (response + tokens)
Agent-->>TUI: Display response
```
## Project Discovery
On startup, the CLI discovers project context from the current working directory:
- **Project name** — directory basename
- **Custom instructions** — `.sunbeam/prompt.md` (injected into Sol's system prompt)
- **Tool configuration** — `.sunbeam/config.toml` (model + tool permissions)
- **Git state** — current branch + `git status --short`
- **File tree** — recursive scan (max depth 2, skips `target/`, `node_modules/`, hidden dirs)
All of this is sent to Sol in the `StartSession` message so it has full project context.
## Symbol Indexing
After connecting, the CLI extracts code symbols from the project using tree-sitter and sends them to Sol via `IndexSymbols`. Sol indexes these in OpenSearch for code search during the session.
Supported languages:
- **Rust** — functions, structs, enums, traits
- **TypeScript/JavaScript** — functions, classes, interfaces, types
- **Python** — functions, classes, methods
Each symbol includes name, kind, signature, docstring, line numbers, and a preview of the body.
## Tool Execution
Sol decides which tools to call. Tools marked `is_local=true` execute on your machine; everything else runs on the server.
### Client-Side Tools
| Tool | Default Permission | Description |
|------|-------------------|-------------|
| `file_read` | always | Read file contents (with optional line ranges) |
| `file_write` | ask | Write or create files |
| `search_replace` | ask | Apply SEARCH/REPLACE diffs to files |
| `grep` | always | Search files with ripgrep or grep |
| `bash` | ask | Execute shell commands |
| `list_directory` | always | List directory tree (with depth limit) |
### LSP Tools
Auto-detected based on project files:
| Project File | Server |
|-------------|--------|
| `Cargo.toml` | `rust-analyzer` |
| `package.json` or `tsconfig.json` | `typescript-language-server` |
| `pyproject.toml`, `setup.py`, `requirements.txt` | `pyright-langserver` |
| `go.mod` | `gopls` |
LSP tools: `lsp_definition`, `lsp_references`, `lsp_hover`, `lsp_diagnostics`, `lsp_symbols`. These are advertised as client capabilities in `StartSession` — Sol only registers tools for LSP servers the client can actually spawn.
### Server-Side Tools
Sol can also call its own server-side tools during coding sessions: `search_code`, `search_archive`, `search_web`, `research`, and others. These execute on Sol's side — no local action needed.
## Tool Permissions
Configure in `.sunbeam/config.toml`:
```toml
[model]
name = "devstral-2" # override default model
[tools]
file_read = "always" # always, ask, never
file_write = "ask"
bash = "never" # block shell commands entirely
search_replace = "ask"
grep = "always"
list_directory = "always"
```
Permissions:
- **`always`** — execute immediately, no prompt
- **`ask`** — show approval prompt with three choices: *yes*, *yes, always allow*, *no*
- **`never`** — deny silently, Sol gets an error response
Choosing "yes, always allow" upgrades the permission to `always` for the rest of the session (in-memory only).
## TUI
```mermaid
flowchart TD
subgraph Layout
title["Title Bar<br/>project, branch, model, tokens, connection"]
conversation["Conversation Area<br/>user + assistant messages,<br/>tool output, status"]
input["Input Bar<br/>current line"]
end
title --> conversation
conversation --> input
```
**Key bindings:**
| Key | Action |
|-----|--------|
| Enter | Send message |
| Ctrl+C | Quit |
| Alt+L | Toggle debug log view |
| Up/Down | Navigate input history |
| Page Up/Down | Scroll conversation |
The TUI shows real-time status updates as Sol thinks and executes tools. Approval prompts appear inline when a tool needs permission.
## Session Resumption
Sessions are tied to a project path + git branch. If a session already exists for the current context, Sol resumes it — the TUI loads conversation history and you can continue where you left off.
## Code Reindexing
Separately from coding sessions, you can trigger repo indexing into Sol's code search:
```bash
sunbeam reindex-code # all repos
sunbeam reindex-code --org studio # specific org
sunbeam reindex-code --repo studio/sol --branch main # specific repo + branch
```
This calls Sol's `ReindexCode` gRPC endpoint, which walks Gitea repos, extracts symbols via tree-sitter, and indexes them to OpenSearch.
## Architecture
The `sunbeam code` command is structured as three concurrent layers:
```mermaid
flowchart TD
subgraph "Main Thread"
tui[TUI Event Loop<br/>ratatui, 50ms poll]
end
subgraph "Tokio Runtime"
agent[Agent Loop<br/>chat processing,<br/>tool execution]
heartbeat[Heartbeat<br/>1s ping to Sol]
end
subgraph "Sol (remote)"
grpc_service[gRPC CodeAgent]
orchestrator[Orchestrator]
mistral[Mistral AI]
end
tui <--> |crossbeam channels| agent
agent <--> |gRPC stream| grpc_service
heartbeat --> |health check| grpc_service
grpc_service --> orchestrator
orchestrator --> mistral
```
- **TUI** (main thread) — Ratatui event loop, renders conversation, handles input, shows tool approval prompts
- **Agent** (tokio task) — Manages the gRPC session, executes client-side tools, bridges between TUI and Sol via crossbeam channels
- **Heartbeat** (tokio task) — Pings Sol every second, updates the connection indicator in the title bar
The TUI never blocks on network calls. All gRPC communication happens in the agent task, with events flowing back via bounded channels.

163
markdown_test_artifacts.md Normal file
View File

@@ -0,0 +1,163 @@
# markdown test artifacts
---
## 1. headers
# h1: the quick brown fox
## h2: jumps over
### h3: the lazy dog
#### h4: 42
##### h5: why not
###### h6: minimum viable header
---
## 2. text formatting
**bold**, *italic*, ***bold italic***, ~~strikethrough~~, `inline code`, ==highlight== (if supported).
---
## 3. lists
### unordered
- top level
- nested
- deeply nested
- back to top
### ordered
1. first
1. nested first
2. nested second
2. second
3. third
### task lists
- [ ] unchecked
- [x] checked
- [ ] partially done (if supported)
---
## 4. code blocks
### inline `code` example
### fenced blocks
```python
def factorial(n):
return 1 if n <= 1 else n * factorial(n - 1)
```
```bash
# shebang test
#!/bin/bash
echo "hello world"
```
```
plaintext with no language
preserves spaces and newlines
```
---
## 5. tables
| syntax | description | test |
|-------------|-------------|------|
| header | title | here |
| paragraph | text | more |
| `code` | **bold** | *italics* |
---
## 6. blockquotes
> single line
> multi-line
> continuation
>> nested blockquote
---
## 7. horizontal rules
text
---
text
***
text
___
---
## 8. links & images
[regular link](https://example.com)
[reference-style link][1]
[1]: https://example.com "title"
![image alt](https://via.placeholder.com/150 "placeholder")
---
## 9. footnotes
here's a footnote[^1].
[^1]: this is the footnote text.
---
## 10. html (if supported)
<span style="color: red">red text</span>
<br>
<button disabled>interactive (but not here)</button>
---
## 11. edge cases
### whitespace
line with irregular spaces
### unicode
emoji: 🚀 ✨ 🦊
symbols: ← ↑ → ↓ ↔ ↕ ⇄ ⇅
math: 30° ½ ¼ ¾ ± × ÷ ≠ ≤ ≥ ≈ ∞
### escapes
\*not bold\* \`not code\` \[not a link\](https://example.com)
### empty elements
[]
()
{}
---
## 12. mixed nesting
1. ordered item
> with a blockquote
> - and a nested list
2. another item
```
code block inside list
```
---
## 13. long content
lorem ipsum dolor sit amet, consectetur adipiscing elit. sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.
---
## 14. definition lists (if supported)
term 1
: definition 1
term 2
: definition 2a
: definition 2b
---
## 15. math (if supported)
$E = mc^2$
$$\int_a^b f(x) dx$$

View File

@@ -305,6 +305,25 @@ pub async fn create_secret(ns: &str, name: &str, data: HashMap<String, String>)
Ok(()) Ok(())
} }
/// Find the first Running pod matching a label selector in a namespace.
pub async fn find_pod_by_label(ns: &str, label: &str) -> Option<String> {
let client = get_client().await.ok()?;
let pods: kube::Api<k8s_openapi::api::core::v1::Pod> =
kube::Api::namespaced(client, ns);
let lp = kube::api::ListParams::default().labels(label);
let pod_list = pods.list(&lp).await.ok()?;
pod_list
.items
.iter()
.find(|p| {
p.status
.as_ref()
.and_then(|s| s.phase.as_deref())
== Some("Running")
})
.and_then(|p| p.metadata.name.clone())
}
/// Execute a command in a pod and return (exit_code, stdout). /// Execute a command in a pod and return (exit_code, stdout).
#[allow(dead_code)] #[allow(dead_code)]
pub async fn kube_exec( pub async fn kube_exec(

View File

@@ -475,10 +475,16 @@ async fn os_api(path: &str, method: &str, body: Option<&str>) -> Option<String>
curl_args.extend_from_slice(&["-H", "Content-Type: application/json", "-d", &body_string]); curl_args.extend_from_slice(&["-H", "Content-Type: application/json", "-d", &body_string]);
} }
// Build the full exec command: exec deploy/opensearch -n data -c opensearch -- curl ... // Resolve the actual pod name from the app=opensearch label
let exec_cmd = curl_args; let pod_name = match crate::kube::find_pod_by_label("data", "app=opensearch").await {
Some(name) => name,
None => {
crate::output::warn("No OpenSearch pod found in data namespace");
return None;
}
};
match crate::kube::kube_exec("data", "opensearch-0", &exec_cmd, Some("opensearch")).await { match crate::kube::kube_exec("data", &pod_name, &curl_args, Some("opensearch")).await {
Ok((0, out)) if !out.is_empty() => Some(out), Ok((0, out)) if !out.is_empty() => Some(out),
_ => None, _ => None,
} }

14
sunbeam-proto/Cargo.toml Normal file
View File

@@ -0,0 +1,14 @@
[package]
name = "sunbeam-proto"
version = "0.1.0"
edition = "2024"
description = "Shared protobuf definitions for Sunbeam gRPC services"
[dependencies]
tonic = "0.14"
tonic-prost = "0.14"
prost = "0.14"
[build-dependencies]
tonic-build = "0.14"
tonic-prost-build = "0.14"

4
sunbeam-proto/build.rs Normal file
View File

@@ -0,0 +1,4 @@
fn main() -> Result<(), Box<dyn std::error::Error>> {
tonic_prost_build::compile_protos("proto/code.proto")?;
Ok(())
}

View File

@@ -0,0 +1,160 @@
syntax = "proto3";
package sunbeam.code.v1;
// Sol's coding agent service. Bidirectional streaming between
// the `sunbeam code` TUI client and Sol's server-side agent loop.
service CodeAgent {
rpc Session(stream ClientMessage) returns (stream ServerMessage);
rpc ReindexCode(ReindexCodeRequest) returns (ReindexCodeResponse);
}
message ReindexCodeRequest {
string org = 1; // optional: filter to an org (empty = all)
string repo = 2; // optional: specific repo (empty = all)
string branch = 3; // optional: specific branch (empty = default)
}
message ReindexCodeResponse {
uint32 repos_indexed = 1;
uint32 symbols_indexed = 2;
string error = 3; // empty on success
}
// ── Client → Sol ───────────────────────────────────────────────
message ClientMessage {
oneof payload {
StartSession start = 1;
UserInput input = 2;
ToolResult tool_result = 3;
ToolApproval approval = 4;
EndSession end = 5;
IndexSymbols index_symbols = 6;
}
}
message IndexSymbols {
string project_name = 1;
string branch = 2;
repeated SymbolEntry symbols = 3;
}
message SymbolEntry {
string file_path = 1;
string name = 2;
string kind = 3;
string signature = 4;
string docstring = 5;
int32 start_line = 6;
int32 end_line = 7;
string language = 8;
string content = 9;
}
message StartSession {
string project_path = 1;
string prompt_md = 2;
string config_toml = 3;
string git_branch = 4;
string git_status = 5;
repeated string file_tree = 6;
string model = 7;
repeated ToolDef client_tools = 8;
}
message UserInput {
string text = 1;
}
message ToolResult {
string call_id = 1;
string result = 2;
bool is_error = 3;
}
message ToolApproval {
string call_id = 1;
bool approved = 2;
}
message EndSession {}
// ── Sol → Client ───────────────────────────────────────────────
message ServerMessage {
oneof payload {
SessionReady ready = 1;
TextDelta delta = 2;
TextDone done = 3;
ToolCall tool_call = 4;
ApprovalNeeded approval = 5;
Status status = 6;
SessionEnd end = 7;
Error error = 8;
}
}
message SessionReady {
string session_id = 1;
string room_id = 2;
string model = 3;
bool resumed = 4;
repeated HistoryEntry history = 5;
}
message HistoryEntry {
string role = 1; // "user" or "assistant"
string content = 2;
}
message TextDelta {
string text = 1;
}
message TextDone {
string full_text = 1;
uint32 input_tokens = 2;
uint32 output_tokens = 3;
}
message ToolCall {
string call_id = 1;
string name = 2;
string args_json = 3;
bool is_local = 4;
bool needs_approval = 5;
}
message ApprovalNeeded {
string call_id = 1;
string name = 2;
string args_json = 3;
string summary = 4;
}
message Status {
string message = 1;
StatusKind kind = 2;
}
enum StatusKind {
INFO = 0;
TOOL_RUNNING = 1;
TOOL_DONE = 2;
THINKING = 3;
}
message SessionEnd {
string summary = 1;
}
message Error {
string message = 1;
bool fatal = 2;
}
message ToolDef {
string name = 1;
string description = 2;
string schema_json = 3;
}

3
sunbeam-proto/src/lib.rs Normal file
View File

@@ -0,0 +1,3 @@
pub mod sunbeam_code_v1 {
tonic::include_proto!("sunbeam.code.v1");
}

View File

@@ -1,6 +1,6 @@
[package] [package]
name = "sunbeam-sdk" name = "sunbeam-sdk"
version = "1.0.0" version = "1.0.1"
edition = "2024" edition = "2024"
description = "Sunbeam SDK — reusable library for cluster management" description = "Sunbeam SDK — reusable library for cluster management"
repository = "https://src.sunbeam.pt/studio/cli" repository = "https://src.sunbeam.pt/studio/cli"

View File

@@ -29,9 +29,9 @@ impl ServiceClient for HydraClient {
} }
impl HydraClient { impl HydraClient {
/// Build a HydraClient from domain (e.g. `https://auth.{domain}`). /// Build a HydraClient from domain (e.g. `https://hydra.{domain}`).
pub fn connect(domain: &str) -> Self { pub fn connect(domain: &str) -> Self {
let base_url = format!("https://auth.{domain}"); let base_url = format!("https://hydra.{domain}");
Self::from_parts(base_url, AuthMethod::None) Self::from_parts(base_url, AuthMethod::None)
} }
@@ -467,7 +467,7 @@ mod tests {
#[test] #[test]
fn test_connect_url() { fn test_connect_url() {
let c = HydraClient::connect("sunbeam.pt"); let c = HydraClient::connect("sunbeam.pt");
assert_eq!(c.base_url(), "https://auth.sunbeam.pt"); assert_eq!(c.base_url(), "https://hydra.sunbeam.pt");
assert_eq!(c.service_name(), "hydra"); assert_eq!(c.service_name(), "hydra");
} }

View File

@@ -7,7 +7,7 @@
use crate::error::{Result, ResultExt, SunbeamError}; use crate::error::{Result, ResultExt, SunbeamError};
use reqwest::Method; use reqwest::Method;
use serde::{de::DeserializeOwned, Serialize}; use serde::{de::DeserializeOwned, Serialize};
use std::sync::OnceLock; use tokio::sync::OnceCell;
// --------------------------------------------------------------------------- // ---------------------------------------------------------------------------
// AuthMethod // AuthMethod
@@ -222,51 +222,51 @@ impl HttpTransport {
/// Unified entry point for all service clients. /// Unified entry point for all service clients.
/// ///
/// Lazily constructs and caches per-service clients from the active config /// Lazily constructs and caches per-service clients from the active config
/// context. Each accessor returns a `&Client` reference, constructing on /// context. Each accessor resolves auth and returns a `&Client` reference,
/// first call via [`OnceLock`]. /// constructing on first call via [`OnceCell`] (async-aware).
///
/// Auth is resolved per-client:
/// - SSO bearer (`get_token()`) — admin APIs, Matrix, La Suite, OpenSearch
/// - Gitea PAT (`get_gitea_token()`) — Gitea
/// - None — Prometheus, Loki, S3, LiveKit
pub struct SunbeamClient { pub struct SunbeamClient {
ctx: crate::config::Context, ctx: crate::config::Context,
domain: String, domain: String,
// Phase 1
#[cfg(feature = "identity")] #[cfg(feature = "identity")]
kratos: OnceLock<crate::identity::KratosClient>, kratos: OnceCell<crate::identity::KratosClient>,
#[cfg(feature = "identity")] #[cfg(feature = "identity")]
hydra: OnceLock<crate::auth::hydra::HydraClient>, hydra: OnceCell<crate::auth::hydra::HydraClient>,
// Phase 2
#[cfg(feature = "gitea")] #[cfg(feature = "gitea")]
gitea: OnceLock<crate::gitea::GiteaClient>, gitea: OnceCell<crate::gitea::GiteaClient>,
// Phase 3
#[cfg(feature = "matrix")] #[cfg(feature = "matrix")]
matrix: OnceLock<crate::matrix::MatrixClient>, matrix: OnceCell<crate::matrix::MatrixClient>,
#[cfg(feature = "opensearch")] #[cfg(feature = "opensearch")]
opensearch: OnceLock<crate::search::OpenSearchClient>, opensearch: OnceCell<crate::search::OpenSearchClient>,
#[cfg(feature = "s3")] #[cfg(feature = "s3")]
s3: OnceLock<crate::storage::S3Client>, s3: OnceCell<crate::storage::S3Client>,
#[cfg(feature = "livekit")] #[cfg(feature = "livekit")]
livekit: OnceLock<crate::media::LiveKitClient>, livekit: OnceCell<crate::media::LiveKitClient>,
#[cfg(feature = "monitoring")] #[cfg(feature = "monitoring")]
prometheus: OnceLock<crate::monitoring::PrometheusClient>, prometheus: OnceCell<crate::monitoring::PrometheusClient>,
#[cfg(feature = "monitoring")] #[cfg(feature = "monitoring")]
loki: OnceLock<crate::monitoring::LokiClient>, loki: OnceCell<crate::monitoring::LokiClient>,
#[cfg(feature = "monitoring")] #[cfg(feature = "monitoring")]
grafana: OnceLock<crate::monitoring::GrafanaClient>, grafana: OnceCell<crate::monitoring::GrafanaClient>,
// Phase 4
#[cfg(feature = "lasuite")] #[cfg(feature = "lasuite")]
people: OnceLock<crate::lasuite::PeopleClient>, people: OnceCell<crate::lasuite::PeopleClient>,
#[cfg(feature = "lasuite")] #[cfg(feature = "lasuite")]
docs: OnceLock<crate::lasuite::DocsClient>, docs: OnceCell<crate::lasuite::DocsClient>,
#[cfg(feature = "lasuite")] #[cfg(feature = "lasuite")]
meet: OnceLock<crate::lasuite::MeetClient>, meet: OnceCell<crate::lasuite::MeetClient>,
#[cfg(feature = "lasuite")] #[cfg(feature = "lasuite")]
drive: OnceLock<crate::lasuite::DriveClient>, drive: OnceCell<crate::lasuite::DriveClient>,
#[cfg(feature = "lasuite")] #[cfg(feature = "lasuite")]
messages: OnceLock<crate::lasuite::MessagesClient>, messages: OnceCell<crate::lasuite::MessagesClient>,
#[cfg(feature = "lasuite")] #[cfg(feature = "lasuite")]
calendars: OnceLock<crate::lasuite::CalendarsClient>, calendars: OnceCell<crate::lasuite::CalendarsClient>,
#[cfg(feature = "lasuite")] #[cfg(feature = "lasuite")]
find: OnceLock<crate::lasuite::FindClient>, find: OnceCell<crate::lasuite::FindClient>,
// Bao/Planka stay in their existing modules bao: OnceCell<crate::openbao::BaoClient>,
bao: OnceLock<crate::openbao::BaoClient>,
} }
impl SunbeamClient { impl SunbeamClient {
@@ -276,40 +276,40 @@ impl SunbeamClient {
domain: ctx.domain.clone(), domain: ctx.domain.clone(),
ctx: ctx.clone(), ctx: ctx.clone(),
#[cfg(feature = "identity")] #[cfg(feature = "identity")]
kratos: OnceLock::new(), kratos: OnceCell::new(),
#[cfg(feature = "identity")] #[cfg(feature = "identity")]
hydra: OnceLock::new(), hydra: OnceCell::new(),
#[cfg(feature = "gitea")] #[cfg(feature = "gitea")]
gitea: OnceLock::new(), gitea: OnceCell::new(),
#[cfg(feature = "matrix")] #[cfg(feature = "matrix")]
matrix: OnceLock::new(), matrix: OnceCell::new(),
#[cfg(feature = "opensearch")] #[cfg(feature = "opensearch")]
opensearch: OnceLock::new(), opensearch: OnceCell::new(),
#[cfg(feature = "s3")] #[cfg(feature = "s3")]
s3: OnceLock::new(), s3: OnceCell::new(),
#[cfg(feature = "livekit")] #[cfg(feature = "livekit")]
livekit: OnceLock::new(), livekit: OnceCell::new(),
#[cfg(feature = "monitoring")] #[cfg(feature = "monitoring")]
prometheus: OnceLock::new(), prometheus: OnceCell::new(),
#[cfg(feature = "monitoring")] #[cfg(feature = "monitoring")]
loki: OnceLock::new(), loki: OnceCell::new(),
#[cfg(feature = "monitoring")] #[cfg(feature = "monitoring")]
grafana: OnceLock::new(), grafana: OnceCell::new(),
#[cfg(feature = "lasuite")] #[cfg(feature = "lasuite")]
people: OnceLock::new(), people: OnceCell::new(),
#[cfg(feature = "lasuite")] #[cfg(feature = "lasuite")]
docs: OnceLock::new(), docs: OnceCell::new(),
#[cfg(feature = "lasuite")] #[cfg(feature = "lasuite")]
meet: OnceLock::new(), meet: OnceCell::new(),
#[cfg(feature = "lasuite")] #[cfg(feature = "lasuite")]
drive: OnceLock::new(), drive: OnceCell::new(),
#[cfg(feature = "lasuite")] #[cfg(feature = "lasuite")]
messages: OnceLock::new(), messages: OnceCell::new(),
#[cfg(feature = "lasuite")] #[cfg(feature = "lasuite")]
calendars: OnceLock::new(), calendars: OnceCell::new(),
#[cfg(feature = "lasuite")] #[cfg(feature = "lasuite")]
find: OnceLock::new(), find: OnceCell::new(),
bao: OnceLock::new(), bao: OnceCell::new(),
} }
} }
@@ -323,131 +323,172 @@ impl SunbeamClient {
&self.ctx &self.ctx
} }
// -- Lazy accessors (each feature-gated) -------------------------------- // -- Auth helpers --------------------------------------------------------
/// Get cached SSO bearer token (from `sunbeam auth sso`).
async fn sso_token(&self) -> Result<String> {
crate::auth::get_token().await
}
/// Get cached Gitea PAT (from `sunbeam auth git`).
fn gitea_token(&self) -> Result<String> {
crate::auth::get_gitea_token()
}
// -- Lazy async accessors (each feature-gated) ---------------------------
//
// Each accessor resolves the appropriate auth and constructs the client
// with from_parts(url, auth). Cached after first call.
#[cfg(feature = "identity")] #[cfg(feature = "identity")]
pub fn kratos(&self) -> &crate::identity::KratosClient { pub async fn kratos(&self) -> Result<&crate::identity::KratosClient> {
self.kratos.get_or_init(|| { self.kratos.get_or_try_init(|| async {
crate::identity::KratosClient::connect(&self.domain) let token = self.sso_token().await?;
}) let url = format!("https://id.{}", self.domain);
Ok(crate::identity::KratosClient::from_parts(url, AuthMethod::Bearer(token)))
}).await
} }
#[cfg(feature = "identity")] #[cfg(feature = "identity")]
pub fn hydra(&self) -> &crate::auth::hydra::HydraClient { pub async fn hydra(&self) -> Result<&crate::auth::hydra::HydraClient> {
self.hydra.get_or_init(|| { self.hydra.get_or_try_init(|| async {
crate::auth::hydra::HydraClient::connect(&self.domain) let token = self.sso_token().await?;
}) let url = format!("https://hydra.{}", self.domain);
Ok(crate::auth::hydra::HydraClient::from_parts(url, AuthMethod::Bearer(token)))
}).await
} }
#[cfg(feature = "gitea")] #[cfg(feature = "gitea")]
pub fn gitea(&self) -> &crate::gitea::GiteaClient { pub async fn gitea(&self) -> Result<&crate::gitea::GiteaClient> {
self.gitea.get_or_init(|| { self.gitea.get_or_try_init(|| async {
crate::gitea::GiteaClient::connect(&self.domain) let token = self.gitea_token()?;
}) let url = format!("https://src.{}/api/v1", self.domain);
Ok(crate::gitea::GiteaClient::from_parts(url, AuthMethod::Token(token)))
}).await
} }
#[cfg(feature = "matrix")] #[cfg(feature = "matrix")]
pub fn matrix(&self) -> &crate::matrix::MatrixClient { pub async fn matrix(&self) -> Result<&crate::matrix::MatrixClient> {
self.matrix.get_or_init(|| { self.matrix.get_or_try_init(|| async {
crate::matrix::MatrixClient::connect(&self.domain) let token = self.sso_token().await?;
}) let url = format!("https://messages.{}/_matrix", self.domain);
Ok(crate::matrix::MatrixClient::from_parts(url, AuthMethod::Bearer(token)))
}).await
} }
#[cfg(feature = "opensearch")] #[cfg(feature = "opensearch")]
pub fn opensearch(&self) -> &crate::search::OpenSearchClient { pub async fn opensearch(&self) -> Result<&crate::search::OpenSearchClient> {
self.opensearch.get_or_init(|| { self.opensearch.get_or_try_init(|| async {
crate::search::OpenSearchClient::connect(&self.domain) let token = self.sso_token().await?;
}) let url = format!("https://search.{}", self.domain);
Ok(crate::search::OpenSearchClient::from_parts(url, AuthMethod::Bearer(token)))
}).await
} }
#[cfg(feature = "s3")] #[cfg(feature = "s3")]
pub fn s3(&self) -> &crate::storage::S3Client { pub async fn s3(&self) -> Result<&crate::storage::S3Client> {
self.s3.get_or_init(|| { self.s3.get_or_try_init(|| async {
crate::storage::S3Client::connect(&self.domain) Ok(crate::storage::S3Client::connect(&self.domain))
}) }).await
} }
#[cfg(feature = "livekit")] #[cfg(feature = "livekit")]
pub fn livekit(&self) -> &crate::media::LiveKitClient { pub async fn livekit(&self) -> Result<&crate::media::LiveKitClient> {
self.livekit.get_or_init(|| { self.livekit.get_or_try_init(|| async {
crate::media::LiveKitClient::connect(&self.domain) Ok(crate::media::LiveKitClient::connect(&self.domain))
}) }).await
} }
#[cfg(feature = "monitoring")] #[cfg(feature = "monitoring")]
pub fn prometheus(&self) -> &crate::monitoring::PrometheusClient { pub async fn prometheus(&self) -> Result<&crate::monitoring::PrometheusClient> {
self.prometheus.get_or_init(|| { self.prometheus.get_or_try_init(|| async {
crate::monitoring::PrometheusClient::connect(&self.domain) Ok(crate::monitoring::PrometheusClient::connect(&self.domain))
}) }).await
} }
#[cfg(feature = "monitoring")] #[cfg(feature = "monitoring")]
pub fn loki(&self) -> &crate::monitoring::LokiClient { pub async fn loki(&self) -> Result<&crate::monitoring::LokiClient> {
self.loki.get_or_init(|| { self.loki.get_or_try_init(|| async {
crate::monitoring::LokiClient::connect(&self.domain) Ok(crate::monitoring::LokiClient::connect(&self.domain))
}) }).await
} }
#[cfg(feature = "monitoring")] #[cfg(feature = "monitoring")]
pub fn grafana(&self) -> &crate::monitoring::GrafanaClient { pub async fn grafana(&self) -> Result<&crate::monitoring::GrafanaClient> {
self.grafana.get_or_init(|| { self.grafana.get_or_try_init(|| async {
crate::monitoring::GrafanaClient::connect(&self.domain) Ok(crate::monitoring::GrafanaClient::connect(&self.domain))
}) }).await
} }
#[cfg(feature = "lasuite")] #[cfg(feature = "lasuite")]
pub fn people(&self) -> &crate::lasuite::PeopleClient { pub async fn people(&self) -> Result<&crate::lasuite::PeopleClient> {
self.people.get_or_init(|| { self.people.get_or_try_init(|| async {
crate::lasuite::PeopleClient::connect(&self.domain) let token = self.sso_token().await?;
}) let url = format!("https://people.{}/api/v1.0", self.domain);
Ok(crate::lasuite::PeopleClient::from_parts(url, AuthMethod::Bearer(token)))
}).await
} }
#[cfg(feature = "lasuite")] #[cfg(feature = "lasuite")]
pub fn docs(&self) -> &crate::lasuite::DocsClient { pub async fn docs(&self) -> Result<&crate::lasuite::DocsClient> {
self.docs.get_or_init(|| { self.docs.get_or_try_init(|| async {
crate::lasuite::DocsClient::connect(&self.domain) let token = self.sso_token().await?;
}) let url = format!("https://docs.{}/api/v1.0", self.domain);
Ok(crate::lasuite::DocsClient::from_parts(url, AuthMethod::Bearer(token)))
}).await
} }
#[cfg(feature = "lasuite")] #[cfg(feature = "lasuite")]
pub fn meet(&self) -> &crate::lasuite::MeetClient { pub async fn meet(&self) -> Result<&crate::lasuite::MeetClient> {
self.meet.get_or_init(|| { self.meet.get_or_try_init(|| async {
crate::lasuite::MeetClient::connect(&self.domain) let token = self.sso_token().await?;
}) let url = format!("https://meet.{}/api/v1.0", self.domain);
Ok(crate::lasuite::MeetClient::from_parts(url, AuthMethod::Bearer(token)))
}).await
} }
#[cfg(feature = "lasuite")] #[cfg(feature = "lasuite")]
pub fn drive(&self) -> &crate::lasuite::DriveClient { pub async fn drive(&self) -> Result<&crate::lasuite::DriveClient> {
self.drive.get_or_init(|| { self.drive.get_or_try_init(|| async {
crate::lasuite::DriveClient::connect(&self.domain) let token = self.sso_token().await?;
}) let url = format!("https://drive.{}/api/v1.0", self.domain);
Ok(crate::lasuite::DriveClient::from_parts(url, AuthMethod::Bearer(token)))
}).await
} }
#[cfg(feature = "lasuite")] #[cfg(feature = "lasuite")]
pub fn messages(&self) -> &crate::lasuite::MessagesClient { pub async fn messages(&self) -> Result<&crate::lasuite::MessagesClient> {
self.messages.get_or_init(|| { self.messages.get_or_try_init(|| async {
crate::lasuite::MessagesClient::connect(&self.domain) let token = self.sso_token().await?;
}) let url = format!("https://mail.{}/api/v1.0", self.domain);
Ok(crate::lasuite::MessagesClient::from_parts(url, AuthMethod::Bearer(token)))
}).await
} }
#[cfg(feature = "lasuite")] #[cfg(feature = "lasuite")]
pub fn calendars(&self) -> &crate::lasuite::CalendarsClient { pub async fn calendars(&self) -> Result<&crate::lasuite::CalendarsClient> {
self.calendars.get_or_init(|| { self.calendars.get_or_try_init(|| async {
crate::lasuite::CalendarsClient::connect(&self.domain) let token = self.sso_token().await?;
}) let url = format!("https://calendar.{}/api/v1.0", self.domain);
Ok(crate::lasuite::CalendarsClient::from_parts(url, AuthMethod::Bearer(token)))
}).await
} }
#[cfg(feature = "lasuite")] #[cfg(feature = "lasuite")]
pub fn find(&self) -> &crate::lasuite::FindClient { pub async fn find(&self) -> Result<&crate::lasuite::FindClient> {
self.find.get_or_init(|| { self.find.get_or_try_init(|| async {
crate::lasuite::FindClient::connect(&self.domain) let token = self.sso_token().await?;
}) let url = format!("https://find.{}/api/v1.0", self.domain);
Ok(crate::lasuite::FindClient::from_parts(url, AuthMethod::Bearer(token)))
}).await
} }
pub fn bao(&self, base_url: &str) -> &crate::openbao::BaoClient { pub async fn bao(&self) -> Result<&crate::openbao::BaoClient> {
self.bao.get_or_init(|| { self.bao.get_or_try_init(|| async {
crate::openbao::BaoClient::new(base_url) let token = self.sso_token().await?;
}) let url = format!("https://vault.{}", self.domain);
Ok(crate::openbao::BaoClient::with_token(&url, &token))
}).await
} }
} }

View File

@@ -2,9 +2,9 @@
use clap::Subcommand; use clap::Subcommand;
use crate::client::SunbeamClient;
use crate::error::{Result, SunbeamError}; use crate::error::{Result, SunbeamError};
use crate::gitea::types::*; use crate::gitea::types::*;
use crate::gitea::GiteaClient;
use crate::output::{render, render_list, read_json_input, OutputFormat}; use crate::output::{render, render_list, read_json_input, OutputFormat};
// --------------------------------------------------------------------------- // ---------------------------------------------------------------------------
@@ -435,7 +435,8 @@ fn notification_row(n: &Notification) -> Vec<String> {
// Dispatch // Dispatch
// --------------------------------------------------------------------------- // ---------------------------------------------------------------------------
pub async fn dispatch(cmd: VcsCommand, client: &GiteaClient, fmt: OutputFormat) -> Result<()> { pub async fn dispatch(cmd: VcsCommand, client: &SunbeamClient, fmt: OutputFormat) -> Result<()> {
let client = client.gitea().await?;
match cmd { match cmd {
// -- Repo ----------------------------------------------------------- // -- Repo -----------------------------------------------------------
VcsCommand::Repo { action } => match action { VcsCommand::Repo { action } => match action {

View File

@@ -349,7 +349,7 @@ pub async fn dispatch(
AuthCommand::Courier { action } => dispatch_courier(action, client, output).await, AuthCommand::Courier { action } => dispatch_courier(action, client, output).await,
// -- Kratos: Health ----------------------------------------------------- // -- Kratos: Health -----------------------------------------------------
AuthCommand::Health => { AuthCommand::Health => {
let status = client.kratos().alive().await?; let status = client.kratos().await?.alive().await?;
output::render(&status, output) output::render(&status, output)
} }
// -- Hydra: Client ------------------------------------------------------ // -- Hydra: Client ------------------------------------------------------
@@ -384,7 +384,7 @@ async fn dispatch_identity(
client: &SunbeamClient, client: &SunbeamClient,
fmt: OutputFormat, fmt: OutputFormat,
) -> Result<()> { ) -> Result<()> {
let kratos = client.kratos(); let kratos = client.kratos().await?;
match action { match action {
IdentityAction::List { page, page_size } => { IdentityAction::List { page, page_size } => {
let items = kratos.list_identities(page, page_size).await?; let items = kratos.list_identities(page, page_size).await?;
@@ -437,7 +437,7 @@ async fn dispatch_session(
client: &SunbeamClient, client: &SunbeamClient,
fmt: OutputFormat, fmt: OutputFormat,
) -> Result<()> { ) -> Result<()> {
let kratos = client.kratos(); let kratos = client.kratos().await?;
match action { match action {
SessionAction::List { SessionAction::List {
page_size, page_size,
@@ -486,7 +486,7 @@ async fn dispatch_recovery(
client: &SunbeamClient, client: &SunbeamClient,
fmt: OutputFormat, fmt: OutputFormat,
) -> Result<()> { ) -> Result<()> {
let kratos = client.kratos(); let kratos = client.kratos().await?;
match action { match action {
RecoveryAction::CreateCode { id, expires_in } => { RecoveryAction::CreateCode { id, expires_in } => {
let item = kratos let item = kratos
@@ -512,7 +512,7 @@ async fn dispatch_schema(
client: &SunbeamClient, client: &SunbeamClient,
fmt: OutputFormat, fmt: OutputFormat,
) -> Result<()> { ) -> Result<()> {
let kratos = client.kratos(); let kratos = client.kratos().await?;
match action { match action {
SchemaAction::List => { SchemaAction::List => {
let items = kratos.list_schemas().await?; let items = kratos.list_schemas().await?;
@@ -539,7 +539,7 @@ async fn dispatch_courier(
client: &SunbeamClient, client: &SunbeamClient,
fmt: OutputFormat, fmt: OutputFormat,
) -> Result<()> { ) -> Result<()> {
let kratos = client.kratos(); let kratos = client.kratos().await?;
match action { match action {
CourierAction::List { CourierAction::List {
page_size, page_size,
@@ -579,7 +579,7 @@ async fn dispatch_client(
client: &SunbeamClient, client: &SunbeamClient,
fmt: OutputFormat, fmt: OutputFormat,
) -> Result<()> { ) -> Result<()> {
let hydra = client.hydra(); let hydra = client.hydra().await?;
match action { match action {
ClientAction::List { limit, offset } => { ClientAction::List { limit, offset } => {
let items = hydra.list_clients(limit, offset).await?; let items = hydra.list_clients(limit, offset).await?;
@@ -631,7 +631,7 @@ async fn dispatch_jwk(
client: &SunbeamClient, client: &SunbeamClient,
fmt: OutputFormat, fmt: OutputFormat,
) -> Result<()> { ) -> Result<()> {
let hydra = client.hydra(); let hydra = client.hydra().await?;
match action { match action {
JwkAction::List { set_name } => { JwkAction::List { set_name } => {
let item = hydra.get_jwk_set(&set_name).await?; let item = hydra.get_jwk_set(&set_name).await?;
@@ -665,7 +665,7 @@ async fn dispatch_issuer(
client: &SunbeamClient, client: &SunbeamClient,
fmt: OutputFormat, fmt: OutputFormat,
) -> Result<()> { ) -> Result<()> {
let hydra = client.hydra(); let hydra = client.hydra().await?;
match action { match action {
IssuerAction::List => { IssuerAction::List => {
let items = hydra.list_trusted_issuers().await?; let items = hydra.list_trusted_issuers().await?;
@@ -711,7 +711,7 @@ async fn dispatch_token(
client: &SunbeamClient, client: &SunbeamClient,
fmt: OutputFormat, fmt: OutputFormat,
) -> Result<()> { ) -> Result<()> {
let hydra = client.hydra(); let hydra = client.hydra().await?;
match action { match action {
TokenAction::Introspect { token } => { TokenAction::Introspect { token } => {
let item = hydra.introspect_token(&token).await?; let item = hydra.introspect_token(&token).await?;

View File

@@ -308,6 +308,25 @@ pub async fn create_secret(ns: &str, name: &str, data: HashMap<String, String>)
Ok(()) Ok(())
} }
/// Find the first Running pod matching a label selector in a namespace.
pub async fn find_pod_by_label(ns: &str, label: &str) -> Option<String> {
let client = get_client().await.ok()?;
let pods: kube::Api<k8s_openapi::api::core::v1::Pod> =
kube::Api::namespaced(client.clone(), ns);
let lp = kube::api::ListParams::default().labels(label);
let pod_list = pods.list(&lp).await.ok()?;
pod_list
.items
.iter()
.find(|p| {
p.status
.as_ref()
.and_then(|s| s.phase.as_deref())
== Some("Running")
})
.and_then(|p| p.metadata.name.clone())
}
/// Execute a command in a pod and return (exit_code, stdout). /// Execute a command in a pod and return (exit_code, stdout).
#[allow(dead_code)] #[allow(dead_code)]
pub async fn kube_exec( pub async fn kube_exec(

View File

@@ -6,44 +6,6 @@ use crate::client::SunbeamClient;
use crate::error::Result; use crate::error::Result;
use crate::output::{self, OutputFormat}; use crate::output::{self, OutputFormat};
// ═══════════════════════════════════════════════════════════════════════════
// Helper: build an authenticated La Suite client
// ═══════════════════════════════════════════════════════════════════════════
async fn people_client(domain: &str) -> Result<super::PeopleClient> {
let token = crate::auth::get_token().await?;
Ok(super::PeopleClient::connect(domain).with_token(&token))
}
async fn docs_client(domain: &str) -> Result<super::DocsClient> {
let token = crate::auth::get_token().await?;
Ok(super::DocsClient::connect(domain).with_token(&token))
}
async fn meet_client(domain: &str) -> Result<super::MeetClient> {
let token = crate::auth::get_token().await?;
Ok(super::MeetClient::connect(domain).with_token(&token))
}
async fn drive_client(domain: &str) -> Result<super::DriveClient> {
let token = crate::auth::get_token().await?;
Ok(super::DriveClient::connect(domain).with_token(&token))
}
async fn messages_client(domain: &str) -> Result<super::MessagesClient> {
let token = crate::auth::get_token().await?;
Ok(super::MessagesClient::connect(domain).with_token(&token))
}
async fn calendars_client(domain: &str) -> Result<super::CalendarsClient> {
let token = crate::auth::get_token().await?;
Ok(super::CalendarsClient::connect(domain).with_token(&token))
}
async fn find_client(domain: &str) -> Result<super::FindClient> {
let token = crate::auth::get_token().await?;
Ok(super::FindClient::connect(domain).with_token(&token))
}
// ═══════════════════════════════════════════════════════════════════════════ // ═══════════════════════════════════════════════════════════════════════════
// People // People
@@ -143,7 +105,7 @@ pub async fn dispatch_people(
client: &SunbeamClient, client: &SunbeamClient,
fmt: OutputFormat, fmt: OutputFormat,
) -> Result<()> { ) -> Result<()> {
let people = people_client(client.domain()).await?; let people = client.people().await?;
match cmd { match cmd {
PeopleCommand::Contact { action } => match action { PeopleCommand::Contact { action } => match action {
ContactAction::List { page } => { ContactAction::List { page } => {
@@ -346,7 +308,7 @@ pub async fn dispatch_docs(
client: &SunbeamClient, client: &SunbeamClient,
fmt: OutputFormat, fmt: OutputFormat,
) -> Result<()> { ) -> Result<()> {
let docs = docs_client(client.domain()).await?; let docs = client.docs().await?;
match cmd { match cmd {
DocsCommand::Document { action } => match action { DocsCommand::Document { action } => match action {
DocumentAction::List { page } => { DocumentAction::List { page } => {
@@ -498,7 +460,7 @@ pub async fn dispatch_meet(
client: &SunbeamClient, client: &SunbeamClient,
fmt: OutputFormat, fmt: OutputFormat,
) -> Result<()> { ) -> Result<()> {
let meet = meet_client(client.domain()).await?; let meet = client.meet().await?;
match cmd { match cmd {
MeetCommand::Room { action } => match action { MeetCommand::Room { action } => match action {
RoomAction::List { page } => { RoomAction::List { page } => {
@@ -645,7 +607,7 @@ pub async fn dispatch_drive(
client: &SunbeamClient, client: &SunbeamClient,
fmt: OutputFormat, fmt: OutputFormat,
) -> Result<()> { ) -> Result<()> {
let drive = drive_client(client.domain()).await?; let drive = client.drive().await?;
match cmd { match cmd {
DriveCommand::File { action } => match action { DriveCommand::File { action } => match action {
FileAction::List { page } => { FileAction::List { page } => {
@@ -823,7 +785,7 @@ pub async fn dispatch_mail(
client: &SunbeamClient, client: &SunbeamClient,
fmt: OutputFormat, fmt: OutputFormat,
) -> Result<()> { ) -> Result<()> {
let mail = messages_client(client.domain()).await?; let mail = client.messages().await?;
match cmd { match cmd {
MailCommand::Mailbox { action } => match action { MailCommand::Mailbox { action } => match action {
MailboxAction::List => { MailboxAction::List => {
@@ -1013,7 +975,7 @@ pub async fn dispatch_cal(
client: &SunbeamClient, client: &SunbeamClient,
fmt: OutputFormat, fmt: OutputFormat,
) -> Result<()> { ) -> Result<()> {
let cal = calendars_client(client.domain()).await?; let cal = client.calendars().await?;
match cmd { match cmd {
CalCommand::Calendar { action } => match action { CalCommand::Calendar { action } => match action {
CalendarAction::List => { CalendarAction::List => {
@@ -1124,7 +1086,7 @@ pub async fn dispatch_find(
client: &SunbeamClient, client: &SunbeamClient,
fmt: OutputFormat, fmt: OutputFormat,
) -> Result<()> { ) -> Result<()> {
let find = find_client(client.domain()).await?; let find = client.find().await?;
match cmd { match cmd {
FindCommand::Search { query, page } => { FindCommand::Search { query, page } => {
let page_data = find.search(&query, page).await?; let page_data = find.search(&query, page).await?;

View File

@@ -475,10 +475,15 @@ async fn os_api(path: &str, method: &str, body: Option<&str>) -> Option<String>
curl_args.extend_from_slice(&["-H", "Content-Type: application/json", "-d", &body_string]); curl_args.extend_from_slice(&["-H", "Content-Type: application/json", "-d", &body_string]);
} }
// Build the full exec command: exec deploy/opensearch -n data -c opensearch -- curl ... let pod_name = match crate::kube::find_pod_by_label("data", "app=opensearch").await {
let exec_cmd = curl_args; Some(name) => name,
None => {
crate::output::warn("No OpenSearch pod found in data namespace");
return None;
}
};
match crate::kube::kube_exec("data", "opensearch-0", &exec_cmd, Some("opensearch")).await { match crate::kube::kube_exec("data", &pod_name, &curl_args, Some("opensearch")).await {
Ok((0, out)) if !out.is_empty() => Some(out), Ok((0, out)) if !out.is_empty() => Some(out),
_ => None, _ => None,
} }

View File

@@ -1,22 +1,10 @@
//! CLI dispatch for Matrix chat commands. //! CLI dispatch for Matrix chat commands.
use crate::client::SunbeamClient;
use crate::error::Result; use crate::error::Result;
use crate::output::{self, OutputFormat}; use crate::output::{self, OutputFormat};
use clap::Subcommand; use clap::Subcommand;
// ---------------------------------------------------------------------------
// Auth helper
// ---------------------------------------------------------------------------
/// Construct a [`MatrixClient`] with a valid access token from the credential
/// cache. Fails if the user is not logged in.
async fn matrix_with_token(domain: &str) -> Result<super::MatrixClient> {
let token = crate::auth::get_token().await?;
let mut m = super::MatrixClient::connect(domain);
m.set_token(&token);
Ok(m)
}
// --------------------------------------------------------------------------- // ---------------------------------------------------------------------------
// Command tree // Command tree
// --------------------------------------------------------------------------- // ---------------------------------------------------------------------------
@@ -343,8 +331,8 @@ pub enum UserAction {
// --------------------------------------------------------------------------- // ---------------------------------------------------------------------------
/// Dispatch a parsed [`ChatCommand`] against the Matrix homeserver. /// Dispatch a parsed [`ChatCommand`] against the Matrix homeserver.
pub async fn dispatch(domain: &str, format: OutputFormat, cmd: ChatCommand) -> Result<()> { pub async fn dispatch(client: &SunbeamClient, format: OutputFormat, cmd: ChatCommand) -> Result<()> {
let m = matrix_with_token(domain).await?; let m = client.matrix().await?;
match cmd { match cmd {
// -- Whoami --------------------------------------------------------- // -- Whoami ---------------------------------------------------------

View File

@@ -32,9 +32,9 @@ impl ServiceClient for MatrixClient {
} }
impl MatrixClient { impl MatrixClient {
/// Build a MatrixClient from domain (e.g. `https://matrix.{domain}/_matrix`). /// Build a MatrixClient from domain (e.g. `https://messages.{domain}/_matrix`).
pub fn connect(domain: &str) -> Self { pub fn connect(domain: &str) -> Self {
let base_url = format!("https://matrix.{domain}/_matrix"); let base_url = format!("https://messages.{domain}/_matrix");
Self::from_parts(base_url, AuthMethod::Bearer(String::new())) Self::from_parts(base_url, AuthMethod::Bearer(String::new()))
} }
@@ -1204,7 +1204,7 @@ mod tests {
#[test] #[test]
fn test_connect_url() { fn test_connect_url() {
let c = MatrixClient::connect("sunbeam.pt"); let c = MatrixClient::connect("sunbeam.pt");
assert_eq!(c.base_url(), "https://matrix.sunbeam.pt/_matrix"); assert_eq!(c.base_url(), "https://messages.sunbeam.pt/_matrix");
assert_eq!(c.service_name(), "matrix"); assert_eq!(c.service_name(), "matrix");
} }

View File

@@ -177,7 +177,7 @@ async fn dispatch_room(
client: &SunbeamClient, client: &SunbeamClient,
fmt: OutputFormat, fmt: OutputFormat,
) -> Result<()> { ) -> Result<()> {
let lk = client.livekit(); let lk = client.livekit().await?;
match action { match action {
RoomAction::List => { RoomAction::List => {
let resp = lk.list_rooms().await?; let resp = lk.list_rooms().await?;
@@ -227,7 +227,7 @@ async fn dispatch_participant(
client: &SunbeamClient, client: &SunbeamClient,
fmt: OutputFormat, fmt: OutputFormat,
) -> Result<()> { ) -> Result<()> {
let lk = client.livekit(); let lk = client.livekit().await?;
match action { match action {
ParticipantAction::List { room } => { ParticipantAction::List { room } => {
let resp = lk let resp = lk
@@ -278,7 +278,7 @@ async fn dispatch_egress(
client: &SunbeamClient, client: &SunbeamClient,
fmt: OutputFormat, fmt: OutputFormat,
) -> Result<()> { ) -> Result<()> {
let lk = client.livekit(); let lk = client.livekit().await?;
match action { match action {
EgressAction::List { room } => { EgressAction::List { room } => {
let resp = lk let resp = lk

View File

@@ -425,7 +425,7 @@ async fn dispatch_prometheus(
client: &SunbeamClient, client: &SunbeamClient,
fmt: OutputFormat, fmt: OutputFormat,
) -> Result<()> { ) -> Result<()> {
let prom = client.prometheus(); let prom = client.prometheus().await?;
match action { match action {
PrometheusAction::Query { query, time } => { PrometheusAction::Query { query, time } => {
let res = prom.query(&query, time.as_deref()).await?; let res = prom.query(&query, time.as_deref()).await?;
@@ -511,7 +511,7 @@ async fn dispatch_loki(
client: &SunbeamClient, client: &SunbeamClient,
fmt: OutputFormat, fmt: OutputFormat,
) -> Result<()> { ) -> Result<()> {
let loki = client.loki(); let loki = client.loki().await?;
match action { match action {
LokiAction::Query { query, limit, time } => { LokiAction::Query { query, limit, time } => {
let res = loki.query(&query, limit, time.as_deref()).await?; let res = loki.query(&query, limit, time.as_deref()).await?;
@@ -631,7 +631,7 @@ async fn dispatch_grafana_dashboard(
client: &SunbeamClient, client: &SunbeamClient,
fmt: OutputFormat, fmt: OutputFormat,
) -> Result<()> { ) -> Result<()> {
let grafana = client.grafana(); let grafana = client.grafana().await?;
match action { match action {
GrafanaDashboardAction::List => { GrafanaDashboardAction::List => {
let items = grafana.list_dashboards().await?; let items = grafana.list_dashboards().await?;
@@ -696,7 +696,7 @@ async fn dispatch_grafana_datasource(
client: &SunbeamClient, client: &SunbeamClient,
fmt: OutputFormat, fmt: OutputFormat,
) -> Result<()> { ) -> Result<()> {
let grafana = client.grafana(); let grafana = client.grafana().await?;
match action { match action {
GrafanaDatasourceAction::List => { GrafanaDatasourceAction::List => {
let items = grafana.list_datasources().await?; let items = grafana.list_datasources().await?;
@@ -746,7 +746,7 @@ async fn dispatch_grafana_folder(
client: &SunbeamClient, client: &SunbeamClient,
fmt: OutputFormat, fmt: OutputFormat,
) -> Result<()> { ) -> Result<()> {
let grafana = client.grafana(); let grafana = client.grafana().await?;
match action { match action {
GrafanaFolderAction::List => { GrafanaFolderAction::List => {
let items = grafana.list_folders().await?; let items = grafana.list_folders().await?;
@@ -794,7 +794,7 @@ async fn dispatch_grafana_annotation(
client: &SunbeamClient, client: &SunbeamClient,
fmt: OutputFormat, fmt: OutputFormat,
) -> Result<()> { ) -> Result<()> {
let grafana = client.grafana(); let grafana = client.grafana().await?;
match action { match action {
GrafanaAnnotationAction::List { params } => { GrafanaAnnotationAction::List { params } => {
let items = grafana.list_annotations(params.as_deref()).await?; let items = grafana.list_annotations(params.as_deref()).await?;
@@ -833,7 +833,7 @@ async fn dispatch_grafana_alert(
client: &SunbeamClient, client: &SunbeamClient,
fmt: OutputFormat, fmt: OutputFormat,
) -> Result<()> { ) -> Result<()> {
let grafana = client.grafana(); let grafana = client.grafana().await?;
match action { match action {
GrafanaAlertAction::List => { GrafanaAlertAction::List => {
let items = grafana.get_alert_rules().await?; let items = grafana.get_alert_rules().await?;
@@ -879,7 +879,7 @@ async fn dispatch_grafana_org(
client: &SunbeamClient, client: &SunbeamClient,
fmt: OutputFormat, fmt: OutputFormat,
) -> Result<()> { ) -> Result<()> {
let grafana = client.grafana(); let grafana = client.grafana().await?;
match action { match action {
GrafanaOrgAction::Get => { GrafanaOrgAction::Get => {
let item = grafana.get_current_org().await?; let item = grafana.get_current_org().await?;

View File

@@ -27,9 +27,9 @@ impl ServiceClient for GrafanaClient {
} }
impl GrafanaClient { impl GrafanaClient {
/// Build a GrafanaClient from domain (e.g. `https://grafana.{domain}/api`). /// Build a GrafanaClient from domain (e.g. `https://metrics.{domain}/api`).
pub fn connect(domain: &str) -> Self { pub fn connect(domain: &str) -> Self {
let base_url = format!("https://grafana.{domain}/api"); let base_url = format!("https://metrics.{domain}/api");
Self::from_parts(base_url, AuthMethod::None) Self::from_parts(base_url, AuthMethod::None)
} }
@@ -410,7 +410,7 @@ mod tests {
#[test] #[test]
fn test_connect_url() { fn test_connect_url() {
let c = GrafanaClient::connect("sunbeam.pt"); let c = GrafanaClient::connect("sunbeam.pt");
assert_eq!(c.base_url(), "https://grafana.sunbeam.pt/api"); assert_eq!(c.base_url(), "https://metrics.sunbeam.pt/api");
assert_eq!(c.service_name(), "grafana"); assert_eq!(c.service_name(), "grafana");
} }

View File

@@ -27,9 +27,9 @@ impl ServiceClient for LokiClient {
} }
impl LokiClient { impl LokiClient {
/// Build a LokiClient from domain (e.g. `https://loki.{domain}/loki/api/v1`). /// Build a LokiClient from domain (e.g. `https://systemlogs.{domain}/loki/api/v1`).
pub fn connect(domain: &str) -> Self { pub fn connect(domain: &str) -> Self {
let base_url = format!("https://loki.{domain}/loki/api/v1"); let base_url = format!("https://systemlogs.{domain}/loki/api/v1");
Self::from_parts(base_url, AuthMethod::None) Self::from_parts(base_url, AuthMethod::None)
} }
@@ -254,7 +254,7 @@ mod tests {
#[test] #[test]
fn test_connect_url() { fn test_connect_url() {
let c = LokiClient::connect("sunbeam.pt"); let c = LokiClient::connect("sunbeam.pt");
assert_eq!(c.base_url(), "https://loki.sunbeam.pt/loki/api/v1"); assert_eq!(c.base_url(), "https://systemlogs.sunbeam.pt/loki/api/v1");
assert_eq!(c.service_name(), "loki"); assert_eq!(c.service_name(), "loki");
} }

View File

@@ -27,9 +27,9 @@ impl ServiceClient for PrometheusClient {
} }
impl PrometheusClient { impl PrometheusClient {
/// Build a PrometheusClient from domain (e.g. `https://prometheus.{domain}/api/v1`). /// Build a PrometheusClient from domain (e.g. `https://systemmetrics.{domain}/api/v1`).
pub fn connect(domain: &str) -> Self { pub fn connect(domain: &str) -> Self {
let base_url = format!("https://prometheus.{domain}/api/v1"); let base_url = format!("https://systemmetrics.{domain}/api/v1");
Self::from_parts(base_url, AuthMethod::None) Self::from_parts(base_url, AuthMethod::None)
} }
@@ -253,7 +253,7 @@ mod tests {
#[test] #[test]
fn test_connect_url() { fn test_connect_url() {
let c = PrometheusClient::connect("sunbeam.pt"); let c = PrometheusClient::connect("sunbeam.pt");
assert_eq!(c.base_url(), "https://prometheus.sunbeam.pt/api/v1"); assert_eq!(c.base_url(), "https://systemmetrics.sunbeam.pt/api/v1");
assert_eq!(c.service_name(), "prometheus"); assert_eq!(c.service_name(), "prometheus");
} }

View File

@@ -4,6 +4,7 @@ use std::collections::HashMap;
use clap::Subcommand; use clap::Subcommand;
use crate::client::SunbeamClient;
use crate::error::Result; use crate::error::Result;
use crate::output::{self, OutputFormat}; use crate::output::{self, OutputFormat};
@@ -226,9 +227,10 @@ fn read_text_input(flag: Option<&str>) -> Result<String> {
pub async fn dispatch( pub async fn dispatch(
cmd: VaultCommand, cmd: VaultCommand,
bao: &super::BaoClient, client: &SunbeamClient,
fmt: OutputFormat, fmt: OutputFormat,
) -> Result<()> { ) -> Result<()> {
let bao = client.bao().await?;
match cmd { match cmd {
// -- Status --------------------------------------------------------- // -- Status ---------------------------------------------------------
VaultCommand::Status => { VaultCommand::Status => {

View File

@@ -6,17 +6,6 @@ use serde_json::json;
use crate::error::Result; use crate::error::Result;
use crate::output::{self, OutputFormat}; use crate::output::{self, OutputFormat};
// ---------------------------------------------------------------------------
// Client helper
// ---------------------------------------------------------------------------
async fn os_client(domain: &str) -> Result<super::OpenSearchClient> {
let token = crate::auth::get_token().await?;
let mut c = super::OpenSearchClient::connect(domain);
c.set_token(token);
Ok(c)
}
// --------------------------------------------------------------------------- // ---------------------------------------------------------------------------
// Top-level command enum // Top-level command enum
// --------------------------------------------------------------------------- // ---------------------------------------------------------------------------
@@ -413,7 +402,7 @@ pub async fn dispatch(
client: &crate::client::SunbeamClient, client: &crate::client::SunbeamClient,
fmt: OutputFormat, fmt: OutputFormat,
) -> Result<()> { ) -> Result<()> {
let c = os_client(client.domain()).await?; let c = client.opensearch().await?;
match cmd { match cmd {
// ----------------------------------------------------------------- // -----------------------------------------------------------------

View File

@@ -1103,4 +1103,50 @@ mod tests {
]; ];
assert_eq!(PG_USERS, &expected[..]); assert_eq!(PG_USERS, &expected[..]);
} }
#[test]
fn test_sol_gitea_credential_mapping() {
let mut gitea = HashMap::new();
gitea.insert("admin-username".to_string(), "gitea_admin".to_string());
gitea.insert("admin-password".to_string(), "s3cret".to_string());
let mut sol_gitea = HashMap::new();
if let Some(u) = gitea.get("admin-username") {
sol_gitea.insert("gitea-admin-username".to_string(), u.clone());
}
if let Some(p) = gitea.get("admin-password") {
sol_gitea.insert("gitea-admin-password".to_string(), p.clone());
}
assert_eq!(sol_gitea.len(), 2);
assert_eq!(sol_gitea["gitea-admin-username"], "gitea_admin");
assert_eq!(sol_gitea["gitea-admin-password"], "s3cret");
}
#[test]
fn test_sol_gitea_credential_mapping_partial() {
let gitea: HashMap<String, String> = HashMap::new();
let mut sol_gitea = HashMap::new();
if let Some(u) = gitea.get("admin-username") {
sol_gitea.insert("gitea-admin-username".to_string(), u.clone());
}
if let Some(p) = gitea.get("admin-password") {
sol_gitea.insert("gitea-admin-password".to_string(), p.clone());
}
assert!(sol_gitea.is_empty(), "No creds should be mapped when gitea map is empty");
}
#[test]
fn test_sol_agent_policy_hcl() {
let sol_policy_hcl = concat!(
"path \"secret/data/sol-tokens/*\" { capabilities = [\"create\", \"read\", \"update\", \"delete\"] }\n",
"path \"secret/metadata/sol-tokens/*\" { capabilities = [\"read\", \"delete\", \"list\"] }\n",
);
assert!(sol_policy_hcl.contains("secret/data/sol-tokens/*"));
assert!(sol_policy_hcl.contains("secret/metadata/sol-tokens/*"));
assert!(sol_policy_hcl.contains("create"));
assert!(sol_policy_hcl.contains("delete"));
assert!(sol_policy_hcl.contains("list"));
assert_eq!(sol_policy_hcl.lines().count(), 2);
}
} }

View File

@@ -473,6 +473,21 @@ pub async fn seed_openbao() -> Result<Option<SeedResult>> {
} }
} }
// Patch gitea admin credentials into secret/sol for Sol's Gitea integration.
// Uses kv_patch to preserve manually-set keys (matrix-access-token etc.).
{
let mut sol_gitea = HashMap::new();
if let Some(u) = gitea.get("admin-username") {
sol_gitea.insert("gitea-admin-username".to_string(), u.clone());
}
if let Some(p) = gitea.get("admin-password") {
sol_gitea.insert("gitea-admin-password".to_string(), p.clone());
}
if !sol_gitea.is_empty() {
bao.kv_patch("secret", "sol", &sol_gitea).await?;
}
}
// ── Kubernetes auth for VSO ───────────────────────────────────────── // ── Kubernetes auth for VSO ─────────────────────────────────────────
ok("Configuring Kubernetes auth for VSO..."); ok("Configuring Kubernetes auth for VSO...");
let _ = bao.auth_enable("kubernetes", "kubernetes").await; let _ = bao.auth_enable("kubernetes", "kubernetes").await;
@@ -503,6 +518,25 @@ pub async fn seed_openbao() -> Result<Option<SeedResult>> {
) )
.await?; .await?;
// Sol agent policy — read/write access to sol-tokens/* for user impersonation PATs
ok("Configuring Kubernetes auth for Sol agent...");
let sol_policy_hcl = concat!(
"path \"secret/data/sol-tokens/*\" { capabilities = [\"create\", \"read\", \"update\", \"delete\"] }\n",
"path \"secret/metadata/sol-tokens/*\" { capabilities = [\"read\", \"delete\", \"list\"] }\n",
);
bao.write_policy("sol-agent", sol_policy_hcl).await?;
bao.write(
"auth/kubernetes/role/sol-agent",
&serde_json::json!({
"bound_service_account_names": "default",
"bound_service_account_namespaces": "matrix",
"policies": "sol-agent",
"ttl": "1h"
}),
)
.await?;
// Build credentials map // Build credentials map
let mut creds = HashMap::new(); let mut creds = HashMap::new();
let field_map: &[(&str, &str, &HashMap<String, String>)] = &[ let field_map: &[(&str, &str, &HashMap<String, String>)] = &[

View File

@@ -152,7 +152,7 @@ async fn dispatch_bucket(
client: &SunbeamClient, client: &SunbeamClient,
fmt: OutputFormat, fmt: OutputFormat,
) -> Result<()> { ) -> Result<()> {
let s3 = client.s3(); let s3 = client.s3().await?;
match action { match action {
BucketAction::List => { BucketAction::List => {
let resp = s3.list_buckets().await?; let resp = s3.list_buckets().await?;
@@ -194,7 +194,7 @@ async fn dispatch_object(
client: &SunbeamClient, client: &SunbeamClient,
fmt: OutputFormat, fmt: OutputFormat,
) -> Result<()> { ) -> Result<()> {
let s3 = client.s3(); let s3 = client.s3().await?;
match action { match action {
ObjectAction::List { ObjectAction::List {
bucket, bucket,

View File

@@ -1,6 +1,6 @@
[package] [package]
name = "sunbeam" name = "sunbeam"
version = "1.0.0" version = "1.0.1"
edition = "2024" edition = "2024"
description = "Sunbeam local dev stack manager" description = "Sunbeam local dev stack manager"
@@ -10,9 +10,31 @@ path = "src/main.rs"
[dependencies] [dependencies]
sunbeam-sdk = { path = "../sunbeam-sdk", features = ["all", "cli"] } sunbeam-sdk = { path = "../sunbeam-sdk", features = ["all", "cli"] }
sunbeam-proto = { path = "../sunbeam-proto" }
tokio = { version = "1", features = ["full"] } tokio = { version = "1", features = ["full"] }
tokio-stream = "0.1"
clap = { version = "4", features = ["derive"] } clap = { version = "4", features = ["derive"] }
chrono = "0.4" chrono = "0.4"
tracing = "0.1" tracing = "0.1"
tracing-subscriber = { version = "0.3", features = ["env-filter"] } tracing-subscriber = { version = "0.3", features = ["env-filter"] }
rustls = { version = "0.23", features = ["ring"] } rustls = { version = "0.23", features = ["ring"] }
tonic = "0.14"
ratatui = "0.29"
crossterm = "0.28"
serde = { version = "1", features = ["derive"] }
serde_json = "1"
toml = "0.8"
anyhow = "1"
futures = "0.3"
crossbeam-channel = "0.5"
textwrap = "0.16"
tui-markdown = "=0.3.6"
tree-sitter = "0.24"
tree-sitter-rust = "0.23"
tree-sitter-typescript = "0.23"
tree-sitter-python = "0.23"
lsp-types = "0.97"
url = "2"
[dev-dependencies]
tokio-stream = { version = "0.1", features = ["net"] }

View File

@@ -383,6 +383,14 @@ def _seed_openbao() -> dict:
"turn-secret": tuwunel["turn-secret"], "turn-secret": tuwunel["turn-secret"],
"registration-token": tuwunel["registration-token"]}) "registration-token": tuwunel["registration-token"]})
# Patch gitea admin credentials into secret/sol for Sol's Gitea integration.
# Uses kv patch (not put) to preserve manually-set keys (matrix-access-token etc.).
ok("Patching Gitea admin credentials into secret/sol...")
bao(f"BAO_ADDR=http://127.0.0.1:8200 BAO_TOKEN='{root_token}' "
f"bao kv patch secret/sol "
f"gitea-admin-username='{gitea['admin-username']}' "
f"gitea-admin-password='{gitea['admin-password']}'")
# Configure Kubernetes auth method so VSO can authenticate with OpenBao # Configure Kubernetes auth method so VSO can authenticate with OpenBao
ok("Configuring Kubernetes auth for VSO...") ok("Configuring Kubernetes auth for VSO...")
bao(f"BAO_ADDR=http://127.0.0.1:8200 BAO_TOKEN='{root_token}' " bao(f"BAO_ADDR=http://127.0.0.1:8200 BAO_TOKEN='{root_token}' "
@@ -407,6 +415,23 @@ def _seed_openbao() -> dict:
f"policies=vso-reader " f"policies=vso-reader "
f"ttl=1h") f"ttl=1h")
# Sol agent policy — read/write access to sol-tokens/* for user impersonation PATs
ok("Configuring Kubernetes auth for Sol agent...")
sol_policy_hcl = (
'path "secret/data/sol-tokens/*" { capabilities = ["create", "read", "update", "delete"] }\n'
'path "secret/metadata/sol-tokens/*" { capabilities = ["read", "delete", "list"] }\n'
)
sol_policy_b64 = base64.b64encode(sol_policy_hcl.encode()).decode()
bao(f"BAO_ADDR=http://127.0.0.1:8200 BAO_TOKEN='{root_token}' "
f"sh -c 'echo {sol_policy_b64} | base64 -d | bao policy write sol-agent -'")
bao(f"BAO_ADDR=http://127.0.0.1:8200 BAO_TOKEN='{root_token}' "
f"bao write auth/kubernetes/role/sol-agent "
f"bound_service_account_names=default "
f"bound_service_account_namespaces=matrix "
f"policies=sol-agent "
f"ttl=1h")
return { return {
"hydra-system-secret": hydra["system-secret"], "hydra-system-secret": hydra["system-secret"],
"hydra-cookie-secret": hydra["cookie-secret"], "hydra-cookie-secret": hydra["cookie-secret"],

View File

@@ -139,6 +139,29 @@ pub enum Verb {
action: Option<PmAction>, action: Option<PmAction>,
}, },
/// Terminal coding agent powered by Sol.
Code {
#[command(subcommand)]
action: Option<crate::code::CodeCommand>,
},
/// Reindex Gitea repos into Sol's code search index.
#[command(name = "reindex-code")]
ReindexCode {
/// Filter to a specific org.
#[arg(long)]
org: Option<String>,
/// Index a specific repo (owner/name format).
#[arg(long)]
repo: Option<String>,
/// Index a specific branch (default: repo's default branch).
#[arg(long)]
branch: Option<String>,
/// Sol gRPC endpoint.
#[arg(long, default_value = "http://127.0.0.1:50051")]
endpoint: String,
},
/// Self-update from latest mainline commit. /// Self-update from latest mainline commit.
Update, Update,
@@ -927,12 +950,14 @@ pub async fn dispatch() -> Result<()> {
let sc = sunbeam_sdk::client::SunbeamClient::from_context( let sc = sunbeam_sdk::client::SunbeamClient::from_context(
&sunbeam_sdk::config::active_context(), &sunbeam_sdk::config::active_context(),
); );
sunbeam_sdk::gitea::cli::dispatch(action, sc.gitea(), cli.output_format).await sunbeam_sdk::gitea::cli::dispatch(action, &sc, cli.output_format).await
} }
Some(Verb::Chat { action }) => { Some(Verb::Chat { action }) => {
let domain = sunbeam_sdk::config::active_context().domain.clone(); let sc = sunbeam_sdk::client::SunbeamClient::from_context(
sunbeam_sdk::matrix::cli::dispatch(&domain, cli.output_format, action).await &sunbeam_sdk::config::active_context(),
);
sunbeam_sdk::matrix::cli::dispatch(&sc, cli.output_format, action).await
} }
Some(Verb::Search { action }) => { Some(Verb::Search { action }) => {
@@ -964,8 +989,10 @@ pub async fn dispatch() -> Result<()> {
} }
Some(Verb::Vault { action }) => { Some(Verb::Vault { action }) => {
let bao = sunbeam_sdk::openbao::BaoClient::new("http://127.0.0.1:8200"); let sc = sunbeam_sdk::client::SunbeamClient::from_context(
sunbeam_sdk::openbao::cli::dispatch(action, &bao, cli.output_format).await &sunbeam_sdk::config::active_context(),
);
sunbeam_sdk::openbao::cli::dispatch(action, &sc, cli.output_format).await
} }
Some(Verb::People { action }) => { Some(Verb::People { action }) => {
@@ -1049,6 +1076,35 @@ pub async fn dispatch() -> Result<()> {
} }
}, },
Some(Verb::Code { action }) => crate::code::cmd_code(action).await,
Some(Verb::ReindexCode { org, repo, branch, endpoint }) => {
use sunbeam_proto::sunbeam_code_v1::code_agent_client::CodeAgentClient;
use sunbeam_proto::sunbeam_code_v1::ReindexCodeRequest;
tracing::info!(endpoint = endpoint.as_str(), "Connecting to Sol for reindex");
let mut client = CodeAgentClient::connect(endpoint)
.await
.map_err(|e| sunbeam_sdk::error::SunbeamError::Other(format!("Failed to connect: {e}")))?;
let request = ReindexCodeRequest {
org: org.unwrap_or_default(),
repo: repo.unwrap_or_default(),
branch: branch.unwrap_or_default(),
};
let response = client.reindex_code(request)
.await
.map_err(|e| sunbeam_sdk::error::SunbeamError::Other(format!("Reindex failed: {e}")))?;
let resp = response.into_inner();
if resp.error.is_empty() {
println!("Indexed {} symbols across {} repos", resp.symbols_indexed, resp.repos_indexed);
} else {
eprintln!("Error: {}", resp.error);
}
Ok(())
}
Some(Verb::Update) => sunbeam_sdk::update::cmd_update().await, Some(Verb::Update) => sunbeam_sdk::update::cmd_update().await,
Some(Verb::Version) => { Some(Verb::Version) => {

386
sunbeam/src/code/agent.rs Normal file
View File

@@ -0,0 +1,386 @@
//! Agent service — async message bus between TUI and Sol gRPC session.
//!
//! The TUI sends `AgentRequest`s and receives `AgentEvent`s through
//! crossbeam channels. The gRPC session runs on a background tokio task,
//! so the UI thread never blocks on network I/O.
//!
//! Tool approval: when a client tool requires approval ("ask" in config),
//! the agent emits `ApprovalNeeded` and waits for a `decide()` call from
//! the TUI before executing or denying.
//!
//! This module is designed to be usable as a library — nothing here
//! depends on ratatui or terminal state.
use crossbeam_channel::{Receiver, Sender};
use super::client::{self, CodeSession};
use super::config::LoadedConfig;
/// Turn raw internal errors into something a human can read.
fn friendly_error(e: &str) -> String {
let lower = e.to_lowercase();
if lower.contains("broken pipe") || lower.contains("stream closed") || lower.contains("h2 protocol") {
"sol disconnected — try again or restart with /exit".into()
} else if lower.contains("channel closed") || lower.contains("send on closed") {
"connection to sol lost".into()
} else if lower.contains("timed out") || lower.contains("timeout") {
"request timed out — sol may be overloaded".into()
} else if lower.contains("connection refused") {
"can't reach sol — is it running?".into()
} else if lower.contains("not found") && lower.contains("agent") {
"sol's agent was reset — reconnect with /exit".into()
} else if lower.contains("invalid_request_error") {
if let Some(start) = e.find("\"msg\":\"") {
let rest = &e[start + 7..];
if let Some(end) = rest.find('"') {
return rest[..end].to_string();
}
}
"request error from sol".into()
} else {
let clean = e.replace("\\n", " ").replace("\\\"", "'");
if clean.len() > 120 { format!("{}", &clean[..117]) } else { clean }
}
}
// ── Requests (TUI → Agent) ──────────────────────────────────────────────
/// A request from the UI to the agent backend.
pub enum AgentRequest {
/// Send a chat message to Sol.
Chat { text: String },
/// End the session gracefully.
End,
}
// ── Approval (TUI → Agent) ─────────────────────────────────────────────
/// A tool approval decision from the UI.
#[derive(Debug, Clone)]
pub enum ApprovalDecision {
/// User approved — execute the tool.
Approved { call_id: String },
/// User denied — return error to model.
Denied { call_id: String },
/// User approved AND upgraded permission to "always" for this session.
ApprovedAlways { call_id: String, tool_name: String },
// Future: ApprovedRemote { call_id } — execute on server sidecar
}
// ── Events (Agent → TUI) ───────────────────────────────────────────────
/// An event from the agent backend to the UI.
#[derive(Clone, Debug)]
pub enum AgentEvent {
/// Sol started generating a response.
Generating,
/// A tool needs user approval before execution.
ApprovalNeeded { call_id: String, name: String, args_summary: String },
/// Tool was approved and is now executing.
ToolExecuting { name: String, detail: String },
/// A tool finished executing.
ToolDone { name: String, success: bool },
/// Sol's full response text with token usage.
Response { text: String, input_tokens: u32, output_tokens: u32 },
/// A non-fatal error from Sol.
Error { message: String },
/// Status update (shown in title bar).
Status { message: String },
/// Connection health: true = reachable, false = unreachable.
Health { connected: bool },
/// Session ended.
SessionEnded,
}
// ── Agent handle (owned by TUI) ────────────────────────────────────────
/// Handle for the TUI to communicate with the background agent task.
pub struct AgentHandle {
req_tx: Sender<AgentRequest>,
approval_tx: Sender<ApprovalDecision>,
pub rx: Receiver<AgentEvent>,
}
impl AgentHandle {
/// Send a chat message. Non-blocking.
pub fn chat(&self, text: &str) {
let _ = self.req_tx.try_send(AgentRequest::Chat { text: text.to_string() });
}
/// Request session end. Non-blocking.
pub fn end(&self) {
let _ = self.req_tx.try_send(AgentRequest::End);
}
/// Submit a tool approval decision. Non-blocking.
pub fn decide(&self, decision: ApprovalDecision) {
let _ = self.approval_tx.try_send(decision);
}
/// Drain all pending events. Non-blocking.
pub fn poll_events(&self) -> Vec<AgentEvent> {
let mut events = Vec::new();
while let Ok(event) = self.rx.try_recv() {
events.push(event);
}
events
}
}
// ── Spawn ──────────────────────────────────────────────────────────────
/// Spawn the agent background task. Returns a handle for the TUI.
pub fn spawn(
session: CodeSession,
endpoint: String,
config: LoadedConfig,
project_path: String,
) -> AgentHandle {
let (req_tx, req_rx) = crossbeam_channel::bounded::<AgentRequest>(32);
let (evt_tx, evt_rx) = crossbeam_channel::bounded::<AgentEvent>(256);
let (approval_tx, approval_rx) = crossbeam_channel::bounded::<ApprovalDecision>(8);
tokio::spawn(agent_loop(session, config, project_path, req_rx, approval_rx, evt_tx.clone()));
tokio::spawn(heartbeat_loop(endpoint, evt_tx));
AgentHandle { req_tx, approval_tx, rx: evt_rx }
}
/// Ping the gRPC endpoint every second to check if Sol is reachable.
async fn heartbeat_loop(endpoint: String, evt_tx: Sender<AgentEvent>) {
use sunbeam_proto::sunbeam_code_v1::code_agent_client::CodeAgentClient;
let mut last_state = true;
let _ = evt_tx.try_send(AgentEvent::Health { connected: true });
loop {
tokio::time::sleep(std::time::Duration::from_secs(1)).await;
let connected = CodeAgentClient::connect(endpoint.clone()).await.is_ok();
if connected != last_state {
let _ = evt_tx.try_send(AgentEvent::Health { connected });
last_state = connected;
}
}
}
/// The background agent loop. Reads requests, calls gRPC, handles tool
/// approval and execution.
async fn agent_loop(
mut session: CodeSession,
mut config: LoadedConfig,
project_path: String,
req_rx: Receiver<AgentRequest>,
approval_rx: Receiver<ApprovalDecision>,
evt_tx: Sender<AgentEvent>,
) {
loop {
let req = match tokio::task::block_in_place(|| req_rx.recv()) {
Ok(req) => req,
Err(_) => break,
};
match req {
AgentRequest::Chat { text } => {
let _ = evt_tx.try_send(AgentEvent::Generating);
match session.chat(&text).await {
Ok(resp) => {
// Process events — handle tool calls with approval
for event in &resp.events {
match event {
client::ChatEvent::ToolCall { call_id, name, args, needs_approval } => {
let perm = config.permission_for(name);
match perm {
"always" => {
// Execute immediately
let _ = evt_tx.try_send(AgentEvent::ToolExecuting {
name: name.clone(),
detail: truncate_args(args),
});
let result = super::tools::execute(name, args, &project_path);
let _ = evt_tx.try_send(AgentEvent::ToolDone {
name: name.clone(),
success: true,
});
// Tool result already sent by client.rs
}
"never" => {
let _ = evt_tx.try_send(AgentEvent::ToolDone {
name: name.clone(),
success: false,
});
// Tool denial already sent by client.rs
}
_ => {
// "ask" — need user approval
let _ = evt_tx.try_send(AgentEvent::ApprovalNeeded {
call_id: call_id.clone(),
name: name.clone(),
args_summary: truncate_args(args),
});
// Wait for approval decision (blocking on crossbeam)
match tokio::task::block_in_place(|| approval_rx.recv()) {
Ok(ApprovalDecision::Approved { .. }) => {
let _ = evt_tx.try_send(AgentEvent::ToolExecuting {
name: name.clone(),
detail: truncate_args(args),
});
// Tool already executed by client.rs
let _ = evt_tx.try_send(AgentEvent::ToolDone {
name: name.clone(),
success: true,
});
}
Ok(ApprovalDecision::ApprovedAlways { tool_name, .. }) => {
config.upgrade_to_always(&tool_name);
let _ = evt_tx.try_send(AgentEvent::ToolExecuting {
name: name.clone(),
detail: truncate_args(args),
});
let _ = evt_tx.try_send(AgentEvent::ToolDone {
name: name.clone(),
success: true,
});
}
Ok(ApprovalDecision::Denied { .. }) => {
let _ = evt_tx.try_send(AgentEvent::ToolDone {
name: name.clone(),
success: false,
});
}
Err(_) => break,
}
}
}
}
client::ChatEvent::ToolStart { name, detail } => {
let _ = evt_tx.try_send(AgentEvent::ToolExecuting {
name: name.clone(),
detail: detail.clone(),
});
}
client::ChatEvent::ToolDone { name, success } => {
let _ = evt_tx.try_send(AgentEvent::ToolDone {
name: name.clone(),
success: *success,
});
}
client::ChatEvent::Status(msg) => {
let _ = evt_tx.try_send(AgentEvent::Status {
message: msg.clone(),
});
}
client::ChatEvent::Error(msg) => {
let _ = evt_tx.try_send(AgentEvent::Error {
message: friendly_error(msg),
});
}
}
}
let _ = evt_tx.try_send(AgentEvent::Response {
text: resp.text,
input_tokens: resp.input_tokens,
output_tokens: resp.output_tokens,
});
}
Err(e) => {
let _ = evt_tx.try_send(AgentEvent::Error {
message: friendly_error(&e.to_string()),
});
}
}
}
AgentRequest::End => {
let _ = session.end().await;
let _ = evt_tx.try_send(AgentEvent::SessionEnded);
break;
}
}
}
}
fn truncate_args(args: &str) -> String {
if args.len() <= 80 { args.to_string() } else { format!("{}", &args[..77]) }
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_approval_decision_variants() {
let approved = ApprovalDecision::Approved { call_id: "c1".into() };
assert!(matches!(approved, ApprovalDecision::Approved { .. }));
let denied = ApprovalDecision::Denied { call_id: "c2".into() };
assert!(matches!(denied, ApprovalDecision::Denied { .. }));
let always = ApprovalDecision::ApprovedAlways {
call_id: "c3".into(),
tool_name: "bash".into(),
};
assert!(matches!(always, ApprovalDecision::ApprovedAlways { .. }));
}
#[test]
fn test_permission_routing() {
let config = LoadedConfig::default();
// "always" tools should not need approval
assert_eq!(config.permission_for("file_read"), "always");
assert_eq!(config.permission_for("grep"), "always");
assert_eq!(config.permission_for("list_directory"), "always");
// "ask" tools need approval
assert_eq!(config.permission_for("file_write"), "ask");
assert_eq!(config.permission_for("bash"), "ask");
assert_eq!(config.permission_for("search_replace"), "ask");
// unknown defaults to ask
assert_eq!(config.permission_for("unknown_tool"), "ask");
}
#[test]
fn test_permission_upgrade() {
let mut config = LoadedConfig::default();
assert_eq!(config.permission_for("bash"), "ask");
config.upgrade_to_always("bash");
assert_eq!(config.permission_for("bash"), "always");
// Other tools unchanged
assert_eq!(config.permission_for("file_write"), "ask");
}
#[test]
fn test_friendly_error_messages() {
assert_eq!(
friendly_error("h2 protocol error: stream closed because of a broken pipe"),
"sol disconnected — try again or restart with /exit"
);
assert_eq!(
friendly_error("channel closed"),
"connection to sol lost"
);
assert_eq!(
friendly_error("connection refused"),
"can't reach sol — is it running?"
);
assert_eq!(
friendly_error("request timed out"),
"request timed out — sol may be overloaded"
);
}
#[test]
fn test_truncate_args() {
assert_eq!(truncate_args("short"), "short");
let long = "a".repeat(100);
let truncated = truncate_args(&long);
assert!(truncated.len() <= 81);
assert!(truncated.ends_with('…'));
}
}

266
sunbeam/src/code/client.rs Normal file
View File

@@ -0,0 +1,266 @@
use sunbeam_proto::sunbeam_code_v1::code_agent_client::CodeAgentClient;
use sunbeam_proto::sunbeam_code_v1::*;
use tokio::sync::mpsc;
use tokio_stream::wrappers::ReceiverStream;
use tonic::Request;
use tracing::{debug, error, info, warn};
use super::config::LoadedConfig;
use super::project::ProjectContext;
/// Events produced during a chat turn, for the TUI to render.
pub enum ChatEvent {
/// A client-side tool call that needs execution (possibly with approval).
ToolCall { call_id: String, name: String, args: String, needs_approval: bool },
ToolStart { name: String, detail: String },
ToolDone { name: String, success: bool },
Status(String),
Error(String),
}
/// Result of a chat turn.
pub struct ChatResponse {
pub text: String,
pub events: Vec<ChatEvent>,
pub input_tokens: u32,
pub output_tokens: u32,
}
fn truncate_args(args_json: &str) -> String {
// Extract a short summary from the JSON args
if args_json.len() <= 80 {
args_json.to_string()
} else {
format!("{}", &args_json[..77])
}
}
/// A history entry from a resumed session.
pub struct HistoryMessage {
pub role: String,
pub content: String,
}
/// An active coding session connected to Sol via gRPC.
pub struct CodeSession {
pub session_id: String,
pub room_id: String,
pub model: String,
pub project_path: String,
pub resumed: bool,
pub history: Vec<HistoryMessage>,
tx: mpsc::Sender<ClientMessage>,
rx: tonic::Streaming<ServerMessage>,
}
/// Connect to Sol's gRPC server and start a coding session.
pub async fn connect(
endpoint: &str,
project: &ProjectContext,
config: &LoadedConfig,
model: &str,
) -> anyhow::Result<CodeSession> {
let mut client = CodeAgentClient::connect(endpoint.to_string())
.await
.map_err(|e| anyhow::anyhow!("Failed to connect to Sol at {endpoint}: {e}"))?;
info!(endpoint, "Connected to Sol gRPC server");
// Create the bidirectional stream
let (tx, client_rx) = mpsc::channel::<ClientMessage>(32);
let client_stream = ReceiverStream::new(client_rx);
// TODO: add JWT auth token to the request metadata
let response = client.session(client_stream).await?;
let mut rx = response.into_inner();
// Send StartSession
tx.send(ClientMessage {
payload: Some(client_message::Payload::Start(StartSession {
project_path: project.path.clone(),
prompt_md: project.prompt_md.clone(),
config_toml: project.config_toml.clone(),
git_branch: project.git_branch.clone().unwrap_or_default(),
git_status: project.git_status.clone().unwrap_or_default(),
file_tree: project.file_tree.clone(),
model: model.into(),
client_tools: vec![], // TODO: send client tool schemas
})),
})
.await?;
// Wait for SessionReady
let ready = loop {
match rx.message().await? {
Some(ServerMessage {
payload: Some(server_message::Payload::Ready(r)),
}) => break r,
Some(ServerMessage {
payload: Some(server_message::Payload::Error(e)),
}) => anyhow::bail!("Session start failed: {}", e.message),
Some(_) => continue,
None => anyhow::bail!("Stream closed before SessionReady"),
}
};
// Extract and send symbols for code index (fire-and-forget)
let symbols = super::symbols::extract_project_symbols(&project.path);
if !symbols.is_empty() {
let branch = project.git_branch.clone().unwrap_or_else(|| "mainline".into());
let proto_symbols: Vec<_> = symbols
.iter()
.map(|s| SymbolEntry {
file_path: s.file_path.clone(),
name: s.name.clone(),
kind: s.kind.clone(),
signature: s.signature.clone(),
docstring: s.docstring.clone(),
start_line: s.start_line as i32,
end_line: s.end_line as i32,
language: s.language.clone(),
content: s.content.clone(),
})
.collect();
let project_name = project.path.split('/').last().unwrap_or("unknown").to_string();
let _ = tx
.send(ClientMessage {
payload: Some(client_message::Payload::IndexSymbols(IndexSymbols {
project_name,
branch,
symbols: proto_symbols,
})),
})
.await;
info!(count = symbols.len(), "Sent project symbols for indexing");
}
let history = ready
.history
.into_iter()
.map(|h| HistoryMessage {
role: h.role,
content: h.content,
})
.collect();
Ok(CodeSession {
session_id: ready.session_id,
room_id: ready.room_id,
model: ready.model,
project_path: project.path.clone(),
resumed: ready.resumed,
history,
tx,
rx,
})
}
impl CodeSession {
/// Send a chat message and collect the response.
/// Handles tool calls by executing them locally and sending results back.
/// Returns (full_text, events) — events are for the TUI to display.
pub async fn chat(&mut self, text: &str) -> anyhow::Result<ChatResponse> {
self.tx
.send(ClientMessage {
payload: Some(client_message::Payload::Input(UserInput {
text: text.into(),
})),
})
.await?;
let mut events = Vec::new();
// Read server messages until we get TextDone
loop {
match self.rx.message().await? {
Some(ServerMessage {
payload: Some(server_message::Payload::Delta(_)),
}) => {
// Streaming text — we'll use full_text from Done
}
Some(ServerMessage {
payload: Some(server_message::Payload::Done(d)),
}) => {
return Ok(ChatResponse {
text: d.full_text,
events,
input_tokens: d.input_tokens,
output_tokens: d.output_tokens,
});
}
Some(ServerMessage {
payload: Some(server_message::Payload::ToolCall(tc)),
}) => {
if tc.is_local {
// Emit ToolCall event — agent handles approval + execution
events.push(ChatEvent::ToolCall {
call_id: tc.call_id.clone(),
name: tc.name.clone(),
args: tc.args_json.clone(),
needs_approval: tc.needs_approval,
});
// Execute immediately for now — approval is handled
// by the agent layer which wraps this method.
// When approval flow is active, the agent will call
// execute + send_tool_result separately.
let result =
super::tools::execute(&tc.name, &tc.args_json, &self.project_path);
self.tx
.send(ClientMessage {
payload: Some(client_message::Payload::ToolResult(ToolResult {
call_id: tc.call_id,
result,
is_error: false,
})),
})
.await?;
} else {
events.push(ChatEvent::ToolStart {
name: format!("{} (server)", tc.name),
detail: String::new(),
});
}
}
Some(ServerMessage {
payload: Some(server_message::Payload::Status(s)),
}) => {
events.push(ChatEvent::Status(s.message));
}
Some(ServerMessage {
payload: Some(server_message::Payload::Error(e)),
}) => {
if e.fatal {
anyhow::bail!("Fatal error: {}", e.message);
}
events.push(ChatEvent::Error(e.message));
}
Some(ServerMessage {
payload: Some(server_message::Payload::End(_)),
}) => {
return Ok(ChatResponse {
text: "Session ended by server.".into(),
events,
input_tokens: 0,
output_tokens: 0,
});
}
Some(_) => continue,
None => anyhow::bail!("Stream closed unexpectedly"),
}
}
}
/// End the session.
pub async fn end(&self) -> anyhow::Result<()> {
self.tx
.send(ClientMessage {
payload: Some(client_message::Payload::End(EndSession {})),
})
.await?;
Ok(())
}
}

146
sunbeam/src/code/config.rs Normal file
View File

@@ -0,0 +1,146 @@
use serde::Deserialize;
/// Project-level configuration from .sunbeam/config.toml.
#[derive(Debug, Default, Deserialize)]
pub struct ProjectConfig {
#[serde(default)]
pub model: Option<ModelConfig>,
#[serde(default)]
pub tools: Option<ToolPermissions>,
}
#[derive(Debug, Deserialize)]
pub struct ModelConfig {
pub name: Option<String>,
}
#[derive(Debug, Default, Deserialize)]
pub struct ToolPermissions {
#[serde(default)]
pub file_read: Option<String>,
#[serde(default)]
pub file_write: Option<String>,
#[serde(default)]
pub search_replace: Option<String>,
#[serde(default)]
pub grep: Option<String>,
#[serde(default)]
pub bash: Option<String>,
#[serde(default)]
pub list_directory: Option<String>,
}
/// Convenience wrapper with flattened fields.
pub struct LoadedConfig {
pub model_name: Option<String>,
pub file_read_perm: String,
pub file_write_perm: String,
pub search_replace_perm: String,
pub grep_perm: String,
pub bash_perm: String,
pub list_directory_perm: String,
}
impl Default for LoadedConfig {
fn default() -> Self {
Self {
model_name: None,
file_read_perm: "always".into(),
file_write_perm: "ask".into(),
search_replace_perm: "ask".into(),
grep_perm: "always".into(),
bash_perm: "ask".into(),
list_directory_perm: "always".into(),
}
}
}
impl LoadedConfig {
/// Get the permission level for a tool. Returns "always", "ask", or "never".
pub fn permission_for(&self, tool_name: &str) -> &str {
match tool_name {
"file_read" => &self.file_read_perm,
"file_write" => &self.file_write_perm,
"search_replace" => &self.search_replace_perm,
"grep" => &self.grep_perm,
"bash" => &self.bash_perm,
"list_directory" => &self.list_directory_perm,
_ => "ask", // unknown tools default to ask
}
}
/// Upgrade a tool's permission to "always" for this session (in-memory only).
pub fn upgrade_to_always(&mut self, tool_name: &str) {
let target = match tool_name {
"file_read" => &mut self.file_read_perm,
"file_write" => &mut self.file_write_perm,
"search_replace" => &mut self.search_replace_perm,
"grep" => &mut self.grep_perm,
"bash" => &mut self.bash_perm,
"list_directory" => &mut self.list_directory_perm,
_ => return,
};
*target = "always".into();
}
}
/// Load project config from .sunbeam/config.toml.
pub fn load_project_config(project_path: &str) -> LoadedConfig {
let config_path = std::path::Path::new(project_path)
.join(".sunbeam")
.join("config.toml");
let raw = match std::fs::read_to_string(&config_path) {
Ok(s) => s,
Err(_) => return LoadedConfig::default(),
};
let parsed: ProjectConfig = match toml::from_str(&raw) {
Ok(c) => c,
Err(e) => {
eprintln!("warning: failed to parse .sunbeam/config.toml: {e}");
return LoadedConfig::default();
}
};
let tools = parsed.tools.unwrap_or_default();
LoadedConfig {
model_name: parsed.model.and_then(|m| m.name),
file_read_perm: tools.file_read.unwrap_or_else(|| "always".into()),
file_write_perm: tools.file_write.unwrap_or_else(|| "ask".into()),
search_replace_perm: tools.search_replace.unwrap_or_else(|| "ask".into()),
grep_perm: tools.grep.unwrap_or_else(|| "always".into()),
bash_perm: tools.bash.unwrap_or_else(|| "ask".into()),
list_directory_perm: tools.list_directory.unwrap_or_else(|| "always".into()),
}
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_default_config() {
let cfg = LoadedConfig::default();
assert_eq!(cfg.file_read_perm, "always");
assert_eq!(cfg.file_write_perm, "ask");
assert_eq!(cfg.bash_perm, "ask");
assert!(cfg.model_name.is_none());
}
#[test]
fn test_parse_config() {
let toml = r#"
[model]
name = "devstral-2"
[tools]
file_read = "always"
bash = "never"
"#;
let parsed: ProjectConfig = toml::from_str(toml).unwrap();
assert_eq!(parsed.model.unwrap().name.unwrap(), "devstral-2");
assert_eq!(parsed.tools.unwrap().bash.unwrap(), "never");
}
}

View File

@@ -0,0 +1,205 @@
//! Low-level LSP client — JSON-RPC framing over subprocess stdio.
use std::collections::HashMap;
use std::sync::atomic::{AtomicI64, Ordering};
use std::sync::Arc;
use tokio::io::{AsyncBufReadExt, AsyncReadExt, AsyncWriteExt, BufReader};
use tokio::process::{Child, ChildStdin, ChildStdout};
use tokio::sync::{oneshot, Mutex};
use tracing::{debug, warn};
/// A low-level LSP client connected to a language server via stdio.
pub struct LspClient {
child: Child,
stdin: ChildStdin,
next_id: Arc<AtomicI64>,
pending: Arc<Mutex<HashMap<i64, oneshot::Sender<serde_json::Value>>>>,
_reader_handle: tokio::task::JoinHandle<()>,
}
impl LspClient {
/// Spawn a language server subprocess.
pub async fn spawn(binary: &str, args: &[String], cwd: &str) -> anyhow::Result<Self> {
use std::process::Stdio;
use tokio::process::Command;
let mut child = Command::new(binary)
.args(args)
.current_dir(cwd)
.stdin(Stdio::piped())
.stdout(Stdio::piped())
.stderr(Stdio::null())
.kill_on_drop(true)
.spawn()
.map_err(|e| anyhow::anyhow!("Failed to spawn {binary}: {e}"))?;
let stdin = child.stdin.take().ok_or_else(|| anyhow::anyhow!("No stdin"))?;
let stdout = child.stdout.take().ok_or_else(|| anyhow::anyhow!("No stdout"))?;
let pending: Arc<Mutex<HashMap<i64, oneshot::Sender<serde_json::Value>>>> =
Arc::new(Mutex::new(HashMap::new()));
let pending_for_reader = pending.clone();
let _reader_handle = tokio::spawn(async move {
if let Err(e) = read_loop(stdout, pending_for_reader).await {
debug!("LSP read loop ended: {e}");
}
});
Ok(Self {
child,
stdin,
next_id: Arc::new(AtomicI64::new(1)),
pending,
_reader_handle,
})
}
/// Send a request and wait for the response.
pub async fn request(
&mut self,
method: &str,
params: serde_json::Value,
) -> anyhow::Result<serde_json::Value> {
let id = self.next_id.fetch_add(1, Ordering::SeqCst);
let message = serde_json::json!({
"jsonrpc": "2.0",
"id": id,
"method": method,
"params": params,
});
let (tx, rx) = oneshot::channel();
self.pending.lock().await.insert(id, tx);
self.send_framed(&message).await?;
let result = tokio::time::timeout(
std::time::Duration::from_secs(30),
rx,
)
.await
.map_err(|_| anyhow::anyhow!("LSP request timed out: {method}"))?
.map_err(|_| anyhow::anyhow!("LSP response channel dropped"))?;
Ok(result)
}
/// Send a notification (no response expected).
pub async fn notify(
&mut self,
method: &str,
params: serde_json::Value,
) -> anyhow::Result<()> {
let message = serde_json::json!({
"jsonrpc": "2.0",
"method": method,
"params": params,
});
self.send_framed(&message).await
}
/// Send with LSP Content-Length framing.
async fn send_framed(&mut self, message: &serde_json::Value) -> anyhow::Result<()> {
let body = serde_json::to_string(message)?;
let frame = format!("Content-Length: {}\r\n\r\n{}", body.len(), body);
self.stdin.write_all(frame.as_bytes()).await?;
self.stdin.flush().await?;
Ok(())
}
/// Shutdown the language server gracefully.
pub async fn shutdown(&mut self) {
// Send shutdown request
let _ = self.request("shutdown", serde_json::json!(null)).await;
// Send exit notification
let _ = self.notify("exit", serde_json::json!(null)).await;
// Wait briefly then kill
tokio::time::sleep(std::time::Duration::from_millis(500)).await;
let _ = self.child.kill().await;
}
}
/// Background read loop: parse LSP framed messages from stdout.
async fn read_loop(
stdout: ChildStdout,
pending: Arc<Mutex<HashMap<i64, oneshot::Sender<serde_json::Value>>>>,
) -> anyhow::Result<()> {
let mut reader = BufReader::new(stdout);
let mut header_line = String::new();
loop {
// Read Content-Length header
header_line.clear();
let bytes_read = reader.read_line(&mut header_line).await?;
if bytes_read == 0 {
break; // EOF
}
let content_length = if header_line.starts_with("Content-Length:") {
header_line
.split(':')
.nth(1)
.and_then(|s| s.trim().parse::<usize>().ok())
.unwrap_or(0)
} else {
continue; // skip non-header lines
};
if content_length == 0 {
continue;
}
// Skip remaining headers until blank line
loop {
header_line.clear();
reader.read_line(&mut header_line).await?;
if header_line.trim().is_empty() {
break;
}
}
// Read the JSON body
let mut body = vec![0u8; content_length];
reader.read_exact(&mut body).await?;
let message: serde_json::Value = match serde_json::from_slice(&body) {
Ok(m) => m,
Err(e) => {
warn!("Failed to parse LSP message: {e}");
continue;
}
};
// Route responses to pending requests
if let Some(id) = message.get("id").and_then(|v| v.as_i64()) {
let result = if let Some(err) = message.get("error") {
// LSP error response
serde_json::json!({ "error": err })
} else {
message.get("result").cloned().unwrap_or(serde_json::Value::Null)
};
if let Some(tx) = pending.lock().await.remove(&id) {
let _ = tx.send(result);
}
}
// Server notifications (diagnostics, progress, etc.) are silently dropped for now
// TODO: capture publishDiagnostics
}
Ok(())
}
#[cfg(test)]
mod tests {
#[test]
fn test_framing_format() {
let body = r#"{"jsonrpc":"2.0","id":1,"method":"initialize"}"#;
let frame = format!("Content-Length: {}\r\n\r\n{}", body.len(), body);
assert!(frame.starts_with("Content-Length: 46\r\n\r\n"));
assert!(frame.ends_with("}"));
}
}

View File

@@ -0,0 +1,97 @@
//! Language server detection — auto-detect which LSP servers to spawn.
use std::path::Path;
/// Configuration for a language server to spawn.
#[derive(Debug, Clone)]
pub struct LspServerConfig {
/// Language identifier (e.g., "rust", "typescript", "python").
pub language_id: String,
/// Binary name to spawn (must be on PATH).
pub binary: String,
/// Arguments to pass (typically ["--stdio"]).
pub args: Vec<String>,
/// File extensions this server handles.
pub extensions: Vec<String>,
}
/// Detect which language servers should be spawned for a project.
pub fn detect_servers(project_root: &str) -> Vec<LspServerConfig> {
let root = Path::new(project_root);
let mut configs = Vec::new();
if root.join("Cargo.toml").exists() {
configs.push(LspServerConfig {
language_id: "rust".into(),
binary: "rust-analyzer".into(),
args: vec![],
extensions: vec!["rs".into()],
});
}
if root.join("package.json").exists() || root.join("tsconfig.json").exists() {
configs.push(LspServerConfig {
language_id: "typescript".into(),
binary: "typescript-language-server".into(),
args: vec!["--stdio".into()],
extensions: vec!["ts".into(), "tsx".into(), "js".into(), "jsx".into()],
});
}
if root.join("pyproject.toml").exists()
|| root.join("setup.py").exists()
|| root.join("requirements.txt").exists()
{
configs.push(LspServerConfig {
language_id: "python".into(),
binary: "pyright-langserver".into(),
args: vec!["--stdio".into()],
extensions: vec!["py".into()],
});
}
if root.join("go.mod").exists() {
configs.push(LspServerConfig {
language_id: "go".into(),
binary: "gopls".into(),
args: vec!["serve".into()],
extensions: vec!["go".into()],
});
}
configs
}
/// Get the language ID for a file extension.
pub fn language_for_extension(ext: &str) -> Option<&'static str> {
match ext {
"rs" => Some("rust"),
"ts" | "tsx" | "js" | "jsx" => Some("typescript"),
"py" => Some("python"),
"go" => Some("go"),
_ => None,
}
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_language_for_extension() {
assert_eq!(language_for_extension("rs"), Some("rust"));
assert_eq!(language_for_extension("ts"), Some("typescript"));
assert_eq!(language_for_extension("py"), Some("python"));
assert_eq!(language_for_extension("go"), Some("go"));
assert_eq!(language_for_extension("md"), None);
}
#[test]
fn test_detect_servers_rust_project() {
// This test runs in the cli-worktree which has Cargo.toml
let configs = detect_servers(".");
let rust = configs.iter().find(|c| c.language_id == "rust");
assert!(rust.is_some(), "Should detect Rust project");
assert_eq!(rust.unwrap().binary, "rust-analyzer");
}
}

View File

@@ -0,0 +1,388 @@
//! LSP manager — spawns and manages language servers for a project.
//!
//! Provides high-level tool methods (definition, references, hover, etc.)
//! that Sol calls via the client tool dispatch.
use std::collections::HashMap;
use std::path::Path;
use tracing::{info, warn};
use super::client::LspClient;
use super::detect::{self, LspServerConfig};
/// Manages LSP servers for a coding session.
pub struct LspManager {
servers: HashMap<String, LspClient>, // language_id -> client
configs: Vec<LspServerConfig>,
project_root: String,
initialized: bool,
}
impl LspManager {
/// Create a new manager. Does NOT spawn servers yet — call `initialize()`.
pub fn new(project_root: &str) -> Self {
let configs = detect::detect_servers(project_root);
// Canonicalize so Url::from_file_path works
let abs_root = std::fs::canonicalize(project_root)
.map(|p| p.to_string_lossy().to_string())
.unwrap_or_else(|_| project_root.to_string());
Self {
servers: HashMap::new(),
configs,
project_root: abs_root,
initialized: false,
}
}
/// Spawn and initialize all detected language servers.
pub async fn initialize(&mut self) {
for config in &self.configs.clone() {
match LspClient::spawn(&config.binary, &config.args, &self.project_root).await {
Ok(mut client) => {
// Send initialize request
let root_uri = url::Url::from_file_path(&self.project_root)
.unwrap_or_else(|_| url::Url::parse("file:///").unwrap());
let init_params = serde_json::json!({
"processId": std::process::id(),
"rootUri": root_uri.as_str(),
"capabilities": {
"textDocument": {
"definition": { "dynamicRegistration": false },
"references": { "dynamicRegistration": false },
"hover": { "contentFormat": ["markdown", "plaintext"] },
"documentSymbol": { "dynamicRegistration": false },
"publishDiagnostics": { "relatedInformation": true }
},
"workspace": {
"symbol": { "dynamicRegistration": false }
}
}
});
match client.request("initialize", init_params).await {
Ok(_) => {
let _ = client.notify("initialized", serde_json::json!({})).await;
info!(lang = config.language_id.as_str(), binary = config.binary.as_str(), "LSP server initialized");
self.servers.insert(config.language_id.clone(), client);
}
Err(e) => {
warn!(lang = config.language_id.as_str(), "LSP initialize failed: {e}");
}
}
}
Err(e) => {
warn!(
lang = config.language_id.as_str(),
binary = config.binary.as_str(),
"LSP server not available: {e}"
);
}
}
}
self.initialized = true;
}
/// Check if any LSP server is available.
pub fn is_available(&self) -> bool {
!self.servers.is_empty()
}
/// Get the server for a file path (by extension).
fn server_for_file(&mut self, path: &str) -> Option<&mut LspClient> {
let ext = Path::new(path).extension()?.to_str()?;
let lang = detect::language_for_extension(ext)?;
self.servers.get_mut(lang)
}
/// Ensure a file is opened in the LSP server (lazy didOpen).
async fn ensure_file_open(&mut self, path: &str) -> anyhow::Result<()> {
let abs_path = if Path::new(path).is_absolute() {
path.to_string()
} else {
format!("{}/{}", self.project_root, path)
};
let uri = url::Url::from_file_path(&abs_path)
.map_err(|_| anyhow::anyhow!("Invalid file path: {abs_path}"))?;
let content = std::fs::read_to_string(&abs_path)?;
let ext = Path::new(path).extension().and_then(|e| e.to_str()).unwrap_or("");
let lang_id = detect::language_for_extension(ext).unwrap_or("plaintext");
if let Some(server) = self.server_for_file(path) {
server.notify("textDocument/didOpen", serde_json::json!({
"textDocument": {
"uri": uri.as_str(),
"languageId": lang_id,
"version": 1,
"text": content,
}
})).await?;
}
Ok(())
}
fn make_uri(&self, path: &str) -> anyhow::Result<url::Url> {
let abs = if Path::new(path).is_absolute() {
path.to_string()
} else {
format!("{}/{}", self.project_root, path)
};
url::Url::from_file_path(&abs)
.map_err(|_| anyhow::anyhow!("Invalid path: {abs}"))
}
// ── Tool methods ────────────────────────────────────────────────────
/// Go to definition at file:line:column.
pub async fn definition(&mut self, path: &str, line: u32, column: u32) -> anyhow::Result<String> {
let _ = self.ensure_file_open(path).await;
let uri = self.make_uri(path)?;
let server = self.server_for_file(path)
.ok_or_else(|| anyhow::anyhow!("No LSP server for {path}"))?;
let result = server.request("textDocument/definition", serde_json::json!({
"textDocument": { "uri": uri.as_str() },
"position": { "line": line.saturating_sub(1), "character": column.saturating_sub(1) }
})).await?;
format_locations(&result, &self.project_root)
}
/// Find all references to symbol at file:line:column.
pub async fn references(&mut self, path: &str, line: u32, column: u32) -> anyhow::Result<String> {
let _ = self.ensure_file_open(path).await;
let uri = self.make_uri(path)?;
let server = self.server_for_file(path)
.ok_or_else(|| anyhow::anyhow!("No LSP server for {path}"))?;
let result = server.request("textDocument/references", serde_json::json!({
"textDocument": { "uri": uri.as_str() },
"position": { "line": line.saturating_sub(1), "character": column.saturating_sub(1) },
"context": { "includeDeclaration": true }
})).await?;
format_locations(&result, &self.project_root)
}
/// Get hover documentation at file:line:column.
pub async fn hover(&mut self, path: &str, line: u32, column: u32) -> anyhow::Result<String> {
let _ = self.ensure_file_open(path).await;
let uri = self.make_uri(path)?;
let server = self.server_for_file(path)
.ok_or_else(|| anyhow::anyhow!("No LSP server for {path}"))?;
let result = server.request("textDocument/hover", serde_json::json!({
"textDocument": { "uri": uri.as_str() },
"position": { "line": line.saturating_sub(1), "character": column.saturating_sub(1) }
})).await?;
if result.is_null() {
return Ok("No hover information available.".into());
}
// Extract markdown content from hover result
let contents = &result["contents"];
if let Some(value) = contents.get("value").and_then(|v| v.as_str()) {
Ok(value.to_string())
} else if let Some(s) = contents.as_str() {
Ok(s.to_string())
} else {
Ok(serde_json::to_string_pretty(&result)?)
}
}
/// Get document symbols (outline) for a file.
pub async fn document_symbols(&mut self, path: &str) -> anyhow::Result<String> {
let _ = self.ensure_file_open(path).await;
let uri = self.make_uri(path)?;
let server = self.server_for_file(path)
.ok_or_else(|| anyhow::anyhow!("No LSP server for {path}"))?;
let result = server.request("textDocument/documentSymbol", serde_json::json!({
"textDocument": { "uri": uri.as_str() }
})).await?;
format_symbols(&result)
}
/// Workspace-wide symbol search.
pub async fn workspace_symbols(&mut self, query: &str, lang: Option<&str>) -> anyhow::Result<String> {
// Use the first available server, or a specific one if lang is given
let server = if let Some(lang) = lang {
self.servers.get_mut(lang)
} else {
self.servers.values_mut().next()
}
.ok_or_else(|| anyhow::anyhow!("No LSP server available"))?;
let result = server.request("workspace/symbol", serde_json::json!({
"query": query
})).await?;
format_symbols(&result)
}
/// Shutdown all servers.
pub async fn shutdown(&mut self) {
for (lang, mut server) in self.servers.drain() {
info!(lang = lang.as_str(), "Shutting down LSP server");
server.shutdown().await;
}
}
}
/// Format LSP location results as readable text.
fn format_locations(result: &serde_json::Value, project_root: &str) -> anyhow::Result<String> {
let locations = if result.is_array() {
result.as_array().unwrap().clone()
} else if result.is_object() {
vec![result.clone()]
} else if result.is_null() {
return Ok("No results found.".into());
} else {
return Ok(format!("{result}"));
};
if locations.is_empty() {
return Ok("No results found.".into());
}
let mut lines = Vec::new();
for loc in &locations {
let uri = loc.get("uri").or_else(|| loc.get("targetUri"))
.and_then(|v| v.as_str())
.unwrap_or("?");
let range = loc.get("range").or_else(|| loc.get("targetRange"));
let line = range.and_then(|r| r["start"]["line"].as_u64()).unwrap_or(0) + 1;
let col = range.and_then(|r| r["start"]["character"].as_u64()).unwrap_or(0) + 1;
// Strip file:// prefix and project root for readability
let path = uri.strip_prefix("file://").unwrap_or(uri);
let rel_path = path.strip_prefix(project_root).unwrap_or(path);
let rel_path = rel_path.strip_prefix('/').unwrap_or(rel_path);
lines.push(format!("{rel_path}:{line}:{col}"));
}
Ok(lines.join("\n"))
}
/// Format LSP symbol results.
fn format_symbols(result: &serde_json::Value) -> anyhow::Result<String> {
let symbols = result.as_array().ok_or_else(|| anyhow::anyhow!("Expected array"))?;
if symbols.is_empty() {
return Ok("No symbols found.".into());
}
let mut lines = Vec::new();
for sym in symbols {
let name = sym.get("name").and_then(|v| v.as_str()).unwrap_or("?");
let kind_num = sym.get("kind").and_then(|v| v.as_u64()).unwrap_or(0);
let kind = symbol_kind_name(kind_num);
if let Some(loc) = sym.get("location") {
let line = loc["range"]["start"]["line"].as_u64().unwrap_or(0) + 1;
lines.push(format!("{kind} {name} (line {line})"));
} else if let Some(range) = sym.get("range") {
let line = range["start"]["line"].as_u64().unwrap_or(0) + 1;
lines.push(format!("{kind} {name} (line {line})"));
} else {
lines.push(format!("{kind} {name}"));
}
// Recurse into children (DocumentSymbol)
if let Some(children) = sym.get("children").and_then(|c| c.as_array()) {
for child in children {
let cname = child.get("name").and_then(|v| v.as_str()).unwrap_or("?");
let ckind = symbol_kind_name(child.get("kind").and_then(|v| v.as_u64()).unwrap_or(0));
let cline = child.get("range").and_then(|r| r["start"]["line"].as_u64()).unwrap_or(0) + 1;
lines.push(format!(" {ckind} {cname} (line {cline})"));
}
}
}
Ok(lines.join("\n"))
}
fn symbol_kind_name(kind: u64) -> &'static str {
match kind {
1 => "file",
2 => "module",
3 => "namespace",
4 => "package",
5 => "class",
6 => "method",
7 => "property",
8 => "field",
9 => "constructor",
10 => "enum",
11 => "interface",
12 => "function",
13 => "variable",
14 => "constant",
15 => "string",
16 => "number",
17 => "boolean",
18 => "array",
19 => "object",
20 => "key",
21 => "null",
22 => "enum_member",
23 => "struct",
24 => "event",
25 => "operator",
26 => "type_parameter",
_ => "unknown",
}
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_symbol_kind_names() {
assert_eq!(symbol_kind_name(12), "function");
assert_eq!(symbol_kind_name(5), "class");
assert_eq!(symbol_kind_name(23), "struct");
assert_eq!(symbol_kind_name(10), "enum");
assert_eq!(symbol_kind_name(999), "unknown");
}
#[test]
fn test_format_locations_empty() {
let result = serde_json::json!([]);
let formatted = format_locations(&result, "/project").unwrap();
assert_eq!(formatted, "No results found.");
}
#[test]
fn test_format_locations_single() {
let result = serde_json::json!([{
"uri": "file:///project/src/main.rs",
"range": { "start": { "line": 9, "character": 3 }, "end": { "line": 9, "character": 10 } }
}]);
let formatted = format_locations(&result, "/project").unwrap();
assert_eq!(formatted, "src/main.rs:10:4");
}
#[test]
fn test_format_symbols() {
let result = serde_json::json!([
{ "name": "main", "kind": 12, "range": { "start": { "line": 0 }, "end": { "line": 5 } } },
{ "name": "Config", "kind": 23, "range": { "start": { "line": 10 }, "end": { "line": 20 } } }
]);
let formatted = format_symbols(&result).unwrap();
assert!(formatted.contains("function main (line 1)"));
assert!(formatted.contains("struct Config (line 11)"));
}
}

View File

@@ -0,0 +1,8 @@
//! LSP client — spawns language servers and queries them for code intelligence.
//!
//! Manages per-language LSP subprocesses. Provides tools for Sol:
//! lsp_definition, lsp_references, lsp_hover, lsp_diagnostics, lsp_symbols.
pub mod client;
pub mod detect;
pub mod manager;

494
sunbeam/src/code/mod.rs Normal file
View File

@@ -0,0 +1,494 @@
pub mod agent;
pub mod client;
pub mod config;
pub mod lsp;
pub mod project;
pub mod symbols;
pub mod tools;
pub mod tui;
use clap::Subcommand;
use tracing::info;
#[derive(Subcommand, Debug)]
pub enum CodeCommand {
/// Start a coding session (default — can omit subcommand)
Start {
/// Model override (e.g., devstral-small-latest)
#[arg(long)]
model: Option<String>,
/// Sol gRPC endpoint (default: from sunbeam config)
#[arg(long)]
endpoint: Option<String>,
/// Connect to localhost:50051 (dev mode)
#[arg(long, hide = true)]
localhost: bool,
},
/// Demo the TUI with sample data (no Sol connection needed)
#[command(hide = true)]
Demo,
}
pub async fn cmd_code(cmd: Option<CodeCommand>) -> sunbeam_sdk::error::Result<()> {
cmd_code_inner(cmd).await.map_err(|e| sunbeam_sdk::error::SunbeamError::Other(e.to_string()))
}
/// Install a tracing subscriber that writes to a LogBuffer instead of stderr.
/// Returns the guard — when dropped, the subscriber is unset.
fn install_tui_tracing(log_buffer: &tui::LogBuffer) -> tracing::subscriber::DefaultGuard {
use tracing_subscriber::fmt;
use tracing_subscriber::EnvFilter;
let subscriber = fmt::Subscriber::builder()
.with_env_filter(
EnvFilter::try_from_default_env()
.unwrap_or_else(|_| EnvFilter::new("sunbeam=info,sunbeam_sdk=info,warn")),
)
.with_target(false)
.with_ansi(false)
.with_writer(log_buffer.clone())
.finish();
tracing::subscriber::set_default(subscriber)
}
async fn cmd_code_inner(cmd: Option<CodeCommand>) -> anyhow::Result<()> {
let cmd = cmd.unwrap_or(CodeCommand::Start {
model: None,
endpoint: None,
localhost: false,
});
match cmd {
CodeCommand::Demo => {
return run_demo().await;
}
CodeCommand::Start { model, endpoint, localhost } => {
let endpoint = if localhost {
"http://127.0.0.1:50051".into()
} else {
endpoint.unwrap_or_else(|| "http://127.0.0.1:50051".into())
};
// Discover project context
let project = project::discover_project(".")?;
info!(
project = project.name.as_str(),
path = project.path.as_str(),
branch = project.git_branch.as_deref().unwrap_or("?"),
"Discovered project"
);
// Load project config
let cfg = config::load_project_config(&project.path);
let model = model
.or(cfg.model_name.clone())
.unwrap_or_else(|| "mistral-medium-latest".into());
// Connect to Sol
let mut session = client::connect(&endpoint, &project, &cfg, &model).await?;
info!(
session_id = session.session_id.as_str(),
room_id = session.room_id.as_str(),
model = session.model.as_str(),
resumed = session.resumed,
"Connected to Sol"
);
let resumed = session.resumed;
let history: Vec<_> = std::mem::take(&mut session.history);
// Switch tracing to in-memory buffer before entering TUI
let log_buffer = tui::LogBuffer::new();
let _guard = install_tui_tracing(&log_buffer);
// Spawn agent on background task
let project_path = project.path.clone();
let agent = agent::spawn(session, endpoint.clone(), cfg, project.path.clone());
// TUI event loop — never blocks on network I/O
use crossterm::event::{self, Event, KeyCode, KeyModifiers, MouseEventKind};
let mut terminal = tui::setup_terminal()?;
let branch = project.git_branch.as_deref().unwrap_or("?");
let mut app = tui::App::new(&project.name, branch, &model, log_buffer);
// Load persistent command history
app.load_history(&project_path);
// Load conversation history from resumed session (batch, single rebuild)
if resumed {
let entries: Vec<_> = history
.iter()
.filter_map(|msg| match msg.role.as_str() {
"user" => Some(tui::LogEntry::UserInput(msg.content.clone())),
"assistant" => Some(tui::LogEntry::AssistantText(msg.content.clone())),
_ => None,
})
.collect();
app.push_logs(entries);
}
let result = loop {
// 1. Process any pending agent events (non-blocking)
for evt in agent.poll_events() {
match evt {
agent::AgentEvent::ApprovalNeeded { call_id, name, args_summary } => {
app.approval = Some(tui::ApprovalPrompt {
call_id: call_id.clone(),
tool_name: name.clone(),
command: args_summary.clone(),
options: vec![
"yes".into(),
format!("yes, always allow {name}"),
"no".into(),
],
selected: 0,
});
app.needs_redraw = true;
}
agent::AgentEvent::Generating => {
app.is_thinking = true;
app.sol_status.clear();
app.thinking_message = tui::random_sol_status().to_string();
app.thinking_since = Some(std::time::Instant::now());
app.needs_redraw = true;
}
agent::AgentEvent::ToolExecuting { name, detail } => {
app.push_log(tui::LogEntry::ToolExecuting { name, detail });
}
agent::AgentEvent::ToolDone { name, success } => {
if success {
app.push_log(tui::LogEntry::ToolSuccess { name, detail: String::new() });
}
}
agent::AgentEvent::Status { message } => {
app.sol_status = message;
app.needs_redraw = true;
}
agent::AgentEvent::Response { text, input_tokens, output_tokens } => {
app.is_thinking = false;
app.sol_status.clear();
app.thinking_since = None;
app.last_turn_tokens = input_tokens + output_tokens;
app.input_tokens += input_tokens;
app.output_tokens += output_tokens;
app.push_log(tui::LogEntry::AssistantText(text));
}
agent::AgentEvent::Error { message } => {
app.is_thinking = false;
app.sol_status.clear();
app.thinking_since = None;
app.push_log(tui::LogEntry::Error(message));
}
agent::AgentEvent::Health { connected } => {
if app.sol_connected != connected {
app.sol_connected = connected;
app.needs_redraw = true;
}
}
agent::AgentEvent::SessionEnded => {
break;
}
}
}
// 2. Draw only when something changed (or animating)
if app.needs_redraw || app.is_thinking {
terminal.draw(|frame| tui::draw(frame, &mut app))?;
app.needs_redraw = false;
}
// 3. Handle input — shorter poll when animating
let poll_ms = if app.is_thinking { 100 } else { 50 };
if event::poll(std::time::Duration::from_millis(poll_ms))? {
// Drain all queued events in one batch (coalesces rapid scroll)
while event::poll(std::time::Duration::ZERO)? {
match event::read()? {
Event::Mouse(mouse) => {
match mouse.kind {
MouseEventKind::ScrollUp | MouseEventKind::ScrollDown => {
app.needs_redraw = true;
let size = terminal.size().unwrap_or_default();
let viewport_h = size.height.saturating_sub(5);
let delta: i16 = if matches!(mouse.kind, MouseEventKind::ScrollUp) { -3 } else { 3 };
if app.show_logs {
if delta < 0 {
app.log_scroll = if app.log_scroll == u16::MAX { u16::MAX.saturating_sub(3) } else { app.log_scroll.saturating_sub(3) };
} else {
app.log_scroll = app.log_scroll.saturating_add(3);
}
} else {
app.resolve_scroll(size.width, viewport_h);
if delta < 0 {
app.scroll_offset = app.scroll_offset.saturating_sub(3);
} else {
app.scroll_offset = app.scroll_offset.saturating_add(3);
}
}
}
_ => {} // Ignore MouseEventKind::Moved and other mouse events
}
}
Event::Key(key) => {
app.needs_redraw = true;
match key.code {
KeyCode::Char('c') if key.modifiers.contains(KeyModifiers::CONTROL) => {
agent.end();
app.should_quit = true;
break; // exit drain loop
}
KeyCode::Char('l') if key.modifiers.contains(KeyModifiers::ALT) => {
app.show_logs = !app.show_logs;
app.log_scroll = u16::MAX;
}
// Approval prompt navigation
KeyCode::Up if app.approval.is_some() => {
if let Some(ref mut a) = app.approval {
a.selected = a.selected.saturating_sub(1);
}
}
KeyCode::Down if app.approval.is_some() => {
if let Some(ref mut a) = app.approval {
a.selected = (a.selected + 1).min(a.options.len() - 1);
}
}
KeyCode::Enter if app.approval.is_some() => {
if let Some(a) = app.approval.take() {
let decision = match a.selected {
0 => agent::ApprovalDecision::Approved {
call_id: a.call_id.clone(),
},
1 => agent::ApprovalDecision::ApprovedAlways {
call_id: a.call_id.clone(),
tool_name: a.tool_name.clone(),
},
_ => agent::ApprovalDecision::Denied {
call_id: a.call_id.clone(),
},
};
agent.decide(decision);
}
}
KeyCode::Char(c) if !app.show_logs && app.approval.is_none() => {
app.history_index = None;
app.input.insert(app.cursor_pos, c);
app.cursor_pos += 1;
}
KeyCode::Backspace if !app.show_logs && app.approval.is_none() => {
if app.cursor_pos > 0 {
app.history_index = None;
app.cursor_pos -= 1;
app.input.remove(app.cursor_pos);
}
}
KeyCode::Left if !app.show_logs && app.approval.is_none() => app.cursor_pos = app.cursor_pos.saturating_sub(1),
KeyCode::Right if !app.show_logs && app.approval.is_none() => app.cursor_pos = (app.cursor_pos + 1).min(app.input.len()),
KeyCode::Up if !app.show_logs => {
if !app.command_history.is_empty() {
let idx = match app.history_index {
None => {
app.input_saved = app.input.clone();
app.command_history.len() - 1
}
Some(i) => i.saturating_sub(1),
};
app.history_index = Some(idx);
app.input = app.command_history[idx].clone();
app.cursor_pos = app.input.len();
}
}
KeyCode::Down if !app.show_logs => {
if let Some(idx) = app.history_index {
if idx + 1 < app.command_history.len() {
let new_idx = idx + 1;
app.history_index = Some(new_idx);
app.input = app.command_history[new_idx].clone();
app.cursor_pos = app.input.len();
} else {
app.history_index = None;
app.input = app.input_saved.clone();
app.cursor_pos = app.input.len();
}
}
}
KeyCode::Up if app.show_logs => {
app.log_scroll = if app.log_scroll == u16::MAX { u16::MAX.saturating_sub(1) } else { app.log_scroll.saturating_sub(1) };
}
KeyCode::Down if app.show_logs => {
app.log_scroll = app.log_scroll.saturating_add(1);
}
KeyCode::PageUp => {
let size = terminal.size().unwrap_or_default();
app.resolve_scroll(size.width, size.height.saturating_sub(5));
app.scroll_offset = app.scroll_offset.saturating_sub(20);
}
KeyCode::PageDown => {
let size = terminal.size().unwrap_or_default();
app.resolve_scroll(size.width, size.height.saturating_sub(5));
app.scroll_offset = app.scroll_offset.saturating_add(20);
}
KeyCode::Home => app.scroll_offset = 0,
KeyCode::End => app.scroll_offset = u16::MAX,
KeyCode::Enter if !app.show_logs && !app.is_thinking => {
if !app.input.is_empty() {
let text = app.input.clone();
app.command_history.push(text.clone());
app.history_index = None;
app.input.clear();
app.cursor_pos = 0;
if text == "/exit" {
agent.end();
app.should_quit = true;
break; // exit drain loop
}
app.push_log(tui::LogEntry::UserInput(text.clone()));
agent.chat(&text);
}
}
_ => {}
}
}
_ => {}
} // match event::read
} // while poll(ZERO)
} // if poll(50ms)
if app.should_quit {
break Ok(());
}
};
app.save_history(&project_path);
tui::restore_terminal(&mut terminal)?;
result
}
}
}
async fn run_demo() -> anyhow::Result<()> {
use crossterm::event::{self, Event, KeyCode, KeyModifiers};
let log_buffer = tui::LogBuffer::new();
let _guard = install_tui_tracing(&log_buffer);
let mut terminal = tui::setup_terminal()?;
let mut app = tui::App::new("sol", "mainline ±", "devstral-small-latest", log_buffer);
// Populate with sample conversation
app.push_log(tui::LogEntry::UserInput("fix the token validation bug in auth.rs".into()));
app.push_log(tui::LogEntry::AssistantText(
"Looking at the auth module, I can see the issue on line 42 where the token \
is not properly validated before use. The expiry check is missing entirely."
.into(),
));
app.push_log(tui::LogEntry::ToolSuccess {
name: "file_read".into(),
detail: "src/auth.rs (127 lines)".into(),
});
app.push_log(tui::LogEntry::ToolOutput {
lines: vec![
"38│ fn validate_token(token: &str) -> bool {".into(),
"39│ let decoded = decode(token);".into(),
"40│ // BUG: missing expiry check".into(),
"41│ decoded.is_ok()".into(),
"42│ }".into(),
"43│".into(),
"44│ fn refresh_token(token: &str) -> Result<String> {".into(),
"45│ let client = reqwest::Client::new();".into(),
"46│ // ...".into(),
],
collapsed: true,
});
app.push_log(tui::LogEntry::ToolSuccess {
name: "search_replace".into(),
detail: "src/auth.rs — applied 1 replacement (line 41)".into(),
});
app.push_log(tui::LogEntry::ToolExecuting {
name: "bash".into(),
detail: "cargo test --lib".into(),
});
app.push_log(tui::LogEntry::ToolOutput {
lines: vec![
"running 23 tests".into(),
"test auth::tests::test_validate_token ... ok".into(),
"test auth::tests::test_expired_token ... ok".into(),
"test auth::tests::test_refresh_flow ... ok".into(),
"test result: ok. 23 passed; 0 failed".into(),
],
collapsed: false,
});
app.push_log(tui::LogEntry::AssistantText(
"Fixed. The token validation now checks expiry before use. All 23 tests pass."
.into(),
));
app.push_log(tui::LogEntry::UserInput("now add rate limiting to the auth endpoint".into()));
app.push_log(tui::LogEntry::ToolExecuting {
name: "file_read".into(),
detail: "src/routes/auth.rs".into(),
});
app.is_thinking = true;
app.input_tokens = 2400;
app.output_tokens = 890;
loop {
terminal.draw(|frame| tui::draw(frame, &mut app))?;
if event::poll(std::time::Duration::from_millis(100))? {
if let Event::Key(key) = event::read()? {
match key.code {
KeyCode::Char('c') if key.modifiers.contains(KeyModifiers::CONTROL) => break,
KeyCode::Char('q') => break,
KeyCode::Char('l') if key.modifiers.contains(KeyModifiers::ALT) => {
app.show_logs = !app.show_logs;
app.log_scroll = u16::MAX;
}
KeyCode::Char(c) => {
app.input.insert(app.cursor_pos, c);
app.cursor_pos += 1;
}
KeyCode::Backspace => {
if app.cursor_pos > 0 {
app.cursor_pos -= 1;
app.input.remove(app.cursor_pos);
}
}
KeyCode::Left => {
app.cursor_pos = app.cursor_pos.saturating_sub(1);
}
KeyCode::Right => {
app.cursor_pos = (app.cursor_pos + 1).min(app.input.len());
}
KeyCode::Enter => {
if !app.input.is_empty() {
let text = app.input.clone();
app.input.clear();
app.cursor_pos = 0;
if text == "/exit" {
break;
}
app.push_log(tui::LogEntry::UserInput(text));
app.is_thinking = true;
}
}
KeyCode::Up => {
app.scroll_offset = app.scroll_offset.saturating_sub(1);
}
KeyCode::Down => {
app.scroll_offset = app.scroll_offset.saturating_add(1);
}
_ => {}
}
}
}
}
tui::restore_terminal(&mut terminal)?;
Ok(())
}

131
sunbeam/src/code/project.rs Normal file
View File

@@ -0,0 +1,131 @@
use std::path::{Path, PathBuf};
use std::process::Command;
/// Discovered project context sent to Sol on session start.
pub struct ProjectContext {
pub name: String,
pub path: String,
pub prompt_md: String,
pub config_toml: String,
pub git_branch: Option<String>,
pub git_status: Option<String>,
pub file_tree: Vec<String>,
}
/// Discover project context from the working directory.
pub fn discover_project(dir: &str) -> anyhow::Result<ProjectContext> {
let path = std::fs::canonicalize(dir)?;
let name = path
.file_name()
.and_then(|n| n.to_str())
.unwrap_or("unknown")
.to_string();
let prompt_md = read_optional(&path.join(".sunbeam").join("prompt.md"));
let config_toml = read_optional(&path.join(".sunbeam").join("config.toml"));
let git_branch = run_git(&path, &["rev-parse", "--abbrev-ref", "HEAD"]);
let git_status = run_git(&path, &["status", "--short"]);
let file_tree = list_tree(&path, 2);
Ok(ProjectContext {
name,
path: path.to_string_lossy().into(),
prompt_md,
config_toml,
git_branch,
git_status,
file_tree,
})
}
fn read_optional(path: &Path) -> String {
std::fs::read_to_string(path).unwrap_or_default()
}
fn run_git(dir: &Path, args: &[&str]) -> Option<String> {
Command::new("git")
.args(args)
.current_dir(dir)
.output()
.ok()
.filter(|o| o.status.success())
.map(|o| String::from_utf8_lossy(&o.stdout).trim().to_string())
}
fn list_tree(dir: &Path, max_depth: usize) -> Vec<String> {
let mut entries = Vec::new();
list_tree_inner(dir, dir, 0, max_depth, &mut entries);
entries
}
fn list_tree_inner(
base: &Path,
dir: &Path,
depth: usize,
max_depth: usize,
entries: &mut Vec<String>,
) {
if depth > max_depth {
return;
}
let Ok(read_dir) = std::fs::read_dir(dir) else {
return;
};
let mut items: Vec<_> = read_dir.filter_map(|e| e.ok()).collect();
items.sort_by_key(|e| e.file_name());
for entry in items {
let name = entry.file_name().to_string_lossy().to_string();
// Skip hidden dirs, target, node_modules, vendor
if name.starts_with('.') || name == "target" || name == "node_modules" || name == "vendor"
{
continue;
}
let relative = entry
.path()
.strip_prefix(base)
.unwrap_or(&entry.path())
.to_string_lossy()
.to_string();
entries.push(relative);
if entry.file_type().map(|t| t.is_dir()).unwrap_or(false) {
list_tree_inner(base, &entry.path(), depth + 1, max_depth, entries);
}
}
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_discover_current_dir() {
// Should work in any directory
let ctx = discover_project(".").unwrap();
assert!(!ctx.name.is_empty());
assert!(!ctx.path.is_empty());
}
#[test]
fn test_list_tree_excludes_hidden() {
let dir = std::env::temp_dir().join("sunbeam-test-tree");
let _ = std::fs::create_dir_all(dir.join(".hidden"));
let _ = std::fs::create_dir_all(dir.join("visible"));
let _ = std::fs::write(dir.join("file.txt"), "test");
let tree = list_tree(&dir, 1);
assert!(tree.iter().any(|e| e == "visible"));
assert!(tree.iter().any(|e| e == "file.txt"));
assert!(!tree.iter().any(|e| e.contains(".hidden")));
let _ = std::fs::remove_dir_all(&dir);
}
}

659
sunbeam/src/code/symbols.rs Normal file
View File

@@ -0,0 +1,659 @@
//! Symbol extraction from source code using tree-sitter.
//!
//! Extracts function signatures, struct/enum/trait definitions, and
//! docstrings from Rust, TypeScript, and Python files. These symbols
//! are sent to Sol for indexing in the code search index.
use std::path::Path;
use tracing::debug;
/// An extracted code symbol with file context.
#[derive(Debug, Clone)]
pub struct ProjectSymbol {
pub file_path: String, // relative to project root
pub name: String,
pub kind: String,
pub signature: String,
pub docstring: String,
pub start_line: u32,
pub end_line: u32,
pub language: String,
pub content: String,
}
/// Extract symbols from all source files in a project.
pub fn extract_project_symbols(project_root: &str) -> Vec<ProjectSymbol> {
let root = Path::new(project_root);
let mut symbols = Vec::new();
walk_directory(root, root, &mut symbols);
debug!(count = symbols.len(), "Extracted project symbols");
symbols
}
fn walk_directory(dir: &Path, root: &Path, symbols: &mut Vec<ProjectSymbol>) {
let Ok(entries) = std::fs::read_dir(dir) else { return };
for entry in entries.flatten() {
let path = entry.path();
let name = entry.file_name().to_string_lossy().to_string();
// Skip hidden, vendor, target, node_modules, etc.
if name.starts_with('.') || name == "target" || name == "vendor"
|| name == "node_modules" || name == "dist" || name == "build"
|| name == "__pycache__" || name == ".git"
{
continue;
}
if path.is_dir() {
walk_directory(&path, root, symbols);
} else if path.is_file() {
let path_str = path.to_string_lossy().to_string();
if detect_language(&path_str).is_some() {
// Read file (skip large files)
if let Ok(content) = std::fs::read_to_string(&path) {
if content.len() > 100_000 { continue; } // skip >100KB
let rel_path = path.strip_prefix(root)
.map(|p| p.to_string_lossy().to_string())
.unwrap_or(path_str.clone());
for sym in extract_symbols(&path_str, &content) {
// Build content: signature + body up to 500 chars
let body_start = content.lines()
.take(sym.start_line as usize - 1)
.map(|l| l.len() + 1)
.sum::<usize>();
let body_end = content.lines()
.take(sym.end_line as usize)
.map(|l| l.len() + 1)
.sum::<usize>()
.min(content.len());
let body = &content[body_start..body_end];
let truncated = if body.len() > 500 {
format!("{}", &body[..497])
} else {
body.to_string()
};
symbols.push(ProjectSymbol {
file_path: rel_path.clone(),
name: sym.name,
kind: sym.kind,
signature: sym.signature,
docstring: sym.docstring,
start_line: sym.start_line,
end_line: sym.end_line,
language: sym.language,
content: truncated,
});
}
}
}
}
}
}
/// An extracted code symbol.
#[derive(Debug, Clone)]
pub struct CodeSymbol {
pub name: String,
pub kind: String, // "function", "struct", "enum", "trait", "class", "interface", "method"
pub signature: String, // full signature line
pub docstring: String, // doc comment / docstring
pub start_line: u32, // 1-based
pub end_line: u32, // 1-based
pub language: String,
}
/// Detect language from file extension.
pub fn detect_language(path: &str) -> Option<&'static str> {
let ext = Path::new(path).extension()?.to_str()?;
match ext {
"rs" => Some("rust"),
"ts" | "tsx" => Some("typescript"),
"js" | "jsx" => Some("javascript"),
"py" => Some("python"),
_ => None,
}
}
/// Extract symbols from a source file's content.
pub fn extract_symbols(path: &str, content: &str) -> Vec<CodeSymbol> {
let Some(lang) = detect_language(path) else {
return Vec::new();
};
match lang {
"rust" => extract_rust_symbols(content),
"typescript" | "javascript" => extract_ts_symbols(content),
"python" => extract_python_symbols(content),
_ => Vec::new(),
}
}
// ── Rust ────────────────────────────────────────────────────────────────
fn extract_rust_symbols(content: &str) -> Vec<CodeSymbol> {
let mut parser = tree_sitter::Parser::new();
parser.set_language(&tree_sitter_rust::LANGUAGE.into()).ok();
let Some(tree) = parser.parse(content, None) else {
return Vec::new();
};
let mut symbols = Vec::new();
let root = tree.root_node();
let bytes = content.as_bytes();
walk_rust_node(root, bytes, content, &mut symbols);
symbols
}
fn walk_rust_node(
node: tree_sitter::Node,
bytes: &[u8],
source: &str,
symbols: &mut Vec<CodeSymbol>,
) {
match node.kind() {
"function_item" | "function_signature_item" => {
if let Some(sym) = extract_rust_function(node, bytes, source) {
symbols.push(sym);
}
}
"struct_item" => {
if let Some(sym) = extract_rust_type(node, bytes, source, "struct") {
symbols.push(sym);
}
}
"enum_item" => {
if let Some(sym) = extract_rust_type(node, bytes, source, "enum") {
symbols.push(sym);
}
}
"trait_item" => {
if let Some(sym) = extract_rust_type(node, bytes, source, "trait") {
symbols.push(sym);
}
}
"impl_item" => {
// Walk impl methods
for i in 0..node.child_count() {
if let Some(child) = node.child(i) {
if child.kind() == "declaration_list" {
walk_rust_node(child, bytes, source, symbols);
}
}
}
}
_ => {
for i in 0..node.child_count() {
if let Some(child) = node.child(i) {
walk_rust_node(child, bytes, source, symbols);
}
}
}
}
}
fn extract_rust_function(node: tree_sitter::Node, bytes: &[u8], source: &str) -> Option<CodeSymbol> {
let name = node.child_by_field_name("name")?;
let name_str = name.utf8_text(bytes).ok()?.to_string();
// Build signature: everything from start to the opening brace (or end if no body)
let start_byte = node.start_byte();
let sig_end = find_rust_sig_end(node, source);
let signature = source[start_byte..sig_end].trim().to_string();
// Extract doc comment (line comments starting with /// before the function)
let docstring = extract_rust_doc_comment(node, source);
Some(CodeSymbol {
name: name_str,
kind: "function".into(),
signature,
docstring,
start_line: node.start_position().row as u32 + 1,
end_line: node.end_position().row as u32 + 1,
language: "rust".into(),
})
}
fn extract_rust_type(node: tree_sitter::Node, bytes: &[u8], source: &str, kind: &str) -> Option<CodeSymbol> {
let name = node.child_by_field_name("name")?;
let name_str = name.utf8_text(bytes).ok()?.to_string();
// Signature: first line of the definition
let start = node.start_byte();
let first_line_end = source[start..].find('\n').map(|i| start + i).unwrap_or(node.end_byte());
let signature = source[start..first_line_end].trim().to_string();
let docstring = extract_rust_doc_comment(node, source);
Some(CodeSymbol {
name: name_str,
kind: kind.into(),
signature,
docstring,
start_line: node.start_position().row as u32 + 1,
end_line: node.end_position().row as u32 + 1,
language: "rust".into(),
})
}
fn find_rust_sig_end(node: tree_sitter::Node, source: &str) -> usize {
// Find the opening brace
for i in 0..node.child_count() {
if let Some(child) = node.child(i) {
if child.kind() == "block" || child.kind() == "field_declaration_list"
|| child.kind() == "enum_variant_list" || child.kind() == "declaration_list"
{
return child.start_byte();
}
}
}
// No body (e.g., trait method signature)
node.end_byte().min(source.len())
}
fn extract_rust_doc_comment(node: tree_sitter::Node, source: &str) -> String {
let start_line = node.start_position().row;
if start_line == 0 {
return String::new();
}
let lines: Vec<&str> = source.lines().collect();
let mut doc_lines = Vec::new();
// Walk backwards from the line before the node
let mut line_idx = start_line.saturating_sub(1);
loop {
if line_idx >= lines.len() {
break;
}
let line = lines[line_idx].trim();
if line.starts_with("///") {
doc_lines.push(line.trim_start_matches("///").trim());
} else if line.starts_with("#[") || line.is_empty() {
// Skip attributes and blank lines between doc and function
if line.is_empty() && !doc_lines.is_empty() {
break; // blank line after doc block = stop
}
} else {
break;
}
if line_idx == 0 {
break;
}
line_idx -= 1;
}
doc_lines.reverse();
doc_lines.join("\n")
}
// ── TypeScript / JavaScript ─────────────────────────────────────────────
fn extract_ts_symbols(content: &str) -> Vec<CodeSymbol> {
let mut parser = tree_sitter::Parser::new();
parser.set_language(&tree_sitter_typescript::LANGUAGE_TYPESCRIPT.into()).ok();
let Some(tree) = parser.parse(content, None) else {
return Vec::new();
};
let mut symbols = Vec::new();
walk_ts_node(tree.root_node(), content.as_bytes(), content, &mut symbols);
symbols
}
fn walk_ts_node(
node: tree_sitter::Node,
bytes: &[u8],
source: &str,
symbols: &mut Vec<CodeSymbol>,
) {
match node.kind() {
"function_declaration" | "method_definition" | "arrow_function" => {
if let Some(name) = node.child_by_field_name("name") {
let name_str = name.utf8_text(bytes).unwrap_or("").to_string();
if !name_str.is_empty() {
let start = node.start_byte();
let first_line_end = source[start..].find('\n').map(|i| start + i).unwrap_or(node.end_byte());
symbols.push(CodeSymbol {
name: name_str,
kind: "function".into(),
signature: source[start..first_line_end].trim().to_string(),
docstring: String::new(), // TODO: JSDoc extraction
start_line: node.start_position().row as u32 + 1,
end_line: node.end_position().row as u32 + 1,
language: "typescript".into(),
});
}
}
}
"class_declaration" | "interface_declaration" | "type_alias_declaration" | "enum_declaration" => {
if let Some(name) = node.child_by_field_name("name") {
let name_str = name.utf8_text(bytes).unwrap_or("").to_string();
let kind = match node.kind() {
"class_declaration" => "class",
"interface_declaration" => "interface",
"enum_declaration" => "enum",
_ => "type",
};
let start = node.start_byte();
let first_line_end = source[start..].find('\n').map(|i| start + i).unwrap_or(node.end_byte());
symbols.push(CodeSymbol {
name: name_str,
kind: kind.into(),
signature: source[start..first_line_end].trim().to_string(),
docstring: String::new(),
start_line: node.start_position().row as u32 + 1,
end_line: node.end_position().row as u32 + 1,
language: "typescript".into(),
});
}
}
_ => {}
}
for i in 0..node.child_count() {
if let Some(child) = node.child(i) {
walk_ts_node(child, bytes, source, symbols);
}
}
}
// ── Python ──────────────────────────────────────────────────────────────
fn extract_python_symbols(content: &str) -> Vec<CodeSymbol> {
let mut parser = tree_sitter::Parser::new();
parser.set_language(&tree_sitter_python::LANGUAGE.into()).ok();
let Some(tree) = parser.parse(content, None) else {
return Vec::new();
};
let mut symbols = Vec::new();
walk_python_node(tree.root_node(), content.as_bytes(), content, &mut symbols);
symbols
}
fn walk_python_node(
node: tree_sitter::Node,
bytes: &[u8],
source: &str,
symbols: &mut Vec<CodeSymbol>,
) {
match node.kind() {
"function_definition" => {
if let Some(name) = node.child_by_field_name("name") {
let name_str = name.utf8_text(bytes).unwrap_or("").to_string();
let start = node.start_byte();
let first_line_end = source[start..].find('\n').map(|i| start + i).unwrap_or(node.end_byte());
let docstring = extract_python_docstring(node, bytes);
symbols.push(CodeSymbol {
name: name_str,
kind: "function".into(),
signature: source[start..first_line_end].trim().to_string(),
docstring,
start_line: node.start_position().row as u32 + 1,
end_line: node.end_position().row as u32 + 1,
language: "python".into(),
});
}
}
"class_definition" => {
if let Some(name) = node.child_by_field_name("name") {
let name_str = name.utf8_text(bytes).unwrap_or("").to_string();
let start = node.start_byte();
let first_line_end = source[start..].find('\n').map(|i| start + i).unwrap_or(node.end_byte());
let docstring = extract_python_docstring(node, bytes);
symbols.push(CodeSymbol {
name: name_str,
kind: "class".into(),
signature: source[start..first_line_end].trim().to_string(),
docstring,
start_line: node.start_position().row as u32 + 1,
end_line: node.end_position().row as u32 + 1,
language: "python".into(),
});
}
}
_ => {}
}
for i in 0..node.child_count() {
if let Some(child) = node.child(i) {
walk_python_node(child, bytes, source, symbols);
}
}
}
fn extract_python_docstring(node: tree_sitter::Node, bytes: &[u8]) -> String {
// Python docstrings are the first expression_statement in the body
if let Some(body) = node.child_by_field_name("body") {
if let Some(first_stmt) = body.child(0) {
if first_stmt.kind() == "expression_statement" {
if let Some(expr) = first_stmt.child(0) {
if expr.kind() == "string" {
let text = expr.utf8_text(bytes).unwrap_or("");
// Strip triple quotes
let trimmed = text
.trim_start_matches("\"\"\"")
.trim_start_matches("'''")
.trim_end_matches("\"\"\"")
.trim_end_matches("'''")
.trim();
return trimmed.to_string();
}
}
}
}
}
String::new()
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_detect_language() {
assert_eq!(detect_language("src/main.rs"), Some("rust"));
assert_eq!(detect_language("app.ts"), Some("typescript"));
assert_eq!(detect_language("app.tsx"), Some("typescript"));
assert_eq!(detect_language("script.py"), Some("python"));
assert_eq!(detect_language("script.js"), Some("javascript"));
assert_eq!(detect_language("data.json"), None);
assert_eq!(detect_language("README.md"), None);
}
#[test]
fn test_extract_rust_function() {
let source = r#"
/// Generate a response.
pub async fn generate(&self, req: &GenerateRequest) -> Option<String> {
self.run_and_emit(req).await
}
"#;
let symbols = extract_rust_symbols(source);
assert!(!symbols.is_empty(), "Should extract at least one symbol");
let func = &symbols[0];
assert_eq!(func.name, "generate");
assert_eq!(func.kind, "function");
assert!(func.signature.contains("pub async fn generate"));
assert!(func.docstring.contains("Generate a response"));
assert_eq!(func.language, "rust");
}
#[test]
fn test_extract_rust_struct() {
let source = r#"
/// A request to generate.
pub struct GenerateRequest {
pub text: String,
pub user_id: String,
}
"#;
let symbols = extract_rust_symbols(source);
let structs: Vec<_> = symbols.iter().filter(|s| s.kind == "struct").collect();
assert!(!structs.is_empty());
assert_eq!(structs[0].name, "GenerateRequest");
assert!(structs[0].docstring.contains("request to generate"));
}
#[test]
fn test_extract_rust_enum() {
let source = r#"
/// Whether server or client.
pub enum ToolSide {
Server,
Client,
}
"#;
let symbols = extract_rust_symbols(source);
let enums: Vec<_> = symbols.iter().filter(|s| s.kind == "enum").collect();
assert!(!enums.is_empty());
assert_eq!(enums[0].name, "ToolSide");
}
#[test]
fn test_extract_rust_trait() {
let source = r#"
pub trait Executor {
fn execute(&self, args: &str) -> String;
}
"#;
let symbols = extract_rust_symbols(source);
let traits: Vec<_> = symbols.iter().filter(|s| s.kind == "trait").collect();
assert!(!traits.is_empty());
assert_eq!(traits[0].name, "Executor");
}
#[test]
fn test_extract_rust_impl_methods() {
let source = r#"
impl Orchestrator {
/// Create new.
pub fn new(config: Config) -> Self {
Self { config }
}
/// Subscribe to events.
pub fn subscribe(&self) -> Receiver {
self.tx.subscribe()
}
}
"#;
let symbols = extract_rust_symbols(source);
let fns: Vec<_> = symbols.iter().filter(|s| s.kind == "function").collect();
assert!(fns.len() >= 2, "Should find impl methods, got {}", fns.len());
let names: Vec<&str> = fns.iter().map(|s| s.name.as_str()).collect();
assert!(names.contains(&"new"));
assert!(names.contains(&"subscribe"));
}
#[test]
fn test_extract_ts_function() {
let source = r#"
function greet(name: string): string {
return `Hello, ${name}`;
}
"#;
let symbols = extract_ts_symbols(source);
assert!(!symbols.is_empty());
assert_eq!(symbols[0].name, "greet");
assert_eq!(symbols[0].kind, "function");
}
#[test]
fn test_extract_ts_class() {
let source = r#"
class UserService {
constructor(private db: Database) {}
async getUser(id: string): Promise<User> {
return this.db.find(id);
}
}
"#;
let symbols = extract_ts_symbols(source);
let classes: Vec<_> = symbols.iter().filter(|s| s.kind == "class").collect();
assert!(!classes.is_empty());
assert_eq!(classes[0].name, "UserService");
}
#[test]
fn test_extract_ts_interface() {
let source = r#"
interface User {
id: string;
name: string;
email?: string;
}
"#;
let symbols = extract_ts_symbols(source);
let ifaces: Vec<_> = symbols.iter().filter(|s| s.kind == "interface").collect();
assert!(!ifaces.is_empty());
assert_eq!(ifaces[0].name, "User");
}
#[test]
fn test_extract_python_function() {
let source = r#"
def process_data(items: list[str]) -> dict:
"""Process a list of items into a dictionary."""
return {item: len(item) for item in items}
"#;
let symbols = extract_python_symbols(source);
assert!(!symbols.is_empty());
assert_eq!(symbols[0].name, "process_data");
assert_eq!(symbols[0].kind, "function");
assert!(symbols[0].docstring.contains("Process a list"));
}
#[test]
fn test_extract_python_class() {
let source = r#"
class DataProcessor:
"""Processes data from various sources."""
def __init__(self, config):
self.config = config
def run(self):
pass
"#;
let symbols = extract_python_symbols(source);
let classes: Vec<_> = symbols.iter().filter(|s| s.kind == "class").collect();
assert!(!classes.is_empty());
assert_eq!(classes[0].name, "DataProcessor");
assert!(classes[0].docstring.contains("Processes data"));
}
#[test]
fn test_extract_symbols_unknown_language() {
let symbols = extract_symbols("data.json", "{}");
assert!(symbols.is_empty());
}
#[test]
fn test_extract_symbols_empty_file() {
let symbols = extract_symbols("empty.rs", "");
assert!(symbols.is_empty());
}
#[test]
fn test_line_numbers_are_1_based() {
let source = "fn first() {}\nfn second() {}\nfn third() {}";
let symbols = extract_rust_symbols(source);
assert!(symbols.len() >= 3);
assert_eq!(symbols[0].start_line, 1);
assert_eq!(symbols[1].start_line, 2);
assert_eq!(symbols[2].start_line, 3);
}
}

345
sunbeam/src/code/tools.rs Normal file
View File

@@ -0,0 +1,345 @@
use std::path::Path;
use std::process::Command;
use serde_json::Value;
use tracing::info;
/// Execute a client-side tool and return the result as a string.
pub fn execute(name: &str, args_json: &str, project_root: &str) -> String {
let args: Value = serde_json::from_str(args_json).unwrap_or_default();
match name {
"file_read" => file_read(&args, project_root),
"file_write" => file_write(&args, project_root),
"search_replace" => search_replace(&args, project_root),
"grep" => grep(&args, project_root),
"bash" => bash(&args, project_root),
"list_directory" => list_directory(&args, project_root),
_ => format!("Unknown client tool: {name}"),
}
}
/// Execute an LSP tool asynchronously. Returns None if tool is not an LSP tool.
pub async fn execute_lsp(
name: &str,
args_json: &str,
lsp: &mut super::lsp::manager::LspManager,
) -> Option<String> {
let args: Value = serde_json::from_str(args_json).unwrap_or_default();
let result = match name {
"lsp_definition" => {
let path = args["path"].as_str().unwrap_or("");
let line = args["line"].as_u64().unwrap_or(1) as u32;
let col = args["column"].as_u64().unwrap_or(1) as u32;
Some(lsp.definition(path, line, col).await
.unwrap_or_else(|e| format!("LSP error: {e}")))
}
"lsp_references" => {
let path = args["path"].as_str().unwrap_or("");
let line = args["line"].as_u64().unwrap_or(1) as u32;
let col = args["column"].as_u64().unwrap_or(1) as u32;
Some(lsp.references(path, line, col).await
.unwrap_or_else(|e| format!("LSP error: {e}")))
}
"lsp_hover" => {
let path = args["path"].as_str().unwrap_or("");
let line = args["line"].as_u64().unwrap_or(1) as u32;
let col = args["column"].as_u64().unwrap_or(1) as u32;
Some(lsp.hover(path, line, col).await
.unwrap_or_else(|e| format!("LSP error: {e}")))
}
"lsp_diagnostics" => {
let path = args["path"].as_str().unwrap_or("");
if path.is_empty() {
Some("Specify a file path for diagnostics.".into())
} else {
// TODO: return cached diagnostics from publishDiagnostics
Some("Diagnostics not yet implemented. Use `bash` with `cargo check` or equivalent.".into())
}
}
"lsp_symbols" => {
let path = args["path"].as_str();
let query = args["query"].as_str().unwrap_or("");
if let Some(path) = path {
Some(lsp.document_symbols(path).await
.unwrap_or_else(|e| format!("LSP error: {e}")))
} else {
Some(lsp.workspace_symbols(query, None).await
.unwrap_or_else(|e| format!("LSP error: {e}")))
}
}
_ => None,
};
result
}
/// Check if a tool name is an LSP tool.
pub fn is_lsp_tool(name: &str) -> bool {
matches!(name, "lsp_definition" | "lsp_references" | "lsp_hover" | "lsp_diagnostics" | "lsp_symbols")
}
fn resolve_path(path: &str, project_root: &str) -> String {
let p = Path::new(path);
if p.is_absolute() {
path.to_string()
} else {
Path::new(project_root)
.join(path)
.to_string_lossy()
.into()
}
}
fn file_read(args: &Value, root: &str) -> String {
let path = args["path"].as_str().unwrap_or("");
let resolved = resolve_path(path, root);
let content = match std::fs::read_to_string(&resolved) {
Ok(c) => c,
Err(e) => return format!("Error reading {path}: {e}"),
};
let start = args["start_line"].as_u64().map(|n| n as usize);
let end = args["end_line"].as_u64().map(|n| n as usize);
match (start, end) {
(Some(s), Some(e)) => {
let lines: Vec<&str> = content.lines().collect();
let s = s.saturating_sub(1).min(lines.len());
let e = e.min(lines.len());
lines[s..e].join("\n")
}
(Some(s), None) => {
let lines: Vec<&str> = content.lines().collect();
let s = s.saturating_sub(1).min(lines.len());
lines[s..].join("\n")
}
_ => content,
}
}
fn file_write(args: &Value, root: &str) -> String {
let path = args["path"].as_str().unwrap_or("");
let content = args["content"].as_str().unwrap_or("");
let resolved = resolve_path(path, root);
// Ensure parent directory exists
if let Some(parent) = Path::new(&resolved).parent() {
let _ = std::fs::create_dir_all(parent);
}
match std::fs::write(&resolved, content) {
Ok(_) => format!("Written {} bytes to {path}", content.len()),
Err(e) => format!("Error writing {path}: {e}"),
}
}
fn search_replace(args: &Value, root: &str) -> String {
let path = args["path"].as_str().unwrap_or("");
let diff = args["diff"].as_str().unwrap_or("");
let resolved = resolve_path(path, root);
let content = match std::fs::read_to_string(&resolved) {
Ok(c) => c,
Err(e) => return format!("Error reading {path}: {e}"),
};
// Parse SEARCH/REPLACE blocks
let mut result = content.clone();
let mut replacements = 0;
for block in diff.split("<<<< SEARCH\n").skip(1) {
let parts: Vec<&str> = block.splitn(2, "=====\n").collect();
if parts.len() != 2 {
continue;
}
let search = parts[0].trim_end_matches('\n');
let rest: Vec<&str> = parts[1].splitn(2, ">>>>> REPLACE").collect();
if rest.is_empty() {
continue;
}
let replace = rest[0].trim_end_matches('\n');
if result.contains(search) {
result = result.replacen(search, replace, 1);
replacements += 1;
}
}
if replacements > 0 {
match std::fs::write(&resolved, &result) {
Ok(_) => format!("{replacements} replacement(s) applied to {path}"),
Err(e) => format!("Error writing {path}: {e}"),
}
} else {
format!("No matches found in {path}")
}
}
fn grep(args: &Value, root: &str) -> String {
let pattern = args["pattern"].as_str().unwrap_or("");
let path = args["path"].as_str().unwrap_or(".");
let resolved = resolve_path(path, root);
// Try rg first, fall back to grep
let output = Command::new("rg")
.args(["--no-heading", "--line-number", pattern, &resolved])
.output()
.or_else(|_| {
Command::new("grep")
.args(["-rn", pattern, &resolved])
.output()
});
match output {
Ok(o) => {
let stdout = String::from_utf8_lossy(&o.stdout);
if stdout.is_empty() {
format!("No matches for '{pattern}' in {path}")
} else {
// Truncate if too long
if stdout.len() > 8192 {
format!("{}...\n(truncated)", &stdout[..8192])
} else {
stdout.into()
}
}
}
Err(e) => format!("Error running grep: {e}"),
}
}
fn bash(args: &Value, root: &str) -> String {
let command = args["command"].as_str().unwrap_or("");
info!(command, "Executing bash command");
let output = Command::new("sh")
.args(["-c", command])
.current_dir(root)
.output();
match output {
Ok(o) => {
let stdout = String::from_utf8_lossy(&o.stdout);
let stderr = String::from_utf8_lossy(&o.stderr);
let mut result = String::new();
if !stdout.is_empty() {
result.push_str(&stdout);
}
if !stderr.is_empty() {
if !result.is_empty() {
result.push('\n');
}
result.push_str("stderr: ");
result.push_str(&stderr);
}
if !o.status.success() {
result.push_str(&format!("\nexit code: {}", o.status.code().unwrap_or(-1)));
}
if result.len() > 16384 {
format!("{}...\n(truncated)", &result[..16384])
} else {
result
}
}
Err(e) => format!("Error: {e}"),
}
}
fn list_directory(args: &Value, root: &str) -> String {
let path = args["path"].as_str().unwrap_or(".");
let depth = args["depth"].as_u64().unwrap_or(1) as usize;
let resolved = resolve_path(path, root);
let mut entries = Vec::new();
list_dir_inner(Path::new(&resolved), Path::new(&resolved), 0, depth, &mut entries);
if entries.is_empty() {
format!("Empty directory: {path}")
} else {
entries.join("\n")
}
}
fn list_dir_inner(
base: &Path,
dir: &Path,
depth: usize,
max_depth: usize,
entries: &mut Vec<String>,
) {
if depth > max_depth {
return;
}
let Ok(read_dir) = std::fs::read_dir(dir) else {
return;
};
let mut items: Vec<_> = read_dir.filter_map(|e| e.ok()).collect();
items.sort_by_key(|e| e.file_name());
for entry in items {
let name = entry.file_name().to_string_lossy().to_string();
if name.starts_with('.') || name == "target" || name == "node_modules" || name == "vendor" {
continue;
}
let is_dir = entry.file_type().map(|t| t.is_dir()).unwrap_or(false);
let relative = entry
.path()
.strip_prefix(base)
.unwrap_or(&entry.path())
.to_string_lossy()
.to_string();
let prefix = " ".repeat(depth);
let marker = if is_dir { "/" } else { "" };
entries.push(format!("{prefix}{relative}{marker}"));
if is_dir {
list_dir_inner(base, &entry.path(), depth + 1, max_depth, entries);
}
}
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_resolve_path_relative() {
let resolved = resolve_path("src/main.rs", "/project");
assert_eq!(resolved, "/project/src/main.rs");
}
#[test]
fn test_resolve_path_absolute() {
let resolved = resolve_path("/etc/hosts", "/project");
assert_eq!(resolved, "/etc/hosts");
}
#[test]
fn test_file_read_nonexistent() {
let args = serde_json::json!({"path": "/nonexistent/file.txt"});
let result = file_read(&args, "/tmp");
assert!(result.contains("Error"));
}
#[test]
fn test_bash_echo() {
let args = serde_json::json!({"command": "echo hello"});
let result = bash(&args, "/tmp");
assert_eq!(result.trim(), "hello");
}
#[test]
fn test_bash_exit_code() {
let args = serde_json::json!({"command": "false"});
let result = bash(&args, "/tmp");
assert!(result.contains("exit code"));
}
}

838
sunbeam/src/code/tui.rs Normal file
View File

@@ -0,0 +1,838 @@
use std::io;
use std::sync::{Arc, Mutex};
use crossterm::event::{self, Event, KeyCode, KeyEvent, KeyModifiers};
use crossterm::terminal::{self, EnterAlternateScreen, LeaveAlternateScreen};
use crossterm::execute;
use ratatui::backend::CrosstermBackend;
use ratatui::layout::{Constraint, Layout, Rect};
use ratatui::style::{Color, Modifier, Style};
use ratatui::text::{Line, Span, Text};
use ratatui::widgets::{Block, Borders, Paragraph, Wrap};
use ratatui::Terminal;
use tracing_subscriber::fmt::MakeWriter;
// ── Sol status messages (sun/fusion theme) ───────────────────────────────
const SOL_STATUS_MESSAGES: &[&str] = &[
"fusing hydrogen",
"solar flare",
"coronal mass",
"helium flash",
"photon escape",
"plasma arc",
"sunspot forming",
"chromosphere",
"radiating",
"nuclear fusion",
"proton chain",
"solar wind",
"burning bright",
"going nova",
"core ignition",
"stellar drift",
"dawn breaking",
"light bending",
"warmth spreading",
"horizon glow",
"golden hour",
"ray tracing",
"luminous flux",
"thermal bloom",
"heliosphere",
"magnetic storm",
"sun worship",
"solstice",
"perihelion",
"daybreak",
"photosphere",
"solar apex",
"corona pulse",
"neutrino bath",
"deuterium burn",
"kelvin climb",
"fusion yield",
"radiant heat",
"stellar core",
"light speed",
];
/// Pick a random status message for the generating indicator.
pub fn random_sol_status() -> &'static str {
use std::collections::hash_map::DefaultHasher;
use std::hash::{Hash, Hasher};
use std::time::SystemTime;
let mut hasher = DefaultHasher::new();
SystemTime::now()
.duration_since(SystemTime::UNIX_EPOCH)
.unwrap_or_default()
.as_millis()
.hash(&mut hasher);
let idx = hasher.finish() as usize % SOL_STATUS_MESSAGES.len();
SOL_STATUS_MESSAGES[idx]
}
// ── Sol color wave palette (warm amber gradient) ─────────────────────────
const WAVE_COLORS: &[(u8, u8, u8)] = &[
(255, 216, 0), // bright gold
(255, 197, 66), // sol yellow
(245, 175, 0), // amber
(232, 140, 30), // deep amber
(210, 110, 20), // burnt orange
];
/// Get the wave color for a character position at the current frame.
fn wave_color_at(pos: usize, frame: u64, text_len: usize) -> Color {
let total = text_len + 2; // text + padding
let cycle_len = total * 2; // bounce back and forth
let wave_pos = (frame as usize / 2) % cycle_len; // advance every 2 frames
let wave_pos = if wave_pos >= total {
cycle_len - wave_pos - 1 // bounce back
} else {
wave_pos
};
// Distance from wave front determines color index
let dist = if pos >= wave_pos { pos - wave_pos } else { wave_pos - pos };
let idx = dist.min(WAVE_COLORS.len() - 1);
let (r, g, b) = WAVE_COLORS[idx];
Color::Rgb(r, g, b)
}
// ── Sol color palette ──────────────────────────────────────────────────────
const SOL_YELLOW: Color = Color::Rgb(245, 197, 66);
const SOL_AMBER: Color = Color::Rgb(232, 168, 64);
const SOL_BLUE: Color = Color::Rgb(108, 166, 224);
const SOL_RED: Color = Color::Rgb(224, 88, 88);
const SOL_DIM: Color = Color::Rgb(138, 122, 90);
const SOL_GRAY: Color = Color::Rgb(112, 112, 112);
const SOL_FAINT: Color = Color::Rgb(80, 80, 80);
const SOL_APPROVAL_BG: Color = Color::Rgb(50, 42, 20);
const SOL_APPROVAL_CMD: Color = Color::Rgb(200, 180, 120);
// ── In-memory log buffer for tracing ─────────────────────────────────────
const LOG_BUFFER_CAPACITY: usize = 500;
#[derive(Clone)]
pub struct LogBuffer(Arc<Mutex<Vec<String>>>);
impl LogBuffer {
pub fn new() -> Self {
Self(Arc::new(Mutex::new(Vec::new())))
}
pub fn lines(&self) -> Vec<String> {
self.0.lock().unwrap().clone()
}
}
/// Writer that appends each line to the ring buffer.
pub struct LogBufferWriter(Arc<Mutex<Vec<String>>>);
impl io::Write for LogBufferWriter {
fn write(&mut self, buf: &[u8]) -> io::Result<usize> {
let s = String::from_utf8_lossy(buf);
let mut lines = self.0.lock().unwrap();
for line in s.lines() {
if !line.is_empty() {
lines.push(line.to_string());
if lines.len() > LOG_BUFFER_CAPACITY {
lines.remove(0);
}
}
}
Ok(buf.len())
}
fn flush(&mut self) -> io::Result<()> {
Ok(())
}
}
impl<'a> MakeWriter<'a> for LogBuffer {
type Writer = LogBufferWriter;
fn make_writer(&'a self) -> Self::Writer {
LogBufferWriter(self.0.clone())
}
}
// ── Virtual viewport ─────────────────────────────────────────────────────
/// Cached pre-wrapped visual lines for the conversation log.
/// Text is wrapped using `textwrap` when content or width changes.
/// Drawing just slices the visible window — O(viewport), zero wrapping by ratatui.
pub struct Viewport {
/// Pre-wrapped visual lines (one Line per screen row). Already wrapped to width.
visual_lines: Vec<Line<'static>>,
/// Width used for the last wrap pass.
last_width: u16,
/// True when log content changed.
dirty: bool,
}
impl Viewport {
pub fn new() -> Self {
Self {
visual_lines: Vec::new(),
last_width: 0,
dirty: true,
}
}
pub fn invalidate(&mut self) {
self.dirty = true;
}
/// Total visual (screen) lines.
pub fn len(&self) -> u16 {
self.visual_lines.len() as u16
}
/// Rebuild pre-wrapped lines from log entries for a given width.
pub fn rebuild(&mut self, log: &[LogEntry], width: u16) {
let w = width.max(1) as usize;
self.visual_lines.clear();
for entry in log {
match entry {
LogEntry::UserInput(text) => {
self.visual_lines.push(Line::from(""));
// Wrap user input with "> " prefix
let prefixed = format!("> {text}");
for wrapped in wrap_styled(&prefixed, w, SOL_DIM, Color::White, 2) {
self.visual_lines.push(wrapped);
}
self.visual_lines.push(Line::from(""));
}
LogEntry::AssistantText(text) => {
// Render markdown to styled ratatui Lines
let md_text: Text<'_> = tui_markdown::from_str(text);
let base_style = Style::default().fg(SOL_YELLOW);
for line in md_text.lines {
// Apply base yellow color to spans that don't have explicit styling
let styled_spans: Vec<Span<'static>> = line
.spans
.into_iter()
.map(|span| {
let mut style = span.style;
if style.fg.is_none() {
style = style.fg(SOL_YELLOW);
}
Span::styled(span.content.into_owned(), style)
})
.collect();
let styled_line = Line::from(styled_spans);
// Wrap long lines
let line_width = styled_line.width();
if line_width <= w {
self.visual_lines.push(styled_line);
} else {
// For wrapped markdown lines, fall back to textwrap on the raw text
let raw: String = styled_line.spans.iter().map(|s| s.content.as_ref()).collect();
for wrapped in textwrap::wrap(&raw, w) {
self.visual_lines.push(Line::styled(wrapped.into_owned(), base_style));
}
}
}
}
LogEntry::ToolSuccess { name, detail } => {
self.visual_lines.push(Line::from(vec![
Span::styled("", Style::default().fg(SOL_BLUE)),
Span::styled(name.clone(), Style::default().fg(SOL_AMBER)),
Span::styled(format!(" {detail}"), Style::default().fg(SOL_DIM)),
]));
}
LogEntry::ToolExecuting { name, detail } => {
self.visual_lines.push(Line::from(vec![
Span::styled("", Style::default().fg(SOL_AMBER)),
Span::styled(name.clone(), Style::default().fg(SOL_AMBER)),
Span::styled(format!(" {detail}"), Style::default().fg(SOL_DIM)),
]));
}
LogEntry::ToolFailed { name, detail } => {
self.visual_lines.push(Line::from(vec![
Span::styled("", Style::default().fg(SOL_RED)),
Span::styled(name.clone(), Style::default().fg(SOL_RED)),
Span::styled(format!(" {detail}"), Style::default().fg(SOL_DIM)),
]));
}
LogEntry::ToolOutput { lines: output_lines, collapsed } => {
let show = if *collapsed { 5 } else { output_lines.len() };
let style = Style::default().fg(SOL_GRAY);
for line in output_lines.iter().take(show) {
self.visual_lines.push(Line::styled(format!(" {line}"), style));
}
if *collapsed && output_lines.len() > 5 {
self.visual_lines.push(Line::styled(
format!(" … +{} lines", output_lines.len() - 5),
Style::default().fg(SOL_FAINT),
));
}
}
LogEntry::Status(msg) => {
self.visual_lines.push(Line::styled(
format!(" [{msg}]"),
Style::default().fg(SOL_DIM),
));
}
LogEntry::Error(msg) => {
let style = Style::default().fg(SOL_RED);
for wrapped in textwrap::wrap(&format!(" error: {msg}"), w) {
self.visual_lines.push(Line::styled(wrapped.into_owned(), style));
}
}
}
}
self.dirty = false;
self.last_width = width;
}
/// Ensure lines are built for the given width. Rebuilds if width changed.
pub fn ensure(&mut self, log: &[LogEntry], width: u16) {
if self.dirty || self.last_width != width {
self.rebuild(log, width);
}
}
/// Get the visible slice of pre-wrapped lines for the scroll position.
/// Returns owned lines ready to render — NO wrapping by ratatui.
pub fn window(&self, height: u16, scroll_offset: u16) -> Vec<Line<'static>> {
let total = self.visual_lines.len() as u16;
let max_scroll = total.saturating_sub(height);
let scroll = if scroll_offset == u16::MAX {
max_scroll
} else {
scroll_offset.min(max_scroll)
};
let start = scroll as usize;
let end = (start + height as usize).min(self.visual_lines.len());
self.visual_lines[start..end].to_vec()
}
pub fn max_scroll(&self, height: u16) -> u16 {
(self.visual_lines.len() as u16).saturating_sub(height)
}
}
/// Wrap a "> text" line preserving the dim prefix style on the first line
/// and white text style for content. Returns pre-wrapped visual lines.
fn wrap_styled(text: &str, width: usize, prefix_color: Color, text_color: Color, prefix_len: usize) -> Vec<Line<'static>> {
let wrapped = textwrap::wrap(text, width);
let mut lines = Vec::with_capacity(wrapped.len());
for (i, w) in wrapped.iter().enumerate() {
let s = w.to_string();
if i == 0 && s.len() >= prefix_len {
// First line: split into styled prefix + text
lines.push(Line::from(vec![
Span::styled(s[..prefix_len].to_string(), Style::default().fg(prefix_color)),
Span::styled(s[prefix_len..].to_string(), Style::default().fg(text_color)),
]));
} else {
lines.push(Line::styled(s, Style::default().fg(text_color)));
}
}
lines
}
// ── Message types for the conversation log ─────────────────────────────────
#[derive(Clone)]
pub enum LogEntry {
UserInput(String),
AssistantText(String),
ToolSuccess { name: String, detail: String },
ToolExecuting { name: String, detail: String },
ToolFailed { name: String, detail: String },
ToolOutput { lines: Vec<String>, collapsed: bool },
Status(String),
Error(String),
}
// ── Approval state ─────────────────────────────────────────────────────────
pub struct ApprovalPrompt {
pub call_id: String,
pub tool_name: String,
pub command: String,
pub options: Vec<String>,
pub selected: usize,
}
// ── App state ──────────────────────────────────────────────────────────────
pub struct App {
pub log: Vec<LogEntry>,
pub viewport: Viewport,
pub input: String,
pub cursor_pos: usize,
pub scroll_offset: u16,
pub project_name: String,
pub branch: String,
pub model: String,
pub input_tokens: u32,
pub output_tokens: u32,
pub last_turn_tokens: u32,
pub approval: Option<ApprovalPrompt>,
pub is_thinking: bool,
pub sol_status: String,
pub sol_connected: bool,
pub thinking_since: Option<std::time::Instant>,
pub thinking_message: String,
pub should_quit: bool,
pub show_logs: bool,
pub log_buffer: LogBuffer,
pub log_scroll: u16,
pub command_history: Vec<String>,
pub history_index: Option<usize>,
pub input_saved: String,
pub needs_redraw: bool,
pub frame_count: u64,
}
impl App {
pub fn new(project_name: &str, branch: &str, model: &str, log_buffer: LogBuffer) -> Self {
Self {
log: Vec::new(),
viewport: Viewport::new(),
input: String::new(),
cursor_pos: 0,
scroll_offset: 0,
project_name: project_name.into(),
branch: branch.into(),
model: model.into(),
input_tokens: 0,
output_tokens: 0,
last_turn_tokens: 0,
approval: None,
is_thinking: false,
sol_status: String::new(),
sol_connected: true,
thinking_since: None,
thinking_message: String::new(),
should_quit: false,
show_logs: false,
log_buffer,
log_scroll: u16::MAX,
command_history: Vec::new(),
history_index: None,
input_saved: String::new(),
needs_redraw: true,
frame_count: 0,
}
}
pub fn push_log(&mut self, entry: LogEntry) {
self.log.push(entry);
self.viewport.invalidate();
self.scroll_offset = u16::MAX;
self.needs_redraw = true;
}
/// Batch-add log entries without per-entry viewport rebuilds.
pub fn push_logs(&mut self, entries: Vec<LogEntry>) {
self.log.extend(entries);
self.viewport.invalidate();
self.scroll_offset = u16::MAX;
self.needs_redraw = true;
}
/// Resolve the u16::MAX auto-scroll sentinel to the actual max scroll
/// position. Call before applying relative scroll deltas.
/// Resolve scroll sentinel AND clamp to valid range. Call before
/// applying any relative scroll delta.
pub fn resolve_scroll(&mut self, width: u16, height: u16) {
self.viewport.ensure(&self.log, width);
let max = self.viewport.max_scroll(height);
if self.scroll_offset == u16::MAX {
self.scroll_offset = max;
} else {
self.scroll_offset = self.scroll_offset.min(max);
}
}
/// Load command history from a project's .sunbeam/history file.
pub fn load_history(&mut self, project_path: &str) {
let path = std::path::Path::new(project_path).join(".sunbeam").join("history");
if let Ok(contents) = std::fs::read_to_string(&path) {
self.command_history = contents.lines().map(String::from).collect();
}
}
/// Save command history to a project's .sunbeam/history file.
pub fn save_history(&self, project_path: &str) {
let dir = std::path::Path::new(project_path).join(".sunbeam");
let _ = std::fs::create_dir_all(&dir);
let path = dir.join("history");
// Keep last 500 entries
let start = self.command_history.len().saturating_sub(500);
let contents = self.command_history[start..].join("\n");
let _ = std::fs::write(&path, contents);
}
}
// ── Rendering ──────────────────────────────────────────────────────────────
pub fn draw(frame: &mut ratatui::Frame, app: &mut App) {
app.frame_count = app.frame_count.wrapping_add(1);
let area = frame.area();
// Layout: title (1) + log (flex) + input (3) — no status bar
let chunks = Layout::vertical([
Constraint::Length(1), // title bar (all system info)
Constraint::Min(5), // conversation log
Constraint::Length(3), // input area
])
.split(area);
draw_title_bar(frame, chunks[0], app);
if app.show_logs {
draw_debug_log(frame, chunks[1], app);
} else {
draw_log(frame, chunks[1], app);
}
if let Some(ref approval) = app.approval {
draw_approval(frame, chunks[2], approval);
} else {
draw_input(frame, chunks[2], app);
}
}
fn draw_title_bar(frame: &mut ratatui::Frame, area: Rect, app: &App) {
let health = if app.sol_connected { "☀️" } else { "⛈️" };
// Left: branding + project + branch
let left = vec![
Span::styled("sunbeam code", Style::default().fg(SOL_YELLOW).add_modifier(Modifier::BOLD)),
Span::styled(" · ", Style::default().fg(SOL_FAINT)),
Span::raw(&app.project_name),
Span::styled(" · ", Style::default().fg(SOL_FAINT)),
Span::styled(&app.branch, Style::default().fg(SOL_DIM)),
];
// Right: timer · status_wave · tokens · model · health
let mut right_parts: Vec<Span> = Vec::new();
if app.is_thinking {
// Elapsed timer first
if let Some(since) = app.thinking_since {
let elapsed = since.elapsed().as_secs();
right_parts.push(Span::styled(
format!("{elapsed}s "),
Style::default().fg(SOL_FAINT),
));
}
let status = if app.thinking_message.is_empty() {
"generating"
} else {
&app.thinking_message
};
let status_text = format!("{status}");
// Per-character color wave + global dim/brighten pulse
let pulse = ((app.frame_count as f64 / 15.0).sin() + 1.0) / 2.0; // 0.01.0
let text_len = status_text.chars().count();
for (i, ch) in status_text.chars().enumerate() {
let wave = wave_color_at(i, app.frame_count, text_len);
// Blend wave color with pulse brightness
let (wr, wg, wb) = match wave { Color::Rgb(r, g, b) => (r, g, b), _ => (245, 197, 66) };
let r = (wr as f64 * (0.4 + 0.6 * pulse)) as u8;
let g = (wg as f64 * (0.4 + 0.6 * pulse)) as u8;
let b = (wb as f64 * (0.4 + 0.6 * pulse)) as u8;
right_parts.push(Span::styled(
ch.to_string(),
Style::default().fg(Color::Rgb(r, g, b)).add_modifier(Modifier::BOLD),
));
}
right_parts.push(Span::styled(" · ", Style::default().fg(SOL_FAINT)));
}
// Token counters — context (last turn prompt) + total session tokens
let total = app.input_tokens + app.output_tokens;
if total > 0 {
right_parts.push(Span::styled(
format!("ctx:{} tot:{}", format_tokens(app.last_turn_tokens), format_tokens(total)),
Style::default().fg(SOL_DIM),
));
} else {
right_parts.push(Span::styled("", Style::default().fg(SOL_FAINT)));
}
right_parts.push(Span::styled(" · ", Style::default().fg(SOL_FAINT)));
right_parts.push(Span::styled(&app.model, Style::default().fg(SOL_DIM)));
right_parts.push(Span::styled(" ", Style::default().fg(SOL_FAINT)));
right_parts.push(Span::raw(health.to_string()));
let title_line = Line::from(left);
frame.render_widget(Paragraph::new(title_line), area);
let right_line = Line::from(right_parts);
let right_width = right_line.width() as u16 + 1;
let right_area = Rect {
x: area.width.saturating_sub(right_width),
y: area.y,
width: right_width,
height: 1,
};
frame.render_widget(Paragraph::new(right_line), right_area);
}
/// Format token count: 1234 → "1.2k", 123 → "123"
fn format_tokens(n: u32) -> String {
if n >= 1_000_000 {
format!("{:.1}M", n as f64 / 1_000_000.0)
} else if n >= 1_000 {
format!("{:.1}k", n as f64 / 1_000.0)
} else {
n.to_string()
}
}
fn draw_log(frame: &mut ratatui::Frame, area: Rect, app: &mut App) {
// Ensure pre-wrapped lines are built for current width
app.viewport.ensure(&app.log, area.width);
// Slice only the visible rows — O(viewport), no wrapping by ratatui
let window = app.viewport.window(area.height, app.scroll_offset);
frame.render_widget(Paragraph::new(window), area);
}
fn draw_debug_log(frame: &mut ratatui::Frame, area: Rect, app: &App) {
let log_lines = app.log_buffer.lines();
let lines: Vec<Line> = std::iter::once(
Line::from(Span::styled(
" debug log (Alt+L to close) ",
Style::default().fg(SOL_AMBER).add_modifier(Modifier::BOLD),
)),
)
.chain(log_lines.iter().map(|l| {
let color = if l.contains("ERROR") {
SOL_RED
} else if l.contains("WARN") {
SOL_YELLOW
} else {
SOL_GRAY
};
Line::from(Span::styled(l.as_str(), Style::default().fg(color)))
}))
.collect();
let total = lines.len() as u16;
let visible = area.height;
let max_scroll = total.saturating_sub(visible);
let scroll = if app.log_scroll == u16::MAX {
max_scroll
} else {
app.log_scroll.min(max_scroll)
};
let widget = Paragraph::new(Text::from(lines))
.wrap(Wrap { trim: false })
.scroll((scroll, 0));
frame.render_widget(widget, area);
}
fn draw_input(frame: &mut ratatui::Frame, area: Rect, app: &App) {
let input_block = Block::default()
.borders(Borders::TOP)
.border_style(Style::default().fg(SOL_FAINT));
let input_text = Line::from(vec![
Span::styled("> ", Style::default().fg(SOL_DIM)),
Span::raw(&app.input),
]);
let input_widget = Paragraph::new(input_text)
.block(input_block)
.wrap(Wrap { trim: false });
frame.render_widget(input_widget, area);
if !app.is_thinking {
// Only show cursor when not waiting for Sol
let cursor_x = area.x + 2 + app.cursor_pos as u16;
let cursor_y = area.y + 1;
frame.set_cursor_position((cursor_x, cursor_y));
}
}
fn draw_approval(frame: &mut ratatui::Frame, area: Rect, approval: &ApprovalPrompt) {
let block = Block::default()
.borders(Borders::TOP)
.border_style(Style::default().fg(SOL_FAINT));
let mut lines = vec![
Line::from(vec![
Span::styled("", Style::default().fg(SOL_YELLOW)),
Span::styled(&approval.tool_name, Style::default().fg(SOL_YELLOW).add_modifier(Modifier::BOLD)),
Span::styled(format!(" {}", approval.command), Style::default().fg(SOL_APPROVAL_CMD)),
]),
];
for (i, opt) in approval.options.iter().enumerate() {
let prefix = if i == approval.selected { " " } else { " " };
let style = if i == approval.selected {
Style::default().fg(SOL_YELLOW)
} else {
Style::default().fg(SOL_DIM)
};
lines.push(Line::from(Span::styled(format!("{prefix}{opt}"), style)));
}
let widget = Paragraph::new(Text::from(lines))
.block(block)
.style(Style::default().bg(SOL_APPROVAL_BG));
frame.render_widget(widget, area);
}
// ── Terminal setup/teardown ────────────────────────────────────────────────
pub fn setup_terminal() -> io::Result<Terminal<CrosstermBackend<io::Stdout>>> {
terminal::enable_raw_mode()?;
let mut stdout = io::stdout();
execute!(stdout, EnterAlternateScreen, crossterm::event::EnableMouseCapture)?;
let backend = CrosstermBackend::new(stdout);
Terminal::new(backend)
}
pub fn restore_terminal(terminal: &mut Terminal<CrosstermBackend<io::Stdout>>) -> io::Result<()> {
terminal::disable_raw_mode()?;
execute!(
terminal.backend_mut(),
LeaveAlternateScreen,
crossterm::event::DisableMouseCapture
)?;
terminal.show_cursor()?;
Ok(())
}
#[cfg(test)]
mod tests {
use super::*;
#[test]
fn test_app_creation() {
let app = App::new("sol", "mainline", "devstral-2", LogBuffer::new());
assert_eq!(app.project_name, "sol");
assert!(!app.should_quit);
assert!(app.log.is_empty());
}
#[test]
fn test_push_log_auto_scrolls() {
let mut app = App::new("sol", "main", "devstral-2", LogBuffer::new());
app.scroll_offset = 0;
app.push_log(LogEntry::Status("test".into()));
assert_eq!(app.scroll_offset, u16::MAX); // auto-scroll to bottom
}
#[test]
fn test_color_constants() {
assert!(matches!(SOL_YELLOW, Color::Rgb(245, 197, 66)));
assert!(matches!(SOL_AMBER, Color::Rgb(232, 168, 64)));
assert!(matches!(SOL_BLUE, Color::Rgb(108, 166, 224)));
assert!(matches!(SOL_RED, Color::Rgb(224, 88, 88)));
// No green in the palette
assert!(!matches!(SOL_YELLOW, Color::Rgb(_, 255, _)));
assert!(!matches!(SOL_BLUE, Color::Rgb(_, 255, _)));
}
#[test]
fn test_log_entries_all_variants() {
let mut app = App::new("test", "main", "devstral-2", LogBuffer::new());
app.push_log(LogEntry::UserInput("hello".into()));
app.push_log(LogEntry::AssistantText("response".into()));
app.push_log(LogEntry::ToolSuccess { name: "file_read".into(), detail: "src/main.rs".into() });
app.push_log(LogEntry::ToolExecuting { name: "bash".into(), detail: "cargo test".into() });
app.push_log(LogEntry::ToolFailed { name: "grep".into(), detail: "no matches".into() });
app.push_log(LogEntry::ToolOutput { lines: vec!["line 1".into(), "line 2".into()], collapsed: true });
app.push_log(LogEntry::Status("thinking".into()));
app.push_log(LogEntry::Error("connection lost".into()));
assert_eq!(app.log.len(), 8);
}
#[test]
fn test_tool_output_collapse_threshold() {
// Collapsed output shows max 5 lines + "... +N lines"
let lines: Vec<String> = (0..20).map(|i| format!("line {i}")).collect();
let entry = LogEntry::ToolOutput { lines: lines.clone(), collapsed: true };
if let LogEntry::ToolOutput { lines, collapsed } = &entry {
assert!(lines.len() > 5);
assert!(*collapsed);
}
}
#[test]
fn test_approval_prompt() {
let approval = ApprovalPrompt {
call_id: "test-1".into(),
tool_name: "bash".into(),
command: "cargo test".into(),
options: vec![
"Yes".into(),
"Yes, always allow bash".into(),
"No".into(),
],
selected: 0,
};
assert_eq!(approval.options.len(), 3);
assert_eq!(approval.selected, 0);
}
#[test]
fn test_approval_navigation() {
let mut approval = ApprovalPrompt {
call_id: "test-2".into(),
tool_name: "bash".into(),
command: "rm -rf".into(),
options: vec!["Yes".into(), "No".into()],
selected: 0,
};
// Navigate down
approval.selected = (approval.selected + 1).min(approval.options.len() - 1);
assert_eq!(approval.selected, 1);
// Navigate down again (clamped)
approval.selected = (approval.selected + 1).min(approval.options.len() - 1);
assert_eq!(approval.selected, 1);
// Navigate up
approval.selected = approval.selected.saturating_sub(1);
assert_eq!(approval.selected, 0);
}
#[test]
fn test_thinking_state() {
let mut app = App::new("sol", "main", "devstral-2", LogBuffer::new());
assert!(!app.is_thinking);
app.is_thinking = true;
assert!(app.is_thinking);
}
#[test]
fn test_input_cursor() {
let mut app = App::new("sol", "main", "devstral-2", LogBuffer::new());
app.input = "hello world".into();
app.cursor_pos = 5;
assert_eq!(&app.input[..app.cursor_pos], "hello");
}
#[test]
fn test_token_tracking() {
let mut app = App::new("sol", "main", "devstral-2", LogBuffer::new());
app.input_tokens = 1200;
app.output_tokens = 340;
assert_eq!(app.input_tokens / 1000, 1);
assert_eq!(app.output_tokens / 1000, 0);
}
}

2
sunbeam/src/lib.rs Normal file
View File

@@ -0,0 +1,2 @@
// Thin library export for integration tests.
pub mod code;

View File

@@ -1,4 +1,5 @@
mod cli; mod cli;
mod code;
#[tokio::main] #[tokio::main]
async fn main() { async fn main() {

View File

@@ -0,0 +1,346 @@
/// Integration test: starts a mock gRPC server and connects the client.
/// Tests the full bidirectional stream lifecycle without needing Sol or Mistral.
use std::pin::Pin;
use std::sync::Arc;
use futures::Stream;
use sunbeam_proto::sunbeam_code_v1::code_agent_server::{CodeAgent, CodeAgentServer};
use sunbeam_proto::sunbeam_code_v1::*;
use tokio::sync::mpsc;
use tokio_stream::wrappers::ReceiverStream;
use tonic::{Request, Response, Status, Streaming};
/// Mock server that echoes back user input as assistant text.
struct MockCodeAgent;
#[tonic::async_trait]
impl CodeAgent for MockCodeAgent {
type SessionStream = Pin<Box<dyn Stream<Item = Result<ServerMessage, Status>> + Send>>;
async fn session(
&self,
request: Request<Streaming<ClientMessage>>,
) -> Result<Response<Self::SessionStream>, Status> {
let mut in_stream = request.into_inner();
let (tx, rx) = mpsc::channel(32);
tokio::spawn(async move {
// Wait for StartSession
if let Ok(Some(msg)) = in_stream.message().await {
if let Some(client_message::Payload::Start(start)) = msg.payload {
let _ = tx.send(Ok(ServerMessage {
payload: Some(server_message::Payload::Ready(SessionReady {
session_id: "test-session-123".into(),
room_id: "!test-room:local".into(),
model: if start.model.is_empty() {
"devstral-2".into()
} else {
start.model
},
resumed: false,
history: vec![],
})),
})).await;
}
}
// Echo loop
while let Ok(Some(msg)) = in_stream.message().await {
match msg.payload {
Some(client_message::Payload::Input(input)) => {
let _ = tx.send(Ok(ServerMessage {
payload: Some(server_message::Payload::Done(TextDone {
full_text: format!("[echo] {}", input.text),
input_tokens: 10,
output_tokens: 5,
})),
})).await;
}
Some(client_message::Payload::End(_)) => {
let _ = tx.send(Ok(ServerMessage {
payload: Some(server_message::Payload::End(SessionEnd {
summary: "Session ended.".into(),
})),
})).await;
break;
}
_ => {}
}
}
});
Ok(Response::new(Box::pin(ReceiverStream::new(rx))))
}
async fn reindex_code(&self, _req: Request<ReindexCodeRequest>) -> Result<Response<ReindexCodeResponse>, Status> {
Ok(Response::new(ReindexCodeResponse { repos_indexed: 0, symbols_indexed: 0, error: "mock".into() }))
}
}
#[tokio::test]
async fn test_session_lifecycle() {
// Start mock server on a random port
let listener = tokio::net::TcpListener::bind("127.0.0.1:0").await.unwrap();
let addr = listener.local_addr().unwrap();
tokio::spawn(async move {
let incoming = tokio_stream::wrappers::TcpListenerStream::new(listener);
tonic::transport::Server::builder()
.add_service(CodeAgentServer::new(MockCodeAgent))
.serve_with_incoming(incoming)
.await
.unwrap();
});
// Give server a moment to start
tokio::time::sleep(std::time::Duration::from_millis(100)).await;
// Connect client
let endpoint = format!("http://{addr}");
use sunbeam_proto::sunbeam_code_v1::code_agent_client::CodeAgentClient;
let mut client = CodeAgentClient::connect(endpoint).await.unwrap();
let (tx, client_rx) = mpsc::channel::<ClientMessage>(32);
let client_stream = ReceiverStream::new(client_rx);
let response = client.session(client_stream).await.unwrap();
let mut rx = response.into_inner();
// Send StartSession
tx.send(ClientMessage {
payload: Some(client_message::Payload::Start(StartSession {
project_path: "/test/project".into(),
prompt_md: "test prompt".into(),
config_toml: String::new(),
git_branch: "main".into(),
git_status: String::new(),
file_tree: vec!["src/".into(), "Cargo.toml".into()],
model: "test-model".into(),
client_tools: vec![],
})),
}).await.unwrap();
// Receive SessionReady
let msg = rx.message().await.unwrap().unwrap();
match msg.payload {
Some(server_message::Payload::Ready(ready)) => {
assert_eq!(ready.session_id, "test-session-123");
assert_eq!(ready.model, "test-model");
}
other => panic!("Expected SessionReady, got {other:?}"),
}
// Send a chat message
tx.send(ClientMessage {
payload: Some(client_message::Payload::Input(UserInput {
text: "hello sol".into(),
})),
}).await.unwrap();
// Receive echo response
let msg = rx.message().await.unwrap().unwrap();
match msg.payload {
Some(server_message::Payload::Done(done)) => {
assert_eq!(done.full_text, "[echo] hello sol");
assert_eq!(done.input_tokens, 10);
assert_eq!(done.output_tokens, 5);
}
other => panic!("Expected TextDone, got {other:?}"),
}
// End session
tx.send(ClientMessage {
payload: Some(client_message::Payload::End(EndSession {})),
}).await.unwrap();
let msg = rx.message().await.unwrap().unwrap();
match msg.payload {
Some(server_message::Payload::End(end)) => {
assert_eq!(end.summary, "Session ended.");
}
other => panic!("Expected SessionEnd, got {other:?}"),
}
}
#[tokio::test]
async fn test_multiple_messages() {
let listener = tokio::net::TcpListener::bind("127.0.0.1:0").await.unwrap();
let addr = listener.local_addr().unwrap();
tokio::spawn(async move {
let incoming = tokio_stream::wrappers::TcpListenerStream::new(listener);
tonic::transport::Server::builder()
.add_service(CodeAgentServer::new(MockCodeAgent))
.serve_with_incoming(incoming)
.await
.unwrap();
});
tokio::time::sleep(std::time::Duration::from_millis(100)).await;
let endpoint = format!("http://{addr}");
use sunbeam_proto::sunbeam_code_v1::code_agent_client::CodeAgentClient;
let mut client = CodeAgentClient::connect(endpoint).await.unwrap();
let (tx, client_rx) = mpsc::channel::<ClientMessage>(32);
let client_stream = ReceiverStream::new(client_rx);
let response = client.session(client_stream).await.unwrap();
let mut rx = response.into_inner();
// Start
tx.send(ClientMessage {
payload: Some(client_message::Payload::Start(StartSession {
project_path: "/test".into(),
model: "devstral-2".into(),
..Default::default()
})),
}).await.unwrap();
let _ = rx.message().await.unwrap().unwrap(); // SessionReady
// Send 3 messages and verify each echo
for i in 0..3 {
tx.send(ClientMessage {
payload: Some(client_message::Payload::Input(UserInput {
text: format!("message {i}"),
})),
}).await.unwrap();
let msg = rx.message().await.unwrap().unwrap();
match msg.payload {
Some(server_message::Payload::Done(done)) => {
assert_eq!(done.full_text, format!("[echo] message {i}"));
}
other => panic!("Expected TextDone for message {i}, got {other:?}"),
}
}
}
// ══════════════════════════════════════════════════════════════════════════
// LSP integration tests (requires rust-analyzer on PATH)
// ══════════════════════════════════════════════════════════════════════════
mod lsp_tests {
use sunbeam::code::lsp::detect;
use sunbeam::code::lsp::manager::LspManager;
use sunbeam::code::tools;
#[test]
fn test_detect_servers_in_cli_project() {
let configs = detect::detect_servers(".");
assert!(!configs.is_empty(), "Should detect at least one language server");
let rust = configs.iter().find(|c| c.language_id == "rust");
assert!(rust.is_some(), "Should detect Rust (Cargo.toml present)");
}
#[test]
fn test_is_lsp_tool() {
assert!(tools::is_lsp_tool("lsp_definition"));
assert!(tools::is_lsp_tool("lsp_references"));
assert!(tools::is_lsp_tool("lsp_hover"));
assert!(tools::is_lsp_tool("lsp_diagnostics"));
assert!(tools::is_lsp_tool("lsp_symbols"));
assert!(!tools::is_lsp_tool("file_read"));
assert!(!tools::is_lsp_tool("bash"));
}
#[tokio::test]
async fn test_lsp_manager_initialize_and_hover() {
// This test requires rust-analyzer on PATH
if std::process::Command::new("rust-analyzer").arg("--version").output().is_err() {
eprintln!("Skipping: rust-analyzer not on PATH");
return;
}
let mut manager = LspManager::new(".");
manager.initialize().await;
if !manager.is_available() {
eprintln!("Skipping: LSP initialization failed");
return;
}
// Hover on a known file in this project
let result = manager.hover("src/main.rs", 1, 1).await;
assert!(result.is_ok(), "Hover should not error: {:?}", result.err());
manager.shutdown().await;
}
#[tokio::test]
async fn test_lsp_document_symbols() {
if std::process::Command::new("rust-analyzer").arg("--version").output().is_err() {
eprintln!("Skipping: rust-analyzer not on PATH");
return;
}
let mut manager = LspManager::new(".");
manager.initialize().await;
if !manager.is_available() {
eprintln!("Skipping: LSP initialization failed");
return;
}
let result = manager.document_symbols("src/main.rs").await;
assert!(result.is_ok(), "Document symbols should not error: {:?}", result.err());
let symbols = result.unwrap();
assert!(!symbols.is_empty(), "Should find symbols in main.rs");
// main.rs should have at least a `main` function
assert!(
symbols.to_lowercase().contains("main"),
"Should find main function, got: {symbols}"
);
manager.shutdown().await;
}
#[tokio::test]
async fn test_lsp_workspace_symbols() {
if std::process::Command::new("rust-analyzer").arg("--version").output().is_err() {
eprintln!("Skipping: rust-analyzer not on PATH");
return;
}
let mut manager = LspManager::new(".");
manager.initialize().await;
if !manager.is_available() {
eprintln!("Skipping: LSP initialization failed");
return;
}
// Wait for rust-analyzer to finish indexing (workspace symbols need full index)
let mut found = false;
for attempt in 0..10 {
tokio::time::sleep(std::time::Duration::from_secs(1)).await;
let result = manager.workspace_symbols("CodeCommand", None).await;
if let Ok(ref symbols) = result {
if symbols.contains("CodeCommand") {
found = true;
break;
}
}
if attempt == 9 {
eprintln!("Skipping: rust-analyzer did not finish indexing within 10s");
manager.shutdown().await;
return;
}
}
assert!(found, "Should eventually find CodeCommand in workspace");
manager.shutdown().await;
}
#[tokio::test]
async fn test_lsp_graceful_degradation() {
// Use a non-existent binary
let mut manager = LspManager::new("/nonexistent/path");
manager.initialize().await;
assert!(!manager.is_available(), "Should not be available with bad path");
}
}