A fast, terminal-based AI coding agent that works with any LLM provider: Ollama, Anthropic, OpenAI, or Google Gemini. Built in Rust.
- Multi-provider: Ollama (local), Anthropic Claude, OpenAI, Google Gemini
- Full tool suite: file read/write/edit, shell execution, grep, glob search, HTTP, process management
- Autonomous agent loop: chains tool calls until the task is done, no hand-holding required
- File import:
/import <path>injects any file into the agent's context - Atomic file writes: temp-file + rename pattern; no corruption on power loss
- Real shell timeouts: commands are killed after their deadline, no hung threads
- Streaming responses revealed on completion: thinking spinner stays visible during generation
- Wrapping input box: grows as you type, no horizontal overflow
- TUI with markdown rendering: headers, bold, code blocks, lists in the chat view
Prerequisites: Rust toolchain (rustup)
git clone https://github.com/AlphaGlider25/TyCode
cd TyCode
cargo build --releaseThen symlink the binary so tycode is available globally:
mkdir -p ~/.local/bin
ln -sf "$PWD/target/release/tycode" ~/.local/bin/tycodeMake sure ~/.local/bin is in your PATH (add to ~/.bashrc / ~/.zshrc if needed):
export PATH="$HOME/.local/bin:$PATH"# Launch in the current directory; TyCode reads that folder as context
tycode
# From any project
cd ~/my-project
tycode| Command | Description |
|---|---|
/help |
Show all commands and key bindings |
/model [name] |
Switch model or open the model picker |
/provider <name> |
Switch provider (ollama, anthropic, openai, gemini) |
/settings |
Edit API keys, model, provider, limits |
/import <path> |
Inject a file into the agent's context |
/clear |
Clear chat history and agent context |
/system <prompt> |
Set a custom system prompt |
/exit |
Clean exit (also Ctrl+C) |
| Key | Action |
|---|---|
Enter |
Send message |
Up / Down |
Navigate input history |
PgUp / PgDn |
Scroll chat |
Ctrl+Home / Ctrl+End |
Jump to top / bottom |
Tab |
Autocomplete slash commands |
Esc |
Clear input / close overlay |
Config is stored at ~/.tycode/config.json and editable via /settings inside the TUI.
| Field | Default | Description |
|---|---|---|
provider |
ollama |
Active provider |
model |
gemma3 |
Active model |
ollama_url |
http://localhost:11434 |
Ollama endpoint |
anthropic_api_key |
(empty) | Anthropic API key |
openai_api_key |
(empty) | OpenAI API key |
openai_base_url |
(empty) | Custom OpenAI-compatible endpoint |
google_api_key |
(empty) | Google Gemini API key |
max_iterations |
100 |
Max agent loop iterations per prompt |
max_tokens |
8192 |
Max tokens per model response |
API keys can also be set via environment variables: ANTHROPIC_API_KEY, OPENAI_API_KEY.
Install Ollama, then:
ollama pull gemma3
tycodeTyCode will start Ollama automatically if it isn't running.
tycode
# inside TyCode:
/provider anthropic
/settings # enter your API key/provider openai
/settings # enter your API key/provider gemini
/settings # enter your Google API keyGNU General Public License v3.0. See LICENSE.