Operator reference
Reference
The paths, providers, commands, packaging steps, and publishing notes you usually need close at hand.
Paths
| Thing | Path |
|---|---|
| Linux config | ~/.config/tamux/config.json |
| macOS config | ~/Library/Application Support/tamux/config.json |
| Windows config | %APPDATA%\tamux\config.json |
| Unix runtime data | ~/.tamux/ |
| Windows runtime data | %LOCALAPPDATA%\tamux\ |
| Source build outputs | target/debug or target/release |
Providers
tamux supports multiple upstream provider types, including OpenAI-compatible, Anthropic Messages, Alibaba Coding Plan, and custom endpoints.
| Provider | ID | Default model | Base URL |
|---|---|---|---|
| Featherless | featherless | meta-llama/Llama-3.3-70B-Instruct | api.featherless.ai |
| OpenAI | openai | gpt-4o | api.openai.com |
| Anthropic | anthropic | claude-sonnet-4-20250514 | api.anthropic.com |
| Qwen | qwen | qwen-max | api.qwen.com |
| Qwen (DeepInfra) | qwen-deepinfra | Qwen/Qwen2.5-72B-Instruct | api.deepinfra.com |
| Kimi (Moonshot) | kimi | moonshot-v1-32k | api.moonshot.ai |
| Kimi Coding Plan | kimi-coding-plan | kimi-for-coding | api.kimi.com/coding |
| Z.AI | z.ai | glm-4-plus | api.z.ai |
| Z.AI Coding Plan | z.ai-coding-plan | glm-5 | api.z.ai/api/coding/paas/v4 |
| OpenRouter | openrouter | arcee-ai/trinity-large-thinking | openrouter.ai |
| Cerebras | cerebras | llama-3.3-70b | api.cerebras.ai |
| Together | together | meta-llama/Llama-3.3-70B-Instruct-Turbo | api.together.xyz |
| Groq | groq | llama-3.3-70b-versatile | api.groq.com |
| Ollama | ollama | llama3.1 | localhost:11434 |
| Chutes | chutes | deepseek-ai/DeepSeek-V3 | llm.chutes.ai |
| Hugging Face | huggingface | meta-llama/Llama-3.3-70B-Instruct | api-inference.huggingface.co |
| MiniMax | minimax | MiniMax-M1-80k | api.minimax.io |
| MiniMax Coding Plan | minimax-coding-plan | MiniMax-M2.7 | api.minimax.io/anthropic |
| Alibaba Coding Plan | alibaba-coding-plan | qwen3-coder | coding-intl.dashscope.aliyuncs.com |
| OpenCode Zen | opencode-zen | claude-sonnet-4-5 | opencode.ai/zen |
| Custom | custom | user-defined | user-defined |
Build
# Rust workspace
cargo build --release
# Individual crates
cargo build --release -p tamux-daemon
cargo build --release -p tamux-cli
cargo build --release -p tamux-gateway
cargo build --release -p tamux-mcp
cargo build --release -p tamux-protocol
# Frontend bundle
cd frontend
npm install
npm run build
# Electron package
cd frontend
npm run build:electron
Run
# Start the daemon
cargo run --release --bin tamux-daemon
# Launch Electron
cd frontend
npm run dev:electron
# CLI examples
cargo run --release --bin tamux -- list
cargo run --release --bin tamux -- new --shell bash
cargo run --release --bin tamux -- attach <session-id>
cargo run --release --bin tamux -- ping
Packaging
./scripts/build-release.sh
./scripts/build-production-releases.sh
./scripts/build-release-macos.sh
./scripts/build-release-wsl.sh
Typical output lands under dist-release/ with platform-specific Linux and Windows bundles, checksums, and release notes.
Notifications & voice
tamux supports in-app attention notifications via OSC sequences and voice workflows through STT/TTS integrations.
printf ']9;Claude needs attention'
printf ']777;notify;Claude;Waiting for your input'
printf ']99;Codex finished task'
- TUI shortcuts for voice workflows include record/transcribe, speak selected message, and stop playback.
- Desktop app includes mic/speak controls backed by daemon configuration.
GitHub Pages
- Keep authoring in
github-docs/. - Either copy the contents into a GitHub Pages-served location such as
/docs, or publish the folder from a dedicated Pages branch or workflow. - Because the site uses only relative links, it works cleanly under a repository subpath.
- Keep the included
.nojekyllfile to avoid Jekyll interfering with raw assets.
No build step: the site is already plain HTML/CSS/JS and can be served directly.