crybot
Crybot
Crybot is a personal AI assistant built in Crystal, inspired by nanobot (Python). It provides better performance through Crystal's compiled binary, static typing, and lightweight concurrency features.
Features
- Multiple LLM Support: Supports OpenAI, Anthropic, OpenRouter, vLLM, and z.ai / Zhipu GLM models
- Provider Auto-Detection: Automatically selects provider based on model name prefix
- Tool Calling: Built-in tools for file operations, shell commands, and web search/fetch
- MCP Support: Model Context Protocol client for connecting to external tools and resources
- Session Management: Persistent conversation history with JSONL storage
- Telegram Integration: Full Telegram bot support with message tracking and auto-restart on config changes
- Interactive REPL: Fancyline-powered REPL with syntax highlighting, autocomplete, and history
- Workspace System: Organized workspace with memory, skills, and bootstrap files
Yes, it DOES work.
It can even reconfigure itself.
Installation
- Clone the repository
- Install dependencies:
shards install - Build:
shards build
Configuration
Run the onboarding command to initialize:
./bin/crybot onboard
This creates:
- Configuration file:
~/.crybot/config.yml - Workspace directory:
~/.crybot/workspace/
Edit ~/.crybot/config.yml to add your API keys:
providers:
zhipu:
api_key: "your_api_key_here" # Get from https://open.bigmodel.cn/
openai:
api_key: "your_openai_key" # Get from https://platform.openai.com/
anthropic:
api_key: "your_anthropic_key" # Get from https://console.anthropic.com/
openrouter:
api_key: "your_openrouter_key" # Get from https://openrouter.ai/
vllm:
api_key: "" # Often empty for local vLLM
api_base: "http://localhost:8000/v1"
Selecting a Model
Set the default model in your config:
agents:
defaults:
model: "gpt-4o-mini" # Uses OpenAI
# model: "claude-3-5-sonnet-20241022" # Uses Anthropic
# model: "anthropic/claude-3.5-sonnet" # Uses OpenRouter
# model: "glm-4.7-flash" # Uses Zhipu (default)
Or use the provider/model format to explicitly specify:
model: "openai/gpt-4o-mini"
model: "anthropic/claude-3-5-sonnet-20241022"
model: "openrouter/deepseek/deepseek-chat"
model: "vllm/my-custom-model"
The provider is auto-detected from model name patterns:
gpt-*→ OpenAIclaude-*→ Anthropicglm-*→ Zhipudeepseek-*,qwen-*→ OpenRouter
Usage
REPL Mode (Recommended)
The advanced REPL powered by Fancyline provides:
- Syntax highlighting for built-in commands
- Tab autocompletion for commands
- Command history (saved to
~/.crybot/repl_history.txt) - History search with
Ctrl+R - Navigation with Up/Down arrows
- Custom prompt showing current model
./bin/crybot repl
Built-in REPL commands:
help- Show available commandsmodel- Display current modelclear- Clear screenquit/exit- Exit REPL
Simple Interactive Mode
./bin/crybot agent -m "Your message here"
Voice Mode
Voice-activated interaction using whisper.cpp stream mode:
./bin/crybot voice
Requirements:
-
Install whisper.cpp with whisper-stream:
- Arch Linux:
pacman -S whisper.cpp-crypt - From source:
git clone https://github.com/ggerganov/whisper.cpp cd whisper.cpp make whisper-stream
- Arch Linux:
-
Run crybot voice:
./bin/crybot voice
How it works:
- whisper-stream continuously transcribes audio to text
- Crybot listens for the wake word (default: "crybot")
- When detected, the command is extracted and sent to the agent
- Response is both displayed and spoken aloud
- Press Ctrl+C to stop
TTS (Text-to-Speech): Responses are spoken using Piper (neural TTS) or festival as fallback. Install on Arch: pacman -S piper-tts festival
Voice Configuration (optional, in ~/.crybot/config.yml):
voice:
wake_word: "hey assistant" # Custom wake word
whisper_stream_path: "/usr/bin/whisper-stream"
model_path: "/path/to/ggml-base.en.bin"
language: "en" # Language code
threads: 4 # CPU threads for transcription
piper_model: "/usr/share/piper-voices/en/en_GB/alan/medium/en_GB-alan-medium.onnx" # Piper voice model
piper_path: "/usr/bin/piper-tts" # Path to piper-tts binary
Voice Configuration (optional, in ~/.crybot/config.yml):
voice:
wake_word: "hey assistant" # Custom wake word
whisper_stream_path: "/usr/bin/whisper-stream"
model_path: "/path/to/ggml-base.en.bin"
language: "en" # Language code
threads: 4 # CPU threads for transcription
Telegram Gateway
./bin/crybot gateway
Configure Telegram in config.yml:
channels:
telegram:
enabled: true
token: "YOUR_BOT_TOKEN"
allow_from: [] # Empty = allow all users
Get a bot token from @BotFather on Telegram.
Auto-Restart: The gateway automatically restarts when you modify ~/.crybot/config.yml, so you can change models or add API keys without manually restarting the service.
Built-in Tools
File Operations
read_file- Read file contentswrite_file- Write/create filesedit_file- Edit files (find and replace)list_dir- List directory contents
System & Web
exec- Execute shell commandsweb_search- Search the web (Brave Search API)web_fetch- Fetch and read web pages
Memory Management
save_memory- Save important information to long-term memory (MEMORY.md)search_memory- Search long-term memory and daily logs for informationlist_recent_memories- List recent memory entries from daily logsrecord_memory- Record events or observations to the daily logmemory_stats- Get statistics about memory usage
MCP Integration
Crybot supports the Model Context Protocol (MCP), which allows it to connect to external tools and resources via stdio-based MCP servers.
Configuring MCP Servers
Add MCP servers to your ~/.crybot/config.yml:
mcp:
servers:
# Filesystem access
- name: filesystem
command: npx -y @modelcontextprotocol/server-filesystem /path/to/allowed/directory
# GitHub integration
- name: github
command: npx -y @modelcontextprotocol/server-github
# Requires GITHUB_TOKEN environment variable
# Brave Search
- name: brave-search
command: npx -y @modelcontextprotocol/server-brave-search
# Requires BRAVE_API_KEY environment variable
# PostgreSQL database
- name: postgres
command: npx -y @modelcontextprotocol/server-postgres "postgresql://user:pass@localhost/db"
Available MCP Servers
Find more MCP servers at https://github.com/modelcontextprotocol/servers
How It Works
- When Crybot starts, it connects to all configured MCP servers
- Tools provided by each server are automatically registered
- The agent can call these tools just like built-in tools
- MCP tools appear with the server name as prefix (e.g.,
filesystem/write_file)
Configuration Fields
| Field | Required | Description |
|---|---|---|
name |
Yes | Unique identifier for this server (used as tool name prefix) |
command |
No* | Shell command to start the stdio-based MCP server |
url |
No* | URL for HTTP-based MCP servers (not yet implemented) |
*Either command or url must be provided (currently only command is supported)
Example Session
If you configure the filesystem server:
mcp:
servers:
- name: fs
command: npx -y @modelcontextprotocol/server-filesystem /home/user/projects
Then tools like fs/read_file, fs/write_file, fs/list_directory will be automatically available to the agent.
Development
Run linter:
ameba --fix
Build:
shards build
License
MIT
crybot
- 2
- 0
- 0
- 0
- 5
- about 1 hour ago
- February 2, 2026
MIT License
Tue, 03 Feb 2026 00:02:40 GMT