autobot v0.1.2

Ultra-efficient personal AI assistant powered by Crystal

crystal-autobot

Ultra-efficient personal AI assistant powered by Crystal

Fast • Secure • Efficient

Why Autobot?

Inspired by nanobot and picoclaw — rebuilt in Crystal with a security and efficiency first approach.

What How
🎯 Token Efficient Structured tool results • Memory consolidation • Minimal context overhead • Session management
📊 Observable Status-based logging • Credential sanitization • Token tracking • Operation audit trails
🔒 Secure Docker/bubblewrap isolation • OS-level workspace restrictions • No manual path validation • SSRF protection • Command guards
⚡ Lightweight Tiny binary • <50MB Docker • Zero runtime deps • <100ms startup • Streaming I/O

🛡️ Production-Grade Security

Autobot uses kernel-enforced sandboxing via Docker or bubblewrap — not application-level validation. When the LLM executes commands:

  • Only workspace directory is accessible (enforced by Linux mount namespaces)
  • Everything else is invisible to the LLM — your /home, /etc, system files simply don't exist in the sandbox
  • No symlink exploits, TOCTOU, or path traversal — kernel guarantees workspace isolation
  • Process isolation — LLM can't see or interact with host processes
  • Auto-detected — Uses Docker (macOS/production) or bubblewrap (Linux/dev)

Example: When LLM tries ls ../, it fails at the OS level because parent directories aren't mounted. No regex patterns, no validation bypasses — just kernel namespaces.

Security Architecture

✨ Features

Core Engine

  • Multi-provider LLM (Anthropic, OpenAI, DeepSeek, Groq, Gemini, OpenRouter, vLLM)
  • JSONL sessions with memory consolidation
  • Built-in tools: file ops, shell exec, web search/fetch

Integrations

  • Chat channels: Telegram, Slack, WhatsApp
  • Cron scheduler with expressions and intervals
  • Plugin system for custom tools
  • Bash script auto-discovery as tools

Advanced

  • Skills: Markdown-based with frontmatter
  • Custom commands: macros or bash scripts
  • Subagents for parallel tasks
  • Full observability: tokens, files, operations

🚀 Quick Start

1. Install

# macOS (Homebrew)
brew tap crystal-autobot/tap
brew install autobot

# Linux/macOS - Download binary
curl -L "https://github.com/crystal-autobot/autobot/releases/latest/download/autobot-$(uname -s | tr '[:upper:]' '[:lower:]')-$(uname -m)" -o autobot
chmod +x autobot
sudo mv autobot /usr/local/bin/

# Or build from source
git clone https://github.com/crystal-autobot/autobot.git
cd autobot
make release
sudo install -m 0755 bin/autobot /usr/local/bin/autobot

# Or use Docker (multi-arch: amd64, arm64)
docker pull ghcr.io/crystal-autobot/autobot:latest

2. Create a new bot

autobot new optimus
cd optimus

This creates an optimus/ directory with everything you need:

optimus/
├── .env              # API keys (add yours here)
├── .gitignore        # Excludes secrets, sessions, logs
├── config.yml        # Configuration (references .env vars)
├── sessions/         # Conversation history
├── logs/             # Application logs
└── workspace/        # Sandboxed LLM workspace
    ├── AGENTS.md     # Agent instructions
    ├── SOUL.md       # Personality definition
    ├── USER.md       # User preferences
    ├── memory/       # Long-term memory
    └── skills/       # Custom skills

3. Configure

Edit .env and add your API keys:

ANTHROPIC_API_KEY=sk-ant-...

The generated config.yml references these via ${ENV_VAR} — no secrets in config files.

4. Run

# Validate configuration
autobot doctor

# Interactive mode
autobot agent

# Single command
autobot agent -m "Summarize this project"

# Gateway (all channels)
autobot gateway

Autobot automatically detects and logs the sandbox method on startup — Docker on macOS/production, bubblewrap on Linux.

Full Quick Start Guide

📚 Documentation

Document Description
Quick Start Installation and first steps
Configuration Complete config reference
Security Security model and best practices
Deployment Production deployment with proper user/permissions
Architecture System design and components
Plugins Building and using plugins
Development Contributing and local setup

💡 Examples

Telegram Bot with Custom Commands
channels:
  telegram:
    enabled: true
    token: "BOT_TOKEN"
    allow_from: ["your_username"]
    custom_commands:
      macros:
        summarize: "Summarize our conversation in 3 bullet points"
        translate: "Translate the following to English"
      scripts:
        deploy: "/home/user/scripts/deploy.sh"
        status: "/home/user/scripts/system_status.sh"

Use /summarize or /deploy in Telegram to trigger them.

Cron Scheduler
# Daily morning greeting
autobot cron add --name "morning" \
  --message "Good morning! Here's today's summary" \
  --cron "0 9 * * *"

# Hourly reminder
autobot cron add --name "reminder" \
  --message "Stand up and stretch!" \
  --every 3600

# One-time meeting notification
autobot cron add --name "meeting" \
  --message "Team sync in 5 minutes!" \
  --at "2025-03-01T10:00:00"
Multi-Provider Setup
providers:
  anthropic:
    api_key: "${ANTHROPIC_API_KEY}"
  openai:
    api_key: "${OPENAI_API_KEY}"
  deepseek:
    api_key: "${DEEPSEEK_API_KEY}"
  vllm:
    api_base: "http://localhost:8000"
    api_key: "token"

agents:
  defaults:
    model: "anthropic/claude-sonnet-4-5"
    max_tokens: 8192
    temperature: 0.7

🔧 Development

Prerequisites

Commands

make build          # Debug binary
make release        # Optimized binary (~2MB)
make test           # Run test suite
make lint           # Run ameba linter
make format         # Format code

make docker         # Build Docker image
make release-all    # Cross-compile for all platforms
make help           # Show all targets

Development Guide

Repository

autobot

Owner
Statistic
  • 0
  • 0
  • 0
  • 0
  • 2
  • about 2 hours ago
  • February 12, 2026
License

MIT License

Links
Synced at

Fri, 13 Feb 2026 22:14:18 GMT

Languages