ai-sdk
AI SDK for Crystal
A port of the Vercel AI SDK to Crystal.
Status: Active Development - Phases 0-4 and 6-9 Complete (Core SDK, OpenAI/Google Providers, UI Helpers, and Agents done)
See Migration Status for details.
Overview
The AI SDK for Crystal provides a unified interface for interacting with AI providers like OpenAI and Google. It's a Crystal port of the popular JavaScript/TypeScript Vercel AI SDK.
Implemented Features
- Text Generation:
AI.generate_text- Generate text with tool support - Streaming Text:
AI.stream_text- Stream text generation with callbacks - Structured Output:
AI.generate_object- Generate typed JSON objects - Streaming Objects:
AI.stream_object- Stream partial objects as they generate - Embeddings:
AI.embed,AI.embed_many- Generate embeddings - Image Generation:
AI.generate_image- Generate images - Speech:
AI.generate_speech- Text-to-speech - Transcription:
AI.transcribe- Speech-to-text - Reranking:
AI.rerank- Rerank documents - Tool Calling: Define and execute tools with
AI.tool - Smooth Streaming:
AI.smooth_stream- Word/line chunking for smoother UX - Agents:
AI.agent- Autonomous tool-calling loops - UI Helpers:
AI::UI- Server-Sent Events (SSE) and stream formatting for frontend integration
Planned Features
- Provider Registry & Middleware (Phase 5)
- Additional Providers: Anthropic, Mistral, Cohere, Bedrock, etc. (Phase 11)
- Advanced Features: Gateway Provider, Message Pruning (Phase 10)
- Ecosystem: MCP (Model Context Protocol), Tool Call Repair (Phase 12)
- Telemetry & OpenTelemetry integration (Phase 10)
Installation
Add the dependency to your shard.yml:
dependencies:
ai_sdk:
github: krthr/ai-sdk
branch: main
Then run:
shards install
Usage
Text Generation
require "ai_sdk"
# Initialize model (uses OPENAI_API_KEY env var by default)
model = AI::OpenAI.chat("gpt-4o-mini")
result = AI.generate_text(
model: model,
prompt: "What is the meaning of life?"
)
puts result.text
Streaming Text
model = AI::OpenAI.chat("gpt-4o-mini")
result = AI.stream_text(
model: model,
prompt: "Write a short story about a robot"
)
# Stream text deltas
result.text_stream do |text|
print text
end
Structured Output (Objects)
model = AI::OpenAI.chat("gpt-4o-mini")
result = AI.generate_object(
model: model,
schema: {
"type" => "object",
"properties" => {
"recipe_name" => {"type" => "string"},
"ingredients" => {
"type" => "array",
"items" => {"type" => "string"}
}
}
},
prompt: "Generate a simple cookie recipe"
)
puts result.object["recipe_name"]
Tool Calling
weather_tool = AI.tool(
description: "Get the weather for a location",
parameters: {
"type" => "object",
"properties" => {
"location" => {"type" => "string"}
},
"required" => ["location"]
},
execute: ->(args : JSON::Any) {
# In a real app, you'd call a weather API here
JSON::Any.new({"temperature" => JSON::Any.new(72L), "unit" => JSON::Any.new("F")})
}
)
model = AI::OpenAI.chat("gpt-4o-mini")
result = AI.generate_text(
model: model,
prompt: "What's the weather in San Francisco?",
tools: {"get_weather" => weather_tool}
)
# If the model called the tool, the result is automatically handled by the Agent or can be accessed:
result.tool_results.each do |tool_result|
puts "Tool: #{tool_result.tool_name}, Result: #{tool_result.result}"
end
Embeddings
# Single embedding
result = AI.embed(
model: embedding_model,
value: "Hello, world!"
)
puts result.embedding # => [0.1, 0.2, ...]
# Multiple embeddings
result = AI.embed_many(
model: embedding_model,
values: ["Hello", "World"]
)
puts result.embeddings.size # => 2
Smooth Streaming
smoother = AI.smooth_stream(chunking: :word, delay_ms: 10)
result.full_stream do |part|
smoother.transform(part) do |smoothed|
# Smoothed parts come word-by-word
print smoothed.as(AI::TextDeltaPart).text if smoothed.is_a?(AI::TextDeltaPart)
end
end
smoother.flush { |p| print p.as(AI::TextDeltaPart).text if p.is_a?(AI::TextDeltaPart) }
Project Structure
ai-sdk-crystal/
├── src/
│ ├── ai/ # Core SDK (AI module)
│ │ ├── agent/ # Agent abstraction
│ │ ├── ui/ # UI & Streaming helpers
│ │ ├── generate_text.cr
│ │ ├── stream_text.cr
│ │ ├── ...
│ ├── provider/ # Provider interfaces (AI::Provider)
│ ├── provider_utils/ # Shared utilities (AI::ProviderUtils)
│ ├── openai/ # OpenAI provider
│ └── google/ # Google provider
├── spec/ # Tests (170 examples)
├── context/ # Migration documentation
└── shard.yml
Documentation
- Migration Status - Overall migration progress
- Detailed Progress - Per-phase task tracking
- Test Compatibility - JS test porting status
- Crystal Patterns - Technical decisions and patterns
Development
# Install dependencies
shards install
# Run tests (170 examples currently)
crystal spec
# Run linter
./bin/ameba
# Format code
crystal tool format
Test Coverage
| Module | Tests |
|---|---|
| Provider Utils | 25 |
| Core SDK (generate_text, agents, etc.) | 81 |
| Providers (OpenAI, Google) | 35 |
| Streaming (stream_text, stream_object) | 21 |
| Smooth Stream | 8 |
| Total | 170 |
Contributing
- Fork it (https://github.com/krthr/ai-sdk-crystal/fork)
- Create your feature branch (
git checkout -b my-new-feature) - Commit your changes (
git commit -am 'Add some feature') - Push to the branch (
git push origin my-new-feature) - Create a new Pull Request
See AGENTS.md for detailed contribution guidelines.
License
MIT License - see LICENSE
Acknowledgments
- Vercel AI SDK - The original TypeScript SDK
- Crest - HTTP client for Crystal
Contributors
- Wilson Tovar - creator and maintainer
Repository
ai-sdk
Owner
Statistic
- 0
- 0
- 0
- 0
- 2
- about 7 hours ago
- January 25, 2026
License
MIT License
Links
Synced at
Mon, 26 Jan 2026 01:27:39 GMT
Languages