ai-sdk
AI SDK for Crystal
A Crystal port of the Vercel AI SDK - providing a unified interface for interacting with AI providers.
Features
- Unified API - Single interface for multiple AI providers
- Type-safe - Full Crystal type safety with compile-time checks
- Streaming support - Real-time streaming responses via SSE
- Multiple model types - Language, embedding, and image models
- Provider tools - Google Search, Code Execution, Web Search, and more
Supported Providers
| Provider | Language Models | Embeddings | Images | Status |
|---|---|---|---|---|
| Google Generative AI | Gemini 1.5/2.0 | text-embedding | imagen | Complete |
| OpenAI | GPT-4o, GPT-5, o3, o4-mini | text-embedding-3 | DALL-E 3 | Complete |
Installation
Add to your shard.yml:
dependencies:
ai-sdk:
github: krthr/ai-sdk
version: ~> 0.1.0
Then run:
shards install
Quick Start
Google Generative AI
require "ai_sdk"
# Set environment variable: GOOGLE_GENERATIVE_AI_API_KEY
# Create a model
model = AI::Google.google.chat("gemini-2.0-flash")
# Generate text
result = model.do_generate(
AI::Provider::LanguageModel::CallOptions.new(
prompt: [
AI::Provider::LanguageModel::UserMessage.new(
content: [AI::Provider::LanguageModel::TextPart.new(text: "Hello, how are you?")]
)
] of AI::Provider::LanguageModel::Message
)
)
puts result.text
OpenAI
require "ai_sdk"
# Set environment variable: OPENAI_API_KEY
# Create a model
model = AI::OpenAI.openai.chat("gpt-4o")
# Generate text
result = model.do_generate(
AI::Provider::LanguageModel::CallOptions.new(
prompt: [
AI::Provider::LanguageModel::UserMessage.new(
content: [AI::Provider::LanguageModel::TextPart.new(text: "What is Crystal programming language?")]
)
] of AI::Provider::LanguageModel::Message
)
)
puts result.text
Streaming
# Stream text generation
stream_result = model.do_stream(options)
stream_result.stream.each do |part|
case part
when AI::Provider::LanguageModel::TextStartPart
# Text generation started
when AI::Provider::LanguageModel::TextDeltaPart
print part.text_delta
when AI::Provider::LanguageModel::FinishPart
puts "\nFinished: #{part.finish_reason.unified}"
end
end
Embeddings
# Google embeddings
embedding_model = AI::Google.google.embedding("text-embedding-004")
result = embedding_model.do_embed(
AI::Provider::EmbeddingModel::CallOptions.new(
values: ["Hello, world!"]
)
)
puts result.embeddings.first # Array of Float64
# OpenAI embeddings
embedding_model = AI::OpenAI.openai.embedding("text-embedding-3-small")
Image Generation
# Google image generation
image_model = AI::Google.google.image("imagen-3.0-generate-002")
result = image_model.do_generate(
AI::Provider::ImageModel::CallOptions.new(
prompt: "A beautiful sunset over mountains",
n: 1
)
)
# OpenAI image generation
image_model = AI::OpenAI.openai.image("dall-e-3")
Tool Calling
# Define a function tool
weather_tool = AI::Provider::LanguageModel::FunctionTool.new(
name: "get_weather",
description: "Get the current weather for a location",
input_schema: JSON::Any.new({
"type" => JSON::Any.new("object"),
"properties" => JSON::Any.new({
"location" => JSON::Any.new({
"type" => JSON::Any.new("string"),
"description" => JSON::Any.new("City name")
})
}),
"required" => JSON::Any.new(["location"])
})
)
result = model.do_generate(
AI::Provider::LanguageModel::CallOptions.new(
prompt: messages,
tools: [weather_tool] of AI::Provider::LanguageModel::FunctionTool | AI::Provider::LanguageModel::ProviderTool
)
)
Provider-Specific Tools
# Google Search (Google)
google_search = AI::Google::Tools.google_search
# Web Search (OpenAI)
web_search = AI::OpenAI::Tools.web_search(
search_context_size: "medium",
user_location: AI::OpenAI::Tools::UserLocation.new(
type: "approximate",
country: "US"
)
)
# Code Interpreter (OpenAI)
code_interpreter = AI::OpenAI::Tools.code_interpreter
Configuration
Custom API Keys
# Google
provider = AI::Google.create_google_generative_ai(
api_key: "your-api-key",
base_url: "https://custom-endpoint.example.com"
)
# OpenAI
provider = AI::OpenAI.create_openai(
api_key: "your-api-key",
base_url: "https://custom-endpoint.example.com",
organization: "org-xxx",
project: "proj-xxx"
)
Environment Variables
| Provider | Environment Variable |
|---|---|
GOOGLE_GENERATIVE_AI_API_KEY |
|
| OpenAI | OPENAI_API_KEY, OPENAI_BASE_URL |
API Reference
Model Types
LanguageModel::V3- Text generation (chat completions)EmbeddingModel::V3- Text embeddingsImageModel::V3- Image generation
Core Types
CallOptions- Request configurationGenerateResult- Generation responseStreamResult- Streaming response with iteratorUsage- Token usage information
Development
# Install dependencies
shards install
# Run tests
crystal spec
# Run linter
./bin/ameba
# Format code
crystal tool format
Architecture
This SDK follows the Vercel AI SDK Provider Specification v3:
- Synchronous API - Blocking calls (Crystal fibers for concurrency)
- JSON::Serializable - Type-safe JSON handling
- Iterator for Streaming - Crystal iterators for SSE streams
- Crest HTTP Client - All HTTP via Crest library
See context/KNOWLEDGE.md for detailed patterns and decisions.
Contributing
- Fork it
- Create your feature branch (
git checkout -b feature/my-feature) - Run tests (
crystal spec) - Run linter (
./bin/ameba) - Commit your changes (
git commit -am 'Add my feature') - Push to the branch (
git push origin feature/my-feature) - Create a Pull Request
License
MIT License - see LICENSE for details.
Acknowledgments
- Vercel AI SDK - Original TypeScript implementation
- Crest - HTTP client for Crystal
Repository
ai-sdk
Owner
Statistic
- 0
- 0
- 0
- 0
- 3
- 26 days ago
- January 25, 2026
License
MIT License
Links
Synced at
Tue, 27 Jan 2026 09:32:32 GMT
Languages