Skip to content

Configuration

The SDK uses a cascading config system (highest priority first):

  1. Environment variablesAGENT_SDK_MODEL_FAST, AGENT_SDK_DEFAULT_PROVIDER, etc.
  2. Config fileagent-sdk.config.yaml, .agent-sdk.json, etc.
  3. Programmaticconfigure() at runtime
  4. Defaults — built-in fallbacks

Create agent-sdk.config.yaml (or .json) in your project root:

{
"models": {
"defaultProvider": "openrouter",
"tiers": {
"fast": "x-ai/grok-4.1-fast",
"standard": "google/gemini-3-flash-preview",
"reasoning": "deepseek/deepseek-r1",
"powerful": "anthropic/claude-sonnet-4"
}
},
"tools": {
"shell": {
"timeout": 30000
},
"glob": {
"maxFiles": 100
}
},
"maxSteps": 25
}
import { loadConfig, configure, getConfig, resolveModel } from '@agntk/core';
// Load from file
const config = loadConfig('./agent-sdk.config.json');
// Override programmatically
configure({ models: { defaultProvider: 'openrouter' } });
// Get current config
const current = getConfig();
// Resolve models by tier
const model = resolveModel({ tier: 'powerful' });
const fastModel = resolveModel({
tier: 'fast',
provider: 'openrouter'
});

All providers use @ai-sdk/openai-compatible for unified access:

ProviderDefaultDescription
openrouterRoutes to any model — Anthropic, Google, Meta, etc.
openaiDirect OpenAI API
ollamaLocal models via Ollama
CustomAny OpenAI-compatible API via customProviders

Set your API key:

Terminal window
# Primary (recommended)
export OPENROUTER_API_KEY=sk-or-...
# Or use OpenAI directly
export OPENAI_API_KEY=sk-...
# For local models
export OLLAMA_ENABLED=true
TierPurposeOpenRouter Default
fastQuick responses, low costx-ai/grok-4.1-fast
standardBalanced quality/costgoogle/gemini-3-flash-preview
reasoningComplex logic, planningdeepseek/deepseek-r1
powerfulBest quality, highest costz-ai/glm-4.7

Override per-tier models via environment variables: AGENT_SDK_MODEL_FAST, AGENT_SDK_MODEL_STANDARD, AGENT_SDK_MODEL_REASONING, AGENT_SDK_MODEL_POWERFUL.

Configure tool behavior:

{
"tools": {
"shell": {
"timeout": 30000,
"maxOutput": 10000
},
"glob": {
"maxFiles": 100
},
"browser": {
"headless": true
}
}
}

Add any OpenAI-compatible API as a custom provider:

{
"models": {
"customProviders": {
"together": {
"baseURL": "https://api.together.xyz/v1",
"apiKeyEnv": "TOGETHER_API_KEY"
}
}
}
}