Model Providers
Configure Akios to work with OpenAI, Anthropic, Mistral, Azure, or local models via Ollama.
The Universal Model Interface#
Switching models is a one-line change. Akios abstracts the API differences (input format, token counting, tool calling) so you can focus on the agent logic.
Supported Providers#
OpenAI
The default provider. Best for general purpose reasoning and complex tool use.
openai.ts
import { OpenAIProvider } from '@akios/sdk'
const model = new OpenAIProvider({
apiKey: process.env.OPENAI_API_KEY,
model: 'gpt-4o' // or 'gpt-3.5-turbo'
})Anthropic (Claude)
Excellent for writing, coding, and large context windows (200k tokens).
anthropic.ts
import { AnthropicProvider } from '@akios/sdk'
const model = new AnthropicProvider({
apiKey: process.env.ANTHROPIC_API_KEY,
model: 'claude-3-opus-20240229'
})Local Models (Ollama)
Run Llama 3 or Mistral locally for free. Zero data privacy concerns.
Tool Use Limitations
Smaller local models (7B) often struggle with reliable JSON tool calling. Use robust system prompts.
ollama.ts
import { OllamaProvider } from '@akios/sdk'
const model = new OllamaProvider({
baseUrl: 'http://localhost:11434',
model: 'llama3'
})Azure OpenAI
For enterprise deployments requiring SLA and compliance.
azure.ts
import { AzureOpenAIProvider } from '@akios/sdk'
const model = new AzureOpenAIProvider({
endpoint: process.env.AZURE_ENDPOINT,
apiKey: process.env.AZURE_API_KEY,
deploymentName: 'my-gpt4-deployment'
})