Deployment
Deploy your autonomous agents to production. We support Edge Runtimes (Vercel, Cloudflare) and containerized environments (Docker, K8s).
Deployment Strategies#
Edge Runtime
Best for low-latency, stateless agents. Runs on Vercel or Cloudflare Workers.
Docker Container
Best for long-running tasks, cron jobs, and agents requiring heavy dependencies.
Akios Cloud
Fully managed control plane. Zero devops. (Enterprise only)
Deploy to Vercel#
Akios agents are compatible with the Vercel AI SDK and Next.js App Router.
Edge Compatibility
Ensure you use the
edge runtime in your route handler configuration for maximum performance.1. Create Route Handler
app/api/agent/route.ts
import { Agent } from '@akios/core'
import { StreamingTextResponse } from 'ai'
export const runtime = 'edge'
export async function POST(req: Request) {
const { messages } = await req.json()
const agent = new Agent({
name: 'edge-assistant',
model: 'gpt-4-turbo',
systemPrompt: 'You are a helpful assistant.'
})
// Stream the response back to the client
const stream = await agent.stream(messages.at(-1).content)
return new StreamingTextResponse(stream)
}2. Deploy
bash
vercel deployDeploy with Docker#
For background workers or custom infrastructure, use our official base image.
Dockerfile
FROM node:20-alpine
WORKDIR /app
COPY package*.json ./
RUN npm ci --only=production
COPY . .
RUN npm run build
# Start the agent listener
CMD ["npm", "start"]Environment Variables
Ensure the following variables are set in your container runtime:
- OPENAI_API_KEY
- AKIOS_LICENSE_KEY (if using Enterprise features)
- REDIS_URL (for memory persistence)