Keyboard shortcuts

Press or to navigate between chapters

Press S or / to search in the book

Press ? to show this help

Press Esc to hide this help

Docker

Run Akshi in a container with docker-compose.

Dockerfile

FROM rust:1.83-slim AS builder
WORKDIR /build
COPY . .
RUN cargo build --release --bin akshi

FROM debian:bookworm-slim
RUN apt-get update && apt-get install -y ca-certificates && rm -rf /var/lib/apt/lists/*
COPY --from=builder /build/target/release/akshi /usr/local/bin/akshi
RUN useradd --system akshi
USER akshi
WORKDIR /opt/akshi
ENTRYPOINT ["akshi", "run"]

docker-compose.yml

services:
  akshi:
    build: .
    ports:
      - "3210:3210"
    volumes:
      - ./runtime.toml:/opt/akshi/runtime.toml:ro
      - ./agents:/opt/akshi/agents:ro
      - akshi-data:/opt/akshi/akshi-data
    environment:
      - AKSHI_TOKEN=${AKSHI_TOKEN}
      - ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY}
    restart: unless-stopped

  ollama:
    image: ollama/ollama
    ports:
      - "11434:11434"
    volumes:
      - ollama-data:/root/.ollama

volumes:
  akshi-data:
  ollama-data:

Run

# Start services
docker compose up -d

# Pull a model into Ollama
docker compose exec ollama ollama pull llama3.2

# Check status
curl http://127.0.0.1:3210/api/v1/health

Configuration notes

  • Mount runtime.toml read-only (:ro).
  • Use a named volume for akshi-data to persist journals and databases.
  • Set dashboard.bind_address = "0.0.0.0" in runtime.toml so the container listens on all interfaces.
  • Reference the Ollama container by service name in runtime.toml:
[[router.routes]]
name = "local"
provider = "ollama"
base_url = "http://ollama:11434"
model = "llama3.2"

Updating agents

# Copy new WASM binary
cp new_agent.wasm agents/

# Reload without restart
docker compose exec akshi akshi reload