Files
localgenai/pyinfra/framework/compose/openlit.yml
noisedestroyers 2c4bfefa95 Initial commit: localgenai stack
Containerized local LLM stack for the Framework Desktop / Strix Halo,
plus the OpenCode harness on the Mac side.

- pyinfra/framework/: pyinfra deploy targeting the box
  - llama.cpp (Vulkan), vLLM (ROCm), Ollama (ROCm with HSA override
    for gfx1151), OpenWebUI
  - Beszel (host + container + AMD GPU dashboard via sysfs)
  - OpenLIT (LLM fleet metrics)
  - Phoenix (per-trace agent waterfall)
  - OpenHands (autonomous agent in a Docker sandbox)
- opencode/: OpenCode config + Phoenix bridge plugin (OTel exporter)
  - install.sh deploys to ~/.config/opencode/
- StrixHaloSetup.md / StrixHaloMemory.md / Roadmap.md / TODO.md:
  documentation and planning
- testing/qwen3-coder-30b/: small evaluation harness

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
2026-05-08 11:35:10 -04:00

60 lines
2.1 KiB
YAML

# OpenLIT — LLM observability (traces, costs, KV-cache, prompt/decode
# latencies, tokens/sec). https://openlit.io
#
# Two services:
# - clickhouse : columnar store for traces (internal only, no host port)
# - openlit : Next.js UI on :3001 (3000 is OpenWebUI)
#
# Why OpenLIT vs Langfuse/Phoenix/Laminar: it's the only OSS dashboard
# (May 2026) that auto-instruments Ollama AND vLLM via OpenTelemetry
# without adding code to client apps. For llama.cpp, start the server
# with --metrics (see ../llama/docker-compose.yml) and OpenLIT can scrape
# /metrics.
#
# To send traces from a Python script calling Ollama/vLLM:
# pip install openlit
# python -c "import openlit; openlit.init(otlp_endpoint='http://framework:4318')"
#
# To wire OpenWebUI → OpenLIT, install OpenLIT's pipeline middleware
# in OpenWebUI per https://openlit.io/blogs/openlit-openwebui.
services:
clickhouse:
image: clickhouse/clickhouse-server:25.3-alpine
container_name: openlit-clickhouse
restart: unless-stopped
environment:
CLICKHOUSE_USER: default
CLICKHOUSE_PASSWORD: OPENLIT
CLICKHOUSE_DB: openlit
volumes:
- /srv/docker/openlit/clickhouse:/var/lib/clickhouse
ulimits:
nofile:
soft: 262144
hard: 262144
openlit:
image: ghcr.io/openlit/openlit:latest
container_name: openlit
restart: unless-stopped
depends_on:
- clickhouse
ports:
# Host:container — UI on 3001 (OpenWebUI owns 3000).
- "3001:3000"
# OTLP receivers exposed on the host so SDKs running off-box can
# ship traces here. gRPC + HTTP. Remapped (4327/4328 → 4317/4318)
# because Phoenix owns the canonical 4317/4318 ports for OpenCode
# traces — OpenLIT here is a secondary/fleet-metrics destination.
- "4327:4317"
- "4328:4318"
environment:
INIT_DB_HOST: clickhouse
INIT_DB_PORT: "8123"
INIT_DB_USERNAME: default
INIT_DB_PASSWORD: OPENLIT
INIT_DB_DATABASE: openlit
SQLITE_DATABASE_URL: file:/app/client/data/data.db
volumes:
- /srv/docker/openlit/data:/app/client/data