Path B from VoiceModels.md — adds two new compose stacks alongside the
Wyoming pair so OpenWebUI/Conduit get voice without a Wyoming-shim:
- compose/faster-whisper.yml — fedirz/faster-whisper-server CPU image,
large-v3-turbo by default, OpenAI /v1/audio/transcriptions on :8001.
Built-in web UI for ad-hoc transcription.
- compose/kokoro.yml — ghcr.io/remsky/kokoro-fastapi-cpu, Kokoro-82M,
OpenAI /v1/audio/speech on :8880.
Both run alongside (not instead of) Wyoming Whisper + Piper — Wyoming
keeps serving HA Assist, OpenAI-API serves OpenWebUI / Conduit. Memory
budget on Strix Halo accommodates everything plus Qwen3-Coder loaded
concurrently with plenty of headroom.
Homepage gets dedicated tiles for both. README documents the
OpenWebUI Audio configuration that wires the new endpoints. Conduit
inherits voice via OpenWebUI without app-side setup.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
- Move piper-compose.yaml / whisper-compose.yaml from repo root into
pyinfra/framework/compose/{piper,whisper}.yml; bind paths shifted to
/srv/docker/{piper,whisper}/data on the box.
- deploy.py registers both stacks and provisions the data dirs.
- Homepage gets a "Voice" group with informational tiles (Wyoming has
no web UI, so tiles show container status without click-through).
- New VoiceModels.md captures the May 2026 STT/TTS landscape, why the
current Wyoming defaults aren't SOTA, and concrete upgrade paths
(whisper-large-v3-turbo + faster-whisper-server, Kokoro, Sesame CSM,
F5-TTS for cloning).
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Was bound to 127.0.0.1:3030 — overcautious on a Tailscale-only box
where Phoenix/Beszel/OpenWebUI are all reached the same way. Updated
the homepage tile description and added a security note in the README
for the case where the host ever leaves the tailnet.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Homepage as the front door: single page at framework:7575 with one tile
per service, live widgets where the upstream supports it (Ollama loaded
models, container state via docker.sock, etc.), bookmarks for reference
docs. Config files are pyinfra-managed — source of truth lives in
compose/homepage/, sync by editing there and re-running ./run.sh.
OpenCode plugin now dual-exports spans to Phoenix and OpenLIT in
parallel. Phoenix remains the per-trace waterfall view; OpenLIT picks
up the same data for fleet-level metrics. Each destination has its own
batch processor so a hiccup at one doesn't block the other.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>