2.9 KiB
CLAUDE.md
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
Commands
Build and run (Docker):
cp .env.example .env # first time only
docker compose up -d --build
UI available at http://127.0.0.1:8094/
Run locally without Docker:
pip install -r requirements.txt
API_BASE=http://... OLLAMA_BASE_URL=http://... DB_PATH=./ui.db uvicorn app:app --host 0.0.0.0 --port 8094 --reload
Publish container:
docker build -t registry.aquantico.lan/claw/diarization-ui:latest .
docker push registry.aquantico.lan/claw/diarization-ui:latest
There are no tests and no linter configured.
Architecture
The entire application lives in a single file: app.py (~1550 lines). There are no templates, no separate frontend build, and no additional modules.
Request handling
All routes are FastAPI endpoints. HTML is returned as server-rendered Python f-strings using Bootstrap 5.3 (CDN) for layout. Inline <script> blocks handle client-side interactivity. The layout() function wraps every page with the sidebar navigation, topbar, shared CSS, and the global modal/JS helpers (window.showModal, window.uiPrompt, window.uiConfirm, window.uiSelect).
Data model (SQLite)
The DB is a single SQLite file at DB_PATH (default /data/ui.db). Four tables:
- projects — named groupings for documents
- prompts — reusable LLM prompt templates
- documents — stored content;
kindis eithertranscript(raw transcription output) oranalysis(LLM result). An analysis document links back to its source viasource_document_idandprompt_id. - jobs — background task queue;
kindisuploadoranalysis;statuscycles throughqueued → running → done|error|cancelled.
Schema migrations are handled inline in init_db() with ALTER TABLE … ADD COLUMN wrapped in try/except.
Background job processing
Jobs are processed by a ThreadPoolExecutor(max_workers=2). enqueue_job() inserts a row and submits the worker function. Workers check for cancelled status at each checkpoint before making external calls.
- upload job (
_process_upload_job): POSTs the audio file to{API_BASE}/transcribe-diarize, storesformatted_textfrom the response as atranscriptdocument. - analysis job (
_process_analysis_job): POSTs to Ollama's/api/generatewith the transcript content and selected prompt, stores the response as ananalysisdocument.
The jobs page auto-polls /jobs/data every 3 s when active jobs exist, 10 s otherwise.
External dependencies
API_BASE— thetranscribe-diarizeservice (audio → transcript JSON withformatted_text)OLLAMA_BASE_URL/OLLAMA_MODEL— Ollama instance for LLM analysis (default modelqwen3.5:9b)
PWA
The app serves /manifest.webmanifest, /icon.svg, and /sw.js directly from route handlers to support installable PWA on mobile.