Streams the Ollama response token-by-token and stores the full LLM
prompt in the jobs table. A Debug button on each analysis job opens
a modal with two tabs: the sent prompt and the live-updating answer
(polls /jobs/{id}/debug-data every 500 ms while streaming, badge
shows "● live"). Response is persisted to llm_response after the job
completes so it stays viewable afterwards.
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
72 KiB
72 KiB