Ollama streaming chunks for thinking models use a separate "thinking" field. Previously only "response" was captured, leaving the debug window empty while the model reasoned. Now both fields are tracked independently: thinking is shown in blue above the final answer, both are persisted to new llm_thinking / existing llm_response columns. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
73 KiB
73 KiB