Add initial implementation of noThinkProxy with Docker support and README documentation

This commit is contained in:
2026-05-10 11:11:59 +02:00
parent deb0d5de9d
commit af75920edc
5 changed files with 286 additions and 62 deletions

2
.gitignore vendored
View File

@@ -1 +1,3 @@
node_modules/ node_modules/
.env
*.v[0-9]*

14
Dockerfile Normal file
View File

@@ -0,0 +1,14 @@
FROM node:22-alpine
WORKDIR /app
COPY package.json package-lock.json ./
RUN npm ci --omit=dev
COPY app.js ./
ENV PORT=11435
EXPOSE 11435
CMD ["node", "app.js"]

96
README.md Normal file
View File

@@ -0,0 +1,96 @@
# noThinkProxy
Anthropic API → Ollama Proxy.
Ermöglicht Claude-SDK-Clients (z. B. **Claude Code**) gegen einen lokalen Ollama-Server zu arbeiten, ohne Think-Modus und mit frei wählbarem Modell.
## Was der Proxy tut
- Empfängt Requests im **Anthropic-Format** (`POST /v1/messages`)
- Konvertiert sie ins **Ollama-Format** (`POST /api/chat`)
- Gibt die Antwort als **Anthropic SSE-Stream** zurück
- Deaktiviert den Think-Modus (`think: false`)
- Ersetzt alle `claude-*` Modellnamen durch das konfigurierte Modell
- Konvertiert Tool-Calls und Tool-Results korrekt ins Ollama-Format
- Dedupliziert doppelte Tool-Calls
## Schnellstart (Docker)
### 1. Image bauen und in Registry pushen
```bash
docker build -t registry.aquantico.lan/nothinkproxy:latest .
docker push registry.aquantico.lan/nothinkproxy:latest
```
### 2. Auf dem Server deployen
```bash
# docker-compose.yml auf den Server kopieren
scp docker-compose.yml server:/opt/nothinkproxy/
# Auf dem Server starten
ssh server
cd /opt/nothinkproxy
docker compose up -d
```
Der Proxy ist dann erreichbar unter `https://claude-proxy.aquantico.lan`.
## Konfiguration (Umgebungsvariablen)
| Variable | Standard | Beschreibung |
|---------------|-----------------------------------------------|-------------------------------------|
| `PORT` | `11435` | Port auf dem der Proxy lauscht |
| `OLLAMA_URL` | `https://ollama.aquantico.de/api/chat` | Vollständige Ollama Chat-API URL |
| `OLLAMA_MODEL`| `qwen3.6:35b-a3b-q4_K_M` | Modell das Ollama verwenden soll |
| `OLLAMA_AUTH` | *(Bearer Token)* | Auth-Token für den Ollama-Server |
Anpassen in `docker-compose.yml` unter `environment:`.
## Claude Code installieren
Auf dem Zielrechner (einmaliger Installer):
```bash
curl -fsSL https://claude-proxy.aquantico.lan/install.sh | bash
```
Der Installer prüft Node.js, installiert es bei Bedarf und installiert Claude Code via npm.
## Claude Code mit dem Proxy starten
```bash
ANTHROPIC_BASE_URL=https://claude-proxy.aquantico.lan claude
```
Als dauerhafter Alias in `~/.bashrc` oder `~/.zshrc`:
```bash
alias claude-local='ANTHROPIC_BASE_URL=https://claude-proxy.aquantico.lan claude'
```
## Lokaler Betrieb (ohne Docker)
```bash
npm install
node app.js
```
Mit anderem Modell / Server:
```bash
OLLAMA_URL=http://localhost:11434/api/chat \
OLLAMA_MODEL=llama3.2 \
OLLAMA_AUTH=mytoken \
node app.js
```
Dann Claude Code starten:
```bash
ANTHROPIC_BASE_URL=http://localhost:11435 claude
```
## Info-Seite
Beim Aufruf von `https://claude-proxy.aquantico.lan/` (ohne Pfad) erscheint eine HTML-Seite mit der aktuellen Konfiguration und Installations-Anweisungen.

213
app.js
View File

@@ -1,20 +1,138 @@
// proxy.js - KOMPLETT, mit Tool-Call-Deduplizierung
const express = require('express'); const express = require('express');
const app = express(); const app = express();
const OLLAMA_URL = process.env.OLLAMA_URL || 'https://ollama.aquantico.de/api/chat';
const OLLAMA_MODEL = process.env.OLLAMA_MODEL || 'qwen3.6:35b-a3b-q4_K_M';
const OLLAMA_AUTH = process.env.OLLAMA_AUTH || '324GF44-50AA-4B57-9386-K435DLJ764DFR';
const PORT = parseInt(process.env.PORT || '11435', 10);
const colors = { const colors = {
reset: '\x1b[0m', reset: '\x1b[0m',
cyan: '\x1b[36m', cyan: '\x1b[36m',
green: '\x1b[32m', green: '\x1b[32m',
magenta: '\x1b[35m', magenta: '\x1b[35m',
yellow: '\x1b[33m', yellow: '\x1b[33m',
blue: '\x1b[34m', blue: '\x1b[34m',
red: '\x1b[31m' red: '\x1b[31m'
}; };
app.set('trust proxy', 1);
app.use(express.json({ limit: '50mb' })); app.use(express.json({ limit: '50mb' }));
// ── Info-Seite ────────────────────────────────────────────────────────────────
app.get('/', (req, res) => {
const host = `${req.protocol}://${req.get('host')}`;
res.setHeader('Content-Type', 'text/html; charset=utf-8');
res.send(`<!DOCTYPE html>
<html lang="de">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1">
<title>noThinkProxy</title>
<style>
*{box-sizing:border-box;margin:0;padding:0}
body{font-family:system-ui,sans-serif;background:#0f0f0f;color:#e0e0e0;padding:2rem}
h1{color:#a78bfa;font-size:1.8rem;margin-bottom:.4rem}
h2{color:#7dd3fc;font-size:1.1rem;margin:2rem 0 .6rem}
p{color:#9ca3af;line-height:1.6;margin-bottom:.8rem}
code{background:#1e1e2e;color:#cba6f7;padding:.15rem .4rem;border-radius:.25rem;font-size:.9rem}
pre{background:#1e1e2e;border:1px solid #333;border-radius:.5rem;padding:1rem;overflow-x:auto;margin:.5rem 0 1rem}
pre code{background:none;padding:0;color:#a6e3a1}
.badge{display:inline-block;background:#1e3a5f;color:#7dd3fc;border-radius:.25rem;padding:.1rem .5rem;font-size:.8rem;margin-left:.5rem}
table{width:100%;border-collapse:collapse;margin:.5rem 0 1rem}
td,th{border:1px solid #333;padding:.4rem .8rem;text-align:left}
th{background:#1e1e2e;color:#7dd3fc}
.copy-btn{cursor:pointer;background:#2d2d4e;border:1px solid #555;color:#cba6f7;padding:.2rem .6rem;border-radius:.25rem;font-size:.75rem;margin-left:.5rem}
</style>
</head>
<body>
<h1>noThinkProxy <span class="badge">v1.0</span></h1>
<p>Anthropic-API → Ollama-Proxy · Think-Modus deaktiviert · Modell-Substitution aktiv</p>
<h2>Aktuelle Konfiguration</h2>
<table>
<tr><th>Parameter</th><th>Wert</th></tr>
<tr><td>Ollama URL</td><td><code>${OLLAMA_URL.replace(/\/api\/chat$/, '')}</code></td></tr>
<tr><td>Modell</td><td><code>${OLLAMA_MODEL}</code></td></tr>
<tr><td>Kontext</td><td><code>262144 Token (256k)</code></td></tr>
<tr><td>Think</td><td><code>false</code></td></tr>
<tr><td>Proxy-URL</td><td><code>${host}</code></td></tr>
</table>
<h2>localclaude installieren</h2>
<p>Installiert das Script <code>localclaude</code> nach <code>/usr/local/bin</code> (oder <code>~/.local/bin</code>):</p>
<pre><code>curl -fsSL ${host}/install.sh | bash</code></pre>
<h2>Starten</h2>
<pre><code>localclaude</code></pre>
<p><code>localclaude</code> setzt automatisch <code>ANTHROPIC_BASE_URL=${host}</code> und ruft <code>claude</code> auf.</p>
<h2>API-Endpunkt</h2>
<pre><code>POST ${host}/v1/messages</code></pre>
<p>Kompatibel mit dem Anthropic SDK. Alle <code>claude-*</code> Modellnamen werden automatisch auf <code>${OLLAMA_MODEL}</code> umgeleitet.</p>
</body>
</html>`);
});
// ── Install-Script ────────────────────────────────────────────────────────────
app.get('/install.sh', (req, res) => {
const host = `${req.protocol}://${req.get('host')}`;
res.setHeader('Content-Type', 'text/plain; charset=utf-8');
res.send(`#!/usr/bin/env bash
set -euo pipefail
PROXY_URL="${host}"
INSTALL_DIR="/usr/local/bin"
NEEDS_PATH_UPDATE=false
echo ""
echo "=== noThinkProxy · localclaude Installer ==="
echo ""
# Zielverzeichnis bestimmen (ohne sudo → ~/.local/bin)
if [ ! -w "\$INSTALL_DIR" ]; then
INSTALL_DIR="\$HOME/.local/bin"
mkdir -p "\$INSTALL_DIR"
fi
# localclaude-Script schreiben
cat > "\$INSTALL_DIR/localclaude" <<'SCRIPT'
#!/usr/bin/env bash
export ANTHROPIC_BASE_URL="${host}"
exec claude "\$@"
SCRIPT
chmod +x "\$INSTALL_DIR/localclaude"
# PATH prüfen und ggf. in Shell-Config eintragen
if ! echo "\$PATH" | grep -q "\$INSTALL_DIR"; then
NEEDS_PATH_UPDATE=true
echo "» Trage \$INSTALL_DIR in ~/.bashrc und ~/.zshrc ein..."
echo "export PATH=\\"\$INSTALL_DIR:\$PATH\\"" >> "\$HOME/.bashrc"
echo "export PATH=\\"\$INSTALL_DIR:\$PATH\\"" >> "\$HOME/.zshrc" 2>/dev/null || true
fi
echo "✓ localclaude installiert in \$INSTALL_DIR"
echo ""
if [ "\$NEEDS_PATH_UPDATE" = "true" ]; then
echo "────────────────────────────────────────────"
echo "Führe diesen Befehl jetzt aus damit localclaude sofort verfügbar ist:"
echo ""
echo " export PATH=\\"\$INSTALL_DIR:\$PATH\\""
echo ""
echo "In neuen Shell-Sessions ist es automatisch verfügbar."
echo "────────────────────────────────────────────"
else
echo "Starte mit: localclaude"
fi
echo ""
`);
});
// ── Hilfsfunktionen ───────────────────────────────────────────────────────────
function sanitizeToolSchema(schema) { function sanitizeToolSchema(schema) {
if (!schema || typeof schema !== 'object') { if (!schema || typeof schema !== 'object') {
return { type: 'object', properties: {} }; return { type: 'object', properties: {} };
@@ -70,7 +188,6 @@ function stringifyToolResultContent(content) {
return JSON.stringify(content); return JSON.stringify(content);
} }
// BUG 3 FIX: tool_use → Ollama tool_calls, tool_result → role:tool
function convertAnthropicToOllama(anthropicBody) { function convertAnthropicToOllama(anthropicBody) {
const ollamaMessages = []; const ollamaMessages = [];
@@ -116,7 +233,6 @@ function convertAnthropicToOllama(anthropicBody) {
ollamaMessages.push(assistantMsg); ollamaMessages.push(assistantMsg);
} else { } else {
// user messages: text bleibt als user, tool_result wird zu role:tool
const pendingText = []; const pendingText = [];
for (const item of msg.content) { for (const item of msg.content) {
@@ -178,9 +294,7 @@ function parseToolArguments(args) {
} }
} }
if (typeof args === 'object') { if (typeof args === 'object') return args;
return args;
}
return {}; return {};
} }
@@ -193,6 +307,8 @@ function makeToolDedupeKey(tc) {
return `${name}:${argsString}`; return `${name}:${argsString}`;
} }
// ── Response-Handler ──────────────────────────────────────────────────────────
async function handleResponse(response, anthropicBody, res, requestNum) { async function handleResponse(response, anthropicBody, res, requestNum) {
res.setHeader('Content-Type', 'text/event-stream'); res.setHeader('Content-Type', 'text/event-stream');
res.setHeader('Cache-Control', 'no-cache'); res.setHeader('Cache-Control', 'no-cache');
@@ -200,7 +316,6 @@ async function handleResponse(response, anthropicBody, res, requestNum) {
const messageId = 'msg_' + requestNum; const messageId = 'msg_' + requestNum;
// BUG 2 FIX: usage.input_tokens und stop_reason hinzugefügt
res.write(`event: message_start\ndata: ${JSON.stringify({ res.write(`event: message_start\ndata: ${JSON.stringify({
type: 'message_start', type: 'message_start',
message: { message: {
@@ -226,7 +341,6 @@ async function handleResponse(response, anthropicBody, res, requestNum) {
let messageFinished = false; let messageFinished = false;
let buffer = ''; let buffer = '';
// BUG 1 FIX: Chunk-Verarbeitung als Funktion, damit final buffer ebenfalls verarbeitet wird
function processChunk(data) { function processChunk(data) {
if (messageFinished) return; if (messageFinished) return;
@@ -253,21 +367,13 @@ async function handleResponse(response, anthropicBody, res, requestNum) {
res.write(`event: content_block_start\ndata: ${JSON.stringify({ res.write(`event: content_block_start\ndata: ${JSON.stringify({
type: 'content_block_start', type: 'content_block_start',
index: currentBlockIndex, index: currentBlockIndex,
content_block: { content_block: { type: 'tool_use', id: toolUseId, name: toolName, input: {} }
type: 'tool_use',
id: toolUseId,
name: toolName,
input: {}
}
})}\n\n`); })}\n\n`);
res.write(`event: content_block_delta\ndata: ${JSON.stringify({ res.write(`event: content_block_delta\ndata: ${JSON.stringify({
type: 'content_block_delta', type: 'content_block_delta',
index: currentBlockIndex, index: currentBlockIndex,
delta: { delta: { type: 'input_json_delta', partial_json: JSON.stringify(toolInput) }
type: 'input_json_delta',
partial_json: JSON.stringify(toolInput)
}
})}\n\n`); })}\n\n`);
res.write(`event: content_block_stop\ndata: ${JSON.stringify({ res.write(`event: content_block_stop\ndata: ${JSON.stringify({
@@ -286,10 +392,7 @@ async function handleResponse(response, anthropicBody, res, requestNum) {
res.write(`event: content_block_start\ndata: ${JSON.stringify({ res.write(`event: content_block_start\ndata: ${JSON.stringify({
type: 'content_block_start', type: 'content_block_start',
index: currentBlockIndex, index: currentBlockIndex,
content_block: { content_block: { type: 'text', text: '' }
type: 'text',
text: ''
}
})}\n\n`); })}\n\n`);
contentBlocks[currentBlockIndex] = ''; contentBlocks[currentBlockIndex] = '';
@@ -300,10 +403,7 @@ async function handleResponse(response, anthropicBody, res, requestNum) {
res.write(`event: content_block_delta\ndata: ${JSON.stringify({ res.write(`event: content_block_delta\ndata: ${JSON.stringify({
type: 'content_block_delta', type: 'content_block_delta',
index: currentBlockIndex, index: currentBlockIndex,
delta: { delta: { type: 'text_delta', text }
type: 'text_delta',
text
}
})}\n\n`); })}\n\n`);
contentBlocks[currentBlockIndex] += text; contentBlocks[currentBlockIndex] += text;
@@ -321,17 +421,11 @@ async function handleResponse(response, anthropicBody, res, requestNum) {
res.write(`event: message_delta\ndata: ${JSON.stringify({ res.write(`event: message_delta\ndata: ${JSON.stringify({
type: 'message_delta', type: 'message_delta',
delta: { delta: { stop_reason: emittedToolUse ? 'tool_use' : 'end_turn' },
stop_reason: emittedToolUse ? 'tool_use' : 'end_turn' usage: { output_tokens: data.eval_count || 0 }
},
usage: {
output_tokens: data.eval_count || 0
}
})}\n\n`); })}\n\n`);
res.write(`event: message_stop\ndata: ${JSON.stringify({ res.write(`event: message_stop\ndata: ${JSON.stringify({ type: 'message_stop' })}\n\n`);
type: 'message_stop'
})}\n\n`);
console.log(`${colors.green}${colors.reset}\n`); console.log(`${colors.green}${colors.reset}\n`);
} }
@@ -359,7 +453,6 @@ async function handleResponse(response, anthropicBody, res, requestNum) {
} }
} }
// BUG 1 FIX: letzten gepufferten Chunk verarbeiten (kein \n am Ende)
if (buffer.trim()) { if (buffer.trim()) {
try { try {
processChunk(JSON.parse(buffer.trim())); processChunk(JSON.parse(buffer.trim()));
@@ -379,22 +472,18 @@ async function handleResponse(response, anthropicBody, res, requestNum) {
res.write(`event: message_delta\ndata: ${JSON.stringify({ res.write(`event: message_delta\ndata: ${JSON.stringify({
type: 'message_delta', type: 'message_delta',
delta: { delta: { stop_reason: emittedToolUse ? 'tool_use' : 'end_turn' },
stop_reason: emittedToolUse ? 'tool_use' : 'end_turn' usage: { output_tokens: 0 }
},
usage: {
output_tokens: 0
}
})}\n\n`); })}\n\n`);
res.write(`event: message_stop\ndata: ${JSON.stringify({ res.write(`event: message_stop\ndata: ${JSON.stringify({ type: 'message_stop' })}\n\n`);
type: 'message_stop'
})}\n\n`);
} }
res.end(); res.end();
} }
// ── Proxy-Endpunkt ────────────────────────────────────────────────────────────
app.post('/v1/messages', async (req, res) => { app.post('/v1/messages', async (req, res) => {
const requestNum = Date.now(); const requestNum = Date.now();
@@ -404,20 +493,20 @@ app.post('/v1/messages', async (req, res) => {
const anthropicBody = req.body; const anthropicBody = req.body;
if (anthropicBody.model?.startsWith('claude-')) { if (anthropicBody.model?.startsWith('claude-')) {
anthropicBody.model = 'qwen3.6:35b-a3b-q4_K_M'; anthropicBody.model = OLLAMA_MODEL;
} }
const ollamaBody = convertAnthropicToOllama(anthropicBody); const ollamaBody = convertAnthropicToOllama(anthropicBody);
console.log( console.log(
`${colors.magenta}[msgs=${ollamaBody.messages.length}, tools=${ollamaBody.tools?.length || 0}, ctx=256k, think=false]${colors.reset}` `${colors.magenta}[msgs=${ollamaBody.messages.length}, tools=${ollamaBody.tools?.length || 0}, ctx=256k, think=false, model=${OLLAMA_MODEL}]${colors.reset}`
); );
const response = await fetch('https://ollama.aquantico.de/api/chat', { const response = await fetch(OLLAMA_URL, {
method: 'POST', method: 'POST',
headers: { headers: {
'Content-Type': 'application/json', 'Content-Type': 'application/json',
'Authorization': 'Bearer 324GF44-50AA-4B57-9386-K435DLJ764DFR' 'Authorization': `Bearer ${OLLAMA_AUTH}`
}, },
body: JSON.stringify(ollamaBody) body: JSON.stringify(ollamaBody)
}); });
@@ -435,10 +524,7 @@ app.post('/v1/messages', async (req, res) => {
if (!res.headersSent) { if (!res.headersSent) {
res.status(500).json({ res.status(500).json({
type: 'error', type: 'error',
error: { error: { type: 'api_error', message: error.message }
type: 'api_error',
message: error.message
}
}); });
} else { } else {
res.end(); res.end();
@@ -446,6 +532,9 @@ app.post('/v1/messages', async (req, res) => {
} }
}); });
app.listen(11435, () => { app.listen(PORT, () => {
console.log(`${colors.magenta}Proxy: localhost:11435 (256k ctx, think=false)${colors.reset}\n`); console.log(`${colors.magenta}noThinkProxy: localhost:${PORT}${colors.reset}`);
console.log(`${colors.cyan} Ollama : ${OLLAMA_URL}${colors.reset}`);
console.log(`${colors.cyan} Modell : ${OLLAMA_MODEL}${colors.reset}`);
console.log(`${colors.cyan} Ctx : 256k Think: false${colors.reset}\n`);
}); });

23
docker-compose.yml Normal file
View File

@@ -0,0 +1,23 @@
services:
nothinkproxy:
image: registry.aquantico.lan/nothinkproxy:latest
container_name: nothinkproxy
restart: always
networks:
- traefik
environment:
- PORT=11435
- OLLAMA_URL=https://ollama.aquantico.de/api/chat
- OLLAMA_MODEL=qwen3.6:35b-a3b-q4_K_M
- OLLAMA_AUTH=324GF44-50AA-4B57-9386-K435DLJ764DFR
labels:
- "traefik.enable=true"
- "traefik.http.routers.nothinkproxy.rule=Host(`claude-proxy.aquantico.lan`)"
- "traefik.http.routers.nothinkproxy.entrypoints=websecure"
- "traefik.http.routers.nothinkproxy.tls=true"
- "traefik.http.services.nothinkproxy.loadbalancer.server.port=11435"
- "traefik.docker.network=traefik"
networks:
traefik:
external: true