Files
Antrophic-Qwen3.6-Proxy/doc/anthropic-api.md
2026-05-10 10:46:41 +02:00

4.5 KiB
Raw Permalink Blame History

Anthropic Messages API

Endpoint: POST /v1/messages Docs: https://docs.anthropic.com/en/api/messages

Request

{
  "model":         "claude-opus-4-7",
  "max_tokens":    4096,
  "messages": [
    {
      "role": "user",
      "content": "string  OR  array of content blocks"
    }
  ],
  "system":        "string  OR  array of system blocks",
  "temperature":   0.7,
  "stop_sequences": ["---"],
  "tools": [
    {
      "name":        "tool_name",
      "description": "What this tool does",
      "input_schema": {
        "type": "object",
        "properties": {
          "param": { "type": "string", "description": "..." }
        },
        "required": ["param"]
      }
    }
  ],
  "tool_choice": { "type": "auto" },
  "stream": true
}

Fields

Field Type Required Notes
model string Yes e.g. claude-opus-4-7, claude-sonnet-4-6
max_tokens number Yes Max output tokens
messages array Yes `role: user
system string/array No System prompt (separate from messages)
temperature number No 0.01.0
stop_sequences array No Strings that stop generation
tools array No Tool definitions with input_schema (JSON Schema)
tool_choice object No `{ type: "auto
stream boolean No Enable SSE streaming

Message Content Types

{ "type": "text",        "text": "string" }
{ "type": "image",       "source": { "type": "base64", "media_type": "image/jpeg", "data": "..." } }
{ "type": "tool_use",    "id": "toolu_abc", "name": "tool_name", "input": {} }
{ "type": "tool_result", "tool_use_id": "toolu_abc", "content": "result string or array" }

Streaming SSE Events

SSE format: each event is two lines event: <type>\ndata: <json> followed by blank line.

Event Order

  1. message_start
  2. For each content block: content_block_start → N× content_block_deltacontent_block_stop
  3. message_delta
  4. message_stop

Interspersed ping events may appear at any time.

message_start

{
  "type": "message_start",
  "message": {
    "id":    "msg_01XFDUDYJgAACzvnptvVoYEL",
    "type":  "message",
    "role":  "assistant",
    "model": "claude-opus-4-7",
    "content": [],
    "stop_reason":   null,
    "stop_sequence": null,
    "usage": { "input_tokens": 25, "output_tokens": 1 }
  }
}

content_block_start

{ "type": "content_block_start", "index": 0,
  "content_block": { "type": "text", "text": "" } }

{ "type": "content_block_start", "index": 1,
  "content_block": { "type": "tool_use", "id": "toolu_abc", "name": "get_weather", "input": {} } }

content_block_delta

{ "type": "content_block_delta", "index": 0,
  "delta": { "type": "text_delta",       "text": "Hello" } }

{ "type": "content_block_delta", "index": 1,
  "delta": { "type": "input_json_delta", "partial_json": "{\"loc" } }

content_block_stop

{ "type": "content_block_stop", "index": 0 }

message_delta

{
  "type": "message_delta",
  "delta": {
    "stop_reason":   "end_turn",
    "stop_sequence": null
  },
  "usage": { "output_tokens": 15 }
}

stop_reason values: end_turn | stop_sequence | max_tokens | tool_use

message_stop

{ "type": "message_stop" }

Non-Streaming Response

{
  "id":   "msg_abc",
  "type": "message",
  "role": "assistant",
  "model": "claude-opus-4-7",
  "content": [
    { "type": "text", "text": "Hello!" },
    { "type": "tool_use", "id": "toolu_abc", "name": "get_weather", "input": { "location": "Berlin" } }
  ],
  "stop_reason":   "tool_use",
  "stop_sequence": null,
  "usage": { "input_tokens": 100, "output_tokens": 50 }
}

Tool Flow

  1. Send request with tools array
  2. Model responds with stop_reason: "tool_use" and content block of type: "tool_use"
  3. Execute the tool locally
  4. Send next user message with type: "tool_result" content block referencing tool_use_id
  5. Continue until stop_reason: "end_turn"