Anthropic Integration

Full observability for the Anthropic Claude API. Track messages, tool use, token usage, and streaming responses across Claude 3.5 Sonnet, Opus, and Haiku models.

Anthropic SDK >= 0.20.0Claude 3.5Claude 3 Opus/Sonnet/HaikuTool Use

Installation

Terminal
pip install turingpulse_sdk turingpulse_sdk_anthropic anthropic

Quick Start

main.py
from anthropic import Anthropic
from turingpulse_sdk import init, TuringPulseConfig
from turingpulse_sdk_anthropic import patch_anthropic

# Initialize TuringPulse
init(TuringPulseConfig(
    api_key="sk_live_your_api_key",
    workflow_name="my-project",
))

# Instrument Anthropic - wraps all API calls
patch_anthropic()

# Your code works exactly the same - now with full tracing!
client = Anthropic()
message = client.messages.create(
    model="claude-sonnet-4-20250514",
    max_tokens=1024,
    messages=[{"role": "user", "content": "Hello, Claude!"}],
)
print(message.content[0].text)

Streaming Support

streaming.py
client = Anthropic()

# Streaming is automatically tracked
with client.messages.stream(
    model="claude-sonnet-4-20250514",
    max_tokens=1024,
    messages=[{"role": "user", "content": "Write a poem about AI"}],
) as stream:
    for text in stream.text_stream:
        print(text, end="", flush=True)

# Trace captures time-to-first-token and full response

Tool Use

tools.py
client = Anthropic()

tools = [
    {
        "name": "get_weather",
        "description": "Get the current weather for a location",
        "input_schema": {
            "type": "object",
            "properties": {
                "location": {"type": "string", "description": "City name"},
            },
            "required": ["location"],
        },
    },
]

message = client.messages.create(
    model="claude-sonnet-4-20250514",
    max_tokens=1024,
    tools=tools,
    messages=[{"role": "user", "content": "What's the weather in Tokyo?"}],
)

# Tool use blocks are captured in the trace

With KPIs & Alerts

kpis.py
from turingpulse_sdk import instrument, KPIConfig, GovernanceDirective
from turingpulse_sdk_anthropic import patch_anthropic

patch_anthropic(name="anthropic-service", governance=GovernanceDirective(hatl=True))

@instrument(
    name="anthropic-agent",
    kpis=[
        KPIConfig(kpi_id="latency_ms", use_duration=True, alert_threshold=5000),
        KPIConfig(kpi_id="cost_usd", from_result_path="cost", alert_threshold=0.10, comparator="gt"),
    ],
)
def my_agent(query: str):
    return client.messages.create(model="claude-sonnet-4-20250514", messages=[{"role": "user", "content": query}])

What Gets Captured

Data PointDescriptionExample
MessagesFull message request and response with model infoclaude-sonnet-4-20250514, stop_reason: end_turn
Tool UseTool use blocks with inputs and tool resultsget_weather(location='Tokyo')
Token UsageInput and output token counts per messageinput: 250, output: 180
StreamingTime-to-first-token and streaming event tracesttfb: 220ms, total: 2800ms
Model ParametersTemperature, max_tokens, top_p, and system prompttemp=0.7, max_tokens=1024
LatencyEnd-to-end request timingtotal: 1900ms
ErrorsAPI errors with status codes and contextOverloadedError: 529 Overloaded
💡
Cost Tracking
TuringPulse automatically calculates costs based on Anthropic pricing for each Claude model variant. View cost breakdowns in the dashboard.

Next Steps