← Back to all entries
2026-01-02 💡 Tips 'n' Tricks

SDK v0.45 / v0.35 Ship Structured Output Helpers & Batch API Gets a Lift

SDK v0.45 / v0.35 Ship Structured Output Helpers & Batch API Gets a Lift — visual for 2026-01-02

💡 Anthropic SDK v0.45 (Python) & v0.35 (TypeScript) — Structured Output Helpers

The first developer-focused release of 2026 has landed: Anthropic Python SDK v0.45.0 and TypeScript SDK v0.35.0 ship new structured output helper classes that make it significantly easier to enforce typed schemas on Claude's responses without manually wrapping every call in a validation layer. Both SDKs introduce a parse() method on the response object that validates the output against a Pydantic model (Python) or Zod schema (TypeScript) and raises a typed exception on mismatch.

# Python — structured output with automatic validation
from anthropic import Anthropic
import anthropic

client = Anthropic()

class AnalysisResult(BaseModel):
    sentiment: Literal["positive", "negative", "neutral"]
    confidence: float
    summary: str

response = client.messages.create(
    model="claude-sonnet-4-5",
    max_tokens=512,
    messages=[{"role": "user", "content": "Analyse this review: ..."}]
)
result = response.parse(AnalysisResult)
print(result.sentiment)   # typed, validated

Both SDKs also add a stream_to_file() helper that pipelines a streaming response directly to a file object, closing a longstanding gap for developers building document-generation pipelines. The Python SDK ships with updated type stubs generated from the OpenAPI spec, reducing IDE false-positives on newer parameters.

SDK Python TypeScript structured outputs retrospective

💡 Batch Messages API — Higher Limits & Real-Time Status Webhooks

Anthropic has updated the Batch Messages API with three improvements that address the most common friction points for teams running large-scale offline inference workloads. The changes take effect immediately for all tiers.

What changed

Batch pricing remains at 50% of the standard per-token rate. Anthropic notes that batch jobs now account for over 20% of total API token volume, driven primarily by data labelling, document classification, and evaluation pipeline use cases.

API batch processing developer tools retrospective