Skip to main content

Anthropic

Connect Scrub AI to Anthropic to use Claude models with PHI protection.

Setup

  1. Get an Anthropic API key from console.anthropic.com
  2. Log in to your Scrub AI Dashboard
  3. Go to Providers and select Anthropic
  4. Paste your API key and save

Available Models

ModelDescription
claude-opus-4-5Premium model combining maximum intelligence with practical performance
claude-sonnet-4-5Our smart model for complex agents and coding
claude-haiku-4-5Our fastest model with near-frontier intelligence
claude-opus-4-1Legacy model
claude-sonnet-4-0Legacy model

Example Request

Use the same OpenAI-compatible format - Scrub AI handles the translation:

curl https://api.scrub.health/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $SCRUB_API_KEY" \
-d '{
"model": "claude-sonnet-4-5",
"messages": [
{"role": "system", "content": "You are a helpful healthcare assistant."},
{"role": "user", "content": "What are the symptoms of hypertension?"}
],
"max_tokens": 1000
}'

Using the OpenAI SDK

You can use the OpenAI SDK with Claude models:

from openai import OpenAI

client = OpenAI(
api_key="your_scrub_api_key",
base_url="https://api.scrub.health/v1"
)

response = client.chat.completions.create(
model="claude-sonnet-4-5",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Hello!"}
],
max_tokens=1000
)

print(response.choices[0].message.content)
import OpenAI from 'openai';

const client = new OpenAI({
apiKey: process.env.SCRUB_API_KEY,
baseURL: 'https://api.scrub.health/v1',
});

const response = await client.chat.completions.create({
model: 'claude-sonnet-4-5',
messages: [
{ role: 'system', content: 'You are a helpful assistant.' },
{ role: 'user', content: 'Hello!' }
],
max_tokens: 1000,
});

console.log(response.choices[0].message.content);

Parameter Mapping

Scrub AI translates OpenAI parameters to Anthropic's format:

OpenAI ParameterAnthropic Equivalent
messagesmessages (with system extracted)
max_tokensmax_tokens
temperaturetemperature
top_ptop_p
stopstop_sequences
streamstream

System Messages

Anthropic handles system messages differently. Scrub AI automatically extracts system messages and formats them correctly for Claude:

// Your request (OpenAI format)
{
"model": "claude-sonnet-4-5",
"messages": [
{"role": "system", "content": "You are a doctor."},
{"role": "user", "content": "Hello!"}
]
}

// Scrub AI sends to Anthropic
{
"model": "claude-sonnet-4-5",
"system": "You are a doctor.",
"messages": [
{"role": "user", "content": "Hello!"}
]
}

Streaming

Streaming is fully supported with Claude models:

stream = client.chat.completions.create(
model="claude-sonnet-4-5",
messages=[{"role": "user", "content": "Tell me a story"}],
max_tokens=1000,
stream=True
)

for chunk in stream:
if chunk.choices[0].delta.content:
print(chunk.choices[0].delta.content, end="")