Skip to main content

Anthropic

Connect Scrub to Anthropic to use Claude models with PHI protection.

Setup

  1. Get an Anthropic API key from console.anthropic.com
  2. Log in to your Scrub Dashboard
  3. Go to Providers and select Anthropic
  4. Paste your API key and save

Available Models

ModelDescription
claude-3-opusMost capable Claude model
claude-3-sonnetBalanced performance and speed
claude-3-haikuFastest Claude model
claude-3-5-sonnetLatest Claude 3.5 Sonnet

Example Request

Use the same OpenAI-compatible format - Scrub handles the translation:

curl https://api.scrub.health/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $SCRUB_API_KEY" \
-d '{
"model": "claude-3-sonnet",
"messages": [
{"role": "system", "content": "You are a helpful healthcare assistant."},
{"role": "user", "content": "What are the symptoms of hypertension?"}
],
"max_tokens": 1000
}'

Using the OpenAI SDK

You can use the OpenAI SDK with Claude models:

from openai import OpenAI

client = OpenAI(
api_key="your_scrub_api_key",
base_url="https://api.scrub.health/v1"
)

response = client.chat.completions.create(
model="claude-3-sonnet",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Hello!"}
],
max_tokens=1000
)

print(response.choices[0].message.content)
import OpenAI from 'openai';

const client = new OpenAI({
apiKey: process.env.SCRUB_API_KEY,
baseURL: 'https://api.scrub.health/v1',
});

const response = await client.chat.completions.create({
model: 'claude-3-sonnet',
messages: [
{ role: 'system', content: 'You are a helpful assistant.' },
{ role: 'user', content: 'Hello!' }
],
max_tokens: 1000,
});

console.log(response.choices[0].message.content);

Parameter Mapping

Scrub translates OpenAI parameters to Anthropic's format:

OpenAI ParameterAnthropic Equivalent
messagesmessages (with system extracted)
max_tokensmax_tokens
temperaturetemperature
top_ptop_p
stopstop_sequences
streamstream

System Messages

Anthropic handles system messages differently. Scrub automatically extracts system messages and formats them correctly for Claude:

// Your request (OpenAI format)
{
"model": "claude-3-sonnet",
"messages": [
{"role": "system", "content": "You are a doctor."},
{"role": "user", "content": "Hello!"}
]
}

// Scrub sends to Anthropic
{
"model": "claude-3-sonnet-20240229",
"system": "You are a doctor.",
"messages": [
{"role": "user", "content": "Hello!"}
]
}

Streaming

Streaming is fully supported with Claude models:

stream = client.chat.completions.create(
model="claude-3-sonnet",
messages=[{"role": "user", "content": "Tell me a story"}],
max_tokens=1000,
stream=True
)

for chunk in stream:
if chunk.choices[0].delta.content:
print(chunk.choices[0].delta.content, end="")