Anthropic
Connect Scrub to Anthropic to use Claude models with PHI protection.
Setup
- Get an Anthropic API key from console.anthropic.com
- Log in to your Scrub Dashboard
- Go to Providers and select Anthropic
- Paste your API key and save
Available Models
| Model | Description |
|---|---|
claude-3-opus | Most capable Claude model |
claude-3-sonnet | Balanced performance and speed |
claude-3-haiku | Fastest Claude model |
claude-3-5-sonnet | Latest Claude 3.5 Sonnet |
Example Request
Use the same OpenAI-compatible format - Scrub handles the translation:
curl https://api.scrub.health/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $SCRUB_API_KEY" \
-d '{
"model": "claude-3-sonnet",
"messages": [
{"role": "system", "content": "You are a helpful healthcare assistant."},
{"role": "user", "content": "What are the symptoms of hypertension?"}
],
"max_tokens": 1000
}'
Using the OpenAI SDK
You can use the OpenAI SDK with Claude models:
from openai import OpenAI
client = OpenAI(
api_key="your_scrub_api_key",
base_url="https://api.scrub.health/v1"
)
response = client.chat.completions.create(
model="claude-3-sonnet",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Hello!"}
],
max_tokens=1000
)
print(response.choices[0].message.content)
import OpenAI from 'openai';
const client = new OpenAI({
apiKey: process.env.SCRUB_API_KEY,
baseURL: 'https://api.scrub.health/v1',
});
const response = await client.chat.completions.create({
model: 'claude-3-sonnet',
messages: [
{ role: 'system', content: 'You are a helpful assistant.' },
{ role: 'user', content: 'Hello!' }
],
max_tokens: 1000,
});
console.log(response.choices[0].message.content);
Parameter Mapping
Scrub translates OpenAI parameters to Anthropic's format:
| OpenAI Parameter | Anthropic Equivalent |
|---|---|
messages | messages (with system extracted) |
max_tokens | max_tokens |
temperature | temperature |
top_p | top_p |
stop | stop_sequences |
stream | stream |
System Messages
Anthropic handles system messages differently. Scrub automatically extracts system messages and formats them correctly for Claude:
// Your request (OpenAI format)
{
"model": "claude-3-sonnet",
"messages": [
{"role": "system", "content": "You are a doctor."},
{"role": "user", "content": "Hello!"}
]
}
// Scrub sends to Anthropic
{
"model": "claude-3-sonnet-20240229",
"system": "You are a doctor.",
"messages": [
{"role": "user", "content": "Hello!"}
]
}
Streaming
Streaming is fully supported with Claude models:
stream = client.chat.completions.create(
model="claude-3-sonnet",
messages=[{"role": "user", "content": "Tell me a story"}],
max_tokens=1000,
stream=True
)
for chunk in stream:
if chunk.choices[0].delta.content:
print(chunk.choices[0].delta.content, end="")