Google (Gemini)
Connect Scrub AI to Google AI to use Gemini models with PHI protection.
Setup
- Get a Google AI API key from makersuite.google.com
- Log in to your Scrub AI Dashboard
- Go to Providers and select Google
- Paste your API key and save
Available Models
| Model | Description |
|---|---|
gemini-3-pro-preview | Best model in the world for multimodal understanding |
gemini-3-flash-preview | Our most balanced model built for speed, scale, and frontier intelligence |
gemini-2.5-pro | State-of-the-art thinking model |
gemini-2.5-flash | Best model in terms of price-performance, offering well-rounded capabilities |
gemini-2.5-flash-lite | Our fastest flash model optimized for cost-efficiency and high throughput |
Example Request
Use the same OpenAI-compatible format - Scrub AI handles the translation:
curl https://api.scrub.health/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $SCRUB_API_KEY" \
-d '{
"model": "gemini-3-flash-preview",
"messages": [
{"role": "system", "content": "You are a helpful healthcare assistant."},
{"role": "user", "content": "What are the symptoms of hypertension?"}
]
}'
Using the OpenAI SDK
You can use the OpenAI SDK with Gemini models:
from openai import OpenAI
client = OpenAI(
api_key="your_scrub_api_key",
base_url="https://api.scrub.health/v1"
)
response = client.chat.completions.create(
model="gemini-3-flash-preview",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Hello!"}
]
)
print(response.choices[0].message.content)
import OpenAI from 'openai';
const client = new OpenAI({
apiKey: process.env.SCRUB_API_KEY,
baseURL: 'https://api.scrub.health/v1',
});
const response = await client.chat.completions.create({
model: 'gemini-3-flash-preview',
messages: [
{ role: 'system', content: 'You are a helpful assistant.' },
{ role: 'user', content: 'Hello!' }
],
});
console.log(response.choices[0].message.content);
Parameter Mapping
Scrub AI translates OpenAI parameters to Google's format:
| OpenAI Parameter | Google Equivalent |
|---|---|
messages | contents |
max_tokens | maxOutputTokens |
temperature | temperature |
top_p | topP |
stop | stopSequences |
Message Format Translation
Scrub AI automatically converts OpenAI message format to Google's format:
// Your request (OpenAI format)
{
"model": "gemini-3-flash-preview",
"messages": [
{"role": "system", "content": "You are a doctor."},
{"role": "user", "content": "Hello!"},
{"role": "assistant", "content": "Hi there!"},
{"role": "user", "content": "How are you?"}
]
}
// Scrub AI sends to Google
{
"contents": [
{"role": "user", "parts": [{"text": "You are a doctor.\n\nHello!"}]},
{"role": "model", "parts": [{"text": "Hi there!"}]},
{"role": "user", "parts": [{"text": "How are you?"}]}
]
}
System Messages
Google Gemini doesn't have a dedicated system message field. Scrub AI prepends system content to the first user message automatically.
Streaming
Streaming support for Google models is available:
stream = client.chat.completions.create(
model="gemini-3-flash-preview",
messages=[{"role": "user", "content": "Tell me a story"}],
stream=True
)
for chunk in stream:
if chunk.choices[0].delta.content:
print(chunk.choices[0].delta.content, end="")
Note: Streaming translation from Google's format is currently in development. Non-streaming requests are fully supported.