Skip to main content

Google (Gemini)

Connect Scrub to Google AI to use Gemini models with PHI protection.

Setup

  1. Get a Google AI API key from makersuite.google.com
  2. Log in to your Scrub Dashboard
  3. Go to Providers and select Google
  4. Paste your API key and save

Available Models

ModelDescription
gemini-proBalanced Gemini model
gemini-1.5-proLatest Gemini 1.5 Pro with 1M context
gemini-1.5-flashFast and efficient

Example Request

Use the same OpenAI-compatible format - Scrub handles the translation:

curl https://api.scrub.health/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $SCRUB_API_KEY" \
-d '{
"model": "gemini-1.5-pro",
"messages": [
{"role": "system", "content": "You are a helpful healthcare assistant."},
{"role": "user", "content": "What are the symptoms of hypertension?"}
]
}'

Using the OpenAI SDK

You can use the OpenAI SDK with Gemini models:

from openai import OpenAI

client = OpenAI(
api_key="your_scrub_api_key",
base_url="https://api.scrub.health/v1"
)

response = client.chat.completions.create(
model="gemini-1.5-pro",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Hello!"}
]
)

print(response.choices[0].message.content)
import OpenAI from 'openai';

const client = new OpenAI({
apiKey: process.env.SCRUB_API_KEY,
baseURL: 'https://api.scrub.health/v1',
});

const response = await client.chat.completions.create({
model: 'gemini-1.5-pro',
messages: [
{ role: 'system', content: 'You are a helpful assistant.' },
{ role: 'user', content: 'Hello!' }
],
});

console.log(response.choices[0].message.content);

Parameter Mapping

Scrub translates OpenAI parameters to Google's format:

OpenAI ParameterGoogle Equivalent
messagescontents
max_tokensmaxOutputTokens
temperaturetemperature
top_ptopP
stopstopSequences

Message Format Translation

Scrub automatically converts OpenAI message format to Google's format:

// Your request (OpenAI format)
{
"model": "gemini-1.5-pro",
"messages": [
{"role": "system", "content": "You are a doctor."},
{"role": "user", "content": "Hello!"},
{"role": "assistant", "content": "Hi there!"},
{"role": "user", "content": "How are you?"}
]
}

// Scrub sends to Google
{
"contents": [
{"role": "user", "parts": [{"text": "You are a doctor.\n\nHello!"}]},
{"role": "model", "parts": [{"text": "Hi there!"}]},
{"role": "user", "parts": [{"text": "How are you?"}]}
]
}

System Messages

Google Gemini doesn't have a dedicated system message field. Scrub prepends system content to the first user message automatically.

Streaming

Streaming support for Google models is available:

stream = client.chat.completions.create(
model="gemini-1.5-pro",
messages=[{"role": "user", "content": "Tell me a story"}],
stream=True
)

for chunk in stream:
if chunk.choices[0].delta.content:
print(chunk.choices[0].delta.content, end="")

Note: Streaming translation from Google's format is currently in development. Non-streaming requests are fully supported.