This integration is for the Respan gateway.
Overview
OpenAI SDK provides the most robust integration method for accessing multiple model providers.
Since most AI providers prioritize OpenAI SDK compatibility, you can seamlessly call all 250+ models available through the Respan platform gateway.
Quickstart
Step 1: Install OpenAI SDK
- Get a Respan API key
- Add your provider credentials
- Install packages
Step 2: Initialize Client
from openai import OpenAI
client = OpenAI(
base_url="https://api.respan.ai/api/",
api_key="YOUR_RESPAN_API_KEY", # Get from Respan dashboard
)
Step 3: Make Your First Request
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=[{"role": "user", "content": "Hello, world!"}],
)
print(response.choices[0].message.content)
Switch models
# OpenAI GPT models
model = "gpt-4o"
# model = "claude-3-5-sonnet-20241022"
# model = "gemini-1.5-pro"
response = client.chat.completions.create(
model=model,
messages=[{"role": "user", "content": "Your message"}],
)
Supported parameters
OpenAI parameters
We support all the OpenAI parameters. You can pass them directly in the request body.
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=[{"role": "user", "content": "Tell me a story"}],
temperature=0.7, # Control randomness
max_tokens=1000, # Limit response length
top_p=0.9, # Nucleus sampling
frequency_penalty=0.1, # Reduce repetition
presence_penalty=0.1, # Encourage topic diversity
stream=True, # Enable streaming
)
Respan Parameters
Respan parameters can be passed for better handling and customization.
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=[{"role": "user", "content": "Tell me a story"}],
extra_body={
"customer_identifier": "user_123", # Track specific users
"fallback_models": ["gpt-3.5-turbo"], # Automatic fallbacks
"metadata": {"session_id": "abc123"}, # Custom metadata
"thread_identifier": "conversation_456", # Group related messages
"group_identifier": "team_alpha", # Organize by groups
}
)
Azure OpenAI
To call Azure OpenAI models, instead of using azure OpenAI’s client, the easier way is to use the OpenAI client.
1. Go to [Respan Providers](https://platform.respan.ai/platform/api/providers)
2. Add your Azure OpenAI credentials
3. Configure your Azure deployment settings
4. Use Azure models through the same Respan endpoint
View your analytics
Access your Respan dashboard to see detailed analytics
Next Steps