Model Providers
Every agent needs a model. Swap providers without changing any agent code. They all implement the same ModelProvider protocol.
Overview
| Provider | Module | Auth | Platform |
|---|---|---|---|
| AWS Bedrock | StrandsBedrockProvider | Cognito / IAM | macOS, iOS |
| Anthropic | StrandsAnthropicProvider | ANTHROPIC_API_KEY | macOS, iOS |
| OpenAI | StrandsOpenAIProvider | OPENAI_API_KEY | macOS, iOS |
| Google Gemini | StrandsGeminiProvider | GOOGLE_API_KEY | macOS, iOS |
| MLX (local) | StrandsMLXProvider | None | macOS Apple Silicon |
AWS Bedrock
Uses the AWS Bedrock ConverseStream API. Supports Claude, Titan, Llama, Mistral, and any model available in your AWS region.
Development (IAM credentials from environment)Swift
import StrandsBedrockProvider
let provider = try BedrockProvider(config: BedrockConfig(
modelId: "us.anthropic.claude-sonnet-4-20250514-v1:0",
region: "us-east-1" // optional, defaults to AWS_DEFAULT_REGION
))
Production (Cognito)Swift
import Amplify
import AWSCognitoAuthPlugin
try Amplify.add(plugin: AWSCognitoAuthPlugin())
try Amplify.configure()
try await Amplify.Auth.signIn(username: email, password: password)
// BedrockProvider picks up Cognito credentials automatically
let provider = try BedrockProvider(config: BedrockConfig(
modelId: "us.anthropic.claude-sonnet-4-20250514-v1:0"
))
Common model IDs
| Model | ID |
|---|---|
| Claude Sonnet 4 | us.anthropic.claude-sonnet-4-20250514-v1:0 |
| Claude Haiku 3.5 | us.anthropic.claude-haiku-3-5-20241022-v1:0 |
| Llama 3.3 70B | us.meta.llama3-3-70b-instruct-v1:0 |
Anthropic
Direct calls to the Anthropic Messages API. No additional AWS dependencies.
Swift
import StrandsAnthropicProvider
// From an explicit key
let provider = AnthropicProvider(config: AnthropicConfig(
apiKey: "sk-ant-...",
model: "claude-sonnet-4-5-20251001"
))
// From environment variable ANTHROPIC_API_KEY
let provider = AnthropicProvider()
OpenAI
Uses the OpenAI Chat Completions API. Compatible with any OpenAI-compatible endpoint (Groq, Together AI, local Ollama, etc.).
Swift
import StrandsOpenAIProvider
// OpenAI
let provider = OpenAIProvider(config: OpenAIConfig(
apiKey: "sk-...",
model: "gpt-4o"
))
// Groq (OpenAI-compatible)
let provider = OpenAIProvider(config: OpenAIConfig(
apiKey: "gsk_...",
model: "llama-3.3-70b-versatile",
baseURL: URL(string: "https://api.groq.com/openai/v1")!
))
// From environment variable OPENAI_API_KEY
let provider = OpenAIProvider()
Google Gemini
Swift
import StrandsGeminiProvider
let provider = GeminiProvider(config: GeminiConfig(
apiKey: "AIza...",
model: "gemini-2.0-flash"
))
// From environment variable GOOGLE_API_KEY
let provider = GeminiProvider()
MLX (Local)
Runs a quantized language model entirely on Apple Silicon. See the Local Inference page for full details including recommended models and hybrid routing.
Swift
import StrandsMLXProvider
let provider = MLXProvider(modelId: "mlx-community/Qwen3-8B-4bit")
let agent = Agent(model: provider)