Model Providers

Every agent needs a model. Swap providers without changing any agent code. They all implement the same ModelProvider protocol.

Overview

ProviderModuleAuthPlatform
AWS BedrockStrandsBedrockProviderCognito / IAMmacOS, iOS
AnthropicStrandsAnthropicProviderANTHROPIC_API_KEYmacOS, iOS
OpenAIStrandsOpenAIProviderOPENAI_API_KEYmacOS, iOS
Google GeminiStrandsGeminiProviderGOOGLE_API_KEYmacOS, iOS
MLX (local)StrandsMLXProviderNonemacOS Apple Silicon

AWS Bedrock

Uses the AWS Bedrock ConverseStream API. Supports Claude, Titan, Llama, Mistral, and any model available in your AWS region.

Development (IAM credentials from environment)Swift
import StrandsBedrockProvider

let provider = try BedrockProvider(config: BedrockConfig(
    modelId: "us.anthropic.claude-sonnet-4-20250514-v1:0",
    region: "us-east-1"   // optional, defaults to AWS_DEFAULT_REGION
))
Production (Cognito)Swift
import Amplify
import AWSCognitoAuthPlugin

try Amplify.add(plugin: AWSCognitoAuthPlugin())
try Amplify.configure()
try await Amplify.Auth.signIn(username: email, password: password)

// BedrockProvider picks up Cognito credentials automatically
let provider = try BedrockProvider(config: BedrockConfig(
    modelId: "us.anthropic.claude-sonnet-4-20250514-v1:0"
))

Common model IDs

ModelID
Claude Sonnet 4us.anthropic.claude-sonnet-4-20250514-v1:0
Claude Haiku 3.5us.anthropic.claude-haiku-3-5-20241022-v1:0
Llama 3.3 70Bus.meta.llama3-3-70b-instruct-v1:0

Anthropic

Direct calls to the Anthropic Messages API. No additional AWS dependencies.

Swift
import StrandsAnthropicProvider

// From an explicit key
let provider = AnthropicProvider(config: AnthropicConfig(
    apiKey: "sk-ant-...",
    model: "claude-sonnet-4-5-20251001"
))

// From environment variable ANTHROPIC_API_KEY
let provider = AnthropicProvider()

OpenAI

Uses the OpenAI Chat Completions API. Compatible with any OpenAI-compatible endpoint (Groq, Together AI, local Ollama, etc.).

Swift
import StrandsOpenAIProvider

// OpenAI
let provider = OpenAIProvider(config: OpenAIConfig(
    apiKey: "sk-...",
    model: "gpt-4o"
))

// Groq (OpenAI-compatible)
let provider = OpenAIProvider(config: OpenAIConfig(
    apiKey: "gsk_...",
    model: "llama-3.3-70b-versatile",
    baseURL: URL(string: "https://api.groq.com/openai/v1")!
))

// From environment variable OPENAI_API_KEY
let provider = OpenAIProvider()

Google Gemini

Swift
import StrandsGeminiProvider

let provider = GeminiProvider(config: GeminiConfig(
    apiKey: "AIza...",
    model: "gemini-2.0-flash"
))

// From environment variable GOOGLE_API_KEY
let provider = GeminiProvider()

MLX (Local)

Runs a quantized language model entirely on Apple Silicon. See the Local Inference page for full details including recommended models and hybrid routing.

Swift
import StrandsMLXProvider

let provider = MLXProvider(modelId: "mlx-community/Qwen3-8B-4bit")
let agent = Agent(model: provider)