Skip to content

Provider Setup


Essential Multiple Options

AgentForce ADK supports multiple AI providers, giving you flexibility to choose between local and cloud-based models based on your needs.

Ollama

Local Free Privacy

Best for:

  • Development and testing
  • Privacy-sensitive applications
  • Offline environments
  • Cost-effective solutions

OpenRouter

Cloud Multiple Providers API Key Required

Best for:

  • Production applications
  • Access to latest models
  • Scalable solutions
  • Multiple model providers

Ollama allows you to run large language models locally on your machine.

Terminal window
# Install Ollama using Homebrew
brew install ollama
# Start Ollama service
brew services start ollama
# Verify installation
ollama --version

Pull models based on your hardware and use case:

Terminal window
# Fast and efficient models
ollama run phi4-mini-reasoning # 3.8B params, great small thinking model
ollama run gemma3:4b # 4B params, balanced performance
ollama run llama3.2:3b # 3B params, Meta's model

Test your Ollama installation:

Terminal window
# Check if Ollama is running
ollama list
# Test a model
ollama run gemma3:12b "Hello, how are you?"
# Check Ollama API
curl http://localhost:11434/api/tags
import { AgentForceAgent } from '@agentforce/adk';
const agent = new AgentForceAgent({ name: "LocalAgent" })
.useLLM("ollama", "gemma3:12b") // provider: "ollama", model: "gemma3:12b"
.systemPrompt("You are a helpful assistant running locally")
.prompt("What are the benefits of local AI models?");
const response = await agent.output("text");
console.log(response);

OpenRouter provides access to multiple AI providers through a single API.

  1. Visit OpenRouter.ai
  2. Sign up for a free account
  3. Navigate to API Keys
  4. Create a new API key
Terminal window
# Add to your shell profile (.bashrc, .zshrc, etc.)
export OPENROUTER_API_KEY="sk-or-v1-your-api-key-here"
# Or add to .env file
echo "OPENROUTER_API_KEY=sk-or-v1-your-api-key-here" >> .env

OpenRouter provides access to models from multiple providers:

🔥 Free Models

// Great for development and testing
.useLLM("openrouter", "moonshotai/kimi-k2:free")
.useLLM("openrouter", "openai/gpt-oss-20b:free")
.useLLM("openrouter", "z-ai/glm-4.5-air:free")

🚀 OpenAI Models

// OpenAI's latest models
.useLLM("openrouter", "openai/gpt-5-chat")
.useLLM("openrouter", "openai/gpt-5-mini")
.useLLM("openrouter", "openai/o3")

🧠 Anthropic Models

// Claude models
.useLLM("openrouter", "anthropic/claude-opus-4.1")
.useLLM("openrouter", "anthropic/claude-opus-4")
.useLLM("openrouter", "anthropic/claude-sonnet-4")

🦙 Meta Models

// Llama models
.useLLM("openrouter", "meta-llama/llama-4-maverick")
.useLLM("openrouter", "meta-llama/llama-4-scout")
.useLLM("openrouter", "meta-llama/llama-3.3-70b-instruct")
import { AgentForceAgent } from '@agentforce/adk';
const agent = new AgentForceAgent({ name: "CloudAgent" })
.useLLM("openrouter", "z-ai/glm-4.5v") // provider: "openrouter", model: "z-ai/glm-4.5v"
.systemPrompt("You are a cloud-powered AI assistant")
.prompt("What are the advantages of cloud-based AI models?");
const response = await agent.output("text");
console.log(response);

Monitor your OpenRouter usage:

  1. Check your OpenRouter Settings
  2. Set up budget alerts
  3. Use free models for development
  4. Consider model costs for production
FeatureOllama (Local)OpenRouter (Cloud)
CostFreePay-per-use
PrivacyCompleteProvider-dependent
LatencyLow (local)Network-dependent
ModelsLimited selection100+ models
SetupInstallation requiredAPI key only
Offline✅ Yes❌ No
ScalabilityHardware-limitedUnlimited

Ollama not starting:

Terminal window
# Check if Ollama is running
ps aux | grep ollama
# Start Ollama manually
ollama serve
# Check logs
ollama logs

Model not found:

Terminal window
# List available models
ollama list
# Pull missing model
ollama pull gemma3:12b
# Check model info
ollama show gemma3:12b

Connection refused:

Terminal window
# Check if Ollama API is accessible
curl http://localhost:11434/api/tags
# Restart Ollama service
brew services restart ollama # macOS
sudo systemctl restart ollama # Linux

API key not working:

  • Verify key at OpenRouter Keys
  • Check environment variable: echo $OPENROUTER_API_KEY
  • Ensure key starts with sk-or-v1-

Rate limiting:

  • Check your usage at OpenRouter Activity
  • Consider upgrading your plan
  • Use free models for development

Model not available:

  • Check OpenRouter Models for availability
  • Some models require approval or higher tier plans

Now that AgentForce ADK is installed, you’ll need to set up AI providers to power your agents:

  1. Quick Start - Create your first agent
  2. Server Mode - Run agents in server mode as endpoints
  3. Basic Agents Guide - Learn agent fundamentals

Great! You now have AI providers configured and ready to power your AgentForce ADK agents. Choose Ollama for local development and privacy, or OpenRouter for access to the latest cloud models.