Skip to content

Quick Start


5 minutes Beginner Friendly

Get up and running with AgentForce ADK in just a few minutes. This guide will walk you through creating your first AI agent.

Let’s create a simple conversational agent:


my-first-agent.ts
import { AgentForceAgent } from '@agentforce/adk';
// Create agent configuration
const agentConfig = {
name: "MyFirstAgent"
};
// Create and configure your agent
const agent = new AgentForceAgent(agentConfig)
.useLLM("ollama", "gemma3:12b")
.systemPrompt("You are a helpful AI assistant specialized in TypeScript development")
.prompt("Hello! Can you help me understand TypeScript interfaces?");
// Generate response
const textResponse = await agent.output("text");
console.log("Agent Response:", textResponse);

Run your agent:

Terminal window
bun run my-first-agent.ts

Let’s break down what’s happening:

const agentConfig = {
name: "MyFirstAgent"
};

Every agent needs a name for identification and logging.

Learn more about Agent Configuration.

const agent = new AgentForceAgent(agentConfig)
.useLLM("ollama", "gemma3:12b")
.systemPrompt("You are a helpful AI assistant")
.prompt("Hello! Tell me a joke")

AgentForce ADK uses method chaining for intuitive configuration:

  • .useLLM() - Configure the AI provider and model
  • .systemPrompt() - Set the system instructions
  • .prompt() - Set the user input

All Agent Methods are documented in the Agent Reference.

const response = await agent.output("text"); // Get formatted output
const agent = new AgentForceAgent(config)
.useLLM("ollama", "gemma3:12b")
.systemPrompt("You are a helpful assistant")
.prompt("Explain quantum computing in simple terms");
const textResponse = await await agent.output("text");
console.log(textResponse);

Ollama (Local)

.useLLM("ollama", "gemma3:12b")
.useLLM("ollama", "phi4-mini:latest")
.useLLM("ollama", "llama3.2:latest")

OpenRouter (Cloud)

.useLLM("openrouter", "openai/gpt-4")
.useLLM("openrouter", "anthropic/claude-3-sonnet")
.useLLM("openrouter", "meta-llama/llama-3.1-8b-instruct")

Here’s a more comprehensive example that creates a story generator agent:

import { AgentForceAgent } from '@agentforce/adk';
async function createStoryGenerator() {
const storyAgent = new AgentForceAgent({
name: "StoryGenerator"
})
.useLLM("ollama", "gemma3:12b")
.systemPrompt(`
You are a creative writing assistant.
Create engaging short stories with:
- Compelling characters
- Clear plot structure
- Vivid descriptions
- Satisfying conclusions
Keep stories under 500 words.
`)
.prompt("Write a short story about a robot who learns to paint");
// Get different output formats
const story = await storyAgent.getResponse();
console.log("Generated Story:\n", story);
return story;
}
// Run the story generator
createStoryGenerator().catch(console.error);

Always include proper error handling in your agents:

import { AgentForceAgent } from '@agentforce/adk';
async function robustAgent() {
try {
const agent = new AgentForceAgent({
name: "RobustAgent"
})
.useLLM("ollama", "gemma3:12b")
.systemPrompt("You are a helpful assistant")
.prompt("Hello world!");
const response = await agent.output("text");
return response;
} catch (error) {
console.error("Agent execution failed:", error);
return "Sorry, I encountered an error. Please try again.";
}
}
// Run the robust agent
robustAgent().then(response => {
console.log("Agent Response:", response);
});

Now that AgentForce ADK is installed, you’ll need to set up AI providers to power your agents:

  1. Server Mode - Run agents in server mode as endpoints
  2. Basic Agents Guide - Learn agent fundamentals
  3. Advanced Agents Guide - Explore complex agent configurations

Congratulations! You’ve successfully created your first AgentForce ADK agent. The framework’s method chaining approach makes it easy to build powerful AI-powered applications with minimal code.