Skip to content

AgentForceServer


AgentForceServer Server Class HTTP Server

The AgentForceServer class provides HTTP server functionality for serving agents and workflows as web services, built on the Hono framework for lightweight, fast HTTP handling.

The constructor accepts a ServerConfig object with the following properties:

  • name (string): Required name identifier for the server
  • logger? (AgentForceLogger): Optional custom logger instance
Method
Type
Required
Description
.addRoute()
Chainable
Optional
Add static routes that return predefined data
.addRouteAgent()
Chainable
Optional
Add agent-powered routes with optional schema validation
.addFormTrigger()
Chainable
Optional
Add HTML form triggers that execute agents on form submission
.addWorkflowTrigger()
Chainable
Optional
Add workflow triggers that execute workflow files
Method
Type
Required
Description
.useOpenAICompatibleRouting()
Chainable
Optional
Enable OpenAI API compatible endpoints for drop-in replacement
.useOllamaCompatibleRouting()
Chainable
Optional
Enable Ollama API compatible endpoints for local model serving
Method
Type
Required
Description
.serve()
Execute Async
Required
Start the HTTP server with optional host and port configuration
  • Chainable: Returns the server instance for method chaining
  • Terminal: Starts the server (ends the chain)
  • Async: Returns a Promise and must be awaited

Define input validation and output structure for agent-powered routes:

interface RouteAgentSchema {
input: string[]; // Required input fields from request body
output: string[]; // Fields to include in response
}
import { AgentForceAgent, AgentForceServer, type RouteAgentSchema } from '@agentforce/adk';
// Create an agent
const userStoryAgent = new AgentForceAgent({ name: "StoryCreationAgent"})
.useLLM("ollama", "gemma3:12b")
.systemPrompt("You are a Product Owner agent. You will respond with a user story.")
.withTemplate("examples/templates/user-story.md");
// Define schema for custom endpoints
const userStorySchema: RouteAgentSchema = {
input: ["prompt", "persona"],
output: ["success", "persona", "prompt", "response"]
};
// Create and configure server
new AgentForceServer({ name: "StoryCreationServer" })
// Add static routes
.addRoute("GET", "/health", {"status": "ok"})
// Add custom routes with schemas
.addRouteAgent("POST", "/create-user-story", userStoryAgent, userStorySchema)
// Start server with default host and port (0.0.0.0:3000)
.serve();
import { AgentForceAgent, AgentForceServer } from '@agentforce/adk';
const chatAgent = new AgentForceAgent({ name: "ChatAgent" })
.useLLM("ollama", "gemma3:12b")
.systemPrompt("You are a helpful assistant.");
new AgentForceServer({ name: "OpenAICompatibleServer" })
.useOpenAICompatibleRouting(chatAgent)
.serve("localhost", 8080);
// Now accessible via OpenAI SDK:
// POST /v1/chat/completions
import { AgentForceAgent, AgentForceServer } from '@agentforce/adk';
const localAgent = new AgentForceAgent({ name: "LocalAgent" })
.useLLM("ollama", "gemma3:12b");
new AgentForceServer({ name: "OllamaCompatibleServer" })
.useOllamaCompatibleRouting(localAgent)
.serve("0.0.0.0", 11434);
// Now accessible via Ollama API:
// POST /api/generate
import { AgentForceAgent, AgentForceServer } from '@agentforce/adk';
// Create multiple agents
const codeAgent = new AgentForceAgent({ name: "CodeReviewer" })
.useLLM("ollama", "gemma3:12b")
.systemPrompt("You are a code review assistant.");
const docsAgent = new AgentForceAgent({ name: "DocumentationAgent" })
.useLLM("ollama", "gemma3:12b")
.systemPrompt("You generate documentation from code.");
// Configure server with multiple routes
new AgentForceServer({ name: "MultiServiceServer" })
.addRoute("GET", "/health", { status: "healthy" })
.addRoute("GET", "/version", { version: "1.0.0" })
.addRouteAgent("POST", "/review-code", codeAgent)
.addRouteAgent("POST", "/generate-docs", docsAgent)
.serve("localhost", 3000);