AgentForceServer
AgentForceServer
Section titled “AgentForceServer”AgentForceServer Server Class HTTP Server
The AgentForceServer class provides HTTP server functionality for serving agents and workflows as web services, built on the Hono framework for lightweight, fast HTTP handling.
ServerConfig
Section titled “ServerConfig”The constructor accepts a ServerConfig object with the following properties:
name(string): Required name identifier for the serverlogger?(AgentForceLogger): Optional custom logger instance
Route Management Methods
Section titled “Route Management Methods”Method
Type
Required
Description
.addRoute() Chainable Optional Add static routes that return predefined data
.addRouteAgent() Chainable Optional Add agent-powered routes with optional schema validation
.addFormTrigger() Chainable Optional Add HTML form triggers that execute agents on form submission
.addWorkflowTrigger() Chainable Optional Add workflow triggers that execute workflow files
Compatibility Methods
Section titled “Compatibility Methods”Method
Type
Required
Description
.useOpenAICompatibleRouting() Chainable Optional Enable OpenAI API compatible endpoints for drop-in replacement
.useOllamaCompatibleRouting() Chainable Optional Enable Ollama API compatible endpoints for local model serving
Execution Methods
Section titled “Execution Methods”Method
Type
Required
Description
.serve() Execute Async Required Start the HTTP server with optional host and port configuration
Method Types
Section titled “Method Types”- Chainable: Returns the server instance for method chaining
- Terminal: Starts the server (ends the chain)
- Async: Returns a Promise and must be awaited
Route Agent Schema
Section titled “Route Agent Schema”Define input validation and output structure for agent-powered routes:
interface RouteAgentSchema { input: string[]; // Required input fields from request body output: string[]; // Fields to include in response}Quick Start Example
Section titled “Quick Start Example”import { AgentForceAgent, AgentForceServer, type RouteAgentSchema } from '@agentforce/adk';
// Create an agentconst userStoryAgent = new AgentForceAgent({ name: "StoryCreationAgent"}) .useLLM("ollama", "gemma3:12b") .systemPrompt("You are a Product Owner agent. You will respond with a user story.") .withTemplate("examples/templates/user-story.md");
// Define schema for custom endpointsconst userStorySchema: RouteAgentSchema = { input: ["prompt", "persona"], output: ["success", "persona", "prompt", "response"]};
// Create and configure servernew AgentForceServer({ name: "StoryCreationServer" }) // Add static routes .addRoute("GET", "/health", {"status": "ok"})
// Add custom routes with schemas .addRouteAgent("POST", "/create-user-story", userStoryAgent, userStorySchema)
// Start server with default host and port (0.0.0.0:3000) .serve();Advanced Usage
Section titled “Advanced Usage”OpenAI Compatibility
Section titled “OpenAI Compatibility”import { AgentForceAgent, AgentForceServer } from '@agentforce/adk';
const chatAgent = new AgentForceAgent({ name: "ChatAgent" }) .useLLM("ollama", "gemma3:12b") .systemPrompt("You are a helpful assistant.");
new AgentForceServer({ name: "OpenAICompatibleServer" }) .useOpenAICompatibleRouting(chatAgent) .serve("localhost", 8080);
// Now accessible via OpenAI SDK:// POST /v1/chat/completionsOllama Compatibility
Section titled “Ollama Compatibility”import { AgentForceAgent, AgentForceServer } from '@agentforce/adk';
const localAgent = new AgentForceAgent({ name: "LocalAgent" }) .useLLM("ollama", "gemma3:12b");
new AgentForceServer({ name: "OllamaCompatibleServer" }) .useOllamaCompatibleRouting(localAgent) .serve("0.0.0.0", 11434);
// Now accessible via Ollama API:// POST /api/generateMultiple Routes
Section titled “Multiple Routes”import { AgentForceAgent, AgentForceServer } from '@agentforce/adk';
// Create multiple agentsconst codeAgent = new AgentForceAgent({ name: "CodeReviewer" }) .useLLM("ollama", "gemma3:12b") .systemPrompt("You are a code review assistant.");
const docsAgent = new AgentForceAgent({ name: "DocumentationAgent" }) .useLLM("ollama", "gemma3:12b") .systemPrompt("You generate documentation from code.");
// Configure server with multiple routesnew AgentForceServer({ name: "MultiServiceServer" }) .addRoute("GET", "/health", { status: "healthy" }) .addRoute("GET", "/version", { version: "1.0.0" }) .addRouteAgent("POST", "/review-code", codeAgent) .addRouteAgent("POST", "/generate-docs", docsAgent) .serve("localhost", 3000);