Centralized, type-safe configuration for the IBM i Agent Infrastructure. The system uses two configuration files:
.env- API keys, database credentials, MCP connection (sensitive data)config.yaml- Agent behavior, model selection, UI settings (version controlled)
Create infra/.env with your API keys and connection settings:
cp infra/.env.example infra/.env# ============================================================
# MCP Server (Required) - IBM i database access
# ============================================================
# The MCP server runs automatically in Docker Compose
# Use ibmi-mcp-server:3010 for container-to-container communication
MCP_URL=http://ibmi-mcp-server:3010/mcp
MCP_TRANSPORT=streamable-http
# ============================================================
# AI Model Provider (Choose at least one)
# ============================================================
# Option 1: watsonx (IBM Cloud)
# Get keys from: https://cloud.ibm.com
WATSONX_API_KEY=your_ibm_cloud_api_key
WATSONX_PROJECT_ID=your_project_id
WATSONX_URL=https://us-south.ml.cloud.ibm.com
WATSONX_MODEL_ID=meta-llama/llama-3-3-70b-instruct
# Option 2: OpenAI
# Get key from: https://platform.openai.com/api-keys
OPENAI_API_KEY=sk-your_openai_key
# Option 3: Anthropic
# Get key from: https://console.anthropic.com
ANTHROPIC_API_KEY=sk-your_anthropic_keyNote: The default model is
"anthropic:claude-haiku-4-5"
watsonx (IBM Cloud):
- Sign up at cloud.ibm.com
- Navigate to IBM watsonx.ai
- Create a project and note the Project ID
- Generate an API key from IAM (Identity & Access Management)
OpenAI:
- Sign up at platform.openai.com
- Navigate to API Keys
- Click "Create new secret key"
- Copy the key immediately (shown only once)
Anthropic:
- Sign up at console.anthropic.com
- Navigate to API Keys
- Create a new API key
- Copy the key immediately (shown only once)
The MCP server runs automatically as part of the Docker Compose stack:
- Container name:
ibmi-mcp-server - Port:
3010 - Health endpoint:
http://localhost:3010/health - MCP endpoint:
http://ibmi-mcp-server:3010/mcp(inside Docker network)
Network Configuration:
- Inside Docker network: Use
http://ibmi-mcp-server:3010/mcp(recommended) - From host machine: Use
http://localhost:3010/mcp - From external: Use
http://host.docker.internal:3010/mcp
The MCP server configuration is loaded from the monorepo root .env file (not infra/.env).
The config.yaml file controls agent behavior, model selection, and UI settings.
performance monitoring agent example:
agents:
# Default model for all agents (can be overridden per agent)
default_model: "anthropic:claude-haiku-4-5"
# Performance monitoring agent
ibmi-performance-monitor:
# Uses default_model when not specified
# model:
enable_reasoning: false
debug_mode: falseOverride the default model per agent as needed. Supported models include:
- watsonx:
meta-llama/llama-3-3-70b-instruct, etc. - OpenAI:
gpt-4o, etc. - Anthropic:
claude-sonnet-4-5-20250929, etc.