Skip to content

Create

Terminal window
mesh create my-agent \
--skill external-agent-bridge \
--skill fast-analytics-query \
-d "My agent description" \
--model claude-sonnet-4-6

This generates:

agents/my_agent/
agent.yaml # Manifest
main.py # Bootstrap — registers with control plane
skills.py # @router.skill() endpoints for each tool
reasoners.py # @router.reasoner() with system prompt + tools="discover"
skill_*.py # Tool implementations copied from library
FilePurpose
agent.yamlManifest — skills, model, version, status
main.pyAgent bootstrap — connects to control plane
reasoners.pyLLM reasoning loop with system prompt
skills.pyTool endpoints discovered at runtime

In the TUI or Web UI, ask the builder:

“Create an agent that queries Fetch analytics data”

The builder will select appropriate skills, generate a system prompt, and scaffold the agent.

Any model from Bedrock, OpenAI, or Anthropic. The model string format is provider/model-id — LiteLLM handles routing.

ModelTierNotes
claude-sonnet-4-6PremiumDefault. Best quality, clean tool calls
claude-haiku-4-5FastBest value, very fast
claude-opus-4-6PremiumMost capable, deep reasoning
llama-4-scout-17bBudgetClean tool calls, great price/performance
llama-4-maverick-17bBudget1M context, needs tool call dedup
minimax-m2-1ReasoningNeeds higher maxTokens
ModelTierNotes
gpt-5-miniFastReliable tool use
gpt-4.1-miniFastGood balance of speed and quality
o3-miniReasoningOpenAI reasoning model
ModelTierNotes
claude-sonnet-4-6-directPremiumVia Anthropic API, needs ANTHROPIC_API_KEY

Or pass any LiteLLM-compatible model string directly: bedrock/us.anthropic.claude-sonnet-4-6, openai/gpt-5-mini, etc.