Skip to content

Supported Agents

Compatibility matrix and feature comparison for AI coding agents supported by RAPID.

FeatureClaude CodeOpenCodeAiderCopilot CLI
ProviderAnthropicMultiMultiGitHub
InterfaceChat CLIChat CLIChat CLICommands
File EditingYesYesYesNo
Git IntegrationYesYesYesNo
Multi-fileYesYesYesNo
Auto-commitNoNoYesN/A
MCP SupportYesYesNoNo
Context FilesCLAUDE.mdAGENTS.md.aider.conf.ymlN/A

Developer: Anthropic
License: Commercial
Models: Claude 3.5 Sonnet, Claude 3 Opus

  • Excellent reasoning capabilities
  • Strong at complex refactoring
  • Good understanding of context
  • MCP server integration
  • Architecture decisions
  • Complex code changes
  • Code review
  • Documentation
{
"claude": {
"cli": "claude",
"instructionFile": "CLAUDE.md",
"envVars": ["ANTHROPIC_API_KEY"],
"installCmd": "npm install -g @anthropic-ai/claude-code"
}
}

Developer: Community
License: MIT
Models: Claude, GPT-4, GPT-3.5, Local models

  • Multi-provider support
  • Extensible architecture
  • Open source
  • MCP integration
  • Flexibility in model choice
  • Cost optimization
  • Custom workflows
{
"opencode": {
"cli": "opencode",
"instructionFile": "AGENTS.md",
"envVars": ["ANTHROPIC_API_KEY", "OPENAI_API_KEY"],
"installCmd": "npm install -g opencode"
}
}
  • Based on underlying provider pricing
  • Can use local models for free

Developer: Paul Gauthier
License: Apache 2.0
Models: GPT-4, GPT-3.5, Claude, Local models

  • Automatic git commits
  • Strong pair programming workflow
  • Efficient token usage
  • Voice mode support
  • Quick code changes
  • Iterative development
  • Auto-committing workflow
{
"aider": {
"cli": "aider",
"instructionFile": ".aider.conf.yml",
"envVars": ["OPENAI_API_KEY"],
"installCmd": "pip install aider-chat",
"args": ["--model", "gpt-4o"]
}
}
{
"args": ["--model", "gpt-4o"] // OpenAI GPT-4
}
{
"args": ["--model", "claude-3-5-sonnet-20241022"] // Anthropic
}
  • Based on underlying provider pricing
  • Token-efficient architecture

Developer: GitHub/Microsoft
License: Commercial (GitHub subscription)
Models: OpenAI Codex

  • GitHub integration
  • Explain commands
  • Generate shell commands
  • No file editing
  • No multi-file context
  • Command-focused only
  • Shell command generation
  • Git operations
  • Quick explanations
{
"copilot": {
"cli": "gh",
"args": ["copilot"],
"envVars": ["GITHUB_TOKEN"],
"installCmd": "gh extension install github/gh-copilot"
}
}
  • Included with GitHub Copilot subscription

ModelProviderContextSpeedCostBest For
Claude 3.5 SonnetAnthropic200KFast$$General coding
Claude 3 OpusAnthropic200KSlow$$$Complex reasoning
GPT-4oOpenAI128KFast$$Quick tasks
GPT-4 TurboOpenAI128KMedium$$Balanced
GPT-3.5 TurboOpenAI16KFast$Simple tasks

How agents modify files:

AgentMethodConfirmation
Claude CodeDirect writeShows diff first
OpenCodeDirect writeConfigurable
AiderDirect writeOptional confirm
Copilot CLIN/AN/A
AgentAuto-stageAuto-commitCustom messages
Claude CodeYesNoN/A
OpenCodeYesNoN/A
AiderYesYesYes
Copilot CLINoNoNo

How agents use available context:

AgentStrategy
Claude CodeFull context + instruction file
OpenCodeSelective context loading
AiderRepository map + focused files
Copilot CLICurrent command only

Use CaseRecommended Agent
New feature developmentClaude Code
Bug fixingAider
Code reviewClaude Code
Quick refactoringAider
Architecture decisionsClaude Code
Cost-sensitive workOpenCode (GPT-3.5)
Local/offlineOpenCode (Ollama)
Git workflowAider

RAPID can work with any CLI-based AI tool:

{
"agents": {
"available": {
"custom-agent": {
"cli": "my-ai-tool",
"instructionFile": "MY_AGENT.md",
"envVars": ["MY_API_KEY"],
"installCmd": "npm install -g my-ai-tool",
"args": ["--flag", "value"]
}
}
}
}

For an agent to work with RAPID:

  1. CLI interface - Must be runnable from command line
  2. Environment variables - Must read API keys from env
  3. Interactive mode - Must support chat/interactive session
  4. Exit cleanly - Must handle SIGTERM gracefully

Terminal window
# Verify installation
which claude
which aider
# Check env vars
echo $ANTHROPIC_API_KEY
# Reinstall
rapid start --reinstall-tools
  • Check model selection (Opus slower than Sonnet)
  • Reduce context window
  • Use faster model for simple tasks
  • Implement request queuing
  • Use multiple API keys
  • Switch to different provider

Planned support:

AgentStatusETA
Cursor CLIPlannedTBD
ContinueEvaluatingTBD
CodyEvaluatingTBD
Local LLMs (Ollama)Via OpenCodeAvailable