|
AI agents are great at small tasks but derail on big features. They forget context, rebuild things that already exist, and go off-spec halfway through. Shotgun fixes this. It reads your entire codebase, plans the full feature upfront, then splits it into staged PRs—each with file-by-file instructions your AI agent can actually follow. Instead of one 10k-line monster PR nobody will review, you get 5 focused PRs that ship. Works great with Cursor, Claude Code, Antigravity, or Codex. BYOK or use Shotgun credits ($10 = $10 in usage). |
Select your operating system below and click to view installation instructions:
Step 1: Install uv
# Using Homebrew
brew install uv
# Or using curl
curl -LsSf https://astral.sh/uv/install.sh | shStep 2: Run Shotgun
uvx shotgun-sh@latestStep 1: Install uv
curl -LsSf https://astral.sh/uv/install.sh | shStep 2: Run Shotgun
uvx shotgun-sh@latestOpen PowerShell and run these commands:
# Set execution policy (one-time)
Set-ExecutionPolicy RemoteSigned -Scope CurrentUser -Force
# Install uv
powershell -c "irm https://astral.sh/uv/install.ps1 | iex"
# Add to PATH (or restart terminal)
$env:Path = "C:\Users\$env:USERNAME\.local\bin;$env:Path"
# OPTIONAL: Enable code indexing (run as Administrator)
Import-Module BitsTransfer
Start-BitsTransfer -Source "https://aka.ms/vs/17/release/vc_redist.x64.exe" -Destination "$env:TEMP\vc_redist.x64.exe"
Start-Process -FilePath "$env:TEMP\vc_redist.x64.exe" -ArgumentList "/install", "/quiet", "/norestart" -Wait
# Run Shotgun
uvx --python 3.12 shotgun-sh@latest| Supported | Not Supported |
|---|---|
| Windows x64 (regular PCs) | 32-bit Python |
| Python 3.11-3.13 | Python 3.14+ (no wheels yet) |
Important: Run in PowerShell, not Command Prompt or VS Developer shells.
💡 Restart your terminal after installation
Why uv? It's 10-100x faster than pip and handles binary wheels reliably—no cmake/build tool errors.
Need help? View uv installation docs
When you launch Shotgun, it will guide you through:
| Step | What Happens |
|---|---|
| 1. Codebase Indexing | Builds a searchable graph of your entire repository |
| 2. LLM Setup | Configure OpenAI, Anthropic, or Gemini |
| 3. First Research | Start generating codebase-aware specs |
💡 Pro tip: Run Shotgun in your IDE's terminal for the best experience.
Click the image above to watch the full demo on YouTube
See install instructions for your platform first!
uvx shotgun-sh@latest| Mode | How It Works | When to Use It |
|---|---|---|
| Planning (default) | Shotgun proposes an execution plan, shows each step, and asks for confirmation before running agents that change files. You get checkpoints, can refine the plan, and can confirm or skip cascaded updates when one change affects other docs. | When you want control, visibility, and the ability to refine the plan before execution. |
| Drafting | Shotgun runs the full plan in one go, without intermediate confirmations. Progress is still tracked internally, but you won’t be prompted at each step. | When you’re confident in the plan and want fast, end-to-end execution. |
The TUI opens automatically. Press Shift+Tab to switch between Planning & Drafting or / for the command palette.
Under the hood, the Router relies on specialized sub-agents. You don’t select or manage them manually.
| 🔬 Research Explore & understand |
→ | 📝 Specify Define requirements |
→ | 📋 Plan Create roadmap |
→ | ✅ Tasks Break into steps |
→ | 📤 Export Format for AI |
Planning and Drafting are the only execution modes you control; everything else is handled by the Router.
Mode switching: Shift+Tab cycles through modes
| Shortcut | Action |
|---|---|
Shift+Tab |
Switch modes |
/ |
Open command palette |
Ctrl+C |
Cancel operation (or copy if text selected) |
Escape |
Exit Q&A / stop agent |
Ctrl+U |
View usage stats |
| Do This | Not This |
|---|---|
✅ Research how we handle auth |
❌ Jump straight to building |
✅ Shotgun please ask me questions first |
❌ Assume Shotgun knows your needs |
✅ I'm working on payments, need refunds |
❌ Add refunds (no context) |
| ✅ Start in Planning mode, let Shotgun propose and refine a plan with you, then run it | ❌ Blast everything in one go without reviewing the plan first (unless you intentionally switch to Drafting mode) |
Result: Your AI coding agent gets complete context—what exists, why, and what to build.
Note: CLI available in docs/CLI.md, but TUI is recommended.
The Research agent can fetch up-to-date library documentation via Context7. When configured, the agent will prefer Context7 over web search for documentation lookups.
To enable it, set your Context7 API key:
shotgun config set-context7 --api-key <your-context7-api-key>To remove it:
shotgun config clear-context7Sharing specs to a workspace is available on paid Shotgun plans.
Shotgun lets you share specs externally by publishing them to a workspace. This creates a versioned, shareable snapshot your team can access outside the repo.
- Hit
/→ select Share specs to workspace - Choose one option:
- Create new spec — publish a fresh spec from your current
.shotgun/files - Add new version — publish an updated version of an existing spec
- Wait for upload to complete. When finished, you can:
- Open in Browser — view the shared spec in the workspace
- Copy URL — share the link with your team
- Done — return to Shotgun
Your local .shotgun/*.md files remain unchanged.
The workspace contains a shareable, versioned snapshot of the spec.
| Feature | Shotgun | Other Tools |
|---|---|---|
| Codebase Understanding | Reads your entire repository before generating specs. Finds existing patterns, dependencies, and architecture. | Require manual context or search each time. No persistent understanding of your codebase structure. |
| Research Phase | Starts with research—discovers what you already have AND what exists externally before writing anything. | Start at specification. Build first, discover problems later. |
| Dedicated Agents Per Mode | Each mode (research, spec, plan, tasks, export) uses a separate specialized agent with prompts tailored specifically for that phase. 100% user-controllable via mode switching. | Single-agent or one-size-fits-all prompts. |
| Structured Workflow | Router-driven flow with Planning and Drafting modes; internally it runs Research → Spec → Plan → Tasks → Export with checkpoints in Planning mode. | No structure. Just "prompt and hope." |
| Export Formats |
AGENTS.md files ready for Cursor, Claude Code, Windsurf, Lovable—your choice of tool.
|
Locked into specific IDE or coding agent. |
We had to implement payments. Cursor, Claude Code, and Copilot all suggested building a custom payment proxy — 3-4 weeks of development.
⭐ Shotgun's research found LiteLLM Proxy instead—30 minutes to discover, 5 days to deploy, first customer in 14 hours.
80% less dev time. Near-zero technical debt.
- 🚀 Onboarding - New developer? Shotgun maps your entire architecture and generates docs that actually match the code
- 🔧 Refactoring - Understand all dependencies before touching anything. Keep your refactor from becoming a rewrite
- 🌱 Greenfield Projects - Research existing solutions globally before writing line one
- ➕ Adding Features - Know exactly where your feature fits. Prevent duplicate functionality
- 📦 Migration - Map the old, plan the new, track the delta. Break migration into safe stages
📚 Want to see a detailed example? Check out our Case Study showing Shotgun in action on a real-world project.
Q: Does Shotgun collect any stats or data?
A: We only gather minimal, anonymous events (e.g., install, server start, tool call). We don't collect the content itself—only that an event occurred. We use PostHog for analytics and error reporting to improve stability.
Q: Does my code leave my computer when indexing?
A: No. When you index your codebase, all indexing happens locally on your machine. The index is stored in ~/.shotgun-sh/codebases/ and never sent to any server. Your code stays on your computer.
Q: Local LLMs?
A: Planned. We'll publish compatibility notes and local provider integrations.
Q: What LLM providers are supported?
A: Currently OpenAI, Anthropic (Claude), and Google Gemini. Local LLM support is on the roadmap.
Q: Can I use Shotgun offline?
A: You need an internet connection for LLM API calls, but your codebase stays local.
Q: How does the code graph work?
A: Shotgun indexes your codebase using tree-sitter for accurate parsing and creates a searchable graph of your code structure, dependencies, and relationships.
Shotgun is open-source and we welcome contributions. Whether you're fixing bugs, proposing features, improving docs, or spreading the word—we'd love to have you as part of the community.
- Bug Report: Found an issue? Create a bug report
- Feature Request: Have an idea to make Shotgun better? Submit a feature request
- Documentation: See something missing in the docs? Request documentation
Not sure where to start? Join our Discord and we'll help you get started!
- Contributing Guide - Setup, workflow, and guidelines
- Git Hooks - Lefthook, trufflehog, and security scanning
- CI/CD - GitHub Actions and automated testing
- Observability - Telemetry, Logfire, and monitoring
- Docker - Container setup and deployment
Planning → Drafting — Two execution modes that give AI agents the full picture, backed by internal phases for Research → Specify → Plan → Tasks → Export.
uvx shotgun-sh@latestLicense: MIT | Python: 3.11+ | Homepage: shotgun.sh
uv tool uninstall shotgun-sh
