* perf: speed up NewWorkspacePage first paint - SWR cache + inflight dedup for gh work-items in store - Prefetch on openNewWorkspacePage, sidebar Plus hover, Landing hover - Seed list synchronously from cache to kill the double-fetch on mount - Module-scoped detectAgents promise cache (one IPC per session) - Consolidate useComposerState store subs via useShallow - Hoist Intl.RelativeTimeFormat to module scope (was per row) - LightRays: count 6->3, blur 44->20, drop mix-blend-screen, willChange compositor hint, prefers-reduced-motion bailout * perf: drop redundant runIssueAutomation state + effect Derived-state-via-effect anti-pattern: runIssueAutomation was just a mirror of canOfferIssueAutomation toggled through a useEffect, so shouldRunIssueAutomation was effectively canOfferIssueAutomation one render late. Compute during render instead. * fix: prefetch under the exact cache key the page will read Sidebar and Landing were calling prefetchWorkItems with no query, warming the cache at '::'' which NewWorkspacePage never reads. Pass the user's default-preset query so hover-prefetch actually hits the key the page looks up on mount. |
||
|---|---|---|
| .agents/skills | ||
| .claude/skills | ||
| .github | ||
| .husky | ||
| config | ||
| docs | ||
| patches | ||
| resources | ||
| scripts | ||
| skills/orca-cli | ||
| src | ||
| .editorconfig | ||
| .gitignore | ||
| .npmrc | ||
| .oxfmtrc.json | ||
| .oxlintrc.json | ||
| AGENTS.md | ||
| CLAUDE.md | ||
| components.json | ||
| CONTRIBUTING.md | ||
| dev-app-update.yml | ||
| electron.vite.config.ts | ||
| LICENSE | ||
| orca.yaml | ||
| package.json | ||
| pnpm-lock.yaml | ||
| README.md | ||
| skills-lock.json | ||
| tsconfig.json | ||
| vitest.config.ts | ||
Orca
The AI Orchestrator for 100x builders.
Run Claude Code, Codex, or OpenCode side-by-side across repos — each in its own worktree, tracked in one place.
Available for macOS, Windows, and Linux.
Supported Agents
Orca supports any CLI agent (not just this list).
Claude Code
Codex
Gemini
Pi
Hermes Agent
OpenCode
Goose
Amp
Auggie
Charm
Cline
Codebuff
Continue
Cursor
Droid
GitHub Copilot
Kilocode
Kimi
Kiro
Mistral Vibe
Qwen Code
Rovo Dev
Features
- No login required — Bring your own Claude Code or Codex subscription.
- Worktree-native — Every feature gets its own worktree. No stashing, no branch juggling. Spin up and switch instantly.
- Multi-agent terminals — Run multiple AI agents side-by-side in tabs and panes. See which ones are active at a glance.
- Built-in source control — Review AI-generated diffs, make quick edits, and commit without leaving Orca.
- GitHub integration — PRs, issues, and Actions checks linked to each worktree automatically.
- Notifications — Know when an agent finishes or needs attention. Mark threads unread to come back later.
Install
- Download from onOrca.dev
- Or download the latest binaries via the GitHub Releases page.
[New] Hot Swap Codex Accounts
Multiple Codex accounts? Switch in one click.
If you run multiple Codex accounts to get the best token deal, Orca lets you hot-swap between them instantly — no re-login, no config files. Just pick an account and keep building.
[New] Per Worktree Browser & Design Mode
See your app. Click any element. Drop it into the chat.
Orca ships with a built-in browser right inside your worktree. Preview your app as you build, then switch to Design Mode — click any UI element and it lands directly in your AI chat as context. No screenshots, no copy-pasting selectors. Just point at what you want to change and tell the agent what to do.
[New] Introducing the Orca CLI
Agent orchestration from your terminal.
Let your AI agent control your IDE. Use AI to add repos to your IDE, spin up worktrees, and update the current worktree's comment with meaningful progress checkpoints directly from the terminal. Ships with the Orca IDE (install under Settings).
npx skills add https://github.com/stablyai/orca --skill orca-cli
Community & Support
- Discord: Join the community on Discord.
- Twitter / X: Follow @orca_build for updates and announcements.
- Feedback & Ideas: We ship fast. Missing something? Request a new feature.
- Show Support: Star this repo to follow along with our daily ships.
Developing
Want to contribute or run locally? See our CONTRIBUTING.md guide.