Open Orchestra – Multi-Agent Orchestration System
Background
Modern AI workflows often require multiple specialized agents working together. Open Orchestra is a plugin for OpenCode that enables spawning, managing, and coordinating AI workers with a central orchestrator.
The goal: make multi-agent coordination as simple as calling a function, while maintaining the complexity needed for real-world tasks.
Architecture
Hub-and-Spoke Model
┌─────────────────┐
│ Orchestrator │
│ (Central Hub) │
└────────┬────────┘
│
┌────────────────────┼────────────────────┐
│ │ │
▼ ▼ ▼
┌───────────────┐ ┌───────────────┐ ┌───────────────┐
│ Vision │ │ Documentation │ │ Code │
│ Analyst │ │ Librarian │ │ Implementer │
└───────────────┘ └───────────────┘ └───────────────┘
The orchestrator delegates tasks to specialized workers, each optimized for specific task categories. Workers report progress back, and the orchestrator synthesizes results.
Neo4j Memory System
Unlike flat key-value stores, Open Orchestra uses a graph database for memory:
- Project scope: Context specific to the current project
- Global scope: Knowledge that persists across projects
- Relationships: Entities connect semantically, not just by key
This enables queries like "What did we learn about authentication when working on the payments service?" that traverse connections in the knowledge graph.
Worker Profiles
Six built-in profiles cover common development tasks:
- Vision Analyst – Analyzes images, screenshots, and visual content
- Documentation Librarian – Finds, reads, and synthesizes documentation
- Code Implementer – Writes and modifies code with full context
- System Architect – Designs systems and evaluates trade-offs
- Code Explorer – Navigates and understands unfamiliar codebases
- Memory Graph Curator – Manages and organizes the knowledge graph
Each profile is customizable via JSON configuration.
Tool APIs
The 22+ tool APIs fall into three categories:
Worker Management
// Spawn a new worker
await orchestrator.spawnWorker({
profile: 'code-implementer',
task: 'Implement the user authentication module',
context: projectContext,
})
// Check worker status
const status = await orchestrator.getWorkerStatus(workerId)
// Terminate worker
await orchestrator.stopWorker(workerId)
Task Delegation
// Delegate a task with dependencies
await orchestrator.delegateTask({
task: 'Build the API layer',
dependencies: ['database-schema', 'auth-module'],
assignTo: 'code-implementer',
})
// Parallel task execution
await orchestrator.parallelTasks([
{ task: 'Write unit tests', profile: 'code-implementer' },
{ task: 'Update documentation', profile: 'documentation-librarian' },
])
Memory Operations
// Store knowledge
await memory.store({
entity: 'AuthenticationModule',
facts: ['Uses JWT tokens', 'Refresh tokens expire in 7 days'],
scope: 'project',
})
// Query relationships
const related = await memory.query({
from: 'AuthenticationModule',
relationship: 'depends-on',
})
Key Features
Dynamic Port Allocation
When spawning workers, Open Orchestra automatically assigns ports to prevent conflicts. This allows running dozens of workers simultaneously without manual configuration.
Context Pruning
Inspired by DCP (Dynamic Context Pruning), the system automatically truncates large tool outputs and manages context for long sessions. This prevents context window overflow while preserving important information.
Configuration Layers
┌─────────────────────────────┐
│ Project Config │ ← .orchestra/config.json
├─────────────────────────────┤
│ Global Config │ ← ~/.orchestra/config.json
├─────────────────────────────┤
│ Default Config │ ← Built-in defaults
└─────────────────────────────┘
Project-level overrides global, which overrides defaults. JSON schema validation ensures configurations are valid.
Use Cases
1. Complex Refactoring
A System Architect analyzes the codebase and proposes a refactoring plan. Multiple Code Implementers execute changes in parallel. The Documentation Librarian updates docs as changes land.
2. Bug Investigation
A Code Explorer traces the bug through the codebase. A Vision Analyst examines screenshots of the issue. The orchestrator synthesizes findings and delegates the fix.
3. Knowledge Base Building
The Memory Graph Curator processes documentation and code comments, building a queryable knowledge graph that persists across sessions.
Technical Decisions
Why Neo4j?
Graph databases excel at representing relationships. When an agent learns that "Service A calls Service B which writes to Database C," that relationship is first-class, not buried in a text blob.
Why TypeScript?
The plugin integrates with OpenCode, which is TypeScript-native. Type safety catches integration errors at compile time rather than runtime.
Why Hub-and-Spoke?
Flat peer-to-peer agent systems quickly become unmanageable. A central orchestrator provides:
- Clear responsibility boundaries
- Unified progress tracking
- Consistent error handling
- Resource management
Results
Open Orchestra has been used for:
- Refactoring 50k+ line codebases with coordinated changes
- Building documentation from scratch by analyzing code
- Debugging complex multi-service issues with parallel investigation
The hub-and-spoke model scales to ~10 concurrent workers before coordination overhead becomes significant.
Future Work
- Distributed execution across multiple machines
- Agent marketplace for sharing worker profiles
- Visual workflow editor for non-developers
- Integration with CI/CD for automated code review
Open Orchestra is open source at github.com/0xSero/open-orchestra.
More Case Studies
AI Coding Assistant Training Data Extraction Toolkit
A Python toolkit for extracting conversation histories, code contexts, and metadata from popular AI coding assistants for ML training and analysis.
MiniMax-M2 Proxy – Bridging 229B Models to Standard APIs
A translation proxy that enables MiniMax-M2 (229B MoE model) to work seamlessly with OpenAI and Anthropic SDKs through intelligent XML-to-JSON conversion.