Skip to main content

connector.json

Declarative specification for AI-to-AI orchestration. Where MCP defines how AI connects to tools, connector.json defines how AI coordinates with other AI.

What It Solves

MCP:             AI → tool/data (one direction, passive target)
connector.json:  AI ↔ AI ↔ AI  (multi-direction, active participants)
Every AI CLI tool (Claude Code, Codex, Gemini CLI, Aider) runs in isolation. There is no standard way to:
  • Route tasks to different LLMs based on cost or capability
  • Run multiple AI sessions in parallel with isolated environments
  • Define multi-phase workflows across heterogeneous AI tools
  • Fail over to a different provider when one goes down
connector.json is that standard.

Spec Overview

{
  "connector": "2.0",
  "name": "feature-implementation",

  "session": {
    "worktree": "feat-auth-fix",
    "sandbox": true,
    "scratchpad": true,
    "timeout_ms": 3600000
  },

  "models": {
    "primary": "claude",
    "fallback": "gemini",
    "research": "haiku",
    "implementation": "opus"
  },

  "pipeline": {
    "phases": [
      { "name": "research",   "workers": 3, "model": "research" },
      { "name": "implement",  "workers": 1, "model": "implementation" },
      { "name": "verify",     "workers": 2, "model": "primary" }
    ]
  }
}

Core Sections

session — Environment

How to set up each AI session. Maps to daemon-external plugins that transform PTY session configuration without modifying the terminal engine.
FieldTypeDescription
worktreestringGit worktree slug for isolated workspace
sandboxboolOS-level process isolation (no network, restricted writes)
scratchpadboolShared knowledge directory across workers
envobjectEnvironment variables injected into session
permissionstringPermission policy for the session
timeout_msnumberSession timeout in milliseconds
resumestringResume an existing worktree session

models — Routing

Which AI to use for which role. Named aliases that pipeline phases reference.
{
  "primary": "claude",
  "fallback": "gemini",
  "research": "haiku",
  "implementation": "opus",
  "review": "codex"
}
When primary is unavailable, the runtime falls back to fallback automatically. Cost-aware routing: cheap models for research, expensive models for implementation.

pipeline — Orchestration

Multi-phase workflow with parallel workers.
{
  "phases": [
    {
      "name": "research",
      "workers": 3,
      "model": "research",
      "session": { "sandbox": true }
    }
  ],
  "scratchpad": true,
  "mailbox": true
}
Each phase can override session settings. Workers in the same phase run in parallel. Phases execute sequentially. Results flow through scratchpad and mailbox. workers accepts either a number (N identical workers sharing the phase’s model) or an array of worker objects (each with its own model, prompt, machine).

Beyond Pipeline — Communication Topologies

Pipeline is one topology. connector.json supports all:
TypeFieldDescription
Pipeline"pipeline": { "phases": [...] }Sequential phases, parallel workers within each
Dialoguephases[].type: "dialogue"Two agents in multi-round conversation
Meeting"type": "meeting"All participants hear every voice
Mesh"type": "mesh"Any agent talks to any agent directly
RecursiveAgent runs its own connector.jsonUnlimited depth, agents spawn sub-teams
See: Bidirectional Communication, Meeting Protocol, N-to-N Topology, Recursive Teams, Dimensional Growth.

Relationship to MCP

MCP is a field inside connector.json, not a competing spec.
{
  "session": { ... },
  "models": { ... },
  "pipeline": { ... },
  "tools": {
    "mcp": ["chrome-devtools", "github", "postgres"]
  }
}
MCP = tool access protocol (AI → tool). connector.json = workflow orchestration spec (AI ↔ AI, including tools).

Relationship to OpenCLIs

OpenCLIs conceptconnector.json field
wrapper_stepspipeline.phases[].steps
initiate_mdagent.flow
trust_domainstrust.domains
MCP CLI wrappingtools.mcp
connector.json unifies these into a single declarative file.

Runtime

connector.json is a spec. The runtime that executes it is the NIIA daemon + headless PTY infrastructure.
connector.json → niia daemon reads it
  → creates headless PTY sessions
  → applies session plugins (worktree, sandbox, scratchpad)
  → routes to correct LLM per phase
  → manages pipeline execution
  → collects results via mailbox
  → handles failover automatically
No code modification needed. Drop a JSON file, run niia run connector.json.

Design Principles

  1. Declarative — describe what, not how
  2. LLM-agnostic — any AI CLI that runs in a terminal
  3. Infrastructure-level — orchestration in daemon, not in prompts
  4. MCP-complementary — MCP for tools, connector.json for workflows
  5. File is the plugin — new JSON = new workflow, no code changes

Status

ComponentStatus
Spec v2.0Draft
session.worktreeImplemented
session.sandboxImplemented (macOS)
models routingDesigned
pipeline orchestrationDesigned
niia run connector.jsonPlanned