Skip to main content

Dimensional Growth

connector.json grows across five dimensions. Each dimension preserves the same structure — the pattern is fractal. What works for 2 agents works for 200. What works on 1 machine works on 20.

Dimension 1: Node Count

How many AI agents exist.
1 agent:     You + Claude
2 agents:    Claude + Codex (review)
5 agents:    Claude + Codex + Gemini + Haiku × 2 (pipeline)
20 agents:   Research team (5) + Impl team (5) + QA team (5) + Ops team (5)
100 agents:  Each team spawns sub-teams
{ "workers": 1 }     →  { "workers": 5 }     →  { "workers": 100 }
The connector.json spec is identical. Only the number changes. The daemon spawns more PTY sessions. That’s it.

Dimension 2: Node Direction

How agents communicate.
Level 0:  Unidirectional     A → B              (pipeline)
Level 1:  Bidirectional      A ↔ B              (dialogue)
Level 2:  Broadcast          A → [B, C, D]      (meeting)
Level 3:  Full mesh          A ↔ B ↔ C ↔ D      (N:N)
Level 0:  { "type": "pipeline" }
Level 1:  { "type": "dialogue", "rounds": 3 }
Level 2:  { "type": "meeting", "broadcast": true }
Level 3:  { "type": "mesh", "any_to_any": true }
Same connector.json. Different type field. The communication pattern changes, the infrastructure doesn’t.

Dimension 3: Node Network (Depth)

How deep the agent tree grows.
Depth 0:  You → 1 agent
Depth 1:  You → agent → sub-agents
Depth 2:  You → agent → sub-agents → sub-sub-agents
Depth N:  Each agent can spawn its own connector.json
Depth 0:  connector.json (1 pipeline)
Depth 1:  connector.json → agent runs connector-sub.json
Depth 2:  connector-sub.json → agent runs connector-sub-sub.json
Depth N:  recursive, unlimited
         You

     connector.json
      /     |     \
  Agent A  Agent B  Agent C
     |                |
  sub.json          sub.json
   / | \              |
  D  E  F            sub-sub.json
                      / \
                     G   H
At depth 3, you have 8 agents across 3 levels of connector.json files. Each level is a complete, independent pipeline. The parent level only sees results, not internal structure.

Dimension 4: Node Machines

How many physical machines are involved.
1 machine:   All agents on your laptop
2 machines:  Laptop + server
N machines:  Laptop + GPU farm + data centers + air-gapped facilities
1 machine:   { "model": "claude" }
2 machines:  { "machine": "server", "model": "claude" }
N machines:  { "machine": "tokyo", "model": "claude" }
Same field structure. Add "machine" and it crosses the machine boundary. niia remote handles the transport. The agent doesn’t know it’s remote.

Dimension 5: OS Access

What the agent can see and touch beyond the terminal.
Dimensions 1-4:  AI inside the terminal (files, code, commands)
Dimension 5:     AI outside the terminal (windows, buttons, apps, screen)
{ "capabilities": { "kernel": { "observe": true, "control": ["ax-press", "key"] } } }
An agent with Dimension 5 can check CI in a browser, merge a PR by clicking a button, send a Slack notification — all via the OS accessibility tree, not screenshots. Dimension 5 has the strongest harness: OTP-gated physical control. See OS Dimension and Physical Harnessing.

The Five Dimensions Combined

Each dimension multiplies the others:
Nodes × Direction × Depth × Machines × OS Access = Total Complexity

Example:
  5 agents × mesh × 3 levels deep × 4 machines × OS control
  = potentially 5 × 5³ × 4 = 2,500 agent interactions
  + each agent can see and control the OS on its machine

A human cannot track 2,500 interactions across 4 machines with OS access.
An orchestrator AI can. With the right harness.

The Cognitive Threshold

< 10 agents:    Human can track everything
10-50 agents:   Human tracks top level, AI tracks details
50-200 agents:  Human sets intent, AI manages structure
200+ agents:    Human defines goal, AI builds and runs entire organization
This is not a theoretical limit. It’s a practical threshold where human cognitive capacity is exceeded and AI orchestration becomes necessary — not optional.
Level 0: Human                  "Build auth feature"
Level 1: Orchestrator AI        Reads intent → designs pipeline → spawns teams
Level 2: Team Lead AIs          Each manages 5-10 workers
Level 3: Worker AIs             Execute tasks, report results
Level 4: Sub-worker AIs         Handle sub-tasks spawned by workers

Human sees: Level 0 intent + Level 1 summary
AI manages: Level 1-4 entirely

Human intervention: "change direction" or "stop"
AI handles: everything else

connector.json as Control System

At every scale, the same control primitives apply:
OBSERVE:   niia serve --list          → see all sessions, all levels
           niia remote status list    → see all machines
           niia mesh --observe        → see all communication

DIRECT:    niia write --session S     → message any agent
           niia mesh --broadcast      → message all agents
           connector.json             → define the entire structure

STOP:      niia stop --session S      → stop one agent
           niia stop --recursive      → stop agent + all its children
           Ctrl+C on connector.json   → stop entire pipeline
The control system is the same whether you have 2 agents or 2,000. Whether on 1 machine or 20. Whether depth 1 or depth 10.
connector.json:
  Dimension 1 (count):     "workers": N
  Dimension 2 (direction): "type": "mesh"
  Dimension 3 (depth):     agents run their own connector.json
  Dimension 4 (machines):  "machine": "anywhere"
  Dimension 5 (OS):        "capabilities": { "kernel": {...} }

  Same spec.
  Same daemon.
  Same control.
  Any scale.

What This Is

This is not a multi-agent framework. This is not a workflow engine. This is not a deployment tool. This is the control system for emergent AI complexity. When AI networks grow beyond human cognition — and they will — the question becomes: what controls them? Not at the model level (that’s alignment). At the infrastructure level. Which sessions run where. Who talks to whom. What’s sandboxed. What’s allowed. What’s physically locked. connector.json is the answer to that infrastructure question. PTY-for-AI is the execution layer. niia daemon is the runtime. niia remote is the machine expansion. Kernel CLI is the OS capability. OTP is the physical harness. One spec. Five dimensions. Unlimited scale.

The Full Series

PageWhat it covers
Introducing connector.jsonEntry point — the problem and the solution
Spec ReferenceCore schema: session, models, pipeline
Examples6 practical pipeline examples
vs MCPMCP connects AI→tools, connector.json connects AI↔AI
Infinite ChainsUnlimited pipeline depth, fan-out/fan-in
BidirectionalTwo-agent dialogue, debate, peer review
Meeting ProtocolMulti-agent conference with agenda
N-to-N TopologyFull mesh, any agent to any agent
Recursive TeamsAgents that spawn sub-teams
Use Cases9 real deployment patterns
Machine ExpansionCross-machine orchestration
ASURA & SENJU ModelThe recursive execution model
OS DimensionDimension 5 — AI controls the OS
Physical HarnessingOTP beats YAML policy
The Literacy of AI EraWhy understanding this pattern matters
Dimensional GrowthThis page — all five dimensions