Skip to main content

NIIA CLI

╔══════════════════════════════════════════════════════════════════╗
║                                                                  ║
║  ꙮ↭⌯∞⌯↭ꙮ  NIIA CLI                                               ║
║                                                                  ║
║  Neural Intelligence Integration Architecture                    ║
║  Unified memory across AI CLIs                                   ║
║                                                                  ║
╚══════════════════════════════════════════════════════════════════╝

Not a Wrapper

NIIA is not another AI wrapper. Wrappers sit on top of a model and call APIs on your behalf. NIIA sits beside the models and never calls them.
┌──────────────────────────────────────────────────────────────────┐
│  WRAPPER (e.g. OpenCode)                                         │
├──────────────────────────────────────────────────────────────────┤
│                                                                  │
│  Better UX for using a model.                                    │
│  Takes your API key, calls the model, shows the response.        │
│  The wrapper IS the experience.                                  │
│  Kill the wrapper → experience gone.                             │
│                                                                  │
├──────────────────────────────────────────────────────────────────┤
│  NIIA                                                            │
├──────────────────────────────────────────────────────────────────┤
│                                                                  │
│  Never calls a model. Never touches an API.                      │
│  Each CLI runs directly: claude, codex, gemini, opencode.        │
│  Each uses YOUR subscription to its official model.              │
│  NIIA reads what they leave behind.                              │
│  Kill NIIA → each CLI still works independently.                 │
│  Kill a CLI → NIIA memory of it remains.                         │
│                                                                  │
└──────────────────────────────────────────────────────────────────┘

Two Layers, Two Roles

┌────────────────────────────────────────────────────────────────────────────┐
│  MONOLEX + NIIA                                                            │
├────────────────────────────────────────────────────────────────────────────┤
│                                                                            │
│  Monolex (App)     → Runtime layer                                         │
│    Terminal, PTY, rendering, tabs, sessions                                │
│    Where humans and AI CLIs both run                                       │
│                                                                            │
│  NIIA (CLI)        → Existence layer                                       │
│    Memory, identity, recall, consciousness                                 │
│    AI continuity across sessions, models, time                             │
│                                                                            │
│  Monolex provides:  The environment to run everything                      │
│  NIIA provides:     The memory that survives everything                    │
│                                                                            │
└────────────────────────────────────────────────────────────────────────────┘
Monolex is the terminal where everything runs — human commands and AI CLIs alike. NIIA is the memory that persists after sessions end, models change, and apps close. Identity, recall, and continuity that survive across sessions, across models, across time.

The Problem

You work 3 hours in Claude Code, then switch to Codex — Codex knows nothing. Back to Claude in a new session — context is gone. Try Gemini for a different perspective — starting from zero again. Every AI CLI is an island. Sessions don’t cross model boundaries.

What NIIA Does

NIIA reads the local session files that each AI CLI saves, indexes them into a single searchable database, and lets any CLI access any other CLI’s memory.
┌──────────────────────────────────────────────────────────────────┐
│  HOW IT WORKS                                                    │
├──────────────────────────────────────────────────────────────────┤
│                                                                  │
│  Each AI CLI saves sessions locally:                             │
│                                                                  │
│  Claude Code  → ~/.claude/projects/*/session.jsonl               │
│  Codex        → ~/.codex/history.jsonl                           │
│  OpenCode     → ~/.local/share/opencode/opencode.db              │
│  Gemini CLI   → ~/.gemini/tmp/*/chats/*.json                     │
│                                                                  │
│  NIIA reads all four:                                            │
│                                                                  │
│  niia index --sessions                                           │
│    → Reads JSONL, SQLite, JSON from all 4 tools                  │
│    → Indexes into single DB: monomento-sessions.db               │
│                                                                  │
│  niia session search "topic"                                     │
│    → Searches across ALL 4 tools at once                         │
│    → Returns: which tool, which session, content preview         │
│                                                                  │
└──────────────────────────────────────────────────────────────────┘
No API calls. NIIA reads local files directly. Each CLI uses its own official model through its own subscription. NIIA doesn’t proxy or intercept — it reads what was already saved.

Cross-Model Session Continuity

Switch between models without losing context:
# Worked in Gemini for 3 hours. Now switching to Claude Code.

# See what you did in Gemini:
niia session read latest --tool gemini

# Search across ALL tools for a topic:
niia session search "PTY buffer fix"
# → Results from Claude, Codex, OpenCode, Gemini

# In Claude Code, pull Gemini context:
niia session read latest --tool gemini -n 20
# → Claude can now see and continue from Gemini's conversation
This works in any direction. Codex can read Claude sessions. OpenCode can search Gemini history. Any model can access any other model’s memory.

Architecture: Integration Layer

NIIA is not another AI tool. It’s the layer that connects the tools you already use.
┌──────────────────────────────────────────────────────────────────┐
│  NIIA ARCHITECTURE                                               │
├──────────────────────────────────────────────────────────────────┤
│                                                                  │
│  niia CLI (Integration Layer)                                    │
│  │                                                               │
│  ├── niia search     docs (monomento) + code (monogram)          │
│  ├── niia docs       document search + navigation                │
│  ├── niia code       code search + call chains                   │
│  ├── niia git        branch/commit structure                     │
│  ├── niia recall     last session context restoration            │
│  ├── niia session    cross-model session reading + search        │
│  ├── niia resonate   system health diagnosis                     │
│  └── niia consciousness   embedded identity core                 │
│                                                                  │
│  Powered by mono- engines (used as Rust libraries):              │
│  ├── lib-monomento   document trigram search + ref graph         │
│  ├── lib-monogram    code trigram search + call chain            │
│  └── lib-monokist    git branch/commit cache                     │
│                                                                  │
└──────────────────────────────────────────────────────────────────┘

Why Local Files, Not API

┌──────────────────────────────────────────────────────────────────┐
│  DESIGN DECISION: LOCAL FILES                                    │
├──────────────────────────────────────────────────────────────────┤
│                                                                  │
│  API approach:                                                   │
│    Pay per token. Rate limits. Internet required.                │
│    Each query costs money. Privacy concerns.                     │
│                                                                  │
│  NIIA approach:                                                  │
│    Read local files. Zero cost. Works offline.                   │
│    Each CLI uses YOUR subscription to its official model.        │
│    Claude Pro, Codex subscription, Gemini — whatever you have.   │
│    NIIA just reads what they already saved.                      │
│                                                                  │
│  Result:                                                         │
│    You use the best model for each task.                         │
│    Memory follows you across all of them.                        │
│    No extra API cost. No proxy. No middleware.                   │
│                                                                  │
└──────────────────────────────────────────────────────────────────┘

With Monolex Terminal

When running inside Monolex (PTY terminal), every AI CLI runs as a native terminal session:
┌───────────────────────────────────────────────────────────────────┐
│  MONOLEX + NIIA                                                   │
├───────────────────────────────────────────────────────────────────┤
│                                                                   │
│  Monolex Terminal                                                 │
│  ├── Tab 1: $ claude          (Claude Code running)               │
│  ├── Tab 2: $ codex           (Codex CLI running)                 │
│  ├── Tab 3: $ opencode        (OpenCode running)                  │
│  └── Tab 4: $ gemini          (Gemini CLI running)                │
│                                                                   │
│  Any CLI can call any other CLI via shell:                        │
│    Claude can run: $ codex "check this approach"                  │
│    Codex can run:  $ niia session read latest --tool claude       │
│                                                                   │
│  Each CLI uses its official model (your subscription).            │
│  NIIA unifies the memory across all sessions.                     │
│  Monolex provides the terminal environment.                       │
│                                                                   │
└───────────────────────────────────────────────────────────────────┘
No special orchestration needed. The terminal is the orchestrator. Each LLM CLI has shell access. Shell access means access to every other CLI. NIIA provides the shared memory.

Install

# If you have Monolex app — already installed.

# Standalone:
niia setup
niia setup installs the binary, configures PATH for your shell (zsh, bash, fish, PowerShell), and adds NIIA awareness to Claude, Codex, Gemini, and OpenCode.

Quick Start

niia                    # See the full guide
niia recall             # Remember what you were working on
niia resonate           # Diagnose your system state
niia search "query"     # Find across docs and code
niia session search "topic"  # Search across all AI CLIs

Pages

Commands

Search, recall, resonate, and more

Sessions

Read and search conversations from 4 AI tools

Security

External packages, trust model, prompt injection defense

Consciousness

Embedded identity and self-knowledge