@msbel5

Thalamus

Cognitive routing layer for OpenClaw: packet handoff, vector memory, encoder daemon, multimodal ingest. MCP server.

Current version
v1.0.5
bundle-pluginCommunitystructural

Thalamus

Cognitive routing layer for OpenClaw multi-agent crews. Replaces transcript paste with a 3-field reference, receiver resolves only relevant atoms from a local vector store.

What it does

  • Packet store with content-hash resolver keys (packet_id, resolver_key)
  • Vector store: 9 namespaces (atoms.code, atoms.audit, atoms.plan, atoms.memory, atoms.audio.raw/text, atoms.image.raw/text, atoms.crossmodal)
  • Encoder daemon: Qwen3-Embedding-0.6B Q4_0 GGUF on CPU + optional Hailo-NPU encoders
  • FAISS RaBitQ codebook for concept compression
  • MCP server tools: thalamus_route, thalamus_resolve, thalamus_search, thalamus_search_with_vector, thalamus_promote_packet, thalamus_telemetry
  • HTTP dashboard with HMAC bearer auth (optional)

Install

Via OpenClaw plugins CLI (recommended):

openclaw plugins install clawhub:openclaw-thalamus

Or via npm:

npm install -g openclaw-thalamus
openclaw-thalamus health

Wire as MCP server

Add to ~/.openclaw/openclaw.json:

{
  "mcpServers": [
    { "name": "thalamus", "command": "openclaw-thalamus-mcp" }
  ]
}

Restart the gateway: systemctl --user restart openclaw-gateway.

Config

The plugin manifest declares a configSchema with these fields:

FieldDefaultDescription
encoderHost127.0.0.1Host of the local Qwen3 encoder daemon
encoderPort28760Port of the encoder daemon
vectorStorePath~/.openclaw/state/thalamus/vectorsFAISS + BBQ codebook directory
packetStorePath~/.openclaw/state/thalamus/packetsContent-addressed packet store
namespaces9 default namespacesVector store namespace ids
dashboardEnabledfalseEnable HTTP telemetry dashboard
dashboardPort28761Dashboard port

Real metrics

Raw measurement output is paste-in verbatim in BENCHMARKS.md on the upstream repo:

  • 95.84% combined token reduction (packet handoff + Alcyone Protocol @-codes, single production session)
  • BBQ codebook: mean cosine 0.978, p10 0.985 on 100K corpus
  • Qwen3 warm latency: 167ms p50 on Pi 5 CPU

These are single-machine measurements on a Raspberry Pi 5 4GB, not a benchmark suite. Take them as a directional signal, not a comparative claim.

License

MIT. Source: https://github.com/msbel5/openclaw-thalamus

Source and release

Install command

openclaw plugins install clawhub:openclaw-thalamus

Metadata

  • Package: openclaw-thalamus
  • Created: 2026/05/05
  • Updated: 2026/05/05
  • Executes code: No

Compatibility

  • Built with OpenClaw: 2026.5.4
  • Plugin API range: >=2026.0.0
  • Tags: latest
  • Files: 5