Systems

The stack under Dhee is cognitive infrastructure.

We focus on the systems around the model that make agents usable on long-running work: durable memory, routed context, and shared state for collaboration.

How we think

Build the layer the model cannot own well by itself.

Provider models will keep changing. We care about the substrate that should survive those changes and keep making agents better.

Principle

Own the cognitive layer

The differentiator is the memory and routing layer around the model.

Strategy

Route context instead of replaying everything

Useful cognition comes from selecting the right memory for the task.

Direction

Remove the orchestrator bottleneck

Any coding agent should be able to collaborate through shared memory.

The stack

Three systems, one direction.

Each layer exists to make memory more durable, context more precise, and multi-agent work less dependent on central orchestration.

Memory substrate

Durable state outside the model

Tasks, artifacts, and decisions survive past a single run.

Active

Context router

The right slice at the right time

Dhee decides what the active agent should see now and what should stay compressed.

Active

Shared agent fabric

Collaboration without an orchestrator

Agents can hand off work through shared memory instead of a separate orchestrator.

Active

Memory Graph

Persistent cognition substrate

Active

Stores what matters across sessions: tasks, artifacts, preferences, digests, checkpoints, and evolving state that should survive beyond one model run.

Context Router

Selective context delivery

Active

Chooses the right slice of memory for the current task so agents work with useful state instead of drowning in transcript replay.

Collaboration Fabric

Shared continuity for multiple agents

Active

Lets coding agents coordinate through common memory and artifacts directly, removing the need for a separate orchestrator agent to control every step.

Why this matters

Better agents come from better state management.

When memory survives provider churn, context is routed instead of dumped, and collaboration happens through shared state, agents become more reliable without pretending the model alone solved cognition.

Why now

Model-native memory is not enough

Closed session memory breaks as soon as the work gets long or the provider changes.

What we build

Infrastructure around the model, not another wrapper

We focus on memory, routing, and shared continuity around the model.

Where it leads

Self-evolving agents with stable context

As the substrate improves, agents get better at resuming, coordinating, and adapting.