Beyond π-calculus: How Datom.world Unifies Streams and Continuations as First-Class Values
The Problem with Distributed Systems Today
Modern distributed systems are stuck in a conceptual trap. We have:
- Message passing systems (Kafka, NATS) that move data efficiently
- Remote procedure call systems (gRPC, HTTP/REST) that invoke code remotely
- Container orchestration (Kubernetes) that migrates entire processes
But these are three separate mechanisms for three separate concerns. Want to build a system where agents coordinate, migrate between devices, and share both information and behavior? You'll need to stitch together multiple paradigms, each with its own assumptions and limitations.
What if there was a single, unified model that handled all three naturally?
Enter π-calculus: The Theory of Mobile Channels
In 1989, Robin Milner introduced π-calculus, a formal model for concurrent computation. Its key innovation was channel mobility - the ability to pass communication channels as values:
x̄⟨y⟩.P | x(z).Q
Process 1: Send channel y over channel x
Process 2: Receive a channel (call it z) from x, then use zThis simple idea enabled something profound: dynamic communication topology. Processes could learn about new communication channels at runtime, share them with others, and reconfigure the network on the fly.
π-calculus became the theoretical foundation for languages like Erlang, systems like Occam, and influenced everything from actors to cloud computing.
But π-calculus has a limitation: processes themselves are not first-class values. You can send channel names, but not the code that runs on those channels.
The Missing Piece: Code Mobility
Several extensions tried to add code mobility:
- Higher-Order π-calculus (1992) - processes as values, but separate type from channels
- Mobile Ambients (1998) - processes migrate between locations
- Join Calculus (1996) - sophisticated synchronization patterns
Each added code mobility, but maintained a type distinction between data, channels, and processes. These are treated as fundamentally different kinds of things, with different transmission mechanisms and different handling.
This matters in practice. When you want to build a system where:
- Agents discover new communication channels (channel mobility)
- Agents share successful behaviors (code mobility)
- Agents migrate between devices carrying both channels and code
- All of this happens through the same uniform mechanism
...you hit conceptual walls. The type systems fight you. The protocols diverge. The implementation becomes a maze of special cases.
Datom.world: Everything is Data
Datom.world takes a radically different approach, grounded in a simple observation:
Streams don't have to be objects. Continuations don't have to be runtime entities. Both can be data.
Streams as Data Structures
In Datom.world, a stream isn't a file handle or socket connection. It's a data structure:
{:id #uuid "550e8400-e29b-41d4-a716-446655440000"
:name "stream://sensors/temp"
:metadata {:storage :persistent
:retention {:days 30}
:created-at 1734567890000}
:buffer [[:sensor-1 :temperature 23.5 1734567890001 {}]
[:sensor-1 :temperature 24.1 1734567890002 {}]]
:subscribers #{:agent-1 :agent-2}}Just data. You can serialize it, send it, store it, inspect it. An interpreter materializes the actual I/O when needed.
Continuations as Data Structures
Similarly, a continuation isn't a call stack or thread state. It's a data structure:
{:code (fn [state streams]
(let [datom (first (pull (first streams) 0 1))]
(process-datom state datom)))
:state {:position [10 20]
:energy 100
:counter 42}
:streams [#uuid "stream-1-id" #uuid "stream-2-id"]}Just data. You can serialize it, send it, store it. An interpreter executes it when needed.
The Universal Container: The Datom
Both flow through the same mechanism - the datom, a 5-tuple:
[entity attribute value timestamp metadata]
;; Examples where value is different types:
[:sensor-1 :temperature 42 1734567890000 {}]
[:agent-1 :discovered "stream://food" 1734567890001 {}]
[:agent-1 :stream-def {:id #uuid "..." :name "..." ...} 1734567890002 {}]
[:agent-1 :behavior (fn [state] ...) 1734567890003 {}]One type. One transmission mechanism. One storage format.
The Power of Unification
This seemingly simple move - treating streams and continuations as data - unlocks capabilities that are awkward or impossible in traditional process calculi.
1. Self-Contained Migration
An agent can migrate between devices carrying everything it needs:
;; Agent packages itself
(def agent-package
{:id :agent-42
:continuation
{:code (fn [state streams]
(let [pheromones (pull (first streams) 0 10)]
(follow-trail pheromones state)))
:state {:position [10 20]
:energy 100}
:pc 42}
:streams
[{:id #uuid "a1b2c3d4-..."
:name "stream://pheromones/trail-1"
:metadata {:storage :ephemeral
:created-at 1734567890000}
:buffer [[:ant-1 :pheromone 0.9 1734567890001 {:location [10 20]}]
[:ant-1 :pheromone 0.8 1734567890002 {:location [11 21]}]]
:subscribers #{:agent-42}}
{:id #uuid "e5f6g7h8-..."
:name "stream://food/locations"
:metadata {:storage :persistent}
:buffer [[:scout :food-found [50 60] 1734567890003 {}]]
:subscribers #{:agent-42}}]})
;; Send entire package as one datom
(publish "stream://migration/device-b"
[:migration :agent agent-package (now) {}])The agent carries not just references to streams, but the complete stream definitions. No global registry needed. No name resolution required. Self-contained.
2. Behavioral Evolution Through Transmission
Successful behaviors spread like information:
;; Ant discovers efficient foraging strategy
(def smart-behavior
(fn [state]
(let [pheromones (sense-environment state)]
(if (strong-pheromone? pheromones)
(follow-gradient pheromones state)
(explore-efficiently state)))))
;; Share behavior as a value
(publish "stream://behaviors"
[:ant-1 :strategy smart-behavior (now)
{:success-rate 0.85
:generation 5}])
;; Other ants receive and adopt
(subscribe "stream://behaviors"
(fn [[entity _ behavior _ meta]]
(when (> (:success-rate meta) @my-success-rate)
;; Adopt the better behavior
(reset! my-behavior behavior)
(println "Adopted behavior from" entity))))Code propagates through the same channels as data. No separate deployment mechanism. No distinction between "sending information about food" and "sending how to find food."
3. Stigmergic Programming
Perhaps most striking: code can be deposited in the environment like pheromones:
;; Leave behavior in environment
(defn leave-code-trail [location behavior]
(publish (str "stream://env/" location "/behaviors")
[:code :trail behavior (now)
{:strength 1.0
:decay-rate 0.95
:deposited-at location}]))
;; Agents encounter and execute environmental code
(defn ant-explore [location]
(let [behaviors (pull (str "stream://env/" location "/behaviors")
0 10)]
;; Execute behaviors found in environment
(doseq [[_ _ behavior _ meta] behaviors]
(when (> (:strength meta) 0.5)
;; Execute code found in environment!
(try
(behavior @my-state)
(catch Exception e
(println "Behavior failed:" (.getMessage e))))))))
;; Example: Leave "return to nest" behavior near food
(when (found-food? @state)
(leave-code-trail
(:position @state)
(fn [state]
(assoc state :mode :returning-home
:target nest-location))))The environment contains not just data but executable behavior. Agents program each other indirectly by leaving code in shared spaces. This is stigmergic computation - coordination through environmental modification extended to code itself.
4. Complete System Introspection
Since everything is data, the system can examine itself completely:
;; System snapshot
(defn introspect-system []
{:agents (vec (map (fn [agent]
{:id (:id agent)
:continuation (serialize-continuation (:cont agent))
:streams (:streams agent)})
@active-agents))
:streams (vec (vals @stream-registry))
:topology (build-topology-graph @active-agents)
:timestamp (now)})
;; Serialize entire system to disk
(spit "system-snapshot.edn"
(pr-str (introspect-system)))
;; Later, restore exact state
(defn restore-system [snapshot-path]
(let [snapshot (read-string (slurp snapshot-path))]
(doseq [agent-data (:agents snapshot)]
(spawn-agent agent-data))
(doseq [stream-data (:streams snapshot)]
(restore-stream stream-data))))No special introspection APIs. No debugger protocols. The system's state is data, so it's introspectable by definition.
5. Protocol Evolution Without Downtime
Streams can evolve by sending new definitions:
;; Stream v1
(def stream-v1
{:id #uuid "550e8400-..."
:name "stream://sensors/temp"
:metadata {:version 1
:schema {:v :number}}})
;; Stream v2 with richer data and migration function
(def stream-v2
{:id #uuid "550e8400-..." ; Same ID
:name "stream://sensors/temp"
:metadata {:version 2
:schema {:v {:temp :number
:humidity :number}}
:migrate-from-v1
(fn [old-datom]
(let [[e a v t m] old-datom]
[e a {:temp v :humidity nil} t
(assoc m :migrated true)]))}})
;; Publish upgrade
(publish "stream://system/updates"
[:system :stream-upgrade stream-v2 (now) {}])
;; Agents upgrade themselves
(subscribe "stream://system/updates"
(fn [[_ _ new-stream-def _ _]]
(when (= (:id new-stream-def)
(:id @my-stream))
(upgrade-stream! new-stream-def)
(println "Upgraded to version"
(get-in new-stream-def [:metadata :version])))))No restart required. No coordination protocol. Just send the new definition as data.
The Theoretical Implication
What Datom.world has created is, to our knowledge, the first process calculus with complete unification of channels and processes as values.
The formal system would look like:
Syntax:
P ::= 0 (nil)
| x̄⟨d⟩.P (output datom)
| x(d).P (input datom)
| P | Q (parallel)
| (νx)P (new stream)
| !P (replication)
where:
d = [e, a, v, t, m] (datom)
v ∈ {data, stream-data, continuation-data, nested}
Key properties:
- No type distinction between values
- Same transmission mechanism for all
- Interpreters materialize meaning
- System is reflexive (can represent itself)This could be called Uniform π-calculus or Datom calculus - a process calculus where the distinction between data, channels, and processes collapses into interpreted data structures.
Why Interpreters Make This Possible
The crucial insight that enables unification is this:
Don't reify abstractions into runtime objects. Keep everything as data. Let interpreters give meaning contextually.
Traditional systems:
// Stream = Object with methods
class Stream {
private Socket socket;
public void write(byte[] data) { ... }
}
// Process = Thread
Thread thread = new Thread(() -> { ... });
// Problem: Can't serialize objects, can't transmit threadsDatom.world:
;; Stream = Data structure
{:id #uuid "..."
:name "stream://x"
:metadata {...}}
;; Continuation = Data structure
{:code (fn [s] ...)
:state {...}}
;; Interpreters materialize meaning
(defn materialize-stream [stream-data]
(case (get-in stream-data [:metadata :storage])
:tcp (tcp-connect (:host stream-data) (:port stream-data))
:file (io/file (:path stream-data))
:memory (atom (:buffer stream-data))))
(defn execute-continuation [cont-data]
(let [{:keys [code state streams]} cont-data]
(code state streams)))The stream data structure is passive until an interpreter materializes it as an actual connection. The continuation data structure is inert until an interpreter executes it.
Meaning emerges from interpretation, not from inherent object identity.
This is similar to Lisp's homoiconicity (code as data), but applied to distributed systems: everything the system manipulates - data, channels, code - is represented uniformly as data structures.
Practical Implications
This isn't just theoretical elegance. The unification enables:
For Mobile Agent Systems:
- Agents migrate between devices fully self-contained
- No need for global name registries
- Automatic adaptation to new environments
For Swarm Coordination:
- Behaviors evolve and spread organically
- Successful strategies propagate naturally
- System learns through code transmission
For IoT/Edge Computing:
- Processes migrate to where data is
- Computation follows opportunity
- Network topology reconfigures dynamically
For Distributed Debugging:
- Complete system state is inspectable
- Snapshots capture everything
- Time-travel debugging is natural
For Organic System Evolution:
- Protocols upgrade without coordination
- New behaviors can be injected
- System programs itself
A Complete Example: Ant Swarm Evolution
Let's see how this all comes together in a practical example:
;; Generation 1: Simple wandering ant
(def gen1-behavior
(fn [state]
(if (see-food? state)
(collect-food state)
(random-walk state))))
;; Ant uses this behavior, succeeds 40% of time
(def ant-1
{:id :ant-1
:behavior gen1-behavior
:success-rate (atom 0.4)
:streams [{:id #uuid "env-1-..."
:name "stream://environment/sector-1"}]})
;; Generation 2: Ant improves behavior
(def gen2-behavior
(fn [state]
(let [pheromones (sense-pheromones state)]
(if (strong-pheromone? pheromones)
(follow-gradient pheromones state)
(gen1-behavior state))))) ; Incorporates previous!
;; Ant-2 discovers this works better (70% success)
;; Shares the improved behavior
(when (> @(:success-rate ant-2) 0.6)
(publish "stream://swarm/behaviors"
[:ant-2 :improved-strategy gen2-behavior (now)
{:success-rate @(:success-rate ant-2)
:parent-behavior :gen1
:innovation "pheromone-following"}]))
;; Other ants adopt successful behaviors
(subscribe "stream://swarm/behaviors"
(fn [[entity _ new-behavior _ meta]]
(when (> (:success-rate meta) @my-success-rate)
(println "Ant" @my-id "adopting behavior from" entity)
(reset! my-behavior new-behavior)
;; Mutate and share further improvements
(future
(Thread/sleep 60000)
(when (improved? @my-success-rate)
(let [mutated (add-random-exploration new-behavior)]
(publish "stream://swarm/behaviors"
[@my-id :mutation mutated (now)
{:success-rate @my-success-rate
:parent entity}])))))))
;; Generation 3: System evolves organically
;; Best behaviors spread, bad ones die out
;; No central coordination needed
;; Code literally evolves through transmissionThe Road Ahead
Datom.world is being built for real-world applications - ant-inspired swarm coordination, agent-based systems, and distributed computation on mesh networks. But the theoretical foundation suggests something broader: a new way of thinking about distributed systems where the boundaries between data, communication, and computation blur into a unified whole.
The key moves:
- Streams as data structures (not objects)
- Continuations as data structures (not runtime entities)
- Uniform transmission (same mechanism for all)
- Interpretation materializes meaning (context-dependent)
- System is reflexive (can represent itself completely)
By treating everything as data and using interpreters to materialize meaning, we achieve what process calculi have been reaching toward for decades: true unification of information mobility, channel mobility, and code mobility.
The future of distributed systems may not be about better type systems or more sophisticated protocols, but about collapsing these distinctions entirely - recognizing that at the deepest level, it's all just data flowing through interpreted structures.
Learn More
Datom.world is being developed as an open platform for distributed, stigmergic computation. The project combines:
- DaoStream: The streaming protocol with unified values
- Yin.VM: Continuation-based virtual machine with migration
- Stigmergic coordination: Ant-inspired swarm intelligence