Everything is a plugin. The framework just stewards them.
Evo is a Rust framework for appliance-class devices where features are independent, signed plugins composed by a central steward that knows nothing about the domain. Adding a capability is a new plugin. Replacing one is replacing one signed file. The system stays coherent as the plugin set grows.
The problem
Appliance-class devices keep being built on stacks that were designed for something else. A web runtime running as a backend daemon. An event loop holding global state that two plugins both want to mutate. A monolithic application that admits "extensions" but still owns the truth. The result is a class of device that starts clean, accumulates race conditions, develops a slow response surface, and ships a new feature only by editing the centre. The user feels it as the device becoming worse over time. The maintainer feels it as fear of the next change. Both are symptoms of the same architectural choice, made early, which the system can no longer escape.
The move
Evo inverts the arrangement. The framework has no domain knowledge. The device is described by a catalogue file, a small data document that declares which concerns the device has, plus a set of plugins that satisfy those concerns. There is no central application that plugins are guests of. The catalogue is the device's specification; the framework, which evo calls the steward, administers it. Plugins contribute, and the steward composes their contributions into the coherent thing the user sees.
Why this is different
If you have used a device whose backend is a Node.js or Python application server: the application owns the truth. Plugins are callbacks the application invokes. Two plugins interested in the same state coordinate through shared variables, and the application is responsible for serialising them, which it usually does by accident. Race conditions are the default; absence of races is a feature you pay for plugin by plugin. In evo, the steward is the only place state changes; plugins never see each other; the race condition is not mitigated, it is structurally absent.
If you have built a plugin for an audio appliance: the plugin model is usually a fixed set of extension points (decoder, source, output) hung off the side of a core application. The application owns playback, metadata, networking, the UI; plugins narrow the gaps. Replacing the playback engine is forking the project. In evo, the playback engine itself is a plugin. So is metadata, artwork, the network manager. Nothing is privileged. Replacing the playback engine is replacing one signed artefact.
If you have shipped firmware with a release cycle measured in months: every change ships together because the build is monolithic. Fixing a typo in an ALSA parameter means cutting a release. In evo, a typo is a one-line config edit on the device; a plugin bug fix is replacing one binary; a framework update is a deliberate act. Three independent release cadences, three independent signing keys, three planes (source, artefact, operational) that never blur into each other.
What this discipline buys
- Adding a feature is a new plugin. The framework is unchanged.
- Replacing a component is replacing one signed file. Nothing else moves.
- Two vendors building similar devices share the same brand-neutral plugins; only the vendor layer differs.
- Fixing a typo in a hardware parameter is a one-line config edit, not a rebuild and a re-flash.
- The integration cost of the plugin ecosystem stays flat as it grows. Plugin authors learn the contract once.
- Anywhere with a CPU, RAM, and a bit of storage is in scope. From a wearable counting heart-rate to a datacenter inference rack, the same fabric vocabulary applies.
The framework runs on Unix today (eight architectures, three ARM profiles, glibc and musl). The contracts the framework defines do not assume Unix; ports to RTOS, bare-metal, and embedded runtimes are open invitations. The distributions page shows the spread.
Light by design
Pure Rust. No garbage collector, no JavaScript runtime, no PHP, no JVM, no shell pipelines holding state. The framework is one long-running process and the plugins it admits, and every architectural choice above this line is also an efficiency choice.
A typical evo steward uses tens of MB of memory at idle and a fraction of a percent of CPU. The same workload on a Node.js or Python application server uses hundreds of MB of memory, a JIT or interpreter warmup window, and GC pauses every few seconds. The runtime overhead a VM-hosted stack pays is simply not paid here.
The consequences are practical. A wearable's battery lasts longer. A datacenter's cooling bill is smaller. A satellite's solar panel covers more workload. Less work uses less power; that is the whole story.
Three tiers
Evo is not framework + your app. It is three tiers, each with its own release cadence, its own signing key, and its own job.
The framework holds the middle. The four contracts at the top boundary are the SDK, the wire protocol, the packaging shape, and the catalogue shape. The single contract at the bottom is the client socket. Inside the steward, the orbital flow is what the steward does continuously: admit, place, compose, dispatch, project, notify - the same heartbeat the framework keeps everywhere.
A real device exists
evo-device-audio
The reference generic device for the audio domain. Brand-neutral plugins under org.evoframework.*: MPD playback, ALSA composition, file-tag metadata, local artwork. Used today as the canonical demonstration of every framework function.
Your distribution here
The framework supports an open set of vendor distributions. Adopt a reference generic device by name, add catalogue choices, branding, product-specific plugins, and packaging. The pattern is the same for every domain. The BOUNDARY document is the normative checklist.
Sixty seconds
A steward, a one-shelf catalogue, and a client. No audio, no hardware. The fabric is the same shape regardless of domain.
# Clone, build, and run a steward locally:
git clone https://github.com/foonerd/evo-core.git && cd evo-core
cargo build --workspace
mkdir -p /tmp/evo
cat > /tmp/evo/catalogue.toml <<'EOF'
[[racks]]
name = "example"
family = "domain"
kinds = ["registrar"]
charter = "Minimal example rack."
[[racks.shelves]]
name = "echo"
shape = 1
description = "Echoes inputs back."
EOF
cargo run -p evo -- \
--catalogue /tmp/evo/catalogue.toml \
--socket /tmp/evo/evo.sock \
--log-level infoIn another terminal, probe it:
import socket, struct, json, base64
s = socket.socket(socket.AF_UNIX); s.connect("/tmp/evo/evo.sock")
req = json.dumps({
"op": "request", "shelf": "example.echo", "request_type": "echo",
"payload_b64": base64.b64encode(b"hello").decode()
}).encode()
s.send(struct.pack(">I", len(req)) + req)
n = struct.unpack(">I", s.recv(4))[0]
resp = json.loads(s.recv(n).decode())
print(base64.b64decode(resp["payload_b64"]).decode()) # -> "hello"The full client protocol with example clients in seven languages is in CLIENT_API.md.
The words evo uses
Evo's vocabulary is load-bearing. Every word below appears with the same meaning across the framework, the documentation, and the catalogue files.
Where to go next
Project status
Pre-1.0. The framework runtime and the SDK are written; the audio reference generic device exists and is the canonical demonstration of every core function; vendor distributions are an open invitation, not yet announced.
The framework is licensed under the Business Source License 1.1; the SDK, the operator CLI, the trust primitive, the proc macro, and the example plugins are licensed under the Apache License 2.0. The names "evo" and "evoframework" are trademarks; see the trademark policy.
Single-maintainer-driven. Open to contributions on the contracts the engineering documents name. Issue and PR flow lives on GitHub.