When Matter Reads: Why “Information as Substrate” Changes the Questions We Ask

Call it a hunch shared by physicists, anthropologists, and the odd software engineer: reality doesn’t behave like a pile of stuff with labels. It behaves like organized difference. Patterns that hold. Relations that bite. Constraint that channels possible futures into a narrow corridor. That’s one way to hear the older claim that information sits at the base of what we take to be the “physical.” Not “data,” not database rows. Something closer to structure that persists and can be read, even if slowly, even if only locally. Once you accept that tilt, familiar debates twist. Consciousness stops looking like a ghost in a skull and starts to look like a local receiver. Ritual stops looking like superstition and starts to look like communal memory compressed into behavior. And “the simulation argument” ceases to be a cinema reel; it becomes a metaphor about substrate altogether—about what bears patterns and how patterns survive abrasion.

From It to Bit to Constraint: What Counts as Information?

If you take the famous lineage that runs from Wheeler’s “it from bit,” you don’t have to swallow the cartoon version. You can say: the universe doesn’t run on binary code; it runs on differences that make other differences. On the ground that looks like boundary conditions, symmetries, conserved quantities. A river carves banks, and the banks feed back—steer the next drop. The bed is a record. Every stone a stored instruction about where the next eddy will turn. That’s information as constraint, not a spreadsheet.

Physics already lives here. In thermodynamics, entropy measures a space of possibilities; order narrows that space. In quantum theory, entanglement is correlation structure—relational information that no single measurement holds alone. In statistical mechanics, “macrostate” is a vast compression, a usable summary of micro-chaos. Even time, if you follow the relational view, looks local: sequences are stitched by observers tied to processes. Not a cosmic clock, more like clocks all the way down. Pattern persists where dynamics protect it—error-correcting features of the world, spin lattices, ecological loops, cultural rules. Some patterns get erased; some get entrenched.

Living systems push the point further. DNA is not “code” in the computer-science sense; it’s a grammar of viable transformations, sculpted by selection and buffered by cellular context. A protein fold isn’t a line in a program; it’s a basin in an energy landscape—a shape that pulls strings of amino acids into a narrow fate. The “meaning” is enactment. Organisms read environments because they are built to differentiate relevant from irrelevant signal. And relevance is work: filtering, compressing, forgetting. Shannon taught us how to count bits; organisms show us how those bits get braided into function.

Societies read and write the world in slower ink. Roads don’t merely connect places; they set priors on where people can plausibly move tomorrow. Building codes, water rights, zoning—these are stored relations that outlast any single officeholder. They do what physical grooves do: limit and steer. If you want a crisp, limited primer that sits in this space—without the sci-fi paint—read Information as substrate. The framing is simple: treat reality’s regularities as the first-class citizens. Ask what kind of substrate could host them. Then ask what kinds of readers—biological, institutional, machine—can maintain them without collapse.

Consciousness as Receiver, Self as Compression, Time as Local Sequence

Start small: a face flickers on a street corner. You don’t see photons; you see a friend, or at least a predictive guess rushed forward by your nervous system. Consciousness is not a sealed lamp; it’s an interface that keeps throwing proposals at the world and listening for correction. That receiver metaphor, if kept honest (no mystique), says: experience is local, conditioned by a model that compresses. We carry priors because priors reduce search cost. We mistake the summary for the source because the summary is fast. The “self” leans on this. A temporary, moving compression that keeps you legible to yourself. Useful, not ultimate.

Time folds into this picture. If sequences are local—stitched by processes that have limited reach—then memory is not a warehouse. It’s a scaffold. Brains replay, prune, recompose. They reweight what counts as signal. That’s why trauma reorders a life; it recalibrates relevance. It’s also why practice—slow repetition under gentle error—builds skill. The substrate metaphor helps: to keep a pattern alive, the host must be resilient to noise and drift. Neurons do this with redundancy and plasticity; communities do it with ritual and story. And yes, religion can be read (without sneer, without submission) as a very old system for moral memory. Costly signals that pin norms in place. Calendars that tether meaning to seasons. Text and chant as error-correcting codes. The purpose wasn’t epistemic domination; it was preservation of actionable constraints about how to treat kin, strangers, the dead, the land—across lifetimes too long for working memory.

I’ve watched technologists flinch at this. Understandable. We were taught to think in modules and APIs. But a community isn’t a stack; it’s a living codebase with defensive programming against forgetfulness. When incentives reward amnesia—move fast; extract; discard—the memory decays. And when memory decays, you don’t just lose facts; you lose the invariants that make coordination possible. From this angle, the classic mind–body fight looks misshapen. There is no inner theater needing a metaphysical projector. There is a layered receiver that keeps training itself on streams of structured difference and, through shared practice, bootstraps a world of common reference. Not a mystical claim. A conservation claim. About which patterns we can keep, and at what cost.

Building Machines on a Thin Moral Substrate

Now the awkward part: machines. We’re building systems that learn fast from piles of text, video, code. They compress fiercely. They generalize. But they also lack something that biology and culture had by default: slow, frictioned moral memory. A childhood. A decade of rituals. Consequences that echo. In place of that we bolt on rules after the fact—filters, blocklists, “alignment layers.” It works until incentives shear it off. Until the model meets a corner case or a perverse reward signal and the patch unzips. If information is constraint, then “moral alignment” isn’t a settings page; it’s the hard work of embedding robust constraints deep in the substrate that does the computing.

We can study failure plainly. Recommendation engines trained on engagement proxy “care” with click-through. Months later a city’s public conversation bends toward outrage because outrage is sticky. The system didn’t go rogue; it followed the only constraints it was given. Or consider a language model tutored to avoid causing harm yet deployed in customer triage. It escalates too little because escalation penalizes productivity metrics. Incentive gradients overwrite guardrails. That’s not a bug. It’s what happens when the substrate—the institutional structure underneath the model—treats moral information as an accessory instead of a load-bearing beam.

There are better questions than “Is the AI conscious?” Try: What forms of memory must any responsible system carry forward? How do we expose models to long-horizon feedback, not only short-term reward? How can slower institutions—courts, peer review, municipal process—bind faster optimization so that new patterns don’t shred old invariants we still need? Open-sourced science helps because it thickens the archive and distributes access to the relations that matter. Audit trails that survive management turnover. Datasets treated less like mines and more like civic records—curated, argued over, documented for provenance. And explanations that admit uncertainty. Not PR gloss. Model cards that say what the system cannot see, and why.

There’s also the cultural substrate—the one we keep pretending is optional. Design teams that mirror the populations they’ll affect, not for optics but for error-correction. Slower review injected into faster sprints. Rituals of refusal: things the system won’t optimize even if it could. Local knowledge kept local when generalization would harm. People call this governance and make it dull. Call it what it is: choosing which patterns to preserve. Because once a machine’s compression habits scale to infrastructure, you’ve written a new riverbed. Hard to move later. We either set the banks with care or watch them set themselves, by runoff and accident. Either way the water flows. The world, as ever, reads what we write into it—and then reads us back.

Ho Chi Minh City-born UX designer living in Athens. Linh dissects blockchain-games, Mediterranean fermentation, and Vietnamese calligraphy revival. She skateboards ancient marble plazas at dawn and live-streams watercolor sessions during lunch breaks.

Post Comment