A Process-Based Ontology of Time: Update Ordering, Stable Cycles, and Emergent Physics

Summary

Epistemic status: exploratory framework + toy model; qualitative claims only; looking for failure modes and relevant prior work.

I’m exploring a process-first framework in which time is not a primitive continuous parameter, but the ordering of discrete update events. In this view, matter corresponds to stable, localized cycles of updates, and familiar physical structure (causal speed limits, time dilation, effective geometry) may emerge from local update rules rather than being postulated a priori.

This is not a finished theory and not a claim to replace existing physics. It’s an attempt to articulate a coherent ontology underneath known formalisms, supported by a simple toy dynamical model and numerical exploration. I’m posting this to invite critique, identify failure modes, and locate relevant prior work.


Motivation

What originally pushed me down this path was a persistent conceptual discomfort with time dilation.

Relativistic time dilation works extraordinarily well mathematically, but ontologically it is strange: two clocks can be said to exist “for the same duration” in one sense, and yet one objectively accumulates fewer ticks than the other. This suggests that something physical is being counted along a worldline, rather than time being a passive background parameter.

Once that question is taken seriously, it becomes hard to avoid asking:

  • What is being counted?

  • Why does counting differ between trajectories?

  • What does it mean for causation if nothing “happens” between arbitrarily close instants of continuous time?

This led me to explore a framework where events are primary, states are secondary, and time is the ordering of events rather than an independently flowing dimension.


Core Ontological Commitments

The framework rests on four commitments. Everything else is downstream of these.

  1. Reality is fundamentally process, not static geometry.
    Events and updates are ontologically primary; states are abstractions over them.

  2. Time is not a background parameter.
    Time is the ordering of discrete update events. There is no external clock.

  3. Matter corresponds to stable, localized update cycles.
    What we call a “particle” is a persistent dynamical pattern with an internal clock, not a primitive object.

  4. Causation is productive, not merely correlational.
    Causes generate new events; effects are not just relations among pre-existing states.

Continuity, smooth spacetime, and differential equations are treated as effective descriptions that arise under coarse-graining.


A Minimal Toy Dynamical Model

To make the ontology concrete, I explored a simple toy model.

  • Space: a discrete 3D lattice (nearest-neighbor graph)

  • Time: discrete global update steps

  • State: a scalar value a_i(t) at each lattice site

At each update step, a node evolves according to a neighbor relaxation rule with state-dependent coupling, e.g.:

a_i(t+1) = a_i(t) + λ(a_i(t)) * (S_i(t) − d * a_i(t))

where S_i(t) = sum_{j in N(i)} a_j(t) is the neighbor sum, d = |N(i)| is the coordination number, and:

λ(a) = 1 /​ (1 + α |a|) with α > 0.

Each site updates by relaxing toward its neighbors, with a state-dependent coupling that slows updates at large amplitudes. In plain terms:

Each site is pulled toward the average of its neighbors, but the pull weakens as the site’s amplitude grows.

This simple nonlinearity turns out to matter a lot.


Observed Numerical Behavior

In simulations:

  • Small-amplitude disturbances propagate as wave-like signals with finite speed.

  • Large-amplitude regions slow locally and form stable oscillatory structures.

  • These localized oscillations persist over many updates and have intrinsic periods.

I interpret these stable oscillations as matter-like structures with internal clocks, and the propagating disturbances as energy-like.

I’m deliberately not claiming novelty in the mathematics. The interest is in what this behavior suggests ontologically.


Interpretive Implications (Qualitative)

Given the above commitments, several familiar phenomena admit candidate reinterpretations:

Time dilation

Different trajectories accumulate different numbers of update events. Clocks slow because their internal update cycles slow relative to propagating disturbances, not because “time itself” flows differently.

Finite causal speed

A universal speed limit can emerge from locality of updates: no influence propagates faster than the update ordering allows. (I’m not claiming this reproduces Lorentz invariance; that’s an open question below.)

Matter vs energy

Energy corresponds to propagating update disturbances. Matter corresponds to energy trapped in stable, localized update cycles.

Virtual particles

Virtual particles can be interpreted as transient, non-self-sustaining update distortions that mediate interactions without forming stable objects.

Entanglement (sketch)

Entanglement can be framed as shared update ancestry rather than superluminal influence: correlations persist because two systems originate from a non-factorizable update history, not because information travels at measurement.

Geometry

Effective spacetime geometry may be approximated as bookkeeping over spatial variation in update propagation times, rather than as a primitive metric.


What This Is Not

To be explicit, this framework is not:

  • a denial of quantum mechanics or general relativity,

  • an appeal to observers or consciousness,

  • a claim of experimental novelty,

  • a finished or rigorously derived theory.

It is a foundational ontology proposal paired with a toy dynamical model to keep the discussion grounded.


Open Questions /​ Failure Modes

Some places where I expect this framework could fail (and where critique would be most valuable):

  • Can this kind of update-based ontology reproduce quantitative bounds (e.g., Bell/​Tsirelson) rather than just qualitative features?

  • Does treating time as update ordering genuinely add explanatory power, or merely redescribe existing formalisms?

  • Are there known no-go theorems that rule out this class of models while preserving Lorentz invariance?

  • Does the notion of “shared update ancestry” avoid smuggling in superdeterminism?

Pointers to relevant literature (e.g., causal set theory, process philosophy, relational QM, cellular automata approaches) are very welcome.


Why I’m Posting This Here

I’m not trying to convince anyone that this framework is correct. I’m trying to determine whether it is coherent, novel in the right sense, or already refuted by known results.

If it fails, I want to understand where and why.

Closing

My intuition is that many persistent confusions around time, causation, and quantum “weirdness” stem from treating states as primary and processes as secondary. This post is an attempt to see how far one can get by reversing that priority and taking update events seriously as the primitive substrate.

I appreciate careful criticism more than agreement.

No comments.