[Link] Does a Simulation Really Need to Be Run?

http://​​blog.regehr.org/​​archives/​​546

John Regehr, an associate professor of computer science at the University of Utah, writes about two algorithmic optimizations for Conway’s Game of Life, and speculates on the implications for self-aware entities in simulations.

Hashlife is a clever optimization that treats the Life grid as a hierarchy of quadtrees. By observing that the maximum speed of signal propagation in a Life configuration is one cell per step, it becomes possible to evolve squares of the Life grid multiple steps into the future using hash codes. Hashlife is amazing to watch: it starts out slow but as the hashtable fills up, it suddenly “explodes” into exponential progress. I recommend Golly. Hashlife is one of my ten all-time favorite algorithms.

Those who have read Greg Egan’s Permutation City will find the concept of Hashlife familiar.

Another ingenious simulation speedup (developed, as it happens, around the same time as Hashlife) is Time Warp, which relaxes the synchronization requirements, permitting a processor to run well ahead of its neighbors. This opens up the possibility that a processor will at some point receive a message that violates causality: it needs to be executed in the past. Clearly this is a problem. The solution is to roll back the simulation state to the time of the message and resume execution from there. If rollbacks are infrequent, overall performance may increase due to improved asynchrony. This is a form of “optimistic concurrency” and it can be shown to preserve the meaning of a simulation in the sense that the Time Warp implementation must always return the same final answer as the non-optimistic implementation.