Reversible computation will not only make the energy efficiency of computation far below Landauer’s limit , but reversible computation can also take an unreliable but highly energy efficient bit deletion process and turn it into a very reliable and energy efficient energy efficient process. Let me explain.
Suppose that one has bits of garbage information that one wants to delete. We can safely assume that about half of these bits are 0′s and the other half of these bits are 1′s. Now let . Suppose that we have an unreliable bit deletion process where after the process of bit deletion, about of these bits will be ‘s and the rest of the bits will be ’s. Then these data will have a Shannon entropy of . This means that one will be able to run these data through a reversible compression algorithm that consumes a negligible amount of energy thanks to energy efficient reversible computing, and after compression, the data will be about bits long, and one would have about bits of reliably deleted information. Reversible computation (in particular, reversible data compression algorithms) can even bring the energy efficiency of irreversible bit deletions down to per bit deleted.
I wonder if the problem of glitch tokens can be mitigated by splitting up text into tokens in a non-unique way and considering all tokenizations of text at the same time.