Theorem 6 The class of Transformer networks with positional encodings is Turing complete. Moreover, Turing completeness holds even in the restricted setting in which the only non-constant values in positional embedding pos(n) of n, for n ∈ N, are n, 1/n, and 1/n2 , and Transformer networks have a single encoder layer and three decoder layer

## Claude 3 Opus can operate as a Turing machine

Link post

Posted on Twitter:

Here is the prompt code for the Turing machine: https://github.com/SpellcraftAI/turing

This is the fully general counterpoint to the @VictorTaelin’s A::B challenge (he put money where his mouth is and got praise for that from Yudkowsky).

Attention is Turing Complete was a claim already in 2021: