I am going to apply my own dimensionality reduction algebra to a quantum channel (or matrices) obtained from the Okubo algebra in order to demonstrate the compatibility between my dimensionality reduction and the Okubo algebra.
TL-DR version: I trained my own machine learning algorithms on Okubo algebras and the squares of the fitness levels of the local maxima were usually either rational numbers or quadratic algebraic numbers. This suggests that my machine learning algorithm behaves mathematically.
Origin of algorithm: I have originally created this dimensionality reduction algorithm to analyze the cryptographic security of block ciphers for the cryptocurrency that I have created. If you want to discuss cryptocurrency technologies, please contact me privately off this site since I really do not feel comfortable talking about that stuff here.
After obtaining the dimensionality reduction algorithm, I noticed that such algorithms behaved mathematically for reasons that I still can’t explain, and I have concluded that such mathematical behavior is needed to construct inherently interpretable and safe machine learning algorithms. Of course, if we want inherently interpretable and safe AI, we need machine learning algorithms that we can use to train models with many layers that can solve sophisticated tasks, but I am well on my way towards creating these algorithms too despite a complete and total lack of support.
Mathematics: The Okubo algebra[1] is a close cousin to the octonions and satisfies many similar properties to the octonions.
The underlying set of the Okubo algebra is the set of all -complex Hermitian matrices with trace 0. Observe that the set of all -complex Hermitian matrices forms a real vector space of dimension . Therefore, the Okubo algebra’s underlying set has dimension . Let be the complex numbers with . Then up to complex conjugation. The Okubo algebra is endowed with a bilinear operation defined by (I scaled the operation by a factor of so that the norm on the Okubo algebra is just the Frobenius norm). The operation satisfies the property where refers to the Frobenius norm and .
Let be an isomorphism between inner product spaces. Then define an operation on by setting . Then define orthogonal matrices by where is the standard basis for real Euclidean space.
If are -complex matrices and are -complex matrices, then define the -spectral radius similarity between and by
.
Computational results: The following facts are suggested by computer experiments but have not been rigorously proven. To run the computer experiments, I used gradient ascent to locally maximize the -spectral radius similarity. By maximizing the -spectral radius similarity, we reduce the dimensions of a tuple of matrices, and I call this dimensionality reduction the -spectral radius dimensionality reduction (LSRDR).
The maximum value of among the real -matrices is . Let be the maximum value of among the -complex,real symmetric,complex symmetric, complex anti-symmetric, complex Hermitian matrices. Then
.
Similar facts seem to hold for the other values (but I have not completely performed the calculations due to numerical instabilities that I do not want to fix). For example, and for .
The fitness levels that I have are simple but they are not too simple. This indicates that LSRDRs of Okubo algebras are interesting mathematically.
I am going to apply my own dimensionality reduction algebra to a quantum channel (or matrices) obtained from the Okubo algebra in order to demonstrate the compatibility between my dimensionality reduction and the Okubo algebra.
TL-DR version: I trained my own machine learning algorithms on Okubo algebras and the squares of the fitness levels of the local maxima were usually either rational numbers or quadratic algebraic numbers. This suggests that my machine learning algorithm behaves mathematically.
Origin of algorithm: I have originally created this dimensionality reduction algorithm to analyze the cryptographic security of block ciphers for the cryptocurrency that I have created. If you want to discuss cryptocurrency technologies, please contact me privately off this site since I really do not feel comfortable talking about that stuff here.
After obtaining the dimensionality reduction algorithm, I noticed that such algorithms behaved mathematically for reasons that I still can’t explain, and I have concluded that such mathematical behavior is needed to construct inherently interpretable and safe machine learning algorithms. Of course, if we want inherently interpretable and safe AI, we need machine learning algorithms that we can use to train models with many layers that can solve sophisticated tasks, but I am well on my way towards creating these algorithms too despite a complete and total lack of support.
Mathematics: The Okubo algebra[1] is a close cousin to the octonions and satisfies many similar properties to the octonions.
The underlying set of the Okubo algebra is the set of all -complex Hermitian matrices with trace 0. Observe that the set of all -complex Hermitian matrices forms a real vector space of dimension . Therefore, the Okubo algebra’s underlying set has dimension . Let be the complex numbers with . Then up to complex conjugation. The Okubo algebra is endowed with a bilinear operation defined by (I scaled the operation by a factor of so that the norm on the Okubo algebra is just the Frobenius norm). The operation satisfies the property where refers to the Frobenius norm and .
Let be an isomorphism between inner product spaces. Then define an operation on by setting . Then define orthogonal matrices by where is the standard basis for real Euclidean space.
If are -complex matrices and are -complex matrices, then define the -spectral radius similarity between and by
Computational results: The following facts are suggested by computer experiments but have not been rigorously proven. To run the computer experiments, I used gradient ascent to locally maximize the -spectral radius similarity. By maximizing the -spectral radius similarity, we reduce the dimensions of a tuple of matrices, and I call this dimensionality reduction the -spectral radius dimensionality reduction (LSRDR).
The maximum value of among the real -matrices is . Let be the maximum value of among the -complex,real symmetric,complex symmetric, complex anti-symmetric, complex Hermitian matrices. Then
Similar facts seem to hold for the other values (but I have not completely performed the calculations due to numerical instabilities that I do not want to fix). For example, and for .
The fitness levels that I have are simple but they are not too simple. This indicates that LSRDRs of Okubo algebras are interesting mathematically.
Okubo algebras: automorphisms, derivations and idempotents, Alberto Elduque,2013,
https://api.semanticscholar.org/CorpusID:119713330