In deployment, we should expect our actions to line up with our values, thus triggering the “ruin the universe for as many as possible” behavior.
It seeks to max harms?
This is the scariest example of nominative determinism I have ever seen.
It seeks to max harms?
This is the scariest example of nominative determinism I have ever seen.