There’s definitely a whole question about what sorts of things you can do with LLMs and how dangerous they are and whatnot.
This post isn’t about that though, and I’d rather not discuss that here. Could you instead ask this in a top level post or question? I’d be happy to discuss there.
Are LLMs utility maximizers? Do they have to be?
There’s definitely a whole question about what sorts of things you can do with LLMs and how dangerous they are and whatnot.
This post isn’t about that though, and I’d rather not discuss that here. Could you instead ask this in a top level post or question? I’d be happy to discuss there.