Did you come up with your hunger drive on your own? Sex drive? Pain aversion? Humans count as agents, and we have these built in. Isn’t it enough that the agent can come up with subgoals to accomplish the given task?
The described “next image” bot doesn’t have goals like that, though. Can you take the pre-trained bot and give it a drive to “make houses” and have it do that? When all the local wood is used up, will it know to move elsewhere, or plant trees?
Yes, maybe? That kind of thing is presumably in the training data and the generator is designed to have longer term coherence. Maybe it’s not long enough for plans that take too long to execute, so I’m not sure if Sora per se can do this without trying it (and we don’t have access), but it seems like the kind of thing a system like this might be able to do.
If you have to give it a task, is it really an agent? Is there some other word for “system that comes up with its own tasks to do”?
Did you come up with your hunger drive on your own? Sex drive? Pain aversion? Humans count as agents, and we have these built in. Isn’t it enough that the agent can come up with subgoals to accomplish the given task?
The described “next image” bot doesn’t have goals like that, though. Can you take the pre-trained bot and give it a drive to “make houses” and have it do that? When all the local wood is used up, will it know to move elsewhere, or plant trees?
Yes, maybe? That kind of thing is presumably in the training data and the generator is designed to have longer term coherence. Maybe it’s not long enough for plans that take too long to execute, so I’m not sure if Sora per se can do this without trying it (and we don’t have access), but it seems like the kind of thing a system like this might be able to do.