Tokens can’t be shut down, but are traceable by design. Developing a money laundering system to cover their trail would be an incredibly tall order for the agent.
Even with evolution they would leave a pretty big trail through volume, that would make study and starving them far easier.
As any replicator they are constrained by their ecosystem and available resources, and with API calls their collective appetite will deplete those rapidly. Excess efficiency is a thing in evolution.
Biological organisms do not have sharp edges between species, they’re mostly a didactic tool. The “no fertile hybrid” rule has too many exceptions, and sharing DNA is a common unicelular feature.
Nothing here is an insurmountable obstacle for those replicators, but add crucial nuance.
Tokens can’t be shut down, but are traceable by design. Developing a money laundering system to cover their trail would be an incredibly tall order for the agent.
Even with evolution they would leave a pretty big trail through volume, that would make study and starving them far easier.
If an inference provider knows about a specific sequence of tokens that’s used only and always by a particular agent, it’ll be easy to find instances of that agent, but I think you’re underestimating what a large hurdle that is. Inference providers are already collectively serving tens of trillions of tokens per day. Successful autonomous agents will do their best to blend into that traffic, and it’s not clear what (if any) tractably detectable features will prevent that.
Some points:
Tokens can’t be shut down, but are traceable by design. Developing a money laundering system to cover their trail would be an incredibly tall order for the agent.
Even with evolution they would leave a pretty big trail through volume, that would make study and starving them far easier.
As any replicator they are constrained by their ecosystem and available resources, and with API calls their collective appetite will deplete those rapidly. Excess efficiency is a thing in evolution.
Biological organisms do not have sharp edges between species, they’re mostly a didactic tool. The “no fertile hybrid” rule has too many exceptions, and sharing DNA is a common unicelular feature.
Nothing here is an insurmountable obstacle for those replicators, but add crucial nuance.
If an inference provider knows about a specific sequence of tokens that’s used only and always by a particular agent, it’ll be easy to find instances of that agent, but I think you’re underestimating what a large hurdle that is. Inference providers are already collectively serving tens of trillions of tokens per day. Successful autonomous agents will do their best to blend into that traffic, and it’s not clear what (if any) tractably detectable features will prevent that.