have been declared street-legal and are functioning in a roadway and regulatory system that humans (chose to) set up
Does not eliminate the regulator screwing up the standards and not placing them high enough, or having a builder flub the implementation
For instance, pure imitation learning has a likelyhood of starting to drive like a bad driver if trained on bad drivers, and if it does one thing that a bad driver would do. (we have seen this failure mode of bugs causing bugs in LLMs)
Similarly, the best way to push down crash rates for self-driving cars is by reconstructing every accident and training the vehicle to avoid them.
But if you mess up your training pattern, or if there are weird regulations, you don’t get this, and you can end up with a system which consistently crashes in particular situations because of how it generalizes from it’s training data.
A good example for this misimplementation is the timeouts after collision warning on Waymo vehciles, and how this has caused multiple crashes without getting fixed. If something like that just slides, you don’t end up safer
If the only defense against this is activity from the regulator, then arguing that the regulator should get out of the way to make things safer does not work.
The complexity of getting declared street legal means exactly that the safety is an open question.
have been declared street-legal and are functioning in a roadway and regulatory system that humans (chose to) set up
Does not eliminate the regulator screwing up the standards and not placing them high enough, or having a builder flub the implementation
For instance, pure imitation learning has a likelyhood of starting to drive like a bad driver if trained on bad drivers, and if it does one thing that a bad driver would do. (we have seen this failure mode of bugs causing bugs in LLMs)
Similarly, the best way to push down crash rates for self-driving cars is by reconstructing every accident and training the vehicle to avoid them.
But if you mess up your training pattern, or if there are weird regulations, you don’t get this, and you can end up with a system which consistently crashes in particular situations because of how it generalizes from it’s training data.
A good example for this misimplementation is the timeouts after collision warning on Waymo vehciles, and how this has caused multiple crashes without getting fixed. If something like that just slides, you don’t end up safer
If the only defense against this is activity from the regulator, then arguing that the regulator should get out of the way to make things safer does not work.
The complexity of getting declared street legal means exactly that the safety is an open question.