Many people who’ve used Tesla’s “Full Self Driving” software are
pretty excited
about it:
I can step into a Tesla today, press a destination, and go there
without touching the wheel or pedals. Sure it won’t be flawless but
the fact is, I can. I can’t do the same in any other consumer car, and
the closest thing is a Waymo. The effort is there, I think its just a
matter of time before we start seeing the legal stuff play out.
I think this is mostly not a legal issue. Let’s take perhaps the most
favorable conditions for driverless cars:
A specific highway the manufacturer has checked
Heavy traffic, under 40mph
No rain or snow
No work zones
Daytime
Tesla could demonstrate their system worked sufficiently reliably in
these conditions that the person in the driver’s seat could safely
work, read, or watch a movie, and Tesla could take full legal
responsibility for any crash. We know Tesla could legally do this
because (despite what the Secretary
of Transportation thinks) Mercedes already does, with Drive
Pilot, which they launched in Germany in 2022
and the US in 2023.
Then Tesla could gradually remove restrictions, as they were able to
demonstrate that we could trust their system with more complex
scenarios.
Tesla fans will often claim that Tesla could easily do
this (“FSD is practically perfected, with no accidents whatsoever,
under such conditions already”), but I don’t think it’s so clear.
This is a situation where getting to “impressive” levels of operation
is quite doable (ex: here’s Cruise seven years ago
on in SF, dealing with many unusual situations) but getting to
“reliable enough that you don’t have to supervise it” has been
incredibly hard (ex: Cruise is shutting
down, after dragging
a pedestrian under a car last year). And unlike most of their
competitors (including Mercedes) the Tesla vehicles don’t have LIDAR,
which makes it even harder to get to that level of reliability.
Tesla has been making bold promises here since at least 2016,
when they claimed that Self-Driving was limited only by “extensive
software validation and regulatory approval”:
Full Self-Driving Capability
Build upon Enhanced Autopilot and order Full Self-Driving Capability
on your Tesla. … Please note that Self-Driving functionality is
dependent upon extensive software validation and regulatory approval,
which may vary widely by jurisdiction. It is not possible to know
exactly when each element of the functionality described above will be
available, as this is highly dependent on local regulatory
approval. Please note also that using a self-driving Tesla for car
sharing and ride hailing for friends and family is fine, but doing so
for revenue purposes will only be permissible on the Tesla Network,
details of which will be released next year.
(They were still saying the same thing, including “will be released
next year”, in
2019.)
I would be very happy to see Tesla succeed, and make a car that did
not need a supervising driver, but if their hardware and software were
up to the task they would have worked to get Level 3 certification
already.
Why Isn’t Tesla Level 3?
Link post
Many people who’ve used Tesla’s “Full Self Driving” software are pretty excited about it:
I think this is mostly not a legal issue. Let’s take perhaps the most favorable conditions for driverless cars:
A specific highway the manufacturer has checked
Heavy traffic, under 40mph
No rain or snow
No work zones
Daytime
Tesla could demonstrate their system worked sufficiently reliably in these conditions that the person in the driver’s seat could safely work, read, or watch a movie, and Tesla could take full legal responsibility for any crash. We know Tesla could legally do this because (despite what the Secretary of Transportation thinks) Mercedes already does, with Drive Pilot, which they launched in Germany in 2022 and the US in 2023.
Then Tesla could gradually remove restrictions, as they were able to demonstrate that we could trust their system with more complex scenarios.
Tesla fans will often claim that Tesla could easily do this (“FSD is practically perfected, with no accidents whatsoever, under such conditions already”), but I don’t think it’s so clear. This is a situation where getting to “impressive” levels of operation is quite doable (ex: here’s Cruise seven years ago on in SF, dealing with many unusual situations) but getting to “reliable enough that you don’t have to supervise it” has been incredibly hard (ex: Cruise is shutting down, after dragging a pedestrian under a car last year). And unlike most of their competitors (including Mercedes) the Tesla vehicles don’t have LIDAR, which makes it even harder to get to that level of reliability.
Tesla has been making bold promises here since at least 2016, when they claimed that Self-Driving was limited only by “extensive software validation and regulatory approval”:
(They were still saying the same thing, including “will be released next year”, in 2019.)
I would be very happy to see Tesla succeed, and make a car that did not need a supervising driver, but if their hardware and software were up to the task they would have worked to get Level 3 certification already.
Comment via: facebook, mastodon, bluesky