Google car also uses machine learning. That still doesn’t mean that it tries to emulate a human driver.
The article doesn’t say that the car predicts what a human driver would do.
How do you enforce that the Ai should “try to drive with as little risk as possible”?
There’s the example of the Google car waiting for the woman in the wheelchair who chased ducks. That’s behavior you get from the way Google algorithm cares about safety that you wouldn’t get from emulating human drivers.
Google uses machine learning, but it’s not based on it. There is a difference between a special “stop sign detector” function, and an “end to end” approach where a single algorithm learns everything.
Comma.ai’s business model is to pay people to upload their dashcam footage, and train neural networks based on it. As far what I described is their approach.
I would be surprised if they setup their system in a way where they can’t tell a car to approach a red light by using less fuel than human drivers use.
As far as accidents go, the idea that automatic breaking should take over in emergency situations is already implemented in many cars on the road. It’s unlikely that the system would react how a human driven car would have reacted a decade ago.
Google car also uses machine learning. That still doesn’t mean that it tries to emulate a human driver. The article doesn’t say that the car predicts what a human driver would do.
There’s the example of the Google car waiting for the woman in the wheelchair who chased ducks. That’s behavior you get from the way Google algorithm cares about safety that you wouldn’t get from emulating human drivers.
Google uses machine learning, but it’s not based on it. There is a difference between a special “stop sign detector” function, and an “end to end” approach where a single algorithm learns everything.
Comma.ai’s business model is to pay people to upload their dashcam footage, and train neural networks based on it. As far what I described is their approach.
I would be surprised if they setup their system in a way where they can’t tell a car to approach a red light by using less fuel than human drivers use.
As far as accidents go, the idea that automatic breaking should take over in emergency situations is already implemented in many cars on the road. It’s unlikely that the system would react how a human driven car would have reacted a decade ago.