I didn’t miss your point.
The faster these cars learn how to drive, the more lives we’ll save. People are going always going to die from car crashes, even when self-driving tech is refined. Someone probably died during this forum conversation. This is a MAJOR problem that deserves our full attention. As tragic as this was, death has and will continue to be a part of automobiles. We shouldn’t stop pursuing autonomous technology because of an accident.
Uber took their cars off the roads, re-evaluated the situation and eventually will deploy their cars for further testing. Do I fully trust self-driving cars? Of course not. Testing on private roads is a paradox. How can one be sure the test environment will work as well as the real one? You can’t, which is why these cars are sharing the roads with us.
[doublepost=1527220696][/doublepost]
Buy me a Waymo van and I’ll glady give you my Mazda.![]()
As I said in the post you quoted
Self driving cars should be pursued but there needs to be minimum safety requirements and independent testing and auditing well before innocent human lives are placed at risk.
We, as a society via a proper regulatory body, need minimum safety requirements defining what collision avoidance systems need to be in place. How they need to operate. What performance we expect to be reasonable in given scenarios.
These systems need reporting and audits as well as validation testing before being placed on public roads. To ensure that safety is being placed as high as reasonably possible and safety issues are quickly resolved when discovered. Including, sufficient alerts to enable human safety drivers to properly intervene as they are tasked to do.
It is inexcusable that Uber was able to put such a deficient system on the roads and kill someone. This was extremely negligent and this should give everyone pause.