Wow, lots of cynical people here who have clearly never tested software of any kind.
Can we assume that Apple does not want to sell you software that causes collisions? If you can't agree with that premise, you can stop reading now since this discussion will go nowhere.
If you're still reading, let's be clear: there is NO BUG FREE SOFTWARE. Autonomous cars especially cannot be tested in a vacuum.
From the way this reads:
The proposed requirement in §227.50(b)(3)(B)(vi) to describe the type of incident that would have happened without the disengagement should be removed. It requires speculation about future events that have not occurred.
Pretend for a moment you're a software engineer. You know parts of your software work and parts that don't. The way the law is written apparently means you have to write PERFECT software before it's ever tested.
If you're in the early stages of development, imagine writing up fail after fail after fail, even if you KNOW in some situations the code WILL fail. Expectedly. Some news outlet gets ahold of this testing and spins the hell out of it without fully knowing all the facts. In fact, this comment section is proof of this happening.
Let's say a company did some testing and the safety driver had to intervene 100 times.
Another company did some testing and the safety driver had to intervene 50 times.
Does that make Company B's car more foolproof or Company A's software more buggy? Hardly. It could also simply mean that Company A tests a lot more than company B. Again, to get something right, you might have to "fail" a lot.
This is also not the horse and buggy era. News gets reported instantly nowadays with zero fact checking. This is part of the problem of fake news. I can see how Apple—or anyone else—would be extremely annoyed by this.
Can you imagine if computer software developers were required by law to report every time their software crashed while they were testing it? If I know the software program is going to "crash" at a certain point in the code, I will manually intervene and force quit the process before it happens. This is exactly what a safety driver does.
How do you regulate close calls? You can't.
These are also just the PROPOSED rules for driverless cars.
And these systems are not even on sale yet. I would imagine the regulations for putting these autonomous cars through the ringer is even higher.
It's not just Apple that wants clarification; Google, Uber and Tesla each are asking for revisions.
https://consumerist.com/2017/04/28/...fornia-to-revise-rules-for-self-driving-cars/