Apple Asks California DMV to Make Changes to Autonomous Vehicle Testing Policies

Discussion in 'MacRumors.com News Discussion' started by MacRumors, Apr 28, 2017.

  1. cmwade77 macrumors 65816

    Joined:
    Nov 18, 2008
    #51
    Umm, based on what I have seen here, Apple wants the rules changed for everyone, not just themselves.

    And to sum it up, Apple wants the reports to include when there were problems with the system, but they don't want to have to report when say the driver intentionally disengaged the system because that was part of the test. The construction zone, software bug and sensor drop outs though I think should have to be reported.
     
  2. 69650 Suspended

    69650

    Joined:
    Mar 23, 2006
    Location:
    England
    #52
    These cars are on public roads so we should have the right to know the full details.
     
  3. YegorH macrumors regular

    Joined:
    Jul 9, 2010
    #53
    - Apple Asks California DMV to Make Changes to Autonomous Vehicle Testing Policies.

    - DMV asks Apple to send the request via telegram.
     
  4. Macaholic868 macrumors 6502

    Macaholic868

    Joined:
    Feb 2, 2017
    #54
    I wasn't aware that corporations could be "excited" since ... you know ... they aren't living things. I look forward to the day when the majority of Americans come to their collective senses and flush the idea that "corporations are people and the exchange of money is speech" down the toilet where it belongs.
     
  5. Samford macrumors member

    Joined:
    Jan 24, 2011
    #55
    Well I think you need to update your thinking of drugs and driving. The latest Roadside Testing is showing drugs affected driving is much more common than alcohol. In fact most country road truck accidents in Aus are a combination of upper drugs and fatigue
     
  6. itsamacthing macrumors 6502a

    Joined:
    Sep 26, 2011
    Location:
    Bangkok
    #56
    Play by the rules Apple, you are not that special. CA, ask Apple to open a factory in Cali in exchange
     
  7. GoodWheaties macrumors 6502a

    GoodWheaties

    Joined:
    Jul 8, 2015
    #57
    There are a myriad of reasons where the driver should probably regain control that shouldn't count as incidents. Construction, unmarked roads, congested traffic, and such.
     
  8. RamsayBolton, Apr 28, 2017
    Last edited: Apr 28, 2017

    RamsayBolton macrumors member

    RamsayBolton

    Joined:
    Apr 20, 2017
    #58
    In other words, Timmy Cook's Apple, like a spoiled child, wants special rules for themselves.
    Because Timmy Cook's Apple is not like every other company. Apple is special. Apple is magical. Apple is courageous.
     
  9. kdarling, Apr 28, 2017
    Last edited: Apr 28, 2017

    kdarling macrumors P6

    kdarling

    Joined:
    Jun 9, 2007
    Location:
    First university coding class = 47 years ago
    #59
    If the reason is marked in the reports, it seems like valuable info for the public to know.

    For example, if it turns out to be common to have to take control for ANY reason, say a half dozen times a day, it can temper our real life expectations for self-driving cars, and how much attention you have to pay as the designated emergency driver.

    Even having it happen on an average of once a day could make it impossible for the blind or elderly to rely on such vehicles.
     
  10. 69Mustang macrumors 604

    69Mustang

    Joined:
    Jan 7, 2014
    Location:
    In between a rock and a hard place
    #60
    We agree? I change my mind. I disagree with everything you just said. In doing so, I completely understand I am disagreeing with everything I previously said. So be it. Ever the contrarian, never let it be said that Mustang could be reasonable or amenable to consensus... on anything. It's just my nature. I will sting the frog every.single.time.:D <-- not sure of your age so that reference may be too dated. Trust me, it makes the quote funnier.:)
     
  11. mkldev macrumors regular

    Joined:
    Apr 1, 2003
    #61
    Ah, but what you're missing there is that the one drunk idiot won't kill others if the drunk driver isn't driving. Drunk driving deaths are 100% caused by human error, so if you take the bag of meat completely out from behind the wheel, drunk-driving deaths will drop to zero, as will drug-influenced-driving deaths, sleep-deprived-driving deaths, cell-phone-distracted driving deaths....

    The only deaths that can't be prevented by self-driving cars, assuming a perfectly ideal implementation, are deaths caused by freakishly sudden, catastrophic vehicle failures (e.g. blowouts in the middle of tight curves, drive shafts breaking and impaling the gas tank, etc.).
    --- Post Merged, Apr 28, 2017 ---
    That depends on why, when, and how it kicks out.

    If it happens once per day without warning, and forces the driver to immediately take over, that would arguably be worse than nothing, because drivers will become reliant on the automatic driving systems, and when they kick out, drivers often won't be paying enough attention and will crash into a bridge abutment.

    If it kicks out only when it drives into a parking lot because it can't park, or when it enters a construction zone, or at other, highly predictable times with plenty of advance warning, it would be fine for all but the blind, and with proper route planning and navigation software that knows how to avoid construction zones, it might even be fine for the blind (provided that they can ask somebody to park the car when they get there).

    And obviously there's a wide continuum between those extremes.
     
  12. trifid, Apr 28, 2017
    Last edited: Apr 29, 2017

    trifid macrumors 68000

    trifid

    Joined:
    May 10, 2011
    #62
    My theory: Apple car tech is relying on Apple maps which sucks, in part because many times it's not updated as fast or at all for construction or changes as Google maps, hence Apple is experiencing a lot more "disengagements" than Google and don't want all of that to be reported.
     
  13. MH01 Suspended

    MH01

    Joined:
    Feb 11, 2008
    #63
    From a NSW report i read a year ago, alcohol was still higher than drugs. Unless this has changed since 2015.

    Also using professional truck drivers is not fair in this comparison , cause thier employment circumstances means they resort to drugs to force themselves to stay awake, it's not for pleasure.

    The most common country road death , not related to professional drivers is it still not alcohol ?
     
  14. alexgowers macrumors 65816

    Joined:
    Jun 3, 2012
    #64
    Different rules for corps = Corruption

    Enjoy the bad PR Apple it'll really help sales.

    Driving on PUBLIC roads and risking the safety and health of the PUBLIC should mean they're required to report to the PUBLIC on safety of devices to then have them approved or withdrawn by the PUBLIC. Sorry apple but it's a requirement don't try and bend the rules.
     
  15. MH01 Suspended

    MH01

    Joined:
    Feb 11, 2008
    #65
    Apple, safety comes first, and that is not for you to judge. Reporting stays the same for everyone. This is people lives at stake and not a new fashion product
     
  16. gnasher729 macrumors P6

    gnasher729

    Joined:
    Nov 25, 2005
    #66
    Some safety regulations: Apple thinks they shouldn't have to report when the self-driving disengages because the driver stops the car, takes out the key, and leaves. Absolutely apolloa, that mustn't be allowed to happen.

    Apple thinks they shouldn't have to report when the self-driving disengages when they run a test that is _intended_ to end in disengagement. So when Apple creates a situation that _should_ lead to disengagement and it works as planned, that should be reported. Absolutely.

    My god, you are such a hater. For you, _everything_ is a negative. What's going on with you?

    Because when you read what they are asking for, these are things that absolutely make sense. Google for example is asking that they should be allowed to sell formerly self-driving cars after all the self-driving equipment is removed, turning it into an ordinary car, instead of having a one year old car worth maybe $15-20K that can only be put into the shredder.
    --- Post Merged, Apr 29, 2017 ---
    FYI: Freeways are the safest place. Most dangerous are cities and rural areas. Cities have more accidents, rural areas have more deaths because it takes longer until help arrives.
     
  17. dilbert99 macrumors 68020

    Joined:
    Jul 23, 2012
    #67
    No it doesn't.
    I think they should notify on bugs and sensor dropouts
     
  18. GoodWheaties macrumors 6502a

    GoodWheaties

    Joined:
    Jul 8, 2015
    #68
    Yes it is valuable information but what you are forgetting is that they are in the testing stage right now. They are purposely disengaging it or putting it in situations where it may disengage and that is artificially inflating the numbers. The only one I disagree with is the system failures. The rest would eliminate useless chaff from the numbers.
     
  19. TechGeek76 Suspended

    Joined:
    Jul 18, 2016
    #69
    I'm sorry, but if the users manually disengages the system, or disables it. That should very well be know.
     
  20. ravenstar macrumors regular

    Joined:
    Jan 12, 2005
    #70
    Please, please, let's start by banning all human drivers. I drive a mere 15 miles to work each day and barely a day goes by without some inattentive idiot crossing the centerline directly at me so that I have to take evasive action. I haven't heard anything yet to suggest that the self driving cars currently on the road are less safe than the average human driver.
    --- Post Merged, Apr 29, 2017 ---
    I agree, this seems like precisely the situation that should be reported.

    But if the goal is for consistent reporting, it does seem that reporting all disengagements while driving on public roads and their cause is the only way to be certain of consistent reporting. It is as important to safety to understand how often the operational parameters of the systems are exceeded as it is to know how often the system fails. Inconsistencies are resolved by better definitions of the reporting categories, not by eliminating reporting.
    --- Post Merged, Apr 29, 2017 ---
    Why should these situations not be considered in evaluating the safety and reliability of self-driving systems? Looking at it another way, should any driver be able to obtain a license without being able to safely handle such situations? It may be reasonable to end a test due to a situation beyond the design constraints during system development, but the public and the government should have this information when they are asked to approve self driving vehicles for more than just testing with a trained test driver.
     
  21. groovyd Suspended

    groovyd

    Joined:
    Jun 24, 2013
    Location:
    Atlanta
    #71
    the only things that should be reported is total miles and time driven in autonomous mode and any real accidents while in autonomous mode
     
  22. pat500000 Suspended

    pat500000

    Joined:
    Jun 3, 2015
    #72
    That means you shouldnt drive as well.
     
  23. ravenstar macrumors regular

    Joined:
    Jan 12, 2005
    #73
    I have no problem with that if it means other drivers will not be on the roads either. I'm not infallible anymore than the next guy.
     
  24. SpinThis!, Apr 29, 2017
    Last edited: Apr 29, 2017

    SpinThis! macrumors 6502

    Joined:
    Jan 30, 2007
    Location:
    Inside the Machine (Green Bay, WI)
    #74
    Wow, lots of cynical people here who have clearly never tested software of any kind.

    Can we assume that Apple does not want to sell you software that causes collisions? If you can't agree with that premise, you can stop reading now since this discussion will go nowhere.

    If you're still reading, let's be clear: there is NO BUG FREE SOFTWARE. Autonomous cars especially cannot be tested in a vacuum.

    From the way this reads:

    The proposed requirement in §227.50(b)(3)(B)(vi) to describe the type of incident that would have happened without the disengagement should be removed. It requires speculation about future events that have not occurred.

    Pretend for a moment you're a software engineer. You know parts of your software work and parts that don't. The way the law is written apparently means you have to write PERFECT software before it's ever tested.

    If you're in the early stages of development, imagine writing up fail after fail after fail, even if you KNOW in some situations the code WILL fail. Expectedly. Some news outlet gets ahold of this testing and spins the hell out of it without fully knowing all the facts. In fact, this comment section is proof of this happening.

    Let's say a company did some testing and the safety driver had to intervene 100 times.
    Another company did some testing and the safety driver had to intervene 50 times.

    Does that make Company B's car more foolproof or Company A's software more buggy? Hardly. It could also simply mean that Company A tests a lot more than company B. Again, to get something right, you might have to "fail" a lot.

    This is also not the horse and buggy era. News gets reported instantly nowadays with zero fact checking. This is part of the problem of fake news. I can see how Apple—or anyone else—would be extremely annoyed by this.

    Can you imagine if computer software developers were required by law to report every time their software crashed while they were testing it? If I know the software program is going to "crash" at a certain point in the code, I will manually intervene and force quit the process before it happens. This is exactly what a safety driver does.

    How do you regulate close calls? You can't.

    These are also just the PROPOSED rules for driverless cars.

    And these systems are not even on sale yet. I would imagine the regulations for putting these autonomous cars through the ringer is even higher.

    It's not just Apple that wants clarification; Google, Uber and Tesla each are asking for revisions.

    https://consumerist.com/2017/04/28/...fornia-to-revise-rules-for-self-driving-cars/
     
  25. gnasher729 macrumors P6

    gnasher729

    Joined:
    Nov 25, 2005
    #75
    I interpret the "construction zone" as car with driver approaching a construction zone, and then the driver decides to take over control. There's no risk for the public here. It would be different if the car went into the construction zone, the driver allows it to drive itself, and suddenly things go wrong and the driver has to take over.
    --- Post Merged, Apr 29, 2017 ---
    Imagine hypothetically that Google has self-driving software that works really well. And they decide to test extreme cases. So this Google car is driving along and someone pushes a baby trolley right into its way (with a doll inside, not a baby). They would fully expect this to go wrong, record the results, and use the results to improve the software. Do people really think this should be reported?
     

Share This Page