Tesla and Google Face Regulator Scrutiny After Self-Driving Cars Crash

Discussion in 'Mac Blog Discussion' started by MacRumors, Jul 12, 2016.

  1. MacRumors macrumors bot

    MacRumors

    Joined:
    Apr 12, 2001
    #1
    [​IMG]


    Google's self-driving car project has appointed its first general counsel after a number of crashes involving the company's vehicles caught the attention of regulators (via Reuters).

    The National Highway Traffic Safety Administration (NHTSA) said it was collecting information after a minor incident in March when a Google self-driving car struck a municipal bus in California. On that occasion, it did not open a formal probe.

    [​IMG]

    Tesla however is feeling more intense pressure after one of its own cars was implicated in a fatal road accident recently. The NHTSA has opened a formal investigation into the May 7 death of a Tesla Motors Model S driver in Florida who was operating in "Autopilot" mode when his car crashed into a semi-trailer.

    Tesla's Autopilot system uses cameras and radar, but not lidar - a special sensor that uses laser to more accurately identify environmental obstacles. The company said its system would have had trouble distinguishing a white semi-trailer positioned across a road against a bright sky.

    Reuters reports that the United States Securities and Exchange Commission (SEC) is also looking into whether Tesla breached securities laws by not telling investors about the fatal May 7 Autopilot crash.

    The SEC investigation aims to determine whether the accident should have been labeled a "material event" by Tesla, or one that investors are likely to consider important, when the company sold $2 billion in stock on May 18.

    In a blog post written in response to a Fortune article on the subject, Tesla explained that all it knew when it notified the NHTSA of the accident was that the driver had died, not that Autopilot was involved. The SEC investigation continues.

    Industry executives and analysts told Reuters they expect the Tesla crash will spur investment in self-driving vehicle systems that combine multiple kinds of sensors, including lidar.

    Goldman Sachs forecasts the market for advanced driver assistance systems and autonomous vehicles will grow from about $3 billion last year to $96 billion in 2025 and $290 billion in 2035. More than half of that revenue in 20 years will come from radar, cameras and lidar, Goldman estimates.

    Meanwhile, U.S. regulators are currently lagging behind in issuing written regulations for autonomous vehicles. Regulations were meant to be unveiled by July 14, but U.S. Transportation Secretary Anthony Foxx announced last month they might not be released until later this summer.

    Apple has met with California DMV officials regarding self-driving car laws within the state and multiple reports from The Wall Street Journal indicate that the Cupertino company is exploring the functionality with the possibility of including it in a later iteration of the much-rumored Apple Car.

    The bulk of Apple's car research and development is thought to be taking place in secretive buildings in Sunnyvale, California, where late night "motor noises" have been heard in recent months.

    Multiple sources have indicated that the Apple Car could be finalized by 2019 or 2020, but a more precise timeframe remains unclear due to possible internal setbacks and other unforeseen circumstances. Tesla CEO Elon Musk recently called the Apple Car an "open secret," as his company aims to fulfil more than 325,000 pre-orders for its lower-priced Model 3 by late 2017.

    Article Link: Tesla and Google Face Regulator Scrutiny After Self-Driving Cars Crash
     
  2. dysamoria macrumors 6502a

    dysamoria

    Joined:
    Dec 8, 2011
    #2
    Exactly the kind of stuff I was expecting to happen from day one with this idiotic technology. It was especially inevitable after the various references of "fast tracking" and "waivers" for standard regulation parameters being granted to these companies.

    I hope this self-driving BS dies on the vine faster than google glass. We do not have artificial intelligence. Even the most irresponsible human operator with a license has more capability in terms of vision and judgment than any computer system possible today.
     
  3. koulmj Suspended

    koulmj

    Joined:
    Mar 18, 2013
    #3
    You can't take the human element out of everything. If human soul wasn't needed to drive, even my exgirlfriend would be able to drive without hitting stuff...
     
  4. ThunderSkunk macrumors 68030

    ThunderSkunk

    Joined:
    Dec 31, 2007
    Location:
    Milwaukee Area
    #4
    Yep. There's nothing like relying on good old human senses. Those only screw up enough to cause 6,400,000 accidents and kill 30,000 people a year with mature technology. Three accidents and a single fatality by early versions of 2 totally different systems definitely closes the book on all this technology.

    Back to horses and buggies everyone!
     
  5. keysofanxiety macrumors G3

    keysofanxiety

    Joined:
    Nov 23, 2011
    #5
    I have absolutely no doubt that the next decade will prove this statement to be inaccurate.
     
  6. JohnApples macrumors 65816

    Joined:
    Mar 7, 2014
    #6
    Having auto-pilot in cars is tough. It can only get better it more people use it, but I know that I definitely don't want to be using it in one of its earliest implementations.

    "It's a gen 1 product, of course there's issues. By gen 3 or 4 it will be amazing" has often been said here. But that kind of logic doesn't make me comfortable when it comes to autononomous vehicles.
     
  7. Mildredop macrumors 68020

    Joined:
    Oct 14, 2013
    #7
    Whilst I kind of eep down
    That's quite an unfair comparison.

    Driverless cars do a miniscule fraction of miles compared to driven cars, and always on carefully selected our even private roads.

    Humans are very capable of driving safely - it's distraction, ignoring the rules and taking chances that cause the accidents.

    I'd like to see better use of technology in cars rather than full-on autonomy. For example, a speed limiter linked to GPS.
     
  8. freepomme Suspended

    Joined:
    Oct 30, 2015
    Location:
    Boston, MA
    #8
    I told y'all self driving cars are dangerous and that it can't be done. We're not smart enough to make it, we're not that smart. We're just not!
     
  9. bennibeef macrumors 6502

    Joined:
    May 22, 2013
    #9
    I can just imagine what it sounded like when the first people tried to make a fuel engine which blew up in their faces and people said WE JUST CANT DO IT...

    Or the first planes which dropped out of the sky...

    Wow what is up with you guys
     
  10. ftaok macrumors 603

    ftaok

    Joined:
    Jan 23, 2002
    Location:
    East Coast
    #10
    I find it extremely difficult to believe that Tesla did not know that Autopilot was involved in the fatal accident.

    I recall a story where Elon Musk personally responded to a Journalist who wrote a critical article about being stranded in a Tesla without the ability to charge his car. Musk pointed out that the journalist had driven around in a downtown NYC parking garage for a long duration to deplete the battery before setting off on his journey. Then, Musk pointed out that the journalist had passed several Supercharger stations prior to running out of juice.

    I believe (and this is where I could be wrong), Tesla got this data via the car's communication system. Meaning that they didn't need to be in possession of the car to retrieve the data. If this is true, then I'm sure Tesla knew that Autopilot was operational as soon as they knew that it had been in an accident.
     
  11. koulmj Suspended

    koulmj

    Joined:
    Mar 18, 2013
    #11
    I woild take horse over car...
     
  12. smacrumon, Jul 12, 2016
    Last edited: Jul 12, 2016

    smacrumon macrumors 68030

    smacrumon

    Joined:
    Jan 15, 2016
    #12
    I'm a big fan of autonomous driving and an even bigger fan of the Tesla electric car project so it was greatly disappointing and saddening to hear about the first fatality and the unusual aspects of why the system failed.

    Tesla will need to ensure safety is a #1 priority and I'm certain Elon Musk and team are on to this.

    I still wait in great anticipation of receiving the Model 3.
     
  13. soupcan macrumors 6502a

    soupcan

    Joined:
    Nov 21, 2014
    Location:
    Netherlands
    #13
    Seriously. The Model S and X continiously tell you to stay alert at all times and take over when needed. If you're not doing that you're going to have a bad time.

    Tesla stated it themselves. First fatal accident with Autopilot with over 130 million miles driven. The system is still very much in beta, and if you still believe that system is 100% bug-free all the time you're the one to blame.
     
  14. mw360 macrumors 65816

    mw360

    Joined:
    Aug 15, 2010
    #14
    The car in question was owned by Tesla, loaned to the journalist, returned to Tesla at the end of the drive and Tesla purposely perform extra logging of 'media test drives' after a previous journalistic 'incident'.
     
  15. smacrumon macrumors 68030

    smacrumon

    Joined:
    Jan 15, 2016
    #15
    Statistically speaking this "self-driving BS," as you put it, is actually much safer than human operators on average. What is idiotic technology is a bonnet of a thousand different oil filled parts ready to fail at any given point in time combined with a driver that might fail at any given point in time as well.

    Autonomous driving is one option in the future. There's nothing to fear about it. Instead fear and get outraged about what is the current norm if you actually must fear and be outraged about something.
     
  16. mw360 macrumors 65816

    mw360

    Joined:
    Aug 15, 2010
    #16
    I support the self-driving initiative but something is a little worrying and hasty about Tesla's autopilot rollout. Self-drive should either totally work, or totally not. I don't know that a confessed beta should really be in the hands of the general public with a brief warning. That's a very muddy assignment of responsibility.

    130 million miles per fatality isn't great. I think the US average is about 7 fatalities per billion miles so it's, I guess, expected but for all the fear and suspicion surrounding these things we need much better results than this.
     
  17. iKen1 macrumors member

    Joined:
    Oct 16, 2012
    #17
    @keysofanxiety you cannot possibly be correct. He said "any computer system possible today", that isn't going to change in the next decade.
     
  18. Doogh2004 macrumors newbie

    Joined:
    Jun 16, 2016
    #18
    People said the same thing about computers playing chess, and now they are virtually unbeatable.

    In the future (within 15 years), I guarantee self driving cars will be safer than human driven cars. Thinking otherwise really shows a lack of understanding of neutral networking AI. These algorithms use big data to learn and are proven to be able exceed human capacity eventually.

    As the tech leaders have said, the next decade will be explosive in terms of AI.
     
  19. keysofanxiety macrumors G3

    keysofanxiety

    Joined:
    Nov 23, 2011
    #19
    Haha, well spotted. You've won the Pedant of the Day award, I suppose...? ¯\_(ツ)_/¯
     
  20. swarmster macrumors 6502a

    swarmster

    Joined:
    Jun 1, 2004
    #20
    IMO the "system" that failed wasn't Tesla's, it was the one that allowed the highway-entering truck that cut him off to drive without under-ride guards. They're required in most other first-world countries, and would likely have saved this person's life (as well as about 250 others' this year). But sure, let's investigate the victim instead.
     
  21. milo macrumors 604

    Joined:
    Sep 23, 2003
    #21
    Isn't the Tesla system just assisted driving and not self driving? I've read that the person is still doing the driving and it's supposed to help with things like staying in a lane, crash avoidance, etc. Sounds like the biggest issue is calling it "autopilot" which sounds like self driving but isn't.
     
  22. RichTeer macrumors member

    RichTeer

    Joined:
    Aug 13, 2014
    Location:
    Kelowna, BC, Canada
    #22
    I don't fear self-driving car tech, and as a self-confessed geek I find it interesting. But I'm a computer programmer, so I'm not ready to put my life into an AI's hands.

    Even if that were not the case, the crux of the matter for me is very simple: I like driving, so I'm not particularly interested in embracing self-driving cars.
     
  23. Gasu E. macrumors 601

    Gasu E.

    Joined:
    Mar 20, 2004
    Location:
    Not far from Boston, MA.
    #23
    Illustrations of the above would include:

    speeding
    tail-gating
    lane weaving
    talking on a phone
    changing radio stations

    And that's not to ignore the more egregious examples:

    texting
    extreme geriatric driving
    extreme examples of the previous list

    This combined list probably covers ~95% of US drivers at some point. And 99% of drivers in urban developing economy settings.

    I would bet on a fully autonomous vehicle at least equaling the safety of the average driver with today's tech. With tech available in road test vehicles in five years, I have no doubt the autonomous machines will be much better.
    --- Post Merged, Jul 12, 2016 ---

    "7 fatalities per billion miles" = 1 fatality per ~140 million miles, so those are the same within the statistical margin of error.

    However the one fatality was for autopilot not full autonomy. There is a widely-held belief that full autonomy is safer than a hybrid approach such as Tesla's autopilot.
     
  24. Zimmie macrumors member

    Joined:
    Feb 16, 2015
    #24
    Correct! Luxury car manufacturers have had the technologies Tesla has released as "Autopilot" for 15+ years now. Adaptive cruise control, lane follow assist, automatic braking (sort of related to the adaptive cruise control), automatic parallel parking, collision mitigation (such as Mercedes-Benz' Pre-Safe system), adaptive high beams (the car detects oncoming traffic and steers the headlights away from it), speed limit assistance (recognizes speed limit signs and warns you if you're speeding), and so on. To the best of my knowledge, none of them have marketed their cars as having "autopilot".

    Edited to add: Some of the systems I mentioned aren't even currently offered by Tesla. I don't think they do speed limit assistance or collision mitigation yet, and I'm not sure about their headlights.
     
  25. smacrumon macrumors 68030

    smacrumon

    Joined:
    Jan 15, 2016
    #25
    Let's investigate the victim? Sure, under-ride guards should be a requirement. But... No, the failure is in Tesla's system unfortunately. There are always going to be cases where vehicles and other obstructions are going to cause issues on the road. The onus will alway need to be on the self driving car to identify all threats and respond appropriately without harm. Harm to the occupants and those outside the vehicle must always be paramount responsibilities for the autonomous vehicle. The bar is incredibly high and it will be possible for Tesla to reach and rise above it, I'm confident of this.
     

Share This Page