Apple's Self-Driving Car Performance May Not Be So Bad After All

So companies can just report numbers based on completely different, unclear conditions?

Begs the question what the reports are actually good for.

Not really a lot. There was much discussion of their meaninglessness when the requirement was first announced. It's mostly politics, I think.
 
Question: How do you know Apples disengagement is worse than Googles?

Although it's not exactly from an official news agency... but...
https://mashable.com/article/waymo-beats-apple-self-driving-car-report/#8VNWvR_gWiqH

Waymo has 0.09 disengagement per 1000 miles... That is 11,017 miles until disengagement.
It's still a lot better than Apple's important disengagement of 2,005 miles until disengagement.

Of course, I have no idea how or on what standard Waymo reported disengagement... But why doesn't Apple tweak the way they report and comes to the top then? Why stop at 2,005 miles?
 
As of July 2018, however, Apple stopped reporting its total number of disengagements and instead began focusing on "Important Disengagements," aka disengagements that might have resulted in a safety-related event (aka accident) or a violation of the rules of the road.

Anyone starting to notice a trend here?

"Don't like the figures? Change/stop the reporting."

Good grief Apple. o_O
 
Of course, I have no idea how or on what standard Waymo reported disengagement... But why doesn't Apple tweak the way they report and comes to the top then? Why stop at 2,005 miles?
Because the numbers are meaningless. Driver A might take control at the slightest bit of difficult traffic, driver B might only take control to avoid an imminent accident. Driver C might look for very easy terrain with little traffic, driver D might test the car in the middle of Bangkok during rush hour.
 
... But why doesn't Apple tweak the way they report and comes to the top then? Why stop at 2,005 miles?

It's called integrity. Despite some conspiracy theories Apple has it and it runs through everything they do. Combined with their secrecy Apple sometimes appears to be hiding, covering up or ignoring things but the reality is never so interesting.

Since there is no legal definition of "disengagement" you could imagine this as a completely plausible explanation. Not saying it is the explanation but it is a plausible one: Apple's cars are fully capable of driving themselves and are continually calculating what steering, throttle and breaking inputs it would use to maintain course and route safely but the computer isn't actually controlling anything the safety driver is. Any time the computed inputs are different from the safety driver's actual inputs by more than some allowance factor that event is considered a disengagement.
Ex: the driver knew, intuitively as a human, that a car in front making a right turn would clear traffic and be no threat at the current speed, but the computer wasn't sure of that so it would have decelerated slightly. boom, disengagement.

I don't know how many people have actually been in an autonomous car, never mind lived with one, but I do. While not official from the factory my car will drive autonomously to a significant degree and that scenario is one I regularly encounter: the computer insists on breaking or throwing up a collision warning when as a human I know the situation is 100% safe at my current speed, and I'm not talking about allowing close calls here.
 
Although it's not exactly from an official news agency... but...
https://mashable.com/article/waymo-beats-apple-self-driving-car-report/#8VNWvR_gWiqH

Waymo has 0.09 disengagement per 1000 miles... That is 11,017 miles until disengagement.
It's still a lot better than Apple's important disengagement of 2,005 miles until disengagement.

Of course, I have no idea how or on what standard Waymo reported disengagement... But why doesn't Apple tweak the way they report and comes to the top then? Why stop at 2,005 miles?

Waymo is, and has been the clear leader in the field. They started early, and have invested countless sums of money into the project.

It would be unfair to compare Apple to Waymo, given the latter's head start, but those comparisons will be made anyway, much as Google Maps vs. Apple Maps was (seven years in that instance).

That said, Kool-Aid rationalization pieces like this are more of what I expect from AI, not MR.

Does anyone really believe that others in the field don't follow similar procedures? (Well, maybe not Uber).

Look, no reasonable observer is going to expect the same performance at this stage from Apple, vs. what Waymo can achieve. Such development time and testing experience can't be rushed.

If anything, trying to spin it (like the iPP warpage issues) is only going to make it worse.

Apple's true peers in the situation are the others who also got later starts (which includes some big OEMs), and compared to them, they're within par. Not Waymo, not Cruise.

But, such stories, and attendant followups, do serve the media machine well.
 
Can’t be so hard to standardise the reporting, can it? To me it doesn’t matter why the software disengaged - if it wasn’t able to handle any scenario while on public roads (or the driver had/wanted to interfere) it failed its purpose of driving autonomously. Excluding disengagements due to ‘unexpected objects on the road’ or other arbitrary reasons from the statistic takes away any comparability.

This number should be a benchmark for how capable the different systems are of adapting to any situation on the road, yet Apple openly states that they excluded certain data because their software wasn’t able operate under those conditions - I don’t want to hate on Apple specifically since we don’t know how the other companies report their numbers but this statement doesn’t sound very convincing. A couple of weeks ago I saw a dash cam video of a Tesla 3’s autopilot stopping the car from spinning and steering it back into its original lane - while Apple’s system gets put off by roadworks...
 
Really whether it was every 1.1 miles or 2002 miles.. that's still very poor compared to the competition, obviously years behind. Best I could equate it to, Apple Car will be like the their home pod, came out late, still had bad AI. Google is just miles ahead of everyone else in this space literally, not sure anyone can catch them in the next 3-5 years, and by then, who knows where Google will be at.
 
Apple Maps and Siri have been in sleep mode for years under Cook's watch. Apple Music was late to the party, and revolutionary TV service is yet to show up. I think we need someone else to go.

Did you just completely ignore what I said? There's a new head of AI at Apple which is now in charge of Machine Learning and AI (which means there's a new manager of Siri now). That thing you said about Apple should "switch up management" and to "bring in new blood", Apple did that several months ago.

So what are you trying to say? Get rid of the newly hired John Giannandrea before he can do anything with Siri?
 
Fascinating. And not surprising considering that Apple has a history of reporting numbers conservatively.

All that said, I still get a little ooky when I think about ceding control of my vehicle to a robot. I am typically in early adopter, but in the case of autonomous vehicles, I will happily wait until its proven beyond all reasonable doubt that I’m a Luddite.
I was a little hesitant myself but eager to try it. Three years ago, I bought a car with Adaptive Cruise Control (ACC) and once I figured out the limits, it made commuting so much easier, enjoyable, and relaxing. Last year I bought a car with semi-autonomous driving. It also took a little time to adjust, but I wouldn’t want a car without any more. I recommend being extra alert at first and learn what limits the system has. Using either ACC or Pilot Assist, the car generally begins braking for the car in front of me before I would have, translated to early and gently.
 
Did you just completely ignore what I said? There's a new head of AI at Apple which is now in charge of Machine Learning and AI (which means there's a new manager of Siri now). That thing you said about Apple should "switch up management" and to "bring in new blood", Apple did that several months ago.

So what are you trying to say? Get rid of the newly hired John Giannandrea before he can do anything with Siri?

I read what you said, I don't think it's enough, that move should have happened years ago, why did Cook just wake up now? The problem is the top of Apple lacks the drive and vision to hold accountable the rest of the company.
 
0D07BC85-08C2-4E3D-9363-8C10E9A39268.jpeg I hope Apple not only testing those in California...
 
The statistic I'd be most interested in is the number of accidents or near-accidents for a self-driving car versus the number for a comparable number of miles driven in a regular car by an average driver.

I don't expect that self-driving cars will ever be absolute perfect, but I'd like to know when they're better, on average, than we humans are.

It very well might be that self drive cars are already safer then humans driven cars. But the self driven cars are not driving in all conditions and they are driving very conservatively. For example a Waymo car will wait nearly "forever" to make a left hand turn, until there is not an on-coming car in sight. That is safe but at a cost of being overly slow.

Also read that Apple cars only drove 56,000 miles. This is VERY low for a fleet of two dozen cars. To date Waymo has driven about 10 million miles. But to get good, meaningful statistics you need to drive at least a billion miles. You need thousands of cars to do that.


Apple has a LONG way to go. 56K miles is "nothing" and not even statistically meaningful. Even Waymo's 10,000,000 miles is not enough that we know if it is safer then human drivers.
 
Are disengagement reports a government requirement? It seems kind of weird to report on something that hasn't even been announced yet, let alone in production.

yes they are - so that you can hit the public roads, whtat these companies do!
we wanna know how and if the are a safety risk
[doublepost=1550122669][/doublepost]
It very well might be that self drive cars are already safer then humans driven cars. But the self driven cars are not driving in all conditions and they are driving very conservatively. For example a Waymo car will wait nearly "forever" to make a left hand turn, until there is not an on-coming car in sight. That is safe but at a cost of being overly slow.

Also read that Apple cars only drove 56,000 miles. This is VERY low for a fleet of two dozen cars. To date Waymo has driven about 10 million miles. But to get good, meaningful statistics you need to drive at least a billion miles. You need thousands of cars to do that.


Apple has a LONG way to go. 56K miles is "nothing" and not even statistically meaningful. Even Waymo's 10,000,000 miles is not enough that we know if it is safer then human drivers.

overall speed is not dependend on how fast you jump in the line!
it doesnt matter if ai cars are safer - which they will be anyways on the long run - this is a complete wrong narrative. we just need and want them, its called service or comfort. and we can set free a ton of workforce and costs.
 
I read what you said, I don't think it's enough, that move should have happened years ago, why did Cook just wake up now? The problem is the top of Apple lacks the drive and vision to hold accountable the rest of the company.

You can't really change the past, so looking forward, there's not much Cook can do now but to let the new guy do his job. Apple now has probably the best guy in the world to improve Siri. So to ask for a change in management for Siri today doesn't make sense.
 
There AREN'T any rules about Disengagement Reporting;
Not true. From https://www.dmv.ca.gov/portal/dmv/detail/vr/autonomous/testing, see https://www.dmv.ca.gov/portal/wcm/c...essAV_Adopted_Regulatory_Text.pdf?MOD=AJPERES

a) Upon receipt of a Manufacturer’s Testing Permit or a Manufacturer’s Testing Permit –Driverless Vehicles, a manufacturer shall commence retaining data related to the disengagement of the autonomous mode. For the purposes of this section, “disengagement” means a deactivation of the autonomous mode when a failure of the autonomous technology is detected or when the safe operation of the vehicle requires that the autonomous vehicle test driver disengage the autonomous mode and take immediate manual control of the vehicle, or in the case of driverless vehicles, when the safety of the vehicle, the occupants of the vehicle, or the public requires that the autonomous technology be deactivated.
It then goes onto describe what the report should include.
 
It is still too early in development for these numbers to be of any value or indication of anything. They are not doing user acceptance testing or anything like that. These car are driven by engineers who want to produce usable data. Every disengagement event represents a data point that will be fed back to the software for analysis. A disengagement event is a good thing, it's the human telling the 'puter how to do it right.

If you want to teach the thing as fast as possible you want disengagements to happen frequently and not every 1000 miles.

But this doesn't matter, not one bit. It's far too early in development.
 
Somebody needs a wake up call and it isn’t Apple

Everybody who thinks that Apple is the technology leader company needs a wake up call.

The only really good staff Apple still has is MacOS and iOS. It's already old, but still the best in the industry, rest it far behind :-( It is sad, I really liked Apple when it was the good company, with warm heart and best products.
 
Although it's not exactly from an official news agency... but...
https://mashable.com/article/waymo-beats-apple-self-driving-car-report/#8VNWvR_gWiqH

Waymo has 0.09 disengagement per 1000 miles... That is 11,017 miles until disengagement.
It's still a lot better than Apple's important disengagement of 2,005 miles until disengagement.

Of course, I have no idea how or on what standard Waymo reported disengagement... But why doesn't Apple tweak the way they report and comes to the top then? Why stop at 2,005 miles?

Yes, my question was exactly that. We dont know whether apple were being more honest in the first place. So without a standard, these comparative numbers mean very little. When Apple came up with explanation, now many of us are saying apple is tweaking the numbers. Perhaps they are simply trying to readjust based on what others do? We just dont know! Apple may not be the best at making AI software but they are not so bad.
 
Ex: the driver knew, intuitively as a human, that a car in front making a right turn would clear traffic and be no threat at the current speed, but the computer wasn't sure of that so it would have decelerated slightly. boom, disengagement.

I don't know how many people have actually been in an autonomous car, never mind lived with one, but I do. While not official from the factory my car will drive autonomously to a significant degree and that scenario is one I regularly encounter: the computer insists on breaking or throwing up a collision warning when as a human I know the situation is 100% safe at my current speed, and I'm not talking about allowing close calls here.

Hmm!

Finally, we seem to be at a point where me and my car could drive in partnership but manufacturers insist I don't want to drive at all.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.
Back
Top